30
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Restricted Science

      editorial

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction In 2004, the National Science Advisory Board for Biosecurity (NSABB) was created as an independent federal advisory body. Its role was to advise the U.S. government on strategies to prevent the misuse of dual-use research. Since its inception, the NSABB has ruled on two cases: the 1918 flu-virus synthesis conducted by government scientists in 2005 and the H5N1 experiment conducted in 2011 by two separate university teams in the Netherlands and the United States. While in the first case, without much public debate, the NSABB quickly decided to support publication of the experiment’s findings, in the second case, it initially requested a halt on publication and the removal of methodological details from the proposed articles for fear that they could be used by malevolent actors to create a pandemic among humans. The decision was reversed 6 months later, but it sparked a worldwide firestorm, engaging the scientific and security communities in a heated debate about whether the dissemination of scientific data should be regulated, and what types of research should be conducted. Yet, the key question that triggered the overall controversy remains largely ignored: under what conditions could the H5N1 experiment be reproduced, if at all, by malevolent actors using only published data? The lack of attention to the issue of reproducibility stems from a widespread belief that science is inherently reproducible and published data are the primary tool allowing such replication. Empirical evidence suggests otherwise. Analysis of recent dual-use research projects and past bioweapons programs shows that reproducibility of past work faces stiff challenges, especially when using written protocols alone. Translating a scientific idea into a product that functions reliably is a challenge that is routinely encountered in the pharmaceutical industry, as well as in past bioweapons programs. In this article, we start by emphasizing the challenges associated with reproducing scientific experiments and their application to specific purposes based on empirical research conducted by the authors. We then suggest criteria to weigh security risks against the health benefits of dual-use research for the purpose of producing more accurate threat assessments, without imposing unnecessary restrictions on the diffusion of knowledge. Sources of Reproducibility Challenges in Science While the H5N1 controversy was raging, the National Institutes of Health (NIH) revealed that much of its past funded research could not be reproduced. In 2012, for example, the drug company Amgen reported that it failed to reproduce 89% of the findings from 53 major cancer-related papers (1). The previous year, the pharmaceutical company Bayer in Germany indicated that it could not validate the results of two-thirds of its own preclinical studies (1). Interestingly, no connections were made between these revelations and the H5N1 experiment, also funded by the NIH. Empirical research shows that some experiments are extremely difficult to replicate, due to the contingencies associated with experimental work and the nature of knowledge. First, replication of past work using published documents is problematic because scientific articles rarely provide a detailed account of all stages of an experiment and their associated contingencies. The methods section of scientific papers is usually brief and provides only an overview of the experimental methods to show that a concept has been implemented; it is not intended to be a step-by-step protocol (2). Second, scientific articles rarely delve into the problems that researchers encountered during the experiment nor do they explain how long it took to resolve such problems. For example, the article describing the 2010 creation of a self-replicating Mycoplasma mycoides cell by researchers at the J. Craig Venter Institute (JCVI) includes a two-sentence statement indicating that the team faced challenges with transplantation, which were eventually overcome (3). However, interviews with JCVI scientists reveal that transplantation attempts routinely failed for 2 years, leading the scientist responsible for transplantation to consider abandoning the project. As her supervisor explains: After two years of just seven days a week [of continuous work], she came into my office saying she wanted to work on a new project; she couldn’t do this anymore …. We tried lots and lots of different approaches. And we had suspicions of something we thought might work … but these were hard experiments to do with a lot of reagent prep for every experiment …. Everything you could possibly think of that might allow you to move a really big piece of DNA into a cell [we tried] (4). Publications often play down the long and painstaking process of systematic problem solving that is often required to resolve difficulties involved in experimental work, leaving the false impression that problems can be readily overcome. Experimental work also sometimes requires the development of new techniques and protocols that cannot easily be used for other purposes or by other individuals. In the M. mycoides transplantation case, a new protocol had to be designed for the experiment, and was published in 2007. Yet, 6 years later, the researchers were not able to use this protocol for work with another organism (4). Additionally, the researchers worked with large pieces of DNA that break easily during pipetting, introducing an additional hurdle to replicating the experiment. To prevent damage, the team emphasized the importance of pipetting “gently” and using pipette tips with wide openings through which large pieces of DNA could pass unobstructed (5). Although pipetting is a common technique, not all scientists were able to pipette the M. mycoides DNA gently enough to keep it intact. As one researcher explains: Our genome transplanters are really good at this [keeping supercoiled DNA intact] … I sat in the same hood … with Carole [Lartigue – the expert] and we used the same reagents … the only thing different was each of us had our own pipettes and plates, and I did a transplant in parallel with her … she got … 2,000 colonies [successful transplants] and I got 20. I thought I was doing exactly what she was doing in pipetting slowly. [But] doing these tricks is still very much a magic hand sort of thing (4). This highlights a problem well known among practicing scientists but generally ignored in evaluations of the potential reproducibility of dual-use experiments: the importance of expertise acquired through years of practice in the laboratory. Much of this expertise involves tacit skills not easily translated into words, such as the muscle memory that allows a researcher to know what constitutes “gentle” pipetting, or acquired and replicated by others, even when a technique is demonstrated in person or an experiment is done in cooperation with the technique’s designer (6–8). Moreover, laboratory disciplines and routines often contribute to the development of laboratory-specific skills that cannot be standardized or transferred to a new location. The University of New York-Stony Brook virologists who synthesized poliovirus in 2002 emphasized the importance of maintaining “sameness” in their laboratory routines, materials, and technicians to ensure successful results. Tellingly, a post-doctoral fellow who spent 6 years in the New York laboratory could not replicate his work in his home laboratory in Belgium (9). Thus, the tacit, personal, and local nature of knowledge constitutes a strong barrier to reproducibility. Because know-how does not easily translate into words, its importance for experimental success is frequently ignored in threat assessments. Application to Nefarious Objectives The NSABB’s initial decision to edit the H5N1-related article before its publication was followed by the Dutch government’s decision to impose export-control restrictions on the Dutch team’s article. Dutch authorities claimed that the research fell under European Council Regulation EC 428/2009, which attempts to prevent the spread of nuclear, chemical, and biological weapons by requiring an export license before publication (10). These moves are based on the assumption that innovations achieved in the laboratory can be easily fashioned into a harmful agent or a bioweapon. Yet, past bioweapons work shows that transforming a scientific concept developed in the laboratory into a product that has a specific, applied purpose, and functions reliably and effectively can take several decades and require a variety of expertise. Specifically, the passage from laboratory concept to specific application faces the challenge of scaling-up fragile microorganisms for large-scale production and developing a delivery mechanism that will protect the agents from environmental degradation when released as a weapon. For example, within the Soviet bioweapons program, the development of an antibiotic-resistant strain of the bacterium that causes plague took 20 years to achieve and involved teams at three institutes. Scaling-up anthrax and smallpox weapons took Soviet researchers about 5 years to achieve and required the involvement of large teams of scientists, including the designers of the original strains. And within the U.S. bioweapons program, scientists discovered that the botulinum toxin weapon they had produced eventually lost some of its toxicity upon aerosol release. These examples demonstrate that laboratory successes do not necessarily lead to successful application to a specific purpose. Instead, specialized skills honed over years of practice in production and weaponization work are critical to success (11). New Analytical Framework Seen against this background, fears that the H5N1-related articles might support replication by malevolent actors seem exaggerated. They ignore the fact that science is a cumulative process where knowledge is acquired and built through many years of personal and collective experimentation. Therefore, it is neither easily acquired nor easily transferred, and even less so by means of published articles. More importantly, these fears also indicate that the NSABB’s initial decision to edit the H5N1 article before its publication was not rooted in a risk/benefit analysis that considered the determinants of success in scientific work. Indeed, even though the Board interviewed the lead authors and a variety of influenza experts, it did not interview the scientists and technicians who actually conducted the laboratory work (12). In fact, important details about the experiment’s difficulty were revealed after the Board issued its recommendation, and only as a result of the controversy, not as a result of the Board’s inquiry. Therefore, any future review of dual-use research should be based on a careful analysis of the tacit, personal, and laboratory-specific skills required to perform scientific experiments. This implies that NSABB reviewers conduct face-to-face interviews with the scientists and technicians who executed the laboratory work to identify the hidden contingencies associated with key stages of an experiment, including the development of laboratory- or agent-specific techniques or protocols that may not transfer easily to a new location. A laboratory visit may also reveal hidden laboratory idiosyncrasies that contribute to experimental success and may prevent replication elsewhere. In order to improve the NSABB’s ability to assess the ease of replication by terrorists or states, its reviewers should also include an expert who has hands-on experience working with the microorganism under consideration. In the H5N1 case, NSABB members had access to outside influenza experts, but their lack of experience working with the influenza virus itself, notwithstanding their expertise in other areas, did not allow some of them to appreciate the importance of experimental details that could have impacted the ultimate threat assessment 1 . Without a major change in the NSABB’s approach, future restrictions might result in two equally negative consequences. First, suspicions among foreign entities that restrictions on scientific work are hiding U.S. government bioweapons work might increase. Second, scientists may avoid U.S.-funded research for fear that the government might block their work from being published. To wit, the Dutch scientist who conducted the H5N1 research temporarily blocked by the NSABB recently published a follow-up study in the journal Cell. In the Section “Acknowledgment,” he stipulated that the work was not funded by the NIH (13). Conflict of Interest Statement The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

          Related collections

          Most cited references7

          • Record: found
          • Abstract: not found
          • Article: not found

          Tacit Knowledge, Weapons Design, and the Uninvention of Nuclear Weapons

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Framing biosecurity: an alternative to the biotech revolution model?

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Dual-Use Research and Technological Diffusion: Reconsidering the Bioterrorism Threat Spectrum

              The global security community continues to view a potential bioterrorist event with concern. Kofi Annan, former Secretary General of the United Nations, stated “the most important under-addressed threat relating to terrorism…is that of terrorists using a biological weapon” [1]. The European Commission believes that biological weapons “may have particular attractions for terrorists” [2]. The United States Commission on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism believes it is very likely that a weapon of mass destruction will be used in a terrorist attack by the end of 2013, and that an attack with a biological weapon is more likely than one with a nuclear weapon [3]. There is good reason for concern. Infectious diseases elicit instinctive fears that some terrorist organizations appear to have the intent to exploit [4]. The 2001 anthrax attacks in the United States, believed to have been caused by a single actor [5], were a keen reminder of the ability of bioterrorism to cause death and societal disruption. Such concerns have been linked to the rapid progress in life science research. The most advanced techniques 20 years ago are today routine (and some, like DNA synthesis, are also much cheaper [6]), while new fields, notably synthetic biology [7], [8], have opened frontiers previously inconceivable. Furthermore, expertise in life science research is globally dispersed, and methodologies for synthesizing and/or altering the virulence of pathogens in the laboratory have already been published in high-profile scientific journals. Activities that have garnered substantial attention include chemically synthesizing the poliovirus [9] and the ΦX174 bacteriophage [10], demonstrating the importance of a variola virus gene for its virulence [11], and reconstituting the 1918 influenza virus [12]. Each has been classified as dual use research of concern (DURC), which is defined by the US National Science Advisory Board for Biosecurity (NSABB) as “research that, based on current understanding, can be reasonably anticipated to provide knowledge, products, or technologies that could be directly misapplied by others” [13]. DURC creates a tension between freedom of research and national security [14]–[17]. As security communities have pushed for tighter oversight of research, scientific communities have been quick to grasp that certain biosecurity regulations, such as export controls [18] or visa controls for foreign scientists [19], run the risk of being inadvertently disruptive [20]–[24]. Members of the US NSABB have even argued that the inhibition of life science research could be considered a threat to national security and public health in and of itself [25]. Yet as concerns the rationale for biosecurity controls, the scientific community has been generally muted. Although this may be related to the secrecy surrounding intelligence about terrorist organizations, classified snippets of information should not have priority over expert technical input. Ceding the debate to the security community could lead to inaccurate threat assessments and the adoption of inappropriate biosecurity control measures. The European Centre for Disease Prevention and Control (ECDC) was established in 2005 with the mandate to strengthen Europe's defenses against infectious diseases through developing European Union–wide surveillance networks and early warning systems, coordinating scientific studies, and identifying emerging health threats [26]. As a part of ECDC efforts to evaluate potential bioterrorism threats, we reviewed 27 assessments (published between 1997 and 2008) that address the links between life science research and bioterrorism with the objective of identifying DURC relevant for public health (Text S1). The focus of the review was limited to the application of DURC by terrorist organizations and it did not consider state-sponsored biological weapons programs. The 27 assessments were selected based upon a literature review and interviews with a panel of international experts. Collectively, the 27 assessments explicitly cite a wide range of DURC activities. Based upon these, we conducted a threat assessment during an expert workshop. The purpose of this threat assessment was to identify those DURC activities that would be the most easily deployed by bioterrorists. The key parameters for this assessment were the level of expertise required for conducting any given DURC activity and the level of equipment required to conduct the work. In the threat assessment, an estimated threat level was calculated for each DURC activity by giving a score ranging from 1 (high threshold) to 3 (low threshold) for both parameters, and then multiplying these scores to yield the final threat, which could be 1, 2, 3, 4, 6, or 9. Higher scores indicate a higher likelihood of success if they were to be undertaken by bioterrorists (Text S1). The overall ranking provides an indication of the threat spectrum related to the ability of bioterrorists to exploit life science research (Table 1), and it suggests that “low tech” activities may be especially attractive to bioterrorists. This opposes the tendency of biosecurity discussions to be rather more focused on “high tech” research: typically, the potential negative consequences of research falling into the wrong hands are accentuated while the likelihood of this occurring is inadequately considered. Is the availability of material, methodologies, and high-level expertise, none of which should be taken for granted, even adequate for the development of a sophisticated bioweapon? Technology is much more than the sum of its material and informational aspects. Social contingencies and tacit knowledge, serendipity and unpredictability, institutional memory, and many other factors are essential to the successful design and deployment of any given technology, including (if not especially) biological weapons [27], [28]. Interviews with the Wimmer group about the poliovirus synthesis [9], for example, highlight that replicating the experiment is a very challenging and time-consuming procedure even for virologists familiar with the experimental system [29]. It is not obvious that extrapolating the methods from this work for other purposes—or to another laboratory—would have been successful. The challenge is surely even greater when resource, time, or other constraints (such as the need to be clandestine) are involved. 10.1371/journal.ppat.1001253.t001 Table 1 Threat assessment for research areas of concern. Expertise ThresholdLow – (3)Medium – (2)High – (1) Equipment ThresholdLow – (3)Medium – (2)High – (1) Threat Level Enhance the dissemination of a biological agent by contamination of food or water supplies late in a distribution chain 3 3 9 Increase the environmental stability of a biological agent by mechanical means, e.g., microencapsulation 2 2 4 Confer resistance to therapeutically useful antibiotics or antiviral agents 2 2 4 Facilitate the production of biological agents 2 2 4 Enhance the dissemination of a biological agent by contamination of food or water supplies early in a distribution chain 3 1 3 Enhance the dissemination of a biological agent as powder or aerosol 1 2 2 Synthetic creation of viruses 2 1 2 Render a vaccine ineffective 1 1 1 Enhance the virulence of a biological agent 1 1 1 Increase the transmissibility of a biological agent 1 1 1 Enhance the infectivity of a biological agent 1 1 1 Alter the host range of a biological agent 1 1 1 Render a non-pathogenic biological agent virulent 1 1 1 Insertion of virulence factors 1 1 1 Enhance the resistance of a biological agent to host immunological defence 1 1 1 Insertion of host genes into a biological agent to alter the immune or neural response 1 1 1 Generate a novel pathogen 1 1 1 Increase the environmental stability of a biological agent by genetic modification 1 1 1 Enable the evasion of diagnostic or detection modalities 1 1 1 Targeting materials to specific locations in the body 1 1 1 Calculated according to the formula total threat  =  (expertise threshold) × (equipment threshold), this table presents individual DURC activities according to the ease with which a terrorist organization could be expected to replicate the work, based on expertise and equipment thresholds. The highest threat level comes from DURC activities that were deemed to require overcoming only low expertise and low equipment thresholds (such as contaminating a food or water source with an unaltered pathogen). Conversely, the lowest threat comes from highly sophisticated DURC activities that would need to overcome high equipment and expertise thresholds, such as those that would be required to substantially alter the genetic nature of a pathogen. The recent history of bioterrorism also suggests that more attention should be allotted to low tech threats [30]. An extensive review of biocrimes in the 20th century argued that although bioterrorists might acquire some capabilities, there is “reason to doubt the ease with which such groups could cause mass casualties” [31]. Aum Shinrikyo, for example, was not successful in procuring, producing, or dispersing anthrax and botulinum toxin in the 1990s, while Al Qaeda is believed to have failed to obtain and work with pathogens by the early 2000s [32], and this likely remains the case. In comparison, the contamination of food and water, and direct injection/application of a pathogen, all have much lower technical hurdles and might be expected to be rather more successfully deployed [31]. The best-known example is the contamination of salad bars with Salmonella by the Rajneeshee cult in 1984, which led to roughly 751 illnesses and 45 hospitalizations [33]. It remains the only known incident in which a terrorist organization, rather than an individual, deployed a biological agent in the US [31]. We do not suggest that high tech bioterrorism threats do not exist—rather, that their likelihoods should be re-evaluated. Biosecurity policy discussions could gain more nuance and credibility by adopting more sophisticated notions about the challenges inherent in conducting and replicating advanced research. The life sciences community has an obvious self-interest in this, and might best achieve it by emphasizing the oft-unacknowledged factors inherent to successful high tech research, including those related to social contingencies and tacit knowledge. Thus far, when life scientists have entered the fray, they have tended to reinforce the “high-tech” perspective, even if their objectives have been to argue against strict biosecurity controls and/or to encourage the life sciences to engage in debates about the risks and benefits of its research [34]–[36]. Many agree about the importance of threat mitigation measures that prepare for the eventuality of a bioterrorism attack, irrespective of its source [37], [38]. Examples include encouraging the development of diagnostics, vaccines, and therapeutics, as well as empowering public health agencies to strengthen defenses against communicable diseases. Such approaches have the additional advantage that they take the broadest possible view of the threat spectrum by also preparing for attacks by the most successful “bioterrorists” of all, nature and globalization, which have led to the emergence of numerous new communicable diseases in recent years [39]–[41]. A focus on strengthening global health security has been put forward by the Obama administration [42] and the European Commission [38], and has also gained prominence in fora such as the Biological and Toxin Weapons Convention [43]. Public health, too, is dual use: it can be leveraged to counter natural and intentional disease outbreaks. Supporting Information Text S1 Dual-use assessments reviewed in this study (in reverse chronological order). (0.05 MB PDF) Click here for additional data file.
                Bookmark

                Author and article information

                Contributors
                URI : http://frontiersin.org/people/u/126665
                Journal
                Front Public Health
                Front Public Health
                Front. Public Health
                Frontiers in Public Health
                Frontiers Media S.A.
                2296-2565
                17 July 2014
                24 September 2014
                2014
                : 2
                : 158
                Affiliations
                [1] 1Biodefense Program, School of Policy, Government and International Affairs, George Mason University , Fairfax, VA, USA
                Author notes

                Edited and reviewed by: Jonathan E. Suk, European Centre for Disease Prevention and Control, Sweden

                *Correspondence: sbenouag@ 123456gmu.edu

                This article was submitted to Infectious Diseases, a section of the journal Frontiers in Public Health.

                Article
                10.3389/fpubh.2014.00158
                4173220
                cc49ef87-c246-4a75-924b-5cb9f14a1170
                Copyright © 2014 Ben Ouagrham-Gormley and Fye.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 10 May 2014
                : 10 September 2014
                Page count
                Figures: 0, Tables: 0, Equations: 0, References: 13, Pages: 3, Words: 2389
                Categories
                Public Health
                Opinion Article

                h5n1,nsabb,regulating dual-use research,venter institute,m. mycoides,bioweapon

                Comments

                Comment on this article