Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Little Prevention, Less Cure: Synthetic Biology, Existential Risk, and Ethics Brian Patrick Green, Markkula Center for Applied Ethics and School of Engineering, Santa Clara University, California, USA The biosecurity, biosafety, bioweapon, and biodefense risks of synthetic biology are enormous and have been discussed in some detail (e.g. Petro et al., 2003, Lemon and Relman et al., 2006). Such risks may include everything up to the destruction of most of life on Earth. These worstcase scenarios should not be discounted, because there are not only individual cults and terrorist groups that would be happy to perform such heinous acts, but also possibly entire states, such as North Korea. Historically, many states have been involved in bioweapon research and production, and as the power of biotechnology is democratized we should also expect non-state actors to become involved, as indeed some already have (e.g. the Rajneeshees in 1984, the 2001 Anthrax attacker, etc.). This raises the question of global catastrophic and existential risks. The philosopher Nick Bostrom has described global catastrophic risks as risks which threaten massive global disaster and existential risks as risks which threaten human extinction (Bostrom, 2002). Synthetic biology presents such risks, especially if permitted as a DIY hobby that anyone, including terrorists, could pick up. Because synbio permits such significant changes to living organisms, we should not expect to be able to prepare for all the various diverse and unpredictable bioweapons that could be produced by a fully democratized DIY synbio milieu. Indeed, we cannot even effectively deal with the natural biological problems that nature throws at us now. The philosopher Hans Jonas has argued that the first and most important rule of ethics, his “imperative of responsibility,” is that humankind must exist in the future (Jonas, 1984). One is not allowed to play a “va banque” game with humanity. Therefore anything that puts humanity at risk ought to be carefully controlled or eliminated, if possible. There are many risky things that we cannot control, but synthetic biology need not be one of them. Recalling the “risk equation” (risk = harm x probability), Michael Davis has argued that for any unacceptable harm with a non-zero probability the risk is too high (Davis, 2012). Human extinction should qualify as an unacceptable harm; therefore, since DIY synbio permits a certain non-zero probability of that harm, it presents an unacceptable risk that ought not be permitted. As we enter the risk terrain of DIY synbio, we – or at least some of us – are deciding that we are willing to risk everything on the possible finite goods synbio might give to us. Reasonable gamblers should not risk everything, including their own lives, on a finite win. Given the dangers presented by synbio and the ethical rule that humans ought to exist in the future (which we ought hardly to need, as self-interest would hopefully suffice), we need a strong governance and policy response to this threat. The current Presidential Commission for the Study of Bioethics Issues response of “prudent vigilance” is insufficient. “Prudent vigilance” Workshop on the Research Agendas in the Societal Aspects of Synthetic Biology 1 would have been an odd solution to the dangers of nuclear power, for example. Synthetic biology permits the creation of destructive capacities worse than nuclear weapons and at much less difficultly. Adaptation to and mitigation of these risks will likely need to be, therefore, even more significant than the changes to the world that occurred due to the advent of nuclear weapons. Perhaps it is only because the power of nuclear weapons was made clear on Hiroshima and Nagasaki that nuclear technology has been controlled as well as it has. In lacking examples of the destructive power of synbio, our collective imaginations seem to fail. How can we respond to this failure of the imagination? We need to present these ideas to the public as best we can. Mitigation of and adaptation to the risks of synbio should be a top priority. This will require policy responses which include governance over scientific research and technological development. When people try to make nuclear reactors at home (as has happened more than once), the public and the government should be concerned. Reactors are peaceful uses for nuclear power, but they still do not belong in people’s homes. Likewise, when someone tries to do synthetic biology “at home,” the public and the government should be concerned. Glowing plants are not weapons, but the same methods which produce them could produce much worse things. No finite benefit can justify the risk of human extinction, but what of the risks of smaller accidents or attacks that might kill “only” millions of people? Can any benefit justify that level of risk? I would think not, but this is a question for the public to decide, not for academics, scientists, engineers, DIY inventors, or any one group. This is a question of the common good, and so the decision makers should be everyone. Synthetic biology needs intense scrutiny, public discussion, democratic process, and limitations and enforcement to prevent unacceptable scenarios, so that we produce the best possible future with these technologies and not the worst. References Bostrom, Nick. “Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards.” Journal of Evolution and Technology 9, March 2002. Davis, Michael. “Three nuclear disasters and a hurricane.” Journal of Applied Ethics and Philosophy 4, August 2012. Jonas, Hans. The Imperative of Responsibility. Chicago: University of Chicago Press, 1984. Lemon, Stanley M., and David A. Relman, et al. Globalization, Biosecurity, and the Future of the Life Sciences. Washington, D.C.: National Academies Press, 2006. Available at: http://www.nap.edu/catalog/11567.html Petro, James B., et al. “Biotechnology: Impact on Biological Warfare and Defense.” Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science 1, 2003. Green, "Synthetic Biology Risk and Ethics" 2