Andrés Martinez, co-director of the Future Tense Initiative and director of the Schwartz Fellows Program at the New America Foundation, took to the stage at Washington, D.C.’s Google office to welcome the assembled audience and set the stage for the day-and-a-half-long event. Among the primary questions to be pondered, he said, were: “How, as a democratic society, can we exercise oversight over scientific inquiry? Is that a fool’s errand?” Martinez also explained the background of the conference title, “Here Be Dragons.” It refers to an old cartography convention, by which mapmakers would label unexplored territories with the words, “Here be dragons.” The phrase connotes mystery, unexplored territory, and consequences. Given the event’s focus on synthetic biology, the future of the Internet, and other developing technological and scientific fields, it is an appropriate title.
Synthetic Biology 101: Can’t We Make the Dragons?
Andrew Hessel, co-chair of bioinformatics and biotechnology at Singularity University and founding director of the Pink Army Cooperative, next took the stage to offer “a brief primer on what synthetic technology is” to give the audience, of varying levels of familiarity with the emerging technology, some background on the science. To explain some of the fundamentals, Hessel drew many comparisons between the development of the computer and the current state of synthetic biology, saying, for instance, “bacteria are actually biological computers,” with DNA rather similar to software. Extending the analogy, Hessel argued that just as computers were once expensive, reserved for a small cadre of experts, and baffling to most, we will soon see the field of synthetic biology being democratized as it becomes cheaper and easier for everyday people with an interest to begin “printing” their own organisms and doing their own research via “DIY bio” and community organizations that people can join. In his opinion, “synthetic biology is the next IT industry,” as synthetic biology could eventually impact the fuels and plastics we use, the pets we house, and the medicine we take, with cancer and reproductive assistance showing particular promise for synthetic biology.
The Promise and Perils of Synthetic Biology Today
The panel, moderated by Future Tense Fellow Robert Wright, attempted to reconcile the potential benefits and dangers of synthetic biology, as well as the ethical considerations. The panelists included George Church, professor of genetics, Harvard Medical School and director, Center for Computational Genetics; Dan Sarewitz, associate director, Center for Nanotechnology in Society, Arizona State University, and co-director of the Consortium for Science, Policy and Outcomes; and Robert J. Sawyer, author of Flashforward, Mindscan, and Factoring Humanity. Church and Sawyer agreed that the reduction in cost for synthetic biology will have great impact on its potential. Wright asked the group to consider the potential dangers of synthetic biology, but Sarewitz rejected the opposing “good or bad” premise, saying, “The problem I have with the promise-or-peril framing is it makes it look like technologies deliver their impacts in little, discrete bites.” Instead, he argued, we should focus on “the systemic ways [new technologies] transform society and culture.” The discussion then turned to how we can prevent disasters, whether caused by a mistake in a home laboratory or an individual with nefarious intentions. Church says that “active surveillance of all possible participants” will be vital and stresses that international accord will be important, as this is a field that will grow not only within the U.S. but also internationally. Sawyer notes that today, scanners sold in the U.S. are unable to scan bank notes; he proposes that any “DIY synthetic bio” kits could perhaps come with similar built-in mechanisms to prevent abuse, misuse, and/or accidents. When Wright asks whether the pace of technology could be slowed to ensure that such safety measures are in place within sufficient time, Sarewitz says that it is likely impossible to stop such scientific progress artificially; Church concurs, pointing out that international agreement and surveillance would be much easier than forcing researchers to slow down. The far-ranging discussion also covered augmenting humans, with Sawyer positing that people will likely start off wanting to replace what they have lost, such as hair, before trying to use synthetic biology to go beyond the bodies and capacities they were born with.
Groping for the Online Master Switch: The Elusive Quest to Govern the Internet
Bruce Gottlieb, general counsel of the Atlantic Media Company and former chief counsel to the chairman of the FCC, appeared next to discuss how the communications industry and its government controls have evolved. The primary goal here was to illuminate the lessons that Internet governance, in particular, could hold for regulating synthetic biology and other emerging technologies. He traced the history of government involvement with regulating communications, including regulations involving content like fair use and copyright; busting the AT&T national monopoly; and distributing licenses, among other things. From there, he turned to the confusing way that the Internet is regulated, what he termed a “fractured and incoherent system of regulation” involving numerous governmental agencies. While this can cause headaches, one de facto policy that Gottlieb praised was how the Internet has been allowed to develop without “a mother-may-I system of regulation,” whereby companies do not have to ask permission before exploring many new frontiers.
Connecting the Genes and the Bytes
Bruce Gottlieb remained onstage, to be joined by fellow panelist Andrew Hessel, who earlier in the morning gave the primer on synthetic biology. The two, moderated by Jacob Weisberg, chairman and editor-in-chief of the Slate Group, discussed how the government will play a part in the development of synthetic biology. “How can government help, how can government hurt, and does the government need to do anything to restrain the unfettered [research]?” Weisberg asked. One thing that Hessel cautioned against is “overreacting to the potential risk.” He pointed out that nature already performs its own “genetic engineering” and said that he thinks that there will be ways to set standards to prevent, say, someone from releasing a tweaked or wholly new virus. Currently, we are only working with bacteria, the lowest level of life; regulation, Hessel said, “will come in the wake of … success.” As for whether government can help guide development, Gottlieb pointed to the National Institutes of Health grant model and said that innovation comes quickest when there is little regulation. Perhaps, he proposed, the way forward is to distinguish between the aspects of synthetic biology research that should require oversight and permission and those that don’t; the former will, of course, develop more slowly. Weisberg then asked how the U.S. can encourage research, and Hessel proposed “play spaces,” or community labs, where individuals could learn and take part in the development of synthetic biology even if they don’t have a research laboratory at a university.
Can Technology Policy Be Democratic?
Jacob Weisberg returned to moderate another panel, this time featuring Michael Crow, president of Arizona State University, and Neal Stephenson, author of Cryptonomicon, The Diamond Age, Snow Crash, and Zodiac. When asked “whether democracy is well-suited to” nourishing technology like synthetic biology, Crow argued that the two must be highly compatible, because the U.S. has helped incubate enormous developments; however, he cautioned that it might not always be the case. Stephenson agreed that the U.S. has helped develop new technology—but stipulated that in the 20th century, the motivation came primarily when a non-democratic government “want[ed] to destroy [the U.S.] or [was] believed to want to destroy it.” Though the U.S. currently faces terrorism, as Weisberg pointed out, Stephenson said that the terrorism threat has not imbued the U.S. with the same competitive drive that Russia did. Another challenge that Crow and Stephenson discussed was the gap between the scientific community and the general population, and the distrust between the two. One problem that Crow cited was that scientists frequently fail to focus on social outcomes, and the grants that they receive do not necessarily ask for more demonstration of success beyond journal publication. Stephenson said that one recent development has been that more scientists are moving to privately funded research, and he cited the J. Craig Venter Institute’s 2010 breakthrough announcement of a synthetic, self-replicating cell. One important consideration, Crow said, was to ensure that there is understanding between the technical, scientific elite, and the general population; Stephenson proposed, “half-jokingly,” that science-fiction writers like himself help bridge that gap.
Bio, the Hollywood Treatment
This panel opened with a crowd-pleasing video created by Brian Malow, a science comedian and video contributor to Time.com, about how genetic engineering and synthetic biology have been portrayed by Hollywood.
After the end of the video, Malow and Robert J. Sawyer discussed how all of the movies in Malow’s video were essentially horror films: In movies like JurassicPark, scientists fail to think through the consequences of their attempts to mold nature. When it comes to biotechnology, Sawyer said, “The Hollywood message … is ‘don’t go there.’ ” The cautionary tales are not necessarily without value, Sawyer added; Malow also said that there are some interesting ethical issues raised by some of these movies, particularly Gattaca and Bladerunner. In the latter, he said, there is interesting debate about what man owes his creations.
Can Washington Keep Up With the Next Big Thing?
This panel featured Larry Downes, fellow, Center for Internet & Society, Stanford Law School;Gary Marchant, Lincoln Professor of Emerging Technologies, Law and Ethics, Sandra Day O’Connor College of Law, Arizona State University, and Senior Sustainability Scientist, Global Institute of Sustainability; and Jim Thomas, research program manager and writer, ETC Group. The moderator, Brink Lindsey, a senior scholar in research and policy at the Ewing Marion Kauffman Foundation, opened the discussion by saying that the panel ought to have been titled “Can Washington keep from screwing up the next big thing? Because we need to be cognizant of risks on both sides.” Marchant argued that Washington will never be able to “keep pace” with breaking technology; however, Thomas disagreed, saying that certain parts of Washington can indeed keep pace, and he pointed to nanotechnology as an example of an emerging technology that was adequately understood by policymakers. Downes noted that democracies have been built to be “deliberate, incremental, and … slow to change”; while that is usually good, “the mismatch is obvious” when governments attempt to regulate technology, and there are occasions when the government gives into the “terror reaction” with new technology and responds with too-stringent regulation. In an attempt to pinpoint examples of technologies that the government does or does not properly regulate, Marchant and Thomas disagreed about current oversight of genetically modified crops, leading to a discussion about economic progress and how profit motivation can impact the development o fnew technology. Thomas noted that energy companies are heavily investing in synthetic biology, possibly driving it in a “destructive” direction. The market, too, can impact the development of new technology; Marchant cited RFID as a technology that was crippled, delayed by at least 10 years, because of a negative reaction from consumers. He argued that self-regulation could be effective to make sure that those who best understand a technology are responsible for it. Downes responded by pointing out that at times industries that self-regulate can over-regulate, with the movie industry serving as an example.
The Curious Case of Wikileaks
Robert Wright returned to his role of moderator to guide a discussion to Wikileaks. The participants were Don E. Kash, professor emeritus, School of Public Policy, George Mason University; Rebecca MacKinnon, Senior Schwartz Fellow, New America Foundation, and co-founder, Global Voices Online; and Bruce Sterling, professor of Internet Studies and Science Fiction at the European Graduate School. The first question posed to the group was whether Julian Assange and Wikileaks were inevitable; MacKinnon said no, arguing that while some large-scale leak may have happened under someone else, Assange and his team were unique. She added that democracies are most vulnerable to leaks, but Kash countered that the governments currently paying the price for Wikileaks revelations have been authoritarian in nature. There is almost certainly more such web warfare to come, argued Sterling, who said that piracy, botnet attacks, and other such Internet-based disruptions are on the horizon. The group discussed some of the other potential problems of the future, such as an “electronic Pearl Harbor” that shuts down the U.S. Internet, or companies tamping down on disruptive or unpopular speech. Sterling argued that Assange is a “precursor of something,” a harbinger of more disruption to come; the panel then debated whether Assange needed the support of established print media to make Wikileaks’ cable dump a success. MacKinnon argued that though the newspapers helped, Wikileaks would have soon hit front pages regardless.
Brian Malow’s Footnotes
Science comedian Brian Malow returned to the stage to bring the day to an end. “I’m supposed to sum up what we learned here today?” he asked bemusedly before launching into jokes about genetically engineered glow-in-the-dark bunnies, whether evolution can hit a dead end, and more.
Day 2: Friday, Feb. 4, 2011
Stranger Than Fiction: Technology’s Challenge for Storytellers
The second day began with a discussion between Sascha Meinrath, director of the Open Technology Initiative at the New America Foundation, and science-fiction writer Neal Stephenson, author of Cryptonomicon, The Diamond Age, Snow Crash, and Zodiac. Stephenson discussed how his research and “geeky pursuits” frequently end up influencing his fiction. Meinrath asked Stephenson about the role of writers to influence events like the Egyptian protests and use of technology, but Stephenson was unsure.“I don’t know nowadays if it’s really writers who are going to have that role. I think we’ve certainly been eclipsed in the popular imagination by more charismatic humans,” he said. One thing Stephenson said that he finds intriguing is that recent technological advances seem to be on a smaller scale than they were in the early parts of the 20th century, when we first developed automobiles, the television, and more; now, technological innovation is less about engineering than digital technology, which he said he thinks is a shame. Stephenson also criticized the “lock-in path dependency,” a bad habit of both consumers and companies, that limits our ability to think of new ways of doing things. He pointed to our slow move away from fossil-fuel dependency and the fact that we continue to launch space shuttles the same way we also have as examples of this “lock-in path dependency.” How is storytelling going to change as technology develops? One way that Stephenson mentioned was a current project he is working on with several other writers, a serialized novel being published in installments online.
The Dragons Online: The Internet’s Coming Surprises
Andrés Martinez of the New America Foundation moderated a conversation with Alan Davidson, director of government relations and public policy for the Americas, Google; and Tim Wu, Future Tense Fellow, New America Foundation, and professor, Columbia Law School. Wu is also the author of the recent book The Master Switch: the Rise and Fall of Information Empires. The trio discussed whether the Internet is a U.S.-centric industry—it is truly a “global medium,” according to Davidson, with policymakers in Europe particularly active—before the conversation turned to how much is left to be invented. Wu put forth the provocative argument that we are currently seeing a dearth of imagination, a deficit of big, new ideas; he contrasted today with the 1960s, when he believes that there was a great deal more innovation. But Davidson disagreed, saying that even just with the Internet, we cannot yet imagine how, for instance, we will be able to accommodate 5 billion people online, whereas at the moment there are “only” 2 billion. Martinez offered that one difference between now and 50 years ago is that thinkers are “siloed” within their own research cultures, whereas in the 1960s, research was more unified. One area that the group agreed there was room for development was in health information technology, energy, and other big societal problems. But one challenge to the Internet’s continued growth in a relatively regulation-free environment will be related to privacy concerns. Davidson agreed that it is a cause for concern, but said that he hoped to find a role for government and regulation“that doesn’t undermine these basic architectural features that” have allowed the Internet to become this free space to create and share ideas.”
Will Synthetic Biology End Human History?
The conversation returned to a focus on synthetic biology with this panel moderated by Michael Specter, a staff writer at The New Yorker. His first question to panelists Drew Endy, a synthetic biologist of Stanford University, and Francis Fukuyama, a senior fellow at Stanford’s Freeman Spogli Institute for International Studies, was about whether we should be scared that we are “no longer hemmed in by nature.” Fukuyama dismissed the frightening implication, saying that it’s “silly” and ignores the fact that we are able to adjust technology and its applications as we encounter outcomes—positive or negative—and backlash. The conversation turned again to regulation, with Fukuyama proposing that a new regulatory agency may be required to deal with synthetic biology and other cutting-edge technologies. When discussing the possibility of a Three Mile Island-type mishap involving synthetic biology, Fukuyama suggested that such an accident could occur not in the U.S. but in China, perhaps leading Americans to say “this is what happens when you do this without the adequate framework” for safety, increasing domestic pressure on the U.S. government. But such a framework would not necessarily just entail laws and regulations, said Endy; it would also require education of society and development of new norms.
Public Beneficence in the Pursuit of Science
In the final event of “Here Be Dragons,” New America Foundation President Steve Coll joined Amy Guttman, president of the University of Pennsylvania and chair of the Presidential Commission for the Study of Bioethical Issues. The commission recently released a report, requested by the president, addressing synthetic biology, its potential consequences, and how best to monitor it. The report was inspired by the J. Craig Venter’s synthetic biology breakthrough in 2010. To complete the report, Guttman said, the commission talked to bioethicists, researchers, and other experts, a lengthy process. In the end, they elected to avoid the precautionary principle because of synthetic biology’s potential to save a great deal of human lives—for instance, the development of cheaper anti-malarial drugs for the developing world—in the near future; to proceed too cautiously could cost lives that might otherwise have been saved. But, Guttman pointed out, they also decided to recommend not to simply “let science rip”; instead, she and her commission proposed that the government allow science to proceed, but with caution, and with guidance. They stopped short of recommending new legislation, regulatory agencies, etc., but urged careful observation. After discussing with the audience the different factors the commission considered, Guttman wrapped up by telling the assembly about the commission’s upcoming reports: one on clinical trials held overseas by U.S. companies and the other on the ethics of genetic and neurotesting.
Future Tense is a partnership of Arizona State University, the New America Foundation and Slate magazine.
Follow along on Twitter #futuretense