Biotechnology: The Potential and Perils of Innovation
Emerging Technology Series: Biotechnology--The Potential and Perils of Innovation
Associate Professor of Bioengineering, Stanford University (via videoconference)
Senior Fellow for Global Health, Council on Foreign Relations
Executive Vice President, Strategic Communications, Global Public Policy, and Population Health, Merck & Co., Inc.
In conversation with Rodney W. Nichols, Consultant on Science and Technology Policy, Drew Endy, Associate Professor of Bioengineering at Stanford University, Laurie Garrett, CFR Senior Fellow for Global Health, and Julie Gerberding, Executive Vice President, Strategic Communications, Global Public Policy and Population Health, Merck & Co, Inc. discuss the latest developments in synthetic biology and biotechnology, their implications for U.S. national security over the next decade, and the policy prescriptions they have going forward.
The Emerging Technology series explores the science behind innovative new technologies and the effects they will have on U.S. foreign policy, international relations, and the global economy.
NICHOLS: This is part of the Emerging Technology Series at the Council on Foreign Relations, and today’s topic is “Biotechnology: Potential and Perils of Innovation.” My name is Rod Nichols, and I’m honored to be presiding today.
You’re in for a treat. Those of you wanted an intellectual treat, you’re in for a treat because it’s a brilliant panel. They’re both deep and broad, which is not often the case in these kinds of topics.
And the topic is hot in science and engineering circles in our country and around the world, for national policies and priorities around the world. One of the technologies that will be discussed by some of my more knowledgeable colleagues, CRISPR, was named “Breakthrough of the Year” last year by a major scientific society.
So my job is, first, the plan for the meeting is I will introduce the speakers very briefly. You have their bios. They’ll then make a few remarks—you’ll hear what they’re going to be talking about in a moment—and then we’ll have your comments and questions.
Drew Endy, on the screen looking very thoughtful and waving a finger or two at us—(laughter)—is professor of genetic engineering at Stanford, developed a new lab. He’s really a pioneer in this field himself, a leader globally, an innovator for sure, and an advisor on policy.
Laurie Garrett, on my immediate left, has been an incisive leader at the Council on Foreign Relations for many years, as many of you know, in global health. She is fearless, as you can see from her crutches. She will go anywhere, ride her bike up and down anywhere, from Croatia to Manhattan. She got hurt recently, but it doesn’t seem to have affected her head at all. (Laughter.) She is deeply informed and globally alert, which is one of the things we want to have emphasis at the Council.
Julie Gerberding, on my far left, has had a remarkable career, as you can see in her bio. I asked her a few minutes ago whether she would object to my calling her a renaissance woman, whether that was politically incorrect, and she said, well, she thought that was OK. Very few people have done as many different kinds of things as she has done. Her work spans medicine at the bedside, with research at the bench, and at the highest level of policy circles, including her current work at Merck.
What we’re going to start with, I’m asking each one of the panelists to give us one example of the potential for contemporary biotechnology, which is very powerful, and one example, as they see it, of a peril. And we’re going to start with Drew from Stanford. So, Drew, you’re up.
ENDY: Rod, thanks so much. Can you hear me OK?
ENDY: Great. You know, I thought I would offer an example from the realm of medicine, and talk about opiates, medicines that are used to treat pain—morphine, hydrocodone, and so on.
ENDY: Right now, the sourcing of these medicines is realized by growing poppies—agricultural fields of poppies, opium poppies, that are harvested. And the compounds are extracted and brought into preparations and used to treat pain. There are a number of problems with this current practice. As most are aware domestically, we have massive problems with addiction. We have side effects from the medicines themselves. They’re declared as essential medicines by the World Health Organization. And that’s important because when you look at who has access to these medicines, only about 2 ½ billion people on Earth have access to these essential medicines.
So what does this have to do with the frontiers of biology and biotechnology? The reason I’m sorry to be not able to join you in person, my wife is expecting with our second child due shortly so I’m not allowed to travel. But last year, she and her technical team were able to recapitulate the biosynthesis of opiates in baker’s yeast, so that you could realize the same biochemistry and the same final active pharmaceutical ingredients, not by starting with a field of poppies, but by brewing—starting with sugar and a fermenter and an engineered strain of yeast to realize the production of the same active pharmaceutical ingredients.
Now, the technical significance of this, set aside what the compounds are, is they are performing about 30 different biochemical steps, 30 different enzymes brought from different organisms into baker’s yeast to make an engineered strain that’s able to do a significant amount of biochemistry. Why might they wish to do this? Well, if you could make a diversity of these molecules, maybe you’d eventually have a chance of screening for alternates which don’t have some of the side effects related to addiction or respiratory depression or so on—i.e., we could just get better medicines. That would be a promise of some ambition.
Moreover, because the material supply chain is based on brewing and not poppy farming—for example, we might source much of these materials from poppies grown in Tasmania today in the United States, if we will—if we have a brewing process that could be distributed, maybe the chance of changing the material supply chain and providing treatments for paying globally in a much more responsible way. So that’s a promise.
Now, typically when I bring up this example or others bring up this example the perils are straightforward. There’s an illicit market around these compounds of some significance. There’s many existing problems. And naïve deployment of a new supply chain could lead to more addiction, and so on and so forth. Let me zoom out then and try and put this in some context. At Stanford, we’re posing within the engineering school a question. The question is: How good can we get at engineering living matter? Much like 50 years ago we posed questions like how good could we get at programing computers.
And we’re 10 or 15 years into answering that question, not just at Stanford but globally. And the early indications are, we’re going to get really good at engineering living matter. As good as you might imagine as we got with engineering silicon semiconductor based systems. So then the peril, in my mind, is to figure out how we can create and sustain venues where a strategy is realized, a strategy that is needed to determine what it would mean to flourish on top of future biotechnologies, but also a strategy to mitigate some of the negative outcomes.
And I’m sure we’ll talk about safety, so I don’t want to preempt that, but I bring up security as a particular topic not because I view it as a massive imminent threat, but a threat that’s developing over time. And to speak practically in the United States and elsewhere, but certainly within the United States, we have no venue—politically, from a policy perspective, or even a technical perspective—we have no venue where scholarship can be sustained and a strategy for realizing long-term effective biosecurity might be realized. So I think that’s the biggest peril we’re facing, just simply an absence of sustained dialogue around what strategies might be for realizing a full-on flourishing bioeconomy, combined with venues by which we can mitigate some of the minuses.
NICHOLS: Great. Thank you very much, Drew.
Laurie will give you her idea of a potential and a peril.
GARRETT: Well, yesterday the Recombinant DNA Advisory Committee, the RAC, at NIH, which was founded back in the 1970s, as the result of a prior revolution in biology, the original genetic engineering revolution, approved the first experiment on human beings with CRISPR, which will be conducted shortly. And it is funded by Sean Parker, billionaire of Napster fame. So you see a confluence of different interests here. And the target is cancer cells and modifications that could result in curing people of cancer. So we’re on our first big step towards the utilization of this technology in human beings. And certainly we will see some extraordinary breakthroughs that will be very, very exciting in the near future.
I would say the thing I’m most enthused about from my own, you know, bias as potential applications of synthetic biology, CRISPR, and where it overlaps with basic microbiome research—the other great revolution going on at the same time—is in developing a far broader understanding of pathogens, whether they’re viruses, bacteria, parasites, all over the world, understanding the ecological niches they occupy, and having a better appreciation for how they can rapidly diagnose them, how we can assess their likelihood of jumping zootomically from one animal species to our species or to our livestock, or from one wild plant to an essential food grain of some sort. So I’m very, very excited about where this technology could go in that direction.
Of course, like all new biologies, we see that things get dictated heavily by where the stock market chooses to invest. And I don’t know if the stock market is interested in preventing pathogens. There may be other priorities that Sean Parker and his pals devote their money to. And that takes to my downside concern. We’re only—our understanding of how the microbial world works is so primitive, it’s phenomenal how little we know. We have only relatively recently come to appreciate that every human being in this room has far more microbes in you than cells with your own DNA in them. We’re only at the cusp of appreciating how those microbes interact with other things around you—around your chair, on that table, in the food you just ate—much less how the same interactions are occurring with animals and plants and so on, in the oceans, in the air we breathe, in the soil, et cetera.
I am very concerned that we will have untoward consequences, mistakes made, that result in altering some essential microbial balance that has been in place for millennia, and that is very vital to some ecological system that we care about. It might be the human body. It might be the body of a dog. It might be soil that we grow wheat it. I don’t know. It could be many things. And the reason I’m concerned about this is that—and by the way, this mean then that we’re deliberately or unintentionally altering evolution, right? I mean, a fundamental change can occur, especially using gene drive, that results in an absolute switch in the evolutionary trajectory.
And I worry about this because the technology, especially CRISPR, is so darned easy. You can now buy kits and do it at home. There’s a whole DIY movement developing. It’s cheap. High school students are learning how to do these things. Drew has played a vital role in enhancing the capacities in high schools. And I think that’s fantastic. But I’m worried about what escapes our attention in our hubris that we think we know the world. And something we do could make a fundamental change.
NICHOLS: In a few minutes we might come back and give our group a tutorial on what is CRISPR, but it’s basically gene editing—take out a gene or put in a gene.
Julie, what are your potential and peril favorites?
GERBERDING: Well, first of all, thank you so much for letting me participate in this panel. We actually had a preface to this meeting yesterday on the phone, and we probably could have talked all day because we were exploring these ideas in a great deal of granularity. But I think I’ll come back to Drew’s approach and really talk about medicine because, yes, we will be starting the first clinical trials using CRISPR gene editing to modify our body’s own immune cells, the T-cells, in such a way that they are either more aggressive in attacking cancers, or the breaks that have been put on them are lost.
And obviously this is of tremendous interest on a global basis, and something that could extend what we’re currently learning from the new immuno-oncology drugs that have been in the news—the Jimmy Carter drug, for example. But it’s just taking that same concept and accelerating it, and allowing us to think about truly novel ways of attacking cancers, using our own natural immune system, which is certainly something that should be more preferable than chemotherapy or radiation. So it’s very exciting. As Laurie said, these technologies are cheap. We’ll see an extraordinarily rapid uptake, I’m sure, and a pressure to do more and more.
And the peril is obvious. It’s safety. It’s the specificity of these techniques. Are they only editing the DNA that we want them to edit, or are they actually spilling over and affecting other parts of the body? When you’re doing this technique outside of the body, that threat is not so important. But when you’re giving this in ways that can be inherited—for example, affecting the germ cells so that the children of treated people are affected, or you can spread ecologically across species or into the biome or into the environment in unexpected ways. We have to very much cautious our enthusiasm for the benefits with a very humble recognition that there is an awful lot we don’t know right now about how these techniques are being spread.
And to extend that a bit more globally, right now, for example, in China, the amount of effort that’s going into applying this gene editing and gene drive technology is extraordinary. If you do a Google search for gene editing and China and look at what comes up, you’ll see the myriad applications in rice, in silk worms, in human embryos, in various other cell lines and tissues. So on a global basis we’re seeing people run with this technology in environments that don’t necessarily have a RAC or any of the other mechanisms that we try to use in the United States to balance technology with a safety assessment.
And I think that creates a threat, even in well-intentioned people, let alone the people who can adapt this technology for more nefarious purposes. And CRISPR did appear—well, it wasn’t named CRISPR—but gene editing did appear on the list of weapons of mass destruction of national interest just very recently. So there is a downside.
NICHOLS: The theme in all three of your remarks have been the sort of political, cultural, ethical, regulatory environment in which this powerful new technology is being introduced. And we probably ought to come back to that. Drew, you didn’t make a particular recommendation, but how would you proceed to maximize the potential and minimize the peril? And we’ll ask the other panelists to comment. We certainly don’t want to suppress the science and engineering.
ENDY: Yeah, I think the most important thing I could try to relate is how much of a vacuum there is around sustained, strategic thinking and scholarship around where we’re trying to get to. So if we were talking about nuclear security for a moment, I bet most everybody in the room could summon up schools of thought and strategy—you know, mutual assured destruction and other approaches to mitigating risks associated with nuclear threats. How many people in the room could start to rattle off the names of strategies for mitigating biological threats? And speaking as somebody who is a practitioner, both of the biotechnology but also in the circles of biosecurity to a degree, there’s just a gap there.
It’s not that there aren’t good ideas, but those ideas aren’t being turned over and debated and worked through with the vigor and rigor that I would wish to see, let alone being coupled to strategic investment in public health and science and technology development. Again, it’s not that such things aren’t happening, but they’re happening on a very idiosyncratic and ad hoc basis. And to the extent that biology is central to many of the things we care about—starting with ourselves, but certainly the environment—and to the extent that our civilization is going through a potential transition where, for example, when I was born there were 3 ½ billion people on Earth, and now there’s twice as many people. When I was born, the ecologists tell me there were twice as many animals in the natural environments as there are today, at least the ones we can see.
You know, how do we transition our civilization from living on Earth to living with Earth. And what is the role of advanced biology and biotechnology in navigating and realizing that transition? We need to have a very holistic set of conversations and sustained scholarship around what our strategy should be. So I’ll just—I’ll just park it there. There’s a lot more to say, but, yeah, I think that’s the biggest gap.
NICHOLS: Laurie, I think you have—wanted to make—thank you.
GARRETT: Well, I mean, Drew’s put his finger on the number-one issue. And many things that are—so just to back up a second, because there may be some that are a little bit bewildered, and you’re not quite up to speed on some of the terms we’ve used.
So let me just say that what we’re looking at now with CRISPR—and that’s an acronym, and it doesn’t matter what it stands for—but it turns out it’s a very ancient mechanism, billions of years old, that microbes have used, particularly bacteria, to expel unwanted entry of genetic material brought in by phage or viruses that invaded the bacteria. So you can think of it as a roughly 3-billion-year old immune system. It’s only very recently that our species, homo sapiens, discovered this ancient thing that’s been out there forever and has been the way that bacteria maintain their genetic integrity against the constant onslaught of phage in their environments, and probably the way they plucked useful things from the phage when they needed them, such as the capacity to override antibiotics.
Now, the problem is we discovered this, it turns out to be so easy, wow, boom, cheap. And then discovered there’s another layer to it, which is gene drive, which allows you to essentially tell a genetic segment to just keep on running, and to run in the next generation and the next generation and the next generation, so that it becomes a permanent feature on the landscape. Now, just last week I was web in on a briefing at the NIH, where the heads of two institutes were briefing the head of the NIH about Zika. It was a terrifying meeting. But at one point Francis Collins asked Tony Fauci, the head of the National Institute of Allergy and Infectious Diseases, well, what about genetically modified mosquitos? Can’t we use CRISPR to take care of this problem? And then we won’t have mosquitos carrying dengue, chikungunya, yellow fever, and Zika.
And Tony, very wisely said, well, there’s two problems with that idea. Even though the stock market, by the way, really likes it and is spending a lot of money in this. One, these mosquitos only travel in their lifetime about 200 yards. So if you’re going to release a single generation CRISPR, it’s going to have to be released every 200 yards, which is a pretty tough way to control mosquitos. And then, if you’re going to say, well, no, we want it to go on forever and make a gene drive technology, you better know what you’re doing and you better have a really good idea how you’re effecting evolution out there if you deliberately and permanently alter Aedes aegypti mosquitos. And Tony’s advice was let’s not do it.
So that gets to what Drew said. Where are the—where is that wisdom over time, that builds a series of baselines about the yes and the no? Where are the lines? How do we decide? What’s the process? Where do we go? Many elements of what we’re in right now as a debate, a brand-new debate in its infancy, overlaps with—starting with the Asilomar 1975 meeting when scientists themselves said, whoa, genetic engineering, if we alter viruses, what might happen? Let’s all come together and make a rule book. That went on for some time, and some permutations. Then in early 2000s, we had Dual Use Research of Concern, DURC, arose, because of deliberate modifications of flu viruses that were going on to make super-virulent viruses and to determine how likely they were to infect human cells. This raised a mountain of concerns among bioterrorism and national security experts, and brought all these issues to the fore again.
Then, this similar pattern arose just a few years later, with so-called Gain-of-Function research, or GOF, the idea that you would deliberately insert a genetic capacity into a microorganism in order to see if that made it more virulent or that made it more transmissible across a broader range of species. Once again, all the red flags went up. Everybody freaked out. The same debate over and over again. And now here we are in the next stage of the revolution. And I think Drew’s absolutely right. We don’t have a learning curve. We have a very tiny cluster of people in most of the key high-tech countries in the world who, when we have meetings, we see familiar faces. Oh, right, I saw you ten years ago at the DURC meeting. You know, here we are again. And yet, we don’t seem as—Drew’s metaphor is perfect—we don’t see as we did with nuclear, both fusion and fusion research, a permanent raising of the bar of the discussion so that each layer is a learning curve over the prior, and the sort of assumptions among nations improve.
And I would just add that—as a final note—that, you know, everything that we think are the values and the norms that we would aspire to in the United States on these issues are quite different in other countries. And let’s—we don’t have to talk about Russia, China, you know, Iran, Pakistan, what have you. Let’s just look at the U.K. Their Human Subjects Committee decided it was fine to make alterations in gametes with the intention of improving in vitro fertilization. So something that we think is kind of taboo here, and we haven’t even had an effort to consider it formally, is already OK in the U.K. We’ll see if Brexit is—
NICHOLS: Julie, since Laurie’s last remark is about medicine, medicine has an old view, do no harm. Are you worried about the ethical implications of the gene editing in terms of the progress of medicine?
GERBERDING: Well, absolutely. And particularly in the germ line example that Laurie just gave. I think we have to be extremely cautious in that front. And I totally agree that we need a system of risk assessment and of the rulebook, so to speak, that applies globally, although I also believe that’s going to be a very hard energy to harness. But it’s also true that we’ve been here and done this before, not just going back to the gain-of-function opportunity. I was part of that conversation at CDC, since we were part of the science behind all of that.
But you know, go back to even, you know, smallpox vaccine or, you know, the early days of vaccinology, where the same debates occurred between the technology of advancing the opportunity to prevent infectious disease versus the safety concerns and the academe. So these are ongoing tensions that exist between scientific advances and concerns for unknown risks. I can’t really think of an example where the science has stopped because of the safety concerns, but I do think that the boundary conditions are important to face up to. And in this case, the stakes are particularly high.
NICHOLS: Do you think we’ve made any mistakes so far, with respect to the potential medical applications?
GERBERDING: Well, what we might not understand and the mistakes that we’ve actually made. And, you know, the biome is one example where I don’t think we really fully understand what our manipulation, even our antimicrobial use, for example, or anything that put we into our bodies, what’s that really doing to our internal ecosystems, let alone the environmental ecosystem. And if you take an ecological approach and really think about emerging infectious disease and their zoonotic origins, about 80 percent of the time we really probably have underestimate—have underestimated the perturbations that we’ve caused in the system. So, you know, I don’t have a startling example to give you, but I have a strong sense that we probably don’t know what we don’t know.
GARRETT: That’s a good way of putting it. Can I throw a question to Drew?
NICHOLS: Sure, please.
GARRETT: To Drew, I point, you know, in the air, wherever. (Laughs.) Your colleague, George Church, coined a phrase, directed evolution, about three or four years ago, partly because of what Craig Venter was doing out in San Diego with deliberately creating previously non-existent microorganisms from nucleotides on up. And George was talking, OK, directed evolution, the idea that humans are not just in the Anthropocene affecting evolution because we’re throwing pollutants out and what have you, but we’re making actual, deliberate choices. And he also convened a meeting recently at Harvard that was, quote, “secret” to talk about directed production of a human genome. Can you talk about both the concept of directed evolution, and what was this secret meeting about, and how do you feel about the idea of it being secret?
ENDY: I’m happy to respond to some of those questions. (Laughter.) So there’s directed evolution, as Pim Stemmer and Frances Arnold and others have practiced for a long time, which is in a laboratory to create diversity and then select to provide the features you want, sort of like breeding of a dog, but with molecules and cells. You’re picking up on something, in my opinion, equally if not more important, but I’m hopelessly biased. So for a little bit of quick context, I teach in the undergraduate program in bioengineering at Stanford. This is the newest academic department in the Engineering School in about 50 years. So last time we started a program of the scale of bioengineering it was called computer science. (Laughter.)
And in the laboratory course we offer here—in the laboratory course we offer to our bioengineering students, we have outlawed gene editing. We refuse to teach our students how to cut and paste DNA with any of the techniques, including CRISPR, because we consider that to be a waste of their time because the opportunity cost is so high. Instead, what we wish our students to learn how to become better at is choosing what to work on, designing a living organism, contracting somebody else to manufacture that organism for then, and then receiving that and seeing if it worked. So in other words, we decouple design and fabrication, like an architect might design a building, a contractor will realize that design for you. And so we’ve pioneered that in practice over the last six years.
For that to work practically—this gets back to your framing of your question—a new technology needs to be made quite practical. It’s called DNA synthesis. So can I go from information on a computer, the letters of DNA—A, T, C, G—in whatever combination I want, send that information over the internet to a foundry that will take that string of information and print from scratch the physical DNA molecule, and return that molecule to me either in a cell, or for me to put in a cell, so that the genetic instructions are then carried out? So when I first started teaching at MIT in 2003, the cost of going through this process, of paying somebody to print DNA for me, cost $4 for every letter or base pair. So if I had a gene that was a thousand bases of DNA long that a student wanted, I’d have to give the company $4,000 to print that gene from scratch so that the student wouldn’t have to clone it or splice it or whatever.
George and I have been involved with starting two companies. And as a result, and due to the work of other corporations as well, over the last 13 years the price of printing DNA from scratch for long, blank fragments has dropped by a factor of 400. So today, the cost of printing DNA from scratch for the things we want is not $4 a letter, it’s one penny a letter. And that continues to look like it will cost down by about a factor of two every other year, for the next 10 years or so.
GARRETT: It’s faster than Murphy’s—it’s faster than Murphy’s Law.
GARRETT: Moore’s Law, I mean. Murphy’s—(laughs)—god help us, that’s biosafety.
NICHOLS: That’s a different problem.
ENDY: So we love this technology as would-be engineers of biology, because it allows us to specialize on thinking what would be a good thing to make and then, you know, getting it made professionally.
NICHOLS: Give us an example, Drew—give us an example, Drew, of what you’re making now.
ENDY: Well, the earlier example I started with was making pain medications, right? So the teams—the teams who are programming the baker’s yeast are not, by themselves, going and starting with the plant materials to harvest the genes and then cutting and pasting them into yeast. They’re starting with computer databases of sequence information that are available over the internet, based in Canada or China or wherever. They’re using computer algorithms to find the putative sequences that they suspect will help do the biochemistry they want. They’re paying a company to print the DNA from scratch and mail order it, you know, through FedEx to the lab at Stanford, and instantiating those synthetic genetic programs into the organism, and off it goes.
But there’s many things you could do because think about what biology can do, right? So it’s a massively functional green nanotechnology that’s already taken over the planet. And to the extent you can ship information or instructions, you know, via the internet and overnight shipping, you can instantiate manufacturing of new things wherever you wish, so long as you can operate the system.
Now, to just return to, Laurie, your comment. Because the prices have been dropping faster than computing has been improving, it’s now become imaginable to finance a project to synthesize all the DNA encoding a human being. So the human genome is about 3 billion bases long, plus or minus. And if you divided that by a hundred at a penny a base you would get $30 million as the initial cost to print the DNA encoding a human being. Now, that would not be a continuous genome. It would be in short chunks of 4(,000) to 10,000 bases, and you’d have to figure out how to physically assemble it. And so a meeting was held to discuss this, as have other meetings. And there’s now a proposal to underwrite and pursue the—for the first time, development of capacities to print from scratch and assemble human genomes.
GARRETT: And that meeting was secret. (Laughter.)
GERBERDING: Obviously not.
GARRETT: After the fact we all heard about it.
NICHOLS: We agreed on a conference call yesterday, among the four of us, that this is not a good idea to have a secret meeting, actually. Science thrives best when it’s very open, very transparent, very much debated. And I think we’ve all agreed on that.
Before we go to comments from our colleagues here, what are any of the military implications of this? The Council on Foreign Relations is concerned with foreign relations and military relations. Are any of you worried about the peril of this falling into the wrong hands and leading to weapons?
GARRETT: Julie do you want to start?
GERBERDING: Yeah, falling into the wrong hands I think is probably not an applicable concept here since, you know, a six-grader can probably use these technologies. So we should assume that everybody has the capability. The question is—
NICHOLS: But that’s a significant statement in and of itself. It’s ubiquitous.
GERBERDING: Exactly. Which is, I’m sure, why this is on the list of WMDs that our national security advisor just created. But you know, it’s like the concern about any biological threat. Just because you can do it doesn’t mean it’s a weapon of choice, doesn’t mean it would be the most terrifying weapon of choice. Mother Nature is a very good terrorist herself. But I certainly think it deserves consideration in that paradigm.
NICHOLS: Mmm hmm.
GARRETT: I am less concerned—
ENDY: Could I just—sorry.
GARRETT: Oh, sorry, Drew. Do you want to go?
ENDY: No, please go ahead, Laurie.
GARRETT: OK. I’m less concerned about deliberate use, deliberate modification, compared to accidental and biosafety concerns. You know, Julie used to run the CDC. And in the era after she left, we’ve seen, what, eight major incidents of biosafety breaches? And that’s at the Centers for Disease Control and Prevention, our most security civilian facility for biological research.
NICHOLS: Accidents happen. Accidents happen.
GARRETT: Accidents happen. And when you get to the point where it’s kids in high school labs, or, you know, mentally confused college kids playing around and imagining they’re the next “Breaking Bad” mad scientist, or what have you, then I start getting very worried and very concerned about whether or not we’ve really thought through how to create a safety paradigm. And you know, when Drew described this whole new algorithm of how people are approaching biology, kind of following a trajectory of the placement of creativity versus distributed production and so on that has made the tech and computer revolution, you know, look at what has resulted on the computer side.
We’ve gone from outsourcing more and more and more of computer production and chip production and smartphones and everything to China and to Russia, to a lesser degree. And now we have this massive cybersecurity crisis, where nobody in the world feels like their data is secure, that their very personality, their very thoughts are accessible, whether it’s to their own government or to some hacker somewhere out there, or to a foreign government. And so we’re squarely in the realm of all aspects national security. And I think, you know, this biology revolution is taking the genome to the place where the cyber has been. It’ll have different dimensions, but it’s similarly so diffuse and distributed across so many national boundaries and involves so many different legal systems that essentially it’s beyond regulation.
NICHOLS: Mmm hmm. Drew, you wanted to make a point on the military—potential military applications or foreign policy implications?
ENDY: Sure. It just—it’s a retro comment. The last time I was at the Council I think was joined with Matt Meselson. And one of the important things to remember, it’s slightly before my time, is, you know, we in the U.S. used to have an offensive biological weapons program and made a very good decision heading into the Nixon administration to stand that down. And so it’s just worth remembering that and bringing that wisdom forward. We should now, in my opinion, take actions that even inadvertently might lead others to remilitarize biology. So although we don’t say it out loud very much, it is absolutely critical that nation-states do not remilitarize biology on top of the new capacities that are developed. And it’s a retro comment, because we worked it through many decades ago, but let’s not lose that one.
NICHOLS: Yeah. A norm was set. And I think one of the things we were talking about earlier, there are few norms here in this business.
GARRETT: Well, there was a norm set, but to Drew’s point, it was massively violated. I mean, there was a treaty signed, it was in the Nixon administration, everybody was getting rid of their bioweapons, and that’s when the Soviets, in fact, revved their entire program up so that they had this massive biological weapons program, that included use of primitive genetic engineering to enhance—
NICHOLS: Which was secret. Which was secret and very often doubted whether they really had it at the time.
NICHOLS: We want to open it up to all of you for comments and questions. I think there are some mics roaming around. The question in the center here, he’s got his card up. Right in the center. Could you identify yourself?
Q: Thank you. Yes, I shall. Richard Huber. I’m the chairman of a company called InVina Wine. But I’m here really as an investor and a director of AquaBounty, a company that has got some notoriety for having the first genetically modified animal product that has been approved.
And my question is related to that, and you’ve all touched upon it. The U.S. regulatory process, having seen it close hand, 13 years to get approval from the FDA, is a shambles. And supposedly science-based regulatory is a myth. Politics is all over the place. Can you suggest some better way of having a regulatory framework that both provides protection—something you commented on many times—but also allows the process to move at a somewhat faster pace than 13 years? Thank you.
NICHOLS: That’s a tough question, Mr. Huber, I think. And I wish Peggy Hamburg were here. I’m sure she didn’t—she didn’t block your product, I’m sure.
GERBERDING: I don’t know what Peggy would say, and I certainly don’t intend to speak for her, but I am part of the bio-innovation trade association executive committee. And of course, we’re very keen to look at how the FDA can have what it needs to be able to improve its performance. And speed is one of those performance metrics. So I believe the agency wants to improve. The 21st century cures legislation that’s passed the House but not the Senate is an important step in that direction.
But there’s also the missing underpinning of regulatory science. And when we think about scientific investments in the United States, I think we zoom to the NIH or maybe the National Academy of Science. But we don’t necessarily think about the surround sound that has to go on to make those scientific advances available to people. And regulatory science is a big piece of that. So what we’re really talking about today is a prime example of where we’re going to need a whole new framework for regulatory science to be able to do what we all want to do, which is accelerate the value of the innovations that we’re creating.
NICHOLS: I also think the staffing at FDA has been a problem for Peggy Hamburg and her predecessors. There are not very many first-rate molecular biologists and geneticists who’d rather work for the FDA than Stanford, or even CDC.
GARRETT: Why would they? I mean, look, let’s been real. It’s just like trying to get top of the line cyber folks working for the FBI. Why would you want to take a salary of $90,000 a year, or $110,000 a year, with all the government bureaucracy you have to live with, when you could make 10 times with a smart play crossing between academia and the private sector.
GERBERDING: But, Laurie, some of us do it because we care and we think it’s the right thing to do and, you know, we want to serve our country.
GARRETT: Oh, right, but, you know, that’s the minority. That’s a distinct minority.
GERBERDING: But I do think that—you know, I want to very respectful of the quality of the scientists that are at the FDA, because there are some very outstanding people there. And again, part of the 21st century cure legislation is to give the FDA the hiring flexibilities and the salary support that they need to try to be more attractive to people. So I really want to get behind them and say how can we help you, because I completely agree with your point. We all agree with the problem that we need to help them solve it.
NICHOLS: Another question? Yes, here, on the right. Yes, sir, your name and affiliation, please.
Q: John Chain (sp). I’m a columnist, and I do other things.
Do we know anything at all about what’s happening at Biopreparat, the Russian militarization of bio?
GARRETT: That’s been long-since destroyed, Biopreparat. What we don’t know—so, under the Soviets there were two giant arms of biological weapons production. One was Biopreparat, which was allegedly civilian. It answered directly to the Politburo. And the other was what was going on inside the military. Biopreparat became transparent. There were many years of program with our National Academy of Sciences to transform the entire infrastructure to public health good and find employment for those scientists outside of weapons production. What we don’t know almost anything about is what really was going on inside the military labs, and what may or may not persist in Russian military labs. That we don’t know.
NICHOLS: So you can’t write a column about a secret. We don’t know the answer. Another question or comment? Yes, sir, over here on the left. Your name and affiliation, please.
Q: Stanley Arkin, I’m a member of the Council.
My son, a professor, works at Cal in bioengineering, synthetic biology, Adam Arkin. And you may know him.
ENDY: A very dear friend and colleague, yes.
Q: And from time to time when he speaks to me about the issue, what’s occurred to me is at what point does security attach? He’s given me hypotheticals which would curl your hair. At what point—is it an outside governmental agency? Does the university have some kind of special authority or obligation? But at what point in the course of research does security intervene and therefore seek to protect us all?
NICHOLS: Drew, I think that was directed at you, but other panelists may want to make a quick comment too.
ENDY: Sure. So a pleasure to meet you through the video link, sir. Adam is a dear friend, and an amazing person.
I think it depends on the aspect of security specifically that arises. So you know, there’s not a blanket statement that is responsive to anything—everything that could be practiced. One of the hallmarks of how modern biotechnology flourishes is that there is a distribution of responsibility to the nodes, to the community, to the protagonist, to the individual actors. And so in the just conceiving of an idea, a good citizen is mindful of safety and security aspects. And to attempt to implement, you know, only a top-down control framework doesn’t scale.
There are exceptions to this, which have been developed, in my opinion, on a responsive and ad hoc basis. You know, so for example spent more time over the last 20 months than I might have wished as a member of the National Biosecurity Board, trying to be responsive to government requests around how—if and how work involving gain-of-function with flu and MERS and SARS might proceed or not. And about a month ago, we shipped our recommendations into the government for how to think about—
NICHOLS: You better explain a little more of what that means, what the problem was. But take a minute or two to.
ENDY: Sure. But I mean, but it’s a good example to illustrate how much we’re on our heels in many respects in terms of responding to security concerns. So a number of years ago, as many might know, there was public funding for work to explore how particular strains of flu might be adapted to potentially create a new type of flu variant that might lead to a pandemic. So, could you take bird flu and have it be adapted so it would passage through mammals, create casualties in mammals. Very controversial work. When it was brought forward by the research community, it first emerged in the context of scientific papers that were being considered for publication. And at that point, very late in the practice of the work, concerns were raised that resulted in debate about whether or not such research should be published ever, never mind whether or not the work should have been done.
This experience, combined with imperfect inventory control, imperfect management of shipping of samples from U.S. government labs, combined with other factors, led to the federal government putting a pause on all funding for doing gain-of-function research on specific human pathogens—viral pathogens, like flu. And it took us the better part of 20 months to work through the aspects of that and come up with a set of recommendations that the federal government could consider implementing in terms of if and how to allow such work to proceed. The point I’m trying to communicate, if I zoom out for a second, is this is a lot of work for a specific case. We’re operating on our heels a little bit. It’s very ad hoc and idiosyncratic. It doesn’t really scale as an approach. And if you look at where this landscape is going, and how the technology trends are moving, it seems like we have a lot more on the horizon and beyond the horizon to consider.
So I’ll say something—I’ll say something that sounds a little bit strange, but one of the—if I live in the future, where we’ve made living matter fully engineerable, and we then ask how do we wish to flourish, how are we going to manage safety and security, what our plan? Safety, I think, is both practice and culture. Security is a little bit different. But the lessons I look to come out of Ohio in the late 1960s. So the Supreme Court cases, Jacobellis versus Ohio, having to do with obscenity and speech, Brandenburg versus Ohio, having to do with inciting lawlessness, and how Justice Potter Stewart and others rendered opinions that recognized that they could not, in a top-down fashion, prescribe whether or not a particular work was obscene or was likely to incite lawlessness and intending to do so.
The wisdom of those two court decisions is they recognized the open-ended, generative nature of speech. And I think what we’re looking at in biology is a tremendous open-ended capacity to be constructive and creative. We have to create a culture and a practice, and promulgate a culture and a practice, that I believe would lead to overwhelming practice of constructive behavior, and a technical and public health capacity to respond when the practice is accidentally harmful, or when we have mal-intentioned actors. And it’s an—
GARRETT: If I may—
NICHOLS: Let’s get another comment—we’ll come back, Laurie—another comment of question. Yes, sir, in the center.
Q: Jeffrey Glueck from Foursquare.
A question for Professor Endy and one for Laurie. Is it possible that you could engineer a gene drive, professor, that you could insert into a nation’s food supply, that would rend the entire food supply vulnerable to a trigger chemical or the like, as a bioweapon, or engineer an insect to bite humans and insert a gene drive into humans? Are these—are these conceivable bioweapons with the new gene drive technology?
And for Laurie, the U.S. is signatory to the ENMOD, the Environmental Modifications Treaty, banning environmental modifications. Should that global entity be convened as a forum for some of these global regulations?
NICHOLS: So the first question was to Drew, I think, gene drive in the food supply.
ENDY: Yeah, very quickly—yeah, very quickly, we don’t know but it’s conceivable. The patent filings coming out of Harvard suggest that that could be practiced, but nobody’s practiced it. I’d refer to Kevin Esvelt and others for thoughtful, technical responses. The one thing I would point out regarding gene drives is I have not seen any experimental evidence of a gene drive been 100 percent successful in terms of penetrating a population, even in the most carefully controlled laboratory experiments with relatively homogeneous strains of yeast. You get to the high 90s, but I’ve never seen anything to total completion. Now, I might not have the latest data, but I think that’s relevant from a policy perspective. Could you still nevertheless realize disturbances of some significance? Yes.
And then just as a placeholder, your second question, you know, implies, in a way, being responsive to some of the comments coming from the ETC Group and others, in terms of how one might want to consider the regulation of gene drives. So I can point you to that offline, if you like.
GARRETT: Yeah. I think I’d kind of prefer to take it jumping off the previous question, because it doesn’t matter which international entity you imagine activating to address which specific problem. We have a first problem in front of that, which Drew hinted at when he said, you know, safety and security are cultural issues. They’re about the culture of how you do science. and they’re about the culture in which science is occurring—what’s going on around the outside of it. It’s about the nature of governance in any given setting—whether it’s how a university governs safety on campus of how a nation governs safety within and without its borders.
And when we’ve had various international-level meetings looking at GOF, looking at DURC, and now looking at synthetic biology questions, what you see is that the world cannot even agree on the word security, OK? And it’s very serious. I see it now also in trying to improve the World Health Organization so that it’s capable of responding better than it did to Ebola to the next great outbreak, which now the next great is already here, it’s called Zika. And what you see is that the word “security” is so charged, it’s perceived when Americans say the term, as we’re about security for us not you. But we want you to do things so that we feel safer. And while we’re at it, we define security blended in with terrorism.
And if you go to poor countries, particularly in Africa and South Asia, their attitude is: You’re just trying to come in here and mobilize us, and get us to buy into your tech and your parameters of control. But it’s all in your interests, not ours. And one of the biggest complaints you hear at a lot of meetings on this topic is that the more you securitize biology, the harder it is for people from many parts of the world to engage with biological enterprises in Western Europe and North America, because they become security risks.
So if you’re a really smart 18-year old in Afghanistan who knows a lot, and is super precocious, and ready to take on the next great CRISPR revolution, it’s a lot harder for you to get clearance to come to Stanford compared to, say, a kid from Shenzhen or a kid from Palo Alto. So I think, you know, when we activate these words and think about the policy implications of them, it’s incumbent on Americans to step back and see how is this being perceived by the non-American? What are we projecting as our notion of security and safety?
NICHOLS: Thank you. Another comment? Let’s see, over there on the left here. Yes, sir. Yeah.
Q: Thank you. Exceptional discussion. Earl Carr, representing Momentum Advisors.
What would it take to get countries like China, Russia, to collaborate more with the scientific community to focus on safety? And in particular when I say countries, I’m specifically referring to governments.
NICHOLS: That’s a tough one. Julie, you’ve done a lot of international work, both at CDC and now at Merck. What’s your—
GERBERDING: Well, it’s a really hard question to answer today in this kind of ultra-nationalistic wave that we’re experiencing. So I’m not sure that we have the mechanisms to reliably do that. And we were just talking about culture. So what is the interest of the country in this technology and what is the culture and the spirit of collaboration and biosecurity in that culture? So of course, I think relationships matter and global norms matter, global—above national agencies should matter. I wish we had a WHO that was capable and confident to provide that kind of global leadership. And I think it’s something we could talk about at another panel someday.
But, you know, it’s a very sobering question, because I don’t think we have the ability to do that, nor do we have it with any other technology, such as cybersecurity, or any of the other global threats that we’re experiencing. So I don’t see this as a unique problem. I just see it as a problem that is part of the world—the flat world in which we’re living.
GARRETT: I think you need to separate the governments from everybody else, because actually there’s very, very aggressive collaboration going on right now, particularly between Americans, Europeans and Chinese. China is, you know, the biggest sequencer on the planet, and the biggest databases—although I don’t know if Drew agrees with me. But I think the largest databases on genetic sequences are all pretty much in China right now.
But when you talk about governments, that’s a whole other kettle of fish. Two problems always come forward. First of all, government, as a thoughtful organism, or whatever the heck it is, is always way behind. So the science is over here, and government’s back here arguing—we’re still really arguing 20th century paradigms in a lot of the sort of collective international debate about regulation—whether it’s about regulating counterfeit drugs in pharmaceuticals, or regulating how you would implement various types of synthetic biology. So government is always slow.
And then just to the cultural thing, one anecdote I think is worth remembering. We didn’t—it was previously brought up, the question about Biopreparat. So the only way the West even knew Biopreparat existed was a defector gave the information to the British government, and Margaret Thatcher confronted Boris Yeltsin. So this is after the fall of the Soviet Union. It’s still going on in Russia, in Kazakhstan, in a few other places. And in that confrontation, she is informed, well, when Nixon signed the agreement with Brezhnev eliminating biological weapons, Brezhnev went running back to Moscow and said: The Americans must have a huge bioweapons program because they just had us sign this. So let’s get going, right? (Laughter.) So that’s—remember, government is based on its politics, its ideology, and its culture. And not every government is going to see this problem the same way.
NICHOLS: A further footnote on your—this last question is there are science advisors now to prime ministers and presidents around the world. And very often, some good ministers have helped, as Julie and I were discussing, as I’m sure Laurie knows. So there is intellectual capacity in a lot of governments, but it has to deal with all these political and cultural and ideological barriers before that intellectual capacity can get deployed in the way that I think you had in mind.
GERBERDING: I mean, one thing I would just add to that is that I sense that I’m in a room of elites, some really, really, really brilliant people who have a breadth of knowledge and scientific literacy and a lot of capabilities. But, you know, talk about the 0.1 percent, you know, we live in a world where scientific competency, numeracy, literacy is increasingly the focus of enormous disparity. And so the people at the level who can take in this kind of conversation and participate in it in the debate or the dialogue, versus the citizenry of the nation, you know, that is becoming increasingly large chasm of—
NICHOLS: A gap.
GERBERDING: A huge gap. And you know, we can’t even count on citizens to make good health decisions about ordinary things that are fairly easy to understand, let alone debate the merits of CRISPR, gene drive, or gene editing. And what happens when people don’t understand science is they default to very black and white solutions. Like it’s either really bad, no GMO, that’s off the table, or they just give up and, you know, don’t pay attention and don’t engage in the kind of citizenry manner that would be helpful. So I think we have to accept the reality that not only we do we have to deal with the environment—the safety and security environment of these technologies, but we also have to deal with the fact that we really need to help people understand what they are. And that’s a tall order.
NICHOLS: In the back, on the left, thank you.
Q: David Bard, American Securities.
Drew, early on you alluded to a lack of policy, writ large. And you compared it to nuclear policy as that is well-done and this not. We’ve hit on a lot of pieces of what that policy might look like throughout this conversation, but could you and then maybe others lay out what would the framework look like if you could do it and you didn’t have a lot of these constraints? And then what reading material would you suggest for people who are total novices and don’t understand some of the acronyms that are sort of flying around?
NICHOLS: Drew, you want to take a crack at that in 60 seconds? (Laughter.)
ENDY: Sure. So to the second question, I would point you and others to iGEM, yet another acronym, I-G-E-M. And if you go to iGEM.org on the web, you’ll see something remarkable. It’s an International Genetically Engineered Machines competition. It’s a genetic engineering Olympics. It will take over the Hynes Convention Center in Boston this fall, where you’ll have about 4,000 teenagers showing up with their constructions or creations, their living organisms that they’ve made. The 300 teams, 40 countries, the most creative up-and-coming biotechnology leaders on the planet, to a first approximation. We have evidence now because there’s 30,000 alumni from this program. And they’re running many of the most interesting start-up companies in biotechnology across the planet. So I would just check out what the iGEM students are doing. All their presentation videos are free to watch online. And there’s a tremendous resourcing of information and materials through the iGEM website.
If you wanted to promulgate a new standard of safety or security or any sort of cultural practice globally, and actually deploy that next year, one of the things you could do is you could change the judging requirements in iGEM. And now, all of a sudden, 5,000 people around the world would try and practice that to earn the medals in the competition. So a placeholder just policy comment, to come back to the first question, is there’s a tremendous amount of cultural or soft power. There’s a tremendous amount of network effects being established. And were we to be strategic about realizing what we wished for, we would embrace those possibilities and figure out how to make the world better.
I think Robbie Barbero at OSTP around the topic of regulation, the gentleman from AquaBounty, you know, what Robbie has been trying to do, and it’s a struggle, is to basically implement a process wherein things are being considered on an ongoing basis, not just the specific cases that are coming forward for review and perhaps approval, but how the practice is being carried out, how the systems are working or not working, how different agencies could do better. Are there gaps in the portfolio of agencies or boards? I think that way of thinking, recognizing that when you have a technology—a set of technologies in biology and biotech that are moving so quickly, you have to sustain work to figure out what the plan should be, right?
So when Gretzky describes he’s a good hockey player because he skates to where the puck is going to be, even with ice hockey there’s friction on the rink and the puck is slowing down. We’ve got a little rocket on the puck. The puck is accelerating. And we need to figure out how to skate to where the puck is going to be still to be strategic. And so I’d like to see sustained schools of scholarship. I’d like to see sustained debates among the agencies. I’d like to see sustained investments.
You know, when the Human Genome Project was invested in for sequencing, it was 3 percent bolted on for ethics discussions. The problem with funding things on that basis is it’s typically all piecemeal. If you have to do significant new regulatory science, if you have to do something that requires block funding, you just can’t get the support for it. So I think there needs to be a holistic reconsideration of how we’re strategically approaching biology—period.
NICHOLS: Thank you, Drew. The OSTP he mentioned, that’s the Office of Science and Technology Policy in the White House. And I think Drew is correct, at the White House and at the director of NIH and the director of CDC, those are the leaders that will put together the policy pillars—
GARRETT: For the United States.
NICHOLS: Yeah, of course.
GARRETT: And for publicly funded science. But I mean, let’s keep in mind—
NICHOLS: But those norms usually apply.
GARRETT: There’s—90 percent, you know, out there is not the United States doing this science. And a huge percent of it is in private sector. So you know, the only thing I would say to add to what Drew—Drew’s really smart analysis, is to—so that you don’t all go out of here paralyzed with terror and fear. There is a self-correcting going on all the time within the scientific community. And just a great example, because yesterday was worldwide Subway Microbiome Day. And so in 54 cities around the world, teenagers and college students were out swabbing subways and bringing samples back to figure out what microbes were in all the subways of the world.
This is a build-on from a study done just in the New York subway systems two years ago, that had a really bad outcome, which is to say they went public with their findings, even though they didn’t have great reference strains to—reference sequences to be sure what strains they were really finding, and whether it was accurate or not. And here in a city where we were subjected to 9/11 and an anthrax attack, they said they found plague and anthrax in our subways, and were very surprised that the public was upset and that the city health department was less than happy with their announcement.
A learning curve has happened. A lot of scientists piled onto those folks and said, let’s do this right. Let’s be much more accurate, much more careful. Let’s think about what we’re saying to the public. If this is a public good, let’s frame it right. And now, yesterday, 54 cities executed—under a whole kind of mutually agreed framework that’s a really sophisticated improvement.
NICHOLS: Thank you, Laurie. I think—join me in thanking our panel. (Applause.)
GARRETT: Thank you, Drew.
NICHOLS: Thank you, Drew, from Stanford. (Laughter.)
More from this series
Experts discuss innovations in driverless cars, the costs and benefits of autonomous vehicles, and the regulatory, ethical, and policy concerns that need to be addressed with the implementation of the technology.
Experts discuss innovations in solar technology, investment opportunities in the renewable energy industry, and the effects of solar power innovation on the U.S. energy portfolio.