As Technological Change Accelerates, Can Policy Keep Up?
Technology, Policymaking, and the Future
Lincoln Professor of Law, Culture, and Values, Arizona State University; Future Tense Fellow, New America Foundation
Founder, Practical Futurist
Cofounder and Publisher, HyperVocal.com
Joel Garreau of the Garreau Group and Michael Rogers of the Practical Futurist join HyperVocal.com's Lee Brenner to discuss the latest advances in information technology and biotechnology and their implications for public policy. The panelists note that though government support for basic research has been crucial to the development of many advanced technologies, the proper level of governmental involvement in the regulation of those technologies is contested. They also discuss the economic challenges that are created as technology alters the demand for particular skills in labor force.
BRENNER: OK, welcome to the today's Council on Foreign Relations meeting. My name is Lee Brenner. I am the Co-Founder and Publisher of hypervocal.com, which is a next generation news and media company.
I also co-host a Sirius XM show called "Politics Powered by Twitter," looking at how politics and policy have been altered, changed and are affected by social media and what is playing in the prism of social media.
So great to have you all today. We're going to talk about the future, right? So we've got a lot—lot to cover. But really focused on policy, technology, how those things interact. I've got great—two great people with us, Joel Garreau, who's a Lincoln Professor of law, culture and values at Arizona State University, a future tense fellow at the New America Foundation. He spent almost 40 years at "The Washington Post," covering...
BRENNER: Almost 40.
GARREAU: Almost 40, yes, yes.
BRENNER: And covering society and culture and how things were changing in this country. And things are changing. We also have Michael Rogers who's Founder of Practical Futurist, is one of—former titles, Futurist-in-Residence at "The New York Times," also has a great title.
But I'm going to ask each of them to just give a—a quick one-minute overview of what they kind of are paying attention to and kind of jump into a conversation for a bit. And then we'll open it up to your questions halfway through.
So let's start first with Joel.
Give—give—give me a sense of what you've been working on in terms of technology and policy.
GARREAU: Yes. My area of interest is that we're at an inflection point in history. For the first time in hundreds of thousands of years, our technologies have—are not aimed outward at modifying our environment in the fashion of fire, clothes, cities, agriculture, space travel.
Increasingly, these technologies are aimed inward at modifying our minds, our memories, our metabolisms, our personalities and our kids. And if you can do all that, you're in the stunning position of being the first species to take control of your own evolution, not in some distant science fiction future but right now, on our watch, and I've written a book about this called radical evolution. And the technologies that are driving this are called the grin technologies for genetics, robotics information and nano—and the significance of this is we're not just talking about the internet.
ROGERS: One of the things I work with a lot of companies on now and that's sort of what I do now is having worked for "The Washington Post," "Newsweek" and "The New York Times," moving into the new century, I'd declare victory there and moved on to help other corporations who have a very similar issues.
And what it all comes down to for services and information is what I call the virtualization of the world, that we are moving into a period of time that the best analogy I can think of is Lewis Mumford once wrote an essay called "The Natural History of Urbanization" in which he pointed out that the move into cities for us happened very, very quickly as a species over about a thousand years. And when we move into cities, it was a fundamental shift in society for several reasons.
One, we needed new business models and new laws to live together as opposed to sort of the family groups we had. Number two, it took us one step away from the source of our food.
It was step of abstraction away from the physical world. And of course, urbanization was an enormous shift. I think virtualization, which is the creation of a virtual world that runs parallel to the physical world in, which we live all of the time, is the next big step after urbanization.
And right now, we're at the very beginning of virtualization. But what it means ultimately is that we will all be connected all of the time to the virtual world 24 hours a day, either consciously through a new generation of devices or unconsciously through all of the objects around us.
And just as urbanization really transforms society, our laws, the way we live, I think that virtualization will be the same.
BRENNER: Oh, my god, I think that's actually a great place to start. Now, we've—I was talking on the radio show, actually recently about the internet of things, which is basically that concept of everything being connected that we use—your refrigerators, everything else that can tell you how to live your life.
I guess the general question, is it a good thing for society in a sense that are—are people—are we going to get to a point in—or are we coming very quickly to a point where there's going to be a major conflict in society, whether there are people that are saying, we have to stop this, you know, not necessarily conservative politically. But people say, look, this is moving too fast.
Or are things moving so quickly and it's just the way it is where people are going to adapt and it is coming and people are going to learn to live with these technologies?
ROGERS: I think it's a combination of things. I think people are adapting to technology more quickly than they used to.
And the example that I like to use is, my clients will say it took an entire generation to teach people to use ATMs, automatic teller machines. We actually got the taking money out part pretty quickly but the putting the money in took about a generation, took about 20 years.
And I like to flash forward to the biggest dating site, match.com. I once interviewed one of the founders of match.com and asked him what he would have done differently if he'd known what online dating would become.
And he said, I would have invested much more earlier because he had no idea that online dating would go from something that was really pretty weird and risky to mainstream with eight years. And then finally, Facebook, you know—when Facebook launched in 2006, it was .edu.
It was college students. Within five years, the fastest-growing segment on Facebook was women over 55. So I think we're seeing acceleration of the adoption rate.
BRENNER: Right. And some would say that last part is part of the reason that Facebook is no longer cool. It is really—no, no, no, not for the weird part. No, no, I'm sorry, for—let me finish—cool for...
GARREAU: The elders.
BRENNER: No, cool for kids, teenagers because their mothers are on the platform right now.
BRENNER: And so they're not adopting it. So they are just using mobile technologies in the Net (ph).
ROGERS: Nothing worse than being friended by your mom.
BRENNER: No, and—and that's—and that's actually become now a cultural point.
But so—so looking at—at that technology of—of things and the things people use, Joel, are we looking at the new technologies, you know, there's now a generation that's just coming into adulthood that has never known life without the internet, right? And so they are—they are coming out and obviously, anyone who's born now, internet means a completely different thing than even it means to these 18-year-olds.
How does the technology change for kids today or the way, you know, society—are these going to be these risks between adults of all different ages and children that cannot be remedied?
GARREAU: Well, my interest is in culture and values—who we are, how we got that way, where we're headed and what makes us tick. That's—that's my thing.
And I'm not as big a nerd as I might sound being up here but I mean, I just walk up to the technology to see what light it reflects our human nature. I'm not—I don't care about the gear per se.
But one of the things that looks pretty clear to me is that technology moves faster than culture. There is a culture lag. And—and I guess everybody knows what I mean when I say Morse Law, right, you know, I mean, this curve of accelerating change?
Well, that's also true for now, increasingly for biology and robotics and you know. If the curve accelerating change is going up like this, if our responses are more or less flat, like we're waiting for House judiciary to solve our problems, well, we're obviously toast, right, because the gap just keeps on getting wider and wider.
I have an optimistic view of hoping that our responses will also come up on a second curve. But in the meantime, I think that you're—the reason that you're seeing more and more social weirdness, tea party, take notice, is that when—well, I don't have an interest in that as a response, because I mean, I think what's going on is that the ground is moving beneath our feet.
And—and that's—if anything, it's increasing. And I think when the ground moves beneath your feet, any sane primate looks for something solid to hang onto. And I think what you're seeing is people who are increasingly buying simple narratives that sound right.
I mean, you know—you know, of all—all the commentators, you know, I mean, cable commentators and stuff like that, all of whom are offering simple narratives, which, by the way, are probably wrong. But they are something solid to hang onto.
And that's what worries me. I think we've got to accelerate our narratives.
ROGERS: Lee, let me follow up on you used the phrase, the next generation. That is always a real hot button in any audience I'm dealing with, even if I'm dealing with venture capitalistsand we're talking about investments for—with 10-year time frames and the most technical stuff you can think of. Afterwards, there's always a question, what about these kids?
And the fact is the millennial generation, largest generation in American history, that's the one that is really growing up with technology. It's interesting, some researchers want to divide the millennials into two halves, the 10 to 20-year-olds and the 20 to 30-year olds, and call the 20s the 30-year-old senior millenials, which I love.
Love to tell a 26-year-old, they're now a senior. But the reason is because the whole relationship to the virtual world of the 10 to 20-year-olds and their younger siblings is so different.
"If the curve accelerating change is going up like this, and if our responses are more or less flat, like we're waiting for House judiciary to solve our problems, well, we're obviously toast, right, because the gap just keeps on getting wider and wider."
And that, to me, you know, where is this going? And I have to say, this is one enormous science experiment, because we genuinely don't know what the effects are of growing up so attached to the virtual world, virtual relationships.
But the virtual relationship piece is, I think, is crucial. I think that this generation will grow up being able to create and maintain meaningful virtual relationships.
In personal lives and business lives, that, in a way, that the boomers just can't understand. And that will really change this notion of virtualization.
BRENNER: And—and a general topic, almost, Joel, you were referring to this as—as technology advances very quickly and society may not necessarily be caught up to the speed at which it is moving, you know, a general question could be are—are people—is the internet and technology making people smarter or dumber. And I was talking about that earlier.
But on a—on a kind of angle of that is will the next generation, the young people who are growing up with this technology, because they are living with it from day one, will they be adapted so that they're learning curve will be better and that they will be able to be smarter and have to use it constantly, compared then to, you know, people are going to say, oh, the young people are actually going to solve all the problems here.
But maybe, they have, at least, more tools than people that hadn't grown up with this technology. And it's—and it's harder for them to adapt even though they're, you know, they will be around for the next few decades.
GARREAU: I'd be happy with employed. You know, I mean, I don't know if they're adapting. I don't really care.
I just want them out of my basement. I have two daughters. They're both—and one of them is here and they're both very happily employed.
Thank you very much. God bless you dear (ph). The—but—but no, seriously, I mean, one of the things that I'm very worried about—I think we're all—we all are, is, you know, what happens if this time, it is different.
You know, what happened—I mean, the people who, you know, there's this whole argument that the Luddites were crazy because, you know, the—their jobs were taken away but they ended up moving to the cities and getting better jobs and blah, blah, blah, which is all true.
But meanwhile, they lost their jobs, you know. And there was an awful lot of revolution and war and dead people in the course of what we now smooth out as this curve.
And you know, we—I can easily imagine it. I mean, I, well, all politics is local. So I was a reporter and editor of "The Washington Post." And then in 2009, I ended up having to take the last buyout of senior staff before the layoff started because there is no business model.
Now, I'm in a university. You know, if this happens to me again and that industry gets shut off from under me, I'm going to start taking this stuff personally, you know.
And—and I'm just very cognizant of the fact that the last time we had 25 percent unemployment in the first half of the 20th century, uniform national socialism began looking good to a lot of people, of course, so did the new deal.
But this could be some pretty heavy upheaval. And I imagine it will start with the young.
BRENNER: What role is—does government play in that, then, because of potential technology advances which are replacing so many jobs. And so much money is being put into—through venture capitalists, into technologies and companies that don't necessarily hire the same way that a General Motors and some of these other companies in the past did with similar market capitalization.
How does—what—what does the government have to do to make sure or to help people have jobs in the future?
ROGERS: You know, that—that's a very broad question. And it ranges from income redistribution if necessary because we may end up 30 years out with a class of people who have a lot of time on their hands and are relatively well-educated.
And we don't quite know what to do with them. So income redistribution is one way to keep the revolution out of the streets. A second, though, is to be more realistic about what jobs will continue in the future and which won't. And I used this example when we were talking earlier.
I just finished working with a group that represents plumbers, electricians and heating and air conditioning contractors. And their problem is finding young workers because the United States really focuses on everyone goes to a four-year college, gets a degree and sits in an office.
And those jobs are so easily automated or outsourced that—that that's a fiction almost. I work with legal firms now. And young lawyers are having a difficult time partly because of outsourcing and partly because of automation.
So the—the HVAC contractors want to bring a message to parents, school districts and legislators that says, these are good jobs. Being a plumber can't be automated and it can't be outsourced.
And it's a perfectly dignified thing to do. One guy said to me, you know, my son's friends all went to college, four-year degree, came home with $40,000 in debt.
And they live at home. He went to work for me, at the same time, is buying his first house, thinking about getting married and having a kid. So we have to be more realistic about what jobs are really going to be there.
GARREAU: I mean, I—I think we're coming out of a phase, first, for the last 15 years or so at least, kind of techno-utopianism, that this was all going to be great and we're all going to solve, you know, pain, suffering, stupidity, ignorance, death.
You know, and we're all going to—and there are still a bunch of people who believe that, a lot are in the Silicon Valley and a lot of them in the U.S. government.
One of the things you have to keep track of is the fact that just about all of this change that we're talking about was financed by the U.S. government. Steve Jobs, you know, that whole myth about the garage, I mean, he was a packager and a designer, I mean, and a great one, don't get me wrong.
But every single technology in your smart phone was created by the U.S. government from Siri to GPS to the touch screen to all of this. This is—these are the guys, DARPA or the Defense Advanced Research Projects Agency, where I was embedded for a year, I mean, I'm really struck by the—not yesterday's news about information technology but tomorrow's news about biology, which is where—they—DARPA just launched a new directorate BTO, Biological Technologies Office.
It's very rare for them to create a new directorate. And the reason they finally did was because they had so many biology programs going across their other directorates that they finally had to create one place for it to get all of these things into one place to try to get their arms around it.
And the kind of—I mean, I commend it to your attention. Go to the DARPA Web site and go to BTO, and look at what they're doing there. It's pretty—I mean, they haven't gotten the memo yet.
They're—they're still very optimistic. One of the things that they're doing is a program called Living Foundries 1,000 molecules.
Well, what this is about is that at Arizona State University, in biodesign, which is a—a big bio—a big biotechnology lab, for example, on the roof, we've got creatures that eat CO2 and poop gasoline, seriously and thereby solving the Middle East and climate change, hopefully in the near future. Well, the way you do that is, you know, there are 27 chemical steps between—to go from carbon dioxide to crude equivalent.
And they all occur in nature. But not in one creature. So what you do is you take these 27 steps, genetically stitch them together into one bacteria or algae, and bam, you've got crude equivalent.
Well, what DARPA is working on right now in the—in the Living Foundries business is creating a thousand novel molecules that you've never seen before—novel materials essentially in the same way. And the list of what they're working on, there's another one called prophecy.
"Every single technology in your smart phone was created by the U.S. government, from Siri to GPS to the touch screen to all of this."
It's another one of their—their—their current things which is about biological invulnerability. There's another one called biochronicity, which among other things, is meant to reverse aging.
Now, the—I mean, yeah, check out the DARPA Web site. Check out BTO. Now, the interesting thing to me about is also about the reaction to all these is also largely government-funded so far.
Like for example, when it comes to—I mean, my—one of my big things is ethics. I'm an ethics Professor. And when it comes to the ethics of this technology, one of the people that are—some of the people that are—are most ahead of this is like the Navy, for example.
I give them tons of credit over here in Annapolis and over on the West Coast. They've got lethal autonomous robots. They've got robots that do not need a human to pull the trigger. They're called smart minds.
These smart minds register what—what—you know, whether this, you know, an enemy ship or not. And then they decide whether to blow it up and because the navy has this problem, right, because you can't put electricity very easily through—through water.
So they have a hard time communicating with their robots. So they are in the forefront of creating ethical robots. Are they going to succeed?
I have no idea. Am I glad somebody's working on it? You bet.
BRENNER: What role, I mean, some of these things that are being created, and obviously, I assume the DARPA Web site doesn't have everything they're working on most likely but...
GARREAU: You'll be amazed actually.
BRENNER: ...I mean, they'll—but—but in the sense of things that they're creating, obviously, there's plenty of practical uses and sometimes people haven't thought of but for medicines and—and science and all these things that can help humanity, but obviously, then there's the other side of things where they may be creating molecules that they don't necessarily know exactly what it'll turn into 30 years from now...
BRENNER: ...how the body will react to it, what role—Michael, what role do you think the media has in shaping cultural and societal opinions about these types of things? I mean, how—how does—how important—is there a media anymore that actually has that role in terms of shaping opinion?
Or is it so democratized that there isn't necessarily those—those cultural icons in media?
ROGERS: I think we're going through a period now where media is being reinvented and that the kind of sort of scientific journalism, which is really what's necessary. To understand the technology, you need to understand some of the science behind it, you know.
And there's a long great tradition of science journalism in the United States. But there are fewer and fewer places that can support that kind of journalism.
We have a big hole. And we're not quite sure how we're going to fill it. And I often say that the last century was the golden age of journalism, not because we did such great things but because we had so much gold.
When you are a monopoly, and you own the printing press or the broadcasting tower, you made so much money—you really did, that we invented this church-state relationship in the media. In other words, they made so much money that the business people actually didn't care what we reporters did that much.
You know, we'd upset the local car dealer maybe. But you know, he wouldn't advertise for six months. But he'd come back. So we invented this church-state relationship and did some amazing journalism because they basically let us do whatever we wanted.
And there was great science journalism. I mean, probably back—going back to 70s was one of the first times that the specter of genetic engineering came up.
That was the invention of recombinant DNA, which is basically the underpinnings of—of the—many of these technologies today. There was a conference of scientists to talk about the threats and possibilities of it—very thoughtfully reported by dozens of newspapers.
Craziness appeared. But the craziness actually sort of got damped down. I don't think there is an equivalent to that today. And we need to solve that problem.
BRENNER: And you know, there's been a lot of—there's been tons of discussion about the role of—the size of government, regulation and a lot of these things. And obviously, people, now you—you mention, tea party, but there's a lot of people who say, government needs to be smaller.
But with a smaller government, they don't necessarily create all of these advances that are—that people don't know are part of their phones and all these other things. And then the other side of that is what role does the government have in regulating the Steve Jobs' creation that actually repackages or advances something that the government has created?
And then we can obviously talk a little bit about data and everything else that's going on. But what—what role generally do you think government has in regulating? And then are there people in government that are focused on that?
Because oftentimes, government is not as advanced as—and DARPA is probably the—the most advanced and the rest of the people who are actually regulating aren't as advanced.
GARREAU: Well, from my sense, I attend a lot of conferences on this subject. And I think it's doomed. I mean, one of—I mean, this comes, you know, if you think that culture moves slower than—than innovation, well, then—then comes regulation in Washington and all that.
I mean, they're—you go to these conferences with all these people and they act as if it's possible—I mean, one of the things I'm interested in is, how do you govern technologies that don't exist yet? That's one of the core interest.
That's why I'm in the law school. And they're talking about spending five or 10 years to regulate technologies that are already five or 10 years old. I see nothing good coming out of that.
I just have a lot of trouble. I mean, what I'm—I'm a congenital optimist. So I'm hoping that what replaces this industrial age kind of mechanistic reg writing that, you know, that which—that I view is hopeless is I'm wondering what happens if you end up with—you know, how do you accelerate the curve of human response?
My hunch is that the way you would do that is in a bottom-up flock-like way rather—I mean, I think the idea of top-down hierarchical stuff is just—is just not fast enough. I mean, there's nothing wrong with it in the abstract.
You're just not fast enough. So I mean, what I'm very interested in is when you see, for example, DIY biology, do-it-yourself biology coming up, I mean, which is happening and people stitching stuff together for fun. I'm going to be very interested to see if what kind of ethics and morals evolve fast around that in a bottom-up way.
I've got a lot more optimism about the people who are doing it, coming up with ways to make sure we don't destroy the human race than I do in the FDA, frankly.
BRENNER: And is that a question then of is there going to be self-regulation that'll take a—take a bigger role than government, or maybe, you know, we have crowd-sourcing on everything from fundraising and ideas. But is that maybe a sense where there'll be much more lower government walls and be much more crowd-sourcing of regulation?
GARREAU: Depends on what you mean by regulation.
ROGERS: Self-regulation often makes me nervous because it—it's not the little guy who's doing the self-regulating. It's the big corporations.
ROGERS: Let's take for example, the regulation of the internet within the United States, the internet was launched by DARPA. But it was essentially invented by university professors.
It was DARPA money that guys at Stanford and it's mostly guys at UCLA, invented this thing in the late '60s, where their biggest goal was to see that an e-mail went from Palo Alto to Los Angeles. That was a triumph.
I went to school in Silicon Valley. And I know some of these guys. And you know, once over beers, one of them said, you know, if we have known what the internet was going to turn into, we never would have built it the way we did, because it's utterly insecure.
It's hopelessly insecure. It's the Wild West. So one of the things I think we're going to see in the narrative in terms of government intervention over the next decade is the rule of law on the internet.
And one of the countries that is resisting that the most is the United States. They are very laissez faire about the internet and the whole virtual world, which was a good thing for a while, but now, we have such powerful constituencies—the internet advertising companies, Google, Facebook, gigantic lobbying interests that are still doing their best to keep the internet from being regulated.
It's interesting, a lot of the best thinking, I believe, about privacy, data retention, things like that is going on in Europe because they are not hindered by the enormous lobbying efforts in the United States towards self-regulation.
GARREAU: I think some of the most interesting conversations in the United States that's going on about biology is occurring in the sports pages. All technologies are always adapted first by wherever you see the greatest competition.
And so it's not too surprising that when you're talking about human enhancement, you know, enhancing human cognition, human memory capabilities, aging, the whole—the whole deal, I mean, coming up with version 2.0 humans, all of which is in the works, where this conversation is being—the most thoughtful is in the sports pages, where you're asking whether Barry Bonds should go around with an asterisk on his forehead for the rest of his life because he's not the same kind of human as Willie Mays, his godfather, who, you know, and—and all of the people whose records they broke. This is—we're having a nuance—I mean, it's not all—I mean, it's not all knee-jerk.
If you're having nuanced conversations there where people are saying, on the one hand, well, all—you know, all people want to see is spectacle and if you can hit more home runs, mazel tov. And other people are saying, but wait a minute, what about—is that moral, though?
I mean, the—the whole narrative behind sports is about human competition and about excelling and becoming more of yourself and blah, blah, blah. And what happens if you just become a machine that has got a great pharmaceutical crew?
You know, that—that's, you know, auppose that's what it turns out to be. We're having that conversation right now in a bottom-up way.
And I think that's a lot smarter and more interesting the most of the regulatory efforts that I've seen here in Washington.
ROGERS: Although it might be they're aimed at regulation, which is at least sort of private regulation which is how do you regulate what an athlete is allowed to do to their body. And...
GARREAU: Well, this is what I mean for regulation. What do you mean by regulation?
ROGERS: Yes. Right.
GARREAU: If you're talking about FDA, I think that's hopeless. If you're talking about deciding what is OK and what is not OK in the—in terms of narrative, and what I will buy a ticket for and what I won't, I mean, already, there's a—I mean, I don't expect, for example, to see human cloning anytime soon, not because it's not technologically possible.
Of course, it is. The—but because there has been such a revulsion to it, even more so than to GMOs. Now, if you count that kind of bottom-up narrative as regulation, that's where I end up being a little bit more optimistic.
ROGERS: I think the other interesting point is that we don't really think about these technologies until we see examples of what they can do. And as you say, sports is where it first happened. So we think about these in the abstract and we become overwhelmed.
But when the specifics come along, I'm a little optimistic about the human character.
BRENNER: Well, let's—we're going to actually open it up and invite audience members to join in in this discussion. Please wait for the microphone and speak directly into it.
Stand, state your name and affiliation. And keep your questions and comments concise to allow as many attendees as possible to speak. So I will start calling on people.
In the green there?
QUESTION: And if I could just say that the Web was also supported by the National Science Foundation and not just DARPA, small commercial, but the real question is I have really no faith in either bottom-up or top-down regulation. But I think there's a middle ground like the human embryonic fertilization authority in Great Britain.
And I wondered if either of you were familiar with that model and whether you think that that might, in fact, be a way for the future.
ROGERS (?): It's an—sort of semi-institutional...
QUESTION: Not to hog the microphone but the first test tube baby Louise Brown was of course born in Britain. And the British had something called the Warna (ph) Commission and have developed an authority over human embryonic fertilization and other genetic issues that is made up of a public board that is very carefully selected and largely reflects the general ethics, if you will, of the British population and has kept up very well with the science, has changed with the science and really, I think is a model that we should take a look at here more in the United States.
QUESTION: I'm impressed with what you're saying because I—I grew up with parents that were always way ahead of the technology. The dilemma I find is human beings have great difficulty.
I mean, change is really hard. The younger kids are going to live in a world that's going to be very easy for them until things change. I mean, it—there was a wonderful article in "Scientific America" the other day that I was reading about how the brain basically, the first thing you lose is the—learn is the easiest thing for you to keep going back to.
So you've got this question of adaptation that is here. One—another thing that I think is terribly important, which gets back to your journalism thing, is training people to be able to tell stories. And we don't do that.
That's not considered a very important skill in our educational system. And on the other hand, if you can't tell stories, how can people figure out really what these alternatives are?
I, too, worry about all this mechanization in our lives. And also, the question of jobs, we're going to have to rethink what work is or what people get—will get paid to do.
The other day, I saw some people picking up stuff on the street wearing red vests. And I went out and said, thank you for doing that. We really need people doing.
You've got to find a way to give people respect for doing a lot of work that we wouldn't want to do. And machines aren't going to change that.
But I think it's desperately needed.
BRENNER: Gentleman in the back?
QUESTION: Joel, Lee, Michael, could you address the question—this question, are we headed for the dystopia portrayed by Dave Eggers in "The Circle" inevitably?
And how can we avoid it if we want to?
ROGERS: Oh, thank you for that. This is my hobby horse, thank you. Wait, so I don't make predictions. I don't have a crystal ball.
I don't know anybody who does. I'm still waiting for my—my jet pack. So what I do is scenarios, which are systematic, rational, logical stories, narratives about what the future might be like given the facts that we've actually got on the ground right now.
And when you talk about human enhancement and that you basically end up with heaven, hell and prevail, as the three scenarios. The heaven scenario, the curve goes straight up.
You conquer pain, sufferings, stupidity, ignorance, death. We merge with our machines. We all live happily ever after in this utopia.
This is the Ray Kurtzweil memorial scenario in the singularity universe (ph). Might happen, perfectly credible. It's mapping Morse Law for the future.
The heaven—the hell scenario is the mirror opposite. You have the same rapid change but it gets into the hands of mad men or fools. And we wipe out the human race in the next two days.
Again, a perfectly credible—believe me, that's the business—a perfectly credible scenario. And—and it's the one that most people most easily identify with.
I guess we had a lousy 20th century. Everybody—anytime you say, well, it's all going to go to hell, everybody says, yes, you got that straight.
But the third scenario is not the one I'm predicting but it's the one I'm rooting for and that's prevail. And prevail is not some middle ground between heaven and hell. Heaven and hell are both techno-deterministic, meaning that they—they—they both assume that what matters is how many transistors you can get to talk to each other, you know, in the basis of Morse Law.
Prevail is way over in its own territory because it's got a fundamentally different proposition, which is maybe what happens is—maybe what matters is how many ornery, cussed, imaginative surprising humans you can hook up. There is reasons for guarded optimism that the human connection is what matters.
You look out to the future of the human race from 1200 A.D., you see marauding hordes, you see plague. You can say, OK, it's over for these species. Fourteen-fifty, you get movable type in the printing press.
All of a sudden, you've got a brand new way of storing and sharing, collecting and distributing your ideas. And the results are amazing. First, you get the renaissance and then you get the enlightenment and then the world we got today.
Lots of examples like that. lots of—and there are a lot—and—and throughout our literature and our stories, we've got a lot of prevail stories. So the question I'm asking myself is can we figure out a way to accelerate the second curve of human response in a fashion—in—in—in identical ways to the way DARPA accelerates the first curve of technological challenges.
And I've launched something called a prevail project, which is intended to do, to see if we can do exactly that. In that scenario, you ask yourself, have you—how would you know if the prevail scenario was happening?
And I would guess that what you'd see is a lot of our increased pace of out of nowhere bottom-up, flock-like, unexpected things. Have we seen that much lately?
Well, of course, you know, what about eBay? That's not just the world's biggest flea market. That's hundreds of millions of people doing very complicated things without leaders.
What about Facebook? What about Twitter? I have no idea what Twitter is good for. But if it flips out every tyrant in the Middle East, I'm interested.
BRENNER: That's—you know, in the bow tie.
QUESTION: My question is about the relationship between technology and truth. And I think coming from the millennial generation, I guess if I can take that label on, thinking of a high-brow example and maybe a lower-brow example of maybe going to Wikipedia and assuming that something you read is true, or thinking on the other hand of some of the more recent developments like Vox and 538 and these—these sites that attempt to either use data or use various sort of simple ways to—depict very complex situations and affecting the way in which people understand what's going on and sort of this idea of very sort of boiled down simple models, masquerading as truth, and if that's related to the technology or if it's important to think about it as we continue to look at the role of technology.
ROGERS: I think we're seeing a fundamental shift in communications skills. And it's just starting now, but one of the more controversial things that I say, particularly in audiences like this but I do believe we're coming through the end of widespread long-form reading and writing over the next couple of decades, that those skills will be in very rapid decline.
And the fact is it's because you don't need to read and write that much to get a lot of information about the world. And I tell the story of my mother, a second-grade teacher.
Long ago, if a little boy didn't know how to read, didn't get the reading thing. She'd find out he was interested in antique cars. And she'd bring in some books on antique cars.
He'd start going through them. And lo and behold, he's a reader. He doesn't need to look at books now. He can see all the videos and audio and slideshows about it.
And so we see that. We see long-form reading. And by that, I mean, anything over about 200 words, reading and writing declining very, very, very fast because it's simply not going to be necessary.
And I think what we don't understand is what that does to the thinking processes, what extent the thinking process is formed by learning to read. But in my more dystopian moments, I say that out around 2025, 2030, you know, we will look back with some humor of these days when we used to look for kids with reading disabilities because we will understand that reading was not a natural skill to begin with.
Now, in 2025, we will look for the kids with reading abilities and just like star athletes, we'll say, OK, we are going to teach you to read and write really well. So I think that's why we're starting to see this world of boiled down jargon journalism or graphic journalism.
And I don't think it's a good thing.
GARREAU: Can I just build on that for a minute?
BRENNER: Oh, please.
GARREAU: That's a scary scenario. The—one of the things that—I mean, as a professional storyteller, I mean, I note that—that storytellers have been getting the best piece of meat around the camp fire for an extremely long time.
And you know, the—I don't expect that to change in the future because that's such a—we're pattern-seeking storytelling animals. That's who we are.
You know, rather than look up into the night sky and—and deal with the fact that maybe all those bright dots are—are random distribution, no, we come up with the most amazing stories about bears and princesses and lions and swords and so forth. And we just can't stand the possibility that's random.
And we create stories instead. If that were to change, if all of a sudden we were no longer the storytelling species, that would be a profound shift in what it means to be human. And it's one that I would have a hard time—I'm going to have a hard time writing that scenario over the next 20 or 30 or 40 years.
I mean, my—my problem is that it's getting the best piece of the meat for the—around the camp fire. Rewarding the storytellers is the challenge.
And that's where you—I can imagine seeing a really dark—I agree with you, a really dark next 10 or 20 years before the next business—between the collapse of the old business model, which has occurred already, and the rise of the new one. But to say that there is not going to be storytelling as part of the human species in important and rich and complex ways, wow, I don't want to think about that scenario.
I just have a—a hard time with that.
BRENNER: Although—although maybe storytelling, I think, will always exist. I mean, it's just the platform may change, I think...
BRENNER: ...for someone will...
BRENNER: ...learn how to—they may tell, you know, an entire novel will come out in a hundred-and-40-character bursts, and Nike will decide...
GARREAU: You really believe that?
BRENNER: Oh, yes. And Nike will decide, listen, they've got 12 million people following them. I actually will pay this person to keep doing that. And they can put an ad for Nike every once in a while..
So maybe that's an—I'm not saying that's the entirely new concept or model but maybe, that's—that's an area.
Yes, you there.
QUESTION: Your comments on unemployment in this country I found incredibly pregnant given that there's a surplus of labor in this country and a very robust and active carceral state, which ensnares overwhelmingly black bodies and some brown bodies, too. So I'd love to hear your insights on incarceration, millennials, and surplus labor in this country. Thank you.
ROGERS: Well, you know, I think Mitsy said something earlier on that was a good point, which is we need to do two things about finding jobs for people. One, we need to give sort of service work more dignity.
And we probably need to pay more for it, to give people truly living wages. It's not unlike the early days of factories when the factory workers went in and, you know, they didn't make much money at all. And it was not until the unionization and suddenly, they were the middle class.
And the middle class drove our economic engines. Now, it's pretty clear that service workers are going to be driving—those are going to be the jobs. They're going to be driving the economy.
But—but not if they make $7 an hour. So incarceration, I think, is what you try to prevent and we really need to re-look again at the nature of work in society, I think.
GARREAU: If I could just add to that. Again, if you're looking at the technology that's aimed inward at modifying what we are as humans like cognition and memory, all that jazz. You can pretty easily imagine scenarios in which in the not-too-distant future, we end up with three—at least three different kinds of humans, the enhanced, the naturals and the rest.
The enhanced are the ones who embrace these technologies and who, and for them and for their children, and who, every six months, you—that something brand new, I mean, I've got some of the stuff in my pack, you know, things like Provigil, shuts off a human trigger to sleep, it exists. FDA approved.
You know, so—so you end up with more and more enhanced humans that—that jump at this. And their kids are the ones who end up being smarter and better able to get into the best colleges and so forth.
And you have to decide how you feel about that. Then there are the naturals. These are the ones who do have access to these technologies but they choose to not to indulge, like today's fundamentalists who eschew modern pleasures.
And then you have the rest, which—and these are the people who do not have access to these technologies for reasons of geography or class or whatever. That could get real ugly real fast.
It's been a long time since we've seen more than one kind of human walking around at the same time, 25 or 40,000 years, depending on how you read the fossil evidence. You know, if we all of a sudden – I mean, I wondered—I mean, you—you look at some of our wars right now, you know, you look at Afghanistan, and you—you look at our war fighters versus the people from the—you know, who essentially haven't changed a hell of a lot since the 14th century, and you say, wow, is this what it looks like?
"Now, it's pretty clear that service workers are going to be driving—those are going to be the jobs. They're going to be driving the economy. But not if they make $7 an hour."
Is this the beginning? Scary scenarios.
QUESTION: You've talked about regulation with respect to emerging technologies. I wonder about safety and security that goes beyond regulation.
You mentioned the ability of mad men to wipe out the human race in two days. Whether it's a mad man or somebody who makes a mistake, how do we create safety valves or law enforcement or security service that can prevent such an accident or deliberate act from happening?
GARREAU: I'm real pessimistic about police forces doing this. I'm—I'm much more optimistic about—I mean, my cockeyed optimism is based on—on what bottom-up flock-like surprising solutions look like.
So for example, if you had told—I mean, 30 years ago, if we had said that in the year 2014, every day, every second of every day, our most important computers that regulate everything from our—you know, everything in the world would be attacked by the most incredibly malicious and imaginative and sophisticated pieces of software, you know, bugs and worms and everything, you know, we would—in the '70s, you would have said, OK, it's over for this species.
We're toast, right? So you know, but now, in fact, what's happened is that, you know, without really a hell of a lot in the way of government regulation, we've developed bottom-up responses and there are entire industries that designed to scan—I mean, I'm not saying it's—I'm not saying it's perfect. I'm just saying they exist.
I don't—don't get me wrong. I'm not suggesting that any of this is a panacea. I'm just saying, I'm—I'm a student of good-enough solutions.
And in this—using this by analogy, you've got good-enough—does everybody's credit card get stolen? Sure. Do we indict half the Chinese military? Sure.
I mean, is this a perfect solution? No. But it's not the catastrophe that you would have imagined had I given you the scenario in—in the '70s.
So I mean, this species has a history of muddling through. I mean, that's part of what prevail is all about. It's heroic muddling through, like Huckleberry Finn, like—like Exodus in the bible, overall prevail stories.
And so that's the question I've got is, can we imagine scenarios where we have heroic muddling through? Maybe that's what control risk does for a living.
ROGERS: Let me—can I just—couple of quick thoughts on regulation. The first is in the life sciences. I think we have seen some pretty good self-regulation. You know, just recently, over, for example, what to do with the final smallpox virus, the modified bird flu question, I mean, within the scientific community, there is the roots of some self-regulation that we've seen over the years and is—is not a bad thing and works.
I think when it gets into the capitalist system, regulation—self-regulation begins to break down. I mean, look at General Motors. There were a bunch of engineers who knew that too heavy a key chain would cause the driver to die, right?
But that never got reported because it was going to cost 18 cents more per switch to fix it. So there's regulation but it can be overcome very easily by economics.
The second piece is when it comes to the internet specifically, I think that this is a case where we will see much more global activity. Right now, it's very disperate.
We've got autocratic countries controlling parts of their internet. We've got the United
States being quite laissez faire. Ultimately, I think we're going to move towards
some elements of a more international set of guidelines.
One thing I think we're going to see is internet passports, some ability to have a real identity on the internet. It's astonishing that such an important thing doesn't have legal identities at this point.
So and every country is sort of working on how—all the Western democracies at least are working on how you make a legal identity, probably with some biometric, that when you enter into the virtual world, we're pretty sure we know who you are.
Sure, it's spoofable just the way a passport or driver's license...
GARREAU: Putin is going to love that.
ROGERS: Yes, well, that's why the Americans are being so careful about it because, of course, national I.D. is a third rail. But these won't be IDs that you have to use all the time, just the way a driver's license.
But if you'd like to get on an airplane, it's really handy to have a driver's license. So I think real identities on the internet are going to be a big step forward.
GARREAU: And meanwhile, the most effective regulators we've got are the leakers.
GARREAU: I mean, you know, the—I mean, in the last year, and now that we've learned that what the NSA does for a living, that's made more difference than the entire regulatory apparatus of Washington, D.C., I would argue.
BRENNER: Thanks. So regulators who—and I won't name names but, say, 99 percent of scientists are wrong on certain things.
GARREAU: And they admit it.
GARREAU: That's part of the deal.
BRENNER (?): Right.
QUESTION: So my question is about the I.T. industry and what can you say about the ability of the industry in terms of meeting people's needs today and being ahead of the pack and where it's going and how does this fall into your vision of the future under the different scenarios, how helpful is this industry. What else do you think?
GARREAU: The I.T. guy.
ROGERS: Well, I think the I.T. industry is absolutely crucial in the sense that to talk about the scenarios, we are, you know, running into a lot of challenges between now and 2050, given the population increase that's inevitable, the move towards sustainability, which we're really going to need to have, and move away from the constant growth notion.
I.T., the combination of a global network, smart objects, very intelligent software of the, say, IBM Watson class that is able to—to really look at a lot of data and come up with novel solutions to problems, I think is enormously helpful. In terms of the whole world, I have, you know,—I know analysts in the telecom industry who seriously say that by 2020 or 2021, absolutely everyone on the planet can have a phone if they want, which is an amazing, amazing concept.
It's a combination of very low-cost Chinese hardware and really the Indian business model that lets you sell mobile service to people who earn $2 a day and make a profit. So you know, Pakistan just the other day, was one of the last countries to approve 3G networks.
So a lot of those people by the early 20s are going to have smart phones, and be connected to the internet. It's really going to happen.
It's hard for us to imagine how transformative that will be. But I think I.T., along—along with smart approaches to biology, but biology takes a little longer. I.T. I think can make positive changes very, very quickly.
GARREAU: Can I build on that for a minute. In—in your question is embedded the notion that information technology is a thing, a separate thing, you know, that is somehow distinct from nanorobotics, genetics, ordinary life, you know.
I wonder about that. I mean, you know, everything that—that we seem to be heading towards is smaller, more—more ubiquitous, you know, more—I mean, and if we're not carrying it on—on ourselves personally, knowingly, we're being watched by somebody else who's got a reason to do that for us.
And you know, when it becomes that ubiquitous, when—when the chair becomes smart, when the water glass becomes smart, is it useful to think of that as information technology? Or is this some new state of being?
And how does Hewlett Packard make any money off this?
BRENNER: Well, and then the question is, who controls all of the information, whether it's just...
BRENNER: ...individuals, the government, the companies themselves. I think we probably have time for one more question.
Before we take this, I want to remind all participants that this meeting has been on the record. So everything you've said will be held against you in a court of law.
Yes, in the white jacket?
QUESTION: I was wondering if you can talk a little bit about the education of policymakers on the state of science and scientists on the state of policy and how you would advocate that we make sure there's not a gap between the two communities.
GARREAU: Good final question.
ROGERS (?): Yes.
GARREAU: I'm less interested in regulation than I am in ethics, frankly. You know, we're at this stage, we're at this amazing point in history where we can do just about anything with—with material science, with energy and with even biology. When you can do just about anything, then the core question is what you should do.
The—and that's the core question of ethics. And I—and I mean, in my optimistic moments, it's because I'm seeing more people asking, given the opportunity to do anything, what should we do? And that's why I'm glad to see government entities like the military, like the Navy, like our foreign policy establishment, asking more "should" questions.
And if you can arrive at what we should do—and remember, some consensus or even some novel thoughts about that, that strikes me as being much more important than writing the regs.
ROGERS: I think it needs to—I think it's—it's a really interesting thing that can start in scientific education. One example is for the last few years, I've gone to Cold Spring Harbor laboratory. For those of you who follow genetics, you know, that was sort of the wellspring of a lot of interesting work.
And just to talk to some of the graduate students about how society and science and the media sort of work together, and you know, that's a—a direct connection, I think, to policymaking there, but you know, it's a little like—I think as science and technology affect us more and more quickly, that should be a part of the curriculum.
And we're sort of seeing that shift now in medicine, for example. Medical school educations are a more holistic approach. Journalism education now include entrepreneurialism.
So you can go out and figure out how to make a living. So educations are shifting. And I think the scientific education should as well. And I think it's got to come from that side.
I think the policymakers, it's hard—you need someone to go to them. They don't necessarily come to you.
BRENNER: Are you seeing enough of that conversation happening? I mean, they should be doing it but is—is enough of it happening to make the—the most necessary changes, at least from the policymaker's side?
GARREAU: Actually, it's not like, for example, again, generalizing from a university of one—at biodesign at Arizona State University is 350,000 square feet of creatures that do not exist in nature. They're, you know, I mean, they're—they're—they're out there.
And increasingly, you're seeing people who are embedded in biodesign who are in the ethics game. And they're—and so I mean, an example is, for example, some woman was—some woman scientist was involved in—in—in learning everything there was to know about some disease and she want that—for which there was no cure.
And the ethicist said, would there be something interesting for you to do that might involve a molecule or an organism for which we could create a cure? I don't mean to dump on scientists.
But this is so classic. It had never occurred to her. I mean, scientists do not wake up in the morning thinking about how they're going to change the human race.
They wake up in the morning thinking about how they're going to wire the goddamn monkey. You know, they don't read news. They read journals and stuff like that.
And the idea of opening up their world to a much larger consideration of the impact of what they're doing is, I think, crucial. And I think it I see it happening.
ROGERS (?): Slowly, but you know...
GARREAU: It has to be fast enough.
ROGERS: No, in the I.T. world right now, one of the big, big questions is net neutrality. The FCC, Tom Wheeler's dealing with this. I am pretty convinced that the FCC is hearing all sides of the question.
And then we have advocacy groups on every side. Some of them are a lot richer than the others. But all sides of the argument are being brought to the FCC.
I mean, I'm—I'm comfortable with that. Can the regulators do the right thing? There probably isn't a right thing. Can they do something that causes the least damage?
I think that's probably the case.
GARREAU: At least in the short term.
ROGERS: Yes, exactly.
ROGERS: Right, right.
BRENNER: All right, well, I think we are at our end. I want to thank our panelists, Michael and Joel, for joining us.
BRENNER: And thank you all for attending.
Listen as Aaron Friedberg, professor of politics and international affairs at Princeton University, breaks down the strategic challenges we face with China's rise in the world.
This session was part of a CFR symposium, China 2025, which was cosponsored with the Project 2049 Institute.
Watch Aaron Friedberg, professor of politics and international affairs at Princeton University, break down the strategic challenges we face with China's rise in the world.
This session was part of a CFR symposium, China 2025, which was cosponsored with the Project 2049 Institute.