For the past three-quarters of a century, the United States has led the world in technological innovation and development. The nation now risks falling behind its competitors, principally China. Innovation and National Security: Keeping Our Edge, the report of a CFR-sponsored Independent Task Force, outlines a strategy to ensure the U.S. remains the predominant power in a range of emerging technologies, and the national security implications if it fails to do so.
*This meeting will take place at the Council on Foreign Relations in both New York and Washington, DC.*
Members may bring a guest to this event.
SCHMEMANN: Good morning, everyone. I’m Anya Schmemann, director of the Independent Taskforce Program here at the Council. And I’m delighted to introduce the latest taskforce report, Innovation and National Security: Keeping Our Edge. I invite you to also visit the report online to view our video and engage with our interactive graphics. Our thanks go to our co-chairs, Dr. James Manyika and Admiral Bill McRaven, for their inspiring leadership throughout this project, and to my super colleague Adam Segal, who is joining us from New York, for serving as project director and being the pen of the group.
We are really grateful to all of our dedicated taskforce members and observers who have lent their time and expertise to this project. We do have a number of them with us today. Members, please raise your hands, also in New York. We thank you for your service on the taskforce. I’d also like to recognize Congressman Jim Himes who has been a really great spokesperson for this set of issues. Thank you, Congressman. I’d also like to thank my hardworking colleagues Chelie Setzer and Sara Shah in D.C., Lauren Dudley in New York, and all the staff members of CFR who make these projects possible.
Taskforce reports represent a consensus among members. Our group was truly a model of civil discourse and meaningful collaboration on a set of issues that are truly critical for our country. I hope you enjoy today’s discussion. I hope you read the report and share it if you’re so inclined.
I now turn it over to Kara Swisher to guide our conversation. Thank you.
SWISHER: Thank you very much.
Welcome to today’s Council on Foreign Relations report launch for the CFR Independent Taskforce on innovation and national security. I’m Kara Swisher, an editor of Recode and I wrote a column for the New York Times weekly on technology and issues like this. I’ll be presiding over today’s discussion. I would also like to welcome CFR members around the nation and world participating in this meeting through the teleconference. We’ll hear from them during the question and answer session. And, hello, Adam, on the tiny screen down here. It’s good to see you. (Laughs.) That’s funny.
So I read all the parts I needed to read. So I want to get started with both of you. Thank you for turning closer. Why don’t we start with you, Admiral, about where you think the most important parts of this report—why don’t we go through the report first, then we’ll discuss sort of the key issues, which I think is probably China. But let’s go through what you all think the—what you think the most important part of this is.
MCRAVEN: Yeah, well, thanks, Kara. And let me also thank my co-chair James Manyika and Adam Segal there in New York for the great work he did, and for the rest of the members of the taskforce. You know, we start off the report by talking about this being a Sputnik moment. And I do think, you know, when we take a look at the challenges that are out there, the pace of innovation, the rise of China, the fact that we have a workforce that really needs a lot of infusion of STEM employees, we realize that this is a moment that we have got to seize if we don’t want to fall behind in our national security efforts.
So I think when you look at these issues, and particularly the pace of innovation, the rise of China, the need for more STEM-related folks within the workforce, when you look at the problems that the Department of Defense has in terms of their ability to innovate—they have this great desire to innovate, but there are some constructions on regulations, on the budgeting process, on a lot of things that we need to take a hard look at. These are important concepts, important ideas. I don’t think any of them are fundamentally new ideas. What I’m hoping the taskforce brings to light is the fact that we’ve got to tackle these ideas now, we’ve got to tackle these issues now. If not, we will find ourselves in ten years well behind China. And so I would say that’s kind of the crux of what we see in the taskforce report.
SWISHER: James, why don’t we talk about that? Where do you see we are right now? Because I think many people think China is surpassing us, I certainly do, in many areas that are critical, like AI, quantum computing, robotics, and automation?
MANYIKA: Yeah, just to build on what Bill just described, I think we now have a set of foundational technologies that are pretty fundamental to how innovation across not just the economy and industries, but also even for military use. These are sometimes called the so-called dual-use technologies. And the set of foundational technologies including the advanced connectivity technologies. 5G is a current examples. There’ll be subsequent generations of that. Artificial intelligence, what’s starting to happen with quantum computing. We’ve also got the new revolutions that are coming out of the biosciences, genomics and so forth. These have become quite foundational technologies. And the pace of innovation and investment around those is moving pretty dramatically.
And I think one of the things that’s important to keep in mind is that the U.S. has had an unbelievable track record of innovation and leading the world innovation. And I think that system and playbook of innovation is not guaranteed. We still have an enormous lead. If you think about it, we’ve led the world in terms of R&D investments. We’ve led the world in kind of developing these advanced, amazing technologies. We’ve led the world in in being at the center of the technology innovation ecosystem, domestically and globally. And we’ve been unbelievably impressive in the past at how the military and the national defense system kind of adopts these technologies and brings them to scale.
On every one of those, we have a new and amazing challenger now. China’s on pace to spend as much if not more as their economy grows. In fact, I think they’re on pace to spend close to about 2.5 percent of GDP within a decade. Today—you know, at our peak we spent about 2 percent in 1964. It kind of tailed off a little bit in the ’70s to about, you know, 1.86 (percent). Today we spend about 0.6 (percent). So the pace and scale of that is challenged. China is now playing everywhere around the world. So where we used to be the central node of innovation and the ecosystem, China is now part of global supply chains and value chains everywhere. They’re competing for markets as much as we are everywhere around the world.
So the playbook has kind of changed a bit. And we’ve got an unusually fierce competitor. But we still have the opportunity to keep our edge. And that’s why we actually titled the report Keeping Our Edge.
SWISHER: So, Adam, why don’t we check in with you here? One of the things that you were talking about is we were the most innovative.
MANYIKA: We still are.
SWISHER: Yeah, in certain areas it’s moving very quickly, from what I understand and people I interview. But one of the things about technology is the young eats its old, really quite significantly. And I don’t mean young people necessarily, but just it doesn’t matter—(laughter)—although that’s fine. Who knows what diet they’re having in Silicon Valley these days? (Laughter.) But does it matter? Does it matter as these things start to progress and they leapfrog over each other? But most of the innovation has been from the U.S. completely.
Adam, can you talk to, like, how we got here, and what—sort of give us sort of a broader sense of the kind of investments that need to be made?
SEGAL: Sure. You know, the U.S. set up a system at the end of World War II that was – the vision was set out by Vannevar Bush, who was the science assistant to the president. And he basically envisioned a kind of pipeline that started with basic science that the federal government funded, it moved to the universities where they did research and training, and then it ended up in the commercial sector where companies did further—kind of applied R&D, and venture capital helped grow those industries.
That’s what we have. That’s what has succeeded immensely over the last seventy years. To be quite honest, that’s what the Chinese have been trying to recreate. A lot of what they’ve been trying to do is create spinoffs, create entrepreneurs, improve links between the universities and the private sector. But a big part of that was what the federal government put in at the beginning of the pipeline. And as James and Bill have both noted, federal R&D was a huge part of that, because the federal government can make big bets on risky technologies that the private sector cannot.
So the private sector is playing an incredibly important role in driving R&D. They’re now, you know, funding more than the federal government is. It’s a huge driver in the United States. U.S. companies spend about five times as much on R&D as their Chinese competitors do. So there’s still a huge gap there. But there is a certain type of technology that addresses national security concerns, social concerns, that the private sector is not good at funding. And that’s really where we have to get back involved in a more ambitious and aggressive funding effort there.
SWISHER: Admiral, why don’t you talk about the relationship between government and the tech sector, and the funding? Because one of the things I think most people in Silicon Valley tell me that the Snowden revelations really broke a little bit of the relationship at probably a bad time. But talk about that idea, because there used to be a much more permeating relationship between them, and now it’s not—it’s not the same. You had the Defense Innovation Agency and all kinds—you know, obviously the CIA even has a venture firm out there. Talk a little bit about this—how that relationship is right now.
MCRAVEN: Well, it is one of the thing we talk about in the report, and the need to kind of solidify and strengthen that relationship. I think, you know, the federal government, and particularly the Department of Defense, has had a longstanding, strong relationship with the private sector, with the universities in terms of how the DOD builds on their innovation. Now, there have been some concerns lately. As you well know, we had a relationship with Google, working on Project Maven. The folks at Google decided that they didn’t want to support Project Maven because of the facial recognition aspect of this, and the potential that it could be used in a manner that they weren’t comfortable with.
I think those are important discussions to have. We probably needed to have that discussion before we moved into Project Maven. But these are important conversations, not just between the Department of Defense and the private sector, but I think the entire federal government and the private sector need to have an understanding of how the technology is going to be used, why we think it is important to have this technology. You know, when you look at the Department of Defense, we don’t want to inadvertently, you know, kill innocent people. So, again, I think things like the Defense Innovation Board is vastly important for the Department of Defense to not only have this relationship with the private sector, but also to help the Department of Defense understand how the private sector is innovating today.
One of the problems we have in DOD is that we have a bit of a laborious budgeting process, which is important to make sure that we have the appropriate congressional oversight, that the money that the taxpayers authorize for us to spend is being spent correctly. But it is difficult in the Department of Defense to iterate and to fail fast, as Silicon Valley does. Somehow we’ve got to be able to capture that ability within the Department of Defense so that we can be more innovative, particularly in areas like software. Now, the fact of the matter is, when you have a man in the loop, when you are building, you know, fighters, and submarines, and ships with a man in the loop, or a woman in the loops, you have to be very, very careful about how quickly you iterate. You don’t want to fail fast when the potential for life and limb is out there.
But in areas like, you know, software and some hardware, we’ve got to find a better way to do that. The private sector has figured that out. We need to, again, kind of marry what the private sector is doing with what DOD can do. And we need Congress’ help on that. And, again, we need to continue to have this dialogue between the federal government and the private sector, so there’s an understanding of how this technology is going to be used.
SWISHER: So talk a little bit about that talent pipeline, because it’s—they make more money being where, the companies they are, and including working with the federal government. I mean, I think the guy who founded Oculus, his new company, which is working on border protection, whatever he’s doing, just got valued at a billion dollars from a very famous venture capital fund in Silicon Valley, Andreessen Horowitz. Tell me about how to get that happening? And obviously—and at the same time, facing those objections from a lot of workers in Silicon Valley to working?
MANYIKA: Well, I think part of it is just creating a little bit more fluidity between Silicon Valley and the Department of Defense. I mean, we’ve got many examples, some of whom were actually on the taskforce by the way, who’ve been in both places. And in fact, one of the things we encourage is, you know, is there a way to create career paths that allow people to come—from the private sector—to come spend time in the Department of Defense, and vice versa, and be able to go back and forth. We even encourage—one of the recommendations we make is about what we called ROTC tech, which is, is there a way for people to actually serve, much like in ROTC, but where they’re bringing tech capabilities into those situations? I think a big part of it is just simply having the conversation.
SWISHER: A question, will there be pushups? Because that might be a problem. (Laughter.)
MANYIKA: There will be pushups.
MCRAVEN: There are always pushups. (Laughter.)
SWISHER: Well, I’m so sorry.
MANYIKA: No, but I think the talent question is one of the most fundamental ones, both in terms of talent domestically but also immigrants. One of the things on the domestic front is if you look at the—just our STEM pipeline, it’s pretty sparse. We don’t even fully take into account all the other people who should participate in that—women, minorities, and others. If you look at people getting degrees in STEM disciplines. We don’t fully take advantage of our STEM pipeline. The same thing with immigrants. You only have to look at Silicon Valley at the number of companies, and innovations, patents, whatever, that have been developed by immigrants. So I think the talent pipeline is one of the big areas of recommendations that we actually make.
One of the things—I like that Adam mentioned Vannevar Bush. One of the things that happened in the ’50s was this National Defense of Education Act. I think one of the recommendations we make is we need a twenty-first-century version of that, which is how do we fundamentally, at a very different scale, invest in education and these talent questions to improve the talent pipeline. We haven’t done enough on that.
SWISHER: So, Adam, why don’t you talk about Congress and the White House in this effort, because a lot of this stuff has been previously funded. Do we have a chief technology officer anymore? I don’t think—do we? Perhaps.
SEGAL: Yes, we do. We do.
SWISHER: It took a long time. But to focus in on these things and funding them more significantly. Talk about—like, because a lot of the—I mean, nobody can agree on lunch anymore, I get that. But these are critical issues, national security particularly, especially as a country like China doesn’t have those—is not having those arguments between and among themselves in terms of funding. What kind of funding is needed? And how—it has to come from the federal government in order to do this, especially around some of the newer technologies.
SEGAL: Yeah. So I think that the taskforce really found that there are lots of components of what would be a national strategy out there. And there are things that the Trump administration has done that are addressing these issues, in particular addressing technology leakage or technology flow to China. Addressing the IP theft issue. Looking at CFIUS reform and export control laws. There’s been—
SWISHER: That’s just stealing with stealing and shoplifting really. Like, I’m talking about actually funding massive innovation.
SEGAL: Yeah. And so I think also we’ve seen some rises in the—raises in the budget for things like DARPA and the DIU, which you mentioned before, the Defense Innovation Unit. But you are exactly right, that a comprehensive effort is going to require Congress and the White House to come to agreements about budgets. And you know, unfortunately the White House has basically led with cutting science budgets every budget that they’ve submitted. NSF, the EARPA, right, so the Energy ARPA, the DOE funding all have been—have been cut. So I think as we said and James and Bill mentioned, you know, one of the big recommendations, of course, is to raise historical—federal support for R&D to historical levels, about 1.6 percent, but also looking at a five-year, $20 billion initiative funded at universities that bring together interdisciplinary and transformative research. So I think there is a lot of need for both the White House and Congress to agree on budgets and focus on discretionary spending where this money is going to come from.
MANYIKA: Yeah, I was just going to add, you know, the—one of the things I think that’s been important that the White House has actually done has brought attention to these kind of foundational frontier technologies. You’ve probably seen the commitments, for example, made to quantum computing, to artificial intelligence. There’s been a slew of kind of assertions and, I think, important steps to highlight the importance of frontier technologies. I think what’s needed is then to put real action and funding behind those things. But certainly when it comes to identifying what are those frontier foundational technologies, I think they’ve done that. I think it’s important.
SWISHER: Well, I mean, I went to way too many events where they were—all of them, the Obama administration, this administration—a lot of this, not a lot of spending on it. So how do you get there, Admiral?
MCRAVEN: Yeah, yeah. One of the things I would offer as well is it’s not that this administration, I think, isn’t focused on AI and machine learning. I think we need to give them credit where credit is due. I think they have tackled a lot of these issues. As Adam mentioned, they have increased the budget at DARPA and the Department of Defense. Where we think they need to put some added emphasis is into basic research. This is where you really see this kind of shortage in terms of funding, funding at the universities, but basic research kind of across the spectrum—NIH, NSF, et cetera. Basic research is really what’s going to drive, I think, the foundational changes.
What you see the private sector investing in is innovation that’s going to be commercially relevant in a short period of time, whereas what you’ve seen historically with the federal government is, you know, the development of the internet, the development of GPS, et cetera. These were basic research items that were funded early on that required a long funding stream before they really came to fruition. You just don’t see that happening in the private sector. And this is where we would offer for the administration and for Congress to really begin to invest in basic research in whatever areas we can find, so that, frankly, it will pay dividends ten years down the line.
SWISHER: One of the interesting trends, though, is that these billionaires are spending in certain areas that government used to spend. In space, right now it’s Elon Musk and Jeff Bezos doing that. In cars, it’s Elon Musk. In education it’s Mark Zuckerberg. These are things the government used to do. And I’m not so sure we want to get our cues from billionaires’ decisions on what they want to spend. So how do you get that—in fact, I’m sure we don’t. (Laughter.) So how do you get—where would you put the money—from each of you. Why don’t each of you talk about—Adam why don’t you start—where you think the key areas are? You mentioned several, five, in this report. But what do you think—you know, sort of stack rank them in terms of where the most important—I would say AI above all else. But they’re all systemic and important for each other.
SEGAL: Yeah. I think the—as you mentioned, the taskforce listed about five to eight of them, AI being the most important. I think there’s a large focus on genomics and other health technologies. You know, what to do about the next generation of communication so that, you know, we seem to have lost the race for 5G. So what comes next? How do we make sure that we can at least be competitive and think about what comes next, if it’s 6G. And I think a focus on robotics. But I think, you know, a lot of it is not—you know, one of the things I think the taskforce struggled with is coming up with a list like that because, you know, five years from now maybe it’s a different set of technologies that we’re really talking about. So I think we want to look at the base of processes, interdisciplinary work being incredibly important, managing to bridge the stovepipes both in federal government and the private sector and universities. So you’re looking at work that brings together this convergence of physical, digital, and biological, where we think most of the innovation is going to happen.
SWISHER: What about you?
MANYIKA: The one thing I would add, I think the technologies are the right ones, because those are the general, foundational—general-purpose technologies. But I think there’s a whole set of ancillary things that go around those technologies. Quite often if you look at AI machine learning, in addition to the science of it and the algorithms that go with that are all the data questions that go with that. And how do we think about harnessing the data potential, because you need data to train the algorithms. How do we think about that? The data question also comes up with genomics. We need to understand that. And this is, by the way, one of the areas where our formidable competitor—partly because of their permissive approach to data and so forth—
SWISHER: (Laughs.) Permissive. That’s a nice word.
MANYIKA: Will probably make—yeah—will probably make more progress. So I think we have to navigate these questions about how do we invest in the science, but also the associated components, like data, that are required and critical for that?
Another example also, take the 5G question. Part of it is the technology itself, the radios and the chips and so on, but you also have the spectrum question. One of the reasons why we’ve not done as well is we haven’t quite freed up the kind of spectrum—use of the spectrum that we need now. Again, to their credit, the Trump administration has actually mandated that in fact the Department of Commerce and other federal agencies really look at spectrum usage, to make sure that we actually find a way to free up spectral use in a way that enables these technologies.
So I think this is not just about the science, although I agree with Bill, the science is pretty fundamental and has to be long-term. But we also have to look at the other system aspects of it that are required for these to be successful.
MCRAVEN: And, Kara, what I’d offer, when I was a chancellor at the University of Texas I’d have forums like this. And somebody would invariably ask me what I thought the number-one national security issue was. And I think they always thought I would answer, you know, North Korea, Iran. And my answer was always the same: K-12 education. So what I would offer, as we begin to look at how do we ensure that China doesn’t pass us, that we remain the leader in national security is, we have to invest in education. We have to invest in this pipeline. At the end of the day, as Adam points out, the technology will change. And obviously the budget becomes an issue.
You know, we’re recommending in the report that we go from, you know, about 0.7 percent of GDP up to 1.1. That’s about 146 billion (dollars) up to 230 billion (dollars). We realize that’s a lot of money. But I would offer, as we begin to look at how we’re going to allocate resources to improve our national security, to improve the innovation, let’s invest in education. Let’s invest in these great young men and women that are coming up that are part of the pipeline, because if we invest in them early and often and sufficiently enough, then they will bring the brainpower necessary to figure out what the next technology is going to be, and what’s going to be needed in the future. If all we do is focus too much on the technology, I think we’re missing the point. We got to start early.
SWISHER: I think you’re 100 percent right. One of the things that I’ve always – I always say is there’s not a lack of talent. It’s a lack of opportunity and access to this data. But that’s another thing that doesn’t get funded in a lot of ways. But you’re right, that’s where—that’s the beginning of it.
So I want to finish up sort of by spinning it forward, and then we’ll get to questions from the audience in all places. When you think of the most—you know, obviously we will probably get a lot of questions about China. One of the things you talked about is permissive and it’s authoritarian, actually, is the word you’re looking for. (Laughter.)
MANYIKA: Well, I was talking about the data part, yes.
SWISHER: And creepy. And creepy. And a surveillance state. That’s pretty much all the words you can use. (Laughter.) When you’re talking about the idea of data, the ability—Kai-Fu Lee had a book about this—is the amount of data they’re collecting is astonishing. The number of sensors, the number of—and that’s something, Adam, I think you left out, the idea of sensors and facial recognition and everything else, which I don’t think we want to win an arms race in facial recognitions with the Chinese government, because it plays into all these authoritarian problems. But what—when you think about the data, how do you honestly collect the data so that we keep and we hew to our basic feelings that democracy’s a good thing, and invasiveness into people’s lives and more is problematic? So let’s finish up, each of you, sort of thinking about what you’re—what’s the most—technology that perhaps most scares you, and yet is critical to do correctly?
MANYIKA: Well, I think for me the data question is one of the most critical ones, partly because I think when we think about data use, I mean, at some level AI technology and machine learning rely on data, whether it’s behavioral data by people or whether it’s even, you know, traffic data or satellite data. So there’s—when we think about data, we should think about all the data types that we’re talking about. Now, there’s some that we need to have much better governance systems for, such as surveillance data and so forth. We need much—very, very rigorous. And there we should not compromise our values whatsoever. But there are other data sets, data on climate change, data on health, that can be collected and used to train these algorithms. So I think we can’t lose sight of the fact that, you know, that’s one of the things we’re going to need to do within the context of our values. But data’s an important part of developing a lot of these technologies and using them. And I think that’s one of the reasons why it’s important that, you know, American technologies and approaches to using them actually win everywhere and other people and ourselves build on those technologies, as opposed to have other systems win in those markets.
MCRAVEN: And I would offer, you know, when we went around and talked to the kind of experts on the East Coast and the West Coast and the middle coast, the issue of data was a little confusing in terms of the layman, for me. You know, you think about the more data the better your AI is going to be. But the fact of the matter is it’s got to be clean data, it’s got to be the right data. And in fact, if you just collect a lot of data, and it is not clean and it is not the correct data, then the algorithm is going to be worthless or, worse, bad for you. So I think understanding the nature of the data and what data is going to contribute to successful health care, national security, pick something, is going to be crucial. That’s going to be the hard part, I think, when it comes to data.
SEGAL: So I think one of the things that the fact that China is leading in a lot of this surveillance technology underscores the point in the report that before when the U.S. was the unchallenged leader in this space, it could help set the norms and policies around the use and governance of those technologies. And now we face a real challenge. We see China exporting these technologies to Africa and other places in the developing world. And so the issue is, how do you make sure that U.S. values and interests are supported? And how do you shape these technologies as they emerge? And it’s not just AI, right? We saw with CRISPR and other things that China is really competitive in.
So the taskforce talks about a tech alliance. Others have flown this idea of a technology alliance. And that is really working with our European, Japanese, and other partners on what do we think the goals are? How do you govern these technologies? How do you think about privacy and data? And really having a serious, robust international discussion about where we think we can be effective in stopping the flow of some of these technologies, where we’re more effective in working together, where we’re effective in trying to develop some international norms.
SWISHER: It’s certainly true. I don’t think—the U.S. has led this internet revolution for twenty years now, twenty-five years. And certainly a frightening thought to think about China—the next internet era being owned and controlled by the Chinese, for sure, I mean, especially around the surveillance issues and other things. I mean, certainly we do our share of surveillance, but at least we feel bad about it. (Laughter.) Some of us. Some of us do. Others not so much.
SEGAL: When we get caught.
SWISHER: Anyway, at this time I’d like to invite members to join our conversation with their questions. A reminder, this meeting is on the record, which means Twitter. Wait for the microphone and speak directly into it. Please state your name and affiliation and speak into the microphone. Please limit yourself to one question, and keep it concise, to allow as many members as possible to speak. And I would like to remind national members to email their questions to questions@CFR.org. I have this lovely iPad here which will collect some of the questions.
But why don’t we start now, and someone will come around. Why don’t you pick people?
Q: Hi. Michele Flournoy.
So I wanted—I wanted you to think about, if all of your recommendations were implemented, you know, the representative champions them on the Hill, all that money comes, you build the digital service academy, you get the ROTC program, you have this huge influx of tech talent, in the ideal world, talk about the receiving end. Right now, if you threw 25,000 tech talented people at the Department of Defense—(laughs)—they would not have a very good experience. You know, I think raj can speak to the—it feels like launching an insurgency when you’re running DIU or DDS, or so forth. So what has to happen on the receiving end in terms of how we manage careers, and develop talent? In terms of how we approach acquisition and procurement? In terms of how we approach change management? Like, what are the changes on the receiving end that would have to happen to complement the excellent sort of supply-side recommendations that you make.
MCRAVEN: Well, I would offer, Michele, on the Department of Defense side, as you know better than anyone, you know, we have had kind of similar processes in place for probably the last, you know, 50 or 60 years in terms of how we do budgeting, how we do manpower. And it’s not that we haven’t looked at that. I think we have refined those kind of on the margins. But we really need to do kind of a top to bottom review in terms of ensuring that we are developing the pipelines for the entry of these great STEM folks. And, again, the 25,000 obviously won’t come in all one lump sum. It will—it will be infused in over time. But even over time, I’m not sure the pipeline is ready to receive them.
To the point about how do we innovate? As you know, contrary to what a lot of people think, the Department of Defense is very innovative. And the man and women in the Department of Defense want to innovate. The issue becomes some of the federal acquisition regulations that kind of hamper them sometimes. It is the sense that, you know, we don’t want to do anything that causes us to fail too fast, because we are spending taxpayers’ dollars and we can’t go back to Congress and say: Thanks for giving us $20 million, but it didn’t turn out to be of any value. Now we want another 20 million (dollars).
So we clearly have got to have, I think, a change in philosophy, a change in structures in terms of the Department of Defense. But that’s really going to require—I think it’s going to require Congress to decide how we want to do it. If you go back and you look at 1987 when we did the Goldwater-Nichols Act, I mean, this was led by Congress. A fundamental change in the entire Department of Defense. A fundamental change that made us the finest military in the world, bar none. We need to have Congress take a hard look at the Department of Defense today and say: How are we going to reorganize them to be—to ensure that they are ready for this digital revolution in a way that’s going to ensure that we maintain our superiority in terms of national defense? I think we are tackling it on the margins. I think what it needs is a full-up, top-to-bottom review. And that’s really going to have to be led by the Department of Defense and Congress, I think.
SWISHER: That’s an excellent answer. I thought you were going to say: We need a special juice bar and intermittent fasting.
MCRAVEN: Well, that never hurts either. (Laughter.)
SWISHER: Yeah. It’s critical to—
MCRAVEN: With Red Bull.
SWISHER: Red Bull, that’s right.
MCRAVEN: Red Bull, yeah.
SWISHER: They don’t drink Red Bull. But anyway, Adam, why don’t you get someone from New York asking a question. Go ahead.
Q: Hi. Good morning. I’m Abby Joseph Cohen from Goldman Sachs.
And I want to thank you all for this terrific work—to you and the rest of the taskforce. I’d like to go back to a point that was made earlier, but to discuss it in a more provocative way. And that is, this is not the first wonderful report, nor is it the first wonderful initiative. Indeed, James and I served on a taskforce that was hosted, if you will, by the White House during the Clinton administration under the COMPETES Act, which described exactly what that act was supposed to do. And that proposal included more than twenty suggestions, almost none of which were taken on by what was then a very Republican and hostile Congress. And so I don’t’ mean to make this a partisan issue, but how do we go about convincing political leaders are this point that this—these issues do in fact have the time urgency that so many here in New York and in Washington, in your room, seem to believe that it does? How do we educate those who are making the decisions on budget and on priorities?
SWISHER: Which one of you wants to take that one?
MCRAVEN: Yeah, let me, if I can. You know, we talk about this kind of collegial attitude we had in the—in the taskforce. And I think that’s true. But at one point in time we did have a rather heated discussion about the kind of oh shit moment. And it was, you know, or maybe it was the holy shit moment. I forget. (Laughter.) But it was, we’re here. We need to make sure that the American public understand that now is the time to do something. To your point, I think people have seen these moments coming for a long time. And as we talk about this kind of curve progressing, and we talk about the rise of China, the gap is narrowing dramatically. If now is not our holy shit moment, let me tell you, that moment’s going to be a lot worse ten years from now.
So what we hope this report does is, again, reemphasize some of the critical points that were brought up, I’m sure, in many taskforces before now, but also brings with it this red flag that says, if not now, when? And, oh by the way, it’s just going to get harder the further we go into the future. And at some point in time, when those lines cross, when China becomes stronger than we are, then our ability to turn that around is going to be increasingly difficult, if not impossible. So now has got to be that moment. We talk about it being the Sputnik moment. It really is the holy shit moment. Let’s get on this.
SWISHER: Right. Well, holy shit moment. That’s a good way to put it.
MCRAVEN: I don’t know, can I say that at CFR?
SWISHER: You can say anything you want. (Laughter.) I’m good with it.
But, James, talk about that idea. Is that it’s—Abby’s making a point. Y’all write a lot of reports here. I’m not from Washington, but there’s a lot of reports coming out of Washington. (Laughter.)
MANYIKA: No, I mean, I think Abby’s right. As Abby knows, if we had looked at when we did that work with America COMPETES Act, China was not the economy that it is today. China was not winning in technological places as much as it is today. China wasn’t as built into the global value chains and technology value chains that it is today. So to Bill’s point, if it was urgent then, oh boy is it urgent now, just in terms of where we are. China’s on track to be—to have an economy bigger than the United States in a decade or two. So this—to Bill’s point—this only gets harder. So I think if this isn’t the moment to take on these questions in a nonpartisan way—these are foundational issues. These are not Republican or Democratic. These are foundational issues for American competitiveness. So if not now, I don’t know when.
SWISHER: And it’s also getting rid of the idea that China is just an intellectual property thief, or whatever. It’s more—they’re very innovative. And some of the most innovative stuff is coming, even in the consumer sector.
MANYIKA: No, it is the case. And one of the things that—you know, of course, you know, there’s been a lot of IP theft in the past, and all of that. But I think even if you look at the actual research now, where they are particularly in areas like artificial intelligence, there’s been lots of assessments that have been made about where are some of the breakthrough papers and research coming from. And China is one of those. Even a ranking of if you look at artificial intelligence, how would you rank the ten most, you know, impressive research universities in artificial intelligence? And Chinese universities are on that list of the best ten. So I think to think of this as simply an IP theft issue—of course, there’s a lot of that, and there’s a lot of defense and protection of American innovation and technology that we need to do; and I think we should keep doing that. But this is about competing and winning and out innovating a formidable competitor.
SWISHER: Yeah. That’s leaving out TikTok. Anyway—(laughter)—I can’t use that service.
Anyway, this is a question from national member Kimberly Mullen from Weston, Connecticut. How should we be looking at the role of ethics as it relates to innovation at the national security level, both holistically as well as focusing on specific segments—for example, AI, autonomous machines.
That’s an excellent question. It’s one that the commercial tech sector. Adam, why don’t you take that one.
SEGAL: Sure. I think the taskforce, I think, grappled with that problem at two levels. I think the first is, as Bill was talking about, the growing gap between the Valley and the national security community. So a lot of this has been driven about the legitimate uses of AI and what role the tech community should play in supporting those missions. Project Maven is just one, but you know, we saw with ICE and Microsoft, and then the op-ed from Palantir two weeks ago, where basically they kind of washed their hands of any of the ethical concerns and said, you know, these are policy decisions. And I think, you know, one of the things that taskforce talks about is we’re never going to get anywhere without people actually talking to each other and engaging each other.
And so a lot of the policies that we recommend that have to do about addressing talent shortages in the DOD, we also see as being useful in addressing this problem. There’s people who flow back and forth. You’d have some more interaction. You’d be able to have these discussions. I think the DOD’s AI strategy goes some way to addressing this issue, right? They really headline using AI for humanitarian concerns, forest fire fighting, other things that would not be as controversial. I think the second range of these is what I talked about earlier, is on the international level. Right, again, can we develop a tech alliance. Can we work with our allies to come up with some shared norms in this space moving forward?
MANYIKA: Yeah, and you’re starting to see one of the interesting things with ethics, particularly for AI, you’re starting to see a bit of a convergence around a set of a principles around these things. The various actors in the private sector have come up with particular principles. The Defense Department has come up with theirs. The OECD has—so, and mostly people endorse these. I think the question is, how do we actually live by them? Because I think the—you know, what should be in those I think is roughly coming into sight. But the effectiveness, the enforcement, the peer pressure—whether it’s legislative or peer pressure—whatever mechanism we use to actually make sure that we actually operate by those principles, I think that’s what’s needed now.
SWISHER: OK, is there any—which one of those things—peer pressure isn’t working. But what do you imagine being the most important?
MANYIKA: I think all of them. I actually think all of them. I actually think all of them. I think the peer pressure also helps because, you know, citizens and consumers will complain, and they’ll express their opinions. I think legislation helps. So I think it’s all of them.
MCRAVEN: Again, we have to start off with a fundamental understanding of who we are as a nation. And we are a values-based nation. We are a nation of laws. And I do think we have to have these conversations, they have to be hard conversations, to determine what are the ethical guidelines under which we want to use technology, and what direction do we want to go? I don’t think we’re having those conversations enough, particularly between the federal government and the private sector. To Adam’s point about Maven, it’s not just Project Maven. It’s a whole host of things.
Well, let’s engage in those discussions. Let’s have an understanding of what kind of the baseline expectations are. And then let’s move forward kind of together in this. But always with an understanding that the things we do have got to be moral, legal and ethical. If we violate any of those three fundamentals, then I don’t know that the technology becomes of any value to us.
SWISHER: Great. Next question. Adam, why don’t you—oh, let’s do one from here. Sorry. Can you pick? Sorry, I don’t have my glasses.
MANYIKA: We’ll go to this side of the room.
Q: Hi. Catherine Mann.
So, James, you brought up the issue of IP theft and IP protection. That is particularly important with these technologies that are fundamentally network externalities, winner take all. So how do we manage the process of general-purpose technology being funded by the federal government, through universities which increasingly are taking a particularly strong attitude on IP, through to industry which kind of is winner take all as well right now. And in that environment, the benefits of these technologies do not filter down to enhance productivity growth more generally. So it starts with winner take all, starts with network externalities, you’ve got IP on top of that, but the end game is everybody has to use it. How do you get there? (Laughter.)
MANYIKA: Well, I think, Catherine, you point out one of the—this is one of the challenges of these particular classes of technologies, which we haven’t quite encountered before. The externalities you just described, the network effects, and quite frankly the fact that they’re foundational. I think that’s one of the challenges. And I think our whole set of intellectual property frameworks have not quite been built for this, the way we’ve thought about them, in the sense of how we think about—you know, as you know, you’re an economist. Things that are non-rival technologies, how those get shared.
I think there’s a real rethinking of our intellectual property frameworks, even as we protect what we’ve got today. But I think one thing that would be a mistake is to focus on the intellectual property protection in a way that cuts us off from competing and winning markets, both here and abroad. I think, you know, ultimately we win and we benefit when, in fact, the economy grows, drives productivity growth, and in fact some of that actually translates into prosperity, and jobs, and income and kind of income growth. So let’s not tackle the issue in a way that cuts off those potential benefits. I think that’s one of the things we haven’t quite fully thought through yet as a society.
SWISHER: OK, Adam, why don’t you do one from New York?
Q: Jove Oliver with Oliver Global. Thank you for a great discussion today.
You’ve talked a lot about the STEM pipeline. I wanted to spend a moment on the STEM curriculum. I’ve been doing a lot of work with Mitchell Baker, the co-founder of Mozilla, who argues that the current STEM curriculum produce a generation of, you know, developers, technologists, who have had a lot of issues with privacy, surveillance economy. So my question is, do we need to be updating the STEM curriculum with sort of behavior, political, economic broader issues, so that the people developing these technologies understand the broader implications of their work? Thank you.
SWISHER: So are you talking—can I just ask—are you talking about adding humanities? I wrote a column about this for the Times, that the problem is many of these people haven’t taken any humanities courses, so they don’t have any sense of anything else besides STEM. Is that what you’re talking about? Or STEAM, or—
Q: More the humanities, the behavior sciences, yeah, economics, politics. Thank you.
SWISHER: OK, great.
MANYIKA: Yeah, I couldn’t agree with the question more. And, by the way, big fans of what Michell Baker’s been advocating at the Mozilla foundation. She’s been doing amazing work there. But I think you’re getting at a fundamental issue, which is I think the more we think about STEM or technological education as purely about the math and the science, I think we’re missing something. And we listen to this conversation we just had today, we’re talking about ethics, we’re talking about values. Those done come out of the math, right? They come out of other things. So how do we make sure that as we educate this and think about the talent pipeline, and address the kind of K-12 issues that Bill was describing, make sure that it’s actually a complete education and preparation, so we don’t have just engineers who all they know is coding, and math, and science. So I couldn’t agree with that more.
SEGAL: I’ll just add—oh, sorry, go Bill.
SWISHER: Go ahead. Go ahead, Adam.
SEGAL: I’ll just add that one of the other things that the taskforce brushes on, but doesn’t go into particular detail, is that one of the ways to address the high numbers of American students who drop out of—you know, say that the first year their major is STEM, and then drop out soon after, we know it works to keep people, to keep women, to keep minorities in these fields tends to be more experiential learning, group projects drive by social questions, as opposed to just, you know, pure kind of class-based things. And I think those types of formats you’re going to be challenged with more of the political and ethical concerns than you would in just a straight, you know, here is the lecture on whatever this S&T, STEM topic is.
MCRAVEN: Yeah, to the questioner’s point, I can tell you across the University of Texas system, when we looked at our kind of STEM areas, particularly in the engineering field, we recognize that we were training folks to be great engineers across the system, but we were not incorporating enough of the humanities. The problem you run into is the curriculum are so tight, so well-defined, that it is difficult to continue to add humanities courses. So something has to be cut if you’re going to meet the criteria necessary to graduate a great engineer.
Somehow we’ve got to find a better balance between just the math and the science, and we have to bring in the humanities. I couldn’t agree more with the individual asking the question, because if you don’t have the humanities and you don’t have a foundation for why the math and science is important, then I think you come out with the wrong conclusion.
MANYIKA: Yeah, I wanted just to point out one of the—one of my favorite examples. I mean, you look at what Stanford has tried to do. So Stanford has now set up the human-centered AI institution, which by definition is actually set up to be a multidisciplinary institute so that, yes, people work on AI. It’s not just computer science and the math part of it, but you also have legal scholars. You also have people in humanities. In fact, the co-director is actually a philosopher, co-director with a computer science AI scientist. So I think you’re starting to see a movement towards those kind of multidisciplinary approaches.
SWISHER: I’ve always thought that our democracy would be in better shape this year if Mark Zuckerberg took one existentialism course. (Laughter.) But that’s just me. Back here—OK, back over here.
Q: Hi. Mike Clauser with Access Partnership, a global tech policy firm.
With regards to the Admiral, my apologies, I kind of thought the Army Futures Command missed an opportunity in putting its headquarters in Texas, as opposed to Moffett Field in the Valley or Hanscom in Boston. (Laughter.) And I saw that there was kind of a dissenting opinion that was sort of buried in the back of the report that called for building out more military near tech hubs to get that sort of cross-pollination. I was wondering why that didn’t get baked into the overall report, and why—I mean, you could agree on spending $84 billion more on R&D. Why not put some more folks near the tech hubs?
SWISHER: It’s a big empty building there, too.
MCRAVEN: Yeah, I think what you’ll—I think what you’ll notice is that it’s not just dissenting reports. It’s dissenting and additional comments. And I think this was an additional comment to the report, that in fact it would be great if we, you know, reopened The Presidio and Treasure Island, and these sorts of things, where you could have more kind of civil-military engagement. I think the reason the Army decided on putting Army Futures Command in Austin because of the growing tech industry within Austin, and the sense that it was a little bit more centrally located, and therefore your ability for the commander of Army Futures Command to move, you know, in both directions quickly was there. It’s not that they couldn’t have put it up in Boston near MIT or in San Francisco in the Bay Area. But Austin seemed to be a good choice for a whole number of reasons.
Do I think we need to have more military engagement with, again, the Boston sector and the Silicon Valley? Absolutely. I think the military is trying to do that. We are putting kind of liaison officers, we are putting young interns, you know, young captains and majors and lieutenant commanders out in Silicon Valley and up at MIT so that, frankly, they can see what the military looks like. Back to our discussion about, look, not all of us in the military, you know, have this kind of Dr. Strangelove issue. Most of us want to do—
SWISHER: Not all of us? None of you, I hope. (Laughter.)
MCRAVEN: Most of us want to do very well by the country. And the folks in Silicon Valley, they need to have those kind of personal relationships so they have a better understanding of the culture in the military, and the sense that we’re doing things that we think are noble and honorable for the country, just like the folks in Silicon Valley, and MIT, and Boston are trying to do.
SWISHER: Certainly appeals to patriotism. And, by the way, the rents in The Presidio are insane. So you know that’s not going to work.
But, go ahead, Adam. Want to do another one from New York?
SEGAL: Sure. Yes, sir.
Q: Jamaal Glenn, Alumni Ventures.
Thank you for this conversation today. How much did the taskforce talk about, if at all, deep fake technology. Or specifically—or, more generally technology that facilitates misinformation? If we look at sort of where it’s come, if we look at our challenges in the last presidential election, and we sort of look at the trajectory of where technology goes, I don’t think it’s too crazy to imagine a future where it’s really difficult to distinguish between truth and fiction. How much did you guys talk about this generally or specifically as a national security threat?
SWISHER: That’s a great question. I spent the day at the Federal Election Commission yesterday talking about this issue. Who would like to?
SEGAL: So, I mean, I do think it was raised in a discussion about oncoming threats, and some of the ways that the technology is going in directions that we can’t control. It was a little bit outside of the scope of the report, in the sense that we were not, you know, looking at specific technologies’ impact on national security. But I do think that, again, to come back to this idea about how do you govern these technologies, there is a section on cyber norms and discussions with our allies about what red lines would be. And so disinformation, electoral interference, all those things would in the area of discussion. But we didn’t spend a lot of time specifically on deep fakes.
SWISHER: But the issue of propaganda is an enormous one since the beginning of time, it’s just more weaponized and amplified now with these new technologies.
MANYIKA: Yeah, and I think—you know, the discussion we did have, although I think Adam’s right it wasn’t the focus of this—was a recognition that in fact there are all these uses and misuses of these technologies. That’s why we did recognize that these are dual use technologies. They can be weaponized. They can be taken advantage of, much like other technologies have had in the past. Think of, you know, nuclear science and biological sciences and chemical weapons. So I think we—so we recognize the dual use nature of these and the need for these governance mechanisms but didn’t really go into detail on the deep fakes issue in particular.
SWISHER: For the questioner in New York, there’s a new book by the president of Microsoft, Brad Smith, called Tools and Weapons. I did a great podcast with him recently. It’s really—it does cover a lot of these issues. And it’s—I mean, you understand, from tools and weapons, that technology is that.
So another question here. I don’t mean to be, like, all Facebook’s new Ray Ban spy glasses, but I left my other—I’m really blind as a bat.
Q: My name is Peter Fatelnig. I’m working for the European Union here in D.C.
And I’m glad that Europe didn’t come up, because I guess it wouldn’t result in many compliments. But has the taskforce—(laughter)—has the taskforce considered how other regions of the world, or other countries deal with exactly the same dilemma or question?
SWISHER: By the way, I love Margrethe Vestager. So go Margrethe. (Laughter.)
MANYIKA: We did, in the following sense. I think we thought about that from the point of view that other regions, other parts of the world are allies of the United States. They have been part of the innovation ecosystem, particularly Europe and Japan. How do we work together with them much more fully to, you know, compete and make sure that technology with the right values, and all of that, is kind of built in? So we spent a lot of time on it, in that sense. We also spent time on it in the sense that we need to understand that they will make their own choices too about which technologies they adopt, how they adopt them. So I think there’s a—there a dance in that sense that needs to happen. But these have been historical allies, and also market opportunities for the United States. So I think we thought about it in that sense. But we were not developing recommendations for the EU, but rather for the United States.
SWISHER: There’s obviously been differences around privacy, around Huawei, all kinds of things. But it definitely—the idea of allies is a real important one here.
OK, last question. Let’s do it from New York. Make it a good one. (Laughter.)
Q: Hi. John Paul Farmer. I’m as of the last few months the chief technology officer of New York City.
And I’m curious what role you see state and local governments playing in helping the United States keep its edge in relation to national security.
SWISHER: It’s a great question. Admiral, why don’t you answer that, because it’s about grid security and all kinds of things?
MCRAVEN: Yeah. You know, I think what we found after 9/11, interestingly enough, is that the relationship between the federal government and the state and local government in terms of our national security apparatus wasn’t near as tight as we needed it to be. The information didn’t flow from the federal government down to the state level. And, you know, the 9/11 Commission kind of brought out a lot of this. So when we take a look at national security moving forward, I think we need to make sure that we underpin everything we’re trying to do at the federal government with the state and local government. And the fact of the matter is, you know, to play off Tom Friedman a little bit, the United States is very flat. And we need to make sure that the systems that we invest in, that the pipelines we invest in recognize that it starts at the local level before it ever gets to the top. And therefore, all the systems that we develop, all the—you know, all the proposals we make have to have kind of a top-to-bottom look.
MANYIKA: Yeah. I—actually, at this point, I was actually going to see if we could invite one or two of our taskforce members to comment on this question, because I know we spent a fair amount of time on it. At least we get to hear their voices too. I don’t know if Raj or Nicholas or others want to jump in.
SWISHER: Come on, Raj. (Laughter.)
MANYIKA: No, just if you could—
SWISHER: State and local.
MANYIKA: Just talking on the state and local question, but also even your dual experience, because you’ve lived in both worlds—both the Silicon Valley world but also in the national security.
Q: So I think, you know, the one thing the taskforce report was trying to do is—you know, and the members looked at and tried to bridge some of these gaps. And I think we live in a time where there’s a high level of specialization. And so you have folks that spend a lot of time just deep in technology, or just keep in the government, and there’s not as much crossflow and understanding between the two. And as these topics, particularly around technology and its impact, gets so complex, it’s hard to make the right decision with full context. So finding pathways, like, you know, John Farmer there, who has gone from federal government to local and technology, is really important. And I think we’ve tried to do that in the taskforce, to highlight a couple pathways.
SEGAL: I’ll just add real quickly that—because I wrote it and then cut it in the report—we—and there is now a line or two in there referring to the role that regional governments, state governments play in creating these ecosystems, you know, innovation ecosystems. We see them now emerging in lots of places. At the end of the Obama administration, right, the manufacturing initiative tried to expand those. We’d—again, we have a line or two in the report that refers to those, and the need to build on them. And so we had a much larger discussion actually in the taskforce about how you would fund those and support those from the federal level. But I think implicit in the report also is state and local governments’ importance on education. So in particular, states for university budgets at the state level and for community colleges.
SWISHER: Great. You can add it in. The internet’s endless. There’s lots of space. Just post it.
OK, I’m going to add this last question, because we are almost out of time, but this is from a member nationally from Pinehurst, North Carolina. It’s from Patrick Dewar from the Trenton Group, LLC. And we’ll finish on this one, if you all can be brief.
Given the U.S. government’s focused on hypersonics and high-density energy storage and generation, as well as their much higher sophistication in AI, robotics, material science, et cetera, wouldn’t a more serious focus and public relations push by the government on these technology be the answer to the issue in front of us, instead of the U.S. government trying to dabble in commercial technology?
MCRAVEN: Oh, James, this is all about you. (Laughter.)
MANYIKA: No, but I think one of the—and I think, actually, Bill, you said it. I think something about bringing the public onboard to understand the importance of this moment, I think is part of what we should all be doing, because I think the more the public understands and fully—you know, understands why this is important and comes along, and any kind of publicity, or mobilization, or excitement of the public in that way I would think is very, very important. So I would agree with the question.
SWISHER: OK. Very last. One most important takeaway from each of you from the report. Why don’t we start with you, Admiral?
MCRAVEN: Yeah, I’ll start back with the very lead paragraph, that this is our Sputnik moment. And if we don’t recognize it as a Sputnik moment, then it’s just going to be another taskforce report that sits on somebody’s desk, that nothing gets done. So I hope the public sees this for what it’s worth. We need to take action now. If not, whatever action we take in the future is going to be twice as hard, cost twice as much, and probably be half as effective.
SWISHER: Also holy shit, right? Got it. (Laughter.) OK.
MANYIKA: I can’t top that.
SWISHER: Come on. Try.
MANYIKA: No, I think Bill just said it. We have to recognize this moment for what it is. And, you know, it’s not a political moment. It’s not a partisan moment. It’s about American leadership and innovation, and how we continue to lead the world of innovation.
SWISHER: All right. Adam, finish up, author.
SEGAL: Yeah, so I’ll just end with the point that I’ve been making I think consistently, which is that this is not the U.S. alone in this competition, right? There’s no way the U.S. is going to outspend China. We’re never going to have enough science and engineering graduates in the United States. You know, we’re 300 million versus 1.3 billion people. And the U.S.’s great strength has always been being a node that involves innovation in Europe, and Japan, and our Asian partners. And we have to get back to that—to that vision.
SWISHER: OK. Thank you very much, all of you. And thank you to the members, and everyone for participating. (Applause.)