Michael Chertoff, executive chairman and cofounder of the Chertoff Group; Adam Segal, director of CFR's digital and cyberspace policy program; and Cyrus R. Vance, Jr., New York's district attorney, join L. Gordon Crovitz, partner at NextNews Ventures, to discuss the trade off between privacy and security in the debate over government access to encrypted data, and the implications for business, counterterrorism, and user security. The panelists consider the recent legal case between Apple and the Federal Bureau of Investigation.
CROVITZ: Good evening, everyone. I’m your presider, Gordon Crovitz. And welcome to the Council on Foreign Relations meeting on privacy and security in the digital age. We will, as is always the case, have plenty of time for questions. So please prepare those for members. We’re very lucky to have a panel of people who are both highly experienced and highly opinionated. So we should have a very good evening.
There are bios in your materials, but Michael Chertoff is chairman of the Chertoff Group, formerly served as secretary of Homeland Security, was also a prosecutor and a federal appeals court judge. Cyrus Vance, Jr. is the district attorney for New York, better known in the movies and thrillers as the Manhattan DA. (Laughter.) And to my right is Adam Segal, who directs the Council’s digital and cyberspace policy program.
So I think the biggest issue in this area of privacy and security is what the founders would have made of this—(laughter)—and all the other smartphones in the room. The Fourth Amendment says reasonable searches and seizures are permitted. What happens when technology makes them not possible? So I wanted to start by asking Cy Vance, if you would, to talk a little bit about the analog era. So before all of these digital issues came up, before you had issues like in San Bernardino where we have the dispute between Apple and the FBI over whether Apple should be obliged to help the FBI access that phone, remind us what the world was like in the analog era. What kinds of information could prosecutors get with a traditional warrant?
VANCE: The backdrop to this is I’m speaking really on state and local law enforcement issues, just so I don’t appear to be a federal expert when I’m not.
And really up until September of 2014, we accessed the contents of mobile devices, smartphones, if we could get a warrant from a judge focused on a specific phone. And the range of information that we would get out of that phone was enormous from time to time, and extremely helpful. I think it’s fair to say that in the last 15 years, up until the fall of 2014, and whether it was a case involving homicide or financial crime, digital evidence, very, very often on smartphones, was a key part of gathering the evidence for a case. So the way the world worked up until late September, 2014 is among the many avenues we would investigate for evidence relevant to the case would be in smartphones, as well as laptops, and other devices.
I just want to add that it was very important and is very important for us in terms of building cases, but I also think it’s very important in terms of making sure that prosecutors had all the information they needed to make the right calls. We have had cases in our office where because of our ability to access a phone, when police arrested someone for homicide and I had a very smart assistant over the course of the weekend who just didn’t think that the timeline was adding up, we could get that phone out to Apple, fly it out with a warrant, bring it back, and in this case my assistant was able to cut lose the gentleman who’d been identified and arrested for murder by the police, find the true killer, who is now serving a lengthy term in prison.
So the point is, it was important not only to build cases—or essential. It’s also important to make sure that we are exercising our responsibilities as prosecutors to the degree that I think the court expects us to. And there is going to come a time, because of the inability to access phones which hold a huge amount of evidence, that judges are going to be dealing with motions by lawyers who are saying that we believe in that phone is information that could exculpate my client. And we’re going to have to sort through how we move forward in those cases. I think everybody in this room would want the prosecutor investigating your case, or someone you know, to have all the facts.
We now live in a world where we are not getting all the facts. Many of the facts are on smartphones because criminals, just like you and me, have moved off-paper and onto digital devices. So that’s the pre-world and the post-world is—I’m sure we’ll talk about.
CROVITZ: And just to really level-set this discussion, even before—going back before the smartphone, prosecutors could get access to bank security boxes, to health records, all sorts of things that we in the digital era would say are on our phones, but at that time were more available.
VANCE: Search warrants enabled us to get access to really anything, but only with the authority of a court. And, you know, the devices today are really the only—as far as I can tell, the only warrant-proof devices that exist. And that is posing a challenge I think to both sides.
SEGAL: I wonder—if I can ask—because it strikes me that, Gordon, you’re getting to an important point, which is that qualitatively all of the data on the phone is different because all of those things are in one place, where on the analogy, right, you’d have to do the legwork to get all of those specific things.
VANCE: I mean, maybe, maybe not. If you—when Mike was U.S. attorney he probably, like I have, when officers go out and do search warrants, they probably have three or four locations—office, home, associates. It wasn’t—it’s not that complicated. And maybe you hit 20 places with search warrants, but still it is true that much of this information is now collapsed into a single device, but it’s the same kind of information that would have been accessible to a warrant, whether it was in a car, a home, a safety deposit box, or some physical location.
CROVITZ: And last time I saw, you had estimated that in your office there are 200 iPhones or so that you would like to be able to get access to, but have not been able to.
VANCE: It was clear to me when Apple moved to reengineer its phones to be incapable of being opened, even with a warrant, that this was going to have an impact on the way we did business. We in the Manhattan DA’s office created our own cyber lab over the last several years because the NYPD simply can’t keep up with the volume of devices that we’re asking them to interrogate at the speed we’re asking them to do it. So we started our own lab. And of the phones that have submitted to our lab, of the 750 phones that have been submitted to our lab that we have gotten—we’ve gotten warrants for all the Apple devices, and 230 of those devices are inaccessible to us. Most at this point are iPhone 6’s running with operating systems eight and above.
CROVITZ: So let me turn now to Michael Chertoff. You’ve been on different sides of this issue. You’ve been a prosecutor, head of Homeland Security. You’ve been one of the more public proponents, I think, of the argument that there should not be any ban or tinkering with end-to-end encryption, of the kind that Apple, and Google, and others have now executed for their operating systems. Explain that.
CHERTOFF: And first, let me come back to the way the world was prior to smartphones, because since time immemorial people have tried to frustrate law enforcement’s ability to get evidence, even when there’s a lawful basis to get it.
So we were talking earlier, back in the day when I was a prosecutor and I was getting wiretap or electronic surveillance warrants, you know, that allowed us to lawfully put a tap on a phone or put a recording device in a building. It wasn’t long before bad guys figured that out, so what they would do is they would turn the radio up so you couldn’t have an audible, discernible voice. Or they’d communicate in sign language. Or they’d walk around the block and have a conversation outside of earshot. Now, those were all frustrating the ability of law enforcement to use a warrant to get information, but we accepted that’s the nature of free society, is you can’t make people sit in a room and talk audibly. (Laughter.) So I would argue that this is a species of a problem we’ve faced on and off for many, many years.
I also think that one of the things that you have to consider is end-to-end encryption’s only a way of a number of different ways people will secure data. For example, you have, you know, various kinds of apps that will cause messages to disappear after you read them. That’s a security device, as is encryption. So the challenge is, to what extent, as you balance the legitimate and understandable—and I don’t minimize this—desire of law enforcement to get evidence that can be critical in a case—how do you balance that against the ability of people to secure their own data, particularly in a world in which the government tells people with respect to cybersecurity: Don’t count on us to protect you. You have to protect your own data. And so to me, what you see is really a tradeoff not between security and privacy, but between two types of security.
The last thing I would say, which is relevant to the smartphone, the volume of data is now hugely greater. And one of the interesting legal issues that’s emerging now is to what extent even a warrant for a phone, or an accepted warrantless legal basis to search a phone changes when the phone itself becomes a different kind of device. I think because it’s handling so much information that it’s far beyond what in the old days you could have considered, you know, within a person’s personal control. But beyond that, because often the data’s no longer on the device itself. The device is literally a key to data being stored in the cloud.
And this comes up, for example, in the area of border searches. The rule has been—and I argued for this when I was secretary of Homeland Security—that when you cross a border the government is entitled to look at everything you bring with you. And if it’s a briefcase, they can look in the briefcase. And the law was, if it was a laptop, anything on the laptop could be looked at because you’re bringing in what might be contraband. But now I ask myself, does that change when in effect the phone is no longer the repository of the data, but is the key to get the data that’s already in the cloud. And that would be a little bit, to my mind, like if I crossed the border with my house key and someone said, OK, now I want to have your key and then go to your house and look at everything in your house. I don’t think we would consider that a border search. So in many ways, the technology is confounding even the preexisting legal categories.
CROVITZ: You co-wrote an op-ed in The Washington Post about a year go on this topic that I thought had a fascinating conclusion. I’ve been curious what you meant. And this is my big chance to ask you, so I’m going to. (Laughter.) I’m going to remind you by quoting you. What you wrote was: If law enforcement and intelligence organizations face a future without assured access to encrypted communications, they will develop technologies and techniques to meet their legitimate mission goals. And of course, in the case later, after you wrote this, the FBI did get from a third party a means for accessing the San Bernardino terrorist’s phone. Were you suggesting that law enforcement and intelligence agencies won’t go dark, as the FBI director has warned?
CHERTOFF: I was saying two things. One is, I think it’s actually the nature of encryption that it’s never clear to me that you reach a stage where you have perfect encryption. If you know anything about public and private key encryption, the essence of creating the private key is to have very long sequences of prime numbers that would require years of constant computing to break. But actually I was—today I was shown a chart about various contests that have been run over the last few years to see whether people could break ever longer encrypted keys. And not surprisingly, over time people have found ways to break them.
Now, there will always be another key that’s even longer—although there’ll be some constraints on the ability to do that because the ability then to manage the data itself will become a little bit challenging. But my point is, it’s not in the nature of encryption that it is a static issue. There will always be new innovations. And there will also be new countermeasures. And I think we saw that in the ’90s when there was a debate about whether there needed to be—was there going to be concerns about telephonic encryption and whether there needed to be a requirement that that be limited as well.
The second point I was making was that there’s no vastly more information available—apart from the encrypted content itself—than we had even 10, 15 years ago. I mean, this is a paradise for intelligence agents and law enforcement agents because of the amount of data that’s generated from unencrypted locational data, metadata, financial data, et cetera, et cetera. And in terms of what is the intelligence challenge of locating people who are potential threats, which is finding the needle in the haystack, the haystack is now so large that it’s the kind of the thing we only fantasized about 15 years ago after 9/11.
So again, without minimizing the challenges—particularly if you’re going to present a case in court—that are posed by encryption, I think encryption is not by its nature ever in a permanent state of being perfect. And second, there are so many other kinds of information and evidence out there that net, if I’m an investigator, I’d much rather be where I am in 2015 than where I was, let’s say, in 2010 or 2005.
CROVITZ: So, Adam, in your new book, “The Hacked World Order,” you made the point that as we discuss these issues in the U.S., we’re really not discussing them in a vacuum. And we’ll talk in a moment about potential legislation. But describe what you meant by that, that these kinds of issues are—may be in that way a little bit more complicated than a generation ago when we had telecom regulations that applied to domestic telecom companies.
SEGAL: Well, I think first is the reality that the U.S. has no monopoly on the technology, right? Even if we were to come up to some discussion that we were going to ban certain types of encryption, it will be developed other places, right? So I think that is a huge problem. The second is clearly there’s a huge demonstration effect going on. We can see that the Chinese and others have looked at the debate. You know, the Indians introduced and then very quickly removed a ban on encryption when the realized that everyone was going to have to hold onto their plaintext for years, and they decided that it was not manageable. So there’s clearly a kind of demonstration effect that’s going on to other countries.
And then, you know, there is the larger issue of what we want—what the goals are for U.S. technology companies, right? So what are—what are we thinking about their competitiveness and how they’re going to be in other markets. You know, the FBI director, of course, had kind of thrown it off and said that this is a marketing strategy, but we saw, you know, last week when a Brazilian judge knocked WhatsApp offline for about a day and a half, and several millions—tens of millions of Brazilian users didn’t get access to it, that all of these interactions are closely tied to security and economics.
CROVITZ: So, Cy Vance, I wanted to ask you a question along these lines. We’ve heard good arguments about the complications of potential regulation, and the technology is complex. On the other hand, Apple and Google, in a way, created a change in the situation by moving to default encryption. So before they did this in 2014, were there reasons for them to be concerned about hacking?
VANCE: This is what has puzzled me personally about the communication on this issue after Apple moved to default device encryption, September 2014. At that time, Apple was operating on iOS 7. We are now on iOS 9. And it was I think before the six came out. Now, in that time period, this was the time period where we were presenting affidavits and warrants to judges for them to review. And if they approved it, we took them to Apple, and Apple did whatever magic it did, and returned to us the responsive material. There was no indication that Apple’s ability to maintain the security of the phone content was an issue. In fact, Apple said exactly the opposite, that their iOS 7 was state of the art. And also, interestingly, said they felt they had a responsibility as a corporation to make sure that law enforcement was able to get the information it needed, to find the Alzheimer’s patient or other issues on an emergent basis. So then default device encryption came in in September ’14.
So I think the question that Apple needs to answer, from my position, is has there ever been a situation where when a law enforcement agency sends a phone out to Apple to be decrypted there was ever a compromise of data on that phone. I mean, that’s really the fundamental thing that started all this was the concern about what is called a backdoor, which I disagree with, and that somehow the world’s going to end. But I’ve asked this question in letters to Apple and Google in March of 2015. I’m still waiting for the courtesy of a response. And I think we need—if you’re going to—if we are going to measure, as we have to, what is the risk of maintaining the ability to open a phone by the company, what is the added risk that that puts us in harm’s way of, versus what is the consequence to law enforcement of not being able to access those phones.
I can tell you I have 230 cases in my office from homicide to sex abuse to child abuse to drugs to guns that I can’t get into those phones. What is Apple telling me about how many times a phone that was delivered by a law enforcement agency to Apple as a result of their unlocking the phone pursuant to court order that data was ever compromised? So I think we need an answer.
CHERTOFF: So I think, Gordon, there are two types of encryption, and we have to be careful not to confuse them. The more general issue has to do with end-to-end encryption, which has to do with when you’re either dealing with data in motion or data at rest you encrypt it, except at the point of initiation and the point of receipt. And obviously there that’s a critical element in securing against people hacking into a network and stealing your data, and relevant to financial institutions, health care institutions, et cetera.
Now, the issue that Cy is talking about is a somewhat narrower issue that has to do deal with a feature of the phone. The encryption itself is very simple. You’re talking about four digits. But it’s designed to defeat efforts to repetitively try different variations by slowing up the time between each entry and ultimately stopping anything further on the phone and wiping it if you do it more than a certain number of times. That’s really designed to deal with the problem of somebody stealing your phone and then if they didn’t have that feature they could infinitely go and try, you know, a number of combinations and crack the phone, you know, relatively easily.
The issue in Apple, which is technical—and I should just disclose, we do work with Apple, so it’s out there—was whether creating a tool that would allow you to basically substitute out the operating system that currently exists for a system that did not have that feature—namely the one that shuts down the phone after you have too many tries—whether that once it was created would potentially get released and compromise other phones. At the end of the day, in a way, this is lawyers call a moot point, because inevitably what you’re going to have going forward is a new system that’s configured in such a way so as to be even more protective of the device itself. And if the U.S. government were to say it’s illegal to do that, I guarantee you, you would be able to buy a phone that does that within 10 minutes on the dark web.
So again, I think in some cases, on the narrower issue, the lifespan of what we’ve been fighting about is very short. The broader issue however is very, very important, which is do we preserve the right to encrypt data at rest or in motion within a network and on the Internet.
VANCE: I got to disagree with my colleague, because I don’t think that’s—I think that’s a framing of the issue, but I don’t think that’s how it was framed for us by Apple. Tim Cook said that were there to be, in his words, a backdoor, which he means Apple maintaining a key to open its device, he alluded it to threatening millions of Apple consumers. He likened it to causing cancer. So it was very much about—if Apple maintains a digital key, as it did up until September 2014, is that such a risk that we should forego the ability to access those phones in the service of causing cases and—cases for victims. And for what increased level of security?
I simply don’t think Apple has, as I indicated, answered the question. And, yes, there are many aspects of data in transit. That’s very important. But in terms of just accessing a phone, I’m just not aware that there’s ever been—
SEGAL: I think perhaps the answer is out there in the sense that by 2014, or certainly if not by now, there is nobody who can say we can defend ourselves. So even if we don’t know that Apple has ever been breached and that this tool has been stolen, this general sense would be nobody could ever say we can protect whatever it is we created.
VANCE: But you have to measure—you have to quantify it. If you are going to say: We are taking a whole body of evidence that is essential—that has been essential for centuries, use of warrants based upon probable cause, reasonable searches, to get evidence to build cases against people committing crimes, you have to quantify the risk that results from this ability to unlock your phone. It’s never been done. Did it happen once? Did it happen a thousand times? Has it happened a hundred thousand times? Right now, that number has never been put on the table. And until we quantify that number, you’re asking the entire United States, all of law enforcement and all the public, to say it’s a crisis.
SEGAL: Well, plus the risk that Secretary Chertoff said, that now with the golden age of surveillance there are other ways of perhaps breaking those keys. So—
VANCE: Well, that’s not a risk, that’s—
SEGAL: Well, that’s a mitigating factor. So—
VANCE: You know, I’ve never met a victim yet—a mother of a daughter who has been sexually assaulted, a parent of a child that’s been murdered, a police officer investigating a terrorisms case—who said to me, oh, I don’t really care if you get the conversation. I just want to know the metadata. Juries expect proof beyond a reasonable doubt. And the content of communications is what proves intent in a criminal case. And it’s not just being able to say that Cy Vance emailed Mike Chertoff on Saturday morning. That will get you so far. But I’m pretty clear that that doesn’t get you where you—if it was personal to you, you would want the government to be able to build the evidence.
CHERTOFF: Yeah, so I guess I would say this. Again, I totally understand that if your job is to be a prosecutor you want every bit of evidence you can get. And that’s why prosecutors don’t want to suppress evidence, have evidence suppressed when it’s gotten illegally, for example. But let me tell you, I mean—use the example I talked about earlier. In the pre-smartphone days, when we had electronic surveillance, and I did organized crime cases.
And the guys I investigated did not talk. They used sign language or they walked around the block. And I remember convicting people of murder, and I didn’t have conversations. What I did have was pictures of people walking around the block—(laughter)—and records of telephone call from point A to point B. And somehow, you know, we were able to fumble our way to getting those folks sentenced to a hundred years in prison. So I don’t want to minimize the importance, but to me it’s not an all or nothing proposition how you can do that.
VANCE: And those cases were likely built on informants who took the stand and said: I talked to Mickey. Mickey told me to kill—
CHERTOFF: Actually not, but I won’t bore you with war stories. (Laughter.)
CROVITZ: So we’re about to go to the question period. And please wait for a microphone. And the usual Council rules—stand up, identify yourself, be brief in your question. But before we get to questions, I just want to take the presider’s right here to observe that the Democratic Party member represents the law and order side of this panel. (Laughter.) Michael Chertoff, which has worked with Republican administrations is opposing law enforcement’s request. (Laughter.) And yet another example for those of us from New York that Washington is becoming increasingly unpredictable. (Laughter.)
CHERTOFF: Gordon, I—in all seriousness, Gordon, I do want to make a point, which is to me this is actually because a choice between two types of security. And I do think that the issue of being able to have, particularly I’m not talking about so much device encryption as end-to-end network encryption, is really—like someone like all you who transacts with banks online or does other personal things online, or even has personal emails that are sensitive that you don’t want to have out there—the ability to have those encrypted is part of securing what’s valuable to me. And since the government doesn’t take the responsibility, and shouldn’t, of operating and owning everything on my network, it’s on me to do it. And all I want is the tools to carry it out.
CROVITZ: I think the first question is in the front row.
Q: Amitai Etzioni, George Washington.
VANCE: I don’t think your mic is on.
Q: OK. Thank you.
I wonder if there’s a whole other level of discussion which I would love to learn about, if these things are being decided on a completely different level kind of politically. People are afraid to touch Google and Apple. They have strong lobbying. They have libertarian ideologies to support them. So all these intellectual arguments, which I cherish—and that’s the way I spend my life—are really not going to decide this issue. And therefore, we hear these arguments which don’t stand a minimal examination—for instance, the notion that if Americans cannot get Apple phones they’re going to take Chinese phones and use them, and trust China not to have a backdoor; or it doesn’t matter if you’re going to be out of business for five years, dark, because five years from now somebody else will come up with a way to break encryptions. None of these arguments hold a minimal exam. But isn’t that really not what’s happening here? Isn’t it just a political struggle, including the fact the president was very reluctant to speak to this issue?
CROVITZ: Who would like to take this question?
VANCE: Well, I’ll just jump in. I don’t think it’s just a political issue. And I’m just speaking about myself personally. You know, I am elected to be the head law enforcement elected official for New York, which is the number one—New York City—number one terrorism target in the country. I have some of the most serious crimes occurring in my jurisdiction. So this is not—neither academic nor political, it’s very practical. And for folks like me around the country, 95 percent of the cases—criminal cases in the country don’t occur in federal courts, they occur in state courts.
And so the biggest impact of inability to access at the very least the content of the devices themselves, putting aside for the moment the data in transit, matters greatly. And of course, what we’re doing is talking to political leaders, Senators and others, to try to convince them, in my view, that they should address this with federal legislation. I think that has to be the solution. And they need to do it quickly because I have a—and I think we should all have—a sense of urgency around this issue not being resolved. So, respectfully, politics doesn’t enter into this for me at all. And I think I represent a large percentage of the law enforcement community that is trying to wrestle with this.
CROVITZ: Just to rephrase the question a little bit, and try to elicit another response, when Apple announced its change to its operating system in 2014, it posted on its website this line, that said, it’s not—it’s no longer technically feasible for us to respond to government warrants, which I think may be one of the examples that the FBI director used to say they were marketing their inability even to respond to government warrants. So—
VANCE: Gordon, another screen had that same phrase when they just—they came out with iOS 7 and said: Apple, unlike its competitors, cannot open its devices. So I think—
SEGAL: And we haven’t said the word “Snowden” yet. I mean, I think we do have to realize that there was a motivation from the companies to create distance between them and the U.S. government and all other governments. So you know, I do think there was that framing that was out there that we just can’t ignore.
CROVITZ: In the back.
Q: Alan Raul, Sidley Austin, and former vice chair of the Privacy and Civil Liberties Oversight Board.
So your reframing of the prior question and Adam’s response really goes to my question, which is: Would you address the kind of background context of what the U.S. government can do to address the concerns that were inflamed by Snowden, and the sense of overreach in the acquisition of data by the U.S. government, which I don’t think is a fair perception, but it is the context? And this panel was titled privacy and security, and this debate. And the privacy side is this perception that too much data is being acquired by the U.S. government, without justification. Again, unfair, but without an effective rebuttal by the Federal government, you know, by city and state governments.
So what can be done to address that? And perhaps, you know, the government needs to look at some of the cases that it brings. You know, there’s pending before the 2nd Circuit now the extraterritorial search warrant case, right? So that is an example, perhaps, of the U.S. government overreaching into servers in other countries that, again, gives rise to this dynamic of creating distance between the companies and the government. And I think if you would address, what can the government do to address that perhaps, you know, mitigate the tension?
CHERTOFF: So let me take this first. And you know, Alan here has two separate issues.
On the issue—the first issue you raised, I think that really is what they call the 215 issue, or the metadata issue. There is—and first of all, I mean, I have nothing good to say about Snowden, nor do I credit his version of what he put out there as sort of being accurate. But the sense got out there that government—because the government collected and stored a lot of metadata, which is literally phone number, IP address, originating communication, the address to which it went, and duration—not even a name. Which is kept so you can look for patterns, if you have a predicate to do so. You know, that treated as somehow a gross violation of the law.
Now, the first answer is, when the government finally belated declassified much of their legal foundation for this, including opinions written by the FISA Court, it became clear that in fact this was comfortably within the law, it didn’t go outside the law. And beyond that, that it was really minimal intrusion into privacy. And was essentially what a telephone billing record has in the modern age. And I think there was a failure to—a disappointing failure on the part of the government to explain, in fact, that this was pretty modest. And there’s enormous value to be gained by doing this. And I say this as somehow who in the days after 9/11 had the responsibility to see who the heck else was out there. And you track that by looking at all these metadata collections.
The second point you raised, though, which is somewhat distinct, is a little bit related. This has to do with the fact that the U.S. government, when it seeks to get information, data, about, let’s say, a foreigner, who’s really in a foreign country, in a data center that’s in a foreign country, will go against the service provider that’s doing business in the United States and say: Well, you have to turn this over to us because we’re going to subpoena you. And that’s an issue we used to confront with banks. And the question is, is that subpoena like asking for bank records, which we typically allow the government to do? Or is it like telling a U.S. branch of a Swiss bank: You have to break into the safety deposit box of somebody in Switzerland? And we’re not going to go to Swiss court. We’re going to go to a U.S. court.
That highlights the problem of a world in which our law is still sovereign centric but the data moves around the globe. And the real answer is not to eliminate the ability to get foreign evidence, but to find a cooperative and efficient way to get multiple sovereigns to agree on a process to do this quickly, but that respects the sovereignty of the country, for example, whose citizen is involved. So again, this is not about closing that off, but it is about avoiding what happened in Brazil, where a judge shut down WhatsApp in Brazil because the judge ordered, I guess it was Facebook, to turn over data in the U.S., it violated U.S. law. And the company said, well, what do I do? And that’s not a good situation to be in for anybody.
SEGAL: But I mean, the impact of the Snowden revelations was more than the metadata, right? We have allegations that they undermined encryption, that they hacked into Yahoo and Google servers, that they were sitting on zero-days, that they intercepted equipment and put implants in it. So I think the impact on the companies is clearly much broader than the metadata. And that has never been repaired.
VANCE: Right. And I think it’s an important point that what caused this stir was federal action. (Laughter.) we, who are handling 95 percent of the criminal cases in the country, are now trying to push ahead to do our job when upset has been caused that has nothing to do with us. And we only get access to devices through search warrants. So again, obviously these are—they are overlapping issues, but the Snowden revelations clearly have, I think, had a big impact on public opinion. I understand clearly the relationship between those revelations and the actions of the companies.
But if—looking at Apple—if iOS 7, up to September 2014 was as secure as they said it was, and if we don’t have any indication that there’s ever been a breach, at least of the data at rest, by delivering a phone to them, what is the problem Apple is trying to cure? Is it a public relations problem or is it a privacy problem? And I think it may have been more a public relations problem than a real privacy problem.
CROVITZ: Yes, ma’am.
Q: Thank you. My name is Courtney Radsch.
And a couple of questions. First, there were reports that when you gained—when the FBI gained access to the phone, that the reason that they lost access to it is because they changed the password. So I’d like to know if that’s true or not, because then that actually seems to be more of an issue of governmental—you know, of failure on the government’s part, that they should deal with not the tech company. Also, given that we’re at the Council on Foreign Relations, I think it’s a big ironic that we’re talking about this as if it’s only a U.S. problem, because the fact is that journalists and activists around the world depend on encryption. And so this does not happen in a United States vacuum. And with respect to the local law enforcement, I think one of the things we’ve seen is the militarization of local law enforcement and the transition of not only of military equipment but also intelligence equipment to local law enforcement.
And so this is, you know, within a broader lack of trust in either local or national governments. And when you see law enforcement abuses ranging from, you know, Black Lives Matter and Ferguson, up through mass surveillance of U.S. citizens, I think it’s hard for us to trust the government to do what’s right, and that there’s a difference between communication that occurs when people feel that they’re under surveillance as when they feel that they’re able to communicate in private, especially when you’re the member of a minority group, Muslim, a journalist and you have to think about source protection, because that metadata can reveal the sources of journalists.
CROVITZ: Several questions there.
VANCE: Well, Mike—
CHERTOFF: No, you go first. (Laughter.)
VANCE: Great questions. And many questions. And no simple answers. I think, first, what law enforcement needs to do, and I think people in law enforcement who are smart are, is to address these questions directly. I certainly try to do that, for example, on the issue of race and criminal justice. I think these are big issues and prosecutors and police can’t—we have to explain what we’re doing and why, and to answer some of your questions. There’s a technical question with regard to whether we lost the opportunity in San Bernardino to access the phone. Jim Comey has answered that question. If I try to repeat what he said I’m sure I’m going to get it wrong. But I think there was—there was some apparent confusion over getting access as a phone owned by the county, and somehow that ended up having the phone get locked up. So that’s number one.
And from the local law enforcement perspective, I’ve never used a Stingray in our office, which is the device that captures all sorts of phones in the area. Our investigators don’t use it. If they did use them, they’d be getting a search warrant for them. And you had a lot of other questions which were just, it seems to me, based upon Americans have lost trust—or, some Americans have lost trust in law enforcement. And I understand that. And it is why, whether the issue is race and criminal justice or the issue is talking about how we need to adhere to constitutional standards every time we are trying to get evidence through court order, I think we have to do a better job—not do a better job, but communicate about what we’re doing, because we should get your trust back, if we can.
CROVITZ: Please. Oh, I’m sorry.
Q: Hi. Thank you. My name is Brooke (sp), from Congressman Hurd’s office.
So I was wondering if you can comment on a part of this debate which was sort of touched on, and it has to do with the companies themselves. So the backdoor debate aside, I was wondering what is the legal precedent for forcing a company to do something like create in every case some sort of ability to open up encrypted data, or, you know, if they can’t because they’ve created an encrypted phone that they don’t have the, quote, unquote, “key” for, to force them to create the key for it? And I had one more question, but I don’t remember it. (Laughter.) So I was wondering if you could comment on that.
CHERTOFF: So I’ll start on this one. So I think there are a couple of issues. There’s a narrow issue in the case of San Bernardino, which I guess has not been resolved because it’s moot, about whether in the context of a search warrant—I mean, the All Writs Act, which allows—basically what the judge does is says, you know, I’m granting this warrant. And the authority to do anything—the company has to do anything to effectuate the warrant that is necessary. So that’s—so, for example, if you’re a landlord in a house and there’s a search warrant, you have to, you know, open the door.
The question that was raised, which has not been resolved, is how far does that power go? Can you say to somebody: You have to write—spend six months writing code to get around a problem? Well, that sounds a little bit like you’re enlisting them against their will in the police. The broader question is whether you could require as a congressional matter, or pass a law that would say you have to have the capability with respect to any encrypted data for the company or the host to be able to get into that data. I think, you know, if Congress passed that law, it would probably be constitutional, off the top of my head. I think it would be very unwise, because, again, if you understand the way encryption works—and again, I’m talking not so much about device encryption here but about end-to-end encryption of data on a network.
In order to create the ability to do that, you would either have to have a duplicate key in possession of the company, which would then become a target for being stolen, or you’d have to build into the encryption a flaw in the implementation, which is typically the way we tend to crack encryption now, because the fact of the matter is even people who are trying to be perfect in encryption often fail. For example, they’ll use the same prime number twice or something. And that radically reduces the amount of time it takes to decrypt something.
So I think it’s not so much a question in the case of the broader issue of legal authority, as to the question of the tradeoff between your ability to get general encryption protection versus the individual case. And I know that in Congress now there’s a bill to have some kind of a digital commission to look at this, which probably—I don’t know that these issues are well-resolved in one-off cases. And I say this as a former judge. I don’t know that judges are always the most technologically astute and well-informed people. (Laughter.) Getting a group to look at this, who have the technical background, might be illuminating, actually.
VANCE: OK, I’ll jump in.
VANCE: Companies historically in America have been asked to—or required to do many things in order to address law enforcement imperatives. So banks didn’t volunteer to have to file currency transaction reports when more than $10,000 of cash is moved. They have to file them because we found out that drug dealers were moving cash through banks. The phone companies didn’t volunteer to have their—to make their lines accessible to law enforcement if they had Title 3 warrants. They had to, because everyone knew that bad guys were using the phone to commit crimes and it was necessary to talk to them.
So we aren’t really, in my mind, in that different a place today. We know criminals are using smartphone devices for criminal purposes. There’s a phone conversation—we actually have a report from our office, which we’ll make available to anybody who wants it outside—where there’s a conversation of an inmate in Rikers Island who’s talking with his confederate outside. And the inmate is asking the confederate: Did you make sure you got the iOS 8 devices? Because the government can’t get into them and it’s a gift from god. So the criminals have figured this out.
So the question is, is Apple really any different than the banks, and all the money laundering regulations they have to follow and report on and documents that they have to keep? Or is it that different from the phone companies who have to provide access that require them to set up a connect—something probably requiring software development as well as a physical location? And I think the answer is, not really.
SEGAL: I think that there is a difference in the sense, to the earlier question about our other foreign policy goals of protecting dissidents and journalists and others in other places. So we’re going to have a situation where you can imagine, where the companies are going to be forced by numerous governments to do this issue. And then the question is, who are they going to say no to, right?
So are they going to say, well, we’ve agreed to do this for the United States. We’ve agreed to do it for India. We’ll do it for Germany. We’re not going to do it for China. Maybe we’ll do it—you know, so then I think the situation becomes slightly different. Financial sector, we’ve had agreement basically—it’s been able—we’ve been more able to shape norms on antiterrorism and criminal—kind of a shard interest there. On the free speech issue, we’re going to face a fundamental division that we’re not going to be able to resolve.
VANCE: Let’s test that. Let’s look at China. Let’s look at the request China made to Apple for information on Apple accounts before iOS 8 and after. So before iOS 8, between July and December of 2014, China made 31 requests to Apple for account data out of 39 accounts. And Apple provided in response to the Chinese request 32 percent of the time, OK? Between January and June of 2015, the next six months, now this is after iOS 8 has come in, China made 24 requests for 85 accounts, and Apple responded to 29 percent of those requests. And these are—Apple describes this as the equivalent of a search warrant. And then July to December 2015, 32 requests out many thousands of accounts which I think relates to a phishing investigation—there was a lot of phishing going on—P-F-fishing. And so Apple responded to 53 percent.
So my point is that we have data from Apple that suggests that when Apple went to full-disk encryption, it really wasn’t that different from when Apple had access—when Apple had access through iOS 7 in response to government search warrants. So my point is that this was happening before. It’s happened after the switchover. And so when we are told that the world’s going to end, and that if China’s going to use this power thousands of times, I’m not sure the data supports it from Apple.
CHERTOFF: Yeah, I’m not sure—there’s a bit of a difference, just to be clear, about the issue that was presented in the Apple case, which was getting onto the device, versus the data which had been uploaded to the cloud. My recollection from the press reporting in San Bernardino was actually any of the data that was backed up to the cloud was furnished to the FBI. And the reason they had a problem at the end was that somebody by altering the password stopped the automatic backup system. So it’s not been the case, as I understand it, that the company says: If we have unencrypted data we will refuse to turn it over. I think they agree they have to. It’s do they need to forebear from having a system which they don’t have access to because if they don’t have access to it they will not be able to comply. And that’s really what the issue is about.
SEGAL: And all Apple user data has to be stored in China. So they may have just simply turned over the data that they had.
CHERTOFF: Right. Yeah.
VANCE: Does it have to be stored in China? Or is that an accommodation?
CHERTOFF: Well, I think that’s the condition as Chinese data. And that gets us to the point that Alan raised earlier, which is you have to reconcile the fact that you have multiple sovereigns and you have global data. So it would be one thing if— and maybe a condition of entry into a particular marketplace, you can debate whether you should obey it or not, is that the data generated in that particular country needs to be held there. It would be quite something else if the Chinese requested data of an American held in the U.S., and that were turned over without appropriate process. And that gets us exactly to the issue Alan raised about how do you reconcile these conflicting jurisdictional issues.
Q: Henry Farrell at George Washington University.
Just building on all of that, does U.S. law enforcement pay attention to these questions that Adam raised about possible demonstration effects, setting precedents, creating weaknesses that might be exploited by less sort of careful actors in the world than the U.S.? And should it pay attention to these questions, or is that outside of its mandate?
VANCE: Well, I think, from my perspective, the questions are very big ones. And what I have found—what I disagree with is this point, and its following on your question: Other companies historically have had to adjust when it was clear that the product they were selling and using was being used by criminals and it was a real problem. Apple’s response, for a variety of reasons, was to essentially cut off access to the data in the devices themselves. And it did so using its own—it made its decision. The consequences of that, however, are broad. And they will touch all jurisdictions of the United States. And they will impact the victims in cases all over the United States. And when something that big happens, my question is: Who gets to decide where the line should be drawn between privacy and public safety?
Should it be two companies whose operating systems run 96.7 percent of the world’s smartphone market? I’m not worried about Apple running out of money. I’m not worried about Apple losing market share. They own it. These are two phenomenally wealthy companies, and they decided we’re going to draw the line of public safety and privacy over here. I really don’t think they get to do that. I think Congress gets to do that. And that’s why I believe there should be a political and legislative discussion about this, probably. Our cases in court have to move according to timetables set by statute. Statute of limitations are starting to run—can run on cases that are very important. We don’t have the luxury, particularly at state court, of tons of time. And that’s the reality of how this is playing out in our state courts. And it’s a reality that if you’re a victim, it impacts you.
CROVITZ: So I’m going to take the privilege of the presider by asking the last question of the evening, which is, you know, the Senate Intelligence Committee, in a bipartisan manner, introduced a draft bill that would require technology companies essentially to make—to be able to respond to requests for information, much like phone companies, telecom companies have had to for many years. And in that bill, at the beginning, it says no entity is above the law, reflecting some of the views that Cy Vance has made here. My question is, there’s also legislation in France and in Britain. Of the three countries—France, Britain, United States—which one is likeliest to pass the law first? (Laughter.) Or maybe I should say last.
VANCE: Well, I think they’re likely to pass it before we do. The presidents of each of those countries have been very outspoken in their—in their need to take on this issue of encryption. I don’t know when it will happen. But I think it actually may help America sort itself out in this process, because if the French and the British create laws which address this issue of what can and cannot be encrypted, that I think will have an impact on American legislators, and I think may give them, I hope, comfort that they can pass legislation without fear that the world’s going end.
CHERTOFF: So, Gordon, let me ask this question, because this is, as I said earlier, not just about getting to a particular phone. If you passed a bill like that, what would happen to WhatsApp? You know, disappearing messages. That would become illegal, because if the message isn’t stored, if it disappears as soon as you read it, you can’t produce it. So now what you’re really doing is you’re basically making it illegal almost to destroy any communication. Maybe it should be impossible to delete your email. Maybe there should be a rule that you can never shred a piece of paper.
I mean, we don’t generally operate in this country on the assumption that you have to organize your life so that in the event somebody wants to subpoena you or investigate you to make it easy. In fact, I can make it easier still. Let’s have a law that says this has to be on all the time, has to record everything that I say and video and audio. And then if I do something wrong it’ll be really easy. It would have made my job very much easier when I was a prosecutor. I think, you know, the phone, to me, is a small subset of a larger issue. And that’s why I said earlier I do think—I agree with Cy, it is worthy of study. It shouldn’t be decided by the happenstance of a particular case. But I do think we need to understand that there’s a broad implication of this beyond just the particular configuration of a particular phone.
VANCE: And I think there are actually ways to go forward on this. Right now we have two sides who really are on opposites—polar opposites. And there isn’t—there really doesn’t seem to be a public discussion about how can we move forward? Imagine a world in which there were different levels of encryption. There might be one level of encryption for communications and content, and another level of encryption when it dealt with autonomous moving vehicles that—self-driving vehicles. That second group is something that obviously—if that’s captured, that’s immediately dangerous. By content and communications may be different.
There’s a—we should really sit down and we should explore whether it is in fact true—and I think part of the problem is that Apple in particular has said hell no, no way, no time, no how. And I don’t think that’s helpful to anybody in resolving this, although I respect their ingenuity and the passion of their beliefs. It’s not helping us figure this out.
SEGAL: I suspect we won’t have the luxury of having that rational debate. I don’t think we can discount the role that fear has played in all of these discussions. And so you can imagine, you know, an event that happens here or in France or in Britain that would totally shift the debate. And people will rush to make a law so this won’t happen again, even though the law will not be effective.
VANCE: And if that happens, that law is going to look a lot worse for Apple and Google than the law that we could perhaps move forward to if we sat at the table and sorted this out.
CROVITZ: And in the meantime, the Council on Foreign Relations, always trying to do its bit for the world, Mike Chertoff has dictated a Chinese statute requiring Chinese statute requiring Chinese citizens to keep their phones on. It’s a fantastic idea. (Laughter.)
So please thank the panel in the customary way. (Applause.)