- Expert Roundup
- CFR fellows and outside experts weigh in to provide a variety of perspectives on a foreign policy topic in the news.
The move by major technology companies like Apple and Google to sell products with advanced encryption has pushed the debate over digital privacy and security to a critical stage. Some policymakers are pushing for new laws that would require tech manufacturers to ensure that government investigators could access suspects’ digital information. Meanwhile, privacy advocates say such measures are unnecessary and may undermine security for all. CFR asked three experts to weigh in on how technology firms, in designing their products and services, should balance the privacy demands of their customers with the security concerns of police and counterterrorism agencies.
Apple’s announcement in September that its iOS 8 mobile operating system would feature encryption by default has launched a spirited public debate over whether technology firms should be legally required to compromise the otherwise secure systems they market to consumers.
Law enforcement, namely the FBI, has answered with a resounding "Yes." They claim that as more data is encrypted, they are increasingly unable "to access the evidence [they] need to prosecute crime and prevent terrorism even with lawful authority." They call the process "going dark."
But the numbers don’t back up these assertions. In 2013, encryption foiled only nine out of 3,576 federal and state wiretaps, according to the federal judiciary. It is a huge leap to jump from one quarter of one percent all the way to "going dark." Increasing the security of our digital systems won’t stop law enforcement from prosecuting and preventing crime. Police have a wide variety of investigative tools at their disposal, and only an incredibly intelligent criminal could stymie every single one (and such criminals have already had access to strong cryptography for years).
Would introducing backdoors (secret access methods that investigators can use to overcome otherwise secure systems) make law enforcement’s job easier? Of course. But there are lots of other tools that would make their job easier, and we’ve decided as a nation that these would violate our basic rights enshrined in the Fourth Amendment.
The problem is that backdoors also make criminals’ jobs easier. There’s no such thing as a system insecure enough for police to gain access, but secure enough to guard against criminals, malicious foreign agencies, and other bad actors. Computer science just doesn’t work that way.
"Regrettably, they are trying to frame this debate as one of privacy versus security, when in reality we can and should have both."
Indeed, we have examples of backdoors that led to major digital breaches: the hacking of Greece’s cell phone system in 2006, a similar incident in Italy between 1996 and 2006, and the hacking of Gmail in 2010. Instead of protecting us, law enforcement is supporting policies that would make us and our private information less safe. Regrettably, they are trying to frame this debate as one of privacy versus security, when in reality we can and should have both.
Companies must reflect the values of the countries where they do business, at least if they want to stay in business. Unfortunately, in the most recent encryption debate, much of Silicon Valley has mistaken its own left-libertarian values for those of the world. In fact, surprisingly few people outside the Silicon Valley bubble want to live with the potentially dangerous consequences of giving unbreakable end-to-end encryption to everyone.
Let’s start with corporations, by far the biggest market for security, especially now that many have suffered unrelenting state-sponsored cyberattacks. They all want some encryption, but almost none want to give employees the ability to hide communications from the corporate systems administrator. The same is true for countries. Encryption is regulated in many places around the globe, and not just in authoritarian regimes like China and Russia. In recent memory, France, South Africa, Turkey, and India have all had policies that discouraged strong encryption.
"Encryption is regulated in many places around the globe, and not just in authoritarian regimes like China and Russia."
Even in a country as close to the United States as the United Kingdom, a parliamentary committee investigating an Islamist murder of a British soldier in London criticized a U.S. technology company for not monitoring the killer’s posts more carefully. Said the committee, "this company does not appear to regard itself as under any obligation to ... take action or notify the authorities when its communications services appear to be used by terrorists."
And that was before the Charlie Hebdo attacks. Afterwards, British Prime Minister David Cameron promised legislation to override Silicon Valley’s encryption strategy, winning some support from President Obama and a strong endorsement from the EU’s counterterrorism leader, who suggested adopting the policy Europe-wide.
U.S. technology firms may be misreading the foreign government criticism they encountered in the wake of Edward Snowden’s revelations. Calls from foreign governments to "localize" cloud storage of personal data may have stemmed more from their desire to invade their citizens’ privacy than to preserve it.
Law enforcement officials around the world are deeply frustrated to find their investigations thwarted by Silicon Valley companies that cough up evidence only in response to requests that meet the standards of U.S. law, which are often much more demanding and protective of privacy than they are used to. Seizing on the Snowden revelations to localize the data takes U.S. law out of the picture, serving the interests of both cops and spies in the home country— but not the privacy interests of local internet users.
The real lesson is that it’s always easy for other countries to find reasons to hate big American technology companies. Much as they would like to, Google, Apple, and other firms won’t be able to escape responsibility for how their technology is used simply by adding universal unbreakable encryption. These companies are at the center of our new social lives, and that means that they’re stuck with having to conform to the laws and expectations of countries that are very far from Silicon Valley, in every way.
As a matter of process and good governance, the security concerns of police and counterterrorism agencies are not the responsibility of corporate service providers in the technology community. They are, properly, the concern of Congress and the Executive Branch—those charged by the Constitution with providing for the “common defense” of the Republic. Technology firms like Apple, Google, and Microsoft have a responsibility to their shareholders to make profitable products. To achieve that objective, they have an obligation to their customers to make products that meet their needs and desires at a price that consumers are willing to pay.
Thus, in the wake of the Snowden revelations, it is perfectly appropriate for the technology industry to respond to changing consumer demand by offering products that are more privacy protective—at least it is if they are correct in perceiving that market demand for such products has increased. Indeed, it would be irresponsible of them to fail to change their development and marketing plans in the face of changing demand.
"In the wake of the Snowden revelations, it is perfectly appropriate for the technology industry to respond to changing consumer demand by offering products that are more privacy protective"
If, on the other hand, the U.S. government is of the view that these new product offerings are a threat to national security then it, equally appropriately, should both say so publicly and, to the extent necessary, seek regulatory or legislative changes that they think are required. That is exactly how the process has worked in the past when the U.S. government has prohibited, for example, the export of some products (like tanks), and when it has regulated the manufacture of other products (such as mandating that telecommunications providers include interception capabilities in their products). But the burden of persuasion is on the government. If Congress or the Executive Branch cannot agree that such regulation is necessary then, in a free society, products that are market-responsive should be favored, not disfavored.
Finally, I would not be in favor of legislation prohibiting the production of products with enhanced encryption technology. In the first place, it would make American products less competitive than those manufactured abroad—those without decryption requirements will naturally be favored by privacy-sensitive customers. In the second place, requiring an encryption key backdoor might create unintended vulnerabilities. And, perhaps most importantly, given the globalization of information transfer, trying to stop the spread of encryption technology is a fruitless Canutian task. Ideas can’t, in the end, be suppressed.