from Net Politics and Digital and Cyberspace Policy Program

New Cyber Brief: Disinformation on Steroids

This image made from video of a fake video featuring former President Barack Obama shows elements of facial mapping used in new technology that lets anyone make videos of real people appearing to say things they've never said. AP Photo

Deep fakes are a profoundly serious problem for democratic governments and the world order. A new Council on Foreign Relations Cyber Brief argues that a combination of technology, education, and public policy can reduce their effectiveness.

October 16, 2018

This image made from video of a fake video featuring former President Barack Obama shows elements of facial mapping used in new technology that lets anyone make videos of real people appearing to say things they've never said. AP Photo
Blog Post
Blog posts represent the views of CFR fellows and staff and not those of CFR, which takes no institutional positions.

The Digital and Cyberspace Policy Program has launched a new Cyber Brief. This one explains the challenge of deep fakes to democracy and the international system. The brief was written by Robert M. Chesney, James Baker Chair at the University of Texas School of Law, and Danielle K. Citron, Morton and Sophia Macht Professor or Law at the University of Maryland Francis King Carey School of Law. 

Here's the introduction:

Disinformation and distrust online are set to take a turn for the worse. Rapid advances in deep-learning algorithms to synthesize video and audio content have made possible the production of “deep fakes”—highly realistic and difficult-to-detect depictions of real people doing or saying things they never said or did. As this technology spreads, the ability to produce bogus yet credible video and audio content will come within the reach of an ever-larger array of governments, nonstate actors, and individuals. As a result, the ability to advance lies using hyperrealistic, fake evidence is poised for a great leap forward.

More on:

Cybersecurity

Influence Campaigns and Disinformation

Digital Policy

The array of potential harms that deep fakes could entail is stunning. A well-timed and thoughtfully scripted deep fake or series of deep fakes could tip an election, spark violence in a city primed for civil unrest, bolster insurgent narratives about an enemy’s supposed atrocities, or exacerbate political divisions in a society. The opportunities for the sabotage of rivals are legion—for example, sinking a trade deal by slipping to a foreign leader a deep fake purporting to reveal the insulting true beliefs or intentions of U.S. officials.

The prospect of a comprehensive technical solution is limited for the time being, as are the options for legal or regulatory responses to deep fakes. A combination of technical, legislative, and personal solutions could help stem the problem.

You can find the full brief here.

More on:

Cybersecurity

Influence Campaigns and Disinformation

Digital Policy

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail
Close