Data is the New Gold, But May Threaten Democracy and Dignity
from Women Around the World and Women and Foreign Policy Program

Data is the New Gold, But May Threaten Democracy and Dignity

Few safeguards protect our private data in today’s information economy. What can be done about the fact that our personal images and data can be exploited, potentially threatening personal as well as national security?
Professor Danielle Citron testifies before the House Committee on Energy and Commerce on fostering a healthier internet for consumers in Washington DC, on October 16, 2019.
Professor Danielle Citron testifies before the House Committee on Energy and Commerce on fostering a healthier internet for consumers in Washington DC, on October 16, 2019. Scavone Photography/David Scavone

This post was originally published on CFR’s Net Politics blog.

Having moved into an era of information and disinformation, Danielle Citron and Robert Chesney penned an important article for Foreign Affairs in 2019, Deepfakes and the New Disinformation War: The Coming Age of Post-Truth Geopolitics. This past fall, I had an opportunity to host Danielle Citron, a MacArthur “Genius” Fellow, at CFR to further connect the dots between threats to our personal information and the ways disinformation can threaten democracy, dignity, and national security. An expert on privacy law, Citron is the Jefferson Scholars Schenck Distinguished Professor in Law at the University of Virginia and author of a new book The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age.

More on:

Women and Women's Rights

Privacy

Technology and Innovation

Democracy

If you’ve engaged with any form of technology recently—whether through a smartphone, social media, a fitness tracker, even a seemingly innocuous game like Candy Crush—you have accumulated a substantial amount of intimate privacy data. Intimate data ranges from your location, to when you fall asleep, to even more closely guarded information like your menstrual cycle or sexual partners. And every day, this data is scraped, bought, and sold by data brokers to third parties. Beyond violating our privacy, this repurposing of our personal data undermines our security. Images and other personal data can be used in unauthorized and sometimes fraudulent ways, as can occur with deepfaked photos and videos.

If this makes you want to throw all your devices out the window, it may not be that simple. Julia Angwin, a journalist, and author of the book Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance, investigated whether anyone could have a modicum of privacy without retreating from society entirely. Buying a burner phone, scrambling phone calls, wrapping her phone in tinfoil, and even creating a second identity, she estimates she protected herself only about “50 percent of what is possible.” Even deleting an app and voting with your feet cannot protect digital privacy. You may think that your data is gone, but it continues being sold and repurposed to monetize your digital behavior.

In our conversation, Professor Citron explained the central argument of her new book—that threats to intimate privacy (broadly defined as our bodies, health information, sexual orientation, sexual partners, closest relationships, innermost thoughts, and more)—deny humans their rights to full freedom of expression, equal opportunity, and personal identity, and as such, intimate privacy should be considered a civil and human right under the law.

The threats technology poses to intimate privacy extend beyond individual expression, Citron warned, it also poses a threat to national security. Deepfakes, increasingly realistic, faked audio or video recordings, could alter the outcomes of an election. Imagine if a convincing video of President Biden admitting to “stealing” U.S. elections, or a speech altered to look and sound like a declaration of war, proliferated across social media.

As deepfakes become more realistic too, democracy and our democratic discourse suffer. Citron pointed out that women, people of color, the LGBTQ+ community and other vulnerable groups are often the “canaries in the coalmine” for digital threats that could extend to the larger population. Currently, around 95 percent of deepfakes circulating on the internet are women’s faces placed onto pornography. The practice is not only deeply traumatizing for victims, but also pulls women offline, as evidenced by the case of Indian journalist Rana Ayyub, whose face was placed on a pornographic video, faced internet harassment, and doxxing to silence her reporting on the Modhi government.

More on:

Women and Women's Rights

Privacy

Technology and Innovation

Democracy

In a post-Dobbs environment, the risks of intimate privacy data violations become even more dangerous, according to Citron. Data associated with period tracking apps, health information, and location services could be sold by data brokers to law enforcement to prosecute pregnant people who seek or obtain an abortion. As UC Davis Law Professor Elizabeth Joh warns, “Dobbs reminds us of how little control we have over our digital selves and emphasizes how digital rights are also reproductive rights.” Digital surveillance over women’s reproductive lives will likely disproportionately impact women of color and women with fewer resources.

During the conversation with Citron, several participants asked what, legally and personally, can be done to protect us from the perils of deepfakes and other intimate privacy violations. Citron proposes reforms to Section 230 of the Communications Decency Act, which provides broad immunity to tech platforms for content posted by third parties, even discriminatory content, unless the platform materially contributed to the content. In testimony before Congress, Citron contended that not all platforms on the internet should be considered good faith actors. Some “bad Samaritans,” as she calls them, such as revenge porn websites, should be denied the blanket immunity for content that Section 230 offers. Such reforms complement proposals that Professor Mary Anne Franks discussed at CFR this past summer, in which she shared steps the White House Gender Policy Council is taking with the launch of a task force to address online harassment and abuse.

Ultimately, regulating the ability of private companies to collect certain types of data, like intimate privacy data, will make us less vulnerable to its weaponization. If intimate privacy were legally a civil right, argues Citron, companies would be obligated to protect the information, rather than profit from it.

Citron is generally wary of privacy solutions that focus on individual responsibility, but she does acknowledge the power of activism to hold institutions' feet to the fire and protect the right to intimate privacy. In an inspiring example from her book, Citron brought up South Korea, where just a few years ago, women faced an epidemic of invasive cameras in public restrooms. Following months of protests—and hundreds of thousands of people taking to the streets—the South Korean government eventually conceded to demands and formulated a Digital Sex Crimes Information Unit to address the problem. While Citron firmly believes a legal framework is needed to defend intimate privacy in the long run, she says, “individuals can move mountains,” and moral suasion from the public can incentivize companies to protect this information in the meantime.

Alexandra Dent, research associate at the Council on Foreign Relations, contributed to the development of this blog post.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail