Guarding the Ballot: Addressing Foreign Disinformation and Election Interference
Panelists discuss the escalating threat of foreign disinformation and other forms of election interference and what the United States and its allies can do to combat this risk.
TEMPLE-RASTON: Good morning. My name is Dina Temple-Raston. And I’m the host and managing editor of the Click Here podcast from Recorded Future News. We’re about all things cyber and intelligence. And we’re here today at the Council on Foreign Relations to discuss Guarding the Ballot: Addressing Foreign Disinformation and Election Interference. And this meeting is part of the Council’s Diamonstein-Spielvogel Meeting Series on Democracy. We’re joined today by CFR members attending here in person in Washington, and over 200 attending virtually via Zoom.
You have the bios of our panelists, so I won’t repeat that. But briefly, in order, we’re speaking with the Global Engagement Center’s James Rubin, Anne Applebaum, who is a staff writer at the Atlantic, and Jon Bateman, who’s a senior fellow for Technology and International Affairs Program at the Carnegie Endowment for Peace. So if you could join me in welcoming them here this morning. (Applause.)
So I thought we’d start by setting the table a little bit and talk about how misinformation in 2016 is different than what we’re seeing in 2024, and how the tactics have changed. And fortuitously, Anne, you have an article that’s out today that sort of takes a look of the sweep of this and how it’s changed. So could you just tell us sort of briefly what’s different this time around? Not just in the U.S., but more generally.
APPLEBAUM: Mmm hmm. So, first of all, thank you very much. I’m delighted to be here. And welcome to everybody who’s out in the ether, in cyberspace, as it were. (Laughs.)
I have just published an article. It’s actually a book excerpt. It’s the cover of the Atlantic for the—in the June issue. And it is about the new ways in which Russia, China, but also Iran, Venezuela, and other autocratic states have converged around—not around a plan. It’s not a conspiracy. But have converged around a set of ideas about how to promote authoritarianism and autocracy. And it’s not an—it’s not an entirely new subject. Some of it—some of the elements of it are familiar. I think the new part is the number of countries who are now participating and the degree to which China, for example, is now using its enormous resources and its media resources in Africa, and Asia, and Latin America to promote it.
You know, there are different—you know, we can talk in the course of this conversation about different pieces of this. I mean, the ideas are—will be familiar to many of you. You know, the idea is to promote the idea that autocracies are stable and safe, and democracies are divisive and chaotic and degenerate. You know, the Russian version of this argument has a particular a kind of traditionalist, I would say, fake family values component. You know, we stand for—you know, we’re—it’s a we’re opposed to LGBT people. And they would say—they would use the expression LGBT ideology. We are in favor of traditional marriages. And they use that language. And it’s—and that also is echoed by other autocrats in different places.
But they also increasingly seek to promote those narratives not just at home to keep their own—you know, their own—you know, their own power safe, but they also seek to do it around the world. I think the new element was an understanding that you can—and you can see, I mean, both—certainly in Russia and China, but also elsewhere—you’ve seen over the last decade both Putin and Xi Jinping have spoken publicly and privately about the threat that the language of democracy poses to them. And by the “language of democracy,” they mean people who talk about transparency, accountability, free elections, freedom of speech. And they speak about how it’s a problem for them. It’s language that inspires their own domestic oppositions. They see it coming from abroad, both from their own exile communities but also from us.
And so they have come to understand that they need to undermine those ideas wherever they are. So the—so, again, promoting those narratives not just at home but around the world. In other parts of the world, in the Global South, as I’m sure Jamie can speak about, but also in the United States. I suppose the—there are a couple of new elements that makes this different from 2016. One is it’s much more sophisticated. You know, the Russians are now engaged in an international project of information laundering. So they seek to get their messages to people not only through RT or through official Russian websites, but through—and not only through Facebook, but through, for example, magazines and newspapers that they have taken control of in Latin America, through media in Africa.
You know, they also use websites that purport to be, you know, British or French. And they—again, I think we’re going to talk about that a little bit later too. And they smuggle their narratives into those. The idea, I think, is to make—you know, is to make the narrative that autocracy is good and democracy is bad—this oversimplified version—but to make it appear native, so that it appears in all kinds of places in many different languages, in many—in many communities.
The other element that is not entirely new, but is much stronger than it was, is that they have clear allies inside the democratic world. They have people in the United States, certainly in Europe, who also pick up, retweet, repost, reuse their narratives, and make the same argument. So they have found a really willing audience for people who want to hear that the system is degenerate, it’s decrepit, you know, we need a revolution, we need—we need destruction, we need to end the European Union, we need to end NATO, et cetera, et cetera, et cetera. So that’s an introduction to the—to the idea of the article. And I’m sure we can talk about more details as we go along.
TEMPLE-RASTON: So in 2016, one of the things that we really saw were personas that were built up within social media. They would actually be trying to gather an audience and that sort of thing. What the big switch has been is this idea that you don’t need to necessarily have a social media following to get people to believe your narrative. Instead, you inject your narrative into social media followings that are already there. There’s a group called Doppelganger which has been around for a while, but it’s basically a Russian-backed group that is a combination of commercial entities and Russian intelligence. And I wonder, Jon, have you looked at this at all? Can you talk a little bit about how that has changed?
BATEMAN: Yeah. I think we’ve seen a lot of shifting tactics by the Russians and by the others. And much of this is in response to the U.S. and the Western response to the initial round of Russian and other foreign activity. So, post-2016, post-2017, of course, there were huge revelations about all this foreign influence and interference. And as a response, in the United States there are now hundreds of people in the U.S. government, in the intelligence community, in the major platforms, major news organizations, civil society, philanthropists, practitioners elsewhere—you have organizations like Atlantic Council’s DFRLab, Graphika, Bellingcat. You have this whole trust and safety discipline that’s been built up, a huge intelligence and information sharing edifice.
And so much of this has been trained on trying to identify, monitor, stop, or limit the tactics that we saw in 2016 and subsequently. And so then, of course, you would expect some tactical shift. And the intelligence community has documented this. Different attempts by the Russians and others to be more indirect, in some cases. So we saw a few years ago that instead of developing their own personas out of Moscow, they were using some people in Ghana to do this for them. This was a kind of short lived and abortive tactic, but it was an effort to get around some of the initial guardrails and surveillance that the United States and others had set up.
I’ll say that when we talk about this problem it’s worth just taking a step back and thinking, what do we really know? And how do we know what we know? I think a lot of what we know is about the activity that is being perpetrated against the United States and about the activity that folks in America and elsewhere are doing in response. We actually know very little about the effectiveness of either of these foreign disinformation actions and the ultimate influence they have, you know, quantitatively on our elections, on our society. And then, by the same token, the effectiveness of many of our responses. It’s still an enormous research gap. So in many ways, we’re still seeing through a glass darkly.
TEMPLE-RASTON: Because it’s too early to actually tell? It has a long tail, is that the idea?
BATEMAN: To some extent that’s right. So I guess it depends on what effects you want to measure, right? So if you want to immediately measure, for example, the impact of Russia’s 2016 operations on the 2016 election, you know, first of all, we were very poorly postured to evaluate that. But some of the best empirical studies that have been done do cast doubt on whether that was a significant influence on the U.S. populace. Now, it’s such a close election really almost anything could have been said to be decisive, in that sense. But yeah, subsequently, you know, really we lack the data in many cases and kind of long-term, longitudinal studies to understand all the second- and third-order consequences. Because the information environment is so complex, there’s so many things clashing and combining together, and there can be paradoxical effects. So we’re still trying to understand that in many ways.
TEMPLE-RASTON: So, Jamie, tell us about the Global Engagement Center and some of the things, for example, that you’re doing, Anne mentioned Africa, like the Africa Initiative.
RUBIN: First, let me say that we have to start by understanding the essential asymmetry that Anne’s article talks about, that all of us need to appreciate. That is that the days when we watched Tiananmen Square live on television, that we watched Yeltsin get on the tank in Russia during a revolution, those are over forever. There is a wall that is blocking the authoritarian countries from the rest of the world. It’s called the firewall in China. Russia is building a wall as a result of the Ukraine war, and has almost completed it. And so we need to understand that that asymmetry means that when Russia and China are operating in the rest of the world, it’s an—it’s an advantage for them. It’s free for them to operate in the rest of the world, while we have a much difficult—more difficult time getting our message inside those countries. So that’s point one, asymmetry.
Point two, measurement metrics. You know, I’ve been around Washington a long time. And one of the horrible phrases that I now hear over and over again are metrics and measurements. We think we have to measure and metric and meter out every single thing. Some things are obvious. To me, it’s obvious that one of the reasons the Global South can’t be on Ukraine’s side in a war that is so clearly, obviously the invasion of a neighbor with hundreds of thousands of troops for no good reason on one morning Putin woke up. And yet, much of the Global South is still hedging. Why is that? Well, to me it’s obvious that a decade’s worth of information manipulation by Russia and China have affected the populations in the Global South, broadly speaking.
Some of that reason may be related to anti-American foreign policy, colonialism. You know, some of it may be that the ANC’s people were connected to the Russians. But overall, I think we just know in our gut that a war that is so obviously one sided, for it to be questioned in so much of the world is a result of the tens of billions of dollars the Russians and the Chinese have spent around the world. And we need to be very candid about this. We have been on the backfoot. We thought that after the creation of these media tools that were created in San Francisco and Silicon Valley that somehow democracy was going to naturally spread around the world. And they were tools of truth.
And for many, many years, we were promoting them. Foreign policy initiatives of our government were to teach other governments to use these tools. Well, these tools have now been turned against us. And there is a deep and dark side to them. And Russia and China have figured out how to use them to block their own country’s access and to use them against us. I’ll very briefly mention the Africa Initiative, because it’s the thing I’m most proud of having done—been at the GEC a little more than a year. And we discovered, and you can imagine how we discovered it because we work very, very closely with all agencies of the U.S. government, that Russia was planning to cast doubt on Western medical research and Western medical activities in Africa.
First by suggesting that it was American research causing Dengue fever in West Africa, then using the famous trope that big pharma was testing Africans—drugs on Africans. All of which was intended to cast doubt on Western medical programs like PEPFAR, the most successful American program in history, something the Russians obviously hate. And think about the effect. This is a case where disinformation kills. People who would read, see, or hear that disinformation might be deterred from getting Western medical care that could save their lives. We were able to expose it before it got started. You know, inoculation is the—to me, the most effective way to deal with these, essentially, covert operations of the Russians, is to get them out early so that people and journalists and governments know when they start seeing something that looks unusual that it’s coming from the Russian government, from the Russian intelligence agencies.
We were able to name the individuals, where they worked, how it started, and we believe we had an effect on that program’s efficacy. We did that also in Latin America. But I’ll stop for the moment. But simply to say that that we need to realize that the adversaries in this conflict that Anne has so eloquently stated, between authoritarianism and those who believe in democratic values, is something where they believe information is a crucial aspect of their government statecraft. They do it at the highest levels of government.
Think about what the Communist Party is. It is propaganda. And in Russia, just think about the fact that the Protocols of the Elders of Zion was created in the Kremlin when it was under the czars. So disinformation by Russia and propagandist information dominance by the Chinese is what they’re seeking. And they are investing huge sums of money in it. And we are really just recently becoming aware of the danger, the risks, the consequences. And starting to, I hope, have an impact.
TEMPLE-RASTON: Right. And what you’re essentially talking about is intervention. And I wanted to talk to you a little bit about this, Jon. You’ve done a lot of research on this, trying to figure out these metrics. Can you talk a little bit about that?
BATEMAN: Yeah, sure. And just to follow up on the point that was just made of, you know, it’s great to have metrics, but that just can’t be a stopping point in the conversation. If we can’t measure something that doesn’t mean we can’t take action against things that are obviously concerning. So that was kind of the spirit of some research that we did at Carnegie, where we wanted to get a better handle on what actually works to counter disinformation, whether it’s foreign, domestic, or otherwise. And so we looked at a wide range of interventions that have been proposed.
Some of them are very tactical. So with social media companies the takedowns of fake accounts, right? That’s a very well-known intervention. Or application of fact checking or labeling on content on social media. People also talk about reducing the focus on engagement of algorithms, and potentially even increasing privacy protection so that there can be less microtargeting a very sophisticated advertisement and, again, kind of engagement farming. So there’s a cluster of social media based interventions that are all very familiar to us. There’s also other things that maybe don’t get as much attention—media literacy education, supporting local journalism. Other things that the government can be doing as far as, you know, pre-bunking or inoculating the populace.
And so the goal of this research was to say, what did we actually know about any of these things? And the research is generally sparse. There’s kind of fragmentary social science in each of these areas, as far as empirical data and real hard data. But I think what we can say at a high level is that there is no silver bullet. So there is no single technocratic intervention that’s really going to eliminate disinformation as a problem. It’s been with us for millennia. It will always be with us. We’ll always be arguing about what’s true and false.
But there are some things that seem to maybe be more important that we haven’t invested much in. And the two I mentioned. Investing in local journalism, that seems to have a huge effect on people’s levels of civic knowledge, civic engagement, civic trust. Those are all things that we know make people resilient to false information.
TEMPLE-RASTON: What does that mean, investing in local journalism?
BATEMAN: Just financially. I mean, the numbers are staggering. And, you know, I’m sure Anne knows this as a writer and someone who’s involved in journalism, but more than 2,000 newspapers have shuttered in the United States in a twenty-year period. The number of jobs in newsrooms at newspapers, less than half of what they were twenty years ago today. The revenue, less than half of what it was twenty years ago. We know that newspapers and local newspapers are incredibly important for creating a connective tissue that then helps build knowledge of local issues. And when that’s gone, people look to less important—or, I should say—less trustworthy sources, whether it’s social media rumors, friends and family, TV news—which is not as accurate in many cases, more sensational. So really what we need is a massive injection of money into this sector. And it’s extraordinarily expensive. So there’s different ideas that have been floated for how to seed that.
The other thing I just wanted to mention briefly is media literacy education. You might ask what that is too, because that can come in all different kinds, and shapes, and sizes. It can be a class in primary school. It can be a module that’s given to older people. It can be an online ad encouraging people to think about—kind of advertising or narratives they might come across, and how to how to discern better true and false. The task of being information literate has vastly changed over the digital era. And there are actually programs that seem to work. It takes time. It takes money. But I think some of that is actually more promising than some of the more tactical efforts that get extraordinary amounts of public attention.
TEMPLE-RASTON: Anne, did you have something to say?
APPLEBAUM Can I just say, about local journalism, it’s a well-known, again, authoritarian tactic or practice. So in Russia people often know very little about how their local cities or towns are run. And instead, they know—you know, they see constantly these kind of big geopolitical battles. You know, us against NATO, you know, these sort of grand stories that actually don’t touch their lives and that they don’t have a way of judging. I mean, none of us have a way of judging them. And something similar has happened in the United States, but for different reasons. In our case, it’s because local journalism went bankrupt. But you have the phenomenon of people not seeing—not feeling attached to real issues on the ground that affect them, and instead are moved by, again, these kind of celebrities battling in the ether, which is really what national politics has become.
And that’s very—it’s not just that it creates a more—you know, a better atmosphere for disinformation. It’s also very bad for democracy, because it means that people don’t feel connected to each other and they don’t feel connected to reality, or really to their—to their—to their own issue. So it’s a—you’re right, that it’s—that it’s very central. I mean, I would also just say one thing about measurement. We actually don’t really know how any information moves anybody—sort of good information or bad information or true information or false information. So the fact that we don’t—you know, we can’t pinpoint exactly what this Russian narrative or that Chinese narrative does is not surprising.
TEMPLE-RASTON: We can see it. It’s spreading. But we can’t see what its effect is.
APPLEBAUM: We can see it spreading. We can—I mean, you know, in the case of Russian narratives, I mean, we could—we could hear members of Congress repeating stories that weren’t true. You know, so we know that there was some kind of impact. I mean, there was a famous—I mean, Senator Thom Tillis, he’s a Republican senator, spoke about hearing his colleagues talk in the Senate about we don’t want to give money to Ukraine because Zelensky will spend it on yachts. Zelensky doesn’t own any yachts. But there was a—there was a false story about yachts, which included some pictures of other people’s yachts, that did spread on the internet. And clearly, some members of the Senate had seen it.
TEMPLE-RASTON: Yeah, he certainly wasn’t the only one. Yeah.
RUBIN: Let me just add a point about the difference between Russia and China, because there is generally speaking a pretty big difference. Russia, disinformation, lies, biological weapons, yachts, that’s their bread and butter. That’s what they do. Their playbook is not often that different. In Africa, they were using disease in the same way they claimed the CIA started AIDS. So their playbook is pretty well known and it’s based on lies.
I would say, broadly speaking, China is using domination. Just financial and physical domination. So take Honduras, a small country. Had a change of government. They took the entire press corps of Honduras to China for two weeks on a paid trip to indoctrinate them in how wonderful China is. And then gave them access to Chinese media, which is not filled with lies but fills facts in a way that is manipulative. So every single story in the United States of a crime, or a rape, or a problem, or a pollution, or a corruption is filled up in the Chinese Xinhua service. And in China, only wonderful things happen. And none of the things that we know that make China dangerous.
Why do—why is it true that all over the Muslim world the Uyghur monstrosity, the program that Xi Jinping used the phrase, show them absolutely no mercy—that comes from the Politburo meeting in which they’re describing what they’re going to do to the Uyghurs. And his phrase was, show them no mercy. I think, even, absolutely no mercy. Why does the Muslim world not know about that? Why do they not care about that? Now, some of it is because their leaders don’t actually really care about things they say they care about. But some of it is because China is so effective at filling the media space with their versions of things that are basically true, and fighting to the death to get things that are also true avoided.
And they will fight just to avoid having a meeting from the—one of the Uyghur representative going to a conference in some European country. Every diplomat in China’s embassy will be ordered to try to prevent that meeting from happening. That’s manipulation. It’s not disinformation. But it can have a very, very powerful effect.
TEMPLE-RASTON: Yeah. China has actually created new sites that local newspapers can fill their papers with, so that you’re getting—in the same way we used to use AP, or UPI, or one of those. They’re putting those into newspapers. People don’t realize that they’re skewed.
So I wanted to end not necessarily on a happy note, but at least on a note to talk about what the danger of talking so much about this that we give it power by just talking about it. There’s a great article in Foreign Affairs magazine that just came out that looks at this problem of trying to balance—don’t hype the disinformation threat and, as fate would have it, in fact, Click Here podcast has an episode that dropped today about this. So can you talk a little bit about how we strike that balance?
RUBIN: I’d like to start with that, because I—to the extent that people read that as referring to the rest of the world, I think it would be a terrible tragedy because we are not hyping the disinformation threat in Africa, in Latin America, in Asia. We are underplaying it. We still don’t have in our government a formal threat assessment of the information damage done by Russia and China to our foreign policy interests around the world as a result of Chinese and Russian information operations, funded in the tens of billions of dollars and conducted by the highest levels of their government.
So please, yes, there may be some exaggeration in the domestic space. And now everyone who works for me is going to tell me: Don’t say anything about the domestic space! (Laughter.) The GEC is not working on the domestic space. I work for the State Department. I don’t talk about that. I did happen to work for Hillary Clinton in 2016. And I’d love to talk to people afterwards in a personal capacity about what happened back then. But in my official capacity, let me just say that I don’t believe we are hyping the disinformation threat around the world. I think we are underplaying it. We don’t realize the extent to which it has an impact on those middle countries, those hedging countries, those countries we spend so much time trying to affect.
And we wonder why they can’t choose between the United States and China. Well, maybe one of the reasons is because they don’t know enough about what really goes on with Chinese economic projects, what really happens after the Chinese come and leave, and have destroyed a place, or done environmental damage. Or they really don’t know what’s happened to the Uyghurs. Or they really don’t understand how China has skewed markets, and then somehow is claiming that they’re a victim and a developing country. And yet they have the power they have around the world to spend tens of billions of dollars trying to manipulate things. Maybe it’s true in the domestic space, but it’s certainly not true in the field of battle I work on, which is Latin America, Africa, and Asia.
TEMPLE-RASTON: Fair enough. And, Jon, if you could just quickly give me your thoughts on this. And then we’ll go to questions from members.
APPLEBAUM: I mean, my thought is that domestically—I’ll agree with Jamie in around the world—I mean, domestically they’re really doing the same thing. So it’s not as if one is influencing the other. I mean, that there’s this malign Russian influence that’s making Americans do X or Y. I mean, it’s a joint project. I mean, there’s a—I don’t think it’s a conspiracy. There’s not, like, a secret room where they all plan things together, or anything like that. I mean, there’s just a—the language that, you know, Russia prefers to talk about, whether it’s the war in Ukraine or whether it’s the government of Joe Biden, is the same language that’s used by the—I would say the extreme right and the extreme left in the United States. And they’re—as I said, they’re engaged in the same project. And therefore, they overlap with one another and amplify one another. I would describe it more like that.
TEMPLE-RASTON: And Jon.
BATEMAN: So it’s a serious problem, foreign disinformation. And I’m very glad that organizations like the GEC exist for now and that people like Anne are writing about this issue and calling attention to it. I’m so glad that the word has gotten out on those things.
I think there is a real danger of overhype, certainly at the domestic level. And I worry about this quite a bit. So one way to think about it is, in a given presidential election cycle—if we want to kind of just isolate it to that—what percentage of political content that a typical American comes across would be of some kind of foreign subterfuge? Infinitesimal. I mean, you can always trace the way a narrative kind of worms its way through politics, but often even some of the things that we identify as Russian narratives actually had an even prior origin in some domestic extremist narratives. That’s the claim that’s been made about this yacht story, for example, in the Foreign Affairs piece.
So ultimately we’re now living in a country where the two parties, their Super PACs, they’re spending billions of dollars each election cycle—far outstripping any foreign effort to directly influence us. And that’s before you start to consider the whole richness of other voices—private media, civil society, friends and neighbors, community leaders. A typical American is just awash in political information. And I just don’t think it’s realistic to say that foreign subterfuge is a major driver of American political behavior.
And I just want to say, why does this matter? Why does this matter? Because, of course, we need to be tackling this. It’s a foreign policy problem. It’s a legitimate national security threat. And certainly around the world, it can be very damaging. But I think there are at least two risks of this. One is, there is a risk of McCarthyism. There is a risk that tarring someone as kind of influenced by foreign boogeymen is now one of the go-to terms of political abuse. Oh, you’re just repeating what some Russian person what some Chinese person is saying. I think that securitization of the American political discourse is a concerning trend in its own right.
And the other thing is distraction. In that. I think, realistically, our political maladies are homegrown. They’ve been in plain sight for decades. Polarization, the truth decay, you can go on and on through all these different kinds of phenomenon. Those are the core problems. And so while we need national security officials to address the foreign-facing element of it, we shouldn’t believe that that is the kind of the central driver of our political maladies. We actually just need to heal ourselves, in many ways.
TEMPLE-RASTON: Got it. So at this time I’d like to invite members and guests, both in person and on Zoom, to join the conversation with questions. A reminder, this meeting is on the record. And we’ll start from a question here in Washington. And how about Jane Harman?
Q: I’m Jane Harman at Freedom House.
And a comment. Jamie, your idea about damage assessment I think is fabulous. One other comment. There are metrics to measure the impact of information. People who run for political office have pollsters and, you know, various advisers who help with that all the time. But my question is about a group that hasn’t been raised. And that’s Hamas. I think that Hamas is brilliant at weaponizing disinformation. And just wonder if you agree and can tell us how they learned this. But also, my question is whether Hamas or some other foreign organization could be driving the protests on our campuses through information—disinformation, but also funding of tents, signs, you know, agitators, et cetera. I don’t think this has been investigated adequately.
RUBIN: I’ll just take a little piece of that, because most of it was domestic. The little piece that I would take is that, you know, I’m lucky. I was hired by Secretary Blinken. He’s someone who really does understand the information environment. And, you know, he’s given me a free run of the place, and his time and effort. We’ve been building a coalition of countries that want to deal with it in the way I’ve described. And he’s signed MOUs with various countries.
But he’s pointed out, and again it’s, you know, a tricky subject right now, the Middle East. And I want to state this very carefully. Whether or not you agree with what Israel has done in response to October 7 and how they’ve gone about attacking Hamas to prevent the possibility of that recurring, something terrible did happen on October 7. How terrible it is and how specific it is has actually been the subject of disinformation. And we know that. And we’ve seen other governments try to minimize the massive assault and the minimize the horror, even though Hamas itself promoted most of the knowledge we have by taking videos of it. So disinformation, in this case, Secretary Blinken has pointed out in many countries the response of the Israelis, which whether you agree with it or not—and I don’t want to get into that now unless I absolutely have to—(laughs)—is to something real, and something horrific, and something that has never been seen in Israel in its entire history. And something that has been downplayed by Russia, China, and Iran, in very clear government attempts to try to alter the circumstances.
And here, we need to get used to the fact that the modern era is now one in which Russia, China, and Iran—unless they change their government policies—are choosing to blame the United States for every major crisis. Think about it. COVID. Whether you think it was from a lab or from an animal, it clearly started in China. Russia, China, and Iran said it happened in the United States, blamed it on the United States. The Ukraine—Vladimir Putin decided to invade Ukraine. Russia, China, and Iran have blamed the United States, saying U.S. companies are trying to make money and that’s how the war started. And now we get to Hamas and October 7, and they are blaming the United States for a war that was clearly initiated, began, and started on October 7. So for those of you old enough to remember, there used to be something called the blame America first crowd. That was something that Republicans used to call us Democrats. There is now a concerted effort on the part of our adversaries in the information space—Russia, China, and Iran—to blame the United States as much as they can, in every way they can, around the world when there’s a major crisis.
TEMPLE-RASTON: Go ahead, Jon. Can you talk a little bit about Hamas?
BATEMAN: Yeah. Yeah. So Hamas and its allies in the Middle East, huge perpetrators of disinformation, massive deception about the truth of October 7, what led up to it, what followed. The Middle East is—it’s kind of a mess in terms of the information environment. It’s very chaotic. People believe all sorts of conspiracies and crap. On the U.S. campus protests—so let me—let me try to give my hunch as to what’s going on here, because I don’t know either. But there was a great Washington Post analysis on this recently. So if I were someone in the Information Research Agency or similar organization conducting propaganda in Russia and China, I would absolutely be seizing on this as a fantastic opportunity to exploit divisions in the United States, sow chaos.
But ultimately, from all I can see, this is a legitimate mass youth movement. We see these things all the time. Young people are much more favorable to the Palestinians than they are to Israel, or at least those protesting, or a subset of those on campus. It’s not clear that engaging in some kind of encampment is really that costly of an activity, needs any kind of foreign sponsor. So I just have to say, you know, I think people are posting this question—starting to pose this question, and I squirm a little bit at the idea that a kind of mass political movement that, by all accounts, is fairly organic and doesn’t really need much of a further explanation, we’re kind of now asking, OK, do we need to investigate this, see if there’s some kind of hidden hands there? I think that kind of orientation, in some ways, could lead us down the wrong path and create dangers of its own.
TEMPLE-RASTON: We also kind of forget being students, right? Your choice is to go to the library and study or to hang out with your friends and talk about political things. It’s pretty easy to be motivated that way. Go ahead.
APPLEBAUM: Yeah, I would push back pretty strongly against the idea that you need anybody to organize encampments. So I was in college in the 1980s. And there were—this was at the time of the Apartheid protests, anti-Apartheid movement against South Africa. And I remember, I think it wasn’t a very big encampment, but there was a kind of shantytown built at my university in the center, as a sort of demonstration of something. (Laughs.) Unclear what. But it’s not a new idea. It’s pretty old. It’s not very expensive. And I’m not—you know, I’m not—I don’t—you know, the idea that protests are somehow seeded by outsiders seems extremely unlikely to me. And I would push back against it.
Are the protests being used? Is the language of division being promoted? Yes, absolutely. And that’s been shown—I was going to quote the same Washington Post article. There was also a New York Times article that showed how both inside the United States and around the world images of division have been—and narratives about American chaos, and hypocrisy, and so on, have been—you know, have been pumped out like crazy. So, yes, of course it’s being—it’s being used.
I mean, there was one incident just at the very beginning that was actually attributed to the Doppelganger Network. This was in France, where there were Stars of David being painted on walls in Paris. And it turned out they were being painted by a team of Moldovans who were hired for the purpose. And that was actually a—you know, a French Foreign Ministry project—a think tank that showed, you how, how that was done and why. So are they trying to provoke, you know, distress in Europe and the United States? Yeah, probably. But I would not—I would not attribute the student movement to anything but students.
TEMPLE-RASTON: Yeah. Let me take one more from the room and then we’ll take one from Zoom. Yes. Just—the mic is coming.
Q: Sorry. Thanks. Marcus Brauchli.
I wanted to just ask about solutions. And in particular, ask you to think about—talk a bit about the responsibility of some big U.S. technology companies in enabling the spread of misinformation/disinformation. And how you think the U.S., given the constraints of the First Amendment and so on, can address the responsibility of technology companies in distributing misinformation/disinformation.
RUBIN: Yeah. I’m going to start and then Anne will probably have to finish because I can’t say all that much. This is another case where my staff will be panicking because of the fact that we don’t have regulation in this country. We’ve chosen not to have regulation in this country. That puts an extreme limitation on what the U.S. government can do. There’s also lawsuits that put an extreme limitation on what I can say about it. However, I do think it’s worth addressing the question of what to do about it. So this is the hardest intellectual job I’ve ever had, I’ve worked in foreign affairs for thirty-plus years, to try to figure out how—what role the U.S. government should have, given this set of circumstances. No regulation. First Amendment. You know, I was married to a journalist for a long time, I believe in journalism. Censorship is obviously not on the books.
So what do you actually do? Well, that’s how we at the GEC sort of came up with these two main lines of effort. One is this exposure. I’m very lucky. There’s nowhere else in the U.S. government that I’m aware of where you can have individuals who work with the most sensitive intelligence agencies in our government sitting next to people who work on public diplomacy. And that doesn’t happen anywhere else. And that was created by Congress when it created the GEC. And we’ve been able to take advantage of that in a couple of cases, limited but successful. So one, yes, we need to use our advantage, which is the intelligence community, to deal with how Russia and China are exploiting that social media asymmetry we were just talking about.
The second thing is basic coalition building, and how to get people to think about it in a way that we can get those famous middle countries—those famous hedging countries we always hear about it. And it’s taken me a while to figure out how to talk about it, but I’ve—at least the current version is as follows: Is we need to think of the information domain as a domain like other domains—cyber, space, land, sea, air. And if we do that, we should be able to have countries believe that that domain is sovereign. So what does that mean? That means if Russia or China were to enter that domain without admitting that it was Russia or China, that would be a covert action. That would be not OK.
So if we say that the information space is sovereign, and if we get enough countries in the world—and remember, you know, we still have most of Europe, most of Asia, most of the countries in the world. Who would want to say they didn’t want their countries information space to be sovereign? That’s a pretty hard argument to make. And once you establish that sovereignty, then you’re not asking to shut it down. And in some countries, of course, they do have regulation of social media. They can do things—very strong things that we can’t do in the United States. And what they do may have an impact globally, if they do it strongly enough.
So I work very closely with the Europeans. They obviously do have legislation and regulation that’s just starting to kick in. It’s really just begun in the last six months to have an effect. But if we can get each country in a coalition of countries of all the major countries that are not the authoritarian leaders that Anne was talking about, to say: This is sovereign space. And if you’re going to enter my space, you have to have your name on it. You have to be labeled. You have to be—the provenance of this information has to be known. Because if you boil all of this down—I spoke to—in Bulgaria, there’s a particular issue. They have a lot of Russian speakers.
And it was explained to me, if they see on a screen, on the radio, on TV, whatever the mode—and, frankly, a lot of disinformation has nothing to do with social media. It’s done in the traditional, you know, radio, television, and newspaper mode—although maybe newspapers on the internet, but still old media, as we call it. And it says, the United States has bioweapons in Ukraine. Well, a lot of people might believe that, because they watch TV, and they see stories about bioweapons. But if they saw, and they had to see: Russia says the United States has bioweapons in Ukraine, that will change their perception of the story.
So if we could create a coalition of countries who regard their information space as sovereign and demand transparency to that effect, that these covert operations were not allowed, and if they were conducted there would be sanctions, there would be penalties, there will be criminal results, I believe over time we can get at a big chunk of this. But, again, I don’t think we can solve it through that. And until the United States make some changes legislatively, the U.S. government’s hands are quite tied in terms of what we can say or do with the social media companies.
TEMPLE-RASTON: Jamie, do you think that’s worked for cyber? Do you think cyber is actually sovereign? Cyberspace, that’s one of the things you mentioned.
RUBIN: I think that—I think, yes. Well, I think that we would like there to be one open free world, but it’s not turned out to be that way. Because, again, certain countries have chosen to use cyberspace to attack us. And we’ve had to go through whole new labels of deterrence and responses and all of that to try to minimize the cyber criminality that’s taking place. And what I’m saying is if we can establish that if you don’t have your name on it and you initiated it that’s a crime in the information space, we can begin at least to establish a coalition approach to this, which is the only way that will work.
Not going to the U.N., where the Russians and the Chinese will ruin it, but slowly building out. We already have, let me see, is it thirteen countries that have joined us? That’s South Korea, Japan, Germany, we’ve worked our way through some of the Europeans. I think we have the first Ivory Coast African countries. I was just in Argentina. I will hope to get some Latin American countries. And get all those countries that believe in the sovereignty of their country to say that covert insertion of narratives without label is not OK, we can begin to address the problem. In the absence of what you were obviously implying.
TEMPLE-RASTON: So I just want to be able to—we have a Zoom question that’s been waiting quite a long time. So if we want to go ahead and listen to that, please.
OPERATOR: We’ll take our next question from Fred Hochberg.
Q: Thank you for this great conversation. Fred Hochberg. I’m also chair of Meridian International and served the Obama administration.
My question is, can you identify which democracies—which robust democracies are doing a better job, you can include the U.S., at pushing back on this? So which are the strong democracies that are actually doing a good job? You know, we’ve talked a lot about Latin American and Sub-Saharan Africa, but what democracies?
TEMPLE-RASTON: I think that one’s for you, Jamie? (Laughter.)
RUBIN: All right. I’ll try, but I think I need help on this.
APPLEBAUM: I can.
TEMPLE-RASTON: Go ahead.
APPLEBAUM: So there are a couple of countries that have thought about this and worked on it in different ways. So I did a project a few years ago with Sweden—with the Swedish government. And they were very interested in tracking, you know, as a government—you know, it wasn’t—it wasn’t merely, as it would be in this country—it wasn’t merely outside groups and independent researchers and academics who were tracking Russian narratives. This was in a run-up to a Swedish election campaign. It was a government agency. And first of all, they were interested in tracking it. They wanted to pull together teams of people from around the world who would help them do it.
But they also have a—they and the Baltic countries have similar kinds of projects. They have—they have, you know, I’m not—what the right word is. You know, they have kind of civic defense education projects that they work on together. You’re nodding. You know what I’m talking about. So they will—I mean, some of it is kind of almost, you know, 1950s. I mean, they will send out little leaflets to people’s houses. And they seek to build ties between people. And they will talk about, you know, the need to maintain a Swedish conversation. So it’s actually a part of a government project to make people aware that this is a problem and to bring them on board, and to build—and to build a sense of community around it.
I mean, certainly all the Baltic states have this, particularly Latvia and Estonia who have big Russian-speaking communities. They’ve made big efforts to include Russian speakers in a—in a national conversation, and to speak, you know, in public, in schools, in all kinds of spaces about how we push back against this. And I think—I’m not saying it’s 100 percent successful, but it’s—it may be—you know, we have nothing like that. And we don’t even have anything approaching that.
I would just say briefly, in answer to the previous question, I think we will, sooner or later, get around to regulating social media. Just like once upon a time it was thought to be impossible to regulate food production. You know, how could you possibly control that?
RUBIN: Tobacco.
APPLEBAUM: Sorry. Tobacco. Or, you know, and you think about some things that we do regulate. Think about international financial markets, where money—billions of dollars move around the world at the speed of light all the time. Everybody wants to cheat. Everyone wants to make money. And yet, they are regulated. And they can be regulated. And I think eventually, we will get to that. And by regulation, I don’t mean censorship. I mean rather the regulation of algorithms, the use of ombudsman. We may get to the point where social—we require social media companies to comply with the law. So anything that appears on their platform, they become legally responsible for, which would change the nature of social media overnight.
It may be that we, you know, alter anonymity, or make it—make it—
RUBIN: Less.
APPLEBAUM: At least create some fora where there’s no anonymity. I mean, that’s beginning to exist already. But so we—the Europeans are ahead of us. They do have a conversation about how this should be done. I’m not saying it’s 100 percent successful, or everything’s great. But there is a—there is a—they will eventually begin to do it. And some of their ideas will kick in. And then it may even be that U.S. social media companies will have to comply with their laws. It may in effect be—since we won’t do it because of lobbying or because of cowardice or whatever—it may be that Europeans effectively do it for us until we catch up.
TEMPLE-RASTON: Do you think it’ll start with the rolling back of Section 230, or do you think it’ll be more subtle than that?
APPLEBAUM: I can’t predict what will happen. I would be in favor that, but I’m probably still alone on this now.
RUBIN: On the democracy part, let me just say, as Anne pointed out, that the Europeans in this respect are ahead of us. And then there’s a whole subject which I hesitate to bring up, but is—if we’re going to be honest about it—is messaging. And when I first took this job, since I used to be a spokesman, people thought we were supposed to do messaging. We were supposed to fight back by—fighting back by, you know, answering the disinformation and telling the truth about Russia and China. And I have to admit that messaging is a governmentwide problem. It’s a problem in a democracy, in a system where one of the results of having perfect information at real time means that every single White House official is looking at every single ambassador’s statement instantly. And if there’s an adjective or an adverb that’s misstated, they get calls. That causes ambassadors and messengers to be careful, and cautious, and boring.
Meanwhile, on the other side, you can’t really get criticized for lying too much or too little. Nobody’s going to call you up and say, oh, you lied too little, or you lied too much. And so there’s an inherent advantage for those in this messaging war. I don’t have a great answer for that. We used to have something called the USIA. That was part of a Cold War containment global approach. I would like to think that the threat from Russian and Chinese disinformation and information manipulation will generate additional funding for a real effort to do messaging properly, because until we do that there’s always going to be this advantage.
And I think I can’t do it out of a little office in the State Department. That’s something that’s, you know, done by the president and his people on downward. And having been part of a spokesman’s job, I know a little bit about how that’s done. And, you know, you have to start at the top. And you have to want to be in an information “war,” with quotes, with the other side. Because if you look at what Russia and China say about us every day, they’re using information warfare. They are saying the most horrible things about the United States every day, in every possible way. And the only thing that happens when we’re having a good meeting is that it just dips a little bit for those two or three days, and then it goes back to the United States is the root of all evil.
And so, messaging is not a problem that I could—decided I could affect. And rather, it was more important to get at the countering disinformation part that we discussed, of exposing it early and getting better at that. And we’re going to get better at it. And, you know, these two programs that I’ve described in Latin America and Africa are going to be the first of many. And we’re going to get much, much better at it. And I really hope—you know, so far, you know, we all should thank the retired General Nakasone from the NSA. Because that—for those of us who’ve been around for a while, when we grew up you couldn’t even mention the word NSA. And by the time he left, that information was much more widely used, in a sanitized version, in a downgraded version.
That’s made a huge impact. And most importantly, we were right about Ukraine. Because until that happened, we were hanging with the weapons of mass destruction in Iraq over our shoulders. But I think that is the best way to counter it. And then we need other countries to care about it as much as we do. And then we can create one operational picture. And when those Russian and Chinese narratives begin, and are being sent back and forth and trying to create an authentic, coordinated behavior between Russia, China, and Iran, we can do something about it much more quickly.
TEMPLE-RASTON: Let me just take this last question—
APPLEBAUM: You were nodding about Sweden and the Baltic states.
TEMPLE-RASTON: Let me just take this last question. You can add to that, because we want to finish on time. And this gentleman has been very patient.
Q: Thank you. Henri Barkey from the Council and Lehigh University.
One thing that hasn’t been mentioned is AI. AI as both offensive and defensive. And how do you see this being integrated? The other point I would like to make is that, as someone who’s been a professor for a very long time, our domestic problem is one of declining analytical capability. I mean, our students. It’s shocking. It’s because of social media, because they’re distracted. They only get, you know, two text—I mean, messages that are two lines long. And that is also something we need to combat. And I have no answer to that. (Laughter.) But anyway. AI is the question.
TEMPLE-RASTON: Yeah.
BATEMAN: So, yeah, quickly on AI. So I think time will tell. I think what people are concerned about right now principally is that generative AI can be used to create very realistic, personalized false content. I will say, on those specific concerns, it’s not clear from research that realism or personalization are that persuasive to people. The stop the steal movement required no persuasive, realistic visual evidence. And then studies of microtargeting political ads, which we talked about earlier, they’re a little bit more effective than regular ads, but they’re not—they’re not a magic bullet, by any means.
Where I’m more concerned about generative AI is playing back the social media story. What we saw from social media is not that social media exclusively became a spreader of disinformation itself, but also it changed the economy such that dollars that used to go to journalistic organizations are now flowing to social media companies because of the change in the digital advertising economy. So that’s the kind of frog boiling sort of structural difference that I’m maybe more worried about. What are the pro-social information authorities that are going to be losing revenue to the AI companies? And what can we do about that?
And then, just quickly on the issue of solutions. I think we have to think about solutions in a social and political and cultural context. So the Northern European and the Baltic states, they’ve done the best in many ways. But they’re also the best postured in many ways. They’re fairly small, internally cohesive. They’ve been monitoring these threats for decades because of the proximity of Russia. We don’t have those advantages here. We’re very fractious. We’re very divided.
I actually think any solution in the United States that’s going to be pitched in terms of countering disinformation is probably dead on arrival, in terms of bipartisan consensus. What we really need is to be reframing some of these things around kids, privacy, antitrust. Those are some of the framings that could help us get towards solutions and transparency of tech platforms that might actually get at this problem, without having to say this is fighting disinformation. Because that’s kind of just DOA.
TEMPLE-RASTON: Let me just quickly say something about AI. And that is, right now AI is very good at spotting AI. And so there are a number of AI tools that we talked about in the podcast that have been very good at spotting any sort of little modicum of AI within it. One of them I just want to give a plug to. It’s called TrueMedia.org. And what it is—it was put together by a guy named Oren Etzioni, who is in Washington. And you basically sign up for it. It’s been out for about a month. Ten thousand people have already signed up. And you upload whatever it is you’re suspicious about, and it will give you some sort of scale to say whether or not it’s AI manipulated or whether it’s AI. And we’ll be hearing more and more about that, I think.
And, Jamie, you wanted to—
RUBIN: Just ten seconds to end on a more optimistic note. I think if you compare the response of the leading AI companies to government regulation and government interaction, as compared to the social media companies, you’re seeing something very, very different. And they came to the government and said: You know, we really are concerned about making sure we don’t, essentially—well, I don’t want to say what I really think about this—but that we don’t have the same problem with the AI technology that we created with the social media technology. And that the company leadership, the intellectual leadership, is much more willing to consider, work with, develop, joint, you know, private-public working together, which gives me some hope that it might be a little less dangerous than it would otherwise be.
TEMPLE-RASTON: Well, we’re at time. Thank you, Jamie. Thank you for joining today’s meeting. And I’d like to thank Anne, Jamie, and Jon for joining us today. I’d like to recommend Anne’s excellent article in the Atlantic, Democracy is Losing the Propaganda War. Foreign Affairs has a great article, Don’t Hype the Disinformation Threat. And, as fate would have it, Click Here has a new episode out today that is also dealing with some of these issues.
Please note that the video and transcript of this session will be posted on the CFR website. And we’d like to thank you very much. And please join me in thanking our panel for joining us today. (Applause.)
(END)