Digital Diplomacy's New Dawn—Decoding Foreign Disinformation and Fostering Resilience

Friday, May 31, 2024

Undersecretary for Public Diplomacy and Public Affairs, U.S. Department of State


Chief Executive Officer, PEN America; CFR Member

Liz Allen, U.S. Undersecretary of State for Public Diplomacy and Public Affairs, discusses her role in countering disinformation, combatting foreign malign influence, and fostering a resilient global information space.

NOSSEL: Thanks so much. Hello, everyone. Great to see you and glad to be here. I’m Suzanne Nossel. I’m the chief executive of PEN America, and I’m delighted to be presiding over today’s discussion with Undersecretary Liz Allen. Everyone has her bio. And so we’re going to kick it off, and I’m really looking forward to an important conversation on our information ecosystem and how it fits into our foreign policy and a whole series of challenges. 

For me, just kind of a word on how I come at this conversation. At PEN America, it was after the 2016 election that we really started as a free expression organization to hone in on the problem of disinformation and the kind of contradiction or tension between the idea that most disinformation was a protected exercise of First Amendment rights—you couldn’t clamp down on it, you couldn’t outlaw or punish it—and yet it also had the potential to degrade free expression profoundly. Because free expression rights are not just the right to scream at the top of your lungs, of course; it’s the right to engage in a give and take, to persuade, to elevate truthful information, and all of that seemed to be at risk. 

And so we’ve been involved in trying to tackle this problem in the ensuing years, and it’s complex and it’s daunting. One of the things—I’d say our first insights was you can’t really control the supply side— 

ALLEN: Yes. 

NOSSEL: —you know, as a free country. You know, turning off the spigot of information is pretty much impossible. And so, you know, then you confront this incredibly protean, enormous, varied, oftentimes shadowy realm of how people get information. 

And so Liz is right on the top of it, and I want to— 

ALLEN: Lucky me. 

NOSSEL: —just open up by asking you to give us an overview of how you think about this problem, whether you think it’s solvable, and how. 

ALLEN: Well, thank you. Thank you for that. 

It’s great to be here, everybody. Thanks for spending some time on your Friday with us. It’s a beautiful Friday. We’re eager to talk about this issue, to hear from you; frankly, think about—hear from you about how you’re thinking about this not just as a foreign policy matter, but frankly, many of us are managing issues of information consumption/disinformation in our own lives with friends, family, and ourselves as we think about information diets. So very eager to hear from folks. 

I would say that in the last about two years the State Department as a cross-departmental effort has really taken a new look and a much more holistic look at the issue of the information space, as we call it, to your point. Disinformation, I think, for a long time—as Suzanne has laid out—was thought of as a content problem, right? Can we just get a better Twitter strategy—(laughs)—and overcome this? Can we rely on the platform companies to moderate dis- and misinformation? And what we do know, again, is that that spigot cannot be turned off. And AI only stands to turbocharge that spigot as we heard yesterday, frankly, from OpenAI, which, for those of you who may not have seen, OpenAI put out a voluntary report yesterday saying our tools are being used by Russia, Iran, China, and Israel to perpetrate disinformation campaigns. So this is a mainstream problem and it’s a structural problem. 

And so the thing I would offer to all of you, to table-set this conversation, is that we’re now thinking about dis- and misinformation as a matter of national security, right? It isn’t a matter of communications only. It isn’t a matter of intelligence only. It’s a fundamental structural issue that applies to every other issue set. Because our view is if we are not operating on common truths, we can’t hope to solve any other problem. 

So my colleagues who have worked, for example, in biological and chemical weapons have been subject to disinformation narratives about those issues. My colleagues working in public health are, of course, subject to disinformation narratives and foreign malign influence on their issue. So every person working on matters of policy stands to benefit or lose from how helpfully we can encourage the information space to thrive. And we know we’re up against really daunting equities here. 

So what we know to be true is that we have to look at this much more holistically than we ever have before, and we have to look at the supply side, and we have to look at the demand side, and we have to look at resiliency. So what I would offer to everyone is that for people in the foreign policy apparatus who’ve long thought about the, quote, “information war”—and we are in one—and those who have long been on the tip of the spear working on democratic institutions, on the sort of free media/free expression side, we’re now having to marry all those lines of effort to work together. They’re not discrete lines of effort; they’re lines of effort that contribute to a healthier information space. 

So we’re tying to do three things. 

We’re trying to disincentivize disinformation and foreign malign influence to begin with, and that’s kind of the defense part of our strategy where we analyze, expose, and sometimes try to pre-but malign influence campaigns and disinformation narratives, which we can get into, right? We have to try to disincentivize actors like Russia and China in particular from using information as an influence weapon, which they are doing rampantly. But with tools like sanctions or preemption or declassification of intelligence, we can start to chip away at that. So that’s the disincentivization piece. 

At the same time, we have to incentivize better, healthier information to get into the space to begin with. So I would put, for example, our efforts to support independent media into that incentivization of truthful investigative information with integrity to get into the space. And by the way, for those of us who work in government or in trusted institutions, it’s a bigger imperative on us to have better, more proactive strategic communications campaigns, right? Messaging is no longer an afterthought. It can’t be. So the imperative for us to move faster with better information, with more results-oriented information sooner is on us in terms of incentivizing better information. I would say there’s no better way to counter disinformation than to actually get ahead of it, and we know that governments in particular don’t always move fast enough to do that. So we’re working on that. 

And then, finally, to your question about whether this is winnable, to the extent I think we have opportunity, it’s to actually make audiences more resilient, right? So while we’re actually working on disincentivizing bad information/incentivizing good information, we have to help audiences understand their role as an information consumer. So some of the most promising policy prescription that we see are, for example, media literacy and digital literacy efforts. 

And this is, indeed, a daunting challenge. But I think to the extent we hope to make a difference, it’s with all of those things working together, not just one silver-bullet solution. 

NOSSEL: Yeah. I like that three-part formulation: to disincentivize bad information, elevate good information, and empower people to be able to tell the difference. 

ALLEN: Yeah. 

NOSSEL: Let’s start with disincentivizing bad information. I was reading some of your materials in that the Global Engagement Center, which has the job of rebutting and countering these disinformation campaigns, has put out, I think, 750 reports, and it’s on things like Chinese influence over news outlets in Africa or Russian efforts to skew Latin American opinion on the war in Ukraine. And so you know, it’s gratifying when that sees the light of day, and you can pull away the curtain and show who’s behind these campaigns, but do you think it works? I mean, we’ve learned on social media that fact-checking sometimes perversely can only elevate a false narrative. People see the fact check and that intrigues them, and then they go and ferret out the underlying facts, and who knows what they believe, and they may be just as likely to credit the underlying as they are the corrective. And so I’m wondering, you know, how you think about that. And are you able to analyze the implications of these exposés and find out how that actually, you know, steers the information that you’re trying to, you know, push off-course? 

ALLEN: Yeah. This is a great question because we often think: Are we doing ourselves more harm than good by adding oxygen to disinformation narratives, particularly with the voice of the U.S. government, right? The thing I should note here is that this effort, this fight, this policy solution is one in which civil society, academia, independent media, and government all have to work together, because I have very clear eyes about the credibility and reach limits of what the U.S. government is saying. And frankly, to your point, we have to be really careful about when we choose to, for example, pre-but or preempt a disinformation campaign, because we better be sure that that information is truthful, right? 

I think for a lot of us, you know, the declassification of intelligence around Russia’s invasion of Ukraine in 2022 created a lot of appetite for the U.S. government to be expected to know everything ahead of time and to be able to pre-but it. And the reality is we’re going to get it right most of the time, but we have to be careful about, I think, setting that expectation that we’re going to be able to do that every time. And so when we think about, you know, exposing or preempting disinformation campaigns, I would put it in that broader context of: Does it meet a threshold where we are going to do a benefit to the public good, where we are so sure that this information is true that we feel OK about exposing it, and that we’re not overplaying our hand on the tactic? 

So what I would offer to you is while the Global Engagement Center, which is—who’s familiar with GEC? Anyone? OK, not too many. So we have, like, four bureaus in my public diplomacy family in Washington. The GEC—Global Engagement Center—is charged with the interagency response to countering foreign mis- and disinformation and foreign malign influence. We do not work in the U.S. domestic space, just to help people ground in the conversation. 

And in the last year we’ve done a preemption campaign in Africa, to your point, and we’ve done one in Latin America—a Russian campaign in Latin America, a Chinese campaign in Africa. And in each case, we felt like it met a threshold to either benefit the public good—because in the case of Africa, it was about public health disinformation that was going to actively give bad people—excuse me, people bad public health advice—or we see a public good in making sure that we’re exposing that. In Latin America, it was going to be that legitimate independent media outlets were being used to pilfer Russia disinformation through a hidden hand without their knowledge. 

So there has to be a threshold question is what I would say. And part of the question of does it work isn’t that we expect the disinformation not to travel as much; it’s that we are putting Russia and China on their back foot to know that we’re onto some of their tactics. And there’s a benefit in that as well. 

NOSSEL: That’s interesting. I mean, is there evidence that that kind of stigma actually—I mean, I was—I was interested that you used the word “disincentivize” because I could see how these tactics, you know, would, in their calculus, you know, say to them: Look, sometimes we’re not going to succeed, we’re not going to get away with it. So we’re going to get caught, you know, 30 percent of the time. But you know, do they see a penalty? I mean, is that 30 percent—does the 30 percent discredit them, or make the other campaigns more difficult, or in some way shape their future behavior beyond just saying let’s do two hundred instead of one hundred so that we— 

ALLEN: Right. 

NOSSEL: —you know, we increase our overall number of hits? 

ALLEN: I think we see a benefit to putting their tactics into question and into play. If the, you know, Russian intelligence services who are behind a lot of these disinformation campaigns, for example, know that we know what they’re planning, then they’re by definition forced to try to—try to adjust their tactics in a lot of ways. And we see some benefit to that, right? That’s different than the benefit of are we preventing this narrative from being spread. We’re trying to watch both, but again, it’s hard to—hard to prove a negative. 

NOSSEL: All right. I want to talk a little bit, Liz, about scale. You know, one thing we came to at PEN America—we were a tiny organization, but we were looking at this question of resiliency in relation to the 2020 election and, you know, we found, you know, you can train people to better discern the authenticity of the news they were reading, but it was impossible to reach anything like scale. So where we started to focus was on newsrooms and journalists as kind of vectors, that they—you know, if they can be equipped to deal with disinformation in a more considered way in their reporting, or if they’re targeted by disinformation, then, you know, you can have some ripple effect. 

But you know, I—in reading through some of the information about your programs, it’s clear you’re training individuals, you’re doing exchanges, you’re working with journalists. But what struck me is how we’re out-resourced dramatically, particularly by the Chinese, and that even at the USG level it doesn’t sound as if we’re anywhere near approaching scale. And sort of how do you—how do you think about that? 

ALLEN: This is a good question because for a lot of us who have worked in foreign policy I think you can agree that in the face of such daunting challenges we find peace in the idea that incremental change is still change. But that’s actually not satisfying on this issue set—(laughs)—because it’s expanding so rapidly and the stakes are so high. 

Here’s a couple things I would say about scale. 

The first is that I have found a lot of purpose in naming and diagnosing this challenge, and that’s partly what we’re trying to do from the State Department. It’s why Secretary Blinken gave a speech in March at the Summit for Democracy in Seoul. It’s why we’re trying to have conversations like this with people who, you know, have access to networks who are thinking about what kind of policy solutions and contributions they can make on big structural issues like this. And what I think for too long happened is disinformation became synonymous with communications and content, and what we’ve needed to do in this moment is to say to the broader foreign policy apparatus, and our allies and partners, and academia, and NGOs this is a big structural problem and we have to do the naming and diagnosing first, because we can’t hope to scale programs and tactics if people don’t understand why it matters or what they can do about it. So I wouldn’t want to underestimate this phase of diagnosing that I think we’re still in and making some progress. 

Secondly on scale, I would say there is an enormous appetite from our own allies and partners to work together with us on these issues. I think much like we’ve long had strategic dialogues around, you know, economics and trade, or we’ve had strategic dialogues on science and tech, we are now having MOUs and strategic dialogues on the issue of information integrity. So we need to put policy structure around this to then have to scale it, because the U.S. government certainly cannot be the ones to scale this alone. But if we combine with all of our allies and partners around the world, I do think we have some hope, right? So there was a reason why we signed seventeen MOUs just in the last year, frankly, on this issue. Secretary Blinken just signed the seventeenth yesterday in Prague. I signed the sixteenth last week in Kosovo. And so the scaling, I think, is going to come in governments not just working together, but governments working with civil society on doing this. 

I will say in terms of effective programs—I mentioned this—I think media literacy has a lot of promise, because we aren’t playing the same game as Russia and China. It’s not just that we’re not spending as much money or that we could never imagine spending as much money; it’s that we’re playing a different game, right? We are bound by truth and credibility as our foundational bottom line. They are not. So we shouldn’t pretend that we can keep up. The question is, how do you educate people to tell the difference, which is what you said? 

And so one of our most promising scalable programs is actually working with governments around the world to incentivize media literacy programs to start early on in education, right? And this is where USAID is actually doing a lot of work too, right? The State Department is rarely involved in elementary school education around the world, but AID is. And I think teaching people—teaching students to be information consumers just as much as you’re teaching them math and science should become the norm, frankly. 

The last thing I’ll say on scale, just to give people a sense of what some of the solutions are. You think about an area like the Pacific Islands and Southeast Asia. Extremely contested areas between the U.S. and China, right in the middle of great-power competition, although certainly we try not to say it like that because they don’t want to be put in the middle. But there is a dearth of good information in a lot of parts of the world, so let’s just take that one as an example. We started—the State Department—about two years ago a Southeast Asia Free Media Initiative where we worked to make sure that the AP or Reuters newsfeeds could get into those media markets, and we are funding that. But again, that’s one example. 

So when we started working with our partners in Japan and South Korea—both of whom we have MOUs with, signed in December—we said, hey, one thing you could do working together is provide your free media, your independent media, access in other parts of that region. And we’re looking at that, and they are. 

And so, again, just to give you an example, we can start with pilot programs. We can start with templates and then work with countries around the world to try to do the same. 

NOSSEL: Talking about scale, I want to turn to the role of the tech companies. I was looking at a speech that Secretary Blinken gave on the information ecosystem in March, and you know, he talks about many of the points and areas of activity that you cover, but he also speaks about tech. He says, we’ve invited the tech industry to take steps like improving transparency of algorithms, establishing political ad policies, developing indicators for gauging accuracy of news sources. And you know, it’s that verb “inviting” that really caught my eye. Alongside my work at PEN America, I’m also a member of the Meta Oversight Board where we play a role in adjudicating complex content moderation questions across Meta platforms. And I’ve learned that “inviting” doesn’t mean much—(laughter)—a lot of the time, and that, you know, these companies have their own incentives. They’re sprawling. The way that their decisions get made is opaque. And yet, they are, you know, without question the primary vehicle and vessel for the kinds of information operations that you are talking about. 

So I’m wondering sort of how you think about that from a regulatory perspective, whether you think the EU—you know, we’ve been more or less paralyzed in this country to take any kind of legislative action or regulatory action on issues like the spread of disinformation. That’s not the case in the EU, where they now have some powerful instruments that are being implemented. And so I’m wondering kind of how you see that. Do you think the EU approaches are beginning to work? Or, you know, if not, do you see other tools and methods to lever a more proactive and affirmative role on the part of these companies? 

ALLEN: A couple thoughts, and then if you’re willing would love to hear some perspective from your role to the extent you can talk about it. 

I think what I would make the distinction on the word “invite” is that the opposite of that; what we’re not trying to do is compel. The U.S. government should not be compelling content moderation because that makes us the arbiters of truth in a way that I don’t think our institution is set up to do or really any institution is set up to do, which is, frankly, one of the central challenges to this entire problem. So it’s important to be clear that the U.S. State Department, our Global Engagement Center, our public affairs officers do not, cannot, and in my view should not be compelling content to be taken down on any platform, because it does get into issues of free speech. And we could spend a whole other session on how this issue generally, particularly domestically although not my current portfolio, is ripe for litigiousness and politicization, right, because of how adjacent the issues of censorship are, right? And so we know that. 

But what is important is that the social media companies have their own terms of service, and their terms of service state that users and accounts should not and cannot be misleading, should not be conducting foreign influence campaigns. And so where we have an invitation for collaboration with the platforms is on us helping share information with them that they can then decide either violates or not their terms of service, right? 

And what I would say in a practical matter on what this actually looks like is let’s just stipulate, whether we like it or not, that we can’t boil the ocean on solving the, quote, “disinformation challenge,” but we can make progress, I think, on the foreign malign influence part of disinformation. We can work together to, you know, again, reduce how much Russia, China, Iran, and others are able to use the platforms for their own campaigns. You know, the platforms themselves are interested in doing that. 

Similarly, I think they have a commitment to really tackling the hate speech part of a lot of this, right? Hate speech, you know, incitement of violence, that’s a different kind of disinformation and misinformation than others with less potential for harm. And so when they come to us—because we have invited them—and say please help us better understand what’s happening, we want to be partners in doing that. 

NOSSEL: I mean, how do you distill the foreign malign influence dimension of the problem that you feel, I think rightly, better situated to go after and on firmer ground to go after—but how do you distill that in a(n) ecosystem where it’s so fluid? And you know, if you’re looking at, you know, something false, you don’t know the origin of it. And you know— 

ALLEN: Yeah. 

NOSSEL: So can you act on that? Can you ferret it out? Or, you know, do you have to desist because it’s sort of free speech? 

ALLEN: Yeah. I mean, look, I’m curious what people think about this in the discussion. There’s a difference between false information, right—misinformation—and willfully manipulated information. And I think that’s an important distinction, number one. 

And number two, we actually can in many cases understand the origin or we can understand the backend, you know, technology behind, for example, bot farms, or, you know, fake accounts, or things like that. So when we have information that helps make the case, we can do that internally and we can share, you know, what we would otherwise share publicly. 

NOSSEL: Yeah. I mean, one observation I would make, you know, to your question what I have observed at Meta is in the EU regulatory context, you know, for the most part they have tried to skirt that line, and avoid being an arbiter of truth, and instead double down on requiring that the companies implement their own systems, and audit those systems, and be rigorous in ensuring that the standards that are articulated in those systems are actually met across these sprawling platforms. I think there—I think there’s some power in that. You know, there are all kinds of excuses and flaws and lags, and you know, the incentives, frankly, here in this country for doing that are not all that great. And we see that in the context of our work at Meta, that there are a lot of enforcement areas; that, you know, there are many times where the, you know, things sort of slip through the enormous cracks, and that’s why the case comes to us. And so I do think, you know, there can be a part to play for regulation—whether in this country, you know, or not an open question, but in terms of trying to push some rigor in how those policies are actually implemented and enforced. 

I’m curious, Liz, about a piece of this that you talk—you’ve talked about publicly, which is sort of how the U.S.—and you touched on it in your first answer—sort of how the U.S. communicates itself in its own voice, and sort of the complexity of that, you know, in this era where institutions writ large are all distrusted, where our government I think is particularly distrusted, you know, and yet, you know, must be a voice. And you know, there have been periods of experimentation, whether it’s chiefs of mission doing their own social media or, you know— 

ALLEN: That always ends up well. (Laughter.) 

NOSSEL: —U.S.-sponsored campaigns. What have we learned? What have we—like, what have we learned? What do you see? What do you actually think is effective? And you had two phrases that you mentioned in something that I saw online, audience analytics and people-centered storytelling, which seem to be pieces of sort of what you’ve distilled as meaningful in terms of how strategically the U.S. uses its voice. So maybe weave those in if you want. 

ALLEN: Yeah. This is—this is fun, because I think if we have any communicators in the room you’re sensitive to the idea that, despite their best intentions, a lot of times our policy colleagues say we have a messaging problem. And I’m like—(laughs)—well, we also have a policy problem. So—(laughter)—and those are sometimes synonymous, and sometimes they’re not. And what I would offer, though, is that we have to make sure we are not looking at communications and messaging as a silver bullet to overcome legitimate policy challenges or areas in which the U.S. just simply cannot deliver on what we’re trying to promise. 

What we know now is that audiences around the world—and that’s where most of our State Department analytics lie, so I’ll say nothing of the U.S. domestic audience—have very little patience for the gap between rhetoric and results. And I think the U.S. for many decades was carried on our values with a degree of benefit of the doubt of what we were saying. That just isn’t true any longer, and some of that’s related to our own actions, and some of that’s related to allegations of hypocrisy where that may or may not be true, and some of that is actually just a reflection of the modern media environment. And most institutions are struggling to communicate beyond their values to actually show results. The question is, to what end? 

And so part of what we’re trying to do as a strategic communications matter and a messaging matter is to better help people understand the benefits, the results, the tangibility of what the U.S. foreign policy priorities are, right? I was just in an eight-day, four-country road trip through the Western Balkans. Fascinating, and we covered a lot of these issues. And one of the things I heard over and over from audiences in those countries about their EU accession prospects was we don’t actually understand what it’s going to do for us. We don’t believe it’s going to make that tangible a difference in our daily lives. So where we’ve all grown up hearing from leaders in this country about kitchen-table issues, that’s true all over the world. And so we need to adjust a lot of our own messaging to reflect the fact that people need to understand what lofty rhetoric is actually going to mean in their lives in terms of results, and we have to be quick about it. So the personal storytelling piece is critically important. 

The second thing I’d say is because we have to be humble about our own credibility and reach, and frankly decide when using our voice makes the most sense, we are increasingly working with influential voices elsewhere, right? It’s no surprise to anyone that digital influencers are reaching particularly young generations these days. 

So to give you an example, I was in Sri Lanka in March, and I—February, March? What day is it? February. And in Sri Lanka, we have a very thriving young civil society, right? Sri Lanka is a country in which our U.S. policies are actually making a difference in terms of bringing economic levers, development levers, communications levers to the—to the fore. And we’ll see what happens in terms of their tack toward the U.S. or China. But it’s important to be talking to these influential young leaders where they are. And so while I gave a set of remarks in front of six TV cameras and the traditional media in Sri Lanka, I also did an event with 150 content creators. That reached a million people by the end of that week, right? 

And so the point is, is we have to be doing both. We should not abandon traditional media. I would never suggest that. There’s always a place in the world for traditional independent media. But as the definition of media—and I’d be curious for your thoughts on this—as the definition of media widens, and the definition of journalism widens, and as we’re all struggling with what editorial control looks like, we also—we do have to be enlisting people who are already talking to very large audiences, which means not just bringing in your columnist that you read in the New York Times and the Washington Post every day, for example, to get briefed at the State Department, but also bringing in influence—(audio break). So just—that’s just a little bit about how we’re thinking about strategic comms. 

NOSSEL: OK. I’m going to open it up in a minute. I’ll ask one more question, and then please be ready with yours. 

I want to put two things together that I think are transforming this landscape, encryption and AI. As so much more communication moves into a realm where it travels kind of unseen, and as it becomes so much easier to create and propagate synthetic information and manipulated media, how do we begin—I mean, these—you know, the problems you’ve talked about are so deep and complex and sprawling as is, and you know, it just seems to me these two overlays, you know, really compound it. (Laughs.) 

ALLEN: This is—I mean, this is sort of the question for the moment, right? Is AI going to be the thing that absolutely takes disinformation and misinformation to a level beyond which we have a capacity to actually reverse any tide, or is the fact that the technology around generative AI evolving in such a way that we can actually come in and influence that technology as it’s being developed right now? And I would say right now we’re still in the latter bucket. 

And this is actually where the voluntary nature of tech companies working with the U.S. government has worked. People know that the—you know, the White House earlier this year elicited voluntary commitments on AI regulation and safety from the tech companies. And that’s an area where they said, you know, respectfully to the U.S. government: We don’t believe you’re going to be able to regulate or legislate this with the accuracy or the speed at which it would be effective regulation. It will be obsolete by the time our technology is still evolving. So what we’re doing is coming to the table willingly as U.S. companies who want to preserve our ability to innovate but recognize we have a role to play on the safety piece of this and that’s worked quite well, we think, on the AI piece.  

And I would say that we’re at the point where the AI companies are able to show that they can authenticate AI-generated content. It’s just a matter of can they scale it. And I would just say to everyone in this room one thing that we struggle with to understand if it’s ever possible is even if you get to a model where 99 percent, for example, of a content authentication model for AI-generated content is accurate does that 1 percent make the whole thing obsolete?  

That’s a philosophical discussion that’s being had in the tech community with us right now, how do you do content authentication in a way that you can be sure it’s always a hundred percent correct because that’s a difficult number to achieve.  

So a couple thoughts on authentication. I think there is a place in the world for that. Some of the AI companies and the social media platforms have already started doing authenticated content and labeled content related to the U.S. election and other elections around the world. So they’re starting with some very specific use cases and then we’ll see if that technology can be scaled more widely.  

But I would be remiss if I didn’t offer one sort of thought on optimism around AI and the information space because, again, we have to believe that there are some possible ways to chip away at this. 

I would argue that we should think about AI as a really effective force multiplier on proactive strategic coms as well and we’re doing this in two ways at the State Department, if I may, because while we’re very focused on the harms of deep fakes and the harms of AI-generated disinformation, which are plentiful, we also have to be focused on adopting AI soon enough to stay ahead of the curve on how to use it to be helpful. 

So two ways in which we’re doing that at the State Department, one, language translations. Think about how many audiences around the world we are missing because we don’t have language experts sitting at a desk at the State Department translating manually content into that language, right? 

We can’t afford that, let alone find those people, let alone do it fast enough. But the opportunity to use AI enabled tools gives us an opportunity to translate all of our messaging and our content—this is how media outlets are thinking about it, too—into languages that then reach more audiences. So I think that’s an incredible opportunity.  

Secondly, I don’t know how many of you, you know, have either gotten or get now media analysis from your teams. If you’re at the top of any organization our ambassadors, all of us in D.C., we need to know what’s going on every day in the media, right? 

The ability for AI to help us understand what’s going on in the media every day, do media monitoring, do clips analysis, is potentially transformative for organizations like the U.S. government and probably many of yours, right? 

We just debuted at the State Department an AI-enabled media monitoring platform tool that we built ourselves over the last year, and once we scale it to every embassy overseas we are going to save 180,000 PD working hours next year because people can use AI to do media analysis and clip summaries instead of sitting there for four, six, and eight hours of time. That’s an enormous opportunity for us.  

What am I going to do with 180,000 hours of my workforce? A lot of the in-person people-to-people diplomacy, media relations, talking to exchange participants, meeting with students, that actually is what the U.S. competitive advantage is over, frankly, China and Russia. 

People want to talk to us. They want to be educated by us. They want to learn English. They want to go on our programs, and our ability to scale all of that, I think, is related to our ability to use AI to clear the space to be able to do more, and ask Congress for more money.  

NOSSEL: That’s fascinating. You know, I was just thinking as you’re saying that, you know, someone will think about how they can influence that USG media monitoring system to propagate their messages.  

But that’s the spiral we’re in and then you’ll figure out a better way to stop that from happening and to expose them.  

OK. I’m going to open it up. Just a reminder, this meeting is on the record and please state your name. Don’t put down your hand—(laughter)—because of that. 

Yeah, Yaël?  

ALLEN: Nice to see you. 

Q: Hi. Oh, I forgot we have mics here. Hi. Yaël Eisenstat with Cybersecurity for Democracy.  

I want to preempt my question with I understand if there are legal reasons why you might not be able to answer my question and I accept that.  

So one of the things that many of us have seen I’ve been calling for a while is this multi-front assault against people working to expose the threats of disinformation and that’s whether it is the lawsuits against civil society orgs, whether it is certain congressional hearings against disinformation researchers, but also, obviously, Missouri v. Murthy right now.  

And so I’m curious from your perspective in your seat in government, I mean, we already know—I’ll be very transparent. I’ve been on the receiving end of these threats. How are you thinking about we have an election coming up. There is no doubt that foreign disinformation campaigns will be—will affect it.  

But you’ve got Missouri trying to—Missouri v. Murthy trying to stifle government collaboration with the platforms. I’m just curious how you think about that piece of it and if there’s a real chilling effect on that necessary collaboration. 

NOSSEL: Do you want to just explain in a sentence what Missouri v. Murthy is? Or I can. 

ALLEN: Please. My lawyers would be happy for you to do it. 

NOSSEL: I mean, it’s basically a lawsuit that was filed challenging government contacts with social media companies, for example, when government was reaching out to social media companies to try to deal with false information about COVID. That’s one example, and the argument is that that’s a violation of the First Amendment because they’re using government authority to try to control, you know, what gets posted or taken down from social media.  

And I know the State Department, I think, is an exception to the order that’s currently in force that limits some of those contexts while this case unfolds. 

ALLEN: We were until recently. So yes, to be transparent the State Department is a defendant in this case so I am by definition limited on what I can say.  

But I think I would offer two things, generally speaking, because I did mention the sort of political incentives that exist to use disinformation as sort of a political lever when we’re seeing that, no question.  

This is where I think it’s really important on the diagnosis of the problem, the educating people about the structural challenges and what we’re doing about it to differentiate, for example, what the State Department’s doing, which is exclusively foreign malign influence and disinformation in overseas audiences, with what other parts of the U.S. government are doing, right?  

DHS and DOJ are in charge of foreign election interference for the United States, right, and one of the things we can continue to do is make clear to stakeholders who’s doing what because it reduces the ability for a litigious environment or accusations unfounded to have legs if people better understand what everyone’s mandates are. So we’re doing that.  

I think in my own discussions with philanthropies and academic institutions we have seen an unfortunate chilling effect on very necessary research into these issues. But, again, I think that there’s some opportunity in better scoping that type of information and so one of the things we’ve talked to some of our own partners about is how do you scope your research in such a way that you can actually try to differentiate between what’s happening in the U.S. domestically and overseas because for our purposes that’s quite helpful.  

NOSSEL: Please.  


Q: I’m not sure whether this—Bob Hormats, a former State Department official.  

ALLEN: Hi.  

Q: Hi. Good to see you again.  

Here’s the question I have and that is we know what other countries are doing. We may not know where they’re doing it to the extent but it’s substantial and it’s growing. My question is on the international side of the equation. 

Do you have conversations with other countries who we know are doing this and go to them and say, look, you have vulnerabilities too and we have the ability—I would say this pretty openly—we have the ability to create a lot of misinformation in your country about your leaders.  

Is there any possibility or is this just whistling on the wind of having a serious enough conversation with them so that—you went to the question of deterrent earlier on. I would say that a lot of these countries are very worried about green or yellow or pink or orange revolutions themselves.  

Is there a way of taking a much tougher view than I perceive we are, frankly, and saying we know how to play this game, we can play it better than you, and we can play it in a way that’s quite destabilizing from your point of view.  

Is there a possibility of having such a conversation? I mean, that’s not the Justice Department because they’re doing stuff here. The State Department, however, is doing the kind of the negotiating that you’re talking about.  

Why is that not there? You know, we did this during the Cold War. This is, just as you say, an information war. Why are we not doing the same thing with gusto?  

ALLEN: Can we play offense. 

Q: Yeah, a big offense and a threatened offense can be helpful to these countries, because they’re vulnerable too, and they know they’re vulnerable. They all do.  

NOSSEL: Go ahead. 

ALLEN: I’d say a couple of things, and I appreciate the premise of the question. I’m limited in what I can say both about private diplomatic conversations and about what kind of authorities some of our— 

Q: (Off mic.) 

ALLEN: Yeah. Sure.  

I think—look, I think playing offense looks a lot of different ways and I think when it comes to U.S. election interference, certainly, that’s part of the diplomatic agenda, no question. Yes. 

What I would offer is that in a lot of our conversations with countries around the world who are some of the offenders it’s actually that they’re more afraid of the public opinion piece which, you know, is influenced by information. But we can get at your question in other ways that we have found to be, I would say, more effective.  

The corollary is also that we are talking to leaders around the world, and I just did this in the Balkans, who even if they are not the origin of disinformation campaigns and foreign malign influence the media environment in their country, either because it’s corrupt, right, or one by oligarchs is allowing disinformation to spread, for example, right, or we know that it’s an environment in which propaganda is incentivized. Certainly, those conversations are all happening too. Yeah. 

NOSSEL: A question over here. Thank you.  

Q: Pete Mathias, term member.  

ALLEN: Hi.  

Q: Is there false hope here? If the truth were obvious, we wouldn’t have courts. But humans have relied on courts for millennia to help us approximate or decide what the truth is. So is there false hope that the State Department can decide in minutes, in days, in weeks something that in some cases can take months or even years for courts to decide? 

ALLEN: When it comes to tactics, or when it comes to disinformation narrative results? 

Q: When it comes to truth. When it comes to truth. 

ALLEN: Yeah. 

Q: And then maybe your answer to this is going to be, well, you know, we don’t—there’s certain truths that we don’t opine on, for example, what happened. But even in those cases, where do you draw the line? Where do you draw the line for, like, the truths that you’re OK deciding if it’s true or false and the truths that you decide, OK, well, the jury is out?  

ALLEN: Yeah.  

NOSSEL: I mean, you know, recent examples you might cite are, you know, the lab leak theory where, you know, there was a lot of sense that this was just a falsehood and it was fanning anti-Asian-American hatred and that it shouldn’t be allowed to spread freely, you know, and then later on some researchers came forward and said, no, this was, you know, prematurely dismissed and it needs to be looked into.  

So, you know, that’s, you know, one of many examples of, you know, moments where I don’t think it’s the U.S. government but it’s certainly tech companies that are making these line drawing decisions in real time.  

ALLEN: Yeah. And, look, I think the root of your question is as we have an information space in which people are seeing so much with their own eyes but it’s so often devoid of context, right, how do we actually discern truth. And I would just offer that from where I sit I think we talk a lot about mis- and disinformation; we’re not talking enough about amongst the, you know, publics and the masses incomplete information.  

People are forming judgments and views not necessarily informed by mis- and disinformation but informed by an incomplete information set because it’s just so overwhelming and algorithms are built to reinforce bias and keep people in their information silos, which we haven’t even talked about here yet, right?  

The business model and the structural model of the way people get information is not one in which context and nuance and healthy arguments and two things being true at the same time thrive, which is actually the only hope we do have for trying to get to common understanding.  

I would offer that I think what you’re hearing today is not a false hope, naive assessment of our possibility to make change. It’s the reality that if we don’t try what’s the alternative. Unilaterally disarming? That doesn’t feel satisfying either, right? 

And so I think knowing that we are playing an absolutely asymmetric game here and against some of the toughest odds and, frankly, really daunting, you know, indicators for going into the future I think that’s where you constantly try to reassess where is the most effective place to put limited resources and it’s why we’re trying things like preemption .  

Is that going to work? We have to try. But it’s why, you know, for example, investing in media literacy, which are programs not put on by the U.S. government even if funded by because we don’t think we should be the ones trying to say here’s what’s truthful and what’s not in most cases.  

That’s where civil society has an enormous role to play, and by even having conversations to awaken people to the fact that this is foundational to everything else brings people to the table in a way that says, hey, we all have a role to play. We got to try.  

NOSSEL: Right here. 

Q: Thank you very much, Undersecretary, for this great, great presentation.  

I’m Mark Magnier with the South China Morning Post. 

ALLEN: Great.  

Q: And the kind of conventional thinking since 2015 or so as we all started to wake up was that Russia, truly, truly bad boy; China edging in, has a little better stake in the global order than Russia does; and then sort of North Korea and Iran, further off with less capabilities. Can you give us a sense of what you see as China’s trajectory here, since they’re clearly the ones with amazing tools, amazing technology, and amazing brainpower that can be kind of deployed? Thank you.  

ALLEN: Thank you.  

I would commend to you a report from our Global Engagement Center that was issued I believe late last year that does lay out from the U.S. government standpoint with substantiation how China is manipulating the information space, right—how they are using information as a lever of their power, and so there’s a lot to be said about that.  

What I would offer, and having just gone to China about a month ago with Secretary Blinken and sitting for hours is that we find that our relationship with China is one in which we have to manage to, you know, chart the future of the world in a way that is different and there are necessary opportunities for collaboration that, frankly, include AI.  

I would offer that we just two weeks ago had the first U.S.-PRC AI dialogue, and so I think it’s important as we are, on one hand, calling out manipulation where we see it we also have a relationship in which we are managing it responsibly, you know, in terms of shared issues.  

NOSSEL: Can I—I’m wondering if you could—I mean, I’m curious. I don’t know if this is what you were asking but if you can say anything more just from the perspective of sharing, you know, what you’re seeing in terms of their tactics and their capabilities and how that is unfolding. 

ALLEN: Yeah. I mean, one of the things we’re seeing across the world, and I alluded to this a little bit before with how to get independent information into really, you know, low sophistication media environments, is the PRC has been very aggressive and successful in offering their media services to countries around the world particularly in, for example, Africa and Southeast Asia. As I mentioned, also in Latin America.  

So where we’ve seen, you know, that structural advantage for them by getting Xinhua into communities around the world and teaching Mandarin to communities around the world that’s where I think we can say—that’s not—you know, that’s a different kind of information manipulation than disinformation campaigns, for example, and the media policy tools at their disposal alongside economic and commercial diplomacy tools, right—ports, bridges, roads, all the things—has proven to be really effective, right—5G, Huawei.  

All those things together are things that the PRC is doing around the world, and so as we think about our public diplomacy tools we have to think beyond just the information space even though that’s most of our conversation here. I mentioned before quickly but I would just say if my friends in the U.S. Congress said to me we’ll 10X your budget tomorrow, I would spend a lot of that teaching English to people around the world. I think we have to think beyond just people’s information consumption on the internet— 

NOSSEL: (Sneezes.) 

ALLEN: Bless you. 

NOSSEL: Thank you. 

ALLEN: —as—you know, as the way in which we bring policy solutions to the table. You know, we hear all over the world from young students—from young civil society leaders, we want to be educated in the U.S. Help us learn English. Give us access to capital. How do you help us find markets for our businesses? 

Some of this is just the United States’ ability to show up as a partner because the appetite is there and so there are a lot of economic diplomacy, commercial diplomacy, and public diplomacy options, programs, and tools to take this on as a bigger structural matter than just information. 

Can I—we have a lot of questions. Can you just maybe for a minute, Suzanne, talk a little bit about how you find your members or colleagues in the media in terms of how they see their own role in this?  

Do you think people well understand the whole environment? Because one thing that I’ve talked to a lot of our colleagues in the media about and reporters all over the world is until people see themselves as part of a solution or as part of this ecosystem, you know, we can’t make progress.  

And that’s where I think the question about do we have false hope about the USG, no, it’s about the USG helping convene people around a table to make the environment more healthy. 

NOSSEL: Yeah. I mean, I’ll give you just briefly, we did a survey of journalists and we asked them do you have and are you being provided with the tools that you need to deal with disinformation in your work and basically four out of five of them said no, and this is probably about eighteen months ago, and we’ve developed on the basis of that a program that is dedicated to filling those gaps and equipping them, you know, whether they’re reporting on disinformation, they’re targeted by disinformation, they’re trying to sort through, you know, what may be disinformation that they encounter incidentally in the course of their reporting. 

And so there’s a lot of work to be done, you know, particularly for small newsrooms. I think the New York Times, obviously—you know, Reuters, AP, have tremendous resources that they’re dedicating to this. But, you know, we didn’t really get into the question of local news but I know that’s a centerpiece of your conception of information resiliency and we agree because local news is by far the most trusted around the world and here in this country.  

And so in particular it’s those small newsrooms that, you know, are in dire need of these sophisticated tools and it’s also—you know, it’s evolving and changing. So it’s not something you can educate people on as a one shot thing.  


Q: Thank you for being here.  


Q: I’m Kristen Kaufman. I work in the New York City mayor’s office for international affairs. 

One area where we’ve seen a large amount of disinformation from foreign media sources, foreign social media platforms, is on the migrant crisis. We’ve spoken to your Western Hemisphere division actually just yesterday on how we can collaborate better to fight this disinformation. 

Would love your thoughts on how we can be more aggressive at stemming that tide through the tools you have at your disposal.  

NOSSEL: Kristen, can you give an example of what the kind of misinformation that you typically see?  

Q: Sure. A lot of it is going through the Darien Gap. Most of it we see is information you—when you get to New York City you’re going to get a free house. They’re going to give you a job. They’re going to give you money. You don’t have to work. You go on this path. You go to this entrance at this border checkpoint. You go to this nonprofit once you cross the border— 


Q: —and this is turning into a national security threat because of the quantity of people coming in. 

ALLEN: Yeah. This is a great question. Great to see you. I know some of your colleagues. Thanks for being here.  

OK. This is a really illustrative example because I actually can’t believe we’re almost done with this and we haven’t said the words TikTok yet because it’s so central to so much, frankly.  

The migration disinformation ecosystem is a really good example of why disinformation has been allowed to thrive and it’s because it’s mostly gone into closed channels—WhatsApp, Telegram, and recently TikTok, which is an open channel. 

But our challenge on countering disinformation and getting truthful information into the, quote, “information environment” has been made even more difficult in the last few years because people are increasingly going to, you know, closed messaging spaces, right? 

If they don’t believe what’s on Facebook or they don’t believe traditional news, most of the migration disinformation is being carried through WhatsApp and people-to-people networks, texting networks, mass texting networks. 

How can we even begin to get our heads around that? One, we’re chipping away. To your point, example, is the U.S. State Department just for the first time launched our first commercial WhatsApp channel in the Latin American region specifically to counter migration disinformation.  

What that means is—and how many of you are on WhatsApp? OK. Most, and probably most of you either use it for groups or for person-to-person texting, right? We know that. But a lot of times companies—private sector companies can buy WhatsApp channels to do mass advertisements. 

We are now doing that as the U.S. government. We have a WhatsApp channel specific in Latin America to try to get accurate information into WhatsApp ecosystems. Again, that’s not a silver bullet that’s going to solve the problem tomorrow but it’s an example of us knowing that we have to push ourselves on where information needs to be to be effective. 

What’s also true on the migration piece, and this has been a big learning for us, is no amount of U.S. government Facebook apps is, like, going to change people’s minds on this but their own local radio reporters, right, are going to. 

So one of the things we’ve invested in is bringing reporters from around Latin America to the Darien Gap and to the U.S.-Mexico border and they can go back to their communities, report to their own listeners and viewers what they saw, and that’s actually proving to be most effective.  

So some of that’s a little tactical, right, but what I would offer is that I think we’ve seen in the migration crisis messaging is one thing. Where we’re doing that messaging has been the more consequential piece of the migration crisis and we’ve had to push ourselves.  

Yeah. Thank you.  

NOSSEL: In the back there. Yes.  

Q: Good afternoon. Thank you. Chris Stockel, civil affairs association as in the military. 

So we talked about a tools, scope, staffing, the effects that they’re going to have, the Global Engagement Center, right, lead, directs, synchronizes, integrates, the entire U.S. government, right? So are they properly resourced to accomplish what their mission statement is especially with regards to let’s just take public affairs at OSD or the JCS staff. Not to pick on the military but they’re pretty big, right? And so if you could talk about the comparison of appropriated resources. Thanks.  

ALLEN: Yeah. Thank you so much for the question. I’m really glad for the opportunity to talk about this.  

The short answer to are we resourced enough to tackle this problem is no, absolutely not, which I say to bipartisan members of Congress all the time because this is an issue that, largely, has bipartisan support despite some of the politicization on disinformation because people understand how fundamental it is.  

Here’s what I would say. Right now DOD has an enormously higher budget than the State Department on this issue and DOD also has different authorities, right? This is an open—this is an on the record environment so I’m going to keep it high level.  

But there are different authorities that different agencies have to tackle the information space and one of the things we recognized a couple years ago upon coming in is that there just needs to be better State Department-DOD collaboration on the information space, and there has been, which we think is really good.  

One of the first trips I took when I was in this job and I was acting for a year before I was Senate confirmed was to INDOPACOM and sat down with Admiral Aquilino and said, what are you all doing. We know some of what you’re doing. Here’s what we’re doing.  

Have been to other COCOMs, SOUTHCOM. My colleagues have been to CENTCOM, EUCOM, and AFRICOM, and what we have found as State Department officials is that we are, largely, knocking on an open door from our DOD colleagues who are saying, you know, we’d love to better understand what the State Department’s strategic messaging priorities are. We want to understand how we’re talking about the U.S. government’s foreign policy priorities.  

It makes everybody who’s working in the information space better to be working off the same kind of strategy, and so we have detailed public diplomacy State Department Foreign Service officers to most of the combatant commands as an example of the kind of thing that’s increasing connective tissue when we’re on the side working on more budget.  

Thanks for the question.  

NOSSEL: I think we’ll probably have the last question in the back there. Thank you.  

Q: Hi. Sander Gerber. Thanks for the— 


Q: —interesting discussion. If you repeat a message again and again and again people believe it, and we’ve always been bad at information operations as a country relative to our adversaries. But now these platforms are so broad and so asymmetric, as you said, I don’t see how education is going to fix it. I don’t see how voluntary actions by the media platforms are going to fix it.  

And it’s complicated also because there’s so many multiple authorities by all these different government agencies. But this issue is a threat to our democracy so why isn’t there a blue ribbon commission that raises the profile to this on a national level?  

I don’t hear a national debate that what’s going on is actually a threat to our democracy to at least raise it and start to bring together the resources to put a full USG effort to deal with this.  

ALLEN: Yeah. So a couple things.  

Agree with you that more attention is needed, absolutely. I mean, and I’m not saying that just because it’s my portfolio at State and because we recognize alongside a lot of our colleagues across the whole State Department, right, this is a national security issue. This is a, like, democratic imperative all over the world.  

And so I would say I don’t think we think education is the solution. It’s one of the solutions, and the question—and I think this is, like, a little bit sort of where we started is, is the—you know, is the accumulation of every possible line of effort going to make a dent or not. I think it’s hard to say right now.  

But, again, I think we have to try. I think to your point about repetition, it raises an area in which I would like to see more research by academic bodies, which is the psychology of information, and this is just my personal view, again, based on sort of my own friends and family experience.  

I think there’s a lot of issues of agency and pride in first impressions involved in disinformation and information consumption that no amount of policy prescription can overcome, right, except for maybe learning how to be a better information—a more discerning information consumer to begin with because it is difficult to overcome first impressions.  

It is difficult to help people understand that they are often being acted upon in the information space either by a Russia or a China or even by the own algorithm that dictates what they’re seeing every day, right? So we have to try to get people comfortable with the idea that they don’t have as much agency in their information diets as they think. But I don’t view that as a job for government, right? I think we have a sort of whole of society solution set here and that’s where I do think more attention is needed.  

So would be very happy for people to advocate with your own members of Congress and with your own leadership about the fact that this is not something that, A, has a silver bullet solution, that, B, government can or should do alone or that, C, we’re going to make quick progress on. But it’s really just a matter of trying to tackle the structural pieces. But I appreciate the point.  

NOSSEL: And, I mean, you didn’t mention it but, of course, there was an effort to stand up a disinformation commission and it was immediately attacked and, you know, basically dead on arrival because it triggered so much of a sense that this was the government stepping into a realm that it shouldn’t be in, to the some of the points raised earlier, in terms of free expression and the First Amendment.  

And so, you know, that just underscores that there are these fault lines and tripwires in this sensitive area that can be set off and can make it very difficult to pursue this work and, you know, for me that just underscores how really impressive it is to have somebody as kind of nimble and thoughtful and experimental and practical as Liz Allen getting her arms around this challenge and explaining it to us in such a compelling way.  

And so really grateful to you. I’ve really enjoyed this. Thank you so much. 

ALLEN: Thank you. Thanks for being here. Thank you for having this. (Applause.) 


Top Stories on CFR


The closely watched elections on July 28 will determine whether incumbent President Nicolás Maduro wins a third term or allows a democratic transition.

International Law

The high court’s decision could allow future U.S. presidents to commit grave abuses of power with impunity, with serious implications for U.S. foreign policy and national security.

Election 2024

The Ohio senator is Donald Trump’s choice as his running mate for the 2024 presidential election.