Dolores Albarracín, professor and director of the Social Action Lab and the science of science communications division of the Annenberg Public Policy Center at the University of Pennsylvania, discusses ways to address misinformation. Dana S. LaFon, national intelligence fellow at CFR, discusses malign influence campaigns, how to combat them, and their implications for national security and democracy. The host for the webinar is Carla Anne Robbins, senior fellow at CFR and former deputy editorial page editor at the New York Times.
FASKIANOS: Welcome to the Council on Foreign Relations Local Journalists Webinar. I’m Irina Faskianos, vice president for the National Program and Outreach here at CFR.
CFR is an independent and nonpartisan membership organization, think tank, and publisher focused on U.S. foreign policy. CFR is also the publisher of Foreign Affairs magazine, and as always CFR takes no institutional positions on matters of policies.
This webinar is part of CFR’s Local Journalists Initiative, created to help you draw connections between the local issues you cover and national and international dynamics. Our programming puts you in touch with CFR resources and expertise on international issues and provides a forum on sharing best practices.
Thank you all for taking the time to join us. I want to remind everyone that this webinar is on the record and the video and transcript will be posted on our website after the fact at CFR.org/localjournalists.
We are pleased to have Dolores Albarracín, Dana LaFon, and host Carla Anne Robbins with us to just have this discussion.
Dolores Albarracín is a Penn Integrates Knowledge professor and renowned scholar in the fields of attitudes, communication, and behavior. She has published close to 200 journal articles and book chapters in leading scientific outlets and has been cited over 20,000 times. She is also the author of six books and her forthcoming book is titled Creating Conspiracy Beliefs: How Thoughts Are Formed.
Dana LaFon is a national intelligence fellow here at CFR. She most recently served as chief and founder of the National Security Agency’s office of operational psychology, which is responsible for scaling psychologically-based insights for government operations to counter some of the most egregious national security threats, and she’s an expert in the fields of remote psychology assessment, influence psychology, and malign influence campaigns.
And Carla Anne Robbins is a senior fellow at CFR and co-host of CFR podcast “The World Next Week.” She also serves as the faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs and prior to that she was deputy editorial page editor at The New York Times and chief diplomatic correspondent at the Wall Street Journal.
So thank you all for being with us. I’m going to turn the conversation now over to Carla.
ROBBINS: Irina, thank you so much, and it’s great to have everybody here today and this is a—quite a hot topic and just going to have the conversation among us for about twenty, twenty-five minutes, and you all are journalists—I’m sure you’ll have lots of questions. So feel free, please throw up your hands. Put questions in the chat and we’ll go from there.
So, you know, this is the year of elections. There are going to be more than sixty elections around the world. In the U.S. we have a highly polarized presidential election. We have races for thirty-four Senate seats, every House seat, eleven gubernatorial elections, forty-four states with eighty-five legislative chambers up for grabs.
So, Dana, if I may call you that—
LAFON: Of course.
ROBBINS: —if we can start with you.
You know, what do we have to watch out for in the 2024 elections around the world and in the U.S. itself? When I think about misinformation I’m sort of still caught in 2016, to be perfectly frank. It’s the Russians doing it to us.
You know, are the dangers in the U.S. still mainly foreign? How much of this is homegrown and is it different from what we saw in 2016?
LAFON: Thank you, Carla. Thank you for having me.
And I just want to remind everyone—I know this is on the record—I speak for myself and not any government agency or their constituent agencies.
So thank you very much for hosting me and this topic. So here’s the thing. The idea of misinformation or disinformation which are called misinformation is larger than the average citizen in the U.S., in my opinion, understands.
It’s a very—it’s a larger problem than we experienced in 2016. Nation states such as Russia, China, Iran, and more are actively motivated. Their motivations, if you will, to understand is to keep us distracted, to divide, driving wedges in U.S. social discord and what this does for them it allows certain autocratic nation states to achieve their global goals with less Western interference.
So if we’re occupied with other things it’s easier for those folks to achieve their goals, and this strategy goes back decades but we are seeing an increase particularly both from China and Russia but is also a domestic issue. There is a domestic threat that is on the rise. I can’t give you numbers as far as percentages. I don’t know how we would know. But from our observations we can see that it’s a very large increase.
For example, China, they have—I’ll give you a few examples. They have a state initiative that is multilingual influencers. In other words, they have multilingual influencers online that reach an audience of over a hundred and three million people in forty different languages.
So if you think about influence strategies and influence techniques that they’re perpetrating in their influence strategies they’re using things like likability because they’re speaking the same language. They’re able to pose as native to the country to which they’re targeting. In the run up to the 2022 elections we saw activities such as Facebook removing Communist China Party false personas that were targeting pro-democracy activities.
They use what’s called a seed and amplifier strategy. In other words, they post a seed and then they’ll come back and refer to that seed for truth validation to validate their position and that then ignites an amplification strategy.
So that’s one basic core of misinformation amplification that we see used through false personas, and from a techniques perspective this is engages—this engages techniques such as influence principles of authority—you know, believing those in an authoritative position by perhaps using pseudo fake or fake news sites that they’ve sponsored, likability because of the commonality that they have with their targets, and social proof. If they can get folks to start amplifying and authentically sharing then that amplifies the strategies.
We’re also seeing some things ahead of the 2024 elections. We’ve seen Meta remove thousands of China Facebook accounts that were false personas posing as Americans to post on U.S. politics and relations between Washington and Beijing.
They all have their different profiles of action of how they conduct their activities that are rather distinct but they’re also getting quite savvy at it. So as we progress it will be even harder to recognize those false strategies without strategies of our own in place to recognize.
Russia has a role, too. They have even more motivation to interfere in the 2024 elections than they even had, in my personal opinion, in the 2022 elections. Their goal has traditionally been at a high level to denigrate democracy through sowing that divisiveness, that confusion, the distraction and through other means.
They’ll use state-owned media. They’ll leverage generative AI. They’ll use false personas and covert Russian intel-directed websites and as well as cyber-enabled influence operations like we saw in 2016 when they hacked the DNC.
We’ll see more of that, in my opinion. How we combat it I think that’s something we’ll discuss later in this forum. But there are ways to combat it and so I’ll pause there as the threat intel picture and we can come back to that topic.
ROBBINS: Thank you.
So, Dolores, I do want to talk about you were co-author of this APA report about how to deal with it. But can we just—is there a difference between disinformation and misinformation? Because we tend to use these words interchangeably. Are we making a mistake using them interchangeably or can we get away with that?
ALBARRACÍN: I prefer an umbrella term, misinformation, used more broadly. Misinformation, of course, implies—denotes this concerted attempt at introducing known falsities with intent.
Regardless, even if it’s unintentional, misinformation can be equally harmful. So I don’t think the distinction is necessary, although, of course, everything that Dana has been describing would be clear disinformation. However, its impact has spread beyond the initial influencers is equally harmful or perhaps more and ultimately the intended goal. Yeah.
ROBBINS: So we shared your report and I do want to—I want to talk about—with both of you about some of the recommendations.
You know, we—as journalists we’ve really studied with—we’ve really struggled since 2016 about how to deal with misinformation and outright lies and we’ve been warned again and again that flagging something as untrue would only actually draw more attention to it, make it more viral and for some people even more trusted. You know, if you journalists write about it I don’t trust you journalists in the first place so it’s probably true if you tell us it’s not true.
So this APA report, Using Psychology to Understand and Fight Health Misinformation, has a series of recommendations that seem really relevant not just for people covering health but also for people covering politics. And I wanted to start with avoid repeating misinformation without including a correction. Can we talk about this? You know, people talked about truth sandwiches. You know, what can you recommend to this group embarking on a year of really intense election coverage? If we know there’s a lot of misinformation or disinformation on a particular topic, you know, when do we know that it gets to a point in which we can’t ignore it?
Do you—is there, you know, some rule of thumb—you say something’s a lie, then talk about the lie, then remind people it’s a lie. You know, how do we deal with that?
ALBARRACÍN: One rule of thumb is if something is present in at least 10 percent of the population that makes it a highly salient belief and it’s likely to be used in various decisions throughout the day.
So in that case I would recommend some degree of acknowledgment with all the caveats that you mentioned, yeah, like, you know, say it’s false each time you bring it up and then have the proper correction that’s detailed enough to tie fully to the initial representation of the event.
So those are the general recommendations. They continue to be true. At the same time I would say that other approaches such as what in my research we’ve called bypassing can be equally impactful. So when trying to combat misinformation the question you want to be asking is, well, what do I want to achieve? Am I worried about the fact that if I say the election was stolen people are going to go storm the Capitol?
Well, that’s one approach. Then you start there. So how—what would prevent storming the Capitol is a slightly different question from how do I make sure they know the election was not stolen, and there are different pathways, one being that, well, first of all, if you maybe address the misinformation but especially reinforce any belief and attitude that would protect the Capitol that’s our most urgent goal, in my opinion. It’s not the misinformation in and of itself. It’s the actual consequences.
So with that in mind, I think in addition to corrections even more important sometimes is simply talking about everything that’s going to essentially lead to your goal and making sure that information is properly received.
So, for instance, if you’re trying to get people to support GMOs how are you going to get there? You can say, well, one problem is people think they bring cancer. I can go ahead and correct that. But at the same time you can emphasize and introduce new ideas that people don’t have in mind and that is actually a lot more effective than trying to simply correct the misinformation.
But that begins with knowing that you want to change the ultimate attitude towards GMOs. You don’t necessarily care about simply their belief in the cancer effects, if that makes sense.
ROBBINS: Of course, it wouldn’t be the role of a newspaper to get people to, you know, support GMOs but it would be certainly—I say this as a former deputy editorial page editor, which is different from being a reporter but having been both—but on both sides it certainly is the role of newspapers to support free speech and democracy and the truth.
I struggle—we certainly struggled even on the editorial page when I was at the Times with saying somebody was lying. You know, we didn’t used to use the word lie because lie implied intent. You know, I think the first time we even used the word lie on the edit page at the Times was we finally got around to say that Dick Cheney was lying. The lie had been said so many times that we finally decided we could do it, and when the Times finally used the word lie with Trump on the front page it made news. People were writing stories about the fact that the Times had said on the front page that Trump had lied.
And part of it was, you know, the Gray Lady had finally done it but part of it was this question of were we drawing too much attention—were we undercutting the credibility.
But I think there is this other thing. Is there a certain percentage of the population that you’re just not going to reach and what percentage of the population is up for grabs even for this conversation, right?
I mean, I suppose that’s really—you know, Dana, to go back to you about can you change people’s minds with good coverage?
LAFON: Yeah, I think it’s important to that final point you make, Carla, that you have to know your audience—you know, who are you reaching. It’s intuitive to believe that truth will counter misinformation and that’s psychologically not so—that there are certain cognitive biases and influence strategies that are going on that make the opposite actually true, right?
So if you commit to one belief you will behave consistently to demonstrate your belief and when you’re faced with, you know, someone telling you, no, you’re wrong you will traditionally—human beings, right, will traditionally dig their feet in and look for evidence that supports their belief and naturally not even observe evidence that dissuades their belief. That’s confirmation bias. It’s a very normal human response.
Facts don’t move people. Stories move people, and I think that is one of the core essence of a good misinformation campaign. It’s that narrative that speaks to an emotive knee-jerk response. The goal of a misinformation campaign is to get people to do something that’s in the narrator’s favor.
So simply changing a belief is not the end goal of a misinformation campaign. The end goal is to provide an action. Whether it’s we’re distracted, whether we increase our social discord, whatever the action that’s the goal of the misinformation campaign.
So journalists, I think, are in a really interesting yet very difficult position because countering it with truth is not savvy enough to counter the effects of the misinformation campaign. In my opinion I think it would be useful to use some of those strategies, for example, prebunking, right—getting the information out before it’s used in a misinformation campaign.
Perhaps calling out the alternative or the critic’s side to your piece to say, you know, my critics may say this, which describes the potential narrative of a misinformation campaign, and then you address it from, you know, demonstrating the actual motivations of who is providing this information.
So that’s one example. We saw that when in the beginning of the Ukraine invasion the U.S. government declassified some information that we understood Mr. Putin was going to use as fodder for misinformation campaigns, and if you notice those information campaigns were really, really thwarted because there’s a phenomenon that who says it first it’s true. So getting that information out there first gives us sort of credibility to that information.
There’s also repeatability. You know, I’ll say repeatability is believability. So expressing the journalists’ article, their writing, in this way repeated through various forums, repeated through various mediums and different sources, is a way to build that credibility and the veracity of their article.
Highlighting the motivations of the source and letting the reader—you know, you’re taking the reader on the story of this is why it would be in the best interest of the narrator, here’s what they’re trying to accomplish. It appears that this is what they’re trying to accomplish.
Now, I’m not saying don’t use facts, of course, but if you can marry it into your own narrative into a story that is fact based I think you have a much more robust possibility to counter the effects.
You know, the tricky thing is once that misinformation campaign or disinformation campaign is out there you can’t put that cat back in the bag, you know, as they say. It’s out there. It’s going to have an effect because it’s implicit. It works on implicit levels unconsciously. But you can shape a response to build that immunity for future influence campaigns.
ROBBINS: Dolores, Dana went through quite a few things that were on your list. I wanted to have your reaction to those things, this notion of journalists playing a role in prebunking, journalists playing a role establishing their credibility by taking us behind the curtain saying—you know, explaining the motivations of sources, establishing more credibility that way.
Can you talk about those things?
ALBARRACÍN: Yes. I think the—I mean, the impact on actions brings us back to what is the ultimate goal here, and prebunking can be effective and corrections can be effective but in the end they will come up with a new false piece to bring to the table.
So, in that sense, anything that’s tied to the content, yeah, that you have to go out and prebunk has a limited effect because they have ample time to come up with new facts that then you need to have and prebunk and we cannot even fully anticipate them.
So I would say that a more general approach of fortifying the knowledge base of our citizens is key so then they have enough structures in their minds to be on the alert and be able to recognize it themselves, not have to be told each thing that’s false. Yeah.
So that general structure is important, and second—I mean, secondly, of course, is protecting the integrity of our institutions, which you know how to do probably better than any other player in society.
So what do we do about attacks on science? What do we do on attacks on academia, all the respected sources kind of falling apart? This seems to me like the real problem more than having to prebunk each potential misinformation piece that’s going to be produced.
So if you protect some of the sources and they remain critical reference in society that is going to be a lot more impactful than operating on the level of each misinformation piece as a strategy and as a nation.
What else can I tell you? The other is, of course, in denouncing the actions. Yes, we care about misinformation. It can be harmful. But still, there is a very long way between misinformation and behavior and for the most part the impact is very, very small and this is something that we show in the report. It’s not like, oh, you hear a piece of misinformation and you move from there to, you know, walking into the White House with some assassination attempt.
So this is something important to keep in mind. What is really a source of concern, in my view, and based on a lot of research is it’s not just the misinformation but what do we do when influencers online introduce new processes by which we should take down, you know, the university leaders? That—it’s the actions. It’s not the misinformation. It’s that this has happened and now I’m going to move to do X and creating opportunities for a congressional hearing on X issue.
So, to me, a very good catalog and denouncing of the actions of these actors that’s where I would go because that is very close to behavior and perhaps more or less, you know, focused on that aspect. I don’t know if that answers—
ROBBINS: The decline of trust in institutions is across the board in institutions in this country. I mean, people’s—I mean, Congress is lower than universities. But if you look at things like the Edelman Trust Barometer and other—you know, like, the Gallup does this, you know, year after year and you see the decline of trust. I mean, trust in the, quote, “mainstream media” is—you know, we’re not going to be going around propping up trust in, you know, the president of Penn or the president of Harvard, and—because people don’t trust us in the first place.
I mean, I think we certainly have to think about why people don’t trust us and how—you know, and how we can—one of your recommendations is to leverage trusted sources to counter misinformation and provide accurate information in, say, health information.
I think how much we interact with the community around I think we have to figure out who people do trust when we tell our stories because we can’t rely as we did, you know, fifty years ago and just say people just trust us because we are the press.
So is there a way of building up trust in our institutions so at least people can listen to what we believe we’re purveying, which is as close to the truth as we can get it?
I think what we’re talking about there is building credibility, right, and that’s a different animal than influence inoculation. There’s different ways to build, and I know that’s not the topic of our discussion but there are different ways to build credibility and I think having that reputation is key, right?
Reputation is lost, particularly in crisis. Reputation is made or lost and I think that having that long-term reputation and also that repeatability of the same messaging, you know, that your stories are providing from a journalistic standpoint is valuable.
But if you’re talking about how do we build institutional credibility I think that’s a whole another topic for a whole another journalism seminar. But there are certainly psychological ways to build that credibility.
ROBBINS: I wanted to ask both of you a question which is have we seen countries that have done a better job of getting through a barrage of misinformation of outside attempts or inside attempts to get people to throw up their hands and say, well, I just don’t trust anybody so I’m just not going to trust reality?
Certainly, we saw that in Ukraine and that started, you know, from 2014 on, long predated what we went through, and you see this even today. I was just preparing for something for our podcast and I was looking at something in the pro-Russian press saying there were threats against the Hungarian foreign minister who’s going to have a meeting in Ukraine and he possibly couldn’t set foot in Ukraine.
I mean, they are fabulists at disinformation. I mean, there’s—the EU has something called the EU disinformation review and every week you see this—sort of these viral stories they’re pushing out there. But people in Ukraine have just gotten—I think they’ve just gotten to the point in which they just have shrugged this off because they’ve had a decade of it.
Taiwan seems to have done a very good job of getting through their election of not taking it particularly seriously because they were warned about it, whatever. I mean, are there countries that have figured out how to deal with this better than we have? Have you seen, you know, sort of strategies to prepare going into an election season, either of you?
ALBARRACÍN: I’ll let Dana answer this question.
I mean, I know that there is variability in specific forms of trust. So, for instance, Argentina is highest in trust of—in trust in science but I don’t know that there is one particular country that has managed to fight misinformation as a whole.
LAFON: I think if we look to countries, particularly the near abroad for Russia, those countries, they’ve been dealing with this challenge for a long time and they’ve had some success in managing, you know, living in the solution space and how do we counter those offenses toward their country.
We’ve seen it in Georgia. We’ve seen it around, you know, countries that are bordering Russia. We see them do it through entertainment. There are countries in—I want to say Georgia but it might be one of the different countries, forgive me, that uses a television show to address disinformation campaigns and to provide in an entertaining way the alternative narrative that—and the narrator and what they’re proposing. Very interesting.
One country—I believe this was Georgia—used the artists, right? So they would give the narrative to the artists and artists would reinterpret it through their art and then communicate that to the populace and that was extremely, extremely useful in countering the narratives that came from these disinformation campaigns.
And it also continually acts to inoculate the people viewing those programs so that they’re continually getting a booster on inoculating their ability to build healthy skepticism, to slow down their thinking to be more critical thinking.
So all of those attempts that we’ve seen in those Near East—near abroad countries to Russia have had some success. Now, we can’t simply translate those actions—those activities that have been successful into different cultures without some thought, without some appreciation for how they would be translated into different cultures and different countries.
But I think there’s a body of work there that—to look in the solution space to those countries and take a note.
ROBBINS: So before I want to—we do have a question already but I just wanted to very quickly ask Dolores if I wanted to write a story about, you know, programs that people are trying to prepare people, to better train people to resist misinformation are there any cool programs out there that I could, you know, go and write about?
I mean, I know it’s in early stages but—and I know that right after January 6 the military was looking into this about how to prepare people who were new recruits so that they weren’t radicalized. I mean, are there effective programs out there or at least ones that are in early stages that are worth looking at for training, whether they’re for young kids or for their—for young adults or—
ALBARRACÍN: There is media literacy types of training. I would argue that in the K to 12 system should be the system that does that because it’s not a matter of, OK, give me two lessons on how to identify misinformation. It’s going to change next month in response to what we teach them.
So it’s the critical thinking skills, the fact that, you know, scientific decisions or health decisions are well informed by science and you don’t politicize science because it’s almost sinful. It’s really building the values that are going to protect society from manipulating these institutional actors.
I think that’s the only long-term solution I can imagine. It’s not going to be a quick fix by the time people have no understanding of science or cannot even think about the word “evolution” without thinking it’s potentially problematic religiously.
You know, I mean, I don’t really know that we can change specific pieces without a massive overhaul of how we instruct kids on all of these issues and create spaces in which there are certain institutions that are trustworthy and others that shouldn’t be involved in certain spheres.
ROBBINS: Can I ask a research question for either of you, which is so if I wanted—you talked—both of you talked about how important it is to sort of get ahead of a story, because it’s very hard to take—to get the cat back in the bag. So how do I do that? I mean, it’s certainly—I’m now going to date myself from how long I’ve been out of the daily business, but we had people at the Times whose job it was to monitor Twitter to see whether, you know, there was news that was breaking on Twitter before it broke out on the wires.
But how do we know—when you said—you know, Dolores, you said when 10 percent of the population starts believing something, you know, how do we know that? How do we know when something is previral but threatening to become viral so that we can get out ahead of a misinformation or disinformation story?
ALBARRACÍN: I mean, in our world we would know through surveys or some sort of check. I think journalists have—are pretty well tuned to whatever is in the mix that could be potentially growing, yes, and that could be leveraged to explain new things and especially, you know, in a false way.
I don’t know any other way than following your gut on what might become problematic and putting it out there as early as possible.
ROBBINS: So, Dana, you come from the intelligence community. How do you guys, you know, get at—know when something’s coming down that hasn’t, you know, totally blasted into mainstream consciousness?
LAFON: Yeah. And we may not always know as well, right? So it’s too—I don’t think it’s reasonable to ask the journalists to get in front of that narrative. It’s impossible.
I think our challenge as—to you as a journalist is once it’s there how do you report it in a way that begins that inoculation process by explaining, as Dolores spoke to, you know, linking it to that misinformation and explaining why. Not just that it’s misinformation but explaining why and then explaining what is the technique that was used—the influence technique that was used, and that is basically the influence inoculation process, right?
You’re aware that it happens, you understand the link, and then you refute it through identifying the technique and then explaining how you could refute that, how you could counter that.
If you could shape the writing to integrate those steps of the inoculation process that is one way but I don’t think it’s one—you’re not going to do it as a society with one aspect. To Dolores’ point, education is key and I absolutely agree K through 12 should increase critical thinking skills particularly when looking for ways that we’re influenced, and you can use different strategies. You can use—marketing strategies are full of influence campaigns for good or for bad. But you still—being able to recognize them helps inoculate them.
And then I would put out a challenge. You know, here at the Council we talk a lot about AI and so I would challenge AI to help us identify what are the strategies to get in front of these information campaigns or misinformation campaigns that are going to be at our doorstep.
So it’s education, building that immunity, building that critical thinking and healthy skepticism. It’s as journalists hopefully being able to report it in a way that increases that inoculation and connects the story that you’re writing to the misinformation and explaining why, and then looking to our technical—our technological advances to help us with this problem.
ROBBINS: So both AI for good and evil, and I do want to get into—
LAFON: There you go.
ROBBINS: We have a question from Robert Chaney from the Missoulian.
Robert, do you want to ask your question?
I can read it but I’d much rather have him read it or speak it.
Q: Can you hear me?
ROBBINS: Yes, absolutely. Please.
Q: Hi there. We’re in a situation in Montana where the state Republican Party is about to start a big campaign promoting mail ballot use for the 2024 elections. But there is a faction of our state GOP party that is an extremely John Birch Society election denier crew and they are bound and determined to counter their own colleagues with a lot of mail ballot is insecure, mail ballot is a fraud, election offices are untrustworthy.
And, unfortunately, an awful lot of them consider most of the traditional media in the state as also untrustworthy so they’re not real receptive to us covering their own campaigns for mail ballots.
So we’re kind of looking at this as do we just cover their internal battle over who’s going to be right about mail ballot security or do we try to somehow—I’m just looking for strategies for how to cover this because we’re going to see a whole bunch of anti-mail-ballot, election fraud stuff floating around in the waves in large part by a party that doesn’t want to have it.
LAFON: I’m happy to let Dolores speak it or I can speak to it.
ALBARRACÍN: Is it possible to not cover it? If that’s an option, I mean, deemphasizing it and minimizing would be probably the best solution in this case rather than giving them a megaphone to further disseminate their false claims. Is that an option?
ROBBINS: More than 10 percent of the population believes it. I mean, by your own standard, I mean—
ALBARRACÍN: Well, no, no, no. That was how do we know if this is potentially impactful but it may still—it may be 10 percent of the population but it’s not impactful. So, for instance, if people believe the Earth is flat it has no consequences for us. So that’s the other part.
ROBBINS: Robert, is this impactful? It sounds like this is potentially impactful, isn’t it?
Q: We’ve had a couple of county election offices won in political races by election deniers who have then caused all kinds of mayhem for school board and local mayor elections because they’re—they don’t know how to run the system they hate it so much.
So we’ve got active internal disputes within the GOP party here who are vying for power and influence at the statewide level and part of this group is actively pushing that the mail ballot system is illegitimate.
So I don’t think we can not cover it—
Q: —but we’re going to have a lot of difficulty getting what you might call the majority portion of the GOP party to look at us with any more credibility than they look at their own internal adversaries.
ALBARRACÍN: And it’s a—how large proportionally is this group, the deniers?
Q: I would give them between 15 and 30 percent.
ALBARRACÍN: So, I mean, anything emphasizing that it’s a minority and probably putting them in a different context with experts from other states so then you kind of change the weight of the 15 percent as evidence for the credibility of the claim. That might be a way of doing it, sort of going outside Montana, see what this really, you know, says about election processes more broadly, perhaps.
LAFON: And I might add that this is a local—very big local challenge for you and I want to be sensitive to the journalism integrity as you are that, you know, you report what’s happening but at the same time if you know that—if you know the potential narratives that come through the disinformation of saying mail-in ballots are not effective, saying mail-in ballots are not going—or lead to false election results, you know, you can put all of that in your stories ahead of time when you’re covering, you know, mail-in ballots.
When you’re covering mail-in ballots you can cover, you know, the critics say this might not work or here’s some information about, you know, the veracity of mail-in ballots and their effectiveness prior to their campaign if possible. I don’t know the timing that you’re dealing with but I would recommend that, you know, getting ahead of it and then repeating it is a potential option.
But I think it’s—yeah, it’s a local problem that’s not avoidable.
ROBBINS: Particularly because it’s also a national story. I mean, there’s this gap between the GOP in lots of states, and meanwhile—there are a lot of states in which the RNC is pushing for people to vote early or to vote mail in; and then you have former President Trump, who is on the campaign trail saying: Don’t vote early. This stuff—the integrity of this is just—so this is going to go on between now and the election.
So it’s a story. I don’t think you can avoid it. But I think it’s—you know, this is going to be a truth sandwich story also, which is every time you write about it you’re going to have to say there is no proof that mail-in ballots are insecure.
So we have Gabrielle Gurley from the American Prospect. Gabrielle, do you want to ask your question?
Q: Sure. I’ll read it. Let me just find where I am.
And is there any evidence that hostile state actors are working with domestic groups to create or facilitate misinformation campaigns? Do you want me to—
ROBBINS: And you referenced—can you describe the New Hampshire incident that you referenced?
Q: Oh, the New Hampshire robocall incident is the robocall using—it’s a voice cloning, apparently, where President Biden’s voice was used telling people not to vote in the primary and wait until November to vote that went out from a former New Hampshire Democratic official’s phone number, apparently.
ROBBINS: But you’re not suggesting that was from a hostile state actor?
Q: No, I’m not. I was just asking.
ROBBINS: So these are separate questions. Got it.
Q: These are two separate questions, yes.
ROBBINS: So, Dana, hostile state actors working with domestic groups to create—facilitating misinformation campaigns.
LAFON: Yeah. I’ve not personally seen any evidence of that. I would also suggest that there certainly could be a false flag which means that it could be Iran posing as someone within the U.S. or someone within the U.S. posing as Russia.
So there are false flag operations that go on and that is quite common. So I don’t see any—I personally don’t see any evidence. It doesn’t mean it doesn’t exist or the evidence isn’t out there. I haven’t seen it in my work, in my readings. But there are false flag operations.
ROBBINS: Can I follow up with a question, which is that before the 2020 elections and before the midterm elections there were—there was a pretty big effort by Homeland Security to work with states to avoid hacking and a variety of other things. People seem to be quite vigilant about it and nothing bad happened that we’re aware of.
Have they let down their guard or is there continued vigilance going into this election that you’re aware of, Dana?
LAFON: From what I see at CISA, the organization working with states, and I also see FBI domestically working with states to shore up those secure environments. So I think that there is a lot of domestic efforts to ensure the security of the elections, particularly around the technology around it. I can’t speak to any particular action but I think that’s under the guise of CISA at this point.
ROBBINS: So there’s probably quite good stories for people to be looking at what CISA is doing in individual states because you’re already hearing, you know, people are going to be raising questions about the integrity of the vote and voting machines and ballots and there will only be paper ballots, you know, going forward—there will never be another electronic voting machine and all of that. So I think that will probably be a story closer to the election.
But I’d be really interested to see what CISA is doing, particularly you have to remember Krebs quit and there were just all of these things that happened in the end of the Trump administration. I don’t think people have been covering CISA very much, but I think there’s probably some really good state stories about state funding and CISA and what people have done to strengthen the integrity of their—particularly the election. There’s going to be a lot of them because I’ve heard the Trump campaign talking a lot about electronic voting machines and how unreliable they are, and if they don’t have paper trails I think there’s interesting questions there as well.
LAFON: I would love to see that. I would love to see the journalists on this call dig into, you know, what is DHS doing—you know, what are they doing at a state level. I don’t know personally those stories so I would love to see that.
ROBBINS: Christina Shockley, who is with public radio from a local—of All Things Considered at WUOM in Ann Arbor, Michigan. Christina, do you want to ask your question?
Q: Hi. Yeah. So I’m no longer at Michigan Radio, which is now Michigan Public, but I am still heavily working with stations across the country. And because of the relationship that we tend to like to have with our listeners and because of how we are funded, which is primarily through listener contributions, I’m wondering if you think of—if you think that there are any specific unique ways that public media can go about fighting disinformation and misinformation.
LAFON: I would love to see public media advertise the inoculation process and make that a common nomenclature amongst your audience so that they are understanding and getting boosters and actually exercising that building of the immunity to influence attacks.
I think there’s two ways we can stop disinformation. We can stop the actors doing it or we can harden us cognitively to be more prepared to thwart it. So I would love to see—I would love to see that.
ROBBINS: Dolores, can I ask you about the role of the social media companies?
Among your recommendations is the demand of data access and transparency from social media companies for scientific research on misinformation. I mean, I have seen reporting that says that, you know, that Meta has laid off all sorts of people who were supposed to be, you know, hardening themselves against misinformation in the run-up to elections, and are they taking this seriously and are they willing to be transparent?
Can you talk about the relation between researchers and what you’re seeing about their preparation for elections?
ALBARRACÍN: Yes. It’s been a problem to ensure access to the data especially, of course, Twitter, which was completely cut off and became incredibly expensive. So our ability to get a glimpse even has diminished if not gone away completely. So, I mean, that’s what I can say about the researcher side of things.
Companies are, obviously, not motivated by the goal of ensuring national security or anything like that. It’s a matter—it’s all financial gain, the basic. So whatever efforts they have tend to be very short lasting and often too late and then they are gone, and until we’re up to here of misinformation they typically don’t activate new methods.
But I think Twitter has been perhaps the most problematic in terms of ensuring any sort of filter.
ROBBINS: And does Meta share information? Because Dana in the beginning cited some numbers from Meta. I was just wondering whether—does Meta share data with you?
ALBARRACÍN: No, they don’t. They’ve done the occasional project in which they will share results of what they analyzed internally but they normally don’t share anything broadly, and Twitter used to be the most accessible but still had a lot of limitations. You never knew what the sample really looked like and it’s all fairly restricted. Yeah. So that’s unfortunate. I don’t know what Dana has to say about this.
LAFON: I know from what I’ve read in open source in reporting that Meta has—I think there was a recorded futures article I reference in my paper for Renewing America for CFR about how Meta had reportedly taken down a significant amount of false persona accounts attempting to pretend to be U.S. citizens.
So what I know I learned from the folks in this call in the journalist environment. But there is some reporting of different companies. There’s also a wonderful Microsoft report on misinformation and disinformation campaigns from China, Russia, and North Korea that I would strongly recommend to folks who are interested in getting some understanding of activities and how they’re approaching social media and what—a little bit about what social media is doing. I would send your audience to that.
ROBBINS: We can share links to all of that with everybody. We’ll push them out after this.
Can we talk a little bit about AI while I’m—do we have another—we have another question? Yea? OK.
Jordan Coll, can you—would you like to voice your question and tell us who you work with? I’m sorry. I’m not looking at the list right now.
Q: Hi. Can you hear me?
Q: Wonderful. Wonderful. Yes.
So thank you so much. Jordan Coll here. So recently—I am a freelance reporter and I just finished my graduate program at Columbia graduate journalism school so I just finished there. My question is in two parts.
One, we know that there seems to be, especially with Twitter, you know, journalists are consumed by the platform. I think a lot of the messaging—you know, a lot of political journalists that cover politics are in X because there seems to be a lot of—you know, you have your senators, you have politicians that represent these platforms.
So my first question is how, and especially given the fact that, like, CEOs like Elon Musk, tech company hosts, are replatforming figures that were clearly giving, like, disinformation ads. Like, we have Alex Jones, for instance, with the whole defamation case of Sandy Hook and then Trump as well spreading these.
So how do we resolve this issue, A, and where—in terms of ad space some subscribers, you know, are being—like, advertisers are pulling out of X significantly because of what CEOs are—you know, the decision making process that they’ve placed.
I’m not sure if I’ve clarified the question but one of them would be how do we resolve those actions taken by higher execs that permit—to give them open access to these people to rejoin these accounts again when they clearly have violated the guidelines and the misinformation guidelines, et cetera?
ROBBINS: Do you mean from a public policy point of view? Do you mean you want—you think government should be doing something or do you mean that we as journalists should be boycotting?
Q: I think it would be, like, more so the journalists part of it because, yeah, what are we—our job is not to—you know, we simply—again, we don’t write the policies that they’ve put for themselves. So how do we become more, I guess, alert and how to navigate through the—through that.
ROBBINS: Would we—do we do the—Dolores, sort of just ignore them? You know, that’s sort of—I mean, I must admit that I’m so profoundly ambivalent to the point of—I mean, we still—the Council still do things on X. We do X Spaces, what used to be Twitter Spaces, and—(groans)—that was an editorial board language groaning. (Laughter.)
So, I don’t know. Do you guys think that everybody should just boycott Twitter and get it over with?
ALBARRACÍN: We all think that but institutionally people don’t find a way of getting out because they have their following and nothing has been—has emerged that’s equally successful, you know, or popular. So nobody can leave because how do they get their information out of the credible institutions that still have X accounts?
LAFON: One point is, and I think Dolores spoke about this earlier, is that where do you get your news source from. You know, people should not be getting their news sources from social media. I wouldn’t—I shouldn’t say should.
People—I encourage people not to get their news sources from social media. It’s a way to absorb information and that’s great. But this is a very difficult challenge, right? This has been a challenge for many years.
There’s no easy button here, Jordan. It’s—and to—I can’t speak to how journalists could counter, you know, these effects. But I think looking at it as a social media platform versus a news platform is something that you could educate your audience on.
And congrats for finishing your grad school.
ROBBINS: Let’s hope you haven’t gathered—
ALBARRACÍN: About Twitter, I mean, Twitter used to be an excellent source for all kinds of news so why couldn’t we have a platform like that that’s maintained by, you know, the main media sources and use it for that purpose, a more centralized—
LAFON: Like a news source that is that media.
LAFON: A news media. Yeah, it’s a great idea.
ROBBINS: Well, we are going to have—you know, if President Trump is reelected we’re going to have a real challenge, which is how much do we quote, you know, things that sound like social media posts every time they come out of his mouth, which is that’s a—as it is right now people are being very careful about how much they—people quote the things he says on Truth Social.
But once they’re utterances from the Oval Office these are real—these are major journalistic challenges. You know, how much are you repeating things that are frightening versus just sort of draw us that are frightening? These are—these are major, major challenges to come.
Well, I just want to thank you both for really—I wish it were a happier conversation, but a really interesting conversation. Thank you, everybody, for raising questions. It’s always good when we end with more questions.
And Emily Bell is wonderful to talk about this—from Jordan.
So, Jordan, I hope you don’t—haven’t graduated with too much debt and that you get hired soon.
So I’m going to turn it back to Irina. Thank you all so much.
FASKIANOS: Thank you very much.
As Carla said, we appreciate it and we will be sending out the link to the transcript and the video after this as well as links to the resources that were mentioned.
You can follow us, our work, on CFR.org, ForeignAffairs.com, and ThinkGlobalHealth.org for the latest developments and analysis on international trends and how they’re affecting the United States, and do email us your suggestions for future webinars. You can email [email protected].
So, again, thank you all for being with us today.