Webinar

Disinformation and Election 2020

Thursday, August 6, 2020
Lindsey Wasson/ Reuters
Speaker

Media and Digital Disinformation Fellow, Alliance for Securing Democracy

Presider

Adjunct Senior Fellow, Council on Foreign Relations

Introductory Remarks

Vice President for National Program and Outreach, Council on Foreign Relations

Bret Schafer, media and digital disinformation fellow at the Alliance for Securing Democracy, an initiative of the German Marshall Fund of the United States, discusses disinformation and the threat of foreign interference in the upcoming election. Carla Anne Robbins, CFR adjunct senior fellow and former deputy editorial page editor at the New York Times, hosts the webinar.

FASKIANOS: Good afternoon. Welcome to the Council on Foreign Relations Local Journalists Webinar. Today we'll be discussing disinformation and the threat of foreign interference in the upcoming election with our speaker, Bret Schafer, and host, Carla Anne Robbins. And I want to acknowledge that today is the 55th anniversary of the Voting Rights Act, making this a particularly appropriate day to have this discussion on the state of our democracy. I'm Irina Faskianos, Vice President for the National Program and Outreach at CFR. As you know, CFR is an independent and nonpartisan organization and think tank focusing on U.S. foreign policy. This webinar is part of CFR’s Local Journalists Initiative, created to help you connect the local issues you cover in your communities with global dynamics. Our programming puts you in touch with CFR resources and expertise on international issues and provides a forum for sharing best practices. Today's webinar is on the record and the video and transcript will be posted online on our website after the fact at CFR.org/localjournalists.

We shared our full bios for our speakers with you previously, but I'll just give you a few highlights. Bret Schafer is the media and digital disinformation fellow at the Alliance for Securing Democracy, a bipartisan initiative housed at the German Marshall Fund of the United States, and an expert in computational propaganda. He's a frequent commentator on national news outlets. Mr. Schafer previously held roles in the television and film industry. Carla Anne Robbins, our host, is an adjunct senior fellow at CFR. She is faculty director of the master of international affairs program and clinical professor of national security studies at Baruch College's Marxe School of Public and International Affairs. She was previously deputy editorial page editor at the New York Times and chief diplomatic correspondent at the Wall Street Journal. Welcome Bret and Carla. I'm going to turn it now to you Carla for our opening exchange.

ROBBINS: Thanks so much Irina as ever. And Bret, welcome and thanks for doing this. And thanks as ever to the reporters on the call for all the work you do, that's incredibly important. These are particularly trying and dangerous circumstances, so thanks so much for what you're doing. I don't even know how you do it under these circumstances. So I thought we might start off with a very quick show and tell from the Hamilton project so you folks can see what it is that he does, and how it might help in your reporting. So I'm going to turn it over to you for a little show and tell.

SCHAFER: Thanks, Carla. Let me just try to do a screen share quickly here. Okay, so you should be seeing our dashboard now, which we house on our web page. What the dashboard is essentially, it's a collection of every Twitter account that we have connected to either a government official or a diplomat from Russia, Iran, and China, along with YouTube outputs from RT UK, RT America, CGTN, CCTV plus, and then all of their major English language websites. So it's essentially just a content aggregator and a way to see what at any given time the three countries are focusing on in terms of their sort of narrative focus. So the landing page will get you to the Twitter outputs. So you'll see here that just the most active accounts during the past week, but also which countries they're talking about, which hashtags are using, the top topics, key phrases, and links to URLs. But what the dashboard also allows you to do is do some longitudinal research. So if you have a specific topic of interest for example, well, COVID is not a great example because there's just so much content that would be tough to find something useful there.

But if you're looking, for example, at how China has used the Fort Detrick conspiracy theory to talk about COVID-19, we have a search function right at the top here which will allow you to kind of dig into the data. We now have 2 million tweets, we have about a half million articles and probably 50,000 broadcasts that we have collected. So it's just a pretty quick way of going in and seeing how these three countries are talking about a specific topic. You can also sort of refine things by having a specific kind of date that you're looking at and dig into individual accounts. One of the things that's particularly interesting is also who these accounts are retweeting and engaging with. We're putting out a report today, actually, that should be coming out today with CNN, just on some of the retweeting patterns we've seen from Chinese diplomats and the western voices that they like to try to amplify as a way of getting sort of beneficial information into the ecosystem. So that's just like a very quick, three minute overview of the dashboard. Happy to dig into it in more detail when we get to Q&A.

ROBBINS: That's great and we'll be sending links out for everybody with recommended sources and we'll talk more about that since this is very much intended to help you all do reporting on this topic. So, by this point, you know, we're all pretty well versed in how the Russians messed with the national psyche during the 2016 election hacking and hyping divisive social media posts, the internet trolling and sock puppets. And then how China, Iran, and domestic groups said, wow, that really worked well, and quickly got into the game as well. We also read about how the social media platforms went about taking things down, you know, that after denying that they even had a problem, they said, okay, well, maybe there was a problem. And then, of course, the whole debate back and forth about political censorship, with Facebook versus Twitter and all that. So now that we're getting into the 2020 campaign, and certainly the strangest one in my lifetime visits in the midst of a pandemic, where disinformation is rife. What are we seeing now and how different does it feel to you do what you're seeing from what we know about 2016, do you should we expect to repeat of 2016? Should we be looking for different actors, different tactics, different platforms?

SCHAFER: So I think the major differences, it's unlikely we'll see the sort of carpet bombing campaign that we saw Russia run in 2016, where they just had thousands of troll accounts, bot accounts that were just flooding the zone with information. The platforms are just, they're looking out for it too much at this point to be able to run that type of operation. And frankly, I don't think it would be particularly successful. I don't think it was, frankly, all that successful in 2016. I think what was most successful was the hacking lead campaign, which was run not from the internet research agency, but of course, to the intelligence services. And that I think, is where we're still vulnerable, not just to the Russians, but the Chinese and Iranians, although I should caveat that that I don't think the Chinese actually will get involved in election interference. But that's what I'm particularly concerned about is the more surgical strikes because I don't think we are in a better place right now to deal with those, particularly if they are savvy about how they first leaked that information into the information space. If they somehow get a genuine U.S. voice or U.S. outlet to be the first ones to publish that, that's going to tie the hands of Twitter and Facebook because for obviously some legitimate reasons, they are not going to want to take down content placed by a political figure or a U.S. news outlet. So my concern again is that they will be, when I say they, I'm including it many different actors, so Russia, Iran, as well as domestic actors, will be savvy about how hacked and leaked information first appears on social media and will likely try to place that either with a U.S. person or into a U.S. media outlet, because then that's going to tie the hands of the companies to take action.

ROBBINS: So I'm of two minds about this as well. Listen, there's no question in my mind that that we in the mainstream media did a pretty bad job of handling the hacks and the “what about her emails?” And if you look at the number of front page stories even in the New York Times or particularly in the New York Times, in the sixty-nine days before the election, and that said, what you're talking about is not a problem of disinformation, but the interpretation of actually true information. So first of all, why are you worried about that? You know, why do you think that's going to be the chosen tactic? That's the first question, and the second question is, why do you think we're not smarter? In the press? I think we're smarter.

SCHAFER: I think you're smarter too. I think it's frankly, really challenging for the press. Because, of course, if content comes out and you can verify that it's credible, and it has there's a sort of public interest element to it. I understand from a reporter perspective that you have to cover it. What I'm hoping for this time around that there'll be more context given to that kind of reporting of why it is being released. Now, who could be responsible for releasing it? The motivations behind it? And in this is where I'm less concerned about reputable journalists and some of the fringe blogs out there that have, you know, more political motives. You know, with the Podesta leaks, it was the Podesta leaks, among others going back to WikiLeaks, a lot of the content that was covered, was just sort of salacious in nature and just made, you know, the targets of those leaks look bad, frankly, but didn't actually have much kind of substance to them. So my hope is in sort of the more reputable outlets that that kind of reporting or reporting of that kind of content will be stripped away. If there's something of that news value there, of course, it's got to be reported. But again, my concern is that there will be placed in a fringe blog, but click has sort of partisan leanings, and once it's out there, that is going to force other journalists to cover it, because it's gaining traction on social media. So it's the whole process we call information laundering of a way of just placing a piece of information into the information ecosystem, and then hoping it sort of moved around enough, either through legitimate or, you know, fake accounts, or fake websites, that it actually gains the attention of reputable journalists and then the New York Times and everyone else sort of is forced to cover it because it has gained so much traction and has so much visibility that they really have no choice.

ROBBINS: If we're seeing so much about so much disinformation and how easily it takes off in the midst of the pandemic. And campaigns take on an overheated quality as well. So it's a great petri dish for that. Why wouldn’t the Russians, the Iranians, the Chinese be doing the same thing, whether it's to favor a particular candidate or to further divide, knowing that that might help a particular candidate. I understand that the social media platforms see patterns, so it's harder to place, you know, sock puppet accounts on and all of that. But planting, you know, disinformation out there, I mean, disinformation, look at just the COVID disinformation that's out there, whether or not it's coming from Russia or it's coming from domestic sources. It is spreading; I don't know, what cliche shall I use for that? Why do you think that these countries would not be doing that, in this very vulnerable period of time from now until the election? And beyond that, perhaps the most vulnerable time is going to be between the time when the polls close and when we actually get results, which could be judging from the fact that we just only got the results of two primaries in New York yesterday, you know, we're talking about potentially weeks. It's certainly not going to be John King standing in front of the map, you know, saying I'm calling you know, this county, you know, why wouldn't they be doing that?

SCHAFER: So first, I think you're absolutely right, the major period of concern is going to be the aftermath of the election, actually. I'd actually written a piece before the 2018 midterms that we need to be paying more attention to the couple days at that point when things were sort of normal, right after election results came in where you would get the typical narratives we see about voter fraud, you know, and everything that you've seen for decades. That of course, is just going to be amplified to another kind of dimension because of COVID. And one of the things we're particularly concerned about is the fact that it's not just going to be a delay, I think, between when the polls close and when we get resolved it’s how you will see in some states, I think the results shift. So if you take Pennsylvania, for example, the first counties that are going to be reporting the results are probably going to be smaller rural counties just because they're going to have less mail in ballots to count. So early on, I think you may see Trump in a significant lead in some of these states that will slowly start to be chipped away as the larger urban centers count their ballots. So you're going to see shifting, I think of results, which is going to open a huge opportunity for disinformation campaigns to start talking about, you know, voter fraud problems with mailing ballots, all of these things that, of course, you know, President has himself talked about. So we're going to have a real issue, I think, with domestic and foreign players just sort of playing off that space between when you know, the polls closing when we get results, so I completely agree with you there.

The other concern too is when you look at the resources being thrown at the election, and I'm talking about both at the Department of Homeland Security but also the social media companies. And I'm hoping they're prepared for it this time. But the problem we had last time and I wrote that piece prior to 2018, is, you know, Facebook had this war room that they set up two weeks before the election, that essentially sort of shut the lights off, you know, the night of the election. So I thought that was too short of a time to be looking for election interference, because of course, it's sort of a long term process it you're talking about changing people's opinions. I mean, the reason candidates don't start putting up political ads two weeks before an election is the reason if you're running a disinformation campaign, you don't start two weeks before an election. But the real concern, again, is like, as soon as the votes were cast, they decided, you know, their work was done. And I think I'm really hoping this time that they will extend those war rooms past the day of the election for weeks, if not months, because I think we're going to have real problems there. So the first part of the question of why wouldn't countries do it? I think they would, I think it's just going to look different this time. So placing specific disinformation narratives is something I still think they might do, again, particularly around a piece of hacking material, I just don't think they need to do what they did in 2016, which is to run a wide network of fake accounts to push this content out. Because frankly, we're doing such a good job of running disinformation campaigns against ourselves at this point that they just need to get that piece of information to the right influencer. And then we'll do the work for them. I just don't think they need to run an internet research agency campaign anymore. Again, because the conversations online are so toxic, the polarization is so bad. There are enough websites out there who will do all that work for them.

So just from a resource perspective, my question is just sort of why? They just don't need to do it, which again, doesn't mean that they won't put a specific piece of information out there, I think that's possible. You know, in the non-sort of election context, what we've seen recently, and right now, this is largely in the Russian language space, but we're seeing disinfo about U.S. sort of virus or vaccine tests in Ukraine that are killing Ukrainian soldiers. This is not true. But it's sort of seeding these narratives that are starting to pick up some traction in English language, Facebook groups. Why? Because the Russians are about to roll out their own vaccine. So it will slow down progress of any sort of us, you know, vaccine trials, all of the normal concerns that happen in democratic society. So I still think there is definitely the possibility that they will place this information out there very specific pieces. I just don't see them effectively running this massive sort of network of fake accounts the way they did in 2016. And I definitely don't see those accounts remaining undetected, like you know, you had the fake Tennessee GOP account. It was operational for two and a half years and had 100,000 followers. I think it's highly unlikely that that will remain undetected if they tried that this time.

ROBBINS: So I want to turn this over. But I did want to ask one particular question about local media and targeting local media. You mentioned in our conversation leading up to that, that one of the things that the troll factory, the internet research agency got about the United States was how much Americans trust the local press. And so that one of the ways they manifested that is that you know, they ended up setting up these fake local social media accounts. And the times mentioned yesterday, in a story about the State Department report on Russian disinformation, about Michael Averko, who is a particular favorite of mine, who's a writer for something called Strategic Culture Foundation, which is a front for the Russian Intelligence Service. And how he had published a repeated op-eds in the Yonkers Tribune which is a local Westchester New York newspaper attacking a former Obama official Evelyn Farkas, who was making a run in a primary. And it's interesting that the local paper didn't check where he came from, or what his agenda was for this, do you see this, you know, the sort of targeting of other local, you know, in other local campaigns? And how do local media, which have fewer resources--and I'm not saying that the big papers don't make mistakes, too--how do we protect ourselves?

SCHAFER: Yeah, it's a great question. I mean, beyond, so going back to 2016, we found at least 50 of these fake local news accounts they set up on Twitter. We saw a lot of targeting of local journalists through some of those accounts. So we would see local reporters tagged in some of the IRA reporting just trying to get their attention essentially to cover a story. But I do think the real concern is actually trying to place these opinion pieces into local outlets or again, I'm a little bit more concerned about some of the blogs that are out there, the partisan blogs, which would be less likely to do due diligence to figure out who people who these people are, if the content sort of fits with their worldview and narratives they want to push. But we know that this is something that the Russians again did 2016 they had this fake persona named Alice Donovan, who I think had something like 20 articles placed mainly in left leaning blogs. And then some of the Gulf states have been suspected of running these fake experts that they're running opinion pieces and various outlets.

So I think the targeting of local news sites, the other thing, if we put aside the foreign element for a second, it's really hard to influence a national election. I mean, I think most Americans, probably more or less have their mind made up about who they're voting for president. But if you get down to a local race where a few hundred votes can decide an election, particularly during a primary, it just is far easier to swing an election also, because people probably have less sort of, you know, deep seated opinions about the candidates. So I'm very concerned about domestic disinformation campaigns at the local level, just because it's far easier to run, it would take less resources. And one of the things we see now is you don't have to have any particular technical savvy to run a disinformation campaign. If you go to Google, and you want to understand how to manipulate Google search results, you just Google it, and you'll get 50 companies that will run disinformation campaigns for you. And this exists on Twitter on Facebook. All of these companies who are usually in Belarus or the Philippines, for a few thousand dollars will set up fake accounts will give you fake views.

If you have a damaging narrative about a candidate you can get that into the bloodstream of a local news ecosystem very easily and cheaply. So I have a huge concern about what will happen at the local level. I'm not sure necessarily. I mean, Evelyn Farkas is, I think, a specific example, of course, because of her sort of foreign policy views. And you know, she was a former colleague of mine at GMF. You know, I don't know why necessarily? Well, I'm sort of changing my opinion, because I can see why the Chinese would be interested in, you know, a local election, somewhere in LA County, for example, where they have a bunch of business interests. So there, you've got multiple things working against you, the national media is probably not covering it in the same way because they're just not looking for it. Groups like mine, it's probably a little bit less on our radar, we're just, you know, not running sort of keyword searches to see if anything's there. And then, you know, I think as you mentioned, a lot of local news outlets have less resources to be able to do the open source intelligence work that it takes to figure out you know, whether or not a this person exists, which that should be easy enough. Frankly, that's one that I would just sort of called sloppy journalism. I mean, you need to get the person on the phone and confirm that this person is who they say they are. But in the sense of figuring out the, you know, the example in the Yonkers Tribune, that would be a little bit more difficult. I mean, you need to connect dots.

And you need to understand what this website is that this guy works for, and things like Global Research Canada, for example, that was also talked about in the State Department report. It's just challenging. I mean, it's challenging. You know, I've done this for four years and I know the Russian information space well, and there's still sites that pop up that people are affiliated with who I I'm just not sure if there's that they are there to it. So I think there's a lot of things working against local outlets in terms of being able to verify who these experts are who tried to get their attention because they know that again, Americans trust local news outlets, and it can be particularly damaging to get a piece of content placed there. Again, there's the amplifier effect, you play something in a smaller local news site that will likely gain some traction and jump to larger and larger and larger outlets. I mean, depending on the national interest, but you place it somewhere and you do the work to try to get it picked up by other outlets.

ROBBINS: Thanks, that's a happy thought. Nice to know that the power that local news organizations have those. So let's turn it over to the reporters. I think we have some questions on the line, Irina?

FASKIANOS: Great. Thank you very much. So as you all know, if you click on the participants icon at the bottom of your screen, you can raise your hand there, and also, please share your video if you can. And if you're on a tablet, click the More button and you can raise your hand in that window. And please say who you are and what news outlet you work with to give us context and accept the unmute prompt and I'm going we already have two hands. Right, so I'm going first go to John D’Anna.

Q: Hi, John D’Anna, I'm with the Arizona Republic in Phoenix. And Bret, I wonder if you could, first of all, thank you for doing this. It's very informative. But I wonder if you could speak for a minute about opportunity cost if we as journalists are spending all this time and energy trying to debunk things, particularly in such a polarized atmosphere where very few minds are able to be changed, what are we missing out on? And what are the implications for democracy?

SCHAFER: Well, I think it's not only opportunity costs in terms of what you're missing out on by covering some things and debunking them, you're actually doing the work for them at times. So even the coverage that is disputing a specific narrative sometimes is beneficial because again, it just raises the visibility of a specific narrative. It gives it oxygen. And it's, there's no perfect formula for it. You know, I think I've been wrong certainly in some of the things that I've covered that I should have just let die in the wilderness. But it's very hard to kind of know when something is a fringy narrative that should just be left alone, because by covering it, you're again, getting more visibility. And when it's about to jump into the sort of the wider mainstream ecosystem, were covering it early would have been actually beneficial, because you would have warned people. So I don't have a great answer to that because of the question we've been wrestling with for three years. I'm not a huge fan, frankly, of debunking things in general, because I think you're right, I don't think it's particularly effective. I've had this issue a bit with Twitter and Facebook, fact checking politicians, for example, and I think the context matters. But you know, when Trump in a paid advertisement, it's very different. But if the president puts something out that is probably false, I'm not sure the benefit necessarily of Facebook labeling that as false because I think the people  who know it's false aren't going to be, that's not going to kind of further convince them. And it's probably going to drive his supporters kind of further away from the platform. So I am less a fan of debunking than I am of this is sort of a cheesy term, but pre-bunking the information space around certain topics.

So, around the election, for example, if there are changes to you know, the date, the time how you can flood the information space with accurate information in advance, sort of anticipating what some of the disinformation narratives can be, and getting out ahead of them as opposed to continuing to put out the thousand different disinformation fires which  is just too much stuff, it would suck up all of your time, you wouldn't cover anything else. And I don't think it's particularly if effective because all of the research has shown, actually that by pointing to something that is false that somebody already believes, you tend to reaffirm actually their beliefs and the false statements. So that's sort of a rambling answer for you of like I it's a case by case basis, because I do think sometimes a narrative is salient enough that it has to be covered for those who are able to be convinced we always talk about sort of the fence sitters, like the people who are deep into the conspiracy theory world. Like they're a bit lost. I'm more concerned about the high school student who's running a generic Google search and the top thing that pops up for them is a disinformation narrative. One of the things we pointed out is just how effective the Russians are at getting their content in front of people, which I can talk about a bit with data voids. So that's one thing or just you know your aunt or uncle who's on Facebook, who's maybe not a conspiracy theorist, but is getting this stuff just pumped to them constantly. We're actually getting a debunked or fact check article there is helpful for people to point to and say like, Hey, you know, this actually is a little bit suspicious. So it's a case by case basis, honestly. And I don't have the perfect formula for it. Because again, I think I've been wrong many, many times and covering things that I shouldn't have covered.

FASKIANOS: Let's go to Rickey Bevington.

Q: Hi, there. Thanks so much. This is wonderful. I'm a Marshall Memorial Fellow class of 2014. So part of the family and yeah, I'm a big fan of the German Marshall Fund, obviously. And I've traveled through Serbia and Ukraine and Hungary actually studying disinformation and press freedom. So here in Georgia, I work for Georgia Public Broadcasting, a statewide public radio and television network, Rickey Bevington. We may be sending our first QAnon member of Congress to Washington. There's a primary runoff on Tuesday, Marjorie Taylor Greene. And so I think for me as a journalist, this is actually a question maybe anybody can help answer as my strategy so far has been simply, you know, there's no sort of, there's no convincing a QAnon voter, but I'm going to keep doing what I do so that when they're ready, they can come back and learn about my political coverage, rather than this back to the previous question. What's the opportunity cost? We're going to keep doing what we do. And when people are ready, they can come to us, you know, if they choose to. But I actually just wanted to ask, particularly as a local journalist covering a QAnon candidate, do you have any ideas on what we might be doing as a newsroom? Rather than simply throwing our hands up and simply saying, well, people in the fourteenth congressional district are going to do what they're going to do, you know, is there something that a local journalism organization can be doing more of in this particular instance?

SCHAFER: That's a very challenging question. And I'll say I sort of have multiple connections to your path because I lived in Hungary for three years, which is how I sort of became interested initially in conspiracy theories just because it is such a great breeding ground for them for some reason. I also went to high school in Savannah so I have a Georgia connection there. The QAnon stuff is really challenging and conspiracy theories are so much more challenging than just like traditional disinformation, because they're almost irrefutable, because if you continue to point out the fallacies and conspiracy theories, or even if you get a former conspiracy theorist to sort of, you know, flip their position even you know, the couple times Trump has lightly sort of denounced QAnon, you know, it all gets worked into the conspiracy, that these are just you know, ploys, is being used for its cover. So it's an impossible thing to kind of fight against in many ways.

So I don't have a great answer on how you cover QAnon and conspiracies in this context, other than I think it is more beneficial to continue to point to some of the like the mechanisms of how conspiracies spread in general, without actually calling out QAnon specifically, because I think that again, drives anyone who's sort of a QAnon supporter kind of further into that camp, because of course, look, here's the publicly funded media telling us again, not to believe QAnon which is exactly, you know, why should we should believe QAnon because they're all part of this massive conspiracy. So I think what's helpful is going to the kind of the architecture sometimes of how disinformation narratives spread, and by that, I mean, pointing out for example, the use of automation and how systems are manipulated. So you're really getting down to more of like, more of the tactics that are used to try to get narratives in front of people as opposed to pointing out the fallacies in some of the narratives themselves. It's a more complicated story to tell because there usually needs to be a little bit of like, technical savvy to be able to point to like, okay, here's how this narrative started in the wilderness of a 4chan message board, and then ended up you know, in this Facebook group, but I think sometimes telling that A to B to C to D story is effective. And it also kind of gives you a bit of arm's length distance from actually just calling it out as being crazy, which it is, but that's not usually a very kind of effective approach. If you're talking about trying to like, actually convince people who are leaning in that direction to come back over to the other side.

FASKIANOS: Thank you, let’s go to Kala West.

Q: Good afternoon. I'm Kala. I'm from Philadelphia. I work for WURD radio. In covering elections, I focus on a millennial base show for the younger audience. Now, with the millennials and the generations under us, we are very social media heavy. And you know if that clickbait gets us, my question is as a producer and a host, how do I let them know that that what they got clicked on was really a catfish and that's not the truth? Because we already have an issue with them going to the polls. So now the polls are closed and you have to register to get a mail-in ballot, which is an extra step that's not technology friendly, because you have to literally write in the ballot which some people are just not doing. So how do we cover that and get them to understand that, hey, this is important as well. Just as going to the polls, you need to you need to know this and you need to listen to this type of content versus looking at what's coming up in the advertising section of Tic Tok and Instagram.

SCHAFER: Yeah, I think the key there is being the credible local voice for communities. So you're the one kind of out front saying, here's exactly how to vote logistically. These are the steps you take if you're hearing any other narrative out there, because again, there's going to be too many for you to cover all of them. It's false. Like, this is how it works. Here are the people you can go to for reliable information. Here's your local election board. So the point I tried to stress when I talked to groups like National Urban League or the League of Women Voters, who are more of the sort of voting activists, this is what I tried to stress is to be the person in your Facebook, you know, communities or on Twitter, who's out there saying, you have a badge of credibility because of who you're connected to. Whether you're a reporter or an activist, and saying like, these are the steps, this is it, this is how it works. Anything else you're hearing is not accurate. Because that's really the only way I think you can fight it because to your point of like, there is so much out there in terms of false narratives, that by the time you have done the legwork to dispute one thing, there's 50 others that have popped up that you're just like constantly chasing your own tail.

So this is this is where I'm sort of more of a fan of, you know, doing this pre bunking where you're just aggressively flooding the zone with correct information. So those who, you know, those who want to find accurate information are finding it. And this goes so this is maybe a good time to talk very briefly about data voids, because I think it's really important to us. So the concept of data void, essentially, what that means is, search engines work really, really well when there's a lot of credible information on a topic and you run a query and it's going to give you something reliable most of the time. So if you run a query on you know, World War II history, it's going to take you most of the time to credible site. If there is a lack of credible information, the search engine still works to give you whatever is available. And so disinformation operators understand this so they target these data voids where there's a low amount of information to push out false narratives. And they're very, very savvy about this. I mean, white nationalists have exploited this for years in ways that are frankly. I mean, I'm careful about using this term, but like fairly sophisticated, so they'll misspell things like Holocaust and post because they know that, you know, one out of every twenty users when they type it into a Google search will have a typo in there. And if they have a typo, you're now getting like Stormfront as your number one source of material on Holocaust information.

So there's constantly the ways bad actors, and again the Russians are amazing at this sort of carpet bomb search queries that are important to them, and flood the zone. So when people search for it, when they're looking for good information. So these are not the conspiracy theorists who are sort of deep, deep deep into their own kind of web of weirdness on 4chan and Reddit. These are when I'm talking about they're like fifteen or sixteen year old who's writing reports for school or someone who just wants to know, where do I vote, but need to kind of flood that information space with credible information, because it actually works on the back end with the algorithms as well. Because like Google knows, you know, it's a local reporter, you're credible, if you put out enough stuff there, it will serve up your content at the top. But if you're not putting out that content, or others, you know, and the activist base are not putting out that content. It's just going to serve up what it has, which could include, you know, Gateway Pundit and all of these different sort of fringy sites in the US, but also Sputnik, RT others. So that's the way I think you can really fight it at like the algorithm level, to make sure that those who are just running queries because they want to find information are actually getting good information, because there's not much frankly, like the Googles of the world can do because they can’t anticipate every search query. And if there's not information there, it just it's going to give them what's available. And if what's available is low quality information or specifically targeted to give them bad information. That's what they're going to get.

FASKIANOS: Thank you. Let's go to Charles Robinson next.

Q: Hi, this is Charles Robinson from Maryland Public Television. Last go around we had a lot of African actors kind of, you know, telling people that you don't want to vote for Hillary or they were Bernie supporters or whatever. And then later, we found that they weren't. But I want to talk about that but I also want to drill down on this idea that during this election, many states are looking at either giving out absentee ballots versus having people physically come in, I can tell you my state, we were going through rounds. They have to round. They were supposed to come up with an idea yesterday. They didn't. And I'm waiting right now, find out what that's going to be like.

SCHAFER: So that kind of two questions there. One, I've done a lot of work actually on the targeting of the African American community in 2016. I wrote a paper for the Urban League last year about this. Some great research has been done by Kate Starbird at the University of Washington, who actually had mapped out before we knew about the internet research agencies targeting had mapped out the conversation on Twitter of Black Lives Matter activists and blue lives matter activists. And she'd sort of figured out the community who the influencers were the patterns of conversation. And once we had the list of accounts that were been, you know, determined to be internet research agency accounts, she went back in and sort of colored in those dots in the network diagram. And they were dead center in the conversation on both sides, and we're oftentimes the most polarizing voices. So while there was obviously you know, legitimate sort of movement on both sides, you had the bad actors who were coming in just poisoning it just enough to move the needle a bit in one direction or the other. So that's the concern is not that they're going to, you know, we get these questions all the time, sometimes domestically, often overseas of a protest movement popping up, and it's like, are the Russians behind it? If it's of any scale, I mean, my answer is almost always no. But can they kind of integrate themselves into a protest movement with more divisive narratives, you know, more kind of polarizing content just again, we look at it as sort of a brush fire that's already starting and they're just kind of moving it in a slightly different angle so it burns the way they want it to burn.

To the second question about the lack of clarity and how people can vote this time around. This I think, is the number one concern and the closer we get to election day without clarity more problematic This is going to be because in any again, anytime there's an information void, and there's a gap in information, this is going to be filled by people who are going to want to get their own sort of opinions out there to manipulate the conversation. So, Wisconsin is a great example where you saw things kind of flip back and forth before the primary ten different times nobody really knew what you can do. We are talking with a lot of election officials about the need to like come up with a plan now and start actively and aggressively launching a public communications campaign around it. And just to think that, you know, the average voter is going to be able to find this information on their own, it's really, really problematic. And if you are just sort of waiting to see how things you know, this sort of state of U.S. public health on October 15, and then come up with your contingency plan, like we are in big, big, big, big trouble. So I don't you know, it's we're reliant in some ways on them getting their act together. I think some states have we've had conversations with New York now for months, and I think they're at least they're asking the right questions.

But it's really going to be it's going to come down to them having a plan well in advance, communicating that plan. Because otherwise, every possible opportunity is out there for bad actors to try to suppress the vote by giving people false information. And I'd say the one thing to look at specifically here, you know, we look a lot at Facebook, at twitter. That's obviously the major concern, but like smaller communities like next door, which is really down to the neighborhood level. Now, if you're running a campaign that is trying to suppress a vote and a very specific community, Next Door is a perfect place to run that because you can target down to, again, you know, a couple block radius and say, Hey, you know, we're having a massive outbreak of coronavirus right now, which may not be true. So you scare people away from the polls, or you say, you know, they're having this problem. So we're switching the date when you can register this polling places closed, all of these opportunities are going to be out there. And you can really get down to the community level on some of these social media platforms that would be very, very concerning because of course, if you're talking again about the black vote, and you want to suppress the vote in a certain community, you go into that sort of, you know, that page on Facebook, that community in Next Door, and you just target that community with very, very specific false information about COVID. And if the states are I should say that, you know, the county the local election board doesn't, hasn't actively communicated what their plan is, you know, people are going to believe the information that's out there.

FASKIANOS: I'm going to Andrea Scott next.

Q: Hey, thank you so much. I'm Andrea Scott. I'm the editor of Marine Corps Times and Military Times. I'm a little behind the curve here. But could you talk a little more about the specific disinformation operators who you’re seeing the most? Are these big tanks in Russia? Is this the government? Are these white nationalists? Like who is the most kind of their tactics? I think it's very interesting.

SCHAFER: So I would, just to sort of qualify, we look specifically at foreign actors in my work. I mean, it's almost impossible to fully separate the two out because of course, the foreign disinformation actors are trying to like weave their narratives into domestic communities. So the two are kind of intertwined. But I don't specifically look at like what's happening in white nationalists communities, that's kind of not like sort of point of origin for looking at disinformation narratives. In general, I would say I'm definitely more concerned about domestic actors and foreign actors, for sure. You know, the motivation is there, the money is there. They understand the local context better. The communities are authentic and organic. It's just easier to do. But again, in terms of like who those specific people are domestically, I mean, it's sort of the usual players, but I'm not sure I could tell you kind of one group to be more concerned about than others. I will say, given the outlets you write for, we've always been very concerned about targeting of people of any affiliation to the military. There's one guy, Christopher Goldsmith, I think is his name, who actually, I think he worked from Vietnam Veterans of America and just sort of, in his free time started looking into all of these Facebook groups that were being set up basically spoofing his organization and other sort of military organizations on Facebook and found I mean, thousands that are targeting military members, some that were just clearly kind of poorly run operations out of the Philippines again, really wasn't clear what they were trying to do. But we know Russia has tried this forever, because of course, members of the military have more kind of credibility, their voices have more credibility in their communities. You know, they're influential. So they are they are the sort of key influencers, that some of the disinformation actors want to get their content in front of because their voice means more in communities.

In terms of the foreign players, election wise, I'm against this, I don't think China will get involved in sort of direct election interference. Russia probably will. Iran has the capability to run hack and leak campaigns and to set up sort of fake websites. But their messaging is really not particularly sophisticated. You know, just the tracking that we do on our dashboard. I would rank the three of Russia far and away being the best having the most established ecosystem. The State Department report yesterday points to just like how many of these sort of carve out and cut out in semi affiliated websites that they have a piece of which is the other two actors do not but China is rapidly catching up. One of the things of China is their outreach on Western social media platforms is very, very new 75% of Chinese diplomats and government officials have set up their Twitter account in the last year and a half. When we started looking at this in like March of 2019, they had about 40 official accounts on Twitter, they now have 160. Many of them have a huge amount of influence there they have their state media has enough reach to actually be able to run a campaign that will resonate. Iran just doesn't have that their state media is not well funded. They're almost a caricature of sort of propaganda sites sometimes. And their messaging is just not good. You know, when they talk about the George Floyd protests, so we saw all three countries kind of message aggressively around George Floyd. Iran, you know, Russia really, really gets it, they can they can tailor the message for you specific communities on both sides. They understand the terminology. China's getting there and they're not quite as good. But we saw, for example, the Ministry of Foreign Affairs spokesperson for China, in response to a state department tweet criticizing their actions in Hong Kong just respond with I can't breathe, you know. So they're like they're getting the language of the internet and how to kind of work Western audiences. We saw Iran putting stuff out about, you know, the great Satan. And you know, this just being sort of typical for racist U.S., which just doesn't resonate particularly well. So of the three, I'm less concerned about Iran, but Iran still could be damaging in terms of leaking material. Russia would be the best. China is always a concern. But again, I'm I don't I think China is still too risk averse to directly interfere in this election the way that we saw Russia do it, but my primary concern are domestic players, for sure.

FASKIANOS: Thank you. Carla. Let's go back to you.

ROBBINS: Great. Thank you. So just one thought and a question when you were talking about the sort of community void, because politicians are failing or flailing, both at this point, I mean, this is a role for journalism. And the same way we have had in the past sort of hotlines, you know, tell us about, you know, your local rug cleaning company that ripped you off. I mean, this is something I know that we've all got limited resources. But it would seem to me that if people feel like they're not getting adequate information, or they're, they're getting disinformation, or they're getting something that they're confused by, you know, tweet your local newspaper and say, is it true that they moved my local voting place? Is it true that they've changed the date of the primary? And even by doing the outreach to your local community and saying to them, come to us if you're confused, to be on the alert for this sort of information, or this sort of disinformation or this sort of gap or lack of information, A , you shame your local officials because you really point out the gap and B, you play what is the most fundamental role of journalism? Listen, I understand there's limited resources but that would seem to me that sort of hotline role in the in the run up to the election seems like a pretty fundamental role that that could be played by certainly if I were back in the daily business, it's something I wouldn't be doing.

That's the first thing but I suppose the question that I have is, I want to go back to that period that I'm obsessed with which is thanks John D’anna you can hire me. I work cheap these days. (laughs) I'm an academic. Which is this period of time from when the polls close to two to when we actually get the results which is going to be a completely lunatic time because we have been we're into instant gratification and we're not going to get instant gratification. How can if I'm a local reporter covering the results in my state or in my community? How can I look at the trends about who has picked up who the influencers are? Who's pushing what narrative in my local community, I think back to the French election, in which the French have this thing, which I find abhorrent, in which they've got this cooling down period for 36 hours before the vote in which there's a blackout period, and is during that period of time when the Russians dumped all of this Macron hacked information with some easter eggs they weren't true in the midst of it or maybe they were doggy poop, whatever you want to call them. And, but the French reporters in part because they were blacked out in part because they saw how we chased our own our own tails didn't pick them up. But Jack Posobiec pushed it all back in into France at a very fast rate. And thanks to Ben Nimmo and other people who went and tracked and said, Wait a minute, you're talking about 3500 meter retweets and about a minute and a half. And this is you know, a pattern that looks like cozy bears behind it. And because they could look at the pattern of retweets and the speed in which it was coming in that it was coming from the US back to France, that they basically debunked it because people who might have been tempted to run with it, including the American press, because we had no blackout, saw the pattern. So you guys do pattern. You guys have the resources that local news organizations don't have you don't work on domestic. But you could help in this because you do pattern work. How can you help and who else is out there to help local organizations in this incredibly fragile time to look at the patterns and to say, you know, there's something going on here, guys. And the fever has, you know theorists out there, how can we track it because local news organizations aren't going to be able to do that somebody has to help them.

SCHAFER: Yeah. So there's a lot of, there's a lot of things and those points I want to touch on. So I'll try to be brief on a couple of them. Before we get to that last question, which I think is key. So the one thing about having kind of local reporters monitoring their own space as a way of, you know, people in the community who are not getting accurate information to go to local reporters. One of the things that's also important there is escalating that then to the platforms themselves. A lot of this just sort of, if it's at the local level sort of sits below the radar of kind of rising to the level of getting Twitter and Facebook to take action, which they will in the election contest, and then election context if there's misinformation or disinformation about how, when and where to vote, that absolutely violates their terms of service and they will take that down. quickly if they see it. And so one of the problems is there's again, so much stuff out there that they have a problem, I think sifting through the millions of reports a day that they're getting from average users. So this is where media can play a huge role of flagging something and saying like, hello Twitter, essentially, Twitter, Facebook, this is happening, here's this thing, it's been up here two days, and then that content will be taken down. So they think that's an important role.

The other part of that, too, in terms of the period after the vote, when we're not going to have results, I think the key for local reporters there is to prep audiences or readers in advance for the fact that this is going to take a long time this year. And to sort of lay out what it's going to look like why it's going to take this much time, which counties are going to be able to report first, just the mechanics of how the vote counting process will work this year, because it's totally different from anything we've ever seen before. And to your point, like we're all conditioned to knowing the next day. So you know, as this goes from a day to two days to three days to four days, suspicion is just going to start bubbling up. So I think there's got to be a ton of work done in advance to condition audiences to say that like, we're just not going to know this year, this is why, and here's the steps that you know, here are the steps that are being taken to protect the boat, etc, etc.

The final question of resources, understanding that no, resources for us are frankly, tough sometimes, because in terms of disinformation, it's really just me at ASD who works on it. And I, every day have more interesting leads than I can track down. But there are a couple groups out there specifically around the election, forgetting some of them, so like common cause and a couple other election advocates are working on a system that allows trusted reporters I think is what they're calling it, to escalate information to them to track it down. There's another initiative being run by Stanford. So Stanford's internet Observatory, but they're working with Graphika, who, of course, is the private social media company in New York who we do some work with, University of Washington, University of Wisconsin, they're doing the same thing. So it's essentially a ticketing system for trusted reporters to be able to flag things for them. Stanford's got a team of, you know, forty grad students who can do the initial legwork to go through and say, Hey, you know, this is false, but it just seems to be misinformation posted by your crazy uncle, as opposed to this appears to be an actual kind of info op or disinformation campaign, and they can then escalate it. So I think it's the election integrity project at Stanford would be a good one to get in touch with I know they're actively reaching out right now to try to find partners is probably the wrong term. But I think again, it's trusted reporters who, you know, they know they're going to be a bit more credible in terms of what they're reporting to them to track down. DFR lab at Atlantic Council is pretty good, but again all of these groups are pretty overwhelmed with the amount of incoming a day. So the Stanford project would be one -

ROBBINS:Is First Draft News still doing is doing it for the American election?

SCHAFER: I know less specifically about what First Draft is doing, other than that they do great work.

ROBBINS: And we're going to, if you can think of come up with a list and email us, email Irina. You know, I'll do a little bit of research we'll try to pull together and we'll also ask you guys who are on this call, if you also have sources, if you can shoot them to us at you know, at the Council. We'll try to pull together something in the next couple of weeks and then share it with the group. Because I think this time period is going to be so in any resources that everybody can have to turn, to basically hold down the brush fire in those days seems incredibly important.

FASKIANOS: We have one last question I want to try squeeze it in from Hezi Aris. And if you can make it brief, and you know, we'll go for just a couple minutes more.

Q: Well, thanks very much for taking my question. I'm actually the publisher and the editor of the Yonkers Tribune. I was really shocked to hear that you were going to talk about us, but what you should know and you may not have known is first of all, I learned about Ms. Faskas’ campaign. EMILY’s List backed her up and I gave note of that. And then I gave note of another supporter that wrote a letter supporting them. And then Mr. Averko ended up, he came up with another position. So we took it not just as a false narrative, but rather as an opinion and it sounds to me like much of this conversation has tended to be paternalism rather than simply reporting what was out there and people do have an opinion. In fact, I tried very, very desperately, many, many times to reach the campaign, they never returned the call. The only time they did return a call was not directly from her, but someone who claimed they were a representative for her campaign. And what was interesting is they wouldn't give a name. And yet I spent an hour with them on the phone. And all I suggested to them was okay, you don't need to agree with anything that was said but we are open to your perspective. Just write what you want to say, tell us what's wrong or tell us what isn't wrong,  tell us where you think we failed. You've got carte blanche to write what you like, and it'll be fun. Never heard from them. And yet, we published two other op-ed pieces from individuals that were supportive of them. And I wasn't going to chastise them or tell them what to think or how to think. And maybe it's a new world, maybe they're bad actors, as you're suggesting, Mr. Schafer. And no, I don't have the time to track everyone down. I was aware of Mr. Averko, but it's just an opinion. So what are we supposed to do when we call up a campaign and they don't respond?

ROBBINS: I'm sorry, can I respond to that as a former deputy editorial page editor at the Times, and it's not that we haven't made mistakes, too. I think the issue is when you have someone writing an op-ed for you, and the question is an issue of the identification of their interest and what their sourcing is and if the Strategic Culture Foundation is indeed a front for the SVR, the Russian intelligence service, and you don't tell your readers what, you know, what their background is, and you know who's paying the bills, then it's not just a question of an opinion, you haven't adequately informed your readers of who's got what dog in what fight. I don’t think that's paternalism.

SCHAFER: I think the context is the key, you know, particularly in the U.S. information space where, you know, we have the First Amendment, I think it was a mistake, actually, a couple years ago when a lot of the conversation around disinformation focused on content, which was always going to be inherently problematic, as opposed to the actors and the behaviors being used to kind of push out the content. But it's complicated because a lot of times you do see these figures who are Americans, we know that for sure. They have antique sort of U.S. western foreign policy views fine. We're up here on RT and appear on Sputnik and right for all of these outlets, and it's very, you know, difficult to disentangle, who is just sort of hesitant, genuine opinion, who has used some of these sites for, you know, more leverage for that opinion, and who actually has kind of deeper connections to sites that are funded by foreign intelligence, you know, foreign adversaries. So I would just say, I think the context is key. You know, I don't think there is a necessary there's a problem with running that op-ed, I think the issue which is understandable that this wouldn't necessarily be known because this report has not come out yet, is the fact that this is a guy who writes for an outlet that is funded by a foreign intelligence service. I think that's key for a reader to know.

FASKIANOS: With that, we are out at the end our time, and so I want to thank you both for all of you for this discussion. It's really been a fruitful hour. As Carla said, we're going to collect sources and we will circulate them to you to help with your reporting. So, Bret Schafer, thank you very much for being with us today. We really appreciate it. Just as a reminder, you can follow Bret's work at the Alliance for Securing Democracy @securedemocracy and you can follow Carla on Twitter @RobbinsCarla. So please follow them. Please visit CFR.org and ForeignAffairs.com for the latest on Election 2020 as well as other foreign policy issues. And again, please share your suggestions and feedback with us for future Local Journalists Webinars by emailing us at local [email protected]. Thank you.

Top Stories on CFR

Israel

Israel must act firmly to enforce the Lebanon agreement or it will collapse within months--and only Israel can be expected to enforce it.

Iran

Iran’s nuclear program and missile arsenal have garnered increased international scrutiny amid its flaring conflict with Israel.

Artificial Intelligence (AI)

The rise of generative artificial intelligence (AI) has been breathtaking, and American firms are leading the way in showing the potential of a new AI-propelled world. But rivals like China are gaining ground, with major consequences for the U.S. economy and security.