Political Disruptions: Combating Disinformation and Fake News

Don Pollard
Speakers
Benjamin T. Decker

Research Fellow, Misinformation Project, Shorenstein Center on Media, Politics and Public Policy, Harvard Kennedy School

Joan Donovan

Media Manipulation Research Lead, Data & Society

Kelly M. Greenhill

Associate Professor and Director, International Relations Program, Tufts University; Research Fellow, International Security Program, Belfer Center for Science and International Affairs, Harvard Kennedy School

Presider
Richard Stengel

Political Analyst, MSNBC; Distinguished Fellow, Digital Forensic Research Lab, Atlantic Council; Former U.S. Undersecretary of State for Public Diplomacy and Public Affairs (2014–2016)

Introductory Remarks

Vice President, National Program and Outreach, Council on Foreign Relations

As the final session of the 2018 College and University Educators Workshop, Kelly M. Greenhill, Joan Donovan, and Benjamin T. Decker assess the challenges of disinformation and media literacy, and its role in U.S. democracy, with Richard Stengel.
 

FASKIANOS: I am pleased to welcome you back into the Peterson Hall for our final session, which I’m going to turn over to Richard Stengel, who is the longest-serving undersecretary of state for public diplomacy and public affairs in American history. I’m sure you all recognized him—(laughter)—from his daily appearances on MSNBC as an analyst. And he’s working on a forthcoming book on combating violent extremist messaging and disinformation around the world. So nobody better to run this panel than Richard Stengel—

STENGEL: Thank you.

FASKIANOS: —on combating disinformation and fake news. So, Rick, over to you. Thank you.

STENGEL: Thank you, Irina.

Hey! Good morning. Good afternoon.

AUDIENCE MEMBERS: Good afternoon.

STENGEL: Wow. How was your day?

(Cross talk.)

STENGEL: Hi. So what did you—so you’ve been here since what time?

AUDIENCE MEMBERS: Yesterday.

STENGEL: So are we—we’re the last thing that you’re doing today? And really—and the most interesting thing, right?

AUDIENCE MEMBER: Yes.

AUDIENCE MEMBER: Uh-huh.

STENGEL: And then do you go home tonight, or you stay?

AUDIENCE MEMBER: First we go to the bar. After you we got to the bar.

STENGEL: (Laughs.) OK. (Laughter.) Well, I asked if they could have a bar here, but they said no—(laughter)—because drinking contributes to disinformation. I don’t know if you know that.

All right. So welcome, everybody. So I’m going to do a very lazy thing, which is instead of giving that nice introduction that I got, I’m going to ask everybody to introduce herselves or himselves. And it’s really—I mean, I think—we actually have been chatting for the last half an hour. This is—this is something we’re all obsessed with. This is something that’s really of the moment. I imagine for folks in academia this is something that students want to talk about and know about, and I do think when and if we get answers to all of this it’ll come from people who are digital natives.

Just to annotate what was said, I’m actually—at the State Department, I looked after two things, one of which was the only entity in government that combated ISIS’s disinformation. And then I started the only entity in government in a non-classified entity that combated Russian disinformation. And the book is about sort of the combination of those two things, and I learned a tremendous amount just in the last half an hour.

So, Kelly, why don’t you start, and then we’ll go down there. And we’ll start with, like, what’s the nature of the problem we’re looking at.

GREENHILL: I am Kelly Greenhill. I am a political scientist. I am on the faculty at Tufts University and a research fellow at the Belfer Center at Harvard Kennedy School of Government.

DONOVAN: Hi, everybody. I’m Joan Donovan, and I’m the media manipulation and platform accountability research lead at Data & Society, which is a nonprofit independent research institute just down the road in Chelsea. So I’m really happy to be here. And what we study at Data & Society is the impact of sociotechnical systems on society, and so my research really looks at disinformation and media manipulation from that point of view.

DECKER: And my name is Ben Decker. I am a research fellow at the Shorenstein Center at the Harvard Kennedy School, principally focused on political disinformation around the midterm elections and the 2020 presidential.

STENGEL: Great. So, you know, when we have these discussions—and I assume you have the same reaction I do, and you probably even more so—is that we don’t really have a great glossary of terms of defining disinformation, fake news, et cetera. And that’s the preamble to what is the—what is the big problem here that we’re all looking at? And what is it that you’re looking at in particular that embodies that larger problem? And you can do a little bit of a definition of the terms, too, while you’re talking about it, Kelly.

GREENHILL: Well, the definition of the terms I’m not sure I want to tackle because they’re highly contentious and different people use different definitions.

STENGEL: All right, out the window.

GREENHILL: And I can—I can share the terms that I use, but I have to say they’re not universally accepted. And it’s not because I’m particularly embracing controversial terms, but rather that the terms themselves are under contestation, and even the nature of the problem remains the subject of debate and deep contestation.

I would argue that the most—the fundamental problem or the most significant problems are not what we would think of as fake news, per se, as in demonstrably fake, easily identifiable as fake by most people, but rather misinformation, disinformation, and what I call extra-factual information. So information that is unverifiable or you can’t tell if it’s true or false at the time of transmission, but which nevertheless serves as an actionable form of intelligence for those either who believe it or treat it as true, or those who would use it because they believe other people would treat it as true.

And so, when we have these elisions of the true and the false, or things that seem like they could be true—things that are “truthy”—that’s—those are the pieces of information I think are particularly problematic. And that’s the sphere where it’s particularly difficult to debunk them and where it’s easier for those who spread this kind of information to push back and say, well, this part’s true, this is true—or they don’t even say this part’s true, but how could you contest X, Y, and Z, because here are other mainstream media sources saying X, Y, and Z are, in fact, true?

So it’s this gray region, this gray zone, where it’s not traditional disinformation but a combination of misinformation and play on rumors, conspiracy theories, and this sort of gray propaganda. That’s where I think the nub or the crux of the problem lies.

STENGEL: By the way, those terms—

GREENHILL: Yes.

STENGEL: —the gray zone—all from Russian active measures. They’ve been doing it for a million years, and Ben will talk a little bit about that.

But, Joan, so piggybacking on that, what—hasn’t this always been around? I mean, confirmation bias has existed in the Garden of Eden, I believe, is where it was first written about. (Laughter.) I mean, so how do you—how do you combat that?

DONOVAN: It was a good apple. (Laughter.)

GREENHILL: Or a quince. (Laughs.)

DONOVAN: Yeah, maybe. I’ll have to check Wikipedia. I think—(laughter)—it’s an authoritative source these days—(laughter)—as, you know, I think Facebook and YouTube are now leaning on it to be the arbiter of truth in this space.

Yeah, I mean, we’ve always had attempts to deceive, right? Delusion/deception has always been something that is the hallmark of persuasion, right? It’s part of our entire media ecosystem. Advertising is built on persuading people that products do things that are magical. There’s—you know, Stuart Hall’s got some great writing on this.

And so when we research media manipulation at Data & Society, we think about how is media used to manipulate people in groups, but then also how do people in groups use media to manipulate others. And I think that what we’re dealing with now is this moment where we’ve moved beyond gatekeepers. So the cost of broadcasting right now is very, very low; and the cost to society, then, is very high because we don’t have enough people that are good arbiters of truth that do the peer review collective work.

You know, truth is a collective response to a problem, right? We decide that things are true. I don’t want to sound too much like a social constructionist here, but we decide when a table is a table, right—(laughs)—and not a chair. And so, you know, but a 3-year-old might think differently, which is why you have to teach people these things. You have to socialize people into truth. And I think that with the rapid uptake of the internet and the ubiquity of computing, we’re at this moment where a lot of the gatekeepers that we’ve come to rely on—such as educational systems, journalists, family and friends even—we are unable to decide what is truth in the moment.

And the people that we study that use media to manipulate other people know that, and they know that there are times when information is critical, during—especially what we study during breaking news cycles, where we see a lot of what we call source hacking, which is hacking source material. It’s making shit up—(laughter)—so that in the moment they can piggyback their ideas or their political opinions into the story. So it doesn’t surprise my research lab every time we see a mass shooting that we see the misidentification of certain mass shooters that identify either this young internet comedian named Sam Hyde, or Antifa is a popular—a popular conspiracy theory, or crisis actors. And in these moments we know what to look for because we are a highly skilled research lab, but it doesn’t mean that anyone that’s popping onto Twitter that are looking at these keywords knows the difference between one of these posts that is targeted disinformation and one of these posts that is maybe what we call misinformation, is information that’s being circulated in the moment but we all know it’s unverified.

Right now we have this mixing of verification signals, too, that make it even more difficult to discern who is telling us the truth. For instance, on Twitter you’ll see the blue checkmark. Lots of people interpret that as being verified and sanctioned by Twitter as someone who has been vetted and is a good source of information. Of course, when Twitter hears me say something like that they say, no, all we’ve done is say we know that this person is this person because they’ve provided their driver’s license to us. But the signals that the public receives are very different.

And I’ll shut up there. (Laughter.)

STENGEL: So, Ben, I wanted—

GREENHILL: Can I just pick up—say one thing in addition?

STENGEL: Absolutely.

GREENHILL: To speak to Joan’s point, all of this is exacerbated by the fact that we’re in the midst of a fundamental trust crisis. We don’t trust our institutions the way we used to, so the arbiters or those—not simply that the gatekeepers have been pushed aside, but the gatekeepers, many of them are not trusted. So Congress, you know, 12 percent of Americans think that, you know, Congress can be trusted. About the same number think that the president can be trusted. Even the media, newspapers. So on a good day, right, it’s about 30 percent.

STENGEL: But I—my reaction to that always is that there was a false amount of trust once upon a time. I trusted the government when they told me if I put my head under my desk I would be prevented from dying from a nuclear holocaust. (Laughter.) I trusted that.

GREENHILL: Yeah, that’s right.

DONOVAN: You had Wikipedia, though. (Laughter.)

STENGEL: Yeah, so—(laughter)—

GREENHILL: But it does make it harder in the context of when source—you know, who is supposed to be the arbiter of truth?

STENGEL: So, Ben, you’re—so I was actually at the Shorenstein Center last year, and they do fantastic work. And Nicco Mele, who’s the head of it, is a—is a great guy—for hiring you, too, by the way.

DECKER: Thank you. (Laughs.)

STENGEL: But you guys are actually doing some stuff about this. And one of the things that I always say is that we don’t have a fake news problem, we have a media illiteracy problem, that people don’t actually understand. And now that I’ve been out of media for a while, it’s like, I’m amazed that people—they don’t—they don’t understand the sources of information or how to verify anything. Like, even my flippin’ kids don’t know that. But you guys are going to solve that problem, right?

DECKER: Yeah, and I think it’s very interesting because there’s been—(laughter)—yeah, and we are going to solve it, by the way, just casually—(laughter)—because that’s what we do.

You know it goes beyond almost a public media literacy problem and it’s actually become almost a newsroom literacy problem, where, you know, journalists, obviously, are very overwhelmed in this day and age at the number of, you know, breaking news stories and potential misinformation that just shows up in their, you know, digital monitoring streams every day. And people don’t have time to literally understand that nefarious actors on 4chan might be getting ready to, you know, blitz the French presidential election with a falsified story about, you know, millions of euros in a fake Cayman Island(s) bank account. Nobody has the sort of foresight to understand, oh, like, who is this alt-right community, what is the alt-right, why do they matter to me. And, you know, for, you know, ABC, the alt-right only mattered after they were tricked into falsely, you know, reporting the alleged white nationalist connection between the Parkland shooter. And, you know, journalists don’t really understand literally that source hacking is taking place. So, you know, one of the biggest things that we’re trying to do is, you know, talk to, you know, newsrooms on the whole across the board about things that, you know, you can do to verify and understand information.

You know, I take it that everyone’s probably seen the Emma Gonzalez .GIF tearing up the Constitution, as it’s, you know, circulated pretty widely. Now, what’s crazy about that sort of disinformation campaign, to debunk it and understand that it emerged from 4chan by people who wanted to tarnish her image, it took 30 seconds to do a Google reverse image search. And thankfully, that 4chan thread was actually indexed by Google, and the image hashing that, you know, searches through images on Google actually properly identified, you know, that instance. So, you know, I think that there’s a lot of very high and advanced, you know, technologies around media manipulation; however, you know, on a very sort of scalable human level there are really simple things.

And, you know, we should be talking about reverse image searches at the dinner table once in a while. And I think that, you know, there needs to be this sort of trickling down of skillsets from the expert to, you know, your everyday digital navigator because, you know, it’s almost, you know, like when we get on an airplane, we put on a seatbelt. That is just something that we are told to do and we understand to do. I think we need to incorporate some more of these things into the daily digital appetite because, you know, I think what, you know, has really ultimately been said here—and by the way, the terminology I think we all do agree a lot more on without ever having spoken about it—is really, you know, the overamplification of the Rashomon effect.

I think one of the biggest sort of teaching, you know, media examples to understand this is the 1950 Kurosawa film, where, you know, essentially there’s my truth, your truth, and the truth, and nobody’s lying, creating this ultimate, you know, lack of trust and understanding about, you know, what is—is my truth the social truth, or is that limited to my perception? And what is the value of my perception or another’s perception? And I think source hacking has really created a sense where we don’t know who we should be listening to.

STENGEL: Well, that—I mean if we’re going back to the nature of the problem, I mean, not to get highfalutin about it, but the nature of the problem is truth. You know, is there such a thing? And I think people think that there is. And there are Rashomon truths and there’s other truths like two plus two equals four. But I think this expectation that there is a single, uniform, universal truth is part of the problem, that people get deceived. Soo—

DONOVAN: Well, there’s that, but then there’s also just events that happen that, when you are putting so much of your newsroom efforts into looking at what’s happening online, that you don’t get the sense of the local anymore. And so even our basic reporting that would have, you know, come up under BBC no-comment style streaming reporting, we’re not even getting the basic information about events that happen. We’ve seen attempts to spam Twitter into getting reporters to cover fake weather events, as well as fake chemical explosions, and these are beta tests for trying to get newsrooms to cover more contentious events. And so even the nature of when an event happens, being able to check back on the local has become a very big problem for journalists.

STENGEL: Well, but, like, I’ll reverse that. Once upon a time people have bemoaned the fact—and, I mean, I was in the media business my whole life. Well, now everybody cuts back. They don’t have correspondents anymore. They don’t have, you know, people in Paris or Istanbul. Well, now people there have phones and they can record things in real time. And we used to send correspondents there because the people reading the Chicago newspaper couldn’t read the New York newspaper, so they each had to have a correspondent in Paris. Now you have millions of people who are reporters who are giving you what seems like real information. Shouldn’t that be a gigantic asset to news organizations?

GREENHILL: Seems, right? It seems like it should be. The question is—and I guess—I mean, I’m hesitant to use this analogy, but I’m going to do it anyway. There can be a surfeit of information, or we could think of it—like, the NSA is getting bombarded with information; which pieces of it are you going to pay attention to? Which of these photos are you going to treat as true if they’re in conflict with one another? Because, you know, yes, people are there with cellphones, but we also know that cellphones can be—photos can be doctored.

STENGEL: So that—

GREENHILL: And so which pieces of information do the folks in the newsroom, who don’t have someone in the Paris Bureau, decide to run with if they’re on deadline, time is short? It’s not that corrections can’t happen, but once the erroneous, unverified, problematic report is out there, we can try to undo some of the damage, but you can’t—

DONOVAN: Well, there’s also the issue around they’re not even in control of the distribution of those images anymore. They don’t have time—

GREENHILL: The newsrooms.

DONOVAN: They don’t have time to vet them. They’re showing up on Twitter. And the issue with making sense of that source material is that in that sea of information there are authentic pieces and then there is stuff that—there are things that people have made in order to leverage the moment. And knowing when that leveraging is happening is—it’s an increasingly I don’t want to say new tactic, but it’s a—it’s a tactic that these what I call media movements have taken up.

So there are movements online of people that only care about destabilizing the media industry, right? So they’ll even sometimes impersonate white nationalists in order to try to get the media to pick up a story about white nationalism because what they really want to do is make journalists have to issue retractions and make journalists have to make corrections, and to destabilize the entire news network. And so you’re in this adversarial media space right now where we’re used to thinking about adversarial media as a political spectrum where you’ve got one outlet that leans more right versus one outlet that leans more left, but you’re not taking into account this third space of people that really just want to make either of those poles look foolish.

STENGEL: What do you do about that, Ben?

DECKER: I mean, I think that’s where, I mean, the inherent lack of context through which we ingest information online is, I mean, the single greatest issue. And I think, you know, earlier we were kind of talking about the stale approach of fact-checking in 2018, that you could almost equate it to putting a Band-Aid on a broken leg. You know, this is kind of a tried method that we’ve been doing, you know, for decades to sort of counter Russian disinformation about how the government, you know, created AIDS. You know, that same sort of fact-check doesn’t work. So I think being able to understand that this image originated on 4chan and this was the conversation that led to its creation, to really identify, essentially, the purveyors of disinformation or the operators, what have you, and almost start to think more in terms of how an intelligence operation would work solely based on the proactivity rather than the reactive nature, I think, of our current media.

So, you know, I think one of the things that, you know, brought Joan and I together in terms of knowing that, you know, we were both, you know, real expert ninjas in this space was, you know, when we started talking about or being concerned about Charlottesville. You know, I think for most people Friday night, when the tiki torches came out, was the first real red flag. But I think, you know, we can concur that, you know, throughout July there had been a very large buildup of conversations, and it was hard not to assume that the Charlottesville rally was going to be a very big event. And, you know, unfortunately, you know, history kind of proved that correct. But again, if you shift towards this proactive understanding and building context for your audiences, you know, we could have socially, I think, been prepared as a society for what happened, whereas the media landscape didn’t sort of prepare the public for, you know, the shock and awe.

STENGEL: Well, one of the—and one of the things we were talking about beforehand is how technology is changing perception, and we were talking about—did people see the Jordan Peele Barack Obama video this past week? How many people saw that? Do you know what I’m talking about? Yeah. So this idea of invented video. So it was a video of Barack Obama allegedly slamming President Trump and various other things, and it was—and it was put together through I don’t even know what means.

DONOVAN: BuzzFeed.

DECKER: It was BuzzFeed.

DONOVAN: Yeah. It was a—it’s a—it’s what we call a deep fake. So this is the ability to make it look as if video is—the person in the video is saying—it’s a manipulation of the mouth movements to make it look like they’re saying what you think they’re saying. And Jordan Peele is an interesting choice because he does a very good Barack Obama impression, but there’s also software now that does it for audio. So it only needs a few minutes of someone’s normal speaking voice in order to parse out the syllables to recreate any typed conversation that it wants in someone’s voice. And I was having lunch with a BuzzFeed reporter today, and we were talking about wouldn’t it have been even crazier if they had used both technologies at once.

And what’s interesting about these kind of technologies is that they—the industries that they’re made for aren’t for the purposes that we’re scared about, but they’re made for pornography, right? And a lot of our technology online starts with Vitamin P. (Laughter.) And, like, the pornography role in the development of online technology, the development of online advertising, the indexing of search engines, even, you know, that’s one of the places that as researchers you tend to look for advances in technology, because they’re going to be at the most cutting edge of how to make money off of these—the evolution of these technologies. They were the first to leverage photo and video. And so you can look to other industries to start to understand. You know, unfortunately, political operatives are often the last technology adopters. And so it’s an interesting thing to see this technology that probably would have been, like, what we call a nothing burger, you know, in that space, or maybe was used—would be used to do funny things now becoming a threat to national security.

STENGEL: So what’s—that’s a very “stormy” proposition.

DONOVAN: At the same time—(laughter). Zing!

DECKER: On top of his game today. (Laughter.)

DONOVAN: He is. Plenty of time for—

STENGEL: By the way, this is the most laughter there’s ever been at a Council event. (Laughter.)

DONOVAN: Yes, yes, it’s the end of the day, but we’re not done.

STENGEL: So I want to make sure that we’re getting around to the stuff that you all are interested in, and I think there’s going to be questions—right?—starting how soon? Yeah.

So but the issue, though—and even when we talk with each other and even when regular people go, well, what can we do about all this—I mean, as you were saying, you know, the political class is often the last to know, and the consumers even further behind. And the idea that people, in order to read their daily diet of information, are going to start using tools to, you know, reverse engineer pictures seems to me like kind of an unlikely proposition. What are—what can people do?

GREENHILL: I’m going to defer to my colleagues in terms of—I mean, I could talk about the variety of things that could be done with respect to platforms, but I think the two—

STENGEL: But is that, you know—

GREENHILL: I mean, that’s one piece of it. And then there’s taking steps that can empower individuals to be more discriminating and more intelligent—and I don’t mean that in any kind of you’re not smart, but shall we say more discriminating and more skeptical consumers of online media, as well as other media.

I would say, however—and this is a big but—is that all of these actors—and they are actors—who are bombarding us with this information spend a whole lot of time making it as difficult as possible for us to distinguish between the true and the false. And what we know from the 2016 election, as well as from before and subsequent to that, is that a lot of targeting happens in ways that are designed to make you not want to go in to figure out whether or not this is true or not.

STENGEL: Yes.

GREENHILL: Because you’ve been—you have been chosen. You are the chosen one to get the content that speaks to what your worldview already says, like, yeah, this fits with what I believe about the world and it speaks to my concerns and my fears.

STENGEL: And that happens on the left and the right.

GREENHILL: That happens, yes, on the left and the right.

And to build off of something that Joan said, it’s not just the porn industry. There’s another industry that’s been playing in this pond in ways that we didn’t—or they didn’t think about at the time, which is marketers, and for thinking about how they build brand loyalty.

DONOVAN: Yes.

DECKER: Yeah.

GREENHILL: You know, there’s this entire strand of marketing called neural marketing, and they’re figuring out how to trigger, you know, your amygdala, things you’re afraid of, things you’re concerned about. And there’s a little part of your brain that speaks to perception of your body and speaks to issues of disgust and hate. And they use MRIs. They use EEGs. They use a variety of sensors to figure out what pushes our buttons. And so we can talk about a variety of steps that can be taken to be smarter consumers of information and to check to see, you know, or try to fact check, but they’ve got our number.

STENGEL: Yeah.

GREENHILL: And they’ve got our individual numbers.

STENGEL: And we talked a little bit about this earlier, about advertising, having been once upon a time where I was out talking to advertisers and trying to get them to advertise in Time magazine. The problem with advertising in media is, there’s an illusion there that is fostered right in the beginning, which is you’re not paying for the content. The cost of the content is supplied by the advertiser. You’re not paying for the correspondent in the Middle East, or whatever. And there’s an illusion there already. So with this whole issue of Facebook, I always think, well, how do people think I get all of this stuff for free. Well, it’s not free. The cost is advertising. Advertising you sells you information. They want your information. So this whole ecosystem is a little bit poisoned by that.

DECKER: Do we socially understand—

STENGEL: By the way, that’s why I lost every advertiser whenever I—

GREENHILL: So—but it’s more than that, because these—it’s the ads, but then it’s the political actors who know how and why the ads work, who then say, oh, brain science tells me that if I trigger the amygdala—

STENGEL: Yeah.

GREENHILL: —you know, so it’s coming from the commercial sphere. It’s commoditized for a reason. We have no choice if we’re going to have access to free content. But then that technology and that knowledge is taken and used in ways that can affect our democracy and our political system.

STENGEL: And they’re still amateurs at it compared to the real marketers in advertising.

GREENHILL: That’s true, but learning habits. Yes.

Oh, anyway, I’m sorry, Ben, you were—

DECKER: Oh, no, I mean, you know, I think socially, like, does the public—do we even as a community in this room understand what we are paying in order to have these free services? You know, I don’t know if anyone’s ever downloaded their data from Google that’s collected, but, you know, I did it the other day just out of curiosity and there’s about 600,000 pages’ worth of information.

STENGEL: Wow.

DECKER: You know, that is not a small cost that I am paying Alphabet to, you know, use the internet. And, you know, I think, you know, in the wake of Cambridge Analytica and a lot of this, you know, sort of Facebook or platform privacy issues that are now becoming much more a part of the dinner table discussion, we’re only starting to understand how much of our lives we’re putting into the internet.

And as far as solutions go, you know, personally, you know, to keep my own sanity, I think the easiest thing is just to get offline. You know, there is, like, a huge, you know, value, whether it’s, you know, romanticized or just practical to read a newspaper, or to find, you know, a sort of non-exploitable format to consume information. And I think that if we look at, you know, cultures in America, you know, in every continent we have these, like, cultural holy spaces, like public libraries, bowling alleys, even the office watercooler, where somehow we all sort of come together and just participate in a group conversation. And I think that there’s an inherent trust that we feel in these spaces. And I think that, you know, ultimately, you know, whether it’s the media industry, academia, you know, the research community, we need to find ways to—and I think Joan probably would know how to do this better than anyone else I’ve ever met on this planet—is to create these kind of clean, safe spaces to trust what’s happening around you on the internet.

STENGEL: That is a tall order.

GREENHILL: Yeah.

STENGEL: I’d love to open it to questions now, because I want to make sure we’re getting at the things that everybody’s interested in.

You had your hand up first. Now, are you all bored with each other? Can you tell us who you are and where you’re from?

Q: OK, can you hear? Hi, I’m Katherine Barbieri from University of South Carolina.

I just wanted to say I’m a big fan of Kelly Greenhill’s book. She wrote a book—if you guys haven’t read it, it’s one of my favorites—called Sex, Drugs, and Body Counts, about the—with Peter Andreas, about the politics of numbers and crime statistics in war. And so I just keep thinking this is—you know, again, part of my thinking about disinformation is based on what you wrote and what I’ve written on this topic that’s gone on forever in politics. So how is it different today? I mean, even when I tell students to read your book, because it shows—I do quantitative research on war and illegal operations, and you’ve talked about this for a long, long time. So how’s it different? Can you tell people about your book? (Laughter.)

GREENHILL: Can I tell people about my book? Maybe offline?

How’s it different today. So we published the book in 2010, so it seems like a long time ago, and in many ways it seems like a lifetime ago. But the only thing that’s sort of fundamentally different today is the explosion of social media. And I—when I say only, that’s implying that it’s not somehow revolutionarily different. And in many ways, I would say it is revolutionarily different in terms of scope and scale and speed at which information can be disseminated.

Now, to go back to one of Richard, Rich or—

STENGEL: Rick, Richard, either—

GREENHILL: —earlier point—

STENGEL: Disinformation, right there.

GREENHILL: —I’m trying to make sure that the truth is—you know, comes out—is also theoretically possible to debunk unprecedentedly quickly, with unprecedented speed, scale, scope, and so on and so forth. But, again, for all the reasons I’ve already articulated, not—it’s not that easy. The issues that we raised in “Sex, Drugs, and Body Counts” are timeless. They held long before we wrote the book, and they’re still true today. And I would say some of the things that we know about how new technologies come along and upend how information’s transmitted and how these new technologies can be politicized and also used to mobilize people, go back to the Guttenberg printing press, if not before. So along comes the Guttenberg printing press, and the Church is really nervous about it and they know—and they—rightly so, they don’t know why, and they don’t necessarily know about Martin Luther or the 95 Theses, but it upends everything and they were right to be afraid.

But then the printing press has also enabled an untold number of social movements to succeed, and the kind of social movements that we want to succeed, civil rights to be just one example. So information is power. New technologies upend everything, but they don’t change how our brains work. They don’t change—new technologies don’t change what presses people’s buttons. New technologies don’t fundamentally generally change worldviews or beliefs. They might change them on the margins and they change how we interact with the world, but they don’t change fundamental human cognition, with the arguable exception that maybe we all are now suffering from attention deficit disorder because of interaction with our devices, but that’s a sort of separate issue. And we learn, over time, how to make peace with these technologies. So it was true for the printing press. It was true for radio. We understand how they can be used and manipulated to mobilize people for good and for ill, and we move on until the next technology. So some of these issues are timeless, and I won’t talk about—

STENGEL: I’d love you guys to weigh in on the same question. Like, what is different now. What—

DONOVAN: I think—let me take you guys back to 2015, a long time ago.

GREENHILL: (Laughs.) Ages.

STENGEL: A very long time ago.

DONOVAN: We’ve been through some stuff.

In 2015, I was at UCLA as a post-doc at Institute for Society and Genetics researching white nationalist use of DNA ancestry tests to prove racial purity. So, again, really lightweight research projects. (Laughter.) I thank my wife for coming today because she’s heard so many of these stories and has had to live through a lot of these moments in my research career where I’m just a researcher, I’m interested in social movements, I’m in this position where I’m doing medical sociology, and I have to figure out where white nationalists are hanging out online. In 2015, all you had to type in to Google to find them was “white nationalist” and it would take you directly to Stormfront. And there was a community of people ready to engage with me and hear me out and listen to my interests and, you know, hear my grievances.

And as I watched that community develop and mapped that community’s use of DNA ancestry testing and science, I also was in the moment when white nationalists were seeing the changes in the Republican Party and figuring out where they could hitch their cart, to which candidates, and why, and what their issues were going to be going forward, particularly around isolationism, protectionism, antiimmigration, and infrastructure. They were—you know, the money for jobs was really important to them.

And as I watched the political sphere play out, we also saw a mass attention in media to white nationalists. We saw the rise of the alt-right. And if not for media attention to the alt-right, we probably wouldn’t have gotten the size and scope of that movement coming in to public space as we did. So it’s a combination of the technology, but also the attention that we pay as a society to these issues that brings those kind of movements into public space. And I would say that that also works for disinformation.

And so when I think about technology’s relationship to society, I’m also thinking about our relationships to each other, and our relationships to things that might be much more difficult to seek out had we not had this structure of the internet that is beholden to these key words, where there are no gatekeepers stopping you from engaging with, in my case, white supremacists. And so one of the things that I worry about in this space around psychology especially is how easy it is for someone to get into what we call a rabbit hole and go down with one of these conspiracy theories, to the very depths of it, and then convince themselves that one of their fundamental values is somehow flawed. And they’re—in doing that, they have many, many more tools and there’s so much more media to pull on to keep you hearing that same message over and over and over again. And so as you’re thinking about teaching and possibly even working on these issues around disinformation, think about those rabbit holes and how they might even change you as you go through and start to shape your opinions of these spaces and shape your opinions of these conspiracies.

DECKER: And I think there’s two really good books that kind of lump a lot of these things together in some ways, one being—and I’m shocked that I’m the first person to mention the title; I would have figured you’d bring it up—but Algorithms of Oppression is really, I mean, of tremendous value for understanding, you know, intentional and unintentional big data warfare, which, you know, I think, you know, from the book is clearly nothing new, and maybe some of the tools and the tactics have adjusted slightly. And the other one is Jamie Bartlett’s The Dark Net, in which, you know, he really interviews some of the I would say more socially threatening and socially nonthreatening very dark cavernous spaces of the internet to understand even how they’re using, you know, big data to convince people to sort of enter their space, you know, whether it’s through kind of, you know, psychological or, you know, even sort of technical means to get someone to see, you know, a photo or document. You know, it’s really important that we start to look at it from all angles, and particularly the more social examples that can help make sense of this. It’s why I think, you know, using, you know, the crime data and war data is so valuable, because that’s always kind of been a mainstay of kind of media reporting around, you know, conflicts and wars, and the interaction with the public. But now anyone, you know, can technically go get a data science degree from somewhere and use all types of, you know, free open source, you know, software to make it look pretty and prove a point. And, you know, you can shape the data however you want. So, you know, it’s just looking at it in a new norm, I think.

STENGEL: So I’m just going to add to one thing, as Joan mentioned, all the way back in 2015. Things have happened so fast. So part of the thing that I’m writing about in my book and that I saw at the State Department was, after the annexation of Crimea in 2014 and the invasion of Ukraine in 2014-2015, the stuff the Russians were doing in the Russian periphery in the Baltics was a trial run for what they did here in 2016. I mean, the same stuff, in Russian. But actually, because we are so susceptible and their English is poor, nevertheless, people still reacted in the same way to that kind of disinformation. I mean, it’s really—it’s so not old. It’s still fresh and just going on right now.

DONOVAN: Well, in that era I remember seeing people were sharing YouTube videos of citizens in Crimea saying, oh, there’s no Russians here, right? And it was just so obvious that they were lying. I didn’t even speak the language, but you could see the subtitles.

STENGEL: Right.

DONOVAN: And it was just this blatant sort of everything’s fine. You know, but it was on YouTube.

STENGEL: And you had the president of the Russian Federation, a guy named Vladimir Putin, saying directly to camera there are no Russian soldiers in Ukraine and Crimea—I mean, a direct bald-faced lie. Where did that come from?

But at any rate, next question. Did you want to weigh in on this?

DECKER: I kind of wanted to ask you a question, actually, if we could flip the entire script for a second. (Laughter.)

You know, in sort of the Crimea and Eastern Ukrainian space, you know, some of the researchers I’ve kind of spoken to over the years have said the biggest difference for them and us really is that disinformation is taking place over all mass media.

STENGEL: Right.

DECKER: Though it’s not just this dark corner of 4Chan/pol/ that is bubbling into, like, my grandmother’s Facebook feed. This is what I’m seeing on CNN, NBC, ABC, et cetera. So, I mean, how does that—

STENGEL: Well, in fact, one of the things—and that goes to the counter-ISIS stuff we did, because the—we have this illusion that everything’s happening on social media. I mean, one reason it was so hard to fight against ISIS messaging is because they had kiosks and billboards and flyers. Like, we can’t compete with flyers, right? We can compete with social media. And in the Russian periphery and in the Baltics and in Russia itself, 90 percent of people get 90 percent of their information from TV. And so, yes, it seeps up into TV because it eventually gets there, but it—social media is really just the tip of the iceberg.

GREENHILL: Which is why I would say, to go back to Katherine’s question, that I think it’s less different than people say—

STENGEL: Yes. Yeah.

GREENHILL: —because we think about the Nazi example. The reason it worked was it was full-spectrum, orchestrated, every source of information. There were billboards and there were flyers and radio. And it was—

STENGEL: And repetition, repetition, repetition.

GREENHILL: Yeah, repetition, repetition, repetition. Textbooks. You were bombarded. And so social media is a piece. The internet is a piece. But in these orchestrated, broad-spectrum information operations, nothing like being up close and personal.

STENGEL: It’s full spectrum. And that’s—and that’s the Russians.

GREENHILL: Yes.

STENGEL: They are completely full spectrum. And that’s what the ISIS guys got, too.

OK. This gentleman right here.

Q: This is fascinating. Thank you.

Without trying to be Pollyannaish, we’ve mentioned some dangers, some real threats. What are some hopeful signs, at least beginnings? One, for example, that I think of is the internet has destroyed newspapers as we know them, but, what, The New York Times now has over 2 million people subscribing to its website? Who could have expected that? I don’t know what the figures for The Washington Post. The Economist also has people all over the world. So what are some other hopeful signs? And do you agree with me—

STENGEL: No, well, actually it was funny, beforehand we, amazingly, came out with some interesting hopeful ideas which are actually different from that. I don’t know if you—who wants to start with that, but—

GREENHILL: Actually, Joan, you didn’t get to answer the platforms question. So do you want to run with this and then we’ll jump in? (Laughter.)

DECKER: Yeah. Run, Joan, run.

GREENHILL: You—(inaudible).

DECKER: Go for it.

DONOVAN: Run? I want to run out the door—(laughter)—is where I want to run.

What’s been difficult about this is I think we lack basic science around a lot—basic research is lacking around this. And one of the things that the platform companies do is they look at their products and they look at their products only. And so it’s been really an uphill battle convincing these platforms that because they do interlock in different places—so, for instance, during breaking news events Google News will return popular things on Twitter. And so, if you can break Twitter, then you can break Google too, right? And so the interdependence of these technologies is something that we’ve had to really work on and shape to get them to understand that these are cross-platform problems, and that what is bad behavior that’s—that is not allowed on a place like YouTube may be fine behavior on another platform. So that’s where they’ll get organized, and then YouTube will be where they deploy things.

But, for me, where I look to for hope is I look at the work of groups like ProPublica that have been doing to—I mean, they’re the only journalist agency that’s really doing long-form investigative journalism to audit some of these algorithms and audit how these algorithms are shaping our lives. They did a really impactful piece using leaked information on this group of far-right extremists called the Atomwaffen, that they used Discord chats to coordinate, and ProPublica got access to those Discord chats and has been doing the work of finding—I mean, these people have—were bragging about murdering someone in their Discord chat. So I look to journalists in a lot of ways as our first line of cover.

But then I’m really looking to university researchers to do the basic research that it’s going to take to understand: What are the pathways of this disinformation? Who are the targets of this disinformation? When I looked at what Russian trolls were up to and I saw that they were impersonating Black Lives Matter pages, it didn’t surprise me. But at the same time, when I went to people in the Black Lives Matter movement and said did you report these pages, they said yes. And so I know that the flagging practices on these platforms are broken. And so there’s another intervention there, trying to get the internal platform companies to fix their auditing systems.

And then also thinking about in terms of impact—community impact around these fake news stories, what happened in August in Charlottesville is still going on every day for people there. And there are white supremacists on Twitter that will often invoke ideas or message—sub-tweet messages suggesting that they might show up at any time. And so community members are alarmed. It’s really upended those government—the city council structure there. And so I’m trying to understand what are the—what are the IRL, right, examples—what are the in-real-life ways that we can start to combat some of this, but also remembering that we built these systems and we can dismantle them, right? And if they’re not serving us and they’re not serving the public good, then another platform is possible. I mean—

STENGEL: So we—one of the causes for optimism is the work that people here on the panel are doing. And, Ben, you were talking about some of the work that other people are doing that you admire that also gives you a little bit of a—of a good feeling about—

DECKER: Absolutely.

STENGEL: —that it’s possible to remedy this.

DECKER: For sure. I mean, I think step one in really creating or developing optimism is recognizing that this problem is way too big for one person, one team, one company, one organization, one university. We need to bring in an almost interdisciplinary fashion as many different skillsets to target every sort of micro and macro level of this issue together.

And if journalism is the first line of defense, you know, we are, especially at the Shorenstein Center, building a collaborative model for reporting on disinformation around elections. We have essentially tested this model in France, the U.K., and Germany, and are going to be applying it to the midterm elections. So could you imagine 70, 80, 90, you know, national and local, you know, media organizations across the political spectrum all sort of unifying and joining together to ensure that the stream of reporting stays clean and as, I guess, least toxic as humanly possible? So if we can have a collaborative journalism model that sort of goes beyond the profit margins.

And in universities we have, like, digital-literate students who understand problems that we might have not even understood in university to address these things, and have people who are armed with these digital-literacy skills and who are then going into jobs that maybe didn’t necessarily require those skills five years ago, but now coming in it has to be a requirement. So, you know, as, you know, future generations become more digitally native, I think that, like, you know, the future really is in these future generations, I think.

STENGEL: To cast a little shadow over this, one of the—you mentioned that these platforms, these are American companies. And when I was at the State Department and traveling around the world talking about this, I can’t tell you how many times people said to me, well, these are your companies, fix it. Or these are your companies, the propaganda and disinformation is coming from you. Vladimir Putin, when they launched Russia Today, said it was an antidote to the American-English hegemony over the world media system. That’s how people saw it. So it’s a—which is another reason why so many countries are passing what are known as data-localization laws, where the actual information has to flow through servers on their physical soil. So—and because they don’t have a First Amendment, that then they can use that information against their own citizens, which is a little scary side of all this. I’m sorry that—

DECKER: Well, there’s one more sign for optimism, then. So thank you for bringing that up, Richard.

STENGEL: I bet. (Laughter.)

DECKER: Because while, you know, RT is, you know, invading every, I would say, weekly finance media space to prop up a new bureau, you know, in America and across the world, you know, the donor community is very eager to address this problem and very eager to work with communities of researchers, academics, journalists, et cetera, to target this problem. So, you know, I think that there is an appetite to solve this from the top down. So if you can find, you know, three friends out of this day and a couple of researchers down the road, you should apply for grant money to target your best effort and approach into your micro-expertise in this larger problem.

STENGEL: Great. The young lady over here.

Q: Hi. Nienke Grossman, University of Baltimore Law School.

So been interested to hear about the possible solutions to these problems, and I’ve heard you talk about, for example, the news media getting together and working on it and other sort of non-state responses to fixing this. And I was wondering if you could talk a little bit about the role of our government in fixing this problem and whether—what role does legislation play here, you know, in terms of regulating these platforms and making sure that our elections are not tainted? Thank you.

DONOVAN: So I did a session with the Congressional Progressive Caucus—that’s a tongue-twister for me—on Wednesday, right, and their staffers are very concerned with the use of algorithms to—that encode bias in—racial bias, bias in housing, bias in credit. They’re interested in understanding how Google and Facebook and Twitter shape our information landscape, service us in terms of goods and services. But I can tell you from the looks on their faces, the room full of 50 staffers, that there’s—it’s going to be a lo-ong day because they don’t know—they don’t really know where to start because the one thing that they fear is that they don’t have enough technological knowledge to understand the problem in order to do regulation that doesn’t overreach. There’s a lot of work that we need to do in the mind of the public to unwind the rhetoric of Silicon Valley around these technologies being one of liberation. So we’ve spent a lot of the last 10 years talking about the internet as public space. It’s more like a McDonald’s at 2 a.m., right? There are rules. (Laughter.) There’s a lot of stuff happening, but you could get kicked out, right? So it’s a different—it’s a different mode of operating, right, if you’re trying to convince someone even on its best day that the internet is a hotel lobby, right? And that—but understanding what those rules should be and what measures of free speech should be tolerated, as well as understanding censorship.

And then from the platform side, because the U.S. doesn’t have laws about hate speech, you may go to Germany and get a fundamentally different set of returns when you type in certain keywords because they have laws against it that must be abided by. And for researchers, some of our hope was that the GDPR in the U.K. would, you know—(laughs)—sort of protect some of us Americans because Facebook had a bunch of our data that was being stored in Ireland. That data is now being ported and sorted somewhere else where they’ll be less accountable for protections.

So there’s a lot to learn from—I also briefed members of the U.K. House of Commons around the fake news issues as they were prepping to work with Cambridge Analytica. And even in that space there’s a lot of work that needs to be done to educate staffers and decision-makers alike on what is the kind of regulation that we need. Right now the regulation is really stuck in this Section 230, where platforms are treated like telephones in a way. They’re not responsible for the deals that are made over those wires. And in not being responsible for the content, then, platforms are investing in content moderation out of the goodness of their own brand reputation.

But unless there’s enforcement/transparency/auditing with a real eye towards this notion of fairness around a bias—because we haven’t really talked about there are other problems with the internet that aren’t just disinformation—but Ben and I could be sitting right next to each other and get served different advertising when we’re looking for housing. And Ben might be served better deals than me, and it might have to do with how they’ve encoded each of us in terms of class related to our data profiles.

And so there’s a lot of stuff that’s in the air about legislation, but targeting that legislation is going to be really difficult because of the lack of technological knowledge and then the vastness of the problem.

STENGEL: Do you all have thoughts about that?

GREENHILL: I would just say there’s also politics and incompatible views about what ought to be done. So everything she said, add politics. (Laughter.)

DECKER: Yeah. But even to subtract maybe my personal views on politics or this administration versus the last one, is government agile enough to respond to the speed in which the internet evolves? You know, even the best White House with the best budget and the best teams, is it too big to adapt? So, I mean, inherently, you know, I think about something like autonomous vehicles and, you know, the tens of millions of algorithmic decisions that will essentially save a pregnant woman and her child, and instead kill two old, you know, senior citizens who are minorities. You know, that computer, that mechanism is inherently making a decision to, you know, put a higher value towards, you know, this pregnant women in the car, and because of age, race, et cetera—you know, a million other things—you know, we’re talking about algorithms of oppression.

So at the end of the day, I don’t know if the business value of making money and profiting is ever going to really clash, you know, and have a proper counterbalance in government.

STENGEL: And I have to say, having just come out of government, I’m very chastened by how much knowledge and information there is about technology in government. I mean, it’s—I feel—people in Washington are 10, 15, 20 years behind.

I mean, if you watched the hearing with Mark Zuckerberg the other day, it’s like, oh, my god. I mean, it reminded me of going—I used to go out—because I also did counter-ISIS stuff, and I would see—Congress was very interested in that at the time, and I remember speaking with the Democratic woman—congresswoman who said to me, Mr. Stengel, can’t we just close off that piece of the Internet where they are?

That is a—that is a typical question.

DONOVAN: Yeah.

DECKER: That’s not a—that’s not a—some—that’s not a rare Luddite. I mean, that would be—that was the normal question. So I’m just not optimistic about government being able to do it.

One of the interesting things I found that—Ben’s point about hate speech—again, in traveling around the world, the First Amendment is the—one of the biggest anomalies about America all around the world. Nobody, but nobody around the world understands it. And they—when they understand—when I said, well, we have to defend the right of that minister in Florida to burn a Quran, there’s not one person in the Islamic world who understand that.

I actually—having once been a—almost a First Amendment absolutist, I’ve really moved my position on it because I just think for practical reasons and for society we have to kind of rethink some of those things, and I mean, government might be able to do something like that, but I wouldn’t even be confidant of that. But, you know, the thing is, though, all of the platforms have their own constitutions, which are their terms of service agreements, and so that’s where some of these problems can be—can be reckoned with, I think.

GREENHILL: Look, the question is what will incentivize them to reckon with it, and it seems like their—as imperfect as it is, government is one of the few recourses because, unless it is in their—

STENGEL: Yes.

GREENHILL: —so if the public decides essentially we’re walking, then they might be incentivized, but absent that, as slow and problematic as it is—

STENGEL: Yeah, but it is—

GREENHILL: —regulation and laws is our only recourse.

STENGEL: And it’s super hard because I remember once the—in this ISIL messaging, they did—so much of their—most of their messaging was in Arabic. Most of it was positive about the Caliphate. So here’s the most bloodthirsty terrorist organization in history, and they have something on Twitter which—it’s a basket of apples, and it says in Arabic, “the Caliphate is bountiful.” Is that—can you ban that? Can you ban that off your platform? Isn’t that protected speech? I mean, it’s just—it’s not—it’s not an easy problem.

Someone—who has had their hand up—you’ve had your hand up a long time, right? OK.

Q: I didn’t mean to—(off mic).

STENGEL: OK.

Q: Thank you.

Q: Is this on? Yeah. My name is Rich Kiefer. I work at a community college, Waubonsee Community College.

And you were talking earlier in your presentation about the idea of hijacking images and—media hijacking. And a lot of my younger students are really aware of that phenomenon. They understand how easy it is, and the spillover effect then is they then question legitimate news as, well, obviously, that’s manipulated, too.

And I’m wondering, in the classroom, do you have any suggestions how we can maybe counter that? And thank you for your comments this afternoon.

DONOVAN: Yes, so we have a primer. We do reports—you can go to datasociety.net and check them out—that are written in a way to help educators, as well as to use as course content, that have some supplementary syllabi with them.

One of the issues around media literacy that we talk about a lot in the office is has media literacy backfired. Have we made too critical a student that will begin from, you know, the presupposition that maybe it is false? Rather than the assumption that all is true, they start from all is false; teach me otherwise. And what’s really difficult about unlearning a conspiracy theory is there’s just far more content that supports it because you don’t make up content that says the frogs are not gay in—you know, after they’ve been exposed to fluoride, because you don’t create counter-messaging for crazy! (Laughter.)

So you’re in this space where students will come to you and say, oh, that’s just, you know, liberal BS. I’ve heard about—I was on YouTube. There are 45 videos on YouTube about the frogs being gay, and you’re just like, do I take them to the swamp? Do we go get frogs? (Laughter.) You know? Like, do we—like, what are we doing here? (Laughter.) You know? And—you know, do I go to the biology teacher? Do I bring—you know, how do we remedy this?

And one of the things is helping them understand that media is meant to persuade, and media is a message, and media is a medium, and that they can decode those messages. And ultimately it’s not up to them to decide what is true, but it’s also up to them to vet the legitimacy of these sources and understand that the guy that’s trying to sell you fluoride-free toothpaste, that’s also hawking the frog conspiracy about fluoride making frogs gay, those might be related. (Laughter.) Right? And so you can—you can work with them on these things, and you can show them how a good critique can work for them.

But ultimately those kind of questions are going to keep coming up, and we tend to think that media literacy at this time—because we don’t have really great mechanisms for evaluating how students are learning from the Internet, how they’re getting their news from YouTube, we don’t have the tools in place yet to teach them how to create their own media, how to create their own messaging, how to understand and decode those processes. And so there’s more—there’s more to do, but also there’s some interesting—there’s some interesting new methods for teaching that need—that need to be circulated.

STENGEL: And the—and the bad actors use our—use journalistic objectivity against us.

DONOVAN: Uh-huh.

STENGEL: And the Russians, in particular, are smart about this because what they would do is—it’s like they’ll have a guy saying the frogs are gay, so then CNN has to have a scientist on who says the frogs are not gay. And that’s how—and that’s how they do that all the time.

So Peter Pomerantsev—I don’t know if you know, but he’s a brilliant writer on Russian disinformation, and he says that the—when a settled issue is framed as a debate, we’ve lost. And that’s what—and that’s what they do all the time, and that’s what CNN did during—I don’t work for CNN so I can say that—but it’s like they were—I remember, at the State Department, passing a television, and the chyron would say, candidates clash over whether Earth revolves around the Sun. Really? (Laughter.) The bad guys have won that debate.

GREENHILL: Yeah. They’ve also won in another way because we’re distracting attention from other issues that we might be covering—

STENGEL: Yes.

GREENHILL: —that are probably more important.

DECKER: But, you know, the purveyors of falsehoods or disinformation—you know, like Joan said—wear a number of hats.

I would love to hear about, you know, discussions happening in classrooms where people are looking at, you know, Infowars as a dietary supplement company and not a media organization because the financial model is selling, you know, brain supplements and, you know, anti-zombie creams and, you know, other—(laughter)—sort of doomsday prep or, you know, ingredients, so to speak, as a way to fuel the media machine.

So we should question everything to a certain degree, and there’s a reason that I actually probably read a bit more of the Wall Street Journal now and less of, you know, The Washington Post sometimes, or The New York Times, because I just would like a little bit less opinion, even it’s my own. And that’s good, that’s healthy, that’s debate, that’s democracy. That is, I think, the America that we all aspire and want to exist in.

It’s just—you know, journalism is based on, you know, evidentiary reporting, so once we lose the evidence, that’s gone. You know, something on the right—you know, Fox News led the investigation. If it’s based on, you know, evidence and fact, great. It’s, you know, contributing to healthy debate and democracy, so—you know, it’s—you know, let’s just look at the context and peel a couple more layers back, and see if we get more truth. And I think that—especially younger people who are so digitally literate do kind of understand that, oh, if I peel back the onion a bit, it’s actually, you know, a rotten apple inside. It’s not even an onion at all.

STENGEL: How much more time do we have?

STAFF: Five minutes.

STENGEL: OK, great.

DONOVAN: We’ve got to get the question from the woman here earlier—

GREENHILL: Yeah, she has been really, really patient.

DECKER: Yeah, this woman’s got to ask—

STENGEL: Yes.

DONOVAN: In the blue glasses.

Q: Thank you. So I’m going to kind of go against the grain here and challenge us to think a little bit about whose fake news because I think we are paying so much attention to the fake news that is now, that—we’re all talking basically about the same strand of fake news, but there was a dominant narrative that has been a part of mainstream—I’m not taking it to the Nazi extreme which you took, but I’m talking about master narratives that people carried around, unchallenged, for so long. And we didn’t think about them as fake news, and we still don’t think about them as fake news, even though they are.

And then, even—to take it even one step further, think about the media cartels, which control and dominate the way that the discourse is shaped. Is that not fake news in certain ways? And in that sense, is there no potential—can we not also highlight the potential for this multiplicity and actually breaking into some of the consensus that it seems to be so set on what is—not fake news.

So for example, I’ll talk about the events in Gaza three weeks ago. Every American—every mainstream American and world outlet wrote a title that said, “Palestinians killed” or “die in clashes with Israel.” There were snipers, hundreds of feet away, shooting at unarmed demonstrators, but every major newspaper outlet—or every major news outlet called it clashes. No one calls that fake news.

So there’s a question here about how we are defining and what merits our attention as fake news that we need—

STENGEL: So just to—so a—so give me—give examples of the master narratives that existed—or still exist that maybe need to be questioned.

Q: American master narrative about—what is it—destiny—manifest destiny. American narrative about entitlement to American land despite American—Native Americans’ presence and the massacres against them. The denial of the continuing aftermath of slavery that is continuing to be a part of American mainstream media and mainstream consciousness. There is a whole list of things. Basically every nationalist narrative is a narrative that we need to be questioning, and every nationalist narrative is a part of the mainstream that we accept. And this is something that I think we need to be more conscious about.

So, for me, if my students ask questions about everything they hear, that’s a good thing. That’s a good outcome.

DONOVAN: Yeah, so in this space, since 2010 I’ve been researching activist communication networks online, and we had this heyday around 2011 where we were really celebrating the live streamer. We were really celebrating the citizen journalists. We were celebrating the participatory capacity of people to make and do media online.

And I think we’ve—and I think that we hit sort of the apex of that around Black Lives Matter movement being able to challenge and show, explicitly, police brutality in many different forms, and we even saw news outlets that would have never taken up police shootings as a data journalism project—such as The Guardian or WaPo making that part of their journalism and the kind of work that they do in house. And those kinds of projects around data journalism were considered experimental. And they were considered experimental because they came out of movements, and they came out of this online, participatory nature of doing the work and doing counter-data action, doing stat activism.

And I think that we—like I was saying, that the political operatives are the last to catch up, which is—I do mark a period around 2015 where things did flip. Whereas we were post-Ferguson then, and we had had a—very robust citizen journalism networks that had—that had broken some of these master narratives. But they cannot overtake them because they don’t have the money and the infrastructure in order to do content at the level of groups that then saw the power of citizen journalism to spread mainly their white nationalist messaging.

So we saw someone like Milo Yiannopoulos take the stage, have a—not a huge infusion of cash, but be able to run his own media narrative and his own media network. And one of the things that we study at Data & Society are these alternative media networks and how, with a small level of cash infusion, are able to leverage broadcasts in another way that rivals these media companies. But we are not seeing investments in movement-style journalism in the way that we are seeing investments in disinformation that are part and parcel of getting more mainstream outlets to cover these stories in the ways that support some of these master narratives, or they don’t challenge these master narratives.

And I’ll tell you that I was in a lot of these right-wing groups looking at their reactions to what was going on in Israel as well as looking at what they were reacting to around the Syria strikes, and they, too, don’t feel represented. They, too, don’t feel that their anti-war messaging has reached peak normie, as they like to call it.

And so even in the extremist spaces, they don’t feel represented by the media, which speaks to your point about these—some of these being cartels because journalists also have a kind of window of things that they will acceptably report on, and if you are to try to break those frames with—you know, your editor might say, oh, this is best in op-ed, right? This is no longer something that is going to be on the front page.

And so I see—I see what you are saying here, and I also think that there is room to create alternative media networks that have cash infusions, that do like stylistic content in the audience development that do bring out these other—these other issues and these other stories that must be—that must be covered. But I also see that mainstream media is fragmenting, and I don’t think it’s going to be healthy in 10 years for—to be the owner of the New York Times, for sure.

STENGEL: Do you—do you all want to weigh in on that?

DECKER: I mean, I think the only thing I would add to what Joan said is that, you know, on the whole, I think that one of the biggest consequences of, I mean, this global information disorder, let’s just call it, is that I think everyone feels that way, and that everybody is competing against, you know, these kind of master narratives. So, you know, and that’s probably, to a certain degree, due to—you know, like we’ve been talking about this, you know, democratized tech arms race, and you know, if everyone is competing, everyone is going to continue to feel that way and, you know, further, sort of dig the moats and build up the walls around their own kind of community echo chambers.

STENGEL: So I’ll—you will be the last question, but I just want to weigh in on that for one second. So there’s another word for master narratives. It’s called history.

Basically every country creates their own narrative story and, you know, my old job at the State Department was what people used to joke as the chief propagandist job. We haven’t talked about propaganda. Propaganda—I’m not against propaganda. Every country does it, and they have to do it to their own population, and I don’t necessarily think it’s that awful. And this idea of a news cartel—I mean, I was editor of Time in 2012, during that election, and I remember—you know, you were competing against cartels and everybody. I remember being on a panel with the then-editor of The New York Times, who said, it’s really hard to break through these days. This is the editor of The New York Times saying it’s hard to break through. I almost—I wanted to jump off the platform—like, what’s it like for the rest of everybody? So, I mean, there’s no—I mean, there are cartels, but the cartels don’t have hegemony like they used to.

The gentleman right there. Last question.

Q: I don’t think you all addressed her issue, which I put in terms of understanding what happen in the world, because what is happening in America is what I’m—the United States flipped on the Global South and in the Third World, which we live with for many, many years in terms of a master narrative that was, and still is, propaganda.

STENGEL: You know what? I hate last questions—(laughter)—don’t you? I never—I usually just want to end something before the last question.

But at any rate, I want to thank this fantastic panel here today. (Applause.)

And I do want to say I actually think the—I mean, talk about optimism. I mean, the optimism is all of you there figuring out how to teach your students about this, and using some of the techniques and some of the sources that we’ve talked about here today, and I hope you are successful.

Thank you very much. (Applause.)

(END)

Top Stories on CFR

South Korea

Yoon Suk-Yeol’s shocking declaration of ‘emergency martial law’ was a frontal assault on the integrity of South Korea’s hard-won democratic system.

Lebanon

An array of domestic and foreign powers are vying for influence in Lebanon, including the Lebanese Armed Forces, Hezbollah, Israel, Iran, Syria, and the United States.

China

China’s growing willingness to defy the international order, and its increasingly aggressive leadership, have led it to increasingly utilize economic coercion against countries it believes have defied China’s interests. This coercion can be powerful, and the United States and its partners have not been well-prepared for Beijing’s actions. The U.S. and others need to develop a response immediately.