Joan Donovan, director of the technology and social change research project at the Harvard Kennedy School’s Shorenstein Center; Amy S. Mitchell, director of journalism research at the Pew Research Center; and Claire Wardle, executive director of First Draft; discuss trends in disinformation and practical ways journalists can build trust with news consumers. Susan McGregor, assistant director of the Tow Center for Digital Journalism and assistant professor at Columbia Journalism School, moderates.
MCGREGOR: Well, hello, everyone. It’s wonderful to see you all here today. My name is Susan McGregor. I’m going to be moderating the panel this afternoon. And I’m very excited to be joined by three exceptional scholars, researchers, and explorers of disinformation and misinformation. And we’re also looking forward to having your questions towards the end of the panel. So please feel free to make notes of things that interest you as we have our conversation.
So starting to my far right, Joan Donovan.
DONOVAN: Don’t say that.
MCGREGOR: Huh? (Laughs.)
DONOVAN: Far right? Come on.
MCGREGOR: Oh, yes. (Laughter.) Right off the bat.
DONOVAN: Right off the bat. You’re going to label me? Come on. (Laughter.)
MCGREGOR: Right off the bat we started that way. Amy Mitchell and Claire Wardle.
And so the topic of our conversation today is misinformation, disinformation, and of course local journalism. And so I know that I’m thrilled to be having this conversation with all of you, because it gives me a chance to also get some questions answered.
I think the first thing I’d like to start out with, and just kind of throw to all of you, is I think for most of us we have a sense that misinformation and disinformation are very much things that have been facilitated and exacerbated by the fact that so many people now get their news from social media. Given that these tech platforms are, you know, global in nature, they’re huge, they’re very technically driven, I’d love to just hear what you all think the role is of local media and local journalists when it comes to both handling and countering these things, you know, as smaller organizations working within constrained geographies, very often. What’s really possible on the local level to help challenge these massive campaigns that are happening on these platforms? So anyone want to take the fall first?
MITCHELL: I can offer some thoughts to kick it off, I suppose. You know, if we think about the audiences that you’re trying to reach, I mean, it is true that we’ve seen such a growth in the digital landscape overall, and the use of social media, where you now have about seven in ten Americans sort of getting some of their news through social media, which functions in a very different way, where a lot of that is coming at you. You’re not even looking for it. You may be aware or not aware of the sources that are actually sending you that information.
So one of the things we did, when you think about that—the fact that the vast majority of Americans are now in that space, online in general and then on social media. And we have this political divisiveness in our country which we are seeing at local levels as well, and there’s a lot of money to be made. So there’s a lot—that brings a lot of different types of content providers in that space. And the majority of Americans are also—have said to us in our survey work that there are overwhelmed by the amount of news and information that’s out there today.
So one of the things we did, that it’s worth, I think, local journalists thinking about, is that we looked at people’s ability to differentiate between factual statements in the news and opinions. And we were interested to see what characteristics have some impact or don’t have some impact on that. And what we saw was pretty striking, which was that those who had high digital savviness compared with low digital savviness did far better in their ability to distinguish between those two. Higher political awareness, low political awareness of issues and events that are occurring in the country, and higher trust in the news media versus lower trust in the news media.
And so if we think about relationship building with audience, part of that is having them understand who you are, what you’re putting out, and the content that they can expect from you, and what your content looks like—which we know becomes more and more important as there are these counterfeit, you know, misrepresented examples of certain types of outlets.
MCGREGOR: Yeah. Joan, I was going to say, I know you’ve done a lot of work in this area. And I was recently looking at one of your pieces from last year, the weaponizing the digital influence machine. And it seems to me that obviously advertising—you know, we know that advertising is part of what’s being used to spread misinformation and disinformation. I’m curious if you could speak to what’s happening there a little bit, and also, again, sort of what’s the role for local media organizations? Is there an opportunity to recoup some of the money that’s being made by participating in this? Or is it something that, you know, media organizations—especially smaller ones—should just be staying away from? Is there a balance there to be struck?
DONOVAN: Yeah. I think that one of the things that—oh, by the way, I’m Joan. I’m at the Shorenstein Center at Harvard right now. But prior to that, I did a series of reports as the media manipulation lead at Data & Society. So that’s where that report is currently. And all of the research at Data & Society is free and accessible on the website. Which I think is really important in this day in age because what I’ve found is anything that you can access on Google is information that’s failed to be monetized in any other way than through advertising. And so when you’re looking up events, when you’re looking up breaking news through Google, there’s all of these strange opportunities for search engine optimization and advertisers, and, you know, misinformation to leverage the system in ways that maybe you and your newsroom is not thinking about when you’re just trying to put together a very factual headline, you know, and get things in order.
But there are other people in other ways in which they’re using the algorithmic structure in order to ensure that their content shows up first and fast. And so some of the ways that they do this is by buying keywords on Google. Or, they might change the name of their Facebook page to take advantage of a breaking-news event. One thing that happened right after Parkland was someone who had a substantial-sized page on Facebook changed their name to: Official March for Our Lives, right? And they were easily able to capture a whole bunch of people who were looking for information about that emerging movement. And so as you’re reporting and thinking about, well, what are the local issues, you actually have to now double down on people in the streets, fact checking. If it’s a political race, you’ve got to call the communications manager and double check that that’s a Facebook page or an Instagram account.
We have a report out called Data Craft as well, which looks at Rep. Babin’s account on Instagram, and how there are a few of them. And they’re not blue check-marked or verified, but it’s really difficult to tell the imposter from the real, because the imposters tend to lay in wait, and they tend to serve very innocuous content. The stuff you—you know, entertainment news, and what have you. But then as it gets closer to the race, or it gets closer to decision-making, or a certain hot-button wedge issue like immigration comes up, that’s when they spring into action and change their content, or maybe even change the name of their page.
So as local reporters are starting to think about, well, how am I going to show my audience what I have, you also have to think about: Well, how is my story talking to these algorithms? What keywords am I using? Is it advantageous to purchase a boost post, to ensure that it does get seen by the people who’ve already liked your page, or spreads out on Twitter? And that can be a really—both an expensive proposition, but also one that feels kind of dirty, in a way. You know, it’s like you’re paying to have the news in front of the audience. But at the same time, this distribution model has changed so much from, you know, newsstands and even email subscriptions to this wider networked world that you actually have to think about serving the news and reaching people differently.
And so one of the things that our reports are trying to show is that advertising algorithms are both being leveraged for bad information—we have a series of fake accounts or counterfeit material sort of lying in wait that then will use algorithms and advertising in order to break through to new audiences. And, you know, local journalists are still working within, you know, an ethical framework or, you know, a set—a code of conduct where those things feel a little bit like a digital dirty trick. And so you have to think about, you know, weighing those things against each other, especially if you are reporting on a very important wedge issue in your specific locality.
WARDLE: Can I just jump in to say, A, it’s an absolute pleasure to be here in front of so many local journalists. This topic is focused on a national level. And everybody worries about the Washington Post, the New York Times, and CNN, and how they’re covering disinformation. And their focus is on Russia, and more Russia. And that is part of the story. But to Joan’s point, it’s very easy to target local audiences, partly from a financial basis. If you run a page that says that Tom Hanks is moving to Greensboro, North Carolina, that works in Greensboro, North Carolina and it works in many, many, many other cities. You have the same story. You just change the location.
So local is really a vulnerable spot. And local newsrooms are the newsrooms that have been stripped of resources. And you’re being asked to do incredible journalism. Oh, and also, can you help your audiences navigate the information ecosystem. Oh, right, but we’re not going to give you any more money to do that. So, A, I’m just really glad that we’re having this conversation. And to Joan’s point, these—understanding what these tactics are, and they really are tactics, whether they’re for financial gain or they’re for political gain. There are disinformation agents that are targeting local areas and using the machinery of Facebook and other digital devices to target audiences. And you really do have an incredibly strong role to play, because you still have a lot of trust with your audiences. And that’s so key in an increasingly divided country.
We did a project in France around the French election in spring 2017, and we had a number of Parisian news outlets. But a lot of the French people said: Oh, the Parisian elite. Same as, like, inside the Beltway. But we actually worked with a lot of local newsrooms in France, which meant that we had people from both the left and the right, because they didn’t care where they were from they just really cared about Strasburg coverage, or Calais coverage. So that—for me, having these conversations at the local level is so vital.
MITCHELL: Yeah. And if you think about it, I mean, if there’s the thought that, well, maybe it’s kind of a dirty thing to do, to be paying to have my content up there, but that’s what the public wants, right? What they’re looking for is really good information on this topic, if that’s what they’re searching for. So you’re actually really doing them a better service by putting forth your content in front, and by helping them understand what it is that they may come across in the local space because, as Claire is saying, so much of what they see in the media is the national story, and the Russia story, and the big—you know, the big national and international political story, when there’s so much else that’s happening.
You ask them about things like bots, what they’ve heard of is Russia. They think they’re only used for bad purposes. You know, when there’s—it’s like SEO, you know, in many ways. You can use bots to do all kinds of other sorts of pushing forward of content that’s not meant in a misleading or false way.
MCGREGOR: Well, I think—and I apologize, I don’t have a reference for this. But, I mean, one thing that we do know I think—and this may actually be a Pew piece—is that trust in local news is actually still quite high in America. I mean, at sort of the national level there’s a lot of concern about the national media organizations. But actually, communities really connect with their local news organizations.
And so, I think—Joan, to one of the things that you were saying—I’m not sure if I’m using the term correctly—it sounds like part of this is what we would think of as astroturfing, sort of this idea that there are, even at the local level and even particularly at the local level, there actually are groups that are kind of seeding content, waiting for an opportune moment to then activate people around it. So they may be targeting a local area and kind of waiting for something to happen there, and then it looks like they’ve had this Facebook page for a long time, or they’ve had this Twitter page for a long time. And then it can be very difficult for journalists to kind of tell the difference between something that’s genuinely grassroots, that’s actually happening locally, and something that’s kind of just been setup in advance waiting for an opportune moment.
So, I mean, it seems like one of the things obviously that local journalists can do in that case is kind of get out on the ground. Are there other strategies? I mean, Claire, obviously you’re a leading expert in kind of verification and fact checking. Are there particular things that journalists should be looking for when examining social media groups? Things that they’re seeing online—as many of us are going there for information, even as reporters—processes that we should be thinking about as we try to verify these things and make sure they’re genuine actors?
WARDLE: Well, I mean, to be fair to Facebook, they do now allow you to see the history of the page. And it is pretty amazing when you see the history of the page. Like, oh, a year ago this was selling health supplements and now it’s a local Facebook page. So it is worth checking. I can’t remember the actual process—I call it if you Google it—it’s on the right-hand side. I can’t remember where you will find it, but it’s the history of the page.
DONOVAN: I think to your point about lying in wait, one of the strange data points from last year that sticks—it really sticks in my mind is seasoned accounts. So there’s a black market for all of, you know, likes, views, retweets, shares. So when someone says a botnet, usually what they’re referring to is someone bought a bunch of retweets, or they bought a bunch of followers. And it’s a network of fake accounts that someone—there was an excellent New York Times piece, Mark Hansen, on the follower factory, where, you know, you can buy these things. But seasoned accounts, the price of them went up last year. So these are accounts that people have abandoned and, you know, people have either cracked the passwords or reregistered them and have been able to say: OK, you know, well, I want to sell you ten thousand accounts that are three years old. And that helps you establish more legitimacy in this space, because Twitter has been cracking down especially on new accounts that are registered in batches and then are all linked together.
And so as the platform companies become more savvy, so does this black market for gaming algorithmic systems. And when people view those metrics, there’s a natural association between legitimacy and five hundred thousand views, right? And so, you know, we have to actually hedge against that in thinking about, well, what does that mean for fact checking and journalism? Are those views real? Are they true? In our report on Data Craft, we have a set of—sort of like a map of how you go through, and verify, and look at all these different ways in which an account or a post might have been gamed. And there are things that trip people up a lot. Sometimes it’s the location of the post, or they’ve retroactively changed the date on a blog. Things like this you can look for using the internet archive. And you can go back and try to get a sense of things that have been changed.
But this dark market for these things, platform companies know that they’re being gamed. And they have teams of people in place. These teams are not always authorized to do the right thing, because growth is real incentive for many of the shareholders in platform companies. And so you know, Claire and I can talk all day about what the problems are, but if they’re not willing to fix them or they’re trying to apply a strictly technological fix, there are many markets out there that will provide what they think they fix. There’ll be another market for that within a day or two. The hackers will have figured out exactly how to get around it.
WARDLE: I think also, to that point, there’s a huge focus on fact checking now. And that’s a good thing. But I increasingly use the term source checking. Because if we focus so much on what the content is, and particularly—we understand about confirmation bias. It’s very difficult to change people’s minds. Fact checking’s very important for ensuring that we have a public record that’s correct. But we would save a lot of time, sometimes, without fact checking if we did source checking first, and realized it’s got absolutely no credibility because it was tweeted by ten accounts with exactly the same think, and they’ve all got a combined following of ten.
I mean, there’s a whole host of things that we could be better at. And I think this goes to journalism training. I think this is not just about local newsrooms. This is all newsrooms. There are very few newsrooms in America where journalists have the skills to do this kind of digital forensic analysis of networks and accounts. And it’s just so necessary now. And because we haven’t really put that training into J schools and because, again, resources are being stripped, newsroom training budgets are being stripped, right at a time when we actually really needed to provide these kind of skills.
This isn’t, oh, you know, the smart intern in the corner. This is skills that every journalist needs to have because if one of us gets it wrong, then, A, it moves through the system, but it also means that journalism becomes less credible. It allows people to point to it and say, as we know, F-asterisk-asterisk-asterisk news. (Laughter.) So it’s not in any of our interests to get it wrong. But mistakes are being made because we’re under increasing pressure and the training hasn’t been kept up to speed.
MCGREGOR: Well, and I think—speaking of training—this is something that I think has been a lot in the discussion as well in terms of how to counter the effects of misinformation and disinformation is the idea of sort of media literacy, and media literacy for the public in particular. So obviously, yes, as journalists we need to get up to speed and find ways to—and I think this is a great example of the kind of event where, you know, being able to pick up new tips and skills about how to improve our practices. To what extent do you all think that this also needs to become a subject for newsrooms to then educate their own readers?
I mean, Amy, you spoke to the idea of making clear what is the legitimate content of the news organization so that news consumers can tell the difference between that and, you know, content that’s trying to look like news. But, you know, from one of the reports that you mentioned, you know, of a few different thing—you know, things like digital savvy, news interest, political awareness, trust in media—these are all things that make people better, news consumers better, at distinguishing between fact and opinion. I mean, does this—not to add another thing to the plate—but, like, does this need to be something that local news organizations, who have these close relationships with their audiences, need to be kind of putting front and center, almost like another beat, to say: Hey, you know, this stuff is out—this stuff is out there. You need to be informed. And here are the ways that you can manage your own news consumption?
MITCHELL: And it works to—you know, doing that also works to just foster the relationships. Which we see in all of our data, having that close relationship with people in your community, having a strong community engagement among members of the public themselves there, all leads to better interest in the news, to being loyal to particular sources. So all of that, creating a conversation and a dialogue, the public then is going to continue to look to you for that kind of knowledge and for that information, at the same time that they’re just getting to know you better and your content better.
So, you know, it certainly isn’t something that can, you know, I would think, have a negative impact. And it’s going to—if you think about the digital space and you think about online in general, not even going to social media, if in the older forms there was very defined news space. I mean, the concern that was discussed at panels way back then was the advertorial section of the printed newspaper because the border looked different, right, and that was their way of doing it. That was the huge flag that everybody was talking about in the room. There’s nothing that fully defines what news is in this space. And that’s even why in our surveys we start to define what we’re intending them to think about when we say “news” in our surveys.
So because it has now such a breadth of what the heck can fall in that, so it’s a real challenge for the public to get there, to even know what is it supposed to look like? How am I supposed to be able to see? Especially if it’s in the social media feed where there’s not a lot of distinction from one piece of content to the next.
WARDLE: But I think it’s a real opportunity—and everybody’s talking now about we need more transparency because it’s going to help with trust. But sometimes writing about just traditional journalism, nobody really wants to know how the sausage is made. Like, I was on hold for ages with city hall, and then somebody didn’t call me back, and then I was really struggling to get my second source. I mean, that’s not exciting to write about and it’s—we’ve never done it. But I think when you’re trying to explain the processes for doing digital verification—the reason I love verification is because it makes you feel like Sherlock Holmes. But it’s a really great way for engaging with audiences, because all the tools that you use to verify accounts, they’re available to you but they’re also available to your audience.
So it’s strange now in the way that journalists don’t have this special hotline to city hall. It’s like, you can also do this work and learn how to assess these things. And certainly we’ve done projects in France, and Brazil, and currently in Nigeria around election monitoring. And in the reports, we explain how we did the work. And when we evaluate it, it’s astonishing how many audience members say: I loved it because I learned, as I read, how to do this myself. And I don’t normally get the opportunity. So it feels very odd, again, to Joan’s point—to do journalism in the age of disinformation is requiring us to do things that have not necessarily been the norm. But I think time requires it.
MCGREGOR: And I think that’s a great point, because, as many of us remember, on the national level in the 2016 election, you know, David Fahrenthold’s work. You know, one thing that was really remarkable about it, and kind of unusual, was showing this really kind of piecemeal process and saying, you know: This is what happened today, and this is the bit that I got today. And I think there’s—happily, right, that’s potentially an opportunity because it provides that transparency, it enhances the skills of the people that are following that. But it also lets us gain visibility and kind of audience for the work that we’re doing anyway, right? If we’re putting in dozens of hours to get a story, being able to put out little bits of information and say: This is what’s happening today—even if the full “story,” quote/unquote, is coming a week or two later. You know, everyone has already too much to do. It lets us actually make something of those pieces as they’re coming together, rather than, you know, kind of nothing, nothing, nothing, here’s headline.
WARDLE: I mean, there are amazing local journalists, particularly in TV news, who are out there constantly sharing what they’re doing with the audience. I think it’s harder if you’re a desk-based journalist to think that you’ve got anything to say.
DONOVAN: Yeah. And I think one of the things that really changed with the way that news moved online—especially around that BuzzFeed model of the tweet is a source now, but also Twitter is a beat, right? Like the internet became a beat at some point in this very strange way. And now it’s almost like the world has inverted, where you have all these journalists doing desk work where what happens on the internet is the thing they’re reporting on. And they’re not in the streets, looking around, making those connections. And, you know, so BuzzFeed has really pioneered this man in the tweets-style of journalism.
And what’s advantageous about having a journalist locally is they can look left, they can look right, they can look outside the frame, they can question people around and get that color, and get a better story. The difficulty, I think, with the desk journalism as well is you are in competition with a bunch of weirdos. (Laughter.) I’m just going to be frank. You know, a story breaks, there’s a Twitter thread, people start assembling information in a hashtag. You go over to Reddit and there’s a bunch of super sleuths that are, like, drawing red lines between license plates and the shape of trees. There’s all these weird things where people are collating and amassing information and trying to do the work of journalists.
And in some cases, that can really trip us up. We saw very clearly the Boston bombing was a mess because of the way the police were crowdsourcing information, journalists were relegated to that crowdsourcing model. But the crowd is here to stay, right? And so one of the things that we researched around influence online is we looked at a lot of people that would call themselves independent journalists, and how they cultivate audiences on YouTube. And a lot of it has to do with these para-social relationships that they create with their audience where they’re in the bedroom reporting the news, you know. And they’ve got some screenshots they’re going to show you. And they’re going to tell you about what’s happening, you know, in AOC’s campaign, or something.
And it’s really just aggregated news that they’ve brought into a particular ten-minute clip or whatever. But those para-social relations that they are creating are durable. That’s what’s strange about it, is that people do go back time and time again for this more editorialized news. And a lot of it has to do with the structure of YouTube that pings people and reminds them that there’s new content to come back and read. And so there are opportunities for flexing that kind of influencer model in doing short more op-ed-y pieces and cultivating audiences and followers that way.
But, I mean, I really don’t envy you in the sense that when you’re trying to do all of this fact checking you’re in competition with all of these different methods and motives and opportunists that are really trying to both beat you to the punch, but also, in some cases, particularly certain beats, they’re targeting you and your legitimacy as well. So you have to think about, you know, how do you also do your news across platforms to ensure that many different audiences see the work that you’re doing?
And Twitter isn’t the only game here. But if you pay attention to—and this is going to sound crazy—but sort of the different constituencies online that follow maybe some of the conspiracies and whatnot on your beat, you might actually gain a perspective and a way of understanding how information flows across these spaces, so as to make an intervention with your articles and your pieces.
MCGREGOR: Yeah. I mean, I think you touch on something that—I love that phrase, the man in the tweets. (Laughs.) But, I mean, I think this also touches on something. I mean, obviously there’s a lot of opportunity in digital, and social, and all of this, right? I mean, it has its pros and cons. But I think, you know, the very topic of the discussion right now—misinformation, disinformation—we also know that these are really used to not just manipulate the public, but also manipulate the attention and efforts of journalists themselves.
And so I’m wondering if you all have thoughts or ideas about how, especially for local journalists where, you know, there is that unique and really important opportunity to actually be on site, right, and to have those strong ties with the local community, so that you know if that’s a real activist group, you know, that’s allegedly doing something in your community. So how do you balance that sort of traditional beat reporting, the time allocated to that? Which we know—I mean, I was struck—I know there’s been a lot of work on this, but—Joan, in one of your reports—the loss of two hundred thousand, more than 50 percent, of newspaper jobs between 2001 and 2016, right? I mean, a massive, massive hit to the human resources available to the industry. And so, again, everyone being asked to do more, you know, with the same amount of time and the same resources.
Any thoughts on how to kind of find that sweet spot, where we’re not letting go of that beat reporting and just kind of following the memes, while also staying abreast of what’s happening—what is happening online, because that’s—we all know too, that’s a lot of what our audience is coming with. Our audience is searching—you know, they are searching online, and they are seeing things on social media. We’re in competition with the rabbit hole of the internet at the same time. I’m just curious if you all have any thoughts about where folks might go with those things.
MITCHELL: I mean, I would just add that it’s not the same time same resources. It’s less people, same time, and fewer resources. So it even adds more to that challenge. So it’s incredibly hard to think about how to approach it. At the same time, part of what we’ve seen is there some more coming together, one, within newsrooms—so, who’s got the subject of sort of this subject matter. How can they share this with somebody else who may have to go cover that, you know, at the spur of—the spur of the moment because they’re here in the newsroom and the other person is out. So being able to really work together as a newsroom, and then work cross collaborations, both within the local area and with people across the country and even outside the country, that are covering the same sorts of issues that you cover, people in your team cover.
I think that’s one of the things that all of this has led to, in a positive way, is journalists coming together to say: Hey, we got to be sure that this—that this is accurate, that we’re working together to get this story out. There certainly seem to be more of it happening. Nonprofits working with some of the longer established news organizations because they have a staff of three, but more time on their hands to be able to go do things. So some of that innovative collaboration. And then saving resources in a way that other people in that mix can access them. That’s something that Claire’s team, with the work in the election over in France, wasn’t it, where everybody was sharing, you know, across many different newsrooms what they were looking at, in terms of the fact checking role.
WARDLE: Can I just—yeah, just really quickly. I mean, to that point, it’s completely pointless to have five journalists in five different local newsrooms all debunking the same Twitter image. I mean, we shouldn’t be competing around disinformation, because if we get it wrong it hurts all of us. But so that means that in this era of collaboration it takes a strong editor to say, yes, for these circumstances, it makes sense for us to have either a shared Slack community or other types of environments that we can actually work with one another to make sure that we don’t get hoaxed. So, yeah, in France we worked with thirty different newsrooms, national, regional, and local. We just finished in Brazil with twenty-four of the biggest Brazilian newsrooms, and in Nigeria with twelve of the biggest newsrooms.
And I really didn’t think it was going to work. And my colleagues were like, yes. It did involve us going to a chateau outside Paris, getting drunk, and singing karaoke. (Laughter.) But once people had actually let down their guard, and at 2:00 in the morning sang Living on a Prayer, it meant that two weeks later on Slack they could say to a journalist from another newsroom, can somebody help me with this? And actually, in these spaces, we do need to be helping one another. And I think in those resources, who—we can’t also just monitor everything. But there’s a combination of beat reporting and you do need to be in all of your local Facebook pages, because there’s going to be a ton of stuff in there. Some of it weird. Some of it, it’s what your audience really cares about.
And I know that you know this, but there’s so much online to monitor that, again, within your newsroom, who does this? How do you mange that workflow? I think these are all questions that no newsroom has got right. the New York Times has got its own silo. The Washington Post has got its own silo. I don’t know how you’re doing it either, but it’s a challenge that every newsroom is facing.
MCGREGOR: So I think what that means is we should have a karaoke machine set up later. We expect that at the reception. (Laughter.) Well, I want to sort of take that—again, we’re talking about local journalism here, but as we’ve touched on a few times, Claire, you’ve been looking at these things globally. And I know that you’ve seen a lot of things that are great kind of learning opportunities for how to handle these things. You know, we are here at the Council of Foreign Relations. Obviously we’re in that sense also thinking in an international context. Are there things that you’ve seen that are happening internationally that, you know, journalists here—you know, local journalists here can learn from and share with one another, that we should be on top of? And do you have suggestions for ways to stay on top of—you know, stay abreast of those innovations as they happen?
WARDLE: So the biggest thing globally is the rise of closed messaging apps. And that’s partly because people are nervous about their own digital footprint. And so they are moving into spaces like WhatsApp and Facebook Messenger, and small group discussion.
MCGREGOR: You mean the public is doing that.
WARDLE: The public, yeah. So what that means is when you’re trying to understand what rumors are circulating in Brazil, the challenge was this was encrypted. There was no way of knowing. Yet, that’s where most of the content was circulating. So we needed to build really strong relationships with the audience to try and get them to send us content. But I’m sure by 2020 the same is going to happen in the U.S. And, surprise, surprise in the U.S., there are high uses of closed messaging apps. They just tend to be in diaspora communities and different ethnic groups, who aren’t necessarily represented in newsrooms.
But it’s not that there isn’t that usage happening in the U.S. We do need to be much more aware of that and ensure that we have relationships with our audiences. But also things like Facebook Groups. I mean, we don’t spend a lot of time in Facebook Groups, but they are increasingly popular, partly because Mark Zuckerberg decided about a year ago that he was going to make the relationships we have with friends and family more important. But Groups are a place where a lot of disinformation circulates.
So a lot of this is just being aware of these trends. And if you train all your journalists on how to monitor Facebook pages, but in two years’ time everybody’s somewhere else, it just—we just need to be ahead of these digital trends, so that we don’t get caught out.
DONOVAN: I think it’s also important to realize how much rumor turns this wheel of disinformation. And so we will often see in different, more obscure spaces on the web, lots of—you know, let’s do Jobs not Mobs. It’s trying to figure out a way to rebrand the caravan, because we’re not getting the kind of uptake that we want. You know, you have people that are—you know, they may be political operatives in these spaces. They may just be, you know, chaos trolls type people. They may be—in my line of work a lot of them are ardent white supremacists and misogynists who are interested in pushing this worldview in to the way that we—into our lexicon, into our common, you know, thoughts about immigration. We should also be thinking about, well, maybe a white enclave is the way forward, right? These are the kinds of things that they want you to think automatically.
And so a lot of these things germinate in these other spaces. And to Claire’s point about we don’t need a bunch of reporters that are spending all day in these spaces. What we need are collaborative ways of fact checking and seeing where these rumors are percolating, and then what influencers pick them up and make them trend or make them a central issue in a certain political platform or part of some kind of a discussion that we’re having about a particular issue. And so looking at and tracing those information lines out of the rumors and out of these operations and into how they target either journalists or they target D-list celebrities, or if they know that a certain set of reporters are paying attention to, you know, someone who’s had some kind of scandal recently. They will, like, actively target those people, because they know journalists are watching them, and they know that they can get their slogan or their story to piggyback on that issue.
And I think it’s also really important to understand when political operatives are seeing movements in these rumors are politically advantageous opportunities, right? During the Kavanaugh hearings, we saw a lot of talk about #MeToo. I don’t believe a lot of that was authentic in a lot of ways, right? What they were doing was kind of riding the movement for whatever kind of political opportunity they thought they could get. And so you saw a lot of influencers tweeting about Kavanaugh and tweeting about this. And you saw an incredible polarization on the issue. I mean, it was just left-right, no nuance, no middle, no possibility of discussion. And when I see stuff like that happen, where there’s, like, a very clear stereo effect of the polarization, I know that there’s many different coordinated actors at play that are building a coalition on the left and the right.
And so you have to pay attention to who’s working together. Where is this rumor coming from? Is this part of a movement? Is this just out of nowhere? And if it comes out of nowhere, it’s usually something that’s been seeded in some of these spaces that, you know, you shouldn’t have to go to, you know, logically. But my research team does.
WARDLE: Because also a part of that too is that the agents of disinformation are hoping that you will report on this stuff. And sometimes it’s very tempting as a reporter to find a pretty niche rumor that’s like—because it’s the, oh—like that moment as a journalist. Like, well, this is a good story. But that’s what they’re hoping you’re going to do because even in the form of a debunk—they see debunks as a form of engagement because you’re giving oxygen to that. And there’s real challenges. And there’s a lot of academic research that’s still mixed about actually do we do more harm about writing about disinformation? And certainly if we write headlines poorly and irresponsibly and simply just repeat the falsehood, we do more damage.
So there are ways of how do you word a headline? We should actually focus on the truth, rather than the falsehood. So rather than saying, Obama is not a Muslim, you say, Obama is a Christian. I mean, that’s a bad headline—(laughter)—but it’s a simple way of remembering. By simply saying “not” or putting myth, myth, myth, myth, your brain remembers the myths. Like, there’s all these types of things that make it really hard to do journalism in the age of disinformation. But we can’t turn our back on it. We need more research. And in the meantime we to do, learn, throw spaghetti at the wall, because we’re at a pretty crazy time.
MCGREGOR: Well, I think you kind of just—you started to answer a question I had as I was listening to this discussion, which is when we start to see the rumors, if we are in the Facebook Groups, if we are in the WhatsApp channels or whatever, and we see rumors starting to gain traction, I mean, should journalists be jumping in and saying: Hey, no, that’s not happening? Or is that—you know, is that a place where we’re going to start to go down that rabbit hole and really lose a lot of time? Is that something that’s worth doing? I mean, obviously also there is this question of, you know, by repeating it or debunking it are we adding to the problem? I mean—(laughs)—
WARDLE: We talk about this—we talk about this tipping point, which if you go too early you do damage, if you leave it too late you can’t pull the rumor back—Pizzagate. So there’s this kind of magic hour. And we’re still trying to figure out what it is. But to your point, there’s real ethical considerations about reporters wading around in WhatsApp groups and Facebook, you know, Groups. And I don’t think we’ve done a good enough job as an industry of talking about what should the ethical limitations be of sourcing from closed spaces. Is it the equivalent of deep background? Or is it that you can use it to inform other reporting if you get other sourcing? Or is it that we can say we found it here. It’s—I got into the group. So it was said. I mean, it’s not public. It’s semi—there’s a whole host of things. And I would suggest that most editors—I don’t know how many editors are in the room—but whenever I bring it to the editors they’re not normally across these types of things because they didn’t grow up in a newsroom where digital was the norm. And so there are new challenges that we need to talk about.
MCGREGOR: Yeah. Well, I want to sort of take a little bit of a pivot here, because I’m just curious. If it’s all right, if I poll the room. I’m curious if any of the local journalists here are—in their newsroom—are there platform people who have been working in your newsroom, or who have been platforms who have been supporting work in your newsrooms? Is anybody willing to say yes or no? Yes? We’ve got a couple of hands. A few. I’m just—I mean, I’m curious—I mean, I’m curious too because I think, you know, we all know that, again, huge shrinking of resources in local newsrooms. And I know that something—this is something in research that’s a challenge as well. The question is, where does the support for important work come from?
You know, I’ve definitely had, you know, friends and colleagues where, you know, they were the Snapchat channel person, or—you know, I’m curious, again, how do we—because there’s so much out there, whether it’s a direct sponsorship of having people in the newsroom or, you know, these platforms are constantly coming out with tools. You know, how do we—how do local—you know, I think a larger—everyone faces it, right? Larger news organizations there’s a little bit more depth. There’s maybe a little bit more margin for, you know, experimenting with things. You know, how should local journalists and local news organizations be thinking about operating with these relationships with platform companies, when, you know, we know there are a lot of kind of not necessarily parallel incentives or parallel priorities? Any thoughts?
MITCHELL: Well, I’d even be curious at how many folks in the newsrooms—how many of those newsrooms are having conversations about this. Is it something where there are regular conversations about it? Or does it still feel kind of out there, or we’ll deal with it when it’s in front of our face? Are there—are there conversations about how do we strategize? Because I think that’s the first thing that has to happen, right? There has to be a conversation about it. And then thinking about how can we work together, and talking to other organizations about what they’ve done, if they’ve seemed to have some success. But I think we’re—it seems to me that we’re still in the pretty early stages of maybe even talking about it among journalists themselves.
DONOVAN: Yeah, I think the platform question is pivotal in the sense that a lot of journalists feel very individualized, but also the newspaper has been disaggregated. People share individual articles, right? And so the uptake of that means that people assemble their own sort of scattershot newspaper every day, where they don’t get a good balance of entertainment, and sports, and weather, and opinion. You know, one of the things that’s, like, interesting about the New York Times editorial opinions is that they really travel very far and wide, and sometimes much further than any other section in the newspaper. And so you’re cultivating a younger generation of news consumers that don’t differentiate between the different sections of the newspaper. And part of this is aided by the infrastructure itself, which is that Twitter and Facebook really want to be known for the news. People—you know, they want people to say, well, I read it on Facebook, not I read it on the Guardian, right? (Laughs.) You know?
And so one of the ways in which they co-opt those organizations and that branding is by really creating these content cards that make it very hard to see who is serving this news. And I think the reason why we’re in this moment that we’re in, where disinformation is so potent, is because people were able to leverage that very banal infrastructure, you know, these design decisions. No one really thought, well, what are the downstream impacts of losing the branding of a major news organization, and it being in this tiny little Twitter content card, where the picture is very big, and the link is in gray. And many of us—you know, I wear glasses—sometimes you don’t even see it, right, until you click. And sometimes the way that the browsers are now setup in apps it doesn’t even go all the way to the website. Sometimes it’s just within that browser.
And so I think it’s a very difficult thing for us to, you know, face that not only are the newsrooms becoming disaggregated, the journalists are becoming atomized, but also the content and the way that this stuff is traveling is affecting these audiences, where they’re not actually able to enjoy, like, a healthy, you know, smattering of different types of news. And so I think that, you know, a lot of talk has been about, you know, reviving alt-weeklies, and bringing back magazines and things of this nature, because people are also craving that sense of breadth and depth. And so, you know, I think it’s important for newsrooms to also think about, you know, is there a possibility of reviving some print maybe once a week or things, in order to make sure that people have access to a more robust set of information all in one place.
MCGREGOR: So now is my favorite portion, which is where we get to take questions from you all. So I think we have at least—we’ve got a couple of mics around the room. So please raise your hands, get someone’s attention, and let’s get started. So, sure, yes, in the middle here.
Q: Andy Revkin. Former New York Times, where I had a blog for ten years, that I learned how to deal with the commenters and all this, and Twitter.
But I also wrote a lot about this issue. And my questions are how much of this is—what is the media’s responsibility to start working with its audience? A lot of this was a compartmentalized discussion about what we need to do in the newsroom. What’s the education component here? The Center for Humane Tech, you know, there’s this bigger universe of people who have become convinced that it’s impossible, essentially, to tamp this down without a cultural—literally people have to work at it like yoga or something. It’s a practice—
MCGREGOR: You mean the audience members?
Q: Yeah. You know, I started a hashtag a couple years ago, #MakeRealityCoolAgain. It has to have pull, reality, and it doesn’t right now. Whether you’re a Green New Deal advocate or someone on the other side. So I don’t know what the question—what’s the role of media in training, or sort of facilitating, or sort of employing the audience as part of the solution?
MITCHELL: I think that was a little bit what Susan was trying to get at earlier. You know, it seems to me that if we think about the core of the success of the business of the newsroom is that relationship with our audience and therefore having an audience. And so it is of interesting, I would say, you know, primary interest to be sure your audience feels like you are with them in this, and you are working with them to get them the information, and to be able to help them navigate it. As Claire was saying, talking about your own navigation or the tools that you used to learn are also ones the public can then use. So by doing that, you’re educating the public in certain ways. But the more that there’s a relationship with members of your audience, the more they have you in mind and they may be likely to check, to come back to your source, to say: Oh, hey, did they report on this also? Or, I didn’t see that rumor headline in their newspaper.
So I certainly think it’s a key element. Can journalists take this on by themselves? Probably not. (Laughs.) In terms of getting the public up to speed. We also see there’s a lot of the public that thinks it’s fun to share this. And even some of these sites that put in huge letters, you know: Satire! This is fake! People don’t care. This is what they want to see, and so they’re going to spread that anyway. But there is a lot—the largest portion of U.S. adults is still in that middle core where they are not on the ends politically. They are not hyper political on either side of the spectrum. They’re also less—not as staunch followers of news. So they tend to be less politically driven. But they’re also not the drivers of news conversation in their community. So there is opportunity there to connect with those folks before they get down this rabbit hole, not having a clue what to do.
MCGREGOR: I’m just going to speak out of turn for a minute here, because I also think that part of this, in addition to being transparent about our process as journalists, reporters, editors, I also think there’s a massive value to literally writing down what your practices are as a news organization and making them available. I constantly refer—and, you know, I’m an assistant professor at Columbia Journalism School—I constantly refer my students to the AP news values. And I’m always impressed as I work with people outside of the journalism and media space, they don’t actually know.
Like, they have no—I’ve had colleagues say, I don’t know, are there rules about what you’re allowed to do to a photograph before I goes in a newspaper? Yes, of course there are. But this is—you know, these are very educated, very savvy people. But I think sometimes as organizations we’re not really saying, hey, this is how we do what we do. And this is the things that we do not do. And I think that, you know, that’s actually something that can be—you know, it’s a substantial effort, but kind of a one-time effort, to say: This is who we are and how we do what we do. And if you want to know what that process is, we’re happy to, you know, put it writing and stand by that, so.
MITCHELL: And, to be honest, I mean, one of the things that was interesting that we saw in one of our recent surveys, was it’s the case that most of the public thinks that most of what journalists are going to report is going to be accurate, even across the political lines. But they also think journalists cover up mistakes and aren’t transparent when they make mistakes, so they’re not willing to share that with the public. So sort of what did I do in my report and, oh, what didn’t work here? How did I get taken by that rumor? Those sorts of things can go a long way to building that relationship, as well as educating.
DONOVAN: And my answer to you, just briefly, is sweepstake prizes. (Laughter.) It’s how we used to do it, right? You know, like, get them to send in a little coupon, you know? But in our research at Data & Society on how conservatives consume the news, we did an audience study where we had a researcher for a year spend time with conservatives in the South and look at what exactly is their social media practice? And one of the things that conservatives love to do is read five or six stories about the same thing, and spot liberal bias. And so the trust in Fox isn’t about that Fox is telling me exactly what happened, it’s that they admit their bias, and so I can take it seriously.
And so she found—Francesca Tripodi is an incredible sociologist—found time and time again, young and old, they were reading Al Jazeera, New York Times, CNN, because they wanted to see how badly they were maligning Trump. Or, you know, one of her, you know, groups, she went to Bible study. And the pastor says: You know, we need to go back to the original text, and he breaks out the tax code. Because he’s saying that journalists are not fairly covering the tax code. They’re saying all these things are in it. It’s just not there. Read the original text. And if any of you are Christians in the room, that’s biblical practice. You know, that’s just scripture, right? And so they’re referring to this and trying to, you know, abstract what might be the values of journalists that aren’t reading the original text, that aren’t going back, and are, you know, inserting analysis and opinion over fact.
And that report on scriptural inference I think really also helped us inform how conservatives were using Google search in a precognitive mode. Which is, they were using search to fact check things. And they would put quotes around very specific statistics. And then they would get into a realm of fake news, because those statistics were the things that were being reported. And so when you’re looking at adversarial media, or media that’s something that you think, oh, maybe I need to debunk this and things, throw some really—you know, the keywords and statistics into Google and see what else comes up, and see what that news ecosystem looks like. And you can get a broader sense of why people are maybe misunderstanding what’s happening. And then you can write more pointed rebuttals.
MCGREGOR: So there’s one here, and I see one in the back also. So, yes. Yeah, please go ahead.
Q: OK. I’m Jessica Rosegaard with WWNO Public Radio in New Orleans.
I mean, I like the idea of writing down your process and making it accessible, except that it’s that old adage of you can lead a horse to water. So, you know, personally, on a very small level, I’ve been engaging with friends on social media and trying to get them to using the F-A-K-E news term, because—
Q: —I don’t want to give it breath. And I don’t want my friends to. And fortunately for me, my friends trust me as a person and as a journalist. (Laughs.) So, you know, those connections do have some weight. And I think for those of us that are in local newsrooms, it’s really community engagement. And even through your friends. And somebody posted a picture of a border agent allegedly with a very large gun going after a mother and a child. And it looked kind of weird to me, and I didn’t recognize the source. And I’m also not trying to force my friends to consume only mainstream media sites, because I don’t want them to—you know, to take issue with that. I’m not trying to snuff out any voices. But it didn’t look right to me. So I went, and I googled. And sure enough, it very easily could have been photoshopped because the colors on the uniform did not match Getty pictures. So I just gently said, hey, hey—(laughs)—please be careful.
You know, a friend posted an article about Steve Scalise and speaking to a white supremacist rally and being in New Orleans. My antenna went up and I was like, oh my God, is this something I need to cover? Is this breaking news? And it was from a couple of years ago. And I said, hey, please, yes, this is still important. But please—you know, so I’m basically trying to get my friends to have some journalistic integrity on their social media pages because they’re not held to any accountability in what they share, which stinks, because they have a platform, you know? And we’re held to the highest accountability. And I’m—you know, I’m hoping that just that one little pebble ripples. You know, it’s a very small effort. And sometimes results in late nights in a bottle of wine, and me angrily on social media trying to debunk my friends. But, again, in local newsrooms, I feel like we have that access to our communities, and we can use that.
MCGREGOR: Yeah. I mean, I think that’s absolutely true. And, again, as we’ve touched on—and we haven’t talked a lot about those kind of in-person relationships. But I think over the—over the last decade or so, one place where local journalism organizations have really thrived is also in curating and creating in-person events where journalists connect with the local community and help bring the local community together. And I think that those are—that is obviously a place where local news organizations are irreplaceable, right? You can do that in way that national organizations can’t. And that’s definitely something to leverage.
DONOVAN: Yeah, one of—a point about this is my friend and collaborator danah boyd at Data & Society has talked a lot about how the denigration of the social fabric has a lot to do with lots of—the loss of particular important social institutions. And one of these is just nobody knows who reporters are any day—you know, we don’t have friends who are reporters. And so it is on you to, you know, maybe be nice—you know, Uncle Larry, you know, he’s crazy, you know? (Laughter.) But the rule—you know, the coda is, avoid lurid curiosity, right? That’s part of the practice here. And so, you know, you can just kind of link back to that and be like, this seems a little far-fetched. And we have rules around here that we abide by. And if you get people to, you know, see that they are part of the distribution system, you know, they might be a little bit more responsible.
But the other thing is, you know, it really feels good to share information that comes—that confirms your biases, you know, because it doesn’t just mean that you’ve thought of them on your own, but it points to you being part of a group of people and a community of people that, you know, do believe that Hillary Clinton, you know, is a Satanic witch. Which is fine, I guess, but, like, here clothes, they’re just not—I mean, I’m from Boston, so Salem, we know our witches. (Laughter.) Like, it’s just not—it’s just not happening.
MCGREGOR: It’s just not—they don’t do the colorful pantsuits.
DONOVAN: It’s just not the look, you know? Yeah, it’s just—you know, even in best days with witches, you know?
MCGREGOR: So, sorry, all the way in the back there? The gentleman? Yeah.
Q: My name’s Fred Melo. I’m a local news reporter in St. Paul, Minnesota for the St. Paul Pioneer Press.
And one of the ironies in this conversation is that one of the biggest sources of misinformation is our own website, that you scroll down to the bottom of any one of my articles, you get this, you know, comment section where you can read the most scandalous things. And unless there’s violence, or somebody’s home address that’s been posted, we’re probably going to take it down. Now, our competition—which is kind of a bigger paper—polices that a little bit more and doesn’t allow those comments if it’s a crime story or maybe something with overt racial overtones. But even there, there’s a lot of misinformation being posted. What do we do about that? Do we allow that? Do I slide into the comments and say, hey, you’re all wrong, you conservatives? And by the way, I’m not a liberal. You’re just all wrong, you conservatives? You know, it exposes you to so much, I don’t want to say liability, but it really confirms their bias, which they already have to an umpteenth degree. What do we do? Do we police these pages? Do we take them down? They’re a revenue source? We need the money.
WARDLE: Yeah, I mean, I would say this debate about comment has been going on for a long time. The places that do it really, really well, surprise, surprise, have spent a lot of money on paying moderators. And if you do that really well—like, the Guardian has been doing it for fifteen years—you get huge payback. You have a proper community. Ends up being easier to police, et cetera, et cetera. If you are a small paper, there has to be a really strong decision that—a cost-benefit decision there. So I think that it’s—I don’t think it’s possible unless it’s resourced properly. And until we come up with a better business model for local, I wouldn’t recommend that you spend your time there.
Other than to recognize—I mean, there is—I did some research on this fifteen years ago. The more that commenters think that they’re being read by actual journalist or politicians, the quality goes sky high. If they think that nobody’s reading, apart from the other crazy guy, that’s when you get the wild west. And so there is a balance there. But it takes a while to cultivate that, for you to find that actually the comments are useful to you. Because when that community develops and blooms you’re like, wow, that guy’s telling you about this new source, or this new tip, and it’s great. But let’s not kid ourselves, like, that takes time and it takes resources. So this—you know, we’ve been talking about this for fifteen years and we haven’t come up with a solution.
MITCHELL: Yeah, and there may also be differences by—you know, by community in terms of how digitally oriented it is. If a lot of that is moving into the social media space instead of in the comment section it may be easier to be able to kind of walk away from that comment section, than an area that’s kind of just getting online, you know, and they’re just getting—that’s been more tied to legacy products until now. So it is still a—it’s an ongoing challenge.
MCGREGOR: Yeah. I think there’s definitely a challenge there also to see, you know, how much of the revenue is it really? Because, you know, it’s like even looking at things like remnant advertising—I actually thought were you were going with that was, like, the outbrain links at the bottom, right? (Laughs.) Everybody’s nodding, right? And I’m surprised sometimes too when I go to news, and I see these things. And I go, I mean, this is obviously—and, you know, from my own understanding from industry reports, you know, a lot of news organizations it’s in, like, maybe the high single digits, the percent of advertising revenue that’s—and that’s just of advertising revenue that’s going to that.
So, you know, I think, as Claire suggests too, there’s a real—you know, there’s a real discussion to be had, right? Are the comments worth keeping? Are these things, you know, things that we’re going to put resources to, or putting in enough resources to make them worth the trade off? It’s obviously a very tough conversation. And local newsrooms especially are operating on very small margins. So it’s—yeah.
MITCHELL: And what are the other avenue your audiences have to connect with you and to offer their feedback? I think that’s a critical part as well, because they need to feel like they can, in some fashion.
MCGREGOR: Yeah. Other questions? Oh, good. We’re getting—sorry, I think there was one—yeah, there, and then in the middle.
Q: Hi. I’m Bryan Clark. I’m the editorial page editor and run the social media at the Post Register. It’s a small newspaper in Idaho.
I had a—something that happened with a botnet. And I had a—I have a particular way I address it. But I did a novel encounter for me. So I just figured maybe I’d relate it. You tell me like maybe I should have been thinking of, and if there’s a better way to handle it. So we had a referendum coming up. And I post an op-ed that takes one side of this referendum. And suddenly it gets, like, 10,000 likes from Central Asia. (Laughter.) So clearly—and the only solution I could come up with is I just post on it, like: Look, jerk, if you do this again, I’m deleting the post. And then I realized, now I’ve given the ability to delete any post they want, if I actually implement that policy. So I actually don’t know what I should do and how I should think about it.
DONOVAN: Yeah. In that case you can contact Twitter directly and say, hey, I got hit. You know, can you take these—and they can, like, erase those things. And they’re supposed to be actually proactive about that. But something like that might only cost someone twenty bucks. You know, it’s not expensive to put ten thousand likes on a thing. And so, yeah. So I would just contact Twitter directly and say, hey, listen, we’re getting gamed. And this is—and this is very normal now. Anybody that tries to run a poll online. This is actually why we’re really afraid of the Census going online, is because we know that there are ways of gaming every digital system. And the Census is the biggest data project of, you know, our country, right? It’s the longest-running, most important set of data we have. But we know that all these digital systems can be gamed, including the Time Person of the Year.
MCGREGOR: (Laughs.) Right. OK, here. And then in the front.
Q: So I was on a panel once—well, no. I was teaching a workshop where I had to teach kids to do fact finding, and to do what journalists do. And I started, as I was doing my research, and I realized how do I tell them what a valid news source is. I mean, we all kind of just know this, right? So I started looking around and just said, you know, does AP have a list? Does somebody have a list? You know, journalism trade groups, is there a list somewhere? And I did find a couple of lists. But I guess this is very controversial in a country that celebrates free speech, because I’m not talking about censorship, but have any of you given some thought about some kind of credential? Some kind of credential that would go with—I know, I know. But maybe this will lead to some other better idea that you have. Maybe it’ll inspire an idea. But I’m just wondering. I have parents who have no clue, and they’re reading this stuff. And there’s really no way, it seems, for them to know what’s valid and what’s fake.
MCGREGOR: So you’re talking about the credentialing of outlets?
MITCHELL: Like the Housekeeping good seal of approval?
Q: Kind of. But I don’t mean—I mean, you there could be conservative ones, liberal ones, whatever. Not based on ideology or anything. But just when something is deemed fake, or, like, Russia Today, or—you know, I don’t know. Anything you got?
MCGREGOR: Well, I think, Amy, you had some work on this about federal—about the idea of government regulating some of this stuff.
MITCHELL: Certainly. I mean, it’s certainly the case that the public would put freedom of speech over the challenge of addressing misinformation when it comes to the government taking steps. So do you want the government to take steps, et cetera, and potentially lose some access to—the publishing access that you want, or have misinformation? And they opt for misinformation. So that’s a very, very strong belief. More willingness for technology companies to do so, but there’s a lot of tension there. But I would say, as with comments, this idea of somehow validating news organizations has also been talked about for years, and years, and years.
So one of the challenges—there’s actually another one that just launched I think this week—Steve Brill’s new one? NewsGuard. But they—one of the challenges is, how do you—who’s making the ratings? How do you make them, right? Again, just like we see with algorithms, there are people there who are judging whether something is good journalism or not, right? Is it accurate? Is it transparent? Do they ever publish something that’s wrong or intentionally misleading? There are things to arm members of the public with when they’re looking at the breadth of what a source does. It’s a lot harder to create a checklist.
It’s also hard to bring it to scale, to get people to have willingness to go through the steps too. There was another one a few years ago that launched that, again, just sort of fizzled because they couldn’t get enough—it takes time for a news organization to put forward their content, right, to say, look, we do all these things. So the idea of how does one arm members of the public and help them distinguish what is a legitimate news operation that’s seeking to do the right thing, even if it can’t always do the right thing because of the challenges that are faced in that field, versus those who are really looking to mislead.
But into that goes the political divide. So you would have people say very legitimate news organizations aren’t putting out real news. You know, they’re not putting out what’s accurate. They’re putting out what’s not, something that’s false. So it’s a real challenge. It is something that people have talked about for a long time. It’s one of the right things to think about in terms of how to help the public. The other question is, just like checkmarks that are—you know, this has been fact checked—how much will the public actually notice it?
DONOVAN: One thing to think about too is does anybody ever wonder why Donald Trump’s Twitter handle is @RealDonaldTrump? That’s pretty weird, right? But this is because at the beginning, Twitter didn’t have these blue checkmarks. No one was being verified. And so they were just like, so, what we’ll do is we’ll put these checkmarks next to people’s names, and these people have submitted to us verification of their identity. And we will do it for celebrities and public figures. And it kind of unlocks a different set of moderation tools, where you can set certain limits on who can get to you, and how many followers they need to have, and things like that.
So the blue checkmark turns into a thing that then becomes even stranger, a mark of legitimacy and endorsement. But no one has ever come out and said: This is that the blue checkmark really means, which is that somebody sent in a picture of their license and confirmed their identity. And then the controversy that we faced last summer, or the summer before, around the Unite the Right rally is that as white supremacists gained status and became newsworthy figures, they were applying for blue checkmarks. And instantly, people were transferring a kind of informal authority to them as legitimate independent journalists, in some respects. And they were podcast hosts, right? I mean, anybody can do a podcast. (Laughter.) It’s true.
MITCHELL: But even to take it back, there was a study that was done of young folks about their ability to, you know, tell what was sort of a legit, accurate story, and one that wasn’t. But the fact that there was data in it, meant it was OK. You know, it made them feel it was OK. And that’s, you know, one of the things we spend our time with. OK, well, what’s real data and what’s not real data? So there’s—it’s—there’s a lot of—a lot of different areas to think about in that space.
WARDLE: But there aren’t—there’s Reporters Without Borders doing something. CredCo, Trust Project—
DONOVAN: Yeah, CredCo is great.
WARDLE: —NewsGuard. So there are a number of people working on it. But all of the things that you said, is that it’s not without its challenges.
MCGREGOR: Yeah, and just to add one last thing. I think one thing that’s interesting is to think about how we—what went away that we want to now add back. And it strikes me that we’ve touched on this idea that in the digital space there’s a real loss of essentially branding and sourcing that’s communicated to the viewer. And so when I find something on a search engine, or I see something on Facebook, it’s not really obvious where that comes from. And I always point out that, you know, when you used to go to a newsstand or the supermarket checkout aisle, none of us had a lot of difficulty distinguishing between tabloids and broadsheets, right? We knew what the National Enquirer was, and it looked a certain way, and it had a certain kind of presentation. A lot of those distinctions have been erased in the digital space and by social media and search engines. And so I think one thing to think about is how do we add back some of those things that we knew about media originally that now is missing.
I think—are we till 4:00 or 4:15? Till 4:00. All right. So we have time for one more question. Sorry.
Q: Thanks. Bennett Hanson, Global Press.
Early in the discussion, I think Joan listed some pros and cons of a newsroom buying social media bots to boost followers, likes, and readership. Would you recommend that? And can you name any news organizations who have done that?
MCGREGOR: Oh. (Laughter.)
DONOVAN: Well, I mean, there was a great story yesterday about Sputnik, right, that’s running a net of accounts on Facebook, where they’ve spread out the branding. And then what they do is they, like, leverage different keywords that are popular that day. They might change the name of the Facebook page.
WARDLE: That wasn’t just for the sake of news and information. That was propaganda. (Laughs.)
DONOVAN: Yes. Yeah, it was propaganda. But, you know, I mean, some of our best tactics, right? You know? (Laughter.) I’m saying no. (Laughter.) But also, I mean, come on. (Laughter.) The problem is that it is considered a digital dirty trick. And ethically it is—you know, if you get caught, the costs are very high, right? You lose your social media accounts. You’re, you know, lambasted in other press outlets.
There are other ways of going about that, where you would buy, like, in-platform advertising, which is very much the same thing except it’s a more premium price. And we still haven’t figured out if when you’re buying things in platform if you’re actually buying followers from Twitter, or if you’re buying likes from accounts on Facebook, or if that’s also supported by this algorithmically generated crowd, right? So there’s a lot to be learned there. And I think that my main point is about we don’t know how these tools work, because there’s not a lot of transparency. But what we do know is people are making billions off of advertising online.
And the people who are spending a decent amount of money to infuse the information ecosystem with all of this garbage are winning the day in terms of the fact that we are now all scrambling to recover an information ecosystem that felt very stable and we felt very homed in it, even when it had problems where, yeah, from time to time major news organizations would get hoaxed or they would print the wrong thing, or WMDs, right? (Laughs.) These things happened. But we had course correction, we had new—you know, we faced those challenges.
And I think algorithmically what we fail to understand is people are talking to computers now. They’re not talking to each other. And in doing that, you know, when they enter something into search, you need to consider that they’re not maybe thinking with the same keywords and the same structures that you are. And so you have to think more broadly about, well, what is my audience actually looking for? And how can I show up there, in those responses? And, you know, the person who’s mildly unethical might juice that for their advantage.
WARDLE: But definitely national newsrooms are spending a fortune on these platforms—a fortune. And, I mean, I remember ten years ago when I was working at the BBC, Al Jazeera bought the keyword “Syria.” And the BBC people were like, that’s terrible.
DONOVAN: Why did you do this?
WARDLE: But it was genius. (Laughter.) It was just a few cents to do it. Al Jazeera came to the top. So, again, it’s a conversation we need to have. We’re not having it. Newsrooms—when I’ve had a conversation with a newsroom and they tell me how much money they’ve spent. And they’re like, don’t tell a soul. You know, and it’s, like, we should be more transparent. Like, if we’re actually spending a ton of money on Facebook, right, let’s have a conversation with Facebook that says: Why don’t you give newsrooms some advertising dollars, Facebook? You’ve got your new $300 million. Why don’t you have a fund that allows local newsrooms to boost their posts? You know, like, we should be having a transparent conversation, so we can actually have leverage to talk to the platforms about what’s needed.
MCGREGOR: Well, I don’t want to stand between anyone and a reception. But I do just want to say thank you to our wonderful panelists, and also to all of you. I know it was—I’m very excited to have been in this space and looking forward to hopefully talking with many more of you soon. So thank you very much.
WARDLE: Thank you. (Applause.)