All the (Fake) News That’s Fit to Share
5:52 minutes
In the days just after the presidential election, the top result returned by the Google News service for the search “final election results” was an article from a dubious website that claimed—wrongly—that Donald Trump had won the popular vote. And over the course of the campaign, false articles spread like wildfire over social networks such as Facebook, spreading both fake statistics and accounts of events that had never happened. In fact, an analysis by BuzzFeed News found that in the final three months of the election season, fake news stories on Facebook outperformed top real news stories published by reputable news providers such as The New York Times, the Washington Post, and NBC News.
Both Google and Facebook have said that they will put measures in place designed to slow the spread of fake news, mainly by restricting the placement of advertisements that earn money for viral sites, though Facebook’s Mark Zuckerberg said last week that he doubted fake news had had much impact on the election. Slate’s Will Oremus joins Ira to talk about the challenges of putting a damper on the viral spread of fake news, and what readers can do to be more aware of their online news diet.
Will Oremus is a senior technology writer for Slate in New York, New York.
SPEAKER 1: And now it’s time to play Good Thing, Bad Thing.
[MUSIC PLAYING]
SPEAKER 1: Because every story has a flip side. In the days just after the Presidential election, if you searched the words “final election results” on Google News, the top result was an article from a dubious website that claimed wrongly that Donald Trump had won the popular vote. And, of course, it’s not an isolated incident.
Over the course of the campaign, false articles spread like wildfire over social networks like Facebook, spreading both fake statistics and accounts of events that had never happened. In fact, an analysis of Buzzfeed News found that in the final three months of the election season, fake news stories on Facebook outperformed, topped real news stories published by news providers like The New York Times, The Washington Post, and NBC News. Both Google and Facebook have said that they will put measures in designed to slow the spread of fake news. But is that enough?
Joining me now is Slate’s Will Oremus. Welcome back to Science Friday.
WILL OREMUS: Good to be here.
SPEAKER 1: So tell us about– some people tag the recent rise in fake news to something that sounds like a good thing. How could this be a good thing?
WILL OREMUS: That it’s a good thing that we’re having more fake news?
[LAUGHTER]
WILL OREMUS: Well, you know Facebook CEO Mark Zuckerberg has tried to defend the company against a lot of the criticism it’s been getting in the wake of the election from people saying that people were misinformed, that voters got the wrong information. You had fake news stories like the one you mentioned. And actually the fake news stories, according to several data analyses, skewed heavily pro-Trump. And so you had stories about the Pope endorsing Trump, Denzel Washington endorsing Trump, Hillary Clinton being indicted.
But what Mark Zuckerberg has said is, look, the vast majority of content on Facebook is authentic. We do have mechanisms in place for people to flag fake news. And his perspective was that really, this is a distraction from the fact that what Facebook has done is to have opened up the media to a much wider array of voices than ever before. So he was trying to sort of spin this as a good thing. I don’t think a lot of people bought that.
But if you are looking for a silver lining here, it could be the fact that companies like Facebook and Google do have the power to change this if they want to. It’s not an easy problem. And it’s not always easy to draw the line between what’s fake news and what’s just a false story, what’s a misunderstanding. But what they’ve done so far is they’ve taken a step whereby they are no longer allowing identified fake news sites to advertise on their networks.
The deeper issue would be to address the way an algorithm like Facebook’s actually seems to prioritize fake news. I mean the algorithm is geared toward what gets clicks, what gets likes, and that lends itself to sensationalism. So what Facebook will have to do is find probably an algorithmic solution if it really wants to tackle this and to make truth a value that it optimizes for, along with all the other things that it optimizes for in the news feed algorithm.
SPEAKER 1: And of course, though, these are all businesses, Facebook and the other, they’re businesses. Won’t they be losing money if they do that?
WILL OREMUS: Yeah, that’s a really good question. I mean Facebook– if you ask them what the goal of their news feed is, the goal is to show users what they want to see. And in fact, when they train their news feed ranking algorithm, their optimal outcome is they go and ask a Facebook user– out of all the stories we could have shown you at the top of your feed, which ones would you have liked to see first? And then that’s what they test their algorithm against. So they are there to please their users.
Now if it turns out that their users love reading stories that confirm their political viewpoint, whether or not they’re true, that presents a little bit of a conflict for Facebook. Because then you have the option to either keep pleasing people by feeding them fake and misleading stories, or to try to fulfill some sort of democratic obligation to inform the public or to challenge people’s viewpoints. It’s not clear that it’s in their interest to do that.
SPEAKER 1: But are they going to go in that direction for now? Are they going to try to limit the fake news? And can they do it? I mean you mentioned the algorithms. Is it actually possible to do that?
WILL OREMUS: Yeah, it’s a good question. I mean I think the underlying story here is really a conflict over what Facebook is as a company. So Facebook considers itself a technology company. It has built software tools that allow people to connect with each other.
But if you ask a lot of the people in the media, politicians– Facebook has become a dominant force in the news industry. And they see Facebook as refusing to own up to the roles that a media company usually plays, which are to inform and educate and not just to connect people regardless of what the content is. So I think that Facebook will respond to that pressure.
There are already signs that they feel it. They do not want to be perceived as a bad company. And even if CEO Mark Zuckerberg has been denying that this is really a problem, there are reports that there are internal groups of Facebook employees who are meeting in secret to figure out what they can do and how they can pressure him.
SPEAKER 1: Could they just take the ads off of the phony news stories and then there is no money to be made on that?
WILL OREMUS: That’s part of it. And then that just leaves the much deeper problem of the fact that Facebook’s entire algorithm is structured to show people what they already want to see, as opposed to expose them to different points of view.
SPEAKER 1: So we’ll see how this all plays out in the future. This is the future. It is pretty hard. All right. Thank you, Will.
WILL OREMUS: Thanks.
SPEAKER 1: Will Oremus, excuse me, Will Oremus, Senior Technology Writer for Slate.
We’re going to take a break. When we come back, it’s the Cold Show. I don’t mean the show itself, but we’re dedicating the rest of the hour to that perennial enemy, the common cold. How you get it, the science behind it, do the cold remedies work? Just in time for that holiday season, I think you’ll learn a little bit about spending time with other people.
So stay with us. We’ll be right back after this break.
Copyright © 2016 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of ScienceFriday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
As Science Friday’s director and senior producer, Charles Bergquist channels the chaos of a live production studio into something sounding like a radio program. Favorite topics include planetary sciences, chemistry, materials, and shiny things with blinking lights.