Skip to main contentSkip to navigationSkip to navigation
The proliferation of politically biased, fake news stories on Facebook has become widespread.
The proliferation of politically biased, fake news stories on Facebook has become widespread. Photograph: Money Sharma/AFP/Getty Images
The proliferation of politically biased, fake news stories on Facebook has become widespread. Photograph: Money Sharma/AFP/Getty Images

Facebook’s failure: did fake news and polarized politics get Trump elected?

This article is more than 7 years old
in San Francisco

The company is being accused of abdicating its responsibility to clamp down on fake news stories and counter the echo chamber that defined this election

“If I were to run, I’d run as a Republican. They are the dumbest group of voters in the country. They believe anything on Fox News. I could lie and they’d still eat it up. I bet my numbers would be terrific.”

Many Guardian readers will have seen this quote, attributed to a 1998 interview with Donald Trump in People magazine, in their Facebook news feed.

It’s a great quote, but he never said it.

It typifies the kind of fake news and misinformation that has plagued the 2016 election on an unprecedented scale. In the wake of the surprise election of Donald Trump as president of the United States, pressure is growing on Facebook to not only tackle the problem but also to find ways to encourage healthier discourse between people with different political views.

Rather than connecting people – as Facebook’s euphoric mission statement claims – the bitter polarization of the social network over the last eighteen months suggests Facebook is actually doing more to divide the world.

“People have unfriended friends and family members because the style of discourse is so harsh,” said Claire Wardle, research director at the Tow Center for Digital Journalism. “Facebook stumbled into the news business without systems, editorial frameworks and editorial guidelines, and now it’s trying to course-correct.”

Facebook will need to change its business model if it does want to address these editorial challenges. Currently, the truth of a piece of content is less important than whether it is shared, liked and monetized. These “engagement” metrics distort the media landscape, allowing clickbait, hyperbole and misinformation to proliferate. And on Facebook’s voracious news feed, the emphasis is on the quantity of posts, not spending time on powerful, authoritative, well-researched journalism.

The more we click, like and share stuff that resonates with our own world views the more Facebook feeds us with similar posts. This has progressively divided the political narrative into two distinct filter bubbles – one for conservatives and one for liberals (a blue feed and a red feed), pulling further and further apart in the run-up to election day.

‘Dust cloud of nonsense’

These information bubbles didn’t burst on 8 November, but the election result has highlighted how mainstream media and polling systems underestimated the power of alt-right news sources and smaller conservative sites that largely rely on Facebook to reach an audience. The Pew Research Center found that 44% of Americans get their news from Facebook.

Yet fake news is not a uniquely Republican problem. An analysis by BuzzFeed found that 38% of posts shared from three large rightwing politics pages on Facebook included “false or misleading information” and that three large leftwing pages did the same 19% of the time.

What is a uniquely Republican problem is the validation given to fake news by the now president-elect. Trump has routinely repeated false news stories and whipped up conspiracy theories – whether that’s questioning Obama’s heritage, calling climate change a hoax or questioning “crooked” Hillary Clinton’s health – during high-profile rallies, while urging his followers not to trust corrupt traditional media.

The conspiracy theories are amplified by a network of highly partisan media outlets with questionable editorial policies, including a website called the Denver Guardian peddling stories about Clinton murdering people and a cluster of pro-Trump sites founded by teenagers in Veles, Macedonia, motivated only by the advertising dollars they can accrue if enough people click on their links.

The situation is so dire that this week President Obama spoke about the “crazy conspiracy theorizing” that spreads on Facebook, creating a “dust cloud of nonsense”.

Newspapers used to be the gatekeepers of news. Now 44% of Americans get their news from Facebook, according to one study. Photograph: Alba Vigaray/EPA

“There is a cottage industry of websites that just fabricate fake news designed to make one group or another group particularly riled up,” said Fil Menczer, a professor at Indiana University who studies the spread of misinformation. “If you like Donald Trump and hate Hillary Clinton it’s easy for you to believe a fake piece of news about some terrible thing Hillary has done. These fake news websites often generate the same news just changing the name to get people on either side to be outraged.”

Menczer and his Indiana University colleagues hope to better understand how fake news, and how pieces debunking fake news, spread through social media by launching a range of analytical, non-profit tools later this year.

Looking for what we want to hear

The misinformation being spread doesn’t always involve outlandish conspiracy theories. There’s a long tail of insidious half truths and misleading interpretations that fall squarely in the grey area, particularly when dealing with complex issues like immigration, climate change or the economy.

“Not everything is true or false, and in the gaps between what we can check and what is missing from our control we can create a narrative,” said Italian computer scientist Walter Quattrociocchi, who has studied the spread of false information. “Trump won at this. He was able to gather all the distrust in institutional power by providing an option for people looking for a change.”

“These things are very hard to detect automatically if they are true or not,” said Menczer. “Even professional fact-checkers can’t keep up.”

Donald Trump on election night. Facebook’s algorithm means people are more likely to see stories with political viewpoints that match their own. Photograph: Mike Segar/Reuters

According to Menczer’s research there’s a lag of around 13 hours between the publication of a false report and the subsequent debunking. That’s enough time for a story to be read by hundreds of thousands if not millions of people. Within Facebook’s digital echo chamber, misinformation that aligns with our beliefs spreads like wildfire, thanks to confirmation bias.

“People are more prone to accept false information and ignore dissenting information,” said Quattrociocchi. “We are just looking for what we want to hear.”

It’s a quirk of human psychology that the UK Independence party (Ukip) toyed with during the campaign for Britain to leave the EU. Arron Banks, Ukip’s largest donor, told the Guardian that facts weren’t necessary for winning. “It was taking an American-style media approach. What they said early on was ‘facts don’t work’ and that’s it. You have got to connect with people emotionally. It’s the Trump success.”

While it’s human nature to believe what we want to hear, Facebook’s algorithms reinforce political polarization. “You are being manipulated by the system [for falling for the fake news] and you become the perpetrator because you share it to your friends who trust you and so the outbreak continues,” said Menczer.

It’s a perfect feedback loop. So how do you break it? Menczer says the solution is to create a filter. Before social media, the filter was provided by media companies, who acted as gatekeepers to the news and had staff trained in fact-checking and verifying information. In an age of budget cuts in traditional media, and the rise of clickbait and race-to-the-bottom journalism, standards have slipped across the board.

“Now the filter is us. But that’s not our job so we’re not good at it. Then the Facebook algorithm leverages that and amplifies the effect,” said Menczer.

A story about Hillary Clinton murdering people was among the conspiracy theories spread on Facebook during the election. Photograph: Justin Sullivan/Getty Images

And so we come back to the algorithm.

Despite continually insisting that it’s a neutral technology platform and not a media company, Facebook is all-too aware of the influence it has to drive footfall to the polling stations.

Around 340,000 extra people turned out to vote in the 2010 US congressional elections because of a single election-day Facebook message, according to a study published in Nature.

In a separate study the social networking site worked out how to make people feel happier or sadder by manipulating the information posted on 689,000 users’ news feeds. It found it could make people feel more positive of negative through a process of “emotional contagion”.

So what should Facebook do? It’s certainly not going to be easy. It has tried – and failed – to get a grip on the problem before, launching a tool to let users report false information in January 2015. (That ultimately failed because it relied on users, who turned out not to be very good at spotting fake news and also to falsely report a story as “fake” if they didn’t agree with it.) In September 2016, the company joined a coalition, along with Twitter, to improve the quality of reporting on social media and cut down on fake news. We have yet to see the fruits of this alliance.

Human v automated editors

In the interim, Facebook found itself in trouble over the team of humans who were curating its trending news section. According to a former journalist who worked on the project, the team was routinely told to suppress news stories of interest to conservative readers. The company was widely criticized for playing the role of censor and being biased against Republicans.

That led Facebook to fire the editors and let the algorithm decide what’s trending. Since then fake news has repeatedly found its way into the highly influential trending list.

“Instead of hiring more editors to check the facts, they got rid of the editors and now they are even more likely to spread misinformation,” said Menczer. “They don’t see themselves as a media company and they run the risk of being told they are picking sides. They are in a tough spot, but they are also making a lot of money.”

Facebook’s continued rejection of the idea that it is a media company doesn’t sit well with some critics. “It sounds like bullshit,” said high-profile investor Dave McClure, speaking from the Web Summit in Lisbon a few hours after an expletive-filled on-stage rant about Trump. “It’s clearly a source of news and information for billions of people. If that’s not a media organization then I don’t know what is.”

Wow, @davemcclure has a MELTDOWN on #WebSummit stage over Donald Trump election pic.twitter.com/aRJmFpYyQA

— Adrian Weckler (@adrianweckler) November 9, 2016

He added that technology entrepreneurs have a responsibility to enable a “more well-rounded experience” for their audiences. “A lot of them are only thinking about how to make money. Maybe we need to mix in having ethics and principles and caring about the fact that people have a reasonable and rational experience of the information they process. Although that sounds a little too utopian.”

One solution could be to try to reduce the effect of filter bubbles by showing users a wider variety of opinions than their own. Even if people have a tendency to reject those opinions, at least they’ll be exposed to a diversity of views.

Wardle suggests that to tackle fake news, Facebook could introduce a mechanism to allow fact checking organisations to report false stories to Facebook so they don’t continually circulate. “Of course, people will shout censorship, so maybe Facebook could choose to change the way it display certain stories instead,” she said.

This is problematic because Facebook would have to manipulate the algorithm to make it less likely you would see something from a site categorized as disreputable. This would potentially involve discounting content your friends were interested in. “Then we would not like the platform as much because we like seeing stuff our friends are liking and sharing,” said Menczer.

All of these issues point towards the inevitability of Facebook acknowledging that it’s no longer just a technology company, but a media company – the media company.

In Mark Zuckerberg’s first Facebook update post-election, he talked about the need for everyone to work together. “We are all blessed to have the ability to make the world better, and we have the responsibility to do it. Let’s go work even harder,” he said.

Wardle is skeptical. “That’s all well and good - but start by changing your platform.”

Most viewed

Most viewed