Skip to main contentSkip to navigationSkip to navigation
Facebook said it had ‘removed a small number of the pages shared with us for violating our policies’.
Facebook said it had ‘removed a small number of the pages shared with us for violating our policies’. Photograph: Avishek Das/SOPA Images/REX/Shutterstock
Facebook said it had ‘removed a small number of the pages shared with us for violating our policies’. Photograph: Avishek Das/SOPA Images/REX/Shutterstock

Facebook ‘still making money from anti-vax sites’

This article is more than 3 years old

Social network allowing dangerous Covid theories to be shared, says Bureau of Investigative Journalism

Facebook is allowing users to profit from the spread of potentially dangerous false theories and misinformation about the pandemic and vaccines, including deploying money-raising tools on pages with content flagged up by the social media giant’s own factcheckers.

An investigation has found 430 pages – followed by 45 million people – using Facebook’s tools, including virtual “shops” and fan subscriptions, while spreading false information about Covid-19 or vaccinations.

The findings come despite a promise the platform made last year that no user or company should directly profit from false information about immunisation against Covid-19.

Facebook generally does not share this income, but it does occasionally take a cut, and benefits financially from users engaging with content and staying on its services, exposing them to more ads.

The research, by the London-headquartered Bureau of Investigative Journalism, is likely to have uncovered only a tiny snapshot of the vast amount of monetised misinformation on Facebook related to the pandemic and vaccines.

A Facebook spokesman said the company was investigating the examples brought to its attention, and had “removed a small number of the pages shared with us for violating our policies”.

However, many of the posts identified as misinformation do not violate Facebook rules, the spokesman added, without providing any details.

“Our initial investigation shows a large number of the pages flagged had zero violations against our harmful misinformation policies, and we’d dispute the overall accuracy of the data being provided,” he said.

The pages identified included sites for comedians and religious leaders, social media personalities and traditional media reporters.

There are a large number of alternative health sites, focused on a range of subjects from nutrition to yoga and wellness. Only a minority are clearly focused on the pandemic, or anti-vaccine sentiment. The others are sharing content to broader audiences.

Seven languages are represented – including German, Hebrew, Polish and Spanish – reaching readers around the world.

More than 260 of the pages the bureau identified have posted misinformation about vaccines. The remainder include false information on the pandemic, on vaccines more broadly, or a combination of the two. More than 20 pages identified have gained Facebook’s blue tick signalling authenticity.

For Facebook, offering ways to make money is probably a route to encouraging people to use its platform rather than its competitors’, according to Dr Claire Wardle, executive director of First Draft, a US-based non-profit organisation fighting online misinformation, which contributed to the bureau’s research.

However, Facebook can also profit from the popularity of brands and individuals who spread misinformation. It takes a cut of 5% to 30% on its Stars currency, used by fans to tip creators who stream live video.

Facebook also briefly took up to 30% of fees paid by new supporters from January last year, but reversed this in August.

The bureau found two pages using Stars: An0maly and Sid Roth’s It’s Supernatural, a religious site which has blamed the pandemic on abortion and has featured guests describing a dream in which God showed them the virus being created in a Chinese lab. Between them, the pages have reached more than 2.6 million people.

The site run by An0maly, real name AJ Feleski, who describes himself as a “news analyst & hip-hop artist”, is one of the most influential pages sharing misinformation to be identified by the investigation, with more than 1.5 million followers.

A video from last March, in which he questions if the pandemic is “bio-terrorism”, is one of at least three posts on the page that Facebook’s factcheckers have flagged for containing false or partly false information.

Yet even on Saturday a strap appears under the videos inviting viewers to pay to “Become a supporter” and “Support An0maly and enjoy special benefits”.

Facebook’s policies for creators using monetisation tools include rules against misinformation, especially medical misinformation.

In November, Facebook, along with Google and Twitter, agreed a joint statement with the UK government committing to “the principle that no user or company should directly profit from Covid-19 vaccine mis/disinformation. This removes an incentive for this type of content to be promoted, produced and be circulated.”

The bureau’s findings suggest Facebook has breached this agreement, as well as failed to enforce its own policies.

A Facebook spokesman said: “Pages which repeatedly violate our community standards – including those which spread misinformation about Covid-19 and vaccines – are prohibited from monetising on our platform.

“We take aggressive steps to remove Covid misinformation that leads to imminent physical harm, including false information about approved vaccines.”

The company removed 12 million pieces of Covid misinformation between March and October, and placed factcheck warning labels on 167 million other pieces of content, he added.

Organisations including the UN, the World Health Organization and Unesco said in September that online misinformation “continues to undermine the global response and jeopardises measures to control the pandemic”.

Some of the pages identified in the investigation also directed their followers to more extreme content that has been largely scrubbed from social media.

Veganize, a Portuguese-language page based in Brazil with 129,000 followers, offers paid supporter subscriptions.

A “pinned post”, which is fixed at the top of the page even as new content is added, carries a link to a collection of files hosted on Google including “Plandemic”, a pair of conspiracy-laden, thoroughly discredited videos that briefly went viral last summer before social networks made strenuous efforts to remove them.

Groups spreading information flagged by factcheckers as false have also used Facebook to fundraise. The Informed Consent Action Network (Ican), a US non-profit, is one of the most well-funded organisations in the US opposing vaccinations.

Facebook and YouTube removed pages for Highwire, an online show run by Ican founder Del Bigtree that made claims repeatedly rated as false by factcheckers, for which Ican says it is suing the tech companies.

Yet despite removing the Highwire page, Facebook still allows Ican to solicit donations from its more than 44,000 followers on a page that has had at least two posts flagged by factcheckers. According to its page, Ican has raised almost £24,000 since February 2020.

Facebook must approve organisations signing up to raise funds and vaccine misinformation is explicitly cited as a reason that fundraising may be removed from an organisation.

Wardle, from First Draft, believes the money-making systems Facebook offers could encourage people to spread misinformation.

“It is human nature. We know one of the motivations is financial,” she said.

“They have started to believe these things, but when you are in that circle, you also realise there is a way to make money, then you realise that the more you get hits the more money you are making. It’s more than the dopamine hit – it’s dopamine plus dollars.”

Most viewed

Most viewed