Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Want to Avoid Conspiracy Theories and Misinformation on Facebook? Good Luck

Facebook allegedly knew about the dangers of misinformation well before the 2020 election.

By Nathaniel Mott
October 23, 2021
(Illustration: Malte Mueller/Getty Images)

Facebook has been accused of knowing that its platform enabled misinformation to reach a significant number of people, especially in the US, long before the 2020 presidential election and the Jan. 6 attack on the US Capitol led to increased scrutiny of the company's policies.

NBC News reports that a Facebook employee created an experimental account called Carol Smith in 2019 to research the platform's effect on users. Smith described herself "as a politically conservative mother from Wilmington, North Carolina" with "an interest in politics, parenting, and Christianity" and followed groups related to Fox News and former President Donald Trump. (Other accounts were set up with different characteristics, but the report centers on Smith.)

It took just two days for Facebook to recommend groups related to the QAnon conspiracy theory to the fake account. "Smith didn’t follow the recommended QAnon groups," NBC News says, "but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation."

The New York Times reports that numerous Facebook employees warned about the proliferation of misinformation on the social network in the days following the 2020 presidential election. One employee said two days after the election that comments on popular posts included "combustible election misinformation," the Times says, while another warned four days later that 10% of the political material viewed by users in the US claimed the vote was fraudulent.

NBC News and the Times say their reporting is based at least partly on internal documents shared with journalists and the US Securities and Exchange Commission by Frances Haugen, a former Facebook product manager turned whistleblower who testified about the company's response to misinformation before the Senate Commerce Committee in early October. (Haugen's documents also provided the basis for earlier reporting from The Wall Street Journal.)

But not all of the research was shared by Haugen, the Times says, and The Washington Post reports that another whistleblower told the SEC that Facebook "prizes growth and profits over combating hate speech, misinformation, and other threats to the public" on Oct. 13. These actions suggest Haugen wasn't the only Facebook employee disappointed by the company's response—or lack thereof—to misinformation before and after the presidential election.

Facebook has repeatedly defended itself against the criticism resulting from the documents shared by Haugen, her testimony, and the latest reports from the Times and the Post. That defense continued on Oct. 22 with a blog post attributed to Guy Rosen, its VP of Integrity, titled "Our Comprehensive Approach to Protecting the US 2020 Elections Through Inauguration Day." That blog post begins with Rosen saying Facebook started planning for this election years ago:

Long before the US election period began last year, we expected that the 2020 election would be one of the most contentious in history — and that was before we even knew it would be conducted in the midst of a pandemic. We worked since 2016 to invest in people, technologies, policies and processes to ensure that we were ready, and began our planning for the 2020 election itself two years in advance. We built our strategy to run all the way through Inauguration Day in 2021, knowing that there was a high likelihood that the election results would be contested. So we planned specifically for that scenario. This election planning was built upon all of our other integrity work and investments we’ve made since 2016.

Rosen also says that "to blame what happened on January 6 on how we implemented just one item of the above list"—namely the "break the glass" measures used to limit the spread of misinformation by limiting the distribution of live videos, for example, or automatically deleting content that would normally be reviewed by content moderators first—"is absurd."

"We are a significant social media platform so it’s only natural for content about major events like that to show up on Facebook," Rosen says. "But responsibility for the insurrection itself falls squarely on the insurrectionists who broke the law and those who incited them."

We're unaware of legitimate allegations that Facebook did absolutely nothing to stop the spread of misinformation on its platform, however, or that it's somehow more responsible for the Jan 6. insurrection than the people who participated in it. The criticism of the company has instead focused on the assertion that those responses have proven inadequate, that it exposed millions of Americans to inflammatory content, and that it's prioritized money over morals.

These criticisms are backed by increasing amounts of internal research provided to media outlets, Haugen's testimony before the Senate Commerce Committee, and individual outlets' anonymous sources familiar with Facebook's operations. The company has complained that its research hasn't been presented fairly, or that its current and former employees aren't being truthful, but those are some of the only available sources of information about its workings.

Its inability to facilitate outside research—and its efforts to quash it outright—have seen to that.

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Nathaniel Mott

Contributing Writer

I've been writing about tech, including everything from privacy and security to consumer electronics and startups, since 2011 for a variety of publications.

Read Nathaniel's full bio

Read the latest from Nathaniel Mott