Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Facebook employees said its ability to detect vaccine hesitancy and misinformation is 'bad in English and basically nonexistent' in other languages, reports show

Facebook CEO Mark Zuckerberg (left) and former Facebook employee Frances Haugen.
Facebook CEO Mark Zuckerberg (left) and former Facebook employee Frances Haugen. Matt McClain-Pool/Getty Images/Andrew Harnik/AP

  • Facebook is struggling to manage COVID-19 vaccine hesitancy and misinformation, internal documents suggest.
  • Facebook's "ability to detect vaccine hesitancy comments is bad in English and basically non-existent elsewhere," according to reports.
  • Facebook claims vaccine hesitancy on the platform declined by 50%, but experts remain skeptical. 
Advertisement

Facebook is struggling to manage COVID-19 vaccine hesitancy and misinformation despite previously boasting the platform as a vital pandemic resource, internal documents suggest.

According to internal Facebook documents from February and March 2021, the tech company's internal systems failed to identify, remove, or prevent anti-vaccine comments from appearing on the site. The reports, which were obtained by CNN, state that Facebook's "ability to detect vaccine hesitancy comments is bad in English and basically non-existent elsewhere." 

"We have no idea about the scale of the [Covid-19 vaccine hesitancy] problem when it comes to comments," said a report posted to Facebook's internal site in February 2021, a year into the pandemic, according to CNN. "Our internal systems are not yet identifying, demoting and/or removing anti-vaccine comments often enough." 

The report is part of a set of leaked internal documents referred to as "The Facebook Papers," which were reviewed by 17 major US news organizations on Monday. The documents are part of the redacted files shared with Congress by the legal counsel of Facebook whistleblower Frances Haugen, who testified earlier this month after filing at least eight complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public, according to Insider.

Advertisement

The "most active" Facebook groups in the US "have been the hundreds of anti-quarantine groups in addition to the standard set that have been most active for months/years (Trump 2020, Tucker Carlson, etc.)," a May 2020 post to Facebook's internal site said, according to CNN.

According to a Facebook spokesperson, the company has since added additional safety controls to manage groups on its platforms.

"There are no one-size-fits-all solutions to stopping the spread of misinformation, but we're committed to building new tools and policies that help make comments sections safer," the spokesperson told Insider.

In March of 2020, at the onset of the pandemic, Facebook's CEO Mark Zuckerberg posted that Facebook would work closely with the World Health Organization and local health authorities to stop "hoaxes and harmful misinformation" related to COVID-19. In his message, Zuckerberg committed Facebook to "removing false claims and conspiracy theories that have been flagged by leading global health organizations" as dangerous posts that violate the platform's community guidelines.

Advertisement

However, data scientists at Facebook asked for resources to monitor COVID-19 misinformation on the platform, but were ignored by company leadership, according to a report from The New York Times. In July, Facebook rejected President Joe Biden's claim that the spread of misinformation surrounding the coronavirus pandemic on its platforms was "killing people."

"We will not be distracted by accusations which aren't supported by the facts," a Facebook spokesperson said in a statement to Insider in July. "The fact is that more than 2 billion people have viewed authoritative information about COVID-19 and vaccines on Facebook, which is more than any other place on the internet. More than 3.3 million Americans have also used our vaccine finder tool to find out where and how to get a vaccine."

In a rebuttal blog post in July titled "Moving Past the Finger Pointing," Facebook Vice President of Integrity Guy Rose wrote that vaccine hesitancy on the platform declined by 50%.

According to a 2020 report from the Center for Countering Digital Hate, COVID-19 misinformation communities online can easily take advantage of engagement-driven algorithms to spread their messages on social media. Less than 1 in 20 false posts were removed across Facebook, Twitter, Instagram, and YouTube, even after users had reported the content, Insider reported

Facebook Congress Vaccine
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account