Apple almost pulled Facebook and Instagram off iPhones after it found human trafficking was organised on apps
Internal documents, reportedly seen by the Associated Press , show that Facebook was “under-enforcing on confirmed abusive activity”.
Facebook’s investigation into its platform found that “domestic workers frequently complained to their recruitment agencies of being locked in their homes, starved, forced to extend their contracts indefinitely, unpaid, and repeatedly sold to other employers without their consent” but that, “in response, agencies commonly told them to be more agreeable.”
It also found that recruitment agencies dismissed “more serious crimes, such as physical or sexual assault, rather than helping domestic workers.”
Apple did not respond to a request for comment from The Independent before time of publication, but an internal analysis from Facebook stated that the removal of its apps “from Apple platforms would have … potentially severe consequences to the business, including depriving millions of users of access.”
Facebook found that three-quarters of all problematic posts happened on Instagram, while links to maid-selling sites were predominantly on Facebook. Over 60 per cent of the material was from Saudi Arabia, and a quarter came from Egypt, according to analysis conducted by Facebook in 2019.
It eventually disabled over 1000 accounts but, worryingly, only two per cent of the problematic content was reported by users. Facebook has been criticized in the past for not being proactive enough with its moderation policies and enforcement , instead relying on its user base to highlight issues. Facebook acknowledged in its report that “domestic servitude content remained on the platform.”
The company was apparently aware of the problem since 2018 with what the company deemed “domestic servitude” – defined as “a “form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception” – to such an extent that Facebook had its own acronym (‘HEx’, or ‘human exploitation’).
Facebook’s documents reportedly suggested a pilot program that would start this year to target Filipinas with pop-up messages and adverts warning them about the dangers of working internationally. It is unclear whether the program ever began.
Facebook did not provide comment to The Independent before time of publication but told the Associated Press: “We prohibit human exploitation in no uncertain terms. We’ve been combating human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”
This news comes amid a flurry of scandals being reported on as a result of whistleblower Frances Haugen providing a collection of documents, known colloquially as the ‘Facebook Files’, to the Wall Street Journal.
Haugen recently gave evidence to a parliamentary committee in which she accused Facebook of putting profits ahead of its users and lacking moderator support in numerous countries – at the risk of similar riots to those seen in the United States on 6 January.
Compared to countries like Brazil, India and the United States – which were in a high-priority “tier zero and were monitored continuously – Myanmar, Pakistan, and Ethiopia lacked misinformation classifiers despite being designated at the highest risk.