Tech

Why Did Instagram Remove Summer Walker’s Post Criticizing the KKK?

White supremacist groups are thriving on social media, but platforms like Instagram are silencing those who speak out against racist hate.
KC
Queens, US
Summer Walker stares directly into the camera as a Facebook style hand icon covers her mouth, silencing her.
Image by Cathryn Virginia | Photo by Amy Sussman via Getty Images

Summer Walker’s Instagram page tends to stir up as much conversation as her rousing R&B. In the past year, she’s received so much backlash for posting about her opposition to vaccines and sharing general COVID-19 conspiracy theories to her nearly 5 million followers that she’s proposed starting her own social media app altogether. But Walker generated controversy of a different kind last week after Instagram removed a meme she reposted following the indictments of rappers like Young Thug and Gunna on racketeering charges. The 88-page indictment filed on May 9 accuses Thug of committing over 30 crimes since 2013, including terroristic threats and drug charges, with Gunna allegedly facing drug and firearm charges. In response to the arrests, Walker’s seven-word post asked in simple plain text: “Why [the] KKK never got a RICO?” 

Advertisement

Created in 1970, the Racketeer Influenced and Corrupt Organizations (RICO) Act was once used to target groups like the mafia, bringing down crimelords like the Genovese and Gambino crime families by detailing years of conspiracies under the umbrella of their criminal enterprises. But today, these laws often focus on street gangs-turned-rap crews. In racializing the racketeering law, its modern usage has effectively criminalized rap music by turning lyrics into makeshift confessions. Thug’s lyrics are being used as evidence in his indictment, including “Slatt” from Young Stoner Life’s 2020 compilation, Slime Language 2. “I killed his man in front of his momma,” Thug raps. 

Walker’s post challenges the legal ramifications of ignoring crimes committed by a hate group like the Ku Klux Klan, an organization that has traditionally used violence to maintain racial order. The post’s removal silences those brave enough to question white supremacy, its history, and ultimately, how it continues to fester in violent hate crimes across the country. 

According to a screenshot Walker posted of a message she received, Instagram claims Walker’s post was removed for breaching its Community Guidelines, which the company said were created to “encourage people to express themselves in a way that’s respectful to everyone.” The question she posed was a violation of its guidelines regarding “violence and dangerous organization,” the company said, but Instagram declined to offer any feedback in its response to Walker about how the post contravened those policies. 

Advertisement

In an Instagram post published after the initial meme was deleted, Walker pushed back: If videos of police brutality can be disseminated widely and without warning, then who do these tech giants truly consider a “protected” class? “That question went against @Instagram guidelines but I’ve literally seen black people beaten/abused in hella different ways by cops but that type of racist activity seems to have no issue living rent free on this platform,” she wrote in an Instagram caption. “I’m hella confused.”

At the time of publication, Instagram’s parent company, Meta, did not respond to questions regarding the post’s removal. Instagram’s Community Guidelines don’t provide much clarity on the issue, either: The company claims it doesn’t condone the “support or praise of terrorism, organized crime, or hate groups,” and that credible threats will be removed from the platform. 

The post is yet another instance of Meta’s tortured history regarding content moderation and how it regulates content by or about far-right groups like the KKK on the platform. According to a 2022 report by the Anti-Defamation League (ADL), a huge part of Meta’s moderation problem is that extremist speech and mainstream speech are so similar that algorithms cannot accurately determine what hate speech is. Its research found that right-wing groups often use “coded language to evade detection by content-moderation algorithms,” while speech that is not harmful may be flagged because moderation teams do not understand the nuance and context behind it.

Advertisement

A key example of this issue was a 2017 controversy surrounding Facebook, another Meta social media property, informally banning the word “dyke,” which had been flagged by moderators as hate speech. Although the word may be used in a discriminatory manner, lesbian groups like the national motorcycle club Dykes on Bikes and the annual Dyke March parade have reclaimed the epithet. For many, the term—when used by the community to refer to itself—is now one of empowerment.

The post’s removal silences those brave enough to question white supremacy, its history, and ultimately, how it continues to fester in violent hate crimes across the country.

Meta’s shoddy moderation has caused many to question its motives. In 2019, the company hired representatives from The Daily Caller, a right-wing news site co-founded by Fox News conservative political commentator Tucker Carlson, to help fact-check articles ahead of the 2020 election. The publication’s integrity has been under fire for years: Leading up to the 2016 presidential race, The Daily Caller was a leading source for anti-Muslim and anti–Hillary Clinton content, according to a Harvard report on misinformation. When asked about its partnership with The Daily Caller, Meta cited the need for a “diverse set of fact-checking partners,” per spokesperson Lauren Svensson.

Advertisement

Its parent company’s moderation woes and unclear guidelines have allowed Instagram to become an incubator of pages dedicated to uplifting white supremacy. These networks of pages, also known as “racistgram,” go against the very Community Guidelines that Meta claims makes their platforms a safe space. At least seven pages that are a part of this network remain on Instagram without any repercussions from Meta, according to a new report from the progressive watchdog group Media Matters for America.  

Walker’s criticism toward Instagram couldn’t be more timely, as violent white supremacy continues to proliferate both on and offline. On Saturday, the author of a 180-page white nationalist manifesto drove three hours to a predominantly Black neighborhood and killed 10 Black people at a supermarket. Much of the shooter’s manifesto speaks of a fear of “the great replacement,” or the idea that the white population—and thus, racial dominance—is on the decline. Unfortunately, the shooting was not an isolated incident: According to the ADL, more than 240 people have been killed by white supermacist terrorists since 2012. Homicides motivated by white supremacy accounted for three out of four right-wing extremist-related killings over that time period.

Summer Walker was right to question when this country will truly reckon with its own culture of hate. Meta’s removal of her post only further underscores the blindspots of the company—and the tech world, writ large—when it comes to allowing hate speech of all kinds to thrive online. The company has a white supremacy problem and Black people, both in digital spaces and in real life, are facing the consequences. 

Kristin Corry is a senior staff writer at VICE.