Social Media Is Speciating Human Thought | Opinion

Like finches in the Galapagos, we're unwittingly evolving away from our neighbors on our own isolated islands of thought.

In the past, we disagreed about who to elect president. These days, we can't even agree on who we did elect. A year ago a third of Americans thought the real winner of the 2020 presidential election was the incumbent, and a few of them stormed the nation's capitol in protest of perceived widespread fraud.

Though a majority of Americans on both the Left and Right think our democracy is "in crisis and at risk of failing," we can't agree on why.

But there's a logical, apolitical explanation for our society's current discord: Personalized machine-learning algorithms are pushing us into our own isolated digital ecosystems to the point that our world views are becoming fundamentally incompatible, and so are we. See: Darwin.

When Charles Darwin visited the Galapagos Islands in 1835, he encountered small brown finches with fascinating quirks that varied from island to island. Some had long pointy beaks for seed eating; others had short stout beaks for insect snacking; still others had beaks that could produce particular vocalizations. The finches, Darwin theorized, had evolved into separate species from a common ancestor in response to their unique island environments.

In a sense, speciation is also happening on social media, but it's speciation of thought. Whether it's on Facebook, Twitter, TikTok, YouTube or any other major platform, our personalized feeds are separating us onto our own individual islands of ideas, impelling our division and imperiling our democracy.

Speciation happens in three basic stages: separation, adaptation, and division.

Phone Apps
In this photo illustration the logo of US online social media and social networking service Facebook (C), the US instant messaging software Whatsapp's logo (L) and the US social network Instagram's logo (C) are displayed... Matt Cardy/Getty Images

First, a barrier separates groups of organisms—like the Pacific Ocean for finches, the Grand Canyon for certain squirrels, or the land bridge between North and South America for shrimp species. Likewise, highly personalized social media and search algorithms silo our worldviews by surveilling our every online move and serving us content completely unique to us that they predict will keep us scrolling. This tends to be things that provoke fear, outrage and loyalty to one's own group. These algorithms surround all of us—all countries, parties, classes, and identities—all the time, separating what we see and ultimately believe.

Second, separated groups adapt to their distinct environments. Darwin's finches developed unique traits that allowed them to thrive on their particular islands. In social media's case, the algorithms first adapt to us, learning what we like and what works to keep us engaged. This is how machine learning operates: With our every post, emoji and moment staring at a certain piece of content, the algorithm learns more about us and then refines its recommendations to ensure greater engagement. Our thoughts in turn adapt to our increasingly extreme and individualized information landscape, moving in the direction we're pushed. As Cathy O'Neil, author of Weapons of Math Destruction, says, "Algorithms don't predict the future, they cause the future."

Finally, groups stop reproducing together, instead multiplying amongst themselves. Darwin's finches reinforced their differences by sexual selection. People radicalized by personalized algorithms likewise become less interested in, tolerant of, or even aware of others' views. As we hole up on our own information islands, alternate perspectives become unreal and uncomfortable to us. A recent Pew Research Center poll found that six in 10 Americans find it stressful to ​​talk politics with people who disagree with them. Our hesitancy to have conversations that could bridge our divides perpetuates them. Our world views thus clash toward incompatibility.

Through this process, Galapagos finches evolved into multiple species uniquely fit for their environments and could peacefully live apart. Humans on social media, however, are transformed into infinite "species" that are isolated from others' views and incapable of getting along in the society we all must share. And because we only see the narrow perspective our algorithms show us, we often don't realize just how far we've evolved away from our neighbors.

January 6, 2021 was an inevitable outcome of our diverging world views. That week, Trump voters and Biden voters saw just 5 percent of the same information on their Facebook feeds. It's no surprise that 60 percent of Americans feel pessimistic about whether America can overcome its divisions to solve its biggest problems. Our personalized feeds have separated and evolved us to the point where we can't even agree on what those problems are: Was January 6th an illegal and reckless attack on democracy, or was it the last stand from our proudest patriots protesting a stolen election? Is climate change the most dire threat humanity faces—a comet hurtling toward total destruction—or a comparatively minor concern that can be de-prioritized? Are mask and vaccine mandates essential public health measures, or a paranoid and tyrannical violation of our personal liberties? We are divided to our cores.

And we're only a decade into this technological experiment. What will our world look like in another 10 years? If these forms of personalized algorithms continue, speciation of thought will only get worse. The machine learning algorithms are getting better and better every day, and our online information islands will continue to drift further and further apart. This technology is accelerating our differences, is fundamentally at odds with a healthy or sustainable democracy, and is undermining our ability to come together to understand and solve our world's other big problems.

Darwin would never have identified his theory of evolution without rigorous research and the ability to observe the conditions impacting these species in the wild. But when it comes to the opaque technologies ruling our thoughts, behaviors and discourse, we're still in the dark. If social media platforms believe in their mission to connect the world, then it's time they open up these black box algorithms so our legislators and researchers can hold them accountable. Just like we protect our environmental ecosystems through measures that promote biodiversity and curb pollution, we need regulation of our information ecosystem before it tears us apart beyond repair.

Jeff Orlowski is the director of The Social Dilemma and the founder of Exposure Labs, a film and impact production studio. Newsweek and The Social Dilemma have partnered to create The Social Dilemma Debate Project, an initiative that serves to combat the polarization, hate and gridlock that defines today's culture and politics with a new generation of strong debaters. Join us.

The views in this article are the writer's own.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer

Jeff Orlowski


To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go