The Internet Should be a Communist Utopia. It’s Not, Because of the Filter Bubble

When you talk about the filter bubble, you’re talking about something quite specific. It’s the heavily curated ecosystem of the internet. A set of rules that filter all of the world’s information and organize it into what algorithms expect you to want to see — algorithms that suggest your next video on YouTube, or show you an article on Facebook.

At first, it can seem like a user friendly way to prioritize and curate the internet according to a set of personalized boundaries. You only see relevant content, and it brings order to the sprawling, chaotic internet.

In the era where fake news and propaganda virally populates Facebook — the world’s biggest news aggregation platform — it’s gone from being a user-friendly convenience to a threat to how we perceive the world around us. It skews our perception of reality, which in turn affects the way we behave — it’s part of the reason behind Trump’s election.

Facebook founder Mark Zuckerberg addressed the problem in a manifesto published February 2017:

“Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared. But the past year has also shown it may fragment our shared sense of reality.”

Why?

Sensationalization and Polarization

While he acknowledges the existence of the filter bubble in the manifesto, he says that “sensationalism and polarization” are to blame even more so.

On sensationalization: it’s proven that content with emotional triggers like fear, disgust and anger is more persuasive . When evaluating what to believe, readers react more to their emotions than to the facts. Sensationalized content plays on that, and is more likely to get exposure on social media.

The easiest way to sensationalize a story? Fabricate it from scratch.

sensational

A fake news story that was shared over 2,000,000 times.

On polarization: Polarization is the division of two contrasting groups by opinion and beliefs — the ‘us and them’ attitude responsible for some of history’s greatest atrocities. It’s also a product of the filter bubble and the a factor behind the spread of fake news. In his manifesto, Zuckerberg argues that showing content that contrasts with the reader’s opinion makes the problem of polarization worse:

“Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization by framing other perspectives as foreign.”

Recode editor Kara Swisher firmly believes Facebook needs to take responsibility for becoming an echo chamber of misinformation, and that it can’t get away with it by calling itself a ‘platform’. She says Facebook is a media company and should act like one — fact checking, and being careful not to spread lies in the first place.

“When you gather people’s attention*, and sell that attention to advertisers, guess what? You’re a media company.”

With Facebook being the number one news source in the world, and the days of people going to a specific set of publications for the news coming to an end, Facebook may have originally started out as a platform but when it acts as an editor and curates stories on such a massive scale, there’s no denying it is a media company.

The Media has Always Been Limited in Scope

Traditionally, the media has been segmented politically. This isn’t something that has changed because there are more sources available.

Outlets by leaning

Before social media, in the golden age of print and broadcast journalism, there were fewer sources but the same level of political bias. With just 3 biased news sources, people will still read the source that mirrors their own values and perception of the world, further perpetuating their existing views.

The filter bubble is nothing new. Instead, it represents the organization of chaos and the collapse of media diversity. With sources ever more diverse and an algorithm that keeps viewpoints insular, the system will eventually implode and everything goes back to the way it always was: people read what they already agree with, and the cycle continues.

The Original Philosophy of the Internet

The internet has been through many iterations before finally morphing into what we understand it as today. First of all, it was ARPANET — a decentralized communications network designed to survive nuclear war because no one blast could knock the whole thing down. Then, it was used by academics to as an upgraded version of a physical library: to store and share the world’s information. Organizations like Archive.org and Wikipedia have actively been trying to improve and preserve the internet’s information with the original goal of these academics in mind.

But, of course, with the commercialization of digital real estate in the exact same way newspapers sell ad space, the internet could never be truly flat in its organization of information. All information is available, but some is more available than others.

Mark Fisher, writing in his book Capitalist Realism: Is There No Alternative?, explains this inevitable state with relation to the widespread acceptance of neoliberalism as the only political system that works:

“emancipatory politics must always destroy the appearance of a ‘natural order’, must reveal what is presented as necessary and inevitable to be a mere contingency, just as it must make what was previously deemed to be impossible seem attainable.

Facebook’s capitalist drive to keep users on its site for as long as possible manifests itself in the display of only content the user will agree with. It seems impossible to now imagine a future where information is organized in any other way.

The filter bubble: Is there no alternative?

The Filter Bubble was Predicted in the 1980s

The filter bubble is just a pattern, and patterns are everywhere. In nature, it’s apparent; while grass grows in a certain way, its precise spread can’t be predicted. It’s a pattern on one level, and chaos on another.

Philosopher Jean Baudrillard writes about these patterns in his essay AIDS: Virulence, or Prophylaxis. The gist of it is that every complex system goes through a series of changes: disorganization, then organization.

JPB Quote

In some cases, like political infrastructures, it starts out as an attempt to organize the disorganized (rudimentary tribal governance), then turns into hard organization with written laws and policies. Baudrillard theorizes that all of these attempts at organization will always create negative virulence. Whether that’s the mobilization of terrorism, the spread of AIDS amongst organized, connected human cells, or the hyper-regimented organization of information through content discovery platforms and search engines.

In effect, Baudrillard predicted the filter bubble by noticing the common patterns throughout the history of society: the internet, which started as a disorganized utopian free-for-all, ends up as a tightly controlled corporate machine where the most profitable content is shown to the audiences most likely to buy. Or, in the case of the recent election, the most sensational content is shown to those most likely to be swayed by it.

The Historical Filter Bubble of the Soviet Union

In the USSR, there were around 3 different types of suitcase. If you look for second-hand suitcases past a certain age in Latvia, you’re going to stumble across one of the limited range of selection.

Obviously, the ex-Soviet countries are still reeling from the Soviet-imposed filter bubble, but again, it’s a pattern. Before the advent of the printing press, news spread by word of mouth and that in itself was a filter by today’s standards. Even today, a lot of news outside the internet is local only. We only notice the filter bubble because we should have such unbridled access to the outside world but the places we get our information are tightly controlled by proprietary gatekeepers. The communist utopia of the internet is dead — now controlled by corporate forces, the highest bidder, or those with the marketing education and sensationalization skills to get their content in front of a wide audience.

Google’s mission is to organize information on a global level. Facebook is to organize it by blinkered relevance. The internet’s mission to make all information available to everyone in equal measures has been derailed by capitalist agendas.

But what else can you expect in an era where social media itself is responsible for powering the election of Trump?

Space landscape-obsessed dreck penman. Appears on TechCrunch, The Next Web, and on Secret Cave in a far less restrained capacity.