Hear the writers discuss this subject on the Secret Cave Podcast!
We’re inside the filter bubble.
All the content you see on sites like Facebook, Netflix, Twitter, Instagram and Google (news, music, images, movies, search results) is controlled by algorithms.
Over two thirds of content discovery happens online in the land of the algorithms…
These algorithms use a (rudimentary) form of machine learning to profile your tastes into crude categories, crunch the data and show you what they expect you to like. If it’s anything like the profiling in Google Analytics, you’ve been pegged as things like ‘Movie freak’ and ‘Technophile’ by countless lines of code in the past.
Now, if you’re the sort of person who can’t stand cat memes — and never likes them on Facebook — you can be sure Facebook’s not going to take the risk of showing you any. Algorithms are programmed to be scared and stupid: scared to show you anything you might not like, and too stupid to extrapolate information to truly work out your deep range of interests past the point of a few rough guidelines.
Case in point:
A few weeks back, an algorithm on my Facebook feed snapped and said “Oh yeah? You like technology news? Why don’t you see only technology news?!”. Clicked something about Apple last week? Prepare to never see anything else ever again.
And, unless I actively go about finding sources manually, there’s no chance for me to discover anything that might lie slightly on the fringes of what I’m interested in.
This isn’t just my own moaning. I’m actually recounting an example that perfectly illustrates the filter bubble theory. The thoughts we have about the content we consume is a direct result of control by algorithms. It’s like an infinite loop, where the more interest we show in what we’re being given, the narrower the range of content becomes.
Here’s a rough idea of the process:
- You sign up for Facebook. It has no idea who you are or what you like.
- You share an announcement about Android on your timeline.
- Facebook latches on, starts stabbing in the dark, displaying general tech news and hoping to get you to narrow your interests further
- You like a few specific stories on your timeline from this selection
- You start getting adverts, content suggestions and sponsored page results related to only these narrow fields of interest
- The more you interact with them, the narrower the field gets
- Eventually, it’s at the point where you’re getting shown specific niche areas of these topics, and would never be shown content related to cybersecurity or Samsung
Unfortunately, it means you have to be careful about what you express interest in. If you don’t often interact with content on social media, you’re sending strong signals to the algorithm about things that could easily be one-offs.
We end up living in our own ‘personal ecosystems‘, sheltered from whatever actually might be going on in the world because algorithms won’t show us anything that could accidentally cause a disagreement.
It’s easy to argue that this can make for a dull and overly relevant experience online, but it’s just as easy to argue that — unless these aggregation algorithms start curating content more holistically — the majority of the population (over two thirds state their source for ‘new’ content is social media) will each become enclosed in their own personal filter bubbles, totally unaware of new ideas, disagreeable events or sources of entertainment they didn’t know about before.
Why do I care about the echo chamber?
I feel so strongly about this because I started Secret Cave to be a place where those with one interest (sci-fi films, for example), can read material about things they might otherwise be shielded from by the usual sources for content discovery — like wrestling.
I’ve been lucky enough to have personal introductions to a ton of art and media that’s simply not inside the scope of algorithmic curation.
We’re entering into the age where the internet isn’t a platform for true discovery, but a noisy echo chamber. One that doesn’t echo new work, but the existing thoughts inside our own heads, over and over again until you give up, click a link, and the cycle refines and starts over.
Origins and evolution of the echo chamber
The first personalized content algorithm came from Google way back in 2004. It used browsing history, location, and language to determine what you meant by the terms you used. This is much more useful and less insidious than Facebook’s news feed algorithm which came 7 years later in 2011. Since search in Facebook is a far less prominent feature, that algorithm was written to show you what it thinks you will like.
There are definite benefits to algorithms of this kind: relevant search results, related items when online shopping. But for the most part, these two use cases are objective.
If you’re searching for a book shop, you don’t need one in Florida if you live in Texas. If you just bought a toothbrush, you might want to buy toothpaste. No harm in offering, right?
But if you like journalism following the Trump campaign, it doesn’t mean you’re down to give that a disproportionate amount of your attention. Imagine if Amazon suggested you buy toothpaste every day for the rest of your life until you finally caved, and then it started up with toothbrush suggestions again…
With the 2011 Facebook algorithmic news feed, something ironic happened.
Because Facebook shows you links based on what your friends view, a disproportionate amount of depthless clickbait started showing up. In a bid to display the most relevant, highest quality content, Facebook accidentally did the opposite and filled timelines with sensationalized tripe.
It took a full two years to correct this, all the while allowing viral shite warehouses (where the content is buried 15 clicks deep) to thrive.
And, ‘you’ll never guess what happens next!’ — Facebook cracks down on titles using the curiosity gap to get clicks, and stops the shite warehouses watering down the quality of content.
The timeline hasn’t changed much since, leaving us with a refined yet repetitive echo chamber of tightly tailored content, housing us in a cosy little bubble and making sure we don’t see anything we don’t agree with, subscribe to, or have a strong (in their eyes) interest in.
Twitter’s take on the echo chamber
It’s a shame, but Twitter has a dramatically smaller user base than Facebook — 317 million vs. 1.6 billion.
While Twitter is for news, conversations and sharing content, it had resisted algorithms for years. Until recently, tweets were shown chronologically. That’s because unlike Facebook, it is supposed to be a timeline of events, as they happen, leaving nothing out.
The problem, though, is that as Twitter grew, it evolved from a place where users would follow 20 friends and a few celebrities to a place where users follow thousands of random users and clutter their timelines past the point of any usefulness. It turned the timeline from a platform for tightly curated content to a platform for random chance. You could just as easily see three exceptional stories in a row as you could see two emoji-spams followed by an ancient meme.
Twitter’s solution — to increase engagement for its dying platform — was to introduce a Facebook-style algorithmic timeline that displays tweets from people you like and retweet often, as well as popular tweets from your circle.
The way it’s engineered — e.g., less refined and targeted — makes Twitter now a better place than Facebook to chance upon something that isn’t laser-targeted at your existing ideals.
Popping the filter bubble
Is the future to tone down the algorithmic control, to tighten it up and make it smarter, or to go ‘off the grid’ entirely, abandon the algorithms and just rely on a few sources you can trust to provide fresh, insightful, high quality content?
There are a few ways to fight against it — selectively signing up to email newsletters, handpicked Twitter lists, etc. — but honestly, who has the time or energy when they can load up social media and be instantly presented with a string of mildly interesting items that’ll hold their attention for as long as it takes to have a cigarette or choke down a rank sandwich at work.
As people care less about their email inbox, experience monopolization by huge media outlets (BuzzFeed & co), and rely on algorithms to read their minds and make shallow predictions about their tastes, we’re trapping ourselves in our own private ecosystem of beliefs, assumed interests and overbearing media outlets.
Unless there’s a dramatic change, it won’t be long before our interests stagnate and we become a population of drones with no original thought.
That might be an overstatement. What I’m saying is, while I’m having a cigarette or choking down a rank sandwich at work, show me something that isn’t less-than-mildly-interesting Apple announcements.