The conditions that enable and encourage trolling aren’t exclusive to the internet, but they are more prevalent online than off, mostly thanks to the ease of anonymity and effects of crowd psychology. Territorial behavior — based either on platform loyalty or tight-knit communities — is amplified when geographical constraints no longer play a part. Internet crusades targeting other groups or individuals have become so commonplace that major platforms like Twitter have had to rethink their stance on free speech.
The internet population is growing, but it’s also fragmenting as real world issues polarize mainstream and fringe subcultures alike. In this article, I’ll examine the phenomena of internet territorialism and those who coordinate trolling on a large scale.
Trolling as territorializm
Trolling is universally regarded with negative connotations, but at its origin it was more like a playful game than a malicious crusade. Usenet groups in the 1990s — specifically alt.folklore.urban, (AFU) which went on to inspire Snopes.com — are the first recorded homes of internet trolls. Trolls on AFU (often experienced users) would bait initiates with misinformation or misspellings, hoping to highlight the naivety of new users who weren’t in on the joke. In this instance, trolls truly were lurking under the bridge and allowing entry only to those who could pass the test, solve the riddle, or answer the questions correctly.
“AFU has its own ways of doing things: everything from dealing with trolls to investigating urban legends has been reduced to a formula over the years. Many people posting to the group have been there for years: five or six years isn’t unusual. The group has developed its own humour and culture of in-jokes. If you’re new you will occasionally see things you don’t understand but after the second or third time you’ll notice a pattern about it and will figure-out what’s going on.”
Trolling in its formative state consisted of harmless jokes or tongue-in-cheek hazing. It was a way for users to feel like they’re part of the in-crowd, or to feel a momentary sense of superiority or even empathy for those being hazed. The kindly-worded extract above, from the group’s style guide, doesn’t stink of malicious intent — its very existence proves that the group made efforts to induct newcomers. In the years between the ‘90s and present day, internet communities have become more fragmented and dangerous. Trolling has since evolved from an induction test to something that can ruin careers, insight gun violence, and cause suicides.
As trolling became more widespread, it shifted from a joke to a method of motivating and mobilizing groups of anonymous users in tight-knit internet territories. We see real life parallels in studies of anonymity and crowd psychology. Usually these factors come together in real life during riots and protests, but the internet magnifies uncommon human tendencies by bringing like-minded people together in an echo chamber of ideas and common goals.
The AFU example is interesting because it offers an early instance of trolling as a form of territorialism. In that early case, all were welcome as long as they learned a few in-jokes. Now, trolling has become synonymous with online harassment, and is used as a way to pressure people out of the group; especially people who don’t share the same extreme views as the established “old hats”.
Twitter is an especially attractive platform for trolls; unlike Facebook and LinkedIn, it doesn’t require you to use a real name. Users can direct tweets at anybody using the service and, for a long time, their code of conduct was in favor of free speech above all, “except in limited circumstances”.
In 2016, a notorious bout of Twitter abuse involving Milo Yiannopoulos, Leslie Jones, and numerous anonymous fake accounts prompted Twitter to recently rethink their policies on harassment.
The shift away from Twitter as the self-described “free speech wing of the free speech party” indicates the rise of a new kind of trolling: coordinated abuse against a specific target. Twitter, however, isn’t always the place where this abuse is planned or exacted.
One example that we came into contact with first-hand at Secret Cave is the entourage of trolls stirring up animosity on The Dick Show’s subreddit after a dispute between Maddox and Dick Masterson. For a community focused on Masterson’s podcast, the ratio of posts that directly concern Maddox is startling. Here, fans left bitter by the untimely end of a podcast hosted by both Maddox and Masterson, gather information and co-ordinate attempts to defame and harass Maddox.
This is just one symptom of the coalescence of a sidelined population and the internet’s ability to bring them together and stir them into a frenzy. Outside the analog realm of riots and protests, this kind of behavior was previously impossible on such a large scale.
As Andy Tyrrell told us on a recent podcast about trolling, users fall into becoming trolls because they see that the methods work. It’s been over two decades since the dawn of the internet age, and that’s a lot of time for this vicious feedback loop to gain momentum and provide proof to trolls that their efforts will not go unrewarded. So much so, that it’s become more like a weapon for crusaders than the casual pastime of online troublemakers.
Trolling as a crusade
Before Facebook and Twitter (but during the decline of Usenet), 4chan established itself as the home and destination of the internet’s most dedicated trolls. Trolling was still territorial, and echoed the gatekeeper-style tactics of AFU, but 4chan’s popularity and huge user base made it possible for large groups to come together and crusade for a collective cause. The internet was not prepared for the mass-mobilization of malicious users. By sheer strength of number, users of 4chan’s /b/ board managed to manipulate Google search results and flood YouTube with porn.
One particularly upsetting event even involved a group of anonymous users spamming the Facebook page of a dead girl with hundreds of horrific messages.
While these incidents reflect a sadistic sense of humor rather than a concerted effort to affect change, they serve to illustrate the power a coordinated group of anonymous users really has.
Similar to the outrage surrounding Dick Masterson and Maddox, the cancellation of alt-right comedian Sam Hyde’s show, Million Dollar Extreme was met with explicit calls-to-action from fans, Trump supporters, and Hyde himself. In a tweet that has since been deleted, Hyde called on fans to protest Adult Swim’s parent company, Turner:
Best way to get show back is to put pressure on Turner. I’ll make an announcement vid soon, prob tmrw
— Sam Hyde (@Night0fFire) December 6, 2016
This resonated with The_Donald, a subreddit for Trump supporters
In a conversation Sam Hyde had with Tim Heidecker to confront him about the show’s cancellation, Heidecker speaks of the coordinated online harassment he was subjected to as a result:
Hyde: I have heard that you campaigned to get our show taken off the air.
Heidecker: That is wrong, Sam, and listen to me: It is wrong. Now, like I have said on this show many times, I have expressed to people there and to people around me that I have been attacked online by people in your name. I don’t know if it’s you or whoever it is, but it’s people saying…
Hyde: It’s not me.
Heidecker: Okay, it’s not you, but it’s people over and over again, coordinated, consistent, nasty, violent…
Heidecker: No, listen. Violent death threats — all that shit coming at me, and all I said — all I ever said to anybody about this subject — was “That audience is an issue.” That is an issue that [it] doesn’t seem to be dealing with, so…
Below is just one of the the thousands of tweets directed at Heidecker in the wake of Hyde’s accusations:
I’m sure that his show would definitely of been aired if T&E never had a show. They’re worthless!
— Taylor K Conrad (@TaylorKConrad) December 25, 2016
The Hyde/Heidecker case is one example of a Troll campaign that had offline impact, but perhaps one of the most memorable incidents in the past 12 months has been the Pizzagate hoax.
Following the leak of John Podesta’s emails in the 2016 elections, members of 4chan’s alt-right /pol/ board cooked up a plan to defame Hillary Clinton and bring about the election of Trump.
Here’s the post that started it all:
Whether the author was sincere or not, the hoax was picked up by influencers with large, committed followings, including Alex Jones and now-renowned alt-right conspiracy theorist, Mike Cernovich.
— Mike Cernovich 🇺🇸 (@Cernovich) November 21, 2016
Convinced that the leaked email exchanges used words like pizza, hotdog and walnut as code for pedophilic activities, armies of trolls rose up from the depths of 4chan to harass Podesta and gather information.
Their quest led them to Comet Ping Pong, a pizzeria in Washington DC. At the apex of the events, an armed man who believed Clinton officials were locking children in the restaurant’s basement fired an assault rifle inside the building.
Alex Jones — whose own coverage of Pizzagate prompted Edgar Maddison Welch to fire shots inside the pizzeria — has since apologized for his false accusations. However, this has done little to dissuade those invested in the theory.
Alex Jones disowning Pizzagate hasn’t dissuaded them — if anything, his “silencing” is proof of a broader conspiracy.
— Will Sommer (@willsommer) March 25, 2017
It’s important to remember that this complex, distressing sequence of events came from a poisonous collective with a solid reputation for creating and proliferating misinformation.
The ripple effect spread from 4chan to The Washington Post in a matter of days, once again proving that trolls are among the most influential forces on the internet — and, indeed, the physical world.