Last month, I decided to embark on an experiment. I created a new TikTok account with the specific goal of funneling a hypothetical teenage boy down the TikTok rabbit hole, otherwise known as the “alt-right pipeline.” The result was shocking.
But first, methodology: I only watched content from my “For You” page, and I did not use TikTok’s search function, so all videos in this experiment were suggested to me by the algorithm.
If I saw a video that I thought would take me further down the rabbit hole, I would “like” it. If I was very confident that it would, I would also follow the creator or I would comment on the video.
After skipping through dog videos and recipes, I found my starting point: a comedy video made by an Asian-American user and based on the stereotype of strict Asian parents who speak with exaggerated Chinese accents. From there, the algorithm recommended a “Karen” video. (Karen is a term used to describe white, usually middle-aged women who are exceptionally rude and act entitled.)Videos of alleged “Karen freakouts” are popular on TikTok; they’ve also been controversial, with many arguing that the stereotype is used to attack women rather than call out bad behavior.
I decided that a dash of misogyny would help this experiment, and gave that video a like. The next video I liked was posted by Barstool Sports, a site focused on “bro humor” and that Media Matters has called “a cesspool of misogyny and bigotry.” Their page was an obvious follow.
In the early stages of this experiment, most TikToks masked their bigotry with humor. But some of the jokes grew more bigoted, and even violent. One video featured a man screaming racial stereotypes at a shooting-range target. In another, a man threw his girlfriend into a snowbank. Transphobic humor became frequent, and jokes began to raise the question of who was supposed to be laughing. My feed also began to populate with male fitness content. Most of these videos are not political but play to the insecurities of teenage boys, making them vulnerable to influence.
My feed then took a bizarre turn. I began to see videos from the “pedophile hunter” movement, which features creators disguising themselves as children to confront alleged pedophiles in public, à la the reality TV series To Catch a Predator. At first glance, these videos don’t seem hateful (who amongst us is not opposed to molesting children?) but they do not exist in a vacuum. With charges that LGBTQ+ people are “groomers” becoming commonplace on the far right, and a broader obsession with pedophilic conspiracy theories in movements like QAnon, these videos can play to an audience with ulterior motives.
Next, I came across perhaps the most pivotal person in this journey: Andrew Tate. Over the last few years, Tate, who in December was arrested on suspicions of human trafficking in Romania, has become a folk hero to the far right. He’s a retired kickboxer prone to flaunting his wealth and bragging about his physical prowess and willingness to fight. And while adults may not be impressed by this bravado, many insecure teens find Tate cool, even someone to emulate. His videos became representative of my feed’s rightward shift: What started with a video mocking skydivers for holding hands led to one in which Tate warns men that women will leave them at the first sign of cowardice. And just three likes later, I reached a video where Tate says that the devil is an active force in the world and, in a seeming dig at trans people, that boys are “cutting their dicks off from a psyop [psychological operation] from Disney”—a long way from skydiving.
Tate was banned from TikTok and most other social media platforms in 2022, although his Twitter account was reinstated after Elon Musk took over that platform. But that hasn’t stopped Tate’s content from proliferating: his fans post and repost his clips more rapidly than TikTok can remove them.
My feed’s rightward push soon went into hyperdrive. At first it was videos in support of Kanye West’s antisemitic ravings, which quickly gave way to videos making coded claims about small groups of “elites’’ controlling society. From there, it was just a few swipes to a video of David Icke, best known for his conspiracy theory that the world is controlled by an elite race of reptilian “lizard people.” In the one I saw, he claimed that Israel’s intelligence agency, the Mossad, along with several other intelligence agencies, control people in power by filming them sexually abusing children. I had reached my destination—an outright antisemitic conspiracy theory.
I checked my app’s usage time. I had been on TikTok for 3 hours and 21 minutes—less than three days of usage for teens who spend an average of 99 minutes on the app each day.
As difficult as these videos were to watch, I learned a few things from this experiment. The first is that the bigotries promoted on the platform are connected: misogyny, racism, homophobia, transphobia and xenophobia were necessary traveling companions in my journey down the rabbit hole. It also became clear that irony is an essential tool to the early stages of radicalization. The content started out with jokes, but by the time I got to David Icke, comedy night was over.