ARE ALGORITHMS DANGEROUS?
Andrew Michael’s first-person account “From Zero To Hate In Just a Tik And a Tok” (Winter 2023), describing how the TikTok algorithm can lead to Nazi content even for those not looking for it, generated a spirited response online. On the web forum Reddit, a debate ensued over the concept of radicalization described by the author. “This whole concept of algorithmic extremism is frankly nonsense,” a commentator with the username Party_Reception_4209 wrote. “What the algorithms do is optimize for engagement. If you constantly signal to the platform that you want a certain kind of content then you’ll get it. No platform serves up extremism when all the user ordered was memes of hot moms dancing in their kitchen, which is all I ever see on the app.” Disagreeing, another commentator (username: Vecrin), explained that it is the ability of algorithms to shift a user’s recommended content over time that makes them insidious. “Let’s say you’re a random teenager who likes gaming. TikTok feeds you amazing Fortnite clips. All good. But then it starts to push this guy who starts making fun of some ‘crazy’ woman who is ‘ruining gaming.’” Over time, Vecrin noted, a user’s feed will continue to develop. “You happen upon a TikTok talking about how feminists are trying to ruin culture…Pretty soon you’re listening to people complain about how women aren’t having relationships with guys like you anymore. Feminism has corrupted them…So you start to listen to that type of content more. It’s fun. They’re just joking around. What’s the harm in listening to a few slurs drop every once in a while?…So you continue to listen and think, huh, maybe they’re right about this stuff.” They warn, “It starts with a normal, fun subject (like gaming) and slowly pushes you down the rabbit hole.”