The transphobia pipeline – TikTok’s algorithm and harmful content

I was scrolling through TikTok, as I usually do, and sometimes you can come across actually insightful and interesting content. One of those videos came on my For You Page and it talked about how engaging with transphobic content can lead the user to get right-leaning, alt-right uploads recommended. While that side of the political spectrum generally is against supporting trans people, it is interesting how the TikTok algorithm puts that type of content together and categorises them to be similar, and recommends them to people. This can be generally harmful, as some people can get sucked into the world of ‘crypto-terfs’, and accidentally fall into harmful content territory.

The first video I saw was from a creator named Abbie Richard, with the username @tofology. As of this moment, the video has about 2.5 million views, almost 600 thousand likes, and around 14 thousand comments. She describes the article that she co-wrote with Olivia Little, titled ‘TikTok’s algorithm leads users from transphobic videos to far-right rabbit holes’. It is a very interesting read, with a well-thought-out methodology with the aim to prove the hypothesis. The researchers demonstrated how transphobia is a gateway to the possible radicalisation of a social media timeline. The experiment featured making a brand new account and following known transphobic creators and interacting with content deemed as harmful. The first 450 videos on the account’s FYP were recorded and reviewed for their themes and topics.

The research’s findings include information that interacting solely with transphobic content leads to the recommended feed being filled with homophobia, racism, misogyny, white supremacist content, anti-vax videos, antisemitism, ableism, conspiracy theories, and videos promoting violence.1 To be more precise:

Of the 360 total recommended videos included in our analysis, 103 contained anti-trans and/or homophobic narratives, 42 were misogynistic, 29 contained racist narratives or white supremacist messaging, and 14 endorsed violence.

Olivia Little and Abbie Richards in ‘TikTok’s algorithm leads users from transphobic videos to far-right rabbit holes’2

The researchers found that the longer they scrolled and only interacted with transphobic content, they came across more and more violent videos, almost calling for harm to happen to LGBT people. A lot of those videos get hundreds of thousands of views, with a great number of ‘positive’ comments, supporting the creator. The article describes how the format of TikTok videos is almost perfect for easily spreading harmful content, with the combination of audio, video and text. The platform’s easiness and being able to post hateful uploads in a jokeful manner is the key to the big community concerning hate speech. The creators mention many individual examples of videos, promoting transphobia and featuring many far-right figures such as Nick Fuentes, Ben Shapiro, the previous leader of the British Union of Fascists Oswald Mosley or Paul Nicholas Miller among others.

This research also correlates with another video I came across on my FYP talking about how the TERF (trans-exclusionary radical feminist) pipeline is similar to the alt-right pipeline, with a young, impressionable person possibly being easily brought into such a harmful community. It can start with sympathy to the hateful beliefs and later fully immersing yourself within such a place. The TikTok creator with the handle of @gothamshitty or Lauren explains the term ‘crypto-terf’, which are people who hide their true beliefs or avoid talking about trans issues in general. She points to her own previous beliefs and that identifying hidden transphobic content is important when trying to educate yourself as a new feminist. Identifying or sympathising with possible terf content can lead to misinformation. She mentions an example when a particular creator talks about how ‘radical feminism is better than liberal feminism’, which can lead to a viewer assuming that those are the only options of being a feminist, breaking such a complex issue down to a simple one or two. Lauren then starts explaining the mechanic of the previously mentioned experiment. When a person decides to follow a creator, their recommended content can easily align with the videos that the creator engages with. By following a possibly harmful individual, a young, impressionable audience can fall into a dangerous rabbit hole.

This new way to access harmful content, through a high-speed, low attention span, can endanger many young people. Radicalisation of a feed like the FYP of TikTok can lead people to follow and engage with hate speech, violence and other awful things. It is a scary turn of events, how TikTok can have guidelines that do not allow ‘content that attacks, threatens, incites violence against, or otherwise dehumanizes an individual or group on the basis of’ things like a person’s gender and gender identity, and yet this type of content is not taken down but even promoted.

1.https://www.mediamatters.org/tiktok/tiktoks-algorithm-leads-users-transphobic-videos-far-right-rabbit-holes

2. Ibid.