Algorithms and Short-Form Content

Short form content is everywhere nowadays, with Tiktok, Instagram reels and Youtube shorts seemingly grabbing our collective attention span with an iron fist. At least that’s how it has felt for me ever since google introduced Youtube shorts to the platform. I managed to dodge Tiktok – thankfully – as I was working as a substitute teacher at an elementary school when it took off, thereby forever associating it with 10 year olds in my mind. Youtube shorts however were harder to avoid. I have been using Youtube since around 2007and it has long been a big part of my leisure routine, which meant that this new short form content could easily sneak up on me. 

Youtube Shorts

Since 2020 I have increasingly found myself watching and scrolling through these shorts, sometimes for hours and often when I’d actually much rather be doing something else. The quick dopamine hits that come from short form content seem to be  incredibly addictive to my brain, even though on a conscious level consuming content in this way doesn’t feel particularly enjoyable. The worst part of this for me is that it feels like I’m letting my mood be decided by whatever kind of content pops up, determined by a complicated algorithm that uses what data there is about me (for example gender, age, location, whether or not I have children etc) and my internet habits to figure out what content it thinks I in particular will find “engaging”. Engaging in this case does not equal enjoyable, I’m just as likely to engage with a short that has cute animals in it as I am with one that depicts graphic police brutality. The only choice I get to make in the matter is whether or not to interact with the youtube algorithm, which is hard to avoid since consuming content on the site is a deeply ingrained habit.

Does it have any value?

This got me thinking about content algorithms broadly and whether or not they have any value at all. Since the algorithms show you what it thinks you will engage with based on what you’ve watched previously you can quickly find yourself getting recommended a lot of the same kinds of content. This can be a good thing or a bad thing depending entirely on which direction the algorithm takes you. For example, if you are a closeted queer teenager in a queerphobic and unsupportive family, interacting with queer suportive content on a platform like Tiktok or Youtube can be a huge help in accepting yourself, since the algorithm will tend to show you more, similar content based on your user data. On the other hand if you encounter more toxic content such as conspiracy theories or “manosphere” content you are at risk of being exposed to more and more extreme far-right content. In the case of Short form content in particular this effect is amplified since you scroll through different content even faster than a traditional Youtube video.

Overall I feel that these types of content algorithms aren’t necessarily bad, what they do is ‘mostify’. They take whatever content is being presented to you and give you more of it and more concentrated versions of it. This means that it can be a tool of radicalization as much as it can be a tool of infinite cute puppies. While this is true it is important to keep in mind the creators behind the algorithm as well. The thought that Google has such power over my feelings through the content loop I’m being fed is troubling to say the least. The corporation Google© has a profit motive, but not necessarily a human motive.