Short form content is everywhere nowadays, with Tiktok, Instagram reels and Youtube shorts seemingly grabbing our collective attention span with an iron fist. At least that’s how it has felt for me ever since google introduced Youtube shorts to the platform. I managed to dodge Tiktok – thankfully – as I was working as a substitute teacher at an elementary school when it took off, thereby forever associating it with 10 year olds in my mind. Youtube shorts however were harder to avoid. I have been using Youtube since around 2007and it has long been a big part of my leisure routine, which meant that this new short form content could easily sneak up on me.
Youtube Shorts
Since 2020 I have increasingly found myself watching and scrolling through these shorts, sometimes for hours and often when I’d actually much rather be doing something else. The quick dopamine hits that come from short form content seem to be incredibly addictive to my brain, even though on a conscious level consuming content in this way doesn’t feel particularly enjoyable. The worst part of this for me is that it feels like I’m letting my mood be decided by whatever kind of content pops up, determined by a complicated algorithm that uses what data there is about me (for example gender, age, location, whether or not I have children etc) and my internet habits to figure out what content it thinks I in particular will find “engaging”. Engaging in this case does not equal enjoyable, I’m just as likely to engage with a short that has cute animals in it as I am with one that depicts graphic police brutality. The only choice I get to make in the matter is whether or not to interact with the youtube algorithm, which is hard to avoid since consuming content on the site is a deeply ingrained habit.
Does it have any value?
This got me thinking about content algorithms broadly and whether or not they have any value at all. Since the algorithms show you what it thinks you will engage with based on what you’ve watched previously you can quickly find yourself getting recommended a lot of the same kinds of content. This can be a good thing or a bad thing depending entirely on which direction the algorithm takes you. For example, if you are a closeted queer teenager in a queerphobic and unsupportive family, interacting with queer suportive content on a platform like Tiktok or Youtube can be a huge help in accepting yourself, since the algorithm will tend to show you more, similar content based on your user data. On the other hand if you encounter more toxic content such as conspiracy theories or “manosphere” content you are at risk of being exposed to more and more extreme far-right content. In the case of Short form content in particular this effect is amplified since you scroll through different content even faster than a traditional Youtube video.
Overall I feel that these types of content algorithms aren’t necessarily bad, what they do is ‘mostify’. They take whatever content is being presented to you and give you more of it and more concentrated versions of it. This means that it can be a tool of radicalization as much as it can be a tool of infinite cute puppies. While this is true it is important to keep in mind the creators behind the algorithm as well. The thought that Google has such power over my feelings through the content loop I’m being fed is troubling to say the least. The corporation Google© has a profit motive, but not necessarily a human motive.
This is a funny one! It’s nice to know that people out there, even people not deeply involved in the programming scenery, are more than aware of the internals and working of social media platforms.
You mentioned something very important in your blog, ‘radicalisation’. Also, other blogs talk about ‘echo chambers’. I think that this is one of the worst cons of consuming media in mass. I also noticed that people nowadays have the tendency to train their ‘algorithms’ (I don’t really approve the use of term ‘algorithm’ this way, but that’s how some people use it nowadays). Also, I noticed some changes with respect to social media use and practices, and there are some concerning reports talking about the same thing: there is a tendency nowadays to use social media in closed groups. In the very early days of Facebook, for instance, and me being an early user, I remember people having open profiles publicly displaying their activities and photos. Nowadays, everything is closed, and people tend to communicate in closed groups. In sum, I think that the purpose of social media platforms has shifted from socialising to feeding one’s ego, in which case it may not be spoken of social media anymore.
I used to get a lot of these toxic, manosphere kind of shorts. In the beginning, I liked it because I thought it was funny. Some very insecure men talking about how they were so cool and women were not. However, I realized there are a lot of people who take these ideas, represented in a 15-second short, very seriously and become radicalized. Thankfully, I realized it was very harmful content, but I think a lot of people still don’t realize this yet. I personally think shorts are a bad development, but I hope the positives can outweigh the negatives.