Have you ever been scrolling through YouTube, TikTok, Reddit, Twitter or any other kind of media platform and thought to yourself: “How did I end up here?” Sometimes you are looking for something, like on Google, but you can often catch yourself watching or reading something that you did not intend to watch in the first place. Other times it will be a piece of content you are not familiar with or agree with. But if you are on the platform long enough, this feeling of unfamiliarity will become less common. Most people do not realize this, but it is something we need to be aware of in our day-to-day digital lives because it can seriously warp our perception of reality in the real world.
The almighty algorithm
While it might look like all the content you see on the homepage of your media platform of choice is completely random, it is most likely not. Based on the content you have seen before, an algorithm behind the scenes of the website decides which other content you are the most likely to click on next. Nobody knows exactly how these algorithms make their choices. Not even the software engineers that programmed the algorithm. But we do know that can be very effective. It is important to media platforms that you stay on their website and get fed content that you like. More time spent on the website equals more time spent on looking at advertisements on the website. Even if you have never been on a specific website, an algorithm can make an educated guess of your interests and what you are most likely to click on next through a process called fingerprinting [1], but this is a subject for a different time.
You see what you want to see
“So, the algorithm does its best at giving you the things that you, what’s the problem here?” Often there is a very thin line between things that people like to see and things that people agree with. Human psychology is very predictable in the sense that they want others to confirm their views. If you find a piece of content that completely aligns with your views, you are more likely to be engaged with the creator and, in turn, more likely to be engaged on the platform of your choice. This is very important to the mysterious algorithms of the platform, because this can be exploited to keep you on the website for as long as possible.
However, humans are not perfect, and their opinions can sometimes even be further from perfect. There are many ways to look at subjects such as politics, science, and religion, but, for some views it is generally agreed upon that they can be malicious, and we should stay away from them. But the algorithm does not always care about that. All it cares about is engagement, because it is owned by a commercial company which needs to make as much money as it can. To do this, the algorithm chooses to show people their favorite content even though the content might be a bad influence.
Repetition legitimizes
Can you guess what happens if someone only gets to see things from one perspective without even knowing that there might be another side to it? The reality of that person can get warped without them even being aware of it. In a way, this is a type of modern propaganda where the distributor is not a big organization or government, but an algorithm that not everyone is aware of. This can be a very dangerous thing because it is impossible to know how hard the algorithm is pushing certain ideas that can be harmful to society. It can lead to extremism and make the division between beliefs worse.
How do we solve this?
Unfortunately, there is no way to keep the algorithm in check. If you reduce the effectiveness of the algorithm, companies lose money, and money is the most important thing for companies. Simply choosing to stop recommending content linked to specific ideas can be even worse, because there is not a single authority that can decide which beliefs are allowed without censoring other important beliefs.
So, we must work on the root cause of this problem: education. By improving education in the field of digital media, we can drastically improve the ways in which we use our favorite websites without sacrificing our enjoyment. Universities are offering courses on these types of subjects, but that is not enough. To effectively educate people about the dangers of digital media, we will need to educate people at all stages of life. Teaching people to be critical about the content they consume and make a distinction between entertainment and propaganda is essential to ensure that conflicts are kept to a minimum.
References
- [1] Your Social Media Fingerprint https://robinlinus.github.io/socialmedia-leak/
Oh, the good old concept of ‘The Wikipedia Game’! But I do agree that one of the best solutions is education. It is a good tool as a starting point to make newer generation less susceptible to social media’s manipulative algorithm. I think another good starting point would be to limit time spent on social media in general (of course, it is not possible to not use it at all, given that most of us keep in touch with loved ones all over the world). A more realistic solution, if it comes to politics, reading or following different parties/news outlets would help to paint a slightly less biased picture of events. And yes, limiting the algorithm has its drawbacks — monopoly and even more control over people’s minds and emotions. However, even if we manage to keep the effect of social media under control, I do think the issue would remain (for sure to a lesser extent, though) since the same applies to television and paper press or even irl friend groups. I mean, we are keener after all to be friends with people who share our stances on the world. So, it is a dilemma — if we try to “free” people from imposed opinions online, what stops them from being “victims” offline?
Sometimes I be like urgg I hate algorithms, it’s like you’re living in a bubble of your circle, and you know too well your Tiktok is literally for your page, controlling and manipulating your worldview. but actually, sometimes I kinda like it. When I wanna be surrounded by something, I’ll intentionally keep watching a kind of video, searching for the relevant hashtags, saving and liking those to make them appear more on my FYP until all of the contents are exactly what I expect. That’s how I can avoid ‘toxic’ and ‘baiting’ content I guess? but still, the ‘almighty algorithm’ is still something we are not really aware of who is the one getting manipulated here lol