A few weeks ago, I had to write a paper about a major challenge in global mental health and one of the discussions that came up was about social media. From what I have learned so far, social media may or may not be bad for you; and there are many theories and studies even experiments that try to judge whether the applications on our phones really had an impact on our mental wellbeing1. As I was browsing this topic, I came across a phenomenon called Algorithmic Anxiety.
Algorithmic Anxiety
Popularized by Kyle Chayka in his essay “The Age of Algorithmic Anxiety” published in The New Yorker, Algorithmic Anxiety originated from the uncertainty from the algorithms on digital platforms; this worry is based on the unpredictable and vague mechanism of the way the algorithm works. Despite not being an actual subtype of an anxiety diagnosis, social media algorithms have impacted a lot of people’s lives.
So essentially, one of the core “problem” with social media algorithms is that most of the time it is not transparent, which can induce anxious feelings towards using social media. Additionally, the essay also discusses the opaque mechanism of Airbnb’s algorithm, which creates disadvantages for renters and tenants in finding suitable listings. This seems to be the case for other social media as well, except, some would try to tailor your experience to be as personalised as possible. But is it even helpful?
Well, a user on Medium discusses her self-diagnose Algorithmic Anxiety and how it impacted her views on digital platform recommendations and social media. She talked about how she couldn’t recreate her Pinterest board and that she felt stressed out that she could not meet her expectations using her board of ideas. She developed negative emotions toward herself and was not doing well both mentally and physically. To which she posed the question, “Is this what I want or what the algorithm wants me to want? […]”
Echo Chamber
Algorithmic anxiety is also closely related to the digital echo chamber, which is an online environment where people are exposed to content, information, or opinions that only reinforce their beliefs and views. At its core, the algorithm is merely used to predict your interest or what it is you are looking for next, which basically is used to keep you engaged; and when you are stuck in a vicious loop, it is hard to get out.
I had encountered my very own echo chamber during one of my nightly reel scrolls. I have two Instagram accounts, a personal one and a finsta (basically a fake, private one), I don’t usually log on to my personal account but one night I decided to do my scrolling there. To my surprise, the reels that were shown were the exact same ones I had watched before in my other account. I was really annoyed by this, because why would I watch the same set or reels twice? it’s not even that funny. This actually happens so often that I gave up scrolling my personal account for reels. This made me realize that, even if I have different accounts because I have shown interest in the same type of content, I will continuously get the same type of content over and over again across my social media. I feel like this defies the purpose of using social media; I can’t learn or expose myself to different perspectives and views.
Final Thoughts
It is ironic that social media is now a prominent issue in our society when its initial use was to connect people. However, in reality, social media companies do not seem to care about this emerging problem. Nevertheless, I do believe that transparency of data use and the mechanism of the algorithm could be communicated better to users.
Recommended Readings
- https://www.injectionmag.com/post/algorithmic-anxiety
- https://www.newyorker.com/culture/infinite-scroll/the-age-of-algorithmic-anxiety
- https://robinalbin.medium.com/my-name-is-robin-and-i-suffer-from-alogorithmic-anxiety-da2e754f9b63
- Srivastava K, Chaudhury S, Prakash J, Dhamija S. Social media and mental health challenges. Ind Psychiatry J. 2019 Jul-Dec;28(2):155-159. doi: 10.4103/ipj.ipj_154_20. Epub 2020 Aug 14. PMID: 33223706; PMCID: PMC7660000. ↩︎
I think this is a very interesting phenomenon. I personally sometimes feel like i don’t have a clear picture of recent events in news purely due to the fact that i am probably in some very heavily algorithm based echo chambers. This definitely makes me doubt my ability to interpret news sometimes.
I can totally relate to this as I experience the algorithm anxiety from time to time. Algorithm is biased, it only shows you what you are interested in and curious about. For example, you accidentally spend 4 more seconds on a topic you have never heard of and the algorithm resolves that you love this contend. However, you are just curious. After that, this contend gets glued in your feed. Even though you realize you don’t like this topic or even find it repulsive. You just cannot get it off.
Such a great read! I don’t really feel anxious about the content I see online, but I can totally understand how it could get overwhelming. Honestly, I think there are even more reasons why people feel stressed out from being online than you mentioned, which is kind of sad when you think about it. Things like the pressure to keep up with trends, idealized images everywhere, and FOMO really pile on. It’s unfortunate that platforms, initially meant to connect and inspire, have evolved in ways that can lead to so much mental strain.
I really enjoyed your blog! I think the topic of social media and its effects on our mental health is very important. Personally I would say I observe negative effects on myself. For me, algorithmic anxiety is the fear of no longer having control over myself and is triggered when I get stuck in a nightly reels scroll and don’t stop even though I realise that I’m no longer feeling good.
However, in addition to these effects on a personal level, I also associate algorithms with a socio-political anxiety. How stable is our democracy if the public increasingly moves into the digital sphere? Sounds good at first, because digital platforms can make information available to a very large number of users at a very low level, and many new voices can be heard on them. However, the phenomena of filter bubbles and echo chambers also harbour many dangers: Strong fragmentation, polarisation and radicalisation. All of them threaten the basis of democracy, a healthy public sphere. Fortunately, I recently read an article that shows that the effects of filter bubbles and echo chambers have so far been vastly overestimated and are not reflected in empirical evidence (Birgit Stark et al., „Maßlos überschätzt. Ein Überblick über theoretische Annahmen und empirische Befunde zu Filterblasen und Echokammern,“ in Digitaler Strukturwandel der Öffentlichkeit: Mediensymposium, ed. Mark Eisenegger et al. (Wiesbaden: Springer, 2021), 303-21.). Nevertheless, I am sure that this is a very important topic for our democratic future.
I heavily agree that social media companies should be forward and transparent with their data use policies. And they should have policies in place that prevent echo chambers from forming. I think the formation of these echo chambers and content loops dictated by an algorithm that only cares about watchtime and retention rates is very harmful for society overall. It causes conspiracy theories and dangerous radical social and political beliefs to spread.