Be Aware of the Algorithm

Comparing ones ‘Explore’ or ‘For You’ pages of the same platform with one another shows how unalike someone’s personalised content is. This personalisation of content on online platforms is produced by algorithms, which can vary on different platforms. According to Swart, most ‘ordinary’ users are often not even aware of the presence of algorithms, because algorithms (should) blend into the background when behaving as expected.1 She also states that netizens learn about, conceptualise and understand it through use and experience with algorithms.2 It can thus be often difficult (especially in the beginning) to recognise/be aware of the algorithm in different online spaces, which can be negated with more experience in the digital space.

“Make sure to feed the algorithm with a like and leave a comment for dessert and engagement.”

Edvasian, 20213

How We View the Algorithm

There are three perceptions from users of the algorithm according to Swart’s findings: 1) positive: algorithms are seen as useful guides that highlight relevant stories with a beneficial recommender system, 2) neutral: algorithms are “a mathematical, rational process” “that requires input to generate a certain output”, and 3) negative: algorithms are “powerful forces intended to stimulate purchases” and/or are “linked to issues of censorship.4

As I’ve said earlier, Swart states that people make sense of algorithms through use and experience with them, but she continues to explain that this revolves around three elements: 1) platforms; to “compare how different platforms worked and assess algorithms in relation to each other”, 2) features; “to compare algorithms within an app or website”, and 3) type of content, “i.e. recommendations vs advertisements vs “regular” content.5

Swart also discovered that there is a rise in users’ expectations when they are aware of them being surveilled (by the algorithm), because “there should be a fair trade-off between giving up data privacy and ease of use.6 However, she also saw that users have spontaneous reflections/awareness of algorithms when they are surprised when confronted with unexpected or confusing results.7 She explained that “algorithms that functioned too well created the uncomfortable feeling of being watched or even exploited, especially when users could not logically deduce its assumptions from the data they had consciously supplied”.8

(Dangerous) Power of the Algorithm

Dogruel, Masur & Joeckel stated “that algorithms govern our everyday internet experiences”  and explained that algorithmic decision-making/curation has been associated with various potential risks and an increase to negative effects on an individual and societal level, which can range “from privacy infringement, forms of political and economic manipulation, censorship, and discrimination to biases in computing outputs”.9 They explained further that algorithmic operations can exploit and manipulate the decision-making of users, which negatively affects users’ “autonomy in navigating online environments”.10

It is, also according to Dogruel, Masur & Joeckel, still challenging to know and/or understand “the degree to which algorithms are based on their own constantly evolving nature or (…) their context-dependency”.11 They also mentioned that algorithms have a “mutual shaping with user interactions”, which (alongside evolvement over time) creates difficulties when approached with “predefined assumptions about factual knowledge”.12

https://youtu.be/nfczi2cI6Cs?si=harNU_OLykinRUkE

Liberation from the Algorithm

There is a way to break free from the algorithm, which is by raising our ‘algorithm literacy’ according to Dogruel, Masur & Joeckel.13 They defined algorithm literacy as “being aware of the use of algorithms in online” spaces and “knowing how algorithms work” “(the types, functions, and scopes of algorithmic system” in online spaces)”.14 They also predicted that raising ones algorithm literacy “may serve as types of defense mechanisms against” “potentially negative effects of algorithmic curation” as a result of identifying, understanding and countering its impact.15 Awareness and knowledge of algorithms in online spaces are thus crucial.

Another method, besides raising our algorithm literacy, is called by Swart as “tactics to ‘challenge’ the algorithm”, but she discovered that users rarely participate in them due to: 1) users imagining “their own role in shaping algorithms as limited”, 2) algorithms conceptualising “users as rational beings”, with conscious and deliberate engagement, 3) challenging algorithms taking effort, and 4) users being overall “reasonably content with how recommender algorithms classified them”.16

While the aforementioned methods have to be done by the users of digital spaces, Dogruel, Masur & Joeckel also argue that “successful changes in policy measures (…) might impact users’ awareness of algorithm use in such a manner that adjustments to the scale might be necessary for future applications”.17

” (…) we’re interacting with algorithms, in our everyday life, more and more. We are training them, and they’re training us. So we have to study this so we understand it better and we don’t let it go in directions that are harmful to society or to certain groups of people.”

Guillaume Chaslot, 202118

Conclusion

While all of the information in this blog post is not a secret and many netizens already have some kind of idea of the algorithm in digital spaces, this does serve as a reminder to be aware of the algorithm. Swart stated that there are three perceptions from user of the algorithm: positive, neutral, and negative. She continues to state three elements, which are important when users try to make sense of algorithms online: platforms, features, and type of content. However, she also saw that users have spontaneous reflections/awareness of algorithms when they are surprised when confronted with unexpected or confusing results, which.

While the perception of the algorithm can be positive, Dogruel, Masur & Joeckel explained that algorithmic decision-making has been associated with various potential risks and an increase to negative effects on an individual and societal level. They continue to explain that it is challenging to know and/or understand “the degree to which algorithms are based on their own constantly evolving nature or (…) their context-dependency”.

It can be challenging to achieve liberation from the algorithms in digital spaces by raising our ‘algorithm literacy’ and Swart explained that there “tactics to ‘challenge’ the algorithm”, but with a low participation. Dogruel, Masur & Joeckel also argue that “successful changes in policy measures (…) might impact users’ awareness of algorithm use in such a manner that adjustments to the scale might be necessary for future applications”. Many things can be done to make users more aware of algorithms in digital spaces, which can benefit them.


  1. Swart, 2021. 3. ↩︎
  2. Swart, 2021. 3. ↩︎
  3. Edvasian, 2023. 15:55-16:00 ↩︎
  4. Swart, 2021. 6. ↩︎
  5. Swart, 2021. 5. ↩︎
  6. Swart, 2021. 6 ↩︎
  7. Swart, 2021. 5. ↩︎
  8. Swart, 2021. 6 ↩︎
  9. Dogruel, Masur & Joeckel, 2022. 116-129. ↩︎
  10. Dogruel, Masur & Joeckel, 2022. 116. ↩︎
  11. Dogruel, Masur & Joeckel, 2022. 119. ↩︎
  12. Dogruel, Masur & Joeckel, 2022. 130. ↩︎
  13. Dogruel, Masur & Joeckel, 2022. 116. ↩︎
  14. Dogruel, Masur & Joeckel, 2022. 118. ↩︎
  15. Dogruel, Masur & Joeckel, 2022. 116-120. ↩︎
  16. Swart, 2021. 7. ↩︎
  17. Dogruel, Masur & Joeckel, 2022. 130. ↩︎
  18. The Wall Street Journal, 2021. 12:31-12:55. ↩︎

Bibliography

  • Edvasian, “The Chronically Online Generation – Over Sharing is Bad,” YouTube, January 18, 2023.  Video, 16 min., 14 sec. https://youtu.be/yfHy7yxwuEs?si=iS9lP1luob9L2mrj
  • Dogruel, Leyla, Philipp Masur, and Sven Joeckel. “Development and Validation of an Algorithm Literacy Scale for Internet Users.” Communication Methods and Measures 16, no. 2 (2022): 115–33. doi:10.1080/19312458.2021.1968361.
  • Swart, Joëlle. “Experiencing Algorithms: How Young People Understand, Feel About, and Engage With Algorithmic News Selection on Social Media.” Social Media + Society 7, no. 2 (2021). doi:10.1177/20563051211008828.
  • The Wall Street Journal, “How TikTok’s Algorithm Figures You Out | WSJ,” video investigation, July 21, 2021. YouTube, 13 min., 2 sec. https://youtu.be/nfczi2cI6Cs?si=02oeSU4OfsFEVZZ3.