The term is coined by Eli Pariser, who explains what it is in-depth in his TED Talk. The phenomenon was discussed in a previous DMSC class, and I wanted to expand on this. In short, it is the effect that combined algorithms have on the content you get to see. The disclosed content becomes personalised, creating your own bubble influenced by your behaviour.
This does not pose a problem if it comes down to entertainment preferences: Why yes Netflix and YouTube, I would love some more of That Nonsense. One click leads to another and next thing you know, you are hours into a binge-watching marathon. This is relatively harmless.
However, it is problematic when it comes down to information sharing and spreading. According to Pariser, even the news you get from major outlets gets filtered, so your news is different from what others get to see. Should your search results become so personalised that all conflicting information, articles and authors are edited out, then all you do find confirms your own information, views and biases.
But this echo is nothing new, at least not to me. I had looked up speeches held after school shootings. The comments were mostly in line with the stance taken in the video, with little to no resistance from the opposing party. It is unlikely that one would click on a video that contrasts with their own views.
These echo chambers can become so radical with the influence and reasoning with others that the info is then easy to perceive as truth. Although I find Reddit great, it also makes users or allows users to be very susceptible to this kind of filter bubble. (It is said 4chan is worse, but I do not have personal experience on that forum.) Of course, you are more interested related that diverging or even opposing content, but that is kind of the crux here.
The most concerning example of this, in my opinion, is how
incels are perceived. Some
argue that the term is a stigmatising caricature. Others
paint a more violent picture.
The thing is, it is unclear whether posts are mocking or earnest. I have the same problem with flat-earthers or anti-vaxx people; is this real or fake? It can be challenging to discern, but it becomes a real problem when threats of violence become acts like the Toronto van attack.
Nevertheless, for those who take everything at face value, these echo chambers cannot be constructive (in the sense that they might tone down). The subreddit r/incels/ was banned, but a similarly extreme or even more so, r/TheRedPill/ is quarantined but not banned.
What would help then, to counteract this filter bubble? Randomising your input into the algorithms by out of character clicking? Undoing all personalisation by forgoing those algorithms altogether? Actively pursuing opposing views to try and beat the system?
To be frank, I don’t know. But to be aware of what is happening is a decent start.