The role of social media in spreading misinformation is undeniable, harmful to individuals and dangerous to society -according to Facebook.
What is the Golden Rule of social media? One might say don’t post anything you wouldn’t want your grandmother to see. Others say, be careful of what you like or whom you befriend. Whatever rule you find the most important, it’s usually linked to information or the exchange of information; from a user’s perspective, that’s what social media is all about: the exchange of information. You gain knowledge of others and others (users and the platform as a whole) gain knowledge of you.
As with any form of exchange, it’s very possible we could be taken advantage of and the information we receive may be worth a lot less than the information we provide.
This can be seen as one of the many explanations for why falsehoods exist on social media. Conspiracy theories, fake news, alternative facts, are all just streams of misinformation. Here, there is a significant distinction to make from disinformation, which intentionally seeks to deceive. As the evidence required to certify genuine disinformation is naturally very difficult to obtain, the term can be misused when combining misinformation with unsubstantiated opinions; thereby perpetuating falsehoods.
Straight from the Horse’s Mouth
But couldn’t all of this be quite easily denounced as yet more biased information? But it’s made a bit more reliable when it comes from the largest of the social media giants.
What mainly motivated me to write about this topic was last week’s news story that Frances Haugen -a former product manager for Facebook- revealed herself to be the whistleblower who leaked thousands of pages of Facebook’s internal research to the Washington Post, in what is perhaps the company’s largest leak since its’ conception1.
The documents Haugen provided proved the full extent of Facebook’s knowledge of its effect on users and certified what many could have guessed before, how the company was far more concerned with profits than with the public. Internal research papers and copied message boards reveal how Facebook’s own research concluded how its’ algorithms, changed in 2018, prioritise negative content as it increases their exposure to the platform and so increases the likelihood of them clicking on adverts. It found users overwhelmingly engage with negative content (which often perpetuates hate speech or violence) more positive content as it elicits a far greater response due to the comparative ease to push users to anger than any other emotion. Haugen’s leaked documents show how Facebook recognised the harmful effect of Instagram’s algorithm on vulnerable teenage girls. Yet still, harmful content was prioritised in order to make the girls spend more time on the app.
“Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”2Frances Haugen, 60 Minutes Interview
Haugen further mentions how, following the 2020 US Presidential election, Facebook no longer saw the relevance of her civic integrity division disbanding and “distributing” the work of the department. All in spite of the fact that during the election, Facebook realised the danger of harmful misinformation and increased safety within the algorithm. The highly adverse effects of Facebook’s algorithm have also been noted by other organisations. From another of Haugen’s leaked internal reports, several European political parties strongly opposed Facebook’s algorithm change due to the prioritisation of negative content and the subsequent development of increasingly extreme policy. Other threatening details from Haugen’s documents demonstrate how Facebook was utilised by the Military of Myanmar in 2018 to organise and commit a genocide.
The Science of Conspiracy Theories
Klofstad Uscinski and Mathew Atkinson define conspiracy theories as:
“a proposed explanation of events that cites as a main causal factor a small group of persons (the conspirators) acting in secret for their own benefit, against the common good”3
Arguably stemming from distrust, they seek to provide simple answers to complex questions by following the path of such distrust. This ranges from the benign, like flat earth theory, to the malicious and harmful, like QAnon.
In an academic paper produced by Political Behavior in June 2021, Adam M. Enders and others suggest how increased exposure to social media increases the likelihood of believing in some types of conspiracy and misinformation4. Yet, they argue this was moderated by a level of paranoia know as “conspiracy thinking” causing the misinterpretation of any occurrence which was difficult to explain or disagreeable as conspiratorial. Examples include malevolent & farfetched but also deeply sinophobic conspiracies surrounding COVID-19.
Additional ideas which may support this argument include how people tend to consume a greater proportion of content with which they agree; therefore, the dominance of agreeable content can easily create an echo chamber of falsity. This perpetuates the belief in misinformation and the resultant consumption of such content on social media; which, thanks to Haugen, we know is something that Facebook recognises and actively allows.
How Can We Tackle Misinformation?
In the light of all of this, it might seem like social media is out to get us and the only way to stop the cycle is to break the wheel. But I don’t think that’s the answer. It’s a collective issue and each of us has to realise the danger of misinformation and recognise ways to identify and combat it.
Haugen, in her 60 Minutes interview and testimony to US Congress, suggested the US Government’s need to impose sweeping restrictions on social media giants citing how they have abused their monopolistic power and cannot act independently. Large scale solutions to the problem require systemic changes like ardent journalistic rigour and widespread literacy5. This may help to dismantle the distrust many have in public institutions. However, this should go hand in hand with individual changes to our personal usage of social media; namely, more variable information inlets or open discussion.
It can easily snowball out of control. Now, back to the beginning- what about that Golden Rule of Social Media? This might just go against the spirit of what I’ve said but in an apt yet almost reductive sense, I think it’s integral to bear one key thing in mind: don’t believe everything you read.