Regulation of social media platform policy and monopolies

Note: this is my submission for the retake essay.


The Problems with Social Media

What happens on social media? It seems that it’s not the users that decide, instead “The Algorithm” is responsible for deciding the way content appears on the platforms. The precise thing the algorithm does wrong depends on who you ask (Twitter, for example, seems to have at once a liberal and conservative [Huszár et al., 2022] bias[1]) but the consensus is that the workings and policy of social media platforms have effects that reach outside the platform and into the wider society.

A particularly egregious set of examples is Mr. Musk’s latest stunts with his hobby social media platform, Twitter. Here’s a few:

These examples highlight two ways in which Twitter negatively affects society:

  1. its policies are opaque, ever-changing, and allow it to influence what is discussed on the platform
  2. it uses its dominance in the microblogging market to harm its competition

Platform Policies

With a society that increasingly relies on platforms for everything from social contacts to news aggregation, the platform algorithms and policies have effects that reach outside the platforms themselves:

Because of the role of Twitter—and many other social media platforms—the importance of basic rules that enforce fair and free discussions and protect the rights of users cannot be overstated.

Competition, or Lack Thereof

In an open market, users can move to competing platforms, with policies they like better. But the platforms are often monopolies in their niche:

  • TikTok for short videos
  • Twitter for microblogging
  • YouTube for longer videos
  • Instagram for photo-sharing

And so the platforms can govern in a one-directional way: there are the rules given from above the users need to adhere to, but these users have no leverage to influence this because they have nowhere else to go.

The Role of Governments

If users cannot change affect the platform policies, governmental institutions and regulatory bodies should step in. Currently, these have largely no influence on how these platforms are run.

However, the European Union is currently working on implementing the Digital Services Act (DSA) and the Digital Markets Act (DMA), which have as goals to ensure safe and accountable online environments and online markets respectively.

Both acts are rather far-reaching, and interfere in depth in the governance of platforms. However, they are very much necessary. In this article, I explain why stricter and more proactive regulating of social media platforms such as the DSA and DMA are essential, and explain what aspects of this regulation are important.

Okay, but What Are ‘Social Media Platforms’?

What is meant with ‘social media platforms’? Instinctively, we understand that this term describes online platforms in which users interact and generate content, such as Facebook, YouTube and Twitter. But a precise definition is hard: “I’ll know it when I see it”. Academics face the same problems [Carr 2015].

As such I will use again the vague description of ‘online platforms in which users interact and generate content’, and hope that through the examples I’ll give of how specific platforms have impacted society, you’ll get an idea as well.

Which ‘Effects on Society’ Are We Talking About?

Cultural Impacts

As we’ve all read (right?) in [Nieborg 2018]’s paper on the influences of platforms on cultural production, “cultural production is contingent on platform policy”. What this means is that the way creative, cultural content is created and shaped, is influenced by the policies of platforms.

For example, cultural media is much more likely to be viewed if it goes viral (“platform sharing practices and algorithmic curation tend to favor viral content”) [Nieborg 2018].

Platforms also interfere with the relation between readers and news organizations by placing themselves as middlemen. This allows them to aggregate and profit from the content produced by others (for example Google News, which Australian publishers took issue with), and influence how many people reach the sites of media organizations by changes in platform policy [Nieborg 2018].

As mentioned before, platform monopolies ensure that users and producers don’t have alternatives to go to, and as a possible consequence, users and producers have to adhere to rules very strictly out of fear of losing access to only platform they need, by self-censoring.

Public Opinion and Politics

Platforms gain power over other things outside of the contents on their platform: they can shape public opinion, even of users that are not on the platform.

The moderation of platforms can have a clear political bias [Huszár et al. 2022]. This is not inherently problematic—imagine if there were laws that declared left-wing forums illegal!—but if the moderation is not transparent, the platforms are allowed to shape discussion without oversight.

Another place where the moderation of platforms can be problematic is with misinformation. Platforms have a responsibility to not allow misinformation to proliferate, because the democratic system cannot withstand these influences. See for examples the “failure of Facebook and other digital platforms to prevent the circulating of misinformation during the 2016 US elections” [Nieborg 2018] and how this influenced said election [Alcot et al. 2017].

It Always Comes Back to Geopolitics

All of this is problematic, however with tensions between the great powers, these influences can even be used maliciously by nation-states.

These platforms are based in some country, but can have a monopoly in others. While most social media platforms popular in the EU are based in the US, a notable exception is TikTok, owned by China’s ByteDance. It is not without controversy: it is suspected of spying on users for the Chinese government , and is accused of making its app more addictive in markets outside China. But even the US has encouraged its own platforms to spread its ideas and values to other markets. [Jin 2013]

In short, states can use the dominance of platforms can be used for geopolitics — imperialism by means of internet platforms: platform imperialism. [Jin 2013]

Necessary Regulations

So now that the impacts of social media platforms are clear, it is possible to outline aspects where regulation is necessary:

Allowing Fair and Free Discussions

First, because discussion on social media platforms has such an impact on the rest of society, it is important that it is afforded to take place fairly.
This means that the policies of platforms must ensure that no opaque censorship happens, place mechanics to dispute moderation decisions, fight misinformation and illegal content, protect the privacy of users, and protect vulnerable groups such as children and minorities. In short, certain fundamental rights of users must be protected.

Preventing Market Dominance Abuse

Then, I’ve mentioned that the monopolies of popular platforms in their relative niche ensure that users don’t have alternatives. Even if certain rights are guaranteed, other aspects users could want to see can be ignored because of monopolies.

So it makes sense to think about how market monopolies and dominance should be regulated, to ensure that competing platforms have a chance to exist. Even if a platform doesn’t have a strict monopoly, it can affect other platforms with its dominance over the market, for example by prohibiting linking to competing platforms.

While the US monopolies are not viewed as inherently unwanted, “the EU places a “special responsibility” on dominant firms not to “distort competition” in any market”. [Keyte 2018].

But online platforms are very likely to use initial successes to gain dominant market positions [Nieborg 2018], resulting in effective monopolies such as YouTube, TikTok, and Instagram.

We’ve already seen Twitter abuse its position by banning links to other platforms. Social media platforms should not be allowed to use their market dominance to stifle competition.

Preventing Outside Influences

I’ve previously mentioned of how TikTok is accused of influencing in other countries for the Chinese government. This highlights the importance of being able to hold a firm accountable, even if its origins are outside of the EU. Spying is another issue, because users’ data is allowed to be siphoned to outside countries where EU legislation offers no protections against abuse.

What’s This About the DMA and DSA?

Here’s a quick overview of both of the acts.

The Digital Markets Act (DMA) is meant to regulate the digital economy, specifically by identifying dominant players and preventing them from abusing their positions. These platforms are referred to as ‘gatekeepers‘ because without regulation, their position allows them to stifle competition. With regards to the necessary regulations I identified, this act focuses on preventing market dominance abuse.

The Digital Services Act (DSA) introduces new obligations for different types of platforms, progressively more extensive and strict the larger a platform is.
With these rules, the act ensures free and fair discussion by forcing platforms to be transparent and fair in moderation, proactive in fighting misinformation and illegal content, and protecting users’ privacy.

We’re Making Progress, But…

Both the DMA and the DSA come into full force early in 2024. I identified three main topics where legislation is needed:

  1. Allowing fair and free discussions
  2. Preventing market dominance abuse
  3. Preventing outside influences and data gathering

Good Things

The first two topics are covered relatively well by the DMA and DSA, respectively, as discussed in the previous section. The DSA additionally offers guarantees about the privacy of users and their data.

In my opinion, the DMA and DSA are very significant steps in ensuring the harmful effects of social media networks are mitigated.

One thing I specifically like is the way both acts impose progressively stricter and more extensive obligations and rules on platforms as they get larger and more influential; I think this will allow smaller competitors to punch above their weight and compete more effectively with the larger ‘gatekeepers’.

Monopolies Are Still Allowed to Exist

However, effective monopolies are still allowed to exist under this regulation. With the problems made clear, I think the market of social media platforms is one where monopolies are inherently unwanted. Only time will tell if the new anti-trust legislation will create a market where no single platform is dominant, but I would’ve liked to see this enforced by the new acts.

Personal Data in Other Countries

We don’t trust firms with Chinese origins with our data, that much is clear. But should we trust the US, whose platforms dominate our European market? The personal data transfer to the US used to be regulated by the Privacy Shield framework, but it was struck down in 2020, and it’s ‘2.0’ successor almost certainly faces the same fate. I’d hoped to see this appear more explicitly in the DSA, but it appears we’ll have to see whether Privacy Shield 2.0 will stay upheld.

Enforcement Could Be an Issue

Legislation needs to be enforced to work. As we’ve seen with the GDPR, a similarly ambitious set of regulations, has suffered from lack of teeth because of inefficient enforcement. The European Commission has previously said that a team of 80 enforcers is needed, but now plans for a team of only 40 strong. Hopefully it’ll be enough.

We’ll see the effects of the Digital Services Act in February 2024, when it will come into full force, and the Digital Markets Act in March 2024, when all stated obligation will begin to apply.


Notes

[1]: Note that the case for a conservative bias is much stronger—read the paper, it’s very insightful and a good refutation to accusations of liberal bias of social media platforms.

Bibliography

  • Allcott, Hunt, and Matthew Gentzkow, ‘Social Media and Fake News in the 2016 Election’, Journal of Economic Perspectives, 31.2 (2017), 211–36 https://doi.org/10.1257/jep.31.2.211
  • D. Y. Jin, ‘The Construction of Platform Imperialism in the Globalization Era’, tripleC, vol. 11, no. 1, pp. 145–172, Jan. 2013, doi: 10.31269/triplec.v11i1.458.
  • Carr, Caleb T., and Rebecca A. Hayes, ‘Social Media: Defining, Developing, and Divining’, Atlantic Journal of Communication, 23.1 (2015), 46–65 https://doi.org/10.1080/15456870.2015.972282
  • Fatema, Shafaq, Li Yanbin, and Dong Fugui, ‘Social Media Influence on Politicians’ and Citizens’ Relationship through the Moderating Effect of Political Slogans’, Frontiers in Communication, 7 (2022) https://www.frontiersin.org/articles/10.3389/fcomm.2022.955493
  • Huszár, Ferenc, Sofia Ira Ktena, Conor O’Brien, Luca Belli, Andrew Schlaikjer, and Moritz Hardt, ‘Algorithmic Amplification of Politics on Twitter’, Proceedings of the National Academy of Sciences, 119.1 (2022), https://doi.org/10.1073/pnas.2025334119
  • Jin, Dal Yong, ‘The Construction of Platform Imperialism in the Globalization Era’, TripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society, 11.1 (2013), 145–72 https://doi.org/10.31269/triplec.v11i1.458
  • Keyte, James, ‘Why the Atlantic Divide on Monopoly/ Dominance Law and Enforcement Is So Difficult to Bridge’, Antitrust, Vol. 33, No. 1, Fall 2018.
  • Nieborg, David B, and Thomas Poell, ‘The Platformization of Cultural Production: Theorizing the Contingent Cultural Commodity’, New Media & Society, 20.11 (2018), 4275–92 https://doi.org/10.1177/1461444818769694