Youtube censorship

In June 2019, YouTube temporarily suspended the account of the archive of Alkmaar. Their reason? The archive uploaded historical footage on their account, including a few videos on World War II, which were considered inappropriate following updated YouTube guidelines. After national media coverage and multiple shout outs to YouTube, the archive managed to retrieve its account. An online discussion unfolded. Was YouTube doing the right thing or was it unnecessary censorship? Also, why did the archive upload these videos to YouTube instead of uploading them to their own website? Let’s take a brief look at what happened.

YouTube and its struggle with extremist content

For years, media outlets like Facebook, Twitter and YouTube have been criticized for allowing their platforms to be used by extremists like neo-Nazis, jihadis and conspiracy theorists to promote their ideas. As it is free to upload content and share it with approximately millions of users worldwide, especially YouTube is both a convenient and effective media to use. In 2014, Isis used YouTube for its ‘one billion campaign’, which had the aim to recruit users for their cause. Although more violent videos of the terrorist group were removed within hours after being uploaded, videos in which they promoted their ideology could still be found on YouTube.1 Later on when live videos were introduced to YouTube, this was gratefully used by toxic channels to live stream their broadcasts.

YouTube was widely criticized for not doing enough against toxic content. Another important point of criticism was the way YouTube recommended videos to users. The algorithm that it used to recommend videos to users tended to favour extremist videos.2 It could thereby throw users into a loop downwards towards the ‘dark side of YouTube’, even if at first users did not search for these kinds of videos.

All these cases have led YouTube to become more active in their crackdown on toxic content. When Isis was still on the rise, YouTube reportedly created a technology that redirected users interested in Isis towards videos which debunked Isis propaganda.3 In 2017 it began limiting recommendations and features on supremacist videos. In June 2019, YouTube went even a step further when it updated its so-called hate-speech policy, after which it more actively worked to crackdown on supremacist content.

Deleting historical videos on World War II

Although YouTube’s willingness to tackle toxic content seems to be a good thing, its policy change of June also had some complications. Within a week, multiple media reported on YouTube videos with historical content being taken offline. In Great Britain, history teachers complained that videos in which they lectured about World War II were removed.4 In the Netherlands, the issue was mainly brought up by a tweet of the archive of Alkmaar.

The archives’ YouTube account got suspended, apparently for spreading hate speech. It turned out that the reason behind the suspension was a video on their account of Dutch fascist leader Anton Mussert paying a visit to Alkmaar in 1942.

An online discussion followed. Most comments on the Tweet were in support of the archive, stating that historical footage should not be censored, as it educates us on the past. There were even a few comments comparing the censorship to book burnings. Although I think this comparison might be a bit of an exaggeration, I agree with the point that historical footage, even on sensitive topics, should be visible online. Those that disagreed with the archive stated that YouTube has the right to choose what content it allows on its website and that we just have to accept this. This argument is a bit simplistic in my opinion. In the online world, media outlets are not just platforms that users consume and thereby have to agree with every decision made by the provider. YouTube’s popularity also comes with the responsibility to listen to users. YouTube isn’t just an online platform, it is a public space in which users have an active voice.

Some people on Twitter raised questions towards the archive. Why did they use YouTube? Couldn’t they just upload the content on their own website? The archive explained itself by naming the benefits which have already been mentioned: YouTube is both convenient to use and reaches a larger audience than they would attract elsewhere. A few users also suggested using other outlets which would not remove this kind of content. The downside of this is that these do not attract the amounts of viewers reached by using YouTube. Besides that, a few of these media outlets have arguably taken not censoring videos a bit too far, as also toxic and even pornographic content can be found on some of these sites. This off course is not the right place for an official institution to share their content.

Fortunately for the archive of Alkmaar, they retrieved their account and YouTube acknowledged that they made a mistake. However, other historical footage that used to be on YouTube is not there anymore, even some footage that I remember being used in some of my classes. In my opinion, YouTube is the right place to share historical content, even on sensitive topics, but it needs guarantees. The primary distinction between toxic content and non-toxic content is that the first creates a place for users to voice and strengthen their toxic ideas. Off course, the latter might also be misused by users, like neo-Nazis commenting on educational videos about World War II. The main aim then should be to target the creation of these kinds of toxic spaces. Besides asking from YouTube to work on this, it also implicates a certain awareness by those that upload historical content. Should I enable comments on this video? Am I doing enough to put the content in the right context? In this way, maybe the incident of the Alkmaar archive might actually have been an eye opener, as it forces uploaders to become more aware of the possible consequences of their content

Footnotes

  1. Imran Awan, ‘Cyber-Extremism: Isis and the Power of Social Media’, Society 54:2 (2017) 138-149.
  2. Chris Stokel-Walker, ‘Youtube’s algorithm keeps suggesting extremist content’, New Scientist 243 (2019) 14.
  3. Séraphin Alava, Divina Frau-Meigs and Ghayda Hassan, Youth and violent extremism on social media: mapping the research (Paris 2017), page 16.
  4. Jim Waterson (2019), ‘YouTube blocks history teachers uploading archive videos of Hitler’, The Guardian <https://www.theguardian.com/technology/2019/jun/06/youtube-blocks-history-teachers-uploading-archive-videos-of-hitler>.