Deepfake technology in the Age of Fake News

Events and information being purposely (mis)presented in misleading or simply malicious ways is no new phenomenon, however since 2016 and Donald Trump, this type of misleading information is often referred to as “Fake News”. An example of this practice can be seen in the BBC’s intentional editing out of laughing audience members originally heard in the video recording during British Prime Minister Boris Johnson’s appearance on Question Time. Replacing the sound of laughing audience members with that of a clapping audience might give viewers the idea that Boris Johnson has unanimous approval among the audience members. This can be categorized as being “fake news”. Replacing soundtracks, suspicious cuts in video footage, omitting certain facts and misleading headlines have been the forerunners in the creation of fake news, yet these are noticeably pretty simplistic strategies by today’s standards and available technology. This brings us to Deepfakes, an extremely problematic and effective editing technology that is currently being toyed around with by hobbyists and researchers. Deepfake technology uses artificial neural networks to almost completely seamlessly to replace one person’s face with that of another in photographs and video footage. Naturally, the targeted person and the person whose face is being used have to have a similar skin tone in order to have clear integration between the face and the rest of the body. 

Here is an example of Deepfake technology being used by MIT researchers. They apply the technology to Richard Nixon in conjunction with using hours of voice recordings to have ‘his’ voice read a speech written by them. The MIT researchers wanted to demonstrate Deepfake technology by creating a video where Nixon gives a speech following a hypothetical Apollo 11 catastrophe. 

More (harmless) examples can be found online and chances are one wouldn’t be able to tell that most of them aren’t genuine. This seems to be all fun and games, a sort of party trick to show friends, but it really does beg the question to what extent this technology can be used for malicious purposes. If simplistic strategies such as omitting certain pieces of information and misleading headlines can have a massive effect on political outcomes, then this highly advanced and effective footage editing technology is undoubtedly able to do just as much, if not more, damage as traditional approaches to ‘fake news’. This technology isn’t being gate-kept by governments or research institutions. While deepfake apps aren’t  available on the Google Play Store, a Chinese app called “Zao” allows users to scan their face and using deepfake technology, edit them into popular TV shows and movies. This app is available as an .apk to be downloaded onto Android phones by literally anyone. More deepfake projects can be found on Github, downloaded by anyone that wishes to have fun or create drama. While this is super cool technology, it isn’t restricted and can allow anyone to visually create harmful or simply entertaining fictional exchanges between anyone they have pictures or videos of. In this day and age, the creation of deepfake technology and its potential uses for malicious purposes shouldn’t be shocking, yet it is still worrying to think about what other, perhaps exclusively harmful, technologies are currently being developed. 

Show CommentsClose Comments

1 Comment

  • ManonvdG
    Posted November 27, 2019 at 9:16 am 0Likes

    Interesting piece. I have heard and seen this technology before but didn’t know it was called Deepfake. It is creepy! Because let’s be real there will always be powerful malicious people in this world. I just hope that as this technology evolves experts keep evolving in being able to detect such adaptions to video’s. It is interesting how technology makes this kind of curve. First photography, video, audio, DNA and all these things helped innocent people from confirming their innocence. But we will come to an age where everything will be able to be faked and fabricated. For good or bad.

Leave a comment