The term “Deepfake” came out in 2017. It is mixed with words of ‘deep learning’ and ‘fake’. It is widely being used to change faces, control lips, implant fake audio files, make pornographic, political propaganda, or mock videos. There are lots of fake videos made with Deepfake (deep fake technology) circulating on the Internet. This media information (pictures, sounds, videos, etc.) synthesized using artificial intelligence and machine learning to synthesize the faces and voices of chosen celebrities, politicians, and other well-known figures into 18 plus videos and promotional messages for advertisement. This kind of ‘movie face-changing’ technology, unknowing people at first glance, seems to think that the victim of a spoof is actually participating in the performance.
Now even the “face-changing” software has risen, which makes deepfake spoof videos more popular than before and tends to use to commit crimes if used ‘Deepfake’ inappropriately. The Deepfake videos have evolved and they are relatively difficult to identify social engineering scam tools, which makes those videos really easy to fool the audience.
There are four ways to identify fake Deepfake videos:
The situation of altering media information now is common and it can not be ignored. How should people recognize deekfake? For the film, Wired.com interviewed Sabah Jassim (the professor of mathematics and computer science at Buckingham University) and Bill Posters (the co-founder of Spectre), they suggested the following:
- The Blink rate; the blink rate of Deepfake objects is lower than the normal people
- The Synchronization of voice and lip movement
- Emotional disagreement
- Blurred traces, pauses, or discoloration of the picture
Three things to do when you suspect a film is fake:
At present, CEO fraud using Deepfake technology has appeared. In September 2019, a voice produced by Deepfake technology appeared to trick a senior manager in the UK to remit $243,000 to an account opened by a criminal group. According to the insurance company of the victimized company, the fake voice can not only imitate the voice of the person being impersonated but also imitate the intonation, sentence, and accent of his speech.
Deepfakes content is becoming more and more common on the Internet using the deep learning technology in artificial intelligence to alter the actions and sounds in films and deliver distorted and untrue content. It often uses celebrity videos to achieve rapid and widespread dissemination, and It may cause negative effects such as influencing public opinion, impacting the market, and destroying people’s reputation. The most important thing in the face of Deepfakes is alertness and repetitive thinking. In addition, there are three things to do when you suspect a film is fake:
The video below is to show how easy the Deepfake video can be produced.
- Stop:
- If you have any concerns about this video, do not respond, share or comment immediately.
- Question:
- Where is the original source of this film? Is the people’s speech in the film different from usual? Why does this person or organization share this video online?
- Report:
- When you see any suspicious content on the Internet, if you are worried or firmly believe that it is Deepfakes, report it to the website or application platform where the news was exposed. YouTube, Facebook and Twitter are trying to filter and delete Deepfakes.
https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them
I saw the title for this one and I just had to read it! I remember one of the first times I heard about Deepfakes was in the UK at Christmas. Each Christmas Channel 4 -one of the major TV stations- broadcasts an alternative Queen’s speech, usually designed to highlight social issues or poke fun at prominent people in Britain. This particular year much of the media noted how an astoundingly accurate Deepfake of the Queen was that year’s broadcast. The section on how to identify the use of Deepfake software was insightful and truly shows how dangerously deceptive Deepfakes can be. One thing I did not realise was how easy they were to create!