Deepfakes – Are they potentially dangerous, and why?

Deep Fake Featured Image

In the second Terminator film, the T-1000 robot was able to morph its appearance to look like anyone it wanted to. This trait was one of the most powerful and sinister in sci-fi film history, allowing the robot to gain the trust of someone, before killing them. Trust is a critical element of human relationships that makes them work.

Trust is also a human behavioral trait that cybercriminals exploit. Welcome to the world of deepfakes, the dystopian future of sci-fi films is here and now.

What is a Deepfake?

A deepfake refers to the application of AI-based technology, deep learning, to manipulate video and voice. Deep learning uses techniques called neural networks, fashioned (loosely) on how our brains work. Deep learning requires very large datasets to train the neural networks. In the use of deep learning in the creation of a deepfake, the data is typically many thousands of images of two people that are morphed and merged using specialist software. Voice is then overlaid, and lips, synced, etc. The result is a faked video that looks very real.

An example is that of a deepfake of Mark Zuckerberg of Facebook. The video, posted to Instagram, seemingly showed ‘fake Mark’ saying the ominous words:

“Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures…I owe it all to Spectre. Spectre showed me that whoever controls the data controls the future.”

Deepfakes in Action

Deepfakes are increasing with Researchers at DeepTrace finding, in 2019 14,698 deepfake videos online, almost double the amount in 2018. This increase in the use of deepfake technology demonstrates that it is becoming increasingly accessible. Increased accessibility equates to more novel uses, both for good and bad. Here are a few of the latest uses of deepfakes, and some that we may be seen in the news, very soon.

Fake News

In an era where “fake news” seems to contribute to much of the worlds’ woes, deepfakes have taken on the role of a propaganda tool.

This deepfake video of Barack Obama shows how deepfakes can be used to manipulate the truth and release information, en masse. Deepfakes offer a very powerful mechanism to those who would attempt to manipulate people’s voting habits.

In an attempt to counter the deepfake marketplace, Google is working on creating a repository of thousands of deepfake videos. The data provided from this database will be used to help create the tools needed to detect fake videos so they can be removed. Facebook is behind a similar initiative creating a repository of deepfakes for detection purposes, the videos being used as datasets in deepfake detection tools.

Deepfake Apps

The Zao app (currently only usable with a Chinese mobile number) allows you to create your own deepfakes. Perhaps you want to morph your face with that of an actor in your favorite movie? What could possibly go wrong? Well, privacy for a start. The Zao app manufacturer has a policy that gives them the right to use the imagery created using the app, for whatever purpose.

At the moment, the DeepTrace research shows that most deepfakes are being used in the porn industry. An app called ‘DeepNude” which swapped women’s clothed bodies with nudes, has been accused of “weaponizing” deepfake technology against women. The app was used as a device to threaten and tarnish a female journalist who was investigating corruption in the Indian government. The Indian journalists’ face was merged into a porn video using the DeepNude app. The journalist was subsequently threatened with rape and intimidated. The app has since been deactivated.

Extortion and Cybercrime

Deepfakes are the perfect vehicle to take social engineering to the next level.

This may already have begun with the case of a British CEO who was tricked into transferring $240,000 to a fraudster. The CEO believed he was talking to the head of the parent company during a phone call, who asked him to urgently transfer the money. The CEO is believed to have been tricked by a “deepfake” voice.

As deepfake technology becomes increasingly accessible, it is likely to be used as a tool in any scenario that requires social engineering. Social engineering, a tactic used in many cybercrimes that manipulate nature human behavior by toying with our trust, urgency, shame, etc. Sextortion, for example, is the perfect application of deepfakes. Sextortion is a scam that is increasing according to the FBI. The scam is usually in the form of an email that informs you, that you have been caught on camera, in a ‘compromising position’. To prevent the video from being sent out to all and sundry, you have to pay a ransom in bitcoin.

It is only a matter of time before we see the first sextortion scams that contain a deep fake of an individual. The fraudsters will find a target, likely a person in a well-paid job, find public domain videos of that individual from conferences, Facebook, and so on, and use deepfake technology to create the video. This will be sent to an email with the threat of release unless payment is made. Even if the victim knows the video is not real, they may feel they have no choice but to pay as the video looks so realistic.

A Deepfake Futurescape

Cybercrime feeds off innovation and deepfake is an innovation in manipulating trust. Already, deepfake technology has become more accessible, the Zao app, is just the beginning. You do not need a crystal ball to see a future where it becomes increasingly difficult to discern the real from the deeply faked. Deepfakes have passed the ball to the cybercriminal to use in new and ever-more difficult to detect cybercrimes. Whilst the industry again tries to catch up by counteracting the nefarious potential of deepfakes, we must all be cautious; the next email you open may show you starring in a porn video with the threat of making it public unless you pay up.

Cybersecurity analyst
David is a cyber security analyst and one of the founders of Interested in the "digital identity" phenomenon, with special attention to the right to privacy and protection of personal data.