Every now and again something arises from the shady regions of the internet that makes us all sit up and take notice.
When the phenomenon of the deepfake first arose in 2017, it was an immediate sensation. Using AI technology, deepfakers are able to transpose the head and voice (usually of a politician or celebrity) onto the body of someone else, to make them look like they are doing or saying something that they have never done. The famous video of Barack Obama (see how they did that here) saying various things he clearly hadn’t, created by filmmaker Jordan Peele, was a warning of the new era of fake news and deepfake videos.
Deepfake porn became an immediately popular genre, transplanting the heads of well-known people onto the bodies of porn actors, much to the celebrities’ distress (Scarlett Johansson, a frequent victim, is known to have ‘given up’ trying to combat the vast amount of deepfake porn featuring her image, declaring it a waste of time as it is so pervasive). But now arising from this genre comes the unsavoury spectre of using deepfake porn as blackmail material.
The AI used is now so refined that it can scan publicly available social media photos and create deepfake porn using ordinary images – of you and me, if we haven’t got our privacy settings sorted – which can then be used to extort and blackmail the unwitting victims. Or in some cases, just to publicly humiliate them, as in the case of Rana Ayab, the Indian investigative journalist who faced a barrage of abuse for trying to expose the case of a child rapist in her home country. She was subjected to a horrific deepfake porn attack that, in her own words, ‘ended up on almost every phone in India’. You can read more about that here.
According to a report from the security firm Trend Micro, the chatter among more nefarious parts of the web is that the proliferation of ransomware that uses deepfake porn to blackmail victims is not far away. “The attacker starts with an incriminating Deepfake video, created from videos of the victim’s face and samples of their voice collected from social media accounts. To further pressure the victim, the attacker could start a countdown clock and include a link to a fake video...If the victim does not pay before the deadline, all contacts in their address books will receive the link.”
The threat of sending sexual images or videos of a victim to all his or her contacts is not new, but the ability to create that imagery out of nothing certainly is. Once again the internet proves that while it’s one of the greatest things ever created, it is also one of the most terrifying.
Sarah Smith
Foundation Speaker