AI-Generated 'Revenge Porn' Is Our New, Unfortunate Reality

Image: HBO

AI-generated pornography – known as “deepfakes” – is becoming more convincing, seamless and real. People with rudimentary computing knowledge can now use artificial intelligence to swap the faces of actors in pornographic videos with those of people they know. Welcome to a new, terrifying era of revenge porn.

In January this year, a new app was released that gives users the ability to swap out faces in a video with a different face obtained from another photo or video – similar to Snapchat’s “face swap” feature. It’s an everyday version of the kind of high-tech computer-generated imagery (CGI) we see in the movies.

Today I Discovered 'Deepfakes' And I Don't Want To Live On This Planet Anymore

Fresh off the news that Reddit brought the banhammer down on r/deepfakes and a worried direct message from a friend on Twitter, I realised I should probably go and find out what 'deepfakes' actually are. Turns out, I probably didn't want to know.

Read more

You might recognise it from the cameo of a young Princess Leia in the 2016 Star Wars film Rogue One, which used the body of another actor and footage from the first Star Wars film created 39 years earlier.

Now, anyone with a high-powered computer, a graphics processing unit (GPU) and time on their hands can create realistic fake videos – known as “deepfakes” – using artificial intelligence (AI).

Sounds fun, right?

The problem is that these same tools are accessible to those who seek to create non-consensual pornography of friends, work colleagues, classmates, ex-partners and complete strangers – and post it online.

The evolution of deepfakes

In December 2017, Motherboard broke the story of a Reddit user known as “deep fakes”, who used AI to swap the faces of actors in pornographic videos with the faces of well-known celebrities. Another Reddit user then created the desktop application called FakeApp.

It allows anyone – even those without technical skills – to create their own fake videos using Google’s TensorFlow open source machine learning framework.

The technology uses an AI method known as “deep learning”, which involves feeding a computer data that the computer then uses to make decisions. In the case of fake porn, the computer will assess which facial images of a person will be most convincing as a face swap in a pornographic video.

Known as “morph” porn, or “parasite porn”, fake sex videos or photographs are not a new phenomenon. But what makes deepfakes a new and concerning problem is that AI-generated pornography looks significantly more convincing and real.

Another form of image-based sexual abuse

Creating, distributing or threatening to distribute fake pornography without the consent of the person whose face appears in the video is a form of “image-based sexual abuse” (IBSA). Also known as “non-consensual pornography” or “revenge porn”, it is an invasion of privacy and a violation of the right to dignity, sexual autonomy and freedom of expression.

In one case of morph porn, an Australian woman’s photos were stolen from her social media accounts, superimposed onto pornographic images and then posted on multiple websites. She described the experience as causing her to feel:

physically sick, disgusted, angry, degraded, dehumanised

Yet responses to this kind of sexual abuse remain inconsistent. Regulation is lacking in Australia, and elsewhere.

Recourse under Australian criminal law

South Australia, NSW, Victoria and the ACT have specific criminal offences for image-based sexual abuse with penalties of up to four years imprisonment. South Australia, NSW and the ACT explicitly define an “intimate” or “invasive” image as including images that have been altered or manipulated.

Jurisdictions without specific criminal offences could rely on more general criminal laws. For example, the federal telecommunications offence of “using a carriage service to menace, harass or cause offence”, or state and territory offences such as unlawful filming, indecency, stalking, voyeurism or blackmail.

But it is unclear whether such laws would apply to instances of “fake porn”, meaning that currently, the criminal law provides inconsistent protection for image-based sexual abuse victims across Australia.

Recourse under Australian civil law

Victims have little recourse under copyright law unless they can prove they are the owner of the image. It is unclear whether that means the owner of the face image or the owner of the original video. They may have better luck under defamation law. Here the plaintiff must prove that the defendant published false and disparaging material that identifies them.

Pursuing civil litigation, however, is time-consuming and costly. It will do little to stop the spread of non-consensual nude or sexual images on the internet. Also, Australian civil and criminal laws will be ineffective if the perpetrator is located overseas, or if the perpetrator is an anonymous content publisher.

Artificial intelligence makes it easier for people to scrape facial imagery from social media accounts and superimpose it into pornographic videos.

Addressing the gap in legislation

The Australian Parliament is currently debating the Enhancing Online Safety (Non-Consensual Sharing of Intimate Images) Bill 2017. This bill, which is yet to become law, seeks to give the Office of the eSafety Commissioner the power to administer a complaints system and impose formal warnings, removal notices or civil penalties on those posting or hosting non-consensual intimate images.

Civil penalties are up to A$105,000 for “end-users” (the individuals posting the images) or A$525,000 for a social media, internet service or hosting service provider.

Importantly, the proposed legislation covers images which have been altered, and so could apply to instances of deepfakes or other kinds of fake porn.

Prevention and response beyond the law

While clear and consistent laws are crucial, online platforms also play an important role in preventing and responding to fake porn. Platforms such as Reddit, Twitter and PornHub have already banned deepfakes. However, at the time of writing, the clips continue to be available on some of these sites, as well as being posted and hosted on other websites.

A key challenge is that it is difficult for online platforms to distinguish between what is fake and what is real, unless victims themselves discover their images are online and contact the site to request those images be removed.

Yet victims may only become aware of the fake porn when they start receiving harassing communications, sexual requests, or are otherwise alerted to the images. By then, the harm is often already done. Technical solutions, such as better automated detection of altered imagery, may offer a way forward.

The ConversationTo adequately address the issue of fake porn, it is going to take a combination of better laws, cooperation from online platforms, as well as technical solutions. Like other forms of image-based sexual abuse, support services as well as prevention education are also important.

Nicola Henry, Associate Professor & Vice-Chancellor's Principal Research Fellow, RMIT University; Anastasia Powell, Associate Professor and ARC DECRA Fellow, Criminology and Justice Studies, RMIT University, and Asher Flynn, Senior Lecturer in Criminology, Monash University

This article was originally published on The Conversation.


Comments

    You can't put the toothpaste back in the tube. The only way to beat this is for deepfakes to become MORE prolific and well-known in the media. Then if someone makes one of you, people will just respond with "oh, another one".

    What really needs to be done is increase the punishment for these acts. People do this because they dont think they will be punished.

    I'm reminded of Robin Williams - apparently in his will, there was a section banning the use of his likeness [after death] through digital manipulation.

Join the discussion!

Trending Stories Right Now