Deepfake Porn: The Rise of AI-Generated Exploitation

Deepfakes are more commonly used to target women than to influence politics, despite the enormous attention they have received for their potential political dangers. According to Deeptrace, cybersecurity company, 96% of all deepfakes are non-consensual porn.

Deepfake pornography can have the same devastating consequences as revenge porn, in many cases leading victims to change their names or completely remove themselves from the internet.

Europol report finds deepfake technology could become staple tool for organised crime.

And “the industry” is thriving these days when AI tools makes the process easier than ever before.

Seeing Is Believing?

Two years ago, a Telegram bot was utilized to create explicit and non-consensual images of real women. The extent of this digital violation was horrifyingly vast, affecting a staggering 100,000 victims, including underage girls. 

A large part of the victims stated that many of the colleagues or friends who saw the images believed they were real.

Last year, a student from Giurgiu used deep fake to turn her colleague into an adult film actor. The 12-year-old boy needed psychological counseling after the incident, dealing with a lot of shame.

This was also the first court ruling that targets the Deepfake area and Romania.

But without a specific law that addresses this aspect, these cases are judged under the amendments of a law that addresses defamation and issues crop up fast. In the first verdict of the court both families had to pay damages to each other. After the boy allegedly cursed his colleague who did the deepfake.

Recently a study conducted by an independent researcher and shared with WIRED, “shows that at least 244,625 videos have been uploaded to the top 35 websites set up either exclusively or partially to host deepfake porn videos in the past seven years, according to the researcherOver the first nine months of this year, 113,000 videos were uploaded to the websites—a 54% increase on the 73,000 videos uploaded in all of 2022”.

How about the law?

A law to address this is close to be released in UK, but basically that’s it. There are no specific laws protecting victims of nonconsensual deepfake pornography in EU. 

We do have “something” in Romania, in May this year, the Parliament voted a law proposed by USR four years ago. Publishing compromising images, without consent, for revenge is punishable by up to 3 years in prison. 

However, it does not solve the problem of the identity of the person posting. As long as anyone can post anything from fake accounts.

Now what?

New technologies like Stable Diffusion can create fake porn images is seconds. 

The real challenge here is that it is almost impossible to discern consensually from non-consensually distributed images. 

Not being a native English speaker, I tried to check my grammar with Bard on some of the phrases and it replied that: “I’m a language model and don’t have the capacity to help with that”.

Maybe this is a good start!

It is clear that we need rules, rules that protect people and discourage such actions. Also rules that lead to the closing of the pages or forcing them to properly verify the people who upload pornographic content.

Creating fake erotic images may not be inherently bad. After all, Nude in a Black Armchair by Picasso is a piece of art. Telling the story of Marie-Thérèse Walter, aged 22, who had been Picasso’s big love for years.

But unfortunately, most of the deepfake images today have nothing to do with art. They are made without consent, and it becomes deeply harmful for the victims, leading to serious psychological problems and even suicide attempts.

Disclamer: This article also contain personal opinions of the author and do not reflect the official policy or position of any organization, company, or entity.

Leave a Reply

Your email address will not be published.