In age of AI, women battle rise of deepfake porn

In age of AI, women battle rise of deepfake porn

The proliferation of deepfake pornography underscores the threat of AI-enabled disinformation, which can damage reputations and lead to bullying or harassment.
The proliferation of deepfake pornography underscores the threat of AI-enabled disinformation, which can damage reputations and lead to bullying or harassment.. Photo: Stefani REYNOLDS / AFP
Source: AFP

Photo apps digitally undressing women, sexualized text-to-image prompts creating "AI girls" and manipulated images fueling "sextortion" rackets -- a boom in deepfake porn is outpacing US and European efforts to regulate the technology.

Artificial intelligence-enabled deepfakes are typically associated with fake viral images of well-known personalities such as Pope Francis in a puffer coat or Donald Trump under arrest, but experts say they are more widely used for generating non-consensual porn that can destroy ordinary lives.

Women are a particular target of AI tools and apps -- widely available for free and requiring no technical expertise -- that allow users to digitally strip off clothing from their pictures, or insert their faces into sexually explicit videos.

"The rise of AI-generated porn and deepfake porn normalizes the use of a woman's image or likeness without her consent," Sophie Maddocks, a researcher at the University of Pennsylvania tracking image-based sexual abuse, told AFP.

Read also

Voice actors warn Comic-Con over rampant AI threat

"What message do we send about consent as a society when you can virtually strip any woman?"

In a tearful video, an American Twitch streamer who goes by QTCinderella lamented the "constant exploitation and objectification" of women as she became the victim of deepfake porn. She was harassed, she added, by people sending her copies of the deepfakes depicting her.

PAY ATTENTION: Click “See First” under the “Following” tab to see YEN.com.gh News on your News Feed!

The scandal erupted in January during a livestream by fellow streamer Brandon Ewing, who was caught looking at a website that contained deepfaked sexual images of several women including QTCinderella.

"It's not as simple as 'just' being violated. It's so much more than that," she wrote on Twitter, adding that the experience had "ruined" her.

'Hyper-real'

The proliferation of online deepfakes underscores the threat of AI-enabled disinformation, which can damage reputations and lead to bullying or harassment.

Read also

Biden gets tech titans to pledge guardrails on AI risks

While celebrities such as singer Taylor Swift and actress Emma Watson have been victims of deepfake porn, women not in the public eye are also targeted.

American and European media are filled with first-hand testimonies of women -- from academics to activists -- who were shocked to discover their faces in deepfake porn.

Some 96 percent of deepfake videos online are non-consensual pornography, and most of them depict women, according to a 2019 study by the Dutch AI company Sensity.

"The previously private act of sexual fantasy, which takes place inside someone's mind, is now transferred to technology and content creators in the real world," Roberta Duffield, director of intelligence at Blackbird.AI, told AFP.

"The ease of access and lack of oversight -- alongside the growing professionalization of the industry -- entrenches these technologies into new forms of exploiting and diminishing women."

Among a new crop of text-to-art generators are free apps that can create "hyper-real AI girls" -- avatars from real photos, customizing them with prompts such as "dark skin" and "thigh strap."

Read also

Tech titans promise watermarks to expose AI creations

New technologies such as Stable Diffusion, an open-source AI model developed by Stability AI, have made it possible to conjure up realistic images from text descriptions.

'Dark corner'

The tech advancements have given rise to what Duffield called an "expanding cottage industry" around AI-enhanced porn, with many deepfake creators taking paid requests to generate content featuring a person of the customer's choice.

Last month, the FBI issued a warning about "sextortion schemes," in which fraudsters capture photos and videos from social media to create "sexually themed" deepfakes that are then used to extort money.

The victims, the FBI added, included minor children and non-consenting adults.

The proliferation of AI tools has outstripped regulation.

"This is not some dark corner of the internet where these images are being created and shared," Dan Purcell, chief executive and founder of the AI brand protection company Ceartas, told AFP.

"It's right under our noses. And yes, the law needs to catch up."

Read also

Chinese ghost town of mansions reclaimed by farmers

In Britain, the government has proposed a new Online Safety Bill that seeks to criminalize the sharing of pornographic deepfakes.

Four US states, including California and Virginia, have outlawed the distribution of deepfake porn, but victims often have little legal recourse if the perpetrators live outside these jurisdictions.

In May, a US lawmaker introduced the Preventing Deepfakes of Intimate Images Act that would make sharing non-consensual deepfake pornography illegal.

Popular online spaces such as Reddit have also sought to regulate its burgeoning AI porn communities.

"The internet is one jurisdiction with no borders, and there needs to be a unified international law to protect people against this form of exploitation," Purcell said.

New feature: Сheck out news that is picked for YOU ➡️ click on “Recommended for you” and enjoy!

Source: AFP

Authors:
AFP avatar

AFP AFP text, photo, graphic, audio or video material shall not be published, broadcast, rewritten for broadcast or publication or redistributed directly or indirectly in any medium. AFP news material may not be stored in whole or in part in a computer or otherwise except for personal and non-commercial use. AFP will not be held liable for any delays, inaccuracies, errors or omissions in any AFP news material or in transmission or delivery of all or any part thereof or for any damages whatsoever. As a newswire service, AFP does not obtain releases from subjects, individuals, groups or entities contained in its photographs, videos, graphics or quoted in its texts. Further, no clearance is obtained from the owners of any trademarks or copyrighted materials whose marks and materials are included in AFP material. Therefore you will be solely responsible for obtaining any and all necessary releases from whatever individuals and/or entities necessary for any uses of AFP material.