According to a concerning report, there has been a surge in the development of artificial intelligence (AI) applications that generate nude images of women based on their fully clothed pictures.
As per Graphika’s recent analysis released today (Friday), these specialized services are moving beyond “niche internet forums” to become automated and widespread online enterprises.
These technologies alter existing images and videos of individuals without their consent to create simulated nude depictions.
In a recent incident, a high school student in New Jersey fabricated pornographic images of his female classmates, reflecting a disturbing trend of AI-generated scandals. In a small city in Spain, 20 girls aged between 11 and 17 reported being targeted by AI tools.
Graphika’s research reveals a drastic surge in referral links on platforms such as Reddit and X (formerly Twitter), increasing by over 2,000% since the beginning of 2023. In September alone, there were 24 million unique visitors to websites offering such AI-generated nude content.
To establish a comprehensive online industry mirroring established e-commerce practices, AI nude image apps are leveraging open-source AI image generators.
According to Graphika, several services adopt a shareware model, offering initial generations for free before implementing charges for advanced features such as higher quality exports, age and body customization, and AI photo editing options like inpainting.
Payments for these services are facilitated through mainstream channels like PayPal and Stripe, as well as cryptocurrencies such as Coinbase and Commerce.
Impact on Victims
Users of these applications endure feelings of shame and vulnerability. Parents of New Jersey schoolchildren expressed fear and uncertainty regarding the potential dissemination of nude images of their daughters.
Dorota Mani, whose 14-year-old daughter’s image was used in an AI-generated nude, expressed concerns about the long-term repercussions on her daughter’s professional, academic, and social life.
In Spain, a student reportedly extorted money from a peer, threatening to distribute an AI-generated nude image of her when she refused.
While there is no federal law specifically prohibiting the creation of fake pornographic content, the production of such material involving minors is illegal in the United States.
In a recent case, a North Carolina child psychiatrist received a 40-year prison sentence for utilizing AI applications to produce nude images of patients, marking a significant legal precedent in this domain.