Written by 3:46 am AI, Discussions, Uncategorized

– Leveraging AI for Popular “Nudification” of Women in Photos: Exploring the Legal Void

Experts indicate that there is a surge in the usage of applications and websites employing artificial intelligence to alter images by removing clothing.

In September alone, as reported by the social network analysis company Graphika, approximately 24 million individuals accessed platforms that offer such image manipulation services.

These companies, often referred to as grooming or “nudify” firms, heavily utilize popular social networks for marketing purposes. Analysts note a substantial spike of over 2,400% in the volume of links advertising these apps on platforms like X and Reddit since the beginning of the current month.

Utilizing AI technology, these companies specialize in digitally clothing individuals in photos. Notably, some of these services cater exclusively to female users.

These applications exemplify the concerning trend of non-consensual content production facilitated by advancements in artificial intelligence, commonly known as “deepfake.” The unauthorized alteration and distribution of images sourced from social media platforms without consent raise significant legal and ethical concerns.

Recent reports from Japan highlight a concerning rise in AI-generated explicit content involving minors, prompting calls for stricter regulations in response.

The rapid increase in the popularity of these services can be attributed to the proliferation of open-source AI models capable of producing highly realistic images. Game developers leverage these freely available designs to enhance their products.

Santiago Lakatos, a researcher at Graphika, pointed out the significant improvement in the quality of deepfake images, noting a shift towards more realistic outputs.

Instances of inappropriate advertising have been observed, such as one image on X promoting a grooming tool with suggestive language that may lead to harassment. Additionally, sponsored content on Google’s YouTube promotes similar software, violating Google’s policies against sexually explicit material.

Despite claims of attracting a substantial user base and increased website traffic, providers offering these services at a monthly cost of US$9.99 face scrutiny for potentially breaching advertising regulations. Efforts to reach out to X and Reddit for comments have been unsuccessful.

Privacy advocates express concerns over the misuse of AI technology to create non-consensual intimate content, a longstanding issue exacerbated by recent advancements in algorithmic software.

While federal laws do not explicitly prohibit deepfake pornography, regulations exist to prevent the production of such material involving minors. Notably, a child psychiatrist from North Carolina received a 40-year prison sentence for exploiting naked apps on patient photos, marking a significant legal precedent.

Major platforms like TikTok and Meta Platforms Inc. have taken steps to restrict searches related to undressing software, acknowledging the potential harm associated with such content. However, the widespread availability of these services poses challenges for law enforcement and victims seeking recourse against such violations.

Visited 2 times, 1 visit(s) today
Last modified: February 7, 2024
Close Search Window
Close