Written by 5:55 am AI Services, Uncategorized

### Surging Demand for AI-Powered “Nudify” Apps Revolutionizing Photo Editing

It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distribu…

Experts suggest that the demand for applications and websites utilizing artificial intelligence to manipulate images of individuals without clothing is on the rise.

In September alone, as reported by the social network analysis company Graphika, approximately 24 million individuals accessed websites that offer image manipulation services to undress people.

These companies, often referred to as grooming or “nudify” businesses, extensively utilize prominent social networks for promotional purposes. According to Graphika, there has been a staggering 2,400% increase in the number of links advertising such apps on social media platforms like X and Reddit since the beginning of this period. It is noteworthy that many of these services are exclusively targeted towards women.

The proliferation of these programs exemplifies a concerning trend of non-consensual content generated and disseminated in response to advancements in artificial intelligence, particularly in the realm of algorithmically generated media. The unauthorized distribution of images sourced from social media poses significant legal and ethical dilemmas as the subjects are often unaware and have not consented to such manipulations.

An instance highlighted on platform X involved an advertisement for an app that implied users could create nude images and share them with the individual depicted in the original clothed photo, potentially inciting harassment. Additionally, a sponsored content from one of these apps appears prominently on Google’s YouTube search results for the term “nudify.”

Google has stated that advertisements featuring sexually explicit content are prohibited, and they are actively removing any ads that violate their policies. However, both X and Reddit have not responded to requests for comments on this matter.

The issue of non-consensual sexual content has plagued the internet for a considerable time, but privacy advocates are increasingly concerned about the accessibility and efficiency of algorithmic tools enabled by AI advancements.

Eva Galperin, the head of security at the Electronic Frontier Foundation, notes a concerning trend where ordinary individuals, including high school and college students, are engaging in such activities with ease.

Victims of such manipulations are often unaware of the existence of these edited images. Even for those who are aware, seeking legal recourse or intervention from law enforcement can be challenging, as highlighted by Galperin.

The ongoing debate surrounding the regulation of AI technologies and their applications remains contentious.

While the U.S. laws prohibit the production of explicit images involving minors, there is currently no federal legislation specifically addressing the creation of deepfake pornography. Notably, a recent case saw a child psychiatrist from North Carolina sentenced to 40 years in prison for using image manipulation apps on photos of patients, marking a significant legal precedent.

Platforms like TikTok and Meta Platforms Inc. have taken steps to restrict searches related to undressing software, with TikTok cautioning users that such content may violate their guidelines. Meta Platforms Inc. has also implemented measures to block keywords associated with searches for such software.

Visited 1 times, 1 visit(s) today
Last modified: February 9, 2024
Close Search Window
Close