Written by 1:49 pm AI Trend, Uncategorized

### The Surge in Popularity of AI-Powered “Nudify” Apps

The services, many of which only work on women, are part of a worrying trend of non-consensual porn…

Experts indicate that there is a surge in the popularity of applications and websites utilizing artificial intelligence to manipulate images by removing clothing.

In September alone, an estimated 24 million individuals accessed websites that offer image manipulation services, as reported by the social network analysis company, Graphika.

These companies, often referred to as “grooming” or “nudify” services, frequently leverage prominent social networks for promotional purposes. According to Graphika, the volume of advertisements promoting these apps on platforms such as X and Reddit has spiked by over 2,400% since the beginning of the current month. These services employ AI algorithms to digitally undress individuals, with many of them catering exclusively to women.

The proliferation of such applications underscores the concerning trend of non-consensual content generated and circulated through advancements in artificial intelligence, also known as algorithmic or synthetic media. Given that these images are typically sourced from social media without the subject’s consent, this poses significant legal and ethical dilemmas.

Notably, an advertisement on platform X promoting an image manipulation app hinted at the ability to create altered images and potentially engage in harassment by sending them to the person depicted without their consent. Additionally, sponsored content for one of these apps appears prominently on Google’s YouTube search results for the term “nudify.”

Google has stated that advertisements featuring “sexually explicit content” are prohibited and has taken action to remove those that violate their policies. However, X and Reddit have not responded to requests for comments regarding these issues.

The rise of AI technology has facilitated the creation of manipulated content, raising concerns among privacy advocates. Eva Galperin, the head of security at the Electronic Frontier Foundation, notes a growing trend of individuals, including students, engaging in such activities.

Despite existing laws against producing non-consensual explicit material involving minors, there is currently no federal legislation specifically addressing deepfake pornography. Notably, a North Carolina pediatrician received a 40-year prison sentence for using image manipulation software on patients, marking the first prosecution under laws prohibiting the creation of such material.

Platforms like TikTok and Meta Platforms Inc. have taken measures to restrict searches related to image manipulation software, citing potential violations of their guidelines. TikTok, in particular, has blocked searches for the term “undress.” However, representatives from these platforms have chosen not to comment on these actions.

Visited 2 times, 1 visit(s) today
Last modified: February 13, 2024
Close Search Window
Close