Written by 7:12 pm Discussions

### Popular Apps Enhancing the Visibility of Unclothed Women: Potential Risks?

These programs manipulate existing pictures and videos of real individuals and make them appear nud…

According to a recent study on social media trends, there is a growing popularity in the use of artificial intelligence programs that digitally undress individuals in photos without their consent. These programs manipulate existing images and videos of real people to create a naked appearance, particularly targeting women.

A report by Graphika, a social media analytics company, investigated 34 companies offering this service, referred to as non-consensual intimate imagery (NCII). Shockingly, these platforms collectively attracted 24 million unique visitors in September alone.

One advertisement for such a service boldly states, “Undress Any Girl You Want,” highlighting the disturbing nature of this online trend. The accessibility of open-source AI models for image manipulation has facilitated the proliferation of these unethical apps and websites. The surge in Reddit and X-rated links related to these services has skyrocketed by over 2400% in the past month, underscoring the rapid expansion of this industry.

These NCII enterprises now function as fully-fledged online ventures, adopting sophisticated marketing tactics akin to established e-commerce businesses. They leverage online transaction technologies, influencer marketing, customer referral programs, and advertising on popular social media platforms to promote their services effectively.

While the unauthorized posting of nude images online is not a new phenomenon, the emergence of AI-powered tools has made the creation of fake explicit content alarmingly easy. This poses challenges in distinguishing between authentic and manipulated images.

Law enforcement agencies like the FBI have raised concerns about the use of manipulated photos for sextortion and other malicious activities. Perpetrators exploit advanced editing technologies to create realistic yet fake explicit content, often sourced from social media accounts or the internet, to extort victims. Minors are particularly vulnerable to such exploitation, as evidenced by incidents involving AI-generated nude images of young girls circulated in Spain and New Jersey.

The absence of federal regulations specifically addressing the creation of these manipulated images exacerbates the risks associated with their proliferation. While platforms like TikTok and Meta have taken steps to restrict access to such content by blocking search terms and removing related advertisements, the challenge of combating this unethical practice persists.

Visited 2 times, 1 visit(s) today
Last modified: February 27, 2024
Close Search Window
Close