When Miriam al-Adib returned from her business trip, she was the first to notice the photographs. Her child said, “Mom, I have something to show you.”
Upon opening her phone, the 14-year-old girl immediately displayed a picture of herself. Adib, a gynecologist and mother of four sons in Almendralejo, a town in southeastern Spain, expressed, “It’s startling to see. If I didn’t know it was my daughter’s body, I would have believed the image to be real.” The image was incredibly lifelike.
Among the numerous nude photos of teenagers circulating in a WhatsApp group among students in Almendralejo, some were identified as deepfakes.
The dissemination of these images led to public bullying, some victims refusing to attend school, and others experiencing panic attacks. Adib, speaking from her clinic, voiced her concern to The Guardian, stating, “I feared that these images had made their way to pornographic websites that are still unknown to us.”
The prosecutor’s office mentioned that some individuals are facing charges for using an app downloaded from the internet to generate these images. However, the creators of the software, believed to be located in Eastern Europe, remain unidentified.
Almendralejo, a quaint town with faded renaissance-era churches near the Portuguese border, became the focal point of a recent incident highlighting the rise of AI tools enabling the creation of hyper-realistic images with minimal effort. The Spanish case garnered global media attention in the following weeks.
While deepfakes featuring celebrities like Taylor Swift attract significant media coverage, they contribute to the distress caused by illicit images that law enforcement struggles to combat.
A similar scenario unfolded at Westfield High School in New Jersey, where students created and shared obvious deepfake images targeting many girls, sparking a civil lawsuit and legislative efforts to ban the production and dissemination of such content.
The application central to both incidents in New Jersey and Spain is known as ClothOff.
Despite efforts to maintain anonymity, a six-month investigation conducted for a new Guardian audio series called Black Box has uncovered potential links to individuals associated with ClothOff.
The trail leads from Belarus and Russia through European-registered entities to major companies headquartered in London.
ClothOff allows users to “undress anyone using AI,” attracting over 4 million monthly visitors. Accessible via mobile devices, users confirm their age, pay approximately £8.50 for 25 credits, and can then post photos of individuals whose clothing has been digitally removed.
A Relative and an Associate in Belarus
Investigations revealed a Telegram account under the name Dasha Babicheva, engaged in business dealings on behalf of ClothOff. The account, potentially based in Minsk, Belarus, facilitated discussions with bankers, website modifications, and business collaborations.
Alaiksandr Babichau, purportedly Dasha Babicheva’s nephew, also appears closely linked to ClothOff.
Connections between ClothOff and Babichau extend to interactions with potential business partners, with one ClothOff employee identified as the “founder” using the Telegram alias “Al.”
Efforts to conceal identities were evident in transactions involving ClothOff, leading to the establishment of Texture Oasis, a London-based entity seemingly designed to obscure financial dealings related to ClothOff.
The investigation also uncovered ties between ClothOff and GGSel, an online gaming platform, raising questions about financial transactions and potential sanctions evasion.
The complexity of these connections underscores the challenges in distinguishing between authentic and fabricated online personas, especially when accompanied by high-quality multimedia content.
For further details on this investigation, tune in to the upcoming Black Box episode next Thursday.
For additional information or leads on this story, contact [email protected].