A student from New Jersey is taking legal action against a peer for allegedly producing and circulating AI-generated pornographic images of her and other classmates.
As per a national lawsuit filed in the United States District Court District of New Jersey, a 15-year-old student referred to as Jane Doe, along with other female peers at Westfield High School, utilized an “AI software or website” to manipulate their pictures. Initially, these edited images were shared on Instagram.
The AI tool was used to digitally erase the clothing of Jane Doe and her peers in all the photos, resulting in new images depicting the girls in a naked form while keeping their facial features recognizable, as stated in the lawsuit.
The lawsuit asserts that these nude images of Jane Doe and other young women were virtually indistinguishable from authentic, unaltered photos.
During the summer of 2023, the classmate responsible for creating these images allegedly distributed the edited photos to classmates and potentially others using the Internet and Snapchat. Jump, the parent company of Snapchat, informed CBS News that their policies explicitly prohibit the dissemination of such images and the platform cannot be utilized for such purposes.
In a statement, Jump emphasized a zero-tolerance policy towards any form of sexual exploitation within their community.
In October 2023, Jane Doe’s parents were informed by her Union County high school, albeit not named in the lawsuit, about the existence of these pictures. The school’s assistant director confirmed Jane Doe as a “victim” and acknowledged the presence of the images. Additionally, the assistant superintendent mentioned that a student had reported the sighting of nude images of Jane Doe to the school authorities.
Furthermore, the lawsuit reveals that Jane Doe’s father was in contact with her and her parents. Although the Westfield Police Department promptly cooperated with an investigation, no charges were pressed due to the inadmissibility of the school officials’ information in the inquiry.
Moreover, the lawsuit highlights the lack of cooperation from the defendant and potential witnesses with law enforcement, hindering efforts to determine the extent of image sharing and ensuring their deletion and non-publication.
The lawsuit underscores the profound impact on victims like Jane Doe who endure lasting social and psychological harm from the creation and circulation of such images. The emotional toll includes distress, anguish, stress, shame, and humiliation, with repercussions that may not be adequately addressed through legal means.
In seeking recourse, the lawsuit requests a temporary restraining order preventing the accused from sharing the photos or disclosing Jane Doe’s identity, alongside damages amounting to $150,000 for each instance of a nude picture disclosure, as well as compensatory and punitive damages to be determined during the trial. The defendant would be obligated to surrender all images to Jane Doe for complete deletion and destruction of any copies.
Jane Doe’s legal representative, Shane Vogt, expressed optimism that this case would demonstrate avenues for individuals to safeguard themselves against the proliferation of AI-generated explicit content.
With the rise in popularity of AI-generated erotic imagery, several states have enacted legislation to combat its dissemination and declare it unlawful. New Jersey is currently developing a bill to criminalize algorithmic pornography, prescribing penalties including imprisonment for individuals sharing such manipulated content. President Joe Biden’s executive order in October prohibited the use of AI for creating non-consensual sexual material or child exploitation.
Kerry Breen
At CBSNews.com, Kerry Breen is a reporter and news director. She has a background at NBC News’ TODAY Digital and is a journalism student at New York University’s Arthur L. Carter School of Journalism. Her reporting covers current events, breaking news, and issues such as substance abuse.