In its quarterly assessment released on Tuesday, the National Center for Missing & Exploited Children (NCMEC) reported a concerning rise in online child sexual abuse, including the creation of images and videos using artificial intelligence. The organization highlighted a more than 12% increase in online child abuse reports in 2023, exceeding 36.2 million pieces of information submitted to its CyberTipline.
The majority of reports received by NCMEC were related to the dissemination of child sexual abuse material (CSAM), encompassing videos, photos, and instances of financial extortion. Predators utilize AI-generated CSAM to extort children and families for economic gain, a disturbing trend identified by the organization.
The proliferation of deep-faked explicit content involving minors poses a significant challenge, hindering the identification of genuine child victims. The production of any sexually explicit material involving minors is a federal offense, emphasized a Department of Justice counsel from Massachusetts.
The NCMEC expressed particular concern about the escalating use of AI to create deceptive content, emphasizing the detrimental impact on children and communities. Despite receiving over 35 million reports in 2023, the NCMEC highlighted challenges in the quality and volume of information submitted, underscoring the need for enhanced collaboration between the public, tech companies, and law enforcement to combat online child exploitation effectively.