U.S. Lawmakers Aim to Combat AI-Generated Child Sexual Abuse Material

Congressman introduces bill to address AI-generated child sexual abuse material, highlighting the urgent need to update legal frameworks and bolster technical awareness to prevent, detect, and prosecute these crimes.

author-image
Trim Correspondents
Updated On
New Update
U.S. Lawmakers Aim to Combat AI-Generated Child Sexual Abuse Material

U.S. Lawmakers Aim to Combat AI-Generated Child Sexual Abuse Material

Congressman Nick Langworthy (NY-23) has introduced the Child Exploitation & Artificial Intelligence Expert Commission Act to address the growing problem of child sexual abuse material (CSAM) created using artificial intelligence (AI). The proposed legislation would establish a commission to develop a legal framework to assist law enforcement in preventing, detecting, and prosecuting AI-generated crimes against children.

The commission would investigate how AI may be used to commit child exploitation crimes, evaluate the ability to prevent, detect, and prosecute such crimes under current laws, and analyze the efficacy of the legal framework to charge individuals suspected of using AI to create CSAM. The bill is co-sponsored by several representatives and supported by various organizations, who highlight the urgent need to address the challenges posed by the rapid advancement of AI and its potential impact on the exploitation of children.

Why this matters: The proliferation of AI-generated CSAM poses significant challenges for law enforcement and impedes the identification of real child victims. Addressing this issue requires revisiting legal frameworks and bolstering technical awareness to effectively prevent, detect, and prosecute AI-enabled crimes against children as the technological environment rapidly evolves.

According to an annual assessment by the National Center for Missing & Exploited Children (NCMEC), reports of child sexual exploitation online increased by more than 12% in 2023 compared to the previous year, surpassing 36.2 million. The majority of tips were related to the circulation of CSAM, but there was also an increase in reports of financial sexual extortion, where predators lure children into sending nude images or videos and then demand money. The NCMEC also received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI, a new category it started tracking in 2023.

Creating visual depictions of minors engaging in sexually explicit conduct is a federal crime in the United States. The Department of Justice has acknowledged that AI-generated images violating child protection laws remain a concern, but there have been few documented instances of suspects being charged or successfully prosecuted for creating such content, raising concerns about the efficacy of current legal frameworks in addressing the issue.

The NCMEC report emphasizes the continued need for action from Congress and the global tech community to address the growing threat of online child sexual exploitation, including the use of AI-generated CSAM. As Congressman Langworthy stated, "The commission would bring together experts to develop tools to effectively address the challenges posed by the rapid growth of AI in the context of child exploitation."

Key Takeaways

  • Congressman Langworthy introduced bill to address AI-generated child sex abuse material
  • Proposed commission to develop legal framework to prevent, detect, and prosecute AI crimes
  • Reports of online child sexual exploitation increased by over 12% in 2023 to 36.2 million
  • AI-generated images/videos of child exploitation a new category tracked by NCMEC in 2023
  • Current legal frameworks struggle to effectively address AI-generated child exploitation content