AI-generated child pornography is surging − a legal scholar explains why the fight against it is complicated and how the law could catch up
(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)
Wayne Unger, Quinnipiac University
(THE CONVERSATION) The Internet Watch Foundation, an organization that tracks child sexual abuse material posted online, has documented a surge over the first half of 2025 in AI-generated, realistic sexually explicit videos depicting minors. Some of the material was derived from images of real minors, and some was wholly synthetic.
The Supreme Court has implicitly concluded that computer-generated pornographic images that are based on images of real children are illegal. The use of generative AI technologies to make deepfake pornographic images of minors almost certainly falls under the scope of that ruling.
But the legality of the new fully AI-generated content is less clear. As a legal scholar who studies the intersection of constitutional law and emerging technologies, I see images that are completely fake but indistinguishable from real photos as a challenge to the legal status quo.
Policing child sexual abuse material
While the internet’s architecture has always made it difficult to control what is shared online, there are a few kinds of content that most regulatory...