Research On Ai-Generated Child Sexual Abuse Material And Global Prosecution Strategies
đ§ Legal & Prosecution Frameworks for AIâGenerated CSAM
Key elements
Definition and scope: CSAM (child sexual abuse material) traditionally refers to images, videos, or other media depicting real children engaged in sexual activities or sexualised imagery of minors. AIâgenerated CSAM adds a new dimension: materials created entirely or partly by generative AI (images/videos) with no actual victim (or real victim involvement) but depicting minors in sexualised or abusive contexts.
Legal treatment: Many jurisdictions have extended laws covering CSAM to include âpseudoÂphotographsâ, âcomputer generated imageryâ or âdeepfakesâ of minors. For example, the U.S. DOJ has publicly declared that âCSAM generated by AI is still CSAM.â
Prosecutorial strategy:
Production/creation of AIâCSAM (even without real victim).
Distribution/sale of such material (sharing online, selling access).
Possession/access of AIâCSAM.
Soliciting/commissioning such content (i.e., paying for custom AIâCSAM).
Challenges:
Attribution: linking the AI generation and the user with illegal intention.
Evidence: capturing metadata, prompt logs, machineâlearning artefacts.
Jurisdiction & global cooperation: content may be produced in one country, hosted in another, distributed globally.
Existing legislation: many legal frameworks were drafted before AIâgen imagery; prosecutors must interpret âobscene depiction of minorsâ or âpseudo photographsâ to cover AI content.
Global cooperation: Because platforms and user networks cross borders, investigations often involve mutual legal assistance treaties (MLATs), multiânational task forces (e.g., Europol), and coordinated raids/seizures of servers/devices globally.
Prosecution strategy:
Seize the device or system where the AI prompts/logs are stored.
Trace payments/subscriptions for AIâCSAM generation.
Use forensic tools to distinguish âreal childrenâ images vs âsyntheticâ output, but treat both as criminal if sexualised minors depict.
Employ sentencing enhancements for AI usage or customâcommissioned content.
Update statutory language (in some jurisdictions) to explicitly mention âcomputerâgenerated or AIâgenerated sexual images of minorsâ.
âïž Case Studies
Case 1: United States ââŻStevenâŻAnderegg (Wisconsin, 2024)
Facts: A 42âyearâold software engineer, StevenâŻAnderegg, allegedly used the AI image generation model (StableâŻDiffusion variant) to create more than 13,000 sexualised and explicit images of preâpubescent minors. He used extremely specific prompts (including ânegative promptsâ to avoid adult depiction) to generate the content. He is also alleged to have sent the images to a 15âyearâold boy via Instagram. In his messages he reportedly boasted of his AIâskills.
Legal framework/charges: The U.S. DOJ charged him with producing, distributing and possessing obscene visual depictions of minors engaged in sexually explicit conduct, as well as transferring obscene material to a minor under age 16. The key legal principle: even though no real children were depicted, federal law bans âobscene visual depictions of minorsâ and courts have accepted that AIâgenerated CSAM falls within that prohibition.
Prosecution strategy: Law enforcement traced Instagram chats, the device seizure revealed the AI model, prompt logs, thousands of images. Prosecutors emphasised the âspecial skillâ in generative AI, advanced custom prompt engineering, and distribution to a minor.
Outcome: Case is ongoing (as of the latest data), but could carry up to 70 yearsâ prison if convicted on all counts. Significance: this is among the first federal prosecutions in the U.S. focusing on AIâgenerated CSAM.
Lessons: Creates precedent that AIâgenerated imagery of abused children is treated as CSAM; demonstrates need for prosecutors to treat âsyntheticâ as ârealâ under law; underscores importance of prompt logs and forensic AIâusage records.
Case 2: United Kingdom ââŻHughâŻNelson (UK, 2024)
Facts: HughâŻNelson, 27, used 3D modelling/AI software (DazâŻ3D with AIâfunctions) to generate child sexual abuse imagery. He took commissions from buyers to create custom images of children being abused, using real photographs of children as starting point then transforming via AI. He sold them online and also shared them for free. Police discovered the network via an undercover operation in a chat room.
Legal framework/charges: Under UK law, creation and distribution of indecent images of children (including pseudoâphotographs) is criminal. The court accepted that images derived via AI using real children photographs count as indecent images.
Prosecution strategy: Investigators traced purchase/payments, peerâtoâpeer networks, and the use of real photographs of children. At sentencing, the judge noted the âdepths of depravityâ in the images and the commercial element of the business.
Outcome: Mr. Nelson was convicted of 16 childâsexualâabuse offences in 2024 and sentenced to 18 years in prison.
Lessons: This case confirms that in the UK you can be prosecuted for AIâenabled CSAM generation; shows that realâphotograph involvement enhances gravity; demonstrates how custom orders/commissions raise aggravating factors.
Case 3: Australia ââŻDavidâŻBradleyâŻDillonâHenderson (New South Wales, 2024â25)
Facts: In Wollongong, NSW, a 22âyearâold man (DavidâŻBradleyâŻDillonâHenderson) was found with AIâgenerated child sexual abuse material on his computer. He admitted using prompts such as âschoolgirl, skirt and leggingsâ to generate explicit images via an AI tool, storing the resulting material in a folder labelled âopen meâ. The images did not necessarily show real children physically abused, but were generated. His girlfriend discovered them and reported him.
Legal framework/charges: Under Australian law (CriminalâŻCodeâŻActâŻ1995 etc), production and possession of child abuse material is criminal, whether real or simulated. The courts examined whether AIâgenerated imagery qualifies â in this case it did lead to guilty verdict.
Prosecution strategy: The case hinged on his admission of prompt usage, his computer logs, the storage of AIâgenerated pornography depicting children, and the contextual folder naming. Defence argued ignorance of illegality; court found sufficient evidence of intent and knowledge.
Outcome: He was found guilty; sentencing was pending (in December).
Lessons: Reinforces that even without direct victim photos, using AI to generate sexualised images of minors is liable; shows how prompt logs and folder organisation provide evidentiary support.
Case 4: Denmark / Global Ring ââŻOperation âCumberlandâ (2025)
Facts: A coordinated multinational law enforcement operation (led by Danish authorities with support from Europol and partner countries) targeted a criminal distribution ring that offered access via subscription to AIâgenerated CSAM. The service allowed users to pay a symbolic fee and receive AIâgenerated child sexual abuse images. The ring involved 19 countries; 25 suspects arrested, 173 devices seized, 273 suspected members identified.
Legal framework/charges: Many jurisdictions treat distribution of CSAM (including synthetic) as criminal. The operation emphasises global cooperation under cybercrime frameworks.
Prosecution strategy: Law enforcement traced payment systems, platform infrastructure, and distribution channels; seized servers/devices; used cooperation across states. Identified AIâgeneration platform, subscription model, worldwide users.
Outcome: Arrests have been made, devices seized, and ongoing prosecutions. The case demonstrates how AIâCSAM distribution networks span jurisdictions and require collaborative responses.
Lessons: Highlights the global dimension of AIâCSAM; shows how commercial distribution models increase scale; indicates need for crossâborder legal frameworks and harmonised legislation.
đ§© Summary Table of Key Features
| Case | Jurisdiction | AIâAspect | Legal Framework / Charges | Key Evidence | Outcome / Significance |
|---|---|---|---|---|---|
| Anderegg (USA) | U.S. (Wisconsin) | Generated >13k AI images via Stable Diffusion | Production/distribution/possession of CSAM | Prompt logs, AI model use, Instagram chats | Landmark federal case treating AIâCSAM as CSAM |
| Nelson (UK) | United Kingdom | 3D/AI tool creating custom images from real children photos | UK indecent images law covers pseudoâphotos | Commission orders, sale records, realâkids photo inputs | First UK major conviction for AIâCSAM |
| DillonâHenderson (Australia) | Australia (NSW) | AI prompts to create sexualised minor imagery | Australian production/possession offences | Prompts, folder naming, AIâoutput analysis | Confirms Australian law applies to AIâCSAM |
| Operation Cumberland | Denmark/19 countries | Subscriptionâbased AIâCSAM distribution network | Distribution of CSAM across jurisdictions | Payment logs, server seizures, multiâstate cooperation | Illustrates scale and global reach of AIâCSAM networks |
đ Observations & Strategic Insights
Statutory clarity is required: Some jurisdictions had to interpret existing CSAM laws (which were drafted for realâchild images) to cover fully synthetic images.
Prompt logs and AI tool metadata matter: Prosecutors increasingly rely on logs of AI prompts (âcreate nude prepubescent child touching genitalsâ), negative prompts, model versions used.
Commercial dimension aggravates liability: Commissioned generation, sales/subscriptions of AIâCSAM elevate offences.
Global distribution amplifies complexity: Networks operating across borders require international law enforcement collaboration and harmonised legislation.
Victimâless image does not mean harmâless: Even without real children, AIâCSAM is treated as illegal because it normalises abuse, may facilitate grooming/luring, and uses realâchild images as inputs in many cases.
Sentencing is severe and evolving: Courts are treating AIâCSAM offences seriously; case law is emerging rapidly.
Prevention and platform liability: AI model firms, platforms enabling generation/distribution, may become part of enforcement ecosystem (filters, prompt blocks, licensing controls).
Challenges remain: Distinguishing between fantasy art vs illegal sexualised minor depiction; managing encryption/anonymous networks; crossâjurisdiction enforcement delay.

comments