← All Articles & Guides
Online SafetyFor ProfessionalsFor ParentsNEW · MAY 2026

AI-Generated CSAM and Deepfakes: What Schools Must Know in 2025

Artificial intelligence now generates child sexual abuse material without a real victim — and creates convincing fake intimate images of real young people. Both are criminal. Both are happening in UK schools. Here is what every DSL and parent needs to understand.

✍️ By The Safeguard Hub Team 📅 May 2026 · Last reviewed May 2026 ⏱ 10 min read Part of The Safeguard Hub Articles Series
Person using laptop — AI-generated CSAM deepfakes what schools must know 2025

Photo: Pexels — person using laptop

Online safety illustration — AI deepfakes child sexual abuse material schools

The Scale of the Problem

The Internet Watch Foundation's 2024 annual report confirmed that AI-generated CSAM reports to the IWF rose by over 300% between 2022 and 2024.[1] These images are indistinguishable from real photographs. Separately, deepfake intimate images of real young people — created by peers at the same school — are an increasingly common form of image-based abuse.

The Law: What Changed in 2023–2025

AI-Generated CSAM — Already Criminal

Under the Protection of Children Act 1978, creating, distributing, or possessing any indecent image of a child is criminal — regardless of whether the image shows a real child. Maximum sentence: 10 years for making/distributing; 5 years for possession.

Deepfake Intimate Images — New Offence (2024)

The Criminal Justice Bill 2024 created a new offence of creating a deepfake intimate image without consent, regardless of intent to share. A pupil who creates such an image of a classmate — even if they never share it — is committing a criminal offence.

When a School Discovers CSAM or Deepfake Images

  1. Do not view, copy, or forward the images. Describe only what was seen. Viewing CSAM requires specialist police authority.
  2. Secure the device in a sealed bag, noting time and discoverer, and preserve it for police.
  3. Call 999 if CSAM is involved. For deepfake intimate images without CSAM, call 101. Both should also be reported to the IWF at iwf.org.uk.
  4. Make a same-day MASH referral. The victim must be treated as a victim of harm. Handle disclosure to the child with extreme sensitivity.
  5. Treat the perpetrator as a safeguarding concern too — not only a disciplinary matter. Young people who create CSAM may themselves have been groomed or exploited.

Citations

[1] Internet Watch Foundation (2024). IWF Annual Report 2024. iwf.org.uk.

[2] CPS (2024). Intimate Image Abuse — Legal Guidance. cps.gov.uk.

[3] HM Government (2023). Online Safety Act 2023. legislation.gov.uk.

[4] NSPCC (2024). Indecent Images of Children: Information for Professionals. NSPCC Learning.

Share this article: 𝕏 X f Facebook in LinkedIn 📱 WhatsApp

Related Resources

Online Grooming Hub →Professional Portal →Parents' Corner → All Articles →