Artificial intelligence now generates child sexual abuse material without a real victim — and creates convincing fake intimate images of real young people. Both are criminal. Both are happening in UK schools. Here is what every DSL and parent needs to understand.
Photo: Pexels — person using laptop
The Internet Watch Foundation's 2024 annual report confirmed that AI-generated CSAM reports to the IWF rose by over 300% between 2022 and 2024.[1] These images are indistinguishable from real photographs. Separately, deepfake intimate images of real young people — created by peers at the same school — are an increasingly common form of image-based abuse.
AI-Generated CSAM — Already Criminal
Under the Protection of Children Act 1978, creating, distributing, or possessing any indecent image of a child is criminal — regardless of whether the image shows a real child. Maximum sentence: 10 years for making/distributing; 5 years for possession.
Deepfake Intimate Images — New Offence (2024)
The Criminal Justice Bill 2024 created a new offence of creating a deepfake intimate image without consent, regardless of intent to share. A pupil who creates such an image of a classmate — even if they never share it — is committing a criminal offence.
Citations
[1] Internet Watch Foundation (2024). IWF Annual Report 2024. iwf.org.uk.
[2] CPS (2024). Intimate Image Abuse — Legal Guidance. cps.gov.uk.
[3] HM Government (2023). Online Safety Act 2023. legislation.gov.uk.
[4] NSPCC (2024). Indecent Images of Children: Information for Professionals. NSPCC Learning.