You're offline - Playing from downloaded podcasts
Back to All Episodes
Podcast Episode

UNICEF Reveals Alarming Scale of AI Deepfake Child Abuse

February 5, 2026

Audio archived. Episodes older than 60 days are removed to save server storage. Story details remain below.

UNICEF reports that over 1.2 million children across 11 countries have had their images manipulated into sexually explicit deepfakes in the past year. The agency is calling on governments worldwide to criminalize AI-generated child sexual abuse material.

A Disturbing Global Reality

The United Nations children's agency has revealed the shocking scale of AI-generated child sexual abuse, with at least 1.2 million children across 11 countries disclosing that their images were manipulated into sexually explicit deepfakes in the past year alone.

The joint research by UNICEF, INTERPOL, and ECPAT found that in some countries, one in every 25 children has been victimised, roughly equivalent to one child in every typical classroom.

The Harm Is Real

UNICEF has been unequivocal in its messaging: deepfake abuse is abuse, and there is nothing fake about the harm it causes. The agency expressed particular alarm over what it terms nudification, where AI tools strip or alter clothing in photos to create fabricated nude or sexualised images of minors.

Children themselves are acutely aware of the danger. In some surveyed countries, up to two-thirds of young people said they worry that AI could be used to create fake sexual images or videos of them.

Calls for Urgent Action

The agency outlined several immediate steps needed to confront this escalating threat. Governments should expand definitions of child sexual abuse material to explicitly include AI-generated content and criminalise its creation, possession, and distribution. AI developers must implement safety-by-design approaches and robust guardrails, whilst digital companies should prevent circulation of such material rather than merely removing it after the fact.

Legislative Progress

Some nations have already taken action. The United Kingdom announced legislation criminalising the possession, creation, or distribution of AI tools designed to generate child sexual abuse material, with offenders facing up to five years in prison. In the United States, the Take It Down Act makes it a federal crime to publish sexually explicit images without consent, with up to three years imprisonment for content depicting minors.

Published February 5, 2026 at 2:15am

More Recent Episodes