UK Government to Ban Nudification AI Apps in New Strategy to Protect Women and Children
The UK government announced plans on 18 December 2025 to ban nudification AI apps that edit images to remove clothing. This move is part of a broader strategy to halve violence against women and girls.
The new legislation will criminalise the creation or distribution of nudifying apps, and those who profit from or enable their use will face the full force of the law. This builds on existing protections under the Online Safety Act concerning sexually explicit deepfakes and intimate image abuse.
The plan specifically targets non-consensual sexually explicit imagery and child sexual abuse material (CSAM). It will make it illegal to create or distribute nudifying tools. Additionally, the government aims to make it impossible for children to take, share, or view nude images on their phones and seeks to outlaw AI tools designed to create or distribute CSAM.
The strategy includes collaboration with technology firms to combat intimate image abuse. This continues ongoing work with SafeToNet to identify and block sexual content and aims to block cameras when such content is detected.
The Internet Watch Foundation stated that 19% of confirmed reporters had reported imagery manipulation and welcomed the steps to ban nudification apps. Children's Commissioner Dame Rachel de Souza had previously called for a total ban on nudification apps. Meanwhile, the NSPCC urged for mandatory device-level protections, which were not included in the current plan.