Home World Politics Crypto Business Sports
Home World Politics Crypto Business Sports
UK Government to Ban Nudification AI Apps to Combat Violence Against Women and Girls image from bbc.co.uk
Image from bbc.co.uk

UK Government to Ban Nudification AI Apps to Combat Violence Against Women and Girls

Posted 20th Dec 2025

L 20%
C 75%
R

On 18 December 2025, the UK government announced a ban on nudification AI apps, which are software tools that edit images to remove clothing. This move is part of a broader strategy aimed at halving violence against women and girls.

The new legislation will criminalize the creation and distribution of these nudifying apps, building upon existing laws related to sexually explicit deepfakes and intimate image abuse. Notably, non-consensual deepfakes are already illegal under the Online Safety Act. Offenders as well as those who profit from or assist in the use of these apps will face the full force of the law.

To combat intimate image abuse further, the government plans to collaborate with technology companies. This includes continuing partnerships with organizations like SafeToNet to identify and block sexual content and to deactivate cameras when such content is detected.

The overarching plan also aims to prevent children from taking, sharing, or viewing nude images on their phones, and to outlaw AI tools that create or distribute child sexual abuse material (CSAM).

The Internet Watch Foundation welcomed these measures, and the NSPCC also supported the announcements while urging the implementation of mandatory device-level protections and improved detection and prevention of CSAM, including within private messages.

Earlier in April, Dame Rachel de Souza, the Children's Commissioner for England, called for a total ban on nudification apps, highlighting growing concern over the issue.

Sources
BBC Logo
https://bbc.co.uk/news/articles/cq8dp2y0z7wo
* This article has been summarised using Artificial Intelligence and may contain inaccuracies. Please fact-check details with the sources provided.