UK to Ban Nudification AI Apps as Part of Wider Strategy Against Violence on Women and Girls
The UK government has announced plans to ban nudification AI apps, which edit images to remove clothing, as part of a broader strategy to halve violence against women and girls. This new legislation will criminalise the creation or distribution of nudification tools, building on existing provisions in the Online Safety Act concerning sexually explicit deepfakes and intimate image abuse.
Technology Secretary Liz Kendall stated that the law will focus on those who profit from or enable these apps, ensuring enforcement action against such activities. The government will collaborate with technology companies, including SafeToNet, whose software claims to identify and block sexual content and disable cameras when such content is detected.
The strategy relies on existing platform filters, such as those used by Meta, to detect nudity and prevent distribution. The government aims to make it impossible for children to take, share, or view nude images on their phones and to outlaw AI tools that create or distribute child sexual abuse material (CSAM).
The Internet Watch Foundation welcomed the measures, with its chief executive highlighting that nudification apps have no valid purpose and pose harm to children. They noted that 19% of confirmed reporters stated their imagery had been manipulated by such apps. The NSPCC also praised the announcement but urged for mandatory device-level protections and more accessible tools to help firms identify and prevent CSAM, including in private messages.
In April, the Children's Commissioner Dame Rachel de Souza had called for a total ban on nudification apps, reinforcing the government's current stance.