UK Government Targets Nudification Apps
The UK government announced plans to ban so-called “nudification” apps. This move is part of a wider strategy to reduce violence against women and girls.
The new laws will make it illegal to create or supply AI tools that edit images to make it appear as though someone’s clothing has been removed.
Building on Existing Laws
Current legislation under the Online Safety Act already criminalizes creating deepfake explicit images without consent. The new rules expand on this by targeting apps specifically designed to “de-clothe” people digitally.
Technology Secretary Liz Kendall said, “Women and girls deserve to be safe online as well as offline. We will not allow technology to be used to abuse, humiliate, or exploit them.”
Risks of Nudification Apps
Nudification apps use generative AI to realistically remove clothing from images or videos. Experts warn that these apps can cause serious harm, especially when used to create child sexual abuse material (CSAM).
In April, Children’s Commissioner for England Dame Rachel de Souza called for a total ban, stating, “The act of making such an image is rightly illegal – the technology enabling it should also be.”
Collaboration with Tech Companies
The UK government plans to work with tech firms to prevent intimate image abuse. This includes partnering with UK safety tech company SafeToNet, which develops AI software to detect and block sexual content and even disable cameras when inappropriate content is detected.
Platforms such as Meta already use similar filters to prevent children from taking or sharing intimate images.
Child Protection and Expert Views
The Internet Watch Foundation (IWF) reported that 19% of confirmed reporters said their imagery had been manipulated. Its CEO Kerry Smith welcomed the ban, adding, “Apps like this put children at greater risk and have no reason to exist.”
The NSPCC also supported the move but noted the government should implement device-level protections to prevent children from taking, sharing, or viewing nude images.
Tackling AI-Generated CSAM
The government aims to outlaw AI tools designed to create or distribute CSAM. These measures are part of broader efforts to make digital spaces safer for children and reduce online sexual exploitation.
