Liv McMahonTechnology reporter
Getty ImagesThe UK government says it will ban so-called “nudity” apps as part of efforts to combat online misogyny.
The new laws announced Thursday as part of a broader strategy to halve violence against women and girls – will make it illegal to create and provide artificial intelligence tools that allow users to edit images to appear to remove someone's clothes.
The government said the new offenses would build on existing rules on sexual deepfakes and abuse of intimate images.
“Women and girls deserve to be safe both online and offline,” said Technology Minister Liz Kendall.
“We will not stand by while technology is used to abuse, humiliate and exploit people through the creation of non-consensual, sexually explicit deepfakes.”
Creating a deepfake explicit image of someone without their consent is already a criminal offense under the Internet Safety Act.
Ms Kendall said the new offense, which makes it illegal to create or distribute nude apps, would mean “those who profit from them or allow their use will feel the full force of the law”.
Nudity or “stripping” apps use generative artificial intelligence to realistically make it appear as if the person in the image or video has been stripped of their clothing.
Experts warn of danger growth of such applications and the potential that fake nude images can cause serious harm to victims, particularly when used to create child sexual abuse material (CSAM).
In April, Children's Commissioner for England Rachel de Souza called for a complete ban on nudity apps.
“Creating such an image is rightfully illegal – the technology that allows it to be done should also be illegal,” she said in the report.
On Thursday, the government said it would “join forces with technology companies” to develop methods to combat the abuse of intimate images.
This will include continued collaboration with British security technology firm SafeToNet, he said.
A British company has developed artificial intelligence software that it says can identify and block sexually explicit content, and block cameras when they detect sexually explicit content being filmed.
Such technology builds on existing filters implemented by platforms such as Meta to detect and flag potential nudity in images, often with the goal of preventing children from taking or sharing intimate images of themselves.
“There is no point in existing”
Plans to ban nudity apps come after previous calls from child welfare charities for the government to crack down on the technology.
The Internet Watch Foundation (IWF), whose Report Remove hotline allows under-18s to privately report explicit images of themselves online, said 19% of verified reporters said some or all of their images had been manipulated.
Its chief executive, Kerry Smith, welcomed the measures.
“We are also pleased to see concrete steps to ban these so-called nudity apps that have no reason to exist as a product,” she said.
“Apps like this put real children at even greater risk of harm, and we're seeing the resulting images collected in the darkest corners of the internet.”
However, while children's charity NSPCC welcomed the news, its director of strategy Dr Maria Neophytou said she was “disappointed” to see no similar “ambition” to introduce mandatory protection at device level.
The charity is among groups calling on the government to help tech companies find easier ways to detect and prevent the spread of CSAM on their services, such as private messaging.
The government said on Thursday it will make it “impossible” for children to take, post or view nude images on their phones.
It also seeks to outlaw artificial intelligence tools. intended for creating or distributing CSAM.








