Woman speaks out after AI image abuse.
A woman has told the BBC she felt dehumanised after Elon Musk’s AI tool Grok was used to digitally remove her clothes. She said the experience reduced her to a sexual stereotype and caused real emotional harm.
The altered images did not show her real body. However, they closely resembled her. Because of this, she said the impact felt deeply personal and violating.
How Grok was misused on X
The BBC reviewed several posts on X where users asked Grok to undress women. In many cases, the chatbot edited photos to make women appear in bikinis. It also placed them in sexual situations. None of the women involved gave consent.
As these posts spread, more users joined in. Some even asked Grok to create new images of the same women.
Samantha Smith’s experience
Journalist Samantha Smith shared her experience on X after someone altered her image using Grok. Soon after, others replied, saying the same thing had happened to them.
Instead of stopping, some users encouraged Grok to generate more images of her.
“Women are not consenting to this,” she said. She explained that although the images were fake, they looked like her. As a result, the experience felt just as invasive as sharing real, intimate photos.
Response from xAI and X
XAI, the company behind Grok, did not provide a detailed response to the BBC. Instead, it sent an automatic reply that said “legacy media lies.”
Grok is a free AI assistant on X, although some features require payment. Users often tag it to get context or reactions. However, they can also upload and edit images using their AI tools.
The system has faced criticism before. In the past, users accused it of generating sexualised photos and videos, including a sexually explicit clip of Taylor Swift.
Legal and expert concerns
Law professor Clare McGlynn from Durham University said X and Grok could stop this abuse if they chose to. She added that the platform appears to act without fear of consequences.
She also said the platform has allowed these images to spread for months. So far, regulators have taken little visible action.
Although xAI’s own rules ban pornographic use of real people’s likenesses, users continue to break those rules without clear enforcement.
What the law says
Ofcom told the BanerClub that creating or sharing non-consensual intimate images is illegal. This includes sexual deepfakes made with AI.
It added that platforms like X must assess the risk of illegal content. They must also act fast to remove it once detected. However, Ofcom did not confirm whether it is currently investigating X or Grok.
Meanwhile, the Home Office said it plans to ban nudification tools. Under new laws, anyone who supplies this technology could face prison time and heavy fines.
