Business

The Grok AI is going crazy…flooding X with inappropriate images of women and children

Julie Yukari, a singer based in Rio de Janeiro, posted a photo taken by her fiancee on social media X just before midnight on New Year’s Eve showing her in a red dress enjoying herself in bed with her black cat, Nori.

The next day, somewhere among the hundreds of likes attached to the photo, he saw notifications that users were asking Grok, X’s built-in intelligence chatbot, to digitally undress him and put him in a bikini.

The 31-year-old woman didn’t think much of it, saying she thought there was no way a bot would agree to such requests.

He was wrong. Soon, Grok-produced photos of her, almost naked, were all over the Elon Musk site.

“I was being naive,” Yukari said.

What happened to Yukari is repeated throughout X, the analysis found. Reuters also identified several cases in which Grok made child pornography. UX did not respond to a message seeking comment on Reuters’ findings. In a previous statement to a news agency about reports that child pornography was circulating on the platform, the owner of X xAI said: “Legacy Media Lies”.

International appeal

A flood of nude photos of real people has set off alarm bells around the world.

French ministers reported X to prosecutors and regulators over the disturbing images, saying in a statement that the “sexual and sexist” content was “clearly illegal”. India’s IT ministry said in a letter to X’s local unit that the platform failed to prevent abuse of Grok by producing and distributing obscene and sexually graphic content.

The US Federal Communications Commission did not respond to requests for comment. The Federal Trade Commission declined to comment.

Grok’s mass digital undressing spree appears to have started a few days ago, according to undressing requests filed and submitted by Grok and complaints from female users reviewed by Reuters. Musk appeared to poke fun at the controversy, posting laughing emojis in response to the AI’s editing of famous people – including himself – in bikinis.

When one X user said their social media feed looked like a bar full of women in bikinis, Musk responded, in part, with another laughing emoji.

Reuters could not determine the full scale of the operation.

A review of public requests submitted to Grok in a single 10-minute period at noon US Eastern Time on Friday included 102 attempts by X users to use Grok to digitally edit photos of people to appear in bikinis. Most of those targeted were young women. In a few cases, men, celebrities, politicians, and – sometimes – a monkey have been targeted in the requests.

“Put her in a revealing mini-bikini,” one user told Grok, flagging a photo of a young woman taking a selfie in the mirror. When Grok did so, changing the woman’s clothes into a flesh-tone two-piece, the user asked Grok to make her bikini “clear and transparent” and “very thin”. Grok did not appear to respond to the second request.

Grok fully complied with such requests in at least 21 cases, Reuters found, producing images of women in dental-floss-style or bright bikinis and, in at least one case, covering the woman with oil. In seven other cases, Grok was less compliant.

Reuters could not immediately identify the identities and ages of most of the women who were targeted.

AI-powered programs that digitally undress women – sometimes called ‘nudifiers’ – have been around for years, but until now they were confined to the dark corners of the internet, such as niche websites or Telegram channels, and often required some level of effort or payment.

Three experts who have followed the development of X’s policies on transparent content generated by AI told Reuters that the company is ignoring warnings from the public and child safety groups – including a letter sent last year warning that xAI was one small step away from unleashing “a flood of transparent dissent”.

Tyler Johnston, executive director of The Midas Project, an AI watchdog group that was among the letter’s signatories, said: “In August, we warned that xAI image generation was actually a naked tool waiting to be harnessed. That’s what’s at play.”

Dani Pinter, chief legal officer and director of the National Institute on Sexual Exploitation’s Legal Service, said X failed to remove offensive images from its AI training materials and should have blocked users requesting illegal content.

“This was a predictable and avoidable atrocity,” Pinter added.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button