X says it’s stopping Grok from putting real people in bikinis on X
After public and government uproar over sexualized deepfakes of women and children, X’s Safety account posted Wednesday evening that it is no longer allowing the Grok account on X to generate “images of real people in revealing clothing such as bikinis.” The xAI-owned company also said it restricted image generation and editing via Grok on X more broadly to paid subscribers.
For what it’s worth, a subscriber reply to X Safety’s post asking Grok to put the tweet “in a bikini” prompted the chatbot to post an image of a woman in a bikini — though she does not appear to be a real person. I’m not a paid X subscriber but, in the process of reporting this piece, I was able to edit the image to be “younger” and “17 years old.”
The post also did not address what the changes mean for Grok’s stand-alone app, which currently ranks No. 5 among free apps in Apple’s App Store. Previous reporting from NBC News found that users could also still generate offensive images using the app.
Tesla and xAI CEO Elon Musk, for his part, said Wednesday that he was “not aware of any naked underage images generated by Grok.”
For what it’s worth, a subscriber reply to X Safety’s post asking Grok to put the tweet “in a bikini” prompted the chatbot to post an image of a woman in a bikini — though she does not appear to be a real person. I’m not a paid X subscriber but, in the process of reporting this piece, I was able to edit the image to be “younger” and “17 years old.”
The post also did not address what the changes mean for Grok’s stand-alone app, which currently ranks No. 5 among free apps in Apple’s App Store. Previous reporting from NBC News found that users could also still generate offensive images using the app.
Tesla and xAI CEO Elon Musk, for his part, said Wednesday that he was “not aware of any naked underage images generated by Grok.”