Recently, social media site X (formerly known as Twitter before the world’s richest asshole got his hands on it) introduced a new feature for its generative AI chatbot, Grok. Images now have an Edit Image button attached, enabling users to modify the original poster’s picture posts without their permission. Users can also post a natural language prompt in the post’s thread, using Grok to modify the image to their liking.
It cannot be overstated how bad an idea this was. Or how easy it is to see where this would go wrong.
Immediately, some users began using this feature on photos of children, prompting Grok to depict them in their underwear. In other words, they prompted the Grok app to generate child pornography. The requests were fulfilled by Grok.
Engineers at X acknowledged the “issue” and said they’re working to add “guardrails” to the Grok program to prevent it. It’s not clear, however, how far these guardrails will extend. In addition to generating child porn, the Grok program processes requests to edit arbitrary women’s photos and sexualize them – e.g., by re-skinning a woman in plain, everyday clothes into a bikini.
It shouldn’t need to be said that there’s nothing wrong with sex and sexuality. (Some of you out there are thick, though, so I’ll say it anyway.) The issue is nonconsensual sexualization. These are women who never agreed – and, in most cases, explicitly do not want – to have what they do sexualized. Yet men do it anyway, despite the petabytes of freely available and paid porn available online.
This behavior isn’t restricted to English-speaking X. It’s stirring anger in Japan, too, as some users wield Grok to sexualize everyone from a popular cosplayer to a royal princess.
Cosplayer protests
One of the women who’s brought this to light in Japan is cosplayer Yukina (@nikuyukina2). Yukina noticed comments on this post of her cosplaying Shinjō Akane from the anime SSSS.GRIDMAN requesting Grok to “zoom out and change her outfit to a single dental floss.”