Grok Exposes Sensitive Content Involving Minors

Posted on

xAI’s Grok is facing criticism for removing clothing from people’s images without their consent, following the recent launch of a feature that allows users to edit any image using the bot without needing the original poster’s permission. The original poster isn’t notified if their picture is altered, and it seems there are few safeguards to restrict anything beyond explicit nudity. In recent days, X has been flooded with edited images of women and children, some appearing pregnant or in sexualized situations. Prominent figures, including world leaders and celebrities, have also been depicted in these generated images.

According to Copyleaks, a company specializing in AI authentication, the trend started when adult-content creators began asking Grok for racy images of themselves. Soon, users began applying similar edits to photos of others—mostly women—who did not consent. Multiple women have reported a growing number of deepfakes on X to news outlets like Metro and PetaPixel. Grok previously had the ability to modify images in suggestive ways when tagged, but the new “Edit Image” tool has significantly increased this trend.

One now-removed post showed Grok editing a photo of two young girls into revealing outfits and suggestive poses. In another instance, a user prompted Grok to apologize for generating an image of girls estimated to be between 12 to 16 years old in inappropriate attire. The user called it a “failure in safeguards” that may violate xAI’s policies and possibly US law. While it’s uncertain if Grok’s images meet these legal standards, realistic AI-generated explicit imagery involving identifiable adults or children can be illegal. In a reply to another user, Grok suggested reporting the issue to the FBI for child sexual abuse material (CSAM) and mentioned it is “urgently fixing” its “lapses in safeguards.”

However, Grok’s responses, like a “heartfelt apology note,” don’t show any actual understanding of its impact and don’t necessarily reflect the views of xAI. When approached for comments, xAI responded to Reuters with just three words: “Legacy Media Lies,” and did not respond to The Verge in time for publication.

Elon Musk seems to have ignited a flurry of bikini edits after he asked Grok to replace a popular image of actor Ben Affleck with one of himself in a bikini. Shortly thereafter, Kim Jong Un’s leather jacket was swapped for a colorful spaghetti bikini, with Donald Trump nearby in a matching swimsuit. Musk even reposted a picture of a toaster in a bikini, joking that “Grok can put a bikini on everything.”

While some edits, like the toaster, were obviously meant for humor, others crossed into more risqué territory, with specific requests for Grok to create images featuring scantily-clad individuals or even to remove skirts entirely. Grok complied, though reports indicate it did not produce full nudity for the images. It even fulfilled requests to replace a toddler’s clothing with a bikini.

xAI’s AI products are heavily marketed as sexualized and have minimal safeguards. For instance, xAI’s AI companion Ani flirted with a Verge reporter, and Grok’s video generator has been reported to create topless deepfakes of celebrities like Taylor Swift, despite existing policies prohibiting such representations. In contrast, competitors like Google’s Veo and OpenAI’s Sora offer guardrails against generating NSFW content, although Sora has similarly been used to produce explicit content involving children.

The issue of deepfake imagery is on the rise, according to a report from cybersecurity firm DeepStrike, and many of these involve nonconsensual portrayals. A 2024 survey found that 40 percent of US students were aware of a deepfake involving someone they knew and 15 percent had encountered nonconsensual explicit or intimate deepfakes.

When questioned about the transformation of women’s images into bikini pictures, Grok responded: “These are AI creations based on requests, not real photo edits without consent.” How seriously one takes an AI bot’s denials is left up to the individual.

Leave a Reply

Your email address will not be published. Required fields are marked *