In a lawsuit filed in California federal court, three anonymous plaintiffs claim Elon Musk’s company xAI should be accountable for its AI models producing explicit sexual images of identifiable minors. The plaintiffs seek to represent a class of individuals whose real images as minors were altered into sexual content by Grok. They accuse xAI of neglecting basic safeguards used by other labs to prevent AI image models from creating pornographic content involving real individuals and minors.
The case, titled Jane Doe 1, Jane Doe 2, a minor, and Jane Doe 3, a minor versus x.AI Corp. and x.AI LLC, was filed in the U.S. District Court of California Northern District. The lawsuit asserts that xAI did not implement standard measures used by other deep-learning image generators to prevent the transformation of normal photographs into child pornography.
The suit highlights Musk’s public endorsement of Grok’s capability to generate sensual images and portray real people in revealing outfits, emphasizing its relevance. TechCrunch’s request for comment from xAI remains unanswered.
One plaintiff, Jane Doe 1, discovered her high school homecoming and yearbook photos altered by Grok to show her undressed following an anonymous Instagram tip. The photos were shared online, with a Discord server revealing similar images of her and other schoolmates.
Jane Doe 2 learned from criminal investigators about sexualized images of her produced by a third-party app using Grok models, while Jane Doe 3 was informed of an altered pornographic image of her on a phone in a criminal case investigation. The plaintiffs’ lawyers argue that since third-party usage involves xAI’s code and servers, the company bears responsibility.
All three plaintiffs, including two minors, report experiencing severe distress due to these images’ distribution, fearing the impact on their reputations and social lives. They seek civil penalties under laws aimed at safeguarding exploited children and preventing corporate negligence.
