Teens File Lawsuit Against xAI Over Grok's Alleged Issues with Generating Sexual Images

Teens File Lawsuit Against xAI Over Grok’s Alleged Issues with Generating Sexual Images

2 Min Read

The class action lawsuit accuses Elon Musk of exploiting an opportunity to profit from sexual predation. Three Jane Does, two of whom are minors, filed a lawsuit against xAI related to the creation of child sexual abuse material (CSAM) using Musk’s AI tool, Grok. The complaint suggests that while other AI firms implement safeguards against misuse by child sex predators, xAI allegedly failed to do so. Instead, the suit contends that xAI and Musk saw a potential profit in the sexual exploitation of individuals, including minors.

In January, it was reported that xAI acknowledged Grok’s capacity to generate images of minors in minimal clothing. A subsequent report noted Grok generated approximately three million sexualized images, including 23,000 of children, over a span of ten days. Various international entities, including those in France, the UK, Ireland, India, and Brazil, are investigating Grok, and California has also initiated an investigation.

The lawsuit, filed on March 16 in California federal court, details significant harm to the Jane Does, Tennessee residents, from xAI’s CSAM production. In December 2025, Jane Doe 1 found out via Instagram that images of her had been generated and shared on Discord by someone she knew. This individual had created images of at least 18 other girls, many of them familiar to Jane Doe 1. Jane Doe 1 is now an adult but was a minor when the original images were taken.

In February, Jane Does 2 and 3 were informed by law enforcement that their images were similarly used to generate CSAM. They remain minors. Another lawsuit against xAI was filed in the same court on January 23, involving an adult Jane Doe who claimed Grok “undressed” her image, portraying her in a bikini. Mashable has reached out to xAI for comment.

You might also like