A group of Tennessee teens say Elon Musk's AI start-up helped turn their school photos into child sexual abuse material—and they're suing. In a proposed class-action filed Monday in federal court in California, three plaintiffs allege xAI's Grok chatbot was used to digitally strip clothing from images of more than 18 girls, many from the same school, creating nude and sexualized images that were then traded on Discord and Telegram, the Washington Post reports. Police have arrested an alleged perpetrator, who, per the lawsuit, traded the CSAM files for other sexually explicit content featuring minors in group chats with hundreds of other people.
The complaint accuses xAI and Musk of effectively enabling child pornography by rolling out editing features—via Grok's "Spicy" mode and related tools—that could "undress" real people and were promoted as a way to boost usage. The teens are seeking damages and a court order blocking similar image-editing capabilities, Reuters reports. Musk has previously said he was unaware of any underage explicit images created by Grok and that the system is designed to reject illegal requests, blaming any lapses on "adversarial hacking."
Regulators in California, Europe, and the UK have already opened investigations into xAI's sexual-image tools, which researchers say churned out an estimated 23,000 images that appeared to depict children in just 11 days. On Monday, Australia's online safety regulator warned that CSAM is more "systemic" and accessible on X than any other mainstream social media service, the Guardian reports.