Nashville, Tennessee – Three teenagers Tennessee They sued this week xAI by Elon Musk, claiming that the imaging tools of the company were used to transform real photos of them into explicitly sexual images.
The high school students, who wish to proceed under pseudonyms, filed the lawsuit in California, where xAI – Musk’s artificial intelligence company – is based. They are seeking class action status to represent what the lawsuit claims are thousands of victims like them who are minors or were minors when explicit sexual images of them were created.
According to the lawsuit, Jane Doe 1 was anonymously alerted in December that someone was distributing sexually explicit images of her on a social media website.
“At least five of these files, one video and four images, depicted his actual face and body in familiar environments, but transformed into sexually explicit poses,” the lawsuit states. It claims that the person who distributed the images knew Doe and used xAI’s imaging tools to convert real photos of her into photos of sexual abuse. One of the images was taken from a photo back home. Another was taken from a high school yearbook.
The person who distributed the images also created explicit images of at least 18 other girls, two of whom are co-plaintiffs in the lawsuit. At the end of December, local police arrested the perpetrator and confiscated his phone. They discovered that he had uploaded the images to several platforms where he exchanged them for sexually explicit images of other minors.
Other AI companies have banned their image generators from producing any sexually explicit content, even adult ones. Musk saw a business opportunity in this and touted the ability of xAI’s Grok chatbot to create “spicy” content, according to the lawsuit. However, there is currently no way to prevent the generation of explicit images of adults while completely blocking the generation of images of children, the lawsuit states. It also claims that xAI knew Grok could produce sexually explicit images of children, but released it anyway.
The lawsuit alleges that the person who distributed the plaintiffs’ images used an app that was licensed to xAI technology or “otherwise purchased their access to Grok, and was used as a cutout or middleman.”
XAI did not respond to an email from The Associated Press seeking comment. But on January 14, a post about the controversy was published on the social media platform
“We take steps to remove high-priority violating content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity, taking appropriate action against accounts that violate our X Standards. We also report accounts seeking Child Sexual Exploitation material to law enforcement authorities as necessary.”
Meanwhile, the student defendants say they worry that images created of them will live forever on the Internet. They fear being harassed because their first names and the name of their school appear in the files. They worry that their friends and classmates have seen the photos and videos, which look real, and they worry who will see them in the future.