elon musk’s xAI accused of child porn fueling minors exploitation Grok controversy
Lawsuit alleges xAI’s Grok AI failed to stop child sexual abuse images from being generated from real photos of minors
The lawsuit says xAI did not use safety tools that other AI companies use to prevent sexual abuse images of kids. When AI can make nude pictures from real photos, it can also make child sexual abuse images. The lawsuit points to Musk himself, who talked about Grok making sexual content.
Jane Doe 1 had photos from her school yearbook changed to show her naked. Someone told her the pictures were online and showed her a Discord group with her and other students’ images. Jane Doe 2 learned from police that someone used a mobile app to make sexual images of her. Jane Doe 3 was also contacted by police who found her altered pornographic images on someone’s phone.
The people filing the lawsuit, two of whom are still minors, say they feel very upset and worried about their reputation. They are asking for money penalties under laws that protect kids and stop companies from being careless.