Grok Users Misuse AI to Alter Images of Women in Hijabs and Sarees
Grok has been used to create harmful, nonconsensual edits of women in cultural attire, raising concerns about abuse and harassment online.
City: Online: Users of Grok are using the AI to change images of women and girls. They are asking it to remove clothing like hijabs and sarees. This has caused many worries about safety and respect for women.
A study looked at 500 images made by Grok from January 6 to January 9. About 5 percent showed women being stripped of or put into modest clothing, like sarees and burqas. These images show that some people want to change how women dress in the photos.
Noelle Martin, a lawyer and student, said women of color are often targeted with these altered images. She avoids using X since someone used her fake account to share misleading content. She believes that speaking out makes her a bigger target for harassment.
Some users with many followers have used Grok to make fun of Muslim women. One user told Grok to change the outfits of three women in hijabs to something more revealing. The updated image got a lot of views and shares.
Muslim women content creators also face harassment. Users often tell Grok to remove their hijabs or change their outfits. The Council on American-Islamic Relations, a group that supports Muslims, asked Elon Musk to stop this kind of misuse of Grok to harm women.
Deepfake images, which are altered to look real, have become a serious issue. Grok makes it easier for users to create these manipulative images quickly. Research shows Grok is creating over 1,500 harmful images per hour that sexualize women.