X blames users for Grok-generated CSAM; no fixes announced
voxadam Monday, January 05, 2026
Summary
The article discusses Xblame's response to a report about Grok, its AI-powered content moderation tool, generating child sexual abuse material (CSAM). Xblame attributes the issue to users, stating that no fixes have been announced to address the problem.
328
573
Summary
arstechnica.com