Govt officials have been quoted saying that the 'safe harbor' status of X could be revoked because of Grok's CSAM content ...
Irish culture minister Patrick O'Donovan says that X is not responsible for the child sexual abuse material generated by it ...
Later, when a user from X compared Grok to a pen, Elon Musk emphasized his point by stating that it is not the pen that is at ...
A controversial child sexual abuse material (CSAM)-scanning proposal that's under discussion by lawmakers in Europe is both the wrong response to tackling a sensitive and multifaceted societal problem ...
It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users ...
Breakthroughs, discoveries, and DIY tips sent every weekday. Terms of Service and Privacy Policy. Earlier this month, Apple announced they would be taking additional ...
In December, Apple announced that it was killing a controversial iCloud photo-scanning tool the company had devised to combat child sexual abuse material (CSAM) in what it said was a ...
When Apple announced its plans to tackle child abuse material on its operating systems last week, it said the threshold it set for false positives account disabling would be one in a trillion per year ...
Elon Musk's Grok AI has been allowing users to transform photographs of woman and children into nude and compromising images.
Two major developments reignited regulatory and technological discourse around Child Sexual Abuse Material (CSAM) this year: The first, Visa & MasterCard cracking down on adult sites that contained ...