Two years ago, Apple first announced a photo-scanning technology aimed at detecting CSAM—child sexual abuse material—and then, after receiving widespread criticism, put those plans on hold. Read ...
Key negotiators in the European Parliament have announced making a breakthrough in talks to set MEPs’ position on a controversial legislative proposal aimed at regulating how platforms should respond ...
As part of its content filtering service, DNSFilter automatically blocks CSAM content and generates detailed reports on related activity. The company expanded its blocklist by hundreds of thousands of ...
Apple Inc.'s (NASDAQ:AAPL) decision to abandon plans to scan iPhones for child sexual abuse material (CSAM) has invited the wrath of protestors, who have now set up banners in front of Apple Park to ...
New service makes high-precision CSAM identification and classification capability available to platforms and services through the world's leading trust & safety intelligence provider. LEEDS, United ...
A child protection organization says it has found more cases of abuse images on Apple platforms in the UK than Apple has reported globally. In 2022, Apple abandoned its plans for Child Sexual Abuse ...
Apple Inc. (NASDAQ:AAPL) is facing a $1.2 billion lawsuit filed on Saturday in U.S. District Court in Northern California for discontinuing its child sexual abuse material detection feature. What ...
The CSAM detection system preserved user privacy, data encryption, and more, but it also introduced many potential new attack vectors that may be abused by authoritarian governments. For example, if ...
Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple ...
Rhiannon was just thirteen when she was groomed online, coerced and sexually abused. Her perpetrator was charged, but the impact of his crimes runs deep. “I didn't speak about my abuse for a very long ...