Deep search
Search
Copilot
Images
Videos
Maps
News
Shopping
More
Flights
Travel
Hotels
Real Estate
Notebook
Top stories
Sports
U.S.
2024 Election
Local
World
Science
Technology
Entertainment
Business
More
Politics
Any time
Past hour
Past 24 hours
Past 7 days
Past 30 days
Best match
Most recent
Apple, CSAM
Apple Hit With $1.2B Lawsuit Over Abandoned CSAM Detection System
Apple is facing a lawsuit seeking $1.2 billion in damages over its decision to abandon plans for scanning iCloud photos for child sexual
Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool
Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
Apple hit with another lawsuit over iCloud-scanning u-turn
It's the second suit over Apple's decision not to scan everyone's iCloud photos for CSAM.
Apple Defeats Lawsuit Related to iCloud's Measly 5GB of Free Storage
The U.S. Court of Appeals for the Ninth Circuit this week upheld a lower court's dismissal of a lawsuit alleging that Apple illegally
Apple faces billion-dollar lawsuit over allegedly failing to block iCloud child abuse
Apple faces lawsuits alleging failure to curb child abuse content on iCloud, prompting questions about privacy and accountability.
Apple's Abandonment of iCloud CSAM Scanner Is Hurting Victims, Lawsuit Alleges
A second suit says Apple isn't doing enough to stop the spread of harmful images and videos and that it's revictimizing the subjects of those materials.
Apple Faces $1.2B Lawsuit Over Scrapped CSAM Tool
Apple faces a $1.2 billion lawsuit for failing to address child sex abuse material (CSAM) after cancelling a detection tool.
Apple faces lawsuit over abandoned CSAM detection on iCloud, NYT reports
Apple (AAPL) has been hit with a lawsuit over its decision not to implement a system that would have scanned iCloud files for child sexual
Apple sued over abandoning CSAM detection for iCloud
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material,
Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud
Apple faces legal action over its decision to abandon a system designed to detect child sexual abuse material (CSAM) in iCloud. The lawsuit, filed in Northern California, accuses Apple of neglecting to implement promised measures to combat the circulation of CSAM,
2d
Apple allowed child sex abuse materials to proliferate, according to class action lawsuit
Apple is once again facing a billion dollar lawsuit, as thousands of victims come out against the company for its alleged ...
3d
Apple’s iPhone Hit By FBI Warning And Lawsuit Before iOS 18.2 Release
For users eagerly anticipating iOS 18.2’s Apple Intelligence upgrade, the net result is that the security all iPhone users ...
4d
Apple Sued for Failing to Curtail Child Sexual Abuse Material on iCloud
Victims of abuse are seeking more than $1.2 billion in damages, arguing that the company abandoned a 2021 system it developed ...
10d
Employee lawsuit accuses Apple of spying on its workers
The suit alleges Apple forces employees to give up their personal privacy rights and surveils them through iCloud accounts ...
9d
Lawsuit Claims Apple Spies on Employees’ Personal Devices
A digital advertising specialist says the tech giant forces people to install software that lets the company track their ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results
Feedback