- Joined
- Feb 28, 2011
- Messages
- 10,566
- Reaction score
- 7,961
- Location
- So. California
- Gender
- Male
- Political Leaning
- Very Liberal
I have no problem with this. This should be done with all online cloud services. HOWEVER, this is a dilemma for Big gov Cons and Libertarians. Or is it? Is it only acceptable for anything you deem "really, really, bad?"
You guys better sleep with one eye open.
You guys better sleep with one eye open.
Apple will report images of child sexual abuse detected on iCloud to law enforcement
Apple started testing the system on Thursday, but most U.S. iPhone users won't be part of it until an iOS 15 update later this year, Apple said.
www.cnbc.com
Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.
The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.
Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.
The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.