- Joined
- Feb 28, 2011
- Messages
- 10,566
- Reaction score
- 7,961
- Location
- So. California
- Gender
- Male
- Political Leaning
- Very Liberal
Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.
The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.
Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.
The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.
Reign of Terror. We are also encouraging people to report those who don't follow the Covid protocols. Sounds kinda of anti-demorcatic.I have no problem with this. This should be done with all online cloud services. HOWEVER, this is a dilemma for Big gov Cons and Libertarians. Or is it? Is it only acceptable for anything you deem "really, really, bad?"
You guys better sleep with one eye open.
Apple will report images of child sexual abuse detected on iCloud to law enforcement
Apple started testing the system on Thursday, but most U.S. iPhone users won't be part of it until an iOS 15 update later this year, Apple said.www.cnbc.com
I have no problem with this. This should be done with all online cloud services. HOWEVER, this is a dilemma for Big gov Cons and Libertarians. Or is it? Is it only acceptable for anything you deem "really, really, bad?"
You guys better sleep with one eye open.
Apple will report images of child sexual abuse detected on iCloud to law enforcement
Apple started testing the system on Thursday, but most U.S. iPhone users won't be part of it until an iOS 15 update later this year, Apple said.www.cnbc.com
The matching process is done on the user’s iPhone, not in the cloud, Apple said.
The system only works on images uploaded to iCloud, which users can turn off, Apple said. Photos or other images on a device that haven’t been uploaded to Apple servers won’t be part of the system.
This ties into the point @Checkerboard Strangler made, and I agree that there are definitely concerns around how these images are being identified. The last thing we need is people being investigated for pictures of their naked babies in a bathtub.They should post their algorithm so we can evaluate the criteria they are using to determine such pictures.
They should post their algorithm so we can evaluate the criteria they are using to determine such pictures.
No, I mean that I can make my point without having someone try to twist it into something I didn't say, and take a left turn at Albuquerque I have no desire to make.You mean so James O'Keefe can turn it into an expose about "how Democrats enable NAMBLA?"
I guarantee you, posting their algorithm will only invite third party voluntarist CT manufacturing.
No one will even bother trying to UNDERSTAND the algorithm, because the same people won't even bother trying to understand how a f****g vaccine works.
I have no problem with this. This should be done with all online cloud services. HOWEVER, this is a dilemma for Big gov Cons and Libertarians. Or is it? Is it only acceptable for anything you deem "really, really, bad?"
You guys better sleep with one eye open.
Apple will report images of child sexual abuse detected on iCloud to law enforcement
Apple started testing the system on Thursday, but most U.S. iPhone users won't be part of it until an iOS 15 update later this year, Apple said.www.cnbc.com
The bigger issue is what happens in the future when Apple acquiesces to the Chinese government (or other authoritarian states) and starts handing over user data that contains anti-regime images or messages.
We already know that capitalists don’t give a **** about people’s lives, just continued short term profits.
How long before a couple of doting grandparents get hauled up on charges because Apple noticed a clip of "baby's first bath" at grandma's house.
See, I think it's great to try and leverage technology to catch these monsters, but my faith in Apple to do it properly is not very strong.
They will screw up big time and refuse all accountability for their mistakes, which will be legion.
Just like Facebook is currently doing.
That already exists. It is getting access to the cloud that can be the problem and decrypting the information.That's my concern too. It might not even be Apple's doing - all it takes is someone creating an application to scan user data in the cloud.
I think there is an unsubtle chasm between child's first bath and a child that's covered in bruises, undernourished, and...worse.How long before a couple of doting grandparents get hauled up on charges because Apple noticed a clip of "baby's first bath" at grandma's house.
See, I think it's great to try and leverage technology to catch these monsters, but my faith in Apple to do it properly is not very strong.
They will screw up big time and refuse all accountability for their mistakes, which will be legion.
Just like Facebook is currently doing.
There's plenty to crap on Apple for, but when it comes to privacy they're basically the only act in town when it comes to privacy.I automatically assume that Apple has bad motives, no matter what the issue is.
I think there is an unsubtle chasm between child's first bath and a child that's covered in bruises, undernourished, and...worse.
Touche.There's also some of that overlap from child abuse which you just referenced above and child MOLESTATION, some of which only leaves INTERNAL damage.
My issue is with some unaccountable corporate monolith taking charge of deciding BY ALGORITHM what is or is not child molesting.
We're basically allowing Apple, or any other phone mfr, to become Omni Consumer Products.
View attachment 67346510
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?