Aug 9 (Reuters) - Apple Inc (AAPL.O) on Monday said that iPhone users' entire photo libraries will be checked for known child abuse images if they are stored in the online iCloud service.
The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users' phones, tablets and computers for millions of illegal pictures.
NOT CLIENT SIDE, but on iCLOUD IF the photos ARE STORED on iCLOUD.
Nowhere do they claim to run the iPhone client side detection for images that are NOT stored on iCloud.
Also, considering Apple's abysmal battery runtimes and how little ML-power the Apple SoCs have compared to GPU clusters on the server side, it would make zero sense to run it on client side.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
n the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this da
Nope, that's reading it wrong.
It was always only on iCloud and they are still claiming they don't do it on the client side for photos stored only there.
I don't trust Apple longer than I can throw them (not very far), but that's what they are claiming.
Anyway, anybody who stores anything of privacy value on the cloud is a fucking moron.
Yes, that's what it says:
NOT CLIENT SIDE, but on iCLOUD IF the photos ARE STORED on iCLOUD.
Nowhere do they claim to run the iPhone client side detection for images that are NOT stored on iCloud.
Also, considering Apple's abysmal battery runtimes and how little ML-power the Apple SoCs have compared to GPU clusters on the server side, it would make zero sense to run it on client side.
https://www.apple.com/child-safety/
https://educatedguesswork.org/posts/apple-csam-intro/
https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/
Thank you. I stand corrected. Didn't read that.
It was only announced this week hence the "that was quick" part, already has mission creep.
Never put stuff that's personal on 3rd party sites.
I think that Apple won't be searching your phone for child pr0n, but rather right wing memes to ID conservatives. Just a wild theory.