Aug 9 (Reuters) - Apple Inc (AAPL.O) on Monday said that iPhone users' entire photo libraries will be checked for known child abuse images if they are stored in the online iCloud service.
The disclosure came in a series of media briefings in which Apple is seeking to dispel alarm over its announcement last week that it will scan users' phones, tablets and computers for millions of illegal pictures.
NOT CLIENT SIDE, but on iCLOUD IF the photos ARE STORED on iCLOUD.
Nowhere do they claim to run the iPhone client side detection for images that are NOT stored on iCloud.
Also, considering Apple's abysmal battery runtimes and how little ML-power the Apple SoCs have compared to GPU clusters on the server side, it would make zero sense to run it on client side.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
n the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this da
Yes, that's what it says:
NOT CLIENT SIDE, but on iCLOUD IF the photos ARE STORED on iCLOUD.
Nowhere do they claim to run the iPhone client side detection for images that are NOT stored on iCloud.
Also, considering Apple's abysmal battery runtimes and how little ML-power the Apple SoCs have compared to GPU clusters on the server side, it would make zero sense to run it on client side.
https://www.apple.com/child-safety/
https://educatedguesswork.org/posts/apple-csam-intro/
https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning/
Thank you. I stand corrected. Didn't read that.
It was only announced this week hence the "that was quick" part, already has mission creep.