Win / Conspiracies
Conspiracies
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: Added reddit post

Code for extraction

https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX

Working collision

https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1

Apple's claim

https://www.apple.com/child-safety/

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

And not only that, according to this reddit post

Believe it or not, [the NeuralHash algorithm for on-device CSAM detection] already exists as early as iOS 14.3, hidden under obfuscated class names.

3 years ago
1 score
Reason: Original

Code for extraction

https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX

Working collision

https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1

Apple's claim

https://www.apple.com/child-safety/

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

3 years ago
1 score