Code for extraction
https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX
Working collision
https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1
Apple's claim
https://www.apple.com/child-safety/
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
And not only that, according to this reddit post
Believe it or not, [the NeuralHash algorithm for on-device CSAM detection] already exists as early as iOS 14.3, hidden under obfuscated class names.
I'm not tech savvy enough to know what this means, other than that they're already implementing what they said they were going to do in the future.
Thanks for the explination, and I surmise that you're correct about memes saved on your phone.
My apple is never getting synced again, and I will be getting a Gab phone when those come out. Alas, I lose years of text messages on the Apple and have to physically write my contacts down.