Code for extraction
https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX
Working collision
https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1
Apple's claim
https://www.apple.com/child-safety/
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
And not only that, according to this reddit post
Believe it or not, [the NeuralHash algorithm for on-device CSAM detection] already exists as early as iOS 14.3, hidden under obfuscated class names.
What you just said has been plastered all over places talking about this topic. Someone doesn't want people to go there.