Apple is intending to start detecting child sexual abuse material (CSAM) in iCloud photos and they are planning to use the NeuralHash algorithm so that the detection can happen without actually looking at anyone's photos.

Hash functions convert data into strings of characters, which act as fingerprints for the files - different files produce different hashes. But they aren't great for images because the computed hash changes dramatically even if the image is only slightly different (rotated, flipped, cropped, resized).

The NeuralHash solves this and generates hashes that remain largely consistent even when the image is changed slightly.

Read more here!

Click on the images to use your own. Hashes are computed on your device and nothing is uploaded.
Note: This does not work on iOS for now.
Some naturally-occuring hash collisions can be found here.

Loading NeuralHash model... While we're waiting, here's a quick background.

Apple is intending to start detecting child sexual abuse material (CSAM) in iCloud photos and they are planning to use the NeuralHash algorithm so that the detection can happen without actually looking at anyone's photos.

Hash functions convert data into strings of characters, which act as fingerprints for the files - different files produce different hashes. But they aren't great for images because the computed hash changes dramatically even if the image is only slightly different (rotated, flipped, cropped, resized).

The NeuralHash solves this and generates hashes that remain largely consistent even when the image is changed slightly.

Image A
{{ i.value }}
{{ i.value }}
Image B