Developer Finds NeuralHash Machine Learning Model Related To Apple’s CSAM Scanning Initiative…. Apple Goes Into Spin Mode

Apple’s system to scan iCloud-bound photos on iOS devices to find illegal child sexual abuse material, or CSAM is supposed to ship in iOS 15 later this year. But there may be evidence that pieces of it have already shipped. The NeuralHash machine-learning model involved in that process of doing on device scanning for CSAM appears to have been present on iOS devices at least since the December 14, 2020 release of iOS 14.3.

That sounds pretty sketchy. It sounds like at the very least, they were laying the groundwork for this months ago without letting the public know about it. And the cynic in me says that they might have been using this to test something in the wild, seeing as iOS 14.3 is a public release. Which if that’s the case is all sorts of “shady A.F.”

People are looking at this discovery now. But given the fact that this NeuralHash machine learning model is encrypted, that may take a while. But Apple surprisingly was quick to comment on this. In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification. So in effect, Apple has admitted in a back handed way that this NeuralHash is related to their CSAM system.

Let’s think about this for a second. An amateur developer finds this NeuralHash in a version of iOS from late last year and Apple in an uncharacteristic move is quick to say it’s not the finished product. There’s absolutely nothing suspicious about that is there?


Here’s my $0.02 worth: People who are giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it this is invasive. There’s a ton of ways for a system like this to go wrong or be exploited or worse. Worse being it being repurposed for other means. And the result is that it will ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. Apple needs to stop trying to ram this down users throats and start working with the people who are raising objections to come up with something that is far less controversial. Just as I suggested here. Otherwise, Apple will lose the trust of their user base who expect them to do far better than this.

Leave a Reply

%d bloggers like this: