Apple Scanning For CSAM Material Isn’t A Black And White Issue

Earlier this week Apple announced that they would eventually start scanning users photos in iCloud for CSAM or Child Sexual Abuse Material. Now Apple did post a very technical document on this that is very much worth reading. But here’s what you need to know:

  • Apple will be using a database of known images to scan against. The database is not publicly available.
  • Apple will report instances of CSAM to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United States.

This set off a firestorm of controversy. Mainly because the thinking is that this is the thin edge of the wedge in terms of Apple looking for anything that they consider to be illegal and reporting it to authorities. The EFF for example has a very good document as to what they found problematic about this. And there’s the perception that a company who allegedly values privacy is now invading your privacy.

The thing is that I don’t think it’s that simple.

Google for example does this already and has done this for years. Twitter, Microsoft, Facebook, and others use the sort of methods that Apple plans to use to look for and report known images of child abuse. So Apple could be seen as being late to this party. But in reality they’re not. In 2019 Apple updated its privacy policies to note that it would scan uploaded content for “potentially illegal content, including child sexual exploitation material”. So Apple has been doing this for a while. It’s just that nobody noticed or cared.

So what this says to me is that the is much ado about nothing. Right?

Not so fast. Let’s go back to to the fact that many think that this is the thin edge of the wedge in terms of scanning for anything that is deemed to be illegal and reporting it to the relevant authorities. Apple does plan on rolling this out on a country by country basis. So who’s to say that Apple to comply with some requirement that a country has won’t start doing any of the following:

  • Photos of people not wearing Covid masks.
  • Photos of Chinese people disrespecting the Chinese government.
  • Photos of Middle eastern women not wearing hijabs.

To be clear, I am picking some somewhat touchy examples to make the following point: That some country may want Apple to scan for items that we in Canada or the USA would consider to be problematic. But the country in question finds perfectly acceptable. And it isn’t like Apple hasn’t gone along with something like this before. Apple does have a completely separate iCloud instance for China to comply with local regulations. So it’s not a far fetched idea in my mind.

Then there’s the fact that Apple could expand this to any illegal activity in places like Canada and the USA. Would they do that? I don’t know. But it is possible. And Apple has been under pressure for years to do just that. Mostly in the realm of preventing and investigating terrorism. So I can see a scenario where Apple could start doing something like for terrorism investigations, and beyond. And imagine how angry users would be if that happened.

I think there’s no easy answers here. Nobody wants scumbags exploiting children. And we as a society should do everything to stop it. But I think what needs to happen is that Apple, Google, Facebook and others need to have a very nuanced discussion with lawmakers and users about how you balance the need to protect children with the need for privacy. Because as far as I am concerned, this isn’t a black and white issue. And as a result, you shouldn’t expect a black and white answer.

Leave a Reply

%d bloggers like this: