Macworld’s take on iPhone adding scanning for Child Abuse Materials and Privacy

I have been reading a lot of articles on Apple’s recent move to add CSAM or Child Sexual Abuse Materials in any photos you sync with iCloud (which is all of them if you have iCloud backup on). I really like the depth that Jason Snell at Macworld has taken on the issue, and why it is an issue.

Apple’s approach here calls all of that into question, and I suspect that’s the source of some of the greatest criticism of this announcement. Apple is making decisions that it thinks will enhance privacy. Nobody at Apple is scanning your photos, and nobody at Apple can even look at the potential CSAM images until a threshold has passed that reduces the chance of false positives. Only your device sees your data. Which is great, because our devices are sacred and they belong to us.

Apple’s approach here calls all of that into question, and I suspect that’s the source of some of the greatest criticism of this announcement. Apple is making decisions that it thinks will enhance privacy. Nobody at Apple is scanning your photos, and nobody at Apple can even look at the potential CSAM images until a threshold has passed that reduces the chance of false positives. Only your device sees your data. Which is great, because our devices are sacred and they belong to us.

The risk for Apple here is huge. It has invested an awful lot of time in equating on-device actions with privacy, and it risks poisoning all of that work with the perception that our phones are no longer our castles.

And while it is noble to try and do something about Child Sex Abuse, it also does fly in the face of Apple and them being the arbiter of privacy. And that isn’t even talking about false positives. And then there is where does this lead, because if they are scanning your photos won’t they soon be scanning everything, and where is the privacy there.

Leave a Reply