Privacy Pinpoint: Apple to scan photos on all iPhones

Privacy Pinpoint: Apple to scan photos on all iPhones

Apple is back in the news, but instead of protecting its users like they did back in 2016 when they denied the FBI access to a San Bernardino shooters iPhone, they have new plans that include scanning all of your messages and photos - and not just the ones in the iCloud (they've been doing that for years) - the ones on your iPhone. That device you paid your hard-earned money for; that same device that you have some of your most precious information on - it's being scanned for "malicious" content. Apple has spent tons and tons of money on ad campaigns boasting about how much it values the privacy of its users - about how they have gone (and will go) above and beyond to protect their users.

Just a couple years ago, the Attorney General William Barr tried to build new legislation called the EARN IT Act. This legislation, on the surface, was commendable. It was trying to find innovative ways to allow online entities to be held accountable for any "online child sexual abuse material". Just beneath the surface, however, this legislation was a mess. It would create a congressionally appointed commision with the development of "best practices" that all websites, applications, Internet Service Providers, and other online entities could follow to avoid liability for any of their users uploading child abuse materials. The main issue with this is that assuming this tech company has millions of users, how can they be held liable for user uploaded content, especially without knowledge of it? Yes, this legislation includes any tech company who provides encrypted storage services - how could they know what content is uploaded if they don't have access to it? This is a bill that would attack the entirety of encryption altogether because it would hold the company liable and remove their section 230 rights. This would force them to find a way to be able to report the content (therefore breaking encryption or otherwise storing the private keys of its users in order to hand them over to law enforcement later.

Let's not pretend though, any decent human being likely believes people who are engaging in child abuse or utilizing content of such disgusting things should be buried under a prison - but to put a backdoor in encryption or otherwise hamper it's effectiveness with the guise of "it's for the children" is a joke. It's the technology version of taking away guns from law abiding citizens and saying it's because a criminal used one to murder someone - of course they did - they're a criminal and they'll find criminal ways to get a hold of a firearm regardless of the laws we set forth because - you guessed it - they're a criminal. If technologists were to poke a hole like that in the encryption mechanisms we use - that could and would be found/used/abused by the government and many bad actors. Things are hard enough to keep private and secure these days, we don't need to willingly make a big hole for malicious people/entities to climb through. This would hurt the vast majority of every day users of technology much more than it would hurt criminals. To reiterate – we are sure that everyone reading this agrees that anyone using or creating this type of content should be stopped and the world would be a much better place without them – but turning our devices on us whether we're committing these atrocities or not is wrong, and this type of software can be used for a plethora of terrifyingly bad things.

All of that above information is important because even though Apple protected users back in 2016, they're now flipping the switch. They have announced new features that include "protection for children" - the same disguise as Barr used. These features will search through iPhone messages and photos for images of child abuse using a few proprietary tools; one of these is called NeuralMatch which will scan photo libraries to see if they contain anything that matches a data base of known child abuse imagery. Again on the surface this is a commendable and reasonable set of features that would protect our kids. Underneath the hood, this is a privacy mess where Apple is rolling out features that should be considered mass surveillance rather than protection for our children.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this...Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs – without asking."  – Edward Snowden

Cryptography researchers and privacy activists all over the world have brought up quite a bit of interesting theoreticals that should be heavily considered. These are the types of tools used by Authoritarian or Communist governments (ahem CCP) to control the masses by scanning for whatever it is they want – and they use this type of technology to censor entire countries in some cases. Apple has been under government pressure for years to allow for increased surveillance of the encrypted data of their userbase. The EFF warns that "all it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content ... That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change." whether this is anti-government messages, LGBTQ content, or any other number of things which could be used maliciously – this is a bad thing indeed.

Apple – today – says that these are the operating procedures – but what about tomorrow when the government pushes them to do X, Y, or Z? What methods do we have to protect ourselves? None. How can we stop these features from coming to our phones through the update process? We can't. Apple controls those mechanisms, not us. Apple only cares about your privacy until something better comes along, and right now – this is it. These devices are not private and they are definitely not secure, especially with technology like Pegasus around. We need to have companion devices that we have full control over, like a laptop running Linux, so we can ensure that our technology works for us and keeps our information secure and private without question.

Thanks for reading, see you all next week. Stay safe, stay secure, stay private.