Editorial: Apple ditches privacy for porn policing
In this world of snooping and snitching, truly private conversations are increasingly endangered. Apple devices once provided a refuge from all kinds of corporate and governmental prying, and the iconic Silicon Valley company had been studying how to block even itself from its users’ private communications by expanding end-to-end encryption — from the device to the cloud and back again — that ensures only the users can have access to their own information.
But Apple is now taking a deeply disappointing step in the opposite direction with its plan to scan photos collected on U.S. iPhones and iPads in a puzzling search for child pornography.
The company that has long vowed not to create back doors to encrypted user data is now building just such a door and is making a key. And when that key is in hand, who else will demand to use it, or figure out how to snatch it away?
Apple announced its plan on Thursday to scan devices for photos that are uploaded to its iCloud photos service. (Many companies scan photos uploaded to the cloud; Apple will scan devices for photos that have been uploaded to the cloud.)
Software will compare the scanned data to a collection of known sexually exploitative images of children. Matches will be reviewed by human beings, and if confirmed they will be flagged for the National Center for Missing and Exploited Children, a private nonprofit child protection organization. From there the information could conceivably be referred for criminal prosecution.
Apple distinguishes its program from others by noting that competing companies scan all user photos in the cloud — the global network of servers that collectively store uploaded data. Apple claims its program of scanning individual devices is more secure.
More secure, perhaps, but also more intrusive and creepy. And the larger point is that Apple has abandoned its laudable quest for user-only access. Why?
It could be because of pressure from the Justice Department and Congress, who believe we are safer and more secure when government can compel private companies to disclose user data. That was the gist of the showdown between Apple and the FBI following the 2015 terror attack in San Bernardino, in which authorities wanted Apple to help it break into the iPhone left behind by Syed Rizwan Farook.
Apple refused, angering many Americans who believed it was possibly standing in the way of their safety by protecting the privacy of a deceased killer. But the company was also standing in the way of government forcing itself into all of our devices and communications, and in the process it was standing up for privacy. The company is well aware that there is no hardware, software or policy that safeguards only the privacy of the good guys and permits surveillance only of criminals and terrorists.
Preventing access to anyone means locking out not just criminal prosecutors, but also foreign governments on the lookout for dissidents and others it wants to control, criminals who want to get their hands on personal information, commercial interests who want to find out what the competition is doing, spies and miscreants of all sources.
Child sex trafficking and exploitation of the innocent is a serious problem. Still, if Apple is going to open a door into otherwise private customer photos, why has it zeroed in on this issue as opposed to, say, terrorist threats, murder-for-hire plots or other serious crimes?
Perhaps because the crime is so photo-oriented, and because it has long been a target of Congress. But now that there is a door, won’t it be even easier for the government to open it even wider and to demand access to images that hint at other activities, criminal or otherwise?
“Apple will refuse any such demands,” the company said in a statement. And the showdown over the San Bernardino iPhone suggests that it may well mean it.
But this move, which may be meant to fend off government pressure, could just as easily encourage government to exert further pressure — for direct access to illegal photos in devices, and then to other communications that it argues provide evidence of crime.
Private communication that cannot be accessed by the prying eyes and ears of governments, companies or crooks is an essential element of freedom and Apple has in the past been right to promote it. The change in direction is a very serious setback.
More to Read
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.