On the heels of the celebrity nude photo leak, Apple released a full report on its commitment to user privacy. The statement, which followed the announcement of the new iOS 8 software, asserts that data on Apple devices is so secure that even company employees cannot access it. This means that personal information cannot be given to law enforcement, even with a valid warrant. The security changes were met with praise from civil liberties groups — and with good reason — but response has not been universally positive.
In a briefing, FBI director James Comey said he could not understand why Apple would “allow people to place themselves beyond the law.” Other police officials were unequivocal in their condemnation. “Apple [iPhones] will become the phone of choice for the pedophile,” said John Escalante, head of the Chicago Police Department’s Bureau of Detectives. The negative presumption is that, under the protection of Apple’s new, stringent regulations, users with bad intentions will turn to the iPhone to engage in illicit activities. Outgoing Attorney General Eric Holder agreed, saying companies like Apple could thwart law enforcement’s ability to do its job.
While criticisms of Apple’s hyper-security sound reasonable, it is important to remember that physical obstacles — doors, locks, walls — also prevent law enforcement from doing its job. One must ask if Apple has an obligation to make user info available if the police have a warrant, and whether the public can trust Apple’s new encryption if it is truly as strong as the company says.
Warrants are certainly part of effective policing, but granting law enforcement access to all the data on a device that stores so much personal information is bound to lead to abuses of power, including domestic phone data-mining, which was found unconstitutional in December 2013. Police officers tracked down criminals before smartphones existed — the restoration of personal privacy will not lead users to commit crimes any more than the invention of the deadbolt did.
In its privacy report, Apple makes bold claims. In the smartphone market, the tech giant uses superior consumer privacy as another way to beat Google, their main competitor. Additionally, iPhone software is closed-source and proprietary, so software engineers have no way of independently verifying the claims that Apple is making.
Ultimately, there is little stopping Apple from inserting a security vulnerability into a future software release. Apple is hoping that consumers will accept the image of a benevolent handler of personal data at face value, however, and they would be right to. Apple deserves praise for making consumer privacy a standard, especially because these changes come in the wake of government attempts to do away with any semblance of online privacy.
A version of this article appeared in the Tuesday, Oct. 7 print edition. Email Tommy Collison at [email protected]