Security and privacy have always been the core of what Apple does. Whatever product or service Apple launches, it has always put user privacy and security first. The company’s Secure Enclave chip is a crucial component in that aspect. It first debuted with the iPhone 5s where it was meant to store the data captured by the Touch ID sensor.
The Secure Enclave chip ensured that the highly sensitive fingerprint data could not be stolen by other apps or its integrity compromised in any way whatsoever. Over the years, the role of the Secure Enclave chip has only grown, and it now stores a number of critical data for iPhone and iPad users.
The Independent got a good look at the facility where Apple tests its Secure Enclave chip. Apple puts the chips through a series of harsh environments to ensure they survive the test. That’s important because in extreme situations, these chips can be put under extreme stress to try and recover data from them.
In a huge room somewhere near Apple’s glistening new campus, highly advanced machines are heating, cooling, pushing, shocking and otherwise abusing chips. Those chips – the silicon that will power the iPhones and other Apple products of the future – are being put through the most gruelling and intense work of their young and secretive lives. Throughout the room are hundreds of circuit boards, into which those chips are wired – those hundreds of boards are placed in hundreds of boxes, where these trying processes take place.
Those chips are here to see whether they can withstand whatever assault anyone might try on them when they make their way out into the world. If they succeed here, then they should succeed anywhere; that’s important, because if they fail out in the world then so would Apple. These chips are the great line of defence in a battle that Apple never stops fighting as it tries to keep users’ data private.
More importantly, the report has quotes from Apple’s SVP Craig Federighi who addresses the jibe by Google’s CEO Sundar Pichai about how the company sells privacy as a luxury good and other privacy-related concerns around Apple products. He says that for Apple, privacy considerations are at the beginning of whatever it does, not at the end. While Federighi was surprised by the jibe from Pichai, he says that it is gratifying to see companies take privacy seriously.
“I don’t buy into the luxury good dig,” says Federighi, giving the impression he was genuinely surprised by the public attack.
“On the one hand gratifying that other companies in space over the last few months, seemed to be making a lot of positive noises about caring about privacy. I think it’s a deeper issue than then, what a couple of months and a couple of press releases would make. I think you’ve got to look fundamentally at company cultures and values and business model. And those don’t change overnight.”
Apple has come under criticism for storing Chinese user data in servers located in China. The company was forced to make this move to meet local regulatory norms from the Chinese government. Federighi makes a great point about how since the data is encrypted, it does not matter where it is stored.
Federighi says that the location data is stored in matters less when the amount of information collected is minimised, and any that is is stored in ways that stop people from prying into it.
“Step one, of course, is the extent that all of our data minimisation techniques, and our keeping data on device and protecting devices from external access – all of these things mean that that data isn’t in any cloud in the first place to be accessed by anyone,” he says. By not collecting data, there is no data for officials in China or anywhere else to read or abuse, Apple claims.
What’s more, Federighi argues that because the data is encrypted, even if it was intercepted – even if someone was actually holding the disk drives that store the data itself – it couldn’t be read. Only the two users sending and receiving iMessages can read them, for example, so the fact they are sent over a Chinese server should be irrelevant if the security works. All they should be able to see is a garbled message that needs a special key to be unlocked.
The whole article is definitely worth a read especially if you value privacy.
[Via The Independent]
Recent Comments