-
The Dec. 2 mass shooting has shone a light on the sector's vulnerability to borrowers who, exploiting the absence of face-to-face contact on the Internet, lie on their loan applications.
December 10 -
Authentication methods meant to keep cybercriminals from taking over accounts can also lock out legitimate users. This old dilemma is growing more vexing as cybercriminals get better at impersonating customers and as regulators increasingly push multifactor authentication.
January 7
I am glad the Department of Justice picked a fight with Apple. Let me explain why.
For a long time, Apple has had a differentiated stance on how its choices in security drive privacy of user data — one that the tech giant has often sought to contrast with that of its competitors, especially Google. Where it once seemed that Apple's rigid stance on privacy and encryption hobbled its own ability to deliver a continuous user experience across its devices and services, that position has now equipped Apple to take up a principled position against the DOJ on behalf of user privacy.
The careful considerations Apple has made toward encryption and device security throughout its device lineup, such as those detailed in the
Since Apple won't easily be intimidated, the company can offer a better challenge to the DOJ than others the latter has battled. Take the anonymous email service Lavabit, for example. Chronicled
What is being asked of Apple and why should it concern us?
The FBI has asked Apple to provide custom firmware, which can be installed on the device in question at either an Apple or FBI facility to: a) remove the rate-limiting restrictions that exist to discourage endless attempts to enter a device PIN code; and b) remove the automatic-wipe security measure that wipes the data on the device after 10 incorrect PIN code attempts.
Lacking the encryption keys Apple uses to sign its firmware, the FBI cannot attempt to do so itself. As Apple continues to be silent on the technical feasibility of an FBI request to break into an iPhone 5c, there has been much industry debate on whether such an attempt will work on newer iOS devices. The FBI contends that not only is Apple able to comply but it also can ensure the custom firmware it creates will be limited to the specific device.
Irrespective of whether Apple can limit the firmware's applicability to the device in question, the risk is the legal precedent may guide futureadjudication in matters relating to encryption, data security and privacy.
It could also mean that other tech and security companies may get pressured similarly — and the outcome may not be as scope-bound as a specific device, and instead could result in poorly conceived and badly implemented back doors or master keys. Stuxnet is the perfect example of such an asymmetrical weapon, engineered to sabotage
Regardless of whether Apple custom designs a firmware, specifically for the iPhone 5c in question, the larger concern is once such firmware exists, it amounts to a back door that cannot be unlearned. If Apple's own private keys are what afford such firmware legitimacy and trust so that it can be installed on a device, then what happens if the federal government asks Apple to surrender its keys, as they have in the case of Lavabit under a gag order? Apple has an unenviable position, that any decision the company makes specific to a single device can affect security decisions it is allowed to make on everything else Apple hasn't yet designed.
What does this impact?
Apple has commendably abstracted away all of the complexity around pervasive and unbreakable security in its devices so user data can remain secure, private and conveniently accessible. By encrypting the data on a device using both the PIN code and the unique device ID, Apple has ensured that no one can break into your device without authorization. You can back up your data into iTunes and store data on a computer, which then again is encrypted by the same key combination — putting the data out of reach for any and all malicious or unauthorized access. The iCloud backup remains the third route for either you or law enforcement to access your files — the former using your Apple ID, and the latter with a subpoena.
The question then becomes: How will Apple respond in the future, especially in how it designs security into its devices, to counter what it sees as an overreach by the U.S. government?
It has already disassociated itself from encrypting data on the device, using a combination of user-supplied PIN codes and unique device IDs that aren't known to Apple. But will it go even further? Will Apple discourage future law enforcement requests to weaken device security by triggering a device data wipe if a firmware were to be forced without the correct PIN code? Will Apple stop using its own encryption keys for iCloud backups, and instead, bring backups in line with how user data is encrypted in the other two scenarios — a two-factor method that uses a user-supplied PIN code? These decisions, if Apple chooses to implement them, could cost Apple the careful balance it provides between user experience and pervasive and unbreakable security.
And what if the government or the courts begin to force companies like Apple from engineering stronger security? Instead of recommending thresholds in system security to protect consumer data and commerce, will we begin to dictate how tall the ceiling can be? For instance, could Apple be stopped from changing how iCloud backups are encrypted — which will obviate any and all incoming subpoenas for access to encrypted data? Should judges, who are woefully behind in understanding technology, be proposing boundaries in cryptography?
What is at risk requires a strong articulation in a climate of fear, and Apple and Tim Cook deserve all the credit for trying their best. Choices in security affect us all. Cryptography is not a tool for the criminal and math has no agenda.
Cherian Abraham is a mobile security and payments consultant. He writes regularly on topics surrounding fraud, identity and payments on his blog