3 Sept 2014

Apple Doesn’t Take Customer Security Seriously - 5 Irresponsible and Shocking Lapses

By Michael Krieger: I’m the furthest thing in the world from a technology or security expert, but what I have learned in recent years is that a dedicated, sophisticated and well funded hacker can pretty much own your data no matter how many precautions you take. Nevertheless, the major technology companies on the planet shouldn’t go out of their way to make this as easy as possible.
In the wake of the theft of private images from several prominent celebrities, many people are rightly wondering whether how vulnerable their data is. The answer appears to be “very,” and if you use Apple, the following article from Slate may leave you seething with a sense of anger and betrayal.David Auerbach wrote the following for Slate. Read it and weep:

In the wake of the theft of the private data and photos of dozens of celebrities, there is at least one major culprit. Not the alleged leakers, though obviously they’re to blame, but the company that has most prominently overstated its security in the first place: Apple.  
What is clear is that Apple has had a known security vulnerability in its iCloud service for months and has been careless about protecting its users.
Apple patched this vulnerability shortly after the leak, so even if we’re not sure of exactly how the photos got hacked, evidently Apple thinks it might have had something to do with it. Whether or not this particular vulnerability was used to gather some of the photos—Apple is not commenting, as usual, but the ubiquity and popularity of Apple’s products certainly point to the iCloud of being a likely source—its existence is reason enough for users to be deeply upset at their beloved company for not taking security seriously enough. Here are five reasons why you should not trust Apple with your nude photos or, really, with any of your data.

1. The vulnerability is Security 101 stuff.
Up until Monday, Apple had a significant and known brute-force vulnerability in its Find My iPhone servicewhere you type in your Apple ID and password on your computer in order to locate your iPhone on a map. Most services that use passwords, from Facebook to Google to banks, will lock your account or at least throttle logon attempts after a certain number of failed access tries to prevent a person who is not you from making endless guesses at common passwords. Apple itself will do this in most places—but not through its Find My iPhone service, where hackers are allowed unlimited attempts at guessing passwords. You can endlessly try password after password as quick as you like. Once a correct Apple ID password is confirmed through Find My iPhone, a hacker then has access to your iCloud account.  
2. The vulnerability was publicly known since May.
A Russian security group called HackApp released iBrute, a proof-of-concept tool to exploit this vulnerability, on Aug. 30. But don’t blame them, because the celebrity hacking probably took place quite a while before that. The Register publicized the lack of any sort of limit on iCloud logon attempts in May, and Apple did nothing about it, giving hackers plenty of time to bash away at accounts. Even after iBrute was publicly released, Apple didn’t patch the vulnerability until Sept. 1 and did nothing to secure accounts in the meantime.
3. Apple defaults users into the cloud.
Clouds are wispy and ephemeral, the very opposite of secure, so why would you want to store anything in them? No one particularly does: Cloud storage has been forced on users because it suits tech companies, not because it’s what’s best for consumers. But Apple makes it very hard not to store photos in its cloud, nude or otherwise. Camera Roll automatically backs up photos (all photos) to the cloud by default, and Apple makes it difficult for average users to change the default. It’s worked. And it’s too bad, because whatever you store on the cloud has far less legal and security protection than what’s on your own computer. Even deleting photos from your phone doesn’t delete them from the cloud, as security expert Nik Cubrilovic pointed out on Twitter. (The American Civil Liberties Union’s Christopher Soghoian has wisely suggested a “private photo” feature that doesn’t upload certain photos to the cloud.)* Defaulting to the cloud is like checking baggage on an airline: People might look through your stuff, and even steal it. And like the airlines, Apple’s liability is strictly limited by the extremely generous (to Apple) agreement you sign when you purchase any of its products.
4. Apple does not encourage two-factor authentication.
Two-factor authentication, in which physical possession of a particular device (like a phone) is necessary to log in to an account, is one of the most common and effective supplements to the problematic security of regular passwords. Google, Yahoo, Facebook, Twitter, and many other services offer two-factor, though rarely by default. Still, as the Daily Dot writes, “For reasons that defy all logic, Apple makes it extraordinarily difficult to enable two-step verification,” making users wait three days just to turn it on. (In other words, if you had found out about the vulnerability on Aug. 30, you couldn’t have protected yourself until Sept. 2.) Apple barely publicizes its two-factor authentication and has not encouraged users to adopt it. Apple controls the default user experience for its products, and it has the responsibility for that default to be reasonably secure—which it currently is not.
5. Two-factor authentication wouldn’t have worked anyway.
Even if you were a celebrity who had enabled two-factor authentication, it wouldn’t have helped in this case because Apple doesn’t enforce two-factor authentication for iCloud logons even if you have it turned on, as was reported by Ars Technica all the way back in May of 2013Apple primarily uses two-factor to prevent credit card purchases, not to protect the privacy of your data.
At this point, I want to highlight two previously published articles:
Apple’s Massive Security Flaw: NSA Exploit or an Honest Mistake?
Apple Directors Overrule and Reject Shareholder Proposal to Protect User Privacy
But sure, go ahead and camp out for 19 days for that iPhone 6.


Have no fear. Even if Apple did protect your data, NSA employees will still pass around your nude pics with reckless abandon. Let’s not forget what Snowden said in a July interview:

“You’ve got young enlisted guys, 18 to 22 years old,” Snowden said. “They’ve suddenly been thrust into a position of extraordinary responsibility where they now have access to all of your private records. In the course of their daily work they stumble across something that is completely unrelated to their work in any sort of necessary sense. For example, an intimate nude photo of someone in a sexually compromising position. But they’re extremely attractive.
“So what do they do? They turn around in their chair and show their co-worker. The co-worker says: ‘Hey that’s great. Send that to Bill down the way.’ And then Bill sends it to George and George sends it to Tom. And sooner or later this person’s whole life has been seen by all of these other people. It’s never reported. Nobody ever knows about it because the auditing of these systems is incredibly weak. The fact that your private images, records of your private lives, records of your intimate moments have been taken from your private communications stream from the intended recipient and given to the government without any specific authorization without any specific need is itself a violation of your rights. Why is that in a government database?”
“It’s routine enough, depending on the company that you keep, it could be more or less frequent. These are seen as the fringe benefits of surveillance positions.”
Because terrorism…
Full Slate article here.
In Liberty,
Michael Krieger


Source

No comments:

Post a Comment