Mobile trust and security
The other day I wandered into Best Buy at the mall. Nobody’s around and I’m alone with the sales guy. “Umm, what’s the most secure device you have here?” He takes a step back.
Paraphrasing our brief conversation, Apple and Samsung make up 95% of his sales and he thinks Apple is safer. “Is Apple safer because they screen apps better?” Head nods.
“I heard Blackberry is working to secure Android for business users.” Sales guy had nothing to say about that.
Why do people trust Apple?
I wouldn’t take security advice from a Best Buy sales guy, but it does seem that people trust Apple more. Maybe because Apple stood up to the FBI in a very public way. Great marketing, Apple.
Most likely, Apple does care about the slippery slope of security, in terms of unlocking devices. (The same way Google cared about user data intercepted under the ocean.) But I don’t know Tim Cook personally. Even if I did, I wouldn’t feel more or less confident using Apple products because Tim’s not omniscient – he can’t see or control everything going on within Apple.
What’s different about Android?
“I think people can generally trust me, but they can trust me exactly because they know they don’t have to.” –Linus Torvalds
What does that even mean? Well, Linus created the core “kernel” of the Android operating system, a customized version of Linux.
In other words, Linus Torvalds is the core genius inside every Samsung-Android smartphone at Best Buy.
Linux is “open source” which means anyone can look at the code and point out flaws. In that sense, I’d say Linus Torvalds doesn’t have to be as omniscient as Tim Cook. Linux source code isn’t hidden behind closed doors. My understanding is, all the Linux code is out there for anyone to see, naked for anyone to scrutinize, which is why certain countries feel safer using it–there’s no hidden agenda or secret “back door” lurking in the shadows. Does that mean Android phones are safer? That’s up for debate.
How security has changed
For a long time, Apple had the “security through obscurity” thing going for it. In simple terms, that means the bad guys go for low-hanging fruit first, the easy score. Is Apple hanging lower? Windows was the low-hanging fruit. But now that Apple is more popular, it has a bigger target on its back.
As we depend more and more on smartphones, and there’s more people, more money and more at risk, consequently there’s more incentive for hackers to penetrate deep into our devices.
If you read the book “Hackers” by Steven Levy, you know the original hackers were all about the “Hacker Ethic” which boils down to “Information wants to be free.” Sounds harmless enough. For whatever reason, the original hackers found secrets offensive, or they just saw “locked doors” as a technical challenge. Maybe they were idealists, but somewhere along the way, other interests crept in.
That leads us to the zero-day Apple exploit that has people concerned about their iPhones.
The origins of “zero day”
First, what does “zero day” even mean?
Back in the early 90s, a couple of my classmates were into downloading “0 day warez” which was nerd speak for “the latest video games released today.” Games had copy protection. So you couldn’t just buy a game and copy it for your friends, you had to buy your own copy. Hackers figured out how to break the copy protection and called themselves “crackers.” Crackers were competitive, in terms of who could crack a new game first.
For bragging rights, their goal was to crack a game within 24 hours, and that was the “zero day” game, as a full day had not gone by yet.
Fast-forward 20 years. Now you can watch the “Zero Day” movie on Netflix and the original meaning has morphed to mean “software that’s still secret.” Potentially harmful code could lurk undetected in your computer for years. But if your anti-virus scanner hasn’t detected anything suspicious yet, pop culture would consider that a “zero day exploit.” As far as the actual terminology used among hackers, who knows?
Should you be concerned? Almost by definition, most people aren’t targeted by zero-day exploits. Once an exploit is released into the wild and exposed, it’s no longer as useful to attackers, because then it can be studied and whatever hole it used (to penetrate your phone) can be “patched” to block future intrusions. Then again, older unpatched phones could remain vulnerable and ordinary people could be affected.
Patches for Apple vs. Android
In Apple’s case, they’re able to patch these holes within days. For Google, it might not be as fast, depending on the problem. It might take months to get a patch pushed out to everybody, or the fix might never come. For example, it sounds like Samsung is mostly concerned about security updates for its flagship phones.
Why the difference? My understanding is, Google can fix apps and push out patches at the “app level” as fast as Apple, if the problem is specific to a certain app. The main difference is that the Android market is larger and has more devices, and each Android phone manufacturer has a slightly different, tweaked version of the core Android operating system. Different Android manufacturers will push out updates on their own timeline.
Your best bet
If you want the latest (hopefully safest) operating system straight from Google as soon as possible, you’ll want an official Google phone, probably a “Nexus” branded device. According to something I read last night, I believe Android 7 directly addresses this shortcoming to some degree with a new auto-update feature. But for now, the Android ecosystem remains fragmented.
For the average person, what’s at risk? Identity theft, botnet spam, corporate espionage, and loss of privacy.