February 24, 2014 /

A Very Serious Apple Flaw, Which Apple Must Answer For

Over the weekend Apple quietly rolled out a patch to their mobile iOS operating system, which security experts say fixes one of the most major security flaws ever. But fixing it just will not sufice this time.

A Very Serious Apple Flaw, Which Apple Must Answer For

Apple released iOS 7.0.6 to their iPhone, iPad and iPod line of products this weekend. The update was rather quietly rolled out, but once the tech world tore into it, what was quickly discovered is something far more scary.

The culprit of what may be one of Apple’s biggest security snafus is an extra “goto” in one part of the authentication code, Wired reported. That spurious line of code bypasses the rest of the authentication protocols.

The bug could could allow hackers to intercept email and other communications that are meant to be encrypted, according to a Reuters report which was issued late on Friday night.

After seeing the affected code, it could be a simple mistake. I’ve been a developer for almost 30 years and God knows a day doesn’t go by where I don’t make a quick mistake. But mistakes are easy to find with proper practice. Let’s use this one as a perfect example of when proper practice can save off disaster.

The bad part of code was very simple:

if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) !=0 )
    goto fail;
    goto fail;

The problem was that line “goto fail;” and it being there twice. If-then blocks are one of the most common things we do in development. It controls the logic flow of our programs, which is like oxygen to life. In the code above, taken from the Apple flaw, if the “if” block evaluates to true, then the next line, and only the next line of code is ran. That means that the second “goto fail;” is always called no matter the outcome of that “if” statement.

When I am coding, I never write my “if” statements like that. Instead I always put the logic that occurs when the “if” statement tests true inside curly brackets. So if I were writing this, my code would look like this:

if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) !=0 ) {
    goto fail;
    goto fail;

What the curly brackets do is tell the code to execute everything within them if the statement evaluates to true. It’s known as a “code block”. So by taking the two extra key strokes of adding the curly brackets, that second “goto fail;” now becomes a non issue. Something so simple and sweet and would have prevented a security flaw that opened up millions of Apple users to huge security hole.

Not only that, but there are also things referred to as unit tests in the development world. In a nutshell, these are extra programs that run against the code you write to test for every condition possible. They are boring and tedious to write, but I find them an absolute necessity. Apparently Apple doesn’t follow this practice, which is really alarming to me. Of course when you are a company like Apple, worried about release deadlines, then you quickly forego certain steps, like writing tests. After all, they want the big stock market hit of releasing on time, stability or user security be damned! 

But this is just the technical side of things. There’s much more to be alarmed about here.

Is Mac laptops and desktops still vulnerable?

The answer to that is yes:

Then researchers found the same bug is also included in Apple’s desktop OSX operating system, a gaping Web security hole that leaves users of Safari at risk of having their traffic hijacked. 

Apple has promised a fix “soon”, but as I’ve shown a fix for that is very minor, so why wait? In a developer’s world, this patch is about as simple as it gets. But Apple would rather screw around and let the users of their computers risk being hacked? Even worse is the fact that this bug is now widely known, meaning hackers can quickly try to attack computers using it. Apple screwing around with a patch for this can only be described as neglect.

But could the NSA be behind all this?

If you have read my blog in the past then you know I have played devil’s advocate on a lot of the NSA revelations, but there is alarming evidence to suggest that maybe the NSA was behind this. Instead of me going into all the details, I recommend reading this article at Raw Story and I’ll give you the very brief bullet points:

  • Sept. 24, 2012 Apple releases iOS 6, which is where this “bug” first appeared. Yes it has been out there that long and Apple has known about it and it hasn’t been fixed. More on that shortly.
  • Oct. 2012 – NSA documents released by Edward Snowden from October, 2012 has the NSA bragging that they hacked Apple.

Now this could very well be coincidence, but let me put my developer hat back on and explain why it is very easy to believe.

I showed you above how easy it is to read programming code. Computer languages are made that way so that they are easy to work in. That is what’s referred to as the “source code”. But what you run on your computer isn’t the “source code”, unless it’s a scripting language. Instead there is a process between writing the code and giving something you can run. That process is called the compiling stage. What happens here is all that code, which is easy to read for humans, is transformed into something computers actually understand. This is called the binary code and that’s what makes the programs you run on your computer actually run. 

Now trying to hack into compiled code takes a lot of work. You need to attempt to decompile the code from it’s raw computer language back to something easy for humans to read. In the process a lot of things get mangled. A developer can get an idea what SSLHashSHA1.update would do, but decompiling it will lose that friendly name.

Knowing that, the easiest way for a hacker to affect code is on the source code level, before it is compiled to that binary format. Now almost every company uses what’s called a source code repository. I have one for my company, which tracks all source code, changes and anything else I want. Of course this repository sits on a network and is actually accessible via the internet. I never worked for Apple, but I can safely assume they do something similar as it is just that common and really helps with productivity. 

So now that we know about compiling and source code repositories, if I were the NSA wanting to infect millions of devices with minimal work, I would simple hack into a company’s code repository, change that easy to read human code, and let that company distribute my “backdoor” for me. Talk about a way of saving man power. It’s a hackers dream and a trick I could easily see the NSA employing, especially given the evidence above.

What should happen now?

As I said, Apple reportedly knew about this bug almost 18 months ago. This is no minor thing either. We’re talking about a flaw that allows an attacker to see what data is sent between you and your bank, email and anything else. It is an enormous flaw and Apple chose to ignore the very simple fix for well over a year.

To me this is absolutely unacceptable and should be criminal. What motivation did Apple have for leaving this bug in there for so long and not fix it? They knew it would get out eventually and their ignoring of the issue would create a PR nightmare. So there must have been a compelling reason to take that route. 

With what we have seen recently with the Target hack and others, this issue can not be swept under the rug. I firmly believe the DOJ should open an investigation into this flaw and why it was let go for so long, and still exposed on one main Apple line of products. Apple isn’t only risking their users privacy, but opened up a huge hole to financial fraud. 

If the DOJ won’t look into this, then I wonder if we may see some civil action. Like I said, this just developed over the weekend, so I can see the story growing even more in the coming days and attorneys to start asking questions. Perhaps someone in the Congress should also do the same. 

The final thing I’m sitting here wondering as I write this, is how would this be handled if it was Microsoft instead of Apple? We know that the legal minds love going after Microsoft and I got a feeling we would already be hearing rumblings of lawsuits and legal action. Shouldn’t Apple be held to the same standards?

I know this has greatly reduced my trust in Apple and you would be a fool if it didn’t do the same to you. Apple really needs to answer for this, and if they don’t then hopefully the consumers will show their outrage by not purchasing their products. That is the only hope we may have in our free market society.


More IntoxiNation