There are quite a few tools readily known to the Android reversing community. The primary one is most likely smali/baksmali. It’s an open source tool which will decompile/compile an android dex format which is used by dalvik the native Android VM, into a format known as smali, which is very similar to an assembly language. A lot of people even like dex2jar, which further enhances the experience and takes a broken down apk, and pulls out the compiled dex classes. With dex2jar you can further that and attempt to get some readable jar files. If you wanted to make it even simpler you continue with that jar and use something like JD-GUI to read those jars back into native java code and be off running. For the lazy, there’s also the apktool which does most of the above for you in a simple one-stop-shop.
These are all great tools, but what else is out there? That’s what I’ll be covering in the next few articles. Today I’d like to point your attention to JEB (http://www.android-decompiler.com/). I discovered this back in February when it made its first public release. At the time, I was knee deep in doing Android Application Security Assessments as part of our IPA process. I was still primarily using the tools mentioned above, so it was nice not only to find something different (it doesn’t use the open source smali as the decompiler), and it’s a nice all-in-one solution for exploring the code, as well as analyzing it.
Some of the main features include the following, which I’ll go into more depth and why they add more benefit to this tool. The first is the Dalvik decompiler. From day one, I was immediately able to notice a difference between it and the smali decompiler. It uses some of the built-in properties and metadata of the DEX file and uses that to help interpret some of the code. I’ve also seen this change over the past couple months as they’ve improved it, or made other changes. It’s been optimized quite a few times. For me the code decompiled for JEB was broken down further and easier to read, even as a java programmer I’d prefer code to be in the simplest form when trying to analyze it, especially when dealing with unknown classes or libraries, not to mention any obfuscated code.
JEB has a few built in features which make analysis great as well allowing you to examine cross-references, rename methods, fields, classes and packages, navigate between code and data, add comments anywhere and even a notes tab. Putting this into practice is quite easy as well. I’d give a few examples but JEB’s video does a great job itself.
JEB not only gives you assembly and decompiled java views, it breaks down the entire apk, giving you access to all the assets, manifests, string lists and constants as well. Again, the other tools are capable of this, but you’re simply left with the basic file directories, in JEB you get a tab for each item listed out.
JEB also listens to its users with frequent updates, and added features or changes coming directly from the community. The JEB team listens and is active in development.
I think the only things I’ve noticed that I don’t like about JEB are mostly personal and nit-picky. First is because it’s much like an IDE in its layout, there’s no easy way to change the styling of the code views. Yes you can change it, within the options, but you have to go through every item individually and change them separately (comments, keyword, string, etc.). I did this once, but then accidentally overwrote those settings when I did an update. Which brings me to the second item; the update system requires you to download the latest version, which is archived with a password specific to you, you then unpack it, and do what you want from there. JEB does a check on load to tell you if there’s an update or not, but you still have to update it manually. I’ve spoken to the developers and this is something they’re working on but it’s not a priority. Lastly the decompiler omits any try/catch statements. It will leave a comment saying where they were, but it doesn’t actually leave any information that might have been needed. Specifically if an application spits out an error message in a catch block, that string isn’t there. Sometimes that string is useful for pinpointing certain items.
Otherwise JEB has been a great additional to my toolset; it’s one of the first things I go to if I need quick analysis of an APK. I’m just scratching the surface on this introduction to JEB, and will be doing a follow-up comparing it more deeply with some of its competitors and other tools I’ll be listing in these articles as well. Stay tuned!
On Saturday I was saved by a second factor of authentication.
I was playing the new SimCity game on my home computer in the basement, when my gaming session (surprisingly, it was playable that day) was abruptly terminated because my account had been logged on in a different location.
Seeing as how I only had one computer with the Origin software installed, I was surprised by this, so I restarted the game. It told me that I was logged on somewhere else, and if I logged on it would log me off the other location. “Sure, sure, whatever.”
A minute or two later, the same thing happens. Then I realize what’s going on.
I’ll admit, my Origin.com password was horrible. It was four characters long. That said, I was surprised that someone had bothered to capture it. The only game I have is the new SimCity, and it’s problematic as I alluded to in the earlier link.
So I again logged on, forced out the other user, and then went straight to the “change password” link. Here’s where things get fun. Origin’s password reset system requires you to not only have the current password, but also have the answer to your “secret question”. The problem is, my secret question was currently in cyrillic. (Looks like this isn’t uncommon.) So, I didn’t know the answer.
Luckily, the “I forgot my password so please send it to my email” link does not require the secret question to get involved. So I clicked that, got the email, and used the link in the email to reset my password. After that, I spent 30 minutes waiting for an EA technician on chat support to help me change the “secret question”. So now I feel like my account is safe.
Then I walked upstairs and was greeted by 6 voicemail messages on my cellphone. All of them were just strings of digits. This is one of the messages. It turns out, the attacker tried then to gain access to my email account by using the password I had used on Origin, and had kicked off some password reset attempts. Since I had configured my email to require a second factor of authentication (call to my cellphone) before allowing the password to be changed, my email account remained under my control – for now.
So the moral(s) of the story? Use a better password than I did, and make sure you set up second factors of authentication on all the accounts you can. As Smokey the Bear would say if he switched his focus to information security: “Only you can prevent account hijacking.”
Last week at the RSA Conference I had the opportunity to attend the “Mobile Security Battle Royale“, featuring a great panel of experts on mobile phone security. Moderated by Zach Lanier, the panel featured Tiago Assumpção and Collin Mulliner paired off against Charlie Miller and Dino Dai Zovi (co-authors of iOS Hacker’s Handbook).
As many great panels typically do, this panel featured no slides and no set talking points. Instead, Zach asked the panel some great questions to just get the ball rolling, and the panel started firing off great quotes left and right. I got busy live-tweeting the session and got (and re-tweeted) a few great quotes from many of the panel members which I have embedded below.
One of the recurring themes was “which is better”, comparing iOS to Android. BlackBerry/RIM got a few mentions as well since Tiago worked for RIM for a long time. The panelists did not come to any final conclusion, all the platforms have their benefits and their drawbacks. However, as a “battle royale”. there was a certain amount of desire from the moderator and the audience to declare a winner. My belief is that currently iOS is currently ahead, but the battle is close. The reason I’d tip my hat toward iOS at this time is for two reasons. First, it is slightly more expensive and difficult to get an app into the Apple App Store than Google Play, which makes things slightly more difficult for malware developers. Second, Apple iOS devices are generally running the latest version of the operating system, unlike the fractured Android ecosystem which has over half of the active devices running multiple major revisions behind.
Enjoy these quotes (paraphrased a little, I don’t have an eidetic memory) from this great panel discussion. I look forward to the rematch at next year’s conference.
— Peter Hesse (@pmhesse) February 28, 2013
— Peter Hesse (@pmhesse) February 28, 2013
“We’ve proven that we can get to the moon. I personally don’t have the resources so the moon is safe from my attacks.” @dinodaizovi
— DennisF (@DennisF) February 28, 2013
Good point by @collinrm there are 3 parties needed to upgrade android: carrier, hw manufacturer, and Google, only 1 has incentive to upgrade
— Peter Hesse (@pmhesse) February 28, 2013
— DennisF (@DennisF) February 28, 2013
— Peter Hesse (@pmhesse) February 28, 2013
I saw this article come across my news feed today, and I thought to myself “what a great idea for an article!” The title is The Petraeus Affair: Human Nature Beats IT Security Every Time.
I was thinking the article was going to be how General Petraeus and Paula Broadwell out-foxed the IT security measures in place at their various organizations to engage in (what they thought was) clandestine electronic communication. I figured the CIA would block access to GMail for security reasons, and yet these individuals were so determined to communicate they would have found a way. After all, most security controls can only defend against those willing to play by the rules.
Reading the article disappointed me because it wasn’t about that at all. Instead it was a simple attack on human nature. The following is a quote from the article:
No matter how much you try to drill security into your co-workers and families, human nature can always countermand common sense and security measures will be rendered worthless.
Saying phrases like “users are the weakest link”, “we have a layer 8 problem”, and “there’s no patch for stupid” elicits knowing head-nods from security consultants everywhere. I believe this mindset and approach toward security awareness is fundamentally wrong. Yet, it seems to be the majority view of the “thought leaders” of the information security industry.
Why is this a bad thing? Quite simply, it sets “security professionals” against the people they are seeking to protect.
- Arguing with negativity creates derisiveness and creates a negative feedback loop. Take political campaign commercials as an example. While they may create short-term popularity boosts, ultimately they divide the two parties further apart.
- Calling people “users” objectifies them and creates distance as was so eloquently written about by Michael Santarcangelo.
- When you call individuals stupid and state they can’t help but “render security measures worthless”, you create a self-fulfilling prophesy. Children believing the negative things they are told about themselves is a contributing factor in generational cycles of poverty and poor education performance.
I wish our industry would realize that we are all on the same team. We will never be able to address all the threats and risks to our information systems if we do not have the support and willingness of every individual which uses them. Why do we think it is a good idea to alienate them?
An attack on the South Carolina Department of Revenue exposed 3.6 million social security numbers, and about 387,000 credit and debit card numbers of South Carolina residents. Data breaches like this are so common, they are barely newsworthy… and we certainly try not to cover every single data breach event on this blog.
However, today’s followup to the story is what made it interesting. Governor Nikki Haley went on the record in a press conference trying to defend their lack of good practices. I’ve embedded the video below and hopefully it will start at the good part, 12:43 into the video:
This is a really good example of sending the wrong kind of message. I understand her desire to defend the state workers that failed to foresee this type of breach, and adequately protect their citizens’ information. I also agree that she might be right – there are many situations in which social security numbers don’t get encrypted. However, I’d like to break down some specific problems with the way she made this statement.
- By saying “a lot of banks don’t encrypt” she is essentially lumping the practices of the banks in with the practices by the S.C. Department of Revenue. However, I don’t think I’m going out on a limb by saying most banks have better security controls and incident response capabilities than the S.C. Department of Revenue. Not encrypting is not the same as not protecting, and there are definitely different ways to protect information.
- Another statement she made against encryption was “because it is very complicated.” Yes, these days we are facing complex challenges and sometimes the actions we have to take in response are also complex. Encryption is meant to be complicated. You wouldn’t want just anyone to get those social security numbers, right?
- “It is cumbersome and there’s a lot of numbers involved with it.” Again, making too much of how complicated it is. Never mind the fact that encryption is actually pretty easy these days, you have a social, governmental, and fiduciary responsibility to protect that information. And “a lot of numbers”? Really? Are we channeling Teen Talk Barbie?
- “It’s not just that this was a Department of Revenue situation, this is an industry situation.” Actually, this is just a Department of Revenue situation. The industry is working to get better. The industry and government are working together to pass standards and regulations. Forward-thinking organizations are proactively assessing themselves and trying to get better. The industry is being held accountable, and so should the state of South Carolina.
Governor Haley, you sent the wrong message to the public today. You tried to deflect blame and throw other organizations and the industry under the bus. Instead, you need to take a long look at what you’re doing to protect information and promise to your citizens that you’ll work to do better.
Yesterday, this story on Wired was making the rounds: How a Google Headhunter’s E-mail Unraveled a Massive Net Security Hole. Sure, the title is probably hyperbole, but it is an interesting story. At a high level, mathematician Zach Harris noticed that emails from Google – and from several other prominent domains including eBay, PayPal, Yahoo, Amazon, etc. – could be spoofed.
Anyone who has ever run telnet to port 25 and sent an email from firstname.lastname@example.org or email@example.com knows that email has always been pretty easy to spoof. Given the rise in unsolicited emails also known as spam, something had to be done. In 2006, a working group was founded to try and create a standard that would make email harder to spoof. It is called DomainKeys Identified Mail or DKIM. The way DKIM works is that the sending email server applies a digital signature to the mail message headers (completely transparent to the user). The public key which can be used to verify the signature appears in the sending email server’s DNS record.
Since it uses some fancy technology like a digital signature and relies on DNS – a core capability of the internet – email with a valid DKIM signature can generally be trusted that it is legitimately from the domain listed in the From field… Well, that is if you’re doing DKIM correctly.
Mr. Harris found that the DKIM signatures for a number of popular domains used such weak cryptography that it was very easily cracked. While the DKIM standard calls for 1024 bit keys, he found popular domains with 768 bit, 512 bit, and even 384 bit keys. As he points out in the article, he was able to crack the 384 bit keys on his laptop, and for $75 he cracked some 512 bit keys by outsourcing the computing power to Amazon Web Services.
The best quote in the entire article, and the reason I bring this story to your attention is the following:
“People who use cryptographic tools need to realize that local configurations need to be maintained just like software updates need to be maintained,” he says. “In 1998 it was an academic breakthrough of great concerted effort to crack a 512 bit key. Today little old me can do it by myself in 72 hours on AWS. The field of cryptography keeps developing and breaking new ground just like everything else, and you can’t just install a private key, or select a hash algorithm, and expect it to be good forever.”
Wise words, indeed. If connected to any network, you must consider your operating system, your software, and your cryptographic algorithms and keys all as perishable items. They need to be examined for mold and rot periodically, and replaced when necessary. You can no longer afford to “set and forget” anything and consider it dependable.