Bring-your-own-device policies have also brought stress and consternation to the world of corporate data security. Plenty of attention is being paid to issues like network management and the potential security threat that poorly engineered devices pose to enterprise organizations. However, poor software development practices in the mobile space might be creating a much greater threat to corporate security than any mismanaged network ever could. A lost or stolen cell phone in the wrong hands has the potential to wreak havoc on the corporate intranet, and it's all because of the code. That's the message Godfrey Nolan, founder of Research Into Internet Systems LLC, or RIIS, and author of Decompiling Java and Decompiling Android, shared at the AnDevCon event last week in San Francisco.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Is your 'Droid full of holes?
The fact is, it's as easy for hackers to exploit an Android application as it was for Obi Wan Kenobi to sneak onto the Death Star and destroy the Borg. And the results can be just as catastrophic for your corporate IT department if a cell phone running even the simplest pieces of corporate software falls into the wrong hands. Of the many corporate data security risks the chief security officer (CSO) should be worried about, here are three major, code-based, security holes Nolan encounters on a regular basis:
Hackers can gain complete access to developers' source code using simple, readily available decompilation tools. "People may not realize that hackers have complete access to a developer's code, because they can reverse-engineer it using some fairly simple decompilation tools," Nolan said. The APKs (Android application package files) are just sitting out there on mobile devices, waiting to be reverse-engineered and abused.
Organizations are not making sure that corporate data is secure. Even usernames and passwords are often sitting there, unencrypted, on the device. "Usernames and passwords are often stored in plain text, so if the phone is lost, anyone that gets access to the phone may potentially get access to somebody's data," Nolan said.
Few companies are seriously considering the threat posed to their back-end systems by mobile devices. "Mobile phones, the way I see it, are client/server apps -- they communicate with the back end -- if they're communicating via an API [application programming interface] with a username and password, that is often done in clear text," Nolan said. This creates a gateway for hackers to directly access the server, and this is something that should make CSOs cry. Once ne'er-do-wells have their mitts on the right logon, they can engage in everything from massive theft of data and intellectual property to sabotage and outright destruction of information systems.
And just how bad is the problem? Nolan describes downloading and testing a number of Android applications with his team, and finding that only one was well-protected from an application security standpoint." Out of the hundred or so APKs that we've downloaded, we would say that only one was well-protected, and everything else had information that was leaking or was available just in plain text if you reverse-engineered the code," he said.
Mobile application developers can take steps to seal up these security holes. Putting a proper governance structure in place is a good start. "From a governance model, one of the things we should do is take the same sharp look at the code as we do on the server-side part of the house, and apply that to Android," Nolan said. Governance models around code quality and testing have always existed on the server side, but for some reason, Java developers in the mobile space don't seem to code within the same rigid guidelines.
As organizations rush into the mobile space, and developers are rushing to release mobile applications on shorter development cycles than ever before, it's no surprise to discover that many common-sense practices are falling by the wayside. But if developers continue to pepper their software with plain text passwords and deploy applications with unobfuscated code, the reports of security breaches and access violations will begin to mount, and secure mobile development in the Java space will have its reputation unnecessarily sullied. Employing common-sense development practices is not only good for the Android development market, it's also part of the basic responsibility we all have as developers.