The huge increase in corporate mobile applications over recent years has also brought with it a need to secure the information they contain. The integration of security into the software development lifecycle is becoming more and more important, especially in mobile environments.
Why is this?
For many organizations in the past, security was an afterthought. Once the functional requirements of the software had been met, security was added but only if time and budget allowed.
Not only is valuable corporate data potentially at stake, in a world where the bad guys are ever more advanced and the threat of large fines for data breaches has increased with the new EU General Data Protection Regulation (GDPR) rules and the Payment Card Industry (PCI) requirements for those handling payment card details, organizations are realizing they can't afford to ignore any aspect of information security.
Given that mobile applications can contain a wide range of data, from corporate documents to banking information and flight information to sensitive, personal details. It is no surprise that they are being targeted by hackers and need to be every bit as resilient as their web application counterparts.
What does this mean for the applications we develop?
Organizations that create these applications need to devote the appropriate amount of time to ensure security is right up there with the functional requirements. This is done through an appropriate Secure Software Development Lifecycle (SSDLC).
Although this sounds complicated, in its basic form this is simply ensuring that:
- Applications have defined security requirements from the beginning
- Developers understand secure coding techniques/principles
- Time is allocated for security testing
Defined security requirements
Most software projects have a set of requirements from the outset (e.g. what the application is for and what it must do), and security needs to be a part of this definition. These will form an additional set of functional requirements for the application development and are applicable to both in house and externally developed applications.
As an example, where a mobile application might have a requirement for a user to be able to log in on a device while offline, the security component of that stipulation might be for that login to be done in a secure manner. This ensures that as well as fulfilling the user need that was envisaged for the application, it is not done at the expense of data security. These requirements need to cover the data usage within the application in transport, in use and at rest. During previous application testing engagements BSI Cybersecurity and Information Resilience (BSI) have seen a range of security issues at this point, from being able to download and view username and password combinations for the entire user base, to being able to bypass the login dialogue completely.
Security trained developers
In order to 'bake in' security, developers will need to know best practice security principals and what good security looks like. Understanding how hackers attack their applications, and perhaps performing some of those attacks themselves, ensures that vulnerabilities are caught early and can be dealt with as part of the development process.
This also extends to the supporting web services that the mobile applications point to. These are often a point of compromise as developers overlook them as an entry point as the mobile applications that interact with them have limited capabilities.
As an example, in a recent mobile application test, BSI discovered that the application was logging too much information to the device, including credentials used to access the server. This meant that anyone able to physically access the phone would have been able to access these credentials and therefore the users account. Combined with other issues affecting the application, a malicious user could potentially have compromised all the data within the server database including sensitive personal details of all users.
Of course new vulnerabilities are found all the time so even the best mobile applications won't be 100% vulnerability free forever. A solid security knowledge combined with an effective SSDLC will help developers avoid the types of mistake that see an app fall at the first hurdle, or even worse, end up on the front page of a newspaper.
Once the applications are being developed with security in mind, the next step is to ensure they are thoroughly tested before deployment. This requires building in time (and cost) for penetration testing to ensure applications are independently tested.
It's important this is carried out by someone independent of the developers, after all, they know how the application works, or should work and are less likely to take the avenues that a real attacker would. Let's face it, it's very hard to thoroughly and objectively attack your own work.
It's also important not to leave this until after the 'go-live' stage as the application could be vulnerable and already being exploited by the time any remediation work following the testing is carried out.
As part of our testing activities we have seen applications that have show-stopping vulnerabilities, whether in the applications themselves or in the supporting web services. These could have been avoided with some security knowledge and an early penetration test.
It's important to understand that although mobile applications look and feel different from other software, they can be attacked in very similar ways. It is also important for organizations to understand that the information their applications transfer and process needs to be protected as with any other asset. Implementing a suitable SSDLC would go a long way to ensuring applications are built with this in mind.