Historically the goal of security for most companies was nice and simple: keep the bad guys out. And it was easy to classify who the bad guys were. The bad guys were everyone outside the company on untrusted external networks.
This approach worked pretty well until a number of reports emerged starting that internal threats accounted for around 50 to 60 percent of the total security threats to a company. Now the bad guys were more difficult to classify. This revelation required a new approach to security that revolved around keeping the bad guys out, even if they were inside the network. Companies ensured they had all their systems patched and “hardened” so that no attacker would be able to break in. This approach too has lasted a number of years and been very successful.
Recently, however there’s been a marked move towards a new security approach. This approach revolves around securing the data itself as opposed to just the systems and networks that hold the data.
Why the somewhat radical shift in focus and need for a new approach? There are a multitude of reasons but the primary drivers are the increased awareness of the value of data and the failure of the existing security approaches to secure the data.
Companies now more than ever are realizing that their confidential data is in many cases the lifeblood of their business and loss or theft of the data could be critical.
A simple example would be the prior approach to dealing with lost laptops. Previously a laptop left in the back of a taxi would have been written off as just the replacement cost of the laptop. The total cost to the business would be Ã¢â€šÂ¬500 maybe. Now however companies realize that the data stored on the laptop may be worth a whole lot more. How much would that laptop be worth if it currently held all your customer records? While it’s hard to determine you can guarantee it’s a lot more than Ã¢â€šÂ¬500, especially if you hadn’t backups of the data!
Of course this is just one example of how former security controls don’t adequately secure the data. Another major source of new risks to data is the fact that companies now want to share and integrate more with their customers and suppliers than ever before. Integration is no longer a competitive advantage, it’s a requirement. This often involves giving external users access to internal systems and applications, many of which were previously hidden from public view by layers of firewalls. This increased requirement to expose internal applications to the public has emerged in conjunction with the explosion in web application security research. The problem here is that many of the vulnerabilities within web applications are exploited through the normal functionality of the application of which firewalls and traditional security measures have no visibility.
So what does this shift in approach actually entail for the average security manager? One of the most significant shifts is that the security manager must now interact more with the data owners. Previously if a new financial application was being introduced the security team would harden the server and then restrict access to it using a firewall. At that point their job was often done. The finance team would administer the application and that would be the end of it. Under the new data security approach however, the security team would also have to speak with the finance team to determine what kind of data will be stored in the new system, how sensitive that data will be, who will require access and where will the data flow both within and outside of the application.
Essentially the main elements of data security revolve around data classification, data encryption, data integrity and data access control.
Data classification is a key element in the data security model because unless you know how sensitive the data is you can’t assign adequate security controls to its protection. For example, if you have two file servers on your network; one which stores all your companies intellectual property, and one which stores employee personal photos, which are you going to prioitise and assign greater levels of controls to? When stated like that it seems obvious but how many people really know what’s stored on their file servers? And if you don’t know what’s there, you’re either going to end up over-spending on protecting data that isn’t sensitive or you’ll under spend and leave sensitive data exposed.
Data encryption is a rapidly growing area of security. While security managers have been very familiar with the use of encryption for securing data in transit over public network such as the Internet, encrypting data at rest on file servers or in databases is a relatively new concept.
If data encryption is a new concept for many data integrity is an even stranger concept again. Data integrity boils down to a simple question, but one that often is unanswerable, how do you know the data hasn’t been changed since you entered it? There are many examples where data integrity is even more important than data encryption. For example, if your company produced medicine, would you know if someone altered the formula just prior to a new batch being produced?
Data access control may seem like an area that’s already addressed by existing access control controls but can you restrict access to the data throughout it’s life? You may only allow the finance department access the budget files, how do you restrict access to the budget file once it’s been copied to a USB key or emails to an anonymous email account? Technologies such as DRM allow you to ensure that only the members of your finance department can open the file so that if the file is sent outside the organisation is will be useless to anyone else.
Naturally, each of the new controls bring with them many obstacles to overcome such as locating data, educating employees on classification, key management and supporting mobile workers. This are the challenges of the future.
The next time you think about security, don’t just think about how to keep the bad guys out, think about how to keep the data secure.