Critical data exchange in the cloud
The pace of business today requires that critical information can be accessed anywhere, anytime. Furthermore, this sharing of information often happens amongst both internal and external parties. Hosted services are an important tool to enable this exchange of critical information but while the tools can facilitate communication, they bring additional risk and challenges to the organizations that use them. Technology is rapidly evolving to meet this challenge.
One example of this new technology is the use of cloud-based services. Despite the benefits cloud-based services offer including an enhanced security network effect, lowered costs, easy implementation and on demand capacity, there are some significant impediments to adoption. After moving to the cloud it can be difficult for organizations to demonstrate compliance to their security policies and any regulatory requirements under which they may operate. And many organizations, in an attempt to circumvent IT or to quickly capitalize on lower costs, move into the cloud without checking the compliance or security ramifications. As a result, IT teams feel they can lose control over location or access rules for the data that needs to be shared within and outside the firewall. In this case the first audit often shuts down the project and indefinitely suppresses the appetite to try a cloud-based service at all.
To enable your company to gain the benefits of cloud computing while keeping your sanity, it is imperative when considering a cloud service provider that you control the situation to ensure compliance and pass your security audits. At a minimum, a cloud services provider must be able to account for the location of your data at all times. Make sure it is protected from theft while in their custody – at rest and in transit. Keep audit trails to report who accessed data and when it was accessed. Demand flexible data classification and authorization schemes to give you control over the setup and who can access which part of data. Additionally, your provider should offer options for “stronger-than-password” authentication and protection of data “in-use”, enabling premium security features.
Basic security considerations
The foundation of every service is the infrastructure on which it runs. Saving money by selecting a cut rate data hosting facility is not a good strategy if the target customer base is security savvy or regulated like the financial services or life sciences industries. Hosting providers must have security certification, i.e. SAS 70 type II and ISO 27001, which can be shared with customers for review. Another important benefit of certification is that it promotes good practices and helps identify weaknesses in processes.
Disaster recovery sites should be of equal quality and certification so that switching between facilities will not put the service provider or the customer out of compliance. The vendor’s network design must provide a means to clearly separate customer data. If the data is not separated at this level, then it should provide the architecture for the database and the application tiers to enforce data boundaries. A good design will also make applications independent from data location, so physical files can be stored in any geographic location the customer requires.
If a provider hosts customer data, then it means they are taking the responsibility to ensure its confidentiality and integrity. No matter how strong the measures to prevent data leaks are, there is always the potential for it to happen whether by accident or by deliberate action of a malicious insider. Providers must ensure that “lost” data is unusable for unauthorized users. Cryptography is key and should be implemented in a way that one insider will not be able to compromise data confidentiality. Strong encryption algorithms and multi-tier key management systems are a must. Protecting information in-transit is also important and typically requires SSL 3.0 or higher. All ciphers less than 128-bit strong must be disabled at server side. This may result in rejecting connections from browsers that cannot do 128-bit connections, but ensuring that the data cannot be compromised in-transit is worth it.
People who handle customer data and processes can also be targeted by attackers and there is always the possibility of someone trying to steal valuable data that has been stored. Comprehensive and ongoing background checks can help to eliminate weak links. Maintaining detailed audit logs help in identifying and prosecuting the data thieves. It may even prevent the loss as unusual activity can trigger investigation and put the brakes on stealth attempts.
When protecting data, it is important to keep the vulnerability surface of the system to a minimum. By using cryptography, for example, it is possible to reduce the data that needs protection down to the size of the key, a much smaller vulnerability surface to protect. Classifying the data and setting up appropriate authorization rules further strengthens the protection levels. Not all information used for business purposes needs the same level of protection. Some information, like marketing collateral, needs to be accessed by as many people as possible. Other documents such as company financials need to be kept behind strong locks. By properly setting user roles and assigning role-based permissions to data, you can streamline complex authorization schemes and make user administration easier and less error-prone.
The measures described above are the basic or “table-stakes’ security necessary to launch a public cloud-based service for businesses or government. Unfortunately, some very popular cloud providers lack even those. To a great extent this is related to their heritage as they attempt to convert consumer-oriented systems to provide services to business customers. It is very difficult and costly to do that and the jury is still out on whether they will ever be able to bolt-on enough security. The picture is drastically different for systems that were designed from the beginning to be highly secure. Properly implemented security features like cryptography and granular authorization schemes attached to highly segmented data give a head start in implementing extra features to achieve premium security.
Advanced security considerations
Most online applications require only an ID and password for login. The reason for this is not purely technical as it is fairly easy to integrate a multi-factor authentication server. The problem lies in managing those additional factors. User populations of modern online business applications are increasingly “consumer like”. Strong authentication introduces major inconveniences for users. If required at initial login, it is the equivalent of putting the vault door at the entrance to a bank, requiring everyone to pass through that highly secure door even if they just wanted to learn current interest rates and had no intention of handling money. That is why it makes sense to organize data within an online application so that high level security is used to protect only sensitive data and actions.
There are really two choices available. Either predetermine parts of the application that require higher security, e.g., parts of the application that initiate banking transactions or grant access to intellectual property, or allow the data owner to specify which information needs extra protection. This second option though more difficult to implement is preferable, especially for multi-tenant applications where each participant has his or her own security policy and data classification rules. This approach brings us closer to delegating security decisions to people who know the most about the data.
It is well understood that you cannot put critical information in the clear on the Internet and retain control over that information. Once it is downloaded information can be lost or stolen and you would not even know about it, let alone prove it was stolen from you. From the perspective of maintaining control over data, sharing a document online is not much different from mailing out a physical copy. But in properly designed and highly secure online systems, it can be more secure to share an electronic copy because there are technologies that allow tight access control and even remote shredding.
To address this, highly secure cloud systems implement in-use data protection where content is always encrypted even when it is sent down to the user’s browser. All temporary and permanent copies on the user disk are encrypted. Readers cannot open it without the key. When the owner of the document removes access permission for a particular user, the encryption key is destroyed which amounts to digitally shredding a remote document. To make it even more secure, some systems have implemented a Diffie-Hellman key-exchange where the key is never sent to or stored at the client side – it is calculated each time before use.
In conclusion, the cloud enables a critical business requirement – namely secure information exchange. It has a unique ability to meet that requirement and keep businesses secure and compliant. But for that to take place, the infrastructure that runs the applications and the applications themselves must be designed with built-in security from basic encryption and granular authorization schemes to multi-factor authentication in-use protected content. The IT organization can manage those systems and regain control over security by ensuring they understand cloud security principles and require strict and ongoing adherence to those principles by their cloud vendors.