Saturday, June 25, 2011

A Blueprint for Cloud Security

Time and time again, the most common question regarding cloud computing is about security. The question comes in many forms including how do I ensure my data is secure? or how do I guarantee unauthorized people do not access my data? or even my processes are not set to accommodate servers we do not own? These questions and many more, all come from the same fear, a reluctance to make a major change in architecture and strategy because of a lack of understanding, lack of tools or lack of knowledge.


Security is a broad topic, it can encompass many components including the network layer, server location, data center access, data storage devices, application architecture, logging, authentication, monitoring, business process or compliance with regulations, just to name a few. The focus of this article is best practices for ensuring that a plan for security in a cloud environment is complete and well planned for.


First, what is cloud computing in the context of security? In this context cloud computing is the use of computing resources that are provided from shared servers, data centers and environments. Cloud computing in the sense of security blurs the lines that traditionally separated the physical components of one application or company from another. This shared aspect of cloud computing is important to planning security, because historically, many security policies assumed the server and data was physically located in a data center the company controlled. This is no longer the case, companies may have thousands of IT resources they will never physically touch or have control over more then a remote login and the use of the resource.


As with all discussions around security, we must make some assumptions, these set the base of our understanding and focus our later best practices within the context of security for cloud computing:

  • Assume elasticity – Cloud environments can scale up and down quickly, from a security perspective this means things are constantly changing and the security policies, models, processes and tools must automatically support this dynamic environment.

  • Shared physical locality – Through the use of cloud computing, you will inevitably have your application on a server that also has applications from other companies, these can be partners, competitors, aggressors, hackers, or customers. Security policies within a cloud environment should accommodate this proximity to possible threats.

  • Data will take one of three forms – at rest, in transit or in process – Security policies for cloud computing should accommodate data state and ensure that all states are adequately protected, and that data is passed securely between states.

  • Physical security can not be guaranteed – Many cloud providers have instituted physical security well beyond what was possible in a corporate managed data center. This does not mean that all is without risk. Any time a resource within a shared facility is being used, the potential for the equipment to be physically accessed is a possibility. Data, process and applications should be architected to accommodate this without risk of data being compromised or availability being impacted.

  • Assume the server could disappear – Expanding on the above assumption, cloud security solutions should assume the server housing the data could disappear without warning. There have been documented cases of servers that house multiple customers being seized as well as servers failing and not being returned to service. Security plans should ensure that should a server disappear, the risk of data loss is as low as possible.

  • There is no edge – There is no longer a distinct line that can be drawn for where people will access a cloud based environment or where it will be managed from. This lack of a clear boundary that once would accommodate a firewall, must now be guarded by policies, monitoring, intrusion detection and application penetration testing.


Now that we have reviewed the assumptions behind all cloud environments, we can list some of the best practices (In no particular order of priority) for managing security relating to data and application access within a cloud environment:

  • Centralized Authentication and Authorization – Any cloud based environment should use a single, centralized method for authentication and authorization. This ensures that any rogue accounts that are created can be quickly identified and accounts can be rapidly disabled for those that no longer need them. This single mechanism for authentication and authorization should cover both the staff that manage the application and data, as well as the users that access and consume the application and associated data.

  • Centralized Key Management – The use of encryption for data storage and validation should be employees across all cloud environments. These should be implemented via a centralized key management solution so that data access can be revoked if necessary. A central key management solution will enable staff to provide access to data to those that are authorized, and ensure that access is removed when warranted.

  • Encrypt all at rest and in transit data – All data, not actively being processed, should be encrypted. This includes system log files, databases, unstructured data and data the application generates while running. While this has a high overhead in CPU cycles and time, the risk of missing data that should be encrypted is often too great to ignore.

  • Security handle in process data – Any data actively being used by the application should be handled in a way that minimizes the risk of exposure between processes, applications, users, and virtual machines as well as to stored in a location that is persistent, and encrypted. Logging should be done in a way to minimize the exposure of user data to those that are troubleshooting the environment. All in process data should be handled as short a time as possible to minimize the risk of exposure.

  • Use of host-based firewalls – As with all traditional security best practices, going back many years, host based firewalls should be utilized on all systems regardless of internal or external access.

  • Regular penetration testing by outsiders – Any company providing a publicly available site, hosted via a cloud computing solution, should employ the services of an external firm to periodically execute a vulnerability assessments and complete penetration testing of the environment. This outside perspective is important to review and test the design and implementation of the application and data security controls.

  • Staff Training – Training is critical for all team members that are expanding into roles including cloud computing. All staff need to be educated on the new process requirements, the new rules for deployment and the methods in which cloud computing is being employeed. This training ensures staff are comfortable with this new technology and working from a common base of knowledge and experience.

  • Accountability – Staff should be held accountable for what and how they expand into the cloud. Risk assessments should be done prior to large, complex changes to ensure staff have adequately assessed the risks, planned for mitigation strategies and implemented safeguards. Staff should be empowered to suggest changes and make improvements.

  • Change Management – Automated solutions for change and configuration management should be utilized to ensure that all software and servers deployed meet the same baseline standards for configuration. Change and configuration management systems simplify deployments and minimize the change that an oversight leads to a vulnerability.


It is a new world out there, IT managers have more options then ever before when considering how to deploy and utilize new services. Cloud computing is an entirely new way of thinking for many people and creates many new opportunities for scale, efficiency and improved operational models. Despite all that, there are three truths that we must account for when deploying solutions in cloud environments:

  1. There is no edge any more, people consume and create resources from a multitude of places and from a variety of devices.

  2. There is no stable state, applications are elastic and change regularly, we simply can not have a security checklist for new servers any more, we must use process and automation to ensure compliance.

  3. Data is king, the amount being produced today is monumental, and it has huge corporate value. Data must be protected, regardless of state in a variety of circumstances that are no longer under the control of the companies data center manager.

No comments: