Insight and analysis on the data center space from industry thought leaders.
Securing Multi-Cloud Environments
The modern cloud context doesn’t negate the need for sound security engineering based on threat modeling with consideration for proprietary data and use cases.
May 21, 2019
nick_deshpande1 (002)_0
Nick Deshpande, CISSP, CCSP, is responsible for Principal Product Strategy, Oracle Cloud Infrastructure.
Meeting the security and compliance needs across different Cloud Service Providers (CSP), and an organization’s own data center, remains a thorny challenge. Today, many enterprises are migrating business-critical workloads to the cloud, or have done so recently. Others are taking a ‘lift-and-shift’ approach for select applications, leveraging new integrations for legacy systems. This journey may lead them to different providers, for a variety of good reasons.
An IT team might select a private cloud for certain business functions, accounting for the location of a provider’s data centers and compliance footing. Meanwhile, a public cloud could be the ideal environment to host internet-facing web applications based on factors like cost, ease of deployment, coverage, and performance. This new setup might leverage low-latency edge computing to serve functions to a disparate user base.
And, the same organization might unlock greater value from leveraging a vendor-managed instance in a private cloud environment for internal applications once run on-premise. Think ERP, where tighter integrations will let teams unlock greater insights from a myriad of data, much of it sensitive. It’s imperative to have an open ecosystem, but one that remains accessible to authorized users only.
Creating and Maintaining Security Policy Critical
The hybrid/multi-cloud architecture makes for a fuzzy perimeter and a top concern for today’s IT security professionals. In the past, security has hindered cloud migrations, especially when the right tools weren’t available to maintain the control and visibility over an organization’s applications and the sensitive data processed by them. Today, security can be an enabler that allows aspiring digital leaders to leverage everything modern cloud offers.
Depending on the shared responsibility model and consumed services, an organization can remain in charge of its workloads, and certainly the data therein. The modern cloud context doesn’t negate the need for sound security engineering based on threat modeling with consideration for proprietary data and use cases. The outcome is an understanding of what controls to deploy to account for various sources of risk (regulatory, errors, threat actors, etc.).
When those instances are spread across different providers and environments, security teams might feel obligated to deploy different controls — some of which might come from the CSPs themselves or 3rd parties. But this is no longer the case.
Today’s IT security practitioners are embracing Cloud Security Operations (CloudSecOps) tools that were designed for “cross-cloud” migrators and who will eventually transition to cloud native digital leaders. Such tools offer the ability to orchestrate security management functions that provide consistent policies across diverse environments. The advantages are numerous (and mutually supportive): predictability, centralized logging and unification, scalability, and integration.
Predictability — multi-environment controls are a huge advantage during audits, or when internal governance and compliance teams come knocking to confirm that written policies have been effectively applied. The evidence sought by such teams is readily available and in a common format, allowing them to verify that systems are subject to controls commensurate with their sensitivity and significance. If a gap is identified, closing that gap applies to all environments, instead of piecemeal fixes. Rather than explain a myriad of controls that achieve similar functionality to an audit team, consuming valuable time, demonstrating the efficacy of a single, common control benefits all.
Centralized logging and unification — logs never lie; with common formatting that allows security operations centers (SOCs) to correlate and respond to events that may be indicative of a malicious actor targeting different endpoints with similar techniques and procedures, teams will cut down on the time needed stitch together different data in a data lake or separate repository. Unification, the ability to “collect security and operational data in a single data set to correlate and analyze cyber threats,” is a key criterion when selecting a CSP.
Scalability — as the organization grows and applications are required to process more data and more instances are deployed, the security layer must keep pace. DevOps teams want a security control that scales and sends telemetry about performance presented on a single pane, or for which alerts can be created and managed with minimal process overhead.
Integration and orchestration — no security control is an island, and neither should the ones that protect workloads across environments. For team with mature DevOps capabilities, it’s desirable that tight integrations exist between controls to build out automated orchestration and management functions. Rather than design automation for a myriad of controls, a common security platform that comes integration-ready will save time, money, and effort. Need to deploy? Do so once, not multiple times for the same function. Changing policies? A unique platform extends countermeasures across all properties in near real-time, regardless of hosting.
In security, diversity matters, especially when it comes to defense in depth. When considering the best security control to act as a countermeasure against threats and achieve compliance across cloud environments, commonality is preferred. I’ve highlighted four benefits of taking this approach, and there are many others. Most of all, as organizations move to a cloud-first model, a single platform will be highly familiar, enabling ease of use and assurance for all stakeholders. To take this approach, security teams must get in on the ground floor to help define the journey and make best practices easy practices. Common tooling is one such “best and easy” practice for organizations to consider.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating
About the Author
You May Also Like