Insight and analysis on the data center space from industry thought leaders.
Why a Multi-Cloud Strategy Is Critical for Enterprise Data Protection
With the rapid rate of data creation and modification, many enterprises need to up their frequency of data protection to mere minutes.
November 21, 2018
Dave Russell is VP of Enterprise Strategy for Veeam.
Enterprises have embraced the cloud, whether it is used as part of the firm’s infrastructure, for hosted applications or as the platform of choice. And public cloud is increasingly becoming popular. The 451 Group Voice of the Enterprise recently did a survey that found by 2020, the majority of workloads at an enterprise will be spread out across public clouds due to the increased use of IaaS, SaaS and PaaS.
However, what is more interesting to note about an enterprise’s cloud use, is how they are diversifying their implementations. RightScale also did a survey that found that companies are looking at, and ramping up, their use of several public clouds, rather than staying with one provider. ESG’s 2017 report corroborates these findings with 81 percent of organizations reporting a multi-cloud strategy and implementation plan. So, why the focus on multi-clouds? The answer is that by diversifying, enterprises can minimize risk, while remaining agile, speed time to market, increase innovation and optimize costs.
Why Diversification is Needed
While public cloud providers such as Amazon, Google and Microsoft offer very robust and reliable services, there are times when having multiple clouds and providers is essential. It’s not often that these cloud providers have outages, but when they do, it makes national news. Enterprises feel bereft without access to applications, servers and most importantly, their data.
These outages – from the multi-region Amazon outage earlier this year, to the more recent Microsoft Azure downtime – highlight a key fact that enterprise IT teams and executives may overlook: Enterprises are responsible for their own data protection. Public cloud providers are not responsible for your data. The service level agreements (SLAs) may be 99.99 percent, but they are only for network availability and the durability of the infrastructure, not customer data and data availability. Data falls under what is often termed the shared responsibility model.
Each cloud provider has a different shared responsibility model, so it is important that IT executives read the details and specifics carefully. Amazon’s agreement makes the user’s responsibility very clear, stating it is not responsible for the security of any content, or accountable if any data is lost or altered. For Microsoft, its shared responsibility agreement is a bit different. They state that any on-premise, IaaS and PaaS use does not include data responsibility on their part. However, Microsoft SaaS does cover data access – but only for 30-60 days. So, companies looking to keep data available and protected after that timeframe must find another method.
How to Prepare and Manage Multi-Clouds
The goal of every enterprise is getting data management and protection procedures under control by introducing intelligent methods that will keep your company hyper-available around the clock. The first step is to ensure that the data – the critical part of an efficient company, and the lifeblood of an organization – is available. So, it is only stands to reason that when data is unavailable, lost or unprotected, a company could stand to lose millions of dollars from downtime, including loss of business opportunities, lack of innovation or poor customer experiences.
Companies can no longer use the “old” method of backing up or protecting data once a day to tape or the cloud. With the rapid rate of data creation and modification, many enterprises need to up their frequency of data protection to mere minutes - making sure that data is always available, and that data loss is avoided. One side effect of today’s multi-location, multi-data center, multi-cloud strategy is hyper-sprawl, meaning that there is federation of data everywhere, across multiple clouds, databases and devices, which can be a concern for IT as well.
The problem is that without intelligent data management for these multi-clouds and growing data stores, enterprise IT teams will be overwhelmed and lose track of where and how their data is protected. The ideal method for multi-cloud data protection includes five main steps for success:
Backup: It sounds simple, but far too many organizations struggle to provide reliable, consistent availability to ensure that they are recoverable in the event of outages, attack, loss or theft. APIs are critical to allow for deep integration with the application, hypervisor and/or infrastructure to manage data ingestion and to trigger proper protection .
Aggregation: The key to this step is a singular, extensible platform for delivering availability to make sure that all vital data is protected in an increasingly multi-data center, multi-cloud environment.
Visibility: Teams should be able to manage from a single pane of glass that provides visibility in to where data is protected and stored. It allows for enterprises to move from a reactive to a proactive availability approach for optimized capacity planning, resource monitoring and infrastructure allocation. In this stage, the ability to dynamically create isolated instances of protected servers for disaster recovery, devops, patch and security testing or compliance is made possible.
Orchestration: The goal of this stage is to seamlessly move data to the best location across multi-loud environments to ensure business continuity, compliance, security and optimal use of resources for business operations.
Automation: While ambitious and aspirational in nature, this final stage is enabling data to be self-managing by learning to back up and migrate to the ideal location(s) based on business needs in real time, to secure itself during anomalous activity and to be instantly recoverable.
Start Diversification Now
We live in a hybrid world, and enterprises require an extensible platform that can handle this complexity of multi-clouds and on-premises environments. By diversifying data protection across various public clouds, enterprises can ensure greater protection of its data. However, the critical item is for data center personal and leadership to appreciate that the data is still primarily the responsibility of the owning organization, not of the cloud service provider. While there are infrequent outages, enterprises must continually remind themselves that the total responsibility for data availability lies in their hands. By implementing a proper multi-cloud and data management layer over it, enterprises can allow for the efficient movement and handling of data and resources.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.
About the Author
You May Also Like