Insight and analysis on the data center space from industry thought leaders.
Security Solutions that Combine Productivity with Hyper-efficiency
Modern hybrid and hyperscale data center architectures must include security that can keep pace with the demands of productivity and scalability.
November 11, 2020
Sponsored Content
Increasing productivity, even with a remote workforce, is a critical goal of many organizations. But with only so many working hours in a day, productivity gains rely exclusively on the efficiency of the tools being used. In today’s environments, securing the proliferation of new applications is only part of the challenge for today’s firewalls. The volume of data being processed also needs to be addressed, and this is where traditional network firewalls fall down flat.
Following are several critical considerations for all enterprise IT teams as they design security for modern data center infrastructure.
Visibility and control. Managing security risks to high-performance networks means proactively reducing the attack surface. That includes complete visibility and control of the entire environment, the endpoints, network segments, the traffic that is flowing through those segments, applications, and the data that is being accessed. Any device connecting to a data center network is therefore a potential threat vector. But securing a modern data center goes well beyond a traditional on-premises data center. It also requires visibility of all deployed security elements across all the various environments (on-premises, colocations, clouds, etc.), as well as the visibility of users, applications, and devices. It further includes intrusion prevention systems (IPS) that check for and help guard against advanced threats by monitoring the network in real-time.
Zero-trust principles. Zero-trust principles are about privileged access and adaptive trust. As a model, zero trust treats every transaction, movement, or iteration of data as suspicious. When properly implemented, a zero-trust system tracks user and network behavior (users-users, user- machines, machine-machine) and data flows in real-time and alerts teams or revokes access from accounts when an anomaly or anomalous behavior is detected.
Segmentation. Segmenting network traffic implements control points, reduces the potential for attackers to move laterally, and exploit weaknesses in more places in the data center. This means classifying all traffic into different segments, especially at the application and port levels. Network segmentation helps simplify how organizations enforce security policy by following defense in depth.
Time to service. Many current data center solutions yield low performance and high latency, meaning organizations can’t deliver services with the time, agility, and reliability the hyperscale era demands. Services need to be segmented and interoperate between a massive amount of physical and virtual assets. Modern data center firewalls must be able to offer hardware acceleration for Virtual Extensible LAN (VXLAN) termination and re-origination as well as provide dynamic support for Layer 4 or Layer 7 security. Even a tiny amount of downtime or minuscule service delivery challenge can cost companies millions in lost revenue, trust, and brand reputation.
Capacity. Many security infrastructures struggle when immense datasets, also known as “elephant flows,” are transferred over single connections. But elephant flows are a regular need in the hyperscale era, especially for organizations in industries such as pharmaceuticals, e-commerce, aeronautics, or financial brokerage that require securely encrypting and transferring large datasets using high-throughput flows across data centers or across data centers and multiple clouds. Network firewalls applied to hyperscale data centers must be able to perform at these levels, every day.
Imagine a cutting-edge pharmaceutical research company looking to build new medicines while delivering value to shareholders. Testing, modeling, and 3D rendering are key to that process. These functions require the processing and transferring of very large datasets – also known as “elephant flows,” and often comprising tens of Terabytes or more – as quickly as possible to AI/ML simulators. This enables new medicines to be developed faster, with lowered costs and reduced risk to human life.
But that data also needs to be secured. The surreptitious injection of bad data can ruin months or years of research. And competitors and even nation-states may be looking to circumvent the time and expense of research by stealing this intellectual property. But without specially designed security hardware, few security devices on the market are able to keep up.
This is only the beginning. The next generation of smart cars, smart cities, and smart infrastructures – including transportation, power grids, manufacturing, and more – all augmented by AI and Machine Learning – will require the management and processing of massive amounts of Big Data. Providing sufficient performance and processing to support these new architectures will require even faster and more efficient infrastructures. And for most security vendors, this is a looming challenge that isn’t even on their drawing boards – which puts the future of the digital revolution at risk.
About the Author
You May Also Like