Insight and analysis on the data center space from industry thought leaders.

Beyond Hardware: Software Defined Storage

With news of acquisitions and new products, the notion of the software-defined data center is becoming more and more clear to the broader IT audience, writes Woody Hutsell of Fusion-io.

Industry Perspectives

December 3, 2012

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Woody Hutsell is Senior Director of Product Management, Fusion-io

Woody-Hutsell-tn

Woody-Hutsell-tn

WOODY HUTSELL
Fusion-io

As companies like VMware and Intel make acquisitions like Nicira and Virtutech and introduce new products, the notion of the software-defined data center is becoming more and more clear to the broader IT audience.

According to Steve Herrod, CTO of VMware, the software-defined data center is an environment in which “all infrastructure is virtualized and delivered as a service, and the control of this datacenter is entirely automated by software.”

Discussed recently at IP Expo in London, the software-defined data center is a vision of using off-the-shelf commodity hardware and achieving flexibility and scalability through software. This creates a more nimble infrastructure that can keep up with the constantly changing demands put on it by new applications such as open-source databases or big data analytics.

Unlocking Possibilities with Software

The comparative ease of use that comes along with the software-defined data center is compelling. Such environments simplify complicated infrastructures, providing IT professionals with more control and easier management of their environment through the software layer.

Previously, specialists were required to handle proprietary, individual elements of data center upkeep, from server deployments to networking to mechanical storage arrays. This placed strains on IT budgets that were exacerbated by the need to scale out architectures as data demand increased and with the performance of essential applications increasing exponentially in every cycle of Moore’s Law.

With the software defined data center, IT professionals have significantly more freedom to work with the commodity hardware they are comfortable with, the software taking on much of the maintenance that once needed on-site, specialized skill. This frees these experts to be more creative and proactive, allowing them to offer new solutions for business problems, versus living in constant maintenance mode, running after problems that software can now address with little or no human intervention.

As infrastructures continue to be virtualized, housing more operating systems on physical machines to do the work of multiple servers, with switching and routing taking place at the software layer, the need for data to supply those infrastructures is evermore critical.

Overcoming Old Architectures

The mechanical-based disk array is a bulky, expensive, single-function appliance. Today, it is increasingly seen as unsuitable for meeting the demands of the modernized, software-defined infrastructure. As it goes with the rest of the infrastructure, the data supply is becoming software-defined as well.

This is because mechanically based infrastructures cannot keep up with data demand of today’s multi-core processors. SAS and SATA protocols were developed to extract every last bit of performance from mechanical hard disk drives. But these interfaces add unnecessary latency and performance bottlenecks when used in combination with flash memory based devices. The resulting performance degradation diminishes the ability of a flash memory solution to supply data to the application.

A Light of Flash at the End of the Tunnel

As flash memory became economically viable for data center deployment, the first implementations of this new medium were retrofitted to fit into existing disk-based infrastructures, essentially treating flash as a fast disk. Treating flash as an extension of memory is an architecturally superior approach in terms of delivering low-latency performance.

Luckily, innovative companies realized the disk-era method of deployment hindered the native advantages of flash. They developed a server-deployed form factor that placed the flash medium closer to the CPU via PCI Express, along with software that cut out all the latency-inducing disk-based protocols. This was the first step toward making software-defined storage in the data center a reality, significantly improving data performance for applications while reducing infrastructure and capital and operational expenditures in the process.

This essential evolution in computing architectures solved the problem of how to deploy flash memory to gain maximum value, but it still didn’t solve the problem of data acceleration for networked environments. Many enterprise applications have been written with those environments specifically in mind, and this direct-attached model for flash memory did not address this need.

Today, further advances in flash memory deployment have eliminated that barrier. Software solutions that are currently available allow users to load servers with several flash memory devices and run software on top of that hardware to give commodity servers the characteristics of a networked storage device, compatible with all standard network protocols. The proprietary, vertically integrated disk-based arrays can now be replaced at a far lesser cost with flash memory-deployed servers, significantly reducing cost while also delivering the order-of-magnitude performance improvement inherent to flash.

With storage following the software-defined trend in innovation, relying on commodity hardware and allowing software to do the heavy lifting, IT administrators can now realize the fully software-defined data center, with flash memory delivering the kind of application acceleration needed to meet the data demand of the information economy. This frees IT professionals to be more creative and less reactive, opening up a whole new realm of possibility for how businesses can innovate through mastering data demands.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like