Insight and analysis on the data center space from industry thought leaders.

Storage: A Secret Weapon to Lower TCO in the Data Center

Adopting the right storage strategy can make a vast difference in your ability to lower Total Cost of Ownership (TCO) writes Brendan Collins, of HGST. Customizing and tiering infrastructure workloads and tuning drives by application and function will help you more efficiently handle ever-increasing amounts of data. It will also allow you to optimize power consumption, cooling, storage density and performance to deliver new data center economics models

5 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Brendan Collins is the Vice President of Enterprise Storage at HGST, a Western Digital Company, and has more than 20 years of technical and management experience in the disk drive industry.

Brendan Collins

bcollins-hgst

BRENDAN COLLINS
HGST

Whether you are building storage solutions for public clouds, private clouds or traditional data centers, adopting the right storage strategy can make a vast difference in your ability to lower Total Cost of Ownership (TCO). The secret to successfully impacting TCO goes well beyond Hard Disk Drive (HDD) acquisition costs alone. Customizing and tiering infrastructure workloads and tuning drives by application and function will help you more efficiently handle ever-increasing amounts of data. It will also allow you to optimize power consumption, cooling, storage density and performance to deliver new data center economics models.

Storage is No Longer a Commodity

Storage is no longer an interchangeable, one-size-fits-all commodity. Increasingly, it is the critical foundation and enabler for your entire infrastructure as the need for online or cloud data storage grows exponentially. Analysts predict 80 percent of net new apps will target the cloud1 and over a third of all IT spending will move off premise by 20132. Fueled by mobile and enterprise applications and “big data,” the amount of unstructured and “semi-structured’ content is far outweighing the structured data that storage solutions and tools were originally designed to store, protect and manage. New tools like Hadoop and shifts to alternative storage architectures are rapidly emerging to address this avalanche of data.

Today’s “hyper-scale” data centers, where tens of thousands of servers coupled with petabytes (or even exabytes) of data host applications such as search, social networking, big data analytics, content sharing, online banking and retail, are all designed with the assumptions that operating expenses (OPEX) are just as critical as capital expenses (CAPEX), and that both problems and costs are best solved at the system or even data center level.

New HDD Use Less Power

One of the most significant costs in data centers is power and cooling. New 4TB hard drives available on the market use 24 percent fewer watts per gigabyte than lower capacities and support multiple power modes to reduce power consumption by up to 59 percent, depending on the mode and how fast the HDD needs to respond to requests. When set to the lowest power modes like idle or standby, these drives use less than one watt of power, keeping data online at virtually no power cost and freeing valuable headroom for other enterprise needs.

Reducing downtime is another key factor in managing TCO. Purchasing reliable, quality HDDs with the highest possible mean-time-between-failures (MTBF) keeps data safe and accessible, while reducing both maintenance costs and the performance degradation that occurs when storage systems are recovering from HDD failures. HDDs offering five year limited warranties and MTBF3 specifications of 2 million hours deliver 800,000 additional hours of durability while greatly reducing service calls.

Benefits to Increased Capacity, Density

Storage density is dictated by the capacity of the HDD and how many HDDs you can integrate into rack-mounted servers and storage solutions. Massive 4TB drives provide 33 percent more capacity than the previous generation 3TB HDDs. With this higher capacity, a fully loaded standard 19-inch rack can provide up to 2.4PB of data storage. There are two ways to use this capacity advantage: lower TCO by upgrading HDDs and reducing the number of servers, racks, cables and networking gear, or keep expenses flat and quadruple storage capacity (depending on what HDDs you are currently using, of course).

Although areal density may be slowing, there are other ways to keep driving capacity. High capacity Advanced Format technology will improve how the surface of the drive is utilized, leading to optimized capacity use, better data integrity and higher storage densities. When Advanced Format drives are offered with emulation mode you can ensure compatibility with legacy systems and applications, evolve the infrastructure as your business requires, and save the cost of expensive new capital investments in the process.

Storage Tiering Has Positive Impacts

Lastly, whether you have a large hyper-scale cloud data center or a small internal data center, you can significantly improve TCO and utilization with storage tiering.

Tiering introduces layers of storage that are best matched to the needs of a company’s applications and workloads. There are no hard and fast rules for tiering— it depends on the workloads, applications, performance needs and data types. Some of the major tiers and functions to consider include:

Using high-performance SSDs in Tier 0 for mission critical transactional applications provides the best IOPS/$ and the best IOPS/Watt.

  • Deploying 10K/15K RPM SAS HDDs provide the best balance of performance and capacity and are ideal for Tier 1 business critical applications.

  • High capacity 7,200 RPM SAS and SATA HDDs deliver the best $/GB and good power numbers for Tier 2, where the bulk of cloud storage resides.

  • And at the bottom of the “storage pyramid” we find long-term archiving and long-tail content where data is accessed infrequently but still needs to be online.  Low RPM HDDs and tape, in some cases, provide the lowest $/GB and TB/W.

By looking at your requirements and solutions holistically —- at the system and data center levels—by taking account of OPEX, infrastructure and the broader CAPEX picture you can better customize your storage solutions to the use case. Learning how best to tier, pool, deploy and secure data across public and private clouds as well as on premise will enable you to deliver optimal performance and efficiency with reduced system and energy costs.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like