Insight and analysis on the data center space from industry thought leaders.
Reducing Energy Consumption and Cost in the Data Center
Data center managers are faced with planning for the future and the mandate to change their current rate of spending on equipment and operations, writes Rich Gadomski of Fujifilm Recording Media. One area of focus has been the massive energy consumption of data centers and the impact of storage.
December 11, 2014
Rich Gadomski is a member of the Active Archive Alliance and VP of Marketing at Fujifilm Recording Media.
It is probably safe to say that most data center managers are dealing with the challenge of increasing data growth and limited IT resources and budgets. With “save everything forever” strategies becoming more prevalent for many organizations, the strain on IT resources will only get worse over time.
Data center managers are faced with planning for the future and the mandate to change their current rate of spending on equipment and operations. One area of focus has been the massive energy consumption of data centers and the impact of storage.
Energy Consumption in the Data Center
Although the dire predictions of the 2007 EPA report on data center energy consumption have not panned out, there are still ongoing energy consumption concerns and data centers are not off the hook.
Earlier this year, a report by Greenpeace criticized big data centers for using dirty energy (coal, gas, nuclear) as opposed to clean energy (wind, solar). A more recent report from the Natural Resources Defense Council (NRDC) claims waste and inefficiency in U.S. data centers – that consumed a massive 91 bn kWh of electricity in 2013 – will increase to 140 bn kWh by 2020, the equivalent of 50 large (500 megawatt) power plants. However, the 2014 Uptime Institute annual data center survey reveals that data center power usage efficiency (PUE) metrics have plateaued at around 1.7 after several years of steady improvement.
One reason for the heavy energy consumption by data centers is that they rely heavily on spinning hard disk drive technology to store their data. Often the response to increasing data growth has been to add more disk arrays to solve the problem. A hard disk drive platter spinning 24/7/365 at 7,000 or 10,000 RPMs requires power to not only spin it, but to cool it as well. Otherwise the heat generated by the constant spinning would corrupt and eventually destroy the data.
Managing Data Center Growth
While every organization will have different data profiles, studies show that during an average life cycle data becomes inactive after a short period of 30 to 90 days. If that is the case, it makes sense to move that data from expensive, primary disk storage, to more economical tiers of storage such as low cost capacity disk and/or tape.
In the process of moving files from high cost to low cost tiers, data accessibility does not need to be sacrificed. One method of ensuring it’s not is with active archiving strategies. An active archive file system gives you the ability to store and access all of your data by extending the file system to all tiers of storage and does so transparently to the users. As a result, active archives provide organizations with a persistent view of the data in their archive and make it easy to access files whenever needed.
By policy, files migrate from expensive primary storage to economy tiers freeing up primary storage for new data, reducing backup volume and reducing overall cost by effectively storing data according to its usage profile. The idea here is to stop backing up (copying) data that is not changing or seldom retrieved anymore, and move it into an active archive.
Optimizing Performance in Storage Environments
Today, automated tape libraries capable of scaling into the petabytes play a key role in active archiving where the data is still easily accessible and well protected, but consumes no energy until it is retrieved. TCO studies of disk vs. tape show a significant advantage for tape with disk consuming up to 105 times more energy than the equivalent amount of tape storage.
While disk and now flash technology get a lot of attention, today’s tape is performance optimized with LTFS (Linear Tape File System) and tape NAS appliances. Its capacity is constantly increasing thanks to advanced tape drives, high density automated tape libraries and new media innovations like Barium Ferrite (BaFe), shown to have superior performance and longer archival life compared to conventional metal particle (MP) tape. BaFe particles have been used in LTO-6 tape and in the production of multi-terabyte enterprise data tape, supporting the needs of various customers in diverse industries for their large volume backup and archival needs.
Tape can serve as an efficient active archive tier that fits in the same commonly used network storage environments. Networks can move hundreds of terabytes or petabytes of content easily onto a network share powered by tape without introducing any change to their users. By moving data from high energy consuming, expensive primary disk storage to cost effective tape technology within an active archive environment, organizations can significantly decrease energy consumption and space requirements leading to an overall decrease in data center expenses.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like