Facebook Saves Big By Retooling its Cooling

Facebook retooled the cooling system in one of its existing data centers in Santa Clara, reducing the facility's annual energy bill by $229,000 and earning a $294,761 rebate from Silicon Valley Power. Here are details of its presentation at the Data Center Efficiency Summit.

Rich Miller

October 14, 2010

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

facebook-cfd-svlg

These before-and-after CFD modeling diagrams show the difference in airflow management once Facebook implemented cold aisle containment.

Facebook's efforts to make its data centers more energy efficiency isn't limited to its new facility in Oregon. The social network recently retooled the cooling system in one of its existing data centers in Santa Clara, Calif., slashing the facility's annual energy bill by $229,000 and earning a $294,761 rebate from Silicon Valley Power.

The company shared details of its efficiency project today at the Data Center Efficiency Summit 2010 hosted by the Silicon Valley Leadership Group (SVLG) on the Brocade campus in Santa Clara. Facebook pursued a multi-faceted approach in its retrofit of a 56,000 square foot data center, in which it is the only tenant and holds a long-term lease. The facility did not offer the option of using economizers, which allow data center operators to save money by using use fresh air in its cooling systems in place of energy-intensive chillers.

Series of Best Practices
With economization off the table, Facebook implemented a series of best practices to dramatically reduce its energy use, including thermal modeling, installing cold aisle containment, reducing fan energy in servers, and raising the temperature of both supply air and chilled water. At the SVLG Summit, Facebook's refinements were described by Director of Datacenter Engineering Jay Park and engineers Veerendra Mulay and Daniel Lee.

Here's a look at the components of the project:

CFD Modeling: The first step in Facebook's efficiency project was creating a thermal model of the data center using computational fluid dynamics (CFD) software that creates a 3D model of how cold air is moving through the facility, identifying potential “hot spots” as well as areas that are receiving more cold air than needed, wasting cooling and energy. The CFD study revealed that some of the cold air entering the room through the raised-floor was bypassing servers, cooling the room rather than the IT equipment, while warm exhaust air from the hot aisle was mixing with cold air in key areas.

Cold Aisle Containment: Facebook took several steps to address the airflow problems identified in the CFD modeling. It began by installing a cold aisle containment system to isolate the hot and cold air in the data center. Roof panels were installed over the cold aisles, with fusible links to allow for adequate overhead fire suppression. Doors at each end of the aisle allowed access for tech staff. Facebook also took steps to seal every area where cold air could escape, using blanking plates, skirts for PDUs (power distribution units) and sealing cut-outs for cabling.

Reducing the Number of CRAH Units: Once the cold aisle was encapsulated, less airflow was required to cool the equipment. This allowed Facebook to turn off 15 computer room air handlers (CRAHs), saving the energy required to operate those excess units.

Reducing Server Fan Energy: Further savings were gained through adjustments to the server fans. "These fans are PWM fans - pulse with modulation," Park explained. "They're typically pre-set by the manufacturer to run at higher speeds. You modulate the fans to a lower speed and you bring less air through the servers. You can set this through software. Intel can tell you how to do this."

Raising the Air Temperature: Facebook next concentrated on raising the rack inlet temperature as high as it could without triggering additional fan activity. Optimizing the cold aisle and server fan speed allowed Facebook to raise the temperature at the CRAH return from 72 degrees F to 81 degrees F.

Raising the Water Temperature: The higher air temperature then allowed Facebook to raise the temperature of the supply water coming from its chillers, requiring less energy for refrigeration. The temperature of chiller water supply was raised by 8 degrees, from 44 degrees F to 52 degrees F.

Facebook's systematic application of best practices illustrated how energy efficiency projects can create a "waterfall affect" of cascading benefits. The company's approach to the project made an impression upon its landlord, Digital Realty Trust.

"Facebook has distinguished itself as one of the leading efficiency teams among our global portfolio of thousands of customers," said Jim Smith, CTO of Digital Realty Trust. "In addition, Facebook has been open and collaborative in their approach, enabling us to implement some of their strategies with our other customers.  Thus, we have the potential to multiply the energy savings and environmental protection across the infrastructure of many other companies."

Read more about:

Meta Facebook
Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like