Bloomberg Data Centers: Where the “Go”s Go
Hurricane Sandy forces market-data and news giant to rethink infrastructure
The kind of resiliency test Hurricane Sandy forced on Bloomberg’s Manhattan data center is not something John O’Connor wants to go through again. As the storm surge in New York City in late October 2012 was flooding the streets of lower Manhattan, water level in the facility’s basement reached 18 inches at one point.
There were fuel tanks and fuel controls in the basement, all of which could have easily malfunctioned had any more water entered the building, but “mother nature allowed us to win that battle,” O’Connor, manager of data center operations at Bloomberg, recalls. They were able to keep water from rising further; the facility switched to generator power and hummed through the crisis with no incident.
But the episode made upper management at Bloomberg “uncomfortable enough to write a big check” to build a new data center in the quiet New York suburb of Orangetown, as far away from Manhattan as practically possible. “We wanted to have more baskets to put our eggs in,” O’Connor said.
The data center came online this spring. The company hasn’t shut down the downtown facility, which has been in operation for about 18 years, but O’Connor’s team has been moving a lot of the workloads from there to the new one.
Where the “Go”s Go
Bloomberg data centers support its bread-and-butter professional service, which today has more than 325,000 subscribers. They pay the company “for this always-on, always-fast access to data and analytics about all things money,” O’Connor said.
He likes to say Bloomberg’s data centers are “where the ‘Go’s go,” referring to the “Go” key that replaces “Enter” on keyboards that come with the service. “When they hit ‘Go,’ it’s coming back to the data center to get the answers.”
Three Hot Sites
The older facility in Manhattan, the new one in Orangetown, and another one elsewhere in the New York metro are the three primary Bloomberg data centers. All three are “hot,” which means workloads can be quickly shifted from site to site as needed.
The load is split among the sites, sometimes 50-50 between two sites, and sometimes one-third per site. “It’s all designed to be very flexible,” O’Connor said.
If one of the sites goes down, some functionality, the top-tier workloads, will failover automatically, and some will need to be transferred manually by data center operators.
A Telco-Like Global WAN
Bloomberg’s own dedicated fiber infrastructure interconnects all three facilities to make failover faster. Given the nature of its business, the company invests a lot in network infrastructure. In addition to the three primary East Coast data centers, it has more than 100 nodes in data centers around the world, all communicating with the core infrastructure through Bloomberg’s sizable global Wide Area Network. “We’re essentially a telco provider,” O’Connor said.
The company uses other carriers’ physical fiber for connectivity between the nodes and the primary sites, but “’other’ is the wrong word,” he said. “All [carriers] is probably more accurate.” Each Bloomberg data center has two sizable network meet-me rooms where carriers interconnect with each other and with Bloomberg’s infrastructure. Most data centers only have one.
Contain Hot Air, Use Free Cooling, Put VFDs on Everything
The company expects its new data center to be 20 percent more energy efficient than the one in Manhattan. There is nothing extraordinary about the facility’s design that enables it to get to that level of efficiency, O’Connor said. His team simply followed as many efficiency best practices as they could find and implement.
The 7 MW facility has all the latest and greatest IT gear (new hardware is usually a lot more efficient than preceding generations) and uses as much free cooling as possible at any given point in time. But the biggest difference in efficiency is made by complete isolation of hot and cold air, O’Connor said.
Exhaust air escapes servers and rises into an overhead plenum through chimneys attached to the tops of the cabinets. It means the air that comes into the servers can be warmer than usual, which reduces the cooling system’s energy use. “The whole room is a cold aisle, and it doesn’t have to be as cold, because there’s no hot air mixing,” he said.
There are variable-speed drives “on everything,” and the system adjusts the amount of mechanical cooling automatically, based on outside-air temperature. It gets weather data from its own simple weather station on the building’s roof.
Automating this piece of data center management and having better weather data from a local weather station (as opposed to getting data from the nearest airport) enables the facility to use more free cooling and save energy. “If you can get an extra hour of free cooling – and you do that several times a year – that’s extra money,” O’Connor said.
Keeping in Step With Peers
Like other big players in the financial-services market, Bloomberg’s IT and data center teams spend quite a bit of time experimenting with new technologies. The company has a “sizable” private OpenStack cloud, which is currently used for development and some minor customer-facing services, but there’s every intention to shift other types of workloads in that direction as the technology matures. All new applications Bloomberg is developing are created for elastic environments, O’Connor said.
Bloomberg has also “dabbled” in hardware designed to Open Compute Project specifications. OCP is a Facebook-led open source data center and hardware design initiative. There are some Open Compute servers running at Bloomberg, O’Connor said. The company is actively involved in OCP, and he expects to submit proposals for a rack design to the project later this year.
Like other participants in OCP, Bloomberg is involved in the project to drive the open source community in the direction that fits its needs, he explained.
Green Story Born Out of a Crisis
Sandy, doubtless, made lots of organizations with critical infrastructure located in downtown New York – and perhaps in other low-lying coastal areas – take a hard look at their choice of location. It’s hard to say Bloomberg would not have built a new data center in the suburbs was it not for the hurricane and the flood that followed; companies expand infrastructure periodically and for different reasons, and more and more choose to build data centers outside of major cities. More often than not they do it for reasons other than natural disasters (mostly economic).
But Sandy was clearly a catalyst in Bloomberg’s case. As a result, its services gained in resiliency, and the company scored some “green” points for building a facility with advanced energy-efficiency features. It is one of the first data centers to receive certification under US Green Building Council’s newest LEED V4 benchmark, which includes some requirements unique to certain types of buildings, including data centers. Bloomberg’s Orangetown data center received LEED Gold earlier this year.
Read more about:
North AmericaAbout the Author
You May Also Like