Cooling and Powering Florida Poly’s New Supercomputer
Data center supporting the HPC system in the nation’s newest technology and science school features liquid cooling and unusual electrical design.
As the new futuristic campus of Florida Polytechnic University prepares to welcome its first 500-student class later this month, the university, together with IBM, announced installation of an IBM supercomputer in the building.
The school will teach science, technology, engineering and mathematics, and the supercomputer will support its cybersecurity, Big Data and analytics, cloud and virtualization programs.
While not extremely large in size, it is a powerful system and requires very high power density. Ed Turetzky, senior architect at Flagship Solutions Group, who worked on the team that installed the system, said the three racks that were currently populated would consume 32.7 kW when running at full steam.
For comparison, median power density in today’s enterprise and colocation data centers is lower than 5 kW per rack, according to a recent report by the Uptime Institute. High performance computing systems, such as Florida Poly’s new supercomputer, usually need much higher power densities than systems housed in enterprise and colocation facilities and require a different approach to data center design, especially mechanical and electrical infrastructure.
Chilled water to the rack
Florida Poly’s 1,000-core system is cooled with a rear-door heat exchanger, bolted onto the back of one of the racks. The exchanger is made by IBM and its cooling capacity is about 50,000 BTUs, Turetzky said.
The supercomputer needs about 70,000 BTUs of cooling capacity, and the difference is taken care of by ambient air conditioning in the room. As the system scales and more compute nodes are added, another heat exchanger will be installed on another rack.
The new Florida Polytechnic University will open its doors to the inaugural 500-student class later this month. (Photo by Jeane H. Vincent/Florida Polytechnic University)
The system pumps ionized water through the heat exchanger, which is part of a closed loop, dedicated to the supercomputer. The loop extends outside of the computer room where it goes through another heat exchanger that cools it with regular chilled water used by the building’s own air conditioning system.
“There’s an apparatus outside of the computer room that monitors the heat that is exchanged, so it will regulate itself to keep that ionized water at a steady temperature,” Tom Mitchell, vice president of sales at Flagship, said. Water used in the supercomputer's cooling loop is treated for impurities to prevent corrosion in the pipes, he explained.
Power transformer directly on data center floor
The HPC system does not have a dedicated utility feed and relies on the building’s electrical infrastructure for power. Building power comes into the 2,500-square-foot room, where it goes through a “step-down” transformer and an Eaton uninterruptible power supply system. Turetzky said this set-up, where the transformers are placed directly on the data center floor, was unusual.
Typically, the power will be stepped down in a separate closet outside of the data hall, and when an operator needs to add more power, they have to bring an additional feed from that closet. The set-up at Florida Poly makes it easier to add more power when the time comes.
With the transformer right there next to the IT racks, adding power capacity is simply a matter of adding circuits, he said. There is capacity to support 28 circuits, and the system is currently using only eight.
Building a Big Data work force
The campus is the only one in the State University System of Florida dedicated strictly to science, technology, engineering and mathematics. It is one of 28 schools IBM said it would partner with to train students for the millions of jobs in various Big Data fields it expects to be created around the world by the next year.
Read more about:
North AmericaAbout the Author
You May Also Like