Sensors and Predictive Modeling: A Formidable Match
July 6, 2017
Sensors are decreasing in cost, and increasing in functionality. They are becoming more accurate, more reliable, and easier to install. Can they form the basis for a more accurate predictive modeling system in the data center? Ideally, data from sensors could be used to arrive at an optimal operational point for minimizing energy consumption. How might this work, and are data center managers ready for it?
Typically, where predictive modeling is required, computational fluid dynamics systems come into play. Soeren Brogaard Jensen, former VP for enterprise software and services at Schneider Electric and currently CTO of Trackunit, envisaged usage models for CFD without sensors.
“That’s where you base things on a fairly sophisticated model, where you have your equipment mapped out, and you understand the relationships and how it all connects together,” he said. This theoretical model can be used to map out what-if scenarios.
The next level involves overlaying a sensor model on top of the theoretical model to gain a real-world view of the data center’s environment and equipment.
CFD tools can be used as an indicator of trends, rather than as a specific forensic tool, but the closer the data in the model to the real data in the room, the more informed the decisions are that can be made. Many models will make assumptions based on the type and location of the equipment installed, but these can then be calibrated according to measured data from the sensors, to validate the model.
UK-based Future Facilities, which sells a CFD system for data center managers, manually measures environmental parameters to calibrate models based on theoretical specifications.
“When we do a model of a DC we do calibration. We survey the DC, inspect the cabinets, take temperature, pressure, airflow, and we will make sure that the model matches reality,” said Matt Warner, former manager of the firm’s UK development team.
The Benefits of Sensors
Implementing sensor technologies in a data center can help to automate the data gathering process for managers intent on modeling scenarios in the facility. These can then be fed back into a CFD modeling system that can be used for planning what happens under various scenarios. These can include the failure of mechanisms, or installing more equipment in the racks.
Managers can examine various parameters in these scenarios. These include the coefficient of performance (the ratio of the heat projected outside the room to the energy expended on it). Another key metric is the rack cooling index, a quantifiable metric that measures the level of compliance with the AHRAE standards.
ASHRAE publishes allowable temperature and humidity levels within data centers, setting clear operating parameters for computing equipment. One of the biggest benefits of installing environmental sensors in the data center is that, judiciously placed, they can help managers to raise temperatures safely within these bounds.
If the computing equipment is running too cold, then the data center manager risks over-using the cooling equipment within the data center and incurring a power overload.
James Cerwinski, former director of DCIM at Raritan, cited Gartner when he says that for every one degree that data center managers can avoid over-cooling, they can save 3-4 percent on cooling expenses.
“There is another consideration. If you go too hot, depending on the configuration of your service you may drive up energy consumption by excessive fans running in the servers. So every customer has to understand their own environment,” he warned.
Types of Measurement
Temperature and humidity often go together when installing data center sensors. The ambient temperature, combined with the level of moisture in the air, will determine when condensation happens as water forms droplets on nearby services, endangering operations.
Another reason to install sensors is to prevent hotspots building up at specific areas in the data center. Airflow sensors can detect the amount of air arriving via a floor-located conduit for cooling purposes, for example, ensuring that equipment such as networking cables are not blocking the conduit and choking off the chilling supply. Airflow sensors can also be deployed to ensure that hot air returns are similarly free from obstruction.
These sensors can help to feed models that factor in the return temperature index (the measure of recirculated air in the data center, which should be at 100 percent). This is another metric that can be put to good use in a CFD package.
Differential air pressure sensors can also be used to detect differences in pressure between parts of the data center, such as hot and cold aisles. If the pressure differential goes above a certain threshold, it can result in leaks and mingling of air with different temperatures.
Philip Squire, design director at UK-based colocation provider Ark Data Centres, worked with a third-party partner to design its free-air cooling system from the bottom up, and designed its monitoring infrastructure to fit.
“We have four sensors in each aisle, two on each side, for 26 cabinets,” he said. “We’re using air pressure sensors, because they measure the pressure differential between the hot aisle at the front and the cold aisle at the back. Those control the volume of air that we need to go into the aisle, and they are monitoring to ensure that we have a four pascal difference in pressure.”
The benefits to installing sensors in the data center extend beyond the technical. “Many customers work in secure environments, and we can install sensors that trigger an alarm if a door opens and closes,” Cerwinski said. These sensors can alert managers if cabinets containing sensitive hardware applications are opened by unauthorized personnel. And smoke particle sensors can also alert staff to fires in the data center environment.
Where to Place Sensors
Depending on what kind the sensors are being used and what is being monitored, data center managers can place them at different points in the environment. Sensors designed to support ASHRAE temperature and humidity controls can be installed within the rack, at the top, middle and bottom. Another can be installed at the back of the rack to measure containment, and this configuration can be repeated every fourth rack, Cerwinski said.
“Sensing at the cabinet level is important, because many data centers have difficulty providing real-time power and cooling data at the cabinet,” explained Aaron Carman, worldwide critical facilities strategy leader at HP.
Device-level Sensing
Experts caution data center managers not to ignore device-level sensing, which can be particularly useful in analyzing the effective operations of IT equipment.
These on-device sensors only entered widespread use during the last five years, argued Jensen, but they can bring significant benefits, by helping to merge two domains: facilities-level monitoring and IT infrastructure.
“In recent years, we are now overlaying the IT sensors. You have those in the cabinet itself, mounted inside the rack. That offers another element of information in the modeling of the data center,” he said.
Many of these sensors sit directly on the motherboard, while others sit behind the front plate where the air enters the chassis.
Deployment Costs
All of this means that capital expenditure may be the smallest part of the overall cost of a data center sensor initiative. Jensen argues that capital outlay on the sensor equipment itself may be less than 20 percent of the overall cost. Data integration can be more costly, along with other aspects of deployment that data center managers often underestimate, he suggests. These include location mapping, information sharing, and alerts, in addition to the training and tools needed for post-deployment usage.
Nevertheless, sensor technology has developed significantly in recent years, offering increasing levels of functionality that can help to ease some of those deployment costs.
Typically in the past, data center managers would deploy some sensors as a part of the existing data center network. They could be inserted into rack PDUs as plug-and-play options in some cases. In this kind of installation, the sensor communications can piggyback via the network already used to monitor the PDUs.
The alternative is to deploy them as a separate overlay network. The overlay network requires a controller with its own network connections, and can make this approach relatively expensive, because network drops also have to be deployed.
Wireless Sensors
More recently, wireless sensor technologies have emerged that can bring several benefits to deployment teams. Such sensors often use the Zigbee standard, which enables sensors to connect via a wireless mesh network that introduces a level of redundancy into the system. The wireless network is then consolidated by a controller, which then relays the data to a DCIM system, potentially reducing the cost.
“These mesh networks allow them to be more fault tolerant in the way that they connect across the data center,” Jensen said.
There are other advances, too. Sensors have become smaller, and accuracy has increased, he says. But perhaps one of the biggest drivers for sensor technology has been the move towards controlling cooling units directly using sensor technology.
“Wireless sensors are also turning into wireless control systems. A couple of years ago we didn’t see a lot of controls in this space, but now we are,” he said.
Using Sensors to Control Airflow
Some companies are using sensor technology as the basis for control loops to regulate data center temperature. Ark Data Centres’ Squire explains that he used the differential pressure sensors installed along his cabinets to control airflow.
“We measure pressure in the cold aisle, and we use pressure to control airflow,” he said. “At the other end of the airflow system, where the hot air comes out of the servers and then returns back to the input side of our air optimizers, it measures temperature and humidity as well as pressure. There, we control how much it mixes with outside air to deliver the right conditions for the cooling system.”
At the other side of the air optimizer it measures pressure, temperature, and humidity, to ensure that the right mixture of air is being delivered on the cool side. These controllers and sensors interact to ensure that it is making the best use of the waste heat that comes out of its servers, and the best use of the ambient external air.
Controlling data center operations in real time can provide significant efficiency savings, but it is also possible to use this sensor data for strategic outcomes. It enables data center managers to build an influence map, said Jensen.
“It basically shows you in real time, given any constraint that you have in the room, how every CRAC unit is impacting every area of the room,” he said. “Often, you may find that a CRAC is impacting an area in the room that you’d never thought about.”
Maps like these can provide actionable data in issues such as data center overcooling, he explained. Understanding the temperature dynamics of the data center in real time can enable managers to turn down cooling equipment in certain parts of the facility, he suggests, based on predictive analyses of what will happen. In some cases, he believes that data center teams may be able to recognize ROI in under a year on sensor deployments.
“CFD is critically important in the planning cycle, for understanding impact and what happens when you run out of certain resources,” Jensen said, arguing that data center operations benefit from both live data and CFD for effective planning.
The idea here is to create a positive feedback loop, with sensor data used to validate and tune theoretical CFD models, which then enable managers to find the optimal operating point in a data center. The sensors can then be used to maintain it, especially if they’re used as part of a control system to affect key metrics such as airflow.
With sensor technology evolving, and with pressure on data center managers increasing, every piece of actionable intelligence is useful. Together, sensors and predictive modeling tools can be a formidable weapon in the planners’ arsenal.
This article was originally published in the 2015 March/April issue of AFCOM's Data Center Management magazine.
About the Author
You May Also Like