The New Measurement Model

Analytics seem to be all the rage these days, probably due to integration. Technology is now an enterprise effort, not just the domain of the IT department. This results in the need for solid numbers to explain what really happens in the data center.

Jerry Gentry

September 20, 2012

2 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Analytics seem to be all the rage these days. Maybe it is because of budget season, but I think it goes beyond that. It is a result of integration. Technology is now an enterprise effort, not just the domain of the IT department. I’ve alluded to this in a few of my earlier articles on governance and sparse data. The under current of change is impacting all of us. Datacenter metrics have to evolve to show the transition of technology and sourcing strategies.

In data center environments, the summarization of metrics is split between two components. The first group of data falls under the domain of performance. These are measures against SLAs or expected thresholds of utilization of the existing capacity. These metrics focus on the technical components in the infrastructure. Although they may be real time, they are usually reported for a short time span.

The second group are trending statistics measured against the same technical components as well as other areas that might have physical constraint (floor space, power, HVAC, etc.). The statistics in these cases are shown as points of measurement across an extended period of time.

What is changing?

For both types of measures, the information represented by the data all ties back to a financial decision. In the case of performance and SLA data, the end result is to make a decision to invest to improve performance or not investment and accept some risk for a lower level of performance. In the case of accepting a lower level of performance, the benefit is the ability to target investment funds to another project with a potential greater payback.

Trending data, on the other hand, tells whether the investments already made are achieving the goals they had intended. Those investments are in support processes and personnel, hardware/software upgrades or additional capacity. Looking back and associating events, such as upgrades, with the performance trends can be a very powerful tool.

What does that mean to us as data center managers?

For traditional applications there are performance monitors that provide valid information about the end-to-end user to application performance. In the new environment of smart devices and clouds, the end-to-end model is modified. What are the right elements to monitor and, more important, what data will lead to the right investment decision?

The answers aren’t clear. I haven’t found anything consistent in the app store for monitoring. As with any new technology, it will be up to those who manage it to cobble together a method of measurement ahead of the market availability. Once the novelty of the devices wears off, the enterprise decision makers will start to ask the same questions. The questions that lead back to an investment decision. We need to be ready to respond. Right now, I don’t think we are.

To get more useful data center management strategies and insight from Nemertes Research download the Q2 Data Center Knowledge Guide to Enterprise Data Center Strategies – Volume 2.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like