Insight and analysis on the data center space from industry thought leaders.
Applying Baseball Lessons to Data Centers
Making the right decisions means understanding the right data and correlations with unquestioned accuracy, ranging from the baseball field all the way to the IT Infrastructure.
April 4, 2016
John Gentry is CTO for Virtual Instruments.
It’s safe to assume most people have heard of “Moneyball,” the book and blockbuster film that used professional baseball to bring one simple concept into the limelight: You can’t know what to buy if you don’t know what you’re trying to accomplish. “Moneyball” author Michael Lewis acknowledged this concept could have wide-ranging applications when he wrote:
“If gross miscalculations of a person's value could occur on a baseball field, before a live audience of 30,000, and a television audience of millions more, what did that say about the measurement of performance in other lines of work?”
This idea is now widely accepted and has certainly made the rounds in plenty of contexts beyond pro sports. However, there’s one key area that continues to be overlooked, but would benefit greatly from its wisdom – enterprise IT architecture, engineering and operations. As enterprise data centers continue their perpetual shifts in size, new technology adoption and ever-increasing complexity, we’re left with the realization that data center activity and IT output are critical areas in which inadequate performance monitoring and measurement, and incomplete analytics are all too prevalent.
Performance Analytics
When making decisions about budgetary investments, you need a clear understanding of what you’re trying to accomplish. Breaking baseball down to its most granular level means spending money on the individuals who are most likely to generate runs at the plate and prevent them in the field. For IT, it means identifying accurate baselines for performance and availability, and developing the infrastructure that can support your benchmarks and needs at the appropriate cost. Every application and workload has different needs, and additional factors, such as budget and scale, will determine the best direction. Just because something worked for one company – or worked for you in the past – doesn’t mean it’s the best decision for your organization going forward.
If you’re the decision-maker behind a major infrastructure overhaul at a financial services company, for example, you need applications and services that minimize latency and guarantee availability for end users. Understanding what you’re trying to accomplish as granularly as possible, and tracking and measuring to that end with authoritative analysis, is the only way to spend wisely. A telecom provider has different needs than a healthcare provider; just as the right infrastructure investment for a college or university isn’t necessarily the ideal fit for a national retail chain. What is common, though, is that the consequences of a poorly planned IT investment can be felt for years to come. You have a budget, and you have performance benchmarks specific to your organization. Collecting and analyzing data will produce analytics that show you exactly what you need, exactly how to get more out of what you already have, and how to avoid spending on unnecessary devices, components, capacity, etc.
IT Infrastructure administrators, CIOs and others are under a clear mandate to innovate constantly and support strategic business goals while budgets remain flat. Beane and the other small-market baseball franchises that have followed his lead have built dominant baseball teams by emphasizing efficiency and relying on analytics to make spending decisions that specifically worked for them. In the enterprise world, advanced metrics and detailed statistical analysis have found their ways into marketing and sales departments because they help teams optimize performance and spending. IT teams focused on performance need to do the same.
Incomplete Data or Personal Bias
Developing a high-performance, reliable IT Infrastructure has to come down to more than intuition and non-specific data sets and metrics. Those approaches worked when we didn’t have access to robust, deep data that measured end-to-end I/O activity across the open-systems stack. The analytics that determine optimal configurations and investments must be based on all available data and designed to deliver a correlated understanding in the appropriate context. It can’t just be about statistics that measure IOPS per second with no correlated understanding of how the workloads are performing across the stack; nor can IT teams expect one solution to work across the board. A whole new infrastructure won’t support healthcare facilities just because it worked well for managing an e-commerce organization.
Much of Beane’s inspiration for his project in Oakland has been based on the work of Bill James, an original innovator in baseball analysis who, in 1985, made the following three points about baseball statistics:
Baseball statistics have the ability to conjure images.
Baseball statistics can tell stories.
Baseball statistics acquire, from these other properties, a powerful ability to delude us.
Replace “baseball” with “IT performance” and “statistics” with “metrics” in each of those sentences, and they’re still quite true. Falling victim to analytics based on incomplete or out-of-context data can result in truly poor investment and configuration decisions and poor experiences for end users. Making the right decisions means understanding the right data and correlations with unquestioned accuracy, and truly understanding what you’re trying to accomplish, ranging from the baseball field all the way to the IT Infrastructure.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like