Insight and analysis on the data center space from industry thought leaders.

No Need to Worry About "Legacy Loss"

Did you know that the computer system responsible for the American nuclear arsenal still runs on 8-inch floppy disks?

Industry Perspectives

September 6, 2018

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Oren_20Eini_Hibernating_20Rhinos_20002.jpg

Oren Eini is CEO of Hibernating Rhinos.

About 20 years ago, I had a Celeron 800 MHz computer with 386 megabytes of RAM. Never would I imagine that one of my biggest challenges today would be to fit software onto something with the same capacity.

Nearly two decades later, how can we get the most out of legacy hardware? The ability to master this skill will determine how organizations will navigate the next generation of technology.

Legacy Machines Aren’t Going Anywhere ... For Now

Did you know that the computer system responsible for the American nuclear arsenal still runs on 8-inch floppy disks? Most kids today don’t even know what a floppy disk is. Here’s another one: The IRS still uses systems built in the 1980s. Even large enterprises are using computers that need Windows Vista.

There are two main reasons why older machines still command real estate in the server rooms of large enterprises worldwide:

  • It costs less to maintain dinosaurs than it does to replace them. Legacy systems have code nobody at the organization remembers or even understands. Removing these solutions can prove too disruptive. 

  • For a company with more than 100 employees, you need a lot of machines. To get the best ROI on resources, plan ahead. That means buying a large volume of machines and signing a five-year support contract.

Even if the next “I gotta have it” technology comes out in two years, it makes little sense to waste your support. And if you do use something new, the older machines will be used for something else. There will be a point where you have a lot of machines five- to 10-years-old still in use.

Legacy Loss is Obsolete

According to Moore’s Law, every two years the computing power of a computer processing unit (CPU) will double. That implies that for every four years, a new computer experiences a “legacy loss” of being 25 percent as powerful as its modern-day peers.

Moore’s Law has held steady for the last half-century. A software developer could develop a piece of software knowing that by the time it went into full production, it would be working much faster than originally planned over stronger hardware. However, as transistors shrink in size to mere atoms, the limits of physics and research costs are preventing CPUs from doubling in power just as fast.

That means computers that the degree of “advancement” today’s computers have over their counterparts five years ago is less than the degree of “advancement” computers five years ago have over those produced five years prior. The reducing legacy loss also means that computers released five years from now will be even more marginally powerful than those in use today.

As time goes on, the impact of legacy loss will continue to diminish.

New Rules for Performance

If computing power is at the end of the road, how do you continue to improve performance? Focus on making the most of the resources you already have. Resource utilization is the new priority. The challenge is to get applications running as close to 100 percent capacity on every machine in use.

Software running on a five-year-old machine utilizing all of its memory, disk i/o, network bandwidth and CPU can outperform the same application underutilizing a machine fresh off the assembly line. To work faster on legacy hardware, the applications of today need to work smarter. This is a matter of design and logic. How can processes within a database, security system, even the application itself be remodeled to work more efficiently? What rules can be changed to boost performance or cut execution?

The best developers will reinvent the wheel not to fit it on a new model, but get better mileage out of the current one.

Legacy’s Legacy for the Future is Bright

What you need to do in order to get the most out of yesterday’s machines is exactly what you need to get the most out of tomorrow’s machines. The next generation of platforms demands the same efficiencies. Many of them have the same abilities as my old Celeron, utilizing new technologies like:

  • Smaller servers. Raspberry Pi servers literally fit in the palm of your hand. They cost less than standard servers and use around 10 percent of the energy, creating significant cost savings. 

  • Mobile devices. ARM chips are poised to turn your smartphone, tablet or even toaster oven into a hybrid server/client machine. The next generation of mobile applications will have unprecedented capabilities that will revolutionize Big Data and the Internet of Things.

  • Partitioned computers. Another way to save on hardware costs is to buy a computer with multiple CPUs and assign a different application to each CPU. This gives the software making up these applications boundaries on its “allotted space.” The more efficient the software runs, the better it will run on memory and computing power.

While there’s a lot of chatter about the death of legacy systems, it’s all over-hyped. Before running to put all of your financial resources and manpower into brand new technology, stop and think about how you can use what you already have. Your budgeting team will thank you.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like