Why a Special Purpose CPU May Be in Your Storage Future
A dedicated, special purpose CPU could create more balance among system resources.
May 27, 2020
CPUs are being asked to do more than ever, including supporting storage-related tasks, which is why a special purpose CPU may be in your storage future.
For as long as I can remember, there has always been something of an imbalance among computing resources. For example, a system might have far more CPU resources, but far less storage, than software needs. In fact, server virtualization was popularized because of the imbalance between server hardware and the demands made by line-of-business applications at the time. Server hardware was becoming ever more powerful, and applications needed only a fraction of the available resources. Hence, organizations were able to drive down costs by running multiple workloads on a physical server rather than depending on dedicated hardware.
Conventional wisdom has long held that most servers have plenty of CPU resources and that performance is almost always bottlenecked by storage or by some other component. More recently, though, the trend seems to be reversing. Modern flash storage arrays deliver performance that was once unthinkable. At the same time, CPUs are being bogged down by having to juggle more tasks than ever before.
This raises the question of what hardware vendors will do to ensure that performance continues to improve from one generation of hardware to the next.
While it is impossible to predict the future with any certainty, it seems likely that hardware vendors are going to move toward the use of processors that have been designed for a specific purpose.
The concept of having a special purporse processor--designed to handle a very specific task--is not new. In fact, there are countless examples of task-oriented processors. Many network adapters, for example, have onboard processors that handle TCP offloading. Likewise, graphic adapters include dedicated GPUs. These GPUs are also sometimes used for tasks related to machine learning, so as not to overburden a system’s primary CPU. Even Microsoft’s HoloLens 2 has a dedicated holographic processor.
As previously noted, modern flash storage delivers much higher levels of performance than what was previously possible. At the same time, though, storage is far more CPU-dependent than it once was. Tasks such as deduplication, integrity hashing, encryption and compression are all CPU-intensive. This isn’t a big deal for stand-alone storage appliances, because such appliances usually have dedicated hardware for handling such tasks. In other cases, however, such as with servers that use direct-attached storage, storage-related tasks are handled by the same CPU that runs business workloads. However, CPUs are being pushed like never before while Moore’s Law is starting to break down--meaning that future CPUs probably won’t deliver the exponential increases in performance that we have seen in the past.
While offloading storage-related processing tasks to a dedicated, special purpose CPU is one solution to decreasing the load on a system’s CPU, another possible option is to use composable infrastructure.
Composable infrastructure is an architecture in which hardware resources such as CPU, memory and storage are grouped into resource pools. Unlike a traditional server virtualization environment, an administrator may not have to manually allocate hardware resources to workloads (depending on which vendor’s solution is being used). Instead, the composable infrastructure allocates hardware resources to workloads on demand, and then releases those resources back into the resource pools when they are no longer required.
Recently, a vendor called Fungible blended the concepts of purpose-built CPUs and composable infrastructure into “data-centric infrastructure.”
Data-centric infrastructure is the opposite of the compute-centric architecture that has been in use for generations. In a computer-centric architecture, the CPU handles two primary tasks: running applications and moving data between devices such as storage and the network. The problem with this is that storage devices and networks have become much faster in recent years, while CPU performance is beginning to level off.
Fungible’s data-centric infrastructure is based on a new type of processor that the company calls a Data Processing Unit, or DPU. The DPU’s job is to act as a data traffic controller, completely offloading data transfer tasks from the CPU. Because the DPU was purpose-built for this task, it has the potential to be far more efficient than a general-purpose CPU.
What makes the DPU truly unique, however, is that it has been designed for use in composable infrastructure architectures. As such, the DPU is designed to handle the task of transferring data among all of the various components that make up a composable infrastructure.
About the Author
You May Also Like