What is an IPU (Infrastructure Processing Unit) and How Does it Work?

IPUs (Infrastructure Processing Units) are hardware accelerators designed to offload compute-intensive infrastructure tasks like packet processing, traffic shaping, and virtual switching from CPUs.

Salvatore Salamone, Managing editor

December 14, 2023

1 Min Read
What is an IPU (Infrastructure Processing Unit) and How Does it Work?
Siarhei Yurchanka / Alamy

IPUs (Infrastructure Processing Units) take their place in the data center alongside other specialty processors designed to accelerate workloads and offload tasks traditionally performed by Central Processing Units (CPUs).

In much the same way Graphics Processing Units (GPUs) are leveraged to speed up non-graphic calculations involving highly parallel problems due to their parallel structure, IPUs, introduced by Intel, accelerate network infrastructure and infrastructure services like virtual switching. Shifting these operations to a dedicated processor designed specifically to handle these tasks frees up CPU cycles. The end result is improved application performance and the ability to run more workloads with fewer CPUs.

Diving into the World of IPU

An IPU, like a Data Processing Unit (DPU) and Compute Express Link (CXL), makes a new type of acceleration technology available in the data center. While GPUs, FPGAs, ASICs, and other hardware accelerators offload computing tasks from CPUs, these devices and technologies focus on speeding up data handling, movement, and networking chores.

There is growing interest in the general area of infrastructure task acceleration. In the last year, many vendors have introduced solutions that try to address the same issues.

Related:Server Rooms vs. Data Centers: Advantages and Disadvantages

For example, NVIDIA recently introduced a SuperNIC, which it described as a “new class of networking accelerator designed to supercharge AI workloads in Ethernet-based networks.” It is designed to provide ultra-fact networking for GPU-to-GPU communications.

Additionally, there are many other new accelerators designed to speed up particular workloads. Examples include Graphcore’s Intelligence Processing Unit (also called an IPU) and Google Cloud’s Tensor Processing Unit (TPU)…

Read the rest of this article on Network Computing

About the Author

Salvatore Salamone

Managing editor, Network Computing

Salvatore Salamone is the managing editor of Network Computing. He has worked as a writer and editor covering business, technology and science; written three business technology books; and served as an editor at IT industry publications including Network World, Byte, Bio-IT World, Data Communications, LAN Times and InternetWeek.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like