Intel Offers DL Boost as Alternative to Accelerators in AI Servers

The feature in the new 2nd Gen Intel Xeon Scalable chips expedites a key deep learning function, but it’s more basic than you might expect.

Scott Fulton III, Contributor

June 27, 2019

1 Min Read
Intel Offers DL Boost as Alternative to Accelerators in AI Servers
Intel CEO Bob Swan (right), and Navin Shenoy, Intel executive VP and general manager of the Data Center Group, speaking at an Intel event in San Francisco on April 2, 2019Intel

With the virtuous cycle of Moore’s Law having very likely run its course, Intel is looking for other means to boost its processors’ performance than, as its founder once put it, “cramming” transistors onto chips. We’ve seen how the company uses microcode to fast-track the execution of certain classes of code — for example, recognizing when code refers to the OpenSSL open source library and pointing it to faster hardware-based microcode instead. Now, the CPU leader is looking to make performance inroads any way it can, even if it means one market segment at a time.

Yet with Intel’s DL Boost architecture for its Xeon server processors, launched in April, the company is attempting a curious new twist on this approach. On the surface, DL Boost is being presented as a feature utilized by certain 2nd Generation Intel Xeons (formerly code-named “Cascade Lake”) to fast-track the processing of instructions used in deep learning operations.

“Intel is trying to counter the perception that GPUs are required for deep learning inference by adding extensions to its AVX-512 vector processing module that accelerate DL calculations,” Marko Insights Principal Analyst Kurt Marko says.

Intel is stacking up its Xeon CPUs directly against Nvidia’s GPUs, which in recent years have seized the artificial intelligence (AI) hardware market and transformed the brand formerly associated with gaming and PC enthusiasts into a key player in high-performance servers in the data center.

“Inference, I don’t think, was ever really an all-GPU game,” Ian Steiner, senior principal engineer with Intel, tells Data Center Knowledge.

To read the rest of this free article, please fill out the form below:

 

 

About the Author(s)

Scott Fulton III

Contributor

Scott M. Fulton, III is a 39-year veteran technology journalist, author, analyst, and content strategist, the latter of which means he thought almost too carefully about the order in which those roles should appear. Decisions like these, he’ll tell you, should be data-driven. His work has appeared in The New Stack since 2014, and in various receptacles and bins since the 1980s.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like