HPE: $4 Billion Says Intelligent Edge is the Future of Computing

•Hewlett Packard Enterprise unveiled a new $4 billion strategic investment in intelligent edge•Company CEO Antonio Neri made the announcement in his opening keynote for HPE Discover•HPE believes intelligent edge is its “next big opportunity”•The money will be spent on developing technology and services to enable an edge-to-cloud architecture of the future, where most of the data generated by connected devices is processed at the edge

Yevgeniy Sverdlik, Former Editor-in-Chief

June 20, 2018

3 Min Read
HPE
HPE

Hewlett Packard Enterprise on Tuesday unveiled a new strategy it’s planning to spend $4 billion to pursue over the next four years.

The company will invest that much in technology and services to enable the intelligent edge, a catch-all phrase used to describe the myriad of things like smart sensors and cameras or devices that aggregate and process data they produce upstream in the network, such as routers, gateways, or servers. What makes them “edge” devices is their location at the source of the data rather than in a big data center somewhere far away. What makes them “intelligent” is the computing capacity and software to analyze the data in near-real-time, as it’s being generated, and make decisions based on insights gleaned from that analysis.

The idea of the intelligent edge has been gaining steam in the market as companies are figuring out the best ways to take advantage of the explosion of data that’s a result of rapid digitization of traditional businesses and continued growth of new businesses born in the digital age in combination with rapid proliferation of network-connected devices.

The edge-to-cloud architecture HPE – and others – are envisioning will consist of “distributed clouds” that process most of the data at the edge while sending and receiving some data from centralized data centers, operated either by the end-user companies or by third-party cloud providers like Amazon Web Services and Microsoft Azure.

Related:IoT World: The Edge is Getting Smarter, Smaller, and Moving Further Out

“I believe the edge is the next big opportunity for all of us,” HPE CEO Antonio Neri said in his opening keynote for the company’s big annual Discover conference in Las Vegas Tuesday afternoon. “It will be an edge that’s intelligent and cloud-enabled. That’s the future as we see it.”

The $4 billion will be spent on R&D in security, AI and machine learning, automation, and edge computing, the company said.

While his keynote was light on details, Neri highlighted two initial technologies that will play a part in the new strategy specifically. The first is a cloud-managed software-defined WAN (Wide Area Network) solution for branch locations by Aruba Networks, a company HPE’s predecessor Hewlett-Packard acquired for $2.7 billion in 2015.

The solution, called SD-Branch and announced at Discover, simplifies the deployment and management of all things related to network connectivity in company branch offices from a central location.

The other technology is HPE’s “memory-driven computing” architecture, product of The Machine research project, which according to the company is “the largest and most complex research project” in its history. Delivering extreme performance, the architecture will accelerate data processing at the edge, HPE said.

Related:“Everybody’s in the Edge Game”

HPE-The-Machine-prototype-e1494947911284_0.jpg

A prototype of The Machine by HPE

The basic building blocks of memory-driven computing are:

  • Giving all processors in a system access to a single large pool of non-volatile memory instead of dedicating a sliver of memory to every chip;

  • Using a single fabric to interconnect all the components in the system where they can communicate using a single protocol instead of having a separate type of interconnect and protocol for every type of device;

  • Replacing electrons running over copper wires to transmit data between components with photonics, or microscopic lasers carrying data over tiny fibers.

“Memory-driven computing will be a game-changer,” Neri said, announcing during his keynote availability of a memory-driven sandbox designed to give developers a test-drive of the new architecture. (The company unveiled a prototype of The Machine last May.)

Running on the HPE’s high-performance Superdome Flex servers, it is “the biggest, baddest in-memory system out there,” he said. “Those systems are right now on the factory floor.”

About the Author

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like