Microsoft Unveils AI Stack for On-Premises Data Centers and Edge Computing Sites
‘Intelligent Edge’ solution combines rugged FPGA-powered hardware and Microsoft’s deep neural network
May 8, 2018
Microsoft expects 30 billion smart devices to make their way into homes and businesses by 2020. Together, they will pump out so much data that uploading it all directly to the cloud for processing will be impractical, especially if those devices do things like understanding speech or recognizing objects in images. It will have to be done by AI at the edge.
“In the next ten years, billions of everyday devices will be connected; smart devices that can see, listen, reason, predict, and more – without a twenty-four-seven dependence on the cloud,” Frank Shaw, a Microsoft corporate VP for communications, told Data Center Knowledge in an interview. Microsoft’s concept of “intelligent edge” is an “interface between the computer and the real world.”
Each person will generate about 1.5 Gigabytes of data per day, while a smart home will generate as much as 50Gb, he said. That’s roughly an hour-long 4K video. A smart building would generate 150Gb, a connected stadium 200 terabytes, a connected factory a petabyte, and a smart city 250 petabytes per day.
By 2019, 45 percent of this IoT-created data flood will be processed and acted on at or close to the edge of the network, IDC predicts. Some of it will be processed on hyperconverged infrastructure like Microsoft’s Azure Stack. Scott Guthrie, executive VP of the Microsoft Cloud + AI group, told us the on-premises version of the company’s cloud is used for edge computing much more than it is for “lift-and-shift” private cloud.
“People are really recognizing the importance of being able to leverage computing resources on the edge to do more,” he said. “If you have a drone doing visual inspection of bridges looking for cracks, the amount of video that drone creates is enormous, and trying to upload it to the internet using 3G or 4G or even 5G is pretty bandwidth-intensive. The ability to have an Azure Stack instance on the truck where the engineer is doing video processing, doing AI on the edge, makes a tremendous difference as to whether that solution could work or not.”
Intelligence at the Edge
As Microsoft distinguished engineer Doug Burger told us recently, Project Brainwave, the FPGA platform that accelerates machine learning inside Azure, is coming to the edge solution to power what he calls “real-time AI.” It’s now available as a service that developers can use for deep learning in the cloud, called Azure Machine Learning Hardware Accelerated Models (initially supporting ResNet deep neural networks).
Microsoft officially unveiled Braiwave at the Edge at its Build conference in Seattle this week. It will come as a server equipped with a Microsoft-designed FPGA board that can be deployed on-premises – ideal both for reducing latency and for handling sensitive data, Guthrie explained.
The hardware will be a rugged server, initially from Hewlett Packard Enterprise or Dell EMC. The Brainwave system will include Azure IoT Edge to manage updates to the Azure machine learning model, “maintaining symmetry between the cloud, where the model may have been trained, and the edge.”
A user will be able to download a deep neural network model from the Azure ML repository along with the Project Brainwave FPGA image and run it on the local Intel FPGA. Once the model is loaded, the edge device doesn’t need to be connected. “Internet connectivity is required only when updating the model; that means you can run disconnected essentially forever if you don't want to update the model,” Guthrie said.
The Brainwave servers are in limited preview now, and Microsoft hasn’t announced what the solution will cost or how it will charge for it (a single upfront payment or consumption-based). In addition to Azure Stack, Brainwave will also be available for the 100TB Azure Data Box appliance, used for physically shipping data to Azure. That’s if you want to put your data through the machine learning model before it’s ingested by the cloud.
Automating the Factory Floor
One of the use cases the Intelligent edge is aimed at is industrial automation. It can shrink latency of an image-recognition system that automatically checks quality of manufactured parts, for example.
Not being dependent on internet connectivity also makes the system more reliable. Every additional network connection is a potential point of failure.
In an internet-dependent system, “if the internet connectivity comes down, this multimillion-dollar assembly line would screech to a halt,” Guthrie said. The ability to keep an assembly running can make all the difference in a company’s decision to adopt cloud technology.
The factory-floor systems might be powered by Azure Stack. Intelligence might also be embedded on the IoT devices themselves. Azure IoT Edge, Microsoft’s platform that ties it all together on the backend, uses Docker containers to run the same machine learning model in the cloud and at the edge.
Tools for Building Intelligence at the Edge
Custom Vision is the first of Azure cognitive services to run on Azure IoT Edge with more expected in the coming months, Guthrie said. Custom Vision provides image recognition for things like spotting a cracked pipe by a drone or a faulty component by an assembly-line camera. Microsoft is working with DJI, the largest commercial-drone vendor, and Qualcomm to help developers build these systems.
If you need more than standard video, Project Kinect for Azure is a package of sensors including 3D cameras from the ill-fated Xbox Kinect (yes, Microsoft is bringing it back but not for gaming) and HoloLens hardware, plus onboard computing for running machine learning models on the device you build.
A Kinect system could use hand tracking for gesture control or to monitor and assist technicians working with physical parts. It could also be used for spatial mapping. (Thyssenkrupp, for example, uses the HoloLens hardware to generate 3D models of staircases for installing custom stair lifts.) Project Kinect is due to come on the market next year.
There’s also a new Speech Devices SDK for building speech recognition into drive-through ordering systems in restaurants or airport kiosks.
What unifies this mix of hardware, tools, and services is the ability to extend the same development effort from the cloud to the edge and further down to the end device itself. “It’s the flexibility of having the same programming model on an IoT device as on Azure Stack inside your building as on Azure public cloud,” Guthrie said.
About the Author
You May Also Like