IoT World: The Edge is Getting Smarter, Smaller, and Moving Further Out
‘Data gravity’ means IoT of the future will be less like a collection of dumb connected devices and more like a distributed compute fabric.
If you are aware of the Internet of Things concept, when you think of IoT devices, you likely picture sensors, household appliances, wearables and the like, all connected to a network. Being connected is how a collection of “things” becomes an “internet of things.” And that’s been the general understanding of the term to date.
Soon, however, connectivity alone may not be enough to qualify. The Internet of Things of the future may be less like a bunch of dumb devices sending data to a server somewhere far away and more like a distributed computing fabric, where many of the IoT devices themselves can analyze the data and make decisions on their own.
Numerous experts speaking at this week’s Internet of Things World conference in Silicon Valley said IoT is pushing the IT industry once again toward a distributed computing model – this time more distributed and intelligent than ever.
Edge computing is a space we cover a lot on Data Center Knowledge in a variety of contexts, including in the context of IoT. Typically, this computing infrastructure has been similar to what you find in any data center, except less of it and deployed in an IT closet or a micro-data center (a specially designed enclosure) in an office building, a retail store, or a factory. Another tier of edge computing infrastructure consists of routers and other boxes that groups of IoT devices connect to on one end and a big remote data center or public cloud platform on the other. They are increasingly being designed with more computing power.
Those types of edge computing infrastructure aggregate data from IoT devices in the same location for storage and/or processing, bringing compute closer to the sources of data. In addition, however, more and more intelligence is now being actively pushed further out, to the end devices themselves.
While in most cases these devices will still send data to a remote centralized cloud, they will send only a fraction of the data they collect. Most importantly, however, they will be able to function autonomously if needed.
Data Gravity at the Edge
Bandwidth costs and latency are two forces that combine to create what Tony Shakib, general manager for IoT and Intelligent Cloud at Microsoft, called “data gravity” at the edge. Put simply, in many cases you can get more value out of IoT applications if most of the device data is processed on the spot.
In some cases, the application simply cannot deliver what it’s supposed to deliver without latency being sufficiently low. In manufacturing, for example, low latency is necessary to keep factory machines in stable operating conditions. And in spaces like healthcare or autonomous vehicles, an extra millisecond of latency can be a matter of life and death.
Many companies jumped into IoT projects with a cloud-first mindset, doing most of the processing of IoT device data on public cloud platforms like AWS or Azure. Other than the IoT devices themselves, their edge infrastructure would typically include a lot of storage for the accumulation of data before it’s shipped to the cloud, Stephen Goldberg, CEO at HarperDB, said while sitting on an edge computing panel at IoT World.
The bandwidth needed to push data to the cloud and the edge storage infrastructure itself are expensive and end up taking up a lot of the cost of the IoT deployment, he said. A more distributed computing infrastructure, where the edge devices already in place are doing as much computing as possible, is a more rational approach, he argued.
A “real intelligent edge, not a data storage edge,” means you can have IoT analytics much closer to real-time, Goldberg added.
Jesse DeMesa, strategy partner at the venture capital firm Momenta Partners, whose sole focus is the industrial IoT space, agreed that the cloud-first or data center-first approach to IoT analytics infrastructure won’t work, and that companies will ultimately “go towards more autonomous systems.”
Many current IoT adopters are still focused on “connect, collect, and store,” while the “real value of data has a shelf life often measured in seconds,” said one of DeMesa’s presentation slides.
Microsoft’s ‘Intelligent Edge’
One of the biggest proponents of this architectural shift has been Microsoft. The company announced an “Intelligent Edge” strategy a year ago at its Build conference in Seattle. Earlier this month, at this year’s Build, it announced some specific steps to execute it.
Those included open sourcing the Azure IoT Edge Runtime, enabling the company’s computer vision software to run on IoT Edge devices, such as drones and industrial equipment, and partnering with the mobile-chip maker Qualcomm Technologies to build an AI developer kit engineers can use to build smart cameras that can run machine learning, stream analytics, and cognitive services right on the device.
Today, hardware for Azure IoT Edge can be as small as a Raspberry Pi 3 (quad-core 1.2GHz Broadcom CPU, 1GB RAM, Micro SD port for storage) and as powerful as necessary. It runs on Windows and Linux and supports both x86 and ARM processors. Azure services that can run on IoT Edge devices include IoT Hub, Machine Learning, Stream Analytics, Functions, and Cognitive Services.
IoT Edge users can deploy applications on a Raspberry Pi packaged in Docker containers, Tom Davis, director for Azure IoT, said in a presentation at IoT World. But Microsoft is working to drive these capabilities down to smaller and smaller devices in the future, he said.
One Codebase, Many Platforms
Application containers and container orchestration systems like Kubernetes reduce the amount of software that needs to be written for IoT applications. By splitting applications into microservices that can be easily deployed on different kinds of infrastructure, they allow a developer to write an application without worrying whether it will run in the cloud, on an IoT edge router, or on a smart CCTV camera.
Ultimately, where IoT data is processed will be determined by the needs of every specific application and the associated costs. That makes the flexibility that containers and microservices enable an essential characteristic of the distributed computing infrastructure that’s now starting to be built.
“We basically want to be fluid in where we want that compute to take place,” Momenta’s DeMesa said.
About the Author
You May Also Like