Insight and analysis on the data center space from industry thought leaders.

Managing the Edge: Critical Success Factors

Edge computing is not just a closet with a server or a computer inside a car or smart watch, it’s a critical aspect of a high-availability, low-latency IT infrastructure.

5 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

Jonathan.png

Jonathan

Jonathan Luce is Senior Vice President, Strategic Partnerships for RF Code.

Data at the edge. With the Internet of things, cloud applications, and the increasing need for real-time services and analytics, it’s more critical than ever for enterprises to process some types of data physically close to where it’s created and used. Sending it to the cloud or a central data center just isn’t going to cut it for many of today’s use cases. And given that edge computing takes place remotely in many instances, outside a traditional data center environment, there are unique challenges to managing and monitoring these critical IT assets.

Edge computing is not just a closet with a server or a computer inside a car or smart watch, it’s a critical aspect of a high-availability, low-latency IT infrastructure. It is the processing and storage of data as close as possible to the source, which may be in a local server, mini-data center, or an intelligent device with its own computing capabilities (such as sensors). The typical use for edge computing, and a significant driver of growth, is processing data from Internet of Things (IoT) devices, like health monitors or environmental sensors. Another key motivation leading enterprises to the edge is the ever increasing need to analyze and act on data in real-time – for applications like self-driving cars, financial services transactions, predictive analytics, and environmental monitoring.

Enterprises are quickly moving in this direction to meet their business imperatives. According to Gartner, while today only about 10 percent of enterprise-generated data is created and processed outside a traditional data center or cloud, by 2022 that’s expected to increase to 75 percent.

The challenging reality is that many of these edge computing devices are in remote or difficult to access locations, far from the main data center or in spots less accessible to the IT team. For those enterprises engaged with edge computing, a critical success factor is the ability to remotely monitor and manage the system – both to ensure uninterrupted operations and to secure the assets and the network.

Devices designed to operate on the edge need to be self-monitoring or designed with sensors to monitor, track, and report operational and environmental conditions. They also need to be equipped with advanced reporting ability to enable the IT team to monitor and manage operations from afar. When devices can troubleshoot their own health and operability, or even self-heal, they ensure the safety and operations of any edge devices, and the data they house.

Edge computing equipment and IoT devices increase the number of access points to an enterprise’s network, making it more vulnerable to both physical and cyber-security threats. To ensure the safety of the network requires close monitoring of the assets, their behavior, and their environment. Though security is also required in on-site data centers have, the unique locations and devices that house edge computing services require often hands-off protection from afar.

The benefit for enterprises who get the remote operations and monitoring formula right is improved operational efficiency, speed, and lower costs. 

Operational Efficiency

Because edge computing devices are designed to conduct their own real-time analysis and execution, this limits the amount of data sent to the data center or cloud. Many of the data created at the edge, such as that from a network of sensors or a health monitoring device, can be dealt with locally. Since it doesn’t need a central data center for processing, the data can be processed, analyzed, and acted on much faster. And, when the data does need to be stored or processed centrally, it’s usually a small amount of the data processed on the edge device or it’s aggregated data. And this can be programmed to be sent across the network in batches or at times when more network bandwidth is available.

In this case, decentralizing data management and processing increases efficiency. As Santhosh Rao, principal research analyst at Gartner notes: “Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements. As the volume and velocity of data increases, so too does the inefficiency of streaming all this information to a cloud or data center for processing.”

Speed

The physical location of an edge device depends on the use case – financial services firms requiring high speed trades may need an edge server within their physical space. Manufacturing facilities may require a device on the factory floor. A utility company or municipality may require sensors in multiple geographically spread-out outdoor locations. The physical location matters – with the storage and processing of the data moved to an edge device as close as possible to the source, enterprises can get close to zero latency for transaction processing.

This is a critical business driver for many enterprises. Financial services firms make multi-million dollar deals, healthcare companies make life-changing decisions, retailers may deliver real-time augmented reality to customers. All of these activities require the fastest possible speed to meet the business’s mission.

Lowered Costs

By managing data on the edge device, this minimizes network traffic and minimized the amount of data that needs to be stored in the cloud or in a data center. This reduces the cost, so enterprises are not paying to transport or store data they will not need.

Edge computing is part of a complete modern IT infrastructure for today’s enterprises. With the real-time processing and analysis, these distributed mini-data centers complement full data centers and cloud and co-location services. With the need to reduce latency and speed up the delivery of the services and data found at the often-remote edge, enterprises need real-time monitoring, visibility, and analytics to track and ensure their security and operations.

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like