Insight and analysis on the data center space from industry thought leaders.
Defining Your Edge
Here are three considerations to keep in mind as you look to understand the edge, and how it can be used to benefit your specific environment.
August 30, 2018
John Fryer is Senior Director of Industry Solutions for Stratus Technologies.
The emergence of edge technology has created a lot of hype – and also a lot of questions. What exactly is the edge? What are the benefits of processing data at the edge? How is the edge different than cloud?
There is not one definite answer to any of these questions. The technology is so new that there are many aspects of the edge that have yet to be determined. In fact, there are dueling definitions coming out across the industry, each of which attempts to explain exactly what the edge is, and what it entails. According to The Open Fog Consortium, edge computing is the process of placing data and data-intensive applications at the edge to reduce the volume and distance that data must be moved. Whereas the Linux Foundation takes it one step further, defining edge computing as tool to improve the performance, operating cost and reliability of applications and services. The Linux Foundation also says that by shortening the distance between devices and the cloud, edge computing mitigates the latency and bandwidth constraints of today’s Internet, ushering in new classes of applications.
In a sense, both of these definitions are right. It’s agreed that edge computing enables data gathering and analytics to occur at the source by pushing applications, data and services away from centralized locations to the edge of the network. Where the two definitions differ is in their explanation of the approach and purpose of how to deploy edge devices.
While these definitions can be helpful to explain the different capabilities and applications of edge technology, they (and others out there) can be confusing for professionals who are considering adopting edge solutions. Here are three considerations to keep in mind as you look to understand the edge, and how it can be used to benefit your specific environment:
Data Processing and Storage
New research from Gartner estimates that by 2022, 75 percent of data is going to be created and processed at the edge. This estimate is reliant on more companies adopting edge technologies, gathering more data and processing more information than they have ever before.
To understand how to best apply edge technology to your business, you need to first consider how much and what kind of data you’re looking to store at the edge. Currently, we aren’t sure what types of data will be sent to the edge and, therefore, cannot predict the amount of data that will live there. We do know that connectivity levels will impact the amount of data that will be stored at the edge, but this will vary based on regulatory standards, compliance needs and real-time applications. Despite the unknown aspects, the benefits of real-time processing, decreased latency and costs are factors for edge adoption.
Connectivity
The capabilities of edge technology do not require high connectivity to perform their services. However, most edge definitions assume that data connectivity is involved. In reality, a high percentage of the current environments that are defined as “edge” are minimally, or not connected. Take industrial sites for example. Most oil and gas rigs, food and beverage factories, and water treatment plants are using legacy systems that have little or no connectivity to the outside world. For a variety of reasons, including security, they are not currently capable of scaling their old data storage or application infrastructure to be compatible with the edge – making connectivity difficult. So, are these computing resources that are not connected really “edge”? Currently, we would say yes, but data connectivity will play more of a role in determining what is defined as edge computing in the future.
Security
To determine how the edge will fit into your business model, you need to think about the location in which the device itself will live. Edge devices need to be able to handle specialized environments, with the ability to change scaling parameters for different requests. That's in addition to serving as real-time processors between the edge and the cloud. These functions require a higher level of security than the traditional cloud-driven model of storing data in a secure data center, but current edge definitions do not reflect this. Most definitions for cloud-driven edge examples assume that applications are created in the cloud to be deployed and run on at the edge. While this approach does align with cloud computing, it neglects to acknowledge that edge devices require a different management system than a cloud data center and an additional level of security to adequately secure edge technologies and the devices they interact with.
There are a vast array of applications suitable for edge computing and many considerations to keep in mind when determining how to best utilize the edge – including storage, processing power, security and data connectivity. Because of the array of uses out there, there is no one way to define “the edge.” And what’s more – different definitions will continue to be made as the technology evolves and is used effectively across different industries. What is important is that you develop your own definition of the edge, and consider each of these aspects to determine how it can best benefit your operations.
Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.
About the Author
You May Also Like