Insight and analysis on the data center space from industry thought leaders.
Fog Computing for Internet of Things Needs Smarter Gateways
With the Internet of Things expected to include 26 billion connected units by 2020, data center managers must deploy more forward-looking strategies for meeting the needs of businesses. That means embracing "fog computing" as a way to bridge IoT devices to remote data centers and to process the huge data sets this maturing technology will produce.
April 8, 2015
Rick Stevenson is the CEO of Opengear, a company that builds next generation intelligent solutions for managing critical IT and communications infrastructure.
Without question, cloud computing and the Internet of Things (IoT) are two of the hottest mega-trends in technology – and for good reason. The rise of cloud, of course, has changed the equation for organizations calculating infrastructure costs. It has done so by offering on-demand computing, storage and network services with easy scalability and, often, massive savings compared to conventional on-premise approaches.
The IoT promises to bring the advantages of cloud computing to an earthly level, permeating every home, vehicle, and workplace with smart, Internet-connected devices. But as dependence on our newly connected devices increases along with the benefits and uses of a maturing technology, the reliability of the gateways that make the IoT a functional reality must increase and make uptime a near guarantee.
As every appliance, light, door, piece of clothing, and every other object in your home and office become potentially Internet-enabled, The Internet of Things is poised to apply major stresses to the current internet and data center infrastructure. Gartner predicts that the IoT may include 26 billion connected units by 2020. The popular current approach is to centralize cloud data processing in a single site, resulting in lower costs and strong application security. But with the sheer amount of input data that will be received from globally distributed sources, this central processing structure will require backup.
The concept of “fog computing” has been introduced as a bridge between IoT devices in the field and remote data centers. IoT devices can produce huge data sets that need to be processed. With fog computing, some of that processing load can be handled by computing resources at the edge, by filtering and summarizing the data to reduce volume and increase value and relevance.
IoT devices also create a need for low-latency handling of time-critical tasks such as analysis and decision-making. Fog computing, thought of as a “low to the ground” extension of the cloud to nearby gateways, ably provides for this need. As Gartner’s Networking Analyst Joe Skorupa puts it: “The enormous number of devices, coupled with the sheer volume, velocity and structure of IoT data, creates challenges, particularly in the areas of security, data, storage management, servers and the data center network with real-time business processes at stake. Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT.”
For data handling and backhaul issues that shadow the IoT’s future, fog computing offers a functional solution. Cisco, a vendor proposing such a framework, envisions the use of routers with industrial-strength reliability, running a combination of open Linux and JVM platforms embedded with Cisco’s own proprietary IOS. By using open platforms, applications could be ported to Cisco’s infrastructure using a programming environment that’s familiar and supported by multiple vendors.
These robust gateways would strengthen the entire IoT infrastructure by absorbing the brunt of processing work before passing it to the cloud. Fog computing can meet requirements for reliable low latency responses by processing at the edge and can deal with high traffic volume by using smart filtering and selective transmission. In this way, smart edge gateways can either handle or intelligently redirect the millions of tasks coming from the myriad sensors and monitors of the IoT, transmitting only summary and exception data to the cloud proper.
The success of fog computing hinges directly on the resilience of those smart gateways directing countless tasks on an internet teeming with IoT devices. IT resilience will be a necessity for the business continuity of IoT operations, with redundancy, security, monitoring of power and cooling and failover solutions in place to ensure maximum uptime. According to Gartner, every hour of downtime can cost an organization up to $300,000. Speed of deployment, cost-effective scalability, and ease of management with limited resources are also chief concerns.
To produce a bulletproof end-user experience, this smartening of edge gateways will need to rely on features such as out-of-band access, automatic detection and recovery from outages, 4G LTE cellular connectivity with 3G failback, and military-grade FIPS 140-2 security. 3G/4G technology offers cost-effective, always-available connectivity. With a wireless 4G LTE failover solution, distributed enterprises can have the same reliability and competitive advantages as large enterprises. Having complete and secure control over gateways even when networks are down – and having the ability to provision, maintain and repair critical infrastructure – will be integral to maintaining services affecting the daily lives of consumers, who will become increasingly dependent on IoT functionality (and unbeknownst, surely, to the complexities going on behind the scenes).
Moving the intelligent processing of data to the edge only raises the stakes for maintaining the availability of these smart gateways and their communication path to the cloud. When the IoT provides methods that allow people to manage their daily lives, from locking their homes to checking their schedules to cooking their meals, gateway downtime in the fog computing world becomes a critical issue. Additionally, resilience and failover solutions that safeguard those processes will become even more essential.
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.
About the Author
You May Also Like