Assessing the State of the Cloud and Containers at OpenStack Silicon Valley
August 28, 2015
logo-WHIR
This article originally appeared at The WHIR
At the OpenStack Silicon Valley event this week, cloud professionals are discussing an OpenStack ecosystem that’s drastically different than when the event first launched five years ago. The software has considerably matured, but it’s also taking on even greater roles including this year’s hot topic: containers.
Currently, OpenStack is becoming not just the open-source alternative to cloud software like VMware, but it’s actually becoming the preferred software to build and manage private clouds. And with this role, it will undoubtedly shape the way application containers are adopted and managed.
Finding the Right Tool for the Job
7033686037246737976
As Jonathan Bryce, executive director of OpenStack Foundation noted, OpenStack benefits from the dozens of active projects dealing with issues such as object storage, network functions virtualization and monitoring. “There are many OpenStack projects now,” Bryce said, and organizations are able to find the “right tool for the right job”.
He said there is a progression among certains projects from “experiments” where there is low adoption and maturity, but with a high potential for innovation. Over the past five years, many of these projects have become more widely adopted and very mature. Even some of the less mature projects can still be useful. “It just might require more effort,” Bryce said.
A Variety of Management Tools
Craig McLuckie, original product lead for Google Compute Engine, said he found that people didn’t like Platform-as-a-Service(PaaS). Customers risked tying themselves to a platform that makes it difficult to add new functionality if the platform can’t accommodate that.
Google’s container cluster manager, Kubernetes, McLuckie said, is designed to extract the best of PaaS orchestration in a similar way to how Docker extracts the best of PaaS packaging, essentially creating a PaaS environment that provides flexibility.
However, there are several competing options. Kubernetes, Mesos and Docker Swarm currently occupy different niches in the broad area of container orchestration and management.
Boris Renski, co-founder and CMO of Mirantis, notes that while many of these projects say they’re supporting of each other, there’s some significant competition to be the prefered platform.
OpenStack, on the other hand, can be thought of as a glue that binds the different modules of a cloud together so that you can more easily sub in different technologies. OpenStack helps with the integration of different technologies that are in direct or indirect competition. It essentially provides a foundation for swapping out different technologies, making it a good bet when experimenting with containers.
Rethinking Organizational Roles to Make Way for Innovation
James Staten, Microsoft’s cloud and enterprise chief strategist, said it’s not technology that’s holding IT departments back from taking full advantage of cloud technology; it’s organizational psychology. And most IT departments aren’t ready.
Developers and other people in the organization are buying public cloud solutions, and they don’t care if IT supports the solution. Unable to keep up and unable to force employees not to use public cloud services, IT starts to be seen as irrelevant. But, in fact, they’re carrying out important procedures around security and customer privacy.
So, when employees go to public cloud services for the IT they need, he said, admin teams shouldn’t get angry, but instead “try to understand why your employees are trying to circumvent what you provide.” IT could be putting unnecessary barriers around services or providing the wrong services.
IT needs to rethink its role in the organization, and learn how they can help the company provide services more effectively. And this might even mean employees being reassigned to different roles. They may go from being a developer to engineering and DevOps, or from statistician or analyst to data scientist, or from database architect to information architect.
He said the only people in IT who seem content with using yesterday’s environment are people who are close to retirement. And even then, it’s not particularly appealing. IT staff need to be sold on the idea that they’re helping innovate and help the company transform through technology, which should help motivate employees to make the transition.
In Time, People Will Learn the Strengths and Weaknesses of Containers (as They Did with Cloud)
Hype around containers over the past has been reaching a fevered pitch, making many people believe they can be anything and everything to an organization. However, as Gartner’s theory around hype cycles notes, the “peak of inflated expectations” is nearly always followed by a “trough of disillusionment,” when the limitations of the technology becomes known.
For instance, McLuckie says that containers offer good resource isolation, but that virtual machines are still needed for security isolation.
However, organizations are learning how to build and manage clouds and leverage containers in ways that improve service levels and economics, but that also still address persistent concerns such as security and privacy.
DirecTV, which AT&T acquired last month for $49 billion, is using OpenStack and is even exploring containers. It’s role going forward will be helping AT&T provide new video services over-the-top and on mobile devices, and its experiments in architecture will be necessary to deal with the new ways Americans consume media, and will provide valuable experience into how to rollout services at this massive scale. Other broadcasters such as Comcast are also cautiously experimenting with containers.
Bryce notes that while there’s a great deal of interest in containers, very few people are using it in production. “The next phase is operational models that tie it into enterprise environments,” he said. With greater adoption, it’s expected that the community will start to rally around tools that are needed, and maturity will grow.
The Frontier of OpenStack and Containers
Last month, Rackspace and Intel announced a collaboration on an OpenStack Innovation Center (or OSIC) in San Antonio. It has been recruiting and training open source developers, and fostering collaboration between Intel and Rackspace engineers who will be working to improve the scalability, manageability and reliability of OpenStack.
A key part of OSIC is its two 1,000-node OpenStack hybrid cloud clusters that the OpenStack community can use for testing applications at enterprise scale. Rackspace runs one of the world’s largest OpenStack-powered clouds, and further research, involving different industry players, will help enable OpenStack run larger and more diverse workloads, and test out new container-oriented architectures.
Renski notes that one of the big challenges ahead is making OpenStack more capable of dealing with hybrid clouds with the growth of orchestration that allow services to cross data center boundaries.
Lew Tucker, VP and CTO of Cloud Computing at Cisco Systems, said he’s excited for the Magnum project, which basically aims to deliver containers “as a service” on an OpenStack Platform, as well as Kolla, which provides production-ready containers and deployment tools for operating OpenStack clouds.
Diane M. Bryant, an executive with Intel’s data center group, noted that getting the industry to set focus on a common set of goals is also a challenge – one that the community has been able to meet in the past with OpenStack but could also achieve as container architectures come into focus. Given the expansion of the OpenStack community to smaller organizations, she envisions that OpenStack could bring the efficiencies of hyper-scale architectures to the masses, not just a few major companies with massive clouds.
This first ran at http://www.thewhir.com/web-hosting-news/assessing-the-state-of-the-cloud-and-containers-at-openstack-silicon-valley
Read more about:
Google AlphabetAbout the Author
You May Also Like