Planning for the New Windows Server Cadence
June 23, 2017
The next version of Windows Server will let you run Linux containers using Hyper-V isolation (and connect to them with bash scripts), encrypt network segments on software-defined networks and deploy the Host Guardian Service as a Shielded VM rather than needing a three-node physical cluster.
Data deduplication for ReFS means Storage Spaces Direct will have 50 percent better disk utilization. Multi-tenancy for Remote Desktop Services Hosting means you can host multiple clients from a single deployment, and virtual machine load balancing is now OS and application aware, so you can prioritize app performance as you balance the load between multiple hosts.
However, if you want to get those features as quickly as possible, you need to consider if you’re ready to deal with a new version of Windows Server in your data centers every six months.
Even Azure is only now going through its update to Windows Server 2016 (some data centers have upgraded, and some haven’t, so only some VM instances support nested virtualization, for example). But the goal is for Azure to have a new release of Windows Server rolling out to its data centers within six months of it arriving. And with the new Windows Server lifecycle, that’s what Microsoft would like you to be considering in your own data centers – at least for some workloads.
Faster Channels
Like the Windows 10 client (and Office), Window Server (both standard and data center SKUs) and System Center will now have two feature updates a year, in spring and fall (most likely, in March and September, making the next two releases 1709 and 1803). Like the monthly quality updates, these updates will be cumulative.
You’ll need Software Assurance or an MSDN subscription to get the Semi-Annual Channel, but the release cycle will cover both Nano Server and Server Core.
If you don’t want twice yearly updates, the Long Term Servicing Channel will be pretty much the familiar model of a new release every two or three years, with 10 years of support (16 if you pay for Premium Assurance), and that’s available for both Server Core and Server with Desktop Experience, which you’ll need for Remote Desktop and other apps that require a GUI. One thing that isn’t yet clear is whether LTSC for Windows Server will have the same silicon support policy as Windows 10 clients – which explicitly doesn’t support any software or silicon released after the LTSC version. So if you want to upgrade the CPU, you have to switch to a newer LTSC release. That would be a big change from the current Windows Server policy, and we look forward to Microsoft clarifying this.
The first Semi-annual Channel release, coming in September, also marks some changes to Nano Server and Server Core. Although Nano Server currently supports a number of infrastructure roles, it’s rarely used for that; telemetry shows that the vast majority of Nano Server instances are for container scenarios – and in that role, customers want Nano to be even smaller. Because of that, Microsoft is removing the infrastructure roles, which will make images at least 50 percent smaller and improve startup times, so you’ll see better density and performance for containers. “Nano Server going forward will be about new modern pattern apps like containers, and it will also be optimized for .NET Core 2.0,” Microsoft Enterprise Cloud product marketing director Chris Van Wesep told Data Center Knowledge.
Server Core will take over the infrastructure roles, and should be your default for traditional data center roles. There isn’t yet a full list of what will be removed, but the Server Core image may well get smaller as well. “If what you're trying to do is make an optimized deployment for modern data center, you might not need the fax sever role any more,” he suggested. “Let's just make it the best for what it's trying to be and not be everything to everybody, especially some of that old stuff.”
You’ll want to use Server Core as the host for Nano Server (which means you’ll need Semi-annual Channel and SA), but it will also be relevant for running containers using Hyper-V isolation, which doesn’t require Nano Server. “Without any code changes, you can take legacy .NET apps that don't have a GUI dependency, drop them into containers and deploy them on Windows Server and get the benefits of containerization, even with legacy patterns. You can save yourself some money and get yourself on a new platform.”
You can mix and match servers with different channels in your infrastructure. “We expect most customers will find places in their organization where each model is more appropriate,” said van Wesep. But if you want to switch a server from LTSC to Semi-annual Channel (or vice versa) you’ll need to redeploy that server, so the way to deal with this is to pick the right model for your workloads.
“If you have a workload that needs to innovate quickly, that you intend to move forward on a fairly regular basis, the SAC will be the better way of doing that,” suggested van Wesep. “That could be containers, but I also look at customers like hosting and service providers, who are more on the cutting edge of software defined data centers. If we’re putting new scale capabilities or clustering functionality into Hyper-V, they may not want to wait two years to get access to those.”
Splitting releases like this makes sense for the diverse Windows Server market, Jim Gaynor, Research Analyst at Directions on Microsoft told Data Center Knowledge. “Nano Server is turning into a container image component. That means it's going to be changing quickly, not least because containerization is changing rapidly, so staying in the fast lane is a no brainer. Server Core SAC is for fast new features; Server Core LTSC is for your low-change roles like AD, DNS, file/print, and so on. If you want the fast-developing feature train for your container, VM, IIS or storage hosting, you go with Server Core SAC. This is a logical push for Server Core, since it’s where Microsoft wants people to be. For RDS, core incompatible Win32 server apps, and point and shoot orgs... there's literally no change.”
“LTSC is what you pick if you’re a set-it-and-forget-it shop that buys a Windows Server Standard or Datacenter license, without SA,” agrees Directions on Microsoft Research Analyst Wes Miller. But as you move into the faster cadence, whether it’s for infrastructure improvements or containers, you will need to take the licensing implications into account. “There's now a higher likelihood that you'd need to have SA across all your user and device CALs, due to random pockets of servers in your organization needing SA.”
“If you go back to Windows Server 2012 and 2012 R2,” van Wesep reminded us, “we had a one-year gap between them. Previously customers had been saying ‘it’s so frustrating that it takes you three years’, so we did one release faster – and people said ‘that’s way too fast for us to consume’. What we realized is that there really are different people that have different needs.”
How Fast Can You Run?
Having Windows, Windows Server and Office aligned like this also makes support simpler, van Wesep explained. “Every September and March you have a feature update and those updates will be in market for 18 months, so at any point in time you have three versions, and any of the three Office versions will work with any of the three client versions, and now Windows Server and System Center can participate in that.”
So far, so helpful. For some organizations and some workloads though, updating every six months – or even once a year, because you can choose to skip one Semi-Annual release and still be supported, though you can’t skip two in a row - will be too fast a pace, and LTSC is the answer there. But if you’re adopting devops and continuous delivery and turning to containers to make that work, you want the frequent updates to Nano Server that will enable that – and you’ll already be moving at a fast enough pace that six-monthly updates will just become part of your on-going process. Some customers will also want the ‘latest and greatest’ features for infrastructure.
Keeping Windows Server updated is also much less work than, say, upgrading the last of your Windows Server 2003 systems. “By incorporating things like mixed mode cluster updates the process of moving forward shouldn't be nearly as painful as it's been in the past,” van Wesep claimed, pointing out that “Containers are redeployed net new each time anyway. We think, for the workloads people will be using this for, the process of moving forward isn't going to be as arduous as it was. It’s about decoupling the application from the OS from the underlying infrastructure; it's ok to have different cadences of upgrades for all of those layers.
Getting a new version of Windows Server twice a year doesn’t turn Windows Server into any more of a subscription than it already is with Software Assurance. This new cadence is about innovating faster with Windows Server, especially for containers. As with Windows 10, this is about turning deployment from a large-scale project that consumes all your IT resources every few years to an on-going process where you try out Insider builds, pilot Semi Annual Channel releases for a few months, deploy them to the relevant servers, patch them monthly – and then start again a few months later.
“Living properly in a channel-based (and in some situations container-based) world means organizations likely need to consider their model of deployment and servicing – and treat it as a process, not like an event,” says Microsoft's Miller.
About the Author
You May Also Like