There was a brief and lively discussion on Twitter recently stemming from someone asking for advice on how to convince management to turn on Remoting.
“Fire Management, if they have to ask” was apparently not an option, although it should have been. I mean, at this stage, you either know the value of PowerShell and its Remoting technology, or you’re being willfully ignorant.
But that wasn’t where the discussion got lively.
The real discussion was about why Remoting was turned on by default in the first place, on newer versions of Windows Server (since Win2012). After all, Remote Desktop Protocol (RDP) is turned off by default. Most Linux distributions, it was pointed out, turn off sshd by default. So why is Remoting turned on? Isn’t the safest bet to just disable everything, and let people turn on what they need?
I think Remoting (and by Remoting, I mean the Windows Remote Management, or WinRM service) being turned on by default gives us a valuable look at Microsoft’s psyche these days.
First, keep in mind that you can turn it off. You can even do that via a Group Policy for domain computers, and you could certainly do so in a server master image if you wanted to. So it’s pretty easy to have an “off by default” setup in your environment if you want. But wouldn’t it therefore be just as easy for Microsoft to leave it off, and let you “default it to on” by whatever means you prefer, if that’s what you want? Sure. But again, I think this is about Microsoft’s psyche, these days.
Understand that what follows is conjecture, but it’s conjecture based on more than 20 years of watching this company, and on a pretty good working relationship with many of the company’s technology leaders. This also isn’t intended to make you feel that “on by default” is the right answer for you, nor is it intended to convince you that “on by default” is the right answer for anyone. This is an attempt to speculate about the reasons behind “on by default,” whether the decision itself was correct or not.
The short reason is, “Nano Server.”
If you just nodded and went, “yeah, that would explain their thinking,” then you can skip the rest of this. Keep in mind that Remoting isn’t turned on for client computers by default, and that just pretty much reinforces the Nano Server reason.
The very long answer is that Microsoft, these days, is building first for themselves. Specifically, for Azure. They believe – and again, you’re free to disagree and I’m not pitching their belief as gospel – that enterprises should manage their datacenter in much the same way Microsoft manages Azure. Microsoft’s argument for this revolves around efficiency, primarily, and specifically efficiency at scale. Reliability factors into the argument, too. So Microsoft’s decisions have to be examined in light of what works in “the cloud,” because that’s how they expect you’re going to be managing your own servers in the future.
Microsoft has been on a long path, since 2008, of breaking down the monolithic Windows Server product into a discrete set of chunks that can be turned on or off at will. We say that first with the big refactoring of the product into Roles & Features, which could be installed or uninstalled pretty easily. We also saw them ripping out the GUI bits to create the first Server Core. Over the next 5 years, the company refactored Server more and more, through a series of three releases culminating in Windows Server 2012 R2. In that time, Server Core became more and more functional, as more and more of Windows Server was refactored into standalone little bits, and separated from the “GUI stuff.”
Microsoft’s direction here has never been a big secret: they want to ship a fully-functioning version of Windows Server that doesn’t have any… er… windows. They want it, in other words, to be a server, not a client that just happens to have a lot of RAM installed.
Once you kind of buy into the “no GUI on the server” idea, even if just for the sake of discussion, it’s not a far step to “no logging into the server at all, in any way.” Headless servers, in other words, where the host hardware might not even contain video output hardware. After all, if there’s no GUI, then you can be definition do everything via text, which is very easy to transmit over a network. Ask Unix, which has been doing it for decades over Telnet and SSH. If you can do everything remotely, why even support a local login?
Well, that’s Nano Server, an installation option in the version of Windows Server that is expected to ship in 2016.
And if you can’t log on locally at all, then you need some way of connecting to the server to initially configure it. Which is why Remoting is enabled by default, even though little else is. You use your existing OSD infrastructure to deploy new Nano servers, and then you Remote into them to set them up as needed. Unlike most Linux distributions, which allow local login, Nano isn’t even going to provide a means to log into “the console.” At least, as far as we currently know, it won’t; Microsoft’s only made a few statements about it so far.
Windows Server’s architect, Jeffrey Snover, put it fairly concisely in the Twitter discussion: “We believe in a world of headless remote mgmt as the norm.” Headless meaning no way to log in locally, no such thing. Ergo, you need some way to log in remotely, and Remoting is it, and it therefore is enabled by default.
Now, in defense of this “on by default” approach, I’ll point out that unlike nearly every preceding remote management protocol introduced by Microsoft, Remoting is incredibly controllable. It uses WS-Management (WS-MAN), which is HTTP-based. It runs on just one incoming port, which is easy to lock down through physical, soft, and virtual firewalls. You can certainly have an environment that’s pre-engineered to protect that port. But if you buy into Microsoft’s “headless” approach – and whether you do or not, Microsoft certainly buys in – then they had to enable something so you could configure the server, at least initially.
So whether you agree with this direction or not is entirely up to you – and you’re welcome to add your polite, professional comments to this post. I wanted to write this in an attempt to explain, just justify, why I think Microsoft took this approach, and what I think it means for the long term of Windows Server itself. I think simply knowing that direction can inform a lot of your base infrastructure decisions and planning going forward, whether you buy into the approach or not.