DSC: Must-Have or Just Nice-To-Have?

On a recent PowerScripting Podcast episode, I went off on a bit of a career-oriented rant, and amongst other things mentioned something to the effect of, "if you're not learning DSC now, you're screwed." It hopefully goes without saying that my comment applies to folks working in environments that use Microsoft server products; obviously, in an all-Linux shop you're pretty safe not knowing Microsoft's technologies :).

Some discussion on Twitter ensued, a place I hate for discussions because 140 characters leaves you enough room to be misunderstood and paraphrased, but not enough room to articulate your perspective. I wanted to follow-up on the rant a bit, and by doing so here hopefully engender a more detailed discussion.

One comment - and this is a nice, succinct one to start with: "Is it a useful tool? Yes; is it the tool that makes or breaks a sysadmin? No." Couldn't disagree more. Maybe it won't make or break you today, but in a few years - absolutely. Unless you're stuck in a company that's going to just run Win2008 forever. So if it's going to be an inevitable part of your future, then you are, in fact, more and more screwed the longer you ignore it. It's like the poor NetWare guys who ignored TCP/IP. They were screwed, in the end, and had to hustle to catch up. I hate playing catch-up; in my mind "screwed" is what you are whenever you're playing "catch up." So maybe knowing my definition of "screwed" will help the discussion a bit!

Another comment - and a good one - was, "[PowerShell] is a must... but I live in a multi-platform world where it is just a part, not a definer, of the whole." Excellent point, but if you must manage Microsoft technologies, then DSC is going to be a part of your life. Perhaps it'll be DSC "as managed by ___" cross-platform solution, but DSC is going to be the underlying API. If you're comfortable being insulated from underlying APIs by tools, fine - but you'll never be as effective as you would be if you knew those tools. Point being, in a multi-platform environment, DSC is not all you need to know, but you must know it (or begin to) if that environment includes Microsoft server products. Could you manage your Microsoft elements without using DSC? Sure. You can also drive a car using mind control, I'm told, but it's not the most effective way of doing so. Folks are quite welcome to disagree, but I do firmly believe that any environment would benefit from DSC. Time will tell if I'm right or wrong there, but personally - and this is very much a "this is how I proceed with my life" thing - I would rather be on the forefront of something than turn around in 5 years and realize I should have been there.

Keep in mind that, 6 years ago, folks felt free to ignore PowerShell. Many now wish they hadn't. It was a lot easier to get into PowerShell in v1, and then "keep up" with new versions, than to dive in now.

Why do I think DSC will be the same? Because DSC is the ultimate outcome of PowerShell. DSC is what PowerShell has been building toward. I think this is perhaps a perspective that other folks don't share. To them, DSC is "just a tool." It isn't doing anything they couldn't have done all along.

But understand something about DSC: This is something Snover planned almost a decade ago. It was the ultimate destination of his "Monad Manifesto." DSC is exactly what PowerShell has been building up to. DSC is the main reason, in many ways, for PowerShell. If you really think about it, DSC removes much of the need for you to learn PowerShell. 

That's a bold statement. Let me explain.

There's no question that PowerShell can be difficult to learn. It's programming, and not everyone has an aptitude for that. There are literally thousands of commands, and that's just from Microsoft product teams. It's a huge product, like any language has idiosyncrasies, and you can come at it from a half-dozen different directions. Writing scripts that configure or re-configure computers, or even that report on current configurations, can be complex. Yes, they're faster than doing it manually - but it's not a zero effort.

DSC abstracts all of that. To create a DSC configuration, you don't need to know how to program, yet you can potentially leverage all the PowerShell investment Microsoft has been making. You can use PowerShell, and all it can do, without having to really touch much of PowerShell. Sure, there's a middle layer of folks writing DSC resources (which use PowerShell commands as their API), but that's going to be a small subset of folks. A highly-paid subset, I suspect.

If Microsoft had had infinite time, money, and people, they'd have just given us DSC and not mentioned PowerShell at all. PowerShell v1, v2, and v3 were building blocks toward what DSC gives us. DSC was the point, all along. We're just seeing the tippy top of that, now. There's a glacier underneath.

Now, you may be thinking, "bullshit. I can't use DSC to do everything that my job involves, even if I just think about my Microsoft assets." True. Today. But folks, you need to have a little vision. We're dealing with DSC 1.0. Kindergarten DSC. Literally, what you're seeing now is the simplest possible expression of something that the world's largest software company took seven years to deliver. Seven years. Most of Microsoft's PowerShell investment, going forward, is going to be in DSC - I guarantee it. They've done the lower-level building blocks already.

"Can I use DSC to generate configuration reports?" Maybe not today. But have you noticed that a DSC pull server can have a "compliance server" component? Have you looked at its service definition? It's basically a way for servers to report in on the state of their configuration compliance. That's reporting. And that's my point: DSC has a super long way to go. It is going to be everything for an administrator - and that's going to happen fast. Looking at DSC today, that may be tough to imagine. So was PowerShell, in 2006.

And we haven't even seen the tooling that will be layered on top of DSC yet, because it's all so new. The tool where you click a Wizard to add a user... and the tool goes and rewrites four dozen server configuration files, causing the user to exist in AD, in your accounting system, as a home directory on a file server, and so on. Yeah, that'll all happen. Eventually, you won't touch servers anymore - you'l touch their configuration files, and they'll reconfigure themselves appropriately. That's why this is such a big deal. It's not a tool. It's the administrative interface. 

So when I say, "if you're not learning DSC right now, you're screwed," it's because I personally believe that to be true. My experience in the industry and my familiarity with how Microsoft pursues these things informs that opinion. You are going to fall behind the curve so fast you won't even realize it's a curve anymore. Today, people look at Infant DSC and see a basic configuration tool. I see Teenager DSC, and Young Adult DSC, coming around the corner, and they are going to absolutely change the way you are required to manage Microsoft products. Yeah, I personally want to be on board with that right now.

"What about a small shop? Isn't DSC meant for large scale?" No, large enterprises just have the most obvious advantage from DSC. It's less obvious to small shops.

You know how Exchange 2007 really impressed everyone, because the GUI was just running PowerShell under the hood? That meant a small shop could still get the GUI, but you could always drop down to PowerShell when you needed to. It also meant that not everything went into the GUI, and sometimes you had to drop into PowerShell anyway. I predict DSC will do the same thing. GUIs won't run PowerShell commands anymore - they'll modify DSC configurations. Those configurations will then be implemented on the affected servers. Your cross-platform management tools? If they're smart, they'll be doing the same thing.

Think about that. DSC isn't going to be "just a tool." It's going to be the entire interface by which you interact with Microsoft server products. It's as important as the mouse or the keyboard. I truly think people aren't seeing the end-game when it comes to this technology.

You know those admins who only know what the GUI shows them? They don't know much about what's happening underneath, and as a result, they're not very good at planning, architecture, troubleshooting, or anything else that requires a deeper knowledge. That's where you stand with DSC. You either ride that bus, or get run over by it. Eventually.

Do you want to risk not knowing this thing? You might. Perhaps in your job position you know it's not going to affect you. For me, I won't risk it. So that's where my perspective comes from. In my world, this thing is a must-have. And yes, that's an enterprise-class world, with large, cross-platform environments. But it's also a perspective from my experience in SMB - I'd have killed for DSC, given the minuscule budgets and staff I worked with in those environments, and given my colleagues' distaste for scripting.

Anyway, that's how I feel about it - in more detail than 140 characters allowed ;). If you have a different perspective, please feel free to share it. I can't promise that you'll change my mind (and I'm not really out to change yours), but it's good for the world in general to see different perspectives, so that folks can make informed decisions about their own career directions.

About the Author

Don Jones

Don Jones is a Windows PowerShell MVP, author of several Windows PowerShell books (and other IT books), Co-founder and President/CEO of PowerShell.org, PowerShell columnist for Microsoft TechNet Magazine, PowerShell educator, and designer/author of several Windows PowerShell courses (including Microsoft's). Power to the shell!

13 Comments

  1. Right or wrong, your comment certainly got my attention! I hadn't gotten around to playing with DSC much, yet, since we're quite a way from being able to use it in the retail space (which is my team's focus at my day job.) I've been reading up on it in my spare time over the last couple of weeks, since the podcast was recorded, and will play around with it in a lab once I build a few more VMs. I'll also be attending all three of Steven Murawski's sessions at the PowerShell Summit.

    Beyond that, time will tell. Writing resources and configuration files looks pretty straightforward for those of us who are already used to writing PowerShell scripts and modules, but the admin side of using those configurations to generate mof files, configure pull clients, etc, still feels a little bit clunky. I assume that's where higher-level management interfaces will come into play, at some point.

    • I think so. The team seems to have a desire to stay away from AD as a pre-req for DSC, but you could EASILY see them adding an SRV record to DNS listing pull servers... thereby centralizing that bit, which is currently awkward. Actually creating MOFs? Easy - you just run the configuration script. Need a command to copy them to the pull server and generate checksums, but that's like a 3-line function. So... yeah.

      • I know, I was just referring to the steps required to on-board a new server into a DSC pull environment:

        Compile MOF files and checksums named with that GUID on the DSC pull server.
        Configure the LCM on the new server.

        While that's not exactly difficult, it's still multiple steps with potential for human error. Ideally, when building a new server, the LCM would discover a pull server automatically (or have it assigned via sysprep or something), assign itself a GUID, and pop up in an admin interface somewhere. In that admin interface, you select which configuration to apply to the new server, it generates the MOF / checksum, and you're done in one step.

        • Yeah, it's a wee bit much. I think they should (and they may) modify the configuration to also produce a checksum file, and add a common parameter so that you can run the configuration and have it dump the files into your pull server folder, if desired. Actually, that latter bit may be in there. Right now, I just have a function that I run - I give it a configuration name and it takes care of it all.

          Totally agree on the pull server discovery, and I've mentioned it. You should suggest it in Connect, in fact, and let us all vote it up. DNS would be the way to do that. Defaulting to a known GUID like a MAC address (I know, not a GUI, but unique-ish) would be a possibility I suppose. But yeah, something. I think the reason there isn't is because in the MS vision, you're only deploying VMs, and you're using SCVMM, and it knows how to inject a MOF already.

          • Honestly, any production environment should have a workflow/runbook/build server that does the configuration generation. Having individual admins generating configs leaves some major holes in the workflow (like making sure the required resources are available in the appropriate zipped format on the pull server). Steps are easy when you have them scripted or don't have to do them.

            In my environment, a production build just requires a check-in to our DSC repo. You don't have to run any other commands than that. If I want to test the build process (the same one the build servers use to generate configs), I have a command that runs that same process on my local workstation.

            Having the discreet steps is actually good, as you have interception points for testing and logging. If it was one in-box command that did it all, it would be much tougher to debug failures (and trust me, debugging DSC is tough enough).

            Steve

  2. You expressed my argument perfectly about the quote "If you are not actively figuring out DSC, right NOW, you're screwed" in https://twitter.com/StevenMurawski/status/440553912781643776 Learning and mastering DSC right now is not critical, but in the future it will be of great importance once it evolves and progresses. Even on a Windows only environment the sheer mix of versions of Windows and some of the incompatibilities of PS 4.0 with some of the server products does not make the preferred solution right now, this making it a non-critical technology to master at this moment. I do agree completely PowerShell is a must for any sysadmin, operations, security and incident response team. The reason PowerShell is a must right now is in great part because it is just better the best shell all aroud and the power of it being on Windows by default and a coroner stone for server product management. As it matures will it become more important as system are upgraded and DSC becomes integrated more in to products like SCCM. It will be of great value to know, but not a deal breaker in the world where being a specialist is in less demand. I do not see knowing DSC in the future mean not learning PowerShell an acceptable options since the way it has been integrated and how it grows at such scale with each version, third party products and cloud offerings will be it in the forefront of skill to know and master for a long time. One particular one is CIM specially as MS CIM involvement keeps spreading in the multi platform and multi solution ecosystems of many businesses.

    We agree it may be of limited value in many environments right now, that in many years down the road it will be of great value and importance, that PS is a must, not an option for any IT professional right now, but on the argument that is has to be learned and mastered right now I agree to disagree. Keep doing the great writing you have been doing many of the ideas you bring up front are ideas that need talking about and many ignored by the IT community. I back and agree with most of them.

    • That's why I wanted to write this long-form and not in Twitter ;). I think our only point of disagreement is on the "now" part. As I said, for me, when something this important comes along, I feel the longer I wait the further behind I am. When suddenly I wake up one day and it's absolutely the only option, and I'm not up to speed, I'm screwed. I think a lot of people in the MS space get into that rut very easily, which is why I phrased my perspective the way I did. Are you going to get FIRED because you aren't a DSC expert by the end of this week? No. But I didn't say "fired," I said "screwed." DSC is evolving, and gaining coverage, faster than anything else MS has done in the past - so the distance between "don't know anything about it" and "fired" is going to be a lot shorter than similar things in the past.

      Your argument seems to be "I'll have time to catch up when I need to," and if that's the case, I don't disagree with you. But I wasn't making my argument about you in particular; I was making it based on my broader observations of the Microsoft IT pro in general - where I see a tendency to let things go FAR too long.

      So I think some of your disagreement with me might simple be over a matter of language. You're reading my words and getting a different meaning from them, because you're interpreting them from your own personal experience. I was trying to make a very strong statement about a general audience. I feel I have to, because it's an audience that often doesn't react appropriately unless there IS a strong statement.

      That said, my own personal opinion is that if you are not LEARNING this technology TODAY, then you are not setting yourself up for success. It did not say USING... I said LEARNING. It WILL become massively important, and "setting yourself up for success" means proactively preparing yourself for that day. Again... that's my personal opinion, and I've tried to qualify it as such as much as possible both here and in the article.

      Definitely appreciate your (and everyone's) attention to DSC. Just the fact that people are thinking about it is heartening.

  3. interesting article. but i think the DSC is merely a copycat, of what has already matured elsewhere using a tool like puppet. I went to Powershell Summit last year and was surprised when I gave a lightening round demo. I created a slew of windows providers i wrote that incorporate powershell, building on great open aource projects like chocolatey and boe prox's wsus powershell module. Other folks there were clearly behind where i was thinking, and i am really just a windows guy messing with some ruby, puppet and powershell.

    Now the curve is accelerating. And config mgmt is becoming less and less heavy. Docker is a game changer for containerization and it might be the straw that broke the camels back. Nothing on windows can compare. Something has to change in Redmond and i hope that with the new ceo they can make Midori or something more sound replace the nt kernel and put msft back on the map as being meaningful.

    In short, treat your servers like cattle that can be shot, not pets needing coddling.

    • I think it's less a question of DSC being a copycat. At some level, most of technology is copied or adapted from something else; what's important is that DSC gives us, on Windows, what has existed elsewhere for other operating systems for some time. I don't want this to be a "Windows vs." argument; the discussion is on the importance of DSC for Windows people. I'm not saying your argument isn't valid, just that it isn't the topic of discussion right here in this thread.

      • the importance of DSC for windows people should be called into question. Computing in a Windows only ecosystem is becoming a thing of the past and building a DSC library just doesn't scale. And why start from the ground up. Puppet (as an example) runs fantastically on Windows on 1000s of nodes for free with no vendor lock-in. Yes you might need 1 linux server to host it, but then you immediately reap the rewards of all those people doing great things with it. I am not implying DSC is bad, its just so far behind that you should ask yourself why put a line in the sand so far back? The end game is here as you say, just totally available today on Puppet rather than a catchup by MS. Network vendors are talking and more importantly actually integrating it into their stacks. Vmware is managing their vsphere. Cloud companies encourage CM wi5h Puppet, Chef and the like.

        So in summary, IMHO if you are managing windows, you are screwed if you dont know powershell. If you dont know DSC thats fine. There are better tools out there. BUT BUT BUT - if you dont know how to define the state of your infrastructure with code you are screwed. Idempotence is an overdue lesson for many scripters.

        • I think you may have some misconception that it's "DSC vs. Puppet." It isn't. DSC is mainly an API, which something like Puppet could easily leverage (MS has demo'd it, for example). Puppet using DSC would mean faster dev times for the Puppet fans, with a great toolset from the Puppet side. The whole POINT of DSC is that it's been built to work under those other tools.

          And see, like many other folks, you're mis-stating what I've been saying. I didn't say you're screwed if you're not USING DSC. I said you're screwed if you're not LEARNING it. What it actually is. What it actually does. How it works with existing tools and how it plays in a cross-platform environment. Because if you go gung-ho down the "Puppet only" path, you're going to - in the long run - end up wasting a lot of your own time. Or Chef. Or Ansible. Or whatever your tool of choice is.

          That's why MS couldn't just buy into Puppet. Some people use Chef, Some people use other tools. DSC - as an API, not as a tool - was designed to work underneath any of them, giving those tools easier access into the Windows stack. I don't see why that's bad. But, if you don't LEARN what DSC actually is, then you'd never know that, I suppose.

          I don't disagree with your overall points at all. I just don't think you're thinking out what DSC actually is, big-picture wise. If all it was was a "competitor" to Puppet, I'd probably agree with you.

          • If DSC is meant to be an api for other tools to interface, where are the language agnostic hooks outside of the dotnet ecosystem. How does one "reach" the api without calling powershell.exe but instead executing DSC inline inprocess or runspaces from a language like Python, Ruby, Go or any language that is not exclusively rooted in the MS stack. ?

            I just dont see how I am wasting time in the long run automating all the things using Puppet with powershell built into the providers and modules. I feel as the speed at which I am innovating is faster than what is happening elsewhere on Windows CM scene. The DSC was released after the 2013 summit, yet I had my infrastructure as code fully represented before the product shipped.

            What value do you get now converting the underpinning to DSC? Idempotence? Acyclic Directed Graph? Cross platform? If i am defining my infrastructure as code I want to use the most generalized description that covers all the things.

            What I mean by that- if i declare a file to be ensured present in path foo, the language I want to express that in should be common across all the platforms one could manage. From ubuntu to a mac to a windows pc to a firewall. The plugin provider model based on say ruby's cross platform language capabilities affords that ability with a common entry point across all the things. Puppet more so than chef simplifies that with their DSL.

            Inside a devops cultured heterogenous (windows and linux for example) enterprise, specific tools will be selected with openness, api, cross platform, and other criteria. At the present timen DSC seems to hone in on newer Powershell driven Windows servers ie 2012r2. This model of building systems is at direct odds with a procedural approach where the state of the system is ignored in favor of a recipe of steps. So System Center is bloatware at this point, because "all the things" arent infrastructure as code. So microsofts vision is cloudy.

            As a side note where i would have expected innovation from MS is a package manager. Instead we have to rely on the goodness of chocolatey since package management is an abortion on the win.

            I will check out your book, and do look forward to seeing you at the next conference 🙂 Fun topic for sure!

  4. I agree that DSC is worth to learn. But I think the whole write up assumes that DSC will be the Tool of choice for the Future by Microsoft.
    Active Directory GPOs are bound to the AD so they are limited and become obsolete over time.
    Jeffrey Snover told in his presentation that System Center is going to do DSC in the future, so welcome to the club....
    I don't know Windows Intune but...What's with Windows Intune will it die or join the DSC club?
    I think we can only know if we are Screwed up, If we see DSC Version 2 (and we know who makes the configuration-tool race at Microsoft ;-)) ).

    any production environment should have a workflow/runbook/build server that does the configuration generation.

    I think thats the point for this young version of DSC. You have to maintain and adjust the whole configuration production chain by yourself now. .... and that makes this version of DSC very fragile now. (in my point of view)
    A chain is only in-line if you pull them correctly, if you push the chain your work is a mess .
    I need DSC currently for Windows Clients and I am still searching how to handle configuration for a large amount of Clients.
    This Version state of DSC in my point of view is only good for an overview able amount of Servers.
    I don't see at the moment how DSC is designed for the masses and for reproducible history steps.
    And as long as I cannot see this design goals or visions, DSC is to me like a deficient product.
    1. DSC and masses?
    Currently a Node is bound to a static MOF File. You have a hard 1:1 relationship. This bothers me!
    Yes, you can create a repository pool of Node-less Configurations for any kind of Configuration and Import them with "Import-DscResource" into the Configurations which are bound to the nodes.
    So for me this Node-less Configurations are the first Part of the chain.
    Then you use the Node-less Configurations into one or more Configurations for one or more nodes.
    Then you "compile" MOF files one for each node and put them on your Pull Server.

    So what if you change one Node-less configuration on the first part of the chain . You have to recompile the whole chain where this Configuration was used, to deploy the changes in the configuration correctly!
    This is hard to handle at the moment!

    Currently the LCM is only pulling the mof and the resources.
    Would not it be better if the LCM triggers the PowerShell Configuration file for the Node, so that it produces the MOF on the fly and the new produced MOF is pulled ?
    This will Pull the newest changes from the first part of the chain with the "Import-DscResource" command.
    So you have every time a new fresh "compiled" MOF which contains the newest changes and no static outdated MOF!
    With that you don't have to watch your MOF production chain that hard and you have solved the 1:1 problem!
    All parts of the chain are pulled and the chain is in-line 😉

    2. We need reproducible history steps (Configuration Logging)
    If you cannot reproduce the configuration steps a machine has made, your machine has a tattoo that you do not know and you cannot transfer the state of the machine to other machines!
    So I think we need a configuration as a result of all configurations that are made on a machine in history. This would be the finest 😉

    Hope my English was well articulated and my ideas too!

    Peter Kriegel
    German speaking PowerShell community
    http://www.PowerShell-Group.eu