Author Archives: Don Jones

About Don Jones

Don Jones is a Windows PowerShell MVP, author of several Windows PowerShell books (and other IT books), Co-founder and President/CEO of, PowerShell columnist for Microsoft TechNet Magazine, PowerShell educator, and designer/author of several Windows PowerShell courses (including Microsoft's). Power to the shell!

[UPDATED] Review: SAPIEN VersionRecall

I recently played around with SAPIEN’s VersionRecall, and thought I’d share a bit about the experience. As a note, SAPIEN provided me with a license key to use. VersionRecall is advertised as a simple, single-user version control system “for the rest of us.” There are no servers, no databases, and nothing complex, according to the marketing copy.

Setup is quick – a 3-screen wizard and you’re done. Installation took under a minute. When you first launch the product, it attempts to find all the places on your computer where you might store scripts, so that it can connect those to a version-control repository. You can skip that bit, but it only took a few moments on my virtual machine. It found my DSC scripts, my PowerShell modules, and several other places I’d dropped scripts. You then indicate where you’d like your version-control repository – this is where old versions of files will be saved. You can also pick a certificate, to have the software automatically sign scripts each time you make a new version. That’s a subtle and very cool feature – and it’s a way to make AllSigned a more convenient execution policy.

I selected an option to have my version control repository updated every day at 4:30pm. That seems to let the software capture a snapshot of any changed files at that time every day; it was clear that you could also manually submit an update to the repository using VersionRecall or Windows’ own File Explorer.

From there, you’re in an Explorer-like view. It includes a tab for each folder where you store scripts. I find that I like that approach a lot – I tend to organize my scripts that way. I’ve got my modules in one spot, some sample scripts in another, stuff I’m playing with in a third, and so on – so the tabbed approach fits my organizational style. You can open files for editing right there. I don’t have PrimalScript installed on this test machine, but files opened in the ISE just fine. Ribbon buttons let you open the shell, the ISE, or SAPIEN’s PrimalScript or PowerShell Studio products.



Here’s how this works: You have to manually submit changed files to the version-control repository, or wait for the daily check-in (remember, I set mine to 4:30pm). This doesn’t magically capture changes throughout the day. But, you can always manually submit an update if you’ve been making significant edits. That’s how most “big boy” source control systems work – only they don’t usually have an automatic daily-check in as a backup plan. VersionRecall does.

You can always compare the current version against a repository version – and it’s a very slick comparison view.



Once you’ve checked in a few versions, you can easily see the complete list, quickly see what each file contains, and either restore a previous version or copy it to a different location. You can also compare two versions to see what’s different.



Notably, VersionRecall doesn’t stick your files into a database or some proprietary storage. Your check-in files stay files, in their original formats. That means, if you ever need to do so, you can simply go to the folder where VersionRecall’s repository is, and grab the files yourself. It should also allow files to be indexed by Windows (for filetypes where it does that), found by Windows search, and so on.

Unfortunately, PowerShell Studio doesn’t seem to recognize VersionRecall as a source control provider (at least, it didn’t show up when I tried to configure source control in PowerShell Studio). That means you can’t use the integrated check-in/out controls in PowerShell Studio. Instead, you almost want to open files by using VersionRecall’s Explorer, save them in PowerShell Studio, and then submit them to the repository back in VersionRecall. That’s a shame; the automatic check-in/out in PowerShell Studio would make it all a bit simpler.

VersionControl uses a “Modern” user interface scheme for the most part. Its ribbon is pretty clean and well-organized, and the icons were meaningful. As with most recent SAPIEN products, you can change the theme to one of almost a dozen different styles, so you should be able to find something you like. Icons remain the same either way; all you’re changing is the “chrome” of the UI.

Not much else to say. For a product that bills itself as simple and easy, VersionControl certainly delivers. It does one thing, and it does it pretty well. It’s definitely easy – and there’s less excuse than ever for not using some kind of version control for your scripts. A FAQ on SAPIEN’s blog answers questions like why VersionRecall doesn’t check-in files automagically each time they change, how it compares to something like Git, and more.

[Update: I've removed the section on the license key and activation; SAPIEN's Alex Riedel pointed out that I had some factual errors, because my observations were based on my use of a "real" license key that was issued for my particular use, not a "trial" key. I admit that I find software licensing uninteresting, and none of it has any impact on the usefulness of the software, which is what the article was meant to cover.]

VersionRecall sells for $179 as a standalone product, which includes a year of updates. I think that price might be a bit high, given what the product does. I expect, however, that most people are getting VersionRecall as part of a SAPIEN software bundle. For $789, for example, you get everything they make. For me, the perfect combo is PowerShell Studio and VersionRecall, which retails for $568.


PowerShell Summit N.A. 2014 – Budget

As part of our commitment to being a transparent, community-owned organization, I wanted to share the basic budget for the upcoming Summit. Now that registration is cut off, we have most of our final numbers. Keep in mind that, at live events, things “on the ground” can change quickly – so these are, at present, only our expectations “going in.”

  • $113,833.51 in net registration fees. This is after paying credit card transaction fees.
  • -$398.00 for event insurance (already paid)
  • -$76,466.04 for the venue, which includes A/V, F&B, room rental, etc. (already paid)
  • -$9,335.01 for speaker lodging (hotel)
  • -$3,000 for professional event management (including travel for the event manager)
  • -$1,490 for our registration web site (already paid)
  • -$1,710.51 for deposit on the European Summit
  • -$7,500 for speaker reimbursement

That last number is presently the big question; we have some speakers who paid for their registration, and we need to reimburse them. That’s probably about $4,000. We have another $2,500  in promised travel offset fees to speakers doing 3 sessions. We’re trying to reimburse additional travel expenses for other speakers so they’re not totally out of pocket; the final number may be more than $7,500.

Right now, that puts us at an event profit of roughly $13,933.95. Again, some of that may end up going to additional speaker reimbursement; the rest will help fund ongoing activities (like Azure hosting and so forth; I’ll share a full annual operating budget in June, but it’s about $17,000 per year). We have about $20k in payments coming up for the European Summit.

We have approximately $92,000 on-hand; much of that will go to the expenses above that are still pending. We should end April with around $65,000 on-hand – a lot of that comes from earning back a $40,000 pre-payment for the N.A. Summit that we made in fiscal 2013-2014. We’ll use some of that $65k to cover the remaining $20k fees on the European Summit; the rest of our cash-on-hand will help provide deposits for the 2015 N.A. Summit, and to fund ongoing operations for 2014-2015. We’re in good financial shape – we’re making a bit more than we need, but not very much – which is right where we want to be.

The good news is that, between the Summits and our generous corporate sponsors, we’re on track to actually find the $17k wish-list budget we’ve put together (which we’re still researching and tweaking; as stated, I’ll share the full thing in June). That means we’ll be able to start spinning up services like the VERIFIED EFFECTIVE program, monthly TechSession webinars, and so on.

PowerShell Summit NA 2014 Shirts Available

If you’re attending PowerShell Summit NA 2014 (or wish you were), we have some new logo items for purchase! Buy ‘em now and wear ‘em to the Summit, including a baseball jersey and a polo shirt. Visit our Zazzle store to buy (or the Canadian store, to save a bit on shipping if you live up there).

Note that the items may take about 24 hours to become visible, so check on April 15th in the afternoon if you don’t see them immediately.

See you at the Summit!

Summit Session Change

Paul Higinbotham’s session on threading in PowerShell has been changed, because his content would have overlapped with other sessions. Instead, Paul will be presenting:

PowerShell Debugging Enhancements
A number of script debugging enhancements were added to PowerShell 4.0 and the WMF 5.0 preview release. In this talk I will discuss these new debugging features and demonstrate how they work. This will include the new support for remote debugging, debugging workflow scripts, debugging PowerShell jobs, ISE enhancements for remote debugging, and the new “Break All” command.

We’ll update the schedule grid and abstract document.

Massive Update to All Seven Free eBooks at

We’ve just finished a massive re-do of all 7 free ebooks.

First, they’re now hosted in a public OneDrive folder. This means you can quickly and easily view them online, download a DOCX, or download a PDF. Anytime, anywhere.

Second, we’ve had folks go through and make the formatting more consistent, using a more modern font and somewhat “airier” spacing. Hopefully that translates to “nicer to read.” All the original code is also accessible, and available for one-click downloading. Note that .PS1 files may open for viewing; you need to checkmark the file to download it.

Uploads are now proceeding, so depending on when you read this, some files might still be in progress. The GitHub versions (which were problematic for some folks to download) will be removed shortly. Please update your links; has already been updated.


We Want Your DSC Resource Wish List!

What sorts of things would you want to configure via DSC that don’t already have a resource?

NB: Focusing on the core Windows OS and its components only; Exchange, SharePoint, SQL Server, and other products are off the table for this discussion.

For example, I want a “log file rotator” resource, that lets me specify a log file folder, an archive folder, and a pair of dates. Files older than one date are moved from the log folder to the archive folder; archived files older than the second date are deleted.

I’d also like a File Permissions resource. Specify a folder or file, optional recursion, and a set of access control entries (in plain English terms), and it’ll make sure the permissions stay that way.

Maybe also a User Home Folder resource, which would (a) ensure a folder exists for a given set of user accounts, and (b) ensures a set of “template” permissions, so that each individual user has the rights to their folder, plus rights given to global users like admins.

What resources would YOU like to have to ease configuration and maintenance in YOUR environment? Drop a comment!

My DSC Demo-Class Setup Routine

I think I’ve gotten my DSC classroom and demo setup ready. Understand that this isn’t meant to be production-friendly – it doesn’t automate some stuff because I want to cover that stuff in class by walking through it. But, I thought I’d share.

I’ve basically made an ISO that I can carry into class, attach to a Win2012R2 VM and a Win81 VM, and run students through. The server VM is a DC in “company.pri” domain, and the client VM belongs to that domain.

In the root of the ISO are these scripts: ISO_Root (unzip that). Students basically just open PowerShell, set the execution policy to RemoteSigned or Unrestricted, and then run SetupLab -DVD D:, replacing “D:” with the drive letter of the VM’s optical drive. The script isn’t super-intelligent since I demo it at the same time; it needs the colon after the drive letter.

In a folder called DSC_Modules, I add the following DSC modules (unzipped): xActiveDirectory, xComputerManagement, xDscDiagnostics, xDscResourceDesigner, xNetworking, xPSDesiredStateConfiguration_1.1, xSmbShare, xSqlPs, xWebAdministration.

In a folder called DSC_Pull_Examples, I include these scripts: DSC_Pull_Examples (unzip that).

In a folder called eBooks, I include these files: eBooks (unzip that). Those get used in a lot of the demos I do, so I have the lab setup scripts copy over some script modules.

In a folder called Help, I have a file called This contains everything downloaded by the Save-Help command in PowerShell. The Setup script unzips this into the VM and then runs Update-Help against it, so the VM doesn’t need to be Internet-connected.

In a folder called Hotfix, I have the Windows8.1-KB2883200-x64.msu hot fix installer. I include the 32-bit version also, just in case, but my script doesn’t use it.

In a folder called Installers, I have installers for PrimalScript, PowerShell Studio, and SQL Server Express with Advanced Services. Again, those get used a lot in my classes, but the setup script doesn’t rely on them.

Finally, in a folder called sxs, I have the contents of the Windows 8.1 installation media’s \Sources\sxs folder. Some of the things my setup script does – like adding .NET Framework 3.5 so SQL Server 2012 will work – rely on features that aren’t in a Win8.1 VM, normally. Because I don’t want to rely on the Internet, I include this source so I can install new features from it.

This is all pretty specific to the way I run classes, but if there’s any use you can make of it, feel free.

The DSC Conversation Continues

Some lovely conversation on DSC over on Reddit… with some I wanted to perhaps offer an opinion on. From what I’ve seen, these are very common sentiments, and they definitely deserve… not argument or disagreement, but perhaps an alternate viewpoint. I’m not suggesting the commenters are wrong – but that maybe they’re not considering the entire picture.

Certainly if you work with a superset of MS OSs (i.e. you do Linux also), then Puppet or something like it seems like a no brainer. In fact, that is what we’re doing now. Puppet has powershell modules you can install for instance. Personally, I still feel like Powershell is overrated except for small snippets of that’s how something is exposed. Puppet can run powershell commands. AutoIT can run powershell commands… I just don’t see value in Powershell today.

The point is that, until PowerShell, there were no PowerShell commands. Microsoft was incredibly inconsistent about providing automation-friendly commands of any kind. They could have gone down the path of building command-line tools for Cmd.exe; they didn’t. The point of PowerShell is that Microsoft forced themselves to build commands. Now, if you run those from AutoIt, or Puppet, or whatever else – that’s cool. PowerShell is an API, not a tool. Whatever tool you use to access that API is just dandy. Without the API, the tools are useless.

As to DSC – I’m really confused. Why is this separate from Group Policy again? Why is it better? Or is MS giving up on Group Policy as needing a total re-write?

The advantage of Group Policy over DSC, today, is that GP has richer ability to target computers based on OU membership, WMI criteria, etc. Today, DSC targeting isn’t that flexible. On the other hand, GP is extremely difficult to extend, since client extensions are native code. GP was built to manage the registry, although it’s been extended to do more. DSC is built to do whatever PowerShell (and, via CIM, native code) can touch. My opinion? Yeah, DSC will obviate GP over time. Not instantly.

Specifically, as I’ve been rolling out Puppet across Windows and Linux, I see that in some ways, it brings the computer GPO aspect to Linux, and duplicates it a bit on Windows.
Anyway, I won’t be surprised to see someone start writing DSC modules in Puppet, because you’ll want your config management to work across your platforms. And MS is kind of late to the game here – many many people have lots of knowledge already in Puppet, Chef etc…

The guys on the PowerShell team love Chef and Puppet. I think you’re confusing “api” and “tool.” There are two pieces to DSC: Piece one is the ability of PowerShell to read a configuration script and produce a MOF. Piece two is the ability of a Windows computer to receive that MOF and reconfigure itself accordingly. Any tool can do piece one. Use Puppet to produce the MOF. Use Puppet to control which MOFs get sent where. That’s the intent. But Microsoft takes a big burden off the Puppet developers by having Windows know what to do with the MOF. Yeah, MS is late to the game. No question. But they’re joining the game, not reinventing it. What they’re doing works with what everyone else is already doing.

I would personally carry the sentiment even further and say that investing the bulk of your effort in DSC over something like Puppet would be needlessly tying your own hands. Why focus on something that’s platform specific when there is a good cross-platform alternative. Don’t put all your eggs in one basket as it were.

Wrong. It isn’t an either-or thing. DSC’s introduction at TechEd 2013 included a demo of Puppet (or was it Chef?) being used to send configurations to Windows – much more easily, because with DSC, Windows natively knew what to do with them. If you’ve got tooling like Puppet, use it. DSC is just making Windows work better with it. The whole point of DSC is that it plays the cross-platform game everyone else has already been playing. 

Purely on the Windows side, the need to focus on DSC is more about developing the DSC resources you need, so that you can send a MOF (from Puppet, say) to a Windows computer, and that Windows computer will know how to configure everything you need configured. Microsoft will continue to produce resources for core OS and server application stuff; any LOB stuff is what you’d be focusing on.

Heck, even in a pure-Windows environment, with cross-platform off the table, Puppet provides tooling that DSC does not. You’re going to need those tools, whether it’s Puppet, some future System Center thing, or whatever. DSC is a mid-level API, not a tool.

Configuration managment does seem to be the future — I just don’t agree completely with the author’s point of a view that it will have to be DSC.

On Windows, DSC will be the underlying API that your configuration management tool talks to. DSC isn’t a configuration management tool. DSC bridges the gap between a text-based MOF and the bajillion proprietary protocols MS uses internally in their products. Remember, on Linux, it’s easier – everything already lives in a text file of some kind, right (oversimplifying, I know, but still)? In Windows, config information lives everyplace; DSC’s main job is to bridge the gap. DSC doesn’t provide management of what configuration goes where; it just provides the implementation mechanism. In PowerShell, there’s a primitive ability to write configurations, because MS has to give you something, but yeah… I think most organizations would benefit from good tooling atop that.

I think this entire discussion is why more people need to start learning (not necessarily using) DSC if you have Windows in your environment. Find out what it is, what it isn’t, and how it’ll play into the other efforts you’ve got underway. There’s a ton of misconception about what it is and where it’s meant to fit in. When I say, “if you’re not learning DSC, you’re screwed,” I don’t mean, “if you’re not using DSC.” I mean learning. Because if you’re not learning it, you’re going to be subject to the same misconceptions about it. You end up spending a lot of time reinventing what it’s willing to do – and what it’s willing to do in conjunction with your existing tools.



Jobs: PowerShell Scripter Wanted

Told you this would eventually start happening ;). Matt Sullivan of Strategic Staffing contacted me with the following job posting; if you’re interested, reply to him directly at 781-347-5220.

My name is Matt Sullivan and I am a member of the Strategic Staffing Division at NTT DATA Inc., the sixth largest global IT integrator. We have more than 75,000 employees worldwide, offices in 40 different countires, and we are owned by Nippon Telegraph and Telephone, the largest telecommunications company in the world.

I am currently seeking a Scripting Engineer – PowerShell to join our team in Burlington, VT. The job description can be found below for your review. Please note that your resume will not be submitted to the client until we have discussed your background.

Title: PowerShell Scripter
Location: Burlington, VT
Duration: 1 year

Our Client has a number of projects in flight that require scripting (PowerShell) as part of their automation solution in our Windows environment. This position would require that the contractor meet with other project members, to gather requirements, build, test and document the scripts. He/she will then hand this work off to another vendor to be implemented on the scheduling platform (BMC’s Control-M, a SaaS hosted by Client).

As a second priority, the contractor will work with various departments, to examine an existing body of scripts/jobs which also in our Windows environment. These jobs, having been prioritized by the client, will be converted, if necessary, to PowerShell, tested and documented before being turned over to Client. This body of work is not expected to be completed in the time allotted as it is very large. Our goal is to address as many as possible working from the highest priority down.

PowerShell is the scripting language of choice. A few years at a minimum is required including experience with .NET remoting.
Expert level in Powershell
3+ years experience
Powershell V2 and/or V3
Solid understanding of Powershell Remoting
Business Analyst skills
Experience in requirements gathering
Testing methodologies, test plan development
Strong documentation skills
We are dedicated to working with a wide range of IT consultants, as an example corp to corp and W-2 hourly contractors; and we offer competitive benefits for candidates applying as W-2 contractors.

Benefits available for W-2 contractors only:
Caremark Prescription
W-2 Employee Assistance Program
Accident Insurance- Workers’ Compensation Insurance and Business Travel Insurance
Healthcare Reimbursement Account Programs
Credit Union
Corporate Mortgage Program

The DSC Opportunity for ISVs

Desired State Configuration offers a number of immediate opportunities for independent software vendors (ISVs) who are smart enough to jump on board now. DSC currently suffers from a marked lack of tooling. That’s partially deliberate; MS obviously needs to deliver the functionality, and they may well rely on third parties or the System Center team to build tools on top of that functionality. But let’s explore some of the immediate opportunities.

Change Control and Versioning. This should be pretty easy. We basically need a way to “check in” a new DSC configuration, possibly have it go through an approvals workflow, and then deploy it. In more detail, I’d want to be able to submit a configuration script to this tool. It would run the config, generate a MOF, and deploy it to a “lab” pull server location. I could then verify its functionality, and “approve” it to deploy the MOF to a production pull server. Deployment would include creating the necessary checksum file. Obviously, rollback capability to a previous version would be nice.

Configuration Consolidation. Natively, DSC requires me to specify the nodes I want to push a configuration too. I’d like to see a tool that lets me create server lists somewhat graphically, organizing things so that a single server might appear in a “domain controllers” list, a “New York servers” list, and a “Win2012R2″ list.  I could target configurations at each list, and the tool would combine those configurations to create the appropriate one for each node based on its “folder memberships.” That might be done through composite resources. This makes DSC work a bit like GPO, with this tool doing the work of combining configurations into a single one per node.

DSC Studio. Using the underlying DSC Resource Kit and Resource Designer for functionality, give me an IDE that lets me graphically design a resource (specify properties) and then spit out the schema MOF and skeleton PSM1 file. This could probably be a very simple PowerShell ISE add-on, in fact.

Node management. In a pull server environment, give me a tool that lets me group servers. The tool should modify the LCM on each group, so that each member of the group has the same DSC configuration ID. That way, they’re all pulling the correct MOF from the pull server. Otherwise, managing GUIDs gets out of hand pretty quickly – I can see a lot of Excel spreadsheets.

Resources. There are obviously a ton of resources to be written. This might be a bit of a bad call for an ISV, as you never know what MS is going to release resources for. Now that MS has built so many PowerShell cmdlets, building resources on top of them gets pretty straightforward. They’ve pumped out two waves of resources pretty fast already.

In short, I think there’s a big opportunity for a smart company. It’s a matter of seeing the “holes” in the technology, which currently focus mainly on management, and filling them in.


DSC: Must-Have or Just Nice-To-Have?

On a recent PowerScripting Podcast episode, I went off on a bit of a career-oriented rant, and amongst other things mentioned something to the effect of, “if you’re not learning DSC now, you’re screwed.” It hopefully goes without saying that my comment applies to folks working in environments that use Microsoft server products; obviously, in an all-Linux shop you’re pretty safe not knowing Microsoft’s technologies :).

Some discussion on Twitter ensued, a place I hate for discussions because 140 characters leaves you enough room to be misunderstood and paraphrased, but not enough room to articulate your perspective. I wanted to follow-up on the rant a bit, and by doing so here hopefully engender a more detailed discussion.

One comment – and this is a nice, succinct one to start with: “Is it a useful tool? Yes; is it the tool that makes or breaks a sysadmin? No.” Couldn’t disagree more. Maybe it won’t make or break you today, but in a few years – absolutely. Unless you’re stuck in a company that’s going to just run Win2008 forever. So if it’s going to be an inevitable part of your future, then you are, in fact, more and more screwed the longer you ignore it. It’s like the poor NetWare guys who ignored TCP/IP. They were screwed, in the end, and had to hustle to catch up. I hate playing catch-up; in my mind “screwed” is what you are whenever you’re playing “catch up.” So maybe knowing my definition of “screwed” will help the discussion a bit!

Another comment – and a good one – was, “[PowerShell] is a must… but I live in a multi-platform world where it is just a part, not a definer, of the whole.” Excellent point, but if you must manage Microsoft technologies, then DSC is going to be a part of your life. Perhaps it’ll be DSC “as managed by ___” cross-platform solution, but DSC is going to be the underlying API. If you’re comfortable being insulated from underlying APIs by tools, fine – but you’ll never be as effective as you would be if you knew those tools. Point being, in a multi-platform environment, DSC is not all you need to know, but you must know it (or begin to) if that environment includes Microsoft server products. Could you manage your Microsoft elements without using DSC? Sure. You can also drive a car using mind control, I’m told, but it’s not the most effective way of doing so. Folks are quite welcome to disagree, but I do firmly believe that any environment would benefit from DSC. Time will tell if I’m right or wrong there, but personally – and this is very much a “this is how I proceed with my life” thing – I would rather be on the forefront of something than turn around in 5 years and realize I should have been there.

Keep in mind that, 6 years ago, folks felt free to ignore PowerShell. Many now wish they hadn’t. It was a lot easier to get into PowerShell in v1, and then “keep up” with new versions, than to dive in now.

Why do I think DSC will be the same? Because DSC is the ultimate outcome of PowerShell. DSC is what PowerShell has been building toward. I think this is perhaps a perspective that other folks don’t share. To them, DSC is “just a tool.” It isn’t doing anything they couldn’t have done all along.

But understand something about DSC: This is something Snover planned almost a decade ago. It was the ultimate destination of his “Monad Manifesto.” DSC is exactly what PowerShell has been building up to. DSC is the main reason, in many ways, for PowerShell. If you really think about it, DSC removes much of the need for you to learn PowerShell. 

That’s a bold statement. Let me explain.

There’s no question that PowerShell can be difficult to learn. It’s programming, and not everyone has an aptitude for that. There are literally thousands of commands, and that’s just from Microsoft product teams. It’s a huge product, like any language has idiosyncrasies, and you can come at it from a half-dozen different directions. Writing scripts that configure or re-configure computers, or even that report on current configurations, can be complex. Yes, they’re faster than doing it manually – but it’s not a zero effort.

DSC abstracts all of that. To create a DSC configuration, you don’t need to know how to program, yet you can potentially leverage all the PowerShell investment Microsoft has been making. You can use PowerShell, and all it can do, without having to really touch much of PowerShell. Sure, there’s a middle layer of folks writing DSC resources (which use PowerShell commands as their API), but that’s going to be a small subset of folks. A highly-paid subset, I suspect.

If Microsoft had had infinite time, money, and people, they’d have just given us DSC and not mentioned PowerShell at all. PowerShell v1, v2, and v3 were building blocks toward what DSC gives us. DSC was the point, all along. We’re just seeing the tippy top of that, now. There’s a glacier underneath.

Now, you may be thinking, “bullshit. I can’t use DSC to do everything that my job involves, even if I just think about my Microsoft assets.” True. Today. But folks, you need to have a little vision. We’re dealing with DSC 1.0. Kindergarten DSC. Literally, what you’re seeing now is the simplest possible expression of something that the world’s largest software company took seven years to deliver. Seven years. Most of Microsoft’s PowerShell investment, going forward, is going to be in DSC – I guarantee it. They’ve done the lower-level building blocks already.

“Can I use DSC to generate configuration reports?” Maybe not today. But have you noticed that a DSC pull server can have a “compliance server” component? Have you looked at its service definition? It’s basically a way for servers to report in on the state of their configuration compliance. That’s reporting. And that’s my point: DSC has a super long way to go. It is going to be everything for an administrator – and that’s going to happen fast. Looking at DSC today, that may be tough to imagine. So was PowerShell, in 2006.

And we haven’t even seen the tooling that will be layered on top of DSC yet, because it’s all so new. The tool where you click a Wizard to add a user… and the tool goes and rewrites four dozen server configuration files, causing the user to exist in AD, in your accounting system, as a home directory on a file server, and so on. Yeah, that’ll all happen. Eventually, you won’t touch servers anymore – you’l touch their configuration files, and they’ll reconfigure themselves appropriately. That’s why this is such a big deal. It’s not a tool. It’s the administrative interface. 

So when I say, “if you’re not learning DSC right now, you’re screwed,” it’s because I personally believe that to be true. My experience in the industry and my familiarity with how Microsoft pursues these things informs that opinion. You are going to fall behind the curve so fast you won’t even realize it’s a curve anymore. Today, people look at Infant DSC and see a basic configuration tool. I see Teenager DSC, and Young Adult DSC, coming around the corner, and they are going to absolutely change the way you are required to manage Microsoft products. Yeah, I personally want to be on board with that right now.

“What about a small shop? Isn’t DSC meant for large scale?” No, large enterprises just have the most obvious advantage from DSC. It’s less obvious to small shops.

You know how Exchange 2007 really impressed everyone, because the GUI was just running PowerShell under the hood? That meant a small shop could still get the GUI, but you could always drop down to PowerShell when you needed to. It also meant that not everything went into the GUI, and sometimes you had to drop into PowerShell anyway. I predict DSC will do the same thing. GUIs won’t run PowerShell commands anymore – they’ll modify DSC configurations. Those configurations will then be implemented on the affected servers. Your cross-platform management tools? If they’re smart, they’ll be doing the same thing.

Think about that. DSC isn’t going to be “just a tool.” It’s going to be the entire interface by which you interact with Microsoft server products. It’s as important as the mouse or the keyboard. I truly think people aren’t seeing the end-game when it comes to this technology.

You know those admins who only know what the GUI shows them? They don’t know much about what’s happening underneath, and as a result, they’re not very good at planning, architecture, troubleshooting, or anything else that requires a deeper knowledge. That’s where you stand with DSC. You either ride that bus, or get run over by it. Eventually.

Do you want to risk not knowing this thing? You might. Perhaps in your job position you know it’s not going to affect you. For me, I won’t risk it. So that’s where my perspective comes from. In my world, this thing is a must-have. And yes, that’s an enterprise-class world, with large, cross-platform environments. But it’s also a perspective from my experience in SMB – I’d have killed for DSC, given the minuscule budgets and staff I worked with in those environments, and given my colleagues’ distaste for scripting.

Anyway, that’s how I feel about it – in more detail than 140 characters allowed ;). If you have a different perspective, please feel free to share it. I can’t promise that you’ll change my mind (and I’m not really out to change yours), but it’s good for the world in general to see different perspectives, so that folks can make informed decisions about their own career directions.

Julie’s Comments: The Scripting Games – Winter 2014

This post comes to us from Julie Andreacola, one of the members of team Kitton Mittons, who won The Scripting Games – Winter 2014. You’re welcome to submit your thoughts about the Games as well!

The 2014 Scripting Games are over and once again, it was a terrific experience. This was my third scripting games and I was blown away with all that I learned.

The team approach was very appealing to me as I have been the PowerShell expert at my workplace so I was hoping to find a team where someone knew more than I did as I’m only intermediate in PowerShell skills. I struggled to put a team together from our local PowerShell user group for the practice event, but it just didn’t work out due to the timing and workload of potential team members. I took to Twitter to find a team that had an open spot and found the Kitton_Mittons.

The team was just what I needed. We had no expectations to win and we acknowledged that some weeks, people would not be able to participate. All of the team, but myself was located in Northern Virginia, so we arranged for a Google Hangout each evening around 7 p.m. We also had a shared repository on GitHub. Both of these tools were new for us, but were invaluable for our team collaboration. I think we only had one night with everyone in attendance. The sessions varied from discussion of elements of the script, screen sharing (nice Google Hangout feature), and general geek conversation. Two of the team traveled to Charlotte NC to join me in PowerShell Saturday 007 where we met and gained another team member for the final few events.

The learning benefits happened immediately. The first week I learned more about parameters and using them to validate inputs. I immediately began implementing them in my scripts at work, making them more robust and easier to hand off to others as I was transitioning to a new job. A couple days later, our team made our first module. I knew it was easy, but had never done it and now my script at work had a module. One of our team members made an install script that put the files and modules in the correct places. I realized the advantage of this especially when turning scripts over to users unfamiliar with PowerShell. I was able to take the same installer script and quickly customize for use in my workplace. The following weeks included getting more experience with efficiencies of script blocks and better error checking. Although many of my evenings were being taken up with PowerShell, I found the nightly sessions invaluable as our team leader, Jason Morgan, took the time to teach and explain the more complex aspects of the scripts.

The 2014 Scripting Games exceeded my expectations and truly advanced my skills. I also have a new network of System Center IT Pros. I’m starting a new job this week and I know what I learned and gained over the last 4 weeks will help me to excel in this new position. A big thank you to my team mates, coaches, judges, and the PowerShell community. Learning can be fun!

Free eBook from Microsoft’s Scripting Guy: Windows PowerShell Networking Guide

Ed Wilson, Microsoft’s Scripting Guy, has created a free ebook, Windows PowerShell Networking Guide. It’s designed to provide a super-quick PowerShell crash course, and then show you how to  manage various networking scenarios by using the shell.

And it’s free! Just click the link to get your copy – and please, tell a friend!


What Should The Scripting Games Look Like Next Time?

If you’ve been following along with The Scripting Games over the past couple of iterations, you know that we’ve been trying some different, new things. This Winter Games, we did a team-based series of events that threw some really complex scenarios at you. However, we know some folks would like to see the next Summer Games include a less-complex track that perhaps includes a focus on one-liners.

(Not that one-liners are an essential part of a work environment, but they’re fun and a good competitive thing – this is games, after all.)

So we’re looking for your ideas. Drop a comment, and tell us how you think the next Games should be structured.

However, before you comment, understand that judging by official, expert judges gets extremely difficult. Multiple 10 events across 250 entries and you’ve got a metric butt tonne of work for our volunteers to do. Quite frankly, it’s unlikely we’ll be able to provide a score-per-entry with that kind of volume. The folks who do judging just can’t take that much time off work. Seriously, even if a judge only had to look at an entry for 2 minutes, that can easily be more than 80 hours of work to look at every entry. It just isn’t do-able.

So, in your comment, include some thoughts on what you’d like to see for the judging/scoring side as well, keeping in mind the desire of judges to also have family lives and jobs. What’s your real goal in participating in the Games? To get community feedback (comments) on what you’ve done? We can arrange that. Is it perhaps educational to have judges pick out “noteworthy” (both good and bad) entries and comment on them, as a learning guide? Or are you solely after having a “known” expert offer commentary on your entry – which isn’t something we can guarantee if there are a large number of entries?

Help us understand what you’re in it for, and give us some ideas for creating a Summer event that’s fun, as well as educational.