I would like to get some design advice from the experts we are looking at using DSC for Configuration Management quite extensively inside our client base.
This is the flow of the deployment model that we are looking at:
We currently wrap the Chocolatey DSC module inside a Composite resource. We then pass some custom parameters to the Chocolatey module. The Chocolatey package has some intelligence added to create a "settings file" from the custom parameters. This is then deployed to Load Balanced group of servers. One caveat we have is that the same server can run multiple version of the same software package so we will need to plan for that.
So, (1) isn't really a DSC design question so much as a broader package management question. We'll see what other folks hop in with ideas, for that. But you'll probably need to provide some more context. You say, "the same server can run multiple version of the same software package so we will need to plan for that," but you don't really mention what the plan needs to address. Is it, "the server will have multiple versions so we have to figure out which one to install?" I guess, walk through the mental human logic you'd use to solve the problem, and then let's figure out how to code that logic.
(2) depends on your business criteria. Not knowing those, it's impossible to get to "best" or "better." It's probably "fine," then.
(3) If it's working the way it is, leave it the way it is. "Advisable" comes down to 'what works for you and is maintainable," and you're the one to decide that.
(4) Stay in school, don't do drugs.
Thanks for the response Don.
Regarding the Package versions. I'm just trying to understand how other people handle package versioning in DSC. Especially in an Azure DSC environment where you cannot run the same module with multiple versions.
If I use 1 Custom resource and pass the version number as a parameter then the reporting in Azure will only show the Module details.
So I can think of a few scenarios to handle this:
1. Append the name of the Module with the version number so then you can have a module for every version of the software?
AH, I was thinking packages more generally, not PowerShell modules.
1. is probably the way to go, and it's not without precedent. You'll see Framework classes with a "2" on the class name o differentiate from an older, still-existing class. It's not elegant, but not everything in life is.
On 3... Nnnnnnoooo, not if I'm understanding what you're after. Other than having something in the module name. I mean, I suppose you could log something from the resource, but you'd have to go spelunking for it.
Hmm. I suppose the resource could insert something into its Get- output. Then it'd show up in a compliance report. You could technically accept that as an input to Set-, too. Like, the desired configuration includes using v3 of such-and-such, and if that resource gets called and finds itself not v3, it remediates or bails with an error or something.
That is always an interesting thing to work with – how do you manage your multiple versions simultaneously with Chocolatey? Or is that what you are trying to figure out?
This software, I am going to assume is different than PowerShell modules in the context of the rest of the conversation. Please let me know if I'm mistaken?
My first question is always – what is the motivating factor for having multiple versions of the same software deployed and used simultaneously? Understanding the driving context can help shape recommendations here.
Assuming you have major (or major/minor) versions you would be doing this for – example would be like Ruby 1.9, Ruby 2.0, Ruby 2.2. Another is Python2 and Python3. That is one way to handle this – that has a good upgrade strategy.
But if you are deploying 1.0.0, 1.0.1, etc and want all of those versions simultaneously, you have more work and what Don is suggesting might be a better approach.
You could do something like this, but it's maybe not the best approach. However, determining what number one is above, this may be an option.
Again, depending on what you are working with, most of DSC is about doing and reporting only on what it did. So if you created a resource that would run something that produces a report and provide that back, you could get that report.
choco list --local-only --all-versions --limit-output
would produce a parseable report, but you would need to design a custom resource to produce that output.
I had a look at the Azure DSC and then also enabled log analytics and it doesn't look like the Raw Report really goes anywhere. So it looks like if you want to use the Raw report which I assume will be the place for the Get-Output then it will have to be manually pulled from Azure into another reporting tool.
I can't see Rob's post on here anymore but here are the answers to his questions.
With Chocolatey after some research we have decided to use the PackageName + version as the name, which will solve the issue with Chocolatey.
Yes correct it will be our own in house developed software the module we are using is just to deploy the software using DSC.
The reason we are running multiple version are that we are moving to a cloud model. Currently a client would have 1 server per branch that runs 1 version of our software. With the new model a client would have for example a group of 5 servers running in the cloud and they could serve up to 20 different branch locations that could be running on different versions.
As you can see this can become quite cumbersome to manage as all servers will have to be running the exact same versions of the software at all times because a branch client with an incoming request could be re-direct to anyone of the servers in the group.
I would be happy with this approach I just can't seem to find a clean way for output to go back into Azure or Log Analytics.
Sorry, Rob's post got held – it's back now.
Just to update as a dirty work around we have decided to do the following.
We are writing a DSC resource that will be included in every configuration file. It will check every version of the software that we are running and post it to the Azure Variable Page as well as send it to an ELK stack. The Azure Variable are just for us internally to verify and then the ELK stack will have a Kibana Dashboard for the Customer to look at.
You must be logged in to reply to this topic.