We are using AADSC for assisting our clients in setting up environments that run our software. The question is how do we bring it all together so its easy for clients not experienced with DSC to use?
My biggest headache at the moment is how to manage updates to configuration files. At the moment when you need to update a software version you have to manually make a change to a configuration file, upload it to Azure and then apply it to how every many servers you have running. Now this seems simple if you've worked with DSC but it becomes cumbersome and error prone when you have a number of consultants trying to accomplish it.
Does anyone know of a type of front end we can use to give us a simpler way to manage it?
I'd possibly opine that DSC wasn't meant as a software deployment/update mechanism. But that might just be me.
But to answer the question, no. Kind of the whole weak point with DSC is the utter lack of any management tools. If such a front end existed, DSC would be a lot more popular.
Thanks for the response.
But as far as I'm aware DSCs sole purpose is to manage the state of a system and the state of a system will always include software packages?
Choco does a good job of managing the package installs, all that DSC has to do is provide a place to keep track of software versions that Choco will manage.
Given that you're in Azure Automation, your best bet is to use runbooks to provide the missing glue. From a UX perspective, they aren't great, no disputing that. However, they can potentially allow you to create streamlined interfaces to your configurations and configuration data so that the consultants don't have to face the full breadth of editing a DSC document.
Thanks Josh this plan B if we can't find an off the shelf type of Management Tool.
I would mind giving it a try and giving you some feedback.
TL,DR: AFAIK there's no easy solution, and no 'out-of-the-box solution' that actually solves the problem.
I agree with this point (and used/tested it in prod):
Bear in mind it's always harder than it looks, and you're handing over a lot to choco, which is not idempotent by design (It depends of the packages, mostly). There's also the boundary between Software Management and Configuration Management, which is not always easy to maintain on Windows. But it sounds like you're familiar with this already.
You're actually defining two problems here:
Here are the two problems:
You could change the way you use AADSC currently so that you build your MOFs in your own build pipeline (i.e. using VSTS or something else), instead of letting the service doing it for you, and then assign those MOFs to target nodes (that can be automated, iirc).
The idea is that on a change request (the change of the 'Configuration document'), it will go through your pipeline and deploy that change for you. Which means you're only left with the second problem, having an interface easy to use and somehow declarative enough for non DSC literate users.
The key principle to Infrastructure as Code, is to improve the capacity to change. It does not matter how well written is your infrastructure in code if it's impossible or even difficult to change, its usefulness is... limited.
Yet DSC Configurations may contain logic to bridge the gap between the provided configuration data meaningful to the business, into something meaningful to the DSC platform and eventually the LCM (each layer has a contract with the one below).
The data now lives outside the configuration, so you need a way to 'inject' it into your DSC configuration, and this is where the DSC tooling problem comes from: There's nothing out-of-the-box that allows you to do that.
The format in which the data is stored is not DSC specific anymore (as long as we can inject it in a way compatible with DSC, via hashtables), and we can focus on what format makes the more sense for the user making changes to it.
The most widely used interface for managing configurations are Yaml files stored in source control (i.e. git), managed in Version Control Systems (providing added benefits, such as Web Interface), because they're terse, easy to read for the human eye, and to parse for computers. Source control provides the collaboration workflow and tools to manage (audit, review, version) changes, and usually a CI tool is 'plugged' to the VCS to form a pipeline so that it automatically test and build trust to the change initiated by humans, and eventually release/deploy it. This is what Chef (Roles, Datatbags), Puppet (Hiera), Ansible (roles and Playbook) uses.
In the end, the solution you implement depends on how painful your current problem is, and the investment you can make to solve it. 🙂 Hope that helped seeing the bigger picture.
@Gael Thank for your detailed post. I've been to your Github and Blog quite a few times and only starting to grasp the concepts that you are trying to build on. You come across as very passionate about what you are doing.
I think our biggest problem at present is that we are trying to productize DSC as a tool to help deployment into client environments. Given this scenario we can't use the normal CI/CD model of integrating with TFS or Github as a source to trigger the builds.
I think our only solution for now is like Josh mentioned to use Azure Automation as the glue and then build a front end to trigger webhooks.
This is the first steps for us to start moving our application stack to the cloud so its going to be an interesting one.
Thank you everyone for taking the time to reply.
You must be logged in to reply to this topic.