My company is in the process of moving towards a SAS model to host our complex web application (single web app, multiple data store engines including RDBMS & MongoDB among others).
For years we have deployed the product either manually or by using an MSI installer followed by multiple manual steps. However, as we grow the product and incorporate newer features and technologies, the complexity of the installation is growing as well.
Ours is a purely Windows application and we have little or no expertise in cross-platform deployment solutions, so I have settled on DSC as the best option for deploying into our own hosted environment. However, many of our customers remain self-hosted in a tightly controlled environment. It is not uncommon for these environments to be so locked down that we have to manually copy .NET framework updates or Oracle client installers to the box via an RDP client, as it is not possible to download directly to the server.
This means that our ultimate "best" solution has to work equally well in all environments – those that can access a pull server as well as those where we drop the software and packages directly onto the box and run it. I see this latter option as self-defeating when working with customers that won't allow at least access to the PS Gallery, and we may have to keep the MSI approach in our pocket for some time.
I know that what I'm describing is essentially a container package, although I'm behind the learning curve on that at the moment and just trying to catch up with DSC. Any thoughts or suggestions on the best way to deploy a "standalone" package that also works well in push/pull environments?
Though you didnt mention what is your role in your orgniazation ill attempt to address your question ona few levels.
What you are looking at is indeed Package Managment.
Ill point you to two technologies i think you should aim for:
Powershell works very well with them, DSC as well.
This however will not meke you leave the MSI or packaging dependant packages yourself. Its merely an additional way.
In my environment, Devs used to compile on their own stations and deploy themselves. Ive swapped them all to using a single Source Control platform and a single Deployemnt Platform. My next step is that each of the builds done will result in a NuGet Package that will be sent to a on-premise Nuget sever (very easy to build one) and on it will also be the rest of the packages for software deployemnt like .NET or in the end even MSOffice for example or SQL server and even Oracle Client. As any Setup.exe or MSI based installers that you can install in a quiet mode or with an answer-file, you can package yourself to a choclatey/nuget package.
Then is where i employ DSC localy to deploy the packages, on top of other componenets as i consider them part of the state of the node.
In your customers environment, youre very limited in what you can do. They will not always accept you building a pull server for them or a Nuget Server for them, sometimes it might even cause duplication if they have already implemented a local package management systems like one of options in SSCM. The only option there is a big installation package that has all of the dependencies in it like NET or Oracle Client and chain the installations. That part i dont think will ever change for organizations that are very strict or do not create an internal package repository.
You can sell them the idea that deployemnt via HTTP (as in the case of NuGet/Choclate) is better security-wise then that of a Share hosting the installtion files and giving people access to those. There are numerous package manager solutions that migth be better for their specific needs, youll just have to package yours in more then one way to suit your needs and your customers.
I will defenatly not use any method that is dependant on your customers infrastucture as a means to dpeloy your packages to them. Kinda like taking responsibility to someone elses servers. So things like creating a DSC envirotment at your place and then doing the same at your customers place, i think will not be a good thing.
If your customers do have a presence in the cloud connected to their infrastructure, then you could extend your package management solution to your cloud and then share it to them as a service. Each customer and their own ways. wont be easy 🙂
You do not need acccess to PSGallery for internal servers, I download the packages im instrested in, and move them to our internal nework and have the servers pointing to that location . Same as i do with PowerShell Help files , saveing them offline and then moving them internally and setting servers to update their help system from that common location via a scheduled task on each node.
Hope this give some light on how i see things.
Thanks again, Arie.
Our app is HUGE (and over ten-years old), consisting of upwards of 30 projects stored in multiple Git repos. We use Team City as our build manager and also host a Package Manager there, but are considering using another resource as a PM because of some TC limitations. We utilize a mix of multiple datastores and front-end technologies. Like many small companies, we face the daily challenge of keeping up with maintenance while adding value and new features for the company and our customers.
Our developers typically build/host the environment on their dev boxes, push to Git, and we have a two-stage build process in TC (all components build independently,then a second step combines them into a deployable application). We use NPM on Team City as our package manager, but are currently only building packages for our own dlls used in maintenance utilities. Our packages are built from nuspec files stored in a separate repo, and we manually build them at specific releases because the storage requirements of building them with every dev check-in are way too high (all projects are built with every check-in).. I'm considering expanding the build process to three-stages (Dev – QA – Stable)so that we can automate the Package builds to only build when we build a stable release.
Currently QA and Demo environments are manually installed as needed on local bare-metal or VMs, as available.
So the upshot is that we're growing and learning, but do not have the bandwidth to completely overhaul everything all at once. I'm mostly a one-man show trying to implement deployment automation with the support of several very interested parties. The amount of time I have to commit to this is limited by other duties as well (maintenance development and customer installs).
Your suggestion of a separate PM hosting our releases and dependencies is very well taken and would be a very good early step towards automation.
Since youre using TeamCity, then you have builtin support for NuGet, just use an "external" Nuget server outside of your TC if its causing you issues. Best part about NuGet is the package dependency which can help making sure the customer gets everything they need.
Based on your description of how it works, i would take a deep look at your source control tree structue, branching and merging guidelines youre using and how you handle dependency injeciton of the projects. Part of the issues you see in your deployemnt are cauesd becuase of that, and not what you see on the surface. So as much as adding another stage will help, dont just do it to "push" the problem to the deployment phase 🙂
Initialy i started using ReleaseManagement for TFS just for software projects but since RM supports DSC, im working nowdays on swapping all my exiting deployemnt pipelines to use RM via DSC, both for software projects but also provisioning.
You can also look into Configuration management solutions like Chef, Puppet or Ansible as they all work nowdays with DSC, but it would mostly require you to have a linux server as the configuration main server – another good reason to think about hosting some of your infrastructure in the cloud.(as long as your contract with your customers doesnt prohibit that ofc)
Already using TC's NuGet sever, we're just not deploying our final product from there. It does have it's limitations (such as not being able to deploy multiple package versions from a project, which requires us to maintain a separate "reference" project per release). This is why we're considering a separate package server, but having that server host our actual application release is a great idea.
As I mentioned, the actual application is quite old, and we are creating new source control repos for newer work, the legacy work is what it is — too big and costly to justify repackaging just yet with our limited resources. As we solve some of the peripheral issues (and get more of the original project rewritten by the hot newness), the hope is that some of the rework will be simplified and become clearer.
We had looked into Chef & Puppet, but as you pointed out, there is at least a minimal Linux component there, and currently not a lot of interest in the company in going that direction. They were actually my main line of original research, but DSC provided an opportunity to get some of their benefits and still stick with Microsoft. Once we have DSC, the move to one of them in the future becomes easer as well.
You must be logged in to reply to this topic.