Managing scripts centrally and distributing them to endpoints

This topic contains 3 replies, has 2 voices, and was last updated by Profile photo of Chris Parker Chris Parker 5 months, 4 weeks ago.

  • Author
    Posts
  • #62037
    Profile photo of Chris Parker
    Chris Parker
    Participant

    Hi everyone,

    I'm familiar with git and source control but in working alone in a small company, I don't have much experience when it comes to practical usage. The biggest challenge I have is, like the title says, managing scripts centrally and distributing them to endpoints.

    For example, I have a script that I wrote that works on a specific server. I was able to write some of it on my own workstation but then I had to copy over to the other machine and continue there. Not wanting to install git on every machine I am now periodically copying the file back to my workstation to commit.

    This is not a great workflow. Some options I came up with are

    1. change my mindset when it comes to installing git and just install it everywhere it's needed
    2. devise some kind of elaborate distribution system that doesn't involve git being installed
    3. keep doing what I'm doing

    I should also mention that another goal I have is continuous deployment. In some cases I could benefit from having a system like this. For example, I have a few of our DNS zones hosted at Azure wherein I define the zone through a PowerShell script. It's a start but it's not great because the script isn't idempotent. My next step for this component is to create a better script so that I can define the zone as a JSON text file and store it in source control. Then, as needed, when I commit DNS changes those can automatically be picked up and the script will make the Azure zone look like my JSON file.

    I see the first option as a stepping stone to this goal but still I'm unsure if it is the right way to go about achieving it.

  • #62445
    Profile photo of Chris Parker
    Chris Parker
    Participant

    No one else is dealing or has dealt with this issue?

  • #62457
    Profile photo of Jeffery Hayes
    Jeffery Hayes
    Participant

    You can do Powershell profiles. Which lets you build a profile then just tell users to copy these to their workstations.

    Let me know if this would work for you as what I did was basically used New-Item $profile -Force to create the default powershell profile then invoke-item to open it

    New-Item $profile -Force
    invoke-item $profile

    then I'll copy my script to load up the scripts from the dfs share.

    then just use
    $psdir=\\dfssharename\
    Get-ChildItem "${psdir}\*.ps1" | %{.$_}

    This in the script will load up any .ps1 file. I'm sure it needs to be adjusted a bit. But you might be able to pull from Github using a profile script.

    • #62461
      Profile photo of Chris Parker
      Chris Parker
      Participant

      Thanks for your reply but I'm not distributing scripts to users. My question is about scripts that are used by servers in some capacity. Usually as scheduled tasks.

      Having said that, your method is giving me an idea. One thing I need to do is have a central share where all my scripts can be found and downloaded from (as permissible). The important part is that I need the git client to automatically pull changes so I won't have to update the repo manually. I wonder if git can automatically detect new commits and update itself.

You must be logged in to reply to this topic.