. source script to remotely connect to servers or not

This topic contains 3 replies, has 3 voices, and was last updated by Profile photo of 84rusty . 84rusty . 3 months ago.

  • Author
    Posts
  • #53480
    Profile photo of Ronnie Jorgensen
    Ronnie Jorgensen
    Participant

    Hi all,

    So this is probably just a question of personal preference by I would like to ask other PowerShell people how they go about connecting to remote computers when you write a script?

    In my case all the scripts i work will either need to connect to a AD domain controller or it will connect to Exchange 2010 or Exchange Online. So i already know that i need this pretty often.

    I also in my PS profile loading a shared network share into the environmental path so i dont have to go to the network share to run scripts every time i want to run a script in our script share.

    So i have to make a decision if first of all my scripts should have the dependency on a another file that is dot sourced or should i just add the code to connect to AD/Exchange/Exchange Online in each my scripts and then at then end of the script disconnect from AD/Exchange/Exchange Online.

    I like the idea of only maintaining the remoting script file one place but i dont like the dependency. What are others doing?

  • #53485
    Profile photo of Richard Diphoorn
    Richard Diphoorn
    Participant

    Be careful when running scripts from a UNC path. It's known to cause problems. Read up on it here: https://blogs.technet.microsoft.com/heyscriptingguy/2011/01/04/run-powershell-scripts-stored-on-a-central-file-share/

    I've placed the connect functions to several services in my profile. So whenever I use a script that, let's say, needs a connection to Exchange, I already have that function available in my profile. And as a result, already available in memory. 🙂

    It depends on your personal preference. If you find yourself sharing your scripts, include functions either IN the script itself, or keep it separated in a dedicated functions file. You could even take it a step further, and create a module which you can handover to another person.

    As I said, it depends a lot on your own preference. Because to much code in your profile makes starting up ISE or the console slow.

  • #53501
    Profile photo of Ronnie Jorgensen
    Ronnie Jorgensen
    Participant

    Hi Richard,

    Thx for the heads up on the running scripts from a UNC.. can you tell me how you else will use multiple scripts but you want to locate them on 1 server and a team of 5 can use these from our own workstations?

    Enter-Pssession -ComputerName "mgmtserver.local.com" perhaps and then from there go to C:\Scripts ??

    I have tried playing around with Modules and while a great option i have big problems with the AD connect feature.

    Any tips you can give is hugely appreciated.

    The problem i faced when added my exchange connect into a module was if i started a console and i just wanted to run connect-exchange as an example. it would run and then disappear on me because the Module has its own script scope.

    I wanted to use my Exchange module and either use it freely in scripts or in the console but it never worked out that way. lol not sure if my understanding of how to use modules is good/bad. lol

    With AD connect it was different because i wanted to check if ActiveDirectory module was present and if it was i would just use that but still ensure my commands passed the admin credentials and not the normal ones. but if AD module was not present then connect to a domain controller instead and then use the AD commands from there. it was all very messy with my limited understanding.

  • #53848
    Profile photo of 84rusty .
    84rusty .
    Participant

    Sorry if a little off-topic; Not sure if this directly correlates with what you're asking but it might help someone out there, but for Exchange 2010 (on premise) I kinda cheat and I have a bunch of scripts on the exchange server itself, tied to various custom Tasks in Task Scheduler that can be called remotely.

    I can remotely call said task(s), with Powershell or Batch file, and then expect a .csv file (in a network share or similar)/email notification etc. as a return. You can even query the task status every x seconds and when complete, automatically detect and import the return and process it with powershell as you see fit, e.g. manipulate csv data. Also has the benefit of processing said scripts quicker as it is being run local to the database.

    This means you only need to worry about privilege to run said task as a remote call via command-line, providing these scripts are static.

You must be logged in to reply to this topic.