This topic contains 2 replies, has 2 voices, and was last updated by
November 1, 2016 at 4:03 pm #56602ParticipantTopics: 2Replies: 7Points: 1Rank: Member
I'm about to start building a PS script to create new user accounts. We're a small company (about 100 users), so bulk creation isn't a requirement.
My approach has been to work out how to do one thing at a time, in the shell, to get the code, then paste that in to a file (in Notepad++, though I'm seriously considering using ISE). It all seems to be coming together nicely.
The driving force for this is because the old VB script, which I also wrote, is loooooong (about 2,000 lines, although some lines are redundant functions) and difficult to support. i.e. if there's any kind of change, I have to edit the script, then test, then make it available. Also, since we moved to Exchange 2007, it no longer creates the user mailbox.
With Powershell, I'm instead going to use template AD User Account objects (one per dept), as a basis for the new user's account. This way, since the changes I usually have to do to the script are things like office address, standard groups, and so on, changes can instead be made to the template accounts without requiring any script editing and testing.
So far my experience with PS suggests the end script will be more like 100 lines long. Nice one.
That's a pretty long preamble to my question! My question is, should I go ahead and build a single script for this, OR, should I build lots of little scripts that each do a small part of the process then have a controller script that calls them one by one? I mean, I hear Don Jones argument (towards the end of this video: https://www.youtube.com/watch?v=KprrLkjPq_c) about using a controller script as small scripts are easier to maintain, but I wouldn't have thought that this script would be particularly big anyway.
November 1, 2016 at 4:09 pm #56605KeymasterTopics: 13Replies: 4872Points: 1,813Rank: Community Hero
First, I would start using the ISE, along with ISESteroids. Seriously. Easier.
Second, I would humbly suggest reading "Learn PowerShell Toolmaking in a Month of Lunches." What it suggests is that you build a series of single-purpose commands (functions) to do each little task, and then build a short "controller" script that strings those together to automate the process. Everything in your life – debugging, coding, even your love life and driving to work – will be vastly simpler and more satisfying that way.
It isn't a matter of big or small. It's a matter of building and maintaining. A 10-line chunk of code is easier to debug, test, and maintain than a 100-line hunk; having tasks enclosed in functions keeps them tightly scoped, easy to test and debug on their own, etc. If you're going down the route of automated unit testing (and you should) using Pester, having tightly-scoped functions will make your tests easier to write, as well. Monolithic scripts of any size are a sure sign of an inexperienced coder with no foresight :). Highly modularized scripts – e.g., broken into discrete functions – are the sign of a higher form of intelligence with great concern for the future of our species.
But I suppose you already heard that argument in the video ;). It's a good argument, though, and it's firmly where the experienced shellers in the community come down.
November 1, 2016 at 4:41 pm #56614ParticipantTopics: 2Replies: 7Points: 1Rank: Member
Thanks, Don. Nice to get a result from the man himself (and so quickly!)
Do the other two videos cover how to build a controller script?
I'm also wondering about the best way to make all these little scripts, plus the controller script, available to the rest of the team.
The topic ‘Script Structure’ is closed to new replies.