Search
Generic filters
Exact matches only
Filter by Custom Post Type

Writing Courseware: 10961 PowerShell Class

Sep 5, 2013
0

We're in the process of working on a 10961C revision to the Microsoft PowerShell course, and I've been reviewing the anonymous comments submitted by MCTs and students on 10961A (the "B" rev, which is what was produced after our beta teach, is just now orderable so we don't have comments yet).

By the way - if you're a student or MCT who has taken/delivered 10961A, you're welcome to contact me directly if you want to share any info on typos you found. Would like to fix those. Microsoft unfortunately didn't bill 10961A as "pre-beta," which it was, and I think that may have not properly set some expectations.

Anyway, if you've ever taken a course and thought anything bad about the courseware (not necessarily the instructor), take a look at these comment excerpts from this one course:

By day 3 (5 day class) most students felt over-whelmed. I had to move some of the chapters around to give them time to acclimate to the product before continuing onto more advanced topics. Students agreed that this shifting around of material was essential, allowing them to absorb what was covered in the first 2 days.

There was not nearly enough material to fill a 5 day class. Students ended up leaving very early on the last two days.

The class had too much repetition of some concepts.

Students were not given enough time or repetition on core fundamentals.

Right. Same class. No idea what to do with that, as a courseware designer.

(and by the way, this is after parsing through hundreds of comments from students who took the class remotely and were extremely dissatisfied with the experience. Believe me, you want to take training live and in-person.)

There's also a question of, "what the heck were you expecting?"

was looking for more examples and understanding of using exchange and AD comandlet.

Missed basic knowledge of Workflows and Web Access.

Should include Flowchart among new features released in Version 3 [as soon as I figure out what feature 'flowchart' is, I'll get right on it]

There was nothing geared toward using PowerShell with SQL Server.

Some material and labs not as relevant for me specifically without a networking/server background. I will likely use exclusively for SharePoint.

The book should have covered creating functions that utilize pipeline content coming in, and Filtering commandlets. Discussion about creating Gui components or a reference to it in the book would be helpful.

Astonishing, because none of these things are mentioned in the course description. Can you imagine writing a generic PowerShell course that included examples specific to [__insert technology here__]? Everyone else in the room would be bored and hate it. Look, you've got one comment from a SharePoint admin with no networking/server experience. Goodness. A few folks suggested more AD examples - which I'd used in 10325, the predecessor course, and gotten tons of comments along the lines of, "I don't do AD in my organization so all of the examples were useless to me." O-kay! Can't win 'em all, I guess.

I think a lot of instructors miss the point on teaching PowerShell, which is to focus on teaching the shell and its discoverability mechanisms. I think setting expectations with students is key, too - let them know you're not covering Exchange or SQL or SharePoint or Lync or whatever, but instead focusing on the core shell. And not even everything the shell does - 5 days isn't enough time. In fact, that's why 55039 is being offered - to provide the functions/programming side of the class.

Anywho - love your feedback if you've taught or taken the class! We have a few weeks in which to decide what we're doing with 10961C.

PowerShell.org's Azure Journey, Part 1

Uncategorized
Aug 19, 2013
0

When we started PowerShell.org, my company (Concentrated Tech) donated shared hosting space to get the site up and running. We knew it wouldn't be a permanent solution, but it let us start out for free. We're coming to the point where a move to dedicated hosting will be desirable, and we're looking at the options. Azure and Amazon Web Services are priced roughly the same for what we need, so as a Microsoft-centric community Azure's obviously the way to go.

Azure Technical Fellow Mark Russinovich is having someone on his team connect with me to discuss some of the models in which we could use Azure. What makes the discussion interesting is that PowerShell.org runs on a LAMP (Linux, Apache, MySQL, and PHP) stack. We're not looking to change that; WordPress requires PHP, and the Windows builds of PHP typically lack some of the key PHP extensions we use. I'm not interested in compiling my own PHP build, either - I want off-the-shelf. WordPress more or less requires MySQL; while there's a SQL Server adapter available, it can't handle plugins that don't use WordPress' database abstraction layer, and I just don't want to take the chance of needing such a plugin at some point and not being able to use it.

What's neat about Azure is that it doesn't care. I adore Microsoft for selling a service and not caring what I do with it. Azure runs Linux just fine. Huzzah!

So, we've got two basic models that could work for us. Model 1 is to just buy virtual machines in Azure. We're planning one for the database and another for the Web site itself, so that we can scale-out the Web end if we want to in the future. We're not going to do an availability set; that means we risk some short downtime if Azure experiences hardware problems and needs to move our VM, but we're fine with that because right now we can't afford better availability. We'd probably build CentOS machines using Azure's provided base image (again, adore Microsoft for making this easy for Linux hosting and not just Windows). We know we tend to top out at 250GB of bandwidth a month, and that we need about 1GB of disk space for the Web site. 500MB of space for the database will last us a long time, but we'd probably get 1GB for that, too. It's only like $3 a month. We could probably start with Small VM instances and upgrade later if needed. All-in, we're probably looking at about $125/mo, less any prepay discounts.

Model 2 is to just run a Website. We still get to pick the kind of instance that hosts our site, so if we went with Small and a single instance, we'd be at about $110 including bandwidth and storage. That doesn't include MySQL, though. Interestingly, Microsoft doesn't host MySQL themselves as they do with SQL Azure. Instead, they outsource to ClearDB.com, which provides an Azure-like service for hosted MySQL. Unfortunately, the Azure price calculator doesn't cover the resold ClearDB service. Looking at ClearDB's own pricing, it'd probably push us to about $120-$125 a month - or about the same as having our own virtual machines. The difference is that, with Model 2, Microsoft can float our Web site to whatever virtual hosts they need to at the time to balance performance; with Model 1, they can potentially move our entire VM - although they're unlikely to do so routinely, since it'd involve taking us offline for a brief period. A super-neat part of this model is its integration with Git: I can run a local test version of the site, and as I make changes and commit them to our GitHub repository, Azure can execute a pull and get the latest version of the site code right from Git. Awesome and automated. I love automated.

An appeal of Model 1 is that I can build out the proposed CentOS environment on my own Hyper-V server, hit it with some test traffic loads, and size the machine appropriately. I can then deploy the VHDs right to Azure, knowing that the instance size I picked will be suitable for the traffic we need to handle. It also give me an opportunity to validate the fact that a dedicated VM will be faster than our current shared hosting system, and to play around with the more advanced caching and optimization options available on a dedicated VM. I can get everything dialed in perfectly, and then deploy.

Azure has other usage models, but these are the two applicable to us. I think it's great that we get these options, and that the pricing is more or less the same regardless. And again, I think it's pure genius that Azure's in the business of making money for Microsoft, and that they're happy to do so running whatever OS I want them to.

I'll continue this series of posts as we move through the process, just for the benefit of anyone who's interested in seeing Azure-ification from start to finish. Let me know if you have any questions or feedback!

Coming Soon: 55039 "PowerShell Scripting and Toolmaking" Course

Aug 12, 2013
0

Later this month, Jason Helmick will be offering a revised "PowerShell Scripting and Toolmaking" course at Interface Technical Training in Phoenix. This new course carries the Microsoft Courseware Marketplace number 55039 - that's right, this is an official, unofficial course that will be available to all Microsoft training partners!

(Courseware Marketplace offerings are not written or endorsed by Microsoft, but they are equivalent to Official Curriculum in many ways, including being eligible for Software Assurance voucher programs. Marketplace offerings supplement Official offerings by providing courses that Microsoft doesn't have the time or resources to generate themselves.)

This course is based directly on Learn PowerShell Toolmaking in a Month of Lunches, and incorporates much of that book's actual text (in fact, a portion of the course's sale price goes to the book publisher, with a portion of that going to the book authors as royalties). That's combined with a full slide deck, some awesome brand-new labs, lab answer key, "starting points" (for lab students who fall behind), and a complete inventory of demo scripts for the instructor to use. It walks through a quick PowerShell review, and moves all the way through creating modules, advanced functions, custom views, and much more. It's a pretty handy course, and even dives into creating "controller" scripts, such as scripts that automate processes or generate HTML reports. We provide a complete 3-VM build guide, and a simple ISO image containing all of the instructor and student files. Students are even welcome to download that ISO themselves for later reference! That URL will be provided in the student manual.

I'm especially proud of the labs, and thankful to Mike Robbins and Jason Helmick for debugging them for me. Through the main part of the course, students have three lab tracks (A, B, and C) to choose from - and overachievers can work on more than one track. Through each module, the labs gradually build from a basic command to a complete, fleshed-out "script cmdlet" packaged in a module, with a custom view and more. It's extremely realistic, and it means much of the classroom time is spent on hands-on labs, where students will get the most value for their money.

This course is designed to complement Microsoft's official 10961 course, which covers substantially the same material as Learn Windows PowerShell in a Month of Lunches, meaning 55039 is kind of a "sequel" course. Training centers are welcome to offer a 5-day accelerated class that combines both courses; that's pretty much the class I teach myself. I don't personally categorize 55039 as "advanced;" rather, it's more of a specific application of PowerShell - building reusable tools. I do offer an advanced course of my own, and there's a chance for that to become a packaged course in the future.

After the beta is complete, the course will be orderable in the Marketplace with a suggested price of $150 per student. It's a full 5-day course, with multiple lab tracks per module, so I felt that was a pretty fair price, especially since students basically get the Toolmaking book "included" in their manual!

If any other trainers would like to know more about the course, they're welcome to contact me. We will be selling it directly as well, for trainers who can't access the Marketplace.

Download the table of contents: 55039-TOC

My PowerShell Workflow Series on TechNet Magazine

As most folks are aware, I've been writing the Windows PowerShell column for Microsoft's TechNet Magazine for... wow, going on 7 years now. For 2013, I was doing a serialized column on PowerShell Workflow, introducing a bit of the technology at a time in each month's article. Eagle-eyed observers will note that the series has "paused," with no new articles in July or August.

First, I'm sorry for the interruption. Unfortunately, right now Microsoft is re-evaluating and re-positioning TechNet Magazine (perhaps in line with a larger re-considering of the TechNet brand, where they recently discontinued the subscription product), and for the time being the company is sticking with internally generated content for TechNet Magazine. I'm hopeful the company will come to a decision soon, and I'll try and keep you posted here.

My past columns (all 77 of them) are still online and accessible, along with hundreds of other articles stretching back almost 8 years.

How Cloud-First Design Affects You

Today, Brad Anderson (Corporate VP in the Windows Server/System Center unit) posted the first in what should be a series of "What's New in 2012 R2" articles. In it, Anderson focuses on how Microsoft squeezed so many features into the 2012R2 release in such a short period of time. The short answer, which has been stated by Jeffrey Snover before, is "we build for the cloud first." That means features we're getting in 2012R2 have, for the most part, already been developed, deployed, and in use in some of Microsoft's own cloud services. This is a huge deal. It means their cloud services (think Azure, O365, and the like) get stuff first, where Microsoft can make sure it's stable. They then package those and hand them off to us.

It means we get better stability, but it also means we get better manageability. Look, you don't get excited when you have to deploy a new server, right? You want to automate that stuff. Well, Azure gets really ticked off if they can't automate it, because they do it thousands times more than you. So forcing themselves to run a ginormous datacenter also forces the company to make better management tools - which they then hand down to us in an OS release.

If, that is, you're managing your datacenter as if it was your own little... dare I say it, private cloud. In other words, if you think of your datacenter as a wee little cloud, and you manage it like one, then you'll get the tech you need, because Microsoft has to develop that tech for themselves. If you want to keep managing it the old-fashioned way... well, you'll get less love.

This whole approach, for me, is the ultimate expression of the Microsoft phrase, "eat the dogfood." Meaning, use our own products just as our customers would. You just have to make sure you're eating the same flavor dogfood. Not that MS expects everyone to have their own in-house Azure. No, that's not the point. The point is that they're developing for a world where admins do nothing but create units of automation, and business processes (perhaps outside IT) initiate those processes. You're going to see more and more tools and technologies (um, PowerShell) to facilitate that model of IT operations; you'll see less and less tech that facilitates the old way (meaning, fewer and less robust GUI tools, I'm guessing).

Desired State Configuration (DSC) is probably an ideal example of this new approach. In the past, when you wanted to configure a few hundred machines to look and behave a certain way, you went clicky-click a few hundred times in a GUI. That's imperative configuration; you tell each machine what to do. That doesn't scale to cloud-sized proportions, and so now we're getting DSC. DSC is declarative configuration, meaning you tell a group of machines what to be. The OS itself figures out how to achieve that state of being. So admins have to shift from thinking "what do I make the machine do" and "how do I tell it what to be." It's not unlike Group Policy, actually, which is also declarative, except that DSC will eventually dwarf Group Policy in terms of reach and capability.

Point being, if you're in the old world of, "I just run through the Wizard and set the machine up," you're not aligned with the new world order. Expect fewer wizards, as product teams shift their investment to building things like DSC resources instead. With 12-18 month product cycles, time is in short supply for each new release. One-at-a-time approaches don't scale to the cloud, so those are likely to get less of that limited amount of time.

Anderson's post is worth a read. It's a little high-level - the man is a Corporate VP, after all - but it shows where Microsoft is pointing their collective brain. It uses the word "delight." It describes in great detail how Microsoft is trying harder to put the customer in the front of every conversation - but, more subtly, it also shows how Microsoft is moving the conversation past "what do customers tell us they want" and more toward "here's what we see customers needing." Henry Ford would be proud.

It's Safe to Run Update-Help - and you should!

Jul 2, 2013
0

I'm informed that sometime today Microsoft will be posting fixed core cmdlet help files for your downloading pleasure - so it's safe to run Update-Help again, and you should definitely do so. There are likely a lot of fixes and improvements to the help text, and you won't be "losing" the parameter value type information from the SYNTAX section.

Maybe schedule an Update-Help for tomorrow morning?

BTW - kudos to the team at Microsoft for getting this issue fixed so quickly. It's a shame this one snuck past them, but once notified of the problem they really did jump on it. The fact that the problem was (from the public perspective) just with the downloadable help files means it's an easy fix that doesn't involve pushing code out through Windows Update (thank goodness).

[UPDATE: It's Safe] CAUTION: Don't Run Update-Help Right Now

UPDATE 2 JULY 2013: Microsoft is informing MVPs that the fix is in, and new help files should be downloadable by (at latest) the morning of 3 July 2013. So get your Update-Help ready to run. More info.

If you haven't recently run Update-Help... don't. There's a problem with the help files that have been produced recently so that instead of:

-computername <string[]>

You're getting:

-computername

This affects all parameters - no value types will be shown. This has been reported to Microsoft, and they've acknowledged receipt of that report and are investigating. Personally, I believe the problem may be related to internal-use-only tools that are used to create the syntax section of the help files, so hopefully it'll be an easy fix.

The -full and -detail help still shows the correct information, so if you've downloaded the borked help files, you're not totally out of luck.

As far as I can determine, this only currently affects core PowerShell cmdlets, not add-in modules from product teams like Exchange, etc. I believe that's because the core cmdlets were just updated and re-published, something the PowerShell team tends to do a bit more frequently than some of the other product groups.

I'll keep you posted as I learn anything new.

Overall Winners of the Scripting Games

Jun 11, 2013
8

Congratulations to our top winners, determined by our expert judges (and in this case we also considered their CrowdScores), mikefrobbins and taygibb, who have just won a free pass to Microsoft TechEd Europe or Microsoft TechEd North America 2014. Instructions are in your profile for claiming your prize. It is transferrable, but must be claimed/transferred by the end of July.

Congratulations to our top voters/commenters, Klaus_Schulte and Poshsg0606. They were chosen randomly for this award, although I did review their comments and scores to ensure they were all meaningful and consistent. They've won free passes to the PowerShell Summit North America 2014; these are transferrable and must be claimed/transferred by the end of July.

Thanks to everyone who participated in The Scripting Games this year. We've received a lot of feedback from you, and very much appreciate the time and spirit you spent to offer it. We're taking it all into consideration for our next event.

The new PowerShell Class is Coming to a CPLS Near You!

May 24, 2013
4

Looking for a great getting-started PowerShell class? Or perhaps you'd like to send a colleague or peer to some PowerShell "zero to hero" training?

We've just finished the official beta-teach of Microsoft's 10961, Automating Administration with Windows PowerShell, and it went great. The sequencing of the class was spot-on, and we had an absolutely incredible group of students. Many were n00bs, which was perfect; a couple had "some" shell experience but wanted to learn "the right way." And they did.

Through a series of 12 modules, you're led through the basics all the way up to writing your own script. The grand semi-finale has you creating a script that provisions a brand-new, freshly-installed Server Core instance - all without logging on to that instance at all. The high moment for me was when one student, after struggling a bit to get started on the provisioning lab, concluded with a "well, that did it." Everything came together for him: command discovery, help, scripting, variables, remoting, all of it. He did the task, from scratch, with practically no help. He's there. 

10961 replaces MS course 10325, and it will soon be supplemented by a Microsoft Courseware Marketplace title that goes further into scripting, error handling, debugging, and more... what I've taken to calling toolmaking. We'll hopefully continue to refresh both courses as PowerShell evolves.

So call your local Microsoft Certified Partner - Learning Systems ("training center") and see when they're offering 10961. A bit of caution: this is a class where, unfortunately, an inexperienced MCT will be really challenged. While the course book is a full, almost-500-page book (you're welcome), it's tightly timed and you'll definitely want to check the credentials and experience of whatever trainer is running the class. You can't just "read the slides" to stay a module ahead of the students on this one.

This class is strongly based upon Learn Windows PowerShell 3.0 in a Month of Lunches, in terms of how the material is presented, although the sequence and narrative was altered a bit to better accommodate Microsoft requirements and classroom logistics. I'm really proud of how the course turned out - so if you've got folks who need some PowerShell training, tell 'em to look it up. Many CPLS centers offer remote training, too, meaning you can attend from the comfort of your own home or office.

If you take the class, I'd love to hear what you think.

PSCustomObject: Save Puppies and Avoid Dead Ends

Uncategorized
Apr 24, 2013
13

Welcome to Scripting Games 2013. Here's my favorite hint for improving your functions and scripts. Avoid writing to the console or formatting your output. Instead use PSCustomObject in Windows PowerShell 3.0 and leave the formatting to the end user.

Windows PowerShell provides lots of great ways to return the output of a command or function. You can write to the host program (Write-Host), write to a file (Out-File), and format your output to look really pretty (Format-*). But all of these techniques kill puppies and bring the pipeline to an abrupt halt.

"Puppies?," you ask. Yes! Windows PowerShell MVP and Scripting Games 2013 Viceroy Don Jones (@concentrateddon) famously says that every time you use Write-Host, a puppy dies. So sad!

The Format cmdlets are almost as bad, although no deaths have yet been attributed to them. Instead, when you use a Format cmdlet, a huge STOP sign should appear warning you that you've brought the pipeline to a halt. Not technically, of course, but for all practical purposes.

To see what I mean, take a peek at these two commands. The output of these commands looks very similar, but it's really quite different.

PS C:\ Get-Process csrss

Handles NPM(K) PM(K) WS(K) VM(M) CPU(s) Id ProcessName
------- ------ ----- ----- ----- ------ -- -----------
885 14 2568 5092 49 516 csrss
714 19 3996 28036 92 632 csrss

PS C:\ Get-Process csrss | Format-Table

Handles NPM(K) PM(K) WS(K) VM(M) CPU(s) Id ProcessName
------- ------ ----- ----- ----- ------ -- -----------
885 14 2568 5092 49 516 csrss
714 19 3996 28036 92 632 csrss

These two commands return different objects and the difference really matters. To see the different output types, you can pipe them to Get-Member. I've used a slightly different approach that gets only the names of types in the output, but it's the same idea.

PS C:\ Get-Process csrss | foreach {$_.gettype().fullname}
System.Diagnostics.Process
System.Diagnostics.Process

PS C:\ Get-Process csrss | Format-Table | foreach {$_.gettype().fullname}
Microsoft.PowerShell.Commands.Internal.Format.FormatStartData
Microsoft.PowerShell.Commands.Internal.Format.GroupStartData
Microsoft.PowerShell.Commands.Internal.Format.FormatEntryData
Microsoft.PowerShell.Commands.Internal.Format.FormatEntryData
Microsoft.PowerShell.Commands.Internal.Format.GroupEndData
Microsoft.PowerShell.Commands.Internal.Format.FormatEndData

Instead of a process object, the formatted command returns a bunch of format objects. You usually discover this when you try to use them in another command. For example, these format objects don't have the properties of a process object, like PagedMemorySize or Handles.

PS C:\ $p = Get-Process csrss
PS C:\ $p | foreach PagedMemorySize
2629632
4075520

PS C:\ $pf = Get-Process csrss | Format-Table
PS C:\ $pf | foreach PagedMemorySize
PS C:\

PS C:\ get-process csrss | sort Handles

Handles  NPM(K)    PM(K)      WS(K) VM(M)   CPU(s)     Id ProcessName
-------  ------    -----      ----- -----   ------     -- -----------
    723      19     3980      28416    87             632 csrss
    881      14     2568       5096    49             516 csrss

PS C:\ get-process csrss | ft | sort Handles
out-lineoutput : The object of type "Microsoft.PowerShell.Commands.
Internal.Format.FormatEntryData" is not valid or not in the correct 
sequence. This is likely caused by a user-specified "format-*" 
command which is conflicting with the default formatting.
    + CategoryInfo          : InvalidData: (:) [out-lineoutput], 
InvalidOperationException
    + FullyQualifiedErrorId : ConsoleLineOutputOutOfSequencePacket,
Microsoft.PowerShell.Commands.OutLineOutputCommand

So you've lost the opportunity to use these objects in subsequent commands. Unless you really want formatting object, the pipeline is effectively dead. Almost as sad as those puppies.

I realized this problem when some colleagues at Microsoft asked me to generate a report that listed the CDXML files in a CIM module and the CIM commands that were defined in each CDXML file. I wrote a tiny script that produced a nice report that looked like this:

MSFT_NetIPAddress.cdxml-help.xml
------------------------------------
Get-NetIPAddress
Set-NetIPAddress
Remove-NetIPAddress
New-NetIPAddress

MSFT_NetIPInterface.cdxml-help.xml
------------------------------------
Get-NetIPInterface
Set-NetIPInterface

MSFT_NetIPv4Protocol.cdxml-help.xml
------------------------------------
Get-NetIPv4Protocol
Set-NetIPv4Protocol

MSFT_NetIPv6Protocol.cdxml-help.xml
------------------------------------
Get-NetIPv6Protocol
Set-NetIPv6Protocol
. . .

But, instead of being delighted, they reported that they now had data that they couldn't use. I had created a dead end. Pretty, but useless. They were happier with a command that produced useable results, even if they weren't pretty.

PS C:\ (Get-Module $ModuleName).NestedModules | Select-Object Name, Path, ExportedCommands

Name                      Path                                    ExportedCommands
----                      ----                                    ----------------
MSFT_NetIPAddress         C:\windows\system32\WindowsPowerShel... {[Get-NetIPAddres
MSFT_NetIPInterface       C:\windows\system32\WindowsPowerShel... {[Get-NetIPInterf
MSFT_NetIPv4Protocol      C:\windows\system32\WindowsPowerShel... {[Get-NetIPv4Prot
MSFT_NetIPv6Protocol      C:\windows\system32\WindowsPowerShel... {[Get-NetIPv6Prot
. . .

To avoid this dead end in the silly Get-Process case, you just remove the Format-Table command. Or, you can use the Select-Object cmdlet to create an object that is a filtered subset of the current object, if that's what you need.
But how do you manage when you're returning values from different objects? It's easy to put them in a table, but there's a much better way that doesn't stop the pipeline.

Windows PowerShell 3.0 introduces PSCustomObject. You can read all about it, and about Windows PowerShell 2.0 alternatives, in about_Object_Creation. PSCustomObject makes it easy for you to create objects.

As the name implies, PSCustomObject creates a custom object with the properties that you specify. The resulting custom object works just like any .NET class object, so you can pass it through the pipeline and use it in subsequent commands.

The value of PSCustomObject is a hash table (@{Key = Value; Key=Value...}) where the keys are property names and the values are property values. When you define a PSCustomObject hash table in a script or function, Windows PowerShell magically creates an object for every instance that you pass to it.

Here's how I used it in a little script that tells you the versions of Updatable Help you have on your local machine.

foreach ($helpInfoFile in $helpInfoFiles)
{
    $ModuleName = $HelpInfoFile.Name.Split('_')[0]
    $CultureInfo = ([xml](Get-Content `
         $HelpInfoFile)).HelpInfo.SupportedUICultures.UICulture

    $UICulture = $CultureInfo.UICultureName
    $Version = $CultureInfo.UICultureVersion

    [PSCustomObject]@{"ModuleName"=$ModuleName; 
                      "Culture"=$UICulture;
                      "Version"=$Version} 
}

In this case , I was processing a bunch of HelpInfo XML files. I want to return an object that contains the module name, the name of the UI culture, and the version number for that UI culture. The details don't matter, except that the property values weren't all in the same object, so I couldn't just select from an object.

PSCustomObject to the rescue! See how easy this is!

In the ForEach loop, I get the values that I need. Then I just define a PSCustomObject and … voila! … I have my objects. The default formatting makes them look nice enough.

ModuleName              Culture        Version
----------              -------        -------
AppLocker               en-US          3.1.0.0
Appx                    en-US          3.1.0.0 
BitLocker               en-US          3.1.0.0                       
BranchCache             en-US          3.1.0.0

But more importantly, the pipeline continues. When I pipe to Get-Member, it shows that I have a usable custom object:

PS C:\ $u | get-member

   TypeName: System.Management.Automation.PSCustomObject

Name        MemberType   Definition                        
----        ----------   ----------                        
Equals      Method       bool Equals(System.Object obj)    
GetHashCode Method       int GetHashCode()                 
GetType     Method       type GetType()                    
ToString    Method       string ToString()                 
Culture     NoteProperty System.String Culture=en-US       
ModuleName  NoteProperty System.String ModuleName=AppLocker
Version     NoteProperty System.String Version=3.1.0.0

And, I can use the output in subsequent commands.

PS C:\ $u | sort Version | group Version

Count Name                      Group
----- ----                      -----
   24 3.0.0.0                   {@{ModuleName=NetSwitchTeam; 
    1 3.0.1.0                   {@{ModuleName=MsDtc; Culture=
    1 3.0.2.0                   {@{ModuleName=Wdac; Culture=e
   15 3.1.0.0                   {@{ModuleName=ScheduledTasks;
    4 3.2.0.0                   {@{ModuleName=Microsoft.WSMan
    1 {3.2.15.3, 3.2.15.0, 3... {@{ModuleName=Show-Calendar; 
    1 3.4.0.0                   {@{ModuleName=NetTCPIP; Cultu

Now, go out and try it! Some of the Scripting Games challenges might require a table, list, or some other formatting, but if it doesn't, be sure return a really useful object.

Good luck to everyone!

Skip to toolbar