PowerShell.org https://powershell.org Fri, 30 Sep 2016 09:32:15 +0000 en-US hourly 1 https://wordpress.org/?v=4.6 Recap of DuPSUG PowerShell Saturday 2016 https://powershell.org/2016/09/27/recap-of-dupsug-powershell-saturday-2016/ https://powershell.org/2016/09/27/recap-of-dupsug-powershell-saturday-2016/#respond Tue, 27 Sep 2016 07:28:04 +0000 https://powershell.org/?p=54642

Related posts:

  1. Build San Francisco 2015
  2. Using PowerShell to enable ChatOps on Windows
  3. Dutch PowerShell User Group opens its doors on Slack
]]>
Last weekend we hosted our second PowerShell Saturday, this time the event was hosted by IPsoft in Amsterdam. During this event members of the Dutch PowerShell User Group gathered together to view a number of presentations and to engage in lively discussions on the various new developments in the PowerShell world.

For more information about PowerShell Saturday, the Dutch PowerShell User Group or the slides and code used in the presentations please head over to the recap blog post here:

Recap of Dutch PowerShell Saturday September 2016

]]>
https://powershell.org/2016/09/27/recap-of-dupsug-powershell-saturday-2016/feed/ 0
PowerShell Happenings at Ignite 2016 https://powershell.org/2016/09/22/powershell-happenings-at-ignite-2016/ https://powershell.org/2016/09/22/powershell-happenings-at-ignite-2016/#comments Thu, 22 Sep 2016 15:27:46 +0000 https://powershell.org/?p=54438

Related posts:

  1. Final PowerShell Summit NA 2015 Inventory Available
  2. PowerShell.org Free eBook Transition
  3. PowerShell.org, Inc. 2015 Shareholder Meeting Roundup
]]>
With Ignite fast-approaching, here's what's up - and this is intended to be a "community post," meaning I'd love it if you could add your own PowerShell At Ignite notes in the comments, including sessions you're looking forward to!

On Sunday evening, while not officially a PowerShell event, a lot of PowerShell glitterati will be at The Krewe's annual gathering from 8pm.

On Monday evening, the Atlanta PowerShell User Group is kindly hosting a meet-and-greet with myself, Jeff Hicks, and Jason Helmick. We promise to be educational; registration required (but free).

Tuesday evening is the PowerShell Community Happy Hour (from 4-7; tickets required), including many of the in-attendance team members, most of the PowerShell.org Board, and a bunch of super Shell enthusiasts. We'll have PowerShell.org and The DevOps Collective laptop stickers!

Wednesday, I'm looking forward to PowerShell Unplugged with Jeffrey Snover and I, from 9 to 9:45am. This is nearly always hilarious and fun. Then, from 10-10:30, Jeffrey, Jason Helmick, and I will be signing books and handing out laptop stickers at the Ignite Bookstore. Finally, from 11-11:30, I'll be signing FREE! books at the Conversational Geek booth (#571) (who have some amazing scavenger hunt prizes).

And of course, please stop by the Pluralsight booth to say hi, pick up some swag, register your company for a free pilot subscription, and whatnot.

So... what're YOU looking forward to next week?

]]>
https://powershell.org/2016/09/22/powershell-happenings-at-ignite-2016/feed/ 3
Changing of the Guard at PowerShell.org https://powershell.org/2016/09/22/changing-of-the-guard-at-powershell-org/ https://powershell.org/2016/09/22/changing-of-the-guard-at-powershell-org/#respond Thu, 22 Sep 2016 14:47:30 +0000 https://powershell.org/?p=54435

Related posts:

  1. PowerShell Summit: Best Conference Deal Ever!
  2. [UPDATE: It's Safe] CAUTION: Don't Run Update-Help Right Now
  3. A Quick #PowerShell #PSHSummit Update (Europe & NA)
]]>
It's a bit of a sad day at The DevOps Collective, which is the nonprofit that runs PowerShell.org. One of our Board of Directors members, Dave Wyatt, will be stepping down from his Director position this week. He wants to focus on his personal life a bit more, although he's still going to be responsible for our public Build Service, and he's going to continue contributing to the Pester project, so the community isn't losing him entirely. Dave's been a huge help, and a huge inspiration, at PowerShell.org, and he'll be greatly missed.

But our sadness is balanced by some happy news, too, as PowerShell.org Webmaster Will Anderson has agreed to fill Dave's seat. Will has brought a great enthusiasm to our team of volunteers, is also a PowerShell MVP, and also resides in Canada. Will's responsible for most of the photography you'll see in the upcoming PowerShell + DevOps Global Summit 2017 brochure, and he's been a great help in keeping PowerShell.org's website up and running.

So please join me in wishing our outgoing Director all the best, and in welcoming Will to the Board!

]]>
https://powershell.org/2016/09/22/changing-of-the-guard-at-powershell-org/feed/ 0
PowerShell + DevOps Global Summit 2017 Preview https://powershell.org/2016/09/19/powershell-devops-global-summit-2017-preview/ https://powershell.org/2016/09/19/powershell-devops-global-summit-2017-preview/#respond Mon, 19 Sep 2016 19:55:14 +0000 https://powershell.org/?p=54275

Related posts:

  1. Session Voting for the PowerShell Summit North America 2013
  2. PowerShell v3's New "Simplified" Syntax
  3. Want to be VERIFIED EFFECTIVE for PowerShell? Here's what to expect.
]]>
As a quick reminder, our Call for Topics is still open for a few more days! Summ. Summit is very much intended to be a kind of mega-user group, not a "conference," so don't assume all the "professional" speakers have taken up all the speaking slots. We want you to participate!

In the meantime, while we're waiting on the content committee to select topics and before registration opens in early November, I wanted to offer a peek at what we're planning.

Deep Dive Day

Sunday is now a formal "full day" of Summit, rather than a "pre-con" day. That means we'll be presenting both Intermediate and Advanced content, including an opportunity for you to dig into the new open-source PowerShell GitHub repo, learn about the layout of the code, review what the community's been up to with that code, and more. Sunday will also offer two Lab opportunities, one for Advanced Functions and one for DSC Resources. You'll be able to wander in at will, and share some of your work with a domain expert, who'll offer critique and advice. We'll also have some pre-done scenarios, in case you'd like to try your hand and test your skills. The full four-day pass is expected to cost $1500, and will be the first opened for registration in November (3-day is expected to be $950 or $975, and will open in January or February).

All Together Now

Monday (the first day you can attend on a 3-day pass) will feature an opening keynote by myself, a full session with ShellFather Jeffrey Snover, an update on PowerShell from team leaders, and our now-famous Lightning Demos from various developers on the team. We'll finish the day with a grand reception, where you can mix and mingle with everyone you've seen, and enjoy some quality food and beverages.

Breakout!

All day Tuesday, as well as Wednesday morning, we'll feature our usual 45-minute breakout sessions on a huge variety of deep topics. We'll be covering DSC, pull servers, JEA, best practices, security, and SO much more, including sessions delivered by members of the PowerShell product team. We've got a full three tracks - more than last year! - of content planned.

Par-ti-ci-pa-tion

We've noticed that Wednesday afternoons can drag a bit - so after lunch, we're going to roll out some great snacks and drinks. Wednesday afternoon will get more interactive, with a variety of Community Lightning Demos (sign up on site with the moderator), panel discussions, focus groups, and more.

On The Air

We've expanded and refined our session recording capabilities, and you can expect better audio, as well as screen-capture recordings for every session (barring technical difficulties), something we haven't been previously able to do with this much content. All sessions are made available on YouTube within a couple of weeks of the event's conclusion (we do not live-stream, and we won't be posting sessions instantly each day).

Networking

It ain't just for routers and switches - Summit remains dedicated to providing plenty of face time with your fellow PowerShell and DevOps enthusiasts. We'll offer additional evening fun (anyone interested in a trip to the Microsoft Museum one evening? We're looking into it), side rooms for breakout conversations, and of course we encourage everyone to participate in breakout sessions by offering comments and asking questions.

Extra Bits

2017 will be the Fifth Anniversary of Summit, and so we're bringing along some extra swag and collectible opportunities. If you attended in 2016, bring your 1-inch button to wear around your badge lanyard and show your alumni status (we'll have 2017 buttons, too). Some merchandise will only be available as an advance purchase, so watch PowerShell.org for details; other merch might be available on-site, but in very limited quantities, so be sure to get that 4-day pass!

Mark Your Calendars

Sunday-Wednesday passes will open for registration the first week of November, 2016; we expect Monday-Wednesday passes to become available in January or February. As always, registration is limited to about 200 attendees (plus our speakers and the product team members), so don't delay. Because registrations are nonrefundable, we do not maintain a waitlist, and we fully expect to sell out - as we have every year.

 

]]>
https://powershell.org/2016/09/19/powershell-devops-global-summit-2017-preview/feed/ 0
Nearing Last Call for PowerShell Summit Topic Proposals (+ Topic Ideas!) https://powershell.org/2016/09/06/nearing-last-call-for-powershell-summit-topic-proposals-topic-ideas/ https://powershell.org/2016/09/06/nearing-last-call-for-powershell-summit-topic-proposals-topic-ideas/#comments Tue, 06 Sep 2016 14:04:52 +0000 https://powershell.org/?p=53529

Related posts:

  1. Verify Your PowerShell Skills
  2. Writing 10961: First Module in For Review
  3. Winter Scripting Camp: The Post Mortem
]]>
Remember that our Call for Topics is still open until the end of September, if you'd like to submit. And, from our Summit Alumni Slack channel, here are a few things people said they'd like to see...

  • I would love to see a session on what it takes to build a PKI infrastructure in support of PowerShell operations ( stuff liked passing creds with DSC ) - this is something glossed over all the time as if it is not a big deal but I think it can be quite challenging for a lot of people to implement.
  • Writing for Performance: Tips and Tricks to Write Faster Code
  • Compiled cmdlets - how to create them and why you might want to (this got a lot of thumbs-up)
  • Open source PowerShell hackathon.  Either one multi-hour (2, 3, 4?) window where people can break into groups and work on some open source PowerShell extension, or two sessions, one at the beginning of the event and one at the end.  The one at the beginning the presenters/organizers provide a set of possible project ideas to work on, and people interested can sign up/vote for projects which creates groups.  The one at the end gives groups an opportunity to share/demo what they produced.  Having a room where people can gather to work on it would be cool.  These don't have to be big projects.  They could be small things, like knocking off one or more issues for an open source project.  The end goal is to have a pull request submitted or a new project posted in GitHub or a new module submitted in the Gallery. Now, to be clear, this isn't a session - but you can definitely propose it. We have some longer time slots on Wednesday for panels, and this might be something you could do then. 
  • examples of real world DSC usage - that was a comment I heard from a number of folks this year
  • Practical Pipelines. ( Illustrate that release pipelines aren't just for DevOps-practicing shops, or public-facing software )
  • Build plans (and tools, like psake)
  • Module design best practices (lots of thumbs-up on this one)
  • Working with Open Source Projects (as a Contributor)
  • Working with Open Source Projects (as a Maintainer)
  • Applying Agile Software Development Methodologies to PowerShell
  • Using <Audit Framework Name Here> for <Security, Audit, Compliance, etc.>. (assumption: someone writes the equivalent of inspec wrapped around Pester)

And if you read the above carefully, you'll notice that we do also have some space for afternoon panels on Wednesday - so if there's a group discussion you'd like to lead, propose it! Just be clear in the description you submit that you're proposing a panel. It'll be up to you to recruit panel members, which you can do on-site. We'll announce panels in need of panelists and direct them to you.

]]>
https://powershell.org/2016/09/06/nearing-last-call-for-powershell-summit-topic-proposals-topic-ideas/feed/ 1
Unit Testing is “Pestering” the Hell Out Of Me https://powershell.org/2016/09/02/unit-testing-is-pestering-the-hell-out-of-me/ https://powershell.org/2016/09/02/unit-testing-is-pestering-the-hell-out-of-me/#comments Fri, 02 Sep 2016 17:31:14 +0000 https://powershell.org/?p=53248

Related posts:

  1. If you haven't *watched* the PowerScripting Podcast...
  2. Help Beta-Test a New Free eBook on PowerShell Reporting
  3. Writing 10961A: The Damn Variables
]]>
About a week or two before Devops Camp, the attendees were asked how much experience they had using Pester, because another attendee was preparing a discussion on Pester and wanted to gauge the other attendees’ comfort level. Learning Pester had been on my to-do list for a while, but I had procrastinated on it for far longer than I intended. I answered “Beginner” - although “complete and utter newbie” would have been more accurate - and I vowed to spend some quality time looking at Pester before arriving at camp.

There are some really great resources out there devoted to Pester, from beginner to intermediate to way-over-my-head. I read articles and watched videos. And I understood, in a conceptual kind of way, how to use Pester. Describe, Context, It, Mock, Assert-MockCalled – I understood what these things were used for. The examples made sense. I was ready to move on to trying it myself. But here is where I stumbled and recovered, and I would like your feedback and opinions on my discoveries.

I took a piece of code I was currently working on and decided that a small function in that code was the perfect function to attempt my first unit test on. I mean, it was the tiniest little function - 7 lines of code! What could possibly be easier? Right?

Boy, was I wrong. The struggle IS real.

In a nutshell, my function really is 7 lines – an If/Else statement and a For-loop – and inside each is an external call to an Active Directory cmdlet. Those would definitely need to be mocked. After all, we know or assume that Set-ADAccountControl and Set-ADObject do what they are supposed to. I was stumped at where to even start because after mocking these external calls – there isn’t actually anything left to the code!

Even after a wise person told me that “This probably isn’t a great example of a “Pester 101” example”, I was still determined to figure out how to write a Pester test to test this function, but I needed to set aside my thoughts of “I can’t figure out how to write a Pester test for this” and instead, start with “Figure out how to write a unit test for this.” My brain freeze wasn’t about Pester – it was about unit testing. What do I need to test? My next step was to do some reading up on general unit testing concepts.

I’m not opposed to buying a book on testing concepts, but I wanted some quick answers and not a research project just to get me started. I turned to “Dr. Google” and I found some useful definitions, both formal and informal, on what unit testing really is. But it wasn’t until I found a comment buried deep in a StackExchange forum post that I realized what my next steps were.

 

Red-Green-Refactor-Repeat

Red: Write a test that fails.

Green: Write the simplest code that makes the test pass. For the first pass, don’t handle edge cases, just enough to make the test pass.

Refactor: Clean up the code and optimize if necessary. Make sure the test still passes.

Repeat: Now think about handling those edge cases and repeat the previous steps with tests, then code, to handle them.

The entire thread can be found here and the detailed explanation of the Red-Green-Refactor-Repeat concept in the comments is definitely worth a read:

http://programmers.stackexchange.com/questions/750/what-should-you-test-with-unit-tests

 

When I started thinking about writing this article, I knew that I was struggling with the concept of unit testing and I had planned to include the code that I was looking to test as part of the blog. After doing the reading to try to wrap my brain around the concepts, I changed my approach. I’ve decided to scrap the original version of this code and try to use the above approach to re-develop the function instead. I plan to blog about my journey through this process in a future post.

Until then, I’d like to initiate a dialog with you, the readers: How do you approach unit testing? What is your thought process? What do you feel is important or not important to include in a unit test?

]]>
https://powershell.org/2016/09/02/unit-testing-is-pestering-the-hell-out-of-me/feed/ 4
Ultimate PowerShell Prompt Customization and Git Setup Guide https://powershell.org/2016/08/26/ultimate-powershell-prompt-customization-and-git-setup-guide/ https://powershell.org/2016/08/26/ultimate-powershell-prompt-customization-and-git-setup-guide/#comments Fri, 26 Aug 2016 05:25:27 +0000 https://powershell.org/?p=52035

Related posts:

  1. Series - Microsoft Certification the PoSh way - 70-411 - Part 1
  2. Automating with Jenkins and PowerShell on Windows
  3. Keeping Windows PowerShell Help Up To Date
]]>
Do you spend hours a day in PowerShell? Switching back and forth between PowerShell windows getting you down? Have you ever wanted "Quake" mode for your terminal?

If we are going to spend so much time in PowerShell, we may as well make it pretty.

Check out the Ultimate PowerShell Prompt Customization and Git Setup Guide for how to:

  • Install and customize ConEmu
  • Enable Quake Mode for your terminal
  • Setup your PowerShell Profile
  • Install and use Posh-Git
  • Generate and use SSH Keys with GitHub
  • Squash Git commits
]]>
https://powershell.org/2016/08/26/ultimate-powershell-prompt-customization-and-git-setup-guide/feed/ 1
Here's Another Reason to Contribute https://powershell.org/2016/08/24/heres-another-reason-to-contribute/ https://powershell.org/2016/08/24/heres-another-reason-to-contribute/#comments Wed, 24 Aug 2016 11:30:14 +0000 https://powershell.org/?p=51732

Related posts:

  1. Automate Bootable USB creation with Powershell–Part 1
  2. Congrats!
  3. Monitoring SQL Server Backups
]]>
Jason Helmick and I were talking last night, and we got onto the topic of expertise and respect. Kind of, "once someone really gets to that expert level, and they surpass their teacher in knowledge, you really respect them." I disagreed, and said, "no, I respect them the minute they start contributing to the world, and helping others."

We all, at some stage, get "outsider syndrome," where we think everyone else is so much smarter than us, that we've nothing of value to contribute. But that's never true. First of all, there's this thing called a "birth rate," meaning there's always new people coming into the field. Second, no matter what your level of expertise, you're in it, right then. "Experts" too often forget what it was like to be a beginner; a beginner knows, and can often relate things that another beginner can understand more readily.

Take this wonderful post by Missy Januszco. Missy probably doesn't consider herself an expert, although she certainly held her own at my recent DevOps Camp. And she certainly wasn't the only one writing about open-source, cross-platform PowerShell Core that week. But she did it from a unique perspective, one that a lot of her readers can probably take a lot from. And she did it - instead of just talking vaguely about giving back someday, she just did, and did it well.

PowerShell.org isn't a curated newsfeed for a select few; its yours. So if you don't have your own place to publish and share, email webmaster@ and let us set you up to write. Whenever you solve some problem, conquer some gotcha, or have a perspective on the latest PowerShell news, share. You definitely have something to offer.

]]>
https://powershell.org/2016/08/24/heres-another-reason-to-contribute/feed/ 2
Microsoft did WHAT? https://powershell.org/2016/08/23/microsoft-did-what/ https://powershell.org/2016/08/23/microsoft-did-what/#comments Tue, 23 Aug 2016 01:51:05 +0000 https://powershell.org/?p=51548

Related posts:

  1. Automate Bootable USB creation with Powershell–Part 3
  2. More Congrats!
  3. Building a Desired State Configuration Configuration - Part 2
]]>
Unless you’ve been living under a rock for the last couple of days, you already know that Microsoft announced last Thursday that the shell/scripting language formerly known as “Windows Powershell” is now supported on Linux and MacOS and that Powershell has been open-sourced. And for days, thoughts of “how can I use this?” or “I wonder if ‘x’ will be supported” have been flying through the minds of every system architect as we internally grapple with the possibilities of what could be, while at the same time trying to understand Microsoft’s motivation for this radical change.

Only the change isn’t so surprising if you think about the changes that Microsoft has been making leading up to this announcement. Separating Powershell Desktop Edition and Core Edition in WMF 5.1. Announcing SQL Server on Linux – after all, IT professionals are going to need a way to administer that SQL instance and it isn’t going to be through a GUI. Supporting Powershell on Linux seemed like a logical next step.

But it is likely just a step along the road to heterogeneous system management. Microsoft Technical Fellow and Powershell inventor Jeffrey Snover isn’t at all secretive over the fact that the vision is built upon Microsoft’s Operations Management Suite (OMS), a suite of automation and management tools that needs to be able to configure, control, manage, monitor, and self-heal a workload that runs anywhere and on any operating system.

From the perspective of a system architect that isn’t typically on the bleeding edge of technology, I am still extremely excited over this announcement. Why? The possibilities seem endless. For one, applications that run on either Windows or Linux or a combination of the two can now be configured by the same language, or maybe even the same set of well-designed scripts. Second, the possibility of using Desired State Configuration (DSC), or third-party tooling such as Chef or Puppet in conjunction with DSC, means I can keep *all* servers in compliance with their configurations using the same tooling. Third, what Devops engineer wouldn’t love having spent a few years learning a scripting language like Powershell only to have its reach extended to other platforms? This change invariably makes us more valuable to the company by being able to take on additional management responsibilities by using the skills we already have. It can then lead to even more cross-platform learnings and opportunities. I definitely plan to learn more about Linux and how I can help build cross-platform tools. If you have similar interests, here are some great resources to get you started!

https://www.pluralsight.com/courses/essential-tools-red-hat-enterprise-linux

https://www.pluralsight.com/courses/linux-networking-advanced-lfce

I haven’t even scratched the surface of thinking about all of the ways I want to take advantage of Powershell on Linux, and I have lots of exploring to do to find out what can or can’t be done – but the energy of the entire Powershell community over these changes certainly carries over to me as well. I’m excited to find out what is possible, to build what may not have been possible, and to contribute back to the Powershell community. So kudos to you, “new Microsoft”, for energizing the entire community of Powershell enthusiasts. I can’t wait to see what’s next.

]]>
https://powershell.org/2016/08/23/microsoft-did-what/feed/ 3
Why "Objects," Remoting, and Consistency are Such a Big Deal in PowerShell https://powershell.org/2016/08/22/why-objects-remoting-and-consistency-are-such-a-big-deal-in-powershell/ https://powershell.org/2016/08/22/why-objects-remoting-and-consistency-are-such-a-big-deal-in-powershell/#comments Mon, 22 Aug 2016 20:39:49 +0000 https://powershell.org/?p=51512

Related posts:

  1. Automate Bootable USB creation with Powershell–Part 1
  2. Congrats!
  3. Monitoring SQL Server Backups
]]>
As PowerShell begins to move into a cross-platform world, it's important to really understand "why PowerShell." What is it, exactly, that sets PowerShell apart? Notice that I do not mean, "what makes it better," because "better" is something you'll have to decide on your own. I just want to look at what makes it different. 

It's the Objects

Folks often say that Linux is a text-based OS, whereas Windows is an object-based OS. That's a convenient simplification, but it isn't exactly accurate. And to understand why PowerShell is different, you need to understand the actual differences - and how Linux and Windows have actually come closer together over the years.

*nix - including Unix, macOS, and Linux - is based on very old concepts. "Old" isn't "bad" at all; much of Linux' current flexibility comes from these old concepts. Core to the Unix ethos is the fact that OS configurations come from text files. There's no Registry, there's no database, it's just text files. Kinda like, um, Windows was, back in the old days, with .ini files (and where do you think the idea for those came from). Text files are super-easy to view, search, modify, and so on. Heck, when I wrote my first Point of Sale system, it was largely text-based, because text files were super-easy for us to troubleshoot remotely, compared to a complex ISAM table structure.

Windows, on the other hand, is an API-based operating system. When you want to query an OS configuration element, you don't just look in a text file - you run code, and query an API. When you need to make a change, you don't just change a text file and hup a daemon - you run code, and submit your changes to an API.

When you need to pass data from one hunk of code to another, you need to have an agreed-upon structure for that data, so that the code on both ends understands the data. These structures are called objects. Traditionally, Unix didn't really have structured data. The file format used by Apache for its configuration was different from the format used by Iptables. Which is totally fine, by the way, because those two things never need to talk to each other. But when you start considering all the things the OS can do - users, file permissions, groups, ports, you name it - you started to end up with a lot of different formats. Indeed, the main reason that Unix had (has?) a reputation for being a complex OS to administer is largely because all of its data is scattered hither and yon, and all in different formats.

That's been changing, though. You're starting to see more and more new projects pop up that rely on structured configuration data, often using JavaScript Object Notation (JSON), although in other cases something like XML. This is a big deal for *nix administration. Why?

Traditionally, re-using the output of a Unix command was complex. Output was pure text, sent to your console via the stdout "channel." Commands typically formatted their output for human eyeball consumption, so if you wanted to send that output instead to another command, you had to do a lot of text parsing. "Skip the first two rows of output, and then for each remaining row, go over 32 columns and grab 5 columns worth of text." Or, "skip the first row, and then in each subsequent row, look for text matching this [regex] and return only the matching text." Unix admins tend to own regular expressions for this reason.

But the problem with all that is that your workflow, and your tooling, becomes very version-bound. Nobody can ever improve tools like ps, because so many scripts rely on the output being exactly as it is today. Instead, you create entire new versions of those tools - which people then take dependencies on, and which can then never change, unless they provide some backward-compatibility switches to force old-version output. The end result is a highly fragmented landscape of tooling, a very high learning curve for incoming administrators, and a high amount of overhead in automating business processes.

When you code a command-line utility in 1973, it's easy to imagine it'll never need to change. On the other hand, when you start building APIs in the 1990s, it's much more obvious that change will be constant. By passing objects - structured data - between themselves, APIs provide a kind of inbuilt forward-compatibility. If v1 of an API outputs objects that have 10 properties, v2 can easily add five more without breaking anything downstream. Anything consuming those objects won't care if there's extra data, so long as the data it was expecting is all there. Object-based data doesn't have any sense of "ordering," so it doesn't matter if the "first" property is Name or if the "first" property is Size. Consumers refer to properties by name, not by position, and the magic of the API itself makes it all match up.

Objects also lend themselves to hierarchies of data. A computer object can have a Drives property, which can be a collection of Drive objects, which can have a Files property, which is a collection of File objects, and so on. Structured data like XML and JSON handle these hierarchies with ease, as do object-oriented APIs; textual output - which is essentially a flat-file at best - doesn't.

So what sets PowerShell apart from other shells is the fact that its commands pass objects from one to another. When you reach the end of a "chain," or pipeline, of commands, the shell takes what's left and generates textual output suitable for human eyeball consumption. So you get the advantages of a text-based command - easy to read output - and the advantages of working with an API. For example, in PowerShell for Linux, Microsoft ships a command that wraps around the Cron feature. Cron is configured from a text file; Microsoft's command "understands" the text file format, and turns it into objects. That means nobody will ever have to grep/sed/awk that text file again - instead, you can deal with structured data. That's a really good example of taking something PowerShell is good at - objects - and applying it to something Linux is really good at - Cron. It's not forcing Cron to look like the Windows Task Scheduler in any way; it's simply applying a new shell paradigm to an already-solid OS component.

This concept of a shell passing objects - again, just structured data - was unique enough that Microsoft was granted a patent for it (the patent also includes other innovations).

 

Remoting

The parent also touches on remoting, which was equally innovative. Yes, I know that Unix has forever had the ability to log into a remote machine, first using things like Telnet, later SSH, and even later still more things. But that's remote logon, and it's not Remoting.

With remote logon, you're essentially turning your local computer into a dumb terminal for a remote computer, a concept literally as old as computers themselves. It's a 1:1 connection, and it was fine when a given company didn't have more than a few machines. But modern, cloud-based architecture involves thousands of machines, and 1:1 doesn't cut it. Remoting enables 1:many connections - "here is a command; go tell these 1200 computers to run it individually, using their own local resources, and then send me the results - as objects." Going forward, PowerShell can use either WS-MAN or SSH as the low-level transport for that conversation, but the protocol isn't important. It's the idea of running one command locally, piping that output to another command which runs remotely, and then taking that output and piping it to yet more commands that run locally. This mixing-and-matching of computing resources and runtime locations is huge. 

 

Consistency

And finally, the one argument that's the toughest to make. Plenty of *nix admins, and plenty of old-school MS-DOS command-line admins, take great pride in their mastery of obscure command-line syntax. It sets them apart from lesser humans, provides a veneer of job security, and proves their dominance of their field.

Unfortunately, it's bad for the planet.

Look, maybe your country is in fine economic shape (ahem, Norway), but here in the United States we have a fairly precarious hold on Biggest Economy in the World. We aren't a manufacturing powerhouse. We basically have two experts: information technology and Hollywood, and we're sometimes sorry about the latter. But for our economy to thrive in this century, we need all hands on deck when it comes to IT. That means a high barrier of entry, and the need to memorize arbitrary and obscure syntax, ain't gonna cut it. Computing is hard enough without making it artificially more obscure through syntax.

chmod ugo+rwx sample.sh

Yeah, see, that's too hard to teach a 12-year-old.

Set-FilePermission -FileName sample.sh -Permissions Read,Write,Execute -Principal User,Group,Others -Action Add

See, you still need to know what's going on in both cases, but the syntax is much easier to read and understand without having to look it up. The command syntax becomes less obscure, and more self-documenting. More maintainable. Obviously, this is just a bogus example, but it illustrates the pattern of PowerShell - meaningful command names, meaningful parameter names, and meaningful parameter value enumerations. And I use meaningful in the correct way, as in, "full of meaning."

PowerShell still allows for a shorthand syntax, if you're just in a hurry -

sfp sample.sh -p r,w,x -for u,g,o -a add

- but you're not forced into it, and it's easier to figure out what those things mean (again, this is a bogus example meant to show the shell's syntax pattern, not an actual run-able command).

So... that's the big deal

And so that's what makes PowerShell different. It's not going to obviate Bash on Linux anytime soon, although it's happy to let you run your same old text-based commands, and even integrate their output as best it can into its object-based pipeline. But at least now, anyone approaching PowerShell for the first time can understand what makes it different, and decide for themselves if they think that's worth an investment to learn to use PowerShell well.

]]>
https://powershell.org/2016/08/22/why-objects-remoting-and-consistency-are-such-a-big-deal-in-powershell/feed/ 3