Search
Generic filters
Exact matches only
Filter by Custom Post Type

Making Awesome Dashboards from Windows Performance Counters

Having an understanding of your systems performance is a crucial part of running IT infrastructure.

If a user comes to us and says "why is my application running slowly?", where do we start? Is it their machine? Is it the database server? Is it the file server?

The first thing we usually do is open up perfmon.exe and take a look at some performance counters. You then see the CPU on the database server is 100% and think  "was the CPU always at 100% or did this issue just start today? Was it something I changed? If only I could see what was happening at this time yesterday when the application was running fine!". It might take you a few hours to find the performance issue on your infrastructure, and you are probably going to need to open up perfmon.exe on a couple of other systems. There is a better way!

What if you could turn your Windows performance counters into dashboards that look like this? How much time would you save?

Full Hyper-V Dashboard

Using a combination of the open source tools InfluxDB to store the performance counter data, Grafana to graph the data and the Telegraf agent to collect Windows performance counters, you will be a master of your metrics in no time!

Read the detailed walk through over at hodgkins.io

 

 

Testing PowerShell Direct with Windows Server 2016 TP3 Hyper-V

Hey there! I  thought we could test PowerShell Direct together today. Here's the elevator pitch: In Windows Server 2016 and Windows 10, we can send PowerShell commands from the Hyper-V host directly to its corresponding virtual machines (VMs), even in the absence of guest VM networking. Yeah, that's cool, isn't it?

What's just as impressive is that PowerShell Direct works even if PowerShell remoting is disabled on the guest VM! PowerShell Direct also circumvents Windows Firewall. Note that PowerShell Direct requires that commands are sent only from a Hyper-V host to its local VMs.

Also, PowerShell Direct is supported at this point only by Windows Server 2016 TP3 and Windows 10. That means a Windows Server 2016 TP3 Hyper-V host cannot leverage PowerShell Direct against, say, Windows Server 2012 R2 virtual machines (give the Hyper-V, PowerShell, and Windows Server teams time; I'm sure this will be supported in the future).

The secret sauce behind PowerShell Direct is PowerShell Remoting Protocol (MS-PSRP), which used to be called just plain ol' garden variety "PowerShell remoting."

The Lab Setup

In my test lab, I started with a domain controller and Hyper-V host (yeah, I'm combining server roles--what of it?) named hyperv1.company.pri. That server's running Windows Server 2016 Technical Preview 3.

In Hyper-V I created a single virtual switch named Internal that is connected to the host/guest network. Of course, we don't care about the switch fabric because we're going to use PowerShell Direct.

Next, I built a Windows Server 2016 TP3-based guest VM named server1 and disabled the network adapter as you can see in the following screenshot. No smoke and mirrors here!

Our lab is set up and ready to test PowerShell Direct.

Our lab is set up and ready to test PowerShell Direct.

As a final "sanity check" to ensure the guest VM is as theoretically inaccessible as possible, I blocked access to all remote access session configurations and disabled the Windows Remote Management (WinRM) service by running the following command from within the guest (thanks to PowerShell MVP Aleksandar Nikolić for clarification on this point):

Disable-PSRemoting -Force

Get-Service -Name WinRM | Stop-Service -Force | Set-Service -StartupType Disabled

Okay. Let's move onto the next phase of our experiment.

Sending Commands to the Guest VM

Let's obtain the name and globally unique identifier (GUID) of our Windows Server 2016 VM (you'll see why in just a moment):

Get-VM | Select-Object -Property Name, VMid

Name        VMId
----        ----
server1     31d787fe-02cd-4363-b50b-16bc8243fc77

PowerShell Direct makes itself manifest by means of two new parameters:

  • VMname
  • VMGuid

Handy, eh? The following two cmdlets support the -VMname and -VMGuid parameters as of this writing in October 2016:

Time to test! Let's start a remote session with the server1 guest VM by specifying its GUID. Note that you will need:

  • Hyper-V administrative privileges on the host
  • Local administrative privileges on the guest
$cred = Get-Credential
Enter-PSSession -VMGuid 31d787fe-02cd-4363-b50b-16bc8243fc77 -Credential $cred

[server1]: PS C:\Users\Administrator\Documents>

We'll finish by using Invoke-Command to send ad-hoc PowerShell pipelines and entire scripts from host to guest:

Invoke-Command -VMName 'server1' -Credential $cred -ScriptBlock { Get-Service | Where-Object {$_.Status -eq 'Stopped'} }

Invoke-Command -VMName 'server1' -FilePath 'D:\scripts\setup-ip.ps1' -Credential $cred

Conclusions

Convenience is the primary advantage that PowerShell Direct brings to us Hyper-V administrators. We can connect to and fully administer our guest virtual machines regardless of their networking, firewall, or WS-Man state. Thanks for reading, and more power to the shell!

Delete Specific E-Mail or E-Mails From All Exchange Mailboxes

Well this is week number two in my quest to post an article once a week and I am back with a common request for Exchange administrators. There are a lot of scenarios that bring up a need to remove an e-mail or e-mails from all mailboxes in your environment. Perhaps there was a disgruntled employee, a virus outbreak, or a reply all to the whole company. We all know that the "Retract" button is best effort (yes I still miss GroupWise for that purpose).

As always we can turn to PowerShell for our scripting needs. The Search-Mailbox command is your best friend for these scenarios. With a simple Get-Mailbox | Search-Mailbox you can take control of all your mailboxes. Be extremely cautious when executing, with great power comes great responsibility. For a full run down on how to accomplish this head on over to PowerShellBlogger.com. I look forward to seeing everyone again next week!

Automate enabling and disabling Lync / Skype for Business users

Hello PowerShell.org community,

This is my first post here at PowerShell.org, and I have a goal of posting tips, tricks, articles, and solutions once a week. My first exposure to scripting was on my x486 computer. I would always create .bat files to launch my DOS based games from the root folder. I learned complex scripting through the use of VB Script, automating the roll out and updating of Windows 2000 desktops and servers. I quickly transitioned to PowerShell as my preferred scripting language upon its release. I use PowerShell on a daily basis to administer Windows Server, SQL Server, Exchange, Lync / Skype for Business, Citrix XenApp / XenDesktop, Office 365, and Dell Active Roles Server. I have very much enjoyed watching the progression and adoption of PowerShell as the default scripting language. I hope my posts will be useful to other administrators around the world.

Today's post deals with automatically enabling and disabling users for Lync / Skype for Business. I kept the script examples simple so that they are easy to understand. If you would like a complex scenario tackled, simply comment on the blog and I will post the solution.

Head on over to PowerShellBlogger.com for a full breakdown of enabling and disabling Lync / Skype for Business users locally or remotely.

Best Regards,
Steve Parankewich
Twitter: powershellblog

Up Next: Andrew Mason from Microsoft talks about Nano Server!

We are very excited for tomorrow night's show! Please join us at the usual time (9:30 PM EDT / 6:30 PM PDT) and the usual place (live.powerscripting.net) as we have a chat with Andrew Mason, group program manager from Microsoft to talk about Nano Server! In case you haven't heard, Nano Server is:

  • a server operating system
  • doesn't have "Windows" in the name
  • has no GUI (nor even a local login console)
  • distinct from Windows Server Core
  • tiny (93% smaller VHD footprint than full Windows)
  • robust (80% fewer reboots)
  • and secure (92% fewer critical bulletins, due to smaller attack surface)

Want to learn more? Check out this breakout session video from the recent Ignite conference with Jeffrey Snover & Andrew Mason, and then tune into the interview!

And here's a short demo.

Also, if you can't make the show, but have questions you'd like to have answered, share them with us! Best way to do so is to tweet @powerscripting.

 

Automating with Jenkins and PowerShell on Windows

Take a minute think about how many PowerShell scripts you have written for yourself or your team. Countless functions and modules, helping to automate this or fix that or make your teams lives easier. You spend hours coding, writing in-line help, testing, packaging your script, distributing it to your team. All that effort, and then a lot of the time the script is forgotten about! People just go back to doing things the manual way.

I put this down to being out of sight, out of mind. Users who do not use the command line regularly will quickly forget about the amazing PowerShell-ing that you did to try and make their lives easier.

Then there are are other problems, like working out the best way to give end users permissions to use your function when they aren’t administrators. Do you give them remote desktop access to a server and only provide a PowerShell session? Setup PowerShell Web Access? Configure a restricted endpoint? I thought the point of this module was to make your life easier, not make things harder!

These problems are what an open source tool called Jenkins can solve for you. Traditionally used by developers to automate their build process, it can be leveraged to wrap web interfaces, job tracking and scheduling around the PowerShell scripts you worked so hard on.

The below image shows what a Jenkins build looks like. In this basic example, the the build creates a text file on a remote machine by using PowerShell Remoting and the Set-Content CmdLetThe parameters for these commands can be entered into the form, and will be passed to your PowerShell script via variables.

jenkins

To find out how to start leveraging Jenkins in your environment, take a look at the below blog posts:

The Monad Manifesto Annotation Project

Richard’s log, stardate 2457164.5

Today's destination is the Monad Manifesto Annotation Project.

The idea behind this project is to keep the manifesto intact somewhere on the internet, and to provide the possibility to the community to annotate on the several topics in the manifesto. The idea for this came from Pluralsight author Tim Warner, with the initial annotations being made by Don Jones. Jeffrey Snover gave his permission for this project, but with a big warning: the content only can be shared on the original source page on penflip, and cannot be hosted anywhere else.

I am already in the progress to put all the chapters from the Manifesto in penflip, and I'm putting the right formatting on it. The idea is to finish this the coming days. After that the actual annotation can be started.

For more information check the project page on penflip:

https://www.penflip.com/powershellorg/monad-manifesto-annotated

Till the next time, live long and prosper.

PowerShell... An exciting frontier...

PowerShell... An exciting frontier...

These are the voyages of a PowerShell adventurer.

Its continuing mission:

To explore strange new cmdlets...

To seek out new modules; new parameters...

To boldly go where no one has gone before!"

Richard's log, stardate 2457163.

Our destination for today is my very first post on PowerShell.org. As you can see, from the opening lines, I approach my journey in PowerShell as a exploration into the unknown, just like the crew of Star Trek, Next Generation did. Till now my journey has been a pleasant one, because you know, exploring PowerShell is a lot of fun! And your exploration should also be a lot of fun, for that reason I want to share with you my discoveries and experiences. These will help you, I hope, to boldly go where no one has gone before!

About Me, And A Statement

My name is Richard Diphoorn, and I’m a IT Professional based in the Netherlands. I work now for around 14 years in IT. My daily work consists mostly of automating, scripting, provisioning new servers, working with System Center, Azure Pack, SMA. Actually everything which can be automated, is that what I am working on. I believe in automation, it’s in my opinion the mindset every IT professional should have.

When I started working in IT, it was not called IT in the Netherlands, it was called ‘automatisering’; in english it’s called ‘automation’. And there you have it, the job I’m doing was always ment to do automation. But still I see a lot of ‘click-next-admins’ around in the field. This term has been thrown up by Jeffrey Snover, and what it means is that there are administrators who click their way trough provisioning and configuration situations, instead of trying to automate this.

It’s my personal quest, to get the intention into the click-next-admins, to learn and use PowerShell. I strive for a transitional change in the admin’s life, by giving them new perspectives on how to ‘do’ things.

For sure I am not the person who possesses all the knowledge, but at least I want to share my passion and the knowledge I build up till now, with the people who are open for it. And with this I invite you, to do together this exploration into ‘strange new cmdlets’. 😉

A Small Introduction

So, with this personal statement I kick off this first post. Our first mission is the exploration of what this thing ‘PowerShell’ actually is, which kind of advantages it brings to you, and why it’s here to stay.

I assume ‘level 200’ as the basic level of my audience, therefore I will not go into the very basics of how you operate a Windows Operating System. I try to avoid unclear technobabble as much as possible, but I don’t want to oversimplify things. I try to make it as simple as possible, but not simpler (where did we heard that before…hmmm…).

Monad, A Brief History Of Time.

If you are like me, you probably bought a little book called ‘Monad’, written by Andy Oakly, quite some years back (if I remember correctly, I bought this book somewhere late december, 2005. I saw this little book on a bookshelf in Waterstone’s, in London. I bought the book because I heard of MSH, and I wanted to learn more about it.

I was hooked. 100%

I still encourage people to read this book, because a lot of information in that book is still relevant, in term of concepts. Topics like the Pipeline, Verb-Noun syntax, cmdlets, repeatability and consistency did not changed from the first version of Monad that the PowerShell team released. This is also the reason why you still see ‘C:\windows\system32\WindowsPowerShell\v1.0’ on all the Windows OS’es till now. This is because the underlying architecture did not changed. As we will continue to explore, you will see what I mean.

This book will explain to you the very first basic concepts, but for really getting into the dirt, I encourage you to read the Monad Manifesto, written by Monad’s architect, Jeffrey Snover. This manifesto explains the long term vision behind Monad, and describes many elements which are consisting today in PowerShell. This is really a comprehensive document on how Jeffrey first saw the big problems that existed in the way administrators did their work.

He explains the new approaches to different kind of models, and how existing problems be solved with these new approaches. This document will also contributes in your way of thinking the ‘DevOps’ way, because many concepts in here contribute directly to the way you should ‘do’ DevOps. For example, Jeffrey talks about ‘prayer-based parsing’, which is in direct conflict with predictability in certain DevOps scenarios.

Because you need to be able to predict what is happening when you go from Testing to Production. In all cases Deus Ex Machine situations needs to be prevented. You always need to know what is happening and why. In my opinion, DevOps is nothing more than just being really good in automating stuff, PowerShell gives you this possibility.

So, what is PowerShell, and how do I benefit from it?

PowerShell basically is a Shell ( a black box, in which you can type 😛 ), in which you can interact with every aspect of the Operating System in either a interactive or programmable manner.

You type commands in a specific format in this window, and magic happens. This is the simple explanation. Now the rest…

The concept of a shell in which you can manipulate the whole windows system in a interactive way or scripted way, with common syntaxes and semantics, was for me a really powerful and inspiring idea. This new shell helped me working more efficient, effective and with more fun. It enabled me to explore the whole OS, to boldly go where I never have gone before!

This concept is not new for the mainframe operators and the *nix admins; it’s something they are used to already for a long time. If you doubt if working on a command line is a bad thing, go and talk with the *nix admin in your company. They happily will show you how fast they can work, I’m sure!

So for you, as a Windows Administrator, what kind of benefits do you get from learning PowerShell? There are obvious benefits like getting stuff done more quickly, and doing it always in the same way so that you never make that one mistake again. A more un-obvious benefit is that you get to know the OS & Apps very well, because sometimes you really dig into the system, really deep. This level of knowledge can and will benefit you in terms of understanding how a system works, and how to resolve problems. This hugely contributes to your personal success in your career, because you are ‘the topnotch’ engineer. You will be the Geordi La Forge of your company, so to say. 🙂

PowerShell is dead, long live PowerShell!

PowerShell is here to stay, rest assured. Almost all the products made by Microsoft can be manipulated with PowerShell in one way or another. This by providing a direct API to the product itself, or either by providing a REST interface. A lot of products from third-party suppliers also support PowerShell, like VMware, NetApp and Citrix. PowerShell is really getting (or already is) a commodity; actually I advice customers to only buy products which can be manipulated with PowerShell.

Be honest here, if a product cannot be automated, how does this product contributes to the WHOLE business? The business thrives by efficient processes, and if all IT processes are efficient, the business profits hugely from that.

In every company where I have been till now in my IT career, make use of Microsoft software. I believe in the best tools for the job. PowerShell is such a tool. It’s ducttape, wd400 and a swiss knive in one, whatever you want to do, PowerShell can do it (and better).

PowerShell is here to stay my fellow IT pro’s, embrace it fully and enjoy the voyage!

I want to thank the crew at PowerShell.org to give me the opportunity to blog on this site!

Till next time, when we meet again.

Why "Puppet vs. DSC" isn't Even a Thing

After all the DSC-related excitement this week, there have been a few online and Twitter-based discussions including Chef, Puppet, and similar solutions. Many of these discussions start off with a tone I suppose I should be used to: fanboy dissing. "Puppet already does this and is cross-platform! Why should I bother with DSC?" Those people, sadly, miss the point about as entirely as it's possible to do.

Point 1: Coolness

First, what Microsoft has accomplished with DSC is cool. Star Wars Episode V was also cool. These facts do not prevent previous things - Puppet/Chef/etc and Episode IV - from being cool as well. Something new being cool does not make other things less cool. This shouldn't be a discussion of, "Puppet did this first, so nothing else can possibly be interesting at the same time." As IT professionals, we should be looking at everything with an eye toward what it does, and what new ideas it might offer than can be applied to existing approaches.

Point 2: Switching

Have you seen the magazine ads suggesting you ditch Puppet and start using DSC? No, you have not - and you will not. If Puppet/Chef/etc is meeting your needs, keep using it. The fact that Microsoft has introduced a technology that accomplishes similar things (make no mistake, they're not the same and aren't intended to be), doesn't mean Microsoft is trying to convince you to change.

I know where people get confused on this, because in the past that's exactly what Microsoft intended to do. They're not, this time. And I'll explain why in a minute.

Point 3: DSC on Linux

Snover demonstrated a DSC Local Configuration Manager running on Linux, consuming a standard DSC MOF file, being used to set up an Apache website on the server. The underlying DSC resources were native Linux code.

This is not an attempt to convince Linux people to switch to Windows, nor is it an attempt to convince them to use DSC. Saying so is like saying, "Microsoft made PowerShell accept forward slashes as path separators in an attempt to convert Linux people.... but we're too smart for that, hahahahah!" It's idiotic. Microsoft knows you're not going to suddenly break down and switch operating systems. They may be a giant corporation that sometimes makes silly moves, but they're not dumb.

No, DSC on Linux is for Windows admins who choose to use DSC, and who want to extend that skill set to other platforms they have to manage. People who aren't, in other words, faced with a "switch" decision.

Point 4: Puppet/Chef/etc Should Use DSC

Linux is, in many many ways, a more simplistic OS than Windows. And I mean that in a very good way, not as a dig. Most config information comes form text files, and text files are ridiculously easy to edit. Getting a solution like Puppet to work on Linux is, form a purely technical perspective, pretty straightforward. Windows, on the other hand, is built around an enormous set of disparate APIs, meaning getting something like Chef/DSC/whatever working on Windows is not only harder, it's essentially a never-ending task.

Microsoft is pouring time and money into creating DSC resources that can, through a very simple and consistent interface, configure tons of the OS. The coverage provided by DSC resources will continue to grow - exponentially, I suspect. That means Microsoft is doing a lot of work that you don't have to.

Even if you're using Puppet/Chef/etc instead of DSC, you can still piggyback on all the completely open and human-readable code that actually makes DSC work. Your recipes and modules can simply call those DSC resources directly. You're not "using" DSC, but you're snarfing its code, so that you don't have to re-invent that wheel yourself. This should make Puppet/Chef people super-happy, because their lives got easier. Yes, you'll doubtless have to write some custom stuff still, but "save me some work" should always be a good thing.

Point 5: Tool vs. Platform

Another thing that sidetracks these discussions is folks not understanding that Puppet/Chef/etc each provide a complete solution stack. They are a management console, they are a domain-specific language, and they are a platform-level implementation. When you adopt Puppet, you adopt it from top to bottom.

DSC isn't like that.

DSC only provides the platform-level implementation. It doesn't come with the management tools you actually need in a large environment, or even in many medium-sized environments. I completely expect tools like System Center Configuration Manager, or something, to provide the management-level tooling on top of DSC at some point - but we aren't discussing System Center.

So arguing "Puppet vs. DSC" is a lot like arguing "Toyota vs. 6-cylinder engine." The argument doesn't make sense. Yes, at the end of the day, Puppet/Chef/etc and DSC are meant to accomplish every similar things, but DSC is only a piece of the picture, which leads to the most important point.

Point 6: Microsoft Did Something Neat

You can't take your Puppet scripts and push them to a Chef agent, nor can you do the reverse. Puppet/Chef/etc are, as I mentioned, fully integrated stacks - and they're proprietary stacks. "Proprietary" is not the same as "close-sourced;" and I realize that the languages used by these products aren't specifically proprietary. But the Puppet agent only knows how to handle Puppet scripts, and the Chef agent only knows how to read Chef scripts. That's not a dig at those products - being an integrated, proprietary stack isn't a bad thing at all.

But it's interesting that Microsoft took a different approach. Interesting in part because they're usually the ones making fully-integrated stacks, where you can only use their technology if you fully embrace their entire product line. This time, Microsoft bucked the trend and didn't go fully-integrated, proprietary stack. Microsoft did this, and the simple fact that they did is important, even if you don't want to use any of their products.

From the top-down, that is from the management side down, Microsoft isn't forcing you to use PowerShell. They're not forcing you to use Microsoft technology at all, in fact. The configuration file that goes to a managed node is a static MOF file. That's a plain-text file, as in "Management Object Format," as in developed by the Distributed Management Task Force (DMTF). A vendor-neutral standard, in other words.

See, Microsoft isn't pushing DSC as a fully integrated stack. DSC is just the bottom layer that accepts a configuration and implements it. Puppet Labs could absolutely design their product to turn Puppet scripts into the MOF file that DSC needs. You'd be able to completely leverage the OS-native, built-in configuration agent and all its resources, right from Puppet.

Frankly, de-coupling the administrative tooling from the underlying API should make people happy. If we're having a really professional, non-fanboy discussion about declarative configuration, I think you have to admit that Microsoft has kinda done the right thing. In a perfect world, the Puppet/Chef/etc administrative tools would let you write your configuration scripts in their domain-specific language, and then compile those to a MOF. Everyone's agents would accept the same kind of MOF, and execute the MOF using local, native resources. That approach means any OS could be managed by any tool. That's cross-platform. You'd be free to switch tools anytime you wanted, because the underlying agents would all accept the same incoming language - MOF.

I'm not saying Puppet/Chef/etc should do that. But if you're going to make an argument about cross-platform and vendor-agnostic tooling, Microsoft's approach is the right one. They've implemented a service that accepts vendor-neutral configurations (MOF), and implements them using local, native resources. You can swap out the tooling layer anytime you want to. You don't need to write PowerShell; you just need to produce a MOF.

At the End of the Day

I think the folks behind Puppet/Chef/etc totally "get" all this. I think you're probably going to see them taking steps to better leverage the work MS is doing on DSC, simply because it saves them, and their users, work. And I don't think you're going to see Microsoft suggesting you ditch Puppet in favor of DSC. That's a complete non-argument, and nobody at Microsoft even understands why people thing the company wants that.

I fully recognize that there's a lot of "Microsoft vs. Linux" animosity in the world - the so-called "OS religions." I've never understood that, and I certainly am not trying to convince anyone of the relative worth of one OS over another. PowerShell.org - a community dedicated to a Microsoft product - runs on a CentOS virtual machine, which should tell you something about my total lack of loyalty when it comes to choosing the right tool for a job. If you're similarly "non-religious" about operating systems, I think DSC is worth taking a look at just to take a look at it. What's it do differently? How can you leverage that in your existing world? Are there any approaches that might be worth considering?

Part of my frustration about the whole "Puppet vs DSC" meme is that it smacks of, "my toys are shinier than your toys," which is just... well, literally childish. And it worries me that people are missing some of the above, very important, points - mainly, that Microsoft is trying really damn hard to play nicely with the other kids in the sandbox for a change. Encourage that attitude, because it benefits everyone.

Once More...

And again, I don't think Microsoft is trying to convince you to use DSC, or any other MS product, here. I'm certainly not trying to do so. I think DSC presents an opportunity for folks who already have a declarative configuration management system, strictly in terms of saving you some work in custom module authoring. And I think for folks that don't have a declarative configuration management solution, and who already have an investment in Microsoft's platform, DSC is going to be an exceptionally critical technology to master. That doesn't in any way diminish the accomplishment of the folks behind Puppet/Chef/etc. In fact, if nothing else, it further validates those products' goals. And I think it's massively interesting that Microsoft took an approach that is open to be used by those other products, rather than trying to make their own top-to-bottom stack. It's a shift in Microsoft's strategic thinking, if nothing else, and an explicit acknowledgement that the world is bigger than Redmond.

Let's at least "cheers" for that shift in attitude.

 

 

 

 

Episode 268 - PowerScripting Podcast - Paul Long from Microsoft on Message Analyzer

A Podcast about Windows PowerShell. Listen:

In This Episode

Tonight on the PowerScripting Podcast, we talk to Paul Long from Microsoft about Message Analyzer

News

Interview

Guest - Paul Long

Links

 

Chatroom Highlights:

<ScriptingWife> http://www.amazon.com/Network-Monitoring-Analysis-Protocol-Troubleshooting/dp/0130264954/ref=sr_1_7?s=books&ie=UTF8&qid=1398390644&sr=1-7&keywords=network+monitoring

<halr9000> http://www.microsoft.com/en-us/download/details.aspx?id=40308

<halr9000> http://blogs.technet.com/b/messageanalyzer/

<halr9000> http://technet.microsoft.com/en-us/library/jj649776.aspx

<halr9000> http://channel9.msdn.com/Shows/Defrag-Tools/Defrag-Tools-71-Message-Analyzer-Part-1

<ScriptingWife> FYI Ed and I will be the guests for Singapore online meeting Friday night May 2nd 830 PM EST (Sat May 3rd Singapore time) sign up here http://www.eventbrite.sg/e/powerbreakfast-sg-01-tickets-10142282841?aff=eorg

<halr9000> ## you guys know this part

<stevenmurawski> ## Are there any good walk thrus with Message Analyzer?  The learning curve is pretty steep coming from Netmon or Wireshark.

<stevenmurawski> ## can you run captures on server core or or remote instances and analyze locally?

<stevenmurawski> ## Any analyzers for PowerShell remoting or CIM over WSMAN?

<Stuwee> ## halr9000, you need to stay close to the mic, you seem to fade alot.

<stevenmurawski> ## Is there an option to view info (with a filter) in realtime, (like the commandline version of wireshark or tcpdump)?

<Stuwee> ## What can you do with PSH and Message Analyzer?

 

The Question -

  • Super power: omniscient
Skip to toolbar