Prof. Powershell

What's In Your Pipeline?

The ease with which we can automate things in PowerShell lies in the pipeline. But use it wisely.

If you've been reading Prof. PowerShell for any length of time you realize how much importance I place on the PowerShell pipeline. It's what makes PowerShell such a magnificent automation engine, once you get your head around the whole "objects in a pipeline" concept:

PS C:\> get-hotfix -description "Security Update*" | where {$_.InstalledOn -gt "7/1/2012"} | Export-Csv c:\work\secupdates.csv -NoTypeInformation

With a single, albeit long, command, I've retrieved all security updates installed since July 1, 2012 and exported them to a CSV file that I can use elsewhere. But just because you can write something as a one-line PowerShell expression, should you?

The answer will sound like something your Mom might say: "It depends."

If you are working interactively at a PowerShell prompt, using a pipelined expression lets you get a lot done with minimal effort. The same could also be true if using a command in a script. But not always. Taking my hotfix example, Get-HotFix takes some time to run, especially when querying remote computers. If I want to do several things with the data, it certainly makes sense to break this up into more manageable pieces:

$hot = get-hotfix -description "Security Update*"
$recent = $hot | where {$_.InstalledOn -gt "7/1/2012"}
$recent | Export-Csv c:\work\secupdates.csv -NoTypeInformation

With this, I can insert additional commands based on this data:

$hot = get-hotfix -description "Security Update*"
Write-host "Found $($hot.count) security updates" -foreground Green
#get a unique list of accounts that installed updates
$hot | select -expand InstalledBy -unique | out-file "c:\work\installers.txt"
if (($hot | select -expand HotfixID) -notcontains 'KB2719985') {
  Write-Warning "Required Security Update Not Found!"
}
$recent = $hot | where {$_.InstalledOn -gt "7/1/2012"}
Write-Host "$($recent.count) are after July 1, 2012" –foreground Green
$file=" c:\work\secupdates.csv"
Write-Host "Exporting to $file" -foreground Green
$recent | Export-Csv -NoTypeInformation

So the first thing to think about is re-use and what you might want to do with data you've collected, especially if it takes a long time to collect it.

It is also possible there might be performance gains, although this is hard to predict. My recommendation is to learn how to use Measure-Command. For example, on my computer this block of code took about 5020 milliseconds.

get-process | where {$_.path} | Select Path -unique | get-acl | Select Path,Owner

While this variation about 5 ms longer. Hardly noticeable.

$paths= get-process | where {$_.path} | Select Path -unique
$paths | get-acl | Select Path,Owner
Interestingly, this only took 4966ms.
$paths= get-process | where {$_.path} | Select Path -unique
  foreach ($path in $paths) {
  $path | get-acl | Select Path,Owner
  }

But I don't recommend letting performance totally guide your decisions. Again, think about what else you might need to do with your command or script. The bottom line is that using pipelines or variables or some combination is as much an art as it is a science.

About the Author

Jeffery Hicks is an IT veteran with over 25 years of experience, much of it spent as an IT infrastructure consultant specializing in Microsoft server technologies with an emphasis in automation and efficiency. He is a multi-year recipient of the Microsoft MVP Award in Windows PowerShell. He works today as an independent author, trainer and consultant. Jeff has written for numerous online sites and print publications, is a contributing editor at Petri.com, and a frequent speaker at technology conferences and user groups.

comments powered by Disqus
Most   Popular

Upcoming Training Events