Archive

Archive for the ‘Windows Server’ Category

Mashing CSVs around using PowerShell

February 15, 2012 2 comments

Since I spend most of my day in the console, PowerShell also serves as my ‘Excel’. So, continuing my recent trend of PowerShell related posts, let’s dig into a quick and easy way to parse up CSV files (or most any type of file) by creating objects!

We, of course, need a few rows of example data. Let’s use this pseudo student roster.

Example data:

Student,Code,Product,IUID,TSSOC,Date
123456,e11234,Reading,jsmith,0:18,1/4/2012
123456,e11234,Reading,jsmith,1:04,1/4/2012
123456,e11234,Reading,jsmith,0:27,1/5/2012
123456,e11234,Reading,jsmith,0:19,1/7/2012
123456,e11235,Math,jsmith,0:14,1/7/2012

Now, for reporting, I want my ‘Minutes’ to be a calculation of the TSSOC column (hours:minutes). Easy, we have PowerShell–it can split AND multiple!

The code:

Begin by creating an empty array to hold our output, importing our data into the ‘pipe’, and opening up an iteration (for each) function. The final $out is our return value–calling our array so we can see our results.

$out = @()
import-csv data_example.csv |
   % {

   }
$out

Next, let’s add in our logic to split out the hours and minutes. We have full access to the .NET string methods in PowerShell, which includes .Split(). .Split() returns an array, so since we have HH:MM, our first number is our hours and our second number is our minutes. Hours then need to be multiplied by 60 to return the “minutes per hour.”

You’ll also notice the [int] casting–this ensures we can properly multiply–give it a whirl without and you’ll get 60 0’s or 1’s back (it multiples the string).

$out = @()
import-csv data_example.csv |
   % {
	$hours = [int]$_.TSSOC.Split(':')[0] * 60
	$minutes = [int]$_.TSSOC.Split(':')[1]
   }
$out

The next step is to create a new object to contain our return values. We can use the new PowerShell v2.0 syntax to create a quick hashtable of our properties and values. Once we have our item, add it to our $out array.

$out = @()
import-csv data_example.csv |
   % {
	$hours = [int]$_.TSSOC.Split(':')[0] * 60
	$minutes = [int]$_.TSSOC.Split(':')[1]
        $item = new-object PSObject -Property @{
			Date=$_.Date;
			Minutes=($hours + $minutes);
			UserId=$_.IUID;
			StudentId=$_.Student;
			Code=$_.Code;
			Course=$_.Product
		}
	$out = $out + $item
   }

With that, we’re done, we can pipe it to an orderby for a bit of sorting, grouping, table formatting, or even export it BACK out as another CSV.

$out = @()
import-csv data_example.csv |
   % {
	$hours = [int]$_.TSSOC.Split(':')[0] * 60
	$minutes = [int]$_.TSSOC.Split(':')[1]
        $item = new-object PSObject -Property @{
			Date=$_.Date;
			Minutes=($hours + $minutes);
			UserId=$_.IUID;
			StudentId=$_.Student;
			Code=$_.Code;
			Course=$_.Product
		}
	$out = $out + $item
   } | sortby Date, Code
$out | ft -a

Quick and dirty CSV manipulation–all without opening anything but the command prompt!

UPDATE: Matt has an excellent point in the comments below. PowerShell isn’t the ‘golden hammer’ for every task and finding the right tool for the job. We’re a mixed environment (Windows, Solaris, RHEL, Ubuntu), so PowerShell only applies to our Windows boxes. However, as a .net developer, I spend 80-90% of my time on those Windows boxes. So let’s say it’s a silver hammer. 🙂

Now, the code in this post looks pretty long–and hopping back and forth between notepad, the CLI, and your CSV is tiresome. I bounce back and forth between the CLI and notepad2 with the ‘ed’ and ‘ex’ functions (these commands are ‘borrowed’ from Oracle PL/SQL). More information here.

So how would I type this if my boss ran into my cube with a CSV and needed a count of Minutes?

$out=@();Import-Csv data_example.csv | % { $out += (new-object psobject -prop @{ Date=$_.Date;Minutes=[int]$_.TSSOC.Split(':')[1]+([int]$_.TSSOC.Split(':')[0]*60);UserId=$_.IUID;StudentId=$_.Student;Code=$_.Code;Course=$_.Product }) }; $out | ft -a

Now, that’s quicker to type, but a LOT harder to explain. 😉 I’m sure this can be simplified down–any suggestions? If you could do automatic/implied property names, that’d REALLY cut it down.

NuGet Package Restore, Multiple Repositories, and CI Servers

January 20, 2012 1 comment

I really like NuGet’s new Package Restore feature (and so does our git repositories).

We have several common libraries that we’ve moved into a private, local NuGet repository on our network. It’s really helped deal with the dependency and version nightmares between projects and developers.

Boom!I checked my first project using full package restore and our new local repositories into our CI server, TeamCity, the other day and noticed that the Package Restore feature couldn’t find the packages stored in our local repository.

At first, I thought there was a snag (permissions, network, general unhappiness) with our NuGet share, but all seemed well. To my surprise, repository locations are not stored in that swanky .nuget directory, but as part of the current user profile. %appdata%\NuGet\NuGet.Config to be precise.

Well, that’s nice on my workstation, but NETWORK SERVICE doesn’t have a profile and the All Users AppData directory didn’t seem to take effect.

The solution:

For TeamCity, at least, the solution was to set the TeamCity build agent services to run as a specific user (I chose a domain user in our network, you could use a local user as well). Once you have a profile, go into %drive%:\users\{your service name}\appdata\roaming\nuget and modify the nuget.config file.

Here’s an example of the file:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <packageSources>
    <add key="NuGet official package source" value="https://msft.long.url.here" />
    <add key="Student Achievement [local]" value="\\server.domain.com\shared$\nuget" />
  </packageSources>
  <activePackageSource>
    <add key="NuGet official package source" value="https://msft.long.url.here" />
  </activePackageSource>
</configuration>

Package Restore will attempt to find the packages on the ‘activePackageSource’ first then proceed through the list.

Remember, if you have multiple build agent servers, this must be done on each server.

Wish List: The option to include non-standard repositories as part of the .nuget folder. 🙂

Posting to Campfire using PowerShell (and curl)

January 16, 2012 Comments off

I have a few tasks that kick off nightly that I wanted to post status updates into our team’s Campfire room. Thankfully, 37signals Campfire has an amazing API.  With that knowledge, time to create a quick PowerShell function!

NOTE: I use curl for this. The Linux/Unix folks likely know curl, however, I’m sure the Windows folks have funny looks on their faces. You can grab the latest curl here for Windows (the Win32 or Win64 generic versions are fine).

The full code for this function is available via gist.

I pass two parameters: the room number (though this could be tweaked to be optional if you only have one room) and the message to post.

param (
 [string]$RoomNumber = (Read-Host "The room to post to (default: 123456) "),
 [string]$Message = (Read-Host "The message to send ")
)
$defaultRoom = "123456"
if ($RoomNumber -eq "") {
 $RoomNumber = $defaultRoom
}

There are two baked-in variables, the authentication token for your posting user (we created a ‘robot’ account that we use) and the YOURDOMAIN prefix for Campfire.

$authToken = "YOUR AUTH TOKEN"
$postUrl = "https://YOURDOMAIN.campfirenow.com/room/{0}/speak.json" -f $RoomNumber

The rest is simply using curl to HTTP POST a bit of JSON back up to the web service. If you’re not familiar with the JSON data format, give this a quick read. The best way I can sum up JSON is that it’s XML objects for the web with less wrist-cutting. 🙂

$data = "`"{'message':{'body':'$message'}}`""

$command = "curl -i --user {0}:X -H 'Content-Type: application/json' --data {1} {2}" 
     -f $authToken, $data, $postUrl

$result = Invoke-Expression ($command)

if ($result[0].EndsWith("Created") -ne $true) {
	Write-Host "Error!" -foregroundcolor red
	$result
}
else {
	Write-Host "Success!" -foregroundcolor green
}
Running SendTo-CampFire

Running SendTo-Campfire with Feedback

Indeed, there be success!

Success!

Success!

It’s important to remember that PowerShell IS extremely powerful, but can become even more powerful coupled with other available tools–even the web itself!

PowerShell: Recreating SQL*Plus ‘ed’ Command

November 9, 2009 Comments off

Aside from PowerShell, I spend a bit of time day-to-day in Oracle’s SQL*Plus.  I’m slowly replacing my dependency on SQL*Plus with PowerShell scripts, functions, and such, but there are some core things that it’s just easier to bring up a console and hack out.

One thing I really love about SQL*Plus is the ‘ed’ command.  The ‘ed’ command dumps your last statement into a temp file, then opens it in Notepad.  Make your changes in Notepad, save, and rerun (in SQL*Plus, it’s a single slash and then Enter), and the frustration of lengthy command line editing is solved.

So, this is a work in progress—my attempt to simulate the same process in PowerShell.

Example Situation

Let’s say, for example, we have a long command we have been hacking away at, but are getting tired of arrowing back and forth on:

$count = 0; sq mydb “select id from reports” | % { $count++ }; $count

(Yes, yes, it’s not ‘very long’, but that’s why this is called an example!)

So, a simple row counter.

The ‘ed’ Command

To recreate the ‘ed’ experience, I need two things:

  • Access to the command line history via get-history.
  • A temporary file.

touch c:\temp.tmp

get-history | ?{ $_.commandline -ne “ex” } |

? { $_.commandline -ne “ed” } |

select commandline -last 1 |

%{ $_.CommandLine} > c:\temp.tmp

np c:\temp.tmp

Stepping through the code:

  1. Create a temporary file.  I used c:\temp.tmp, but you could get a bit fancier and use the systems $TEMP directory or something.
  2. Get-History.  This returns an ascending list of your command history in PowerShell by ID and CommandLine.  We want to ignore any calls to our ‘ed’ or ‘ex’ functions and grab the “last 1”—then insert it into our temp file.
  3. Finally, open up the temp file in NotePad (np is an alias on my PowerShell profile for NotePad2).

So now that I have my ed command, let’s try it out after I’ve already ran my simple row counter.

Last Command in NotePad2

Cool.  We can now make changes to this and SAVE the temp file.  The next step would be to jump back to PowerShell and execute the updated command.

The ‘ex’ Command

Since the slash has special meaning in a PowerShell window, I simply named this script ‘ex’ instead.  Ex… execute… surely I can’t forget that, right?

get-content c:\temp.tmp |

% {

$_

invoke-expression $_

}

The ‘ex’ command is pretty simple.

  1. Get the content of our temporary file from get-content.
  2. For each line (since get-content returns an array of lines), print out the line as text and then invoke (run) it using invoke-expression.

Invoke-Expression takes a string as a command (from our text file) and returns the results.  For more information on Invoke-Expression, check out MSDN.

Wrapping It Up

Now that we have our two commands—ed and ex—we’re ready for those “oh, this is short and easy” commands that turn into multi-page monsters—without fighting tabs and arrow keys.

As I said, this is a work-in-progress.  I’ll update the post as new/better ideas come up. 🙂

Updated: PowerShell “Code-Cat” With Colors and Line Breaks

October 13, 2009 Comments off

Motivated by Jeffery Hicks’ fun with my original “code-cat” version, I decided to fill in a few of the original updates I had floating in my head.

The full code is available here via CodePaste.Net.

First off, I loved the idea of colorizing the comments in a file; however, I wanted to keep my line numbers out of the “coloring” and simply color the comments.  To do this, I had to break apart the line numbers from the actual output.

Coloring Code Cat Comments

Second, I really wanted a cleaner way to wrap lines in my code and to have those lines still lineup with everything else. 

Line Wraps

The full code is available via CodePaste.Net (click here), but let’s step through it and see how things work.

SplitString Function

You’ll notice at the top, I’ve included a custom helper function that takes a string and splits it into an array.  I use this to break up “long” strings into window-sized strings.

function splitString([string]$string, [int]$length)
{
 $lines = @();
 $stringLength = $string.Length;
 $position = 0;

 while ($position -lt $stringLength)
 {
  if (($position + $length) -lt $stringLength)
  {
   $lines += $string.substring($position, $length);
  }
  else
  {
   $lines += $string.substring($position);
  }
  $position += $length;
 }

 return $lines;
}

This method is actually in my PowerShell profile for use in numerous parsing activities.

Colors and Window Size

Unlike the original method, I’ve pulled out the colors into variables so they are easily configured.  I am also using the built-in PowerShell $Host object to determine the width of my console window.  I’ll use this later for the line breaks.

$codeColor = “Gray”;                
$counterColor = “White”;
$commentColor = “Green”;
$newLineColor = “Yellow”;

# Window size, minus the counter, spacing, and a bit extra padding.
$maxLineSize = $Host.UI.RawUI.WindowSize.Width – 12;

Parsing The Code File

There are a few important points in the $content body, here we go!

Replacing Tabs With Spaces

I found that the Substring and Length functions got really quirkly (is a tab 1 character or five, etc) when they came across several tabs (which, tabbing is pretty common in a code file, especially some SQL files).  To remedy this, I replace tabs with five spaces.  This also allows me to control the tabs a bit more, so if I wanted to ease my code output to the left, I could use three or four spaces.

# remove tabs, they’re counted as one character, not five.
$currentLine = $_.Replace(“`t”, ”    “);

Setting the Color of Comments

Using Jeffery’s example, I’m also setting my comments (for PowerShell, C#, and TSQL) to a bright color.

# Color-code comments green (PowerShell, C#, SQL)
if ($currentLine.Trim().StartsWith(“#”) -or `
    $currentLine.Trim().StartsWith(“//”) -or `
    $currentLine.Trim().StartsWith(“–“))
{
     $foregroundColor = $commentColor;
}

Separating Line Numbers From Content

I didn’t want my line numbers to show up green for comments and I wanted to treat them with a different color than the rest of the code body, so I’m taking advantage of the “-nonewline” feature of write-host. This returns the content without forcing a new line break into the result.  Very handy.

write-host (“{0:d4} | ” -f $counter) -nonewline -foregroundColor $counterColor;

Checking for Line Breaks and Handling Them

The next bit is in “beta” for the time being.  The gist is that it looks at the length of the current line vs. the maximum line size we specified in the variables. 

If the current line is longer, it breaks the line into an array of “max line size” lines.  The first line is output using our normal methods—we want our counter to show up, right?  The ‘overflow’ lines are output using a fancy arrow “\—>” and do not include a counter—simply because they are not ‘new lines’.

Of course, if our line is shorter than our window size, it is simply output.

if ($currentLine.Length -ge $maxLineSize)
{
 $splitLines = splitString $currentLine $maxLineSize;
    $countOfLines = $splitLines.Length;
 for ($i = 0; $i -lt $countOfLines; $i++)
    {
  if ($i -eq 0)
  {
   # our first line doesn’t look any different.
   write-host (“{0}” -f $splitLines[$i]) -foregroundColor $foregroundColor;
  }
  else
  {
   # next lines are using the fancy arrow.
   write-host (“\–> |   “) -nonewline -foregroundColor $newLineColor;
   write-host (“{0}`n” -f $splitLines[$i]) -nonewline -foregroundColor $foregroundColor;
  }
 }
}
else
{
 write-host (“{0}” -f $_) -foregroundColor $foregroundColor;
}

Again, the full code is available here via CodePaste.Net.

Like all things, a work in progress!

PowerShell – Easy Line Numbers using Cat (Get-Content)

October 8, 2009 2 comments

I often cat out code pages from the prompt when I simply want to see what’s inside or look for a method.  Cat, of course, is an alias to PowerShell’s get-content cmdlet, but cat brings me back to days in Unix and makes me feel better. 🙂

One thing that was a bit of a trouble was looking for a specific line of code.

For example: Looking through the debug logs, I see an error occuring at line 40 of some file.  That’s great, but who wants to count lines?  Instead, we open up Visual Studio and look.  Well, that’s a hassle too.

Here’s a quick script that does a simple line count with the line in the file.

param ([string] $filename)
$counter = 0;
$content = get-content $filename |
 % { $counter++; write-host “$counter`t| $_” }

Since the result of get-content becomes an enumerable line-by-line list, it’s easy to iterate through and increment a counter.

To call it, just call the method with the file name.  For mine, I called it cat-code.ps1, though I’ll probably set an alias to ‘cc’ or something later on.

.\cat-code.ps1 psake_default.ps1

Code Cat example.

From here, you could add in a check to the $Host.UI.RawUI and handle the overflows a bit better, but that’s for another day!

Automating Extracts Using External .SQL Files and PowerShell

September 29, 2009 Comments off

Rather than rely on a system task, I wanted to be able to kick off an export of one of our SQL databases on the fly, have it generate a CSV, then email the CSV to one of our developers.

Sounds like a task for PowerShell!

Why PowerShell? Well, frankly, because moving jobs around and such on our mixed-mash of SQL versions and servers is a bit annoying and I wanted to see if it worked.

So, to start off, I have a file, call it export_data.sql that contains all of the logic of my SQL query and the format I require.

I’ll also be using the Microsoft SQL query script I discussed in this blog post.

The tricky part is reading in the .sql file as a single “entity”—not the array of rows that get-content usually provides.  Thankfully, there is an easy way to do it.

@(gc ‘export_data.sql’ -readcount 0)

According to the PowerShell documentation, the readcount parameter serves two purposes: one is to set the number of lines to read at a time, the second (when set to zero) is to pull in all of the lines at once.  Great, that’s exactly what we need.

From there, it’s simply feeding our query to the sql-query function I wrote and exporting to CSV.  Nice enough, CSV exports are built-in to PowerShell!

The final command looks something like this:

@(gc ‘export_data.sql’ -readcount 0) | 
   % { sql-query server db $_ } |
   export-csv dump.csv

I could then add another field to mail the csv using the built-in command Send-MailMessage.

❤ PowerShell.