VMware Snapshot Size Powershell Nagios Script

VMware snapshots are a fantastic feature. They can be easily created. The problem arises when they have been forgotten about. Not only do they consume disk space, they can also take a very long time to remove. The check_vm_snap_size.ps1 plugin for Nagios/Icinga was written to notify when any snapshots get over a certain defined size. While other methods exist for checking snapshot file sizes (like running a check via the service console), this plugin uses the PowerCLI interface released by VMware to present that information. When utilized along with NSClient++ it can easily report back to Nagios the size of your snapshots. Combine that with your favorite performance graphing utility (ex. Nagiosgraph) and you show the growth of your snapshot sizes.vmware-nagios-snapshot-size

While the plugin itself is fairly simple (I am no PowerShell guru) the steps to get it operate securely with NSClient++ and to minimize load are somewhat involved.

Prerequisites

Installation and Configuration

The installation and configuration of the script is fairly straight forward by itself. The difficult parts are related to optimizing your PowerCLI environment to reduce the load time. Because the script is reloaded every check interval, without optimization this can put extra load on your host. The other piece is to generate a credential file so that you are not passing username/passwords across the network needlessly.

PowerCLI Optimiziation

Because the PowerCLI is loaded every time the script is run, we want to minimized its impact on the system. One way to accomplish this is to manually compile the .Net PowerCLI XmlSerializers; doing this dramatically reduces the CPU load and startup of time of the add-in. You will only need to do this once per computer per version of the PowerCLI. A big thanks goes out to VELEMTNAL and vNugglets for the commands to do this.

The following script is what I ran on my host, it complies all known versions that might be on the host (I was lazy and didn’t really want to figure out which version I actually had). Note that this script needs to be run as Administrator (right-click and select “Run As Administrator”).

For 64-bit Operating Systems

C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService55.XmlSerializers, Version=5.5.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService51.XmlSerializers, Version=5.1.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService50.XmlSerializers, Version=5.0.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService41.XmlSerializers, Version=4.1.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService40.XmlSerializers, Version=4.0.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"
C:\Windows\Microsoft.NET\Framework64\v2.0.50727\ngen.exe install "VimService25.XmlSerializers, Version=2.5.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"

If you have a 32-bit OS use

 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService55.XmlSerializers, Version=5.5.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService51.XmlSerializers, Version=5.1.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService50.XmlSerializers, Version=5.0.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService41.XmlSerializers, Version=4.1.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService40.XmlSerializers, Version=4.0.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 
C:\Windows\Microsoft.NET\Framework\v2.0.50727\ngen.exe install "VimService25.XmlSerializers, Version=2.5.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f" 

UPDATE – Apr 4, 2017
If you are using Powershell 3.0+ then you need to run a different set of commands in order to make the optimizaitons. Thanks to a comment on a VMware blog post the command you need to run for a 64-bit OS is

c:\Windows\Microsoft.NET\Framework64\v4.0.30319\ngen.exe install "VimService55.XmlSerializers, Version=5.5.0.0, Culture=neutral, PublicKeyToken=10980b081e887e9f"  /ExeConfig:%windir%\system32\WindowsPowerShell\v1.0\PowerShell_ISE.exe

Also according to another comment on the same post “This optimization is not possible and not needed any more with version 6.5+ of PowerCLI”

Credential Store Creation

The second challenge with running this script in an automated fashion via NSClient++ is related to authentication and user rights. The script has been designed to utilize the VI Credential Store features to securely save a credential file to the machine and then just pass that location in the command string so that you are not actually storing raw username and passwords or passing them across the network. The New-VICredentialStoreItem and Get-VICredentialStoreItem commandlets allow the file to be created, however the resulting saved files can only be utilized by the user account that created the file. Check out the PowerShell Article of the Week from Professional VMware for more information on secure credential storage.

By default the NSClient++ service runs as the local System account, so we need to launch a PowerCLI session as the System account if we want to utilize this feature.Thanks to a post on Ben Parker’s Blog called How do I run run Powershell.exe/command prompt as the LocalSystem Account on Windows 7? we have the answer. The trick is use PsExec from Microsoft/Sysinternals. Even though Ben’s blog post is specific to Windows 7 the process works just fine on Windows Server 2008 R2.

  1. Download PsExec from Microsoft
  2. Run PsExec from a command prompt as follows: psexec -i -s Powershell.exe this will open a new window
  3. In the new PowerShell console window type whoami and it should respond with NT AUTHORITY\SYSTEM
  4. Create the XML Credential file by running New-VICredentialStoreItem -host 'host.example.com' -user 'username' -password 'pword' -file c:\hostname.xml substiuting the correct server, user, password and file locations. Note that the location you choose should have the necessary security rights applied.

NSClient++ Configuration

In order to make this script work with NSClient++ you must first make sure that your nsclient.ini is configured for external scripts and NRPE, additionally you need to enable support for argument passing and to allow for nasty meta chars (this last step may not be needed) for the scripts. Assuming you have placed the check_vm_snap_size.ps1 script in the NSClient++ scripts folder then should add the following to the [/settings/external scripts/scripts] section of the config file.

Two things to note, first this should all be on one line it is shown as wrapped to easier reading. Second the last dash - is required.

check_vm_snap_size = cmd /c echo scripts\\check_vm_snap_size.ps1 -hostname $ARG1$ -crit $ARG2$ -warn $ARG3$ -credfile $ARG4$ -hostexclude $ARG5$ -guestexclude $ARG6$; exit($lastexitcode) | powershell.exe -command -

Nagios Configuration

The Nagios configuration is pretty straight forward. The check utilizes the check_nrpe command for passing the request to the host. The following is an example of the configuration for the checkcommands.cfg portion.

define command {
command_name  check_vm_snap_size
command_line  $USER1$/check_nrpe -H $HOSTADDRESS$ -t 45 -c check_vm_snap_size -a $ARG1$ $ARG2$ $ARG3$ $ARG4$ $ARG5$ $ARG6$
}

For the services check you will need to create something like the following, the last two arguments are optional and they refer to hosts and guests that you might want to exclude from the results. The other thing to note is that both the check_nrpe command and the NSClient++ configurations require that all backslashes be escaped because they are special characters, therefore for each single backslash in your path to your credential file you must enter four backslashes in the service check config.

define service {
service_description  VMware Snapshot Size
host_name            hostname
check_command        check_vm_snap_size!vcenterserver.example.com!1024!512!c:\\\\credfile.xml!excludehost.example.com!excludeguest
use                  generic-service
contact_groups       vm-admins
}

Known Issues/Limitations

  • While the check_vm_snap_size.ps1 script supports passing an array for both the hostexclude and guestexclude parameter options, that functionality does not yet work when sending via check_nrpe. You can specific a single host and a single guest, but not multiple.

Misc Notes

You may need to enable set-executionpolicy for both 64 bit and/or 32 bit PowerShell depending upon which version of NSClient++ you have installed.

To Do

The script still needs internal documentation written, as well as hopefully finding a solution to all of the known issues

 

The Script

Save the following as check_vm_snap_size.ps1

param ( [string] $Hostname = "",
 [double] $crit = 100,
 [double] $warn = 50,
 [string] $CredFile,
 [string] $HostExclude =@(""),
 [string] $GuestExclude =@(""),
 [switch] $help
)
$countLargeSnap = 0
$critcount = 0
$warncount = 0
$snapcount = 0
$crittsize = 0
$warntsize = 0
$snaptsize = 0
$LargeSnapNames = ""
$critSnapNames = ""
$warnSnapNames = ""
 
# parameter error checking
if ( $warn -ge $crit) {
 Write-Host "Error - crit vaule must be larger than warn value" -foregroundcolor "red"
 exit 3
}
if ( $Hostname -eq "") {
 Write-Host "Error - Hostname must be specified" -foregroundcolor "red"
 exit 3
}
 
#load VMware PowerCLI
add-pssnapin VMware.VimAutomation.Core -ErrorAction SilentlyContinue
 
# If no credential file specific use the account permission from the user running the script
# otherwise use the credential file to get the host, user, and password strings
if ($CredFile -eq "" ) {
 Connect-VIServer -Server $Hostname -WarningAction SilentlyContinue > $null
}
else {
 $creds = Get-VICredentialStoreItem -file $CredFile
 # check to see if the hostname specific matches hostname in credential file
 if ( $Hostname -eq $creds.Host) {
  Connect-VIServer -Server $creds.Host -User $creds.User -Password $creds.Password -WarningAction SilentlyContinue > $null
 }
 else{
  Write-Host "Unknown - Hostname specific does not match hostname in credentials file" -foregroundcolor "red"
  exit 3
 }
}
 
if ($global:DefaultVIServers.Count -lt 1) {
 write-host "Unknown - Connection to host failed!"
 exit 3
}
 
# Get the list of snaphosts to evaluate from the host, excluding hosts and
# guests if defined
$snapshots = get-VMhost | ?{$HostExclude -notcontains $_.Name} | get-vm | ?{$GuestExclude -notcontains $_.Name} | get-snapshot
 
# Loop through each snapshot and see any sizes exceed the warning or crital
# thresholds. If so then store their names and sizes. Could put into an array
# but that is for another day.
foreach ( $snap in $snapshots ) {
 $snapcount++
 $snaptsize = $snaptsize + $snap.SizeMB
 if ( $snap.SizeMB -ge $warn -and $snap.SizeMB -lt $crit ) {
  $warncount++
  $wVMName = $snap.VM
  $wVMSize = $snap.SizeMB
  $warntsize = $warntsize + $snap.SizeMB
if ( $warnSnapNames -eq "") {
    $warnSnapNames = "${wVMName}:${wVMSize}MB "
    }
   else {
    $warnSnapNames += "${wVMName}:${wVMSize}MB "
    }
 
        }      
  elseif ( $snap.SizeMB -ge $crit  ) {
   $critcount++
   $cVMName = $snap.VM
   $cVMSize = $snap.SizeMB
   $crittsize = $crittsize + $snap.SizeMB
   if ( $critSnapNames -eq "") {
    $critSnapNames = "${cVMName}:${cVMSize}MB "
    }
   else {
    $critSnapNames += "${cVMName}:${cVMSize}MB "
    }
 }
}
 
if ( $critcount -gt 0 ) {
 Write-Host "Critical -" $critcount "VM's with snapshosts larger than" $crit "MB :" $critSnapNames "|snaps=$snapcount;$warncount;$critcount;; ssize=${snaptsize}MB;$warn;$crit;;"
 exit 2
}
elseif( $warncount -gt 0 ) {
 Write-Host "Warning -" $warncount "VM's with snapshosts larger than" $warn "MB :" $warnSnapNames "|snaps=$snapcount;$warncount;$critcount;; ssize=${snaptsize}MB;$warn;$crit;;"
 exit 1
}
if ( $critcount -eq 0 ) {
 Write-Host "OK - No VM's with snapshosts larger than " $warn "MB" "or" $crit "MB" "|snaps=$snapcount;$warncount;$critcount;; ssize=${snaptsize}MB;$warn;$crit;;"
 exit 0
}

Review: SB EventLog Monitor

SB EventLog MonitorI have only one thing to say about this product, “How did I ever live without it”. If you manage more than one Microsoft Windows Server then you definitely need to be using SB EventLog Monitor.

So what does SB EventLog Monitor do that is so great, it collects, collates, and reports via a web interface upon Microsoft Event Log data. The UNIX world has had syslog forever and a ton of tools to help you manage the logging data generated by servers. I’ve even tried to shoehorn Microsoft Event Log data into some of those products, but it was never a good fit. SB EventLog Monitor allows you to quickly and easily manage and analysis what is going on across all of your servers. It allows you to quickly and easily view and filter error messages from different servers and identify patterns. This is particularly useful with dealing with multiple servers across slow WAN links.

It collects the Event Log data either via a Microsoft VB script that use WMI to collect only the new events or via an agent that you can install on your servers. The other requirements are MySQL, PHP (5.0+), and a web server (apache, IIS). While the install is geared towards running everything on a Microsoft server it is possible to run the database and web server on Linux. In fact that is what I did. The install is really pretty easy, so if you are looking for a relatively simple way to increase the manageability of your servers, then I strongly recommend that you take the time to install the open source SB EventLog Monitor.

Review: Script Your Documentation Instantly

If you are anything like me you probably have little to no documentation on your servers. Probably it is because you don’t have the time or the personal to perform the tedious (and boring) work required; you know it is important, but other things seem to take priority. Well, you no longer have any excuse for not getting it done.

SYDI (Script Your Documentation Instantly) is an open source solution that will document your Windows Servers, MS SQL Servers, and Exchange Organizations. It is a fantastically easy product to use. In its simplest form, it will query an individual server and produce a Microsoft Word document detailing the hardware, software, networking, user accounts and storage settings with a table of contents and loads of other useful information. With a little extra work, you can have it query all of your servers and produce a set of XML files that can be converted in to HTML documents (using an included script) that makes publishing a breeze.

Using SYDI is really simple; it is just a VBS script that is launched from a command prompt. You do need to have Microsoft Word installed on the workstation if you want it to produce the documentation in that format. The software is written by an IT Consultant named Patrick Ogenstad. He has posted some really good how-to guides on his site.

I highly recommend that you take a look at SYDI and use it to help jump start you server documentation project. I did and in about 30 minutes I had published our server configs on our Network Management server.

Syncing Files Between PC’s and Mac’s

I don’t know how I missed this, but Microsoft has a really great product/service that allows you to synchronize files between multiple computers (PC’s and Mac’s) across the Internet (thanks Mike for pointing this out). In addition the product also allows you to share files with your friends. This communication happens in a private peer-2-peer fashion and is encrypted. The product is called FolderShare, and Microsoft purchased late last year and is offering the service for free.

Quoting from their site you can:

  • Synchronize all your devices – Retrieve work files at home or access photos at work. With your devices in sync, you no longer have to be frustrated that your information is on another computer.
  • Share files, photos, and home videos with your peers – Select the content you want to share, invite members, and they will be able to access the shared files directly from their device.
  • Access your computer or device remotely – FolderShare mobile access allows you to access your computer from any web browser.

The service works as advertised and is something that I plan to share with my Dad and Father-in-Law as a way for them to backup their information between their machines (they both have desktops and laptops) as well as provide them a way to backup to my house as well (if they want).

Initially I had thought that it would be perfect, easy to use solution to my off-site storage problem related to backup. While it works great, it does a have a few limitations that make it unsuitable for use in my particular situation.

The three big limiting factors are that it only supports files up to 2 GB in size (not that big of deal if you take that into account and split your big files), but the real killer for me is a limitation of 10,000 files per “library”. Some of my directories have over 25,000 files (think all the pictures my wife takes). I started thinking about workarounds, but eventually nixed those ideas as just too complex. The third issue is that it runs as a user level application. So that means that you have to be logged in and have the application running in order to it to sync.

This is a great tool for your personal and/or small business use. It is really easy to use, its free (at least for now), and it fills a niche in most people’s backup strategy by providing off-site backups, as well as access to your files (particularly if you are a laptop user), and finally it is cross platform Mac OSX and Windows.

I Want to Backup Both My Mac and PC

So I recently purchased two cheap 250 GB external USB hard drives. I plan to use them solely for backing up both my Mac and Windows PC. I plan to keep one unit at the house and the other off-site. Either at my office or maybe ship it to my Dad’s for safe keeping.

I’ve been looking around for the best ways to backup both systems to the same hard drive. My problem is that I want to the solution to be simple and robust. Ideally, I’d like to be able to plug the hard drive into either computer and back the other up. I’d like to be able to see the files from both OS’s at the same time. I’d like to do daily, weekly, monthly backups with incremental and compression. It would be nice to also mirror the boxes so that I have a bootable device as well.

I know that I won’t get all that with just one solution (and basically 1 drive). I’m going to try a few things and see what works so this post is basically just to document some initial finding to see what I may want to try.

I found the following on the forums at OSXFAQ:

Under Panther, the command line diskutil tool has an option entitled “MBRFormat” for its “partition” verb, which writes the MBR in DOS/Windows format.

In other words, if you have a disk which is accessible through the device node /dev/disk1, and you wish to partition it into two 80GB partitions – one FAT32 and one HFS+J, you do it as follows:

Code:
diskutil partitionDisk /dev/disk1 2 MBRFormat MS-DOS DosDrive 80G "Journaled HFS+" MacDrive 80G

This would allow my to plug the drive into either type of computer and copy the data over using some method. On the Mac I’d be able to see both file system types, but without some commercial software like MacDrive I would only be able to see the FAT32 partition on the PC. I could leave the drive attached to the Mac and copy the files over the network from the PC. I’d then be able to restore directly if I needed to. I’d need some software on the Mac to make this work.

One really neat solution (that is probably overkill for me) is called BackupPC.

BackupPC is a high-performance, enterprise-grade system for backing up Linux and WinXX PCs and laptops to a server’s disk. BackupPC is highly configurable and easy to install and maintain.

Given the ever decreasing cost of disks and raid systems, it is now practical and cost effective to backup a large number of machines onto a server’s local disk or network storage. This is what BackupPC does. For some sites, this might be the complete backup solution …

BackupPC is written in Perl and extracts backup data via SMB using Samba, tar over ssh/rsh/nfs, or rsync. It is robust, reliable, well documented and freely available as Open Source on SourceForge …

BackupPC Features:

  • A clever pooling scheme minimizes disk storage and disk I/O. Identical files across multiple backups of the same or different PCs are stored only once resulting in substantial savings in disk storage and disk I/O.
  • One example of disk use: 95 laptops with each full backup averaging 3.6GB each, and each incremental averaging about 0.3GB. Storing three weekly full backups and six incremental backups per laptop is around 1200GB of raw data, but because of pooling and compression only 150GB is needed.
  • Optional compression support further reducing disk storage. Since only new files (not already pooled) need to be compressed, there is only a modest impact on CPU time…

I’ve also looked at

  • rdiff-backup which uses rsync like methods plus incremental backups (even of binary files) but they don’t have good windows support yet
  • SuperDuper which is a free/shareware Mac application that will make a mirrored bootable drive as well as has other backup modes.
  • An article about HOWTO: Backup Your Mac With rsync which I’ve used before
  • Dirvish which is a set of scripts for rsync based backups
  • A script for copying opened files on Windows XP and 2003 Server (it uses VSS)
  • An article from LifeHacker about how to backup your PC (with a software recommendation).

I will probably start with just partitioning my disk into two sections and doing a basic copy to get things going and experiment from there. Once I finally get a solution that I like I’ll be sure and update everyone, and if you have a particular solution that you like, please leave me a comment.

Windows Time Service on PDC

Whenever you are moving the PDC emulator role to another computer on a Windows Server 2003 network you need to make the following changes to the time service on both the new machine and the old.

Configure the Windows Time Service on the new PDC emulator

  1. Open a Command Prompt.
    Type the following command to display the time difference between the local computer and a target computer, and then press ENTER:

    w32tm /stripchart /computer:target /samples:n /dataonly

    target – Specifies the DNS name or IP address of the NTP server that you are comparing the local computer’s time against, such as time.windows.com.

    n – Specifies the number of time samples that will be returned from the target computer to test basic NTP communication.

  2. Open UDP port 123 for outgoing traffic if needed.
  3. Open UDP port 123 (or a different port you have selected) for incoming NTP traffic.
  4. Type the following command to configure the PDC emulator and then press ENTER:

    w32tm /config /manualpeerlist:peers /syncfromflags:manual /reliable:yes /update

    where peers specifies the list of DNS names and/or IP addresses of the NTP time source that the PDC emulator synchronizes from. For example, you can specify time.windows.com. When specifying multiple peers, use a space as the delimiter and enclose them in quotation marks.

Change the Windows Time Service configuration on the old PDC emulator

  1. Open a Command Prompt.
  2. Type the following command and then press ENTER:

    w32tm /config /syncfromflags:domhier /reliable:no /update

  3. Type the following command and then press ENTER:

    net stop w32time

  4. Type the following command and then press ENTER:

    net start w32time

Insecure Protocols, Passwords, Ettercap, and PHP

So I was doing my daily web browsing looking for something cool, when I came across something that for the first time really hit home how completely unsecured the Internet is and how simple it is for people to grab your passwords.

Now, I’ve been on/using the Internet in one shape or form since 1987, and have seen things migrate from the nearly open educational network that the amazing and scary thing that it is today. The problem lies in the fact that a lot of the Internet still uses the same insecure protocols that were popular back in 1987; in particular POP/IMAP email authentication along with basic HTML and others.

I’m an IT professional (granted I’m in management now, but still …) but I was taken back by how utterly easy it is to now capture username and password information. So your thinking no big deal, I use a “secure” website to access my online banking, etc. Well do you also use a different password for every site you visit?

What really hit home for me was this post about the Wall of Shame on the Irongeek site. What the site has is some very simple PHP to display the username/password combination sniffed out by Ettercap.

Ettercap is a suite for man in the middle attacks on LAN. It features sniffing of live connections, content filtering on the fly and many other interesting tricks. It supports active and passive dissection of many protocols (even ciphered ones) and includes many features for network and host analysis.

Ettercap is an amazing piece of software with many features. The main site only offers the source code, so knowing that I needed to run the Irongeek’s page on my Linux box, I started to compile the ettercap code. My box was missing one of the libraries and seeing as how I only wanted a proof of concept anyway. I downloaded a precompiled Windows binary. Within about 2 minutes I had everyone up and running and I was capturing the username and passwords of the traffic crossing my network. Wow, that’s really scary.

Today’s little 30 minute diversion is going to really cause me to change the way I use the Internet, and secure my password information. I’m going to look at SSH/SSL tunneling for all my data when I can’t use a secure stream.

How to Have Your Widget and Use It Too

So you’re running Mac OS X Panther (10.3) or some flavor of Microsoft Windows and have been drooling over Tiger’s (not so) new Dashboard feature. Well there is no need to drool anymore, because you can have your widgets and use them too. Konfabulator is now free. For those that don’t know, Konfabulator was the original cross platform widget platform from whom some claim Apple appropriated the concept.

Konfabulator was recently purchased by Yahoo! and subsequently released for free. There are thousands of free widgets available at their Widget Gallery. A couple of my favorites, besides the included weather widget, are: TVScaper, Wunder Radar, and ShortStat.

Nagiosgraph with Windows support

After reviewing the four main tools for graphing performance with Nagios (APAN, Nagiosgraph, Nagiostat, and PerfParse), I decided that Nagiosgraph was the easiest for me to get up and running. Out of the box it worked great for my Linux systems and my network tests, but I needed to add support for monitoring my Windows servers.

I have used APAN in the past, but it was really tough to configure. I also tried PerfParse and liked it. However, it required a lot more resources for the database than I was prepared to handle, and I could probably only keep 30 days of data. But it worked great.

To make things easier I installed the latest CVS nightly of the 1.4.0alpha Nagios Plugins. As of 20040817 these plugins supported performance data output for the check_nt plugin (the one that works with the NSClient service). Once these plugins were complied and installed, I updated the nagiosgraph map file. This file is what is used to parse the output for generating the stats.

(more…)