Add possible Owner to a Cluster Shared Volume

Posted: 26th September 2011 by Seth Killey in Virtualization

More of a bookmark to myself, but when performing a live migration using a previously created CSV, migrating the CSV may fail:

Taken from this blog

Once a Shared Volumes becomes a Cluster Shared Volume you can’t modify the possible owner list from the Failover Cluster Manager Console.

If you try to “Move this shared volume to another node”, and that node is not a possible owner you get:

Operation has failed.

The action ‘Move to node <nodename>’ did not complete

Error code: 0×80071398. The operation failed because either the specified cluster node is not the owner of the group, or the node is not a possible owner of the group.

It can be changed by command line:

cluster . RES “Cluster Disk 1″ /ADDOWNER:<nodename>

Hyper-V Live Migration did not succeed at the source

Posted: 26th September 2011 by Seth Killey in Virtualization

After setting up a 2 node Hyper-V cluster I was ready to perform a Live Migration test, however I got a series of errors which prevented Live Migration from completing successfully.  Event ID 21111 and Event ID 21502:

‘Virtual Machine <servername>’ live migration did not succeed at the source.

Followed by Event ID 20122

‘<servername>’ failed to migrate: the migration was canceled (Virtual machine ID <guid>)

What I discovered is that I had a different name specified for my NIC under Virtual Network Properties in Hyper-V.  If you go to Hyper-V Manager and choose Virtual Network Manager from the action menu, click on the network card and verify all nodes use the same name.  In my case I had 2 different names specified.  Once they matched Live Migration worked as expected

Notes from VMM 2008 R2 P2V conversion

Posted: 19th September 2011 by Seth Killey in Virtualization

This past weekend I used Microsoft’s Virtual Machine Manager 2008 R2 to perform a physical to virtual conversion of a production Windows 2008 server.  I encountered a few stumbling blocks and so I figured I’d document how I overcame them while it’s still fresh in my mind.

  • The first attempt I made at performing a P2V failed with the following error below.  After doing some research I decided to temporarily disable all anti-virus real-time protection on the host server.  I use AVG 2011, but each AV program is a little different so consult with your AV documentation.

Error (12700)
VMM cannot complete the Hyper-V operation on the <servername> server because of the error: ‘<servername>’ failed to add device ‘Microsoft Synthetic Ethernet Port’. (Virtual machine ID 1EF318A1-929E-4B70-AA66-2E2FF87A26B0)

The Virtual Machines configuration 1EF318A1-929E-4B70-AA66-2E2FF87A26B0 at ‘\\?\Volume{22af413c-1122-4d78-96e8-75b1aea68d39}\<servername>’ is no longer accessible: The requested operation cannot be performed on a file with a user-mapped section open. (0x800704C8)
(Unknown error (0x8000))

Recommended Action
Resolve the issue in Hyper-V and then try the operation again.

  • Once my anti-virus real-time protection was disabled I attempted to restart my failed job.  However, I got the following error below.  I followed this technet article and performed each of the steps listed, each time attempting to restart my failed job with the same error.  Ultimately, I decided to cancel the failed job and start fresh with AVG disabled which allowed me to complete the wizard successfully.

Error (2915)
The WS-Management service cannot process the request. Object not found on the <servername> server.

Recommended Action
Ensure that the agent is installed and running. If the error persists, reboot <servername> and then try the operation again.

  • A couple items related to cleanup after the wizard completes
    • The Windows licensing on the newly created VM will likely need to be re-activated.
    • On my new VM, I noticed one of my drives was really low on disk space.  I noticed that at some point during the conversion process it must have created a pagefile on my OS drive, whereas on the production, physical server I had moved the pagefile to a different volume.  I was able to unhide hidden folders and unhide operating system files to locate the large pagefile and safely delete to free up space.  Obviously, make sure the pagefile is no longer in use before deleting
    • While I’m on the topic of pagefiles, for some reason on one server it allows me to have the pagefile on a virtual SCSI controller, yet on a different VM it does not, forcing me to put the pagefile on my system drive which uses IDE.  This KB article explains why it shouldn’t work  Just weird that it does work on one VM and not the other.
    • I’m not sure if I missed a step in the wizard, but if you plan on using the hardware from the recently virtualized server, make sure you edit the network card settings in Hyper-V so the VM uses a dynamic MAC address.  Otherwise, if you format and bring back the old hardware there will be a duplicate MAC address on your network because the VM will be using the same static MAC address
    • If you encounter an error with the wizard as described below, you may have issues with removing the failed VM from your Hyper-V host.  I found this excellent post which resolved the issue for me
    • Lastly, this is more MS Exchange specific, but this VM was my hub transport server and after pulling up the newly created VM, outgoing mail was failing due to MX / DNS lookup failures.  What I determined based on the event log is that the Hub Transport references the old physical NIC.  I was able to refresh this setting with the newly created, generic virtual NIC by going into my MS Exchange Administrator Console, clicking on Server Configuration –> Hub Transport –> select Properties on the server object –> Go to the External DNS Lookups tab –> select the virtual NIC from the drop down menu, click apply, and then restart the Hub Transport Service.  I did likewise for Internal DNS Lookups for safe measure

Event ID: 6398 SharePoint 2010

Posted: 12th August 2011 by Seth Killey in SharePoint

Yet another error that cropped up for me after SP1.

  • The Execute method of job definition Microsoft.SharePoint.Administration.SPSqmTimerJobDefinition (ID <guid>) threw an exception. More information is included below.

    Data is Null. This method or property cannot be called on Null values.

Refer to this article  Like the person who posted, this error was triggered for the CEIP Data Collection process, which in common terms is the Customer Experience Improvement Program.  I disabled the job

Event 5586 after applying SP1 to SharePoint 2010

Posted: 12th August 2011 by Seth Killey in SharePoint

Some more carnage from the SP1 update.  You may get the following:

Unknown SQL Exception 2812 occurred. Additional error information from SQL Server is included below.

Could not find stored procedure ‘proc_UpdateStatisticsNVP’.

The reason for this is there is a stored procedure called proc_UpdateStatistics which references proc_UpdateStatisticsNVP, however SP1 fails to create the later stored procedure in all databases.  This website provides a nice script to identify all databases missing the stored procedure.  Download script to check for missing stored procedure

Now that you know which databases are missing this stored procedure, use the link below to download a script to create the stored procedure.  Just make sure to select the appropriate database from the drop down in SQL or type USE [databasename] GO at the beginning of the script

SharePoint 2010 SP1 installed. Really? No, not really…

Posted: 10th August 2011 by Seth Killey in SharePoint

Last weekend I installed service pack 1 for SharePoint 2010.  Everything seemed to go as planned.  My intranet website pulled up just fine so I slowly backed away from my computer with a sense of relief.  Of course this is SharePoint so naturally everything didn’t really go as planned.  I was doing my daily event viewer check and noticed since the upgrade my SharePoint 2010 Timer service was terminating unexpectedly every minute and SharePoint Foundation Search was complaining with the following:

The mount operation for the gatherer application <guid> has failed because the schema version of the search administration database is less than the minimum backwards compatibility schema version supported for this gatherer application. The database might not have been upgraded.

It turns out the .exe is really just step 1 of 2 of the upgrade because it just updates the binary files.  Why Microsoft would deploy a service pack available via WSUS or standalone installation and give no indication that another step is needed is a mystery.  In any event I found this article which explains the issue.

After running the appropriate PSCONFIG command I rechecked and verified my database had updated its schema.  Ok, phew…all is well, right?  Well, not exactly because my timer service was still crashing and the FIMSynchronizationService (Forefront Identity Manager Synchronization Service) would not start either.  I found this article which gives a solution that worked for me


  1. Central Admin>System Settings>Manage Services on Server (select the server where the User Profile Service is running)
  2. Stop the User Profile Service
  3. Stop the User Profile Synchronization Service (you will be prompted that this will deprovision the service)
  4. Once the services have stopped, Start the User Profile Service again
  5. Start the User Profile Synchronization Service again (you will be prompted to enter the password for the User Profile Service’s svc account) -note1: this service can take a little while to restart; note2: if it does not restart successfully, restart the server and try again (this has worked for me)
  6. Once complete, the User Profile Service and User Profile Synchronization service should show as Started, and the 2 corresponding FIM services on the server should be running again

After this I needed to do one last IIS restart (for some reason my root web app wasn’t working even though all other apps including SharePoint ran just fine).  Now slap yourself in the face for applying a SharePoint patch when everything was working well anyway.

Up until recently using Data Collector Sets in Performance Monitor with Windows Server 2008 R2 was a bit of a mystery. I understood the purpose, but wasn’t particularly motivated to learn how to use it because resource monitor or task manager gave me the basic information I needed to monitor performance in realtime for a quick fix. However, I’m evaluating disk configurations for a SAN to implement Hyper-V virtualization and needed to know just how much disk speed or IOPS I needed to support a virtualized environment without reaching an I/O bottleneck. I’ve always just followed the credo of using SAS / 15K drives for SQL or traditionally high I/O applications and then using SATA / 7.2K drives for file sharing, print sharing, or low I/O applications. Yet I couldn’t help but wonder if this advice applies more to medium to large organizations. I’ve jotted some notes down on how I’m using Performance Monitor to calculate production server IOPS and then use IOPS calculators to determine a strategy for purchasing disks in a SAN environment.

Calculating IOPS on production servers

  1. Open Performance Monitor and create a user defined data collector set
  2. Choose Create manually after giving a name for your collector set
  3. Create data logs and check Performance counter
  4. Add the following counters (taken from this article
    \LogicalDisk\Avg. Disk Sec/Read
    \LogicalDisk\Avg. Disk Sec/Write
    \LogicalDisk\Disk Bytes/Sec
    \LogicalDisk\Disk Reads/Sec
    \LogicalDisk\Disk Writes/Sec
    \LogicalDisk\Split IO/sec
    \LogicalDisk\Disk Transfers/sec
  5. Follow the rest of the wizard
  6. Right-click on the newly created collector set, navigate to the Stop Condition tab and change the stop condition to the period of time you wish to test IOPS.
  7. You can manually run the collection process by right-clicking the collector set and clicking start or schedule a task by right-clicking, choosing properties, and navigating to the schedule tab.  Note that the Performance Counter service must NOT be disabled on the server for this to work
  8. Once you’ve run the collection process you can view the report by going to the Reports section of Performance monitor and navigating to your user defined data collector set.  When you open the report you can view the summary with top disk by IO Rate and notate the IO/sec.  If you have multiple disks or want more detailed information you can go to the Disk section and choose Disk Breakdown.
  9. Optionally, you can export and import your custom data collector set to other servers quickly to recreate the test on all servers.  Right-click on the data collector set and choose Save Template.  On all other servers you can start the new data collector set wizard, choose create from a template, and then click the browse button to select your custom template.

Now that you have an understanding of how much a I/O load is running on your production server you can then match that up by using an IOPS calculator to determine disk speed and RAID configuration needed to meet that demand.  I used this resource or you can contact the SAN vendor and ask for IOPS specs.  I found this article on transactional I/O regarding Exchange helpful as well  So what did I learn, although a 15K SAS drive would be great, SATA will often fit the bill for small to medium sized installations.  Add up the IO/sec for all your servers and then add a buffer (I used 25%, but each organization will have to factor in fluctuation for high demand periods).  It feels much better to make the decision based on logic vs “best practices” from vendors who profit from the upsell.

Usefull apps for an IT pro: Part II

Posted: 26th April 2011 by Seth Killey in Apps

Here’s some other apps I frequently use:

PsTools – Not sure how I forgot this one.  Perhaps the most frequently used tool for desktop support.  Specifically PsExec (remotely execute any command), PsList (list all running programs…for remote task manager type screen use PsList -s \\computername), and PsLoggedon
ProcessExplorer – Another Sysinternals gem…you know what just get the whole suite
TeamViewer – Really nice, lightweight app for remote desktop support.  No application install necessary with QuickSupport version
Anti Twin – This little guy is a real time saver when trying to identify identical files.  Especially when you have a very poorly organized folder with lots of big picture files.  Bonus points because no installation is needed.

Lansweeper – this is really a client-server, full-fledged network inventory software program but if I’m analyzing a new network this is the first thing I install.  Great tool for identifying domain computers, hardware specs, installed software, user login history, license compliance, etc

Usefull apps for an IT pro

Posted: 18th April 2011 by Seth Killey in Apps

This kinda piggybacks off an article on good, lightweight apps found here  Of these apps, I’ll second the following:

SpaceSniffer – Use this to track down a folder of file that is draining free space on your hard drive.  On Windows 7, I also use Windows Search with the parameter size:gigantic to quickly track down big files.
CPU-Z – This little tool proved immeasurably helpful when doing custom PC builds.  Especially when overclocking RAM or CPU this can help you identify when overclocking is overheating your system so you can determine proper settings or decide if further cooling measures are needed.
Darik’s Boot and Nuke – Easy way to destroy data on your hard drive so you can safely recycle or sell an old hard drive.
CCleaner – Every once in awhile I’ll have someone drop off a PC bloated with software and mind numbingly slow.  Now I’d rather just format the drive and start over, but more often than not the person no longer has disks for all the software they’ve paid for and I don’t want to rely on someone remembering where all their critical information is saved and then take the fall when I blow away something important.  CCleaner is a nice compromise that more often than not provides real, noticeable speed improvement.
Prime 95 – Another useful tool for PC builders to test CPU performance
Memtest – Great way to determine if you have bad RAM

Personal favorites of my own:

Bluescreenview – I talked about this before, but great at diagnosing blue screen errors
inSSIDer – Great WiFi tool as an alternative to Netstumbler which doesn’t work on Windows 7
Yamipod – Born out of my absolute hatred for iTunes, I use this tool to manage files on my iPod.  Admittedly a little buggy, but it sure beats forced iTunes and QuickTime updates every other day
FileZilla – My favorite FTP client and FTP server software
WinSCP – I basically use this to transfer config files from Linux servers to my Windows desktop
WinFF – Great video converter utility.  I use this in conjunction with to download YouTube videos and then convert them into a Windows DVD Maker compatible file format
Gparted – My preferred disk partition editor
Microsoft Security Essentials – I can’t believe I’m writing this, but my favorite free, lightweight virus scanner is produced by Microsoft.  I don’t think it’s the absolute best at detecting viruses, but I’ll give a little in that area if the product doesn’t cripple my system when running scans.  If I’m looking to remove a virus I use Malwarebytes

On a somewhat related note, I also love portable apps.  Great way to use some of your favorite apps like Firefox, 7-Zip, ClamWin, Chrome, etc without actually installing.  I like to keep my registry clean.

A handy tool for the dreaded BSOD – bluescreenview

Posted: 5th April 2011 by Seth Killey in Apps

I’ve been on Windows 7 on my network for about a year now and for the first time I had to analyze a dump file for a blue screen of death.  So yeah, Windows 7 is significantly more stable than previous versions of Windows.  In the past, this has been a fairly arduous task that involves loading the windows debugging tools and then downloading the appropriate symbol files so you can figure out what file is causing the BSOD.  However, today I came across an excellent little tool for analyzing minidump files.  All you need to do is download bluescreenview from and run the exe file…no installation necessary.  Within a minute I was able to quickly identify the likely problematic file.