Yesterday, we decided to upgrade all of our 3 XenServer 6.1 servers to the latest up to date Hotfix level. That included all the current hotfixes available.
Everything was done according to the upgrade path provided with disabling of HA and starting with the XenServer Master server.
Now, everything looked to go okay. But after the servers all had rebooted they refused to leave maintenance mode.
We tried litterary everything there was of commands, that are safe to run, still no go, XenCenter refused to connect to the pool, but we were able to SSH our way in!
After a GTM session with Citrix support, we decided to reinstall the entire xenserver pool. The issue seems to be around XS61E009 and XS61E010.
The new install is with XenServer 6.1 and with all hotfixes, patched pr server, before joining them to a new xenserver pool.
So, we started to attach the vms from the storage, and when we tried to boot them up we ended up with a BlueScreen with error: 0x00007b.
After doing som digging around I found out that you need in some cases to do this:
Blue Screen on Booting a Windows VM
If you have multiple disks attached to the vm in question that gets the bluescreen, then you need to initialize the disk with the disk manager in Windows Server.
PS. This article may be updated soon, due to the fact that we are not sure if we have encountered all things to restore our environment 100% yet.