How to create a Windows Server 2008 Cluster within Hyper-V using simulated iSCSI storage
[Updated May’09: Windows Storage Server 2008 now available to MSDN/TechNet subscribers. Checkout Jose Barreto's Blog for details.]
Familiar with Virtual Server 2005 and shared disks for creating virtual clusters? Well its different with Hyper-V. The shared disk option is no longer available (which I did not know when I started testing). You have to use iSCSI instead. Here is a step by step method for creating a fail-over cluster within Hyper-V. Its a cheap way of setting up a test lab (assuming you don’t have access to Windows Storage server). In this post I use StarWind to simulate iSCSI storage … its not an endorsement of the product, I just picked it from amongst the crowd.
Windows Server 2008 fail-over clusters support Serial Attached SCSI (SAS), iSCSI and Fibre Channel disks as storage options. So, how would you go about setting up a virtual Windows Server 2008 test cluster using the new Hyper-V vitalisation product? The method I am about to outline is a little different to what you might be used to Virtual Server 2005. The following steps detail how I managed to setup a test cluster using simulated iSCSI storage. Before beginning it’s worth reviewing this article that outlines the storage options that are available to Hyper-V. By the end of this post you should have a simple two node cluster up and running using simulated iSCSI storage.
Tools for the job:
- A Windows Server 2008 server x64 server with the Hyper-V role enabled (I used a Dell Precision 390)
- One Windows Server 2008 VM to act as a Domain Controller (Clusters must be part of a domain)
- Two Windows Server 2008 VMs to act as Cluster Nodes
- One Windows Server 2003 SP2 VM (or you could use Windows Server 2008 in a Core install to maximise VM performance)
- iSCSI Target Software: I used the StarWind product that is available as a 30 day eval. Windows Storage Server is now available to MSDN/TechNet subscribers.
- iSCSI Initiator software (built into Windows Server 2008)
I wont go into how to create a VM but you can find more info from Virtual Guys weblog.
Before I began looking into the iSCSI simulated storage option for my cluster nodes I tried to expose a single VHD to each of my cluster nodes in the hopes that they would share it. I didn’t get very far and was presented with the following error when powering on the VMs:
This error is by design (thanks Justin Zarb for point this out) as Windows Server 2008 Hyper-V does not support this sort of storage (see link above for Hyper-V storage options). The above error is simply a file system error as the VHD “is being used by another process” … should have spotted that
SETTING UP THE LAB
Note: I’m assuming that you know how to install Windows Server 2003 and 2008. I’m also assuming that you know how to install and configure a Window Server 2008 Domain Controller. If you have any questions leave me a comment and I will see if I can point you in the right direction.
VIRTUAL NETWORK
Create the network with a connection type of “Internal Only”. I enabled Virtual LAN identification and set the default ID to 2 as this will be my public LAN. Setting the default to 2 means that if I dont specify a VLAN on subsequent NICs they will be classified as public connections.
VLAN ids:
- VLAN 2: Public 10.1.1.x/24
- VLAN 3: Heartbeat 192.168.1.x/24
- VLAN 4: iSCSI 192.168.2.x/24
SERVER SETUP
Tip: Be sure to rename each network card on the hosts to make identification easier. If its the public NIC, call it public etc.
Domain Controller: dc01
- Windows Server 2008 x32
- One VHD IDE fixed size disk 10GB
- 1 x NIC connected to my Virtual Network in VLAN 2
Network settings:
- IP Addr: 10.1.1.10
- Mask: 255.255.255.0
- Gateway: I didn’t bother setting one
- DNS: 10.1.1.10
Cluster Nodes:
- Windows Server 2008 x32
- 1 x VHD IDE fixed size disk 10GB
- 3 x NICs connected to my Virtual Network in the following VLANs
- Public card: VLAN 2
- Heartbeat card: VLAN3
- iSCSI: VLAN4
Node01
Public NIC: VLAN 2
- IP Addr: 10.1.1.20
- Mask: 255.255.255.0
- Gateway: I didn’t bother setting one
- DNS: 10.1.1.10
Heartbeat NIC: VLAN 3
- IP Addr: 192.168.1.4
- Mask: 255.255.255.0
iSCSI NIC: VLAN 4
- IP Addr: 192.168.2.4
- Mask: 255.255.255.0
Note: On all NICs in VLAN 3/4 be sure to disable the Client for Microsoft Networks, disable DNS registration and disable NetBIOS. Be sure to check your binding order too. The public NIC should be first.
Node02
Public NIC: VLAN 2
- IP Addr: 10.1.1.21
- Mask: 255.255.255.0
- Gateway: I didn’t bother setting one
- DNS: 10.1.1.10
Heartbeat NIC: VLAN 3
- IP Addr: 192.168.1.5
- Mask: 255.255.255.0
iSCSI NIC: VLAN 4
- IP Addr: 192.168.2.5
- Mask: 255.255.255.0
Note: On all NICs in VLAN 3/4 be sure to disable the Client for Microsoft Networks, disable DNS registration and disable NetBIOS. Be sure to check your binding order too.
iSCSI Target
- Windows Server 2003 SP2 x32 (see here for notes on W2K3 hosts in Hyper-V)
- 1 x VHD IDE fixed sized disk 10GB
- 2 x VHD SCSI fixed sized disks 1GB and 10GB for Cluster disks
- StarWind iSCSI Target Software
- 2 x NICs connected to my Virtual Network in the following VLANs:
- Public : VLAN 2
- iSCSI : VLAN 4
Public NIC: VLAN 2
- IP Addr: 10.1.1.22
- Mask: 255.255.255.0
- Gateway: I didn’t bother setting one
- DNS: 10.1.1.10
iSCSI NIC: VLAN 4
- IP Addr: 192.168.2.2
- Mask: 255.255.255.0
Note: On all NICs in VLAN 3/4 be sure to disable the Client for Microsoft Networks, disable DNS registration and disable NetBIOS. Be sure to check your binding order too. Make sure you format and assign drive letters to the SCSI VHDs on this VM.
Setting up the Cluster
Update 17/10/2008: I've also found that using the Image Files option works quite well too. Image files will allow you to pack more than one VM onto a disk partition. Check out https://www.starwindsoftware.com/images/content/StarWind_MSCluster2008.pdf for more info.
Note: Check out the how to the same with Windows Storage Server 2003 R2. https://www.microsoft.com/windowsserversystem/wss2003/productinformation/overview/default.mspx
Update May 09: Windows Storage Server 2008 has now RTM’d and is available online through MSDN and TechNet. https://www.microsoft.com/windowsserver2008/en/us/WSS08.aspx
Configuring the iSCSI target software (Starwind)
- Install the StarWind software on your iSCSI target VM.
- Launch the StarWind management console.
- Under the Connections you should see localhost:3260. Right click on localhost and select Connect. If I remember correctly the first username and password becomes the default (which you can change later).
- Right click localhost:3260 and select add Device
- Select Disk Bridge Device as the Device type and click next
- Select the first SCSI disk from the list (more than likely \\.\PhysicalDisk1).
- Select Asynronous Mode and Allow multiple iSCSI connections (clustering) and click next
- Give the disk a friendly name
- Repeat the steps to add the second disk
Adding disks to the cluster nodes
Each cluster node now needs to be connected to the iSCSI target. Launch the built in iSCSI initiator and follow the steps below:
- If prompted to unblock the Microsoft iSCSI service always click Yes otherwise the 3260 port will be blocked.
- Click on the Discovery tab and select Add Portal.
- Enter the IP address for the iSCSI target [192.168.2.2]
- Click the Targets tab and you should now see a list of the disks available on the target
- For each disk in the list click Log on and select Automatically restore this connection
- Click on the Volumes and Devices tab and select AutoConfigure. You disks should now appear as Devices.
- Reboot each cluster node as you add the disks.
- Disks will be offline when you reboot. Ensure that you bring them online in Disk Management.
When completed (and hosts connected) you should see something like this on the iSCSI target VM.
Installing the Cluster
The new fail-over cluster wizard is quite straight forward and much easier to follow when compared with Windows Server 2003. There isn't much point in going into too much detail … you’ll find plenty of info on the web.
Here is a step by step guide to installing a two node file cluster in Windows Server 2008.
Comments
Anonymous
January 01, 2003
If you are an MSDN or TechNet Plus subscriber, you can now get Windows Storage Server 2008 and the Microsoft iSCSI Software Target 3.2: http://blogs.technet.com/josebda/archive/2009/05/16/windows-storage-server-2008-with-the-microsoft-iscsi-software-target-3-2-available-to-msdn-and-technet-plus-subscribers.aspxAnonymous
January 01, 2003
Great article! though I wander since the iSCSI Service runs only on one server, we introduce a single point of failure, if this server requireing maintenance/reboot then we have a problem. Any ideas for this problem?Anonymous
January 01, 2003
Lately it's been very quiet on my blog. There are a couple of things to that. First and foremost thereAnonymous
January 01, 2003
The comment has been removedAnonymous
January 01, 2003
Thanks for such good article! Based on it I made my own fail-over cluster with iSCSI. I also tried this Starwind, but I used free version. Nice thing.Anonymous
January 01, 2003
Recebi a pergunta abaixo em um post anterior , e neste post compartilho com vocês a resposta. FailoverAnonymous
January 01, 2003
One of the things I've been saying I'd blog for a while is how to set up a cluster on Hyper-V. SinceAnonymous
January 01, 2003
here is an article gives you detailed step-by-step instructions on configuring Windows Server 2008 or Windows Server 2008 R2 failover clusters. http://www.kernsafe.com/article_product.aspx?id=5&&aid=41 or PDF http://www.kernsafe.com/tech/iStorage-Server/iStorage-Server-iSCSI-SAN-for-Windows-Clustering.pdfAnonymous
January 01, 2003
Si desean crear un laboratorio de clúster dentro de Hyper-V pueden seguir los pasos descritos en esteAnonymous
January 01, 2003
For lab testing, we would often set up Microsoft Server 2003 2-node clusters, under virtual server 2005,Anonymous
January 01, 2003
Not sure what i'm missing, but i cannot get the NICs on the 192.168.2.x LAN to talk to eachother. I disabled the firewalls to make sure everything is open, do you have any suggestions as to how i can fix this?Anonymous
September 15, 2014
Blogs - PFE Ireland - Site Home - TechNet BlogsAnonymous
October 24, 2014
Blogs - PFE Ireland - Site Home - TechNet Blogs