Building vSphere home lab is in my opinion essential. It is quite popular subject and there was even VMworld session dedicated to it:
My reasons for having a home lab are following:
- Run my home IT infrastructure (firewall, DHCP, Active Directory, Mail server, VoIP, …)
- Try and learn new products
- Get hands on experience needed for certifications
I tried many approaches (VMware server, VMware Workstation, Xeon server, …). My goals were to build quite powerful lab, that would fulfill the reasons stated above but at the same time to be cheap, to have low power requirements not to be noisy and not make too much heat. Here is my current config:
I have two servers. One acts as ‘unified’ storage server (NAS and iSCSI SAN) and one as the host for all other workloads.
To try advanced virtualization features shared storage is a must. I needed huge fileserver even before I started with virtualization to store my DVD collection so I build multi terabyte linux based (Debian) fileserver with software raid, which later become Openfiler NAS with iSCSI protocol. However to learn VMware Site Recovery Manager I needed storage which could replicate and was compatible with SRM. The best low cost choice is VSA – Virtual Storage Appliance. VSAs are in OVF format and need either VMware Workstation or ESX server. I installed ESXi on my storage server. I virtualize the multiterabyte fileserver with RDM disks and next to it I run HP StorageWorks (LeftHand) VSA, FalconStor VSA or EMC Celerra VSA. For example for SRM tests I had two copies of Lefthand VSA replicating each other.
The server was build with the purpose to fit there as many hard drives as possible. I use Chieftec Smart case with 9 external 5.25“ drive bays that are multiplied with 3in2 or 4in3 disk backplanes. The motherboard is ASUS P5B-VM DO with 7 SATA ports. I added 4 more with two PCI SATA controllers (Kouwell). I experimented with hardware RAID card (Dell PERC 5/i) but it made too much heat and the local Raw Disk Mapping did not work. The OS is booted from USB flash disk so I can put 11 drives to this server. Currently I have five 1TB drives (low powered green RE3 Western Digitals) for NAS in RAID 5 and two 500 GB drives for VSAs. Intel Core 2 Duo E6400 CPU with 3 GB RAM more than enough to run 3 VMs at the moment (RedHat based NAS fileserver, FalconStor and LeftHand). One onboard Intel NIC is coupled with Intel PRO/1000 PT dual port PCIe adapter cheaply bought off eBay. One NIC is used for management and fileserver traffic (with VLAN separation), two NICs are teamed for iSCSI traffic.
The purpose of ‘workload server’ is to run all the VMs needed for infrastructure purposes and testing. From my experience I found out that the consolidation of non production servers is usually limited by the memory available, therefore I was looking for the most cost effective option to build server with as much memory as possible. At the end I settled on Asus P5Q-VM which has 4 DIMM slots and supports up to 16GB of DDR2 RAM. I bought it cheaply off local eBay equivalent, added Intel Core 2 Duo E8400 3GHz (also bought used) processor and 12 GB of RAM (brand new 4 GB DIMM costs around 110 EUR). The onboard Realtek NIC is not ESX compatible, so I added Intel PRO/1000 PT dual port PCIe, Intel Pro PCIe and PCI adapters to get 4 network interfaces. The server is diskless, boots from USB flash disk and is very quite housed in small micro ATX case.
At the moment I run my infrastructure VMs (two firewalls – Endian and Vyatta), Domain Controller, Mail server, vCenter, vMA, TrixBox, Win XP) and vCloud Director test VMs (vCloud Director, Oracle DB, vShield Manager) without memory ballooning and CPU usage is at less than 25%. The heat and power consumption is acceptable although the noise of 7 hard drives and notably the backplane fans is substantial. In the future I may go the Iomega StorCenter direction to replace the fileserving functions as it is very quite and power efficient but most likely it will not replace the flexibility of VSA.
3 thoughts on “My vSphere Home Lab”
Didi you build a esxi 7 homelab? I would like to know if your:
PCI SATA controllers (Kouwell)
run under esxi 7
I do not use that hardware any more. I now have 4 Shuttle PCs in my lab and they run vSphere 7.