As I spoke about in my first post, ESXi 5.0 AMD Whitebox Server for $500 with Passthrough (IOMMU), my ESXi Home Lab is a bit different than most in that I’m not only looking for a lab, but also a production environment to run virtualized HTPCs (XBMC in a VM), and even virtualized PCs for the house. All hardware had to be consumer, but it also had to support ESXi HA (high availability), ESXi DRS (Distributed Resource Scheduler, also known as clustering), ESXi FT (fault tolerance), and other advanced ESXi features for my home VM lab.
In my first post, I gave one of my builds for a node, however, I’ve also been working with another motherboard, the ASRock 970 Extreme3, that I’ve come to like also. It loses a PCI-e x1 slot over the Gigabyte GA-990FXA-UD3 but I have found I have good luck grabbing this off eBay at a cheaper price. It also gives my ESXi home lab IOMMU (AMD’s version of passthrough), but the total build comes in a little cheaper. I’ve also found a set of hardware, including cards, that I can replicate this build, down to the passthrough each time, that works without any additional configuration or headaches. I imagine there are builds that also work, but I find this build is reproducible for me with no additional issues. Plug and play.
This ESXi whitebox with IOMMU works out to around $534 (or $459 without my 2U rackmount case) currently if you headhunt parts on eBay. The ASRock 970 Extreme3 is easy to find for ~$75, and it supports the FX-6100/FX-6200 processors and FX-8120/FX-8150 8-core processors (Zambezi), as well as the FX-6200 and FX-8320/FX-8350 8-core processors (Vishera ) without a BIOS flash. It’s been plug and go with all three of the motherboards I have purchased. That wins points with me.
ESXi SATA Passthrough on the ASRock 970 Extreme3
One nice quirk that I have found with the ASRock 970 Extreme 3 is that when you pass through the SATA controller, only 4 of the 5 ports on the board passthrough. The 5th SATA port remains available to the ESXi host for use as a local data store. For me, this is a boon rather than a drawback, since VMs that you passthrough hardware to are not available for VMMotion. So why not store them locally and have ESXi boot off the hard drive? I do have an iSCSI target available for the cluster, also.
ESXi Video Card Passthrough on the ASRock 970 Extreme3
Video card passthrough in ESXi can be tricky, but there are some best practices that I’ve come up with for the person looking to do so. First of all, your RAM *must* be reserved for your VM if you’re passing through a video card. This is a non-arguable point. The way that ESXi handles RAM, it moves it around and re-assigns it at times, and the video card will die first time this happens. Reserve your RAM, and you’ll have no stability issues. See my post about ESXi Video Card Passthrough (coming soon) for more information and configuration thoughts.
ESXi AMD Whitebox with IOMMU Parts List
Motherboard: ASRock 970 Extreme3 — $75
CPU: AMD FX-8120 Zambezi 3.1GHz Socket AM3+ 125W Eight-Core — $120
RAM: 32GB (4x8GB) DDR3-1333 — $120
Video Card: HIS Radeon HD6670 PCI-e x16 Low-Profile — $90
ESXi Host Video Card: ATI Rage XL Pro 8BM PCI — $8
Case: iStar D Value D-213-MATX 2U Rackmount – $75
Power Supply: Logisys PS550E12BK 550W Power Supply — $25
NICs: 2xPCI-e GB, 1xPCI GB — $21 ($7 each)
Optional SATA Controller Card: LSI SAS3041E 4-Port SAS/SATA PCI-e x4 — $25
ESXi Whitebox Total Cost without Case: $459
ESXi Whitebox Total Cost w/Case:$534 ($559 w/SATA Card)
Slot Setup for the ESXi AMD Whitebox
PCI-e x16: Radeon HD6670 (Passthrough to VM)
PCI-e x4 : LSI SAS3041E 4-Port SAS/SATA PCI-e x4 (Passthrough to VM)
PCI-e x1 : 5 Port PCI-E USB Port (for Passthrough)
PCI-e x1 : GB NIC (RealTek 8168, used by ESXi host)
PCI : Intel Pro/1000 MT Dual Gigabit PCI-X NIC
PCI : ATI Rage XL Pro 8BM PCI Video Card (Console Video)
Notes: All of this was off eBay, including shipping. Although some deals I had to wait on, I replicated this down to the dollar on two ESXi hosts, and the parts are plentiful. The case, of course, is totally up to you, and this could be done cheaper with a mid-tower box. I’m rack mounting, so I was looking for a solid case.
April 2013 Note: As of the time of this note, RAM has almost doubled in price from when I wrote this article, and most 32GB (4x8GB) kits are running $175-$225. It may be smarter at this point to purchase two 8GB sticks, and expand out later when the price for RAM decreases.
Also, note that the PCI-e x4 slot is a SATA controller, that’s completely optional. This particular node is sitting in a 4U Logisys CS4802 case with 8 SATA drives in with 7 of them passed through to a VM (remember that one of the SATA ports on the ASRock 970 Extreme3 stays with the ESXi host) that is serving as a domain controller and NAS for the house using FlexRAID on the SATA drives to get a 11TB single volume RAID that stores the house’s media, documents, photos, and roaming profiles on. This setup is also working perfectly and was plug-and-play. I also used a video card in this slot and was able to pass this through to a VM without issue.
To get the extra hard drives in without issue, I used a Cooler Master STB-3T4-E3-GP 4 in 3 5.25″ hard drive bay that allowed me to take the three 5.25″ external bays and turn them into 4 bays for 3.5″ hard drives. Worked perfectly, has a 120mm fan built in to keep them cool, and no issues to date with it.
Finally, I want four physical NICs on every node to properly segregate traffic for my VM Lab. I *could* use a dual GB NIC card, but those take up a PCI-e x4 slot at minimum, and I’d lose my SATA controller. On my second node of this type, I didn’t need that SATA card, so that slot is free for another video card, or possibly a dual GB NIC card, freeing up one of the PCI-e x1 slots for a USB card to passthrough. You can also use a Intel Pro/1000 PCI-X dual GB NIC in a regular PCI slot, and these can be found on eBay for <$10. See my article Intel Pro/1000 Dual Gigabit NIC PCI-X Card in PCI Slot for more information and pictures.
The ATI Rage card is kept assigned to the ESXi host as it’s video. Once the node boots, you lose console video output if you don’t have a card devoted specifically to to the host, so I keep this cheap $8 card for ESXi host video locally.
ESXi AMD Whitebox Screenshots
Here are some various screenshots from this build just to give you an inside view of stuff going on. Some of these are of FlexRAID, so you can see the passthrough drives mounted in a RAID and presented as a single drive. This really helps with the XBMC shares that I present out to my HTPCs.