(click to enlarge)
I am not going to go into detail of what each component is here since it's mentioned in the previous version I made. Most of the components are the same, so the new stuff is more interesting...:
(click to enlarge)
- D-Link DGS-1216T. 16port L2 Ethernet switch. The backbone of my network. 16 gigabit ports, fully managed and used to separate my external network from my internal using VLAN segmenting. This switch is really old (some 15+ years old) but it works great.
- 32-port KVM switch. This is an old Raritan KVM that uses Ethernet cables to connect to dongles that connects to your PC. It uses some old Java interface will allows you to access each computer remotely, but due to its old Java version and security issues I use it only locally. It works great.
- External 4-bay drive enclosure. Connected with an older SAS SFF-8470 to SFF-8088 connector on one of the SAS controllers connected in my primary storage rack (4). I use this for personal backups.
- Primary storage enclosure. This is the main harddrive storage unit running FreeNAS and sharing all the drives to the primary server using the iSCSI protocol.
- Intel Xeon E5-2609 running at 2.40GHz (4c/4t).
- ASUS Z9PE-D16 server motherboard supporting dual CPUs and ECC DDR3 RAM (one CPU occupied, half of the memory and PCI-express slots populated).
- 32GB ECC DDR3 RAM.
- 4x 1Gbit network interfaces (onboard, not used).
- 1x Mellanox ConnectX-2 10Gbit network adapter (connected to primary server).
- 1x 16port LSI SAS-9300-16i SAS HBA controller connected to the harddrive backplane using mini-SAS cables.
- 1x 16port LSI SAS-9300-16e SAS HBA controller connected to the external 4-port enclosure (3).
- 16x 3.5" SAS/SATA storage bays for storage expansion.
- Secondary storage enclosure. This enclosure is going to be used for all the backups of the primary enclosure. It's not yet fully installed but the idea is to either hold a full server motherboard setup like with the primary enclosure, or simply connect this bay to the primary enclosure through external SAS connectors. I have an extra 10Gbit NIC I can use for this to double the speed to 20Gbit if needed.
- Primary server. The center of the entire server system.
- Intel Xeon E5-1650 v4 @ 3.60GHz (6c/12t).
- SuperMicro X10SRH-CLN4F server motherboard.
- 128GB DDR4 ECC RAM.
- 4x 1Gbit network interfaces.
- 2x Mellanox ConnectX-3 10GBit QDR network adapters (running at optional 40Gbit if needed), connected to the primary storage enclosure. Only one adapter used at the moment, but secondary may be used for the secondary storage enclosure.
- 3x LSI SAS-9300-4i SAS HBA controllers (one onboard the motherboard, two others as PCI-express adapters) feeding the enclosure harddrive modules:
- 24x 2.5" SAS/SATA storage bays for harddrive expansion. These drives holds the operating system, virtual machine images, cache disks and other data. I only use 6 bays at the moment, but they are all equipped with fast Samsung 850 Pro SSDs (250GB and 500GB models). Plenty of room for future expansion.
- Testing server. HP ProLiant DL360 G8 1U server. This was the old primary server but it's been demoted now as my lab server and emergency server in case the primary server breaks down. At the moment it doesn't run anything as I have not yet had the need to configure it.
- 2x Intel Xeon E5-2630 3.2GHz CPUs (6c/12t).
- 288GB of ECC DDR3 RAM.
- 8x 2.5" SAS/SATA storage bays.
- Two power supplies that work in tandem to provide stable power.
- 4x 1000Mbit built in network interfaces with the option to upgrade them to 2x10Gbit.
And and all I spent around 2000 euros for the primary server, the primary storage enclosure and four 10Gbit Mellanox network cards. I got bits and pieces from work and other sources as well to complete it all. What remains now is getting the backup enclosure up and running and I also plan on pulling a 10Gbit fibre cable from my server to my workstation as I always max out my data transfers to the server (exactly 112MB/s due to the gigabit limitation) and need to remove that bottleneck.
I have not yet maxed out the 10Gbit connection to the storage drives since even when running SSDs you peak at around 4Gbit, so i got plenty of room for expansions. General disk I/O is of course slower across iSCSI than directly connected to a SATA controller but since I don't push that much random data it won't matter for me.
If anyone has experience with iSCSI and knows good tips and tricks to improve performance in Windows let me know.
Now time for some very important credits!
- Again a very big thanks to BA member dw5304 for all the tips, setup ideas and licenses for the server setup.
- To all the BetaArchive members helping out making this community grow and also making sure that I need to run to the store getting more and larger harddrives .