Crazy Build with Dual E5-2699v4

IonutZ

n00b
Joined
Dec 29, 2016
Messages
8
Hey guys,

First post here - would love some advice! I'm looking to build a workstation on an Asus Z10PE-D8 with 2x E5-2699v4. Looking to put about 128gb DDR4 2133 ECC. Booting off of a Samsung 960 Pro attached to the m2 slot. Also going with 2x GTX 1080s in SLI.

My primary concern is that the CPU I'm going with has a lower clock speed than your average i7. I'm upgrading from a 3930k. I'm curious whether I can leverage the crazy processing power that 2x 2699s would give me through virtualization. Is there a hypervisor that would allow me to pass through the gfx cards to a guest OS while supplying the guest OS with a virtual CPU that combines 8 cores for example? If I can get Windows to think that I'm running a Quad core setup at 16Ghz per core, I'm sure that's better than running 32 cores at 2Ghz per core.

Any advice would be welcome :)

Cheers!
 
Hate to break it to you but this is not how CPU cores and application threads work. It really IS 32 Cores @ 2GHz per core, and WILL be slower than your 3930K in lightly threaded applications where the raw clock speed advantage of the 3930K will beat the Xeons. No way to get around this or to 'hack' a way to add them all up.

If this is for gaming (which I assume it is because of the 1080 SLI), might as well look elsewhere like a highly clocked 6700K/upcoming 7700K.

Hope this helps.
 
Hate to break it to you but this is not how CPU cores and application threads work. It really IS 32 Cores @ 2GHz per core, and WILL be slower than your 3930K in lightly threaded applications where the raw clock speed advantage of the 3930K will beat the Xeons. No way to get around this or to 'hack' a way to add them all up.

If this is for gaming (which I assume it is because of the 1080 SLI), might as well look elsewhere like a highly clocked 6700K/upcoming 7700K.

Hope this helps.
He is right. You'd build all that to find out a rig 1/5 the price beats it in games.
 
But it's not just for gaming - it's also for sound processing, and software development. I'd like to be able to run everything virtualized, as in have an OS for just games and shenanigans, and an OS for development that I can run in parallel and switch back and forth as needed.

I'm aware that single threaded applications will be bottlenecked by the CPUs frequency - at the same time I was hoping there would be a hypervisor that could potentially achieve pooling of resources in such a way...

I saw a couple of CPUs in the lineup that hit 3.0 - 3.6ghz - are those more suitable? Any suggestions of CPUs to use?
 
No current way to have a hypervisor "pool" your resources how you'd want them, sad to say.

If money is no object then you're better off having a 4-6 Core high clocked CPU allocated to your games/for fun instance, and how many other cores for your sound processing and software development. Only you would know how many cores you'd need for the latter two tasks.

Yes a 3.0 - 3.6ghz processor would be more suitable. The exact model would probably depend on how many max cores you'd be targeting.

Some other high clocked models if you're willing to drop the core count (based on a quick wiki search):
Xeon E5-2687W v4
Xeon E5-2689 v4
Xeon E5-2689A v4

I would however recommend research into this hypervisor just to check if the 1080 SLI will work through that. I have no experience with virtualization whatsoever.
 
You should just go with a single socket board and a 6950x which should have plenty of horsepower for your multi-threaded apps as well as your single threaded ones. It will also be way better for gaming.
 
Well at some point I'm going to turn this into a server server and I'll want ECC capability... so a consumer CPU and mobo are kind of out of the question. Another reason I want dual CPU capability is because I want to use more than 40 lanes in triple SLI if I go that route.
 
Remember that only two-way SLI will be supported in the future.
 
Also. SLI support on commercial grade or prosumer motherboards is very hard to find. But you've probably figured that out by now.
 
Also. SLI support on commercial grade or prosumer motherboards is very hard to find. But you've probably figured that out by now.

The Asus Z10PE-D8 has SLI support... There's about 3-4 C612 mobos with dual cpu support and SLI.
 
At this price I would build a gaming and a developer system and add a KVM. The sound processing / developer system could build on http://www.natex.us/Intel-S2600CP2J-Motherboard-Kit-p/s2600cp-sr0h8-128gb-12800.htm add PSU (Seasonic M12 II $105), chassis (Phanteks Enthoo Pro $100), coolers (Xigmatek Dark Knight or Silverstone AG07 -- old coolers for an old system, exchange the fans, let's budget $100 though it'll be less, 15 per cooler off eBay plus 10-20 for a great fan), that's $750 and if your apps are scaling well with the number of cores / threads then this old system will destroy everything price/performance wise. If you are planning on a 4K monitor (guessing from 1080 SLI) then add a cheap RX 460, it can be undervolted and underclocked and there are entirely passive (XFX) or semi-passive (Sapphire Nitro comes to mind) cards all of which will ensure the system stays quiet. Because even with all this power, the battle plan is to create a quiet system.

And then build a neat little gaming rig with a much cheaper i7 6700K and Z170 platform. Let's budget $3000 for that.

You've spent much less than the price of a single E5-2699V4.

Now you can spend a few bucks on a few SSDs for the first system, you have a bucketload of slots, I'd use like four Intel 750 in SW RAID-10 in whatever size your sound work requires. Don't be sad about it's PCIe 2.0 only, it's not a severe limitation, the Intel 750 is 2200 MB/s tops and the PCIe 2.0 x4 is 2000 MB/s. Say, the 400GB version costs 350 USD, four of that is only 1400 USD for 800GB of workspace. I presume you planned for hard drives for mass media storage anyways, I would get a Supermicro CSE-M35T-1B 3 x 5.25" to 5 x 3.5" Hot-swap cage for $100, it fits the chassis I've and makes servicing easy. Ultrastar He10 drives are $500, five of that is $2500. All this storage now costs about the same as a second E5-2699V4.
 
Last edited:
I have a NAS that will handle storage so I'm not worried about that. This computer should last me around 5 years+ though... Definitely want a quiet system, but I'm hoping to achieve that with some extra water cooling.

I'm still considering a 2699 because it'll max boost to 3.6, and at 8 cores it still runs at 2.9Ghz. Should be enough for any video game - and running 2x 1080s in SLI will be plenty of power.

I've been looking on Taobao lately for parts, they're significantly cheaper than buying from Newegg or Amazon. They also have this E5-2699C which seems to be identical to a E5-2699 - but I can't find any information about it anywhere. It also doesn't seem to be ES / QS, and it is slightly less expensive. Any ideas where I could find more details on it?
 
Actually, it's possibly doable, but not in the way the OP wants. If you can get the hypervisor to pass through a Thunderbolt port, you can plug in an external graphics dock which can host a single Titan XP.
 
The only chance for GPU passthrough seems with QEMU/KVM http://www.linux-kvm.org/page/Main_Page

Not working with consumer grade GTX and VMware ESXi as they don't want you to do that. Only expensive cards will do,
AMD was a but better but I don't have any actual information as I'm from the green side of life.
 
IonutZ...

What you've described doing - using virtualization to pool resources - is the fundamental opposite of what virtualization is designed for. Virtualization is not used to pool together multiple resources for use by a single system; the resources are already *on* a single system. Virtualization is used to distribute the resources that exist on the single system across multiple other (virtual) systems in order to achieve greater operational density through more efficient use of the hardware.

It's very similar to adding CPU cores, in a way, and can best be thought of like a highway. The clock speed is the speed limit, the core count is the number of lanes. Increasing just the core count (number of lanes) doesn't help out the speed limit, and increasing just the clockspeed doesn't help out when too many cars are vying to use the road.

Virtualization's *job* in this scenario is to *put more cars on the road* because presumably some of the lanes are empty. The entire design of the virtualization ecosystem is built for this purpose. It is *not* designed to let a single car drive faster down the road.

This makes sense really; hypervisors are middleware between the software you're running (Windows/games/whatever) and the hardware they're running on. There are almost no scenarios in existence that adding additional layers of middleware between your software you're using and the hardware it's running on that will make the software faster. They're literally middle-management; they can help you do lots more things at the same time, but rarely-if-ever help you accomplish any particular task faster than before.

Real server motherboards aren't built for gaming. Real gaming motherboards aren't built for 'server-ing' to invent a word. Good server boards like those made by Supermicro won't let you overclock and likely won't come with fancy features like USB 3.1 or sound, and they're not typically built with quiet in mind. Gaming motherboards won't come with IPMI management, VMware (or other) certified SAS controllers or HBAs, or multiple NICs.

Build separate computers for separate jobs, and save yourself money by not buying the 2699.
 
Last edited:
I have a NAS that will handle storage so I'm not worried about that. This computer should last me around 5 years+ though... Definitely want a quiet system, but I'm hoping to achieve that with some extra water cooling.

I'm still considering a 2699 because it'll max boost to 3.6, and at 8 cores it still runs at 2.9Ghz. Should be enough for any video game - and running 2x 1080s in SLI will be plenty of power.

I've been looking on Taobao lately for parts, they're significantly cheaper than buying from Newegg or Amazon. They also have this E5-2699C which seems to be identical to a E5-2699 - but I can't find any information about it anywhere. It also doesn't seem to be ES / QS, and it is slightly less expensive. Any ideas where I could find more details on it?

I have built a system like what your describing. I have dual E5-2689v4 128GB DDR4 2400 on a Asus Z10PE-D8 with a 1080 for one VM and a 1070 for the other. It works really well and games well since the E5-2689v4 has a all core turbo of 3.7Ghz. A lot higher all core then the rest of the xeon line up.

If you were going to go xeon i would recommend going the 2689v4
 
Also I'm using unraid. It is a kvm hypervisor that allows passthrough of nvidia geforce cards. It's also a nas. So you could probably get rid of your nas. That's what i did. Now i just have one system in the house.
 
I have built a system like what your describing. I have dual E5-2689v4 128GB DDR4 2400 on a Asus Z10PE-D8 with a 1080 for one VM and a 1070 for the other. It works really well and games well since the E5-2689v4 has a all core turbo of 3.7Ghz. A lot higher all core then the rest of the xeon line up.

If you were going to go xeon i would recommend going the 2689v4

Do you use the system for things other than running the NAS and the two gaming workstations? Because I can't see how what you've just described building is more cost effective than just building dedicated PCs.

My guesstimates at your pricing. I'm leaving off things like cases / CPU coolers / etc because you've gotta buy that in either scenario, and though there would be more desktops the server equivalent is typically more expensive to make up for it:
Z10PE-D8 - $565
E5-2689v4 - Er, not sure. Call it $1500 x2?
128 GB DDR4 2400 ECC Registered - ~$1000
Total for the box: $4565

For that kind of money, you could easily a bunch of these:
Decent Z270 board, like Z270 Extreme4 - $150
i7-7700K - $325
G-Skill Ripjaws V 32 GB DDR4 3000 CL14 - $244
Total for each system: $719

And individually they'd each be faster the VMs you've got in your big Xeon box, thanks to the lack of VM hypervisor middleware, faster CPU cores and memory, and the ability to overclock even further if you wanted.
 
individually they'd each be faster the VMs you've got in your big Xeon box, thanks to the lack of VM hypervisor middleware, faster CPU cores and memory, and the ability to overclock even further if you wanted.

I also run a few small web servers, TS server, plex, few game servers. I also do some app development and video encoding. I used to have a bunch of dedicated machines but it's annoying having pc boxes everywhere. Then you got to turn them on if there not already going. Then Windows decided to shit itself do you waste half a day fixing that.

This way everything's all in one place. Windows shits itself just grab the iso you backed up a few days ago. All files are available at hdd speeds. Not over the network. One psu. One case. One lot of HDDs. Ecc RDIMMs.

Yes not as fast as having dedicated box's in some applications but for modern multithreaded games the difference if negligible. Older single threaded games run fine. Unless your anal about getting an extra 2fps these cpus game fine. I haven't seen much of a difference if any compared to my 4.6Ggz 4970k i got rid of.

The hypervisor has next to no impact on performance. I benched the cpu on a bare metal install and running through the hypervisor and it was near identical performance.

Sometimes it's not all about best price to performance ratio unless you just use you pc to game. Then you can just buy any overclockable quad core and call it a day.
 
I mean, I do similar, though on a smaller scale, and using two boxes.

My "server" PC runs TS3, PLEX and holds my media library, and runs VMware Workstation with 2 permanent VMs. One of the VMs is my work VM, where all my work shit lives including my programming environment. Occasionally I'll build client VMs in this environment and then export them out, but it's rare. Mostly I use it for product testing, I have a bunch of VMs and an entire domain + Exchange environment built but typically powered off unless I need it to test something. I don't do much video editing - mostly remuxing MKVs to incorporate correct subtitles and audio streams and such, and the odd edit of a game clip or baby video, but that kind of thing I do on my "Gaming" PC.

The difference for me is that, if I need it, I have access to a 16-host VMware ESX cluster that collectively has 4 TB of memory and approximately 80 TB of SSD space. And that's where all my 'real' work gets done because that's the hosted environment we rent out to our customers by the VM.

I was just curious- attempting to build something like you've described your box as would outright murder my budget because it'd be an "all at the same time" kind of thing. While I've got a lot of decent hardware in my house, it was all assembled over time and often used, so the dollar amount was spread out. Plus, for me, the PCs are spread all across the house and I or my wife game on all of them except the server; they're not in physical proximity to where such a centralized box would be located.
 
I have a 6950x that i am going to make a dedicated gaming box out of for 2 VMs got the gaming room. Just have to get around to building it.

Just for anyone who reads this later i ended up building a 6950x overclocked to 4.4Ghz with 32GB 3200Mhz DDR4 ram and it gets about the same fps as my dual xeon 2689v4 system running DDR4 2400. Only game that's shown any real improvement is Arma 3. Went front 30/40fps to 50/60fps. I was kind of hoping for a bit bigger improvement to be honest.

So a 3.7Ghz xeon is perfectly fine for gaming. Will test more and update later.
 
Last edited:
Back
Top