New Intel Core i9-7980XE 18 Core Build

SixFootDuo

Supreme [H]ardness
Joined
Oct 5, 2004
Messages
5,825
Client wants a heavy lifting workstation for 10+ hour renders, simulation, etc - Solidworks, Siemens NX 11 and CAMWorks etc.

Specs below.

I've built a number of these machines but always looking for a new thought process / suggestions others might share. There is always room to learn.

I built out a few Epyc and Xeon builds on paper but I really just want to go with the 4.5ghz I expect to get with the 7980XE across all 18 cores.

Budget is not an issue. This particular client will write a check for any amount but I'm really not sure we need to spend much more for what his Engineer will use.

Anyone have experience with 3Dconnextion products? I see them in a lot of the offices I visit. I see A LOT of these on desks. I normally just supply keyboards and mice for input. I thought we could make an attempt at streamlining his workflow since I'm told he is working 12 and 14 hour days on a new Ford program. 850,000 units.


Intel Core i9-7980XE 18 Core / 36 Treads @ 4.5Ghz
Asus X299 LGA-2066 Board, 2nd Version ( Mark2 )
64gb DDR4 2666 Ram ( 4 x 16 )
nVidia Quadro P4000 8GB DDR5 PCIe16x
1TB Samsung 960 Pro SSD NVMe Drive.
Corsair AIO H150i Pro
Full Tower Case
Seasonic 1000watt Platinum Rated PSU
3Dconnextion SpaceNavigator Mouse and 3Dconnextion Cad Mouse
Slimline comfort keyboard
Windows 10 Professional.
USB DVD
1TB Mechanical Storage
64gb USB Rescue media
 
I will probably get 3200. That's the quote I gave the client. Memory is support expensive. I'm looking at $600 as is I think.
 
In this use case, I call my Dell and Lenovo reps and get quotes on racked workstations.

I am not a fan of troubleshooting in the middle of client rushed deadlines.
I really like it when some guy in a logoed shirt has to swing by with a bunch of replacement hardware instead of me hunting it down.
 
at 4.5 I doubt the motherboard could sustain the power delivery for the processor, it will likely overheat and throttle back the processor. I think 4.2 would be a more likely goal you will achieve. Make sure you have a fan of some kind for the VRM's.
 
I'm sorry, but this isn't very realistic. 4.5GHz with a Corsair H150i AIO cooling? That's not a bad unit but I don't see that being able to handle 18 cores at 4.5GHz. As others have said, Silicon Lottery doesn't even offer a 4.5GHz 7980XE. Even if they did, that cooler isn't likely enough to handle that. I've built many machines for various customers over the years and one of the constants I've found is that they never do enough to keep the workstation as cool as I would. I'm not speaking about design, but about environmental variables. Ambient temperatures and the system's physical location are usually less than ideal. Usually, if customers were cognizant of such things they would build their own boxes.

Active cooling on the motherboard VRM's is going to be a virtual necessity if you want that system to last.
 
I never EVER overclock client machines EVER literally. If they want higest performance it absolutely must come with gauranteed stability. Including RAM data stability.

For an expensive high end build I would get the support and backing of a big warranty like Dell or HP. I wouldnt whitebox this. And why a 7980? Wouldnt a dual xeon station with ecc be better. Ok maybe its a few points less in Cinebench but you will be able to sleep at night and your reputation will not be crushed.
 
Last edited:
I never EVER overclock client machines EVER literally. If they want higest performance it absolutely must come with gauranteed stability. Including RAM data stability.

For an expensive high end build I would get the support and backing of a big warranty like Dell or HP. I wouldnt whitebox this. And why a 7980? Wouldnt a dual xeon station with ecc be better. Ok maybe its a few points less in Cinebench but you will be able to sleep at night and your reputation will not be crushed.

OEM's like Dell and HP cut corners like anyone else. White boxing it is fine if you do it correctly. The build in question isn't the way I'd do it.
 
OEM's like Dell and HP cut corners like anyone else. White boxing it is fine if you do it correctly. The build in question isn't the way I'd do it.

Yes whitebox is fine as long as your available for the next 3 years to honor replacing parts if they fail. At least with the big box companies theyll be around to honor warranties even if you quit or are fired. Its about ethics. IT people never consider it.
 
Must be nice to have the privilege of building a monster system! Reliability is more important than searching for the ultimate speed. Sound like serious crunching numbers and any errors would not be an ideal situation for your reputation. Stick with stock speeds and focus on I/O and memory speeds. IMHO...
 
Intel Core i9-7980XE 18 Core / 36 Treads @ 4.7Ghz
Asus Rampage VI Extreme Motherboard
64GB G.SKILL TridentZ RGB DDR4 ram @ 4000 MHz
Thermaltake Floe Riing RGB 360 TT Premium Edition Liquid Cooling System
Cooler Master - MasterCase H500P Case
oc2.png
 
Last edited:
Yes whitebox is fine as long as your available for the next 3 years to honor replacing parts if they fail. At least with the big box companies theyll be around to honor warranties even if you quit or are fired. Its about ethics. IT people never consider it.

Ethics? I don't know if I agree with that term here. It's not unethical to build a white box machine for a customer. The only thing I'd consider is ethically ambiguous is failing to offer a complete list of possible options for meeting your customer's needs. When I did this sort of thing for a living, I always offered my customers a choice between a prebuilt workstation like an HP Kayak/Workstations, Compaq Proliant (I did this shit a long time ago) and the white box offerings I could build. I made it clear what the advantages and disadvantages were, and many made the choice to go with a white box knowing full well what risks they incurred.

For most of your points, I can agree with you. However, just about everything has it's trade offs. Even going with a big box company's products isn't without some downsides. You can build a better system than HP or Dell but run the risk of needing parts and service with no centralized warranty, or a warranty that isn't worth the paper it's printed on because the company you bought it from went tits up. Then the customer ends up getting stuck with the bill or unit replacement costs. On the other hand Dell and HP is substandard hardware wise. At least those machines are covered by a basic parts and labor warranty. For workstations, these warranties are longer than your average consumer grade desktop or laptop. You can also get support agreements for workstations and servers which have a service level agreement that guarantees parts and service within a certain time frame. Unfortunately, you pay a lot more for that piece of mind.

It isn't as if the parts inside a white box workstation are totally without a warranty. Unfortunately, you end up in a situation where you are at the mercy of some small business to take care of your problems or you have to navigate the waters of the hardware manufacturer's individual component warranties. This isn't the worst thing in the world by itself, but component warranties lack any type of SLA. Those companies couldn't give two squirts of piss how long your machine is down for repairs. Lost revenue or downtime isn't the manufacturer's responsibility. Of course, that's the case of all warranties but a service level agreement with HP or Dell at least guarantees you parts and a technician to install them within a given time frame. That's something those manufacturers will take responsibility for.

Properly built, a white box can last as long as anything else. Computers have the same common failure points regardless of who made them. Mechanical hard drives, fans, PSU's etc. in virtually that order. Many people will go with a white box knowing that the biggest risk in doing so is downtime and storage devices. Servers are another matter and I see very few companies using white box style servers and rightly so. Server infrastructure is a bit different than a single workstation. Then again, it depends on the size of the company and several other factors. I've seen this scenario play out in probably every way you can imagine.

My advice, if we are talking about building a professional workstation, the suggestions in this thread aren't it. You don't use standard retail CPUs. You use Xeons. You don't use RAM with shark fins and shit on them. You use ECC RAM. You don't use consumer SSDs or gaming motherboards in some tower with more lights on it than Joel Schumacher's version of the Batmobile.

Throw down a real budget and we can list some actual recommendations. If you don't give me a real budget, I'll do what I normally do and post some $40,000 smart ass spec sheet as a recommendation.
 
Last edited:
I'm not sure "substandard hardware-wise" is necessarily accurate. Motherboard/bios support is WAY better than anything custom-built.

Have you ever talked to ASUS/GIgabyte/ASrock etc on official LInux support? How about upgrading sandy/ivy/etc motherboards for spectre fixes when they come out?
 
I am not against building whitebox business critical machines or software.
Last time I built gear was bc the client took x79 gear into a stadium and it died.
The outfit that built it only heard of studio use cases, but got sued anyway.

In case of some of the workflow cited, there are some glaring issues that I would call a vendor and get a quote on what they would sell.

Now there's nothing wrong with getting traveling pit crew rates.
I've worked E3, NY Comicon, PAX, The Oscars, bunch of private HBO parties.
I always had extra everything with me, and the client was willing to pay for it.
I am absolutely bespoke testing all patches and updates, recording my results in inventory, and treating the gear like it's going to get on a plane and can't die for the 4 months it's installed.

Yeah, that happened regularly, and I charged them for it.

I would want cooling, RAM, storage, et Al to be spec'd to what the customer expects.

Even if the customer doesn't know, you'd have quotes from vendors of products validated for the use cases the white box(es) hopefully didn't deviate from in any meaningful way that would land you in a lawsuit for damages stemming from loss of anything.
 
Last edited:
Ethics? I don't know if I agree with that term here. It's not unethical to build a white box machine for a customer. The only thing I'd consider is ethically ambiguous is failing to offer a complete list of possible options for meeting your customer's needs. When I did this sort of thing for a living, I always offered my customers a choice between a prebuilt workstation like an HP Kayak/Workstations, Compaq Proliant (I did this shit a long time ago) and the white box offerings I could build. I made it clear what the advantages and disadvantages were, and many made the choice to go with a white box knowing full well what risks they incurred.

For most of your points, I can agree with you. However, just about everything has it's trade offs. Even going with a big box company's products isn't without some downsides. You can build a better system than HP or Dell but run the risk of needing parts and service with no centralized warranty, or a warranty that isn't worth the paper it's printed on because the company you bought it from went tits up. Then the customer ends up getting stuck with the bill or unit replacement costs. On the other hand Dell and HP is substandard hardware wise. At least those machines are covered by a basic parts and labor warranty. For workstations, these warranties are longer than your average consumer grade desktop or laptop. You can also get support agreements for workstations and servers which have a service level agreement that guarantees parts and service within a certain time frame. Unfortunately, you pay a lot more for that piece of mind.

It isn't as if the parts inside a white box workstation are totally without a warranty. Unfortunately, you end up in a situation where you are at the mercy of some small business to take care of your problems or you have to navigate the waters of the hardware manufacturer's individual component warranties. This isn't the worst thing in the world by itself, but component warranties lack any type of SLA. Those companies couldn't give two squirts of piss how long your machine is down for repairs. Lost revenue or downtime isn't the manufacturer's responsibility. Of course, that's the case of all warranties but a service level agreement with HP or Dell at least guarantees you parts and a technician to install them within a given time frame. That's something those manufacturers will take responsibility for.

Properly built, a white box can last as long as anything else. Computers have the same common failure points regardless of who made them. Mechanical hard drives, fans, PSU's etc. in virtually that order. Many people will go with a white box knowing that the biggest risk in doing so is downtime and storage devices. Servers are another matter and I see very few companies using white box style servers and rightly so. Server infrastructure is a bit different than a single workstation. Then again, it depends on the size of the company and several other factors. I've seen this scenario play out in probably every way you can imagine.

My advice, if we are talking about building a professional workstation, the suggestions in this thread aren't it. You don't use standard retail CPUs. You use Xeons. You don't use RAM with shark fins and shit on them. You use ECC RAM. You don't use consumer SSDs or gaming motherboards in some tower with more lights on it than Joel Schumacher's version of the Batmobile.

Throw down a real budget and we can list some actual recommendations. If you don't give me a real budget, I'll do what I normally do and post some $40,000 smart ass spec sheet as a recommendation.


Every studio I've worked at had dedicated design workstations (bc they don't get paid to watch a progress bar), dedicated render boxes, tiered project storage, and a real dedication to network thruput.

This would be 2 people, or 2000.

Scale is everything when dealing with multiple deadlines staring them down.
 
I'm not sure "substandard hardware-wise" is necessarily accurate. Motherboard/bios support is WAY better than anything custom-built.

Have you ever talked to ASUS/GIgabyte/ASrock etc on official LInux support? How about upgrading sandy/ivy/etc motherboards for spectre fixes when they come out?

I worked with HP and Dell systems like that for a number of years. Substandard components is probanly a bit harsh, but they aren't necessarily on par with what's out there in the standard DIY motherboard market in some areas. BIOS support varies by vendor, but I can agree with that. Hardware support is definitely better, but that's a relfection of customer service and the vendor, not the hardware itself. I still maintain that overall, OEM hardware cuts more corners than a proper DIY build does. I used to repair and refurbish these things and I don't know how many I've upgraded over the years but this has universally been true.
 
Must be nice to have the privilege of building a monster system! Reliability is more important than searching for the ultimate speed. Sound like serious crunching numbers and any errors would not be an ideal situation for your reputation. Stick with stock speeds and focus on I/O and memory speeds. IMHO...

I've been able to maintain a very good reputation with nearly all of my clients. The only problems I've ever had was when I would get a call/text about system instability due to the SSD drives getting full. I always quote systems with 512GB SSD purposely allowing myself an upgrade path down the road. While I do have semi-reg maintenance I like to have some measure of reacquiring revenue with these companies. All the systems I build over 7k get 1tb SSD's.

I don't cut corners, I over engineer my systems in that they have more performance than anyone needs. I used to do very very well with gaming systems back in the mid 90's and that run lasted a good solid 11 or 12 years until tablets and smart phones came out. I then just renamed my high-end gaming systems to "workstations" and I've been able to survive.

But I do stress test the holy hell out of my systems. And they can handle 8 - 10 - 12+ hour renders without fail.

Even more importantly that has saved me an absolute ton of time and money is Windows 10 Lite along with me custom host file that would blow your guys mind. I am always remotely updating my clients host files to block out malware, toolbars and all the other bullshit. I run portable apps only on these systems, nothing updates, ever. They all have rescue media that will restore everything. Each morning at 5am, the system automatically images itself in case we need to do a restore. Some of my clients have kids so I block out twitter, facebook and everything else they would be interested in. Having these systems perform without issue 24/7 is key.

Software is such a wildcard. I fear it more than hardware failure. If you can completely and utterly control the software and the user to an extent ... that will get you very very far.

I know a lot of other guys that do workstations and they all ... all cut corners.

I just finished another workstation for a client that I billed out at $19,995 and I built that with 2 x AMD Epyc 7601 CPU's, a Supermicro Motherboard, the MBD-H11DSI, 2 x Noctua Epyc Coolers, 128GB DDR4 2133 ECC and a Supermicro Server Case, PSU and a 2nd hand nVidia Quadro P6000.

Yes it's true, some of these companies / clients have very deep pocketbooks but they will always take the best of the best when it's proposed / offered.

What would you guys built with a 7K budget?

Workloads are crash simulation, wind tunnel, Solidwork renders usually in the 8 - 14 hour time frame, etc etc. They are throwing a lot at these systems.
 
You can't build in a vacuum with this workflow.
There are very specific build targets that will be tiered as Sm, Med, Lg, XL, Industrial.

Scale of the workpiece and degree of testing minutiae will affect where the fine points of what the target builds of a given tier above where you are building should be pulled down to your tier for a "performance boost".

Take some of the Dell rack workstations that can run 4-8 Quadros with essetially their own flash arrays and fibre channel out to rackmates and you'll get the "Industrial" tier.

Autodesk is specifically funny this way, I knew guys in the late 90's/early 2ks that were whiteboxing some of the first flash SAN arrays out of old Sun and SGI gear.

My girlfriend was a Web Producer for them, which is amusing bc there's zero chance I will join a guy I know running their DevOps team.

You really want to understand the discreet components of their workflow.
I sometimes run into these issues like yesterday when I was sitting with a bunch of DBAs trying to be Data Scientists.
They couldn't even figure out how to get their data sorted into targeted data lakes to run AWS services for visualization, much less get their Role access working on data crawlers to get the data from stream they needed to the right volumes.

You end up having to untangle a lot of habits that may be client quirks, and there's no real fault of hardware or software....it's the clients quirky workflow.

That still has to be accounted for and dealt with.

I'm glad I won't do this type of gig anymore, bc I will get these random calls for things like China Mobile, I have 20 days, I need 3 teams to write bespoke changes to existing open source projects, I have 20 days, I get a budget of $375K, I have to be onsite in China the whole time so I donate project management hours, then I need to build out gear to be used in a way no one ever intended and it can't break.

I mean, screw that noise.
As long as there is time to log, you can figure it out.
There are a lot of 3rd party outfits that actually do this type of fill in work for Dell, HP, Lenovo, etc.

Once you dive into it, you rapidly grow past it bc it isn't difficult to start a dialogue with Nvidia, Citrix, Dell, et al bc you have a client doing XXXXX which is potentially cool.
Broadcast the client's work, vendors are always looking for an interesting write up.
You are documenting and encapsulating chunks of their workflow, hopefull you'll see a holistic strategy where the box really makes sense.
You might find some real capability by whiteboxing a solution to a set of problems Autodesk hasn't addressed.
I mean, they have 90+ projects going that need a certain kind of cohesive steering that is presently missing.
Dive into the software, you don't need to write any of it, but you do have to understand how it works for the clients benefit.

They have to work for the man, so anything they can do to help them stand out in a sea of logoed shirts helps them tremendously.

You start finding these really weird groups within the vendors that are tremendously powerful.
 
I worked with HP and Dell systems like that for a number of years. Substandard components is probanly a bit harsh, but they aren't necessarily on par with what's out there in the standard DIY motherboard market in some areas. BIOS support varies by vendor, but I can agree with that. Hardware support is definitely better, but that's a relfection of customer service and the vendor, not the hardware itself. I still maintain that overall, OEM hardware cuts more corners than a proper DIY build does. I used to repair and refurbish these things and I don't know how many I've upgraded over the years but this has universally been true.

I could possibly see the HP & Dell cutting corners to save a few cents here & there, but even so that is what warranty is for right? And if it's off lease-off warranty they are dumped by the buttload on ebay. I borked up one of my Thinkstation S30s when upgrading a cpu. First time upgrading Socket 2011 board. Instead of replacing it I bought 2 new Thinkstation S30s off ebay for $100 a piece (E5-2609, 8G ddr3, 2 x 500G sata, Win7Pro OEM license, Quadro 400) for the entire machine. I just sold my macbook air to upgrade to something newer with better Linux compatibility & bought a Latitude with 3 year on-site warranty, refurbished from Dell themselves for $750...Simply can't get that with an Acer, Asus, etc.

If you want the newest bleeding edge stuff with max overclocking, you don't buy HP & Dell. I think the market is different, not necessarily inferior.

In my experience I've been burned way more times with issues on hardware/Bios support than failed components. Failed componetns are both timewise & moneywise (in most cases easier to replace). Nobody is going to go fix their own spectre bios problems on an ivy bridge box while many of the big OEMS will.
 
I could possibly see the HP & Dell cutting corners to save a few cents here & there, but even so that is what warranty is for right? And if it's off lease-off warranty they are dumped by the buttload on ebay. I borked up one of my Thinkstation S30s when upgrading a cpu. First time upgrading Socket 2011 board. Instead of replacing it I bought 2 new Thinkstation S30s off ebay for $100 a piece (E5-2609, 8G ddr3, 2 x 500G sata, Win7Pro OEM license, Quadro 400) for the entire machine. I just sold my macbook air to upgrade to something newer with better Linux compatibility & bought a Latitude with 3 year on-site warranty, refurbished from Dell themselves for $750...Simply can't get that with an Acer, Asus, etc.

If you want the newest bleeding edge stuff with max overclocking, you don't buy HP & Dell. I think the market is different, not necessarily inferior.

In my experience I've been burned way more times with issues on hardware/Bios support than failed components. Failed componetns are both timewise & moneywise (in most cases easier to replace). Nobody is going to go fix their own spectre bios problems on an ivy bridge box while many of the big OEMS will.

Good points, although I wasn't thinking of ASUS when I talk about workstation builds. I'm thinking more about Supermicro boards etc.
 
update: Parts ordered. Was able to get a new sealed 7980xe for $1,650 off eBay, a brand new Lenovo Branded Samsung PM961 1tb NVMe SSD for $210. Biggest costs that I could not avoid was of course the 64gb DDR4 3200mhz at $720'ish.

I decided to go with the new ASRock X299 TAICHI XE Motherboard. This addresses my concerns with overclocking as it has a beefier cooler solution.

https://www.newegg.com/Product/Product.aspx?Item=N82E16813157797

I was able to pick up a new nVidia Quadro P4000 for $820 off eBay.

All said and done I was able to come in at very close to $3,995

Rounded out with the new Corsair H115i Pro, S340 Elite, Seasonic 850watt Gold Rated

Final specs being, 7980xe 18 Core / 36 Thread, 64gb DDR4 3200mhz, 1tb Samsung NVMe SSD, nVidia Quadro P4000.

For $3,995 I think I was able to do an amazing job.

I'm almost positive I will get 4.3 to 4.5ghz @ 1.15 to 1.20v
 
Oh no, not for a client build. Personally? Maybe a 8700k or possibly a 9700k but those CPU's run great already under good cooling.

I got all the parts in, waiting for the SSD tomorrow, will get it built over the weekend and will throw some pics up and benches.

I keep meaning to document these builds for the hardocp community just out of fun. I really wish I had taken pics of the dual AMD Eypc build I just completed. That was a 20K workstation. One of these days I will get around to a live build / video.

I have limited work space so that's another reason I do not do more to document builds.

I have a dual xeon 256gb ECC build coming up, getting those parts in next week. Trying out a new Supermico Rackmount Stand Case that looks cool

A lot of my custom built workstations go into very informal work area's, mostly into the homes of engineers that work from home but I also have several in offices. All my guys know that I'm a small player and that when there are issues, we have to go thru the warranty process but at the same time, all of these guys have backup systems that I also maintain. System's I've built as well in most cases.

Of the 50+ workstations I've built over the past 7 or 8 years. 99.9% of my problems are always software related. Second, would be client wants more memory / larger SDD and or additional storage. I never have hardware failure, ever and I do find that odd. Wait. I take that back. I had a lot of ssd failures, mostly cheap 60gb drives I was putting in occasional back 5 or so years ago. Nearly all of them died inside/outside warranty. SSD's are a lot more robust these days.

I've bought a few refurbished workstations and they honestly look very well built. I'm sure a lot of these issues are employees digging around in the cases and hardware being mishandled. General BS that causes problems.

While I do float around 4 to 6K a month, I can't just run out and pop in 2K and 4K cpu's or 1K video cards on a whim. I know some of you are wondering how I manage this. It's really not an issue for me or the clients. I did have to cut a vacation shot by 2 days a few years ago and come back home due serious issues with a new 9K workstation and the clients deadline but that's a rare event for me.


0cUSWgs.jpg

gFxwRbL.jpg
 
Last edited:
The difference between a white box and a dell workstation is a dell will run with the case 3/4 full of dust.
 
Oh no, not for a client build. Personally? Maybe a 8700k or possibly a 9700k but those CPU's run great already under good cooling.

I got all the parts in, waiting for the SSD tomorrow, will get it built over the weekend and will throw some pics up and benches.

I keep meaning to document these builds for the hardocp community just out of fun. I really wish I had taken pics of the dual AMD Eypc build I just completed. That was a 20K workstation. One of these days I will get around to a live build / video.

I have limited work space so that's another reason I do not do more to document builds.

I have a dual xeon 256gb ECC build coming up, getting those parts in next week. Trying out a new Supermico Rackmount Stand Case that looks cool

A lot of my custom built workstations go into very informal work area's, mostly into the homes of engineers that work from home but I also have several in offices. All my guys know that I'm a small player and that when there are issues, we have to go thru the warranty process but at the same time, all of these guys have backup systems that I also maintain. System's I've built as well in most cases.

Of the 50+ workstations I've built over the past 7 or 8 years. 99.9% of my problems are always software related. Second, would be client wants more memory / larger SDD and or additional storage. I never have hardware failure, ever and I do find that odd. Wait. I take that back. I had a lot of ssd failures, mostly cheap 60gb drives I was putting in occasional back 5 or so years ago. Nearly all of them died inside/outside warranty. SSD's are a lot more robust these days.

I've bought a few refurbished workstations and they honestly look very well built. I'm sure a lot of these issues are employees digging around in the cases and hardware being mishandled. General BS that causes problems.

While I do float around 4 to 6K a month, I can't just run out and pop in 2K and 4K cpu's or 1K video cards on a whim. I know some of you are wondering how I manage this. It's really not an issue for me or the clients. I did have to cut a vacation shot by 2 days a few years ago and come back home due serious issues with a new 9K workstation and the clients deadline but that's a rare event for me.


0cUSWgs.jpg

gFxwRbL.jpg


From curiosity, is a case like the 340 a bad choice for cooling such a setup?
 
I wonder how this will turn out.

Only 850 Watt PSU - would certainly go with a Corsair AX1500i or AX1600i simply because they are the best - in fact I went with the AX1500i for my PC for various reasons. Works comfortably, more efficiently, less heat at about 50% load.

AIO cooler? - I would not go with anything but a custom loop + a Monoblock for this build. Motherboard - Asus Rampage VI Extreme + EK Monoblock or Heatkiller Pro + VRM Block - Alternatively one of the Asus X299 WS boards with active VRM cooling.

The 7980XE can draw A LOT of power at 4.5Ghz (i.e. 500 Watt) + this thing will be working 10 hours at full load.

I don't know....maybe I am wrong but if the budget is not an issue I would only choose the best components if it was me - hell I have got all of the above for my personal build + a huge external radiator with 9 X 140 fans because I want silence AND performance.
 
What is he cooling it with? Air?...nah need an awesome block huge rad and I want see one of those stooopid pumps in 800L/H+
 
idk Linus just built a 11k system managing to get all cores on a i9-7980XE to 4.8Ghz. Granted that system was seriously over the top.

Not all i9-7980XE are created equal....that and I am not really sure about long term stability and CPU endurance under these settings. Did they mention CPU temps under this overclock? The amount of watts and heat this thing produces after a certain point are outrageous. That was the reason I got an i9-7940X instead in hope of clocking it higher for better single core performance in various Adobe applications I am using. Anyway, I have yet, to fire it up and the last parts are coming next week so I will know first hand whether my decision was wise. There is always the chance that I have a "lemon" cpu....we'll see.
 
Not all i9-7980XE are created equal....that and I am not really sure about long term stability and CPU endurance under these settings. Did they mention CPU temps under this overclock? The amount of watts and heat this thing produces after a certain point are outrageous. That was the reason I got an i9-7940X instead in hope of clocking it higher for better single core performance in various Adobe applications I am using. Anyway, I have yet, to fire it up and the last parts are coming next week so I will know first hand whether my decision was wise. There is always the chance that I have a "lemon" cpu....we'll see.

He knows how to build cool stuff and run it, I trust what he says. If funds were no issue oh that would be fun to put together.
 
You would not build for me !

Overclocking, Corsair AIO, no ECC, no RAID ??? That is no workstation, it's something else but no Workstation, it's too slow for a gaming machine, not integer enuff for a WS...

ECC RAM, Better PSU, more RAM, proper RAID ( no gimmick RAID ), proper DIY Loop with LARGE Rad ...

Workstations look different imho, sorry
 
... That was the reason I got an i9-7940X instead in hope of clocking it higher for better single core performance in various Adobe applications I am using. Anyway, I have yet, to fire it up and the last parts are coming next week so I will know first hand whether my decision was wise. There is always the chance that I have a "lemon" cpu....we'll see.

I am really curious about this 7940x of yours. You are like the only person on the internet that I know has one. I think newegg has 3 reviews of them.
 
You would not build for me !

Overclocking, Corsair AIO, no ECC, no RAID ??? That is no workstation, it's something else but no Workstation, it's too slow for a gaming machine, not integer enuff for a WS...

ECC RAM, Better PSU, more RAM, proper RAID ( no gimmick RAID ), proper DIY Loop with LARGE Rad ...

Workstations look different imho, sorry

100% agree here, sorry OP. This build comes off as more of an engineering toy to me then a dedicated workstation I'd trust my business on, IMO. Especially in threaded workloads running 10+ hours, I'm not sure why dual-Xeons were not opted for on a SuperMicro board (personally we use the X10DAi here at work, 0 issues.)

No ECC RAM, I'd have opted for at least a 1200W PSU for more headroom, no RAID 1 or 10 config for either the OS/software drives or data drives (I've had HW raids save my ass before at work! Gotta love quick swap rebuilds, and screw mobo RAID's), and I'd be a weary of running an AIO on a business critical machine.

No offense meant, just not the route I'd have gone.

Also, regarding the initial post, we have a lot of engineers/designers here at work running the SpaceNavigator Pro mouse, and everyone of them absolutely loves it.
 
I am really curious about this 7940x of yours. You are like the only person on the internet that I know has one. I think newegg has 3 reviews of them.

This must be the most time consuming build I ever had. It has taken me 3 months + and I am still gathering parts for it.
It is all coming together soon though.

I am expecting the final parts for this build and the damn order is stuck because they are awaiting for 2X Bitspower 90 degree fittings to get back in stock. And this order contains my monoblock for the i9-7940X ( EK Asus Rampage VI Extreme Monoblock) so I can't continue without it.

I will make a thread about this as long as I get notification that the order has finally shipped.

And LOL about being the only person with an i9-7940X - there must be more but the truth is that it's not that popular.
 
Back
Top