The Z Build 2.0

Ooop!

Power supply switch flipped on, and we have a Christmas tree!

PXL_20201227_030934793.NIGHT.jpg


(I hope for the love of God those blinding things can be disabled in the bios. I don't want to install any Asus LED control software just to turn shit off)

Lol wut. Even the audio ports emit more light than the alien spaceship in Close Encounters of the Third Kind...

I definitely just whistled the "5 notes" to myself.

PXL_20201227_031312800.NIGHT.jpg




Now to see if it actually posts!
 
Last edited:
PXL_20201227_031804752.jpg


Success!!

After one more trip to Microcenter for the damn M.2 screws and risers, I'll have my system back tomorrow :)

Need to flash the latest BIOS.

The Asus webpage they say "Before running the USB BIOS Flashback tool, please rename the BIOS file(Z2EA.CAP)using BIOSRenamer."

Anyone know if I need a special BIOS renamer tool, or if this is just a matter of renaming the firmware file?
 
So, the good people at MicroCenter comped me a $14.99 m.2 screw, standoff and heatsink kit to make up for the fact that the motherboard was missing the screws.

I wasn't sure they would, but it never hurts to ask.

Originally I was going there for just the screw/standoff kit, but while the website said they had two in stock, no one could find them, but we did find a similar kit, with the same screws and standoffs, and also a heatsink.

I installed my secondary m.2 drive on the bottom, (in the third m2 slot) and left the second slot open so I can more easily add a drive in the future.

There is no space for a heatsink between the board and the case, but I think if I put thermal tape on it it will contact the motherboard tray, and it can serve as a heat spreader.

It's tough to measure, but I think it will work.

PXL_20201228_002544398.jpg


It will be really close. Either it will contact and lower the temp a few degrees, or it will be a TINY bit too short, and will actually act as an insulator.

PXL_20201228_003627564.jpg
 
Last edited:
Alright, we are up and running, but I just poked into the BIOS.

Is it just be or do these stock voltages look absolutely insane? I re-loaded "optimized defaults" just to be sure and they are still very very high.


PXL_20201228_025700514.jpg
PXL_20201228_025254154.jpg


(Please forgive the dirt on the spare monitor from the basement)
 
Alright, we are up and running, but I just poked into the BIOS.

Is it just be or do these stock voltages look absolutely insane? I re-loaded "optimized defaults" just to be sure and they are still very very high.


View attachment 313307 View attachment 313306

(Please forgive the dirt on the spare monitor from the basement)


Looks like I was worrying for no reason. Apparently according to AMD Robert on reddit, Threadrippers can burst all the way up to 1.55v at stock settings. Seeing this in BIOS is normal, and on desktop at idle I am seeing anywhere from 1.05v to 1.2v.

I'm a little bit jumpy after being forced to RMA two of these in a year.
 
Looks like I was worrying for no reason. Apparently according to AMD Robert on reddit, Threadrippers can burst all the way up to 1.55v at stock settings. Seeing this in BIOS is normal, and on desktop at idle I am seeing anywhere from 1.05v to 1.2v.

I'm a little bit jumpy after being forced to RMA two of these in a year.
What caused you to lose two tr? My 3960 has been trouble free...but like you the frist time I saw 1.46v stock vcore in bios then 1.55v gaming I was like holy hell that's a lot of vcore.... How did the two that died clock vs the new one?
 
Well it's up and running, and I just realized I haven't taken any final pictures. Will have to do that ASAP!
 
What caused you to lose two tr? My 3960 has been trouble free...but like you the frist time I saw 1.46v stock vcore in bios then 1.55v gaming I was like holy hell that's a lot of vcore.... How did the two that died clock vs the new one?

I will have to look at this. I haven't looked at clocks yet since installing the new one.
 
Here's a quick cellphone pic. I'll have to break out the DSLR for a real pic at another time. (one in which I remove the little stickers over the EK logo's :p

Quite a few changes since last time.

build_Dec2020.jpg


1.) The Noctua Fans - as previously explained - flaws and all - are back in there. I'm still having speed control issues, but my Aquaero based workaround makes me able to live with them.

2.) Motherboard changed from the Gigabyte Board that only caused problems, to an Asus ROG Zenith II Extreme Alpha. I never thought I'd buy this expensive board with all of its Cristmas tree lights, but it was the only one I could get immediately locally, and I was done waiting.

3.) The plumbing has changed significantly.

3.1) There are now two loops sharing the same reservoir, one dedicated to cooling/radiators and one dedicated to the hot blocks with the coolant mixing in th reservoir, an idea which I had originally toyed with in this thread. The motivation behind it is that you can make up for a restrictive loop by adding more pumps in series, but it scales poorly. Each pump you add, adds much less flow than the one before it. Instead, if we reduce the restrictiveness of the loop by breaking it into smaller parts, each pump will have to work less hard to get the same amount of flow.

3.2) To that end, I used my existing loop with its two pumps as my radiator loop, running a tube behind the motherboard tray up to the top radiators, and back down on the bottom right. I also added a third pump to feed the blocks only. Flow has significantly increased, though I don't want to quote numbers right now, as I believe I am having flow sensor problems. (I have two, one per loop and they give very different readings for the same flow...)

If I can ever get that top bracket which is always out of stock, I plan on replacing my one 420 radiator up top with two 360 crossflow radiators. Then I plan on splitting up the dual pumps, instead converting the cold side to two loops, one front, and one back, so I'll have a three loop system, two cold sides, one hot side, everything blending in the reservoir.

3.3) I decided to add a second Aquaero to help control everything, primary because I already had cone kicking around from my fan speed trouble shooting, and it made life easier. The one with the screen up top controls the pumps and fans on the cold side. There is another one hidden behind the motherboard tray that controls the hot side pump.

3.4) I have added more thermal sensors than I really need for control just to help me monitor and diagnose. The Aquaero that controls the cold side has four Aquabus Calitemp sensors, cold loop intake (right under the reservoir before the two pumps, two cold loop returns (one for front radiator, one for rear radiator) and one near the reservoir for the intake to the hot side. The only one that is actually used for control is the hot side intake sensor. My theory here is that it is the job of the cold side to provide the hot side with cool coolant. If the hot side intake is too hot, then the cold side isn't working hard enough. Both pumps and fans are controlled off of this sensor with the pump speed revving up a half a degree below the fans.

The hot side aquaero has two thermal sensors. One intake right before the pump, a duplication of the one near the reservoir, but the two Aquaero units can't share the same sensors, so... And then the hot side return temperature sensor. The only sensor used for control on this side is the hot side return sensor. The Theory here is that if the return coolant is hotter than expected, the coolant is sitting too long in the blocks and heating up too much, thus the hot side pump speed is too slow.

4.) This board has more addressable RGB lights and headers than you can shake a stick at. I'm not so much into that, so I disabled it all and run it in stealth mode. I was pleasantly surprised that this could be done in BIOS without installing any software. I decided to leave the OLED screen on. While I initially thought it was a silly and excessive idea to have that screen there, it's actually very useful for diagnosis and a quick glance temperature readout when running. I'm still not sure why it needed to be OLED though. A cheaper screen tech would have done just as well.

Here is an example of the OLED screen during Post and boot. Once post is over, it stops diagnosis codes,does a quick ROG logo animation, and then starts displaying CPU temp:



This - at least - is the default behavior. I'm sure it is configurable if I decide to install the Asus software, but I don't feel like doing that.

I also added a single white 5mm LED to a clear end cap at the bottom of the reservoir. This illuminates the reservoir just enough that I can easily see coolant levels. I was concerned it would be too bright, so I actually soldered the LED to a 4pin fan connector and plugged it into the hot side Aquaero, and use the fan header in voltage control mode to alter the brightness. Turns out I didn't need to worry. Due to the dark tint on the side windows, it isn't too bright, even in its' brightest mode.

If I felt like having fun, I could set a fan curve and have the res get brighter the hotter the coolant is, or something like that, but for now I am leaving it as is.

I think that pretty much explains what I have done.

Just hope this CPU lasts longer than the last two did.
 
Looks like I'm a bit too late, but next time, just toss your M.2 drives onto the DIMM.2 card. They get great airflow there and are very easy to access. I believe those are just direct access PCI-E lanes without any proprietary funny business other than the physical shape of the card. I had good results with this on my Asus X299 build.

For your fans, do you have to run them as PWM? My Corsairs work with all three methods that the Aquaero supports (Power/Speed/PWM). If the PWM is what's wonky on the Noctuas, try one of the others and see what happens.

I second your dislike of the Corsair 1000D. I thought it would be a gigantic case that could fit everything and be simple to work in. It almost is that, except nearly every critical dimension ends up being short by just a couple of mm here and there. The useful capacity would probably be doubled if it were just tweaked a little here and there.
 
So, I finally got around to doing some CPU stress testing.

At stock settings I ran 48 Prime95 threads in "mixed" mode for a couple of hours.

Ambient temp was 74°F (23C)

Once heat soaked the coolant held at my setpoint of 31°C with fans stabilizing at about 600rpm (out of a max of about 1800rpm)

The Tdie once heat soaked reported as varying between 48°C and 52°C depending on which test Prime95 was running at the time, which isnt half bad for a Threadripper, and is much better than what I recorded before, confirming something was definitely wrong with the system before. (Either that, or my new "pre-heat and smear paste on thin using a nitrile glove" method is much more effective than my old methods)

In modern games the coolant also holds at my 31°C setpoint, with the Pascal Titan hitting no higher than 38°C core temp while boosting at max, which ranges between ~2030 and 2080Mhz. Fans go a bit higher during gaming though, hitting anywhere between 850 and 1100 rpm.

The interesting part is that in my old case, a Corsair 750D those fan speeds would have been audibly bothersome to me, but in the 1000D they are much more mellow. Something about the case just makes the fan noise less noticible.

I still have some kinks to work out, but in general I am very happy with the results..
 
Here's a quick cellphone pic. I'll have to break out the DSLR for a real pic at another time. (one in which I remove the little stickers over the EK logo's :p

Quite a few changes since last time.

View attachment 314358

1.) The Noctua Fans - as previously explained - flaws and all - are back in there. I'm still having speed control issues, but my Aquaero based workaround makes me able to live with them.

Alright. I'm starting to get pissed at Noctua again.

The workaround mentioned previously DID work pretty well with winter temps, but in the summer, there just isn't enough control resolution to get a good outcome. They keep going back and forth between too slow and too fast, alternating between near silent and LOUD.

This has me casually looking at Corsair ML120 Pro and ML140 Pro online. I bet they would do the job much better seeing that this case was pretty much designed with these fans in mind, so you must be able to string them together with good results.

I'm a little hesitant to spend another $600 on fans, draining the whole loop and replacing them again, but....
 
I'm still curious if there is simply not a strong enough pullup on the pwm line. Just attaching a resistor from the PWM signal to the +5V line the PSU would fix it in this case. I would try like 2k to 20k resistor.

Worst case it might damage the fan.
I very much doubt it could damage anything on the controller side, since that circuit has to be designed to allow a variety of fans, some of which WILL pull-up to 5V.
 
Alright. I'm starting to get pissed at Noctua again.

The workaround mentioned previously DID work pretty well with winter temps, but in the summer, there just isn't enough control resolution to get a good outcome. They keep going back and forth between too slow and too fast, alternating between near silent and LOUD.

This has me casually looking at Corsair ML120 Pro and ML140 Pro online. I bet they would do the job much better seeing that this case was pretty much designed with these fans in mind, so you must be able to string them together with good results.

I'm a little hesitant to spend another $600 on fans, draining the whole loop and replacing them again, but....
I'm not sure how much airflow you need, but it's worth noting that there are a few versions of the ML120 Pro with different flow rates. The retail RGB version has a max RPM of about 1600 whereas the RGB version which came bundled with the AIO cooler will spin to 2500rpm. The difference in max airflow is substantial - 47.3CFM vs 75CFM. The version with single color LEDs and no LEDs are all the 75CFM version.

On my CPU loop (360x60mm in the top rear position), the retail ML120 RGB couldn't quite keep up but the 75CFM version seems to do fine.

Oh, and remember that the rear exhaust can take two 120s or only one 140. In order to fit 2x140 like on my build, you have to machine down the fans by ~2mm.
 
I'm not sure how much airflow you need, but it's worth noting that there are a few versions of the ML120 Pro with different flow rates. The retail RGB version has a max RPM of about 1600 whereas the RGB version which came bundled with the AIO cooler will spin to 2500rpm. The difference in max airflow is substantial - 47.3CFM vs 75CFM. The version with single color LEDs and no LEDs are all the 75CFM version.

On my CPU loop (360x60mm in the top rear position), the retail ML120 RGB couldn't quite keep up but the 75CFM version seems to do fine.

Yeah, I noticed there were different versions before ordering. I got th eones with no LED's at all. They seem to be the 75CFM versions. Thanks for the heads up.

Oh, and remember that the rear exhaust can take two 120s or only one 140. In order to fit 2x140 like on my build, you have to machine down the fans by ~2mm.

Oh yeah, I already modded mine to fit two 140mm fans in the back. I did it slightly differently than you, drilling custom mounting holes, but the result is the same.

Then again, the plastic on the ML140's is undoubtedly different than that on the Noctua's, so I may have to use your approach anyway.
 
Oh yeah, I already modded mine to fit two 140mm fans in the back. I did it slightly differently than you, drilling custom mounting holes, but the result is the same.

Then again, the plastic on the ML140's is undoubtedly different than that on the Noctua's, so I may have to use your approach anyway.
Interesting. Those Noctuas must just have less dead plastic around the frames than the Corsairs. I found that with the ML140s, the problem was not having physical clearance between two lips of the 1000D. The mounting holes were all good and didn't need any modification (and wouldn't have if there had been clearance for two unmodified fans). This is what made it extra clear to me that this was an unintentional screwup by Corsair. I mean, that plus the fact that they had to redo all of their marketing materials which originally said 2x140 could fit there.
 
Exciting times. I got a new GPU, the ZFX Speedster Zero WB 6900xt, which I posted about here.

Going to integrate it into the existing build. I hope fit won't be a problem. The G1/4 ports on this thing are not in the exact same place as on the existing Titan X with EK fullcover block, and I have some tight fitting QDC's,


Old block hole location:

510127_64697_IMG_20160816_204739.jpg



New block hole locations:

513386_PXL_20211006_2201065512.jpg


If you reference by the PCIe connector the right hole on th eold one, is sort of in between the two holes on the new one, but probably closer to the left. hopefully I can make that work without too much surgery. I really don't ahve the time or the inclination right now.

Here is the tight fit I am referring to:

406725_PXL_20201202_225903179.jpg


I may have to drill new holes and move the pump a little bit, which would be a real bummer. I'm hoping that won't be the case.

time will tell!

Anyway, right now I am using my old pump/res to clean out the block before use.

Given EKWB's history with corrosion on nickel plated blocks, I didn't want to go too aggressive with vindegar, so I used some dish soap and distilled water and I am pumping it through the loop.

PXL_20211007_024118371.jpg


The new house has a shitty old inlaw kitchen in the basement (which we mostly use for storage, and I want to knock out and turn it into something more interesting) but for now it makes for a good water loop cleaning station.
 
Last edited:
Here is a side by side image of the old GPU (Pascalt Tian X with EK fullcover block) vs new GPU (XFX Speedster Xero 6900XT EKWB version)

PXL_20211010_233628098.jpg


I really wish there were a published standard for G1/4 port location, , because this new GPU is just different enough that it is going to make me take things out and drill new holes.

Previously intake from below was in the right port. The front edge of it is ~11cm from the PCI slot plate

PXL_20211010_233810595.jpg



On this new GPU, the closest port is the left one and it is about 10 to 10.5cm from the PCI slot plate.

PXL_20211010_233827904.jpg


(pardon my metric, that's just the side of the ruler that wound up being the most convenient to use)

height is also an issue.

To illustrate the problem, here I attempt to hold the old GPU over the new one:

PXL_20211010_234712179.jpg



If you try to force it, this is what happens:

PXL_20211010_235049005.PORTRAIT.jpg


That's obviously not good...

According to my measurements, I need to go left by ~5mm and out by ~17mm.

The left is a pain but doable. The out by 17mm is not doable as the PSU shroud I am attahcing the pump to kind of ends.
 
So, I drilled new holes, ~5mm further left, and as close to the front wall of the PSU shroud.

My drillbit was too short for my drill press to be able to make it past the front wall, so I had to use the hand drill, and honestly, despite using th ecenter-punch, I kind of made a mess of it. Making a hole is easy. Making a hole right next to a bend or another hole is not.

Luckily I was able to hide it pretty well under the EK mount.

I was obviously never going to be able to come the additional 17mm forward I required, so after some thinking, I realized I could rotate the pump, moving its outlet forward somewhat, and making a not horrible bend.

PXL_20211011_022201430.jpg


I'm not thrilled with this solution. It may not have been pretty before, but it is less so now. At least it works!
 
Last edited:
Next steps:

I flashed the BIOS on the Asus ROG Zenith II Extreme Alpha from 1303 to 1502 in order to enable Resizeable BAR.

After that I did a baseline bench with all default GPU settings. Being an old school H:er I kind of frown on canned benchmarks, but there is literally a shit ton of 3DMark Timespy data out there to compare to, so in order to get an idea of how I'm doing I've decided to use it. I only have the free version. I decided toe paid version was not worth $30. Maybe I'll get it when it goes on sale.

timespy_baseline.png


Nothing to write home about, but better than average for a 6900xt. The interesting stuff will come after I spend some time trying to overclock it.

The CPU score seems a little low, but I have read elsewhere that 3Dmark struggles with Threadrippers SMT and the score goes up if you disable it. I don't want to do that, so I'll live with the score as is.
 
I have heard a good deal of praise for AMD's current drivers. I'm not sure I agree.

I have to admit, the Adrenaline release is much better than the last ones I had experience with (Catalyst). From my limited testing thus far they seem very stable, but oh my god the bloat is ridiculous. Why do my GPU drivers have a full streaming suite and a web browser built into them? It makes absolutely no sense. I wish there were a debloated version. Game profiles make sense. Streaming features, overlays and web browsers do not.

Also, the 3d rendering options seem more limited than what I ma used to with Nvidias settings. Maybe there is an advanced tab somewhere I just haven't found yet.

I will have to say I am very happy that I am now able to see POST and BIOS on my main screen. With my previous Nvidia GPU I always had to view the BIOS on my side screens at a 90 degree angle.

Anyway. It's a work night. This is all I have time for tonight.
 
Last edited:
Here is a side by side image of the old GPU (Pascalt Tian X with EK fullcover block) vs new GPU (XFX Speedster Xero 6900XT EKWB version)

View attachment 402132

I really wish there were a published standard for G1/4 port location, , because this new GPU is just different enough that it is going to make me take things out and drill new holes.

Previously intake from below was in the right port. The front edge of it is ~11cm from the PCI slot plate

View attachment 402133


On this new GPU, the closest port is the left one and it is about 10 to 10.5cm from the PCI slot plate.

View attachment 402134

(pardon my metric, that's just the side of the ruler that wound up being the most convenient to use)

height is also an issue.

To illustrate the problem, here I attempt to hold the old GPU over the new one:

View attachment 402135


If you try to force it, this is what happens:

View attachment 402136

That's obviously not good...

According to my measurements, I need to go left by ~5mm and out by ~17mm.

The left is a pain but doable. The out by 17mm is not doable as the PSU shroud I am attahcing the pump to kind of ends.
I've come to hate my 1000d case ...... everything is slightly off when it comes to fitting things
 
Back
Top