Anyone order Threadripper 2 yet? User feedback is highly valuable.


Did you end up getting that board?

Everything finally arrived, including that board. Did a fresh Win 10 install, and it blue screens consistently when I try to do a Windows Update during the install via wireless. I just skipped that step, and did the update after Windows finally installed, but I don't like reproduceable crashes. Wondering if it's the motherboard, memory, or the just bad wireless drivers in combination with the Threadripper 2.
 
Did you end up getting that board?

Everything finally arrived, including that board. Did a fresh Win 10 install, and it blue screens consistently when I try to do a Windows Update during the install via wireless. I just skipped that step, and did the update after Windows finally installed, but I don't like reproduceable crashes. Wondering if it's the motherboard, memory, or the just bad wireless drivers in combination with the Threadripper 2.

There is a bug in the default wireless driver that causes these crashes.
 
Build:

2950x TR2
Wraith Ripper HSF
GIGABYTE X399 AORUS XTREME
64gb G.Skill 14-14-14-34 F4-2933C14D-32GTZRX
EVGA GeForce RTX 2080 Ti XC ULTRA
1TB Evo 960
Dark Base 700
EVGA SuperNOVA 1000 G3

All the parts are here but for the 2080Ti and the Wraith Ripper (which, if it doesn't go on sale by the time the 2080Ti ships, I'll probably replace some with an AIO WC. No effing way I stall the build just for a HSF, even though that HSF looks nifty.)

I went with the AORUS XTREME because I may one day fill it with 4x2080Ti. The MSI Creation doesn't fit 4 GPUs simultaneously.

Although it's using gaming parts, and I spent a few hundred extra bucks to go full RGB, the rig is ML. That's why it's not running a 2990WX - for my workload, it most likely won't be faster, and might be slower than 2950X.

These 2080Tis. Woo, boo. They may be overpriced for gaming, but the 2080 series cards are the most cost efficient ML cards on the market. A 2080Ti has about 90% of tensor cores, bandwidth, and RAM when compared to a Titan V, but you can buy two and a half 2080Ti's for the price of a Titan. You could buy a $69,000 DGX workstation from Nvidia or you could build almost the same thing for ~$10k out of gaming parts, now. (But that DGX looks sweet, and has a custom coolant loop that makes it quiet, so that's really nice).

That 2080Ti there is the only one I've been able to snag a preorder for; it's a 2.7x width card, and I'm worried that it's too fat. I *think* one fat GPU hanging off the bottom slot will fit. It sucks that all these parts are gathering dust for at least two more weeks...
 
Similar system

Build

2950x TR2
NZXT Kraken x72
Gigabyte X399 Aurous Extreme
64 GB G.Skill
Titan Xp (just stripped it out of my other computer. Going to upgrade to 2x 2980 TIs when they arrive later)
Cooler Master Cosmos 700p
1 TB Samsung 970 Pro
512 GB Samsung 950 Pro
3x Samsung 512 GB 850 Pros
6 TB WD Black
Corsair 1200 AXi

My old system was an i7 5960x.

So, apparently one of my Titan Xps is busted. I'm in the process of getting it replaced via warranty, so I can't compare it with my old system. I do have some old specs from Battlefield V Open Alpha, which was running better on the 5960x, but I'm unsure if that was using the same graphics preset as the Open Beta. And the same with Monster Hunter World. MHW ran @ 40 fps on 1440p max settings on the i7, and runs @ 70 fps on the 2950x on 1440p max settings, but I'm unsure if that's due to a patch. So no way to really compare them. Diablo 3 also seems to run a little bit worse than my Intel box, but still very playable.

Anyway, outside of the install issue, it's run fairly well. I have to say, I love the way you install AMD CPUs now. My last AMD was an Athlon XP back like 15 years ago, and I destroyed more than my fair share due to pins bending. This new system of sliding it into place is pretty awesome in my opinion, and Intel could learn a lot from that.

My main complaints have to do with the components though, and not the processor. The Cooler Master case comes with 2 HDD trays and 1 SSD tray. And extra trays are unavailable via their store, and require spending more money. Right now, I have sort of an hackish solution, and would like to get this resolved in a better way.

My main issue is the NZXT Kraken. I like silence. This particular AIO, while the decibel level is fine, has a very high pitch which is annoying. Plus, my CPU temperature fluctuates like crazy. It goes from mid 30s to mid 50s, only to drop down to mid 30s again, all in the span of half a minute. I used a Corsair h100i on my 5960x, and that stayed at a rock solid 30 degrees.

Premiere definitely renders faster, and Visual Studio is a bit faster. Not significantly faster, but anything that can shave off time... Most likely, I'm probably the wrong type of user for the Thread Ripper, as it seems more like a side grade. Certain things run faster, certain things run slower, but that seems to be the case with any CPU now, be it Intel or AMD.


I guess my main concern is the cooler. I'm looking for a decent AIO, but supposedly the Enermax has quality control issues.
 
Build:

2950x TR2
Wraith Ripper HSF
GIGABYTE X399 AORUS XTREME
64gb G.Skill 14-14-14-34 F4-2933C14D-32GTZRX
EVGA GeForce RTX 2080 Ti XC ULTRA
1TB Evo 960
Dark Base 700
EVGA SuperNOVA 1000 G3

All the parts are here but for the 2080Ti and the Wraith Ripper (which, if it doesn't go on sale by the time the 2080Ti ships, I'll probably replace some with an AIO WC. No effing way I stall the build just for a HSF, even though that HSF looks nifty.)

I went with the AORUS XTREME because I may one day fill it with 4x2080Ti. The MSI Creation doesn't fit 4 GPUs simultaneously.

Although it's using gaming parts, and I spent a few hundred extra bucks to go full RGB, the rig is ML. That's why it's not running a 2990WX - for my workload, it most likely won't be faster, and might be slower than 2950X.

These 2080Tis. Woo, boo. They may be overpriced for gaming, but the 2080 series cards are the most cost efficient ML cards on the market. A 2080Ti has about 90% of tensor cores, bandwidth, and RAM when compared to a Titan V, but you can buy two and a half 2080Ti's for the price of a Titan. You could buy a $69,000 DGX workstation from Nvidia or you could build almost the same thing for ~$10k out of gaming parts, now. (But that DGX looks sweet, and has a custom coolant loop that makes it quiet, so that's really nice).

That 2080Ti there is the only one I've been able to snag a preorder for; it's a 2.7x width card, and I'm worried that it's too fat. I *think* one fat GPU hanging off the bottom slot will fit. It sucks that all these parts are gathering dust for at least two more weeks...

I'm in the planning stages of a similar build but I do a bunch of spark data engineering and spark ml lib work to prep and stream the data, which is heavy cpu based and doesn't support gpu acceleration. So I'm debating a 2990wx with my current 1080 vs a 2950x with 2080. Cost will be similar, and as 70-80% of my time is spent on the data engineering and processing, I may go 2990. Of course, a lot of that time is coding and not processing time, so it's not a straightforward decision.
 
  • Like
Reactions: oorah
like this
I'm in the planning stages of a similar build but I do a bunch of spark data engineering and spark ml lib work to prep and stream the data, which is heavy cpu based and doesn't support gpu acceleration. So I'm debating a 2990wx with my current 1080 vs a 2950x with 2080. Cost will be similar, and as 70-80% of my time is spent on the data engineering and processing, I may go 2990. Of course, a lot of that time is coding and not processing time, so it's not a straightforward decision.
Or wait on the 2970wx reviews and save money. I hear its estimated to be about 90% of the performance. I dont know from where.
 
Similar system

Extremely similar! Thank you for the heads up on the Kraken; I'm in the market for an AIO so it's good to know what to avoid. If you find a good one let us know that too.


What I've heard is that the 2990wx will do great if the load is CPU not constrained by memory latency or bandwidth. All training problems are constrained by bandwidth and latency, asbut if spark is doing preprocessing of the data set, that could mean anything. So it'll be very specific to whatever your workload actually does.
 
I would like to upgrade my intel (I hate Intel, if I can switch to AMD I am more than happy)
with a Threadripper 2950X but it seems that X499 is on the road.

https://www.guru3d.com/news-story/amd-ryzen-threadripper-x499-motherboards-might-launch-q1-2019.html

I really want PCI Express 4 to be more future proof but I don't understand why AMD should launch X499 on january?

Shouldn't we see ryzen gen 3 before Threadripper 3 or this will be a new chipset for Threadripper 2?

You're never going to get future proofing. That's just the way it is. When you need to upgrade, upgrade. If you don't need to upgrade, don't upgrade.

Because come Jan. 2019, if the X499 is launched, I guarantee that in Jun-July of 2019, the next big thing will be out. And after that, a few months later, the next next big thing. And so on.
 
What are you doing that PCIe 3.0 is a bottleneck or even a potential bottleneck? https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/25.html it's not like PCIe 3.0 x16 is used.... the performance gain from 3.0 x4 to x16 is miniscule and x8 vs x16 is the same.

3dmark, 200 points difference between PCI gen 3 8x to 16x

You're never going to get future proofing. That's just the way it is. When you need to upgrade, upgrade. If you don't need to upgrade, don't upgrade.

Because come Jan. 2019, if the X499 is launched, I guarantee that in Jun-July of 2019, the next big thing will be out. And after that, a few months later, the next next big thing. And so on.

but new PCI express is released every 4 years, plus or minus and this is the only "big thing" that has been announced on the various platforms.
 
3dmark, 200 points difference between PCI gen 3 8x to 16x



but new PCI express is released every 4 years, plus or minus and this is the only "big thing" that has been announced on the various platforms.
..... yet.

The downside to the whole idea of future proofing is that you are trying to read a crystal ball that is full of clouds. Very few people in this industry can accurately say what new or upcoming technologies would be relevant and which won't. As it stands today there are no hardware applications that come anywhere near saturating PCIe v3.0. I for one don't see anything in the near-mid future that will saturate PCIe v4.0 -- except maybe AMD's fabric's IFIS (which may be a driver for including PCIe 4.0) in future Zen configurations.
 
Last edited:
What are you doing that PCIe 3.0 is a bottleneck
today there are no hardware applications that come anywhere near saturating PCIe v3.0
Having some NVMe SSD installed? Samsung 970 Pro is rated at 3.5 GB/s read and 2.7 GB/s write which is already pretty close to M.2 / PCIe 3.0 x4 limits (4 GB/s). Possibly next year's model will saturate PCIe 3.0 x4 in sustained read/write.
 
3dmark, 200 points difference between PCI gen 3 8x to 16x



but new PCI express is released every 4 years, plus or minus and this is the only "big thing" that has been announced on the various platforms.

Yeah, but then ddr5 is coming in 2020 (with a whole new socket) so might as well wait for that....repeat ad nauseum. There's always a new "big" technology around the corner to wait for.
 
I would like to upgrade my intel (I hate Intel, if I can switch to AMD I am more than happy)
with a Threadripper 2950X but it seems that X499 is on the road.

https://www.guru3d.com/news-story/amd-ryzen-threadripper-x499-motherboards-might-launch-q1-2019.html

I really want PCI Express 4 to be more future proof but I don't understand why AMD should launch X499 on january?

Shouldn't we see ryzen gen 3 before Threadripper 3 or this will be a new chipset for Threadripper 2?
This is good info if true, more leaks/info should occur as time goes on. My build would be around December anyways but now maybe next year would be more prudent.
 
I just bought an r5 2600x for fun and so far I'm too t
This is good info if true, more leaks/info should occur as time goes on. My build would be around December anyways but now maybe next year would be more prudent.

Get your 2950x and then get an x499 board later if there is any benefit at all to them. Probably not a significant benefit to be measured. Threadripper is already top of the line.
 
I just bought an r5 2600x for fun and so far I'm too t


Get your 2950x and then get an x499 board later if there is any benefit at all to them. Probably not a significant benefit to be measured. Threadripper is already top of the line.
I usually keep motherboards for a long period of time, so getting PCI E 4 could make the short wait useful if that was the case for long term use. Probably the #1 reason I like sticking with AMD. Not sure when DDR 5 will be mainstream (probably after Intel adopts it, then AMD).
 
Having some NVMe SSD installed? Samsung 970 Pro is rated at 3.5 GB/s read and 2.7 GB/s write which is already pretty close to M.2 / PCIe 3.0 x4 limits (4 GB/s). Possibly next year's model will saturate PCIe 3.0 x4 in sustained read/write.
I do have 3x NVMe SSDs each one on PCIe v3.0 4x. None of them are even close to saturating it and if they were, It would be something you only see in specialized benchmarks. You really wouldn't notice the difference any more than you see running the drives I currently have on PCIe 2.0 x4. Remember to take advantage of PCIe 4.0 you would need your hardware drivers to be written to take advantage of the updated protocols. Those don't happen at the same time the protocol is released. Usually it is follows a year or two after the protocol is released and motherboard developers build the support for it.
 
Last edited:
How is everyone is cooling their TR2's? Wraith Ripper releases Sept 27, which is a full week after the card ships. And it's not clear, but I don't think the Wraith Ripper can be incorporated into a mobo RGB controller like RGB Fusion on the Aorus x399 Extreme. So I am in the market for a quiet cooler. Leaning towards Dark Rock 4 right now. I care more about noise than overclocking, and while AIOs should be able to be quieter, the pump noise tends to make them suck @ idle.
 
Anyway, some initial gaming testing finally as my second video card finally got back from an RMA. Everything is at stock for both systems.

System 1:

Intel i7 5960x (3 GHz)
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan Xp

System 2:

AMD 2950x
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan XP

Monster Hunter World 1440p Ultra (everything maxed out, including volume rendering)

5960x FPS: 40-60 fps
2950x FPS 60-90 fps

I'm guessing that all the excess threads the game uses, the 2950x benefits tremendously here.

Diablo 3 1440p (all graphics options on high)
5960x FPS: 180-200 fps
2950x FPS: 70-90 fps

Turning on Game Mode for the 2950x will get me to 130-150 fps. The game's not even using much of the CPU, so this surprised me quite a bit (CPU usage per core doesn't even reach 25%). I was expecting it to be a little slower, but not this much.

The Witcher 3 1440p (all graphics on Ultra)
5960x FPS: 60-80 fps
2950x FPS: 60-80 fps

The 5960x is slightly faster, by 2-3 frames, but otherwise indistinguishable. The one thing I don't like is the game is a bit choppier on the initial load, but that might not necessarily be the CPU. (On the 5960x, it's pretty smooth loading, but on the 2950x, it's in single digit fps for the first 2-3 seconds upon load).
 
Last edited:
Anyway, some initial gaming testing finally as my second video card finally got back from an RMA. Everything is at stock for both systems.

System 1:

Intel i7 5960x (3 GHz)
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan Xp

System 2:

AMD 2950x
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan XP

Monster Hunter World 1440p Ultra (everything maxed out, including volume rendering)

5960x FPS: 40-60 fps
2950x FPS 60-90 fps

Diablo 3 1440p (all graphics options on high)
5960x FPS: 180-200 fps
2950x FPS: 70-90 fps

Turning on Game Mode for the 2950x will get me to 130-150 fps. The game's not even using much of the CPU, so this surprised me quite a bit (CPU usage per core doesn't even reach 25%). I was expecting it to be a little slower, but not this much.

You MUST update to the latest nvidia drivers. The latest drivers removed bugs that penalized Threadripper performance drastically.

Source...https://www.pcper.com/reviews/Graph...-Performance-Issue-Threadripper-2990WX-Tested
 
How is everyone is cooling their TR2's? Wraith Ripper releases Sept 27, which is a full week after the card ships. And it's not clear, but I don't think the Wraith Ripper can be incorporated into a mobo RGB controller like RGB Fusion on the Aorus x399 Extreme. So I am in the market for a quiet cooler. Leaning towards Dark Rock 4 right now. I care more about noise than overclocking, and while AIOs should be able to be quieter, the pump noise tends to make them suck @ idle.

I just picked up my 2950x from microcenter. I'm gonna use my shit ass ek block for now but I will either wraith rip it or xspc block in a few weeks.

Is there any advantage to the wraith ripper over a waterblock?
 
I just picked up my 2950x from microcenter. I'm gonna use my shit ass ek block for now but I will either wraith rip it or xspc block in a few weeks.

Is there any advantage to the wraith ripper over a waterblock?
Technically, no, other than simplicity. They each have their place, and it'll mostly depend on case selection and how much radiator you want or can fit in your case.

Comparing to the EK block specifically, the wraithripper is probably superior, though I can't say how much.
 
You MUST update to the latest nvidia drivers. The latest drivers removed bugs that penalized Threadripper performance drastically.

Source...https://www.pcper.com/reviews/Graph...-Performance-Issue-Threadripper-2990WX-Tested

That's for the 2990wx, I'm using the 2950x. Even if I run @ 800x600 or 4k, my fps remains the same, so it's CPU limited somehow. Not quite sure why. Even in game mode, the clock was pretty much the same as creator mode. It's 25% slower in game mode, 50% slower with all 16 cores. I ran 3D mark just to make sure something wasn't up, and it was coming back faster than the 5960x in the CPU test, albeit I can't exactly compare with anyone right now as "the processor is unknown". Have to give it some time for 3DMark to update.
 
That's for the 2990wx, I'm using the 2950x. Even if I run @ 800x600 or 4k, my fps remains the same, so it's CPU limited somehow. Not quite sure why. Even in game mode, the clock was pretty much the same as creator mode. It's 25% slower in game mode, 50% slower with all 16 cores. I ran 3D mark just to make sure something wasn't up, and it was coming back faster than the 5960x in the CPU test, albeit I can't exactly compare with anyone right now as "the processor is unknown". Have to give it some time for 3DMark to update.


Gotcha,

on another note is Monster Hunter world worth buying?
 
Hhere's a run down of the 1950x stock vs 2950x PBO comparison. Don't focus on the actual numbers too much - it's a year old windows install with about 40 apps running in the system tray and two VM's running - the comparison is just to see the uplifts offered.

The reason why I think this comparison is relevant is that I never ran my 1950x permanently OC'd as in my mind, losing XFR wasn't worth the permanent 4ghz OC. Now with the 2950x's PBO, it offers the best of both worlds (I'm still tuning PBO though, I'm sure I can get more out of it).

WoLiZaN.png
 
Hhere's a run down of the 1950x stock vs 2950x PBO comparison. Don't focus on the actual numbers too much - it's a year old windows install with about 40 apps running in the system tray and two VM's running - the comparison is just to see the uplifts offered.

The reason why I think this comparison is relevant is that I never ran my 1950x permanently OC'd as in my mind, losing XFR wasn't worth the permanent 4ghz OC. Now with the 2950x's PBO, it offers the best of both worlds (I'm still tuning PBO though, I'm sure I can get more out of it).

View attachment 104515

In your excel spreadsheet can you do an overall average of all of that data (uplift %)? I'm curious
 
2950X with good ram, a good cooler and a case with good airflow is really what most people looking to bump up to TR should get. If you're going to join the big boy club don't cheap out on any of those, otherwise you just stuck your grandma's old bald sunday tires on your shiny new musclecar.

If all that combined is too rich for your blood, stick to AM4 or whatever flavor of intel kool-aid you're hooked on.

2990WX is the nerdy edge case and windows will continue to take a dump on it for awhile due to its core oddities and Because Windows™, but with boost on the 16 core you can get almost everything you should out of it - spend the grand saved on your gpu instead

/signed, 2990WX nerd.
 
2950X with good ram, a good cooler and a case with good airflow is really what most people looking to bump up to TR should get. If you're going to join the big boy club don't cheap out on any of those, otherwise you just stuck your grandma's old bald sunday tires on your shiny new musclecar.

If all that combined is too rich for your blood, stick to AM4 or whatever flavor of intel kool-aid you're hooked on.

2990WX is the nerdy edge case and windows will continue to take a dump on it for awhile due to its core oddities and Because Windows™, but with boost on the 16 core you can get almost everything you should out of it - spend the grand saved on your gpu instead

/signed, 2990WX nerd.

I dont think anyone reading the hedt threads intends to do the opposite of what you suggest.

I'm a second thread ripper owner and I will be a 3rd.
 
Anyone running the Asrock Taichi X399 ATX board or know anyone who is with the series 2 Threadripper?

I have
Diablo 3 1440p (all graphics options on high)
5960x FPS: 180-200 fps
2950x FPS: 70-90 fps

Turning on Game Mode for the 2950x will get me to 130-150 fps. The game's not even using much of the CPU, so this surprised me quite a bit (CPU usage per core doesn't even reach 25%). I was expecting it to be a little slower, but not this much.

You can try pinning Diablo 3 to a single die and see if that helps, this can be done through process lasso or Powershell, more info can be found here (https://forum.level1techs.com/t/what-is-numa-level-one-techs/132060 & ).

It might clear up the weird FPS issue for that game. If I had a 2950X I would test it our myself...

Now looking to step up to a 2950x in a few, but looking to go Air maybe Noctua but most likely Wraith Ripper. Just looking at boards, do not know if I want to go ATX or EATX.

I still do not trust water, AIO or custom loop. While no AIO has leaked on me before, my custom loop just did a few days ago. Calling it a win since the leak was at the pump away from everything, but still...

Edit: What is hanging me up right now is... Do I go ATX and try to jam it inside a Meshify C surrounded by Noxtua Fans or go EATX and stick it inside the SMA8 (already own).
 
Last edited:
Anyone running the Asrock Taichi X399 ATX board or know anyone who is with the series 2 Threadripper?

I have


You can try pinning Diablo 3 to a single die and see if that helps, this can be done through process lasso or Powershell, more info can be found here (https://forum.level1techs.com/t/what-is-numa-level-one-techs/132060 & ).

It might clear up the weird FPS issue for that game. If I had a 2950X I would test it our myself...

Now looking to step up to a 2950x in a few, but looking to go Air maybe Noctua but most likely Wraith Ripper. Just looking at boards, do not know if I want to go ATX or EATX.

I still do not trust water, AIO or custom loop. While no AIO has leaked on me before, my custom loop just did a few days ago. Calling it a win since the leak was at the pump away from everything, but still...

Edit: What is hanging me up right now is... Do I go ATX and try to jam it inside a Meshify C surrounded by Noxtua Fans or go EATX and stick it inside the SMA8 (already own).


Here I made this on the 1950x with process lasso a while ago.

Process lasso absolutely BLOWS windows affinity management blah blah out of the water .... and you can lock a game to a ccx connected direct to ram or just limit to x num of cores etc...
 
Thanks for posting that. I would like to add in here that some games have 2 or more processes (i.e a launcher, a client process and an anti-hack or watchdog process). If you want to adjust the affinity for those kinds of games, you need to to adjust them for all of the processes that game launches because often these components are going to look for each other in the same memory space.

Beyond this caveat this is a very workable solution that can be applied on a permanent basis to applications that need access to local memory pools and reduced valid thread counts.
 
I still do not trust water, AIO or custom loop. While no AIO has leaked on me before

I've used them for six or seven years now, no real issues noted; sample size of a few and I know that I've been lucky, but AIOs are generally reliable and when you have a positive-pressure setup with good intake flow, preferably well-filtered, they get the heat off the CPU and right out of the case efficiently.

And they do it without hanging a huge chunk of metal off of the motherboard, keeping the weight on the enclosure and keeping the area around the CPU clear.

While I'm not interested in TR personally- don't have an application that warrants the cost, yet!- I wouldn't think of using anything else :).
 
Anyway, some initial gaming testing finally as my second video card finally got back from an RMA. Everything is at stock for both systems.

System 1:

Intel i7 5960x (3 GHz)
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan Xp

System 2:

AMD 2950x
64 GB RAM
512 GB Samsung 850 Pro (OS Drive)
1 TB Samsung 970 Pro (Game Drive)
NVidia Titan XP

Monster Hunter World 1440p Ultra (everything maxed out, including volume rendering)

5960x FPS: 40-60 fps
2950x FPS 60-90 fps

I'm guessing that all the excess threads the game uses, the 2950x benefits tremendously here.

Diablo 3 1440p (all graphics options on high)
5960x FPS: 180-200 fps
2950x FPS: 70-90 fps

Turning on Game Mode for the 2950x will get me to 130-150 fps. The game's not even using much of the CPU, so this surprised me quite a bit (CPU usage per core doesn't even reach 25%). I was expecting it to be a little slower, but not this much.

The Witcher 3 1440p (all graphics on Ultra)
5960x FPS: 60-80 fps
2950x FPS: 60-80 fps

The 5960x is slightly faster, by 2-3 frames, but otherwise indistinguishable. The one thing I don't like is the game is a bit choppier on the initial load, but that might not necessarily be the CPU. (On the 5960x, it's pretty smooth loading, but on the 2950x, it's in single digit fps for the first 2-3 seconds upon load).

Was diablo runing in dx9 or dx11? Just interested in knowing if the threadripper has any issues vs intel in dx9 as its important to me.
 



Here I made this on the 1950x with process lasso a while ago.

Process lasso absolutely BLOWS windows affinity management blah blah out of the water .... and you can lock a game to a ccx connected direct to ram or just limit to x num of cores etc...


Yep PL is pretty great, worth throwing the guy a few bucks if you use it a lot. Microsoft should absorb thim like they did with the sysinternals dude, of course I'm sure they would still find a way to dumb it down and fuck it up.
Stuff like this is a poster child for the potential power of open source, when random hobby dude beats your internal design team on your own product something is wrong.

I put most of my games at 4 or 8 cores and can leave encodes etc running on their own and notice nothing.
 
Mine has been licensed for about a year now. It is a pretty good tool. One of the pluses is that you can use the core affinity to pin processes not just the the 1st 8 cores, but you you can do it to each package.In other words unlike game mode, you can use the entire 16+ cores for your finicky programs. No leaving a second package sitting there burning power for nothing. Just make sure to pin all the processes of a multi-executable programs to the same set of cores or it may break shared memory IPC schemes.

The only downside to it is that it is a little technical for the less multi-processor savvy folks.
 
Yep PL is pretty great, worth throwing the guy a few bucks if you use it a lot. Microsoft should absorb thim like they did with the sysinternals dude, of course I'm sure they would still find a way to dumb it down and fuck it up.
Stuff like this is a poster child for the potential power of open source, when random hobby dude beats your internal design team on your own product something is wrong.

I put most of my games at 4 or 8 cores and can leave encodes etc running on their own and notice nothing.

The guy who makes this software is one of the most responsive and professional dudes... he should easily charge 20 more for a license. I think he is charging to little.
 
Last edited:
Back
Top