AMD Ryzen Threadripper 2990WX Gaming Benchmarks Stunted by Faulty Nvidia Driver

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
20,997
Golem.de has made a claim that AMD Ryzen Threadripper 2990WX benchmarks are held back by a faulty Nvidia driver. It seems that some games run at half speed on the 32 core beast compared to its 16 core 2950X sibling when running an Nvidia Geforce GTX 1080 Ti GPU. When the Nvidia GPU was switched out for an AMD Radeon RX Vega 64, the frame rate increased substantially and in some cases the frame rate doubled.

After switching from the Geforce to a Radeon RX Vega 64 (test) , the frame rate rose dramatically in four of the five games tested, sometimes doubling in size. Only Assassin's Creed Origins runs on the Threadripper 2990WX with the AMD card as slow as before. We asked Nvidia if the driver problem is known.
 

MrDeaf

Limp Gawd
Joined
Jun 9, 2017
Messages
428
Interesting.

I recall there used to be a bug in certain games with Six core Phenom that never occurred under quad and eight cores.
All was well after the bug got patched.
 

dreadcthulhu

Weaksauce
Joined
Apr 10, 2017
Messages
121
I am guessing that Nvidia's driver have some sort of bug that shows up with very-high core count CPU's. The Tech Report's recent review of the 2950X also tested a 2990WX locked down to 16 cores; for the games they tested, in 16 core mode the 2990WX was a bit faster than the natively 16 core 2950X, and a lot faster than the 2990WX was with all its cores on.

https://techreport.com/review/33987/amd-ryzen-threadripper-2950x-cpu-reviewed
 

Elf_Boy

2[H]4U
Joined
Nov 16, 2007
Messages
2,468
Did they set the nv driver to multithread in the nv control panel? I am guessing so but still.....
 

gamerk2

[H]ard|Gawd
Joined
Jul 9, 2012
Messages
1,970
I know the current driver has some serious issues anyways. Still, shouldn't it be standard practice at this point to test at least one GPU from each vendor when testing game performance, just so things like this get spotted early?
 

dgz

Supreme [H]ardness
Joined
Feb 15, 2010
Messages
5,838
What Kyle needs to be doing right now

 

renz496

Limp Gawd
Joined
Jul 28, 2013
Messages
232
It seems nvidia really going to need a CPU that work best the way their GPU are designed....they need their own x86 CPU.
 

Wolf_Tech

Limp Gawd
Joined
Sep 19, 2010
Messages
230
Or AMD has put code in there boards and cpus to slow down nvidia and make amd graphics cards faster on there own platform. Has happened before in this business many times.
 

Brackle

Old Timer
Joined
Jun 19, 2003
Messages
7,997
Or AMD has put code in there boards and cpus to slow down nvidia and make amd graphics cards faster on there own platform. Has happened before in this business many times.

If that was the case that would make their CPU reviews perform worse. Even AMD knows they aren't top dog when it comes to graphics. It would make no sense for AMD to do that....why you ask?

When they compare it to their own 1950X with the same Nvidia card, it shows better gaming performance on the 1950X.....Then that would take sales away from the 2990WX.....
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
5,567
I’m not sure if this is a bug or a selling feature..... Maybe an intended limiter to keep their consumer cards out of servers.
 

mnewxcv

[H]F Junkie
Joined
Mar 4, 2007
Messages
8,784
I’m not sure if this is a bug or a selling feature..... Maybe an intended limiter to keep their consumer cards out of servers.
I can't see how that would be achieved legally if the card is advertised as performing a certain way.
 

power666

Weaksauce
Joined
Jun 23, 2018
Messages
113
My first guess is that the driver has problems dispatching more than 32 threads simultaneously. There is an old limitation within Windows where an application could not have more than 32 threads active in an application. This has since been hammered out for newer applications and versions of Windows. Unsure of how this works for low level Windows drivers.

Second guess would be with NUMA and that two of the four dies don't have memory attached. This could be emulated on the 16 core parts: only put memory on channels connected to a single die.
 

KazeoHin

[H]F Junkie
Joined
Sep 7, 2011
Messages
8,463
I remember on my old Xeon (28 threads) GTAV wouldn't even load unless I disabled cores. Essentially if the application saw over 16 threads it's index would overflow and not work at all. I'm guessing that Nvidia has that thread # index set to be 5 bits (32 unique numbers) on Geforce drivers.
 

Creig

Gawd
Joined
Sep 24, 2004
Messages
785
Reminds me of a few years back with Batmangate. Nvidia tried to claim copyright on standard MSAA coding in Batman : Arkham Asylum. Nvidia had Eidos put in a Hardware Vendor ID lockout that would only allow MSAA to be viewed on Nvidia cards. Funnily enough, there was an "error" in the coding that forced AMD cards to perform part of the MSAA processing anyhow. So even though AMD cards weren't allowed to select that mode, this "Nvidia only" MSAA implementation was actually slowing down AMD cards.
 

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,280
Or AMD has put code in there boards and cpus to slow down nvidia and make amd graphics cards faster on there own platform. Has happened before in this business many times.


Wow. Just wow. Most reviewers have been using a flagship nVidia GPU for cpu gaming benchmarks since forever and you think AMD would purposely make their cpus look worse.

They also have been using overclocked Intel cpus to compare various gpu performance.

You really didn't think that one through, did you?
 

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,362
I knew something was wrong with those benches. There was no way an 80mb cache 32 core 4.0 ghz chip should have been that pathetic in games . Just didn't add up.
 

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,362
I remember on my old Xeon (28 threads) GTAV wouldn't even load unless I disabled cores. Essentially if the application saw over 16 threads it's index would overflow and not work at all. I'm guessing that Nvidia has that thread # index set to be 5 bits (32 unique numbers) on Geforce drivers.

Still happened on threadripper. If you set the affinity then problem was solved.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
A lot of these issues just sound like a programmer took shortcuts, like those that relied on clockspeed (consoles) or framerate (physics in Bethesda games).

It's the kind of stuff that AMD should have found in testing and brought up to vendors while also providing guidance to reviewers; their lack of communication (if not also engineering effort) has resulted in a poor initial showing, when they could have passed on responsibility to the vendors that can actually fix the problems and let reviewers focus on the stuff that works as intended on release.
 

mnewxcv

[H]F Junkie
Joined
Mar 4, 2007
Messages
8,784
A lot of these issues just sound like a programmer took shortcuts, like those that relied on clockspeed (consoles) or framerate (physics in Bethesda games).

It's the kind of stuff that AMD should have found in testing and brought up to vendors while also providing guidance to reviewers; their lack of communication (if not also engineering effort) has resulted in a poor initial showing, when they could have passed on responsibility to the vendors that can actually fix the problems and let reviewers focus on the stuff that works as intended on release.
I think it will work itself out in the coming months. The Ryzen 1st gen launch had a ton of problems as well if you recall.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
I think it will work itself out in the coming months. The Ryzen 1st gen launch had a ton of problems as well if you recall.

Oh I expect it to- none of these are new problems themselves. Just quite visibly untested (or uncommunicated) issues that at the very least should have been included in review materials, not just to ensure that TR2 was given the fairest shake possible, but to give something for reviewers to test for a baseline that fixes may be compared to when they arrive.

And at the very least, while these issues shouldn't be unexpected, they certainly could have been already addressed (as they mostly seem to have been in Linux), they do present an opportunity to ensure that codebases are optimized for n-core scaling while accounting for local vs. remote memory access, and that should hold us over until the current computing paradigm changes.
 

Meeho

Supreme [H]ardness
Joined
Aug 16, 2010
Messages
5,396
A lot of these issues just sound like a programmer took shortcuts, like those that relied on clockspeed (consoles) or framerate (physics in Bethesda games).

It's the kind of stuff that AMD should have found in testing and brought up to vendors while also providing guidance to reviewers; their lack of communication (if not also engineering effort) has resulted in a poor initial showing, when they could have passed on responsibility to the vendors that can actually fix the problems and let reviewers focus on the stuff that works as intended on release.
And you know they haven't?
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
And you know they haven't?

You can point to reviewers that have quoted AMD on the subject in their reviews published when the NDA lifted?

Seriously, these issues took investigation by the [H] and others- which means that AMD either didn't do the work, or they didn't communicate their results, and that resulted in their product being shown in a lesser light with lesser results that lacked an accompanying 'why'.
 

Meeho

Supreme [H]ardness
Joined
Aug 16, 2010
Messages
5,396
You can point to reviewers that have quoted AMD on the subject in their reviews published when the NDA lifted?

Seriously, these issues took investigation by the [H] and others- which means that AMD either didn't do the work, or they didn't communicate their results, and that resulted in their product being shown in a lesser light with lesser results that lacked an accompanying 'why'.
I can't point to anything, I'm asking you to link something that made you state AMD didn't do the work you mentioned.

They have been doing some mind boggling stupid things that it wouldn't be unlike them, but I would still like a confirmation that they haven't tried to address any of the mentioned issues before the launch.
 
Joined
Jan 3, 2018
Messages
49
You can point to reviewers that have quoted AMD on the subject in their reviews published when the NDA lifted?

Seriously, these issues took investigation by the [H] and others- which means that AMD either didn't do the work, or they didn't communicate their results, and that resulted in their product being shown in a lesser light with lesser results that lacked an accompanying 'why'.

I didn't realize AMD was supposed to make sure nVidia's drivers work on nVidia GPU's. Silly me, I was under the impression that drivers should be tested by the company that writes the drivers.
 
  • Like
Reactions: Creig
like this

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
I can't point to anything, I'm asking you to link something that made you state AMD didn't do the work you mentioned.

I stated that either they didn't do the work, or they didn't communicate with reviewers. I'm just as willing to accept AMD being their typical selves when it comes to communication.

They have been doing some mind boggling stupid things that it wouldn't be unlike them, but I would still like a confirmation that they haven't tried to address any of the mentioned issues before the launch.

I'd love to see confirmation that they're dialing it in!

I didn't realize AMD was supposed to make sure nVidia's drivers work on nVidia GPU's. Silly me, I was under the impression that drivers should be tested by the company that writes the drivers.

The delta is AMD's CPU. Yeah, they should test it with stuff that it's likely to be used with, and if they ship with discrepancies, they should communicate that so that they're not explicitly or implicitly left with the blame.

If Nvidia's drivers are a problem, that information should have been part of the reviewers materials.
 

Meeho

Supreme [H]ardness
Joined
Aug 16, 2010
Messages
5,396
The delta is AMD's CPU. Yeah, they should test it with stuff that it's likely to be used with, and if they ship with discrepancies, they should communicate that so that they're not explicitly or implicitly left with the blame.

If Nvidia's drivers are a problem, that information should have been part of the reviewers materials.
True. Sometimes it seems AMD is their own worst enemy.
 
Joined
Jan 3, 2018
Messages
49
I stated that either they didn't do the work, or they didn't communicate with reviewers. I'm just as willing to accept AMD being their typical selves when it comes to communication.



I'd love to see confirmation that they're dialing it in!



The delta is AMD's CPU. Yeah, they should test it with stuff that it's likely to be used with, and if they ship with discrepancies, they should communicate that so that they're not explicitly or implicitly left with the blame.

If Nvidia's drivers are a problem, that information should have been part of the reviewers materials.

According to this post, it's also a problem on Intel CPU's with 64 threads as well. There is something about nVidia drivers that don't like 64 threads.
 
Joined
Jan 3, 2018
Messages
49
And?

Did Intel just release a 'consumer' 64-thread CPU?

No, they didn't but that doesn't mean shit to you based on your posting history (I have been coming here for almost 3 years now, don't let the fact that I actually registered in January fool you), any opportunity you get to shit all over AMD you take it. Please explain to me how this is AMD's fault that nVidia's consumer GPU drivers don't work on 64 threads but their professional drivers do.
 
  • Like
Reactions: Creig
like this

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
No, they didn't but that doesn't mean shit to you based on your posting history (I have been coming here for almost 3 years now, don't let the fact that I actually registered in January fool you), any opportunity you get to shit all over AMD you take it. Please explain to me how this is AMD's fault that nVidia's consumer GPU drivers don't work on 64 threads but their professional drivers do.

Nice, way to get personal :ROFLMAO:

I run what gets the job done. I've run everything. I have an AMD GPU right next to my 1080Ti, right now. I'd be running Ryzen if AMD were able to keep up with Intel for games.

And the change here is that AMD released a 64-thread CPU for (very high end) consumer use, and they didn't inform reviewers that Nvidia's driver might not be optimized for that use case, either because they didn't know, or they failed to communicate.
 

Elf_Boy

2[H]4U
Joined
Nov 16, 2007
Messages
2,468
NV has no reason to make amd look bad, right?

I don't see it being practical or reasonable to expect amd to have checked each and every hardware and software configuration possible.

NV released a new driver last week. Should they not have tested?

It is really easy to point the finger and blame. I would have thought everyone here understands how complex the machines and systems involved are and understands these things happen.

Was testing some virtual networking software a couple weeks ago so the game master for the 5e group I play with can use fantasy grounds while traveling and in places he can't get a port open.

After some hours it turned out the reason one of my friends could not get it to work was his c drive was almost full and he installed on another drive.

Wouldn't work unless on c:

I really don't see the difference.

NV driver shortcoming is not amd's fault any more then the virtual tunnel not working was the fault of windows or FG.

Someone assumed 640k was more then enough at one time too.

Really don't understand why so many people seem to need to blame and accuse.

/end soap box rant
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
NV has no reason to make amd look bad, right?

AMD GPUs?

I mean, they don't really have to try.

AMD CPUs?

Nvidia still makes the fastest and the most efficient GPUs, so you figure higher performing AMD CPUs means more Nvidia GPU sales.

And the crux is, TR2-W is something AMD has been developing for years- not something Nvidia has been developing for years. Meaning that AMD should have been aware of the issues, and the best way for this to have been handled would have been for AMD to have advised Nvidia of their results so that a solution could have been baked into the Nvidia driver (and whatever else needs it!) to properly support AMD's new CPU.

Now, does AMD have any reason to make Nvidia look bad?

;)
 
Top