Nvidia publicly bashing Stardock developer over an ALPHA level game

Does anyone really give a shit what AMD or nVidia tells us? Both rampantly lie, the Dx12 benchmarks on actual games will tell the true story.

I take everything with a grain of salt until I hear it from [H]. Why am I in this thread again? :)

I agree. In all actuality, both companies will sing whatever song benefits them. Its the community response to these sort of things that makes me facepalm...

AMD does better in a benchmark: "Its bad code! Its an alpha title! Poor optimization!"

Nvidia does better in a benchmark: "AMD SUXLOLOLOL"

I know it goes both ways, but there are a vast number of Nvidia users over AMD users, so the Nvidia opinion is generally spread much more effectively. I like to highlight the issue once in a while that the hardware community is generally geared toward giving the larger, more profitable company the benefit of the doubt, In hardware repairs it works the same way. I've posted numerous times that customers immediately blame the AMD GPU when their system has an issue, when a similar system has an identical issue only using an Nvidia card, they suddenly become reasonable and suggest a full system test. Both issues would turn out being non-GPU related.

Nvidia system has an issue: "hmm, lets figure out whats causing this, maybe RAM? HDD? Mainboard? we need to be thorough..."

AMD System has an issue: "AMD SUXLOLOLOLOL"

This culture is a community issue, and we should all be more reasonable and open to logical explanations to these things.
 
Something something, grain of salt, until I hear it from [H]... Or at least not WCCF

Guess you missed reading all the other reviews posted in here from other sources?

Not to mention Steve posting an article on it?
 
nVagia getting owned hard once again? Where is the dedicated Nvidia defense squad?
 
Last edited:
This culture is a community issue, and we should all be more reasonable and open to logical explanations to these things.

Nvidia was blaming the developers while the developers said that all DX 12 / windows 10 drivers would need work.

I don't see anyone saying Nvidia sucks or AMD is superior in here, just that yes, driver updates are needed, and Nvidia was in the wrong for blaming the developers. Nvidia released new drivers with a bug fix for the MSAA already.
 
Nvidia made this into a bigger deal then it needed to be...by raising a big stink they made themselves look bad...then again if the benchmark results were reversed I'm sure AMD would do the same...I'm confident that Nvidia will have excellent DX12 performance in real games and future synthetic benchmarks
 
Guess you missed reading all the other reviews posted in here from other sources?

Not to mention Steve posting an article on it?

I did. Heh

Well then, this should get interesting. Looking at the PCPER data is it really that the dx12 data is great for the 390x or that the dx11 was pitiful? I had a 980, I thought it was only 10% faster than a 290x(390x) without OCing it?

Another way to look at it, the DX12 data looks normal (as in normal dx11 deltas) from what I know. The Dx11 data looks off (the 980 being 50% faster rather than 10%).
 
Last edited:
So that's it ?

Unless you use hardware with cripled dx11 drivers from certain company all we get is few frames more ?

No. If I recall correctly the starswarm demo with mantle on a 290 vs dx11 on a 780 showed that the minimum framerates had nice improvements as well compared to dx11.


So even though the average fps in dx11 is relatively strong on the nvidia side, you'll get both a boost to fps and a nice boost to minimum frames for smoother gameplay.
 
Something something, grain of salt, until I hear it from [H]... Or at least not WCCF

This is all good stuff to see, I for one would not hang my hat on it yet. I have seen a lot of "second coming" API benefits in my time that never truly amounted to jack shit in the broad scheme of things.

As for the benchmark itself, the post by the devs was a good read, but honestly until YOU PLAY THE GAME, it means little. That said, I hope DX12 is full of rainbows and unicorns as we have all been told. :)
 
I think the main issue is that nVidia will blame somebody else first, before looking at themselves.

MSAA BUGGG HURRR HURR. When it turns out that their driver actually had issues.
 
I agree. In all actuality, both companies will sing whatever song benefits them. Its the community response to these sort of things that makes me facepalm...

AMD does better in a benchmark: "Its bad code! Its an alpha title! Poor optimization!"

Nvidia does better in a benchmark: "AMD SUXLOLOLOL"

I know it goes both ways, but there are a vast number of Nvidia users over AMD users, so the Nvidia opinion is generally spread much more effectively. I like to highlight the issue once in a while that the hardware community is generally geared toward giving the larger, more profitable company the benefit of the doubt, In hardware repairs it works the same way. I've posted numerous times that customers immediately blame the AMD GPU when their system has an issue, when a similar system has an identical issue only using an Nvidia card, they suddenly become reasonable and suggest a full system test. Both issues would turn out being non-GPU related.

Nvidia system has an issue: "hmm, lets figure out whats causing this, maybe RAM? HDD? Mainboard? we need to be thorough..."

AMD System has an issue: "AMD SUXLOLOLOLOL"

This culture is a community issue, and we should all be more reasonable and open to logical explanations to these things.

Could not have said it better myself.

When nVidia excels at something, it's the expected result and nobody questions it. And when they don't, they're given the benefit of the doubt.

When AMD excels at something, it's seen as an anomaly, and when they don't... Well I don't need to elaborate on that one.
 
Could not have said it better myself.

When nVidia excels at something, it's the expected result and nobody questions it. And when they don't, they're given the benefit of the doubt.

When AMD excels at something, it's seen as an anomaly, and when they don't... Well I don't need to elaborate on that one.

Usually these types of attitudes are developed because of the trends set by the company. AMD has a history and trend of failing at many of it's initiatives and having questionable drivers. NVIDIA on the other hand pours a lot more resources into developer relations and driver development and that's why they get a pass, because people know NVIDIA will resolve a driver issue in short order.
 
So if you want good performance in released games with released drivers, buy NVIDIA.

If you want lots of beta drivers and good performance in an alpha release of one game, buy AMD.

/thread
 
Alpha generally means that the game engine isn't finalized but the game will run, also means game features aren't finalized and code is still being added/altered. Beta generally would mean that only bugs are being addressed else the engine is pretty much finished. So public testing pre-beta should always be taken with a grain of salt.
 
So if you want good performance in released games with released drivers, buy NVIDIA.

If you want lots of beta drivers and good performance in an alpha release of one game, buy AMD.

/thread


You're jumping the gun quite a bit there. I think you need to take a seat. Nobody is declaring anything. I don't think anyone but nVidia fanboys have had anything negative or positive to say either way. And yes, it is one game. Do you know of any other DX12 titles that have software out and about to play with? No? Didn't think so. GTFO
 
So if you want good performance in released games with released drivers, buy NVIDIA.

If you want lots of beta drivers and good performance in an alpha release of one game, buy AMD.

/thread
Maybe true, but nobody is basing their next purchase off a single alpha DX12 benchmark.
DX12 won't matter until Pascal / Arctic Islands anyway. Over a year.
 
Maybe true, but nobody is basing their next purchase off a single alpha DX12 benchmark.
DX12 won't matter until Pascal / Arctic Islands anyway. Over a year.
Who knows how long it will take for developers to fully embrace DX12 enough for it to matter enough for people's gpu purchase.
 
Usually these types of attitudes are developed because of the trends set by the company. AMD has a history and trend of failing at many of it's initiatives and having questionable drivers. NVIDIA on the other hand pours a lot more resources into developer relations and driver development and that's why they get a pass, because people know NVIDIA will resolve a driver issue in short order.

Well, you aren't accounting for runaway confirmation bias:

you have 100 systems, 50 have AMD cards, and 50 have Nvidia cards.

20 systems have issues, 10 are Nvidia based systems, 10 are AMD based systems.

The 10 AMD based systems' owners immediately blame the AMD card, and have it replaced with an Nvidia card.

(an update, you now have 60 Nvidia based machines, and 40 AMD based machines)

As it turns out, 10 of the systems with issues turn out to be HDD errors, 5 of those were initially AMD systems. The other 10 were genuine GPU issues: 5 of those were initially AMD systems.

40 people say "I was smart, and bought an Nvidia card: no issues for me!" +1 rep for Nvidia

5 people say "I bought an Nvidia based system, it had issues, bad luck: it was the card! replaced the card with a new one: it ran just fine" no rep given

5 people say "I bought an Nvidia based system, it had issues, replaced the HDD: it ran just fine!" no rep given

5 people say "I had a system with an AMD card once, it had tons of issues. Replaced the card with an Nvidia card, and replaced the HDD and it worked just fine from then on!" + 1 rep for Nvidia

5 people say "I had a system with an AMD card once, it had tons of issues! I replaced that shit with an Nvidia card, and now its great!"+1 rep for Nvidia

40 people say "I bought an AMD based system, No issues for me!" + 1 rep for AMD

End result:

Nvida 3 anecdotal rep

AMD 1 anecdotal rep

Next generation of buyers use this info as a buying decision.

You have 100 systems: 33 have AMD cards, 67 have Nvidia cards....


You see how this works? I'm not crying for the underdog, but rather I'm trying to help people realize what I've seen with actual buyers and actual customers.
 
No. If I recall correctly the starswarm demo with mantle on a 290 vs dx11 on a 780 showed that the minimum framerates had nice improvements as well compared to dx11.

I think we'll eventually see what we saw with Mantle... that certain types of work benefit better than others. Those that require 400 quadrillion draw calls or are CPU-limited or whatever will benefit nicely. Some Mantle benchmarks that were GPU-limited showed basically no improvement.

Even then, it wouldn't be hard to do better than DX11. At this stage DX11 is older than dirt. Anything would be an improvement.
 
Since I know most people won't actually read the articles but just comment based on their biases here are a few quotes:

I wonder if the anti-GW crowd will be up in arms about this too? Or maybe not.

Didn't take long.

Your comment shows a complete lack of comprehension (or willful ignorance) of the issues people raised, correctly or not, about Gameworks. It's an inept comparison.

But but..draw calls..DX12 revolution..to the metalz!
/cringe

Where can I buy this game?

I'd be in the market for a clue or a life first if I was you.
 
Maybe true, but nobody is basing their next purchase off a single alpha DX12 benchmark.
DX12 won't matter until Pascal / Arctic Islands anyway. Over a year.

Except that part where one of the most played games period right now, Ark, will be fully DX12 compliant very soon (This month), as will UT and all UE4 based games coming and out, and that's a long list.
 
Except that part where one of the most played games period right now, Ark, will be fully DX12 compliant very soon (This month), as will UT and all UE4 based games coming and out, and that's a long list.

And guess which vendor the Unreal engine favors? (hint: it's not AMD). Personally I'd LOVE to see AMD kick some ass in DX 12. I might even buy a Zen CPU if they can manage 90% of the performance a similarly priced Intel CPU would achieve. AMD just desperately needs a huge cash infusion so it can pour resources into R&D, developer relations and drivers. They're doing good for what they have to work with but at the same time they are in this position today because they've been a historically mismanaged company.
 
Last edited:
And guess which vendor the Unreal engine favors? (hint: it's not AMD). Personally I'd LOVE to see AMD kick some ass in DX 12. I might even buy a Zen CPU if they can manage 90% of the performance a similarly priced Intel CPU would achieve. AMD just desperately needs a huge cash infusion so it can pour resources into R&D, developer relations and drivers. They're doing good for what they have to work with but at the same time they are in this position today because they've been a historically mismanaged company.

I never said anything about any of that, what are you on about here? He said that Dx12 will be a year before it matters to anyone, this is clearly false, end of story. i have no idea what tirade you're on about but it had nothing to do with me.
 
I don't get it. If nVidia did bash anyone it was unwarranted. The 980 vs. 390x margin was normal in the DX12 benchmarks.

Look at this review:
http://www.hardocp.com/article/2015...trix_directcu_iii_8g_oc_review/5#.VdKJKvlVhuA

390x and 980 trade blows in DX11 (at stock).

It's not that the 390x is doing great in DX12, it's doing normal. It just did piss poor in DX11 which is making it's DX12 results look good.

If you look at the PcPer results the GTX 980 shows performance regression in DX12 with certain test combinations.

This does suggest that there is an issue somewhere, either with the benchmark and/or drivers and/or DX12.

In the two other cases with DX12 benchmarks, Starswarm and 3dmark (granted more synthetic), the results have shown significant gains (never regression) regardless of hardware used.

Also I'm not sure what, in terms of inherently due to hardware architecture, would actually cause regression. With the Mantle regression (using another low level API) the issue was still a software support one (still unclear though whether it was the game, driver, or API side however due to conflicting information from AMD on the matter).

Well actually maybe there are some possibilities (related to hardware) which might really open up the issue again (and this might be much larger with DX12) in terms of what constitutes "biased" optimizations.
 
Well, you aren't accounting for runaway confirmation bias:

you have 100 systems, 50 have AMD cards, and 50 have Nvidia cards.

20 systems have issues, 10 are Nvidia based systems, 10 are AMD based systems.

The 10 AMD based systems' owners immediately blame the AMD card, and have it replaced with an Nvidia card.

(an update, you now have 60 Nvidia based machines, and 40 AMD based machines)

As it turns out, 10 of the systems with issues turn out to be HDD errors, 5 of those were initially AMD systems. The other 10 were genuine GPU issues: 5 of those were initially AMD systems.

40 people say "I was smart, and bought an Nvidia card: no issues for me!" +1 rep for Nvidia

5 people say "I bought an Nvidia based system, it had issues, bad luck: it was the card! replaced the card with a new one: it ran just fine" no rep given

5 people say "I bought an Nvidia based system, it had issues, replaced the HDD: it ran just fine!" no rep given

5 people say "I had a system with an AMD card once, it had tons of issues. Replaced the card with an Nvidia card, and replaced the HDD and it worked just fine from then on!" + 1 rep for Nvidia

5 people say "I had a system with an AMD card once, it had tons of issues! I replaced that shit with an Nvidia card, and now its great!"+1 rep for Nvidia

40 people say "I bought an AMD based system, No issues for me!" + 1 rep for AMD

End result:

Nvida 3 anecdotal rep

AMD 1 anecdotal rep

Next generation of buyers use this info as a buying decision.

You have 100 systems: 33 have AMD cards, 67 have Nvidia cards....


You see how this works? I'm not crying for the underdog, but rather I'm trying to help people realize what I've seen with actual buyers and actual customers.

FWIW

I have had a GTX 970 for the last few months (since maybe April/May). I have had more crashes with this thing than any other card that I've owned for the last few years.

Including...
GTX 680
7970
7970 GHz
280x
290
290x

I'm not sure where this legendary driver quality for NV comes from, I had to DITCH their WHQL driver and use a beta driver since it was crashing so often.
 
FWIW

I have had a GTX 970 for the last few months (since maybe April/May). I have had more crashes with this thing than any other card that I've owned for the last few years.

Including...
GTX 680
7970
7970 GHz
280x
290
290x

I'm not sure where this legendary driver quality for NV comes from, I had to DITCH their WHQL driver and use a beta driver since it was crashing so often.

I haven't used Nvidia since they could not release Crysis drivers on the first day. It took them half a week to get their shit together...
 
Yeah, Thats the thing. I had no end of trouble trying to get my 590 to work in surround, it had crashing issues as well: switched to my 7970, and I still have the occasional bug in niche software and driver crashes here and there. Ultimately the experience is better on my 7970, but the general public hear:

"blah blah, faulty 590 blah, blah blah AMD DRIVER CRASH ALL THE TIME blah blah."
 
There's a driver issue as the DX12 performance should not decrease versus DX 11 for Nvidia, the game appears to be an AMD gaming evolved application as AMD is linked in the footer on the game's website so I assume they have had time to optimize properly.
 
There's a driver issue as the DX12 performance should not decrease versus DX 11 for Nvidia, the game appears to be an AMD gaming evolved application as AMD is linked in the footer on the game's website so I assume they have had time to optimize properly.

Try reading the op, or at least a few posts as #6 has you covered:

http://hardforum.com/showpost.php?p=1041799310&postcount=6

Nvidia already admitted that the MSAA issue was with their drivers and not the game. They've been having issues with their Windows 10 drivers and even the game company says that drivers will mature and results will get better. The only issue here is Nvidia tried to blame them first instead of fixing their drivers. Once AMD and Nvidia have good Windows 10 / DX 12 drivers, specific games won't need the same optimizations they do now.
 
I haven't used Nvidia since they could not release Crysis drivers on the first day. It took them half a week to get their shit together...

Both companies are guilty of that from time to time. It seems to happen more often lately with AMD, but that might just be due to the amount of Gameworks games released these days. We'll see how it goes on the Nvidia driver side when some new AMD titles launch.

Yeah, Thats the thing. I had no end of trouble trying to get my 590 to work in surround, it had crashing issues as well: switched to my 7970, and I still have the occasional bug in niche software and driver crashes here and there. Ultimately the experience is better on my 7970, but the general public hear:

"blah blah, faulty 590 blah, blah blah AMD DRIVER CRASH ALL THE TIME blah blah."

I'd say over-all through the years I've had more issues with Nvidia drivers, but at lot of that was centered around specific periods. Especially during the early days of Vista. Of course the infamous "driver has stopped working and needed to recover" thing has popped up now and then over the years.
 
I wonder if the anti-GW crowd will be up in arms about this too? Or maybe not.

Up in arms about what, exactly? All parties, AMD, nVidia, Intel, and MSFT have source code access unlike Gameworks. Is that what you are referring to?
 
All of the developers involved in recent GameWorks scandals said the same thing.
"AMD had access to the code for months", blah blah blah.

Can you show me this? Because I remember them saying AMD had game builds, not source code for GW.
 
I don't find the statements from either company to be particularly controversial (my we get excited about everything these days) but I did have to lol at Oxide saying their alpha versions are of higher quality than most other companies' release versions. That was pretty ballsy.

I thought telling nVidia to tread lightly was ballsy too. Obviously they don't need free hardware, trips, or precompiled code from them.
 
Also I'm not sure what, in terms of inherently due to hardware architecture, would actually cause regression. With the Mantle regression (using another low level API) the issue was still a software support one (still unclear though whether it was the game, driver, or API side however due to conflicting information from AMD on the matter).

Well actually maybe there are some possibilities (related to hardware) which might really open up the issue again (and this might be much larger with DX12) in terms of what constitutes "biased" optimizations.

It depends if the API is truly as "low level" as everyone is wanking on about (I doubt it is). But if it is, then the game code has to be written to match the underlying hardware architecture. E.g., CUDA. If you write a CUDA program for Fermi it's not necessarily (in fact it's unlikely to be) optimal for Kepler or Maxwell, and vice versa. You have to put in separate hooks to run slightly different code for the different architectures... if you want maximum performance.
 
Back
Top