NVIDIA Calls Out Intel For Cheating In Benchmarks

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Remember back in school when you heard someone yell "fight, fight" and then everyone ran around behind the cafeteria to watch two people slug it out? This story reminds me of that. I guess we all better head around to the back of the cafeteria cause I think I just saw NVIDIA take the kid gloves off. ;)

In this case, it looks like Intel opted for the classic using-an-old-version-of-some-benchmarking-software manoeuvre. Intel claimed that a Xeon Phi system is 2.3 times faster at training a neural network than a comparable Maxwell GPU system; Nvidia says that if Intel used an up-to-date version of the benchmark (Caffe AlexNet), the Maxwell system is actually 30 percent faster. And of course, Maxwell is Nvidia's last-gen part; the company says a comparable Pascal-based system would be 90 percent faster.
 
wasn't it AMD that did a similar call out on nvidia like 2 or 3 years ago about nvidia turned off a specific setting on a benchmark to make their cards look significantly better than AMD cards without saying it was turned off when the exact opposite was seen when it was enabled?
 
wasn't it AMD that did a similar call out on nvidia like 2 or 3 years ago about nvidia turned off a specific setting on a benchmark to make their cards look significantly better than AMD cards without saying it was turned off when the exact opposite was seen when it was enabled?
Takes one to know one.
 
This is all marketing crap by marketing people which means zero. Call me when they get Blast Processing, then I will be impressed.
 
As I posted in another thread.

People buying these HPC processors know exactly how it performs. So this is nothing but PR for the uninformed.

However GPUs got a huge issue in terms of memory limits. With Xeon Phi for example, besides omnipath, you got 6 channel DDR4 you can fill up. And we all know what happens when a GPU runs out of memory.

There is a reason why Xeon Phi shipments have exploded with 8x growth.
 
They all 'cheat'. They are all going to show benchmarks that make their products look the best. That's why you need to visit sites like [H]ard|OCP to get real reviews before putting your money down.
 
What, did Intel turn up the tessellation so far that the Nvidia cards fell behind?
Intel compared the performance of its latest Xeon Phi coprocessors to the 4 year old Kepler GPUs in the Titan supercomputer. It's not exactly cheating, but it's very misleading.
 
Every benchmark presented by any developer should be taken with a grain of salt. Especially if it shows a product has an amazing advantage over the competition.

It is all marketing bullshit to lure in users who don't, can't, won't, research shit on their own.

NV did it in the past, AMD appears to still do it as part of a rebellion, and intel has done it. Every hardware manufacturer has done it! As the old saying goes, wait for the actual benchmarks from trusted review sites.
 
NV did it in the past, AMD appears to still do it as part of a rebellion, and intel has done it. Every hardware manufacturer has done it! As the old saying goes, wait for the actual benchmarks from trusted review sites.

Nvidia did it in the past? They're doing it now! Whether it's making claims of the 1080 being faster than it is or those deceptive ass 1060 charts:

zVVf3U5.jpg


Yeah that's not being misleading.

I haven't seen much on AMD's side though I do think that them not saying anything about all the 480 rumors was pretty bad. Everyone thought it would be 390X+ speeds and AMD did nothing to squash that, just let it ride.

But yeah, Nvidia is far from "in the past" with their bullshit. It's just as bad as it ever was and isn't slowing down anytime soon I don't think.
 
Nvidia did it in the past? They're doing it now! Whether it's making claims of the 1080 being faster than it is or those deceptive ass 1060 charts:

zVVf3U5.jpg


Yeah that's not being misleading.

I haven't seen much on AMD's side though I do think that them not saying anything about all the 480 rumors was pretty bad. Everyone thought it would be 390X+ speeds and AMD did nothing to squash that, just let it ride.

But yeah, Nvidia is far from "in the past" with their bullshit. It's just as bad as it ever was and isn't slowing down anytime soon I don't think.
Nvidia never released that...
 
Lol plus in recent memory all I can think of is the Fury X release with Fury X beating 980ti at 4k in everything!(in very specific game settings)
 
Nvidia did it in the past? They're doing it now! Whether it's making claims of the 1080 being faster than it is or those deceptive ass 1060 charts:

Yeah that's not being misleading.

I haven't seen much on AMD's side though I do think that them not saying anything about all the 480 rumors was pretty bad. Everyone thought it would be 390X+ speeds and AMD did nothing to squash that, just let it ride.

But yeah, Nvidia is far from "in the past" with their bullshit. It's just as bad as it ever was and isn't slowing down anytime soon I don't think.

I believe the chart you used is from VideoCardz rumor/leak article on the 1060 and not an NV doc. Can't say NV is using BS performance charts when it is not their chart. I also don't recall NV using slides comparing Pascal against AMD cards during the announcement of any Pascal card. They used comparative charts to 9XX series cards but not AMD. The 1060's own website compares it to the 960 and not 480.

So show me where NV used misleading charts comparing to AMD this round. I can show you where AMD did it with the dual RX480 vs 1080.

Otherwise we will stick with, in the past. Doesn't mean they won't do it in the future.
 
The irony from nvidia they should stop bribing game developers and over tessellation then they might have room to talk.
 
The irony from nvidia they should stop bribing game developers and over tessellation then they might have room to talk.

I think all sides of guilty of this. AMD, NV, and Intel all have devs that are extra "helpful" when it comes to performance on their hardware.
 
I think all sides of guilty of this. AMD, NV, and Intel all have devs that are extra "helpful" when it comes to performance on their hardware.

Do you have evidence of this though? Call me out all while not doing the same. What games have Intel done anything too? Is it odd as well when AMD does better in certain games? Nvidia, to my knowledge, is the only GPU company I've seen (especially recently in the past couple years or so) that has truly done things to make them look better and make everyone look worse.

Like the whole tessellation thing with the Witcher 3 (among other games), paying developers to use GameWorks (Project Cars, constant issues, etc), deceptive marketing, will most likely hold DX12 back due to their still shitty way of handling it, etc.

So no, maybe the chart isn't Nvidias, but clearly Nvidia has been and still are up to their old tricks. Nothing has gotten better on their end.
 
Do you have evidence of this though? Call me out all while not doing the same. What games have Intel done anything too? Is it odd as well when AMD does better in certain games? Nvidia, to my knowledge, is the only GPU company I've seen (especially recently in the past couple years or so) that has truly done things to make them look better and make everyone look worse.

Like the whole tessellation thing with the Witcher 3 (among other games), paying developers to use GameWorks (Project Cars, constant issues, etc), deceptive marketing, will most likely hold DX12 back due to their still shitty way of handling it, etc.

So no, maybe the chart isn't Nvidias, but clearly Nvidia has been and still are up to their old tricks. Nothing has gotten better on their end.
So AMD paying EA millions to use Mantle, using deceptive charts to promote their cards, hell setting benchmarks in ways that makes their cards win. Or perhaps always using Ashes as a benchmark because the game heavily favors AMD cards even though nobody plays the game and the game isn't that visually appealing. Selecting reviewers who get review samples based on if they're willing to promote their cards properly etc.

You act like Nvidia a bane on the industry when the fact is both Nvidia and AMD both do this. Intel is fairly neutral on this because they really only have to care about CPU part of their chips and AMD doesn't really compete with Intel so they can say anything and no one cares because everyone just rags on AMD for not being able to compete.
 
So AMD paying EA millions to use Mantle, using deceptive charts to promote their cards, hell setting benchmarks in ways that makes their cards win. Or perhaps always using Ashes as a benchmark because the game heavily favors AMD cards even though nobody plays the game and the game isn't that visually appealing. Selecting reviewers who get review samples based on if they're willing to promote their cards properly etc.

You act like Nvidia a bane on the industry when the fact is both Nvidia and AMD both do this. Intel is fairly neutral on this because they really only have to care about CPU part of their chips and AMD doesn't really compete with Intel so they can say anything and no one cares because everyone just rags on AMD for not being able to compete.

LOL! Come on man...trying to use Mantle as some sort of "they're doing it too!" point is totally off. Mantle was a completely different API that didn't interfere with Direct X (unlike GameWorks). It also helped AMD because of what it was, a close to metal API. Nothing deceptive or twisted to hurt Nvidia as it didn't even affect Nvidia because they couldn't use it.

Ashes is used because it's the most Async compute heavy game out and is the best way to show possible DX12 performance. So should we hate on AMD because they take DX12 seriously or something? How is it AMD's fault that they're better at it? I'd be using Ashes too if it showed what my hardware was truly capable of.

Also when it comes to the reviews I say good on them. They don't have a problem with performance benchmarks more than they do giving cards, for free, to sites that clearly have a chip on their shoulder towards AMD. I wouldn't give shit to someone for free either if all they do is bash me in their spare time. That's hardly deceptive though anyways.

So yeah, I'm not saying AMD is flawless or anything but you certainly used very poor examples and did nothing to dispell Nvidia from the claims I made. Also like that Cr4ckm0nk3y liked your post. Good to know who else believes in poor examples that don't really make a point! But hey, I guess if it's bashing AMD it must be legit!
 
Last edited:
Do you have evidence of this though? Call me out all while not doing the same. What games have Intel done anything too? Is it odd as well when AMD does better in certain games? Nvidia, to my knowledge, is the only GPU company I've seen (especially recently in the past couple years or so) that has truly done things to make them look better and make everyone look worse.

Like the whole tessellation thing with the Witcher 3 (among other games), paying developers to use GameWorks (Project Cars, constant issues, etc), deceptive marketing, will most likely hold DX12 back due to their still shitty way of handling it, etc.

So no, maybe the chart isn't Nvidias, but clearly Nvidia has been and still are up to their old tricks. Nothing has gotten better on their end.

"Think" means a belief or opinion, NOT A FACT! Me saying I think all sides are guilty is an opinion since there is no proof.

NO ONE has evidence of any company asking a dev for preferential treatment. There are rumors, like AMD with EA, AMD with Stardock, NV with GW, Intel with Sysmark, saying the hardware company paid devs to lean things their way or push the hardware company's tech. But there has yet to be proof the money provided was for these reasons. It is also "marketing" or other nonsense.

You full on deserve to be called out by using a third party rumor chart and saying it was NV's chart to prove NV has used deceptive tactics with Pascal.

If NV is the only GPU company you have seen use relationships or misleading marketing material then you need pull your head out of the sand and do a little more research.

No one is saying NV is free from sin, which is why NV calling out Intel is funny. It is also why I said ALL companies have been guilty of using misleading performance figures.
 
NO ONE has evidence of any company asking a dev for preferential treatment.

Pretty sure Project Cars using Physx and plastering Nvidia logos all over the tracks say otherwise. Also do you think Ubisoft started adding Physx and GameWorks to most of their modern games for no reason at all? It doesn't take having to see signed contracts to tell that what you said isn't really the case.

Also, again, your AMD points are faulty. Mantle = no performance issues for Nvidia so what "payoff" or whatever is there from AMD? Did it affect Nvidia in anyway? And again with Ashes...AMD pay Stardock to what exactly? Properly fucking use DX12? Are we to blame AMD for Nvidia's shitty DX12 performance? Nvidia claimed their cards (especially from the 7xx series up) are DX12 so whose fault is that?

Yeah I fucked up with one graph (which do we have evidence that Videocardz made it?) but that's far from the only example against Nvidia in recent times. All the points you people are using against AMD aren't really valid.
 
LOL! Come on man...trying to use Mantle as some sort of "they're doing it too!" point is totally off. Mantle was a completely different API that didn't interfere with Direct X (unlike GameWorks). It also helped AMD because of what it was, a close to metal API. Nothing deceptive or twisted to hurt Nvidia as it didn't even affect Nvidia because they couldn't use it.

Ashes is used because it's the most Async compute heavy game out and is the best way to show possible DX12 performance. So should we hate on AMD because they take DX12 seriously or something? How is it AMD's fault that they're better at it? I'd be using Ashes too if it showed what my hardware was truly capable of.

Also when it comes to the reviews I say good on them. They don't have a problem with performance benchmarks more than they do giving cards, for free, to sites that clearly have a chip on their shoulder towards AMD. I wouldn't give shit to someone for free either if all they do is bash me in their spare time. That's hardly deceptive though anyways.

So yeah, I'm not saying AMD is flawless or anything but you certainly used very poor examples and did nothing to dispell Nvidia from the claims I made. Also like that Cr4ckm0nk3y liked your post. Good to know who else believes in poor examples that don't really make a point! But hey, I guess if it's bashing AMD it must be legit!
Ashes isn't a pretty game, it's not a popular game. Also the fact that in the game the large performance boost is not from Async Compute. You can toggle the option in the built in benchmark to force it on and off in dx12 it barely 3-5% t's not a coincidence that Oxide games was mantle promoter with their not really popular game engine before AMD abandoned it.

Review cards are not given to sites to keep forever, usually they are leant to reviewers...

Directx12 can very easily be optimized to favor one vendor over another that's the whole rub of low level apis you can very easily optimize your apps pipeline for just one vender. The app will still run with the other vendors just not nearly as well. If you want to fear Nvidia doing anything it's that, they can easily pay engineers and resources to help game developers and optimize their game engines for nvidia cards without caring about any other card. After all DX12 leaves less room to optimize though the driver in one aspect the work nvidia did with refining their drivers every game release can be spent on the games directly and it would be more effective in certain ways.
 
Back
Top