Benchmark Wars

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
You guys know we shitcanned benchmarks years ago because we found those unreliable as indicators of gaming performance and GPU cross-comparisons. While we have moved on, the GPU companies have not. Yesterday we got an email from NVIDIA about a new H.A.W.X. 2 benchmark and the company encouraged us to use it on upcoming reviews. Today we got an email from AMD asking us not to use it on upcoming products.
 
Benchmarks :rolleyes: (Ok, FurMark was good for testing temperatures.)

It might be worth demonstrating how broken the benchmark tool is...
 
The fact that AMD approached the developers with a solution and they choose not to use it is suspicious. I would like to hear a response from Ubisoft about this issue.
 
Nvidia promotes a benchmark where it does well, AMD disappointed.

In other ground breaking news.... after years of research and billions of dollars scientists have determined that water, is indeed, wet. :rolleyes:
 
nothing like benching the final product imho...

Ubi shits on A customers, they WILL get their retribution imho.

like crap sales of H.*.W.X 2... lol
 
Bam! Batman AA anyone? There's got to be some pretty big schnanagins behinds the scenes. Seems like nVidia is getting way too much power and control over developers, so much so that developers are feeling threatened or beholden to make nVidia look good.
 
Has Nvidia really sunk that low that they have to go to this extent in an attempt to sabotage a competitors product?
 
It may seem kind of funny, but I only care about benches for games I play, have played, or will play, and only from three sources. One does play throughs, the other two use custom timedemos of full retail games.
 
In other ground breaking news.... after years of research and billions of dollars scientists have determined that water, is indeed, wet. :rolleyes:

Water is not wet, wet is a property of another object, whose surface is adulterated by water molecules... :) Just as oxygen cannot be oxygenated.
 
I say run it anyways. It would also benefit AMD when we show the before and after scores demonstrating how well AMD did implementing this driver-based solution they mentioned. If they're sure of themselves about the driver-based solution and how it'll increase performance, then they should have no objection to this.
 
If true an obvious bullshit move on nvidia's part. You'd have thought after the batman fiasco they would have wised up.
 
I've owned a lot of Nvidia cards over the years but their corporate ethics of late are leaving a pretty bad impression of them in my mind. They bully partners out of existence and use TWIMTBP to get devs to nerf their games on competitor products. That's hardly a way to support the community they depend on.
 
I've owned a lot of Nvidia cards over the years but their corporate ethics of late are leaving a pretty bad impression of them in my mind. They bully partners out of existence and use TWIMTBP to get devs to nerf their games on competitor products. That's hardly a way to support the community they depend on.

Oh look, more conspiracy people.
 
Oh look, more conspiracy people.

How is it conspiracy? In order to get Fermi cards they forced their distributing partners to buy a bunch of older cards they didn't want. They intentionally disabled PhysX when it was paired with an AMD GPU even though it worked perfectly fine. There are plenty of examples of them doing dodgy things to get a leg up on the competition. I'm sure AMD's not entirely clean either, but I haven't seen anything on that level from them.

Also as far as Ubisoft is concerned, I'm done having anything to do with any of their releases until they get rid of that horrible DRM.
 
Is Ubisoft in NVIDIA's pocket and pushing technology while ignoring the company with the largest DX11 market share? Is NVIDIA pushing a benchmark of a yet-to-be released game that is somewhat broken on its competitors cards?
Yes
 
I will look for sites that use this benchmark in their HD6800 reviews, and then remove them from my bookmarks.

I thank [H]ardOCP for giving us a 'behind the scenes' look at things, so we can cut the BS from whatever company. And yes, this move from Nvidia looks a tad desperate, to say the least. Lately all they've been doing, is giving me more reasons not to buy their products.
 
Not surprised NVDA is doing this. I'd be more surprised if they didn't do it especially after Batman AA and the PhysX bullshit.

Right now I'm just waiting for the judgment from the class action from the mobile GPU fiasco. Get some of my money back for the dead piece of shit 8400m so I can upgrade to a 6000 series.
 
I'm really surprised more companies didnt "shitcan" benchmarks like HardOCP :(

The play-throughs are mucho-better IMO.
 
Ugh. May we ask what departments these e-mails originated from? Marketing-side or Technical-side? I know where my money is sitting...

I say run it anyways. It would also benefit AMD when we show the before and after scores demonstrating how well AMD did implementing this driver-based solution they mentioned. If they're sure of themselves about the driver-based solution and how it'll increase performance, then they should have no objection to this.
Great idea--not only would it serve to prove AMD's allegations, but it lets us hold their feet to the fire if the driver-based improvements don't materialize.
 
Amd rocks. Ati was a canadian company. That's why they are angels and care about their customers and the PC community :)
 
The fact that AMD approached the developers with a solution and they choose not to use it is suspicious. I would like to hear a response from Ubisoft about this issue.

This. Hearing Ubisoft's side of the story would be nice.

I've been around way to many marketing groups to believe anything is as it seems on first pass. I wouldn't be surprised if this boils down to "our drivers don't work how they're suppose to. Ubisoft did something that is within the DX11 spec, but we weren't prepared to. We asked them to change their code instead of us fixing the drivers, but they didn't and now we've got to try to keep this benchmark out of reviews before it sinks us."

On the other hand it could very well be Nvidia helped develop the game for their hardware and their drivers and as a result it works much better at release on their cards. But then again, isn't that almost every major game these days? AMD isn't putting any money towards testing games at launch and we have to listen to "AMD clearly has a driver problem in this game, we'll revisit this later when they fix it" in the reviews?

If it's right or wrong to pay developers to get early access is a whole other ballgame.
 
I don't care either way no matter how each card performs with HAWX2 I wont by the game, and I wont buy an AMD card.
Most people buy the brand they like, it has nothing to do with A = 10fps and B=12fps cause we all know in most cases we are talking a few fps. I simply don't care, so all in all it really does not matter to me.
When I get ready for new video cards, I look at what game I am playing that needs more power, then I look at benchmarks which normally show AMD vs NV on [H] then I get the NV card that gives me what I want. Hey that's just how I do it, to each their own, nothing against ATI but they just do not do it for me.
 
Eh, it's no conspiracy. One company wants to use a benchmark that makes their product look better than another companies product. On the other wide, one company doesn't want to use a benchmark that makes their product look lesser than the competition.

Is this any different than any other products out there? Look to the auto industry to see specialized tests that make a specific vehicle look better in that one area (Toyota is great at this recently, trying to make people belive they haven't been selling shit on wheels for the past decade).

It looks like it's business as usual to me.
 
There's no conspiracy here. Ubisoft is just lazy. I don't know how many times they've released half finished games, dragged their feet with patches or just dropped support on titles. They are more focused on their DRM scheme then making a working game.
 
Benchmarks for a game which isnt out yet? The funny things NVidia puts on their "how to review our cards" checklist. (PS. CUDA, CUDA, CUDA!!!!!!!!)

Maybe review sites should just wait until it is released. Then we can all start looking for drivers which search for Quake.exe or GenuineIntel tags or what have you.
 
Different games naturally run differently on different applications. This seems particularly true for DX11 games.

It is not unheard of for companies to push tests that make their products look the most favorable.

-Dirt 2 seems particularly forgiving of AMD's weak tesselation units in the 5xxx series, which is likely why we see AMD pushing that title all the time.

-Civilization 5 on the other hand seems to tax the tesselation units hard, and as such really favors Fermi GPU's.

This is why benchmark results should be taken with a grain of salt. A good review would take into account the full spectrum of benchmarks available and speak to any outlier titles that favor one architecture over another.

Now if we are going to play the game of unfair marketing practises, why don't we look at AMD's "leaked" promotional slides from the 6850 and 6870 launch where they attempted to make their boards look 3 times faster than the GTX460, by not using a zero axis on their bar charts, thus exhaggerating the appearance of a difference.

That is pretty shitty too...
 
I've owned a lot of Nvidia cards over the years but their corporate ethics of late are leaving a pretty bad impression of them in my mind. They bully partners out of existence and use TWIMTBP to get devs to nerf their games on competitor products. That's hardly a way to support the community they depend on.

'Of late'? nVidia isn't doing anything now that they weren't willing to do from day 1. They were accused of cheating in 2003, then in 2008 by Futuremark on the PhsyX benchmarks. By past standards having a company optimize for you and not optimize for the competition is tame by the kinds of things they were accused of in the past. Didn't even HardOCP have an article once about driver cheating and how just renaming the EXE of some files suddenly made huge performance differences?
 
Zarathustra[H];1036317636 said:
Different games naturally run differently on different applications. This seems particularly true for DX11 games.

It is not unheard of for companies to push tests that make their products look the most favorable.

-Dirt 2 seems particularly forgiving of AMD's weak tesselation units in the 5xxx series, which is likely why we see AMD pushing that title all the time.

-Civilization 5 on the other hand seems to tax the tesselation units hard, and as such really favors Fermi GPU's.

This is why benchmark results should be taken with a grain of salt. A good review would take into account the full spectrum of benchmarks available and speak to any outlier titles that favor one architecture over another.

Now if we are going to play the game of unfair marketing practises, why don't we look at AMD's "leaked" promotional slides from the 6850 and 6870 launch where they attempted to make their boards look 3 times faster than the GTX460, by not using a zero axis on their bar charts, thus exhaggerating the appearance of a difference.

That is pretty shitty too...

two wrongs don't make a right. And we're aware of some games being better suited for a platform, but this isn't a game. It's a benchmark. The entire purpose of a benchmark is to compare products with it's competitor. It has to be on even grounds.

Now if they want to say "our tessellation techniques are better than AMD's, and this shows you we're telling the truth", then fine, they can use this demo - but you can't call it a benchmark because it's not a level playing field.
 
This is just another reason "Games" as a legit medium are not taken seriously. Adam Sessler has a great piece on this right now when it comes to companies only letting people they like review their games.
 
Wasn't it Ubisoft who released a DX10.1 patch for assassins creed, then recalled it when it shows AMD video cards beating Nvidia in benchmarks?
 
Not surprised NVDA is doing this. I'd be more surprised if they didn't do it especially after Batman AA and the PhysX bullshit.

Right now I'm just waiting for the judgment from the class action from the mobile GPU fiasco. Get some of my money back for the dead piece of shit 8400m so I can upgrade to a 6000 series.

Don't count on getting anything out of that. They allocated $2 million to be distributed across all claims. I believe that is $2 million before lawyers fees, so you will be lucky to get anything.
 
"You guys know we shitcanned benchmarks years ago because we found those unreliable as indicators of gaming performance and GPU cross-comparisons."

Pretty much says it all, so I would encourage all to ignore what both NVIDIA and AMD say or claim, and wait for the real-world performance from [H]ardOCP.

Specifically, look at two scenarios and the historical evidence:

1. NVIDIA's claims are valid and UbiSoft is not in their pockets. Drivers are as important as hardware, and NVIDIA is known for better and more timely drivers and profiles. AMD knows this, but wants to cry foul because they want to make the biggest splash and profit upon release of their new line...before they have the drivers tweaked. (Evidence: AMD still doesn't have the drivers for the old line optimized, and more than one app developer is reliant on users "rolling back" to previous packages that worked. E.g. Arcsoft's TMT3 won't work properly with drivers beyond 10.5. Could be Arcsoft, could be AMD, or some combination of the two. With HTPCs becoming more prevalent, it makes no sense for this to happen. Also, since AMD is known to be most untimely with drivers, perhaps they simply didn't submit requested information to UbiSoft when NVIDIA did. Who knows? However, there is a mound of historical evidence to support this.) Also, AMD fell far behind with their CPU line vs Intel, and accentuated all the positive before release, leaving after-the-fact-real-world usage to discover that their CPUs were anemic compared to Intel, then also after-the-fact claimed that they were targeting the low-to-midrange field.

2. UbiSoft IS in NVIDIA'S pocket, and is optimizing for NVIDIA and not AMD. Clearly this would be unfair, and as evidence, NVIDIA'S benchmark cheating from years ago. However, this was quite a scandal that affected NVIDIA's reutation deeply, and they know it. However, IF AMD's new line IS superior to theirs, then it's reasonable to assume they might try this again. After all, it's business and corporations, if diagnosed as individuals under the guidelines of the DSM-IV (according to more than one recent study by psychologists) are sociopathic/psychopathic. They don't care about right/wrong/fairness, they just want to make a profit. (However, this last part applies equally to both businesses.)

[I'd also have to throw in that, in comparison with dollar per performance, NVIDIA's 4XX line has been a disappointment in the low-to-midrange field. Not bad products, but not worth it if more performance can be had from AMD for the same or less $$$. Consequently, this would lead one to believe that NVIDIA's worried about their market share as well.]


So...one is left to decide, using the best critical thinking, which scenario could be true. However, the bottom-line is that there is simply not enough evidence to tell yet, with the "tell" being the release of the product, and the subsequent independent testing needed.

Either scenario sounds equally likely, but it also doesn't have to be one or the other; it could be some combination of both. (AMD's lame AND NVIDIA's cheating.) Thanks to [H]ard and other sites for being our watchdogs; we can't afford to buy all the products and run our own tests, then return that which turns out to be best for our use. We need to rely on the experts.
 
Ati cards should work with this demo if it's coded within the DX11 spec (which we can be pretty sure it is or Ati would have complained about that). This isn't like physx - it's not an nvidia only feature, it's just using the DX "open standard", one of those things ati is so found of telling us it supports. If it doesn't work it's either because:
a) Ati drivers suck.
b) Ati hardware is slow (at tessellation).

What with Richard Huddy's rant about *too much* tessellation and now this it seems that tessellation - the thing their marketing team couldn't stop talking about a year ago has now become the pariah.

I suppose all this really confirms is the 6 series can't still can't tessellate as well as the nvidia 4 series, not that I really care that much - I haven't seen one game yet that really looks noticeably better with tessellation on outside of a screen shot. Until I do it's not going to overly influence my graphics card purchases.
 
Back
Top