Benchmark Wars

Zarathustra[H];1036317636 said:
Now if we are going to play the game of unfair marketing practises, why don't we look at AMD's "leaked" promotional slides from the 6850 and 6870 launch where they attempted to make their boards look 3 times faster than the GTX460, by not using a zero axis on their bar charts, thus exhaggerating the appearance of a difference.

That is pretty shitty too...

It's very common for graphs to be shown like this and every release of every generation of card is filled with these marketing graphs that try to show there is more difference than there actually is. Anyone with half a brain that knows how to read a graph will see where it clearly states the intervals without hiding them in any way. Not an issue in my eyes.

Pushing canned benchmarks of unreleased games down reviewers throats on the other hand strike me as bad business. I've had Nvidia and ATI/AMD cards in my gaming history which actually starts back with 3dfx and I've had my ups and downs with both manufacturers and their hardware/drivers. I'll buy the best bang/buck regardless of manufacturer, but if the decision isn't clear then I base my decision on other factors like the way the company does business and Nvidia has made some real blunders.
 
nothing like benching the final product imho...

Agreed, this supports our stance of why using released games off the shelf to evaluate the games performance on video cards just makes sense.

The fact that AMD approached the developers with a solution and they choose not to use it is suspicious. I would like to hear a response from Ubisoft about this issue.

Same

Zarathustra[H];1036317636 said:
-Dirt 2 seems particularly forgiving of AMD's weak tesselation units in the 5xxx series, which is likely why we see AMD pushing that title all the time.

-Civilization 5 on the other hand seems to tax the tesselation units hard, and as such really favors Fermi GPU's.

I don't think the 5xxx series particularly has weak Tessellators, It just doesn't have the same granularity as Fermi does, but name a game that needs that ability, there aren't none.

I haven't seen the results you mention bare that out in my Civ5 testing.

Wasn't it Ubisoft who released a DX10.1 patch for assassins creed, then recalled it when it shows AMD video cards beating Nvidia in benchmarks?

I remember that.
 
Just because somethings in spec doesn't mean it's efficient. It's trivial to make bog slow design decisions on opengl and direct-x, or anything else for that matter. AMD and nVidia cards are not identical, and have certain differing architectural quirks that software developers have had to work around since gpus practically came to existence... provided the amount specialization is reasonable. Such is part of their jobs. The fact that AMD was able to demonstrate improvements does nothing but support that resolving this issue would be trivial, and thus reasonable, which would leave the ball in ubisofts court.

The rest of your post is just conjecture, but you sure seem eager to give nVidia preemptive back-patting.
 
So the ultimate point I'm taking from this is "Fuck Ubisoft" and I agree.
 
My reply to this is short, and consists solely of game titles:

Batman: Arkham Asylum
Borderlands
Need for Speed
City of Heroes
Doom 3

Each of these commercially released games were deliberatly code sabotaged by Nvidia during the development stages. Nvidia has a long history of code sabotage and paying developers / publishers to use artificially altered applications to improve perceptive performance of their products while minimizing the performance of a competitor.

Now, I'm not going to say that AMD/ATi has not done the same thing. As far as I am personally aware, directly aware, indirectly aware, or remotely aware, AMD/ATi has never actually been proven to have paid a developer / publisher to sabotage commercially released game code. As far as I am personally aware, directly aware, indirectly aware, or remotely aware, AMD/ATi has never actually been proven to have paid a developer / publisher to sabotage independently released game code.

I am directly aware of AMD/ATi working with developers to implement vendor neutral solutions. Case in point: City of Heroes: Going Rogue. AMD helped Paragon Studios implement an OpenGL specification rendering engine that gives identical image quality as long as the driver supports ratified OpenGL specification calls. Yes, the changes to the CoH rendering engine did knock out graphics chips and drivers that did not make ratified OpenGL API calls (cough: INTEL :cough), but for the most part, Nvidia and AMD/ATi cards given the same image quality with roughly the same performance among similar performing cards. I find AMD's statement about providing a neutral code solution to Ubisoft to believable.

How long have you been on AMD's payroll? Code sabatoge? What a load of bullshit right there. For one, why would any game developer want to potentially hurt sales of their titles in such a way. Everything I have ever heard is that both companies try to work with game/engine dev's ahead of time to help improve performance ahead of time, usually with the video card company working in performance tweaks into their drivers ahead of time. Nvidia seems to be better at this, ATI seems to have a lot of driver problems. Blaming ATI's driver problems on sabatoge by nVidia is really quite the leap. "improve perceptive performance of their products while minimizing the performance of a competitor" lol, wth are you even saying. This is [H]ardocp, they test the shit and let us know whats going on in real games. If the performance is better, it's better, "perceptive" notwithstanding.

AMD saying "hey don't use this benchmark ,it makes us look bad" well what do you expect them to say? But one competitor saying something about the other is the kind of thing you take with a grain of salt.

[H], don't get caught up in the marketing bs from these two companies. Do your normal in game benchies we all come here for. That's what I will be waiting for (although this game I've never even heard of until today, no idea if it is something I will want to play).

Some more likely scenarios for this whole deal: 1) AMD's card is simply performing poorly, and looking for an excuse to give them time to get their drivers optimized. Nvidia has already done so, knows their card is performing well and wants to tell people about it (marketing people doing their jobs, nothing unexpected). 2) The developer has a demo using early code, and doesn't want to spend time/money making a new demo with newer code. Entirely believeable if they are in crunch mode and trying to get this game out in time for a christmas holiday release (when is this game supposed to come out?). The early code may indeed contain only partial optimizations, and if true it's only because nVidia is much more on the ball when it comes to stuff like this.

Ati pull your shit together and get with Game Dev's in a more timely fashion. This pr-spin stuff isn't going to eb any excuse when the game comes out and your stuff runs crappy on it. What about all the other games where the performance isn't what it should be? What about cross-fire? People are paying good money for your products and they deserve better support than they are getting.
 
Just because somethings in spec doesn't mean it's efficient. It's trivial to make bog slow design decisions on opengl and direct-x, or anything else for that matter. AMD and nVidia cards are not identical, and have certain differing architectural quirks that software developers have had to work around since gpus practically came to existence... provided the amount specialization is reasonable. Such is part of their jobs. The fact that AMD was able to demonstrate improvements does nothing but support that resolving this issue would be trivial, and thus reasonable, which would leave the ball in ubisofts court.

The rest of your post is just conjecture, but you sure seem eager to give nVidia preemptive back-patting.

Hm, apparently missed a quote button. :V That was referring to sbuckler's post. :p
 
I don't think the 5xxx series particularly has weak Tessellators, It just doesn't have the same granularity as Fermi does, but name a game that needs that ability, there aren't none.
Is that the chicken or the egg? Is it not being implemented because it's not needed or because only 1/2 the cards out there can run it?
 
It is really unfortunate that I bothered to read your entire post before getting to this show stopper.

Reminds me of Batmans AA implementation. They also chose to put the vendor-id block in the end of the code instead of the beginning, so ATI cards got performance penalty without any AA being applied. :p

Seriously though, this is low if true, so it would be nice to hear all sides of this story.

Kudos to [H] for revealing this in any case!
 
Is that the chicken or the egg? Is it not being implemented because it's not needed or because only 1/2 the cards out there can run it?

I'm still waiting for a game where Tessellation is apparently obvious, heh. I have yet to find a game where I'm impressed by what Tessellation brings to the game.
 
Water is not wet, wet is a property of another object, whose surface is adulterated by water molecules... :) Just as oxygen cannot be oxygenated.

Mind.
Blown.

Anyway, this type of crap is what kicks ass about the [H] reviewing system, it's basically immune to all this bullshit.

Everyone else = and in this benchmark there is a .8fps difference, and in this one it's ,6fps the other way...

[H] = This is the experience that you will have.
 
Fuck canned benchmarks, particularly as they relate to unreleased games.

Thankfully [H]ard will actually play the damn game when it's released and let us know what time it really is.

In addition, there are plenty of DX11 titles already out that may be played and reported on to give us a fairly accurate idea of what the 6850|6870 can do.

Thanks for the heads up, Kyle.
 
Some of you guys clearly arn't reading the mini-article that Kyle wrote and are just commenting based on the one-liner at the beginning of this thread.

Kyle, thanks for looking out for us gamers. This crap really needs to stop and Ubisoft just dipped even lower in my view. They were already 10 floors below ground level after that crap DRM they were doing.

Also shame on nVidia. I used to be a hardcore nVidia fan as they were the video cards I upgraded with every year once 3DFX died. I switched over to ATI once their TWIMTBP crap started.
 
I also want to say, I think it is both AMD's and NV's job to propose code to the game dev that could help improve performance, who knows their cards better than the company's themselves, they would know how to get better performance on their cards in the game, so I see nothing wrong with proposing code tweaks to the game dev. At that point, it is up to the game dev to decide, I would hope that the game dev would not implement something that would gimp the other card. I also hope that AMD or NV would not be pushy or bully them into using their code if it does gimp the other card. I would hope the game dev does the right thing for their game. When it comes down to it, I just want the best experience in the game, real, meaningful results. I also think we should use gold game releases, that are available, to relate that game experience to gamers. Using preview benchmarks do not represent the final gameplay experience that gamers will experience in the actual retail game. This is all my opinion, but that is how I look at things like this.
 
My reply to this is short, and consists solely of game titles:

Batman: Arkham Asylum
Borderlands
Need for Speed
City of Heroes
Doom 3

Each of these commercially released games were deliberatly code sabotaged by Nvidia during the development stages. Nvidia has a long history of code sabotage and paying developers / publishers to use artificially altered applications to improve perceptive performance of their products while minimizing the performance of a competitor.
.

This really makes me wonder about F@H!
ATI's newest cards are getting HALF the PPD of nVidia's last generation cards.
A 5770 gets 4k PPD and a GTX core 216 gets 8k PPD...And the ATI card has 4 times more stream processors.
So is this going on with F@H too Kyle?
 
They intentionally disabled PhysX when it was paired with an AMD GPU even though it worked perfectly fine.

They also disabled all support for Aegia PhysX cards, even though they still provide decent physics acceleration--- equivalent to an 8800GTS dedicated to physics. I picked one up for $20, assuming that they were cheap because they were out of production. But, as it turns out, the new PhysX driver disables the card completely. That's really shady.
 
As usual, most people here don't even listen to all sides of the story, to pass judgement. AMD said NVIDIA is cheating, so it must be true...How about wait for what Ubisoft has to say ?
How about contacting Ubisoft for a response Kyle and then publish everything ?

People like to say that it's "innocent until proven guilty", but that's a load of BS, considering that most people already had the trial, convicted one side and are readying the axe to cut some heads off...
 
As usual, most people here don't even listen to all sides of the story, to pass judgement. AMD said NVIDIA is cheating, so it must be true...How about wait for what Ubisoft has to say ?
How about contacting Ubisoft for a response Kyle and then publish everything ?

People like to say that it's "innocent until proven guilty", but that's a load of BS, considering that most people already had the trial, convicted one side and are readying the axe to cut some heads off...

I'm sure Kyle's article's sole intention was to call out all participating parties involved in this fiasco. Sometimes this is the only way. Otherwise you'd get a lot of "no comments" when asking a party directly.

Lay everything out in the open and let them have at it.
 
How low can NVIDIA stoop?

We all know that Ubisoft is NVIDIA's stooge. Just look at the Assassin's Creed debacle with the removal of DX10.1

Also look how the recent Ubisoft games have all been DX9. A secret ploy between NVIDIA and Ubisoft to give AMD users a poor experience.

Say what you want about Intel but they are angels compared to NVIDIA when it comes to shady software optimization techniques.

Thanks for the update Kyle. :)
 
If I'm buying a GPU, then I review benchmarks to determine which card offers the best performance/quality per dollar in the games that are driving my desire to get better performance/quality. HAWX 2 is not one of those games.

In my mind, who this really makes look bad is Ubisoft (and they certainly don't need any help from anyone to look bad).
 
Now, is Nvidia grasping at Straws? I have my own opinions about this, and my own opinion is that Nvidia, as a graphics card company, doesn't have long to live.

If you want to get down to it, they don't have much of a future if the current product family stays 'as is' so to speak.

Their chipset division is on life support with no future growth, that large revenue division is basically dead.

The SoC line has been a disaster, Tegra/2 has few design wins and even fewer actual product releases.

Fermi has been a dog for the Quatro / Tesla market.

AMD and Intel are pushing hard for the next few years to basically make the low end GPU obsolete with the APU / CPUGPU hybrid chips each company is vesting heavily in.

The low end / OEM GPU's are where the most money is made for the sheer volume of sales. The market segment we as gamers live in is peanuts and relatively meaningless for overall net revenue. What we give is the marketing hype and chest-beating material that helps to trickle down with brand recognition.

So I can see why they are doing it, well they have always done it, but now their software advantages is basically all that's left.
 
Does anyone know of a book/article/website that has a good historical collection of the various shenanigans the video card companies have pulled over the years, like the quake fiasco, etc.?
 
They used "benchmarks". My testing I have done is real-world game playing. Stay tuned later this week for some Civ5 results.

I'm looking forward to it. :)

If it includes non-SLI, non-CX and the new AMD 6xxx hardware it will be particularly useful to me :)

I'm sticking with my GTX470 for my desktop at least until after the holidays, but I am looking for a good mid-range (read 1 6pin connector) for my laptop with a DIY Vidock. Right now I'm leaning GTS450, but this could change depending on the testing of the new Barts cores on Friday :)

(I'm hoping to buy myself a Xmas presenty in a 30" 2560x1600 flat panel, at which point I fear the GTX470 may no longer be sufficient)
 
Some of you guys clearly arn't reading the mini-article that Kyle wrote and are just commenting based on the one-liner at the beginning of this thread.

Kyle, thanks for looking out for us gamers. This crap really needs to stop and Ubisoft just dipped even lower in my view. They were already 10 floors below ground level after that crap DRM they were doing.

Also shame on nVidia. I used to be a hardcore nVidia fan as they were the video cards I upgraded with every year once 3DFX died. I switched over to ATI once their TWIMTBP crap started.

Some of you act like Nvidia is injecting malicious code to damage performance on AMD cards and bullying the developers into allowing this. This is not what happens. Nvidia offers up help to make sure that games run good for their customer, it is up to the developers to analyze and implement the code. Developers and publishers are responsible for the game, no NVIDIA. Also it is very likely the some of the changes benefit AMD as well, since this is all written for Direct X. Certain TWIMTBP games are notably faster on AMD cards as well, disproving the conspiracy theory people. Meanwhile some people are still happy when AMD seemingly spends no money or effort working with devs make sure games run efficiently on their hardware.
 
As usual, most people here don't even listen to all sides of the story, to pass judgement. AMD said NVIDIA is cheating, so it must be true...How about wait for what Ubisoft has to say ?
How about contacting Ubisoft for a response Kyle and then publish everything ?

People like to say that it's "innocent until proven guilty", but that's a load of BS, considering that most people already had the trial, convicted one side and are readying the axe to cut some heads off...

Yea but give the track record of both its not a stretch to see NV or AMD for that matter being less than honest. I mean this is not the first time this has happened...


Besides this is another reason I gave up on PC gaming. To much drama and BS. Much easier to relax on the couch after a long day and play some games on the console (agreed giving up lots of eye candy but at the end of the day its about how one enjoys the experience vrs how "good" each pixel looks).
 
As usual, most people here don't even listen to all sides of the story, to pass judgement. AMD said NVIDIA is cheating, so it must be true...How about wait for what Ubisoft has to say ?

Indeed, how could they jump to conclusions with the fine upstanding history Ubisoft has with the DX10.1 assassin's creed issue, Batman AA issue, etc etc etc. There's a post earlier mentioning about six other games in Ubisoft's past where suspicious performance is inhibited and/or patches removed b/c they made games run faster on ATI's cards etc.

Your sentance is more akin to "As usual, most people here don't even listen to all the sides of the story to pass judgement. The police said this man robbed a store, so it must be true. What about what the man has to say? He may be convicted fellow with a string of robberies in his history, but, why are people jumping to the conclusion that he's would do it again?!"
 
This really makes me wonder about F@H!
ATI's newest cards are getting HALF the PPD of nVidia's last generation cards.
A 5770 gets 4k PPD and a GTX core 216 gets 8k PPD...And the ATI card has 4 times more stream processors.
So is this going on with F@H too Kyle?

its a mixed bacg of issues.. while im pretty sure nvidia has their hand in Pande Groups back pocket.. the secondary problem is that ATI/AMD decided to ditch the brook++ which is inefficient as hell in GPGPU and moved to openCL which means they have to re-write the client from the ground up.. they currently have an openCL client being tested internally and will hit the closed beta testers within the next 2 months.. from there it will be tested by them for another 2 months and then eventually released to the public.. but this also lies on Pande Group for not hiring coders that actually know what they are doing.. there are 2 boinc projects that have already proven that the 5870 eats nvidia cards for breakfast when it comes to GPGPU.. for example in collatz it takes an overclocked GTX 480 about 8 minutes to finish a WU.. compared to a stock 5870 that takes roughly 5 1/2 minutes to complete the same WU.. and overclocked 5870 takes roughly 4 1/2 minutes to complete the same WU.. milkyway@home is the other boinc project that shows the same difference with a 5870 taking less then half the time to complete the same WU on a gtx 480..
 
Some of you act like Nvidia is injecting malicious code to damage performance on AMD cards and bullying the developers into allowing this. This is not what happens. Nvidia offers up help to make sure that games run good for their customer, it is up to the developers to analyze and implement the code. Developers and publishers are responsible for the game, no NVIDIA. Also it is very likely the some of the changes benefit AMD as well, since this is all written for Direct X. Certain TWIMTBP games are notably faster on AMD cards as well, disproving the conspiracy theory people. Meanwhile some people are still happy when AMD seemingly spends no money or effort working with devs make sure games run efficiently on their hardware.

Devs looking at code that is given to them for free? You expect too much from them!
http://forums.steampowered.com/forums/showthread.php?t=1263556
http://torrentfreak.com/ubisofts-no-cd-answer-to-drm-080718/
 
Ati cards should work with this demo if it's coded within the DX11 spec (which we can be pretty sure it is or Ati would have complained about that). This isn't like physx - it's not an nvidia only feature, it's just using the DX "open standard", one of those things ati is so found of telling us it supports. If it doesn't work it's either because:
a) Ati drivers suck.
b) Ati hardware is slow (at tessellation).

What with Richard Huddy's rant about *too much* tessellation and now this it seems that tessellation - the thing their marketing team couldn't stop talking about a year ago has now become the pariah.

I suppose all this really confirms is the 6 series can't still can't tessellate as well as the nvidia 4 series, not that I really care that much - I haven't seen one game yet that really looks noticeably better with tessellation on outside of a screen shot. Until I do it's not going to overly influence my graphics card purchases.

I agree somewhat with Huddy about "too much" tessellation. It's true that high levels tessellation can be very difficult to notice, especially on surfaces that are facing the viewer. It's also true that current GPU architecture is built for "large triangles" of 8+ pixels. When triangles get to pixel-size or sub-pixel-size, the hardware can't take advantage of parallelism and the shaders suffer from over-shading and performance plummets. Clearly, this level of tessellation should be avoided, or perhaps only spent on areas of the image that would show image quality improvements such as edges, silhouettes, and surfaces at shallow angles. An effective implementation of tessellation doesn't need to require enormous triangle throughput on the hardware. But I don't think Barts has "enough" tessellation power for AMD to sit back comfortably and say that the rest of the problem is in the hands of the developers. Although developers certainly are part of the "problem"...

<off topic>

We have yet to see many games adopt tessellation for anything more than a checkbox feature aside from Civ 5. Part of the problem lies in the difficulty in creating art assets that can be tessellated since "standard" polygonal meshes don't always work well. It's usually hard to "add tessellation" in a meaningful way to an in-progress game without recreating a lot of content. Terrain meshes are an obvious exception that are fairly easily tessellatable, so games like Civ 5 can adopt that fairly quickly but we won't see anything like Heaven until games are being made with their entire art pipeline having tessellation in mind, which will probably be aided along once tools get better so the artists can quickly preview how their work tessellates.

Another part of the problem is the lack of guides and examples for writing tessellation code. The DX11 SDK has a few examples but they miss out on useful things, such as reducing tessellation factor for distant objects and increasing tessellation factor on silhouettes and edges where smoothness is noticeable. It may take a little while yet before programmers are familiar and comfortable with tessellation.

</off topic>

I'm genuinely curious what exactly HAWX2 does that performs so slowly on AMD hardware aside from simply going overboard on tessellation. Hopefully someone does some before/after analysis once AMD gets its driver in shape for it.
 
I'm sure Kyle's article's sole intention was to call out all participating parties involved in this fiasco. Sometimes this is the only way. Otherwise you'd get a lot of "no comments" when asking a party directly.

Lay everything out in the open and let them have at it.


agree.. what did it take 2 days for ATI and nvidia to release a SLI profile update after kyles article? and this is after almost 2 months of nvidia not releasing a SLI profile update.. i have a feeling we will get the same response out of this as well except from ubisoft fixing the game.. but being that its ubisoft i highly doubt it..

what this really sounds like to me is that AMD suggested a better way of running dx11 that would benefit both nvidia and AMD.. AMD expected ubisoft to use the code but then rejected it.. while nvidia was already working on their drivers to tweak the performance for the half ass code ubisoft decided to use leaving AMD behind because of ubisoft not using the improved coding.. so now AMD is scrambling to tweak their drivers to use the half ass code from ubisoft which is why they dont want people running the benchmark in reviews..
 
I'm genuinely curious what exactly HAWX2 does that performs so slowly on AMD hardware aside from simply going overboard on tessellation. Hopefully someone does some before/after analysis once AMD gets its driver in shape for it.

Good point. This really could just be a matter of Hawx2 being particularly tessellation heavy, and AMD just trying to cover up the fact that their parts are weaker when it comes to tessellation than Fermi is.
 
Zarathustra[H];1036318080 said:
THis is the only comprehensive Civ 5 review I have seen to date with benchmarks across different hardware

I linked the 1920x1200 page as I believe that is a pretty common res around here. The differences become more pronounced as you go down in resolutions. Even the GTS450 beats the 5830...
The biggest thing out of that review is we are still seeing gains even at 4Ghz on an i7. Bring on the Sandy Bridge!


See: PhysX for some irony.

I was thinking of that when I was writing my post.
 
So AMD cards have poor tesselation performance. Upcoming benchmark includes lots of tesselation and AMD asks them to tune it down to get the benchmark run better on their products. Developers refuse and AMD decides to degrade the tesselation quality via their drivers.

And all you people blame NVIDIA? WTF?
 
Back
Top