Nvidia caught cheating; latest drivers 169.xx cheat in Crysis demo

Man, simply renaming the .exe and gone is this "optimization". It's só evidential it's not even funny.

Sometimes a driver will fix one thing, and break ten others. App detection is nothing new, and not cheating, it is done to provide app specific optimizations and bug fixes, it does not inherently mean "cheating." Both ATI and NVIDIA use app detection for various things.
 
it's still beta, but no card company is clean on this issue, both sides have done "optimizations" in the past, the only thing to wonder is, do they simply inflate a benchmark or do they help the experience?

I think it's a mix of both at the moment, AMD needs to get their act together and come up with some real performance instead of crying "buy our stuff because they are cheating" scam.

Most of us have been their done that with these stories and what it comes down to is whoever gives the best performance deserves the cash. Without the optimizations, nvidia cards were still trouncing amd's cards.

Makes you wonder if AMD has anything left to give for their drivers, or if this is it and you have no choice but to wait on yet another generation of ATI cards which will be late and beaten by nVidia's random drops of high end offerings...
 
Actually there are about 4 important ones and they show a most definite 7% drop, you doubting the integrity of the tests or did you simply not do the math properly Kyle? ;)

Ok, let's look at the data gleened from the EB graphs. Which ones are the important ones? The ones that prop up your statment? So we need to ignore all the other data right?

These figures show actual frames per second differences in score. And a couple of the test do not support the conclusion at all, but that is another discussion.

0.01
0.04
0.1
0.03
1.58
2.05
1.58
2.05
1.72
2.21
3.82
0.83

I feel pretty good about saying that a couple of frames per second is no big deal.
 
um the linked screenshots between the radeon 2900xt and the 8800gt... the gt looks um well alot better. the rock on the upper left.. on the GT can bee seen in the reflection on the XT its more blurry... and the edges of the water are smoother on the gt... i dunno looks better to me :p
 
Ok, so what I have done is run the benchmark fps results through two statistics tests.
First, I used the Mann-Whitney U test. It's a relatively relaxed test as far as requirements, and simply tells you if there is a significant difference between two pools of data. (for stat-heads out there I used a two-tailed test, as I had no reasonable expectation of seeing a difference if I knew nothing about the feelings of people about these fps data points).
My target P value is P=.05, meaning that if P>.05 (an accepted scientific standard) there is not a significant difference, and if P<.05 there is.
After running it through, my P value was =.795. If P=1, the two would be exactly the same, so there isn't even a trend visible.

The second test is a little more appropriate, but requires the data to be matched: The Wilcoxon signed ranks, matched pairs test (again, two-tailed). Thankfully, EB matched the data for us, matching for hardware, software settings, and all other variables except driver or .exe name (the same variable in essence). This test can detect finer differences and even say in WHICH direction the difference is in.
In this case the P value was =.062 towards better frame rates as discussed.
Now this is a bit closer, however, not significant. It does indicate a trend towards lower fps in the beta driver. Though this trend exists the lack of significance means it could be attributed to the drivers, sure, but could be equally caused by variables in the test setup, observer error etc...


So, I would say that the supposed 7% difference is nothing to write home about, and really could be attributed to many things, including the drivers.

BTW, EB's forums have to be the most god awful garish thing ever. Hurts to look at.
 
I would call it a "let the 8800GT be as impressive as possible" during the initial benchmarks. Each frame counts when there are so few as in Crysis. "Later on, when people have seen the benches, we will "fix" this bug." ;)
Possibly, but it's a bit pessimistic. I'd venture to guess that NVIDIA is just pushing too hard and not catching these issues. The water shader update issue does need to be resolved, but it's also a bit overblown. I've seen nothing to actually suggest that it's intended to update every third frame either (can anyone fill me in on this?), and we have no R600 comparisons on this specific issue, so I don't know where EB got the info that every three frames is "correct".

If you study the 150 mph benchmark flythrough, it's an issue. If you play the game, it's a non-issue, but the benchmark does need to be comparable across all hardware.

Man, simply renaming the .exe and gone is this "optimization". It's só evidential it's not even funny.
If you renamed all of your game executables, performance would drop as well. This shouldn't be surprising to you -- this is typically just how game-specific optimization works.

Given the source of this at best highly questionable BS,I have no faith in its integrity at all.AMD/ati brought it up,so right there,I am already thinking sour grapes,and 'woe is me' syndrom on overdrive.My only question is ?
They both do the same thing. Remember NVIDIA on the Call of Juarez benchmark? NVIDIA cried foul on AMD for so-called NVIDIA-crippling opts, yet it's likely that's not what AMD was intending. If NVIDIA had something to bring to the table there, they would have brought it down fairly hard. That issue just sort of disappeared after not much affair at all though, but obviously Crysis is a much bigger fish.

I'll admit, DAAMIT seems to do this sort of thing more often, but they're the ones in a more desperate position at this point, so it isn't surprising. I'd imagine a lot of you recall AMD slamming NVIDIA for its video playback quality only to get totally owned by NVIDIA's response :)
 
While I agree that it's a little premature to call it a cheat considering it's just a demo and not the full version. But you have to call into question that it's not just the 169.04's. The same problem has been in the last 4 versions of the drivers and one is a WHQL Certified driver. If it's just a simple bug, why hasn't NVIDIA squashed it?

When the full version comes out and I know ATI/NVIDIA will have yet another driver version out, if it occurs in the full version then you have to at least question what's going on. Considering NVIDIA is telling their consumers to "use" this driver for Crysis. It's in their release notes?
 
"or rename the crysis.exe in your game folder to something else (not the name of another game) to remove the offending broken optimisation"

dont have my computer up and running yet so i cant test this, but does he say to not use another game exe because it will have specific optimizations not meant for crysis? or because if you name it to another game the reflections will update just as slowly? if its the latter, then how is nvidia specifically targeting the crysis benchmark, when its affecting all game optimizations? that would be more of a "broken driver" than "cheating"
 
No, it was cheating and it was by nVidia. It's the reason why decent reviewers haven't relied on 3DMark ever since.



http://hardocp.com/article.html?art=NTQ5


http://www.techspot.com/news/5468-nvidia-caught-cheating.html

Right, but what you described was not what was in the "cheat". What you described is something your video card does (both ATI and NVIDIA) on every frame of every game. Everything outside of your "viewable area" (aka FoV) is "culled" so that you dont waste precious cycles processing things that won't be displayed on screen at the end. I'm pointing out that what you described is not this "cheat", but a common thing that every game engine does.
 
Anyone that calls this cheating (here we go again) is just plain stupid and childish.
 
um the linked screenshots between the radeon 2900xt and the 8800gt... the gt looks um well alot better. the rock on the upper left.. on the GT can bee seen in the reflection on the XT its more blurry... and the edges of the water are smoother on the gt... i dunno looks better to me :p

In a game where I'm being shot at and having rockets leaving craters all around me, I really don't think I have time to stop and admire the beautiful water edges and reflection, or bitch about why they are so blurry.
 
Anyone that calls this cheating (here we go again) is just plain stupid and childish.




And to easily swayed by AMD FUD/marketing ?.But again,given the originating party/site thats is reporting this 'news' its not at all surprising.


In a game where I'm being shot at and having rockets leaving craters all around me, I really don't think I have time to stop and admire the beautiful water edges and reflection, or bitch about why they are so blurry.


Well,you do have the time,only thing is,it'll likely get you killed in a hurry. :D
 
I noticed that effect when I was doing some benchmarks and assumed it was being done by the game itself rather than the driver, although my Quadro doesn't do it so I suppose that makes sense in hindsight. I don't know how much of a hit the water reflections are, but since I hadn't even noticed the infrequent reflection updates during normal play I thought it was a pretty sensible optimisation. Of course I notice it now, especially on the boats, but I don't find it distracting.
That said, if we're going to be flying aircraft around in the full game as the options screen suggests, then they'll never get away with it.
 
Of course ATI feels cheated. The 2X00 AA is shamefully slow and bothersome so Nvidia asks all reviewers of their cards to run 4xAA at all tests as a result to give an artificially skewed higher score for those cards since AA performance scaling is no indicator of how strong, well-performing a card is...

what are you on? AA is most definitely an indicator of how strong a card is. Ex. if card A can play (insert random game) at 1680x1050 with 4xAA at the same or higher frame rate as card B can at 1680x1050 with no AA then card A is definitely faster (at least at that game).
 
While I agree it's a only a bug in Nvidia's drivers, there are certain people on this forum who are being a bit ridiculous by claiming that EB is just a puppet for AMD. One just needs to look at Hanner's conclusion:

Hanners said:
We can only imagine that the reduction in image quality we're seeing is due to a Crysis-specific optimisation that isn't working exactly as it should, thus increasing performance at the cost of rendering the game exactly as the developer intends it to look.

He's not claiming Nvidia is cheating in any way; he acknowledges it to be a minor driver bug. I think it's fair for a hardware review site to raise driver issues about the products it reviews. I'm failing to see what Hanners did wrong in this situation. It's a bit surprising and rather upsetting to see negative remarks toward a decent website.
 
Yeah if they were going to risk a cheat I'd think they would go for more than a 7% return. It's not like ATi has anything close to their cards now anyway. IMO ATi needs to quit whining about what nVidia is doing and release something worth owning.
 
lol idk what they are using in the demo machines at lan parties. It runs great on those machines from what i've seen. They must be running a server computer with dual quad cores and quad sli 8800ultras. Idk how they do it but something is under that hood is fast :)
 
Of course ATI feels cheated. The 2X00 AA is shamefully slow and bothersome so Nvidia asks all reviewers of their cards to run 4xAA at all tests as a result to give an artificially skewed higher score for those cards since AA performance scaling is no indicator of how strong, well-performing a card is.

What happened to the days when it was truly scandalous to do per-executable optimizations? What does the pre-Crysis Nvidia driver perform like? What is the Crysis ATI driver going to perform like? The way the ATI performance in Crysis surprised us made me wonder what is going on, Crytek has been with Nvidia for a very long time after all. What in the world have their demo machines been running if people are having performance issues like this in the singleplayer demo? Does these optimizations come from Crytek themselves, according to the Nvidia TWIMPTB program?


(e: I still like specific optimizations for games, but give that choice to players to make. I'd like to have faster performance at the cost of visual quality in multiplayer Crysis as an example. Could this explain why the shadows in the HL2:EP2 version of Source look hard and sharp on 8X00s but (too) soft on 2X00s? Time to investigate further :eek: )


*Artificially* skewed? How is it artificial, when most people choose to run 4xAA if they possibly can afford it? It's quite likely NVIDIA is doing it because it works out in their favor.. but who cares? They've earned the ability to recommend running with AA on by creating a faster card, right? And reviewers still have a mind of their own to test what they choose, not just what NVIDIA says, right? There's nothing artificial about 4xAA.

AMD/ATI don't need any pity here. Just more R&D.
 
For those people who said that desperate AMD/ATi should concentrate more on their performance rather than spreading nVidia's "cheat", let's not forget that the demo runs better on the 2900XT than on 8800GTS so I don't think that it is a desperate move by ATi but it is more a desperate move by nVidia to catch up to the R600 performance. Btw I'm not agianst the optimization.
 
1-3 fps to me seems like it is within the margin of error you get when running timedemos, or even doing manual run-throughs, if I saw a 1-3 FPS difference in my graphs I wouldn't say any card had a clear advantage.
 
While I agree it's a only a bug in Nvidia's drivers, there are certain people on this forum who are being a bit ridiculous by claiming that EB is just a puppet for AMD. One just needs to look at Hanner's conclusion.

He's not claiming Nvidia is cheating in any way; he acknowledges it to be a minor driver bug. I think it's fair for a hardware review site to raise driver issues about the products it reviews. I'm failing to see what Hanners did wrong in this situation. It's a bit surprising and rather upsetting to see negative remarks toward a decent website.

Indeed, it's more than a little hurtful to see such a respected and major web site owner disparaging my name, and that of my site, in public with no evidence to the contrary. Very disappointing. :( Hopefully at some point he'll sit down and read the article in its entirety, and make a similarly public apology for jumping the gun.

However, just to make things clear, at no point was either ATI or cheating brought into my discussion of this issue, any 'accusations' are coming purely from those who are using my article to suggest otherwise. As for being 'fed' this information by ATI, here's the timeline of how things happened:

- We released a GeForce 8800 GT review on launch day, using a pre-ForceWare 169.0x driver release
- Late last week, I received a GeForce 8800 GT sample from another NVIDIA AIB partner, and started benchmarking it, this time with the newer WHQL ForceWare 169.02 release.
- During this testing, I spotted the water issues in Crysis, which I knew wasn't present in an earlier driver, so investigated further - Trying different drivers, driver settings, and changing the executable name.
- I felt that my findings were interesting enough to give a wider hearing too, given that Crysis is a hugely popular game that a lot of gamers are enjoying in demo for right now, as well as being heavily used in reviews right now. And here we are now.
 
1-3 fps to me seems like it is within the margin of error you get when running timedemos, or even doing manual run-throughs, if I saw a 1-3 FPS difference in my graphs I wouldn't say any card had a clear advantage.

Personally (and I know it's kind of different for you guys because of the different ways you benchmark), I'd say that very much depends on the overall level of performance. One or two frames per second at an average of 100 FPS or so really isn't much at all and, as you say, is within a normal margin of error. Drop down into the 30 FPS average though, and a couple of frames per second is actually a pretty big percentage of your frame rate.

Not enough to make the GeForce 8800 GT, or indeed any of NVIDIA's high-end parts, any less awesome, but in my opinion it's at least worthy of note.
 
Of course ATI feels cheated. The 2X00 AA is shamefully slow and bothersome so Nvidia asks all reviewers of their cards to run 4xAA at all tests as a result to give an artificially skewed higher score for those cards since AA performance scaling is no indicator of how strong, well-performing a card is.

Wait what?
i dont spend 400 bucks on a video card to NOT use AA
i think its a pretty good indicator
if a card cant run AA with good frame rates i dont want it and i want to know if it does or not BE FOR i buy it
 
i may be CPU limited in Crysis but i get minimal frame rate hit for 4xAA in the demo
 
The business of business is Business and in the Big Book of Business Wars, this is a couple of 3 year olds playing with rubber knifes. Now when someones car explodes, I will be impressed and/or care enough to change my buying habits.


I neither condone or recommend destroying cars as a valid marketing tool and no cars where harmed in making this post.
 
Big deal, I can't see much difference with the screenshots. The beta drivers are just that, 'beta' - not certified. What do you expect? Cheating? don't make me laugh.

Seems amusing to me that AMD/ATi would sink this low by alerting Hardocp to this graphical issue with reflections. These are desperate times for AMD/ATi, their top cards are miles behind Nvidia's. Much like AMD is a way behind Intel.
 
Big deal, I can't see much difference with the screenshots. The beta drivers are just that, 'beta' - not certified. What do you expect? Cheating? don't make me laugh.

Seems amusing to me that AMD/ATi would sink this low by alerting Hardocp to this graphical issue with reflections. These are desperate times for AMD/ATi, their top cards are miles behind Nvidia's. Much like AMD is a way behind Intel.

on what do you base the statement that ATI contacted Hardocp about this issue with reflections?
 
In a game where I'm being shot at and having rockets leaving craters all around me, I really don't think I have time to stop and admire the beautiful water edges and reflection, or bitch about why they are so blurry.

Well hang on their sport. Not everyone plays the game like you do. And Crysis (like Farcry) is not your typical shooter. When I played farcray yea I had to deal with all the enemies. But there was lots of time to stop and look around and see the what was then a ground breaking game. In the current Crysis demo I have spent a lot of time doing the same thing where things like IQ is important. Hey if I have to spend 50 or so bucks I want to get the full IQ effects to absorb all that Crysis has to offer :)

Sorry I am just tried of hearing this as an excuse time and time again. There are plenty of times to stop and "smell the roses" so to speak....


The fact that AMD/ATI is the one who is 'reporting' this 'news' to all of the major enthusiast sites out there should tell you guys something.....

So what who reports it or tips us off. We all knew it was nVidia that tipped everyone off about ATI and the whole quake/qauke thingy back then. Both do it to each other any time they can.


I think this is a minor bug that will be fixed very soon. Bugs happen all the time and even when the optimization paths of both IHVs. Nothing to see here folks and move along.
 
Sad part is many people will buy a card 7 percent faster regardless of how it got there......
 
Ok, so what I have done is run the benchmark fps results through two statistics tests.
First, I used the Mann-Whitney U test. It's a relatively relaxed test as far as requirements, and simply tells you if there is a significant difference between two pools of data. (for stat-heads out there I used a two-tailed test, as I had no reasonable expectation of seeing a difference if I knew nothing about the feelings of people about these fps data points).
My target P value is P=.05, meaning that if P>.05 (an accepted scientific standard) there is not a significant difference, and if P<.05 there is.
After running it through, my P value was =.795. If P=1, the two would be exactly the same, so there isn't even a trend visible.

The second test is a little more appropriate, but requires the data to be matched: The Wilcoxon signed ranks, matched pairs test (again, two-tailed). Thankfully, EB matched the data for us, matching for hardware, software settings, and all other variables except driver or .exe name (the same variable in essence). This test can detect finer differences and even say in WHICH direction the difference is in.
In this case the P value was =.062 towards better frame rates as discussed.
Now this is a bit closer, however, not significant. It does indicate a trend towards lower fps in the beta driver. Though this trend exists the lack of significance means it could be attributed to the drivers, sure, but could be equally caused by variables in the test setup, observer error etc...


So, I would say that the supposed 7% difference is nothing to write home about, and really could be attributed to many things, including the drivers.

BTW, EB's forums have to be the most god awful garish thing ever. Hurts to look at.

Man, you've just brought all that stat stuff back from uni. I actually remember it all now. Ugh! :p

But yeah, a stats test isn't going to tell you if you think the performance gain to image loss is too big a price to pay. As Kyle noted, that's all that counts really isn't it?


And if it is too high a price to pay, then you can simply rename the .exe and get all the 'goodness' back. Everyone's a winner! :cool:
 
Wait what?
i dont spend 400 bucks on a video card to NOT use AA
i think its a pretty good indicator
if a card cant run AA with good frame rates i dont want it and i want to know if it does or not BE FOR i buy it

I spend $400 on cards to get as high framerate as possible, like "pro" gamers do (I did the quake 3 on an R8500 for FPS's sake), and I have an LCD with jaggies sharper than Nicole Ritchies elbows.
 
If nvidia was cheating.....why when I reduce my shaders to medium.....to get a few precious FPS gain.....does everything in the distance shimmer like cheap costume jewerly???? It's god awful......

And, do I eally care???? My GTXs are a year old and are going to be IT for me for a while......I just want to play the game.:D
 
If nvidia was cheating.....why when I reduce my shaders to medium.....to get a few precious FPS gain.....does everything in the distance shimmer like cheap costume jewerly???? It's god awful......

And, do I eally care???? My GTXs are a year old and are going to be IT for me for a while......I just want to play the game.:D

Hmm, I noticed this too. Far off scenery shimmers, like corruption or poor textures. It seems to be mostly trees or mountain-sides that shimmer. Does it on all drivers. Both my GTS and GT do this. So it ain't my card!
 
While I agree it's a only a bug in Nvidia's drivers, there are certain people on this forum who are being a bit ridiculous by claiming that EB is just a puppet for AMD.
I think LawGiver kind of gave the impression that EB thought it was cheating, when in reality, it was just his assumption (more like a declaration, actually) that it was cheating. Rather than just post the link, he had to charge it, which gave people the wrong idea.

I think the big question is why the only person who seems to be experiencing the skewing issue, as far as I'm aware, is Hanners. I haven't seen anyone actually come out and say "Yeah, I get that too" yet. Makes me wonder a bit, really.

It's quite likely NVIDIA is doing it because it works out in their favor.. but who cares?
And AMD does the same thing, except that they recommend not enabling AA, and many reviewers, unfortunately, comply. Like you said, if a reviewer is following either company's stipulations to the letter, it's time to find a new place to get your reviews.

...let's not forget that the demo runs better on the 2900XT than on 8800GTS
Assumably.
 
I think the big question is why the only person who seems to be experiencing the skewing issue, as far as I'm aware, is Hanners. I haven't seen anyone actually come out and say "Yeah, I get that too" yet. Makes me wonder a bit, really.

I've heard from numerous people who have confirmed this issue with various ForceWare 169.0x versions, although admittedly it's far more apparent in the fly-by benchmark than during normal gameplay. It isn't just me though, that's one of the first things I checked. ;)
 
I spend $400 on cards to get as high framerate as possible, like "pro" gamers do (I did the quake 3 on an R8500 for FPS's sake), and I have an LCD with jaggies sharper than Nicole Ritchies elbows.

lol pro gamers are a bunch of fags that think there cool becouse they can play games for a living or atlest try
and last i checked "pro" gamers make up about .1% of the pc gamer market
and its midrange cards that make money and most gamers want the game to look better
oh and you dont need a 400 buck card to be a "pro" gamer iirc most use midrange shit and play LOL 800x600 LOL

i want better IQ with playble frame rates and i dont play on line much but if you want to play CS 1.6 for the rest of your life thats fine with me but dont bitch becouse card A can do AA faster then card B and reviews like card A better
 
Back
Top