Nvidia caught cheating; latest drivers 169.xx cheat in Crysis demo

I guess that's why one of Elite Bastards current front page stories is a link to your (rather excellent) Gears of War performance article. And who posted it? Oh, it was me. :eek:

You're right though, my talents would be better employed elsewhere, but a McDonalds uniform just doesn't suit me these days. :(


Hehe, I would have direct linked yours on our front page yesterday, but I have a rule against posting dead links. :p No really.
 
I posted this earlier, but it was before EB was on. I want to know what they think about taking a statistic look at teh data:

Ok, so what I have done is run the benchmark fps results through two statistics tests.
First, I used the Mann-Whitney U test. It's a relatively relaxed test as far as requirements, and simply tells you if there is a significant difference between two pools of data. (for stat-heads out there I used a two-tailed test, as I had no reasonable expectation of seeing a difference if I knew nothing about the feelings of people about these fps data points).
My target P value is P=.05, meaning that if P>.05 (an accepted scientific standard) there is not a significant difference, and if P<.05 there is.
After running it through, my P value was =.795. If P=1, the two would be exactly the same, so there isn't even a trend visible.

The second test is a little more appropriate, but requires the data to be matched: The Wilcoxon signed ranks, matched pairs test (again, two-tailed). Thankfully, EB matched the data for us, matching for hardware, software settings, and all other variables except driver or .exe name (the same variable in essence). This test can detect finer differences and even say in WHICH direction the difference is in.
In this case the P value was =.062 towards better frame rates as discussed.
Now this is a bit closer, however, not significant. It does indicate a trend towards lower fps in the beta driver. Though this trend exists the lack of significance means it could be attributed to the drivers, sure, but could be equally caused by variables in the test setup, observer error etc...


So, I would say that the supposed 7% difference is nothing to write home about, and really could be attributed to many things, including the drivers.
 
...not significant.

While I am in no way shape or form going to even try to imply that I have been as thorough as you have been, "not significant" is the overall feeling I got about the data in the EB article after reviewing it.
 
I'm glad I'm wearing my tin foil hat or something this small in the scope of things might have me seeing shadows....
 
Oh, really?

8800 Ultra, 169.04, all settings High, default EXE name (Crysis.exe), only FOV modified and motion blur set to 0, NVCP IQ set to High Quality. First, during gameplay:
ingamezm0.jpg

Looks nothing like what EB is experiencing.

Then, when running the benchmark, enlarged from original crop:
benchka7.jpg

Again, looks nothing like what's reflected in EB's screenshots.

Now, maybe this is a GTS-specific issue, but that's a bit doubtful. Thanks for jumping the gun there, chief.

How do you modify the FOV in Crysis?
 
no thanks.... i let this crap influence me the last time it was talked about. I dont even care if Nvidia is cheating, someone will catch them if they are and make them make right, it doesnt change the fact that i now definitevly perfer them over ATI.
 
Oh, really?

8800 Ultra, 169.04, all settings High, default EXE name (Crysis.exe), only FOV modified and motion blur set to 0, NVCP IQ set to High Quality. First, during gameplay:
ingamezm0.jpg

Looks nothing like what EB is experiencing.

Then, when running the benchmark, enlarged from original crop:
benchka7.jpg

Again, looks nothing like what's reflected in EB's screenshots.

Now, maybe this is a GTS-specific issue, but that's a bit doubtful. Thanks for jumping the gun there, chief.

are u on DX 9?
 
The screenshots were DX10, yes, and I don't see the issue in XP DX9 either.

Please stop quoting the post along with the images. It's rather annoying.
 
I'm not sure if that changes anything. All he said was that the optimization might not be working as Nvidia intended.

That's just the problem.. who said the "optimization" (perf difference) is attributable to the corruption at all? They may not have any effect on one another, but that was the theory laid forth by the article.
 
Which I did refer to in the article, and it's a fair point to some extent, although its equally fair to at least suggest the change in rendering workload that this bug introduces is an obvious culprit for the performance change.

Fallacy. Correlation != Causation.

If the processing of reflections isn't the bottleneck of the card's current processing, who says fixing the bug would have any performance hit? See, you're assuming something that you don't know. Even the fact that you call it an "optimization" in the first place is overly assumptive. It's just plain bad logic Hanners.
 
My target P value is P=.05, meaning that if P>.05 (an accepted scientific standard) there is not a significant difference, and if P<.05 there is.

Setting an alpha value of 0.05 is completely arbitrary, regardless of whether it happens to be the convention.
 
That's just the problem.. who said the "optimization" (perf difference) is attributable to the corruption at all? They may not have any effect on one another, but that was the theory laid forth by the article.

Because when you changed the .exe (or used an older driver) the problem went away. It obviously had to do with some form of game specific optimization; it's not rocket science.
 
Because when you changed the .exe (or used an older driver) the problem went away. It obviously had to do with some form of game specific optimization; it's not rocket science.

Problem is this is with "Crysis" drivers off Nvidia's website and possibly the single largest release of 2007. Doesn't hurt it is also at the same time the 8800GT releases. I wouldn't put it past both IHV's to pull something like this :(

95 percent of the readers will see the 7 percent improvement versus 1-2fps and make their buying decision based on that.
 
Why is everyone getting their panties in a bunch? This is a DEMO! Not the full release, something YOU HAVE TO PAY FOR! So isn't this just a little tad pointless?

If it was based on the full blown release of Crysis, then I can see some stink about it. This article is getting out of hand. If the retail version comes out and everything is fine and dandy, then this article and all this chat is pointless...

We all know that ATI/NVIDIA will be releasing drivers for this game. Can't we just fricken wait until the game is released in what? a week and a half the most? Sheeesh!

+5356376

everyone needs to STOP WORRYING about this trivial BS, because its just that, TRIVIAL
 
Cheating?

This is a great example of the idiocy of the general gaming public.

To accuse nvidia of optimizing for the game as a cheat, has to be an epitome of ignorance.
It's a shame [H] is sinking to generating drama where there is none, because it just spreads bad press by idiotic parrots.

Hey I got an idea, Nvidia should stop optimizing drivers for game performance then you can't complain of that performance increase as 'cheating'.
 
My character's shadow tends to lag my movements. Is that related to the Nvidia driver bug/optimization, or a game bug?
 
Problem is this is with "Crysis" drivers off Nvidia's website and possibly the single largest release of 2007. Doesn't hurt it is also at the same time the 8800GT releases. I wouldn't put it past both IHV's to pull something like this :(

I'm not saying it was done on purpose (aka a cheat). Since Crysis is a fairly big game, Nvidia probably rushed the driver through their QA team and missed a bug. It's not a big deal, but it's clearly a result from an optimization not functioning correctly.
 
Hey I got an idea, Nvidia should stop optimizing drivers for game performance then you can't complain of that performance increase as 'cheating'.

I thought ATI was already doing that for their 2900 series? (rim shot please!) ;) And just for the record, I am an ATI fan, and AMD fan, and Intel fan and yes, even an NVIDIA fan. Just so happens that NVIDIA and Intel have the best upper end tech out there now.
 
I'm not saying it was done on purpose (aka a cheat). Since Crysis is a fairly big game, Nvidia probably rushed the driver through their QA team and missed a bug. It's not a big deal, but it's clearly a result from an optimization not functioning correctly.


I think I saw Hanners post here yesterday that NVIDIA said the problem is already fixed and that there was no performance hit. So doesn't that invalidate his entire article pointing out that the one bug he happened to see was THE optimization that gave the boost? I a sure a whole slew of optimizations were turned off via the rename. Anyway, just a thought.
 
Beta drivers people, thats why beta drivers are beta drivers and not officialy released drivers, because they're not finished yet.

On the subject of optimising, in my opinion these sorts of changes shouldn't be made by the video card vendors, they should be in game optimisations made by the developers or they should be in game options like for example the update rate for reflections you find in need for speed most wanted.

The circumstances where these issues show up obviously are ones not natural to the gaming environment, a fly by cam can pan and move fast enough to display a visual stretch in the reflections where as on the ground limited by movement speed of a footsolider or vehicles its much more difficult to notice, in all honesty if it's only noticeable by flying around the map in free cam mode then I say it should have been made anyhow, just like the tops of buildings arent rendered in Counter Strike Source, but are visible when in ghost cam mode, these visual trade offs are usualy sensible because they're only noticeable outside normal gaming operation. Still Nvidia taking it into their hands is a bit out of line, assuming of course that it was intentional, any number of explinations exist for Nvidia legitimately making these changes, we'd need a statement from them and the Crytek team to put this to bed, otherwise we're just making assumptions and speculating.
 
Beta drivers people, thats why beta drivers are beta drivers and not officialy released drivers, because they're not finished yet.

Still Nvidia taking it into their hands is a bit out of line, assuming of course that it was intentional, any number of explinations exist for Nvidia legitimately making these changes, we'd need a statement from them and the Crytek team to put this to bed, otherwise we're just making assumptions and speculating.

Why should Nvida and Crytek make a statement about beta drivers for a unreleased game?
 
So there are some bugs with the 169.04 drivers, why don't they compare them with other nv driver versions?
 
Because when you changed the .exe (or used an older driver) the problem went away. It obviously had to do with some form of game specific optimization; it's not rocket science.

Wow.. talk about a logical leap.

What you (and the authors of the article in question) just made is the logical equivalent of a short circuit.

The reflections are not the cause of the performance difference. Just because the shadows are changed and the perf changes by changing the .exe name, it does not mean one causes the other. Do you get that? They are 2 DIFFERENT things. It's possible they're linked, but in this case, they are not, and assuming that they are is downright bad logic. For all you know there are other optimizations there that have nothing to do with the reflection that you lose when renaming the .exe. Or, for all you know, maybe an SLI optimization was incorrectly kicking in (as it is something that is .exe dependant) with a single GPU.

Fixing the reflection bug will not change performance unless it were something that was bottlenecking the chip. It's like putting a bigger muffler on a car that has an intake port the size of a pea. The intake is the bottleneck, and switching the muffler (fixing the corruption) may do anything.

The fact that they've fixed it without it affecting performance is a testament to this.
 

Any updates on that? I applaud ATI trying to bring HardOCP's attention to this issue. Only this German site and EB have done anything about Nvidia's 169 "recommended" beta drivers that are being used to market 8800GT to those lusting for all of Crysis gfx prowess, somehow cheating them.

What do YOU like? Are you OK with Nvidia launching new cards with beta drivers that contain subtle yet unethical optimizations just to win us over? How does that make you feel as a customer? As Kyle said, it should be your own opinion, not a reviewer's, or ATI's (even if they are trying to tell us something sincere).

Heck, I run Crysis on 93.71 drivers and SLI still works fine with my dual 7900GTX's! I've confirmed this by running in single-card mode and it actually runs at half the frame rate when facing the ground or looking straight into a large rock. What's up with Nivida "with-holding" SLI capability from Crysis until the official release date? Sounds fishy...
 
Any updates on that? I applaud ATI trying to bring HardOCP's attention to this issue. Only this German site and EB have done anything about Nvidia's 169 "recommended" beta drivers that are being used to market 8800GT to those lusting for all of Crysis gfx prowess, somehow cheating them.

What do YOU like? Are you OK with Nvidia launching new cards with beta drivers that contain subtle yet unethical optimizations just to win us over? How does that make you feel as a customer? As Kyle said, it should be your own opinion, not a reviewer's, or ATI's (even if they are trying to tell us something sincere).

Heck, I run Crysis on 93.71 drivers and SLI still works fine with my dual 7900GTX's! I've confirmed this by running in single-card mode and it actually runs at half the frame rate when facing the ground or looking straight into a large rock. What's up with Nivida "with-holding" SLI capability from Crysis until the official release date? Sounds fishy...

You're a true moron. It's a BUG R-TARD.
 
You're a true moron. It's a BUG R-TARD.

Your respectable perspective. Anybody else? Crysis is the most anticipated game in maybe 3 years, since Half Life 2 for sure. Who knows how many 8800GT's it sold? It's maybe 95% Crysis. Also, no name-calling please.
 
Your respectable perspective. Anybody else? Crysis is the most anticipated game in maybe 3 years, since Half Life 2 for sure. Who knows how many 8800GT's it sold? It's maybe 95% Crysis. Also, no name-calling please.

Not a perspective, it's fact. It's not an optimization when it's obvious like that. It's a BUG. Yes, that's a period, denoting a *fact*
 
Sr7 said:
The reflections are not the cause of the performance difference. Just because the shadows are changed and the perf changes by changing the .exe name, it does not mean one causes the other. Do you get that?

Moofasa~ said:
I'm using the term "optimization" in a very general way. To me an optimization is algorithms that make the required work finished more efficiently (everyone has their own definition of "more efficiently"). I think what you are trying to say is what if algorithm x caused the bug but algorithm y caused the performance increase. That could very well be true, but doesn't change the fact that they all work together (in some degree) to form the optimization. Only Nvidia knows which part of the optimization triggered the performance increase and which part triggerd the bug (and how closely they are related). But I don't think that takes away from Hanners' claim that the optimization contains both a performance increase and a new bug (it's not as if he said the bug is located on line 532 of Nvidia's code).

If you're going to make an attempt to insult me, at least give me the respect and read my posts fully (and try to be more creative next time, I think I heard some crickets after I read your post). :rolleyes:

As you can see I never said they were related. All I said was the article gave me the impression that the 169.04 drivers included a performance enhancement and an image quality bug and that bug was related to a game specific optimization (that's doesn't mean the performance increase was directly related to the image quality loss).
 
If you're going to make an attempt to insult me, at least give me the respect and read my posts fully (and try to be more creative next time, I think I heard some crickets after I read your post). :rolleyes:

As you can see I never said they were related. All I said was the article gave me the impression that the 169.04 drivers included a performance enhancement and an image quality bug and that bug was related to a game specific optimization (that's doesn't mean the performance increase was directly related to the image quality loss).

It isn't an optimization either... in any way you could possibly imagine. Accept the facts guy.

BTW your post makes no sense. You say the bug isn't related to a performance increase persay. When you call it an optimization, it is implying a performance increase. No one optimizes 100% and gets 100%. That simply couldn't be called an optimization, as it is necessary to "optimize" something (see "change") in order to be an optimization.
 
@ Sr7: relax, now Brent said that he's using 169.05 drivers and the water extremely-slow-update "bug" is fixed. We should be seeing newer drivers on the day that Crysis officially hits the stores (very very few days... patience... patience... hold that fuss, boy!) :p
 
Bug or optimization I do not care, just good that these things are found out fast and they get "fixed" be they intentional or not.

What bothers me abit is the way Nvidia conducts its driver development. How often again we get official drivers from them? I allways see Beta drivers coming out and when these kinda things arise they make the innocent cat look and say "oh geez its the Betas you know".

Not good enough for me, Nvidia promised us monthly driver program ages ago, actually it was during 8800 and Vista had problems. I wonder when they are really going to make it happen.
 
Yeah, kudos to ATI for that. The picture by that German site shows more differences than just the water. The shadows reflecting off that jeep window, the trees.... let's wait and see.
 
New food;
[MG]http://www.pcgameshardware.de/screenshots/original/2007/11/1.jpg[/IMG]
Maybe its just me, but looking at those I like the nvidia render better, especially on the windshield. Either flash out the entire thing (its flat) or dont.

This thread is annoyingly long. Im glad I skipped 60% of it. :rolleyes:
Also: I trust Kyle enough to agree with him that if AMD specifically calls to call attention to a non-issue on their opponents BETA drivers, then they are in deep shit and they know it.
 
Bug or optimization I do not care, just good that these things are found out fast and they get "fixed" be they intentional or not.

What bothers me abit is the way Nvidia conducts its driver development. How often again we get official drivers from them? I allways see Beta drivers coming out and when these kinda things arise they make the innocent cat look and say "oh geez its the Betas you know".

Not good enough for me, Nvidia promised us monthly driver program ages ago, actually it was during 8800 and Vista had problems. I wonder when they are really going to make it happen.

Uhh actually they even post WHQL drivers more often than 1 driver a month. Do you bother to look?
 
Back
Top