Nvidia caught cheating; latest drivers 169.xx cheat in Crysis demo

How is that cheating? ATI does the same thing for COH DX9. Rename your exe and watch the framerate drop
 
Oh, really?

8800 Ultra, 169.04, all settings High, default EXE name (Crysis.exe), only FOV modified and motion blur set to 0, NVCP IQ set to High Quality. First, during gameplay:
ingamezm0.jpg

Looks nothing like what EB is experiencing.

Then, when running the benchmark, enlarged from original crop:
benchka7.jpg

Again, looks nothing like what's reflected in EB's screenshots.

Now, maybe this is a GTS-specific issue, but that's a bit doubtful. Thanks for jumping the gun there, chief.
 
I noticd what theyre talking about on that site too. But youre usually not moving fast enough in the game to notice it. Its only in the timedemo whre its really bad.
 
For which cards? It looks better on my 1950xt then it did on my 7900gs.

What?? This made no sense.

I know that what I said is true for the 2900XT, and since it's a driver issue, it's obviously going to affect their other cards too.
 
ATI came to us a couple days ago and mentioned this. Certainly ATI wants this story spun in their favor. We will be looking into this water rendering issue, but we are unsure if it is one of those points that really makes a difference in gameplay. Remember that HardOCP actually uses gameplay to evaluate games so our editors have a logged many hours playing Crysis, not just running the canned demos included. We gave not noticed and obvious impact yet. Obviously driver optimizations are nothing new to the industry, you just have to ask yourself, "At what price?" Not the first time this has come up...

http://enthusiast.hardocp.com/article.html?art=MTExLDEsLGhlbnRodXNpYXN0



Our original Crysis comparisons are noted here in our 8800 GT article. And there are some water screen shots in there as well.

http://enthusiast.hardocp.com/article.html?art=MTQxMCwzLCxoZW50aHVzaWFzdA==

Also I think it is worth noting that we have championed gameplay quality as a backbone to video card testing for a while now. Is NVIDIA cheating or are they simply making an acceptable IQ variation to deliver better performance? I would say that YOU would need to be the judge of that and how it impacts your gaming experience should be the basis for your opinion. Cheating the cheaters anyone? ;)

I will interested in seeing if the Elite Bastards article is a regurgitation of what ATI reported to them....

I think it is worth noting that the 169.0X drivers used in all this testing are clearly marked as "Beta." And lastly, it is worth ATI's time now to start slinging mud when the 2900 series card is doing better in actual gameplay than it ever has before? Maybe ATI should have thought about that before he seeded the issue with EliteBastards?

Let's all get the WHQL driver and see what happens then?
 
Oh, really?

8800 Ultra, 169.04, all settings High, default EXE name (Crysis.exe), only FOV modified and motion blur set to 0, NVCP IQ set to High Quality. First, during gameplay:
ingamezm0.jpg

Looks nothing like what EB is experiencing.

Then, when running the benchmark, enlarged from original crop:
benchka7.jpg

Again, looks nothing like what's reflected in EB's screenshots.

Now, maybe this is a GTS-specific issue, but that's a bit doubtful. Thanks for jumping the gun there, chief.

I hope they fix the bug where distant trees are pixelated, which is clearly visible in your pics.
 
I noticed when playing the demo that the world reflected in the water would jump because it wasnt being updated frequently.. Seems like a pretty janky way to increase performance..
 
I noticed when playing the demo that the world reflected in the water would jump because it wasnt being updated frequently.. Seems like a pretty janky way to increase performance..


Yes, it would seem like a terrible way to increase performance. I say this is a broken driver rather than an outright cheat.

The bottom line is this, NVIDIA has no reason to cheat, their products remain king of the hill. Diminishing image quality in a game that carries their logo is not a move to make, or at least not a smart one.
 
A weird reflection in a timedemo in a “Pre-Release Demo” with “Beta” drivers = cheat, yeah, ok, and I'm Santa Clause. JMO
 
DX10 games are still relatively new. Crysis is the first game released that makes use of a number of DX10 capabilities. NVIDIA has been working closely with Crytek to develop this game. I would put any criticisms on pause for a game that is not yet released that is using a beta form of the driver. As Kyle said, wait and see before you judge, neither is the final product.

As for ATI, the constant criticism of NVIDIA "cheating" simply demeans their position. They always come off sounding like a spoiled child. AMD is doing it to Intel by suing them for billions (they will need it to break even, at best) and now AMD/ATI is trying the same game with NVIDIA. They better watch out, especially now that Intel is coming into the discrete gaming video card market in the near future. They may find themselves quickly becoming irrelevant if they continue to complain instead of releasing a competitive product. Not that the 2900 XT is a bad card, it was just late; too late.
 
The fact that AMD/ATI is the one who is 'reporting' this 'news' to all of the major enthusiast sites out there should tell you guys something.....



Someone in canuckistan sure is busy sending out emails this weekend.... :rolleyes:
 
A weird reflection in a timedemo in a “Pre-Release Demo” with “Beta” drivers = cheat, yeah, ok, and I'm Santa Clause. JMO

For Christmas I want the [H]ardest gaming system there is, and I want it now, and I don't want to wait for Christmas to have it.


I actually don't find it surprising that ATI feels cheated. But I don't think its Nvidia fault, its ATI's fault for having such crappy hardware/drivers that they feel like a 1 FPS gain is going to make them look worse. I been using ATI for a while now, but the HD2900 swayed me over to Nvidia, that is if I were to buy a new card right now, but I can't afford to get anything for a while, that's why I am hoping Brent really is Santa Clause, but I won't be holding my breath.
 
This issue was immediately apparent to me as soon as I started seeing fast action near the water. I figured it was a game bug.
 
Of course ATI feels cheated. The 2X00 AA is shamefully slow and bothersome so Nvidia asks all reviewers of their cards to run 4xAA at all tests as a result to give an artificially skewed higher score for those cards since AA performance scaling is no indicator of how strong, well-performing a card is.

What happened to the days when it was truly scandalous to do per-executable optimizations? What does the pre-Crysis Nvidia driver perform like? What is the Crysis ATI driver going to perform like? The way the ATI performance in Crysis surprised us made me wonder what is going on, Crytek has been with Nvidia for a very long time after all. What in the world have their demo machines been running if people are having performance issues like this in the singleplayer demo? Does these optimizations come from Crytek themselves, according to the Nvidia TWIMPTB program?


(e: I still like specific optimizations for games, but give that choice to players to make. I'd like to have faster performance at the cost of visual quality in multiplayer Crysis as an example. Could this explain why the shadows in the HL2:EP2 version of Source look hard and sharp on 8X00s but (too) soft on 2X00s? Time to investigate further :eek: )
 
Anyone remember the Qauke 3 benchmarks :) same thing.

I was on hand for that one, and I would have to say, "Not even close." Both in terms of IQ impact and framerate impact.
 
i noticed this in the game as well. I thought thats how the game is supposed to be...
 
I don't even see what you guys are talking about the the pic.

Pre-release game + beta drivers = quit crying ATi and go make a decent video card.
 
Crytek has been with Nvidia for a very long time after all. What in the world have their demo machines been running if people are having performance issues like this in the singleplayer demo?

My thinking is code optimizations wont be the answer in the end, shere animal card HP will.

I don't think we have see nVidias 'Crysis card' yet. I am guessing it will be out with Crysis, or real close.

Like it or not, our GPU speeds are going to be taking a decent jump foward in the near future. ;)
 
Here is Brent's full summation of Crysis IQ after hours of playing the game. Quoted from the bottom of this page.

ATI vs. NVIDIA Quality


Finally we can look at ATI versus NVIDIA image quality to see if there are any differences. We are going to run the 2900 XT and 8800 GT at “Very High” setting in DX10 to see if there are any differences with all the features enabled.

Out of all the screenshots the only one we noticed anything different on is the fourth picture comparing depth of field. If you look at the rock in the upper left of the image it has a shadow reflecting off of the water. There is a difference in this shadow’s quality, on the Radeon HD 2900 XT it appears like depth of field is working on it to make it appear out of focus, but on the 8800 GT this shadow is much sharper in detail. We are unsure which is “correct,” but it is a small image quality difference that is rather hard to notice when you are actually playing the game.

Other than that we saw no differences in DX9 or DX10 in this game between the 2900 XT and GeForce 8 series in terms of visual quality while playing. Motion blur, shadows, HDR lighting and all effects seemed to be working well on all video cards.
 
A few fps might not seem that much, however I think that anyone would do anything to get a few more fps in Crysis. I don't consider that this is cheating because it is clearly a clever way to save the processing power in a such demanding game, why waste the processing power on the reflection when you can use it on something else however I do think that it is unfair to compare the performance of an ATi card with a nVidia card when one of the card is doing more rendering than the other. I think that ATi should also do this, until that happens, just change the exe name before doing any comparison :D
 
I hope they fix the bug where distant trees are pixelated, which is clearly visible in your pics.
Not a bug. Its the way the game 'fades' far objects into view. The game uses a stippling effect. A bunch of games have used the same technique. Its really annoying and i HATE it. It really annoys me. It's supposed to be less demanding on hardware and less noticable than alpha blending objects into view, but thats bollocks. It looks like shite and the performance hit means feck all on todays systems that can barely run the fecking game. So in 3 years when an average system can run the game at full speed we'll have something better and still be thinking "The game uses a stippling effect. A bunch of games have used the same technique. Its really annoying and i HATE it. It really annoys me. It's supposed to be less demanding on hardware and less noticable than alpha blending objects into view, but thats bollocks. It looks like shite and the performance hit means feck all on todays systems that can barely run the fecking game."

On topic. If this was formula one then ferrari (ATI) would be saying that mclarren(Nvidia) were cheating by using Lewis Hamilton because he drives a different way to Kimi Raikkonen. Do Paganni bitch at Bugatti because the Veyron sacrifices aerodynamics for pure horsepower? do they feck.

GFX wars have become so fierce that they have become so pointless. Who really gives a crap over 1 or 2 fps?. Does 1 or 2 fps really make a difference between playable and unplayable? Am i going to notice that the undulating waves that form the outer edges of the map only reflect the landscape and not the fauna. Am i feck. People are gonna bitch either way. The amount of people who actually care about this miniscule descrepancy is so minute that it suprises me that either ATI (well AMD but ATI is whta i was brought up on), or Nvidia really give a shit.

Both companies need to ignore the hardcore segment and embrace the mainstream segment where the real money is. Make more fast affordable cards. Nvidia have pulled a HUGE one out of there ass with the 8800GT. Shitting all over ATI's share. They have basically released a hardcore part for the mainstream. ATI has nothing to compete. Unless the next refresh part comes in at a lower price than their current flagship part, i dont see them catching up anymore.
 
I'm curious; is this level of "optimization" similar to what 3rd party drivers do to improve performance? IE Omega, NGO
 
i liked my original title better.

forum dictatorship aside, i guess its not as bad as i thought. tough optimizing should not come at the expense of image quality, yet alone outright corruption as seems to be the case here.
 
LOL, ok. i didnt notice you put them two together, sorry. by CORRUPTION i mean IMAGE CORRUPTION. LOL.
 
I am not sure when less than 1 frame per second to 2 frames per second became "cheating," but you can call it what you will.
I keep hearing "I don't see what the big deal is with a frame or two a second?" being bandied about, but what does a 7% difference mean to you?

It's a 7% difference, not "a frame or two". :rolleyes:
 
I keep hearing "I don't see what the big deal is with a frame or two a second?" being bandied about, but what does a 7% difference mean to you?

It's a 7% difference, not "a frame or two". :rolleyes:

Hmmm, looking at the graphs, out of 12 sets of instances benchmarked, only one had a difference of more than a couple frames per second and that was at a staggering framerate of less than 10 frames per second. The rest are dead even or within a frame or two of each other or dead even. Now how you get an overall 7% out of this is simply a lie. Am I not seeing all the data, or not understanding it? Let's not skew the results to prop up a sensational headline that was fed to EB by ATI.

Yes, I know.
 
One more thing; the 169.02 set, which are WHQL, exhibit the same behavoir. ;)


Fair deal, I can fully accept that driver having the same "issue." Again this seems to be much more of a canned benchmark issue than anything else. I guess if you rely on the canned benchmarks fed to you for video card comparison, you will have to deal with this issue, but I do not think it is a "cheat." I think the data EB collected show it is not a cheat. Certainly the trade off for a couple frames a second is just not worth bad press.

What did NVIDIA have to say about this? You guys talked to ATI about it, but not NVIDIA?
 
I would call it a bug.
Nv puts out beta drivers with an annoying frequency. Often the beta drivers that come out fix one issue while causing ten more.
They were experimenting with some Crysis optimizations, that is all imho. Nv/Ati both do it, and usually we don't notice the iq difference. This one seems to be a little rougher on the image quality than some optimizations are, but not as bad as the quack example.

Imho, if you don't notice the iq difference without real careful scrutiny, and it nets you a few extra frames here or there, then it is good optimization in my book and not a cheat. If the option exists to turn off the optimization, then it is an even better optimization. The optimization in question seems to fail this test. Hopefully and probably Nv will fix this optimization or remove it in their next beta release.
 
You can't be caught cheating when there was no competition going on. They aren't both running cards that are neck and neck. ATI is getting their asses handed to them, and instead of taking the moral high road and simply pointing out that Nvidia is getting a couple extra FPS from a driver bug, they resorted to schoolyard insult throwing and are screaming "cheater!" at Nvidia because they are getting beat and don't want to lose.

We all know what driver cheating looks like, we've seen it for real in past years. There's no evidence of that going on here.

Bad form ATI. I give them a yellow card. Nvidia's ball.
 
I remember the 3dmark thing where either nvidia or ati clipped the rendering of anything not in the FOV therefore jacking performance through the roof.. THATS cheating.. this.. eh beta demo, beta drivers, who cares..
 
I remember the 3dmark thing where either nvidia or ati clipped the rendering of anything not in the FOV therefore jacking performance through the roof.. THATS cheating.. this.. eh beta demo, beta drivers, who cares..

What you just described is done in every game. It's called frustum culling. I suspect that you are not accurately depicting the "cheat" you're referring to. Otherwise, what you just said isn't cheating.
 
I would call it a bug.

I would call it a "let the 8800GT be as impressive as possible" during the initial benchmarks. Each frame counts when there are so few as in Crysis. "Later on, when people
have seen the benches, we will "fix" this bug." ;)

Man, simply renaming the .exe and gone is this "optimization". It's só evidential it's not even funny.
 
No he described it properly, just not completely.

What he meant was the culling was done with the foreknowledge of exactly what part of the scene would and would not be visible during the fixed 3dmark run. Since the path through the benchmark is known, they were able to cull without having to spend cycles "figuring out" what could be culled. The result was some seriously inflated frame rate numbers and was absolutely hardcore cheating.
 
Back
Top