Nvidia caught cheating; latest drivers 169.xx cheat in Crysis demo

Hmm, I noticed this too. Far off scenery shimmers, like corruption or poor textures. It seems to be mostly trees or mountain-sides that shimmer. Does it on all drivers. Both my GTS and GT do this. So it ain't my card!

i notice this too both on my 8800 cards and my 2900 xt. its almost makes it unplayable for me.
 
phide said:
I think the big question is why the only person who seems to be experiencing the skewing issue, as far as I'm aware, is Hanners.

I have heard from enough people to know that this issue isn't isolated to Hanners. I have faith that Hanners "does his homework" before posting any article (especially a high profile one), something certain websites have been failing to do as of late. It just upsets me when a good man has his name unneccessarily dragged through the dirt.
 
Right, but what you described was not what was in the "cheat". What you described is something your video card does (both ATI and NVIDIA) on every frame of every game. Everything outside of your "viewable area" (aka FoV) is "culled" so that you dont waste precious cycles processing things that won't be displayed on screen at the end. I'm pointing out that what you described is not this "cheat", but a common thing that every game engine does.


I didn't describe anything, you have me confused with another poster.
 
I didn't describe anything, you have me confused with another poster.

Ya, so it wasnt you who originally posted it, but you attempted to correct me, and you were wrong. I was never contesting this cheating, I was simply correcting the guy stating that the cheat was some FoV cheat which simply isn't true, as all hardware does this. You posting links to the articles was semi-pointless, since that's not what was being debated.
 
I've heard from numerous people who have confirmed this issue with various ForceWare 169.0x versions, although admittedly it's far more apparent in the fly-by benchmark than during normal gameplay. It isn't just me though, that's one of the first things I checked. ;)

But Hanners, what you did was really irresponsible.. you logically linked the "definitive performance difference" (which is contestable and within a margin of error anyway) to this corruption you saw. Just because you see a fix to corruption and a slight framerate drop when you change the .exe name doesn't mean the corruption is what is buying the performance.

It's a logical fallacy to assume correlation = causation. But your article made that logical link, incorrectly and assumptively.

There are TONS of titles that come out with ATI having massive corruption in it. Does anyone cry foul and claim they're cheating? No... it's just clear that they don't have every bug nailed before the game/demo is released.
 
I have heard from enough people to know that this issue isn't isolated to Hanners. I have faith that Hanners "does his homework" before posting any article (especially a high profile one), something certain websites have been failing to do as of late. It just upsets me when a good man has his name unneccessarily dragged through the dirt.

I think doing homework would have included talking to NVIDIA about this since it is their driver, their part, and a TWIWMTBP game. Instead he only talked to AMD about it. He talks about treating NVIDIA fairly in the conclusion, but I don't think that is what happened.
 
Where does Hanners claim Nvidia is cheating?

OK, lets mince words. Does anyone assume AMD is hacking up the driver for some "optimization" and getting too aggressive with it when they have corruption in a game?

Happy?
 
I spend $400 on cards to get as high framerate as possible, like "pro" gamers do (I did the quake 3 on an R8500 for FPS's sake), and I have an LCD with jaggies sharper than Nicole Ritchies elbows.

what...again...quite spitting out nonsense
 
i want to know at which end of the fps spectrum the 1-2fps change is seen. Is it at min fps or max fps or both? cuz if is at the min fps then it does help in making the game that tad bit smoother..
 
He made a connection that this "bug" was what was causing the performance difference. Changing the EXE name might be disabling other optimizations, and this could be the reason for a 1-3 FPS perf difference (or it could just be in the margin of error.)
 
He made a connection that this "bug" was what was causing the performance difference. Changing the EXE name might be disabling other optimizations, and this could be the reason for a 1-3 FPS perf difference

Which I did refer to in the article, and it's a fair point to some extent, although its equally fair to at least suggest the change in rendering workload that this bug introduces is an obvious culprit for the performance change.

More importantly, it's useful for gamers to be aware of issues which may negatively impact their gaming experience, and how to work around them - Personally, I see it as an important part of our review work, and I always try to point out driver issues in reviews whenever I spot them. Simply brushing them under the proverbial carpet helps neither the graphics IHVs nor consumers.

(or it could just be in the margin of error.)

If I was running benchmarks on a graphics board with a 7% margin of error between runs, I'd start getting worried in all honesty. We should also remember here that the in-built Crysis benchmark performs three runs of the timedemo (four if you're being picky, but the first run is discarded as it's used to 'cache' all the necessary data), producing a result for each one. There was not a 7% margin of error between each test run at the same settings, I can say that for certain.

Also, I'll mention again that I received no e-mail communication regarding this issue from ATI/AMD whatsoever. Hopefully this will clear up Kyle's obvious confusion about this point, as I'm sure such a respected web site owner wouldn't want to damage the reputation of a fellow site erroneously.
 
As a final point, I've also just heard that NVIDIA already have a fixed driver in the pipeline for this issue, which should remove the water reflection bug while leaving performance largely intact.

Good news all around, I would say. :cool:
 
Also, I'll mention again that I received no e-mail communication regarding this issue from ATI/AMD whatsoever. Hopefully this will clear up Kyle's obvious confusion about this point, as I'm sure such a respected web site owner wouldn't want to damage the reputation of a fellow site erroneously.

*cough*
 
As a final point, I've also just heard that NVIDIA already have a fixed driver in the pipeline for this issue
Well, I guess that pretty much squares things away. Out of curiosity though, who did you hear about this from?
 
Well, for me, the important thing they fixed with 169.04 was they made my 8800 GT work. When I tried running it with the 169.01, or even the drivers they submitted to WHQL, it would freeze at the "Don't copy this demo or we'll be pissed" screen. I grabbed the 169.04 drivers, no problem, and I'm not complaining about water update rates. There's enough other beauty in the game to look at, I say don't whine about little shit like this. Just be frickin happy the game runs and is finally coming out after all the delays. :)
 
Well, I guess that pretty much squares things away. Out of curiosity though, who did you hear about this from?

ChrisRay Confrimed it's fixed in the next driver, this is a bug not a cheat no matter how much ATI wants it to be.
 
Those are pretty unfair comments to make to Hanners and EB, especially considering they are one of the few stand up guys in this industry.

So Hanners is "fair" to NVIDIA when he does not discuss the issue with them before publication? And I am unfair for pointing out that this story was delivered to him from AMD? AMD came to HardOCP with this story as well last week....

As far as EB being "stand up guys," I really don't see it that way especially with this story.

Hmmm, maybe Hanners will post here since he is active in this thread and state that he has not worked with AMD on this particular issue? I would like to see that happen.
 
Sorry I missed this post....

Also, I'll mention again that I received no e-mail communication regarding this issue from ATI/AMD whatsoever. Hopefully this will clear up Kyle's obvious confusion about this point, as I'm sure such a respected web site owner wouldn't want to damage the reputation of a fellow site erroneously.

AMD called me, they did not communicate via e-mail on this to me. I find it very interesting that AMD wanted me to do a story on this, and you just so-happened to have the same exact story up a couple days later. And I have heard specifically that the story was delivered to you. Maybe my source is wrong. Maybe he could clear it up?

But I am not "confused" about what was related to me on this issue. I am not making guesses about it, or simply coming up with some crazy ideas. The story is sloppy at best and I think you know that. That is my take on it along with my opinion. I figure you would not be posting in our forum if you did not want to hear my thoughts.
 
Ya, so it wasnt you who originally posted it, but you attempted to correct me, and you were wrong. I was never contesting this cheating, I was simply correcting the guy stating that the cheat was some FoV cheat which simply isn't true, as all hardware does this. You posting links to the articles was semi-pointless, since that's not what was being debated.

Oh, so the post right above mine in which Kyle beat me to linking the same article was pointless as well? It's what the poster was referring to, wether or not he used the exact correct terminology.

We all knew what he meant and that is what the article linked dicussed.
 
As long as it's simply your opinion, rather than a baseless accusation, that's fine. You're entitled to it, wrong though it is, and I'll happily leave it at that.

It was an accusation, but it was far from baseless. Anyway, glad to see you directly reply.
 
A weird reflection in a timedemo in a “Pre-Release Demo” with “Beta” drivers = cheat, yeah, ok, and I'm Santa Clause. JMO

Quoted for Justice... Brent Justice! (I've been waiting a while for that setup)

Seriously, though... the "story" was leaked by the competition, and is retarded considering both the drivers and game are pre-release. Kudos to the [H] for reserving judgment.
 
Quoted for Justice... Brent Justice! (I've been waiting a while for that setup)

Seriously, though... the "story" was leaked by the competition, and is retarded considering both the drivers and game are pre-release. Kudos to the [H] for reserving judgment.

Are you saying that demos shouldn't be used for benchmarking. This particular one is used quite often. Maybe [H] should remove the Crysis results from its 8800gt review. as the game and the drivers are both pre-release.
 
Why is everyone getting their panties in a bunch? This is a DEMO! Not the full release, something YOU HAVE TO PAY FOR! So isn't this just a little tad pointless?

If it was based on the full blown release of Crysis, then I can see some stink about it. This article is getting out of hand. If the retail version comes out and everything is fine and dandy, then this article and all this chat is pointless...

We all know that ATI/NVIDIA will be releasing drivers for this game. Can't we just fricken wait until the game is released in what? a week and a half the most? Sheeesh!
 
Are you saying that demos shouldn't be used for benchmarking. This particular one is used quite often.
I think what's Seal's trying to reaffirm is that some might have the feeling that this story was premature. The demo is pre-release; the drivers are beta. Ergo, bad shit is destined to happen. That being said, the article itself isn't charged either way, as Moofasa's been hammering, and it's really just a performance report/bug report, though some of the conclusions are a little iffy, and the circumstances are a bit...coincidental.

I don't think there's anything wrong with benchmarking with the demo per se, though it makes more sense to wait until the retail version is out.

I wonder if anyone would have gotten their knickers in a twist had the thread title not been so unbelievably misleading and, well, just plain stupid. The article says something akin to "the Earth is 1F hotter", yet the thread title is akin to "MAN-MADE GLOBAL WARMING CONFIRMED -- SEE SUCH-AND-SUCH ARTICLE".
 
^^^look the same bandwagon who was SURE the demo would play better than the beta, which is now false. there is no reason to think the full game will be any different than the demo, except being long of course.
 
Are you saying that demos shouldn't be used for benchmarking. This particular one is used quite often. Kyle jumped all over ATI when the 2900xt was released. No reservations about his judgement at that time. And yet the gets a " Kudos to the [H] for reserving judgment" for this one. Funny.

And Kyle jumped all over the 5800 when it was released too, and the 9800pro still stomped all over it... If I recall correctly, at release time the 2900xt under-performed compared to its competition, cost more, used more power, yadda yadda... the review seemed justified to me.

But, this is all secondary - the fact is the card was an official release. In contrast, these drivers and this game are not.

And for the record, I think canned timedemos should NOT be used for benchmarking... actual in-game experience should be the standard (and has been at the [H] for a long time).
 
Why is everyone getting their panties in a bunch? This is a DEMO! Not the full release, something YOU HAVE TO PAY FOR! So isn't this just a little tad pointless?

If it was based on the full blown release of Crysis, then I can see some stink about it. This article is getting out of hand. If the retail version comes out and everything is fine and dandy, then this article and all this chat is pointless...

We all know that ATI/NVIDIA will be releasing drivers for this game. Can't we just fricken wait until the game is released in what? a week and a half the most? Sheeesh!

THIS article isn't getting out of hand. It is this thread and the pointless mudslinging that is over the top. The article is pretty neutral, lest you believe the 2 guys that don't think so in this thread.
 
I think what's Seal's trying to reaffirm is that some might have the feeling that this story was premature. The demo is pre-release; the drivers are beta.
...
I don't think there's anything wrong with benchmarking with the demo per se, though it makes more sense to wait until the retail version is out.

Indeed.
 
THIS article isn't getting out of hand. It is this thread and the pointless mudslinging that is over the top. The article is pretty neutral, lest you believe the 2 guys that don't think so in this thread.

Absolutely agreed. I have nothing more to say here, I'd like to think that everybody here knows my thoughts on the entire situation, as well as the series of events which led to it.
 
I think what's Seal's trying to reaffirm is that some might have the feeling that this story was premature. The demo is pre-release; the drivers are beta. Ergo, bad shit is destined to happen. That being said, the article itself isn't charged either way, as Moofasa's been hammering, and it's really just a performance report/bug report, though some of the conclusions are a little iffy, and the circumstances are a bit...coincidental.

I don't think there's anything wrong with benchmarking with the demo per se, though it makes more sense to wait until the retail version is out.

And we have done stories on pre-release / beta games and drivers in the past, but we did not do 3 page articles with tons of benchmarks that basically pointed out Team Green or Team Red doing something underhanded. It is a waste of time. We decided to spend our time seeing how Gears of War for the PC was doing. :)

That said, we used Crysis and UT3 to overall evaluate the 8800 GT because I think that is what our readers care about. But spending our time and focus on something so minuscule that implies cheating is not a good way for us to spend our resources. (Would have been great to see them try to pull off something of the magnitude of Quake/Quack then we could have unloaded both barrels!) Taking 3 hours and throwing together some benchmark numbers does not prove much anything now days, at least that is what I think. Playing around with the Crysis internal benchmarks has so far left me unimpressed as well but we will keep using them on the CPU side to see if we see processor scaling with their physics engine, but I am not sure we will.
 
Duly noted. It's a shame you think that, but oh well...

I can't tell you how many times over the years I have seen your comments about our content and thought the same thing....guess its Karma.

Bottom line is your talents could have been at a much better use elsewhere IMO.
 
THIS article isn't getting out of hand. It is this thread and the pointless mudslinging that is over the top. The article is pretty neutral, lest you believe the 2 guys that don't think so in this thread.
I never stated the article isn't neutral, I said it's a little premature to raise the question when the release of the game is days away. It's not like we are waiting months for the game to arrive. It's what? 2 weeks before the launch of the game the demo appeared?

As a gamer, I'm smart to understand that this is a demo and not the full version. If their are glitches, I can accept that and wait to see if the glitches are in the final version. When the game is available and the drivers that are available are producing these image screwup's then I'll be up in arms. This article just tells me that the demo and the drivers out now for the demo are not a match made in heaven. I can deal with that because it's a demo. If the problem occurs in the retail version then get back to me...
 
Kyle and Hanners are gonna start making out if this type of argument keeps going between them.

Solid Gears of War article though, I hadn't read it until it was mentioned here.
 
ChrisRay Confrimed it's fixed in the next driver, this is a bug not a cheat no matter how much ATI wants it to be.


It's a "bug" now that it has been exposed and rectified, isn't it? If no one noticed and no one complained, would it be a cheat?

It doesn't really matter. Those of us with top hardware still aren't going to get high framerates with this game .
 
Kyle and Hanners are gonna start making out if this type of argument keeps going between them.

We've both gotta bring in the hits to our respective sites on Valentine's Day somehow...

Okay, that really is my closing post before things get way too weird. :D
 
We've both gotta bring in the hits to our respective sites on Valentine's Day somehow...

Okay, that really is my closing post before things get way too weird. :D

I am bringing the tongue with my sloppy kisses.
 
Back
Top