Atomic investigates NVIDIA’s long-ignored accusation": ATi's FP16 demotion usage.

ATI *IS* doing this demotion from higher precision to lower precision. That is a fact. They are changing the image without the user or the developers permission. The developer asked for one format and AMD gave them something lesser to get higher FPS.

That is why they are only doing it with DX9 games - devs *CAN'T* use the format that *both* ATI and Nvidia recommend for HDR because it isn't in DX9 at all.

That's AMDs fault for not implementing the hacks so they can be turned off individually, or better yet, even default them to off. Why are they lumping a bunch of legitimate optimizations in with a bunch of IQ hacks? That's their fault. They shouldn't be hacking IQ in the first place.

And you can't turn off *any* of the IQ hacks Nvidia is using (and yes, they *are* using IQ hacks). So do you want the option to turn them on and off and see if there is a difference (amd), or for a company to make the choice for you with no alternative (nvida)?
 
in reference to Batman AA if you buy the game right off of the shelf and install it, there is absolutely no support for ATi AA you have to patch the game first

blue storm,

I take it you actually did not read the article in question, now did you? demotion is not available in DX9 to be used and it is up to Nvidia AND ATi to rework the code to make it happen.
 
Take off your red glasses and see for yourself.
http://forum.beyond3d.com/showthread.php?t=57870

They have subpar image quality.

ATI lowers image quality to try and gain performance (although judging from recent benchmarks they are not getting enough).

And I could just link you to Nvidia cheating in Crysis, specifically the benchmark, so what's your point?

They both optimize, from where I'm sitting with ATI you need a special tool to figure out it's happening, with Nvidia you just need an eyeball or two.

Also, if you read the rest of that thread, you'll see many people disputing whether or not what ATI is doing is even cheating, as there isn't a defined "right" way. You'll also see plenty of people debating the results, trying to figure out what is actually going on.

Subpar? Not really.
 
Prime1 just likes to hang off of Nvidia's sac.....as he obviously overlooks nvidia doing the exact same thing while thet are calling ATi out for it...
 
Take off your red glasses and see for yourself.
http://forum.beyond3d.com/showthread.php?t=57870

They have subpar image quality.

ATI lowers image quality to try and gain performance (although judging from recent benchmarks they are not getting enough).

Please, you act as if AF were too much of an impact in performance, AF is essentially free since the Radeon 9700PRO era which were the very first to offer it with basically no reduction in performance, :rolleyes:

http://www.anandtech.com/show/3868/quick-look-powercolors-radeon-hd-5770-pcs-vortex-edition/2

From 21 tests, the HD 5870 won 11 tests, tied 3 and lost 7 compared to the GTX 470. From 21 tests, the HD 5850 won 15 tests, tie 2 and lost 4 compared to the GTX 460 1GB. Don't even mention the HD 5970 which smokes the GTX 480, are you gonna claim that such gains are thanks to its AF performance? Please :rolleyes:

So which is the better buy??
 
Please, you act as if AF were too much of an impact in performance, AF is essentially free since the Radeon 9700PRO era which were the very first to offer it with basically no reduction in performance, :rolleyes:

Sure it's free, because it's awful. Lot's of awful stuff can be had for free. Dog poo for example.

Personally I just don't see why anyone would want to sacrifice image quality & buy a slower card. Just my 2 euros.
 
Funny, when I first heard about nVIDIA's accusation of AMD, it reminded me of the GeForce FX days where nVIDIA was caught "cheating" in 3DMark 03 and several other games that were out at that time in 2002/2003.:rolleyes:

Back then, the GeForce FX lineup was only capable of FP16 & FP32, but not FP24 (which was DX9 spec called for). The GeForce FX could run FP32 when full precision was called for, but this was a huge performance penalty for the FX cards, so nVIDIA optimized the FX drivers to run in FP16 whenever possible so that it would remain competitive in performance with the DX 9 Radeon 9xxx series (which were capable of FP24). An old AnandTech review of the GeForce FX5900 series describes more about this here: http://www.anandtech.com/show/1104/9

Frankly, I'm surprised nVIDIA is still concerned with "cheating" in old, DX9 games that really aren't benched my most (or any?) reviewers anymore; it's as if nVIDIA never got over the driver optimization scandal from 2003.
 
That is why they are only doing it with DX9 games - devs *CAN'T* use the format that *both* ATI and Nvidia recommend for HDR because it isn't in DX9 at all.



And you can't turn off *any* of the IQ hacks Nvidia is using (and yes, they *are* using IQ hacks). So do you want the option to turn them on and off and see if there is a difference (amd), or for a company to make the choice for you with no alternative (nvida)?

You're dead wrong. NVIDIA does not put any IQ hacks in the driver.. it's a matter of principle. After the whole cheating debacle both companies had done with older hardware, many years ago, they have avoided this shit. Only an idiot would expose themselves to this twice.

Every optimization NVIDIA puts in it's driver now is verified not to affect IQ, both theoretically and in reality (looking at the image).

And you should do your homework... this has nothing to do with HDR... the point is a developer asked for a format and ATI is choosing to swap it without the developer's permission... who is the IHV to do the thinking for the developer? "Don't mind us while we degrade your IQ for your game you spent years working on.."
 
Last edited:
AMD is desperate. They are still running on the old R600 core. It looks like the 6xxx series is going to be a major disappointment. So they need to try every trick in the book to try and maintain customers. Certainly some uniformed individuals will not know of these image quality problems.
 
Sure it's free, because it's awful. Lot's of awful stuff can be had for free. Dog poo for example.

Personally I just don't see why anyone would want to sacrifice image quality & buy a slower card. Just my 2 euros.

Because, oddly enough, Nvidia's IQ isn't any better.

http://www.hardocp.com/article/2009/09/22/amds_ati_radeon_hd_5870_video_card_review/6
What will catch your eyes immediately is the fact that the Radeon HD 5870 exhibits a perfect circle on all filtering levels as we look down the tunnel. The GeForce GTX 285 seems to be better than the Radeon HD 4890, so the HD 5870 is a big improvement over AMD’s previous generation.

In the second screenshot we have enabled 16X AF. Once again we find the Radeon HD 5870 produces the best 16X AF out of the bunch. It is a perfect circle, with no dependencies on angles like the Radeon HD 4890. I would say ATI has succeeded in providing the absolute best filtering quality in a gaming graphics card.

Likewise, I hope you enjoy picking 8x AA in game and getting something that isn't 8x MSAA
 

NV's AF is still better than ATI's - see this. The next page of that article also has actual game screenshots that show NV's AF to be slightly superior.

Likewise, I hope you enjoy picking 8x AA in game and getting something that isn't 8x MSAA

Generally, selecting 8x AA in a game gets you 8x MSAA. Some games offer the CSAA modes too (Source games, BC2 are the ones I'm aware of) but those are offered in addition to the MSAA modes.
 
What can you say about it PRIME1? Trolling? A sucessfull troll is a sucessfull troll.

I can see it being a troll post but he's more accurate than 80% of the post in this thread spewing bias opinions towards both brands with no evidence to back any of it up. It's just one big ow flame war now.
 
Pretty pointless topic that has turned into a flame war? Meh......

I can see it being a troll post but he's more accurate than 80% of the post in this thread spewing bias opinions towards both brands with no evidence to back any of it up

Hmmm........I'll just leave now.
 
Last edited:
Generally, selecting 8x AA in a game gets you 8x MSAA. Some games offer the CSAA modes too (Source games, BC2 are the ones I'm aware of) but those are offered in addition to the MSAA modes.

No, just selecting 8xAA in game gets you 8xCSAA by default. Games have to be aware of Nvidia's extra AA modes to get real 8xMSAA (also called like 8xQ). The default is if a game requests an MSAA surface, they get a CSAA one back (what was that about not giving developers what they ask for? Yeah, like I originally said - pot calling kettle black)

You're dead wrong. NVIDIA does not put any IQ hacks in the driver.. it's a matter of principle. After the whole cheating debacle both companies had done with older hardware, many years ago, they have avoided this shit. Only an idiot would expose themselves to this twice.

Every optimization NVIDIA puts in it's driver now is verified not to affect IQ, both theoretically and in reality (looking at the image).

And you should do your homework... this has nothing to do with HDR... the point is a developer asked for a format and ATI is choosing to swap it without the developer's permission... who is the IHV to do the thinking for the developer? "Don't mind us while we degrade your IQ for your game you spent years working on.."

Care to prove that? Nvidia got busted putting an IQ hack into Crysis, so yes, they *DO* put IQ hacks in the driver.

You could always post some optimization on/off comparisons.... oh, wait, no you can't, Nvidia doesn't let you do that.

And you should do your homework, this is absolutely about HDR. That is the whole point of the format, for HDR. And both Nvidia and ATI do things behind the developers backs all the bloody time. Usually people call it "game support", aka, Nvidia and AMD fixing the devs broken shit by implementing workarounds in the driver. Likewise, claiming it "degrades IQ" is a bit of a stretch - since in the affected games it turns out they look identical. As in, AMD made the right call, and improved performance without sacrificing IQ.
 
I can see it being a troll post but he's more innaccurate than 80% of the post in this thread spewing bias opinions towards both brands with no evidence to back any of it up. It's just one big ow flame war now.

Fixed :)

How can you call him being accurate? If he's spreading lies and misinformation? Both GPU vendors offers similar optimizations to boost performance without degrading image quality, the only difference is that ATi offers you an option to disable it, nVidia not. The image quality topic has been beaten to death eons ago, both are too close to call a difference, AMD has the closest to DirectX reference AF implementation, and has a higher quality anti aliasing which uses a sophisticated edge detection algorithm using shaders for the resolve.
 
NV's AF is still better than ATI's - see this. The next page of that article also has actual game screenshots that show NV's AF to be slightly superior.

Your link says the opposite of what you are saying..

Generally, selecting 8x AA in a game gets you 8x MSAA. Some games offer the CSAA modes too (Source games, BC2 are the ones I'm aware of) but those are offered in addition to the MSAA modes.

Some game like Crysis does not tell you 8xAA is CSAA, it just said 8xAA in the menu..
:rolleyes:

-----------------------------------

Why are you people feeding the troll? It is obvious someone just trying to start a flame war here and you guys just falls for it.
You can see every other post from him that is exactly the same thing all over.
 
How can you call him being accurate? If he's spreading facts that makes me cry? Both GPU vendors offers similar optimizations to boost performance except AMD degrades image quality

Fixed.

Read the Atomic PC article, read the alienbabeltech article, read the beyond3d thread. AMD has subpar image quality and it may be to try and boost performance. The 5870 has fallen behind 470 and could slip even further. It's not looking like the HD6000 is going to be impressive. So they need every trick in the book. I personally find it unacceptable.
 
Fixed.

Read the Atomic PC article, read the alienbabeltech article, read the beyond3d thread. AMD has subpar image quality and it may be to try and boost performance. The 5870 has fallen behind 470 and could slip even further. It's not looking like the HD6000 is going to be impressive. So they need every trick in the book. I personally find it unacceptable.

I may be misinterpreting some things, but the Atomic article states this:

From our testing it's clear that ATI's implementation FP16 Demotion does affect image quality in games as challenged by NVIDIA. However, the extent and prevalence of it is not universal - from four games, only one showed any real visible influence. On the other side of the coin are performance improvements, which are plentiful in two games from four: boosting performance by 17 per cent at high resolutions, and a still-impressive 10 per cent at lower resolutions.

Ultimately the decision to run FP16 Demotion has always been up to the end-user, as the use of Catalyst A.I has been optional since its first inclusion in Catalyst drivers - now many years ago - and as it appears predominantly beneficial or benign in the majority of cases, we'd suggest you leave it enabled.

We also can't help but wonder why NVIDIA attempted to make such an issue over this rendering technique; one that they wholeheartedly support and simultaneously condemn. In the two titles that their hack functioned with we saw no image quality nor performance loss, and rather the opposite: a performance boost. Why haven't NVIDIA included support for FP16 Demotion in their drivers until now for specific titles? Why choose now to kick up a fuss?

The answer, as always, will never really be known.

I'm not entirely sure what the big fuss is.
 
Which also bring me back to my point- How can the one game that shows the difference be verifyably chalked up to FP16 Demotion and not some other optimization or bug? Turning off CAT AI as they did in that article turns off ALL optimizations, not just FP16 Demotion.
 
Fixed.

Read the Atomic PC article, read the alienbabeltech article, read the beyond3d thread.

I did - they all say the same thing. Damn near identical image quality across the board with neither being better. Nvidia doing it differently does *not* automatically mean Nvidia's is better. And again, Nvidia does the same shit. They degrade IQ with their optimizations all the time, that's how most optimizations work. There is almost always a tradeoff of some sort.

The 5870 has fallen behind 470 and could slip even further.

Bwahahaha. Only if you're wearing green glasses maybe.

It's not looking like the HD6000 is going to be impressive.

All the rumors are saying the exact opposite.
 
PRIME1 is just trolling, looks ridiculous with such fud claims and misinformation. With his mentality, we can say that AF is the new Super Sampling Anti Aliasing. :rolleyes:
 
PRIME1 is just trolling, looks ridiculous with such fud claims and misinformation. With his mentality, we can say that AF is the new Super Sampling Anti Aliasing. :rolleyes:

I already posted 3 links. I won't even get into all the issues AMD has with AA. cough Batman cough.
 
PRIME1 is just trolling, looks ridiculous with such fud claims and misinformation. With his mentality, we can say that AF is the new Super Sampling Anti Aliasing. :rolleyes:

lol he's been the Nvidia troll on these boards for awhile now.

Maybe 3-4 years now?...IT happens everytime a new product is about to come out, he comes on here spews PRO nvidia BAD ATI..

Makes me wonder if he is apart of the Tea Party....hmmm
 
I already posted 3 links. I won't even get into all the issues AMD has with AA. cough Batman cough.

You mean the part where Nvidia got the devs to add a vendor ID lockout to prevent AMD cards from running the standard DX9 AA code? How is that AMD's fault? Sounds like Nvidia knew it couldn't compete so it had to sabotage the competition.
 
You mean the part where Nvidia got the devs to add a vendor ID lockout to prevent AMD cards from running the standard DX9 AA code? How is that AMD's fault? Sounds like Nvidia knew it couldn't compete so it had to sabotage the competition.

And the funny thing is that Rocksteady added native Anti Aliasing support on the GOTY edition of Batman AA on AMD hardware!!! Rocksteady knew that adding the vendor ID lockout was a terrible mistake, specially when AMD owns the 88% of the DX11 market according to Steam Survey.
 
Last edited:
And the funny thing is that Rocksteady added native Anti Aliasing support on the GOTY edition of Batman AA on AMD hardware!!! Rocksteady knew that adding the vendor ID lockout was a terrible mistake, specially when AMD owns the 88% of the DX11 market according to Steam Survey.

Batman is a DX11 game? Seems you are the troll in this thread.
 
Batman is a DX11 game? Seems you are the troll in this thread.

He didn't say B:AA was a Directx11 game, he specified that it only added AA support. No, what he's saying is that AMD controls most of the market for newer cards, which is why Rocksteady made an effort to appease the number of gamers who now use these cards.
 
Batman is a DX11 game? Seems you are the troll in this thread.

Way to go idiot, your reading comprehension is down on the toilet, thanks heflys20 for the clarification, the best thing we can do is to everyone in this thread to put this f****d troll on the ignore list and keep moving, we had derailed a little too much from the thread.

Back on topic, I can see that there's some scenarios that using a lower precision HDR will not affect image quality because the precision in such scenarios isn't needed. I also remember an HDR mode based on 32-Bits Integer introduced on the X1k architecture which used 10-Bits per RGB component and 2-Bit for the alpha channel, which showed great image quality and very low impact on performance. But I think that's history because now with the DX10+ hardware around, the developers are free to choose what they feel is the best option plus current hardware is fast enough for HDR effects.
 
Last edited:
Way to go idiot, your reading comprehension is down on the toilet, thanks heflys20 for the clarification, the best thing we can do is to everyone in this thread to put this f****d troll on the ignore list and keep moving, we had derailed a little too much from the thread.

What took you so long? only took me a few posts of his to see how much fud he spreads...

Batman really? haha yea ... good argument there :eek:
 
Just add PRIME1 to your ignore list, he's probably the most blatant nvidia fanboy in these forums.
 
Back
Top