AMD degraded AF quality with 68xx launch!

Status
Not open for further replies.
There are several reasons for trolling most of the time. Prime is a rare case. Either way I get frustrated when there is fanboyism on either side where even when facts are being presented they try everything in their power to shill or dupe people into believing the opposite. People are such followers and tend to lend their loyalty to one company over another and defend at all costs.

To me, my loyalty is to my wallet and if either is pulling crap I'll be happy to bash them. Facts are being presented here and there is something fishy going on. All this back and forth is frustrating to me and enough to fuel all the trolls to come here. I guess what I'm saying is there is such a thing as troll bait. I've seen this many times before.
 
Ok so none of this is fud :( It's time that this be looked into further. I call upon Kyle/Brent/Mark to shine some light on this issue. One US reviewer has to step up and take a non bullshit approach to this issue. How much performance is AMD gaining by conveniently having such a poor alogrhythm for Aniso. It looks like shit honestly as I'm sensitive to the texture shimmer. I noticed this on my wife's setup but thought it was because she had a low end 5000 series card. Looks like it's worse on 6000 series in some instances.

It is definatly worse than on the 5000 series and the 5000 series now get also worse AF quality than before with the newest catalyst 10.10.

Here is what 3DCenter has to say (posted that link before):
http://www.3dcenter.org/artikel/amd...n-bei-der-filterqualitaet/amds-radeon-hd-5800

Default Quality settings from best to worst
1. GeForce 200/400 series
2. HD 5400/5500/5600/5700
3. HD 5800/5900
4. HD 6800

Highes possible settings from best to worst
1. GeForce 200/400 series "High Quality" Setting
2. GeForce 200/400 series "Quality Setting"
3. HD 5400/5500/5600/5700 "AI OFF"
4. HD 5800/5900 "AI OFF"
5. HD 6800 "High Quality"

I left out the theoretical 100% reference from their figure so don't get confused when looking at the above mentioned ranking.
Their final verdict clearly states that Nvidia HQ and Q setting is superiour to anything AMD currently offers.
But what anoys them and i think must of us even more is that AMD lowered the quality on the new products.
While it could be somewhat understandable that you sacrifise a littlebit of image quality for better performance on the lowest end cards, it is absolutely unacceptable on mid range / performance and high end cards.
It is really a bold move from AMD claiming improved image quality in their launch presentations while in reality Barts based HD6850 and HD6870 products have an inferiour filtering quality than Cypress and Juniper based products. Not to mention that they loose quite a bit FPS if you bench them in HQ mode against the direct competition while still having the worse filtering quality. In such a case the GTX460 1GB > HD6850 and GTX460 OC models >= HD6870. GTX 470 clearly wins.
 
Btw 3DCenter.de is one of the most trustable sites. They don't receive any review hardware from the IHV's because they point the finger really often into the existing problems of gpu generations no matter whether its Nvidia or AMD.

http://www.3dcenter.org/artikel/amd...00-mit-rueckschritten-bei-der-filterqualitaet

http://www.3dcenter.org/artikel/amd...n-bei-der-filterqualitaet/amds-radeon-hd-5800

How does AF in a 6800 look 5% better at AF than a 5800? Why not 7%? Or 3%? Do they have any margin for error? And why has nobody noticed this 5% difference in games? How did they calibrate their eyeballs and brains to produce these numbers?
 
They assign a percentage value to subjective image quality. How is a 6800 5% better at AF than a 5800? Why not 7%? Or 3%? What's the margin for error? And why has nobody noticed this 5% difference in games?

Yeah, I didn't even bother to look at that link. One, I don't know what they're saying (aside from what's indicated in the chart), and secondly, I see no pictures or visual comparisons to prove their point.
 
Almost every reviewer, including [H] benches at default driver settings (which is never HQ but Q). Also we're talking about the AF for the 6xxx series, not the 5xxx. Did you even read any of the posts in this thread before flying off the handle?

How can you own a video card and not even notice that everytime you do a true clean install your settings are put back to Q?


good job referencing an article from last year and missing the point. the settings people are talking about are the driver controlled af settings, not in game settings. amd's default af quality has gone *down* since the article you mentioned, and it does have an impact on performance otherwise amd would defalt to hq.
Posted via [H] Mobile Device

Flying off the handle? I wasn't, but, I am thinking of flying off one now!! :) You are both wrong about both the hardocp review and the default for settings for AMD cards.

Vagrant_zero, how can you own a video card and not know that the default settings for AF is "use application settings" Now if you look, and actually really look at the hardocp review you will see that they use 16xAF in every game for both cards.

Ocellaris, read above, the default for AMD cards for AF is "use application settings" The review here is using 16xAF so what performance advantage does either card have? Oh wait, that's right, none.

I posted a link to that article to show that AF used to be really good on AMD cards. It's only the new cards that there is the faintest shimmering. This could simply be something that was overlooked in the rush to get new cards and driver out the door in time for the release date.

And if this was really a serious issue don't you think Nvidia would be all over it like the time a month or so back that they complained about the FP16 demotion trick that AMD were supposed to be using.

But, I think this is just a minor problem, that most people probably won't even notice once they turn up AA and other graphics settings. I am with Fattypants and Heflys on this one!! 5% difference, :D
 
Check out the texture shimmering with HQ here that doesn't happen when you use HQ or even only Q on an GTX 460:
http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

Yeah, this is being investigated in my thread about 5000 filtering on B3D. Preliminary tests shows that an alteration in LOD bias settings have interesting effects:

If you bump LoD to +0.65 the shimmer disappears. On NV cards nudge LoD to -0.65 and it appears.

This you can test for yourself by using the links in my first post in this thread.

Negative lod bias gives sharper picture, but more performance hit and shimmering. Positive lod bias gives less sharp image, but more performance and no shimmering. Minor adjustments in LOD bias don't have much change in picture quality. In some of these old games, ATI and Nvidia had different LOD settings.

If the default LOD settings are on the negative side for AMD (for shaper picture, less performance, shimmering), while on the positive side for Nvidia (less sharp picture, less shimmering, more performance), there is no cheat or less quality of AMD's AF settings. If to put out a cheat finger, it would be towards Nvidia for choosing a more performance option, while the case might be that nobody's cheating anyone.

For AMD cards, if people want to test out different LOD, RadeonPro is an excellent tool. :)
 
Positive LOD makes the Image blurry, NVIDIA HQ stays sharp ;)

Minor adjustments to LOD bias will make it hard pressed to notice any difference in sharpness. If you have an Nvidia card, download 3Dcenter filter tester, load Ground2 texture and set AF to 16X. Move LOD bias slider to -0.65 and try to pinpoint any difference between -65 and 0. On -65 shimmering will appear if you set Y=0.3. On AMD cards, if you follow the same, but increase the LOD bias to +0.65, shimmering will disappear.
 
Last edited:
How does AF in a 6800 look 5% better at AF than a 5800? Why not 7%? Or 3%? Do they have any margin for error? And why has nobody noticed this 5% difference in games? How did they calibrate their eyeballs and brains to produce these numbers?

It does not look better it looks worse! You mixed something. The percentage or the figures are a new thing in order to present IQ differences with some kind of figures because the average reader will otherwise ignore it and just look at the FPS figures. It is easier to understand. Of course it is hard to messure but if you try to be as fair as possible you can do that. They will have a complete article on this one in the near term.

You have to keep in mind that they have to buy their hardware unless an AIB sends one to them! Nvidia as well as AMD will not send any hardware for review to 3DCenter. Why? Because they do not do the standard launch reviews sticking to the reviewer guide. They will analyze the products always very deep and have no favor for one or the other IHV. The 3D community there is the biggest in Germany with a lot of very experienced people. And some of them work for free in their sparetime for this website. No support from the IHV's. That's why it is safe to say that they are not in the pocket of neither Nvidia nor AMD. They were also one of the first websites pointing the finger very deep into the G7X filtering issue and very bashing Nvidia very hard. This time it's AMD and to be honest they deserve that. It's a joke to offer this kind of filtering quality on year 2010 products, while telling everyone how it has been improved.
 
It does not look better it looks worse! You mixed something. The percentage or the figures are a new thing in order to present IQ differences with some kind of figures because the average reader will otherwise ignore it and just look at the FPS figures. It is easier to understand. Of course it is hard to messure but if you try to be as fair as possible you can do that. They will have a complete article on this one in the near term.

So, basically, they pulled those numbers out of their butts? Regardless, I'm supposed to take their word with no charts or nothing; because they're "experienced?." I have no experience with any of these foreign sites, so I'll have to take your word on their validity. I do, however, expect comparisons and evidence when one makes such a bold claim. However, one of the sites (who, BTW, enable HQ in all their AMD reviews) I do understand, Xbitlabs, reached this conclusion on the issue:

"Anisotropic filtering doesn’t have a serious effect on the performance of modern GPUs, so algorithms can be used whose filtering quality doesn't depend on the angle of inclination of the plane. AMD and Nvidia have both transitioned to high-quality anisotropic filtering already and the Radeon HD 6800 just improves the algorithm further to smooth out transitions between mipmaps so that they would be less conspicuous in textures with lots of small details."

Here's another quote:
"As opposed to MLAA, the benefits of the new AF algorithm are easy to see. They won't be so clear in the process of actual gaming, yet a sharp-eyed gamer will surely spot the difference, especially as modern games offer a lot of scenes suitable for that.:

http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_5.html#sect0
 
Coming from someone who's only owned NVidia cards: who cares? If it really takes theoretical test patterns to make the issue visible, then clearly the optimizations have done their job. Like with WAV -> 320kbps MP3, it's all about keeping 99% of the quality so that the change is nearly unnoticeable. If you didn't notice it in a game, or if you couldn't pinpoint the cheater in a double blind test, then it's not worth talking about. This is the whole audio cables debate again.
 
WAW is a quite a bit better then 320 mp3s, just you need high end setup to notice it.
 
So, basically, they pulled those numbers out of their butts? Regardless, I'm supposed to take their word with no charts or nothing; because they're "experienced?." I have no experience with any of these foreign sites, so I'll have to take your word on their validity. I do, however, expect comparisons and evidence when one makes such a bold claim. However, one of the sites (who, BTW, enable HQ in all their AMD reviews) I do understand, Xbitlabs, reached this conclusion on the issue:



Here's another quote:
"Anisotropic filtering doesn’t have a serious effect on the performance of modern GPUs, so algorithms can be used whose filtering quality doesn't depend on the angle of inclination of the plane. AMD and Nvidia have both transitioned to high-quality anisotropic filtering already and the Radeon HD 6800 just improves the algorithm further to smooth out transitions between mipmaps so that they would be less conspicuous in textures with lots of small details."


http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_5.html#sect0

It's not like there isn't a performance impact when going from High quality to quality, or vise versa so this quote means little. I dont need a quote to tell me that AMD's high quality impacts performance more than quality and that they should have a very high quality and the options should be high quality very high quality to match their competitor.
 
It's not like there isn't a performance impact when going from High quality to quality, or vise versa so this quote means little. I dont need a quote to tell me that AMD's high quality impacts performance more than quality and that they should have a very high quality and the options should be high quality very high quality to match their competitor.

But this is about AF, Which has nearly no impact on performance. Am I missing something here, because none of the arguments make any sense to me. Nvidia fans are all complaining that this gives an advantage to AMD performance wise. Yet, the review's here have always used the highest settings for AF which means it's the same for both sides.

It's been a few years since AF has had any impact on performance of cards. Heck even the budget cards now all can run 16xAF in games.
 
But this is about AF, Which has nearly no impact on performance. Am I missing something here, because none of the arguments make any sense to me. Nvidia fans are all complaining that this gives an advantage to AMD performance wise. Yet, the review's here have always used the highest settings for AF which means it's the same for both sides.

It's been a few years since AF has had any impact on performance of cards. Heck even the budget cards now all can run 16xAF in games.

It still has a performance impact though. In extreme cases it could be something like this: http://www.gpu-tech.org/content.php/...f-optimization

On average about 5 to 6 percent while still having the inferiour AF quality. In the midrange / performance segment this is quite a lot as the battle was already close before compared to the direct competition(460, 460OC and 470). So if you don't want to do apples to apples you have to bench ATI High Quality vs Nvidia Quality which is what major sites in Germany will do from now on until either AMD improves AF quality or Nvidia lowers AF quality. I hope that the last will not happen but that AMD instead will be forced to improve susbstantially here in real gaming cases and not on marketing powerpoint slides.
 
But this is about AF, Which has nearly no impact on performance. Am I missing something here, because none of the arguments make any sense to me. Nvidia fans are all complaining that this gives an advantage to AMD performance wise. Yet, the review's here have always used the highest settings for AF which means it's the same for both sides.

It's been a few years since AF has had any impact on performance of cards. Heck even the budget cards now all can run 16xAF in games.

Piggy backing on what ^^^ he said, when AMD's highest (HQ) quality setting will impact your performance as much as nvidia's Quality setting and the review uses High quality for nvidia it puts their GPU's an a performance disadvantage. AMD needs to get their stuff up to par as from what is seems like every generation gets worse at IQ than the last. They are moving in the Opposite direction in IQ and the right direction in Performance whereas nvidia is moving in the right direction for IQ and Performance

That is the point you keep missing.
 
My god they did degrade AF!!! Because Prime1, Goldentiger, and Dualown told me so!
 
My god they did degrade AF!!! Because Prime1, Goldentiger, and Dualown told me so!

I dont listen to trolls, or Fanboys, but when a few review sites all say the same thing and I see videos and picture evidence the lack of fanboyism in me allows me to be receptive of the truth. Hopefully your not diluted enough to sweep this under the rug too ;)
 
Last edited:
It still has a performance impact though. In extreme cases it could be something like this: http://www.gpu-tech.org/content.php/...f-optimization

On average about 5 to 6 percent while still having the inferiour AF quality. In the midrange / performance segment this is quite a lot as the battle was already close before compared to the direct competition(460, 460OC and 470). So if you don't want to do apples to apples you have to bench ATI High Quality vs Nvidia Quality which is what major sites in Germany will do from now on until either AMD improves AF quality or Nvidia lowers AF quality. I hope that the last will not happen but that AMD instead will be forced to improve susbstantially here in real gaming cases and not on marketing powerpoint slides.

Between quality and high quality there is a barely a difference in performance. Where did you get this average 5 or 6 percent difference from? Because I just ran through a few quick checks and the difference between quality AF and high quality AF was less than 1fps.

There are also complaints here that the AF is getting worse with every generation of card from AMD, well, I posted a link to the HardOCP review of the 5870, where they state that the AF is much better than previous generations. They actually state "I would say ATI has succeeded in providing the absolute best filtering quality in a gaming graphics card."

I would be interested to hear what Hard have to say about this, rather than read some german sites and a site only started in March.

Personally I think this is a lot of fuss over nothing.
 
Personally I think this is a lot of fuss over nothing.

You may be right, My own test show that there is a difference in performance based on the game or app. It could be as much as 5% in some games. I just think this is worth investigating and the quick amd loyal support that showed up quickly to sweep this under the rug only piqued my curiosity even more.

I would love HardOCP to do a check on this if not only to debunk this stuff but to possibly bring it to the surface if it is happening so that neither company gets the idea that they can pull a fast one over us Hardcore gamers. We're too l337 for that shit. ;)

Just the idea that either company would have the audacity to pull some shit like this makes me grind my teeth :mad:
 
Between quality and high quality there is a barely a difference in performance. Where did you get this average 5 or 6 percent difference from? Because I just ran through a few quick checks and the difference between quality AF and high quality AF was less than 1fps.

There are also complaints here that the AF is getting worse with every generation of card from AMD, well, I posted a link to the HardOCP review of the 5870, where they state that the AF is much better than previous generations. They actually state "I would say ATI has succeeded in providing the absolute best filtering quality in a gaming graphics card."

I would be interested to hear what Hard have to say about this, rather than read some german sites and a site only started in March.

Personally I think this is a lot of fuss over nothing.

^^ This
 
I dont listen to trolls, or Fanboys, but when a few review sites all say the same thing and I see videos and picture evidence the lack of fanboyism in me allows me to be receptive of the truth. Hopefully your not diluted enough to sweep this under the rug too ;)

IMO this is just people nitpicking, and trying to put down AMD new video cards anyway they can.

Both companies use different things to make the image look the same with no performance loss.

I think it's just the diehard people who wanna nitpick about 1% performance because of some AF not looking right blah blah.

Bunch of Malarkey
 
I have a HD 6870 and coming from an 8800GT, I've noticed improvements in picture quality. Can I put a percentage on that? No. Do I know if it is worse then the HD 5800 series? No.

Does it matter? Not to me and my situation.
 
IMO this is just people nitpicking, and trying to put down AMD new video cards anyway they can.

Both companies use different things to make the image look the same with no performance loss.

I think it's just the diehard people who wanna nitpick about 1% performance because of some AF not looking right blah blah.

Bunch of Malarkey

I hate to admit it.....But I'm starting to agree with this, particularly after reading a post where the TC believes the 68xx series, essentially, sucks; and that the rumored 560 and 570 will be Nvidia's "trump" card.

I dont listen to trolls, or Fanboys, but when a few review sites all say the same thing and I see videos and picture evidence the lack of fanboyism in me allows me to be receptive of the truth

Which review sites? All the ones that have been posted in English have stated the same thing: You can barely (if at all) tell the difference in AF quality during game play. The only possible issue is the shimmering........BTW, how do we discern who's a fanboy here?

Even the mod over at anandtech stated this in regards to Nvidia's standard quality being higher than AMD's.
"No it’s not. nVidia’s standard quality mode has visible mip transitions and also filters less angles than ATi. ATi OTOH has more texture aliasing with their stock settings.

The fact is, it’s not possible to get equal AF with the two IHVs, and it’s never been possible given they’ve always implemented AF differently to each other. All you can do is run the highest possible quality levels for both and then comment on the observable differences."
 
Last edited:
Which review sites? All the ones that have been posted in English have stated the same thing: You can barely (if at all) tell the difference in AF quality during game play. The only possible issue is the shimmering

Just because a review isn't printed in english doesn't make it less credible. Other countries can analyze technology as good as we can, in fact some can do it better. Like china for example where mostly all this stuff is engineered. Google translate is just for those occasions

........BTW, how do we discern who's a fanboy here?

Fanboys are not hard to find if you frequent both the Nvidia and AMD threads as much as I do. There are those who have a slight slant toward one company which is understandable. Then there are others who I rather not single out and name that turn every thread into a fucking stock analysis comparing 2 company's who are fucking losing money to prove that their brand has a bigger schlong than the other, and jump to every opportunity to defend theirs and flame the other. There there are obvious trolls whom we all know have posted in this thread earlier.

If you frequent both threads which I'm sure you do, cause I see you around then you know the people I'm talking about. If your curious I do not consider you a fanboy, personally I consider you have a slant toward amd much like I have a slant toward nvidia. I think bang/back would turn either of us over to the other company as our true loyalties are to our wallets.
 
Let me make this simple, AMD currently has a better product then Nvidia in just about every meaningful way. You can use benchmark X or test Y to try and justify buying an Nvidia product, but in the end, when it comes to gaming which is why we all bought these cards in the first place, AMD wins.
 
Let me make this simple, AMD currently has a better product then Nvidia in just about every meaningful way. You can use benchmark X or test Y to try and justify buying an Nvidia product, but in the end, when it comes to gaming which is why we all bought these cards in the first place, AMD wins.

:D Wannabe Troll
 
Let me make this simple, AMD currently has a better product then Nvidia in just about every meaningful way. You can use benchmark X or test Y to try and justify buying an Nvidia product, but in the end, when it comes to gaming which is why we all bought these cards in the first place, AMD wins.

A short while ago the nv fans can kid themselves by thinking fermi have "superior sli scale", it is now been proven false, if the 6800s are not hurting enough, the 6900s are coming within afew weeks.
The nv boys are getting desperate, grasping at straws, spare some forgiveness to them :(
 
A short while ago the nv fans can kid themselves by thinking fermi have "superior sli scale", it is now been proven false, if the 6800s are not hurting enough, the 6900s are coming within afew weeks.
The nv boys are getting desperate, grasping at straws, spare some forgiveness to them :(

mafa = real tool err troll, or wait a minute maybe both. :D
 
Let me make this simple, AMD currently has a better product then Nvidia in just about every meaningful way. You can use benchmark X or test Y to try and justify buying an Nvidia product, but in the end, when it comes to gaming which is why we all bought these cards in the first place, AMD wins.

Speak for yourself. I see a whole sub-forum of people soaking up Nvidia cards because AMD can't do for them what they want in their area of interest which is very little, if any, gaming.
 
Glad to see my prediction that this would devolve into a fruitless troll topic met with success......(sigh).
 
Last edited:
Glad to see my prediction that this would devolve into a fruitless troll topic met with success......(sigh).
I am as unsurprised as you. The timing of these stories was extremely suspect, and their content was not very original, still beating that "ATI has bad drivers" drum, yet a bunch of people in this thread who stand staunchly behind Nvidia can confirm that there is a problem. Meanwhile...
These are some shots from AoC using the HD 6850's default quality settings. 4xAA|16xAF was set in game. I chose that title because I've also played it on GTX 470 SLi and HD 5870 Crossfire. I saw no differences in IQ running about between any of the cards I've named.

I suspect if something is truly up Kyle and [H]ard will let us know.
...Actual owners of the cards aren't seeing these problems in games, or aren't running the obscure benchmarks required to reveal them.

This thread has done its job, which is to get a negative search result on google about AMD drivers, so it's a victory for Nvidia's FUD machine.
 
Last edited:

Damn the servers in AOC have truly become empty :D (kidding)
 
Damn the servers in AOC have truly become empty :D (kidding)

Those shots are from early morning and you're right too. FC have killed their game.

BTW if anyone is wondering why I am messing with a 6850, I am tuning that card in my old Ph II 940 box before I pass it on to my nephew. My main gaming rig is in my sig (Intel i7 920/GTX 470 SLi), and my alt comp has an OCed 1055T with HD 5870 Crossfire.
 
Last edited:
I'm withholding my final opinion until I see this investigated Deeper.

http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia-vs-amd-image-quality.html

Interesting quotes

NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.


In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.



For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.
 
I'm withholding my final opinion until I see this investigated Deeper.

Wise choice. :)
Wouldn't put too much weight into the blog of a Nvidia marketing director putting a spin on things he found on internet. Nvidia is nowhere near perfect in their filtering nor AA methods:

so what, if they took nv under the magnifying glass you would be surprised that they aren't anything better, if even more back stabbing.

Like this Transparency(Tr) MSAA - instead of traditional multisampling on older gpus prior geforce 400 its now some kind halfway TrMSAA with worse image, it started with cuda 3.x.x /256+ drivers, when a driver detects non fermi gpu it switches to this uglier technique., but fermi uses normal multisampling.. if you test older drivers 190+ with cuda 2.x.x it uses normal transparency multisampling aa..


normal TrMSAA is the first one that's disabled, now with cuda 3.x.x. its the 2nd
- ALL_MODE_REPLAY_MODE_ALPHA_TEST and that has worse image.. normal Tr supersampling is untouched.


But you wouldn't know about it if you didn't check with nv inspector,. and nv was all quiet about it.. all they claimed was that with 256+ it will be faster and im talking about the whole driver in general and that was it..
http://forums.guru3d.com/showpost.php?p=3779426&postcount=2

I bought a gtx 430 for my wife's computer (due to I want a dual boot with linux there and I prefer Nvidia when it comes to Linux). Will test it a bit with filtering up against the 5000 series as a continuation of my investigation here:
http://forum.beyond3d.com/showthread.php?t=57870
 
Last edited:
Wise choice. :)
Wouldn't put too much weight into the blog of a Nvidia marketing director putting a spin on things he found on internet. Nvidia is nowhere near perfect in their filtering nor AA methods:

Not perfect but for sure better than AMD. And if AMD does not improve AF quality on Cayman they will really get bad press because it is completely unacceptable for products in such a price range.
 
I'm of the mindset that as long as HQ is still available then who cares, lossy optimisations are done usually to get some kind of performance benefit and that's fine as long as it's an option to users. If you wan't better AF then set the quality slider higher and stop bitching :)

How can someone think like this? You *do* realize that this invalidates benchmarks going forward if IQ is not roughly equivalent.. right? This is not about "omg there's no way to make it look good on AMD". It's about their intentions and the outcome with the default settings, where 99% of sites benchmark and tell you what to buy based on results.. unless you can think of another reason this would be like this?
 
The below post shows nVidia are hardly perfect or innocent either.

http://hardforum.com/showpost.php?p=1036459341&postcount=117

AMD claim to be the best out in video playback. There are a few review sites that seem to support that claim. A heavy multimedia user who is a medium to light gamer might be best served by something like an HD 6850.

Thread on it @ Anand with links.

http://forums.anandtech.com/showthread.php?t=2121487

Bottom line though, if people aren't satisfied with the performance or IQ of their 6800 card in gaming, they should return it. Starting with the 8800GTX I do feel like nVidia have had the edge with AF when all optimizations are defeated. On the other hand AMD (ATi) have often been regarded as having better general AA.
 
Last edited:
Status
Not open for further replies.
Back
Top