AMD degraded AF quality with 68xx launch!

Status
Not open for further replies.
First if the 68xx replaced the 57xx they would have called it the 67xx. Why did NVIDIA catch a ton of shit for their naming schemes and AMD gets a pat on the back?



Actually only people who care about image quality and getting what they paid for are concerned about this issue.

Stop freaking trolling, seriously...

And plus, AMD already saying their target is to replace the mainstream. Not like they trying to fool anyone, holy moly....

This thread is fucking talking about AF, not your kids trolling about how naming scheme work..

PS: MOD really need to ban this kid right here, every thread about Ati/AMD, he is there to talk shit...
 
Guys, you are feeding the troll.
The guy has been banned under the name of wreckage on other sites.
He is probably not a kid. Rather a shill if I may guess.
He wants you to respond so he can derail the thread.
 
Guys, you are feeding the troll.
The guy has been banned under the name of wreckage on other sites.
He is probably not a kid. Rather a shill if I may guess.
He wants you to respond so he can derail the thread.

Not saying it's not true but how do you know he is wreckage?
 
Not saying it's not true but how do you know he is wreckage?

I think many knows for several reasons. He's used the same homepage in is sig earlier, both on AT and on [H] for projects digital wreckage. On AT, he posted a while links to "cool stuff" using his amazon handle prime1 for clicks (check his post history) etc.
Prime1 has never tried to hide or be deceitful about being wreckage. :)

I find his posts amusing though. :D
 
I find his posts amusing though. :D

They're fun in some "What will he come up with next" way. The fishing for argumentations for thread derailing I find slightly annoying.

For some reason I get more irritated on the people who fall in the trap easily or repeatedly.
 
For some reason I get more irritated on the people who fall in the trap easily or repeatedly.

I realize his purpose, I just enjoy exploiting his flawed logic; particularly when it becomes unbearable. I knew he was going to respond to this topic at some point. It was inevitable.
 
I think image quality has gotten to the point where people simply don't worry as much about pixel peeping on filtering and they just want to enable more features at high fps. sure (maybe) amd duped everyone here, however if they can get extra speed with pretty much no one noticing so far, props to them.

I think you will also find a lot of sites will not address this because its minor. also no one wants to be like 'oops, sorry about that 27 page 6850 review we did, we never noticed this af issue' :p

the real concern here is that the 'final results' review wrap ups of the new amd cards beating nvidia may have been a lot different if amd did not have the +5% extra performace from gimping visual quality.
Posted via [H] Mobile Device
 
Last edited:
First if the 68xx replaced the 57xx they would have called it the 67xx. Why did NVIDIA catch a ton of shit for their naming schemes and AMD gets a pat on the back?



Actually only people who care about image quality and getting what they paid for are concerned about this issue.


Maybe because the 68xx is not a replacement for the 57xx, it's in-between the 57xx and 69xx.

the 5700 series will still sell @ their current price, and the 6800 will fill in the gap of the 5830.
 
And the usual army of Nvidia fanboys finally makes a return. The reviewers guide specifically spelled out the changes made to the settings in the control panel and the difference in results in the HD5xxx and HD6xxx series of cards at default settings. This article reeks of Nvidia PR spin.

Edit: No wonder they are making a stink about this. Nvidia wants websites to test their shitty "Stop sign" filtering:

GF104
gf104-af3.png


Against AMD's arguably superior filtering:
Barts
barts-af3.png
AMD isn't even doing anything that resembles Trilinear filtering these days (lower half ALU AF is how it's supposed to look like):
3dcenteraf.png
 
First if the 68xx replaced the 57xx they would have called it the 67xx. Why did NVIDIA catch a ton of shit for their naming schemes and AMD gets a pat on the back?
/QUOTE]

Two simple words - new chips. Say what you will about the 6870, it's not a rebranded 5770 or something similar. Period.
 
and another German site with a short article on this matter.

Btw 3DCenter.de is one of the most trustable sites. They don't receive any review hardware from the IHV's because they point the finger really often into the existing problems of gpu generations no matter whether its Nvidia or AMD.

http://www.3dcenter.org/artikel/amd...00-mit-rueckschritten-bei-der-filterqualitaet

http://www.3dcenter.org/artikel/amd...n-bei-der-filterqualitaet/amds-radeon-hd-5800

The article is in german but look at the image quality figures. Barts based products are even worse than cypress and juniper ones. The banding bug has been not fixed completely but at least improved. However shimmering is a huge mess.
 
I'm of the mindset that as long as HQ is still available then who cares, lossy optimisations are done usually to get some kind of performance benefit and that's fine as long as it's an option to users. If you wan't better AF then set the quality slider higher and stop bitching :)
 
I'm of the mindset that as long as HQ is still available then who cares, lossy optimisations are done usually to get some kind of performance benefit and that's fine as long as it's an option to users. If you wan't better AF then set the quality slider higher and stop bitching :)

Well the reviewer should care and the consumer as well because the products are pretending a performance that is not there. You cannot compare Q vs Q anymore between the two. In the end If you compare GTX460 and GTX470 also with current pricing there is actually no benefit in buying a Barts based product with the exception of multi monitor support. I don't think that this is a major point for the majority of customers. In a lot of other aspects the Nvidia products are the better alternative currently.
 
I'm of the mindset that as long as HQ is still available then who cares, lossy optimisations are done usually to get some kind of performance benefit and that's fine as long as it's an option to users. If you wan't better AF then set the quality slider higher and stop bitching :)

well the problem is because almost every review site uses the default setting for driver quality. amd cards then get benched with lower quality settings and look more favorable compared to nvidia cards since sites care about fps. if nvidia put out drivers that lowered quality by default just before a release, there would be a category 5 shitstorm since there are double standards for the two companies.
Posted via [H] Mobile Device
 
Last edited:
Well, actually- power consumption is another reason to buy a Barts based part. They also match performance/price. you can save a couple bucks on a 460 over 6850, but you pay more out of your pocket and have to wait for 8+ weeks to get your rebate money back.
 
well the problem is because almost every review site uses the default setting for driver quality. amd cards then get benched with lower quality settings and look more favorable compared to nvidia cards since site care about fps. if nvidia put out drivers that lowered quality be default just before a release, there would be a category 5 shitstorm since there are double standards for the two companies.
Posted via [H] Mobile Device

Spot on... it would be humorous except it's true :(.
 
I'm of the mindset that as long as HQ is still available then who cares, lossy optimisations are done usually to get some kind of performance benefit and that's fine as long as it's an option to users. If you wan't better AF then set the quality slider higher and stop bitching :)

Check out the texture shimmering with HQ here that doesn't happen when you use HQ or even only Q on an GTX 460:
http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php
 
well the problem is because almost every review site uses the default setting for driver quality. amd cards then get benched with lower quality settings and look more favorable compared to nvidia cards since sites care about fps. if nvidia put out drivers that lowered quality by default just before a release, there would be a category 5 shitstorm since there are double standards for the two companies.
Posted via [H] Mobile Device

Spot on... it would be humorous except it's true :(.

LOL, are you guys serious? Look at the reviews done here, they review on the highest playable settings on every card then do a compare of them at the same settings. So where are the lower quality settings that you are talking about?

What about this, hardocp themselves say the AF is spot on in the 5870.

http://www.hardocp.com/article/2009/09/22/amds_ati_radeon_hd_5870_video_card_review/6

And is there anyone here who uses AF at anything other than the highest setting? And it has nearly zero impact on performance.
 
LOL, are you guys serious? Look at the reviews done here, they review on the highest playable settings on every card then do a compare of them at the same settings. So where are the lower quality settings that you are talking about?

What about this, hardocp themselves say the AF is spot on in the 5870.

http://www.hardocp.com/article/2009/09/22/amds_ati_radeon_hd_5870_video_card_review/6

And is there anyone here who uses AF at anything other than the highest setting? And it has nearly zero impact on performance.

Almost every reviewer, including [H] benches at default driver settings (which is never HQ but Q). Also we're talking about the AF for the 6xxx series, not the 5xxx. Did you even read any of the posts in this thread before flying off the handle?

How can you own a video card and not even notice that everytime you do a true clean install your settings are put back to Q?
 
LOL, are you guys serious? Look at the reviews done here, they review on the highest playable settings on every card then do a compare of them at the same settings. So where are the lower quality settings that you are talking about?

What about this, hardocp themselves say the AF is spot on in the 5870.

http://www.hardocp.com/article/2009/09/22/amds_ati_radeon_hd_5870_video_card_review/6

And is there anyone here who uses AF at anything other than the highest setting? And it has nearly zero impact on performance.

good job referencing an article from last year and missing the point. the settings people are talking about are the driver controlled af settings, not in game settings. amd's default af quality has gone *down* since the article you mentioned, and it does have an impact on performance otherwise amd would defalt to hq.
Posted via [H] Mobile Device
 
Last edited:

Ok so none of this is fud :( It's time that this be looked into further. I call upon Kyle/Brent/Mark to shine some light on this issue. One US reviewer has to step up and take a non bullshit approach to this issue. How much performance is AMD gaining by conveniently having such a poor alogrhythm for Aniso. It looks like shit honestly as I'm sensitive to the texture shimmer. I noticed this on my wife's setup but thought it was because she had a low end 5000 series card. Looks like it's worse on 6000 series in some instances.
 
Ok so none of this is fud :( It's time that this be looked into further. I call upon Kyle/Brent/Mark to shine some light on this issue. One US reviewer has to step up and take a non bullshit approach to this issue. How much performance is AMD gaining by conveniently having such a poor alogrhythm for Aniso. It looks like shit honestly as I'm sensitive to the texture shimmer. I noticed this on my wife's setup but thought it was because she had a low end 5000 series card. Looks like it's worse on 6000 series in some instances.

Well, according to those links, the AF issue is hard to discern/barely noticeable.....The only thing slightly noticeable is the texture shimmering.
 
Well, according to those links, the AF issue is hard to discern/barely noticeable.....The only thing slightly noticeable is the texture shimmering.

They mention it's there and also clearly mention that it's more visable in motion especially the shimmering which I'm more sensitive to. I just Pm'd Kyle, Brent, and Mark Warner, lets see if this gets the attention it deserves since many amd loyals want this thread to die.
 
They mention it's there and also clearly mention that it's more visable in motion especially the shimmering which I'm more sensitive to. I just Pm'd Kyle, Brent, and Mark Warner, lets see if this gets the attention it deserves since many amd loyals want this thread to die.

Well....I prefer topics like these die because they go no where. Things like AF vary from title to title, and most users won't be able to the discern difference without significant testing. Most of these topics delve into fruitless bashing, attacks and generally biased behavior.

But, meh.....Perhaps you'll get a response, though I doubt it; since, from what I gather, most people don't care enough.
 
Well....I prefer topics like these die because they go no where. Things like AF vary from title to title, and most users won't be able to the discern difference without significant testing. Most of these topics delve into fruitless bashing, attacks and generally biased behavior.

But, meh.....Perhaps you'll get a response, though I doubt it; since, from what I gather, most people don't care enough.

which makes it odd that you would bring this up in a topic containing almost none of what you are worried about :p
Posted via [H] Mobile Device
 
which makes it odd that you would bring this up in a topic containing almost none of what you are worried about :p
Posted via [H] Mobile Device

Really?

Go troll some other place, because if the bigger difference between 5870 and the 480 in filtering wasn't perceived until the very specific case of Trackmania's track textures then you are bordering on "audiophile praising the danceable 7k $ audio cable" level.

http://www.anandtech.com/show/3987/...enewing-competition-in-the-midrange-market/5#

They are slipping further and further towards being a bargain basement chip company. Launching a new series that is slower to than the old and skimping on image quality are just a few recent examples. It's a shame really. :mad:

You even stated:
if nvidia put out drivers that lowered quality by default just before a release, there would be a category 5 shitstorm since there are double standards for the two companies.

You were saying?
 
Really?






You were saying?

prime1 posts that stuff all over the place, if you want to kill every topic he posts in the forum might look pretty empty.
Posted via [H] Mobile Device
 
Last edited:
Just so i am not being quoted out of context:

The one who was trolling was someone trying to say that nvidia's IQ was lower due to a single static test where it seemed to fail vs the 6000's.

Now that there are vids i admit that i find it at the very least amusing that it is ATi/AMD the ones lacking IQ on actual movement in games, specially after that troll's post about the "traffic light" pattern of nvidia :-P :-P
 
Status
Not open for further replies.
Back
Top