AMD Definitely Degraded IQ on 6000 & now on 5000 with Cat 10.11

Status
Not open for further replies.

Lord_Exodia

Supreme [H]ardness
Joined
Sep 29, 2005
Messages
7,008
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/

Just a few days after the last thread was closed Guru3d Looked into the matter and reported. They hesitated to call it a cheat but nonetheless they have concluded that AMD is indeed degrading IQ to increase performance.

So if it's not a cheat I ask, What should nvidia do being their competition.

a Add the same optimizations and degrade IQ as a default setting, because it's hardly noticeable
b Continue their campaign for public awareness so that AMD stops and plays nice
c Suck it up and let their competition optimize and keep the high road

What should AMD do

IMO

They should make these optimizations optional and let reviewers know not to use them in reviews. That will be the ethical way about this. I don't think they harm the experience that bad but in today's games these optimizations are not welcome, at least not in my gaming rig.

I will ask Mods to erase any posts that are not respectful and remove any Flaming that occurs from either side of the discussion.
 
In all fairness, they choose a game that doesn't naturally support AA to show the difference...? And if they are comparing the performance hit from 6850 to a 580....well we know why. A 580 is far superior to a 6850, fanboy-ism aside.

AMD shouldn't be pulling any fast ones either way.
 
Are there any screenshots where the optimization is actually visible?
They posted a ME1 screenshot and said, "Yeah you can't see it". Then what's the point of using that game?
 
Didnt Nvidia do the same thing back in the day with fx 5200, 5500, and 5800 series. They were getting their ass kick by 9500, 9800 so they turn down IQ to increase their card performance, (eventually scraping the series then bringing out 5900,5950)
 
Didnt Nvidia do the same thing back in the day with fx 5200, 5500, and 5800 series. They were getting their ass kick by 9500, 9800 so they turn down IQ to increase their card performance, (eventually scraping the series then bringing out 5900,5950)

Yes they did, Supposedly since then they vowed not to do it anymore. They absolutely cheated then. I hated them at that time, their cards sucked compared to AMD's. AMD also cheated in Quake 3 with the whole quack Fiasco and got alot of shit for it. The main thing is in today's times putting this stuff in without saying anything is a bit on the unethical side. Also doing it in a way that will make it into most review comparisons is the main part. You can barely notice the optimizations but up to a 10% boost is good from a performance perspective. So a optimization you can barely notice and up to a 10% boost. I think they should keep the optimizations and leave them optional. Reviewers could disable them until nvidia does something similar to compare fairly in the future. Not all of this could be bad but it's the way they did it that is the part I (and many others)have/should have a bit of a problem with.
 
Guru3D said:
The above screenshot is courtesy of 3DCenter.org and their excellent article, it shows the complexity and the optimizations really well. We seriously had a hard time finding an application where the optimizations show well. So mind you, the above example is catered to show the image quality anomaly.
[...]
So the question here is, is it considered a cheat to apply the optimizations or is this something you can life with ?
[...]
But let's take a more real-world example then, please download all three images, and load them up in say Photoshop. Make sure you zoom in at all screenshots to 100% and then start to compare the three images (each lossless 12 MB BMP files).

We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat.

Guru3d didn't think much of it, because you have to jump through a lot of hoops to see it.

Let's say my computer renders Mass Effect at 60 frames per second. And let's say I pull each frame out in photoshop, zoom in, and crank the gamma. In order for these optimizations to bother me, I would have to spend about a half hour for every second of game time examining each frame. Not much of a "cheat" if you ask me (or Guru3D).

If a tree falls in a forest...
 
Last edited:
Yes they did, Supposedly since then they vowed not to do it anymore. They absolutely cheated then. I hated them at that time, their cards sucked compared to AMD's. AMD also cheated in Quake 3 with the whole quack Fiasco and got alot of shit for it. The main thing is in today's times putting this stuff in without saying anything is a bit on the unethical side. Also doing it in a way that will make it into most review comparisons is the main part. You can barely notice the optimizations but up to a 10% boost is good from a performance perspective. So a optimization you can barely notice and up to a 10% boost. I think they should keep the optimizations and leave them optional. Reviewers could disable them until nvidia does something similar to compare fairly in the future. Not all of this could be bad but it's the way they did it that is the part I (and many others)have/should have a bit of a problem with.
The problem is the default setting.
"Quality" is default, "High Quality" eliminates the optimization. Though that doesn't mean High Quality is equivalent to the Nvidia default.

Their point is, both ATI and Nvidia should look the same at default.
Using that optimization feature in some form is fine, it's kind of nice I guess, but the problem is how it's interfering with their benchmarking at stock settings...
 
The problem is the default setting.
"Quality" is default, "High Quality" eliminates the optimization. Though that doesn't mean High Quality is equivalent to the Nvidia default.

Their point is, both ATI and Nvidia should look the same at default.
Using that optimization feature in some form is fine, it's kind of nice I guess, but the problem is how it's interfering with their benchmarking at stock settings...

Agreed, the beef nvidia has seems to be that amd did this without reviewers knowing or having a way around it. I think this would be a cool option as you can barely notice it all while still enjoying up to a 10% boost. They should make it optional and not the standard from them and perhaps nvidia should do something similar.
 
The problem is the default setting.
"Quality" is default, "High Quality" eliminates the optimization. Though that doesn't mean High Quality is equivalent to the Nvidia default.

Their point is, both ATI and Nvidia should look the same at default.
Using that optimization feature in some form is fine, it's kind of nice I guess, but the problem is how it's interfering with their benchmarking at stock settings...

That was my understanding of the article as well. According to Guru3d:
Guru3D said:
So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.

Since Nvidia uses HQ as default and AMD is using Quality as default, AMD is cheating according to this.

Question is, is default made for users or reviewers? Did AMD put quality as default, because it would give best balance between performance and picture quality, or did they put quality as default because they believe reviewers are too stupid to know their way around drivers, and would therefore use default in apples to apples benchmarks?

All reviewers have to do, is to put HQ on both AMD and Nvidia (as one would do in apples to apples review), and this would be a non issue. I would like to see a list of reviewers that don't do this already, since I wouldn't trust them with a review.

3D centers picture quality analysis about AF and AA in HQ settings are more interesting IMO.
 
The problem is the default setting.
"Quality" is default, "High Quality" eliminates the optimization. Though that doesn't mean High Quality is equivalent to the Nvidia default.

Their point is, both ATI and Nvidia should look the same at default.
Using that optimization feature in some form is fine, it's kind of nice I guess, but the problem is how it's interfering with their benchmarking at stock settings...

I don't know. In my mind, Quality implies that there are optimizations being made - High Quality is the setting you would run if you want no optimizations. So as long as you can set High Quality and get no optimizations I'd say it's a lot of ado about nothing. Of course, making it an option is even better.

You could debate whether the proper default is Quality or High Quality, but again I don't see much of an issue making Quality the default. Isn't the whole point of optimizations to improve performance by degrading things I can't see anyway? To my mind, this would be a big deal if the optimizations were added to High Quality mode - since the real issue is just that AMD chooses Quality as the default while Nvidia chooses High Quality - big whoop.

If all you care about is that they both look the same at default you could easily make the counter-argument that Nvidia should be using Quality as their default instead of HQ.
 
Last edited:
I don't know. In my mind, Quality implies that there are optimizations being made - High Quality is the setting you would run if you want no optimizations. So as long as you can set High Quality and get no optimizations I'd say it's a lot of ado about nothing. Of course, making it an option is even better.

I agree with that type of thinking only thing I dont agree with is saying quality is for optimizations. I would say performance is for optimizations quality and high quality should be for no optimizations. IMO

To be honest I'll say that there should be a checkbox that turns this on/off in quality, it should always be on in performance, and never on at high quality. Default settings in driver should be quality (optimization off) Nvidia's default settings should be quality as well. Then on Hard sites like this one they should test at High quality on both cards unless they are low end cards.
 
I agree with that type of thinking only thing I dont agree with is saying quality is for optimizations. I would say performance is for optimizations quality and high quality should be for no optimizations. IMO

To be honest I'll say that there should be a checkbox that turns this on/off in quality, it should always be on in performance, and never on at high quality. Default settings in driver should be quality (optimization off) Nvidia's default settings should be quality as well. Then on Hard sites like this one they should test at High quality on both cards unless they are low end cards.

Default should be whatever the GFX card makers thinks is best balance for the consumer, while HQ should be an all optimizations off feature IMO. Reviewers should be able to turn on HQ and all optimizations are off. If reviewers don't know their way around drivers enough to do this, they shouldn't review cards for others.
 
Default should be whatever the GFX card makers thinks is best balance for the consumer, while HQ should be an all optimizations off feature IMO. Reviewers should be able to turn on HQ and all optimizations are off. If reviewers don't know their way around drivers enough to do this, they shouldn't review cards for others.
Yeah, but is ATI High Quality equivalent to... Nvidia default?
Now the benchmarkers have to go figure that out. Probably yes, but it's hard to tell.

Either way, all this graphical optimization stuff ("low" qualities) should be off by default. Both should be max, out of the box. It just makes sense.

If someone wants to turn down graphics to boost performance, then they can make their own decision to do that.
 
Default should be whatever the GFX card makers thinks is best balance for the consumer, while HQ should be an all optimizations off feature IMO. Reviewers should be able to turn on HQ and all optimizations are off. If reviewers don't know their way around drivers enough to do this, they shouldn't review cards for others.

This is where our opinions differ obviously. I don't think there should be optimizations available in 2 out of the 3 settings by default. It's gonna be 2011 soon and the hardware can handle games at HQ and quality no problem, the only reason to enable it by default in quality is to inflate review results. At least that's what I think. I don't think it's a bad idea to make the optimizations available in Quality by using a checkbox feature that reviewers and customers can disable if they dont want it on. (reviewers should absolutely not leave it on) as it's not fair to both companies.
 
Yeah, but is ATI High Quality equivalent to... Nvidia default?
Now the benchmarkers have to go figure that out. Probably yes, but it's hard to tell.

Either way, all this graphical optimization stuff ("low" qualities) should be off by default. Both should be max, out of the box. It just makes sense.

If someone wants to turn down graphics to boost performance, then they can make their own decision to do that.

We agree 100% I'm sure the forum may be divided on that but I expect us to have the popular opinion. If this were 10 years ago who knows what the popular opinion would be, perhaps the other opinion would be more popular as cards would struggle more with games than they do now.
 
Yeah, but is ATI High Quality equivalent to... Nvidia default?
Now the benchmarkers have to go figure that out. Probably yes, but it's hard to tell.

Either way, all this graphical optimization stuff ("low" qualities) should be off by default. Both should be max, out of the box. It just makes sense.

If someone wants to turn down graphics to boost performance, then they can make their own decision to do that.

If ATI's HQ is equivalent to Nvidia's default, doesn't matter. What matters is if ATI and Nvidia uses optimizations in HQ. If they do, then that is cheats. If Nvidia manages to put optimizations in their quality settings which makes them look like HQ, then that is good. Reviewers however, should only compare HQ to HQ for apples to apples.

I don't see MAX being a default in anything else, so I have a hard time considering default to be a cheat. When I install a game, I usually have to max the settings myself. There are no standards saying that max should be default anywhere. Default is normally something I consider a vanilla setting. Something that would suit most people. HQ is more for those with beefier setups.

However, HQ should be without optimizations.
 
AMD should stop skimping on image quality. I personally can't tell a difference, but they shouldn't skimp on image quality at the consumers expense. In my eyes, since I can't tell a difference, I will still consider getting AMD for my next gpu.
 
All reviewers have to do, is to put HQ on both AMD and Nvidia (as one would do in apples to apples review), and this would be a non issue.
This!

As long as there is an option to make them look alike, nobody should really care. If Guru3D has a problem with the defaults because they always run their tests at default, their policy is getting in the way of their accuracy.
 
Last edited:
This is where our opinions differ obviously. I don't think there should be optimizations available in 2 out of the 3 settings by default. It's gonna be 2011 soon and the hardware can handle games at HQ and quality no problem, the only reason to enable it by default in quality is to inflate review results. At least that's what I think. I don't think it's a bad idea to make the optimizations available in Quality by using a checkbox feature that reviewers and customers can disable if they dont want it on. (reviewers should absolutely not leave it on) as it's not fair to both companies.

Hardware can't handle HQ on everything with max settings everywhere. Most hardware is mainstream cards, not highend cards. What highend cards can do and what mainstream cards can do is different things.

Default is a vanilla settings and always have been. If you start a game, any game, you will find out that you don't get the highest quality settings at default. You need to crank that shit up yourself.

There is nothing wrong by having a default setting made by game developer or hardware developer. However, not having the possibility to crank it up would be bad.
 
Personally I think this is a total non-issue. I'm happy to let the manufacturers decide what the best balance of performance/quality is when it comes to such minor differences, and if you feel differently it is simple enough to tweak.
 
Personally I think this is a total non-issue. I'm happy to let the manufacturers decide what the best balance of performance/quality is when it comes to such minor differences, and if you feel differently it is simple enough to tweak.

it is an issue, amd is cheating the people who review cards and the people who buy them by using the same default settings as everybody else. They should apply a handicap to themselves, or simply rename the default settings so everyone stops making a big deal out of something that needs photo manipulation to figure out.
 
Hardware can't handle HQ on everything with max settings everywhere. Most hardware is mainstream cards, not highend cards. What highend cards can do and what mainstream cards can do is different things.

Default is a vanilla settings and always have been. If you start a game, any game, you will find out that you don't get the highest quality settings at default. You need to crank that shit up yourself.

There is nothing wrong by having a default setting made by game developer or hardware developer. However, not having the possibility to crank it up would be bad.

We each have good reasons for our opposite opinions and I doubt anything will change our minds ;)

Personally I think this is a total non-issue. I'm happy to let the manufacturers decide what the best balance of performance/quality is when it comes to such minor differences, and if you feel differently it is simple enough to tweak.

Imagine if that was the case then nvidia may do the same and make the optimizations default, then amd will degrade even more, then nvidia and then we'll all have piss poor quality at default. In a very competitive market it is us who will fall pray to this aggressive competition if it happens.
 
I don't understand why it's such a big deal. I could just turn on High Quality and forget about it...

Guru3D is all upset that this AMD optimization is applied by default. They preempt the question of "why not just use high-quality?" by saying that they have a policy of only using default settings. But why does that policy exist?

Essentially everyone who has any interest in high performance graphics knows that IQ and performance can be tweaked in drivers. I'm pretty sure most people would notice the setting in CCC sooner than noticing any IQ compromises. If someone can't find their way to the CCC I think there's a fat chance they're attentive enough to notice reduced texture filtering.

I'm somewhat surprised nobody is looking at this in a more positive light; The new Quality setting nets about 8% better performance for an unnoticeable quality loss. I'll probably be using that setting 90% of the time.
 
Personally I think this is a total non-issue. I'm happy to let the manufacturers decide what the best balance of performance/quality is when it comes to such minor differences, and if you feel differently it is simple enough to tweak.

Exactly. The labeling is easy. Quality vs. high quality. Hard to miss that there is a difference.

Worst part is that for a minor reduction in PQ, you gain 8-9% performance according to Guru3d:

The overall performance difference is roughly 8% maybe 9%, and that is a ample difference in the graphics arena alright.

8-9%? Thats half way from GTX 480 to GTX 580. Thats a good reason to have Quality as default and HQ as option for reviewers and highend users.

What did Guru3D say:

Guru3D said:
So what does that optimization look like in real world gaming. Well it's really hard to find actually. In environments with a lot of depth, say a road with a textured grid at the end of that road you would be able to spot some anomalies, but only if you look really carefully and with the scene properly lit.

So, HQ as default, but with 8-9% performance hit, or Q as default with 8-9% better performance with an impact of PQ that they consider hard to find?
 
Last edited:
Imagine if that was the case then nvidia may do the same and make the optimizations default, then amd will degrade even more, then nvidia and then we'll all have piss poor quality at default. In a very competitive market it is us who will fall pray to this aggressive competition if it happens.

This won't happen if reviewers learn to use HQ settings in reviews as they always should do regardless of whats default.

Default is vanillla. This is why you never see games having 8X AA, 16X AF with all eyecandy on as default.

If you believe that default should be highest possible settings, then please show me where default means max settings in other contexts?

Take a look at the default power profile in Windows... It default at balanced. Is this wrong of Microsoft? Should we make a huge article about how Microsoft cheats by not cranking the power up as default? I mean, MS shouldn't cheat and should crank the power up as default, right? :D

We each have good reasons for our opposite opinions and I doubt anything will change our minds ;)

I'm always open for arguments, but I have really problems understanding the issues here. I would argue that most gamers would benefit from 8-9% performance increase more then an increase in picture quality they might not even see.

We need to see that there are two seperate issues here:

The consumer:
Is AMD cheating the consumer by selecting quality as default, giving 8-9% better performance with a "hard to spot reduction in PQ"?

The reviewer:
Is AMD cheating the reviewer by having quality as default, making the assumption that the reviewer isn't competent to select HQ on both cards before making the review? Wouldn't the reviewer also, by same assumption, be cheating us again by pretending he's competent enough to make the review?
 
Last edited:
Looked at the Mass Effect comparison screen shots...I tried my hardest to spot differences but just couldn't...I don't understand what all the fuss is about.
 
Guru3d didn't think much of it, because you have to jump through a lot of hoops to see it.

Let's say my computer renders Mass Effect at 60 frames per second. And let's say I pull each frame out in photoshop, zoom in, and crank the gamma. In order for these optimizations to bother me, I would have to spend about a half hour for every second of game time examining each frame. Not much of a "cheat" if you ask me (or Guru3D).

If a tree falls in a forest...

im with you, if its that subtle....honestly...i say go for it if we get improved performance without noticeable quality differences.....i mean, i dont pay attention to half the shit on the screen when i am playing games.....no time to gawk at the graphics, im playing here.....and i dont do screenshots....so yeah, i say why not, i dont care who does it or who doesn't.....if i dont notice it while gaming, go for it
 
Imagine if that was the case then nvidia may do the same and make the optimizations default, then amd will degrade even more, then nvidia and then we'll all have piss poor quality at default. In a very competitive market it is us who will fall pray to this aggressive competition if it happens.

I'm waiting patiently for your thread calling nvidia cheats for disabling settings in their games to specifically hinder Ati cards.

You need to boost the image in photoshop to see it.

Shortened instructions linked above.

So not even worth the time. This is an asinine and idiotic thing to even be arguing.
 
Last edited:
We agree 100% I'm sure the forum may be divided on that but I expect us to have the popular opinion.

Might want to think again, I'm with Tamlin on this. I think its great that they provide a more enjoyable experience for 99% of their consumers by default.

More enjoyable being: a loss in graphics quality that they cannot comprehend for quite a dramatic increase in FPS which they can indeed notice.

As long as they provide options who gives a damn? [H]ard already does IQ comparisons in their major reviews, if the default setting is dramatic enough to have an effect on the outcome of those results then well then we'll see what Ryan and Kyle have to say about it. Right now it's fucking fine.
 
It doesn't say they cheated, just that the HQ IQ should be default.....Whether it needs to be or not is subjective. Honestly, I still can't see a damn difference, and I'm not sure Guru3d could either, but meh.....The only thing I know is that AMD gains more FPS in lower quality IQ.
 
This won't happen if reviewers learn to use HQ settings in reviews as they always should do regardless of whats default.

Default is vanillla. This is why you never see games having 8X AA, 16X AF with all eyecandy on as default.

If you believe that default should be highest possible settings, then please show me where default means max settings in other contexts?

I believe that it shouldn't be the highest but high enough to actually compare to the competitions. Also I believe that today's graphic cards are very powerful and these optimizations are a bit unnecessary. The message they are sending can be interpreted as they are trying to win benchmarks against the competition at the cost of IQ even if when these optimizations aren't enabled their FPS is Lower but still good enough. So then why..?

Take a look at the default power profile in Windows... It default at balanced. Is this wrong of Microsoft? Should we make a huge article about how Microsoft cheats by not cranking the power up as default? I mean, MS shouldn't cheat and should crank the power up as default, right? :D

That is 100% correct I would say graphic cards of High cost are a rare exception. If this were done on cards that actually need the help then hey it is what it is.

I'm always open for arguments, but I have really problems understanding the issues here. I would argue that most gamers would benefit from 8-9% performance increase more then an increase in picture quality they might not even see.

We need to see that there are two seperate issues here:

The consumer:
Is AMD cheating the consumer by selecting quality as default, giving 8-9% better performance with a "hard to spot reduction in PQ"?

The reviewer:
Is AMD cheating the reviewer by having quality as default, making the assumption that the reviewer isn't competent to select HQ on both cards before making the review? Wouldn't the reviewer also, by same assumption, be cheating us again by pretending he's competent enough to make the review?

The issues can be what I mentioned above;

1. AMD did this without telling anyone including reviewers
2. They don't need these optimizations
3. They degrade IQ by default which IMO should be the exception. Quality should not include degraded IQ Performance should, perhaps a check box for those who desire the more performance, but not sneaking it by hoping it goes unnoticed.

I'm waiting patiently for your thread calling nvidia cheats for disabling settings in their games to specifically hinder Ati cards.

Climber, if and when this is proven 100%, and I happen to stumble upon the proof before it's out there, I publicly promise you that I'll create such a thread.

Might want to think again, I'm with Tamlin on this. I think its great that they provide a more enjoyable experience for 99% of their consumers by default.

More enjoyable being: a loss in graphics quality that they cannot comprehend for quite a dramatic increase in FPS which they can indeed notice.

As long as they provide options who gives a damn? [H]ard already does IQ comparisons in their major reviews, if the default setting is dramatic enough to have an effect on the outcome of those results then well then we'll see what Ryan and Kyle have to say about it. Right now it's fucking fine.

Many consumers do not know how to change the quality settings on their card, trust me on this, I did support for years and always showed this to many people who were clueless. Most reviewers test at quality settings as it's the default and they assume most consumers will run it that way too. Also important to note, they snuck this by everyone until it's been brought to the surface now. They need to make this optional and they need to be told they can't get away with things like this or they will do something worse next time. I understand you can barely notice it, but either way it's unnecessary and it's even questionable why they do it in the first place. Also why they added it on quality settings, being that quality is the most commonly used settings in reviews.
 
all this talk of degrading IQ and yet look at how the images have to be captured and analyzed and manipulated to even notice the difference......

and i do not accept the "give them a inch they'll take a mile" argument, because if the image looks like crap, or i want it to look better than it does, i have the option to turn it up at will. but if i launch it and it looks good to me and i get better performance by not seeing things i wont miss anyways...great!
 
I understand the division here. I didn't expect everyone to agree with eachother. So then why doesn't everyone take a stab at answering the questions in my original post. I'll post them here

So if it's not a cheat I ask, What should nvidia do being their competition.

a Add the same optimizations and degrade IQ as a default setting, because it's hardly noticeable
b Continue their campaign for public awareness so that AMD stops and plays nice
c Suck it up and let their competition optimize and keep the high road


What should AMD do?



Be realistic with these answers and as you think about it, imagine what will happen to the industry as these things you suggest with your answers come to fruition.
 
Last edited:
I don't understand...the Catalyst AI setting has always defaulted to 'Standard'...it was never turned off in any previous driver...isn't setting the new AI setting to 'Quality' the same as the previous default setting of 'Standard'?
 
all this talk of degrading IQ and yet look at how the images have to be captured and analyzed and manipulated to even notice the difference......

and i do not accept the "give them a inch they'll take a mile" argument, because if the image looks like crap, or i want it to look better than it does, i have the option to turn it up at will. but if i launch it and it looks good to me and i get better performance by not seeing things i wont miss anyways...great!

Pretty much....If Nvidia did the same thing, and it wasn't noticeable, I wouldn't care in the least either. Saying they have to match Nvidia's standards, no matter if they're right or wrong, makes little sense to me. People are going to go for the higher FPS, particularly if they can't discern any difference.
 
Pretty much....If Nvidia did the same thing, and it wasn't noticeable, I wouldn't care in the least either. Saying they have to match Nvidia's standards, no matter if they're right or wrong, makes little sense to me. People are going to go for the higher FPS, particularly if they can't discern any difference.

But isn't it nvidia who needs the market share right now? they don't appear to be doing it. I bet if this was nvidia doing this right now this would be a whole different story. But hey I guess we live in the world we do. I'll say this, I dont think this is such a bad thing, the bad thing was the way AMD went about it.

My concern is the bigger picture. So nvidia does the same. Now amd is losing again and worse even. So then what do they do?

They degrade more? They accept it? They take all optimizations off and create public awareness so that nvidia can stop and play nice. See this is a evil circle.
 
Status
Not open for further replies.
Back
Top