Fury X vs. Titan X benchmark

Status
Not open for further replies.
Huh thats shocking.....Both AMD and Nvidia if you dont switch the options lower IQ if you dont change settings.

Very shocked you guys don't change them.

It is up to AMD and NVIDIA to provide the kind of experience they want out of their cards to gamers. It is our job to evaluate that experience. We take that, at its default, and evaluate the gameplay experience. If we were to change anything in the driver we would be artificially changing that experience and creating an unfair comparison.

To keep it fair, we leave the driver settings alone, and at default, on both AMD and NVIDIA. If we detect some measure of ill looking image quality we bring this to the attention of AMD and NVIDIA.
 
It is up to AMD and NVIDIA to provide the kind of experience they want out of their cards to gamers. We take that, at its default, and evaluate the gameplay experience. If we were to change anything in the driver we would be artificially changing that experience and creating an unfair comparison.

To keep it fair, we leave the driver settings alone, and at default, on both AMD and NVIDIA. If we detect some measure of ill looking image quality we bring this to the attention of AMD and NVIDIA.

Well when is the last time you checked that? I know for a fact if I leave my Nvidia drivers on Quality, and don't switch it to High Quality I can notice a difference in IQ and shimmering.

I can't remember the AMD drivers off the top of my head, but I always turn off the AMD optimization. It brings the IQ back to where it should be.

Maybe when in the next review you do, you could check that out? It's suck, but I know both do it....and I fucking hate it when a passion.

I want top of the line IQ, not an optimization where it's noticeable.
 
Well when is the last time you checked that? I know for a fact if I leave my Nvidia drivers on Quality, and don't switch it to High Quality I can notice a difference in IQ and shimmering.

I can't remember the AMD drivers off the top of my head, but I always turn off the AMD optimization. It brings the IQ back to where it should be.

Maybe when in the next review you do, you could check that out? It's suck, but I know both do it....and I fucking hate it when a passion.

I want top of the line IQ, not an optimization where it's noticeable.
First thing I do is change the setting in nvcp and agree with the IQ and shimmering comment, especially noticeable on trees for me. Would love to see an [H] comparison on it as well.
 
This just proves it:

From Greg @ OCUK

Nvidia control panel default settings:
bvOSUNPr1.jpg


Set to highest quality:
6voZccp13.jpg


Screenshots taken from the game ;)

I'd say that is a pretty big difference. Where did the AF go in the first screenshot?

+ there seems to be around 10% perf. drop after setting the quality to highest.
 
Last edited:
This just proves it:

From Greg @ OCUK



Screenshots taken from the game ;)

I'd say that is a pretty big difference. Where did the AF go in the first screenshot?

EXACTLY TO A TEEEEEEE.

This is what I was talking about. So basically you are telling me [H] never knew about this? I thought this was common knowledge?
 
Of course we know the differences control panel settings can make, that isn't the point. To test fairly, we must leave the control panel alone between AMD and NV and compare at their default settings. As I said, it is up to NVIDIA and AMD to provide the kind of gameplay experience they want out of their cards, we take that, and evaluate it as is, fairly, with no outside or artificial manipulation, else it would not be a fair comparison.
 
Of course we know about the differences control panel settings can make, that isn't the point. To test fairly, we must leave the control panel alone between AMD and NV and compare at their default settings. As I said, it is up to NVIDIA and AMD to provide the kind of gameplay experience they want out of their cards, we take that, and evaluate it as is, fairly, with no outside or artificial manipulation, else it would not be a fair comparison.

Huh, ok so you guys know that both companies lower IQ settings by leaving the drivers at default?

I understand being fair between the companies, but what about a disclaimer to people who read the reviews letting them know the settings you use might lower IQ in games if left alone.

It's pretty noticeable.
 
We have never manipulated driver settings, from day one we have always tested at default driver settings, this has never changed. If a disclaimer will help you, I will add one to the test setup page again (I think we use to have that in there at some point in the past), I assumed this was a given.
 
We have never manipulated driver settings, from day one we have always tested at default driver settings, this has never changed. If a disclaimer will help you, I will add one to the test setup page again (I think we use to have that in there at some point in the past), I assumed this was a given.

Well I do not think everyone knows that by leaving the driver settings at stock lowers IQ in games.

I mean I guess I don't care, I just assumed people knew. I guess not everyone did LOL.

Makes me wonder if those threads people start saying AMD or Nvidia has better IQ is just because of the driver settings left alone. Makes ya wonder now...
 
Huh, ok so you guys know that both companies lower IQ settings by leaving the drivers at default?

I understand being fair between the companies, but what about a disclaimer to people who read the reviews letting them know the settings you use might lower IQ in games if left alone.

It's pretty noticeable.

As a reviewer they can not. That would alter the test and leave huge openings for even more critism than they get now. How ever changing AMD and Nvidia driver setting to make sure they are maxed is a good idea for a supplemental review. But again they must leave those at default for most reviews, kinda like leaving cards at stock and not OCing or setting both to the same Mhz.
 
Well I do not think everyone knows that by leaving the driver settings at stock lowers IQ in games.

I mean I guess I don't care, I just assumed people knew. I guess not everyone did LOL.

Makes me wonder if those threads people start saying AMD or Nvidia has better IQ is just because of the driver settings left alone. Makes ya wonder now...

I have not experienced any difference in NVIDIA And AMD in terms of IQ since the Quake 3/Unreal Anisotropic filtering pattern difference days.
 
I have not experienced any difference in NVIDIA And AMD in terms of IQ since the Quake 3/Unreal Anisotropic filtering pattern difference days.

Ahhh, Well change your NVidia driver settings to high quality. You will see the difference.

I mean look at those BF4 pics a couple of posts up. Thats what I see if I don't change it (specially after a new driver install).
 
Well I do not think everyone knows that by leaving the driver settings at stock lowers IQ in games.

I mean I guess I don't care, I just assumed people knew. I guess not everyone did LOL.

Makes me wonder if those threads people start saying AMD or Nvidia has better IQ is just because of the driver settings left alone. Makes ya wonder now...

Well that was the case in the posts I linked earlier. At default there was hanky panky to get better bench scores but when manually set there was no difference, or at least not discernible without being overly nit picky. And that is the case in this thread. The original tester was running at driver defaults when he noticed this discrepancy. It is the consensus that adjusting settings would alleviate any issue and there would be no issue. But the way it looks for now is that the default driver is not giving max or equal IQ. Again this is not confirmed or proven 100%. So far seems only one guy is able or willing to give pics of proof. So for now it is just a theory till we get more substantiated proof from more users or review sites.

I did look at CCC on mine and the only setting that could be suspect is Tessellation which is set to AMD OPTIMIZED. So not sure exactly how that reacts to every game ie: 64x or say 16x auto set in games like TW3. Now that I look more at CCC is see that under TEXTURE FILTERING after set to USE SOFTWARE SETTINGS on the first 2 settings, ANISOTROPIC FILTERING MODE and Samples that there are sections like SURFACE FORMAT OPTIMIZATION set to on. So not sure if being the top two override the ones under or they are in conjunction with.
 
Last edited:
Well that was the case in the posts I linked earlier. At default there was hanky panky to get better bench scores but when manually set there was no difference, or at least not discernible without being overly nit picky. And that is the case in this thread. The original tester was running at driver defaults when he noticed this discrepancy. It is the consensus that adjusting settings would alleviate any issue and there would be no issue. But the way it looks for now is that the default driver is not giving max or equal IQ. Again this is not confirmed or proven 100%. So far seems only one guy is able or willing to give pics of proof. So for now it is just a theory till we get more substantiated proof from more users or review sites.

I did look at CCC on mine and the only setting that could be suspect is Tessellation which is set to AMD OPTIMIZED. So not sure exactly how that reacts to every game ie: 64x or say 16x auto set in games like TW3.

Hmm could of swore there was another option I changed when I ran AMD....been too damn long lol.

Either way I mean it's been known for awhile Nvidia's stock quality settings lowers IQ, but benchmarks higher. Like I said I thought most reviewers knew that.

Anyway yea good luck people with the cards!
 
Of course we know the differences control panel settings can make, that isn't the point. To test fairly, we must leave the control panel alone between AMD and NV and compare at their default settings. As I said, it is up to NVIDIA and AMD to provide the kind of gameplay experience they want out of their cards, we take that, and evaluate it as is, fairly, with no outside or artificial manipulation, else it would not be a fair comparison.

There are differences in color & gamma, that's subjective.

What isn't subjective is everything else that matters, texture quality, view distance.

Ofcourse leaving it on default driver settings is expected to be "fair", because the setting specifies "let application decide" so we gamers assume our in-game settings on ultra are ultra.

But whats happening is on NV, that default settings DEFAULTS to ass quality on filtering. Unacceptable!

There is a performance penalty associated with the default to forcing on quality (to match AMD's rendering on default), it's 10%. Not an insignificant figure any any stretch of the imagination.

You guys claim you notice subtle microstutters and hence you don't do FCAT. So you are saying you don't notice these obvious IQ differences? How are the readers supposed to trust your judgement on something so blatant?
 
You are taking words out of my mouth now. Both AMD and NV use optimizations, this is a rather old argument. I have stated why we use default driver settings.
 
To test fairly, we must leave the control panel alone between AMD and NV and compare at their default settings.

To me that sounds like poor methodology. The default settings are likely to have differing optimizations which will affect image quality uniquely for one side of the other. What company A thinks is a good level for "default" quality could be very different from company B. The most fair and scientific approach would be to test with the highest settings in both control panels, as these presets are unlikely to feature optimizations that have a noticeable effect on IQ altogether. In addition you should ideally put IQ under a microscope every once in awhile to make sure both sides are playing on even ground.

Subjectively you may not feel any difference between vendors, but the same is true listening to a 128kbps MP3 versus a lossless format -- putting both under scrutiny will reveal major differences.
 
There are differences in color & gamma, that's subjective.

What isn't subjective is everything else that matters, texture quality, view distance.

Ofcourse leaving it on default driver settings is expected to be "fair", because the setting specifies "let application decide" so we gamers assume our in-game settings on ultra are ultra.

But whats happening is on NV, that default settings DEFAULTS to ass quality on filtering. Unacceptable!

There is a performance penalty associated with the default to forcing on quality (to match AMD's rendering on default), it's 10%. Not an insignificant figure any any stretch of the imagination.

You guys claim you notice subtle microstutters and hence you don't do FCAT. So you are saying you don't notice these obvious IQ differences? How are the readers supposed to trust your judgement on something so blatant?

Not sure how I would trust fcat too much, just my opinion, but Brent makes a good point. To be fair leave it all stock.

Install the drivers and bench, Thats the only way to be fair, and thats the way it should be.

the only thing I would like to see is more IQ pics with quality vs high quality in the NVCP and CCC, with a disclaimer stating that.
 
This is a very old argument we've had many times in the past, when AMD or NV optimizes people were calling it "cheating" It is a very old and tired argument.

We use default driver settings, we stated our stance then, and now. Nothing more needs to be said, we used default driver settings, we will continue to use default driver settings.

Thank you for the feedback.
 
You are taking words out of my mouth now. Both AMD and NV use optimizations, this is a rather old argument. I have stated why we use default driver settings.

There's optimizations then there's image degradation to chase performance.

When gamers set ingame on maxed, we expect maxed. But with the NVCP bug, on default, it isn't maxed. You have to force it to Quality to get maxed.

As a tech reviewer, aren't you even curious as to the potential impact it has to IQ & performance?

On AMD, default = higher quality, but it hurts performance.

On NV, default = worse quality, but it runs faster (10% according to BF4 tests).

Do a detailed analysis in recent games, image quality comparisons on default driver settings (since its what you test on), same settings in-game, same location. Then do it forced on quality in CC/NVCP.

You will get a ton of site hits from everyone interested in this issue. If it turns out to be true, then basically NV has "optimized" their way to extra performance for years at the expense of IQ.

As noted on AT's forum, this isn't a new problem for NV:

example4i.jpg
 
I think the key take away would be to (at default settings) compare the image quality between both cards and then make a judgement call (or at least provide comparative images). I know that [H] has done this in the past - for a long time it didn't matter as both companies default settings were equivalent. Sounds like another round of comparisons might be warranted as NVidia seems to be getting the "itch" again...
 
Gonna post this here as well:
So the old saying that AMD has better image quality still holds candle to this day:

-In the past ATI had better VGA quality because they used better RAMDAC
-They started using 10bit internal LUT from 5xxx series onwards which gives you better monitor picture quality with calibrated monitors because you get less banding. Nvidia still uses 8bit LUT on their consumer cards.
-In 2014 they upped the internal LUT to 12bit, meaning even less banding with calibrated monitors. I can now calibrate my 10bit Direct Drive HP ZR30w monitor (no internal LUT / scaling board) and get no banding at all.
-And now it seems they even have better texture quality at default settings.

No wonder why all my nvidia cards have looked like shit ;)
Did I just say that out loud :eek:

Let the flame war begin :p
Hopefully not.
 
This is a very old argument we've had many times in the past, when AMD or NV optimizes people were calling it "cheating" It is a very old and tired argument.

We use default driver settings, we stated our stance then, and now. Nothing more needs to be said, we used default driver settings, we will continue to use default driver settings.

Thank you for the feedback.

I agree with staying with Driver default. I believe it is a must to keep the accusations of cheating at bay. But playing devils advocate for a moment, does it seem reasonable that one company may be setting levels lower for better scores knowing that your stance is not to touch Driver defaults? I get it. It is a razors edge and like I stated at first you SHOULDNT change driver settings.

BUT, yeah here it comes, it may be due for looking into, this IQ at driver default. Seems that NVIDIA does indeed have lower IQ settings in their own Driver Default. AMDs CCC has me confused if they do as well, guess I could do some of my own testing. Now isn't the time, late at night, contacts drying out, tired... . But I am definitely curious about this on both sides. I think they both should be set for max level, in the least for the top tier cards, I get lowering for the lower tiers though I think that should be left to game settings rather than drivers.
 
Gonna post this here as well:


Let the flame war begin :p
Hopefully not.

BEGIN? where have you been, lol. Granted what he says is interesting but not entirely helpful the way he put it, with the flaming and all.
 
We have never manipulated driver settings, from day one we have always tested at default driver settings, this has never changed. If a disclaimer will help you, I will add one to the test setup page again (I think we use to have that in there at some point in the past), I assumed this was a given.

What I would say is that you guys are enthusiasts, AND you run your tests with a lot of subjectivity, so why wouldn't the best tests involve the best experience you can create even messing with drivers/control panels? Wouldn't fair be showing the richest capabilities of both companies' products and comparing the performance of both?

I could be wrong, but I would guess that your audience would appreciate seeing this addressed in your reviews, especially because you give playable settings. Those should be playable AND the best experience visually, no?
 
What I would say is that you guys are enthusiasts, AND you run your tests with a lot of subjectivity, so why wouldn't the best tests involve the best experience you can create even messing with drivers/control panels? Wouldn't fair be showing the richest capabilities of both companies' products and comparing the performance of both?

I could be wrong, but I would guess that your audience would appreciate seeing this addressed in your reviews, especially because you give playable settings. Those should be playable AND the best experience visually, no?

So, double our workload, and artificially manipulate the experience to our own.

Yeah, no.

I'll stick to letting AMD and NVIDIA provide the gameplay experience they want for their own cards, and we will evaluate that and see which one is better and how they compare.

The reason for using default driver settings has been stated, please reference above posts.
 
damn, the differences are very noticeable and nobody can deny that...i only imagine if AMD did that, would have been the hell on earth :)) but it's NV,so people will just pass on it...no problems whatsoever
 
Last edited:
This just proves it:

From Greg @ OCUK

Nvidia control panel default settings:
http://3.1m.yt/bvOSUNPr1.jpg

Set to highest quality:
http://1.1m.yt/6voZccp13.jpg

Screenshots taken from the game ;)

I'd say that is a pretty big difference. Where did the AF go in the first screenshot?

+ there seems to be around 10% perf. drop after setting the quality to highest.
So that is "quality" compared to "high quality"?
 
Looks like reviewers need to be going back to image quality comparisons and callout the vendors that are trying to chase to the highest FPS possible.

BF4 on my 2x 290s did look better than my Zotac AMP 980ti at default settings and I mean it was noticeable. Noticed the same on my SLI 970s some time back and wondered why it looked all washed out.


I buy 970 SLI and get the 3.5gb bug, now I have a 980ti and get image quality reduction. I can't seem to win. I hope it gets corrected for the default setting. I am not too worried about it but it does concern me a little.
 
Looks like reviewers need to be going back to image quality comparisons and callout the vendors that are trying to chase to the highest FPS possible.

BF4 on my 2x 290s did look better than my Zotac AMP 980ti at default settings and I mean it was noticeable. Noticed the same on my SLI 970s some time back and wondered why it looked all washed out.

You know that is such a common thing from owners of both brands, the same common description is used for NV "washed out" look. But that's subjective tastes.

What isn't subjective and what [H] has no interest in examining, are actual objective metrics such as texture quality, filtering quality and view distance quality.

Those impact game performance massively yet it's left to AMD & NV to "optimize" all they want to chase higher FPS. Without actual unbiased and ethical sites to keep the bastards honest, this situation with NV's default = crap is encouraged.

Somehow, I think if the situation was reversed and AMD's default produced worse image quality, these same sites will jump all over it and sensationalize it to the max! The double standards are blatant.
 
Last edited:
In those BF screens I think it may be a driver/game bug, either AF is not working, or we're looking at lowest filter quality vs highest.

The difference between "quality" and "high quality" should be noise to the texture at oblique camera positions (which may add sparkle/shimmer at the lower setting). Even examining screenshots its hard to tell them apart outside of brightly lit scenes.

Personally, it's not something I can pick up on during the run of play, but I use DSR.
 
i too, think things are being exaggerated, but everyone wants to find a cause for contention between AMD and NVIDIA
 
i too, think things are being exaggerated, but everyone wants to find a cause for contention between AMD and NVIDIA

Did you look at the sleeping dogs screenshot? Its missing tons of ground texturing + lighting.

Why not add image comparisons to the reviews? You guys have done reviews solely on IQ before.

I mean you are playing the game during these tests, whats wrong with taking a screenshot every so often and attaching them, heck even let other people look through them for issues if you don't have the time.

You guys used to compare differences all the time.

http://www.hardocp.com/article/2013...deo_card_performance_iq_review/6#.VZeYKvlVhBc

It doesn't matter which side has more better quality, if there are differences those should be out in the open.
 
Did you look at the sleeping dogs screenshot? Its missing tons of ground texturing + lighting.

Why not add image comparisons to the reviews? You guys have done reviews solely on IQ before.

I mean you are playing the game during these tests, whats wrong with taking a screenshot every so often and attaching them, heck even let other people look through them for issues if you don't have the time.

You guys used to compare differences all the time.

http://www.hardocp.com/article/2013...deo_card_performance_iq_review/6#.VZeYKvlVhBc

It doesn't matter which side has more better quality, if there are differences those should be out in the open.

It takes a lot of guts to call out Nvidia on their shenanigans. The US tech press was just completely meek on the GTX 970 fiasco. Only pcper and hwc did reviews on performance degradation and furthermore only pcper did 970 SLI reviews which pushed the cards to show what happens when VRAM usage hit above 3.5 GB.

Right now I would not trust a single US tech reviewer. Most are pretty much shilling for Nvidia. Do you think any reviewer is going to go through a whole lot of work and unearth something to the detriment of Nvidia, nah. don't see that happening. When it comes to finding Nvidia's shenanigans its left to the poor consumer to figure it out and raise their voices on various forums on the internet as they did with GTX 970.

If the same IQ issue were with AMD you would see the so called tech sites aggressively examining the issue at hand. Its just the way the system works. People don't want to mess with the dominant company in an industry. Right now Nvidia's dominance of the GPU market is slowly reaching to Intel's levels in the CPU market.Anyway I have always felt from the ATI days that their IQ was better than Nvidia's due to reasons like better RAMDAC quality. ATI/AMD also started supporting 10 bit colour from the HD 5800 series which maintained the IQ lead. Anyway the case at hand now is definitely worth investigating as if the IQ is so glaringly different that making any performance comparisons is completely invalid. But we have to wait and see which US tech reviewer / tech website has the guts to go ahead and do an objective investigation. :D
 
So nvidia is cheating again...


Its not cheating its default driver settings. It has been like that since the 6800 series, They changed the names of things in their control panel though. Performance is now called quality, and quality is now called high quality.

The drivers are set at let the application decide by default.

Also changing the quality slider also adds AA.
 
What is all this back and forth about image quality? I thought that was the point of the apples to apples comparison in having ALL of the settings the same.
 
Man up and do what is right. If you set a game to ULTRA settings, your video card driver better be delivering that experience.

BTW

AMD defaults to let application decide in most of their driver settings in CCC you can of course override those settings if you so choose

NVidia had a long history of doing shady stuff and people blindly support them

1. drivers catching cards on fire
2. IQ reduction via driver overriding game by default
3. Manufacturing problems that they hid from their customers (8400GSM ring a bell)
4. Forced renegotiation of chip supply contract with MS (talk about shooting yourself in the foot)
5. Bankrupted 3DFX after stealing their tech (they must be following Creative Labs)
6. 3D mark on rails driver issues.
7. Can't overclock mobile cards issues recently...
and the list goes on and on and on

Now ATi/AMD have done some things in the past that were really not acceptable and I am not defending them for any of it but they are in no way anywhere near NVidia in terms of sleaziness...

Edit: that above sleeping dogs screenshot is pretty messed up.... and the quality difference between NVidia and AMD is STAGGERING
 
What is all this back and forth about image quality? I thought that was the point of the apples to apples comparison in having ALL of the settings the same.


I think thats why reviewers tend to use the in game settings vs the control panel settings.

So keep the drivers at default (which is application controlled on both IHV's) and what ever the application does is what is done.
 
Sad to see the Fury X get beaten to such a degree, but the smallness of the chip's form-factor combined with the HBM setup might allow for better optimization in the future. As is, the TDP is too high for the data output; I'd imagine future revisions lowering the TDP quite a bit more with the form factors becoming small enough for average mini-ITX towers. A generation or two down the line, this branding may hit a fairly decent high point. The most exciting thing with HBM is with bandwidth because it seems as though the heavily increased bandwidth may lead to dual GPU chips that finally share RAM as apposed to copying to both. The benefits of which would be a significantly reduced price, higher RAM amounts, less delays due to RAM copying, and less TDP.
 
Last edited:
Status
Not open for further replies.
Back
Top