Fury X vs. Titan X benchmark

Status
Not open for further replies.
No dude, I'm trying to diminish the huge amount of BS that some AMD fanboys produce.
I've made it clear what the BS is and asked them to prove it.
I'm not the troll, I'm taking the trolls on.

Sorry but you are a huge troll and quite ignorant of most facts and reasonable community consensus. Also quite hostile. Even in the face of plentiful info you still just spout the same fan boy crap over and over. You even have no idea about the basis of forums and the art of respectful debate. Troll you have a name, Menu.
 
I am asking for the truth not the bringer of it. When I get him will look and see about the consensus of users on both sides saying AMD has better initial IQ levels at driver default. Again though when someone says others are saying it doesn't require proof of the original point just that others are saying it.

Go to overclockers.net and look for the image quality thread and see for yourself what they say.



When making a statement better provide proof when asked, otherwise, its seen as nothing. At least some reasoning behind what you are saying. Not others posts on another forum, unless there is some kind of methodology behind those posts, which I don't think there will be any.
 
No dude, I'm trying to diminish the huge amount of BS that some AMD fanboys produce.
I've made it clear what the BS is and asked them to prove it.
I'm not the troll, I'm taking the trolls on.

Until yesterday that is what I thought but watching you troll the non native english speaker yesterday because he couldn't get his points across the language barrier was a bit much in my mind. This forum seems to be full of trolling lately and is from both sides of the fence. He doesn't have the card to try and prove what you are asking and even if he did you wouldn't believe him because he is an amd fanboy in your eyes.
 
. This forum seems to be full of trolling lately and is from both sides of the fence. He doesn't have the card to try and prove what you are asking and even if he did you wouldn't believe him because he is an amd fanboy in your eyes.


There is a problem there, then why is he posting about it as if he has seen the difference, and taking other's word for it without looking into how he is testing and asking for the proof himself? I might be missing something here so clarify what is going on would be good instead of instigating a posting war.
 
I am asking for the truth not the bringer of it. When I get him will look and see about the consensus of users on both sides saying AMD has better initial IQ levels at driver default. Again though when someone says others are saying it doesn't require proof of the original point just that others are saying it.

Go to overclockers.net and look for the image quality thread and see for yourself what they say.

You have now demonstrated that you stated all the above without any proof yet stated it as fact.
Proof is required for you to demonstrate that you are not trolling.
But it has now been established that you said them when there was no proof.

I didnt make the statements.
You did, in defence of an image quality failure which you brought up here first, and then made things up in support of it.

Sorry but you are a huge troll and quite ignorant of most facts and reasonable community consensus. Also quite hostile. Even in the face of plentiful info you still just spout the same fan boy crap over and over. You even have no idea about the basis of forums and the art of respectful debate. Troll you have a name, Menu.

Whatever you say now matters little.
You cant delete what has been quoted.
It has been demonstrated that you make things up to try and prove a very tenuous case.

If you disagree, provide the proof.
 
AMD Fury X Crossfire 4K :http://www.tweaktown.com/articles/7226/amd-radeon-r9-fury-video-cards-crossfire/index.html

7226_40_amd-radeon-r9-fury-video-cards-crossfire.png


7226_41_amd-radeon-r9-fury-video-cards-crossfire.png


7226_42_amd-radeon-r9-fury-video-cards-crossfire.png


7226_43_amd-radeon-r9-fury-video-cards-crossfire.png


7226_44_amd-radeon-r9-fury-video-cards-crossfire.png


7226_45_amd-radeon-r9-fury-video-cards-crossfire.png


7226_47_amd-radeon-r9-fury-video-cards-crossfire.png


7226_48_amd-radeon-r9-fury-video-cards-crossfire.png
 
Until yesterday that is what I thought but watching you troll the non native english speaker yesterday because he couldn't get his points across the language barrier was a bit much in my mind. This forum seems to be full of trolling lately and is from both sides of the fence. He doesn't have the card to try and prove what you are asking and even if he did you wouldn't believe him because he is an amd fanboy in your eyes.

It went beyond the language barrier.
He stated something as fact and was quite denigrating with the emoticon and saying it was obvious when his starting word was maybe.
Then he admitted he is an AMD fanboy which spellled out why he did it.
And I proved what he said was incorrect anyway.

I am not against AMD, I am against the BS that AMD fanboys keep spouting.
Of my last 6 gfx cards, 3 have been AMD.
I want AMD to recover but their image is always going to suffer when there are so many people posting rubbish.
They are doing AMD no favours because they contribute to the bad image and keep the discussions alive of any problems AMD have.
 
Last edited:
You have now demonstrated that you stated all the above without any proof yet stated it as fact.
Proof is required for you to demonstrate that you are not trolling.
But it has now been established that you said them when there was no proof.

I didnt make the statements.
You did, in defence of an image quality failure which you brought up here first, and then made things up in support of it.



Whatever you say now matters little.
You cant delete what has been quoted.
It has been demonstrated that you make things up to try and prove a very tenuous case.

If you disagree, provide the proof.

You still have yet to prove I said there was. This is why you are a troll. Mixing words and being vague. I spelled it out in bold letters that there is no proof but warrants looking in to. My god you are so FAIL at debate.

Worry not tonight or tomorrow when I am at home I will post a lot of proof an documentation leading up to this debate and promptly look for your apology.
 
You still have yet to prove I said there was. This is why you are a troll. Mixing words and being vague. I spelled it out in bold letters that there is no proof but warrants looking in to. My god you are so FAIL at debate.

Worry not tonight or tomorrow when I am at home I will post a lot of proof an documentation leading up to this debate and promptly look for your apology.

I quoted you and labelled the posts where you said it.
 
Go to overclockers.net not UK and the guy posts there about settings. Everything at default except on fury he changes color or something. He made no tweaks to nvcp or in ccc to alter any in game setting.

I asked earlier for OP to look to see if he saw any difference in image quality. I did not tell him what to look for or what vendor. Unfortunately that science experiment is now ruined. One must use default settings if they wish to prove any hanky panky with default drivers. Adjusting settings in ccc or nvcp will invalidate the issue.

Apparently it is well known AMD has better IQ in games. I assume that means at default, whereas adjusting can make them equal.
He starts by saying go to this site because the guy is posting there. He relayed what the guy says in his posts. tried to get a user hear to check it out without knowledge. He said apparently- not sure how he deserved to be trolled for this. There is no proof either way yet so why is Nenu coming at him like it is a war?
 
He starts by saying go to this site because the guy is posting there. He relayed what the guy says in his posts. tried to get a user hear to check it out without knowledge. He said apparently- not sure how he deserved to be trolled for this. There is no proof either way yet so why is Nenu coming at him like it is a war?
OMG you get it and in a concise manner. Thank you.
 
No you choose a single line out of context of the rest. Typical tactic of someone with ulterior motives.

"2" direct statements that you made that have no proof.
And so far you have provided none despite being asked as you requested I do.
 
"2" direct statements that you made that have no proof.
And so far you have provided none despite being asked as you requested I do.

Ok so first question: CAN YOU READ?

2ND question: CAN YOU COMPREHEND?

Looks like no. The only statement I made which might have proof other than a multitude of posters agreeing to is AMD having higher IQ at driver default. This fury/maxwell IQ issue is in debate where I am looking to see if it is valid.

If you need further explanation of above ask.

Also said I will post all the proof you require tonite or morrow and I EXPECT an apology.
 
All of this to avoid posting proof.
And now you are saying there is proof.
Cant wait.
 
All of this to avoid posting proof.
And now you are saying there is proof.
Cant wait.

You are certainly quite the jerk. Why can I not link proof right now? What possible reason could there be? Maybe because I am on my phone and later I will be on my computer where linking is easy.

Lol you know you are boned and trying to get your punches in while you can.
 
I await your proof :)

Well I think it's quite clear that you have been fisted by nvidia and know it. What other possible reason would there be than you trying to justify that hard fisting by shitting all over these threads? I mean come on, if you're so happy with your nvidia card wtf are you doing here? Are you trying to educate the "stupid AMD buyers" and feel better about yourself knowing that you saved so many souls from AMDs evil claws? I think you're doing your supreme overlords nvidia no favors by trolling. In fact ppl like you just make me even happier with my purchase and no way in hell would I ever buy nvidias shit ever again after reading posts like yours or the likes of you. lol

Now, go home kid, you're drunk. :D
 
Well I think it's quite clear that you have been fisted by nvidia and know it. What other possible reason would there be than you trying to justify that hard fisting by shitting all over these threads? I mean come on, if you're so happy with your nvidia card wtf are you doing here? Are you trying to educate the "stupid AMD buyers" and feel better about yourself knowing that you saved so many souls from AMDs evil claws? I think you're doing your supreme overlords nvidia no favors by trolling. In fact ppl like you just make me even happier with my purchase and no way in hell would I ever buy nvidias shit ever again after reading posts like yours or the likes of you. lol

Now, go home kid, you're drunk. :D


Ok guys just stop it all of you guys are just bantering without showing anything, just wait for durquavian's "proof" and go from there, I am intrigued as well and want to see this "proof".

And Durquavian, if you are asking someone to do screenshots, videos or testing for you, make sure they use some kind of good reasoning, like scientific methodology, otherwise its not really valid.

So step A, to B fully explained and reasoning, similar systems (if you are going to talking about frame rates), same resolutions, list all the settings out on default etc. If there is a noticeable difference and the default settings are different, it would be nice to see once the default settings are set the same what the results are too.
 
Last edited:
Well I think it's quite clear that you have been fisted by nvidia and know it. What other possible reason would there be than you trying to justify that hard fisting by shitting all over these threads? I mean come on, if you're so happy with your nvidia card wtf are you doing here? Are you trying to educate the "stupid AMD buyers" and feel better about yourself knowing that you saved so many souls from AMDs evil claws? I think you're doing your supreme overlords nvidia no favors by trolling. In fact ppl like you just make me even happier with my purchase and no way in hell would I ever buy nvidias shit ever again after reading posts like yours or the likes of you. lol

Now, go home kid, you're drunk. :D

See posts 39 and 47.

Do what you want, I dont care.
People can make up their own minds what they want as long as the information presented to them is correct.
I'm making sure the information is correct / not blatantly wrong.

Thanks Razor, I agree.
 
It went beyond the language barrier.
He stated something as fact and was quite denigrating with the emoticon and saying it was obvious when his starting word was maybe.
Then he admitted he is an AMD fanboy which spellled out why he did it.
And I proved what he said was incorrect anyway.

I am not against AMD, I am against the BS that AMD fanboys keep spouting.
Of my last 6 gfx cards, 3 have been AMD.
I want AMD to recover but their image is always going to suffer when there are so many people posting rubbish.
They are doing AMD no favours because they contribute to the bad image and keep the discussions alive of any problems AMD have.
ok I must have missed that and I apologize for my troll comments. I favor neither side myself and really only care about performance.
edit-I do think we need to let this topic be looked into and hopefully have proof either way.
 
Last edited:
Ok guys just stop it all of you guys are just bantering without showing anything, just wait for durquavian's "proof" and go from there, I am intrigued as well and want to see this "proof".

And Durquavian, if you are asking someone to do screenshots, videos or testing for you, make sure they use some kind of good reasoning, like scientific methodology, otherwise its not really valid.

So step A, to B fully explained and reasoning, similar systems (if you are going to talking about frame rates), same resolutions, list all the settings out on default etc. If there is a noticeable difference and the default settings are different, it would be nice to see once the default settings are set the same what the results are too.

Did you see each post of mine or just basing off the ignorant ramblings of another poster? Because I did try for a more hands off hoping not to influence the OP, get his initial opinion without tainting it. But other posters posted pics and that went all to hell. Then i directed others to the original poster on another thread. While mentioning the opinions of other posters in that thread even giving the consensus seen over last few years I have seen reiterated that AMD has higher IQ at driver default. I will search tonite for the proof that lead to this consensus. As far as the current IQ issue I have not stated one way or the other. Those are the facts to this point.
 
Did you see each post of mine or just basing off the ignorant ramblings of another poster? Because I did try for a more hands off hoping not to influence the OP, get his initial opinion without tainting it. But other posters posted pics and that went all to hell. Then i directed others to the original poster on another thread. While mentioning the opinions of other posters in that thread even giving the consensus seen over last few years I have seen reiterated that AMD has higher IQ at driver default. I will search tonite for the proof that lead to this consensus. As far as the current IQ issue I have not stated one way or the other. Those are the facts to this point.


yes I did, and I did not see anything even remotely reasonable as proof. All the stuff on OC's is just rambling and picking apart videos, those changes could have been attributed to many things, different methods of how motion blur is done, possibly AF was off, AO different methods of that also changes, Keep this in mind they might be the same shader but the way the hardware renders them could be different. LOD settings, there are so many variables. Also the guy was playing the game if I'm not mistaken that also will change things.

Its the same thing as taking the AMD press benchmarks as gospel, there were too many issues with those benchmarks that's why many people stated there were issues with them.
 
yes I did, and I did not see anything even remotely reasonable as proof. All the stuff on OC's is just rambling and picking apart videos, those changes could have been attributed to many things, different methods of how motion blur is done, possibly AF was off, AO different methods of that also changes, Keep this in mind they might be the same shader but the way the hardware renders them could be different. LOD settings, there are so many variables. Also the guy was playing the game if I'm not mistaken that also will change things.

Its the same thing as taking the AMD press benchmarks as gospel, there were too many issues with those benchmarks that's why many people stated there were issues with them.

You seemed to have missed the part where I was interested in proof and the fact I never asserted one way or another. I tried my best to post neutral assert that we need this looked into in the least. That's all.
 
NVidia has been known to cheat aggressively in benchmarks for a very long time:
http://www.tomshardware.com/forum/78088-33-nvidia-caught-cheating
http://www.tomshardware.com/forum/191912-33-nvidia-cheating-again-precision
http://www.geek.com/games/futuremark-confirms-nvidia-is-cheating-in-benchmark-553361/
http://forums.anandtech.com/showthread.php?t=2106925&goto=nextoldest

To be fair, as NVidia continued to ramp up this behavior, AMD eventually caved and wound up essentially doing the same thing. So neither one is exactly a saint. On the other hand, unlike NVidia, AMD actually gave end users a choice to either enable “optimizations” or disable them.

All of this dates back pretty far – instead of “cheating” NVidia has replaced this same type of behavior with “Game Works”. Instead of dropping image quality, they crush the ever living $hit out of your tessellation engine for no particular reason at all… (No, 64x tessellation on pretty much anything does not improve image quality)

Going back even further, there was some consideration around overall 2D quality as well:
http://arstechnica.com/civis/viewtopic.php?f=6&t=993608

Enough on the image quality stuff though – I believe the concern was, “is there any data to suggest that NVidia has ever done something to impair video quality for the sake of performance?” The answer is yes, in the past this has happened (both for 3D gaming as well as 2D desktop performance. Is it relevant still today, probably not.

Regarding generalized discussion in here – fanboys, trolls, etc. I find that NVidia trolling substantially outweighs AMD fanboyism. In fact, it is nearly impossible to have a rational discussion with anyone regarding AMD products, for reasons that I just can’t fathom. For example, Fury X is not a bad card. It is an expensive card for the current performance. It was released with unoptimized drivers and arguably rushed to market. But it plays games and plays them well.

I’ve been gaming for a long time, I’ve always been fond of AMD/ATI. But I’ve run NVidia cards as well, with minimal complaints. After a while I’ve just decided that I’d rather support a company with marginally better corporate ethics regarding pushing gaming forward, and NVidia is not such a company in my eyes. I’d rather pay a premium even with 10% less performance to AMD than to reward what I perceive as bad behavior by NVidia. But that’s just a personal preference – I won’t lose my mind if NVidia knocks it out of the park with Pascal – I’ll just have to weigh how much it means to support a company whose motives I find suspect.
 
ok I must have missed that and I apologize for my troll comments. I favor neither side myself and really only care about performance.
edit-I do think we need to let this topic be looked into and hopefully have proof either way.

np, its tricky to pick apart what happened.
Agreed, lets see what he comes up with.
 
Ok so lets start this long drawn out info-thon.

I will start with my posts.

http://hardforum.com/showpost.php?p=1041707383&postcount=14

Here I asked the OP to run some benches paying attention to IQ. I did not state which of the vendors was stated to have an issue. I also acknowledged that this could all be nothing.

Few posts later Pics of the issue get posted so there goes my hope for an uninfluenced opinion. And a number of posters asking what setting he used that he never posted on Overclockers.uk. so I posted this in its entirety:

http://hardforum.com/showpost.php?p=1041708065&postcount=24

Go to overclockers.net not UK and the guy posts there about settings. Everything at default except on fury he changes color or something. He made no tweaks to nvcp or in ccc to alter any in game setting.

I asked earlier for OP to look to see if he saw any difference in image quality. I did not tell him what to look for or what vendor. Unfortunately that science experiment is now ruined. One must use default settings if they wish to prove any hanky panky with default drivers. Adjusting settings in ccc or nvcp will invalidate the issue.

Apparently it is well known AMD has better IQ in games. I assume that means at default, whereas adjusting can make them equal.


Ok so you see I instructed those questioning the setting to go to Overclockers.NET not .uk, that is where he was posting settings. I even mention the testing I wanted the OP to do and how it was ruined with the pics. In the last paragraph I mention IQ and AMD APPARENTLY having the better. I give my opinion on the statement as being a driver default and that ADJUSTING setting prob make them equal.

But seeing how I must prove every statement and in this case prove that it is a common belief and how it became one, here we go:

http://techreport.com/news/14032/amd-nvidia-image-quality-compared

This is an article stating the common held belief that ATI had better IQ than Nvidia, presumably again at driver default, and how new standards forth coming in Nvidia features had closed that gap. But the interesting part is this:

All in all, users' preferences were mixed. AMD had a slight lead overall, with eight votes to six for gaming, seven to five for HD video watching, and six to four for photo viewing. Nonetheless, some users said they could see no differences between the two systems, and others mistakenly stated a preference for one particular system when they were shown two SLI rigs or two CrossFire rigs.

Again this speaks to the notion some people have about IQ levels between the manufacturers. Of course this goes both ways at times, here with Nvidia claiming AMD was doing it:

http://hexus.net/tech/news/graphics/27717-amd-responds-nvidias-image-quality-complaints/

Posters there linked this site for a review and they compared AMD/ATI and Nvidia and found little issue and when they did they admitted they were nitpicking to the Nth degree.

http://www.guru3d.com/articles-pages/radeon-hd-6850-6870-review,9.html

Of course you can go back to 2003 when Nvidia was turning down IQ for better benchmark scores. Not gonna link those, easy enough to find. At any rate it is easy to see why it is "Apparently it is well known AMD has better IQ in games" and at least my assertion that "I assume that means at default, whereas adjusting can make them equal" was in fact pretty close to how it works when there is an issue. But just in case I need more:

ATI has had the very slight IQ superiority, for the most part of like... a decade now

I did not speak it as fact and even commented as it may be related to only a specific situation. Therefore of that particular comment I am in the right and well within forum principles governing conduct. So lets move on.

So after that post where I Explicitly pointed out, as I did above, DRIVER DEFAULT, I get the usual "you can just adjust settings. Which again the issue was driver default like all the other times ATI/AMD or Nvidia tweaked their default drivers for performance increase in Benchmarks. So tweaking driver defaults invalidates the argument for users but does not change what happens in reviews which was the original concern. Now keep in mind even up to this point I have not stated that there is an actual problem and only state that I am interested to see if it is possible/true, as you see here:

http://hardforum.com/showpost.php?p=1041708110&postcount=29

Hence why I originally in this thread didn't give any info so we could see what the OP could find without influence. I am not saying it is definitely true, it warrants alot more looking into. If it is the case then you can see the issue when we get reviews, they generally only work with game settings not driver level settings.

From here the next few posts were to hopefully make others understand the simple point and the assertions they were making were not my intended point. I was accused of lying thru my teeth and being evasive. So I posted an abridged version:

http://hardforum.com/showpost.php?p=1041708157&postcount=34

Figures, you make the claim about me but don't link any of my posts as proof and still act as if asserted any facts. So here you go since you can't be bothered with honor and respect.

1ST post here was agnostic and gave no info from what was being purported. FACT

2ND post here was to direct others to the forum where the original tester posted his setting and what he did. I even reported what he said he did and my disdain for previous posters here ruining a good chance at getting some proof of the possibility by influencing the OP with what to look for.

NEXT few posts were trying to speak to what others have said, neither agreeing or disagreeing with them. Calling out for more in depth honest looking into the situation. Also explaining if there is an issue it must be done with driver settings at default and only using in game settings. This is about as fair as it gets.

Arguing no one runs default driver settings doesn't change the original issue if it is real or not. And seeing how nearly every review runs default driver settings, it becomes a bigger issue. And given time constraints for most reviews I can see how they might not see it.

Again ALL THIS IS SPECULATIVE AND IN NO WAY AT THIS TIME INCONTROVERTIBLE PROOF OF DISHONEST ACTIVITY BUT RATHER A CALL FOR MORE INFORMATION AND HONEST EVALUATION.

Does that help?


Not sure how much clearer I could have made it. Well it keeps going on from there.

Be forewarned: Tactics for those that troll or try to keep whatever truth you have posted from being taken seriously always do a couple of things:

1: post only partial parts of your post, generally out of context.

2: claim you must prove every statement even observations and yet never themselves pony up proof otherwise. In this case prove where Neither company ever did tweaking of driver defaults to bolster their standing, already I posted proof/accusations of both sides.

3: make claims of victory when they never proved anything themselves. Absence of truth is not truth.

Each of these three is seen here is this thread. But just in case it is a case of "FAILURE TO COMMUNICATE" then I advise when unsure ASK QUESTIONS. It makes for a lot less hostile environment than assuming and then the whole circle jerk to get to any real point.
 
You had only 2 points to address.
I asked them in post 35 this is now post 68 !!
To help you I'll post the same here again.

Post 24
Apparently it is well known AMD has better IQ in games.
Provide proof of this.

Post 27
You stressed that it must be at default settings and are doing it again in the above post. [post 34]
Provide proof that default settings are needed to expose the problems.
 
Apparently it is well known AMD has better IQ in games.

This definitely use to be true for ATI. I recall back in the 9700/9500 days, I could definitely see a difference, and I always preferred ATI. Both companies were fooling around with optimizations, and still are of course. I preferred what ATI was doing over whatever the hell nVidia was doing. I doubt this is still true.

I say that having owned many nVidia cards.
 
You had only 2 points to address.
I asked them in post 35 this is now post 68 !!
To help you I'll post the same here again.

Post 24

Provide proof of this.

Post 27
You stressed that it must be at default settings and are doing it again in the above post. [post 34]
Provide proof that default settings are needed to expose the problems.

Man you are either trying to be a pain or are just thick and incapable of understanding.

Default is necessary because this is what reviewers use to review cards especially when comparing cards. Adjusting any thing in drivers invalidates the claims here but not with reviewers since they generally don't adjust drivers settings. As in the links I posted above this is what Nvidia did in their drivers to win benchmarks in 2003, fact, and what was alleged against AMD in 2010. This is akin to the assertion currently but not to that extreme.

Any more explanation needed?
 
Gregster did new video where he left everything at default values in CCC and NVCP and didn't use any color corrections etc. when doing that video:

https://youtu.be/zC-VpCnexkU

I'd say there's a pretty big difference, meaning AMD default settings have better IQ compared to Nvidia, atleast on BF4. He also made a vid where he compares "let the application decide" and "Use my preference emphasising quality" settings on the NVCP.

https://youtu.be/a2IIM9fncqc

It seems like AF doesn't work how it's intended to when using default NVCP settings.
 
I've always forced AF as some games don't even let you adjust it using the in game settings. Has a pretty dramatic impact on IQ and comes at almost no perf hit so dunno why 16x isn't the default by now.
 
It seems like AF doesn't work how it's intended to when using default NVCP settings.

It's exactly how Nvidia intended :( The default setting give 10% performance at the expense of texture quality. Underhand tactics to boost fps.
 
Last edited:
What about perception - did it feel as comparable as the graphs show? It'll be interesting to see how they perform in crossfire as drivers improve.

Tweaktown has a good crossfire review. Somehow the Fury X dominates. A single Fury X usuallly loses to a 980ti, but Fury X in crossfire is neck and neck with Titan X in crossfire!

Yeah, i know, it doesn't make sense. Seems there is much more to VRAM capacity than meets the eye. At 4k, the Fury X was DOMINATING. HBM magic for you.
 
Tweaktown has a good crossfire review. Somehow the Fury X dominates. A single Fury X usuallly loses to a 980ti, but Fury X in crossfire is neck and neck with Titan X in crossfire!

Yeah, i know, it doesn't make sense. Seems there is much more to VRAM capacity than meets the eye. At 4k, the Fury X was DOMINATING. HBM magic for you.


Its not HBM, its the way crossfire and Sli work, assets are duplicated across the boards, but when using AA and AF those don't, also buffers for rendering aren't doubled either.

Now the min frame rates of Fury X crossfire suck when we look at games that we know use more the 4gb of vram, that isn't good that could be attributed to the 4 GB Vram but don't know unless further investigation was done then what was provided.

So many times it just seems like many people actually don't know how their graphics card work in different modes, setting, etc. I just don't understand when a person pays top dollar for high end cards don't know the basics.....
 
Last edited:
If you just base how a card performs in a game off an average framerate, your method is flawed to begin with. Average framerate does not take into account framerate consistency, smoothness, it doesn't reflect any stutters or pauses, or frametime... these things cannot be shown in bar graphs.

When looking at results, you also need to take the game settings into consideration. You can easily lessen the dependency on VRAM capacity artifically by using lower settings in a game at 4K, versus using high settings, or turning on MSAA to stress VRAM, you can also do the opposite and kill a video card's framebuffer by over using settings, like 8X MSAA at 4K.

On IQ, both NV and AMD look similar. You cannot judge image quality from youtube videos, youtube compresses videos poorly, and videos will look different dependent on many things. Only source non-compressed screenshots can be compared. In my experience, AMD and NV look the same when they are at the same settings, there is not one that stands out different over the other.

/thread
 
Driver defaults on all cards, every brand, all the time, we do not change or touch anything in the driver control panel prior to testing.

Huh thats shocking.....Both AMD and Nvidia if you dont switch the options lower IQ if you dont change settings.

Very shocked you guys don't change them.
 
Status
Not open for further replies.
Back
Top