AMD Radeon R9 290X CrossFire Video Card Review @ [H]

I am working on getting a set of 290X ref cards into me that I can set up and CrossFire in my personal system and see how these compare to TITAN SLI in terms of heat and noise. Hopefully get those in next week. We will see what happens.
 
looks like you don't go crossfire on those w/o WC
i don't know about you guys but i can't concentrate on gaming if it gets too hot
damn ca ;/
 
Kyle I just sold 1 of my 780gtx cards and was thinking of dumping my other 2 and going back to AMD tri fire , do you think the heat will make it impossible unless I'm cooling them with water? I had all 3 of my 780s at 1150/1700 stock voltages with zero heat issues. I wonder how Tri fire scaling is?

I would not suggest TriFire with the reference coolers. I think ASUS, GBT, and MSI are going to have some solid solutions though.

looks like you don't go crossfire on those w/o WC
i don't know about you guys but i can't concentrate on gaming if it gets too hot
damn ca ;/

Well, water cooling does not change the heat produced.....just how quickly you can remove the heat from the die.
 
All I can say is I just got two of these cards this week and I am anxious to Crossfire them.

I have several concerns.

One, the system this will go into currently has an AX 850 PSU. I'm not sure that will be enough to carry the load.

Two.....these cards are red hot on the bench, I can't imagine what the heat will be like once placed in a case. I'm considering doing nothing with Crossfire until Heatkiller comes out with their blocks for these.

Last, I've had one at a time on an open bench, looping heaven benchmark.....the fan isn't that loud. The 6900 series was loud, leafblower loud.......these don't qualify.

Honestly. It's the heat and power requirements that bug me. These cards are GTX 480 hot.......maybe hotter.
 
290x crossfire with a [email protected] is will probably push that 850 watt psu relatively hard if you run something that puts system under full load. oc the 290x cards and you are done for sure.
 
290x crossfire with a [email protected] is will probably push that 850 watt psu relatively hard if you run something that puts system under full load. oc the 290x cards and you are done for sure.

Yes.

Also, in my brief time with the 290s, they don't overclock for shit.

The best I can squeeze out of my MSI card is 1100, my XFX won't even do 1075.

There's no voltage control on these things. All you can do is push the fan and power sliders in the CCC and up the clocks using Afterburner.
It's not heat, because the fans scale and the temp never went over 93C.

Maybe with water and an overall lower temp I can squeeze a bit more.:D
 
I have two reference 290X cards in a CM 690 Advanced case and they run just fine - temps under 70 degrees under load...granted it sounds like a super leaf blower, but I game with headphones. It's loud...but it's not that bad. Everyone is over-blowing it, IMO.

If you want quieter and cooler, you know where to go.
 
I understand that but still it also means a 95C space heater in my room. ...I'd prefer to see something closer to 85C
GPU temperature has nothing to do with how much heat gets dumped into your room/house.

Its TDP that matters for that and the R9 290X will dump about 60w more into your room than a 780 or the rough equivalent to a 60w light bulb -worth of heat.

A practical example: a GPU with a 40w TDP could get to 95C due to a crappy HSF + poor ventilation but that GPU will still only dump 40w worth of heat into your room/house.

If you want to get a better cooler or switch to water cooling you can drop GPU temps but total heat being dumped into your room/house will not change. If you want to keep your room/house cooler you have to move the heat outside of your room/house which isn't easy or always practical to do but isn't impossible with water cooling.
 
Its not just about the silicon.
The high heat affects other components and will cause more wear to soldered joints.
Heat/cooling cycling is what effects solder joints/BGA balls not total temp so long as that temp is well below the melt point of the solder which @ 95C the R9 290X will be.

Most likely AMD has a minimum temp with these cards just like nV has done with some of theirs to prevent the solder from developing cracks and wearing out the card prematurely.

Also, in my brief time with the 290s, they don't overclock for shit. The best I can squeeze out of my MSI card is 1100, my XFX won't even do 1075.
Supposedly this is due to reference voltage regulation + lack of available over volting on reference cards.

Hopefully we'll know for sure either way soon.
 
Wow, I'm impressed, I didnt see this coming!

If anyone has a 200x card, is Skyrim smooth now?
I read that it was never smooth on earlier AMD cards.
My brother is in the market for a new card and has just started Skyrim.

Hmm, I had no problems with Skyrim and my 6950.
The stutter problems in Skyrim where supposed to be fixed with a patch. I have never had to use the stutter remover mod for either Skyrim or Fallout 3.

I bought FalloutNV on the Halloween Steam sale and I have had no problems with that either.
I see there is a stutter remover available but you need Fallout Script Extender for that. I am doing a zero mod play through to decide what mods I may want or need. So far, the only mod I can see myself wanting is the "No bendy RPG bullets" mod. I hate V.A.T.S. with a passion.

I think in Skyrim, I used the bolts and arrows tweaks mod that fixed the aim point and I also used the deadly headshots mod. I am more reluctant to use the deadly headshots mod for Fallout for various reasons. Most notably the much larger number of enemy projectiles flying in my direction.:p
 
Heat/cooling cycling is what effects solder joints/BGA balls not total temp so long as that temp is well below the melt point of the solder which @ 95C the R9 290X will be.
See post #118
 
Already did. You seem to be contradicting yourself a bit in both posts so I thought I'd add in my 2c.
 
In post 108 you say high heat will wear out the solder joints in post 118 you talk about the heat/cooling cycle causing expansion/contraction which eventually cracks the solder.

I'd agree with post 118, in terms of technical correctness though not as an issue the R9 290/x will suffer from, but not 108 WRT solder life or R9 290/x video cards.

It might seem pedantic at first but I assure you "high heat" and "heat/cooling cycle" are 2 very different issues. There is also no reason to believe either of those issues will be a problem for R9 290/x cards since they seem to be doing a good job of not exceeding 95C
 
In post 108 you say high heat will wear out the solder joints
If there wasnt any heat cycling you would have a point.
Higher load temp will cause fractures in the solder joints to appear faster.

in post 118 you talk about the heat/cooling cycle causing expansion/contraction which eventually cracks the solder.
They go hand in hand.
One doesnt happen without the other.

I'd agree with post 118, in terms of technical correctness though not as an issue the R9 290/x will suffer from, but not 108 WRT solder life or R9 290/x video cards.
Then why call it out?
If you didnt see it, then just say, dont try and big yourself up.

It might seem pedantic at first but I assure you "high heat" and "heat/cooling cycle" are 2 very different issues. There is also no reason to believe either of those issues will be a problem for R9 290/x cards since they seem to be doing a good job of not exceeding 95C
You are being pedantic for the sake of it.
They could be very different issues for something that remains at near constant temperature.
But graphics cards are a great example of something that doesnt remain at constant temperature.
The subject at hand is a better thing to discuss :p

There is reason to believe it could be a problem, they are running at very high temp under load.
 
You're contradicting yourself ("If there wasnt any heat cycling...Higher load temp will cause fractures"), shifting goal posts ("They go hand in hand...Then why call it out?"), and clearly cynically reading into my posts with your "big yourself up" remark so thx for showing us all how dishonest you can be I guess.

nenu said:
There is reason to believe it could be a problem, they are running at very high temp under load.
The 480GTX ran at similar temps under load (93-95C) since that card/GPU was designed to run at those temps. "High temp" is a relative term that means nothing in of itself and must be applied on a case by case basis for it to have any meaning.

IOW a temp of 95C might be far too high for one video card might be perfectly fine for the R9 290X if AMD designed the card to run at those temps.

So far as anyone knows they did. For all we know 110C+ might be "high temps" for the R9 290/x. I'm not aware of any official statement regarding max temp before the card or GPU starts to get damaged. All we know for sure is AMD made it so the GPU tries to stay at or under 95C.

Unless you've got a leak to some secret insider docs that show AMD/other AIB vendors know there is a reliability issue with the R9 290/x due to heat you're pretty much just doing some scare mongering while also being caught out at being misleading with technical issues.
 
You're contradicting yourself ("If there wasnt any heat cycling...Higher load temp will cause fractures"), shifting goal posts ("They go hand in hand...Then why call it out?"), and clearly cynically reading into my posts with your "big yourself up" remark so thx for showing us all how dishonest you can be I guess.
I'll spell it out for you.
If there wasnt any heat cycling, you would have a point.
ie if the temperature was high and remained constant, then my statement would not have been correct.
However, temps dont remain constant, they cycle.
So your initial comment wasnt warranted, especially as you claimed to have already seen my second post which you acknowledged was correct.

The 480GTX ran at similar temps under load (93-95C) since that card/GPU was designed to run at those temps. "High temp" is a relative term that means nothing in of itself and must be applied on a case by case basis for it to have any meaning.
A quick search on google reveals it is not insignificant.
https://www.google.co.uk/search?q=gtx480+oven+trick&btnG=Search&gbv=1
And not just for the highest temp cards.
The success rate of oven baking is quite high.
It is fair to say that many card failures are down to fractured solder joints.

IOW a temp of 95C might be far too high for one video card might be perfectly fine for the R9 290X if AMD designed the card to run at those temps.

So far as anyone knows they did. For all we know 110C+ might be "high temps" for the R9 290/x. I'm not aware of any official statement regarding max temp before the card or GPU starts to get damaged. All we know for sure is AMD made it so the GPU tries to stay at or under 95C.

Unless you've got a leak to some secret insider docs that show AMD/other AIB vendors know there is a reliability issue with the R9 290/x due to heat you're pretty much just doing some scare mongering while also being caught out at being misleading with technical issues.

You will notice that I didnt say that it would definitely happen.
I'll quote the post you replied to
There is reason to believe it could be a problem, they are running at very high temp under load.
It should be considered.
 
So your initial comment wasnt warranted, especially as you claimed to have already seen my second post which you acknowledged was correct.
Apparently it was needed since you've now shifted goal posts from claiming high heat was going to cause solder reliability issues to now claiming its heat cycling that is going to be the problem....due to its high temp. Which doesn't make any sense. Also I said you were technically correct WRT to heat cycling being the cause of solder fractures, not that it was going to be an issue with the R9 290/x.

To concern troll successfully you have to learn how to at least be consistent and learn to strawman people a few pages down the line so as to confuse everyone. You're using both tactics far too soon, it makes you easy to pick out.

A quick search on google reveals it is not insignificant..It is fair to say that many card failures are down to fractured solder joints..
Uh no it isn't. Your google also shows 5850's, 8800's, and various other video cards being baked to try and fix them. You'd need a study or paper showing that 95C will cause [x] amount of failure rate with the same or similar solder using the same or similar packaging that is used for the GPU in the R9 290/x to have a point. I don't think anything like that publicly exists unfortunately. Same thing goes for 480GTX failure rates though if you'll accept googling forum posts there is plenty of anecdotal evidence that they were low for that card.

That some cards can get fixed with baking in a oven is certainly true but no one has shown that heat is the initial problem with them. For all we know its poor soldering or QC from the factory which doesn't manifest itself as a problem until after a few dozen or more heat cycles.

It should be considered.
Based on what though? Because it runs at 95C? That is only about 10C higher than my 4890's used to run at, which for solder that usually has a flow point in the hundreds of degrees matters not at all, its nearly a rounding error!

edit: Yea but we don't know which type AMD/AIB's are using and he wants to assume a "worst case" of sorts so I'm just going with it to show how even in a "worst case" he is being overly dramatic at the very best. Even the really low temp stuff doesn't flow until you get to 150C.\/\/\/\/\/
 
Last edited:
Apparently it was needed since you've now shifted goal posts from claiming high heat was going to cause solder reliability issues to now claiming its heat cycling that is going to be the problem....due to its high temp. Which doesn't make any sense. Also I said you were technically correct WRT to heat cycling being the cause of solder fractures, not that it was going to be an issue with the R9 290/x.
My goal posts havent changed at all.
Your lack of understanding is the issue, I already explained, yet it still goes over the top of your head.
You either lied about reading my second post already, or you are argumentative beyond reason.

To concern troll successfully you have to learn how to at least be consistent and learn to strawman people a few pages down the line so as to confuse everyone. You're using both tactics far too soon, it makes you easy to pick out.
I havent trolled you once, only defended myself from you.
Its not me doing the trolling.

Uh no it isn't. Your google also shows 5850's, 8800's, and various other video cards being baked to try and fix them. You'd need a study or paper showing that 95C will cause [x] amount of failure rate with the same or similar solder using the same or similar packaging that is used for the GPU in the R9 290/x to have a point. I don't think anything like that publicly exists unfortunately. Same thing goes for 480GTX failure rates though if you'll accept googling forum posts there is plenty of anecdotal evidence that they were low for that card.
I said it needs to be considered, not that it will happen for definite.
I have shown a lot of cases where heat cycling causes fractured solder joints and that cards "designed" for higher load temps also exhibit it.

That some cards can get fixed with baking in a oven is certainly true but no one has shown that heat is the initial problem with them. For all we know its poor soldering or QC from the factory which doesn't manifest itself as a problem until after a few dozen or more heat cycles.
Those that use the oven trick no longer have a warranty so the chance the failure is down to a badly soldered joint or bad QC is vastly reduced.
I would have expected you to know this when coming in as the big I am.

Based on what though? Because it runs at 95C? That is only about 10C higher than my 4890's used to run at, which for solder that usually has a flow point in the hundreds of degrees matters not at all, its nearly a rounding error!

edit: Yea but we don't know which type AMD/AIB's are using and he wants to assume a "worst case" of sorts so I'm just going with it to show how even in a "worst case" he is being overly dramatic at the very best. Even the really low temp stuff doesn't flow until you get to 150C.\/\/\/\/\/
Yes, based on heat cycling causing solder joint fracturing and hotter temps dont improve the situation, they make it worse.
If 4890 cards didnt suffer heat related joint fractures you would have a point.
Yet people have successfully used the oven trick on their 4890s.

You can use a solder that is less mobile at room temp, but there is only so far you can go before damage to the actual components is too great during the soldering process.
And the hotter the melting point of the solder, the more stress the joint is under at room/operating temps so while a hotter melting point solder is more rigid at room temp, it may have to endure more stress.

Its not me being overly dramatic, I raised a point that is worth consideration.
Its you having a fit about it :)
 
95c is nothing for solder joints. How AMD is controlling temperature makes cycling due to temperature changes even less since it controls a steady temp by fan speed and throttling. It looks like those solder joints will last longer then ever if using the same solder, methods etc. previously.

Great read and once again fun to read. HardOCP are about the only reviews I read every word while others I end up just looking at the charts and glimpse through the wording.
 
95C is fine if there isnt a large temperature difference on a regular basis.
The major temp cycling is between idle (not in use) and load (in use).
 
<snip denial of goal post shifting, trolling, "big i am",etc.>
Hahah denial and "NO YUO BAD" won't convince anyone of anything. You've gone from "heat causes GPU failure" to "heat and heat cycling go hand in hand" to "no its heat cycling" to "well it only needs to be considered, heat/heat cycling might not be a problem...maybe" in the space of a few posts.

I have shown a lot of cases where heat cycling causes fractured solder joints and that cards "designed" for higher load temps also exhibit it....Those that use the oven trick no longer have a warranty so the chance the failure is down to a badly soldered joint or bad QC is vastly reduced....Yet people have successfully used the oven trick on their 4890s.
You linked a google for 480GTX oven baking which isn't the same thing as proof of "fractured solder due to heat cycling" and many cards from different generations popped up in that search too, some of which were not known for being particularly "high temp" cards. We have no clue exactly what is going on here. My personal WAG is these issues are due to the lead free solder that has to be used now since years ago before that switch occurred these failures were seeing today were uncommon. However that is just a WAG. No one really has a good answer. Certainly no AIB's or IHV's have bothered to give a reason publicly why this keeps happening. Oven baking works...sometimes. Sometimes not. This issue is likely lots more complex than "just" thermal stress cracking solder on BGA's of the GPU.

You can use a solder that is less mobile at room temp, but there is only so far you can go before damage to the actual components is too great during the soldering process....95C is fine if there isnt a large temperature difference on a regular basis
Too much of a blanket statements to be useful + assumes this is an issue that can't be engineered around. Which is silly since there are devices out there which work fine at temps higher than 95C that are soldered, things like satellites and such. Those things have temp swings of hundreds of degrees daily and yet work for years on end. For all we know AMD had higher temp brazing solder used or had a custom solder paste that they know will hold up to the thermal stress. None of that is impossible or unlikely.
 
lol I love how most people just look at the performance charts and ignore the headache inducing noise comment in the article.

Umm it just goes without saying most people on [H] buying this card already know and could care less or are have been in the hobby are used to it. In short, this is nothing new with top end cards. (GTX 295 4870x2 anyone?) Myself and many others will either watercool or game with headphones (or both). An excellent case like the top end Corsair cases or the FT02 also would make this moot.

GPU temperature has nothing to do with how much heat gets dumped into your room/house.

Its TDP that matters for that and the R9 290X will dump about 60w more into your room than a 780 or the rough equivalent to a 60w light bulb -worth of heat.

A practical example: a GPU with a 40w TDP could get to 95C due to a crappy HSF + poor ventilation but that GPU will still only dump 40w worth of heat into your room/house.

Excellent explanation.

This card rules this round!:

Side note funny posts this round: mantle will cause 150% gpu utilization increase causing heat failure, lower build quality than Nvidia, BF4 is insignificant, and now solder points - hahahaha bashers are really reaching.
 
Last edited:
Wow, I'm impressed, I didnt see this coming!

If anyone has a 200x card, is Skyrim smooth now?
I read that it was never smooth on earlier AMD cards.
My brother is in the market for a new card and has just started Skyrim.

Skyrim is very smooth with CFX 6870s, especially once it is patched up so it actually uses multithreading properly. When it first came out it was horrid no matter what you were running.
 
Hahah denial and "NO YUO BAD" won't convince anyone of anything. You've gone from "heat causes GPU failure" to "heat and heat cycling go hand in hand" to "no its heat cycling" to "well it only needs to be considered, heat/heat cycling might not be a problem...maybe" in the space of a few posts.
Its clear reading isnt your strong point.
Add comprehension to that.

You linked a google for 480GTX oven baking which isn't the same thing as proof of "fractured solder due to heat cycling" and many cards from different generations popped up in that search too, some of which were not known for being particularly "high temp" cards.
I trust the statements of many of them over you.
Its a known issue and is widely accepted, except by you.
When lower temp cards have the issue, its a surefire warning that higher temp cards should be monitored.

We have no clue exactly what is going on here. My personal WAG is these issues are due to the lead free solder that has to be used now since years ago before that switch occurred these failures were seeing today were uncommon. However that is just a WAG. No one really has a good answer. Certainly no AIB's or IHV's have bothered to give a reason publicly why this keeps happening. Oven baking works...sometimes. Sometimes not. This issue is likely lots more complex than "just" thermal stress cracking solder on BGA's of the GPU.
Now you admit that it is an issue :rolleyes:
Cards in times past that failed were put in the bin.
Someone decided to reflow the solder in the oven and since then we can see that a fair number of failures are due to solder joint failure.
Most of those that fail are outside of warranty, so manufacturers arent going to care to look for or publish anything on the topic.
It benefits them that cards last the warranty and have a limited life beyond that.

Too much of a blanket statements to be useful + assumes this is an issue that can't be engineered around. Which is silly since there are devices out there which work fine at temps higher than 95C that are soldered, things like satellites and such. Those things have temp swings of hundreds of degrees daily and yet work for years on end. For all we know AMD had higher temp brazing solder used or had a custom solder paste that they know will hold up to the thermal stress. None of that is impossible or unlikely.
You profess to know what you are talking about and then equate the manufacture of graphics cards to satellites.
Satellites use more expensive components, techniques and processes.
ie larger through hole components when there is a need to avoid issues with SMD boards and components.
An example:
http://www.satnews.com/story.php?number=1048584852&menu=1
"SMD resistors run hotter than through-hole parts due to their power density, and this excess heat degrades long-term stability when operating at higher temperatures. In addition, board flex stresses may cause SMD chips to crack or delaminate from the board. In high-precision applications, the best choice is often a through-hole resistor specifically designed to provide higher resistance values and power, tighter tolerances, and better long-term stability."
Note the use of specialised components.
The cost and QA of satellite manufacture is in a different league.
 
Don't wanna be a dick about this but the frame on nVidia looks much more consistence to me.

Dude doesn't know the difference between a framerate graph and a frametime graph.

Performance on the R9 290x seems monstrous. I am convinced the R9 290x should be in my next system . Just waiting for better coolers, I cant take my room being any hotter lol, I might even consider water cooling them. I am still waiting on the 780Ti though before final decision.

Lousy cooler, good cooler or water your room will be the same temperature. Look on the bright side, it's winter.

Kyle I just sold 1 of my 780gtx cards and was thinking of dumping my other 2 and going back to AMD tri fire , do you think the heat will make it impossible unless I'm cooling them with water? I had all 3 of my 780s at 1150/1700 stock voltages with zero heat issues. I wonder how Tri fire scaling is?

No dude, don't tri-fire these. The first and second card will throttle so badly you'll lose any benefit from adding the third. Unless you want to go balls out and use risers. From my experience building mining rigs, there's about a 20 degree difference between running cards with a one slot gap, and using a riser with a gap of three inches. If you have a motherboard where the last PCIe slot runs at PCIe 2.0 x8 or higher, definitely use that for Crossfire instead of the middle slot. As for tri-fire, you're going to have to be on water or something ghetto with risers. Aftermarket coolers aren't going to help enough unless they come out with something that's never been seen before.
 
WOW so many noobs in this thread, rofl.

1. If this cards are low quality, how come the voltages are not locked? and GTX780/Titan are? HAHAAH!

2. Have you seen Titan/GTX780 (reference) running at 1.25v + on air?

Here are the parts, AMD used for the reference low quality R9 290X :rolleyes:


PWM: PWM driver is IR3567B , 6 +2 phase PWM control chip
same one as ASUS ROG series mobos , MSI Xpower , and Gigabyte Ultra Durable

IR6894 DirectFET MOSFET

www.irf.com/product-info/datasheets/data/irf6894mpbf.pdf&#8206;
http://www.irf.com/whats-new/nr110201.html

gtfo_women_gif.gif
 
Yes.

Also, in my brief time with the 290s, they don't overclock for shit.

The best I can squeeze out of my MSI card is 1100, my XFX won't even do 1075.

There's no voltage control on these things. All you can do is push the fan and power sliders in the CCC and up the clocks using Afterburner.
It's not heat, because the fans scale and the temp never went over 93C.

Maybe with water and an overall lower temp I can squeeze a bit more.:D

What software were you using to unlock the voltage? Afterburner doesn't have support for it yet, but the Asus GPU software seems to work just fine, even on other brands..There are guys that are getting well over ~1100-1175 @ stock voltage under WC over on overclock.net...

These are some bad ass cards, and AMD really, REALLY needs some serious Kudos for the insane amount of work they have put into CrossFire...I just wish there was a way to see these kinda gains with the 79XX series in X-Fire..I was going to pick up another 2 7950s, but now I think I am going to sell my Golden one (1.3Ghz core) and pick up a 290 and then add a 2nd later down the road..
 
I have two reference 290X cards in a CM 690 Advanced case and they run just fine - temps under 70 degrees under load...granted it sounds like a super leaf blower,

Given this and

HardOCP said:
What happens if you push the fan beyond 55%? Then it does become very loud. At 65%-70% the two fans combined create a lot of sound in your room. Our opinion is that at "Quiet Mode" 40% the fans are quiet, at "Uber Mode" 55% the fans are audible and noticeable, and at 65-70% and higher the fans are very loud and headache inducing.

this, the noise factor is a total dealbreaker for me. Hopefully quieter solutions will emerge.


but I game with headphones. It's loud...but it's not that bad. Everyone is over-blowing it, IMO.

I see what you did there. :)
 
2. Have you seen Titan/GTX780 (reference) running at 1.25v + on air?

Certainly so, just go to overclock.net and look at the unlocked voltage thread for the Titan/780 in Nvidia section. Some are brave enough to go higher then that even, but only those with EK blocks will likely push the 1.35v+ range due to VRMs being the weak link in the chain.
 
crossfire indeed

wow heat is huge issue with this card sucks if you some where that gets hot in the summer
 
:eek: This is very embarrassing for NVidia. Metro: LL is TWIMTPB and they didn't get SLI to work properly?
First of all....thats 4K. I think Brent mentioned that he thought nvidia had some driver issues at that resolution.
Lack of SLI scaling is of course a driver problem, what else should it be? (Well it could be that the game saturates the SLI link @4K, remember why AMD ditched the CF connector? But that's purely speculation on my part.) Which would be forgivable if Metro: LL was an AMD GE title that came out yesterday. But it has been out for a while, and NVidia had an extra couple of months to optimize the game and their driver for each other through the TWIMTBP program.
To me looking at 4K data is like talking about Ferrari. They are wicked fast but about 12 people have one.
Look, this is [H]. A site for high end gaming and enthusiasts. This site benchmarked 5760x1200 triple screens when other, lesser sites still thought that 1024x768 fps numbers were somehow relevant for the performance evaluation of a graphics card. I think that the 4K numbers are very relevant to the readers of this site.
 
Look, this is [H]. A site for high end gaming and enthusiasts. This site benchmarked 5760x1200 triple screens when other, lesser sites still thought that 1024x768 fps numbers were somehow relevant for the performance evaluation of a graphics card. I think that the 4K numbers are very relevant to the readers of this site.

I would think that anyone watercooling does it for sound and power.
eyefinity is common and even if 4K is a few years off the sad truth is at lower resolutions we wont see much difference in between the 7970 and upcoming cards.
for a user buying a card I at least would like to see a noticeable difference, not use a benchmark to say - oh shiny, faster :)
 
Do you really think that 4K is still a few years off? Netflix and YouTube are already running 4K trials, something which they wouldn't do if nobody was there to watch it.

I expect that at CES 2014 we will see an explosion of 4K offerings and prices to come into a range which PC enthusiasts can afford.
 
4K results in favor od 290X means extreme mods, and crazy AA (SSAA) settings in games, with great performance even on lower resolutions, so Nvidia fanboys can whine as long as they can, IQ enthusiasts welcome this, even on single 1080/1200p screens.

Its overkill you are idiot for 1080p and similar?

Ok, lets look at 5 years old game with SSAA.

53386.png
 
Last edited:
Back
Top