AMD Radeon R9 295X2 Video Card Review @ [H]

The most interesting thing to note about this card is that AMD is putting more effort into the build quality and cooling solutions of their GPUs.
I hope this trend continues!
 
Nice:
R9295X2-2-88.jpg



That said, I'm pretty much forced to stay with NVidia due to their grossly superior performance in World of Warcraft and better thermals.

Seriously?
 
The most interesting thing to note about this card is that AMD is putting more effort into the build quality and cooling solutions of their GPUs.
I hope this trend continues!

At a premium of hundreds of dollars, of course.
 
What's the exact source on this?
I see it's from HardwareCanucks.

At a premium of hundreds of dollars, of course.
I would pay significantly more for a video card that came stock with an AIO watercooler, considering the costs/effort of installing one myself.
Maybe up to $100. Probably not much more.
 
I think what surprised me the most is that it's highest temp is 70c. I have triple R9's watercooled and it doesn't go past 60c.

For me, I'm just debating whether I should do a single 4k monitor vs triple 144hz monitors.
 
I think what surprised me the most is that it's highest temp is 70c. I have triple R9's watercooled and it doesn't go past 60c.

For me, I'm just debating whether I should do a single 4k monitor vs triple 144hz monitors.

Why would that surprise you? I'm betting you have more than one fan and more than 120mm worth of rad space, not to mention full cover waterblocks.
 
Are the tubes long enough to cope with the cooler being at the top of the PC? I'm wondering if only 380mm might be a significant flaw.

Imagine you want a small, stylish, and powerful PC, so you go for a MATX system based around the Aerocool Dead Silence case and 2x 295x2. You use an AIO cooler for the CPU, connected to the rear fan mounting. At 380mm, are the tubes long enough to have two of these cards with their coolers mounted in the top? Without stretching, that is; I don't think they're long enough to be able to be tidied.
 
Really? And put them under water? I doubt it lol.500 bucks apiece even second hand considering the lack of 550 dollar ones.

Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.
 
Really? And put them under water? I doubt it lol.500 bucks apiece even second hand considering the lack of 550 dollar ones.

Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.
 
Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.
Two reference R9 290X boards: $550 each, or $1100.

Two full cover waterblocks: $250

So you are at $1350, and that's if you already have pumps, res, radiators, etc in your system.

I think the board is a little overpriced at 1500, but I don't think it's entirely unreasonable for what they are delivering. Certainly if you have the technical expertise the DIY approach is cheaper, but I get the feeling they are targeting users who don't want to do that with this product.
 
I'm interested in what difference these monster cards make to ARMA 2, DayZ and ARMA 3

Anyone tested it on any of these?
 
I have an ancient, passive scythe ninja rev 1 cpu cooler.

I would upgrade to this in a heartbeat if I was in the market for it - costs me less than installing a full watercooling rig on standard cards and shuts up the only thing that makes noise in my rig.

Less hassle. A warranty and reasonable cooling performance. Worst comes to worst (out of warranty time) I'd chuck a 2nd radiator and an reservoir on the loop.
 
Buying AIO coolers and getting brackets wouldn't cost too much more I'm just saying you could buy 4 reference 290x gpus for $1500 without a huge effort (I'm not suggesting that someone should buy 4 of these and stick them into the same case lol)... this is an AIO solution trying to keep two cards cool, dual 290x with full-cover blocks would be far better than an AIO solution and way cheaper.

Please tell me where I can get 290xs for ~350 dollars.
 
200W more than 780ti SLI @ full load? Jesus, AMD is sure going to make you pay for that extra performance, aren't they?

Absolutely unimpressed by their performance with this part. Get those power consumption figures under control and we'll talk. Until they can compete on a performance-per-watt basis they simply aren't competing.

YAWN.
Are you installing your video cards in a tablet? No? Then who gives a shit?

SoftOCP is that way >>>>>
 
^Yeah!
Whats the point of having a 1200 watt PSU if you can use it all?
 
This test should have been performed with a 1440p or 1600p surround setup as well as with Titans which have an equal amount of ram.

I.E surround 1600p = 4800x2560 or 1440p surround, this is better than a 1200p surround setup and most likely if someone is spending this much on a GPU.

Hopefully HardOCP can get their benches in order since this was a very unrealistic setup for someone who is spending this much on a card

EDIT: Funny since here I am complaining that they were not high enough and people are complaining they are not low enough. Good job on the 4k Kyle, you guys do great work guess I am in the minority since I dont feel surround 1080p/1200p shows any difference
 
Last edited:
This test should have been performed with a 1440p or 1600p surround setup as well as with Titans which have an equal amount of ram.

I.E surround 1600p = 4800x2560 or 1440p surround, this is better than a 1200p surround setup and most likely if someone is spending this much on a GPU.

Hopefully HardOCP can get their benches in order since this was a very unrealistic setup for someone who is spending this much on a card

EDIT: Funny since here I am complaining that they were not high enough and people are complaining they are not low enough. Good job on the 4k Kyle, you guys do great work guess I am in the minority since I dont feel surround 1080p/1200p shows any difference
You do realize the monitor they used in testing currently costs around $3,000 US, right?
 
Are you installing your video cards in a tablet? No? Then who gives a shit?

SoftOCP is that way >>>>>

Some people will find anything to argue about. Even if these cards did match the TDP of SLI'd GTX-780ti's, the same people would be then shouting how this card doesn't have Physx, doesn't have Gsync... And let's not forget about AMD's shitty drivers!
 
Some people will find anything to argue about. Even if these cards did match the TDP of SLI'd GTX-780ti's, the same people would be then shouting how this card doesn't have Physx, doesn't have Gsync... And let's not forget about AMD's shitty drivers!

No Adaptive vsync either, which I find huge. Not that I'm in the market for these cards, but on my GTX 660ti, I love it.
 
Please tell me where I can get 290xs for ~350 dollars.

They were available for ~$375 on the forums and from ebay here are a couple of examples:

http://www.ebay.com/itm/Used-Radeon-r9-290x-4gb-Great-Condition-Not-Over-Clocked-/400691488447?pt=PCC_Video_TV_Cards&hash=item5d4b12e6bf#ht_82wt_1063

http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-GDDR5-Video-Card-/181376460465?pt=PCC_Video_TV_Cards&hash=item2a3ae126b1#ht_371wt_1063

There were a few on the for sale/trade forum for $380 but looks like they all sold. I bought one on the forums for $380 last week.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
They were available for ~$375 on the forums and from ebay here are a couple of examples:

http://www.ebay.com/itm/Used-Radeon-r9-290x-4gb-Great-Condition-Not-Over-Clocked-/400691488447?pt=PCC_Video_TV_Cards&hash=item5d4b12e6bf#ht_82wt_1063

http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-GDDR5-Video-Card-/181376460465?pt=PCC_Video_TV_Cards&hash=item2a3ae126b1#ht_371wt_1063

There were a few on the for sale/trade forum for $380 but looks like they all sold. I bought one on the forums for $380 last week.

I'm surprised. Is the mining craze coming to an end? I mean even if mining had never happened 200 bucks off the retail price after 5 months sounds like a lot to me (400 bucks maybe, and that's presuming the mining craze never happened).

Either way, putting 4 (second hand) 290xs under water is still a costlier endeavour (480 bucks in blocks, not to mention rads, fans, pumps etc) than this 1500 dollar 295x2.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Scrypt mining is coming down as cooling costs increase through the summer, and for various other reasons, yes.
 
You do realize the monitor they used in testing currently costs around $3,000 US, right?

Yes of course I do, but to do Eyefinity with a 1080p monitor makes no sense.

If you are going to do eyefinity or Nvidia Surround put it on a 1440p or 1600p surround.

Baasha on OCN is benching with 3 Dell 4k monitors rright now with 2vs3vs4 way sli titans.

I would find it interesting since I am in Surround 1600p and tri sli titans how everything compares. Since in games like Titanfall (yes even with latest patch) it does not even take advantage of the other cards. I personally am comtemplating moving to 4k but the GPU's are not there yet.
 
Looks amazing! I can't justify buying it unfortunately. It's catered to a tiny niche and selling this on the used market is going to be difficult. Buying separate cards (less powerful) in SLI/Crossfire mode would be better for me.
 
Whoever put the DVI port on this card should be shot. It means you can't do 5-way 4K eyefinity. :)
 
With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz.

Where is the outcry in this (Imagine if this was AMD?). Will Nvidia reduce the clocks with next driver and up it again when it feels it needs to compete again? and why are more sites not testing this?
The frame times have also got worse bcoz of this overclocking, and so has the temps, noise and power. Nvidia seems to be moving the goal posts again by increasing clock speeds and nobody has questioned this new tactic.

Well done HardCop you guys rock for catching what no other site has. The only site that says it like it is without worry what AMD and Nvidia will say.
 
Last edited:
This is legit.
They can only test in so many scenarios at Nvidia who, remember, doesn't make boards on their own besides some reference boards for in house use. So they do some testing set a good boost number for launch, and wait.
As the drivers mature, and get more efficient, the amount of heat produced by the boards will decrease, allowing for additional thermal overhead that they can make use of. The data they gain from the millions of sold gpus allows them to make these kinds of tweaks impossible with the small sample set of in-house boards.

But at the end of the day, what difference is 13mhz (1.3%) going to make in terms of real performance?
 
This is legit.
They can only test in so many scenarios at Nvidia who, remember, doesn't make boards on their own besides some reference boards for in house use. So they do some testing set a good boost number for launch, and wait.
As the drivers mature, and get more efficient, the amount of heat produced by the boards will decrease, allowing for additional thermal overhead that they can make use of. The data they gain from the millions of sold gpus allows them to make these kinds of tweaks impossible with the small sample set of in-house boards.

But at the end of the day, what difference is 13mhz (1.3%) going to make in terms of real performance?

Enough to be better than AMD counterparts in benches in reviews.
 
This is the release driver for the 295X2 that has been reviewed in many places on the web. It is notable for being the first review that features Mantle in BF4 along with Eyefinity and Crossfire, which has previously been unplayable on my system.

Does anyone know if there are specific fixes for this configuration contained in these drivers?

There must be a NDA in force, because there have been no clarifying responses to my queries on this thus far....
 
This is legit.
They can only test in so many scenarios at Nvidia who, remember, doesn't make boards on their own besides some reference boards for in house use. So they do some testing set a good boost number for launch, and wait.
As the drivers mature, and get more efficient, the amount of heat produced by the boards will decrease, allowing for additional thermal overhead that they can make use of. The data they gain from the millions of sold gpus allows them to make these kinds of tweaks impossible with the small sample set of in-house boards.

But at the end of the day, what difference is 13mhz (1.3%) going to make in terms of real performance?

As drivers get more efficient at using the GPU they push the GPU harder. Look at how many people had to reduce their OC's a bit when Mantle came along.

That small difference is likely what it takes to stop/reduce throttling during long sessions. If you look at reviews you can tell the ones who don't run their tests long enough for temps to stabilize.
 
Back
Top