AMD Radeon R9 295X2 Video Card Review @ [H]

AMD has done a great job with this 295X2 video card. This is by far one of the best engineering builds AMD has done on a gamer level consumer video card. This is AMD's greatest achievement thus far for a gamer card in terms of build quality, thermal performance, and sound profile. That said, the price does seem slightly high, we would have rather have seen this around $1300. Granted, you pay for the best, and this certainly is the best from AMD.

What a nice review it must have been somewhat of a surprise since you give AMD lots of praise.
Also nice work on the in depth 4k and eyefinity setup.
 
I doubt Nvidia even has a response. What reason do they have? They have the Titan Z and 780 Ti SLi which is competitive and cheaper. I doubt they do anything until Maxwell.

Taken altogether this puts the 295X2 in a very interesting spot. The performance offered by the 295X2 is the same performance offered by the 290X in Crossfire, no more and no less. This means that depending on whether we’re looking at 2K or 4K resolutions the 295X2 either trails a cheaper set of GTX 780 Tis in SLI by 5%, or at the kinds of resolutions that most require this much performance it can now exceed those very same GeForce cards by 5%.
 
I doubt Nvidia even has a response. What reason do they have? They have the Titan Z and 780 Ti SLi which is competitive and cheaper. I doubt they do anything until Maxwell.

I think the point is both companies are at the point of diminishing returns on 28nm so they probably couldn't respond even if they wanted to with something other that a small boost percentage wise.
 
Did I miss the card dimensions? I'm hoping this card might fit in a BitFenix Prodigy M case.

its in the first page of the review 12" or 304mm.. the Distance in that case between PSU and GPU its very tight, probably you won't be able to fit it.. if you use a regular PSU your case have a clearance up to 320mm, if you use a PSU higher than 160mm then the GPU clearance will be cut to 230mm making impossible to fit it.. higher wattage PSU are big.. as example a seasonic 860W PSU its 190mm long and a corsair AX1200i its 200mm long.. so i think no its not possible to fit bigger PSU with bigger GPU there.
 
Awesome review, [H]. Thank you again for always putting in the time and effort.

One beast of a GPU that AMD has unleashed. One beast of a price tag, as well.
 
Ok, that was one incredible video card. Lets be honest, how many of you guys popped a geek stiffy just reading about this beautiful piece of art? I am almost waiting on another cheesy video from ATI with that one dude who guys around smashing computers with NVidia hardware in them. I cannot even think of the name.
 
It's a beast of card for sure. I just wish GPU prices would drop as a whole. At $1200 this card would be a killer buy, $1500 just seems a bit too high. Nvidia is apparently high on crack wanting 3k for it's TitanZ. I think the companies are just pushing prices as high as they can to see if people will still buy them.
 
Last bf4 update added a Fcat command list. If that helps for your next review.

Great article, keep it up!
 
$1500? I'd rather take two R9 290s and crossfire them for $700 less. Of course that will require water cooling but hell, it would still be less to run two 290s and water cooling than paying $1500 for a card.

Gotta admit, that thing is a beast though.
 
It's a beast of card for sure. I just wish GPU prices would drop as a whole. At $1200 this card would be a killer buy, $1500 just seems a bit too high. Nvidia is apparently high on crack wanting 3k for it's TitanZ. I think the companies are just pushing prices as high as they can to see if people will still buy them.

same as Titan card sold out fast.. why?. TitanZ its cheap for Cuda computing, maya etc.. when you see workstation for mid-entry level computing that spends hundreds of dollars in GPUs, then the TitanZ its a unbeatable price.. of course as gaming card its a big waste of money, I highly doubt that someone can think in a gamer buying a TitanZ for gaming..

this card ins't bad at 1500$ but what i think even at that price the card will be hard to find, and who know at what kind of price retailers will sell it.. :confused:.. i just can see prices of 1800-2000$:confused:
 
I don't care for dual-GPU setups, but I've always liked the idea of dual-GPU cards. It's always neat to see graphics vendors go balls-out on things.

AMD could've tried to be a bit more original in their cooler design, though. The "metal" construction and illuminated top-side logo absolutely reek of me-tooism.

Is there a list of mITX cases that could fit this thing?
I can tell you it's too long to fit in a Silverstone FT-03 mini, but you might be able to mod the case to accommodate it. The case could handle the cooling solution without modification, but not the card length.

Is there anyone here who believes this card will be available at anything less than $2000? :rolleyes:
Given the cooling requirements and price, this doesn't exactly look like a miner's dream. I'd expect them to sell at or around MSRP unless production numbers are just shit.
 
I can tell you it's too long to fit in a Silverstone FT-03 mini, but you might be able to mod the case to accommodate it. The case could handle the cooling solution without modification, but not the card length.

There's a [H] "that's what she said" in there somewhere...:p
 
same as Titan card sold out fast.. why?. TitanZ its cheap for Cuda computing, maya etc.. when you see workstation for mid-entry level computing that spends hundreds of dollars in GPUs, then the TitanZ its a unbeatable price.. of course as gaming card its a big waste of money, I highly doubt that someone can think in a gamer buying a TitanZ for gaming..

this card ins't bad at 1500$ but what i think even at that price the card will be hard to find, and who know at what kind of price retailers will sell it.. :confused:.. i just can see prices of 1800-2000$:confused:

I kind of understood the Titan's price. It seemed to make more sense then, even the Titan Black isn't too bad. But the Titan Z price just seems way off to me. I can see gamers buying Titan/Black, but never a Titan Z. So do you think the Titan Z will fly off the shelves as well? Maybe for a compact workstation, but I'd assume most have room for 2 Titan blacks.

I'm sure AMD won't have any issues moving these 295's off the shelves. People will eat them up quickly. It's a killer card.
 
Nice review. Sweet looking card with the built in water cooling.

Sorta sucks they had to water cool it, since it adds to the price, but it looks to be was worth it, especially for 4k gamers.

The mantle added another ~5% as well, plus the smoother feel you have been reporting. Good news for amd users.

Since this card was water cooled, any chance of doing a comparison with 2 water-cooled 780 ti's in SLI? Seems like nvidia would want to answer this challenge... :)
 
Beast! I'd take this solution over two 290X CF because you don't have the issue with top card running hotter and throttling, PCI-E bus load is less and the water cooling is already engineered to work out of the box so no expensive trial and error with DIY. Only thing it needs is an early adopter promo price of $1300.
 
We've seen in another thread benchmarks for 3x 4K with quad Titans, so I'm hoping [H] will follow up this article with 3x 4K with two of these cards.
 
Would be nice to see oclhashcat, mining, etc. benchmarks to cater to a wider audience.
 
Beast of a Card. AMD has a winner. I don't even see this as making sense for crypto currency mining as #1 it's dying and #2 A pair R9 290Xs or 3 R9 290s is cheaper and will get you the same or better results.

So this one is for gamers from what I can tell.

Hopefully this brings Nvidia's pricing down a little bit, but since this is a limited edition card I doubt it will.

Even if Nvidia releases a dual 780Ti it may fall short if they don't overclock and water cool it and add more ram for 4k gaming. Competition is great. :)
 
Would've liked to seen BF4 - D3D vs D3D alongside the D3D (Nvidia) vs Mantle (AMD). I'm not sure I agree with AMD cards being reviewed with Mantle while Nvidia uses D3D. Maybe I am wrong here, but I get the feeling if Mantle was Nvidia tech, people would be screaming Nvidia created a new "API" for cheating in benchmarks.

If one owns an AMD card, and is playing BF4, the gamer is going to chose Mantle since it is the better and faster API for said graphics card. This is a real-world scenario. We can use Mantle because it's an officially supported API from EA and AMD and publicly available for all gamers to chose. What you want us to do is artificially gimp performance, which makes no sense. Gamers are going to chose the fastest API for the game, and for BF4 it is Mantle under AMD cards. If NVIDIA had its own API, we'd use it. There is absolutely no reason not to use Mantle if it is there.

Maybe you aren't use to our methods, but unlike some other sites, we use the features AMD and NVIDIA put in place to better the gameplay experience. Mantle is definitely one that betters the gameplay experience.
 
If one owns an AMD card, and is playing BF4, the gamer is going to chose Mantle since it is the better and faster API for said graphics card. This is a real-world scenario. We can use Mantle because it's an officially supported API from EA and AMD and publicly available for all gamers to chose. What you want us to do is artificially gimp performance, which makes no sense. Gamers are going to chose the fastest API for the game, and for BF4 it is Mantle under AMD cards. If NVIDIA had its own API, we'd use it. There is absolutely no reason not to use Mantle if it is there.

Maybe you aren't use to our methods, but unlike some other sites, we use the features AMD and NVIDIA put in place to better the gameplay experience. Mantle is definitely one that betters the gameplay experience.

+1 I agree 100% and I don't even own AMD cards right now.
 
Great review, great looking card, glad to see AMD stepping it up! No Metro: Last Light though?
 
Even if Nvidia releases a dual 780Ti it may fall short if they don't overclock and water cool it and add more ram for 4k gaming. Competition is great. :)
I actually wonder how much price/performance matters (to buyers) when you get into these price points, though. I'd wager brand affiliation generally scales pretty linearly with price, and when you're talking about $1k+ cards, I expect the typical user's affiliation with particular brands plays a pretty major role when it comes to purchasing decisions.

It may be good enough, on NVIDIA's end, to release whatever they can do at a similar price point, regardless of performance competitivity.
 
Isn't it funny how AMD has "Project Hydra" while nVidia has "Project Shield"? Two completely unrelated products, I know...it's just funny.
 
Isn't it funny how AMD has "Project Hydra" while nVidia has "Project Shield"? Two completely unrelated products, I know...it's just funny.

I can only think of near-obvious references to Captain America and The Avengers here. Hydra being the antagonist and enemy of the first Captain America film, while Captain America becomes a part of SHIELD in the first Avengers film.

Coincidence?
 
Would've liked to seen BF4 - D3D vs D3D alongside the D3D (Nvidia) vs Mantle (AMD). I'm not sure I agree with AMD cards being reviewed with Mantle while Nvidia uses D3D. Maybe I am wrong here, but I get the feeling if Mantle was Nvidia tech, people would be screaming Nvidia created a new "API" for cheating in benchmarks.

You nVidia guys are like Can-Am guys, just go buy a Polaris and you and your wallet will be much happier :D

That card is a beast!
 
Well single screen 1440p 780 Ti SLi is competitive, so maybe that is what he was bitching about? I agree it's stupid though, single screen gaming doesn't really require more than one card.
 
they not re-review 290X Xfired they just take what they have in their data base and mix it with the new tested 295X2,

This is not true, this review used completely new drivers on all cards, all gameplay experiences and data you see in this review is brand new data obtained this past week testing under the new drivers.
 
I wanted to add, what a concise and well-written review. I appreciate the effort that Brent (and likely Kyle, too) make in having a precise, methodological approach to this kind of premium-product review. A lot of thought went into the wording of this review, and it is noted and appreciated. It's professionally done--but it's never artificially dispassionate and condescending, as is the case with some other sites.

Thanks for that, you all are very welcome

*hugs* :p
 
Toms hardware is reporting 470W peak for the card, http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-14.html

and tbh i just glazed over both reviews, how does the [H] guys measure power consumption compared to tom's?

I don't know what Tom does, but I can tell you our method.

I use a killawatt power meter connected to the system which shows us the total system power wattage, keep that in mind, this is total system wattage. We note the power without a video card installed at the top of the page. Then we look at power with the video card installed at full load and idle, and you can compare those numbers.

Our method is real-world, and simple. We play games, as we are playing the games, testing them at different resolutions, settings etc... we always keep an eye on our power meter in every game. We keep a mental note of the highest wattage observed, mark it down. Then, we keep an eye on what the average wattage is doing, it doesn't always stay near the peak, it will sometimes be 20W lower for example, generally. Then we take a few games, crysis 3, far cry 3, and tomb raider, and perform the same path, and go to the same areas in the game where we know GPU usage is high, and leave the card there for 30min and look at the power in places in those games that we know produce high gpu usage and power demands. We know this from a lot of time spent in the game figuring this stuff out. We then report the highest peak system wattage. We do it exactly the same way on all cards, so it is a fair comparison.
 
Great review, great looking card, glad to see AMD stepping it up! No Metro: Last Light though?

That game has kinda played itself out, it doesn't really show us any benefits or comparisons anymore, only that NVIDIA can run PhysX, and AMD can't, it will always be the case. I'm waiting on bended knee to find something to replace it. I think maybe Watch Dogs might be that game.
 
The air coming off the radiator is 60C... I wonder if that high temp will kill pump life, since the fluid itself should be > 60C.

I am happy with my 780ti, not sure if I could ever do xfire again... I like my 780ti > 3x 7970s (280x I believe). That was after the frame pacing was "fixed" but I could still feel hitching. Some games like Farcry 3 played like complete garbage. I only play at 1200P so I'll just keep up to date with the flagship GPUs ;)
 
Last edited:
Back
Top