AMD Radeon R9 290X Video Card Review @ [H]

This bodes well for everyone, fastest top end single GPU to date at a significant cost reduction. It means (hopefully) lower prices on the green side as well. Anyone looking to upgrade around Xmas time should have a good variety of top end GPU's, both Red and Green, at much better prices than we have seen in some time.

nVidia fanboys should really just stop grasping for reasons why this release is somehow a bad thing.
 
Wanted some clarification on a part from the article:

By any ports, does this actually mean that I can have two monitors on the DVI ports and the 3rd monitor on a HDMI port? That's still one of my biggest gripes with AMD in that I have to use a display-port capable monitor or a display-port adapter that may or may not work if I want to use three monitors on one card.

One of the features being showcased with the new lineup from AMD, even the rebrands (so the 280x/270x is not quite the same as a 7970/7870) is eyefinity without needing native DP or an active adapter.

You can see it explained more here - http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/3
 
I want to see performance with BF4 and Mantle on a AMD CPU/GPU setup. Perhaps a 9590 to throw in to get the [H]ardest performance possible. Not that I am giving up my Titans any time soon...just really curious to see from a hardware performance/geek perspective.
 
This bodes well for everyone, fastest top end single GPU to date at a significant cost reduction. It means (hopefully) lower prices on the green side as well. Anyone looking to upgrade around Xmas time should have a good variety of top end GPU's, both Red and Green, at much better prices than we have seen in some time.

nVidia fanboys should really just stop grasping for reasons why this release is somehow a bad thing.

I am a nvidia owner and I think its great that companies do this to reduce prices,
If they didn't nvidia would be a monopoly and AmD would be out of business.
I am sure if nvidia wanted too right now they could release a card that would destroy anything AMD could muster .
 
I guess you didn't actually read what they said about the overclocking

My question is, can you understand graphs? Please, tell me what you can extrapolate from this data.

voltagetuning.jpg
 
That's because mommy and daddy pay the bills right

You've been here 10 years, really, what kind of stupid reply gimmick are you going with tonight. Recently learn about logical fallacy arguments in poli-sci or something?

We're not talking about starving kids in Africa, someone dropping $1100-2000 on multi-GPU setups is not going to worry about saving / paying an extra $3 a month in power costs at the worst possible usage case.
 
Awesome card for the money. Yes I was a bit disappointed with the power and heat output but not surprised considering AMD were still stuck on 28nm and had a much smaller die to compete. Over all Im very happy cause this will bring down the high end to more reasonable prices. Nvidia was really raping us with their pricing.
 
My question is, can you understand graphs? Please, tell me what you can extrapolate from this data.

voltagetuning.jpg

It looks like to me corsair needs to release a liquid cooler for this video card ASAP and people need to ensure they have plenty of ventilation in the case
Forgot to mention a new power supply too
Remember power supply's degrade every year
 
You've been here 10 years, really, what kind of stupid reply gimmick are you going with tonight. Recently learn about logical fallacy arguments in poli-sci or something?

We're not talking about starving kids in Africa, someone dropping $1100-2000 on multi-GPU setups is not going to worry about saving / paying an extra $3 a month in power costs at the worst possible usage case.

It's all about ROI

You sound mad
 
From page 4 of the review:
Ultra HD resolutions can be passed over the HDMI 1.4b port and the DisplayPort with a single cable at 60Hz.

Am I reading this correctly? Are you saying that 4k 60hz is supported over the HDMI1.4b connection? Or are you saying 4k is supported over hdmi, and 4k 60hz over displayport? Everything I read about HDMI 1.4b (I'm no expert but I can has google) indicates it added support for 1080p 120hz, but does not support 4k 60hz.
 
Yep Im sure of that as well:p....We all know that the all powerfull Nvidia always has a plan B just in case. :rolleyes:

LOL. This card's announcement/release has brought me many laughs.

Kudos to AMD for the entertainment.
 
From page 4 of the review:


Am I reading this correctly? Are you saying that 4k 60hz is supported over the HDMI1.4b connection? Or are you saying 4k is supported over hdmi, and 4k 60hz over displayport? Everything I read about HDMI 1.4b (I'm no expert but I can has google) indicates it added support for 1080p 120hz, but does not support 4k 60hz.

Whatever the new version of HDMI is, it's on this card, and it supports 4k 60Hz :).
 
One of the features being showcased with the new lineup from AMD, even the rebrands (so the 280x/270x is not quite the same as a 7970/7870) is eyefinity without needing native DP or an active adapter.

You can see it explained more here - http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/3

Hmmm.....still not happy with AMD's implementation of that. None of the three monitors I have are the same resolution let alone the same timing identical.
 
Interesting 4k results versus 1080p. This is great if you have a 4k display, nothing really special if you don't (yet.)

And I'm skeptical that third parties are going to drastically reduce temps and power load.

I think AMD took a step forward while taking a step back.

Bring on Maxwell. Let's see some head to head. Q1 2014 will be fun with better Hawaii drivers, Maxwell, and G-Sync.

Third party coolers will lower temps, and binning might lower power load.
 
Third party coolers will lower temps, and binning might lower power load.

The TI will be out by then

Sounds like i need to upgrade my NEC 2690s soon to 4k

I need to upgrade to LED anyway to save power these old NEC CFLs are creating too much heat and using 50 watts more then i want now
 
Last edited:
It's a point to consider. It is also interesting that TITAN is achieving the same performance, but with less power, and cooler temperatures. Efficiency, IMO, is always important. AMD use to rule this, 6000 series, 5000 series, but ever since the 7000 series the tables have turned, AMD isn't the king of efficiency anymore. But, AMD is certainly the king of pricing.

true also don't forget the G92 gpu's where nvidia had a short stint of being more efficient til they either fired the guy that developed the G92 gpu or threw him off a bridge and released the GTX 200 series, lol.


You forgot to mention uses less wattage and is cooler card
95c good luck maintaining that 4.8 ghz overclock

Remember heat kills components

i think some people are going a little nuts about the temps. i honestly think once we start seeing custom cooling on these cards the "omg the card runs at 95C" bullcrap will be completely forgotten. AMD's reference coolers have always been trash and anyone spending this kind of money is not going to save a couple dollars buying a reference card over a custom PCB/cooled card.
 
Last edited:
Third party coolers will lower temps, and binning might lower power load.

Also why do some people seem to think that Maxwell will be the only alternative ? Amd have been working on their Maxwell counter as well.

290x(28nm) is only AMD's counter to the 780 (28nm). The fact that it competes well with a 1000$ titan makes the 290x even better. For the 99% of us who care about value this card is a godsend. Sure there are some negatives(heat/noise). Every review has them but then again titan perf for half the price ? In my books that's just FANTASTIC
 
Last edited:
Whatever the new version of HDMI is, it's on this card, and it supports 4k 60Hz :).

Well, the newest version of HDMI is 2.0, and part of it's feature set is 4k @ 60hz. But the article states it's only HDMI 1.4b. That's why I was trying to clarify what the card can and cannot do in terms of 4k connectivity.
 
For the love of god I hope powercolor makes a vortex version of this card. I would buy it in a heartbeat. I can't help but love the design of those cards. Great review. Don't give a shit how much power it uses. This thing is a beast!
 
Adki0MO.png


It's a video toaster!

I don't see custom coolers helping much, that heat is going to still need to be blown out of your system so you'll now have loud case fans churning instead of a loud blower fan on the GPU.

Powertune 2 now takes into consideration heat before throttling and it seems very easy to hit that 95 degree mark, after gaming on this card for an hour I would expect to see a performance drop unless you can keep those temperatures down, might be easy as opening a window if you live some place cold but in the summer or if you live where it's warm in the winter you'll throttle for sure.

Hopefully the 290 non x doesn't run so freak'n hot.
 
Adki0MO.png


It's a video toaster!

I don't see custom coolers helping much, that heat is going to still need to be blown out of your system so you'll now have loud case fans churning instead of a loud blower fan on the GPU.

Powertune 2 now takes into consideration heat before throttling and it seems very easy to hit that 95 degree mark, after gaming on this card for an hour I would expect to see a performance drop unless you can keep those temperatures down, might be easy as opening a window if you live some place cold but in the summer or if you live where it's warm in the winter you'll throttle for sure.

Hopefully the 290 non x doesn't run so freak'n hot.

Thanks for providing this factual data. I see that the hot spot is no where on the gpu.

What if you stick this video card outside this winter? Would that help?

I wonder if AMD should be providing warning labels to place outside your pc. Small children or adults could potentially need a skin Graf after touching this 290x
 
I'm glad it exists as nvidia will need to drop prices.


Also a bunch of questions:

Since it downclocks due to temprature - were the tests done in case ? How do clock speeds behave after few hours long gaming session ?
 
How people cannot see this release as a good thing for both camps is beyond me.

If you're whining about fan noise or heat, [H]²O or bust.

Yeah, thread was great until the trolls showed up in force to once again justify their purchase. Anyway 1 week until 290 NDA and a month until third party cooling solutions.
 
yge0Xdb.png

"silent mode"

Epxj8np.png

"uber mode" note the higher RPM on the cooling fan.

15 minutes of Crysis 3 and you can see once the card hits 94 degrees it throttles the core clock down slightly to attempt lowering the temperature.

I wonder just how loud 100% fan is on this cooler
 
Last edited:
Yeah, thread was great until the trolls showed up in force to once again justify their purchase.

It was bound to happen. It always does. Expect to see the same arguing points over and over again for a few days.
 
yge0Xdb.png


15 minutes of Crysis 3 and you can see once the card hits 94 degrees it throttles the core clock down slightly to attempt lowering the temperature.

It's still a great card compared to a thousand dollar titan. I like the fact AMD has j made it possible to spread the cards far apart in crossfire by not needing a cable. But is it really worth running over a 100 watts more then a Nvidia sli solution
Idk

http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,4.html

Personally I think a gtx780 non reference board cooling is a better way to go right now
 
Barely or not, its overall faster than the 780 and $100 cheaper. Not sure what the problem is?

amd just invalidated every Titan/780 and the new 780Ti buy.
thats the problem.;)

amazing card.
 
Slightly bipolar are we? $200 makes no difference, but waterblock prices are important :eek:
You didn't get the point. Waterblocks are required to achieve in R9 290X what you can with stock coolers with 780s in SLi. With the Ti bios people are running 1200+ clocks on 780s in SLi configuration. This implies almost a ~40% increase in core clock that is usually translateable into ~25%+ performance increase compared to 780 stock speeds.
 
You didn't get the point. Waterblocks are required to achieve in R9 290X what you can with stock coolers with 780s in SLi. With the Ti bios people are running 1200+ clocks on 780s in SLi configuration. This implies almost a ~40% increase in core clock that is usually translateable into ~25%+ performance increase compared to 780 stock speeds.

Makes sense . It still seems AMD might have a upper hand on drivers the next few months by squeezing more out of this card. I also am wondering how many complaints will be on new egg and amazon review forums etc how this card is so hot it is causing system crashes and component failures
http://en.wikipedia.org/wiki/Failure_modes_of_electronics
 
First off, after weeks of anticipation I'd like to thank Brent and Kyle for delivering such a detailed review.

The 290X certainly is a powerhouse given that it can compete with the best that nVidia has to offer, but I can't help but think that once the dust settles, and prices are dropped and the 780 Ti is available that those who wish to water cool and overclock their cards (like myself) will likely opt to spend the additional money for the 780 Ti. The overclocking headroom and efficiency ultimately make more sense to me.

Props to AMD though. The 290X is a bit late, but it's an absolute monster in the price/performance department. And I hope this isn't taken as me pooping on AMD... my current card is a 6970, so not only am I a happy AMD owner, but I'm also in the market for a new card before 2014.
 
Awesome card, would love to get 2 of them, but until AMD includes custom resolutions creation to allow for downsampling, AMD is a no-go for me.

Please Kyle, you need to investigate and promote the benefits of downsampling, as these is a game-changer for many people still gaming in 1080p (big-TVs, projectors). We need AMD to step-up their frame in this regard and at least offer the same functionality as NVIDA to have some competition (even better would be variable resolutions and downsampling/upsampling on a per-frame basis, that would be potentially even better than the GSYNC solution from NVIDIA, which is awesome on paper).
 
Makes sense . It still seems AMD might have a upper hand on drivers the next few months by squeezing more out of this card. I also am wondering how many complaints will be on new egg and amazon review forums etc how this card is so hot it is causing system crashes and component failures
http://en.wikipedia.org/wiki/Failure_modes_of_electronics

Has this forum really had to put up with 10 years of your trolling or is it a recent thing for you?
 
Well, the newest version of HDMI is 2.0, and part of it's feature set is 4k @ 60hz. But the article states it's only HDMI 1.4b. That's why I was trying to clarify what the card can and cannot do in terms of 4k connectivity.

Yeah, I'm not sure either then. Will have to look it up later.
 
Back
Top