AMD Radeon R9 290X Video Card Review @ [H]

Yea in "quiet mode" this can happen easily. "Uber mode" it appears to be less of an issue. Neither mode runs the fan at higher than 55% of top speed though.

It appears to be possible to force the fan to run higher if you like but its pretty noisy if you run anywhere near 100%. Another review had the noise at something like 62dB IIRC at top speed. Right up there with those old Delta screamer 10k rpm fans.

That is unacceptable for it not to be able to hold its advertised speeds.

I'll gladly take the louder fan speed to be able to sustain the performance. Im sure we will see the aftermarket coolers quickly fix it though.

It appears amd is running this baby deep into the redline.
 
Apparently the frame pacing issue is still not completely fixed in DX9:
Ouch, a return of the runt/dropped frames! Skyrim is our only DX9 title remaining in the test suite and it shows us that AMD has only improved frame pacing on the R9 290X for Eyefinity and 4K for DX11 and DX10 titles. Notice the difference in the orange line between the FRAPS FPS and Observed FPS images - that is the problem we saw across the board with CrossFire that is slowly being fixed, product by product.
http://www.pcper.com/reviews/Graphi...-290X-CrossFire-and-4K-Preview-Testing/Skyrim
 
That's fine the card was designed to last through the warranty period up to 95C.

But...the majority of user do not watercool. I can see why people with aircooling don't want a gpu exhausting 85C+ into their rooms.

How is that any different from the card running at a higher fan speed and exhausting a higher volume of 75C air into the room ?
 
Wow - I want one! The question is to hold out for a custom card with some better cooling. I have had really good experiences with my 2 previous reference cards (5870 and gtx680) though it sounds like this card could be even faster with a better cooler as there would be less throttling. Also wonder if the 780s will come down any to match the price. If so, it would be a tougher decision.

Thanks for a great review.

Hold out for a better cooling solution in my opinion. I did not get the feeling that a lot of R&D went into this reference solution. NVIDIA refused to answer any questions about 780Ti yesterday, so I have a feeling that NVIDIA is a bit lost at the moment. For those of you looking for huge price drops, it will be interesting to see what happens considering how huge the die size is on the NV cards.
 
Kyle, would you upgrade/swap out/whatever your two Titans for two 290x's? Just curious.

(sorry if you answered this already!)
 
Kyle, would you upgrade/swap out/whatever your two Titans for two 290x's? Just curious.

(sorry if you answered this already!)

With the information I have right now, no. I am interested to see what real world gameplay shows on 290X CF vs 780 SLI. If I was running a 4K monitor I might be a lot closer to making the jump.
 
Apparently not:
They're talking about at 4K resolution. The R9 290/x is supposed to have "fixed" CF for that resolution. Not sure if its fixed with Eyefinity and the R9 290/x though.

Frame pacing for other GPU's is working for resolutions of 1600p and less without Eyefinity. e: Yea in non DX9 games too.

Apparently the frame pacing issue is still not completely fixed in DX9:
That has been known about for a long time and doesn't mean frame pacing "doesn't work", just that it doesn't work for some games/situations.

This is not pedantry as those 2 phrases ("doesn't work" vs "doesn't work for some games/situations") mean 2 very different things.
 
They're talking about at 4K resolution. The R9 290/x is supposed to have "fixed" CF for that resolution. Not sure if its fixed with Eyefinity and the R9 290/x though.

Frame pacing for other GPU's is working for resolutions of 1600p and less without Eyefinity.

Current 4K monitor support is through eyefinity (MST) 2 x 1920x2160 streams made into 1 via Eyefinity, so technically if 4K is fixed, eyefinity is fixed.
 
That is unacceptable for it not to be able to hold its advertised speeds. It appears amd is running this baby deep into the redline.
It can hold its speeds...if the fan speed is up and airflow is good. Yea they're definitely pushing it but part of the problem here is that the foundries can no longer improve their processes at the speeds they used to.

If you still want nV or AMD to maintain the performance increases they've been able to deliver in the past for new GPU's then you have to expect the IHV's to push the envelope.
 
Good review, thanks for the hard work guys. I may need to re-read my review but I'm interested to how the dynamic clocks work under water as it appears they are tied to fanspeed. Looking forward to putting a pair of these under soon.
 
I just can't believe they when with that crap cooler setup.
What the F were they thinking. 95 temps are insane.
Did kyle try setting it to 100% or did he not hear us
must have...LOL

Kyle

KYLE

TURN IT DOWN

KYLE
 
I'm at work, can't see the video, summary plox :)
780 fully overclocked DESTROYS overclocked 290x when stock coolers are used.
This is probably gonna change with custom coolers later, but for now an overclocked 780 is better than an overclocked 290x, performance-wise.
 
330ha4l.jpg
 
I just can't believe they when with that crap cooler setup.
What the F were they thinking. 95 temps are insane.
Did kyle try setting it to 100% or did he not hear us
must have...LOL

Kyle

KYLE

TURN IT DOWN

KYLE

Ha. Stock coolers are pretty much never as good as aftermarket coolers.

Look at the stock Intel coolers. They have been using the same design since the P4. Talk about insanely lazy and worthless.

I am guessing that the voltage could be lowered for the same clock speeds if the temps were a decent amount lower. This would reduce power draw and lead to even lower temps.

That being said, if I had the money, I would be replacing my 2x 6870s with a 290x.

I still only play at 1080p though, so the 6870s handle pretty much everything just fine.

I can't wait to see the CFX revies for this card.
 
AMD is still catching up.
I've been gaming on my Titans for close to a year now and AMD finally comes out with this crap?

AMD burnt us once with the MRSP of the 7970 @ 599, don't everyone forget to easily.
 
780 fully overclocked DESTROYS overclocked 290x when stock coolers are used.
This is probably gonna change with custom coolers later, but for now an overclocked 780 is better than an overclocked 290x, performance-wise.

I would not doubt this at all. NVIDIA put a lot of time and effort into that reference cooler used on Titan and 780. I am not getting that feeling from AMD on the 290X. As I stated above, if I was going to buy a 290X, I would wait for custom cooling from the likes of ASUS, GBT, or MSI.
 
Thanks as always, Brent y jefe. The heat wrinkle's the one thing that apparently stops it from being a home run, but that aside...this would probably be the best straight replacement for my R6990 at 1600p, unless I run into a crackhead who sells me a 780 for $100...
 
780 fully overclocked DESTROYS overclocked 290x when stock coolers are used.
This is like the exact opposite conclusion of the review. Go to 8:43 in the video for numbers.

They did a bunch of tests at 1440p and less resolution, and mention that the R9 290X really shines at 4K but didn't have a monitor with that res.

At stock clocks they said the 780GTX and R9 290X pretty much switch places for top spot.

When both were overclocked the 780GTX had a approx. 10% lead but they said the price difference more than made up for it...at 1080p.

Raising the resolution to 1440p the difference was 5% or less when both were overclocked.

I can only assume kache is using hyperbole with hydrochloric levels of sarcasm in his previous comments, everything makes sense now!
 
So whatever happened to the talk doing the rounds that external companies were participating in doing their own version of a cooler for this card and submitting them to amd for the best version to become the stock cooler for this card? Seems we got the same old amd cooler that been recycled numerous times over the last lot of years. :confused:
 
I want to build a mini-itx rig with an R9-290x, would need to have fans that exhaust inside the case.....I would put an aluminum rack in it and use it as a make-shift easy bake oven! Think it will work? :D

Props to AMD for not pricing this card through the roof. They really have a great video card here.

I wonder if the 4K performance of the Titan isn't on par due to lack of development for drivers for that resolution? It seems AMD is pushing 4K harder then Nvidia has been. But that's just my assumption.

Excellent [H] GPU review as always Kyle/Brent.
 
So whatever happened to the talk doing the rounds that external companies were participating in doing their own version of a cooler for this card and submitting them to amd for the best version to become the stock cooler for this card? Seems we got the same old amd cooler that been recycled numerous times over the last lot of years. :confused:


I don't know who fed you that bullshit, but that is not going to happen.
 
Those video reviews on page 17 are fail.

The 290X is a 4k+ gaming beast.

Don't have a 4k monitor (or more)? It's obvious that 290X is not your best option.
 
1% faster than titan on average at "all resolutions" for $450 less is pretty solid IMO. (TechPowerUp review always shows relative performance at different resolutions to other cards. Great for seeing % differences at your resolution as well as the average across all resolutions from generation to generation).

http://tpucdn.com/reviews/AMD/R9_290X/images/perfrel.gif

I would definitely wait for custom cooler solutions though. Not sure I will be getting one of these cards, but it is very tempting over my 680.
 
Attack on titan!!! (been waiting weeks to say that) great result, great price.

Looks like amd is looking in the future. Looks like a design that would do even better once it's shrunk to 20nm. or maybe i'm just going mantle :D
 
I wonder if the 4K performance of the Titan isn't on par due to lack of development for drivers for that resolution? It seems AMD is pushing 4K harder then Nvidia has been. But that's just my assumption.

This is a valid concern, but NVIDIA has been hitting the PR horn pretty hard this last few weeks about 4K. I would suggest given that it has been doing this, NV is "ready" for 4K. If NV's drivers are lacking on 4K, well, they are just a bunch of knuckleheads then.
 
Mmm, a couple things.

First, as someone mentioned before, it's going to be interesting how the R9 290 will be on thermal output and power usage. It has slightly less compute units than the 290X and should come in at around $499, hopefully. Performance-wise, I would assume it'd still be higher than a 7970 GHz edition, and probably on par with a GTX 780, not 780 Ti, and no faster. This is just by looking at the reviews so far of the 290X versus the GTX 780.

Second, I'm still wondering if there is a way to measure how much data is pushed through XDMA (Crossfire DMA) between the PCI-E slots. Does it really saturate a PCI-E x16 3.0 slot? To those that want to do Crossfire will probably need boards with PCI-E 3.0 slots at dual x16 or x16/x8. Or, it might make people go up to X79 boards.

Thirdly, is there any reason why Frame Pacing with DirectX 9. I've always been wondering why it'd be easily fixed for DX10/DX11 games than DX9.
 
Don't have a 4k monitor? It's obvious that 290X is not your best option.
At 1440p at stock clocks it'll beat a 780GTX for $100 less. That situation only improves if you go to 1600p.

If you OC both the 780GTX gets a negligible lead at 1440p and probably gets tied at 1600p. For $100 less still.

Its not a perfect card by any means but to say you need a 4K monitor to make the R9 290X worthwhile is silly.
 
I wonder if the 4K performance of the Titan isn't on par due to lack of development for drivers for that resolution? It seems AMD is pushing 4K harder then Nvidia has been. But that's just my assumption.

You may be right, but 64 ROPS, larger cache and 512 bit memory bus (although the bandwidth increase is smaller than it could be due to clock-speed) could also be important at 4k. Seems the memory is rated 1500mhz(6000mhz) so assuming the controller can handle it (TPU could only get past 1500mhz using 'uber' bios) we should see some big bandwidth numbers. TPU hit 400GB/s

From Anand. "But AMD’s focus on 4K resolution workloads also plays a significant part, as 4K represents a significant increase in the ROP workload, and hence the need for more ROPs to pick up the work. Consequently while we can’t easily compare ROP performance across vendors, increasing the number of ROPs is one of the ways AMD will extend their high resolution performance advantage over NVIDIA, by being sure they have plenty of capacity to chew through 4K scenes."
 
Last edited:
My Korean 1440p is going to be happy! Now 4K at +-$500 would be fantastic.
 
Don't have a 4k monitor (or more)? It's obvious that 290X is not your best option.
I don't really see how you are coming to that conclusion. It's still $100 cheaper for the time being and performs better than a 780, stock for stock, at most games. The biggest discrepancy comes at 1080p, but I don't think most people buying 290Xs are gaming on 1080p monitors (aside from those on 120Hz panels) and even then, it's a relatively small gap.

Personally I am excited to see what aftermarket PCBs and water cooling can do with the Hawaii GPU.
 
This is like the exact opposite conclusion of the review. Go to 8:43 in the video for numbers.

They did a bunch of tests at 1440p and less resolution, and mention that the R9 290X really shines at 4K but didn't have a monitor with that res.

At stock clocks they said the 780GTX and R9 290X pretty much switch places for top spot.

When both were overclocked the 780GTX had a approx. 10% lead but they said the price difference more than made up for it...at 1080p.

Raising the resolution to 1440p the difference was 5% or less when both were overclocked.

I can only assume kache is using hyperbole with hydrochloric levels of sarcasm in his previous comments, everything makes sense now!

If he wants to consider 10% destroyed, who are you to tell him he can't?
 
If he wants to consider 10% destroyed, who are you to tell him he can't?
Because 10% is right at the limit where you can actually start to perceive a difference in performance just by looking at the screen.

Bear in mind that is for 1080p resolutions where both cards are getting well over 70fps and over 100 wasn't uncommon. So you'd need great perception & vision + a 120Hz monitor to have even a chance at seeing a visible difference.

If that is his definition of "destroyed" then fine but its a virtually useless one that devalues the word, causing confusion instead of informing anyone but himself when used in that context.
 
Back
Top