AMD Radeon R9 290X Video Card Review @ [H]

Does a card having hardware designed for good performance at 4k resolutions infer that a card should perform well at eyefinity resolutions such as 5780x1080?

I'm running 3x27" 1080p lcd's, /really interested in eyefinity performance. Seems like there should be some relation.....
 
I mean, the 290X is the reason for potential price cuts next month, the 780ti, and the holiday bundle. Yet bitch bitch bitch about the 290X noise. Whatever I say, I think everyone on both sides should be happy.

+1!
I was skipping series 7xx because of the ridiculous prices, but the 290X performance could mean good news for everyone, even if you don't want to switch to red team :)
 
Does a card having hardware designed for good performance at 4k resolutions infer that a card should perform well at eyefinity resolutions such as 5780x1080?

I'm running 3x27" 1080p lcd's, /really interested in eyefinity performance. Seems like there should be some relation.....

I would say yes (High ROP count, high memory bandwidth, larger cache and more VRAM would all help).

5780 x 1080 is about half way between 1600p and 4k so the delta between the 780 and the 290x might be a bit smaller than at 4k.
 
This! Even if you are an Nvidia fanboy (and being a fanboy of anything is pretty stupid), you have to be a moron not to appreciate the benefits this card will bring to everyone.

There will always be fanboys, a la Ford vs. Chevy. I just wish they wouldn't shat all over a perfectly good review thread.
 
No, he got it right:



OMG! Stop, leave, please. You have no idea what you're talking about and how temperature and heat differ. It's even more annoying since your OC'd 780 uses a tiny bit less if not more power and heats the PC accordingly.



OMG! Stop trolling me
 
Holy LOL's at this thread. I check back and it's 27 pages. All in all I am not pleased with the noise of the card, do not care about temps - although I guess both go hand-in-hand. That said, for 549? It's acceptable. Beating the Titan quite a bit at the higher resolutions while mostly matching it at lower resolutions? Not too shabby. Good showing performance wise, not a good show noise wise. That can be fixed easily enough, though.

I have a GTX 780 already but if I were in the market for a new I would definitely give this a long consideration. Beast of a card. I'd want an aftermarket version, though. I do think it's kinda messed up that extremist NV fans are bitching about this card - I mean, the 290X is the reason for potential price cuts next month, the 780ti, and the holiday bundle. Yet bitch bitch bitch about the 290X noise. Whatever I say, I think everyone on both sides should be happy.

i agree with everything you say. We are finally going to see a price cut for the GTX 780 and nvidia countering the 290X release with the TI version which will surely be better.
 
We are finally going to see a price cut for the GTX 780 and nvidia countering the 290X release with the TI version which will surely be better.


True, the 780ti is definitely welcome. OTOH, leaks so far seem to indicate that it will be a 2496 CUDA core SKU which would make it ever slightly slower than Titan. Which for nvidia makes sense, I can't see them overlapping the Titan with a higher performing and less expensive part. I do think 780ti will basically match the Titan or come within 1%. Which is strange because the gap between the 780 and Titan is already very slim, lol.

Essentially, 780ti will bring Titan performance to around the 650-700$ price range as opposed to 1000$ - the ti will obviously also be quiet which isn't the case for the 290X (again, that is acceptable given the 290X price point). Anyway, you know the best part? The GTX 780 price cuts. If GTX 780 is 550-600 with aftermarket 780s in the same price range, that could put some hurt on the 290X. ESPECIALLY considering the 780 will have three free games.

The competition will be great next month, will be very interesting to see how things turn out. :) I hope it turns into an all-out nasty and ugly "tit for tat" price war between AMD and Nvidia. Although I kinda doubt that will happen.
 
This card is impressive however the noise of the fans is something I would rather live without. With G-sync coming out soon and I would say a price drop to NVidia cards, I will wait a little before making up my mind.

It is great to seen AMD coming back. This will just mean better prices for everyone :)

Also from what I have been reading the Titan is not a 3D graphics card is its a business card that can also run games.
 
lol the only time nvidia fan bois don't bitch about AMD is when they lead to lower prices - now if only the CPU space was competitive again. Nice looking review guys think i might get one to play with.
 
True, the 780ti is definitely welcome. OTOH, leaks so far seem to indicate that it will be a 2496 CUDA core SKU which would make it ever slightly slower than Titan. Which for nvidia makes sense, I can't see them overlapping the Titan with a higher performing and less expensive part. I do think 780ti will basically match the Titan or come within 1%. Which is strange because the gap between the 780 and Titan is already very slim, lol.

Essentially, 780ti will bring Titan performance to around the 650-700$ price range as opposed to 1000$ - the ti will obviously also be quiet which isn't the case for the 290X (again, that is acceptable given the 290X price point). Anyway, you know the best part? The GTX 780 price cuts. If GTX 780 is 550-600 with aftermarket 780s in the same price range, that could put some hurt on the 290X. ESPECIALLY considering the 780 will have three free games.

The competition will be great next month, will be very interesting to see how things turn out. :) I hope it turns into an all-out nasty and ugly "tit for tat" price war between AMD and Nvidia. Although I kinda doubt that will happen.

Yeah, things are gonna be pretty interesting in the next 2-3 months, both from AMD side, with the Mantle BF4 tests, and from Nvidia's side, with the 780ti and, well, G-sync too, even if I'm not really eager about a proprietary tech that requires a 150€ chip installed on the monitors.
 
I would say yes (High ROP count, high memory bandwidth, larger cache and more VRAM would all help).

5780 x 1080 is about half way between 1600p and 4k so the delta between the 780 and the 290x might be a bit smaller than at 4k.

Good response, thanks!
 
Just ordered the MSI version of this card along with an i7 4770k and an Asus Sabertooth Z87 board. The awesomeness of this card for the price finally convinced me to upgrade from the Core 2 Quad Q6600 based system I've been running since early 2007. AMD basically made a card that's like a Titan, but not a retarded ripoff like a Titan.
 
Last edited:
Just ordered the MSI version of this card along with an i7 4770k and an Asus Sabertooth Z87 board. The awesomeness of this card for the price finally convinced me to upgrade from the Core 2 Quad Q6600 based system I've been running since early 2007. AMD basically made a card that's like a Titan, but not a retarded ripoff like a Titan.

Wow i upgraded from a Q9300 back in 2011 to a Core I5
my processor back then was holding back the GTX 570

You will see huge gains now

I like the fact you called the titan a retardo rippoff . I you insinuating that people who bought a titan are now retards?
 
Wow i upgraded from a Q9300 back in 2011 to a Core I5
my processor back then was holding back the GTX 570

You will see huge gains now

I like the fact you called the titan a retardo rippoff . I you insinuating that people who bought a titan are now retards?

No. I'm saying that the Titan is a nice card that never made any sense in terms of its pricing.
 
No. I'm saying that the Titan is a nice card that never made any sense in terms of its pricing.

oh i see i understand where you are coming from.

Technology moves by the time you equip your desktop with that new 290x and processor it will already have depreciated a few hundred bucks. btw The titan was released 8 months ago and AMD is finally catching up .
http://www.nvidia.com/titan-graphics-card

Nvidia gave consumers the opportunity to purchase the words fastest GPU back almost 8 months ago . These type of video cards were in super computers for scientific research. I am curious to know if corporations will switch to the 290x now to put into super computers. I would probably think not due to its high heat output
http://nvidianews.nvidia.com/Releas...omputer-For-Open-Scientific-Research-8a0.aspx
 
seems it that the Titan STOMPS the 290x in folding
not surprised
THAT is what stomps looks like

59314.png
 
seems it that the Titan STOMPS the 290x in folding
not surprised
THAT is what stomps looks like

59314.png

This is not surprising to me either since what I just posted above that the titan belongs in a super computer not the 290x for folding

i wonder what nvidia is going to do when mantle programming is going to be used in gaming, some developers seem to like the idea while others do not.
 
Last edited:
This is not surprising to me either since what I just posted above that the titan belongs in a super computer not the 290x

not aimed at you
just all the people using the word "stomped" when the 290x in most cases is at best 8% better and most of time is with in margin of error of the 780 and Titan
 
seems it that the Titan STOMPS the 290x in folding
not surprised
THAT is what stomps looks like

So are you going to post the rest of the charts where the 290X stomps the Titan in compute?
 
So are you going to post the rest of the charts where the 290X stomps the Titan in compute?

you mean the ones that dont matter? do you see a whole sub forum here for sony software
oh and all the others where canned benchmarks
 
So are you going to post the rest of the charts where the 290X stomps the Titan in compute?

Yeah, the results besides folding seem quite positive for the 290x. I'm not in-the-know enough to say what benchmark best represents a realistic commercial/scientific workload. Seems to have the raw horsepower though.

Neither Titan or 290x support ECC (unlike Tesla and Firepro) which I assume is important in certain compute scenarios.
 
you mean the ones that dont matter? do you see a whole sub forum here for sony software
oh and all the others where canned benchmarks

what "matters" is subjective to the user, folding matters to some, not others, other compute programs matter to some, not others, gaming matters to most, most of the readership here are buying these cards for gaming primary, and compute or folding secondary

some compute apps are going to work better on certain cards, than others, so it all depends on what app you are going to use the most or have the most need for
 
Yeah, the results besides folding seem quite positive for the 290x. I'm not in-the-know enough to say what benchmark best represents a realistic commercial/scientific workload. Seems to have the raw horsepower though.

Neither Titan or 290x support ECC (unlike Tesla and Firepro) which I assume is important in certain compute scenarios.

a very good point, ECC is important in the really high-end scientific stuff, and for that those users go with tesla or firepro lines
 
Happy, upset & confused!

Happy for AMD and the launch of the R9 290x!!
Upset cause my cards are maybe 7 weeks old!
Confused, well thinking dump these bad boys and hop on back over to team RED, not sure.

Awesome review, for an awesome card!
 
Yeah, the results besides folding seem quite positive for the 290x. I'm not in-the-know enough to say what benchmark best represents a realistic commercial/scientific workload. Seems to have the raw horsepower though.

Neither Titan or 290x support ECC (unlike Tesla and Firepro) which I assume is important in certain compute scenarios.

From what these benches look like is that AMD and Mantle programming is going to cater to the development of producing faster console ports to pc for next gen consoles. I wish this R290x was available in a non reference/gpu cooler or I would be switching over right now. But it seems we are in the early early stages of Mantle and really wont see much benefit until the programming matures which could take another 6 months. Obviously EA has been in bed with AMD because BF4 has mantle programming this early on and is about to be released. Until next gen consoles mature and the programmers actively use this technology I can see this as being a good thing. The question i have what is Nvidia supposed to be doing to counter this new type of programming for pc games. I am wondering if AMD users will have a better experience graphically and play ability in the future. Or does this not matter?
 
Yeah, the results besides folding seem quite positive for the 290x. I'm not in-the-know enough to say what benchmark best represents a realistic commercial/scientific workload. Seems to have the raw horsepower though.

Neither Titan or 290x support ECC (unlike Tesla and Firepro) which I assume is important in certain compute scenarios.

Cinebench used to be the most widely accepted compute benchmark, they dont appear to have one.

i could give a rats ass about folding/bitcoin.
 
seems it that the Titan STOMPS the 290x in folding
In CUDA apps nV will have the advantage. In OpenCL apps, which is becoming the norm, the R9 290X will stomp any nV card.

You should see the performance of nV cards and AMD cards on Milkyway for a real world example.
 
Thanks for the awesome review. I end up doing my best to buy only stuff that gets gold awards here, when I can afford it. In for a 290x if I don't have to replace the PSU in my sig, which seems iffy.
 
faster console ports to pc for next gen consoles.....The question i have what is Nvidia supposed to be doing to counter this new type of programming for pc games. I am wondering if AMD users will have a better experience graphically and play ability in the future. Or does this not matter?
Not just faster ports from/to PC to consoles but also better since the underlying hardware in many ways is very similar so many optimizations will work for all platforms.

nV is doing some interesting things with Maxwell. I believe they're going to be putting a ARM CPU on die or board with the GPU. The idea is it'll help with latency (this is a big problem, not talked about much for some reason though) for GPU to CPU and vice versa communication. I think it'll need developer support though to work and I don't think that'll play out well for them.

They don't have the console advantage that AMD has this time around so anything GPU specific for nV will have a real tough row to hoe in gaining popularity. Other than that Maxwell should have the normal improvements bandwidth, flexibility, ALU's, etc. and should be faster than the R9 290X without any GPU specific programming on the developers part. Probably will end up quite expensive though too.

Graphically Mantle games will probably look identical to their DX11.x ports, you'll "just" get more fps. Improving stuff like graphics effects or in game assets for Mantle is probably possible but also probably won't happen since that means creating GPU specific art assets/effects which tends to be more work than its worth.
 
Thanks for the awesome review. I end up doing my best to buy only stuff that gets gold awards here, when I can afford it. In for a 290x if I don't have to replace the PSU in my sig, which seems iffy.

I think your PSU will be more than enough. [H] was seeing just under 450w (uber mode) at full load, that's with an overclocked CPU included. Guru 3D recommended a 550-600w for a single card and I think that's pretty conservative. I can't find any official recommendations.

Sweet audio setup btw ;)
 
With all of this talk about Mantle, did anyone stop to think that if they run the R9 290X any hotter, it might explode?

Mantle will help if it actually makes the work that the GPU is doing more efficient- but it can't save this hot-rodded card if it just makes the CPU side more efficient, so that the GPU can do even more work.
 
Everyone complaining about the thermals, might I just add, this was a reference designed card. I think most of us here are more interested in the custom cards. Custom cards will have custom cooling, custom fan profiles, and perhaps custom power efficiency. I look toward these cards, to hopefully solve the thermal issues.

Second, 95c is completely a fine temperature according to AMD. I asked AMD extensively about this, and shared my concerns, and told AMD how I knew all of you would react to this high temperature. AMD ensures us that 95c is well below the thermal threshold of the GPU, it is a safe temperature for this GPU. Believe me, I know the shock value of 95c though, it is hot, no question about it. I'm just relaying what AMD told me, AMD was confident in the temperature.

Keep in mind also the temperature is so high because the fan is capped at 40 or 55%. Unlock the fan to go up to 100% and it isn't so bad. Fan speed is keeping the GPU quite warm. Again why I think custom cards will solve this problem.

If I were going to buy a 290X, I would be looking at what custom cards are offered.

What is the actual max safe thermal threshold for this gpu?
 
Not just faster ports from/to PC to consoles but also better since the underlying hardware in many ways is very similar so many optimizations will work for all platforms.

nV is doing some interesting things with Maxwell. I believe they're going to be putting a ARM CPU on die or board with the GPU. The idea is it'll help with latency (this is a big problem, not talked about much for some reason though) for GPU to CPU and vice versa communication. I think it'll need developer support though to work and I don't think that'll play out well for them.

They don't have the console advantage that AMD has this time around so anything GPU specific for nV will have a real tough row to hoe in gaining popularity. Other than that Maxwell should have the normal improvements bandwidth, flexibility, ALU's, etc. and should be faster than the R9 290X without any GPU specific programming on the developers part. Probably will end up quite expensive though too.

Graphically Mantle games will probably look identical to their DX11.x ports, you'll "just" get more fps. Improving stuff like graphics effects or in game assets for Mantle is probably possible but also probably won't happen since that means creating GPU specific art assets/effects which tends to be more work than its worth.


That was great info thanks for sharing
I wish we could get some game developers or programmers opinions on here. I will have to look out for this info . I recall seeing a youtube video where some game developers wer e excited and a few that were not. I will have to read more into this
 
Back
Top