AMD Radeon R9 290X Video Card Review @ [H]

But if the 290x is faster, isn't the extra power justified :)? if not, why did you not stay with your 680? it uses less power than a the 780 you run does't it?

It seems that you still don't understand about the heat generation: Just because your card Runs cooler, doesn't mean it generates less heat.

Funny because the GTX 780 actually uses 10 watts less then my old GTX 680 at idle and at load . since i am not a big time gamer I think this makes up for the full load wattage which is slightly higher since my computer sits at idle longer then at full load.

I understand heat generation and the 290x needs help because that much heat output for a 1-3 performance difference between a stock titan should be lower. AMD took way too long to release a card to match titan level performance and they could of least done it with a more efficient card. If nvidia released a card with the same TDP requirements as AMD they would surely be hurting .
 
But if the 290x is faster, isn't the extra power justified :)? if not, why did you not stay with your 680? it uses less power than a the 780 you run does't it?

It seems that you still don't understand about the heat generation: Just because your card Runs cooler, doesn't mean it generates less heat.

You seem to be constantly knocking the 95 degree thing, or the noise thing, and not seem to understand that the 95 degree temp is set there for a reason, to keep noise down with the reference cooler.

The only REAL efficiency I see Titan having over AMD is that it has a more efficient reference cooler.

You mean 95C? which equals 203F degrees. I can almost guarantee if your GPU is sitting at 95c that card will need a constant high speed fan to exhaust that heat. From that video i saw on you tube the 290x was loud compared to a titan . To me I would not care about 6-8 FPS if i had to put up with this noise.. I experienced this with a 7970GHZ edition reference cooler and was happy to see that thing go back to the vendor. I thought it was defective because of how loud it got at full load vs my windforce GTX 680 back in the day.

The 290x will be a great card when the windforce version or a non reference cooler is out for it. even then i am going to be surprised how it will perform DBA wise compared to stock since the heat wont be exhausted but recirculated inside the case. At 95C i could only imagine components will not overclock as well due to the heat . Also not sure if you read linus tech tips he mentioned that the 290x could possibly be a poor overclocker from what is out there. Remember alot of the reviews showed the card being tested outside of a case a mother board sitting on a table isnt the same as the card inside of a case when you have to factor in the heat being properly exhausted.
 
Last edited:
Funny because the GTX 780 actually uses 10 watts less then my old GTX 680 at idle and at load . since i am not a big time gamer I think this makes up for the full load wattage which is slightly higher since my computer sits at idle longer then at full load.

I understand heat generation and the 290x needs help because that much heat output for a 1-3 performance difference between a stock titan should be lower. AMD took way too long to release a card to match titan level performance and they could of least done it with a more efficient card. If nvidia released a card with the same TDP requirements as AMD they would surely be hurting .

Did you not read the review? The performance difference @ 4k is much higher than 1-3%, additionally, I still don't think you understand "heat generation".



http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/19

(anand pegs the 680 @ 2W higher than the 780 at idle)
the 680 uses 50W less than the 780 @ load, if you game for 1 hour a day, and idle for 23 hours of the day, how much of a difference are you saving by the 2w IDLE power you save? (which by the way is within the margin of error in testing IMO)

Lets do some math: 2W @ 23 hours ~ 165,000W
50W @ 1 hour ~ 180,000W

So you are spending 15,000 more Watts by owning the 780! (I'd just shut my gaming box down when I'm not using it btw)

There is less of a delta between the Titan and the 290X then between the 680 and 780.... just saying!
 
Did you not read the review? The performance difference @ 4k is much higher than 1-3%, additionally, I still don't think you understand "heat generation".



http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/19

(anand pegs the 680 @ 2W higher than the 780 at idle)
the 680 uses 50W less than the 780 @ load, if you game for 1 hour a day, and idle for 23 hours of the day, how much of a difference are you saving by the 2w IDLE power you save? (which by the way is within the margin of error in testing IMO)

Lets do some math: 2W @ 23 hours ~ 165,000W
50W @ 1 hour ~ 180,000W

So you are spending 15,000 more Watts by owning the 780! (I'd just shut my gaming box down when I'm not using it btw)

There is less of a delta between the Titan and the 290X then between the 680 and 780.... just saying!


My KW meter and UPS output 10 watts less at idle vs my previous card. So i believe it since i checked myself. I see 1-3 percent difference then a titan which is like 4 FPS higher . For the wattage/noise Uber that 290x puts out is it really worth it

not in my opinion
for you this card is perfect since you dont care about energy consumption and heat
 
My KW meter and UPS output 10 watts less at idle vs my previous card. So i believe it since i checked myself.

Did you test the difference @ load speeds then?

btw I do care about energy consumption, I just don't care that much about 30W @ load, it seems you are making a big deal of it, even though you consciously chose to get a 780 which uses 50W more at load... weird IMO!
 
Last edited:
What 'temperature' the GPU runs at is completely irrelevant when comparing different GPUs. The R9 290X is designed to run at 95c- and that's not a bad thing. It has nothing to do with how much heat the card puts out.

How much power a card draws has *everything* to do with how much heat it puts out- if you want to look at efficiency, the general approach is defined as 'units of work done per units of energy consumed', which with video cards can be compared through measuring and calculating FPS/watt.

So, while I don't want to throw gasoline on the fire, UCPU is right- the 290X is less efficient than the 780/Titan, because it uses more energy to produce about the same amount of FPS.

And then there's noise, which is also related to power draw. To produce about the same amount of FPS, the 290X is twice as loud or louder than the Nvidia cards using the stock cooler- that's not just bad, it's insane.

But what about heat? Very simply, having nothing to do with the core temperature of each GPU, the 290X consumes quite a bit more power, and thus must produce quite a bit more heat. And that heat goes somewhere. If you're not diligent about getting said heat out of your computer, the 290X will throttle performance. If you are diligent about keeping it cool, that heat is going into your room :).
 
Yeah i knew about the consoles using AMD parts I am sure they need all the help they can get since AMD stock has dropped from 11.00 to 3.38 the last 5 years on a constant downward trend.
Lies, misinformation, or both. Click on the 5y graph in this link and note that it is not a "constant downward trend from this point 5 years ago." It is definitely low, and it may not be what it once was, but it's definitely not constantly downward from where it was 5 years ago. In fact, it's HIGHER than it was 5 years ago. In any case, since you seem to know a thing or two about finances, instead of celebrating AMD's fall from grace, you could have done better for yourself if you had put in a buy order last night for AMD stock, as it's up 4.02 percent today. Then you wouldn't feel the need to sit in an AMD Flavor forum arguing about how much better your decision to buy a 780 however-many-months ago makes you better than those who decide to buy a part from another vendor, and instead you'd be a much happier person, because you'd have made 4 percent on your money in a few hours time.

Nvidia and Intel are hoping they dont go bankrupt and then what then? Do some research on how monopolies work and how the government can intervene

Are you presuming that I don't understand how monopolies work? In either case, let's follow your logic for a moment and assume that Nvidia is just laying awake at night praying AMD doesn't go under.
  1. Even if AMD did go under, that would still leave Intel as a viable competitor in the graphics market, as the trend is towards integrated graphics and CPU. Both AMD and Intel, but especially Intel, have gotten much better in graphics performance recently.
  2. If AMD went under, then NVidia wouldn't have to worry about lowering their prices on their Titan, 780, and forthcoming 780TI line, to say nothing of the lesser parts. Please explain to me how Nvidia is benefitting by having to lower their prices.

As far as Intel not wanting AMD to go under, Intel is far more worried about ARM than they are about AMD at the moment. Hopefully AMD can make a game of things again, but right now it's looking grim.
 
If AMD went under, I'd bet that Nvidia- or someone else- might make a play for the company. Even better, Nvidia might go for the CPU and server businesses, and Intel might go for the graphics, in keeping with keeping the feds as far from the industry as possible. Of course, Nvidia would need to negotiate with Intel for the patents, but I don't suppose that would be much of a stumbling block, given that Intel needs an x86 CPU competitor, or risk being chopped up.
 
You know what they say about arguing with idiots right? By him/her/it claiming that they are incredibly biased would multiply my respect by infinity, but what's infinity multiplied by ZERO?

45 posts in the last 10 pages about the same thing over and over and over :( He promised he would stop, but alas...
 
What 'temperature' the GPU runs at is completely irrelevant when comparing different GPUs. The R9 290X is designed to run at 95c- and that's not a bad thing. It has nothing to do with how much heat the card puts out.

How much power a card draws has *everything* to do with how much heat it puts out- if you want to look at efficiency, the general approach is defined as 'units of work done per units of energy consumed', which with video cards can be compared through measuring and calculating FPS/watt.

So, while I don't want to throw gasoline on the fire, UCPU is right- the 290X is less efficient than the 780/Titan, because it uses more energy to produce about the same amount of FPS.

And then there's noise, which is also related to power draw. To produce about the same amount of FPS, the 290X is twice as loud or louder than the Nvidia cards using the stock cooler- that's not just bad, it's insane.

But what about heat? Very simply, having nothing to do with the core temperature of each GPU, the 290X consumes quite a bit more power, and thus must produce quite a bit more heat. And that heat goes somewhere. If you're not diligent about getting said heat out of your computer, the 290X will throttle performance. If you are diligent about keeping it cool, that heat is going into your room :).

It really does not consume that much more power, definitely not enough to qualify as "quite a bit", it consumes roughly 10% to produce roughly the same fps @ 1080p, and consumes 15% more power to produce roughly 20%(Sorry I don't know the actual average, someone want to grab it?) more FPS @ 4k(and presumably eyefinity) resolutions.

I don't understand because the 780 uses 17% more power to produce 22% more fps and people are happy with that, but when the same numbers apply to the AMD card it's a big deal :)

The funny part is, if they had managed to produce a heatsink design that was on par with Titan/780/770, most people wouldn't complain because:

A) The temp would be lower
b) the noise would be lower
c) the performance increase would be more substantial.

With that being said, some of these custom cards that will hit the market, we could see higher performance, and lower sound output with a 95 degree card!(but with higher power draw), Watercooled cards should hit the theoretical maximum performance for a stock clocked card, I'd be interested in what those numbers are like.

and BTW, if you want to talk about Efficiency, another way to look at it is AMD is producing the same FPS with ~30% less die space.

Either way, the 290x launch is a good thing for us, since now 780s will be cheaper, and eventually the 290 will be out pushing the 770 down as well.

(and yes, I do understand that the 290x uses more "power" for the same amount of fps, and to me 30-50W @ load really doesn't move me enough @ current price of a KW of electricity over an hour considering the price of the card, though @ the same price, I might go for a 780)
 
Last edited:
More power usage doesn't really matter- like you say, it's not much in a single-card configuration, and it's not like electricity prices have gone the way of gas prices yet. I was really just trying to keep the record straight, because the power draw figures are important for other parts of the discussion.

And you're quite right, if AMD had not fallen off the deep end with their cooler design, we wouldn't be worrying about core temps or power draw. But I've already said as much :).

When considering efficiency per die area, well, you probably shouldn't- because that only actually matters to AMD and Nvidia. They pay for the wafers, and it's the cost per wafer divided into the number of viable die per bin (etc.) that they have to worry about. If you follow mobile processor discussions, you can relate to various vendors' decision to make larger parts rather than higher clocked parts to keep the power draw (and resulting heat) down while keeping performance up. More than one way to skin a cat and all, since the higher performance wafer, analogous to the Hawaiian Islands GPU, might cost more to make and have a different yield than the competition's, yet both result in the same basic cost and performance per die in the end.

Finally, I'm really excited about the implications of AMD stepping back up to the 'large GPU, top-end performance' game that they exited so many years ago. It almost got humiliating to have Nvidia passing off their mid-range die as a 'high-end' part, with AMD barely able to compete using their largest- but still small- die.
 
More power usage doesn't really matter- like you say, it's not much in a single-card configuration, and it's not like electricity prices have gone the way of gas prices yet. I was really just trying to keep the record straight, because the power draw figures are important for other parts of the discussion.

And you're quite right, if AMD had not fallen off the deep end with their cooler design, we wouldn't be worrying about core temps or power draw. But I've already said as much :).

When considering efficiency per die area, well, you probably shouldn't- because that only actually matters to AMD and Nvidia. They pay for the wafers, and it's the cost per wafer divided into the number of viable die per bin (etc.) that they have to worry about. If you follow mobile processor discussions, you can relate to various vendors' decision to make larger parts rather than higher clocked parts to keep the power draw (and resulting heat) down while keeping performance up. More than one way to skin a cat and all, since the higher performance wafer, analogous to the Hawaiian Islands GPU, might cost more to make and have a different yield than the competition's, yet both result in the same basic cost and performance per die in the end.

Finally, I'm really excited about the implications of AMD stepping back up to the 'large GPU, top-end performance' game that they exited so many years ago. It almost got humiliating to have Nvidia passing off their mid-range die as a 'high-end' part, with AMD barely able to compete using their largest- but still small- die.

I disagree on die size, as that directly impacts price =p

Anyway, I'll still be keeping my 7970 for the foreseeable future!
 
I disagree on die size, as that directly impacts price =p

Anyway, I'll still be keeping my 7970 for the foreseeable future!

All other things being equal, a larger die is more expensive- but they are very rarely equal.

And yeah, I'll be keeping my GTX670's for a while longer too- there's just nothing out there that provides a compelling upgrade yet. The non-IGZO 4k monitors aren't out yet, and G-Sync hasn't even been announced for decent 4k monitors or for other GPU brands yet either :).
 
You really think if the 290X die was 100mm^2 smaller (with all performance, temp, and power the same as current) that the card's price would cost any less? That's funny.

No, but I think if it was 150mm^2 larger it would cost more :p Lets face it, both AMD and NV want to make money.
 
Did you not read the review? The performance difference @ 4k is much higher than 1-3%, additionally, I still don't think you understand "heat generation".



http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review/19

(anand pegs the 680 @ 2W higher than the 780 at idle)
the 680 uses 50W less than the 780 @ load, if you game for 1 hour a day, and idle for 23 hours of the day, how much of a difference are you saving by the 2w IDLE power you save? (which by the way is within the margin of error in testing IMO)

Lets do some math: 2W @ 23 hours ~ 165,000W
50W @ 1 hour ~ 180,000W

So you are spending 15,000 more Watts by owning the 780! (I'd just shut my gaming box down when I'm not using it btw)

There is less of a delta between the Titan and the 290X then between the 680 and 780.... just saying!


You're wasting your breath. LOL.
 
You really think if the 290X die was 100mm^2 smaller (with all performance, temp, and power the same as current) that the card's price would cost any less? That's funny.

What are you talking about? If you're talking manufacturing costs, you get more usable chips per wafer with a smaller die size, so YES, it is cheaper to produce and that affects the cost of the final product. This is practically the entire reason NV made a killing with the GK104 because it came in far above Nvidia's initial performance estimates for a 294mm^2 die size (IIRC). The GK110 is substantially more expensive to produce as a DIRECT result of the die size and fewer usable chips per wafer - the Hawaii is absolutely far less expensive to produce than GK110 and AMD can price it more aggressively if they want to, as a result. GK110 also had fairly large R+D costs and that is partially why it was initially positioned for the HPC market.

Anyway, the manufacturing costs have no bearing on the final temperature and all of that nonsense - i'm not sure why you even bring that up. From a cost to produce standpoint, AMD's Hawaii is much cheaper than the GK110. What this doesn't take into account is R+D costs, which AMD surely wants to recoup as Hawaii has been in the works for some time I imagine. It also doesn't account for the other differences on the PCB itself (512 bit bus and 4GB of GDDR5) but from a "chip cost alone" perspective, the answer to your question is all too obvious.

Because of this, AMD is in a much better position than nvidia is if it boils down to any sort of "price war". AMD can price more aggressively than NV can with the Hawaii because of the aforementioned reasons. That said, I still think the GTX 780 is a better "balanced" card if you ignore product cost - noise matters to me as it does to many people. But for 549$? It's acceptable. And i'm sure AIB's are working on 3rd party coolers to address the noise issue as well. Personally, I think it's a good card for the cost. No existing Titan or 780OC owner will want to switch, but it's a pretty darn good card for new GPU purchasers. As well, i've said this before - thanks to 290X, NV is adding a 3 game bundle, the 780ti , among other things. Everyone wins in this case regardless of which you prefer, honestly.
 
Last edited:
Is it possible to run a VGA monitor on the 290X at 1920x1200 >100 Hz?
 
Is it possible to run a VGA monitor on the 290X at 1920x1200 >100 Hz?

From TPU: "Please note that the DVI outputs no longer support analog monitors." That's a little surprising, but I guess gaming CRTs are probably becoming really really rare with 144Hz lcds available. (Though I have seen lcd screens with only an analog cable, fail!) edit: Yeah, you might want to double check that. I don't know.
 
Last edited:
Anandtech article mentions DP->VGA adapter, but I am not sure if that is supported on the 290X and if so, up to what resolution?
 
Anandtech article mentions DP->VGA adapter, but I am not sure if that is supported on the 290X and if so, up to what resolution?

Will be specific to the adapter- you can buy them now.

Also, if you need VGA, you might try using Intel or AMD on-board video. I use it in addition to my 670's for extra screens.
 
So, now that all crossfire communication happens straight over the PCI-E bus, would 2 of these cards in CF be severely bottlenecked on PCI-E 2.0 x8?

Might pick up 2 R290s when aftermarket cards hit, but if there's a big performance deficit on PCI-E 2.0 I'm going to pass.
 
Will be specific to the adapter- you can buy them now.

Also, if you need VGA, you might try using Intel or AMD on-board video. I use it in addition to my 670's for extra screens.

That would not work for FW900 gaming.
 
That would not work for FW900 gaming.

Wow, lucky dude. To be honest, you could try the DP adapter solution, but I expect that it'll add some input lag, likely not what you're looking for. It's really worth investigating if you're serious about it.
 
and doesn't overclock for shit

techreport only squeezed 90mhz out of the core, but memory did overclock well

I think "bad" overclocks aren't always a let down. Perhaps this card is good enough at running near its peak that you dont have to overclock. I wish I could flip a switch on all my stuff and have it just run @ max settings without all the tedious work of overclocking manually.
 
Awesome review Brent, thanks a lot :) This is indeed a mighty graphics processor, a bit old-school though IMO. Along with the upcoming R9 290 (anticipating another great review!), this is going to be a very disrupting force! Needless to say the enthusiasts will gladly welcome this shake-up, much like what happened back in the days of RV770.
 
Digital Viper-X- said:
No, but I think if it was 150mm^2 larger it would cost more :p Lets face it, both AMD and NV want to make money.

So, die size only sometimes directly impacts price ;). Particularly, when you have the larger die...

What are you talking about?
...

You've missed the point... See above.
 
Yeah, I see what you're saying but AMD is in the better long term position in the event of a price war. I think nvidia will definitely lower GTX 700 prices, but they probably won't get ugly with it - their GK110 chips definitely cost a sizable chunk more than Hawaii.

I guess the main point is, being that AMD has less discrete market share - I think they priced it aggressively purposely to gain some of it back. It's too early to tell but so far it seems to be working, as the 290X is basically sold out everywhere (to be fair, this can easily change when NV makes their move next month). Worse comes to worse, AMD can go even lower IMHO because they get more chips per wafer at a lower cost. Which is good for all of us, whether you buy red or green, you know? But I do get your point - namely that 100% of the price savings isn't necessarily passed on to the consumer. I don't disagree with you there.

All in all I think this holiday season will be pretty exciting. For the nvidia price cuts, 780ti, 3 game bundle, and to see where the price wars end up.
 
So, die size only sometimes directly impacts price ;). Particularly, when you have the larger die...



You've missed the point... See above.

well, smaller die = more competitive price.

if AMD could make the card for $200, you don't think they would drop the price knowing that their competition can't follow suite? You know what I meant from the start damnit stop splitting hairs =)
 
Plenty of reviews on this card out now show that it won't throttle if you just bump the fan cap a bit or leave on "uber" so you can agree with OP but still be factually wrong. Also even if you watercool the card you'll still have to deal with the same amount of heat in your room if you have poor ventilation, the GPU will just operate at a lower temp. Temp and heat are not the same thing.


Dude, I had a watercooling setup for 3 years so I know the difference between temps and heat and it doesn't change the fact that that its the same heat output, however with water cooling, you will maintain higher clock speeds without having a stock air cooling solution that is running at up to 100% fan speed in order to maintain any overclock speeds. I guess you're ok with uber mode fan speed of 55% which is 50db under load which is WAY louder than I want in my PC and I don't even want to discuss going above 55% on AMD's stock air blower which is garbage.

Stop treating me or anyone else like we don't have a clue, you couldn't be more wrong. If you're ok with AMD's new product, that's fine, I'm not and we'll agree to disagree, facts are facts though.
 
But if the 290x is faster, isn't the extra power justified :)? if not, why did you not stay with your 680? it uses less power than a the 780 you run does't it?

It seems that you still don't understand about the heat generation: Just because your card Runs cooler, doesn't mean it generates less heat.

You seem to be constantly knocking the 95 degree thing, or the noise thing, and not seem to understand that the 95 degree temp is set there for a reason, to keep noise down with the reference cooler.

The only REAL efficiency I see Titan having over AMD is that it has a more efficient reference cooler.

Digital Viper is correct.

The inefficiency in the 290X is the reference cooler, and its ability to remove heat from the GPU. The 780/TITAN have a more refined, more efficient reference cooler that is better at removing the heat generated by the GPU.

This is why I look forward to custom cards fixing this inefficiency in heat removal. That said, the cards still operate just fine with the "ineffecient" reference cooler and the GPU at 95c. That to me shows a great robustness and high quality, being able to run well that warm without any problems on a less efficient cooler. Good job engineering high quality components to withstand that.
 
Awesome review Brent, thanks a lot :) This is indeed a mighty graphics processor, a bit old-school though IMO. Along with the upcoming R9 290 (anticipating another great review!), this is going to be a very disrupting force! Needless to say the enthusiasts will gladly welcome this shake-up, much like what happened back in the days of RV770.

i love shakeups, keeps things interesting, and the consumer is always the winner, competition like this is so good, I'm glad there is at least AMD and NVIDIA to push each other, having just one would be bad for the consumer, I wish there was a third player still, I remember the potential of Kyro, that went no where :p
 
i love shakeups, keeps things interesting, and the consumer is always the winner, competition like this is so good, I'm glad there is at least AMD and NVIDIA to push each other, having just one would be bad for the consumer, I wish there was a third player still, I remember the potential of Kyro, that went no where :p

Didn't they just add tiled rendering to DX11.2? It's kind of like Aureal's ideas winding up in AMD's GPUs :).
 
Back
Top