Is 290X as hot and power consuming as GTX480?

http://www.pcper.com/reviews/Graphi...-Taking-TITANs/PowerTune-and-Variable-Clock-R

They cover it in pretty good detail here, you'll drop about 8 to 10% after gaming on the card for a few minutes due to the need to keep below that 95C temp. I assume we'll see the same thing with the 290 non-X version just slightly adjusted given the lower "advertised" top clock speed of 947 MHz

If you're running something that is in the hundreds of frames per-second range you most likely won't notice the drop, something like Crysis 3 where it's struggling to do 30fps at say 1440/1600 res you most likely would.

You do have to be alert to benchmarks that are only run at short 30 second interval and don't let the card get to its apparent 95C load operating temperature before recording results.

Was that in quiet or uber mode?

Anandtech shows no throttling in uber: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/19

Don't know how long they tested though.

Besides, I doubt it will be an issue with non reference coolers.
 
I don't understand why everyone is worried about a gpu that is designed to run at those temps but are fine with Crossfire and Sli setups..

I'm debating on adding another 7950 for CX that will draw more power load and make just as much heat but thats normal and ok...the 290x is not to most.
 
I don't understand why everyone is worried about a gpu that is designed to run at those temps but are fine with Crossfire and Sli setups..

I'm debating on adding another 7950 for CX that will draw more power load and make just as much heat but thats normal and ok...the 290x is not to most.

True that... LOL...

Perception I guess.

Anyone who's had multi-card systems knows heat. I mean people buy huge oversized power supplies and if you do use 80% of a 1 kilowatt psu, guess what, something will be creating heat.
 
I don't understand why everyone is worried about a gpu that is designed to run at those temps but are fine with Crossfire and Sli setups..

I'm debating on adding another 7950 for CX that will draw more power load and make just as much heat but thats normal and ok...the 290x is not to most.

I think you're overlapping a lot of people here. There are people who are concerned about temps and those who aren't. Neither camp exclusively crossfires or SLIs.

Honestly, I don't like my card being super loud or heating up my room too much. That's my main concerns. Just a quality of life thing. I had 5870's a few years ago. My brother (who was my roommate at the time) had 480's. His room was insanely hot compared to mine. He eventually put them on water but that heat goes somewhere... It didn't really help cool down his room.

I know I can turn on a fan in my room etc. (and I do), but I'm a bit picky about my room temperature. Also, if it's summer time and the thing throttles more - I'm effectively losing performance because of the season...
 
The R9 290X only uses about 50-60w more power than a 780, or roughly equivalent to a 60w bulb. If a 780 produces tolerable heat for you so will the R9 290X.

If the R9 290X is putting out too much heat for you than so will the 780 or for that matter any other high end video card.

Like Yakk says this is mostly a perception issue. Some more food for thought:
21aH57Z.png


Those are power measurements taken at the wall BTW with the following specs which is a fairly high end system:
CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
 
and AMD continues to tell everyone that 95C is fine and won't damage the GPU...I'd still feel more comfortable with after market cooling...makes waiting for the 20nm version also tempting

Because they know best and have likely tested it extensively...

same issue with everyone thinking they need to keep their i7 below 50C for it to operate best, left over info from back in the Pentium 4 days people need to just let go!

Fact - tech gets small, sometimes heat and power gets lower, but you now want more power to drive larger displays and more details = more power = more heat...
 
Heat and noise never stopped the NV fan boys from gobbling up the GTX 480 when it became available. Either buy one or don't if you thinks its such an issue.
You realize the GTX 480, the card you're berating for its heat output, has a TDP that is a full 20% lower than the 290X... right? :rolleyes:

If the 480 was bad about heat, the 290X is quite a bit worse...

Overclocked GTX780 dumps more heat and I don't see people complaining.
Two reasons for that:
1. The GTX 780 starts out with a 20% lower TDP than the 290X, so you have some reasonable headroom before you even begin to reach the 300w marker.
2. The stock blower cooler on the GTX 780 is substantially quieter than the stock blower cooler on the 290X, there's less to complain about.
 
I don't understand why everyone is worried about a gpu that is designed to run at those temps but are fine with Crossfire and Sli setups..

I'm debating on adding another 7950 for CX that will draw more power load and make just as much heat but thats normal and ok...the 290x is not to most.

sorry but i do not even hit 75 celcius with my 7970 dual x crossfire.It would be really hard keeping the 290x under 100 celcius in crossfire which to me is just to hot.I would be like running 2 reference gtx 480's in sli all i have to say is have fun with that.Im good with my cards getting to around 85 celcius anything hotter scares me
 
You realize the GTX 480, the card you're berating for its heat output, has a TDP that is a full 20% lower than the 290X... right? :rolleyes:

If the 480 was bad about heat, the 290X is quite a bit worse...


Two reasons for that:
1. The GTX 780 starts out with a 20% lower TDP than the 290X, so you have some reasonable headroom before you even begin to reach the 300w marker.
2. The stock blower cooler on the GTX 780 is substantially quieter than the stock blower cooler on the 290X, there's less to complain about.
The problem with the GTX 480 wasn't so much that it was hot and loud, it was that it was hot and loud and you got very little in return. It was about 10% faster than a 5870 while consuming about 100W more (~80%) more power (http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/33.html). This time around, sure the 290X is hotter and louder than the 780, but it's also proportionally faster, so it's efficiency is actually rather close (113% vs. 119% at 1600p): http://www.techpowerup.com/reviews/AMD/R9_290X/28.html
 
This time around, sure the 290X is hotter and louder than the 780, but it's also proportionally faster
That's not the sense I got from [H]'s review of the 290x... it sure as heck didn't look 20% faster than the 780/Titan.
 
If the 480 was bad about heat, the 290X is quite a bit worse...
The 480 got crapped on not only because it was hot and loud but also cost more than the 5870 while only being about as fast. Ran at 93C under load when [H] tested it too.

1. The GTX 780 starts out with a 20% lower TDP 2. The stock blower cooler on the GTX 780 is substantially quieter than the stock blower cooler on the 290X
What the 780 has going for it is the ACX cooler nV uses as their reference HSF. It is very good. AMD put a mediocre HSF on the R9 290X which is probably "good enough" in most situations to keep the price down.

If AMD had put a better HSF on their reference card people probably wouldn't be complaining much if at all.

That's not the sense I got from [H]'s review of the 290x... it sure as heck didn't look 20% faster than the 780/Titan.
Its about 10% or so faster than the 780 when both are at stock clocks at 1600p-1440p resolution. If you go up to 4K resolution that lead opens up to 10-20% faster than the Titan depending on the game.
 
What the 780 has going for it is the ACX cooler nV uses as their reference HSF.
The ACX cooler is not the Nvidia reference cooler... it's EVGA's custom dual-fan cooler.

The GTX 770, GTX 780, and GTX Titan all use the same Nvidia reference blower cooler:
1u6oZQy.jpg


But I agree, it's the best blower I've ever had on a graphics card. Seriously impressed.

If AMD had put a better HSF on their reference card people probably wouldn't be complaining much if at all.
I hinted at that being preferable. I don't think people would have minded the base price being $50 higher if it meant a much better blower cooler being pre-installed.
 
The ACX cooler is not the Nvidia reference cooler...
My bad.

I hinted at that being preferable. I don't think people would have minded the base price being $50 higher if it meant a much better blower cooler being pre-installed.
Gotta hint harder man, internet + real life has bent my sarcasm/hint detector. Use smilies or something.:D

I think a lot of people would be happier with a better HSF on the R9 290X but I also think AMD did some market research on that before they decided what would go into the end product. Given many people's finances right now, which are crap due to the economy/lack of recovery, a more budget focused product launch might end up getting them more sales. Given the way they sold out I'd say so far it seems to be a sensible decision on AMD's part.

edit: "depending on the game" Despite the heat and noise the R9 290X offers great performance for a significantly lower price than the competition at the moment. Ultimately that is what seems to matter to most.\/\/\/\/\/\/\/\/\/
 
Last edited:
ts about 10% or so faster than the 780 when both are at stock clocks at 1600p-1440p resolution. If you go up to 4K resolution that lead opens up to 10-20% faster than the Titan depending on the game.
Like I said, [H]'s review still doesn't make it look like it delivers 20% more performance for the additional 20% increase in TDP... even at 4k

http://hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/14#.UmqWluJKbi4

10% in Crysis 3, 12% in Battlefield 3, 12% in Metro Last Light, 13% in Tomb Raider.
There was a 23% gain in FarCry 3, which is a major outlier. Possible optimization issue on Nvidia's part?
 
sorry but i do not even hit 75 celcius with my 7970 dual x crossfire.It would be really hard keeping the 290x under 100 celcius in crossfire which to me is just to hot.I would be like running 2 reference gtx 480's in sli all i have to say is have fun with that.Im good with my cards getting to around 85 celcius anything hotter scares me

From what I understand, it will never actually hit 100 C. In fact it's designed to basically stay at 95 C. If it ever gets above that, it throttles down to stay at 95 C. So what you're getting is basically a card that always runs at 95 C and gives you variable performance to maintain it.
 
From what I understand, it will never actually hit 100 C. In fact it's designed to basically stay at 95 C. If it ever gets above that, it throttles down to stay at 95 C. So what you're getting is basically a card that always runs at 95 C and gives you variable performance to maintain it.
It's worth noting that keeping a consistent temperature (even a high one) is actually better for component longevity than continual thermal cycling.

It's no coincidence that Nvidia does something similar on the GTX 780 and Titan (they will throttle down from Boost Clock to Base Clock in order to maintain a temperature of exactly 80c, if necessary)
 
Not to be a dick, but where did anyone come up with this card using more juice then the GTX 480.

R9 290X: http://www.techpowerup.com/reviews/AMD/R9_290X/25.html
GTX 480: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html

It does CLOSE to the same amount, but not more. Even in UBER mode.
Sorry, I suppose "using" power isn't the appropriate term here.

The 290x has a TDP of 300w, which is 50w higher than a GTX 480... so it might be more accurate to say that the 290x wastes 20% more of the power fed into it than a GTX 480, turning it into heat.

So to answer the OP's question(s):
- Is the 290x as hot as a GTX 480? Hotter
- Is the 290x as power-consuming as a GTX 480? Nearly.
 
Last edited:
Sorry, I suppose "using" power isn't the appropriate term here.

The 290x has a TDP of 300w, which is 50w higher than a GTX 480... so it might be more accurate to say that the 290x wastes 20% more of the power fed into it than a GTX 480, turning it into heat.

So to answer the OP's question(s):
- Is the 290x as hot as a GTX 480? Hotter
- Is the 290x as power-consuming as a GTX 480? Nearly.

The TDP on the GTX 480 was horseshit from day one. Everyone knew it, the card constantly went over it's TDP limit.

I also wouldn't say that power is Wasted, as the card easily out performs the GTX480 by WAY more then 20%.

The GTX 480 in real world situations would range between 90c and 105c at max load. Mine tends to load full bore at about 97c. Mine does have a Factory OC with a Reference Cooler, (750mhz rather then 700mhz) so there is room for error there.

I think you guys might be being a little harsh on this new card. Yes, it runs hot, but thats with the fan locked down. You'll find that if you crank up the fan to GTX480 levels, it runs cooler then 95c and also never throttles. I think they made a wise move on the Fan caps, it allows you the option of having quiet power, or ripping the muffler off and letting the ponies really scream.

Does it use more power then the Titan? Yes. Is it louder then the Titan? Yes. Yet it bats in the same league as the Titan for 450 dollars less. I'll trade 40w of power consumption for 450 dollars any day.

And this is coming from a person with a Nvidia card in every machine. :p
 
The TDP on the GTX 480 was horseshit from day one. Everyone knew it, the card constantly went over it's TDP limit.
I've never heard of a reference-model GTX 480 having to dissipate more than 250w worth of heat.

Note, TDP is NOT absolute power consumption. Unless your card is 100% efficient, absolute power consumption will be higher than TDP.

I also wouldn't say that power is Wasted, as the card easily out performs the GTX480 by WAY more then 20%.
Absolute performance of the card is irrelevant to the figures being discussed.

Going by advertised TDP, the 290x wastes 20% more of the power you feed it as heat. Pretty simple to understand.

Does it use more power then the Titan? Yes. Is it louder then the Titan? Yes. Yet it bats in the same league as the Titan for 450 dollars less. I'll trade 40w of power consumption for 450 dollars any day.

And this is coming from a person with a Nvidia card in every machine. :p
Still not sure why people keep comparing it to the Titan. The GTX 780 (and now the 780 Ti) already offer Titan-league performance for $370 less than a Titan (and the GTX 780 has been offering that price advantage for months).

So, the 290x offers an $80-better cost-break... but an $80-better cost-break on a card that pretty much requires an $80 aftermarket heatsink isn't horribly compelling.

Is there some GPU-compute advantage the 290x has that warrants comparing it to the Titan? Or are we just going after it so the cost break looks bigger? :p
 
Last edited:
and AMD continues to tell everyone that 95C is fine and won't damage the GPU...I'd still feel more comfortable with after market cooling...makes waiting for the 20nm version also tempting

I run my 470 GTX between 90c - 95c nearly 24/7 and have done so for years. :rolleyes:
 
How is it a waste if you get something useful in return?
Uh... it's "waste" because the power in question is being converted into heat rather than into computation. Hence "waste heat."

Unless you specifically WANT heat for some reason, it's a waste byproduct.
 
I loved the GTX 480, extremely stable overclockers, can take a beating.
Two of them OC'ed on air made me strip down to my boxers because of the heat, but boy did they run.
 
I loved the GTX 480, extremely stable overclockers, can take a beating.
Two of them OC'ed on air made me strip down to my boxers because of the heat, but boy did they run.

I still have a system running a pair of 800mhz GTX 480's and it has been rock solid for years. If the 290x is anywhere near as rock solid as the 480 is then I don't think we'll have too much to worry about on the longevity side of things.


It would be nice if that guy that was making custom brackets to attach an Antec Kuhler 620 to Fermi cards would make some for the 290x.
 
AMD needs to update their reference cooler design. The vapor chamber thing is a nice improvement, but it is crippled by using it in the same shrouded design the reference 4800's had years ago. And those 4800s were nearly identical to the coolers on the reference 3800s. That little centrifugal fan doesn't move much air and the position of the heatsink under the housing makes it nigh impossible to clean without disassembling the whole damn thing. All AMD's gotta do is take a cue from their board partners. Pretty much any non-reference cooler is going to do better in a reasonably well ventilated case.

Also, funny to see [H]ers argue about 'waste' heat- I bet there are a few guys out there drooling at the opportunity to build a custom loop to handle four of these firecrackers. More heat = more [H].

I actually commend AMD for stabbing at the high end like this - Historically, AMD/ATI has been shy about pushing die sizes and power use because they've typically targeted the midrange as their main market. Now they have a design that scales very well and they're testing some higher limits - fun to see considering the slowdown in new process generations.
 
Last edited:
heat has to some where
and thats in to your room
its one thing keeping already cool chip cool so you can over clock it
its another when its a hot chip to start with

and im not so sure it scales
in fact the heat this thing puts out the fact they want to run it that hot tells me that it may be near the limit with out a major change... which would break there new API
seems to mantle is more to try and keep a design relevant for the next 5 years that other wise wouldnt be
if they cant get more then 15% speed up out of Mantle Maxwell will murder it
and thats with out even thinking about gsync
 
You could argue that the heat output is simply due to the limits of this silicon generation and AMD has figured out how to push it as far as they feel comfortable. nVidia has tread that territory more than once. AMD might be pushing the limits of the current GCN architecture, but given how it plays into their long-term plans for HSA I doubt there will be any major paradigm changes for the next shrink.
 
in fact the heat this thing puts out the fact they want to run it that hot tells me that it may be near the limit with out a major change... which would break there new API
seems to mantle is more to try and keep a design relevant for the next 5 years that other wise wouldnt be
if they cant get more then 15% speed up out of Mantle Maxwell will murder it
and thats with out even thinking about gsync

Have you ever programmed a thing ? You assume that for Mantle to work nothing may change, while Mantle will work for 7950/7970 7990 R9 280X R9 290x and R9 290.

There some changes between the hardware but yet Mantle will run on all those cards. But if you want to keep fooling yourself that if any hardware changes that Mantle will stop working go ahead post another few messages on the board.
 
Have you ever programmed a thing ? You assume that for Mantle to work nothing may change, while Mantle will work for 7950/7970 7990 R9 280X R9 290x and R9 290.

There some changes between the hardware but yet Mantle will run on all those cards. But if you want to keep fooling yourself that if any hardware changes that Mantle will stop working go ahead post another few messages on the board.

I think most programmers have a decent grasp of the English language. So...obvious conclusion is that Elios is DEFINITELY a programmer.
 
there all GCN based i thin GCN is a dead end and Mantle is a hail marry to keep it going
why doesnt it work on none GCN cards then?
 
why do people keep comparing the GTX 480/470 with the 290x?...like it's some sort of badge of honor?...the 400 series was a terrible card from Nvidia which was replaced by the real 'Fermi' card which drew much less power and noise...the 400 series is a bad memory
 
might want to check this out to it was made by a programmer

707px-Graham%27s_Hierarchy_of_Disagreement1.svg.png
 
Absolute performance of the card is irrelevant to the figures being discussed.

Going by advertised TDP, the 290x wastes 20% more of the power you feed it as heat. Pretty simple to understand.
Which is ironic because you don't understand it or are purposefully confusing it to make nvidia look better, which is it? TDP is rated by the manufacturer for the HSF, it isn't standardized and has little correlation to actual power consumption numbers. None of what you said above is anywhere close to being correct.
Still not sure why people keep comparing it to the Titan. The GTX 780 (and now the 780 Ti) already offer Titan-league performance for $370 less than a Titan (and the GTX 780 has been offering that price advantage for months).

So, the 290x offers an $80-better cost-break... but an $80-better cost-break on a card that pretty much requires an $80 aftermarket heatsink isn't horribly compelling.

Is there some GPU-compute advantage the 290x has that warrants comparing it to the Titan? Or are we just going after it so the cost break looks bigger? :p
Because the 290X is delivering Titan performance or faster, something the 780 couldn't touch without a health overclock. The 290X is also doing this for $100 less than the 780 and almost half the price of the Titan. I'm not sure how you're missing this, nvidia is simply getting slaughtered in the price/performance metric.
Like I said, [H]'s review still doesn't make it look like it delivers 20% more performance for the additional 20% increase in TDP... even at 4k

http://hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/14#.UmqWluJKbi4

10% in Crysis 3, 12% in Battlefield 3, 12% in Metro Last Light, 13% in Tomb Raider.
There was a 23% gain in FarCry 3, which is a major outlier. Possible optimization issue on Nvidia's part?
So the 290X is slightly less efficient than a $1,000 card...
 
780 has the same TDP as Titan and the 780 Ti will likely be around 275w TDP still lower then the 290x and will be faster then Titan for gaming but slower for compute
and cost the same now
the Titan is Prosumer level card cost is high to keep it from killing Quadro sales
 
there all GCN based i thin GCN is a dead end and Mantle is a hail marry to keep it going
why doesnt it work on none GCN cards then?

What the fuck are you talking about? Mantle works on every GCN card from the 7k series and R series.
 
I still run a Bloomsfield i7 and I use the Antec Kuhler 620 to cool it as I added an extra fan to it and it can run over 4Ghz with the cheap setup..

I know that some over at overclock.net was using this same Antec setup on HD7950's for massive overclocks and using zip ties to hold it down as they have a building log over there and some of that know how could be used on the 290x until we get aftermarket cooling.
 
Back
Top