AMD Radeon R9 290X Video Card Review @ [H]

Yeah that's right that is my total power consumption off my computer with monitors.
You're kinda inflating your numbers by including your monitors. Also take a look at that chart Phuncz posted and look at the power difference between the R9 290X and Titan.

I'm going to repost it because top of new page:


edit: if the PSU is a quality one running within 10w of max rated power output would be OK. Many of the quality PSU's can be run out of spec though that isn't good to do the point is you're being a lil' hyperbolic saying a 250w margin of error is "required". Anyways AMD recommends a 650w PSU so this is like, doubly silly.\/\/\/\/
 
Last edited:
Apparently you guys didn't get the memo:


CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)

And that's from the WALL. A high quality 450W PSU will do fine as long as you don't overclock like mad, are using old power hungry hardware (Core 2 Extreme) or have a 6+ HDD's in your PC.

I would not recommend or even risk running uber mode with a 460 watt psu . He is going to risk damaging all his components if it blows. I think this is funny when UPS are so cheap anyway. You should always provide your desktop with at least 250 watts of overhead
 
Wanted some clarification on a part from the article:

By any ports, does this actually mean that I can have two monitors on the DVI ports and the 3rd monitor on a HDMI port? That's still one of my biggest gripes with AMD in that I have to use a display-port capable monitor or a display-port adapter that may or may not work if I want to use three monitors on one card.

I'm curious about this as well, but my guess would be that you can use both DVI ports with Eyefinity.

As for the display-port adapter business, I too am running a three monitor setup (2560x1440 * 3), and with my 7970CF setup, I use 2 mini-DP/USB adapters with the third monitor plugged into the DVI and it works. It can be glitchy at times, but it mostly works.
 
Apparently you guys didn't get the memo:

59320.png


CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)

And that's from the WALL. A high quality 450W PSU will do fine as long as you don't overclock like mad, are using old power hungry hardware (Core 2 Extreme) or have a 6+ HDD's in your PC.
and you actually think that's as high as it will go? Even my dinky system pulled over 300 watts at the wall in Crysis 3. at stock clocks he will be fine but it will still be pushing the PSU fairly hard at times. With overclocking he is done.
 
You will not save power or reduce heat in your house if you water cool. Water cooling reduces the temp of the part being water cooled.

The heat is then dumped into your room/house via the heat exchanger aka radiator.

Power usage on the video card itself will change little if at all + you have to add in the cost of all the fans + pump you use...so your power bill will remain the same or go up a bit.

Its a little depressing how you guys don't understand the difference between temperature and heat.

edit: The only way to reduce heat in your house with water cooling is to put the heat exchanger outside the house so the heat is dumped there instead of being trapped inside with you. There are people who have done this but its not easy to do for most homes and is fairly rare. There is also geothermal pc cooling which will also dump the heat outside or in the ground under your house but is even harder to do. By all means post pics if you do either of these things please, always get a kick out of it.


So is AMD but top end GPU's from either have been lousy to terrible in terms of energy usage at load for years so this really doesn't mean much at all.


It does when you take into consideration SLI configurations and heat /energy usage and I don't think you understand the concept i was trying to get across
 
Attn: Nvidia fanboys. Keep buying Nvidia!! Honestly, the rest of us (well, most of us, anyway) don't care. The best way to view these cards positively and for you to remain impassioned, loyal, devoted subjects of the sect of Jen-Hsun Huang is to recognize that competition is good for you. Nvidia will likely drop the price of the 780 to $549 (or $499) and then introduce the 780ti at the previous 780 price. Enjoy your previous (or upcoming) purchases, and revel in your self-perceptions. Chances are in either case, your cyber phallus shall remain unchanged in length or girth.
 
It does when you take into consideration SLI configurations and heat /energy usage and I don't think you understand the concept i was trying to get across
SLI/CF doesn't violate laws of thermodynamics. You can change the GPU temps and change where the heat goes but you will not make the heat vanish, that is magical thinking.

Also we weren't talking about CF'ing R9 290X's. His argument, that you and another agreed with, was "water cooling will the R9 290X's only saving grace" + water cooling will somehow reduce heat dumped into your room/house by the GPU + you added that it will save on power usage.

edit: This:
If you take into account everything in your house to save energy you can benefit from lower energy consumption which will in turn make your house easier to cool down.
is not something I'm disagreeing with BTW.
 
Last edited:
and you actually think that's as high as it will go? Even my dinky system pulled over 300 watts at the wall in Crysis 3. at stock clocks he will be fine but it will still be pushing the PSU fairly hard at times. With overclocking he is done.

HardOCP got to about 395 W in Uber with i7 [email protected] GHz. A quality 450 W PSU should be able to cope with that, though I would feel more comfortable with a 550 W. But that is a high CPU OC.

With overvolting and overclocking the 290X you would need a stronger PSU.
 
So my 290X just arrived but the driver isn't on AMD's website yet.

Also the BF4 code didn't work, so that kinda sucked too.

But I'm guessing they're not awake yet. Guess I have to wait a couple more hours.
 
You should care about the heat since this video card will come in handy as a space heater:D

Winter is almost here

Why oh why did you not use the opportunity to inject a Game of Thrones reference (Winter is Coming!) is beyond me. Just six more months until April and season 4
 
I agree with this statement. Even though I am a power junkie myself i find myself trying to save energy throughout my house. My computer office is upstairs and is in the hottest part of my house. After investing in a KW meter and determining what is generating vampire power on my desktop i was surprised to find out that my old UPS , AC66 adapter, NEC monitors, subwoofer, and other accessories generated heat even when off. After upgrading to a green cyber power ups and green surge protectors which cuts vampire power to my entire computer and peripherals has reduced the temps by at least 3 degrees which means the rooms stays cooler and is faster to cool down. I cant imagine what the 290x would be like in my type of environment in the summer. In the winter i would not care . Nvidia is big on ISO standards and participates to be a friendly environmentally organization. That is why I am sure most companies would rather pay the up charge to put titans into their super computers to save on energy costs vs current AMD cards

I have to ask because it seems that either I am reading wrong or you are confused.

Just because the 780 chip is running at a cooler temp doesn't mean there is less heat being dumped into your room. The more efficient the heatsink is on your GPU the more heat it will remove from the GPU and dump Into your room. If the 780 uses 30less watts than a 290x I don't think you will feel any difference in your room temp.
 
I would not recommend or even risk running uber mode with a 460 watt psu . He is going to risk damaging all his components if it blows. I think this is funny when UPS are so cheap anyway.
That's why you buy good PSU's, not just the ones with "more Watt". The good ones have over-current protection and short-circuit protection, the first one would protect you from that.
UPS = PSU ?

You should always provide your desktop with at least 250 watts of overhead
What scientific explanation do you have for your 250W+ overhead requirement ?

and you actually think that's as high as it will go? Even my dinky system pulled over 300 watts at the wall in Crysis 3. at stock clocks he will be fine but it will still be pushing the PSU fairly hard at times. With overclocking he is done.
I'd suggest you read my post again, covering this portion with more care:

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional


If you'd even interpreted Quartz-1's post thoroughly, you'd spot that he isn't overclocking, but is using an Intel i7-3770S.
 
You're kinda inflating your numbers by including your monitors. Also take a look at that chart Phuncz posted and look at the power difference between the R9 290X and Titan.

I'm going to repost it because top of new page:


edit: if the PSU is a quality one running within 10w of max rated power output would be OK. Many of the quality PSU's can be run out of spec though that isn't good to do the point is you're being a lil' hyperbolic saying a 250w margin of error is "required". Anyways AMD recommends a 650w PSU so this is like, doubly silly.\/\/\/\/

I can understand where you are coming from . But if you are saying AMD recommends 650watts I would not be gambling my components with a lower rated card then what is recommended. Also this can cause warranty repair issues as the video card manufacture will always ask what is your power supply rated at during an RMA. Its kinda like saying I put 100 percent enthenol in my gas tank instead of using regular gas on a honda accord and now i am taking it to the dealer to get warranty repairs and expecting repairs free of charge
 
Last edited:
I will have to agree at this statement since the average person like yourself does not know how to effectively monitor or capture the deals as they are released. You have to be on top of it but nonetheless they exist.

Monitoring a website 8 hours a day < Enjoying life and getting paid for work
 
I would not be gambling my components with a lower rated card then what is recommended. Also this can cause warranty repair issues as the video card manufacture will always ask what is your power supply rated at during an RMA.
They still probably won't void your warranty though, unless they're being dicks, since if the GPU has too little power it won't blow up it just won't power up. Faulty PSU's blow up your hardware when they break, actual wattage won't matter.

edit: so you actually believe watercooling will make heat vanish? If you don't then where does the heat go?\/\/\/\/\/\/
 
Last edited:
I have to ask because it seems that either I am reading wrong or you are confused.

Just because the 780 chip is running at a cooler temp doesn't mean there is less heat being dumped into your room. The more efficient the heatsink is on your GPU the more heat it will remove from the GPU and dump Into your room. If the 780 uses 30less watts than a 290x I don't think you will feel any difference in your room temp.

When combined with all your other components in your system every bit helps especially going crossfire/SLI. When you are gaming full load 50-65watts is alot of heat. I guess the only way to test this theory is stick your face up to a 60 watt incandescent light bulb . I installed all energy efficient LED/CFL in the house so i have no way to go about testing this out.
 
But if you are saying AMD recommends 650watts I would not be gambling my components with a lower rated card then what is recommended.
This is because of those cheap PSU's that can't sustain high load, which are often included in cheap cases and OEM computers. Those cheap PSU's won't be able to sustain 80% load for long. It's not a requirement, it's a wall the manufacturer can protect himself when your low-end Acer/Gateway computer blows smoke when you install a high-end GPU.
 
Attn: Nvidia fanboys. Keep buying Nvidia!! Honestly, the rest of us (well, most of us, anyway) don't care. The best way to view these cards positively and for you to remain impassioned, loyal, devoted subjects of the sect of Jen-Hsun Huang is to recognize that competition is good for you. Nvidia will likely drop the price of the 780 to $549 (or $499) and then introduce the 780ti at the previous 780 price. Enjoy your previous (or upcoming) purchases, and revel in your self-perceptions. Chances are in either case, your cyber phallus shall remain unchanged in length or girth.


Do you feel better you mentioned this now? Because I think this has been talked about over and over. If you understood how monopolies work there is a reason why Nvidia and AMD play with each other like this just like intel and AMD. I personally think that is why intel is delaying broadwell chips because AMD is far behind and barely able to surpass Nvidia. Look how long it took to get the 7990 right when the titan was released 8-9 months ago.
I guess companies can say anything they want to avoid releasing new technology so their main competitor so AMD does not go out of business.

http://arstechnica.com/gadgets/2013...broadwell-cpus-delayed-due-to-yield-problems/
 
HardOCP got to about 395 W in Uber with i7 [email protected] GHz. A quality 450 W PSU should be able to cope with that, though I would feel more comfortable with a 550 W. But that is a high CPU OC.

With overvolting and overclocking the 290X you would need a stronger PSU.


I am confused because you say the total system draw was 395 watts in uber but in the review it says total sytem draw at full load in uber was 440watts. The system itself without the video card was 90watts


http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review/15#.UmqMyflON8E
 
How the fuck is it that a certain nvidia fanboi is still basically crapping all over this thread by repeating the same stupid fucking drivel over and over? It looks like a good 1 in 4 posts is by the same person making excuses for why Nvidia as of yesterday is being beat by AMD.

AMD has released the best card for gaming out of the box period and the cards can only get better. I wont be buying a 290, but I am sure as hell looking forward to mid-range cards being released based on the new architecture.

I think they are afraid of what AMD can become. I've never seen such a strong response from the other half on a GPU release before. It's like something just snapped and they have all lashed out at full force.
 
Why oh why did you not use the opportunity to inject a Game of Thrones reference (Winter is Coming!) is beyond me. Just six more months until April and season 4

I have no idea what you are talking about i dont watch that show
 
I think they are afraid of what AMD can become. I've never seen such a strong response from the other half on a GPU release before. It's like something just snapped and they have all lashed out at full force.

The green side calling the red side hot, loud and power hungry is quite hypocritical don't you think? Short memories....who cares buy what you like.
 
They still probably won't void your warranty though, unless they're being dicks, since if the GPU has too little power it won't blow up it just won't power up. Faulty PSU's blow up your hardware when they break, actual wattage won't matter.

edit: so you actually believe watercooling will make heat vanish? If you don't then where does the heat go?\/\/\/\/\/\/

The manufacture wont RMA your card if you are not running the recommended PSU.

I have no idea what you are talking about when it comes to watercooling and make heat vanish.
Heat doesnt vanish unless it has somewhere to escape
 
The green side calling the red side hot, loud and power hungry is quite hypocritical don't you think? Short memories....who cares buy what you like.

It's a bit damn myopic- even if it's been true for a couple of years and reinforced by this release. It was the other way around for quite a while before that.

Sad part is this- this card really is just too damn loud, and that cooler AMD used is worthless. It was built to provide these benchmarks, and nothing more.

So while we can all bask in the performance that this card brings to the table- and it is awesome- literally every review, forum, and comment section is full of one specific piece of advice:

"Wait for the custom coolers"
 
The manufacture wont RMA your card if you are not running the recommended PSU.
This is false for at least some companies and I'm pretty sure most if not nearly all.

I have no idea what you are talking about when it comes to watercooling and make heat vanish. Heat doesnt vanish unless it has somewhere to escape
This is the exact opposite of what you were saying on the other page. This is not me misreading your posts or misunderstanding either, see Digital Viper-X's post.
 
It's a bit damn myopic- even if it's been true for a couple of years and reinforced by this release. It was the other way around for quite a while before that.

Sad part is this- this card really is just too damn loud, and that cooler AMD used is worthless. It was built to provide these benchmarks, and nothing more.

So while we can all bask in the performance that this card brings to the table- and it is awesome- literally every review, forum, and comment section is full of one specific piece of advice:

"Wait for the custom coolers"



Its all just a game to make it look like one company is a ahead of another although AMD is way behind Intel in the CPU industry for the last 2 years I think that Nvidia will strike back and take the lead by releasing a new Titan within the next 3-6 months. Mantle is a joke right now and so is the onboard DSP sound chip that no one has used or will see much benefit for at least another 6 months to a year. By then new technologies will be implemented into Nvidia to make games faster. I think AMD has alot of work to do engineering wise to really push ahead like Nvidia did with the Titan 8-9 months ago when it was released. For all those people who bought the titan for the price they did got a good deal. I think during that time I went through a GTX 670,680 now a 780 and probably took 200-300 dollar depreciation hit
 
It's a bit damn myopic- even if it's been true for a couple of years and reinforced by this release. It was the other way around for quite a while before that.

Sad part is this- this card really is just too damn loud, and that cooler AMD used is worthless. It was built to provide these benchmarks, and nothing more.

So while we can all bask in the performance that this card brings to the table- and it is awesome- literally every review, forum, and comment section is full of one specific piece of advice:

"Wait for the custom coolers"

Apparently not many are taking that advice as they are sold out almost everywhere...it never stopped anybody snapping up 480's. Buy what you want when you want and don't worry about other opinions.
 
This is false for at least some companies and I'm pretty sure most if not nearly all.


I

This is the exact opposite of what you were saying on the other page. This is not me misreading your posts or misunderstanding either, see Digital Viper-X's post.

will have to disagree with your statement here. The fact the matter is the manufacture will ask you "what is your PSU rated at"? If you come back yea i have a 450 watt PSU and i am running a 290x right now . They are going to come back with "go get a 650 watt PSU and call us back". Because they will not diagnose your card any further period.
Ok I guess you should keep rereading it if you dont understand what i am saying
 
Apparently not many are taking that advice as they are sold out almost everywhere...it never stopped anybody snapping up 480's. Buy what you want when you want and don't worry about other opinions.

its called supply and demand
obviously there was a shortage of supply and alot of people fall for new technology or how to interpret benchmarks and energy consumption vs other graphic cards out there. Plus the price is good for the performance value on the 290x . What they did not realize was a geforce GTX TI is going to be released here shortly and perform better. Also it could be that since winter is coming they can finally get rid of that space heater in the office.
 
When combined with all your other components in your system every bit helps especially going crossfire/SLI. When you are gaming full load 50-65watts is alot of heat. I guess the only way to test this theory is stick your face up to a 60 watt incandescent light bulb . I installed all energy efficient LED/CFL in the house so i have no way to go about testing this out.

50-65 is in ubermode, in which case you are getting better perf. IF you overclock your 780 you think you won't be increasing the power consumption + heat generated?

the 30W more it uses in quiet mode will not be noticeable when your computer is already spitting out 400-450W. Unless you are sitting in a closet.
 
Ok I guess you should keep rereading it if you dont understand what i am saying
Wooooah buddy "not diagnose your card further" is very different from "won't RMA your card"!

You've engaged into full goal post shifting mode, attempts to spin your arguments further will cause you to crash and burn, there is no turning back now!!

And that is without even trying to back up anything you were saying earlier either.
 
its called supply and demand
obviously there was a shortage of supply and alot of people fall for new technology or how to interpret benchmarks and energy consumption vs other graphic cards out there. Plus the price is good for the performance value on the 290x .

nm......
 
Apparently not many are taking that advice as they are sold out almost everywhere...it never stopped anybody snapping up 480's. Buy what you want when you want and don't worry about other opinions.

If we could ever verify what that actually meant, then we could discuss it :).

All we can say about the availability is that demand has outstripped production- but we don't know what the production numbers are. The Titan sold out at twice this price too, remember!


Objectively, it would make sense for production to be low, assuming GPU yields are high. We can only hope that the majority of these cards are being fitted with custom coolers, and that their arrival at retailers will represent actual supply.

Only the ignorant and those putting these cards under water would buy one right now :cool:.
 
Do you feel better you mentioned this now? Because I think this has been talked about over and over.

Are you seriously criticizing me for mentioning something that's been pointed out before? My goodness, In your own posts alone, there are likely more Exoplanets discovered than there are times you've re-iterated the same points over and over. We get it; you are an Nvdia shill (or might as well be). Guess what? We don't care. You spend your money where you'd like, I've ordered my R9 290x.
 
50-65 is in ubermode, in which case you are getting better perf. IF you overclock your 780 you think you won't be increasing the power consumption + heat generated?

the 30W more it uses in quiet mode will not be noticeable when your computer is already spitting out 400-450W. Unless you are sitting in a closet.


I dont know maybe it is because I am so fond of my GTX 780 Windforce DBA levels and cooling capacity . I bought a referene 7970 GHZ edition and that think was hot and loud. I can not imagine exhausting 95c heat into my room in the summer
 
If we could ever verify what that actually meant, then we could discuss it :).

All we can say about the availability is that demand has outstripped production- but we don't know what the production numbers are. The Titan sold out at twice this price too, remember!


Objectively, it would make sense for production to be low, assuming GPU yields are high. We can only hope that the majority of these cards are being fitted with custom coolers, and that their arrival at retailers will represent actual supply.

Only the ignorant and those putting these cards under water would buy one right now :cool:.

Love that last part...nice troll. :rolleyes:
 
Back
Top