Why did AMD introduce a botched solution in the name of 290X

Status
Not open for further replies.

madgun

[H]ard|Gawd
Joined
May 11, 2006
Messages
1,783
The chip is great as we all know it but the combined product is not. Why would AMD handicap it's GPU line with the worst performing cooler in the past 10 years? (FX 5800 Ultra comes to mind).

And why even have a stupid BIOS switch. If I were paying top dollar for a card then I would expect my card to perform decently from the onset and not having to rely on some crazy fan profile. Most of the reviewers out there are right in terms of lamenting AMD for the solution that they put together. NVidia is able to take perfect advantage of this situation by pricing their card higher then the 290X. Ofcourse they have a superior product that runs cooler, overclocks like a beast and simply works out of the box without any throttling issues. But the price difference is still too much and apparently they are getting away with it.

And why even base an argument in terms of R290X after having a better cooler should perform more decently - less throttling. After all the competition will also get non-reference cooling.

AMD should have taken more time to get the product out in the market. Their place at the bottom of the performance charts has made them excessively clumsy and it shows in all their product lines.

I loved the company called ATI. True revolutionists - 9700 pro anyone?

I hope AMD can learn from how ATI put hard work into their products. Better coolers, AA and AF technology at product launches. Ah the good old days :(
 
For what it's worth, I agree. I think the 290X Hawaii chip is incredible. The cooler holds it back so much that it's fucking bullshit - throttling, BIOS switch, what the fuck is this shit? AMD should have made a cooler similar to the Titan cooler to make a more complete product.

That is a product that I probably would have bought. As things are with the quiet mode throttling shit, no sale. The chip has potential and AMD fucked their chance over.
 

I know right...OP must need a serious hug..Did you have a bad day at work OP? I mean don't hold back, tell us how you really feel..:p:p

I am personally stoked to own a card that is as fast as Titan outta the box, and then will crush it and run neck and neck with a 780TI in some games and out run it in others for a whopping $388 brand new!:eek::eek:..

Even if you hate AMD with a passion and feel this is the worst product launch in history, you should be shouting their PRAISE (you know, the opposite of your rant) for Forcing Nvidia to slash the prices to bottom barrel (based on their metric) instead of continuing to rob their loyal customers with insanely prices (based on a price/performance ratio)..

Oh let's not forget the game bundle they are offering (after seeing how successfully AMD used it to steal much deserved marketshare) and a $100 discount on their OMG AWESOME Handheld Console (powered by their lackluster Tegra4 SoC units they couldn't give away)...:eek:

I hope I gave you a full belly of delicious trollbait casserole ;)..

For what it's worth, I agree. I think the 290X Hawaii chip is incredible. The cooler holds it back so much that it's fucking bullshit - throttling, BIOS switch, what the fuck is this shit? AMD should have made a cooler similar to the Titan cooler to make a more complete product.

That is a product that I probably would have bought. As things are with the quiet mode throttling shit, no sale. The chip has potential and AMD fucked their chance over.

Since you managed to squeeze in a post while I was typing this one, I will address your concerns here as well...I personally think the stock cooler isn't the greatest, but it doesn't "Hold the card back" in anyway..The cooler's fan profile was set to allow the amount of performance that AMD wanted. It's as simple as that.

The 7950/7970 were the same way..It had insane O/C'ing headroom (Perhaps the best GPU O/C'ing ever possible IMO)..AMD could have launched the card at the max possible clock speed (even the worst chips would do ~1.1Ghz on the core) but chose not to..The card was already the fastest on the market with the launch day clocks..

When Nvidia got their contender (the 680) out 3~5 months later it was a bit faster..So AMD took advantage and retook the crown with the launch of the 7970Ghz Edition cards..Had AMD launched the 79XX cards at ~1.1Ghz and then Nvidia released a card that was faster, AMD would have not had anyway to counter/retake the performance lead*..

I fully believe that AMD knew that Nvidia was going to attempt to rain on their parade once they realized that the 290X smoked the 780/TITAN with a fully enabled GK110 part...And what happened? Nvidia announced their 780TI as AMD expected..AMD has now seen the best Nvidia has to offer for the next 6~8 months (or longer) until both move to 20nm process chips..

AMD can now give their AIBs permission to release the floodgates with the custom versions of the 290/290X and then launch a 290X Extreme/Ultra/PE (insert marketing title here) a few months from now if their AIBs aren't happy with the sale results of the custom cooler/PCB designs..We already know they have PLENTY of headroom, which is just a nice free bonus, especially for those of us that WC..We can get ~20-25% more performance with 50~90% less noise then the best air cooled GPU design..

*Performance varies greatly from Title to Title, based on which company the developer chose to optimize for.,
 
Last edited:
What a fascinating post. Beating a dead horse is popular 'round here. :rolleyes:
 
I think the more important question is why the fuck are people paying 550 for 290X when they can get a card for 400 and achieve similar or even faster performance.
 
I think the more important question is why the fuck are people paying 550 for 290X when they can get a card for 400 and achieve similar or even faster performance.

Doh, Ninja'd again! It sucks to have to literally spell things out line by line, but I personally feel the post above from CC does an excellent job of doing so!:p

And $400? Pffft...I paid $388 for mine from NE with a 1 day shipping time;)..
 
their place at the bottom of the performance charts LOL, that is just to much.

Nvidia did do something right optimize to reduce power draw, and slap on a hell of a reference non reference solution to keep it cool, but they also charge what $250+ extra for that cooler, for that same $250 you can get your own aftermarket cooler for the AMD cards and not ever have to worry about it throttling AND have a faster card hmmmmmmmm.

Granted 290x could have been more tuned or whatever, but it is a beast, where the 780Ti for how much "extra" it has truly is not a step up over Titan considering it has a full fledged 110core AND doesn't have to wrry about the extra DP bits and such.

AMD does what they can with what they have, Nvidia constantly does the minimum possible and charges as much as humanely possibly, for the cpu side Intel does this very launch.

All these fan boys who constantly throw the fud and crap around do not realize one key factor, if AMD did not have any competitive parts to deal with Nvidia/Intel you would be paying an arm and leg maybe even a kidney and NOTHING would be getting better, AMD is the one who makes them push prices, AMD is the one who gets the new stuff out the door tessellation, GDDR, better soldering methods etc etc etc, so yeh, keep pounding your drum maybe one day you`ll hit yourself with that stick and grow up to learn something :)

Great post btw CC. 290x/290 and the rest of the Rx-200 series was not lackluster, they bumped performance and reduced pricing like the 6k series did, nothing wrong with this, a couple new features and some tinkering and its all good. Mantle I think will be the huge change of ground, ad this bodes VERY well for the Rx-300 series as they will have had time to really get the most out of it, and probably see their full blown redesigned cooler come out as well(the Radeon 9k/Rx200 were supposed to use a much redesigned full vapor chamber cooler, but they did not bother, less heat also means less power required, very odd)

Hey if I had the case, I would have a R-290 but I do not :p
 
I agree that AMD has kept Intel and Nvidia honest but at the same time they need to bring up their game one level up. My qualm is about AMD not taking the time to polish their product lines. I myself would have gone with the R9 290 had I not gotten a great deal on a 780 Ti Super Clocked.

If you think the R9 290 could be efficient, take a look at the competition and see the meaning of efficiency. It is efficient RIGHT NOW.
 
If you think the R9 290 could be efficient, take a look at the competition and see the meaning of efficiency. It is efficient RIGHT NOW.
WTH are you even talking about? You sound like you're badly quoting nvidia PR material at this point.

Hawaii has a much smaller die than GK110 but has similar performance. If efficiency is what you're after than you'd have to go with AMD.
 
At the expense of more heat, power consumption and throttling.

Your understanding of efficiency amazes me.
 
Regarding the cooler discussion that keeps popping up I think something that people need to take into account is that the primary sales will be custom cards through board partners which is why there isn't a strong emphasis on the cooler. The reference design is just a reference basically it isn't really the end product that most people will buy.

Nvidia's high end reference cooler came about because it was developed specifically for a line of cards that was reference only, partners were not allowed to make customs, as such the reference design was the end product. This meant the reference solution had to be very appealing from a end user commercial stand point. If you look at Nvidia's reference coolers for cards outside of that situation it is rather basic as well like AMD's reference cooler.

If you are wondering why it is done this way the answer is likely to move up the launch window while keeping it a hard launch. In order to launch with custom boards you would need to grant the board partners more lead time for design as well as more time build up chips to ensure they have a proper supply. Reference card only launches address this issue since they are all manufactured from one source with the same design. You'll notice why both companies out of the gate high end part is typically always a reference only launch.
 
After a product launch, has anyone ever advocated that please ignore the reference design and go for the upcoming AIB partners' coolers. I am not sure I agree with, "Wait for the non-reference card" argument.

The product is here and it is throttling at a setting AMD provided. The reviewers have no choice but to use a stupid BIOS switch to make the card competitive with the competition. If the card was so nice and competitive why even include the BIOS switch in the first place?
 
the 290 is very appealing to me, in fact I can smell 290 non x crossfire in my future. With the new DMA engine and all... I would never buy them with reference coolers though and I love reference coolers.
 
After a product launch, has anyone ever advocated that please ignore the reference design and go for the upcoming AIB partners' coolers. I am not sure I agree with, "Wait for the non-reference card" argument.

The product is here and it is throttling at a setting AMD provided. The reviewers have no choice but to use a stupid BIOS switch to make the card competitive with the competition. If the card was so nice and competitive why even include the BIOS switch in the first place?

Honestly this would have to depend on your perspective and what you are looking for. GPU and video card are used interchangeably but really they are referring to different things.

Are you interested in the 290x reference video card and plan to run it with the stock cooler? Then yes you of course need to take into account the actual characteristics of that video card which includes the cooler.

Or alternatively are you interested in the 290x GPU (fully enabled Hawaii)? In that case the actual characteristics of the reference board used at launch is not as important to you since that may not be the actual video card you are considering. The 290x GPU itself can be (and will be like most GPUs) used in alternative video cards besides the reference video card.
 
I just picked up two 290's for $760.... and they run quieter than the two 7970's they replaced. Not nearly as loud as some of the green shills made it seem they would be.

Fantastic cards, more than $600 savings over two 780Ti's and performing almost the same....

this thread is a major fail. You nVidia trolls are really butthurt.
 
At the expense of more heat, power consumption and throttling.

Your understanding of efficiency amazes me.


Well no one is putting a gun to your head and make you pay for "botched" hardware and by that definition everyone else is crazy enough to spend their money on it. I'm sure that after reading your post the [H] crowd will flock to Nvidia and pay more money for a good cooler, I'm sure that people at Nvidia are really grateful for your keen observation skills.
 
I'm no nvidia troll my friend but I do not want two gpu's in my case running 95c, I 100% agree the cards are the balls and you did good Nuke, but I will wait for some msi's with nice coolers.
 
I'm no nvidia troll my friend but I do not want two gpu's in my case running 95c, I 100% agree the cards are the balls and you did good Nuke, but I will wait for some msi's with nice coolers.

When you say that are you concerned is it about the effects it will have on the rest of your system or the GPU itself?
 
Last edited:
If AMD engineers say the cards were designed to run at 95c... I believe them! However CCC now is set up where you can set a target temp and a max fan speed. If you don't want them running at 95c, then they won't. I personally set mine at 75c and the fans only get loud during gaming sessions. As a headphone gamer this does not bother me.

And not that cost is the only thing that matters, but all that money I didn't spend for the same performance from nVidia can buy a nasty custom water loop if/when I decide I want these silent and cool.
 
Well they said over the life time of the card. Technically the warranty is only for 2 years I believe :p

Kidding aside though, did you actually look at the performance impact though of setting such a low temperature target? The GPU will downclock to reach that temperature target if needed.
 
After a product launch, has anyone ever advocated that please ignore the reference design and go for the upcoming AIB partners' coolers. I am not sure I agree with, "Wait for the non-reference card" argument.

The product is here and it is throttling at a setting AMD provided. The reviewers have no choice but to use a stupid BIOS switch to make the card competitive with the competition. If the card was so nice and competitive why even include the BIOS switch in the first place?

actually there is absolutely nothing stopping the reviewers from changing the fan speed just like there is nothing stopping the consumer that ends up buying the reference card from changing it, so that argument has no weight. whether the reviewer actually changes the fan speed or not is at their discretion. personally if it can be manually changed within the drivers and not a third party app then it should also be shown in the review as well which is what Brent did.
 
I will believe them as long as the card doesn't throttle. Until then it's bull crap.

If AMD engineers say the cards were designed to run at 95c... I believe them! However CCC now is set up where you can set a target temp and a max fan speed. If you don't want them running at 95c, then they won't. I personally set mine at 75c and the fans only get loud during gaming sessions. As a headphone gamer this does not bother me.

And not that cost is the only thing that matters, but all that money I didn't spend for the same performance from nVidia can buy a nasty custom water loop if/when I decide I want these silent and cool.
 
Well they said over the life time of the card. Technically the warranty is only for 2 years I believe :p

Kidding aside though, did you actually look at the performance impact though of setting such a low temperature target? The GPU will downclock to reach that temperature target if needed.

I was under the impression the cards will only throttle after max fan speed has been reached. Is this not correct? I set my fans to 100% max and they definitely do not ramp up that high. I may just set max temp to 95c anyways. The only thing I do not want is that heat dumping in my case, and reference coolers exhaust out the back. Honestly I don't care how hot the GPU's get as long as the other components are safe.
 
actually there is absolutely nothing stopping the reviewers from changing the fan speed just like there is nothing stopping the consumer that ends up buying the reference card from changing it, so that argument has no weight. whether the reviewer actually changes the fan speed or not is at their discretion. personally if it can be manually changed within the drivers and not a third party app then it should also be shown in the review as well which is what Brent did.

Brent did the right thing, otherwise the card isn't competitive at all.
 
At the expense of more heat, power consumption and throttling.
Heat and throttling have nothing to do with efficiency, that is due to their choice of HSF for cost control reasons most likely. Power consumption is high for all the high end GPU's.

You don't know what you're talking about at all.
 
Heat and throttling have nothing to do with efficiency, that is due to their choice of HSF for cost control reasons most likely. Power consumption is high for all the high end GPU's.

You don't know what you're talking about at all.

Dude AMD has a simpler memory architecture which allows them to keep a smaller die. When the die size decreases, there is less surface area available to dissipate the heat. Hence with the amount of GPU power these chips possess, they cannot relieve themselves of all the heat more quickly than they should.

Why are you guys involving the whole efficiency crap in all of this? Efficiency on a GPU is quantified in terms of power usage per task done. Haven't you looked at the power consumption charts or AMD's shown you a different set of benchmarks :rolleyes:

Don't argue with me if you don't know crap about physics and how stuff works. Smaller die size has nothing to do with efficiency. for your reference here's the definition of efficiency:

http://www.merriam-webster.com/dictionary/efficiency

(1) : effective operation as measured by a comparison of production with cost (as in energy, time, and money)
 
If AMD engineers say the cards were designed to run at 95c... I believe them! However CCC now is set up where you can set a target temp and a max fan speed. If you don't want them running at 95c, then they won't. I personally set mine at 75c and the fans only get loud during gaming sessions. As a headphone gamer this does not bother me.

And not that cost is the only thing that matters, but all that money I didn't spend for the same performance from nVidia can buy a nasty custom water loop if/when I decide I want these silent and cool.

This is normally what i do, shoot for 75c, so when gaming what temps are they hitting?
Are you able to sustain 75c? and at what fan %
 
When you say that are you concerned is it about the effects it will have on the rest of your system or the GPU itself?

The gpu itself, i mean these things only have 1-3 year warranty at best right? can they handle 1000 hours of BF4 at 95c? what happens if mantle really pegs the card and makes it work even hotter and harder?
 
Dude AMD has a simpler memory architecture which allows them to keep a smaller die.
They decreased clocks and widened the bus which resulted in less die use for the memory controller but that doesn't mean its simpler at all. You don't know what you're talking about.

When the die size decreases, there is less surface area available to dissipate the heat.
That is power density, which still has nothing to do with efficiency. You don't know what you're talking about.

Efficiency on a GPU is quantified in terms of power usage per task done.
There are many ways to measure efficiency. Die size, power usage, IPC, etc. A significantly smaller die size for similar performance is a pretty big win here for AMD which is much more important than spending a extra 40-60w of power which is perfectly acceptable for a high end GPU. Its probably the key reason they're able to offer such performance for such a low price point. If we were talking about something for use in a mobile or power constrained platform you'd have a point, but we're not, so you don't. It'd be funny if it weren't sad that you quoted the definition for efficiency but still don't understand it.
 
After a product launch, has anyone ever advocated that please ignore the reference design and go for the upcoming AIB partners' coolers. I am not sure I agree with, "Wait for the non-reference card" argument.

The product is here and it is throttling at a setting AMD provided. The reviewers have no choice but to use a stupid BIOS switch to make the card competitive with the competition. If the card was so nice and competitive why even include the BIOS switch in the first place?

Yeah, it's retarded. The fact of the matter is, this isn't something that anyone should defend. Blower coolers are better for small form factors and multi GPU so not everyone can use an open air type of card.

AMD fucked up. Period. The chip has potential but it throttles at insane levels with quiet fan profiles. This should not be the case. Hell, Anandtech and PCPer had their quiet mode 290 at 40% fan throttling so much that it basically became as fast as a 280X at that point - yet AMD fans defend this shit? How can you say the blower cooler shouldn't be better than it is? I don't want to run insanely loud (and *I* think 55% fan is loud on the reference shroud) to get good performance. So you can get a "quiet" card that performs like shit or a loud card that performs well. Talk about a really fucking stupid compromise.....
 
290x clock for clock performs better and the reason it gets so hot is not because of the cooler but is because of the performance output. It works harder. It's just physics. The world's greatest over clocking champs are winning with AMD 290x right now. NVIDIA can't compete on performance nor on price.

Quit your fantasy already.
 
Yeah, except an overclocked GTX 780 matches/passes the 290X fairly easily and without using 200W more power in the process......Sorry, the GTX 780 has way more TDP headroom for overclocking while the 290X? How can you realistically overclock the 290X without using a jetmode 80% fan to prevent throttling when your card is using insane amounts of power for that overclocking?

An overclocked GTX 780 is also 50$ cheaper. Or you can get a classified 780 for the same price as the 290X.
 
I'm just going to wait 2 weeks for the non-reference designs to come out and then we'll all be happy. I don't think I ever in my life have bought a GPU with a reference cooler design. Thats the penalty of early adaption i guess. :D
 
Yeah, except an overclocked GTX 780 matches/passes the 290X fairly easily and without using 200W more power in the process......Sorry, the GTX 780 has way more TDP headroom for overclocking while the 290X? How can you realistically overclock the 290X without using a jetmode 80% fan to prevent throttling when your card is using insane amounts of power for that overclocking?

An overclocked GTX 780 is also 50$ cheaper. Or you can get a classified 780 for the same price as the 290X.


Dude if GTX 780 was the best performing card it would be the champion world over clocker but it is not. The AMD R9 290X is the champ. With driver updates its only going to get better.
 
When you say that are you concerned is it about the effects it will have on the rest of your system or the GPU itself?

This guy has worthwhile points all the time. I'm not sure how big of a deal the chip core temp is if the coolers are doing an adequate job of blowing almost all the hot air directly out of the case. I do believe my initial reaction was the same--I was like I don't want anything that hot in my room. But it isn't necessarily that bad. Like 85 and 95 are both hot air. How big of a difference is that in the real world? I don't know so I shouldn't try and speculate.

This thread wasn't as bad as it could have been. I went 780s this round but amd made it possible by forcing prices down. There is no question that the 200 series made real impact in the gpu world. In a way that history will remember as quirky but effective. When the partner's coolers come out I think we will see some feathers fly but I think they will be pretty solid.
 
I'm just going to wait 2 weeks for the non-reference designs to come out and then we'll all be happy. I don't think I ever in my life have bought a GPU with a reference cooler design. Thats the penalty of early adaption i guess. :D

With AMD the last generation of HD7900 series reference cards proved to work out great for water cooling if that is a route your interested in pursuing.

290x/290 reference design has good water block support so its a good choice if you plan on water cooling.
 
I have the 290 and I have never heard the thing and I believe it is designed as such because AMD knows what the hell there doing better then you or the review sites as they have been cooling a run of 4870x2 /5870x2 /6990/7990 with there design work.

Both have desired a card that does not need a lot of cooling and to me if it can run Heaven 4.0 at 95c means it's bullet proof.. try running it with your 780GTX at 95c..
 
Last edited:
Status
Not open for further replies.
Back
Top