Tech Report on the differnce between Review Samples and Store Bought R9 290x Cards

Megalomaniac

2[H]4U
Joined
Jul 27, 2004
Messages
2,193
Last edited:
What a mess. This is like the 5th website that found substantial performance variances between press and retail cards....

Pretty sad that AMD doesn't have the quality control or engineering foresight to prevent stuff like this from happening.
 
I think they were just using quiet mode not uber. correct me if I am wrong though.
 
I've not seen any throttling at all with my 290. Even when flashed to a Powercolor 290X OC (overclocked to 1030mhz). I played BF4 the other day for 2 hours, and about an hour into it I loaded up GPU-Z to monitor clockspeed and the only time it dropped below 1030mhz was during a map change. I still have the reference cooler on it, in a well ventilated case. Antec 900 with a side fan, all fans are on low speed and are stock, besides the side fan, it doesn't come with a stock fan. The side fan is a 3-speed antec though, and it's at low speed as well.


The only time the cooler actually got noticeably loud was when I was playing single player BF4 and was in the pause menu. Stepped outside for a cigarette and it was pretty loud. Once I got out of the pause menu it went back to being pretty silent. Loaded up GPU-Z then as well and it was still 1030mhz. I'm seriously debating on sending back the Gelid Icy Cool I bought with it (due to the reviews of the loud blower) as I don't really see the need for it in my application.

I am putting together a new computer here in a few days though. i7-3770k and using a Cooler Master Hyper 12 EVO. Will be using the same case and power supply I have now. I'll hang on to the Gelid at least until after that, to see if that brings up my ambient temps as it wont be venting heat directly out the back as it does now with my Antec Kuhler 620.
 
You know, we have a LOT of 290/290X owners here in [H]-Ville and I have not seen cherry picked samples as being an issue.

And I just have to say....Na, NVIDIA has no ties to Newegg or the ability to alter any process there. I have known Scott for a long time and I would say he has the utmost integrity, but NVIDIA being the one pushing this makes it sort of a stinky situation.

We have done samplings direct out of stock for years for testing, and rest assured we NEVER let anyone know what we were doing or had one of the tech companies set it up.
 
I really don't get why sites are trying to make this into some conspiracy and not just telling people to turn up the fan speed. It does appear to they seem to think flipping to uber mode is not an option.

It's also pretty hilarious that Nvidia bought cards for them and I am supposed to trust they are unbiased.
 
Thanks for the input Kyle, I haven't seen anyone at [H] complain about their R9s either. A lot of people are getting even better performance with the custom coolers.. (still waiting for the AIB solutions)

The fact that Nvidia reached out and offered to buy some cards is...... interesting.

stealthy, they were sealed and direct shipped, so i'm thinking they were unbiased. Read TechReports piece on Mantle too, it's pretty interesting and goes into a good amount of detail.
http://techreport.com/review/25683/delving-deeper-into-amd-mantle-api

While reading the R9 cherry picking article, i did entertain the possibility of poor airflow in home PCs, compared to completely open test rigs. For the most part, unless your PC already sounds like a jet engine, you probably have worse airflow than if it was out in the open. Ambient temperature also matters of course... Anyways, those were the only factors i could come up with.
 
Thanks for the input Kyle, I haven't seen anyone at [H] complain about their R9s either. A lot of people are getting even better performance with the custom coolers.. (still waiting for the AIB solutions)

The fact that Nvidia reached out and offered to buy some cards is...... interesting.

stealthy, they were sealed and direct shipped, so i'm thinking they were unbiased. Read TechReports piece on Mantle too, it's pretty interesting and goes into a good amount of detail.
http://techreport.com/review/25683/delving-deeper-into-amd-mantle-api

While reading the R9 cherry picking article, i did entertain the possibility of poor airflow in home PCs, compared to completely open test rigs. For the most part, unless your PC already sounds like a jet engine, you probably have worse airflow than if it was out in the open. Ambient temperature also matters of course... Anyways, those were the only factors i could come up with.
um I am pretty sure there have been a few complain about their cards throttling
 
um I am pretty sure there have been a few complain about their cards throttling

While reading the R9 cherry picking article, i did entertain the possibility of poor airflow in home PCs, compared to completely open test rigs. For the most part, unless your PC already sounds like a jet engine, you probably have worse airflow than if it was out in the open. Ambient temperature also matters of course... Anyways, those were the only factors i could come up with.

It's the nature of the new cards i think. If you kept the stock cooler, if you have poor airflow or if you are in warmer environment yes it will throttle more. You can't rule those factors out. better cooling will prevent the cards from throttling as much. Plus, i'm pretty sure that all the reviews are done on an open motherboard (not inside a case) so by their nature they get better airflow.
 
I really don't get why sites are trying to make this into some conspiracy and not just telling people to turn up the fan speed. It does appear to they seem to think flipping to uber mode is not an option.

It's also pretty hilarious that Nvidia bought cards for them and I am supposed to trust they are unbiased.

The cards were sent directly from newegg without nvidia having touched them. But, as per AMD fashion, AMD is the victim here, I guess. /s

Personally I think the entire situation is AMD's fault and it is bullshit. It's not nvidia's fault that AMD gave them the ammunition for this.... AMD should have spent more than 30 cents on their goddamn cooler if they knew it would cause such substantial performance differences at factory defaults..Fact of the matter is, if the 290X had a Titan quality shroud, none of this would be an issue and it would never have been brought up. AMD's fault, and it isn't a conspiracy. Not everyone wants 55% fan on their 290X card, and 55% fan on the 7970 was quite loud to me.

I do fully expect aftermarket cards to fix this situation completely and I also fully expect the 290 to become a gem once those cards are released. But I can't believe anyone would defend AMD for creating this situation - the entire thing is 100% their fault.
 
why spend all that time developing a monster gpu only to stick bottom of the barrel tim and an insufficient cooler on there?
 
The cards were sent directly from newegg without nvidia having touched them. But, as per AMD fashion, AMD is the victim here, I guess. /s

Personally I think the entire situation is AMD's fault and it is bullshit. It's not nvidia's fault that AMD gave them the ammunition for this.... AMD should have spent more than 30 cents on their goddamn cooler if they knew it would cause such substantial performance differences at factory defaults..Fact of the matter is, if the 290X had a Titan quality shroud, none of this would be an issue and it would never have been brought up. AMD's fault, and it isn't a conspiracy. Not everyone wants 55% fan on their 290X card, and 55% fan on the 7970 was quite loud to me.

I do fully expect aftermarket cards to fix this situation completely and I also fully expect the 290 to become a gem once those cards are released. But I can't believe anyone would defend AMD for creating this situation - the entire thing is 100% their fault.

No its people fault for not having a well vented case.
 
lol, clearly Nvidia is pissed that AMD showed them up, and will do anything to try to save face and justify their pricing.

Anyone that spends 500+ on a videocard and then refuses to flip a switch on it to get optimum performance probably shouldn't have bought one.

It's a non issue for the rest of us.

oh and the stock cooler on the 290s weighs a friggin ton, there is no way they were cheap to make, it's just a shit design.
 
lol, clearly Nvidia is pissed that AMD showed them up, and will do anything to try to save face and justify their pricing.

Anyone that spends 500+ on a videocard and then refuses to flip a switch on it to get optimum performance probably shouldn't have bought one.

It's a non issue for the rest of us.

Nvidia has around 65% discrete GPU market share excluding iGPUs and APUs. AMD showing them up....alrighty then. If you say so.

Anyway, you have an interesting way of justifying 15% performance variances at factory defaults as being okay. That's what you're essentially telling me. Anyone wanting to use quiet mode is fucked if they want consistent performance, and that it is okay. Whatever you say man.

Hey. At least there's bitcoin mining. Right?
 
Just looks like PR mud slinging ... not impressed.
 
Nvidia has around 65% discrete GPU market share excluding iGPUs and APUs. AMD showing them up....alrighty then. If you say so.

Anyway, you have an interesting way of justifying 15% performance variances at factory defaults as being okay. That's what you're essentially telling me. Anyone wanting to use quiet mode is fucked if they want consistent performance, and that it is okay. Whatever you say man.

Hey. At least there's bitcoin mining. Right?

you have a really negative opinion of the cards, I have a realistic opinion. It's a switch on the card, not some pain in the ass hoops to jump through.

And ya, if I was the market leader I would be pissed if the little guy showed me up, which make no mistake, AMD sucker punched Nvidia, and the butthurt is palpable.

And for someone like you, who doesn't own the card, to be this condemning over it is a little odd.

Here's the situation, AMD got wins with both new consoles, people are talking about mantle, and oh by the way they released a card that is cheaper and potentially faster than what Nvidia has to offer. Of course Nvidia is going to go out of their way to save face in this.
 
you have a really negative opinion of the cards, I have a realistic opinion.

And ya, if I was the market leader I would be pissed if the little guy showed me up, which make no mistake, AMD sucker punched Nvidia, and the butthurt is palpable.

I don't think AMD showed nvidia up at all. The 290X is a hot and loud card just like the GTX 480 was, and despite what you think - a sizeable portion of the consumer market does not like hot and loud video cards. The GTX 480 was panned for that reason, just like the 290X is being panned now for being hot, loud, and having performance variances.

I'll end on this note, but I think nvidia would be pissed if AMD had delivered a great card by every metric. I mean acoustics, software, and user experience - the situation right now is, the card doesn't even have a WHQL driver, people are reporting the latest beta to be unstable (see the thread in this forum), the black screen issue still exists, and the card throttles causing 15% performance variances at quiet mode settings. Does AMD advertise "you lose 15% performance for choosing this option" on their cards? Nah. Do you think that AMD showed nvidia up with all of these issues in mind? Again. Nvidia would have a reason to worry if AMD had designed a great card by every metric. Instead, the card is hot, loud with inconsistent performance. It boggles my mind man. If AMD had spent 20 bucks on a better cooler, none of this would be an issue. The noise would never be brought up. Throttling wouldn't happen. I just can't understand why AMD let things happen this way. AMD being AMD I guess? This card could have been fucking great. Instead AMD took a great GPU chip and thanks to powertune and the cooler, dragged it down into the mud. Like I said, boggles my mind. Could have been GPU of the year.

*I'll revisit this opinion when aftermarket cards hit, I expect those to fix these issues. In fact, I expect those to be fucking great cards. But IMO AMD failed customers with their shoddy and cheap reference design. Nothing you can say justifies the fact that default settings can vary so widely in performance - that situation should never happen.
 
I don't think AMD showed nvidia up at all. The 290X is a hot and loud card just like the GTX 480 was, and despite what you think - a sizeable portion of the consumer market does not like hot and loud video cards. The GTX 480 was panned for that reason, just like the 290X is being panned now for being hot, loud, and having performance variances.

I'll end on this note, but I think nvidia would be pissed if AMD had delivered a great card by every metric. I mean acoustics, software, and user experience - the situation right now is, the card doesn't even have a WHQL driver, people are reporting the latest beta to be unstable (see the thread in this forum), the black screen issue still exists, and the card throttles causing 15% performance variances at quiet mode settings. Does AMD advertise "you lose 15% performance for choosing this option" on their cards? Nah. Do you think that AMD showed nvidia up with all of these issues in mind? Again. Nvidia would have a reason to worry if AMD had designed a great card by every metric. Instead, the card is hot, loud with inconsistent performance. It boggles my mind man. If AMD had spent 20 bucks on a better cooler, none of this would be an issue. The noise would never be brought up. Throttling wouldn't happen. I just can't understand why AMD let things happen this way. AMD being AMD I guess? This card could have been fucking great. Instead AMD took a great GPU chip and thanks to powertune and the cooler, dragged it down into the mud. Like I said, boggles my mind. Could have been GPU of the year.

*I'll revisit this opinion when aftermarket cards hit, I expect those to fix these issues. In fact, I expect those to be fucking great cards. But IMO AMD failed customers with their shoddy and cheap reference design. Nothing you can say justifies the fact that default settings can vary so widely in performance - that situation should never happen.

Wait you think you average joe buys £500 GPUs? You seriously think this is a card that the average person will buy? Whoa, thats just cray.
 
Last edited:
There is a case to answer here.

ps there is a 13.11 WHQL driver.
 
Last edited:
Don't get whats so hard. Not every card will be the same and I'm sure if you lined up all the reviewer samples, they would differ too as well. The whole notion behind Quiet mode is pretty cool, but it really does have people scratching their heads when it shouldn't be.
 
No mention what mode they test in? Rerun all the test with the gpus set to uber mode. Results please.

Nvidia is buying amd card for reviewers to run them at their slower mode to show how clock speed varies?
 
R9 290 owner here... Count me in as not having seen throttling or fan noise issues. I overclocked the core mine to 1060mhz core and 1350mhz memory even... fan stays relatively quiet, temps are normal, I monitor with RTSS OSD in games, everything's normal. Reference cooler, same TIM. Then again, I have a 600T with upgraded fans, but kept the side window in place. It's not silent by any means, but it's reasonably quiet and it's very cool.

It really is a hot card, I know that much is true. My 560 Ti's in SLI never got above 65c in gaming. So if you take a naturally hot card, but it in a poorly ventilated case of some poor schmuck... well yeah, I can see where that would be a problem.

All that being said, this thing really is exploding all over. And the review/retail performance variance is really a thing. But AMD addressed that in the uber/quiet mode announcement. Maybe it's just me.
 
lol, clearly Nvidia is pissed that AMD showed them up, and will do anything to try to save face and justify their pricing.

Anyone that spends 500+ on a videocard and then refuses to flip a switch on it to get optimum performance probably shouldn't have bought one.

It's a non issue for the rest of us.

oh and the stock cooler on the 290s weighs a friggin ton, there is no way they were cheap to make, it's just a shit design.

Flip a switch that makes it loud as heck. Yeah..... and it still oc poorly and is at best around a 780 oc to oc while being noisy, lacking shadowplay, and other important features. These sites aren't pointing out issues for funsies, they are real just like amd stuttering, which still is not fixed for crossfire and dx9, the vast majority of games which use it. The 290x costs more, runs louder and hotter, and lacks the intangibles, compared to a 780. 500 with 3 good games vs 550 with no games.... bad deal.
 
Flip a switch that makes it loud as heck. Yeah..... and it still oc poorly and is at best around a 780 oc to oc while being noisy, lacking shadowplay, and other important features. These sites aren't pointing out issues for funsies, they are real just like amd stuttering, which still is not fixed for crossfire and dx9, the vast majority of games which use it. The 290x costs more, runs louder and hotter, and lacks the intangibles, compared to a 780. 500 with 3 good games vs 550 with no games.... bad deal.

Funny that none of the people who own it in a well vented case have any issue with the noise. And what important features?
 
No mention what mode they test in? Rerun all the test with the gpus set to uber mode. Results please.

Nvidia is buying amd card for reviewers to run them at their slower mode to show how clock speed varies?

AMD is selling out virtually all their high end cards, nvidia must be whipping their marketing department like no tomorrow.
 
AMD is selling out virtually all their high end cards, nvidia must be whipping their marketing department like no tomorrow.

You don't read the news do you? Hint... litecoin mining is why. Not gaming.
 
I really wish people would stop parroting reviews that echo their Anti-AMD bias while ignoring the reviews (like the one here) that don't reinforce the negativity.

But I am a dreamer.
 
Funny that none of the people who own it in a well vented case have any issue with the noise. And what important features?

Numbers don't lie, this thing is louder than a 480 by a large margin. Important features like shadowplay, stable drivers, proper ,multicard, physx, txaa, transparency ssaa, etc., not running as a furnace, and not being objectively roaring loud.
 
I'm a 290 owner with aftermarket cooler (Xtreme III) running the XFX 290X BIOS.
I see some clock throttling despite temps being well below 60C, power target at +50%.

After watching CPU + GPU use a lot, it seems that GPU performance drops a bit earlier than on my last card (GTX 580, Xtreme Plus cooler).
On the 580, it sometimes happened with any CPU core the game uses at 85% maxed, more often at 90% and quite often with any core 95% maxed.
Running my 290 as a 290X, I see some drop in GPU use at around 80% use on any CPU core.

Not wanting to tempt fate, I havent tried any other card BIOS than XFX, but it looks like it is worth a try to see if there are any differences.
I wanted to get hold of the Asus 290X BIOS as one user reported GPU use is higher with this.
But I cant find it available for download.
If anyone can assist it will be appreciated.
 
Numbers don't lie, this thing is louder than a 480 by a large margin. Important features like shadowplay, stable drivers, proper ,multicard, physx, txaa, transparency ssaa, etc., not running as a furnace, and not being objectively roaring loud.

GT do you even own a 290(x)? I have two installed in my system, with ZERO issues. Every game I play on my 30" monitor runs smooth as silk. In fact, I had a 780 before this, and I had driver issues with that card.

Also, the fan noise thing is over exaggerated. Before I installed the water blocks, I gave the fans a go to see what all the fuss was about. The fans never went over 50% and was barely noticeably. :D
 
Numbers don't lie, this thing is louder than a 480 by a large margin. Important features like shadowplay, stable drivers, proper ,multicard, physx, txaa, transparency ssaa, etc., not running as a furnace, and not being objectively roaring loud.

- How many people use shadowplay?
- Drivers are stable.
- X-fire now scales and runs better then SLI.
- physx is in how many games?
- proprietary AA modes are an important feature? interesting how the [H] has stated over and over that the AMD and Nvidia cards are identical in visuals.
- stop with the heat bullshit already, the people complaining about it are Nvidia fanbois like you.
- Stop with the noise bullshit already, the people complaining about it are Nvidia fanbois like you.

You and Xoleras need to get over the fact that AMD released a better product then your beloved Nvidia already. Spouting the same old bullshit that has been proven wrong and pointing to websites that purposefully hobble the 290x is flat out fucking stupid.
 
I'm a 290 owner with aftermarket cooler (Xtreme III) running the XFX 290X BIOS.
I see some clock throttling despite temps being well below 60C, power target at +50%.

After watching CPU + GPU use a lot, it seems that GPU performance drops a bit earlier than on my last card (GTX 580, Xtreme Plus cooler).
On the 580, it sometimes happened with any CPU core the game uses at 85% maxed, more often at 90% and quite often with any core 95% maxed.
Running my 290 as a 290X, I see some drop in GPU use at around 80% use on any CPU core.

Not wanting to tempt fate, I havent tried any other card BIOS than XFX, but it looks like it is worth a try to see if there are any differences.
I wanted to get hold of the Asus 290X BIOS as one user reported GPU use is higher with this.
But I cant find it available for download.
If anyone can assist it will be appreciated.

my card is pegged @ 100 percent load mining as I type this, clocks haven't budged.
Temps are near 80c, and I am on an xfx card, and with the same cooler.

What are your VRM temps?
 
- How many people use shadowplay?
- Drivers are stable.
- X-fire now scales and runs better then SLI.
- physx is in how many games?
- proprietary AA modes are an important feature? interesting how the [H] has stated over and over that the AMD and Nvidia cards are identical in visuals.
- stop with the heat bullshit already, the people complaining about it are Nvidia fanbois like you.
- Stop with the noise bullshit already, the people complaining about it are Nvidia fanbois like you.

You and Xoleras need to get over the fact that AMD released a better product then your beloved Nvidia already. Spouting the same old bullshit that has been proven wrong and pointing to websites that purposefully hobble the 290x is flat out fucking stupid.

Nice
 
No mention what mode they test in? Rerun all the test with the gpus set to uber mode. Results please.

Nvidia is buying amd card for reviewers to run them at their slower mode to show how clock speed varies?

From my read of the article, the tests were performed in quiet mode. To me, it was already known that quiet mode artificially limited what the 290X is capable of for the benefit of acoustics. I would wager that rerunning the tests with the cards in uber mode would provide a different outcome that was far less newsworthy.

I have a pair of retail 290X's headed my way for my personal rig, so I'll certainly tinker with them when the arrive (yes Brent, if you're reading, after I finish my current review) and can see if they differ from the engineering sample 290X that I've got in my bunker. I will be running them in Uber mode in my daily driver just as we have decided to do for our evaluations on a go forward basis.
 
PC Per's podcast touched on this subject as well.

They did mention that someone flashed the review cards bios onto the retail one and got within 2-3% difference. So small you would say under the margin of error.


One thing that I have yet to see mentioned is the fact that a few of the reviews were based on Sapphire cards. I wonder if it's something Sapphire did to the bios themselves. Maybe AMD sent them an early bios, then AMD did tweaking and shipped the cards, while Sapphire still used a slightly older bios with that performance delta that some are pointing out.
 
I stopped reading after he said "Nvidia offered to to purchase...". I just find that pathetic whether its justified or not.
 
my card is pegged @ 100 percent load mining as I type this, clocks haven't budged.
Temps are near 80c, and I am on an xfx card, and with the same cooler.

What are your VRM temps?

Some things hardly throttle the clocks others do it more.
There is also a question of GPU use as that varies substantially in some games and seems to increase with higher clocks, but I wont cover that here.

I just did these tests:
Heaven maxes at 1000MHz nearly all the time.
Crysis Warhead throttles to 970/980MHz vsync off, power limit 0%.
Crysis Warhead throttles to 970/980MHz vsync off, power limit 50% <- made no difference.
Metro LL benchmark to just under 1000MHz
COD Ghosts 970MHz vsync on, 1000MHz vsync off.
BF4 870MHz vsync on, 1000MHz vsync off.
X3:TC benchmark was mostly around 980MHz to 998, but in a whole section, the core dropped to near 700MHz. See Afterburner capture at end of post.
edit: Hmm, I missed the first part of it, the clocks dropped to around 500MHz then !!

When using vsync there is a lot of throttling, this is fair enough.
When not using vsync there is a small amount of throttling generally, but sometimes a lot more.

One VRM gets hotter than the other, it maxes out at practically the same temp as the GPU.
50 to 55C without any overclock.


X3:TC benchmark capture
dre9w1.jpg
 
Last edited:
geez, this stuff agin?

people, clocks throttle based on usage.

If the card doesnt need 100%, it doesnt use it...

its part of the new power scheme.

I can guarantee that if your clocks are thorttling with fan uncapped, its because your cpu is pegged.

Xoleras and GT are known Nvidia trolls, just ignore them..
 
Back
Top