AMD Radeon R9 290X Video Card Review @ [H]

It's not a word that is correctly used

You are wrong

Verb that is informal
talk at length in an irritating manner.
So he isn't using the word correctly

It's supposed to be Yup

yep/
exclamation & noun
exclamation: yup; noun: yup
1.
nonstandard spelling of yes, representing informal pronunciation.
^^Please tell me that this forum hasn't devolved into ludicrous debates such as this.^^
 
...and so, the Red Rooster laughed heartily at the Green Goblin who now had egg all over his face and they all lived happily ever after.

Or at least until the next generation. LOL

The next generation is next month....


Well, not really- but Nvidia has already announced the 780Ti, so we can expect a whole 'nuther round of price/performance/features/other qualities discussions then, along with the R9 290 release.
 
The next generation is next month....


Well, not really- but Nvidia has already announced the 780Ti, so we can expect a whole 'nuther round of price/performance/features/other qualities discussions then, along with the R9 290 release.

How do you think the non reference boards and aftermarket coolers will perform on the 290x when dumping 95c of heat back onto your mobo and cpu and other components?
 
So Nvidia is going to overclock the 780 and call it a 780 TI, then charge $650 to keep the suckers buying their expensive parts when you can just OC a regular 780. Word to the wise, buy a regular 780 or 290X. And I can already see the Nvidia fan boys gloating about it in threads. That's all they can come up with...LOL
 
How do you think the non reference boards and aftermarket coolers will perform on the 290x when dumping 95c of heat back onto your mobo and cpu and other components?

Well, they won't be dumping 95c- they'll be dumping 300+ watts, which is fine if the enclosure is set up properly. Now, if people are just using some cheapo case that might come with two fans and don't bother to increase airflow, well, they'll be in trouble.
 
So Nvidia is going to overclock the 780 and call it a 780 TI, then charge $650 to keep the suckers buying their expensive parts when you can just OC a regular 780. Word to the wise, buy a regular 780 or 290X. And I can already see the Nvidia fan boys gloating about it in threads. That's all they can come up with...LOL

Hold on to your panties there kiddo, because this might hurt a bit-

Nvidia has yet to release a fully-spec'd GK110, which is what sits at the heart of the 780, Titan, and multiple Quadro and Tesla parts.

That means that a hypothetical 780Ti could be as much as 20%-30% as fast as the 780, and if the 780 were dropped to be price competitive with the 290 and the 780Ti with the 290X, which is the most reasonable solution (but don't expect reason out of this industry), then we'd see the market settle back to norm.

If you've been watching video cards much, you'd note that Nvidia has usually had the outright fastest and most expensive GPU, with AMD following with a less expensive part that performs almost as good. You can apply the same pattern most of the way down the market segments; these companies typically try to match each other at every market segment, resulting in pairs of cards that represent slightly different takes on the price/performance/features spectrum.

"Two by two, hands of blue, two by two..."
 
I'd actually like to see how well this card deals with mixing dual-link and DP connections for Eyefinity, matching the problems as stated above.

Though I will admit that I'm not really interested in assembling that setup, as I"d much rather have a G-Sync capable 30"-40" 4k setup going... :)

That's what I used to think too, then I tried the iRacing promo and found out they have what is probably the best triple screen support in the business. Now I'd feel lost without the fov that triples provide.

/until they get the oculus done.

iRacing Skip Barber Helmet Cam with Triple Screen at Zolder,
http://www.youtube.com/watch?v=hv5RdYKZpfI

Sim-Racing Monitor Placement and Field-of-View (demonstrated using iRacing.com),
http://www.youtube.com/watch?v=rri19PqJ3qU

Triples let you run realistic FOV's and still have peripheral vision.

It was awesome in the BF4 beta, I was able to run a close fov where everything was the proper size and still have good peripheral. /still have to load up some flight sims when I have time.

Sim Racing 101 : Field of View - Seeing is Believing,
http://www.youtube.com/watch?v=4yYeiAHsdr0
 
Hold on to your panties there kiddo, because this might hurt a bit-

Nvidia has yet to release a fully-spec'd GK110, which is what sits at the heart of the 780, Titan, and multiple Quadro and Tesla parts.

That means that a hypothetical 780Ti could be as much as 20%-30% as fast as the 780, and if the 780 were dropped to be price competitive with the 290 and the 780Ti with the 290X, which is the most reasonable solution (but don't expect reason out of this industry), then we'd see the market settle back to norm.

If you've been watching video cards much, you'd note that Nvidia has usually had the outright fastest and most expensive GPU, with AMD following with a less expensive part that performs almost as good. You can apply the same pattern most of the way down the market segments; these companies typically try to match each other at every market segment, resulting in pairs of cards that represent slightly different takes on the price/performance/features spectrum.

"Two by two, hands of blue, two by two..."

LOL...20-30% faster than the 780? And people say the 290X is loud, hot and power hungry! I think you are stretching the 28nm process with current tech just a bit much.

If only "ifs and buts were candy and nuts"...
 
That's what I used to think too, then I tried the iRacing promo and found out they have what is probably the best triple screen support in the business. Now I'd feel lost without the fov that triples provide.

/until they get the oculus done.

iRacing Skip Barber Helmet Cam with Triple Screen at Zolder,
http://www.youtube.com/watch?v=hv5RdYKZpfI

Sim-Racing Monitor Placement and Field-of-View (demonstrated using iRacing.com),
http://www.youtube.com/watch?v=rri19PqJ3qU

Triples let you run realistic FOV's and still have peripheral vision.

It was awesome in the BF4 beta, I was able to run a close fov where everything was the proper size and still have good peripheral. /still have to load up some flight sims when I have time.

Sim Racing 101 : Field of View - Seeing is Believing,
http://www.youtube.com/watch?v=4yYeiAHsdr0

I own the forza Motorsport steering wheel for pc xbox and i think ps3 . Thing is awesome
Since I am not much of a gamer played with it twice. Lol . Now imagine if you can use a gsycn monitor with triple screen racing
 
That might have been true in the past. However since the 7970 and 7970GE that has not been true.

Hold on to your panties there kiddo, because this might hurt a bit-

Nvidia has yet to release a fully-spec'd GK110, which is what sits at the heart of the 780, Titan, and multiple Quadro and Tesla parts.

That means that a hypothetical 780Ti could be as much as 20%-30% as fast as the 780, and if the 780 were dropped to be price competitive with the 290 and the 780Ti with the 290X, which is the most reasonable solution (but don't expect reason out of this industry), then we'd see the market settle back to norm.

If you've been watching video cards much, you'd note that Nvidia has usually had the outright fastest and most expensive GPU, with AMD following with a less expensive part that performs almost as good. You can apply the same pattern most of the way down the market segments; these companies typically try to match each other at every market segment, resulting in pairs of cards that represent slightly different takes on the price/performance/features spectrum.

"Two by two, hands of blue, two by two..."
 
Hold on to your panties there kiddo, because this might hurt a bit-

Nvidia has yet to release a fully-spec'd GK110, which is what sits at the heart of the 780, Titan, and multiple Quadro and Tesla parts.

That means that a hypothetical 780Ti could be as much as 20%-30% as fast as the 780, and if the 780 were dropped to be price competitive with the 290 and the 780Ti with the 290X, which is the most reasonable solution (but don't expect reason out of this industry), then we'd see the market settle back to norm.

If you've been watching video cards much, you'd note that Nvidia has usually had the outright fastest and most expensive GPU, with AMD following with a less expensive part that performs almost as good. You can apply the same pattern most of the way down the market segments; these companies typically try to match each other at every market segment, resulting in pairs of cards that represent slightly different takes on the price/performance/features spectrum.

"Two by two, hands of blue, two by two..."

Ah yes, the "Nvidia has/can have X which can completely destroy everything else" bullshit. Nvidia has no reason to hold back on releasing the fastest cards possible, they would completely dominate the market and be making a lot more money if what you and other nvidia fanbois keep spewing was true.

The 290x is the fastest card one can buy and there is absolutely ZERO evidence that Nvidia has anything to actually beat it other then an overclocked 780 which may or may not surpass the 290x.
 
Ah yes, the "Nvidia has/can have X which can completely destroy everything else" bullshit. Nvidia has no reason to hold back on releasing the fastest cards possible, they would completely dominate the market and be making a lot more money if what you and other nvidia fanbois keep spewing was true.

The 290x is the fastest card one can buy and there is absolutely ZERO evidence that Nvidia has anything to actually beat it other then an overclocked 780 which may or may not surpass the 290x.

latest Quadro would beat it but thats a $2k card
a fully unlocked GK110 would have no issues beating it
http://www.nvidia.com/object/quadro-desktop-gpus-specs.html
 
Ah yes, the "Nvidia has/can have X which can completely destroy everything else" bullshit. Nvidia has no reason to hold back on releasing the fastest cards possible, they would completely dominate the market and be making a lot more money if what you and other nvidia fanbois keep spewing was true.

The 290x is the fastest card one can buy and there is absolutely ZERO evidence that Nvidia has anything to actually beat it other then an overclocked 780 which may or may not surpass the 290x.

How's that sand taste? Do you keep your head in it all day?

I'm not the kid down the block that holds up a green team sign while you're having a pow wow to worship your red team flag. I could give a rats about teams.

Nvidia has, themselves, admitted that they haven't shipped a fully-enabled consumer part, they're committed to making GPUs quiet, everyone that's owned a GK110-based card has commented that they have massive overclocking potential, and they've already announced a card that will slot in above the GTX780.

Did you expect the world to stop just for you? This stuff goes on forever. These companies are two product releases ahead in their R&D departments. So yeah, Nvidia doesn't just magically 'have' something in the pipe, they've announced it, and we have very good reason to believe that it will be good.

Better than the R9 290X? Hell if I know. But even an overclocked Titan will outrun an 'Uber' 290X while still being quieter, so the potential to make something significantly faster than the Titan is there.

And remember that the Titan is a prosumer card, not a consumer card- it's price can't be compared directly. The GTX780 and GTX780Ti will be AMD's immediate competitors, and the pricing is entirely up to Nvidia. They've sold cards with GPUs as big as GK110 for $200 before.
 
AMD beats NVIDIA by getting hot and loud - NVIDIA will do the same now to beat AMD.
 
Wow I left yesterday using the word yap, then I see people arguing over it.

Thank god for the internet!
 
Wow I left yesterday using the word yap, then I see people arguing over it.

Thank god for the internet!

When the arguing about the topic starts to get boring, everyone starts to argue about proper grammar usage. It's actually quite funny
 
I don't understand why people think that enabling less than 10% more cuda cores on GK110 is going to yield some magical results.

mind = boggled
 
Its an entertaining place to be

It's*

mvgame.png
 
what kind of video card you running on the 19" and 23"ers

I'm actually using two 23"ers now, but only game on 1. HD7950, I just took off the cooler of my second 7950 and don't have anything to clean it off with :(, plus it doesn't even fit properly without scraping the second card.(Fucking Gene-V)
 
I'm actually using two 23"ers now, but only game on 1. HD7950, I just took off the cooler of my second 7950 and don't have anything to clean it off with :(, plus it doesn't even fit properly without scraping the second card.(Fucking Gene-V)

Yeah the one thing i learned is never build a mico atx again for expansion to SLI/Crossfire

I like using a dedicated sound card and i can easily add another video card but will have to lose the sound card
Great overclocking mobo
 
Yeah the one thing i learned is never build a mico atx again for expansion to SLI/Crossfire

I like using a dedicated sound card and i can easily add another video card but will have to lose the sound card
Great overclocking mobo

Yeah. I don't actually know what to do now, I don't feel like watercooling them since that is going to cost $250+ and require quite a bit of work with the case. I might just try to squeeze a small piece of wood in between the cards and maybe ghetto mod a fan in the small space.
 
I don't understand why people think that enabling less than 10% more cuda cores on GK110 is going to yield some magical results.

mind = boggled

Awesome, and so true!

Anyway 290X = the best thing to happen for video card consumers since the GTX 680 swooped in with its price correcting performance.

Now we can all look forward to more bang for our buck regardless of which team we go with. I'm waiting for a) an Nvidia price correction and b) Mantal to see what it's capable of. Then I'll decide what to do and that might just be stick with my 7970's. Frankly I'm still pretty happy with them. Which just seems weird to me because I've had them since launch hehe...
 
So, Tearing is fixed between displays on different display connectors with Eyefinity enabled.

Previously, if you had displays on different types of display connectors, and Eyefinity enabled, if you moved a window between the two displays there would be visual tearing, just like what you see with VSYNC off in games.

Apparently this is now fixed, I can move windows between displays of different connectors, and there is no tearing. very cool
 
Great review!
4k results are awesome. I guess that's where the 512-bit bus really starts to pay off.
 
Back
Top