AMD Radeon R9 290X Video Card Review @ [H]

I really hope that [H] does a R9 290X CFX review with the following configurations.

R9 290X CFX OC
GTX 780 SLi OC (possibly with custom cooled cards that are already at factory overclock speeds faster than a Titan)
GTX Titan SLi OC
GTX 780 Ti SLi OC (when it's released)

I really want to see these benchmarks to make any meaningful decision of potentially changing my cards.

Comparing stock GTX 780 is like driving a Ferrari at 60 MPH and saying performance is not good. :rolleyes:

Just like the 7970 was. Now you know what it was to be an AMD user after Kelpers release :D
 
Didn't they just add tiled rendering to DX11.2? It's kind of like Aureal's ideas winding up in AMD's GPUs :).
Software tiled rendering has been around for years, DX support is not needed to do it though it does make it easier to pull off.

TBDR (tile based deferred renderer) requires specialized hardware to pull off which was what Kyro was. Very cool idea that never took off in the PC space but has done well in the smartphone market. Aureal had nothing to do with it.
 
Dude, I had a watercooling setup for 3 years so I know the difference between temps and heat and it doesn't change the fact that that its the same heat output, however with water cooling, you will maintain higher clock speeds without having a stock air cooling solution that is running at up to 100% fan speed in order to maintain any overclock speeds....I guess you're ok with uber mode fan speed of 55% which is 50db under load
This is not what you were saying at all in the post I was replying too. Not even close:
Blkout said:
I would agree with the OP that water cooling or superb air cooling is the only saving grace for this card. Even at $100 cheaper, its runs VERY hot, even if its safe for your card, you're going to notice that heat output in the room that you're using it in. I can deal with the increased power draw since I'm a power junkie but the heat output is ridiculous when a 780 is running 15-25c cooler depending on the fan profile. The 7950 and 7970 were both hot running cards and a primary reason I sold mine and went with a GTX 670 and now a 780, both run much cooler than AMD's last generation and this generation of cards and it is a BIG deal to me.

The whole heat issue has been run into the ground over the last 8-10 pages so go back and read all about it and if you have a --new-- question or comment about it by all means post it otherwise no one cares.

Stop treating me or anyone else like we don't have a clue, If you're ok with AMD's new product, that's fine, I'm not and we'll agree to disagree, facts are facts though.
You're not discussing facts though when you say stuff like, "water cooling is the only saving grace for this card" or "the heat output is ridiculous <insert comments vs 780 which only uses about 60w less power>" and "you're going to notice that heat output in the room that you're using it in". You actually have to ignore facts to believe these things you're saying are true.
 
Question. Will this card work with those A- Qnix monitors? On their amazon and ebay pages it says that it won't work with DVI-I only DVI-D. I want to buy that monitor and use it with a 290x but if I can't even connect then it'll be pointless.
 
Software tiled rendering has been around for years, DX support is not needed to do it though it does make it easier to pull off.

TBDR (tile based deferred renderer) requires specialized hardware to pull off which was what Kyro was. Very cool idea that never took off in the PC space but has done well in the smartphone market. Aureal had nothing to do with it.

You're right, it was meant to be humorous, not factually accurate; though we've seen very little 'tiling' functionality in desktop GPUs since then :).

And I was likening the 'Aureal' vertex-based audio modeling to TrueAudio, as it's the first solution that seems to do what Aureal did way back when.

Just chuckling about the prospect of two technologies that seemed to have been long-lost that are now shipping in hardware.
 
OK my bad for missing the humor ;(

Yeah its both funny and cool to see some of the old ideas coming back. Various legal crap, market inertia, and money prevented them from coming to proper fruition for years. Now that the IHV's can see the writing on the wall with the slowing process advancements they're starting to look at different ways to differentiate their products and offer value.
 
There's an article on Hexus which purports to look at gaming performance at 4K. Unlike the [H] article, it's completely bl**dy useless: he just tries to run at max settings and the card fails miserably, whereas [H] gives us the usable settings.
 
This is not what you were saying at all in the post I was replying too. Not even close:


The whole heat issue has been run into the ground over the last 8-10 pages so go back and read all about it and if you have a --new-- question or comment about it by all means post it otherwise no one cares.


You're not discussing facts though when you say stuff like, "water cooling is the only saving grace for this card" or "the heat output is ridiculous <insert comments vs 780 which only uses about 60w less power>" and "you're going to notice that heat output in the room that you're using it in". You actually have to ignore facts to believe these things you're saying are true.

Sorry you misunderstood. Sorry I didn't realize I couldn't post anything else related to this topic. I wasn't aware you owned the internet. I'll make sure you check with you first before posting anything else on the internet.
 
2xDVI instead of 2xDP, wtf is this ghetto shit?
DVI is still the norm for many new high res (ie. 1440p+) monitors. Supposedly that will start to change in 2014. Really it all depends on what the monitor OEMs are doing and AMD/nV have no control over that.
 
DVI is still the norm for many new high res (ie. 1440p+) monitors. Supposedly that will start to change in 2014. Really it all depends on what the monitor OEMs are doing and AMD/nV have no control over that.

I'd bet that if Nvidia opened up G-Sync for licensing to AMD/Intel, that DP would gain some traction- and I hope 4k vendors put DP on their setups, whether they're TVs or computer monitors.
 
I bet it would but fat chance of nV opening up G-Sync.

I think we'll see DP more on monitors in 2014 but on TV's its going to be rare I think since not much consumer or high end home theater hardware supports it + HDMI has been revised up to "good enough" status.

Anyways food for thought: VGA, DVI Display Interfaces to Bow out in Five Years
AMD said:
Advanced Micro Devices has announced it would phase out chipset support for DVI by 2015.
 
Simple reason DVI will go away: mainstream adoption of 4k resolution. Yes, this will take time, but this is why DVI will eventually disappear. Let me explain:

DVI is not a standard that is maintained any longer; the governing body disbanded in 2001 IIRC. Therefore with that being the case, DVI-DL is limited to 2560x1600 and cannot go higher, and will not go higher because as mentioned - the standard cannot change since the governing body is gone. OTOH, Displayport is created and maintained by VESA and is an ever changing standard - with that being the case, this is why apple has gone fully and 100% with displayport and thunderbolt (pin compatible with displayport for displays). Displayport fully supports 4k with one cable with the newest spec, DVI-D obviously doesn't because the standard is done and gone as of 2001.

Basically, DVI-D will never happen with 4k. That's why it will be phased out at some point in favor of HDMI 2.0 and Displayport - I think there are hacks to make it (DVI) work with 2 cables, but I don't know of any such monitors doing this. Obviously mainstream adoption of 4k is going to take time, but the only options for 4k are HDMI and DP. HDMI at this time requires 2 cables and MST to work with a 4k display. Displayport works with 1 cable and has enough bandwidth to handle 4k at 60hz, while HDMI in it's current implementations doesn't. Essentially, I think both NV and AMD will remove DVI at some point, although they will probably include adapters for existing DVI monitors. I'm sure they will, actually. Also, keep in mind that professional IPS monitors generally use displayport and nothing but. This is for a reason. Displayport on a technical basis is far better than DVI-D. Displayport has higher bandwidth than DVI-D and supports HDMI audio as well, while DVI-D cannot. It's just a better standard all around, which is why apple had the forethought to switch to it years ago. Conversely, on the PC end of things it's always legacy legacy legacy. That's why we still have people with years old VGA screens and DVI is still in use.

In fact - many newer 4k UHDTV's will have displayport as well. I'm aware of a new Panasonic model that has both HDMI and displayport connectivity options.
 
Last edited:
I haven't seen the differences between HDMI 1.4b and HDMI 2.0- is it enough to preclude the use of DVI passive adapters?
 
I haven't seen the differences between HDMI 1.4b and HDMI 2.0- is it enough to preclude the use of DVI passive adapters?

Yeah. From what i've seen HDMI 2.0 is basically identical to Displayport in terms of bandwidth, so it can do 4k60hz on a single cable from what I understand. The main detraction is that HDMI 2.0 has licensing fees while DP doesn't, but HDMI 2.0 will be the standard (for UHDTV's) at some point in the future.

Right now, though, I don't think anything has HDMI 2.0 capability. No existing GPUs have hardware for HDMI 2.0, so it's going to be a mess once HDMI 2.0 does arrive. PC's will generally need to use displayport for 4k while TV's will opt for HDMI.
 
Well, as I said, I decided to purchase the 2 x 7950 Gigabytes cards with the Windforce coolers. I received them today and they are even better than I thought. They both have 1 x 8 pin and 1 x 6 pin PCIe power connectors which should help overclocking a lot.

The graphics score of 3D Mark 11 went from 9800 with 2 x 6950's to 17250 with these 2 cards. :D Also, games like Tomb Raider and Crysis 3 can now be maxed out at 1080p resolution at 60hz. The only thing is I found is, I am absolutely terrible at gaming now compared to what I used to be. :D

The 290x would have been a good buy to except I do not think I could have handled the OEM Cooler noise. These cards are even quieter than the Powercolor 6950's I was using. I love when competition heats up and prices drop. :D
 
Well, as I said, I decided to purchase the 2 x 7950 Gigabytes cards with the Windforce coolers. I received them today and they are even better than I thought. They both have 1 x 8 pin and 1 x 6 pin PCIe power connectors which should help overclocking a lot.

The graphics score of 3D Mark 11 went from 9800 with 2 x 6950's to 17250 with these 2 cards. :D Also, games like Tomb Raider and Crysis 3 can now be maxed out at 1080p resolution at 60hz. The only thing is I found is, I am absolutely terrible at gaming now compared to what I used to be. :D

The 290x would have been a good buy to except I do not think I could have handled the OEM Cooler noise. These cards are even quieter than the Powercolor 6950's I was using. I love when competition heats up and prices drop. :D


Whoop woop
Windforce my only choice too
 
Is there tearing still on DVI, DVI, DP eyefinity? I'm looking to replace my 7970's which have 4 miniDP, so I don't have to worry about that with my 3xDell 3007wfp-HC's.
 
what does tearing have to do with any of those interfaces?

If you have miss-matched interfaces on 6000 and 7000 and some 5000 series cards and in eyefinity, you get tearing on the odd man out. Here's a video I made about it.
Preffered Monitor VS Tearing Comparison

*Note: this is outdated and the issue where non-similar interfaces exhibited tearing. As of the latest drivers, this is fixed.
 
what does tearing have to do with any of those interfaces?

If you used 2 dvi and 1 displayport for eyefinity, 1 monitor will always have some kind of tearing/line on that 1 monitor that used displayport.

The only way to fix the problem was to use 3 display port connections.

Since now you can use 2 dvi and 1 hdmi with the 290x, it might fix that tearing on the 1 monitor.
 
He's still 'screwed', since one of the outputs is HDMI- that isn't going to run a 3007 at full res :/.
 
If you have miss-matched interfaces on 6000 and 7000 and some 5000 series cards and in eyefinity, you get tearing on the odd man out. Here's a video I made about it.
Preffered Monitor VS Tearing Comparison

*Note: this is outdated and the issue where non-similar interfaces exhibited tearing. As of the latest drivers, this is fixed.

If you used 2 dvi and 1 displayport for eyefinity, 1 monitor will always have some kind of tearing/line on that 1 monitor that used displayport.

The only way to fix the problem was to use 3 display port connections.

Since now you can use 2 dvi and 1 hdmi with the 290x, it might fix that tearing on the 1 monitor.
wow I knew nothing about that but never really paid attention to anything about eyefinity.
 
th


th


Sorry, had to do it.
 
Is there tearing still on DVI, DVI, DP eyefinity? I'm looking to replace my 7970's which have 4 miniDP, so I don't have to worry about that with my 3xDell 3007wfp-HC's.

No, that type of tearing appears to be a non issue with my single 290x.

The Eyfinity cursor corruption bug appears to be fixed as well.

I was running a 7970 with 3x27" (LG 27EA63V-P's) on 1 X DVI, and 2 X MiniDP to DVI active adapters.

Now I'm running those same panels on a 290x , on 2 X DVI and 1 X HDMI, with absolutely no port adapters.
 
No, that type of tearing appears to be a non issue with my single 290x.

The Eyfinity cursor corruption bug appears to be fixed as well.

I was running a 7970 with 3x27" (LG 27EA63V-P's) on 1 X DVI, and 2 X MiniDP to DVI active adapters.

Now I'm running those same panels on a 290x , on 2 X DVI and 1 X HDMI, with absolutely no port adapters.

Ah this is good to know. My 290xs haven't arrived yet, but I was hoping this would be the case
 
I'd actually like to see how well this card deals with mixing dual-link and DP connections for Eyefinity, matching the problems as stated above.

Though I will admit that I'm not really interested in assembling that setup, as I"d much rather have a G-Sync capable 30"-40" 4k setup going... :)
 
...and so, the Red Rooster laughed heartily at the Green Goblin who now had egg all over his face and they all lived happily ever after.

Or at least until the next generation. LOL
 
Informal rendering of "yep", which is, in turn, an informal way of saying "yes".

It's not a word that is correctly used

You are wrong

Verb that is informal
talk at length in an irritating manner.
So he isn't using the word correctly

It's supposed to be Yup

yep/
exclamation & noun
exclamation: yup;&#8195;noun: yup
1.
nonstandard spelling of yes, representing informal pronunciation.
 
Back
Top