GTX 780 OC 6GB vs. R9 290X 4GB Overclocked at 4K @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
GTX 780 OC 6GB vs. R9 290X 4GB Overclocked at 4K - We take the ASUS STRIX GTX 780 OC 6GB video card and run two in SLI and overclock both of these at 4K resolutions to find the ultimate gameplay performance with 6GB of VRAM. We will also compare these to two overclocked ASUS Radeon R9 290X DirectCU II CrossFire video cards for the ultimate VRAM performance showdown.
 
Top notch review, folks. And particularly interesting at this price point.

Wile I don't have a particular allegiance to one side or the other, would it be fair to say that the price difference between the two setups would likely be negated by the increased PSU required? Maybe not entirely, but at least in part. The almost 200w difference OCd at load between the two is nothing to sneeze at, and will cost you.

All that said, mad props to AMD for clearly putting a priority on smoothness. That was a sticky issue and nailing so thoroughly that it still applies at 4k is a commendable feat.
 
AMD deserves a pat on the back for the 290x performance in this review. Also, what is ASUS thinking with that price?

Kind of unsurprising given the ROP disparity, but still a disappointing showing from team green. :(
 
Lovely article, thank you.

Just a suggestion but you might want to change the headline title to reflect that it's SLI vs CF. When I clicked on it, I was expecting a single-card comparison.
 
Excellent review as usual. I had a pair of EVGA Titans running in SLI at 4K. It def felt like it was held back. When I upgraded to my 295x2 and a 290x in Tri-fire...gameplay just feels so much smoother. Too bad EA/DICE has once again crippled performance in BF4 with the latest patch.
 
Wile I don't have a particular allegiance to one side or the other, would it be fair to say that the price difference between the two setups would likely be negated by the increased PSU required? Maybe not entirely, but at least in part. The almost 200w difference OCd at load between the two is nothing to sneeze at, and will cost you.
I'd venture a guess that if someone is going to plop down this much green for a pair of video cards, the price difference between power supplies that can handle them is not going to be a deal or as big of a deal. I'd be buying a power supply that could handle either pair -- that way I don't have to worry about waffling over which pair to get and worrying if I got enough power.
 
Lovely article, thank you.

Just a suggestion but you might want to change the headline title to reflect that it's SLI vs CF. When I clicked on it, I was expecting a single-card comparison.


Title line that long breaks formats......hence the reason for the sentence explanation of the article below the title.
 
Really fine article.:D

Would you please list the monitor brand and size that you use for the testing? Or did I miss it?

I'm impressed that CrossFire has been so improved.
 
AMD deserves a pat on the back for the 290x performance in this review.(

Hear, hear! Congratulations to AMD for getting their Crossfire performance taken care of. Very nice to hear that the tables have indeed turned. Gotta love competition.
 
What witchcraft is this?! A card comparison at 4k resolutions. I say FINALLY it is good to see you are forward looking kyle great article and keep it up sir.
 
Really fine article.:D

Would you please list the monitor brand and size that you use for the testing? Or did I miss it?

I'm impressed that CrossFire has been so improved.

That would be on page 2 where we list all systems specifications on a nice big graphic so you can't miss it. :D
 
Good job AMD with those 290x.

If we were to compare GEFORCE 780 TI SLI to Radeon 290X Crossfire. I am pretty sure 780 TI would lead the way. But the cost is too high.
 
Good job AMD with those 290x.

If we were to compare GEFORCE 780 TI SLI to Radeon 290X Crossfire. I am pretty sure 780 TI would lead the way. But the cost is too high.

Not really. Its well known that R9 290X CF is the fastest solution for 4k gaming. there are other reviews to back it up. also when you look at liquid cooling the R9 290X chips can be pushed further than 1.2 Ghz. so if anything the R9 290X CF strengthens its lead at the top.
 
Kyle and company,

I just want to give a personal thank you for the hard work that you put in delivering quality reviews that hit on the issues your readers are interested in!

This series of reviews on the 780 oc 6gb has been extremely exciting and you have done a fantastic job of answering the very questions I have found myself asking! Kudos to all of you for watching forum feedback and going that extra step!

Thank you! :cool:
 
Excellent review, as always. Makes me wonder about NVIDIA's new Maxwell cards for late this year/early next year ...
 
I hope the enthusiast-line thrashing of NVIDIA by AMD continues....we need the competition.
(And I say this as someone who currently owns all green ;))
 
I guess its fair to say then that AMD > NVIDIA in the context of this review :p
 
Excellent review, as always.
In all of our gaming we have shown you today, in every single game AMD CrossFire feels smoother to us than NVIDIA SLI. That's right, the tables have turned on this issue.
Amd has come a long way in this department.
 
This is where someone needs to point out that whomever made the nVidia cards that were in whichever review (ASUS) is a terrible manufacturer and that nVidia cards made by the superior manufacturer they are a fan of are at least four hundred times as powerful as AMDs cards.


Yes, but, 28" 4K displays are coming down in price, and are much more affordable now under a thousand dollars.

http://www.amazon.com/s/ref=nb_sb_n...57?url=search-alias=aps&field-keywords=28" 4K

If I'm going for 4K I want a 40" screen.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Good review, just wish you showed other resolutions, the 1-2% of people playing @4K don't care what the cards cost, they just buy the best.
I would really like to see if that 6GB really makes any difference over the 3GB in any game other then Watchdogs.(Im talking about stock games, not supped up modded Skyrim)
 
This is where someone needs to point out that whomever made the nVidia cards that were in whichever review (ASUS) is a terrible manufacturer and that nVidia cards made by the superior manufacturer they are a fan of are at least four hundred times as powerful as AMDs cards.

3283932-picard_wtf_riker_i_know.jpg
 
Good review, just wish you showed other resolutions, the 1-2% of people playing @4K don't care what the cards cost, they just buy the best.
I would really like to see if that 6GB really makes any difference over the 3GB in any game other then Watchdogs.(Im talking about stock games, not supped up modded Skyrim)

This review answers that question: The demand on the cards is going to decrease across the board when you decrease resolution, generally, including the demand on the vram due to AA/HBAO and texture resolution etc. There if the extra vram on the STRIX doesn't make it better at 4k, it can't make it better at 1080p, unless there is some kind of engine/driver bug.
 
Good comprehensive review!

On the subject of smoothness... Looking at the minimum FPS and the graphing of the lower FPS the 290x CF setup really keeps that FPS at much more enjoyable levels on average. The new Crossfire system AMD is working with really is also working very nicely.

4GB, and just as important for Hi-Resolution gaming, a 512 bit bus seems to really hit the sweet spot for performance.
 
Good review, just wish you showed other resolutions, the 1-2% of people playing @4K don't care what the cards cost, they just buy the best.
I would really like to see if that 6GB really makes any difference over the 3GB in any game other then Watchdogs.(Im talking about stock games, not supped up modded Skyrim)

That's why we tested a 4K. If you don't see the benefit at 4K, you won't at lower resolutions.

BTW we do have a single card review of this card at 1600p as well, it was the initial review linked in the first page.
 
Good review, just wish you showed other resolutions, the 1-2% of people playing @4K don't care what the cards cost, they just buy the best.

Sorry, but as the owner of a 4K display, I can tell you that you are wrong. My funds are not unlimited and I read the [H] reviews very carefully. Yes, I have a 780 Ti, but that was chosen after reading many articles and accepting that I will not be able to play many games at ultra quality. Note that I have one graphics card, not more.

Speaking of more graphics cards, I wonder if [H] will compare 3x 780 with 3x 290?
 
I figured the 290x in Crosfire would be faster than the 780's in SLI

I think you'd get around the same performance with two 780 Ti as long as the game doesn't use the entire 3gb of frame buffer allowed. Titan Black's would be ideal but again not even close to comparing them on price.

Good review guys :D
 
Thanks for keeping it [H]. I see all this whining from children about how "no one games at 4K yet!". The article needs to come out before you're in the market for a 4K screen, not afterward.
 

lol, lost it at "at least four hundrer times as powerful as AMD cards"

i'm an amd fanboy, and even i wouldn't say that amd cards are at least 400 times as powerful as nvidia cards especially not with reasons like "superior manufacturers"
 
Very comprehensive article, very good read.

It would have been great to have a 780ti 3GB SLI setup in the mix as well so we can truly see whether the vram is actually needed in real world scenarios, even at 4K.

:cool:
 
Great write up guys, thanks for adressing this question. I can now sleep well with my 290X's driving my sammy 4K display. The other part of the equation...used 290X's are going for dirt cheap still, $700 for two 290X's is amazing, sure glad a bunch of people are dumping their mining rigs!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
700-900Watts for whole machine is pretty crazy.

I guess that's only for people with AC in house as without it even using Phenom X6 and GTX 470 in summer was very uncomfortable.
 
700-900Watts for whole machine is pretty crazy.

I guess that's only for people with AC in house as without it even using Phenom X6 and GTX 470 in summer was very uncomfortable.


Not really when your talking about using 2 high end gpus. You don't need to have A/C as long as the room is vented well and its not 90 degrees F + outside.
 
I just have a warm weather and cool weather profile for my cards. I do wonder if ASUS had done the 6GB 780ti how it would fair against the 290X. Im guessing the wide ass 512bit bus is finally of use at the UHD+ resolutions and thats where we see the gains. I do agree power is higher than it should be though, next gen cards will address this. This was AMD's jump back into the Big die game, glad they did it, now just make it more efficient.
 
I'm going to wait a couple of weeks and see what the GTX 880 looks like.

I hate the fact that Nvidia won't let you use a separate Geforce card for Physx when using AMD for video. :mad:
 
Back
Top