GeForce GTX 780 Ti vs. Radeon R9 290X 4K Gaming @ [H]

G-Sync is never coming to AMD video cards, its proprietary nV tech.

Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.
 
Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.

I hope it doesn't turn out like AMD's HD3D. They basically released it and said, "fuck it" and completely forgot about supporting it - and from what I remember, Samsung was one of the partners on board for HD3D.

I never liked 3D vision or HD3D, but nvidia at least maintained support for it. I would hope if AMD releases something similar that they will actually, you know, maintain support...
 
I hope it doesn't turn out like AMD's HD3D. They basically released it and said, "fuck it" and completely forgot about supporting it - and from what I remember, Samsung was one of the partners on board for HD3D.

I never liked 3D vision or HD3D, but nvidia at least maintained support for it. I would hope if AMD releases something similar that they will actually, you know, maintain support...

Paying $100+ for dat 3D vision was nice.
 
Quote me. Soon AMD will announce their own version which will be exclusive to Samsung's next batch of 144hz TN panels.
"Their own version" of the same tech going into G-Sync isn't G-Sync though, G-Sync is proprietary as TruForm or TrueAudio. And 2 competing vendor specific monitor standards is dumb as hell.

Only way it could maybe make sense is if AMD makes their version open (and has it working with nV cards soon after or at launch) + cheap.

Otherwise its HD3D all over again. Same thing goes for G-Sync too.
 
I read some custom cooled 780Ti reviews and those GPUs are monsters. 150$ premium over 290x doesnt look that much with Windforce 780ti. Dont know what AMD is waiting for with their crappy ref. cooler. Let the beasts come and cool that thing. Overcloked Gigabyte 780Ti Ghz 20c cooler, much more quiet and almost 30% faster. I know 290x can clock to but with ref. cooler it must be pain for ears and not that high for sure.
 
Last edited:
Good 3rd party air cooled HSF's don't cost $150 for the R9 290/X, more like $50-70.

AMD is waiting on the AIB vendors who will release cards with better HSF's whenever they get around to it, AMD has little to no say in that process.

R9 290 OC's quite well even with the stock HSF. Noise is high but if you feel like paying $100+ for a 1fps advantage with a OC'd 780GTX @ 1440p resolution than you might have a poor sense of value. Just remember that when you OC that 780GTX power + noise shoot up dramatically as well so you're not going to get a card that is 20C cooler than a R9 290/X.
 
I think that vendors are waiting for AMD to give them go, not the other way around. 150$ premium over 290x is for 780ti not custom 290x. I got kids in next room and noisy GPU is not an option. They couldnt sleep when I used to OC my 5870, no kidding and 290 and even 290x are no better. I would love 290 with TwinFrozr or new Windforce but nothing so far and it is shame in my opinion. And I still love gtx 780 and in reality I dont need andything more. Nice, cool Lightning or something like that is still my No1 choice for 1200p. Perferct GPU in my opinion, maybe 50$ to expensive.
 
I think that vendors are waiting for AMD to give them go, not the other way around.
Based on what? Gonna need a citation.

150$ premium over 290x is for 780ti not custom 290x. I got kids in next room and noisy GPU is not an option....Nice, cool Lightning or something like that is still my No1 choice for 1200p.
Goal post shifting has never convinced anyone of anything and makes you look bad. Also a OC'd 780GTX or 780Ti will still make quite a bit of noise. Quiet stock air cooled HSF's for heavily OC'd top end cards don't exist. None of them are going to run at cool temps either if that is your concern. Not that it should be so long as temps are in the manufacturers' range. They're all going to push 90C+ easily on air when heavily OC'd with stock coolers.

And I still love gtx 780
If you already have a current high end GPU then yes buying a R9 290/X is a bad idea. Of course so would be buying a 780Ti since its only 5-10% faster than a 780GTX for quite a bit more...unless you're going to run SLI, while OC'd of course right? But then you can expect lots of heat to get dumped into your house and noise will probably be deafening, driving you and your kids insane with sleep deprivation + heat stroke*.

So that is out too I guess.
 
Last edited:
LOL Nvidia fanboys out in full force, will defend Nvidia pride, and try to downplay uber mode as much as possible, they must do everything to try to make the 780 Ti look alot better, even it isn't.
 
LOL Nvidia fanboys out in full force, will defend Nvidia pride, and try to downplay uber mode as much as possible, they must do everything to try to make the 780 Ti look alot better, even it isn't.

I've always liked AMD GPUs but I will state for a fact that I do not like the fact that 290X throttles with quiet fan modes. I think it is bullshit. And with that said, I personally find the GTX 780 overclocked to be the best value card in that price range - it has 20% OC headroom above stock with a good cooler and costs 500$ with 3 free games. Can the 290 overclock? Maybe, but it's pretty safe to say Kepler has more reasonable TDP headroom to overclock.

Until aftermarket 290 cards are available to be assessed, that's how I feel. AMD made too many trade-offs with the reference 290 design that make it look undesirable to me despite the low price - and I say that as someone that REALLY liked the 7970. I kinda feel like AMD could have made the 290/290X killer as it IS a great chip, but that cooler design really drags the entire package down. IMO. I mean, in uber mode the chip performs amazing. Just imagine if AMD had actually, like, spent money on their cooler and made it like the Titan shroud. Would be such a much much better product.
 
Never. 16:10 is gone. Accept the bigger sizes and resolutions.

If this is true, it is a terrible shame. 16:9 is great for movies and games, but the the 16:10 aspect ratio is FAR superior for getting work done, which is mostly what I do on my rig.

I wouldn't even consider getting any 16:9 monitor.
 
Zarathustra[H];1040378636 said:
I wouldn't even consider getting any 16:9 monitor.

I agree because fuck logic and value.

1000$ buys 1x 30" 2560x1600 monitor
1000$ buys 2-3x 27" 2560x1440 monitors
1000$ buys 2x Crossover 2755AMG's which support 3840x2160 over HDMI @30hz.

Which solution offers more screen real-estate? Same pricing scheme applies to budget monitors: 2x 16:9 1080p=1x 16:10 1200p monitor.
 
Last edited:
Shouldn't you guys be using SSAO in Far Cry 3? I've always been told that HDAO and HBAO are vendor specific.
 
Shouldn't you guys be using SSAO in Far Cry 3? I've always been told that HDAO and HBAO are vendor specific.

HDAO and HBAO are two differing technologies, but both are fully capable of working on NVIDIA or AMD, there are no compatibility issues, they are not vendor specific. These are open DX rendering effects and are capable of running on any hardware. In FC3 the hierarchy is HDAO > HBAO > SSAO > Off. HDAO, naturally, takes the largest performance hit.

I went back into the FC3 reviewers guide, and pulled this info out on HDAO:

It shows that HDAO is a standard DirectX DirectCompute accelerated algorithm, which naturally works on the DX11 capable NVIDIA GPUs and AMD GPUs.

Far Cry 3™ implements a new and improved version of HDAO that uses full 3D camera space position data to detect valleys in the scene that should be shaded darker, and attenuates the lighting based on valley angle. This effect can be computed at full resolution, or at half resolution, depending upon the hardware resources available. In order for a half resolution HDAO buffer to be re-matched with the main color scene, a DirectCompute accelerated bilateral dilate and blur is performed to ensure that AO properly meets objects from the full screen resolution scene. In Far Cry 3™ this technique has been significantly improved in both performance and quality.

The default setting of the AO on high-end machines in FC3 is HDAO. That's the default setting, no matter the hardware.

I think what most people mean who say that about it being "vendor specific" are confused on is the fact it does perform better on one vendor, versus the other. It's like comparing TressFX in Tombraider, it works on both NV and AMD hardware, but it is definitely faster on AMD hardware.

In that same case, HDAO runs with less of a performance hit on AMD hardware, then it does on NV hardware, but it works just fine on both, and is technically the better quality AO option for FC3, it is more precise in how the shadows fall between the grass most noticeably.

So since HDAO is the better quality option, if the performance is there, we opt to have it enabled, if we need to pull back on performance, dropping to HBAO or SSAO is what we do then, just like any other graphics setting.
 
Have you guys tried overclocking the memory on the 780 and Titan to see if that helps? I'm guessing the 290X's previous advantage at 4k is probably down to Nvidia's earlier cards being bandwidth limited.
 
I agree because fuck logic and value.

1000$ buys 1x 30" 2560x1600 monitor
1000$ buys 2-3x 27" 2560x1440 monitors
1000$ buys 2x Crossover 2755AMG's which support 3840x2160 over HDMI @30hz.

Which solution offers more screen real-estate? Same pricing scheme applies to budget monitors: 2x 16:9 1080p=1x 16:10 1200p monitor.


Does the Crossover 2755AMG support 3840x2160?

I have not seen a Seiki in person to comment, but is it really so awful at 2160p with gaming? For a monitor that is roughly a fifth of the price of its competition, how bad is it?

A fine review Brent, thank you. I suppose equal fan speed/dB levels might be required in future reviews to assuage the nitpicking. (kidding)
 
LOL Nvidia fanboys out in full force, will defend Nvidia pride, and try to downplay uber mode as much as possible, they must do everything to try to make the 780 Ti look alot better, even it isn't.

Ti is a lot better full stop!

Who in their right mind would even purchase a 290X over a 290. Defies logic completely. A shit card vs. a decent one.
 
1.) Yes
2.) The 2755AMG is an excellent monitor (nearly finished review) aside from the Plasma Deposition Coating (subjective, I'm not a fan of blacks), lack of a real warranty, stand and inability to easily exchange+return it.



Agreed.



According to what I have gleaned from eBay and the review, it is a 1440p monitor

I would not say that it is ideal, but 30Hz is not the death blow to a monitor (photo editing or multiwindow viewing). Phosphor persistence would be sufficient of an irritant to abnegate it as a video or gaming screen. Without seeing one, I couldn't pass judgement based on refresh.,, Sorry, this probably belongs in a different thread.
 
It is a native 1440p monitor but it supports 3840x2160 over HDMI @30hz

While it might be an excellent monitor, it does not have native "4K" resolution. I assumed that even the Seiki has that.
 
I'd like to ask the person/people who ran this test, since they'd have as much 4K experience as anyone by now (other people with 4K displays feel free to come in as well):

Did you have issues on AMD hardware (perhaps both sides?) with games launching at 1920x2160 first and having to change the resolution afterwards? I'm assuming this is a general eyefinity limitation displaying the native resolution to start with, which wouldn't normally be an issue as you'd just get one display come up, but many games can't handle an obscure aspect ratio like 1920x2160, some just display one half of a 3840x2160 image and you have to hope only the left hand side of the game will be good enough (e.g. CS:GO), with others, the game simply refuses to launch with a direct3D error unless you disable tiling and go back to 30Hz (Supreme Commander).

I'm not going to make a decision on my upgrade strategy until I know where the source of this problem is. If it's a defect that's inherent to eyefinity that Geforces do not suffer, and that I'll always have to put up with on AMD hardware, I'd have a serious think about getting a GTX780Ti, or possibly eventually two.
 
Both cards are great pieces of kit...but it's becoming more and more apparent that raw speed numbers (in reviews or benchmarks) should not be the only deciding factor to which a person base their decision to purchase.
One only has to look at ISI's rFactor2 forums for that.
All the AMD cards (290 series included), are struggling to run that less than graphically demanding simulation...and have been for a while.
You'd think a driver release would have cured the issue ages ago...but no.
My point is: You need to closely monitor the performance of whatever games you're going to be playing most with whatever card you buy....as well as what support the card is getting to fix potential issues.
It means absolutely nothing if a card is 'screaming fast' but doesn't work for you.
 
Last edited:
Both cards are great pieces of kit...but it's becoming more and more apparent that raw speed numbers (in reviews or benchmarks) should not be the only deciding factor to which a person base their decision to purchase.

All the AMD cards (290 series included), are struggling to run that less than graphically demanding simulation...and have been for a while.


Did you even read the review?
 
Because they cover the smoothness in it, which clearly makes your post useless, sorry.
 
You should be sorry....
Did you even read my post. I clearly stated that the guys running rFactor2 are struggling with it.
Maybe...just maybe, you should read some more before commenting.
 
It's a fair comment, but unfortunately smaller titles very commonly have huge gulfs in performance between GPU manufacturers. If it's not big enough to warrant code in the drivers to optimise it, it'll probably run terribly on whichever brand the developers didn't have in their testing machines.
 
It's a fair comment, but unfortunately smaller titles very commonly have huge gulfs in performance between GPU manufacturers. If it's not big enough to warrant code in the drivers to optimise it, it'll probably run terribly on whichever brand the developers didn't have in their testing machines.

Which is why my post was about monitoring game performance of the titles a person plays most ....before buying a specific card based on benchmarks.
 
Yeah agreed, but that's been true since the very beginning, I wouldn't say 4K changes that any.
 
780 TI is awesome, but that price savings on the 290x for similar performance is hard to beat.. not that it matters to me now since I cant afford them and wont be getting/needing a 4k monitor anytime soon anyways.. thanks for the posting
 
I'm hoping that once the aftermarket R9 290Xs come out, and the bit/litecoin craze calms down, the prices of the GTX 780 Ti are driven down so I can step up to one and only pay a minimal difference.
 
I'm hoping that once the aftermarket R9 290Xs come out, and the bit/litecoin craze calms down, the prices of the GTX 780 Ti are driven down so I can step up to one and only pay a minimal difference.

That's been the basic prediction- Nvidia has continued to maintain their premium pricing due to AMD's follies, particularly their less than stellar reference cooler and failure to meet the market demand for crypto-currency processors.

Essentially, as expected upon these cards' release, once R9 290(x) cards with decent aftermarket coolers are widely available Nvidia will have to adjust their pricing in order to maintain sales volume.

Hopefully it won't be too long :).
 
Back
Top