GeForce GTX 780 Ti vs. Radeon R9 290X 4K Gaming @ [H]

In all honesty, I think nvidia just keep their prices high full stop. It's only when they've actually been outcompeted for 'fastest card' that the prices have really come down, such as when the 290X first came out and toppled the GTX780. I don't really think the AMD prices coming down will reduce the price of their cards,. especially not the GTX 780 Ti, there is simply no reason for them to lower it as it's 'the best', never mind best value. Looking back, the only things I recall cutting the prices of Geforces is the release of more Geforces - e.g. GTX285 superceding the GTX280, the GTX580 superceding the GTX480, and the GTX780 superceding the GTX680. Even then, they don't always drop the price, and just let the old model phase out through being uncompetitive value.
 
It's a business case- if Nvidia sells their cards at a perceived 'premium', it's because people are willing to pay that premium. They won't lower prices unless their cards stop selling enough for them to care, which means that AMD actually needs to out-compete them across the board, not just on raw performance.

And as it stands, limited availability of AMD cards keeps the prices high. The *coin craze will die down, good R9 290(x) cards will hit full availability, stuff like Mantle and G-Sync will begin to be realized, and then we'll be able to judge Nvidia's 'over-pricing' :).
 
Fact is, if they were too overpriced, people wouldn't buy them. The prices are kept high because the features AMD don't offer, combined with nvidia's fairly sizeable loyal fan base mean they'll still sell stuff even if it's clearly less value than the opposition. If it didn't work, they wouldn't do it.
 
In Q3 2013 nvidia had 64.8% discrete GPU market share per Jon Peddie research. So people are still buying based on their higher feature set (adaptive vsync, shadowplay, etc etc etc) and other intangible factors. Nvidia also has much better software with regular feature additions every 2-3 months, while AMD is still working on bug fixes and hasn't added new features since MLAA in ... 2009? Or something like that.. In the past it was easy for me to sort of discount nvidia's software, but having used it for a while it has really grown on me. Enough to make me think twice about going back to AMD, and I used AMD / ATI for many years in the past. Most recently with 7970s. Comparing the software situation of the 7970s to the GTX 780 has been like night and day in terms of quick fixes and feature additions, but I won't get into that.

Nvidia would adjust their prices if people basically stop buying. But gamers are still buying, so we are where we are with pricing. Not alot of actual PC gamers buy AMD over nvidia if they're the same price/performance. AMD needs the lower pricing to have a viable competitor. Then again, the situation right now is pretty anomalous given the entire mining thing. Miners are buying these cards by the truckloads, although I can't see that being sustainable for much longer. The supply levels seem to be leveling off now that LTC/BTC prices have taken a nosedive.
 
What do you mean? AMD has added SSAA directly through their control panel since. Nvidia doesn't even have that feature. Although I do prefer their current SGSSAA support.

They've also added transparency SSAA in DX10/11 titles since.

While they may not have added much in their control panel itself they have helped kickstart quite a few open standards that we see in games.

I'm not trying to play the part of the AMD defense squad but your post isn't exactly accurate.

Don't get me wrong. I am willing to pay a premium for what Nvidia offers for what I use in comparison to AMD. Mostly due to SGSSAA and 3d vision.
 
What do you mean? AMD has added SSAA directly through their control panel since. Nvidia doesn't even have that feature. Although I do prefer their current SGSSAA support.

They've also added transparency SSAA in DX10/11 titles since.

While they may not have added much in their control panel itself they have helped kickstart quite a few open standards that we see in games.

I'm not trying to play the part of the AMD defense squad but your post isn't exactly accurate.

Don't get me wrong. I am willing to pay a premium for what Nvidia offers for what I use in comparison to AMD. Mostly due to SGSSAA and 3d vision.

It rarely is.
It has a a few well know facts that are then smothered with opinion past off as facts.
Most of the reasons he gave are moot because NV was charging a premium and perceived as premium even before such features were introduced because of good marketing and The Way It's Meant to be Played campaign.

When i was building gaming PCs for PC illiterate users than then requested a NV card, i would then ask them why because they will get less hardware over all with in there budget, they said because NV is better and again i asked them why, not a single feature was mentioned, but only that they see NV marketed more and The Way It's Meant to be Played pop up in games.
I said try the ATI/AMD card and if you dont like it i will replace it with a NV card and foot the difference, not once did that ever happen.
 
Last edited:
You can't go wrong with either of the cards. Some people will like one over the other due to features, performance and others will go with price. Either way you can't go wrong.

Regarding high prices of cards, well there certainly are market forces in play, a la alt coin mining, but the real point is that the market now bears ability for manufacturers to build and sell enthusiast level cards (which admittedly is subjective) which carry a premium price. This is the same basic principal that led Toyota, Honda, Nissan to make the leap to luxury brands. You can rail on Toyota all you want for pricing a Lexus at 50K when you can afford half that, but this doesn't change supply and demand when people are more than willing to fork out $600-700 for a card.

At the end of the day, its your money and if you don't buy into the high prices of either manufacturer, then vote with your wallet and get something you feel is of value. Even the value cards are good performers. If on the other hand you don't care about a mark up or paying premium, then do so, its your money, spend it how you like because its no ones business how you spend your money.
 
When i was building gaming PCs for PC illiterate users than then requested a NV card, i would then ask them why because they will get less hardware over all with in there budget, they said because NV is better and again i asked them why, not a single feature was mentioned, but only that they see NV marketed more and The Way It's Meant to be Played pop up in games.
I said try the ATI/AMD card and if you dont like it i will replace it with a NV card and foot the difference, not once did that ever happen.

Most people don't even use the features that do differentiate the two. For that sort of person I really don't see the point in paying a premium for a Nvidia card.
 
Quad 4k? I don't think there is anything that would drive that with any reasonable level of performance from a gaming standpoint.

Really? I thought the review was showing a single card running 4k, so couldn't obviously with some performance losses expect a quad 290x setup to run quad 4k with the next gen offering what is probably closer to the performance seen in this review with a 1 card 1 4k basis? I hope to at least go tri 4k this year
 
Really? I thought the review was showing a single card running 4k, so couldn't obviously with some performance losses expect a quad 290x setup to run quad 4k with the next gen offering what is probably closer to the performance seen in this review with a 1 card 1 4k basis? I hope to at least go tri 4k this year

Performance doesn't scale in a linear fashion with multiple cards. Unless I'm mistaken, anything beyond 2 cards and the returns diminish significantly. However, I may be completely off-base. NV was showing off 3x 4K surround. I have no idea what performance was like.

*edit: Looks like it 3x 4K is doable with triple 7970s... The requirements are still pretty bonkers, though! :eek:
 
I thought u lost about 30% performance on each card so 4 gives you 2.8x perf of one, thereabouts
 
I just got the GTX 780, it is amazing. I can tell a difference in color on a normal web page. I play BF4 as well, this makes that game even better!
 
Their Ntune software does make my computer bluescreen every time, I even did a clean install of windows 7. Does anyone else have that problem?
 
Nvidia ntune. What is it, 2006 again? Also, digital output basically rendered any 2D image quality differences obsolete. Digital output is consistent. Now, this was a real issue with VGA analog output MANY MANY years ago, where the RAMDAC could have a real effect on 2D image output - this is where Matrox in particular excelled. But back to the point. Digital output is consistent across everything. This isn't 1999 with analog output, give me a break.
 
Very nice review
Good to know there is still parity between the two brands. It will keep competition flowing and prices more reasonable.
 
Would like to see how the extra VRAM in the GTX TITAN compares in terms of performance to these 3GB and 4GB cards. Sure it's $300 more expensive, but it would also illustrate one way or the other how VRAM usage affects performance in high-pixel density situations. Maybe toss that 8GB Sapphire R9 290X in there for comparison, too ;).
 
VRAM doesn't directly affect performance. The sooner people realize this, the better. VRAM merely lets you use higher anti aliasing settings, or higher assets in game. For instance, if you want to run 8X SGSSAA or 8X resolution scaling (same as OGSSAA), more VRAM lets you do that. I would say game assets or textures can benefit as well, but with current games we're in a situation where every game uses jack shit for VRAM in terms of assets unless you mod the game (eg skyrim), anti aliasing at SURROUND resolutions or 4k resolutions is the absolute biggest beneficiary to having more VRAM. It is not beneficial at 2560x1600 or below beyond 2GB, unless you specifically want to go nuts with high SSAA levels in your games. And to re-iterate those VRAM requirements increase with higher resolution. At 1080p, 1440p, or 1600p - it doesn't matter unless you're using some type of professional software suite. Gaming? Doesn't make a difference unless you want absurd levels of OGSSAA (resolution scaling) or SSAA. You can use 2X MSAA in nearly all resolutions up to 5760*1200 with no issues even with 2GB. The problems happen at 4k / surround resolutions utilizing 8X MSAA, 8X resolution scaling (OGSSAA), SGSSAA, and stuff along those lines. FXAA and MLAA use zero VRAM.

But it isn't a situation where having 2 more GB's of VRAM increases your framerate. People have this misconception. You can test a 2GB or 4GB GTX 680 or 270X at 1280x1024 and the VRAM timings are the same, the performance is 100% identical. VRAM is about assets (which is barely applicable to modern games) and more about high AA levels at surround or 4k resolutions.
 
Last edited:
Well it's funny how people still tout about how anti-aliasing is not required for great image quality at 4K. People were saying the same thing about 1600x1200, then 1920x1200, and 2560x1600, etc. 2-4 years from now when 4K becomes ubiquitous people are going to start clamoring for AA performance just like in the past, at which point GPUs with 6-8GB of memory will probably be widespread as well.
 
I don't care one way or another. I'm simply correcting your implication that more VRAM increases framerate. That is not the case. As far as AA goes, I don't care either way. It's there if you want it. If you don't want it, then you don't want it. Personally I like FXAA or MSAA 2X and use that in everything, it is the best performance / VRAM use situation for me. 8X SSAA is far too taxing on even the best GPUs, so I don't use it except in extremely OLD games. FXAA and MLAA don't use VRAM BTW. More VRAM is mostly beneficial for absurd levels of SSAA at surround and 4k resolutions, or maybe 8X MSAA. SSAA will quickly become a situation of where you run out of GPU horsepower before VRAM is an issue. SSAA uses a ton of GPU horsepower for no real good reason, most people will just use 2-4x MSAA for 60-80% more performance as opposed to 8X SSAA. 8X SSAA, which is the type of setting which uses too much VRAM and high resolutions, isn't tenable with any GPU. So the point of having more VRAM for this type of SSAA is worthless when your GPU can't push it anyway.

Basically, let's go back to your earlier post. You suggested that the Titan would perform better at 4k with the 6GB of VRAM. Or the 8GB 290X would perform better at 4k at the same settings. Neither of them will perform differently, assuming similar GPU/memory clocks. Basically VRAM is a pass/fail situation. If you use something silly like 8X SSAA resolution scaling (OGSSAA), most cards, even 4GB cards, will fail at 4k resolution. If you test something that works on a 3GB card at 4k, and then test something that works on a 6GB card at 4k resolution, they both will perform the same. IE. The GTX 780ti and GTX Titan black will both perform the same, given the same GPU clockspeeds/memory speeds and all that nonsense. It isn't a situation where the Titan black will be faster just because it has more VRAM. The higher amounts of VRAM *will* make it more desirable for professional applications and/or allow it to run absurd levels of SSAA at 4k resolution.
 
Last edited:
Wow very impressive showing by AMD's 290x.
It equals Nvidia's flagship in performance and sells for $150 LESS!

Finally a solid reason to switch brands. I've been running Nvidia Cards for the past 3 years or so simply because they were the fastest and had the best drivers.

Now AMD has given me a reason to take notice and consider the other side.
Nvidia needs to respond by releasing the next generation within the next 6 months or by cutting some of their prices imho. Those price premiums are no longer justified when you no longer sell the fastest gpu on the planet.
 
Wow very impressive showing by AMD's 290x.
It equals Nvidia's flagship in performance and sells for $150 LESS!

Finally a solid reason to switch brands. I've been running Nvidia Cards for the past 3 years or so simply because they were the fastest and had the best drivers.

Now AMD has given me a reason to take notice and consider the other side.
Nvidia needs to respond by releasing the next generation within the next 6 months or by cutting some of their prices imho. Those price premiums are no longer justified when you no longer sell the fastest gpu on the planet.

Hasn't this been the case for a while (disregarding the mining price spike)? I have always just thought you saved money going AMD, but had better drivers with Nvidia?
 
Hasn't this been the case for a while (disregarding the mining price spike)? I have always just thought you saved money going AMD, but had better drivers with Nvidia?

Actually, previously Nvidia held a noticieable performance edge to justify the price premium.

Remember how AMD used to go on the model that - our single chips will be slower but cheaper, we'll catch up with our dual chip premium model. Well that didn't work so well because and I think they finally realized this --- not a lot of people buy dual chip gpus!!

The 290x works because it's a single chip solution that matches the performance of Nvidia's BEST gpu and for a significantly lower price point.
 
Last edited:
Actually, previously Nvidia held a noticieable performance edge to justify the price premium.

Remember how AMD used to go on the model that - our single chips will be slower but cheaper, we'll catch up with our dual chip premium model. Well that didn't work so well because and I think they finally realized this --- not a lot of people buy dual chip gpus!!

The 290x works because it's a single chip solution that matches the performance of Nvidia's BEST gpu and for a significantly lower price point.

That's definitely true. The 7970 never held a candle in the wind to the 780, but before they went up, weren't the 7970s going for $230ish and were outperformed by only 15-20% by the 770, which went for $325+?

I mean, yea, 15-20% is a lot, but the price difference is 30% or more.

The 780 held the "top dog" position for single GPU, so it had a (deserved?) premium, but I always thought the 770 should have dropped to the $275 mark to be more competitive.
 
Damn awesome review, I love articles like this. You cannot really find many high end vs high end reviews that are this well done!
 
Back
Top