780 ti or 290x....or just wait for 800 series?

There are tons of reviews of the 780Ti running out of VRAM on 4k monitors on games like FarCry, and 290/x did not, search it. Now were talking FarCry, picture what will happen when StarCitizen comes out and all the new games in 2014. If you own a 4k monitor you will have no choice but to lower your settings, and doing that defeats the purpose of owning a high end card.

Show me proof of this tons of reviews that prove the 780ti is running out of vram at 4k and I may just believe you. LOL
 
There are tons of reviews of the 780Ti running out of VRAM on 4k monitors on games like FarCry, and 290/x did not, search it. Now were talking FarCry, picture what will happen when StarCitizen comes out and all the new games in 2014. If you own a 4k monitor you will have no choice but to lower your settings, and doing that defeats the purpose of owning a high end card.

I haven't seen any GTX 780 cards failing at 4k resolutions in Far Cry 3 to date. Do you have a link to such a review? Also, just to point this out, Star Citizen won't have all modules complete until 2015, so that game won't be complete until 2015. Not even sure why that is being brought up...

But, if we just go with your argument for a moment - 4k resolution? How many people use 4k? Heck, 4k is barely usable right now with most screens being 30hz.

Don't get me wrong. I use high resolution IPS panels myself, but you're really talking a niche of a niche of a niche at this point, I wouldn't use 4k as a sole argument for more VRAM. Fact of the matter is very few people use 4k, far fewer than half of 1%. Maybe 4k adoption will improve over the next year and I *hope* it does, but it's not like 100% of users will suddenly have 4k monitors next year. I love high resolution. But until costs come way way down, I don't see 4k being all too important just yet.
 
I haven't seen any GTX 780 cards failing at 4k resolutions in Far Cry 3 to date. Do you have a link to such a review? Also, just to point this out, Star Citizen won't have all modules complete until 2015, so that game won't be complete until 2015. Not even sure why that is being brought up...

But, if we just go with your argument for a moment - 4k resolution? How many people use 4k? Heck, 4k is barely usable right now with most screens being 30hz.

Don't get me wrong. I use high resolution IPS panels myself, but you're really talking a niche of a niche of a niche at this point, I wouldn't use 4k as a sole argument for more VRAM. Fact of the matter is very few people use 4k, far fewer than half of 1%. Maybe 4k adoption will improve over the next year and I *hope* it does, but it's not like 100% of users will suddenly have 4k monitors next year. I love high resolution. But until costs come way way down, I don't see 4k being all too important just yet.
I am not at my computer right now, so cant find too many. Here is one http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055
I see your 4k point, not many are using it yet, BUT this is the current best offering from Nvidia and its not pulling its weight. Dont get me wrong, I am not defending AMD here, they should have done a better job with their cooling solutions from the start, but when you compare a "normally priced" 290x vs a 780Ti not only is it cosing 180+ over a 290x, but its also under performing at 4k settings which is the market is moving towards.
 
I am not at my computer right now, so cant find too many. Here is one http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055
I see your 4k point, not many are using it yet, BUT this is the current best offering from Nvidia and its not pulling its weight. Dont get me wrong, I am not defending AMD here, they should have done a better job with their cooling solutions from the start, but when you compare a "normally priced" 290x vs a 780Ti not only is it cosing 180+ over a 290x, but its also under performing at 4k settings which is the market is moving towards.

So far cry 3 during a bench run at super high image settings 4k starts to stutter. Were talking about frames that are unplayable already on both cards. LOL What a joke. The 780ti took the lead more times in that review.
 
So far cry 3 during a bench run at super high image settings 4k starts to stutter. Were talking about frames that are unplayable already on both cards. LOL What a joke. The 780ti took the lead more times in that review.
Its really pointless talking to a fanboy such as yourself. If you don't know how to distinguish the difference in the review between the two cards in terms of price/performance, then there is not much more I can say.
 
would like to mention the r9-290x is faster than the 780 TI, when you use a aftermarket cooler, or water block.

the r9 290 matches the 780. I would go with a 780 over a 290 (non x) mainly because they are a better value atm, due to inflated prices.

I would go with a 290x over the 780ti though. Cheaper and with good cooling out performs it.
 
I am not at my computer right now, so cant find too many. Here is one http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055
I see your 4k point, not many are using it yet, BUT this is the current best offering from Nvidia and its not pulling its weight. Dont get me wrong, I am not defending AMD here, they should have done a better job with their cooling solutions from the start, but when you compare a "normally priced" 290x vs a 780Ti not only is it cosing 180+ over a 290x, but its also under performing at 4k settings which is the market is moving towards.

Looks like 8X MSAA is the culprit there, lowering it a notch would solve any such issues more than likely.

Don't get me wrong, though, I would always advocate more VRAM for surround, and as I understand it - 4k uses a higher pixel count than 5760*1080. So I guess the same would apply. More VRAM would be desirable for 4k and surround. Yet by the same token, it isn't like the game is unusable. Generally speaking, when VRAM runs out - anti aliasing is the first culprit if it's multi sample or super sample. That takes a great deal of VRAM especially at higher resolutions; and 8X MSAA would probably be overkill, I personally never use that much AA these days. That's just me though.

I'll concede your example though, thanks for the link. For those settings more VRAM would be an asset I guess.
 
Last edited:
would like to mention the r9-290x is faster than the 780 TI, when you use a aftermarket cooler, or water block.

By the same token once you use an aftermarket 780ti or OC it yourself it once again retains the lead by a good margin. There are factory OC'ed 780ti's that have already been benchmarked to be faster than both aftermarket 290X (asus DC II ) And the GTX 690. Now i'm not saying the 290x aftermarket is bad - clearly the performance is very very good.. Just the prices are stupid in the US right now and don't make sense over the 780ti. And the 780ti is still faster if you get a factory OC'ed model.

If you compare reference 290X to reference 780ti, the 780ti is faster. If you compare aftermarket 290X to aftermarket 780ti, the 780ti aftermarket is faster. When you compare reference to aftermarket, that isn't an apples to apples comparison. By the way, I can link benchmarks of both an overclocked DC II 290X compared to the Gigabyte WF3 780ti with the latter winning by a good margin - I just don't feel like wasting space on this page. Don't get me wrong. The aftermarket 290X cards are great in terms of performance, but they're not the fastest when you're doing an apples to apples or aftermarket to aftermarket comparison. The prices are also a bit silly right now, but we all know that. Hopefully the entire mining situation corrects so that these cards become more viable for non-miners.

The way I see it is, once the pricing adjusts for the aftermarket 290 and 290X cards, they will be great cards and great purchases. And at those prices (100-150$ less than the Nv counterparts) you can ignore the software differences more easily between AMD and nvidia. I can say that if I saw a DC II 290 (non X) card for around the 420$ mark, i'd buy it instantly. You just can't beat that type of price performance, with a whisper quiet cooler to boot. It is just hard to do that when they're both (290/290x aftermarket and 780/780ti) essentially the same price as they are now in the states - I do hope that situation corrects itself.
 
Last edited:
If you own a 4k monitor you will have no choice but to lower your settings, and doing that defeats the purpose of owning a high end card.

If you own a 4k monitor you'll be buying 2-4 of the top-end Maxwell cards on the day they release because neither the 290X(even if they were realistically purchaseable) nor the 780ti offer acceptable performance at 4K with max settings right now anyway.

It's pointless trying to prepare for 4K with the current generation of cards, they're not up to the task. I say this as a person who actually owns a 4K monitor, btw, and has for many months.
 
Its really pointless talking to a fanboy such as yourself. If you don't know how to distinguish the difference in the review between the two cards in terms of price/performance, then there is not much more I can say.

Oh noes I got called a FANBOY. Nice one troll. I was making a point that the 780ti isn't lagging because of lack of vram. Read the damn thread before you call people fanboys. LOL
 
If you own a 4k monitor you'll be buying 2-4 of the top-end Maxwell cards on the day they release because neither the 290X(even if they were realistically purchaseable) nor the 780ti offer acceptable performance at 4K with max settings right now anyway.

It's pointless trying to prepare for 4K with the current generation of cards, they're not up to the task. I say this as a person who actually owns a 4K monitor, btw, and has for many months.

correct me if I am incorrect, but isnt 1920x1o80p at 4x SSAA = to 4K ??

edit: if 4X SSAA = 1080p "upscaled" (downsampled) then rendered ?too drunk to think
 
Last edited:
4x SSAA is equal to 4K with no anti-aliasing, assuming all else is equal(texture resolutions, etc), yes. I don't understand the purpose of this comparison, though.

simply my drunken (mis) understanding that 1920 x 1080p with 4X SSAA would be equivilant to 4K resolution (3840 x 2160).. ie 4K results would give an approximate result as 1080p with 4X SSAA
 
Would there be a limitation of vram in BF4 MP at 1440p @ 120Hz on Ultra with a GTX 780Ti?
 
Back
Top