780 ti or 290x....or just wait for 800 series?

I play with a 780 ti on a qnix and hit about 2.6, 2.7 allocated in Precision with max textures, 120% resolution scaling, AA and fxaa off for bf4 mp.
 
XFX Edition R9 290 would be my vote...then unlock it to a 290x. can be got for around 499 from amazon and if you have decent credit then get an amazon card with 50 dollars instant credit to put you down to only spending $450 for what should be a 290x:) (after unlocking) And this is coming from an nvidia man my entire life excluding my recent card:)
Hard to beat a R9 290X for $450 imo

Got mine for $395 with BF4, sold BF4 for $25 and unlocked...290x for $370:D:D: Beaten!
 
Nice cool and silent GTX 780 and nothing more. Little healthy overclock and 290X performance is there. Bit more and its very close to 780 Ti.

780 Ti GHZ or OC models are faster GPUs than current 290x, they are much faster even than stock 780 Ti... For single GPU rig is the best GPU in my opinion. 3gb vram is enough for 1440p with no problems. Gigabyte 780 Ti Ghz is more than 20% faster than 290x in BF4 and still some room for OC left. If money is not the problem, 780 Ti ghz..
 
Last edited:
20nm may be a while away, TSMC fucked both AMD and NVIDIA over, NVIDIA even held a press statement saying how unhappy they were with TSMC.

TSMC screwed up its 28nm fabs (which is why the original chips in the 7970 used high voltage and under-performed)

And they have had constant delays on their 20nm fabs, AMD and NVIDIA have both expressed interest in changing their business relation ship with TSMC.
 
Nice cool and silent GTX 780 and nothing more. Little healthy overclock and 290X performance is there. Bit more and its very close to 780 Ti.

780 Ti GHZ or OC models are faster GPUs than current 290x, they are much faster even than stock 780 Ti... For single GPU rig is the best GPU in my opinion. 3gb vram is enough for 1440p with no problems. Gigabyte 780 Ti Ghz is more than 20% faster than 290x in BF4 and still some room for OC left. If money is not the problem, 780 Ti ghz..

IMHO the 280X (7970Ghz) and the 780 are great buys on sale.. (pre mining histeria).. the latest 290X's are dead even with the Ti's (proce/Performance) but the real winners are the used 780s from lemmings jumping to the latest and greatest (TIs) 80% performance at almost almost HALF the cost, very similiar to the 280X/7970 vs 780.. (80% Perf for 50% less) at launch..

More often than not it is the early adopters who are winning out..
 
I upgraded from a highly overclocked GTX 780 to a highly overclocked GTX 780 Ti and there's a quantifiable difference in performance that I feel was worth the extra money spent.
 
I upgraded from a highly overclocked GTX 780 to a highly overclocked GTX 780 Ti and there's a quantifiable difference in performance that I feel was worth the extra money spent.
You're justifying your purchase, nothing more to it. Still the same 3GB of memory. You purchased another over locked GTX780
 
Ugh, another random misinterpreting and inserting his own nonsense. I can assure you I am not justifying my purchase. Out of pocket it cost me $90 to upgrade if you want to talk money. 3gb of memory, what is your point? (I don't actually care)
 
you won't see 300 dollar 290s for a while, lol.
If mining stays profitable for the forseeable future, then that won't ever happen until the next line of AMD cards comes out... and that's assuming the new cards are better at mining.
Retail prices will probably drop before eBay gets flooded.
 
You're justifying your purchase, nothing more to it. Still the same 3GB of memory. You purchased another over locked GTX780


Ummm 3 gigs of vram is enough even for 4k resolutons and multi monitor screen resolutions. Also the 780ti has very very fast vram speeds which makes up for the extra bandwidth the 290(X) has. What you need more memory than 3 gigs right now? LOL
 
you won't see 300 dollar 290s for a while, lol.

Depends on when the mining bubble pops. When BTC became unprofitable, AMD cards were flooded on ebay for low prices - the guys who are exclusive miners (and aren't gamers), a large percentage of those guys will unload their wares on ebay with race to the bottom pricing.

But who knows how long mining will remain as is. Right now it is profitable if you have a farm. If difficulty increases 10%, it won't be. If mining goes tits up, that will actually be a good thing for PC gamers. Because they will then be able to buy 290/290X cards at reasonable prices in the states, which isn't really possible right now unless you get lucky.
 
Bad times for 290 buyers in any case. Prices in US are at Europe level now, for 290. In the most part of Europe mining madness didnt affect prices much. Probably first time is see same prices for high end GPU in my country and in the US. 1$ is 1 euro + 10% is usual formula that is close to the true..
 
Yeah because vram is more important than how fast a card actually is and is more important for futureproofing... yeah we got it. More BS. The 780ti wins hands down even in 4k resolutions at ultra settings with high SSAA enabled. Stop your foolishness vram snobbery.


Nice noob post, thanks.

13876656163ks11I0WoM_4_3.gif


GTX-780-TI-GB-34.jpg


59676.png


59685.png
 
Last edited:
Ummm 3 gigs of vram is enough even for 4k resolutons and multi monitor screen resolutions. Also the 780ti has very very fast vram speeds which makes up for the extra bandwidth the 290(X) has. What you need more memory than 3 gigs right now? LOL
Keep drinking that coolaid son
 
Nice noob post, thanks.

13876656163ks11I0WoM_4_3.gif


GTX-780-TI-GB-34.jpg


59676.png


59685.png

What does that have to do with vram not being enough to run at 4k resolutions though? Yeah so you showed me a few games where the 290X is slightly faster. Nothing about the 780ti losing to the 290x because it's out of vram at ultra high resolutions. LOL And I was being sarcastic in my post to the person who posted about NVidia being stingy on vram. Like vram means everything about performance. Damn vram trolls.
 
What does that have to do with vram not being enough to run at 4k resolutions though? Yeah so you showed me a few games where the 290X is slightly faster. Nothing about the 780ti losing to the 290x because it's out of vram at ultra high resolutions. LOL And I was being sarcastic in my post to the person who posted about NVidia being stingy on vram. Like vram means everything about performance. Damn vram trolls.

Tell that to people running 2GB 680SLI lol.....
 
BF4 doesn't use more than 2GB of VRAM at 2560x1600 unless you use downsampling. Downsampling (resolution scaling) is OGSSAA, which is an extraneous feature that kills performance anyway. OGSSAA kills VRAM usage in any game, by the way. You could take the most innocuous game in terms of VRAM on the planet such as Black Ops 2 (Quake 3 engine BTW), add downsampling, and run out of VRAM. And that game uses the Q3 engine.

Anti aliasing eats VRAM. SSAA eats more VRAM. OGSSAA absolutely kills VRAM. You could easily go past 4GB of VRAM at 1080p by going nuts with downsampling/OGSSAA, so there's really no point as it kills performance anyway. You'll run out of GPU power well before VRAM becomes an issue.

There are valid reason for more VRAM but it doesn't come into play until you're using super high surround resolutions, and even then it's not a performance type of thing, it's more of a "higher image quality" type of deal. AA is by far the biggest culprit when it comes to VRAM, and having more allows you pile on more anti aliasing especially at super high surround resolutions. If I were using 3 * 1440p panels, I could see 4GB-6GB being useful there. For 1600p though? Nah. 3GB is fine. 2GB is fine although you can't use super high levels of SGSSAA or downsampling. It certainly isn't a big deal to have 2-4GB of VRAM at 1080p or even 1600p.
 
Last edited:
I wonder if its possible to run 150% RS on BF4 with having fps 40 as minimum with ultra settings (MP) did anyone tested with 290/GTX780 overclocked?
 
People keep bringing up BF4 running out of vram and stutering like crazy at ultra high settings though. But isn't BF4 full of bugs, memory leaks, and isn't there lawsuits from the game's publisher because of how shitty the game is optimized? I just heard something about that today. And isn't that why Mantle keeps getting pushed back for it? LOL

Sorry but if that's people excuse to tell people that we need more vram vs more gpu power then that's a poor example. I run 1600p monitor and still don't experience half the stuff people post on these forums about running out of vram. LOL
 
People keep bringing up BF4 running out of vram and stutering like crazy at ultra high settings though. But isn't BF4 full of bugs, memory leaks, and isn't there lawsuits from the game's publisher because of how shitty the game is optimized? I just heard something about that today. And isn't that why Mantle keeps getting pushed back for it? LOL

Sorry but if that's people excuse to tell people that we need more vram vs more gpu power then that's a poor example. I run 1600p monitor and still don't experience half the stuff people post on these forums about running out of vram. LOL

I never said anything about vram. I asked if anyone run 150% resolution scale so i know what settings i could use on my card?
 
I never said anything about vram. I asked if anyone run 150% resolution scale so i know what settings i could use on my card?

Sorry for the confusion, but I didn't quote you because that wasn't directed at you. :)
 
I think the point is being made that you can do certain conditions that will make any card run out of VRAM. That's really the point. If you want a 290 with 4GB of VRAM to go empty at 1080p, you can do it with OGSSAA. If you want to run out of VRAM at 1600p with mods, you can do that. Whether you have 2GB, 3GB, or 4GB of VRAM, you can go overboard with settings to make your VRAM disappear. As it turns out, OGSSAA/resolution scaling uses more VRAM than anything else. Mods are pretty close depending on what type of game mods you want to use. But the normal gaming experience isn't going to require 3-4GB of VRAM at 1080p or even 1600p.

That's the only point being made. If you want to do that sort of thing, you have choices on the market for more VRAM. And that's fine. But it's not like any of these games will ever use more than 2GB at 1600p under normal usage scenarios - you have to go above and beyond with mods and OGSSAA to do so. And even in these situations, by the time you apply so much extraneous stuff for more image quality, GPU power quickly becomes an issue. Let's say a 290 runs at 200 fps in game "X" at 1600p. Well if you use OGSSAA @ 4k, now you're at 30 fps at 1600p. 150+ fps lost because of OGSSAA. GPU horsepower becomes an issue well before VRAM does, generally speaking. Know what I mean?

On the other hand, if I were using 3x 1440p panels I'd want more VRAM for sure. More than 2GB anyway.

As far as future games go, by my calculations, over the prior 3 years VRAM usage has increased somewhat minimally. It is feasible that VRAM use will go up 2 years from now, that will happen, but it's not like VRAM requirements are going to double overnight. OR even in 6 months. Or a year. Even if VRAM is an issue, you can lower settings and be just fine. I just don't think that will happen in the next year with 2GB cards, especially at 1080p.
 
Last edited:
I think the point that most are trying to make when crying foul of the 780TI's VRAM is "why not?" Hell - it can't be THAT expensive, and it's not like the 780TI would cannibalize the Titan any (aren't Titan's still superior in compute?). I think that the 780TI should have 6GB of VRAM simply because it's a flagship consumer card.
 
So you double the cost of goods for memory, which already adds significantly to the base price of the unit that's facing extremely strong price pressure from a competitor, for a very marginal consumer use-case scenario, when you already have a product that fills another market niche (Titan) that can do the job for about the same price point you'd have to sell the 6GB Ti for? GDDR5 ain't cheap at all, especially not when it's binned at 7GHz.

That would not be a smart call on Nvidia's part, and why they didn't make it.

There's just not enough demand. It'll be at least half a decade before 4K becomes anything close to ubiquitous, as consumers don't replace monitors very often, and it'll be awhile before there's a critical mass of 4K content that make consumers want to upgrade, and it'll be awhile before panel yields bring prices down to the magic sub-$300 level for consumer desktop monitors.

Hell, look how long it took 1080p to become the "standard" monitor resolution, and that's with BluRay, HD television, and high speed connections helping to drive sales. Keep in mind those of us using resolutions above 1080p are in a minority, even here on [H] that's nothing but enthusiasts.

Now, if Maxwell parts have a 256bit or 512bit memory bus, then you might see 4GB mass market units this year just to keep up with AMD in the spec sheet wars (and it's 50% lower cost of goods than the 6GB they'd need for GK110 parts, obviously).
 
People keep bringing up BF4 running out of vram and stutering like crazy at ultra high settings though. But isn't BF4 full of bugs, memory leaks, and isn't there lawsuits from the game's publisher because of how shitty the game is optimized? I just heard something about that today. And isn't that why Mantle keeps getting pushed back for it? LOL

Sorry but if that's people excuse to tell people that we need more vram vs more gpu power then that's a poor example. I run 1600p monitor and still don't experience half the stuff people post on these forums about running out of vram. LOL
There are tons of reviews of the 780Ti running out of VRAM on 4k monitors on games like FarCry, and 290/x did not, search it. Now were talking FarCry, picture what will happen when StarCitizen comes out and all the new games in 2014. If you own a 4k monitor you will have no choice but to lower your settings, and doing that defeats the purpose of owning a high end card.
 
Back
Top