NVIDIA to launch GeForce GTX TITAN Black Edition and GeForce GTX 790

Original Titan worked at the time cause it looked like it was the most powerful GPU by long shot
and looked like it will last for at least 1 year in performance crown, so for some people me included the price was "somehow" justified.
But we all know how that unfolded.
Bringing another $1000 titan now this late in the game with everything NVIDIA did in last year its just ridiculous,
NVIDIA must be on crack if they even think they will sell new Titans even remotely close to what they sold original Titan.

People got burned once and that's enough for most i hope!
They think they can milk it forever...
 
It's Nvidia, what did you expect? This is just another Titan -> 780 Ti moneygrab.
How was that a money-grab, exactly? There were still a good few reasons to pick a Titan over a 780 Ti.

The titan has double the RAM and much, MUCH faster compute performance. You want a quadro-like feature set, you pay quadro-like prices.

The price premium also allowed you to enjoy that level of performance months in advance of the 780 Ti even being a rumor. Early access has always been at a premium.
 
Agreed. The 780 (non TI) made the Titan and Titan black irrelevant a long time ago anyway. I view the Titan and Titan ti as being glorified CUDA development cards, not gaming centric cards. I don't like the price of those cards, but you have to consider their little niche. It isn't exactly only for gaming, although I -could- see it being useful for something nuts such as 1600p * 3 in surround.

The gaming catered value card right now is the 780, especially the factory overclocked ones. ;)
 
I made a fancy xls of all the nv cards. There's this huge price and core count gap between the 770 ($289 ar @ NE) and the 780 ($499 @ amzn). They need a 1920 core card with 3gb for about $399.
 
I made a fancy xls of all the nv cards. There's this huge price and core count gap between the 770 ($289 ar @ NE) and the 780 ($499 @ amzn). They need a 1920 core card with 3gb for about $399.

Is there any chance you could post this information for the more curious among us?
 
Original Titan worked at the time cause it looked like it was the most powerful GPU by long shot
and looked like it will last for at least 1 year in performance crown, so for some people me included the price was "somehow" justified.
But we all know how that unfolded.
Bringing another $1000 titan now this late in the game with everything NVIDIA did in last year its just ridiculous,
NVIDIA must be on crack if they even think they will sell new Titans even remotely close to what they sold original Titan.

People got burned once and that's enough for most i hope!
They think they can milk it forever...

I'm afraid you're getting how is should work and how it will work confused. While I agree that they aren't a good value, I would be willing to bet that they have no problem selling them. There are enough people willing to drop (what we would consider) crazy money on their computers. However, even blowing $1000 bucks on one video card is not a bad value for someone who is using that as his main form of entertainment.

Lots of people will blow $1000 bucks on a weekend of skiing. If you head to Las Vegas you can blow a lot more money than that in a weekend and get nothing in return. Although there may not be a noticeable difference between a $500 dollar video setup and a $1000 video setup to me, that doesn't mean an aficionado doesn't notice the difference and happily pay the inflated price.

It seems there are a lot of people out there that make enough money that throwing a few thousand bucks at their primary form of entertainment is no big deal. So Intel Extreme edition CPUs, Titans, 500gig SSDs, $300 headphones etc. etc. are not a huge expense. They may be sitting at home playing Battlefield 4 instead of sailing a yacht or para-sailing in Brazil. And it seems like more and more there are enough people like this that specialty computer components are doing quite well.
 
I'm afraid you're getting how is should work and how it will work confused. While I agree that they aren't a good value, I would be willing to bet that they have no problem selling them. There are enough people willing to drop (what we would consider) crazy money on their computers. However, even blowing $1000 bucks on one video card is not a bad value for someone who is using that as his main form of entertainment.

Lots of people will blow $1000 bucks on a weekend of skiing. If you head to Las Vegas you can blow a lot more money than that in a weekend and get nothing in return. Although there may not be a noticeable difference between a $500 dollar video setup and a $1000 video setup to me, that doesn't mean an aficionado doesn't notice the difference and happily pay the inflated price.

It seems there are a lot of people out there that make enough money that throwing a few thousand bucks at their primary form of entertainment is no big deal. So Intel Extreme edition CPUs, Titans, 500gig SSDs, $300 headphones etc. etc. are not a huge expense. They may be sitting at home playing Battlefield 4 instead of sailing a yacht or para-sailing in Brazil. And it seems like more and more there are enough people like this that specialty computer components are doing quite well.


Im not saying there are not people that will spend that money i did cost me $2200 for 2x Titans but with the thing that NVidia did last year im just saying lots of more people will stay clear of Titans this time.
Since we know there will be "780Ti Ultra Gold edition" in next week with faster performance for half the price :)
 
I wouldn't so much blame Nvidia for this as thank AMD. 780 price drops and 780 Ti as a product likely wouldn't have happened if the 290 hadn't been so competitive.

Rolling out the Ti pretty much mandated that they roll out a newer Titan as well, or just stop production of Titans entirely since gamers (other than a handful of "moar Geebees!" types) no longer had any real reason to buy it, as it was no longer the performance champ.

And lets face it, cost-is-no-object gamers, not people building "budget" compute platforms, were the ones keeping the Titan a marketable, profitable product.
 
If AMD prices don't right themselves, nVidia is going to follow the smell of money ...
 
I wonder if the 790 would sli with a 780 Ti?

Then I could use SLI when playing games that dont use more than 3gb vrsm and simply disconnect the 780ti when I need more vram
 
If Galaxy / KFA produce a 790 with multiple Displayport ports at a £1K price point, then I am game.
 
I wonder if the 790 would sli with a 780 Ti?

Then I could use SLI when playing games that dont use more than 3gb vrsm and simply disconnect the 780ti when I need more vram

Nope. Although Radeons can often work across different cards (i.e. 6990 with a 6970 trifire) nVidia cards aren't that lenient. You can make the same cards with different clocks work together, like a 670 with 670 superclock, but you can't SLI a 670 with a 680.
 
I wonder if the 790 would sli with a 780 Ti?

Then I could use SLI when playing games that dont use more than 3gb vrsm and simply disconnect the 780ti when I need more vram
If the Russian listing is correct then it's a 6GB card. Which i'm assuming that like the GTX 690, it's 3GB per GPU.
 
And lets face it, cost-is-no-object gamers, not people building "budget" compute platforms, were the ones keeping the Titan a marketable, profitable product.
Gamers tend to buy one or two cards...

People who need serious compute performance buy 10 or 20 and cluster them. 10 or 20 Titan's is a HELL of a lot cheaper than a real rack of Tesla hardware. I imagine that fact made (and continues to make) the Titan marketable.
 
Damn, I was only interested becsuse I thought it was 5gb per GPU. No way will I consider it if its 3

I agree COMPLETELY. 3GB per GPU is not enough for that card. That is built in obsolescence right there. I would go with 4GB per GPU, maybe. 5 would be sweet.

Hitting the VRAM wall on an expensive SLI setup SUCKS. And with a whole new console generation and no telling how much VRAM developers will use it is just risky. I can conceive of a scenario where, say, a next gen Grand Theft Auto type game is ported over to PC with additional features that, with anti-aliasing generously applied, would exceed 3GB of usage even at 1080p. If I am going SLI I want supersampling AA on ALL games, ect.
 
3GB should be more than enough for most monitor setups for quite a while. I'd probably be weary of it if I actually gamed at 4k or 3x1440p, but even at these resolutions there aren't many current games that will run into vram limits. You'll almost always be limited by the computing power of the gpu itself and not the vram, aside from certain cases (like Skyrim with 4k textures.)
 
It will be interesting to see how NVIDIA configures the memory on Maxwell. They've been offering less VRAM capacity with their cards vs AMD for a while, I'm wondering if they will meet AMD at 4GB or offer more.

Not really interested in yet another GK110-variant since I went from 680 -> 290X, but I'm hearing good things about GSync.. so might consider going with Maxwell if there is a big improvement.
 
3GB should be more than enough for most monitor setups for quite a while. I'd probably be weary of it if I actually gamed at 4k or 3x1440p, but even at these resolutions there aren't many current games that will run into vram limits. You'll almost always be limited by the computing power of the gpu itself and not the vram, aside from certain cases (like Skyrim with 4k textures.)
I agree, should be plenty for at least 2 to 3 years.
 
Oh noes not more vram snobs and OMG nVida gimped us on our vram. We hit vram walls on all our games now because consoles have like 8 gigs of ram.... LOL Geez this will never end.
 
Well the only thing that hits 5gb of Vram on my titans is Gran Turismo 4 on PCSX2 with 6 time native resolution its nice to play with 60fps compared to 2fps on my previous 680 setup.

But other then PS2 Emulator i don't know a single game that used more then 3gb of Vram at least at my native 1600p res.
 
I'm playing on my 2560X1600 monitor again and have yet seen any problems with my 660ti hitting vram walls at 2 gigs. Only thing holding me back is gpu horsepower. And this is even in Battlefield 4. Believe it or not I can play at ultra settings, just no AA and resolution at 100%. This looks wonderful and plays smooth as butter. I tried turning up the settings and pq didn't look any better imo.
 
I'm playing on my 2560X1600 monitor again and have yet seen any problems with my 660ti hitting vram walls at 2 gigs. Only thing holding me back is gpu horsepower. And this is even in Battlefield 4. Believe it or not I can play at ultra settings, just no AA and resolution at 100%. This looks wonderful and plays smooth as butter. I tried turning up the settings and pq didn't look any better imo.

So because you dont use AA that means your 660ti wont hit vram walls?

Turn up the eye candy and watch it fail.
 
Yeah because PQ is sooooo much better with AA at 100X so I need more vrams because of this right? Try playing at 1600p and tell me you need more AA in Battlefield 4. There is almost NO difference in PQ at all.
 
Try it sometime... There is but you definitely cant see it with an underpowered GPU due to frame lag. Running 2560 myself.
 
So what if I told you I have 2 660ti's. I'll plug the other one in tonight and let you know what I think.
 
Youll still be VRam limited. BF4 uses 2.6gb for me @ 2560x1440 on Ultra with 4x MSAA. My point was, 2gb is not enough.
 
Youll still be VRam limited. BF4 uses 2.6gb for me @ 2560x1440 on Ultra with 4x MSAA. My point was, 2gb is not enough.

Are you using resolution scaling? IE: downsampling/OGSSAA? 2GB cards don't seem to really have an issue with BF4 at even 1600p, unless you use OGSSAA. Which is what resolution scale is, a super expensive form of SSAA.

I haven't seen much in the way of VRAM issues unless you go nuts with AA, OGSSAA, or modding. Just my opinion. Now one may argue that VRAM requirements will increase due to the next gen consoles, but i'm not sure I buy into that. The next gen consoles are fairly weak machines - BF4 on those systems are a joke compared to a high end PC. But, again, this is all just my opinion. Was mainly curious as to whether you're using resolution scaling, which would 100% explain the VRAM use in your case. Resolution scale is OGSSAA, essentially, so it uses an assload of VRAM. I have also seen cases of VRAM "caching" in various games which isn't indicative of real world VRAM use, COD: Ghosts is a good example. That game tells me that i'm right at 3GB on 780s, and 2GB on a 770 in another system. Yet this game never hitches on either machine and uses the Q3 engine. I suspect that it isn't using remotely close to the amount of reported VRAM, and that it is caching.

I don't know. Would like to hear more opinions on this. Has anyone else seen absurdly high VRAM use in games that obviously can't use the reported amount? (COD: ghosts did this for me...)
 
Last edited:
Yeah because PQ is sooooo much better with AA at 100X so I need more vrams because of this right? Try playing at 1600p and tell me you need more AA in Battlefield 4. There is almost NO difference in PQ at all.

try 4x MSAA, it makes a big difference, "hd"gamer.
 
Last edited:
The Asus 760 Mars is a dual gpu gfx card with 2x2 = 4GB of VRAM card. Where'd you get 2GB from?

lol that's not how it works. You can't stack vram across multiple GPUs. It's mirrored on both cards. My tri sli 780s don't have 9gb usable memory, only 3.
 
try 4x MSAA, it makes a big difference, "hd"gamer.

Sorry, but NO it does not!! And going by those charts, the 690 is hanging with all those high vram cards. Your guys silly excuse of why you need more vram is just WRONG!!!
 
Back
Top