are titans done?

Status
Not open for further replies.

Filter

[H]F Junkie
Joined
Dec 30, 2001
Messages
9,524
seems like they are always sold out.

are they still that popluar or is there run done now since the 780 are out?
 
Titans are cold as ice since the launch of the GTX 780 and are manufactured in very little quantity nowadays.
While I recommend the 780 over the Titan, the latter can be found for $800-$850 in the secondhand market.
 
I suspect games will pass the 3gb vram mark soon (if you're at 1440p+), so a Titan isn't a bad idea if you have the money to COMFORTABLY spend.
 
Crysis 3 at 8xMSAA running at 2560x1600 does not require more than 3GB.
Interestingly, one Titan is not powerful enough at these settings and SLI is required at a minimum.

In other words, a single Titan is unlikely to be be powerful enough to be able to push games that will require more than 3GB. As a matter of fact, >3GB memory really becomes useful under extreme setups such as tri-sli and quad-sli.

Except under extreme scenarios, the Titan has become pointless since the launch of the 780.
 
Crysis 3 at 8xMSAA running at 2560x1600 does not require more than 3GB.
Interestingly, one Titan is not powerful enough at these settings and SLI is required at a minimum.

In other words, a single Titan is unlikely to be be powerful enough to be able to push games that will require more than 3GB. As a matter of fact, >3GB memory really becomes useful under extreme setups such as tri-sli and quad-sli.

Except under extreme scenarios, the Titan has become pointless since the launch of the 780.

I find this hard to believe as I witnessed my friend's 690 choke out at 1600p due to vram, and I've seen my own Dragon Age 2 exceed 2gb at moments at 1440p.
 
I find this hard to believe as I witnessed my friend's 690 choke out at 1600p due to vram, and I've seen my own Dragon Age 2 exceed 2gb at moments at 1440p.

Yes, but exceeding 2gb does not mean it will exceed 3gb. I've been running 5760x1200 for some time now, and most games seem to peak around 2.5 gb.
 
Yes, but exceeding 2gb does not mean it will exceed 3gb. I've been running 5760x1200 for some time now, and most games seem to peak around 2.5 gb.

But if games right now are peaking at 2.5gb - how much longer do you think it will be before they pass that barrier - ESPECIALLY with the new consoles coming out in 6 months.

Titan > 780, always.
 
But if games right now are peaking at 2.5gb - how much longer do you think it will be before they pass that barrier - ESPECIALLY with the new consoles coming out in 6 months.

Titan > 780, always.

Only if price is not factored in, and for most of us, it is. Mark my words, bookmark this post if you need to, the new consoles will have little to no effect on the amount of Vram a GPU needs to run PC games. The vast, vast majority of PC gamers are still running 1GB cards, or less.
 
There's a difference here between your video card and cpu having big pools of memory and Heterogeneous Unified Memory Access, "...Which allows both the CPU and GPU to share the same memory pool instead of having to copy data from one before the other can use it. " --

http://www.theverge.com/2013/6/21/4452488/amd-sparks-x86-transition-for-next-gen-game-consoles

I'm not the most tech-savvy person there is, but I don't think we should worry about how much video card memory our current cards have when this technology seeks to remove the differentiation between system memory and video card memory.

Going to read this article when I have a chance: http://arstechnica.com/information-...orm-memory-access-coming-this-year-in-kaveri/
 
Titan > 780, always.

For less than 10% increase in performance, I would say that most people find the $400 price difference hard to swallow.

I run Titan SLI at 2560*1600 and I have not yet seen a game that requires more than 2.5GB with everything maxed out.
The sweep spot is currently 3GB VRAM unless you go tri/quad SLI at extreme resolutions.
 
i think people who purchased titan knew what they were getting into
well i hope so unless they wipe their ass with $1k
 
Only thing the Titan succeeded in doing is making the 780 price look good.

But if games right now are peaking at 2.5gb - how much longer do you think it will be before they pass that barrier - ESPECIALLY with the new consoles coming out in 6 months.

Titan > 780, always.

Developers are not going to release games that require a $1k graphic card at bare minimum to max out. We'll see much cheaper offerings with 6GB within 6 months...
 
I got lucky and found my Reference Titan here for $775. I am keeping an eye out for another at the same price or lower, but that probably won't be till new architecture. At that point maybe $400-$700? Hard to say though since I can't tell the future.

Amazon or eBay would be the best bet atm.
 
But if games right now are peaking at 2.5gb - how much longer do you think it will be before they pass that barrier - ESPECIALLY with the new consoles coming out in 6 months.

Titan > 780, always.

Not really, you can simply turn down AA since you'll lack the horsepower framerate-wise to run high AA on future titles anyway that would exceed the 3gb barrier. Besides, hell, for now you could buy a $600-650 780 and still pay for the bulk of a next-gen card with the change compared to paying $1k for a Titan, enjoying 0-5% performance difference today oc-to-oc and having cash for your next card saved:

http://gamegpu.ru/test-video-cards/geforce-gtx-770-gtx-780-test-gpu.html
GTX 680 @ (Core: 1150 MHz Memory: 7.0 GHz)
GTX 770 @ (Core: 1200~1300 MHz Memory: 7.5 GHz)
GTX 780 @ (Core: 1150 MHz Memory: 7.0 GHz)
TITAN @ (Core: 1080~1180 MHz Memory: 6.7 GHz)
HD 7970 GE @ (Core: 1160 MHz Memory: 7.16 GHz)

Pretty much every test looks like this:
638w.jpg
 
I got lucky and found my Reference Titan here for $775. I am keeping an eye out for another at the same price or lower, but that probably won't be till new architecture..

I am happy with mines which I have paid $750 each but I would not have paid a penny more.
 
I have 2x Titans in my box and I only game at 2560x1440 - I won't be swapping them out for years. If you plan on keeping them for awhile the extra VRAM can only help over the 780.

If you planned on swapping your Titans or 780s for something newer within a year then yeah the Titan was pointless for you in that case.

Agreed.
 
Honestly, there are very few scenarios where the Titan is a good value. Even used ones are still being sold at $900+ when you can get one of the OC'd 780s that beats them right out of the box at $650. You would need an extreme setup to need anything near 6GB of VRAM and the 780 matches or beats the Titan in all other aspects. :)
 
If you plan on keeping them for awhile the extra VRAM can only help over the 780

The reality is that your Titans might not be powerful to push the extra VRAM when it will finally be used by games.
My Titan SLI is currently struggling with Crysis 3 and Metro LL at max settings and these games use both less than 2.5GB of VRAM.
 
And also, Titans were not meant to be a full production card. They were designed to be the cream of the crop. People who have the cash lying around to pay them will (and did). The 780's came out because Nvidia saw the sweet spot with them.
 
Titans are also not just for gaming, they are massive compute cards as well. :)

Don't understand why this is always bought up in Titan threads as if it justifies the price. The 7970 is a monster of a compute card as well, price/perf it blows away the Titan in many compute scenarios (and no, I'm not talking about Bitcoin). The Titan has its use cases as well but those that need it probably already have Quadro cards.
 
At 7680x1600, many games go over 3gb of vram.

If you are using a single monitor. There is no reason whatsoever to buy a Titan. Even at 5760x1200, you don't need Titans.
 
At 7680x1600, many games go over 3gb of vram.

If you are using a single monitor. There is no reason whatsoever to buy a Titan. Even at 5760x1200, you don't need Titans.

I agree.
Furthermore, only Tri-SLI setup and higher can provide the required performance to push games at 7680x1600 which demands more than 3GB of VRAM.
 
And also, Titans were not meant to be a full production card. They were designed to be the cream of the crop. People who have the cash lying around to pay them will (and did). The 780's came out because Nvidia saw the sweet spot with them.

Is everyone forgetting that Titan is designed for surround? Both the 780 and Titan have a place in nvidia's product lineup - the average user who wants the best (since both are similar in performance) will likely get the 780, while someone who opts for a surround setup will likely choose the Titan. Nvidia designed the Titan for those who want crazy surround setups, that is why it has an excessive amount of VRAM - it is largely wasted on single screen systems, but for triple screen it will provide a better experience.

Again, both the Titan and 780 have their place. For some reason people enter these arguments completely ignoring the benefits and uses of VRAM for surround setups. The Titan can and will provide a better experience with higher visual quality (AA levels, textures, etc) than the 780 - but for someone who is using a single screen resolution, the 780 is probably a better choice. Even with 3GB of VRAM, there are situations where the best visual quality settings are not possible in the highest surround resolutions, that is what the Titan's 6GB of VRAM is for. Better visual quality, higher AA, more mods, and better quality textures while in surround resolutions.
 
"If you plan on keeping them for awhile the extra VRAM can only help over the 780." - me

So it can hurt? I'm done with your semantics argument. You have observed data from the future of the two cards versus each other? It don't matter. I'm not in the mood to argue with someone that uses the words reality, might, and observed data regarding a benchmark from 2014.

So it can hurt? A $400 more a piece, I think we can say that it actually hurts the wallet pretty badly. :rolleyes:

I think it is reasonable to infer from actual games that Titan SLI will not be powerful enough to push future games using more than 3 VRAM. There may be exceptions but for $400 that seems like an irrational bet to make.
 
I think no one has a clue how much VRAM games will be able to use once they start taking use of the 8GB memory pool on new consoles and are then ported to the PC.

Give the developers all that RAM, and they'll use it- take Dragon Age II for example with the HQ textures installed. Not a game that requires a lot of rendering power unless you go stupid with the AA settings, but it can use a lot of VRAM, and that game is old.

Those consoles are rendering to 1080p with 7850's- don't expect them to push the geometry or pixel shaders through the roof. Expect them to push the textures to quickly eat up that slack VRAM.

If you're not running 6GB cards now, you're going to want to upgrade down the road just to turn the details up, especially when we start migrating to 4k displays on the desktop (which will likely happen faster than any of us think).
 
Is this how the internet works? Take someone else's statement then argue a point they were not trying to make? I never mentioned cost.

When I said, "If you plan on keeping them for awhile the extra VRAM can only help over the 780," I didn't see the concept of cost come up at all. Yet another fool on the internet to add to the ignore list.

EXsZMJf.png


Much better.

Just because YOU neglected to mention cost, as I and others did, doesn't mean it's not a factor even if you want to ignore it. The picture only hammers this home... :rolleyes: .
 
You know, if you ignore everyone, will there be any [H] left for you to read?

I don't agree with his 'tone' either, as when talking about things like the Titan you're really disregarding cost- a pair of 4GB GTX670's would do quite well, as would the 6GB HD7970's on water assuming AMD effectively un-borks the Crossfire drivers. For the price of the Titans, you could put the others under water, and upgrade the CPU in the process.

But blocking people and then making a spectacle of it? It's not that I don't understand, it's just that I prefer a little 'reasonable' dissent in my discussions. It's hard to hammer out what's right when you don't have people like RadXge and GoldenTiger to provide a beacon for everything that's wrong :cool:.
 
Noise has been my fascination with them as well, though the GTX680 did a pretty good job. The GTX770's with 4GB and the Titan cooler are currently the most interesting card- it's what I'd buy two of now, if I had to choose.

But knowing that I'll want to jump to 4k from 1600p as soon as that becomes realistic, I think I'm better off staying put for now :).

I just hope that Nvidia continues to innovate in the cooler and multi-GPU space; these are two areas that I'm really interested in. And Adobe putting some real GPU acceleration into Lightroom would be fricken' awesome :D.
 
I think no one has a clue how much VRAM games will be able to use once they start taking use of the 8GB memory pool on new consoles and are then ported to the PC.

Give the developers all that RAM, and they'll use it- take Dragon Age II for example with the HQ textures installed. Not a game that requires a lot of rendering power unless you go stupid with the AA settings, but it can use a lot of VRAM, and that game is old.

Those consoles are rendering to 1080p with 7850's- don't expect them to push the geometry or pixel shaders through the roof. Expect them to push the textures to quickly eat up that slack VRAM.

If you're not running 6GB cards now, you're going to want to upgrade down the road just to turn the details up, especially when we start migrating to 4k displays on the desktop (which will likely happen faster than any of us think).

Yes, but if developers actually use up the maximum allotment of memory on the new consoles, then the 6GB VRAM the Titan has isn't going to cut it at 4k, or even 2560x1600. My point being unless you need the VRAM now (ie. for a 3x30" surround setup), there's no point getting a Titan to "future proof" your setup.

And as far as gaming goes, I expect 4k penetration to be held back by GPU tech, until there is a big breakthrough. The incremental upgrades we are getting every year isn't going to be enough once developers really start pushing the new console hardware in a year or so. It's going to remain extremely niche just like 2560x1600 did for many years IMO.
 
Last edited:
I'll liken this to my experience from using a 1GB HD4870 while waiting for my first 2GB HD6950, playing BF:BC2 at 1600p. I adjusted the settings to the left, and everything was fine. It still had a 4.6GHz 2500k backing it up with 16GB of RAM, so it was the limiting factor, of course.

Now, the reality is that these games aren't going to be using this extra memory for stuff that will tax the shader processors- there isn't enough power there for that, these things are slower than mid-range desktops today. It's much more likely that they'll use the memory for stuff that doesn't require extra processing, like textures, as I mentioned above.

Note that a frame buffer for 4k is only twice what we need for 1600p, and I was able to run a modern (at the time) game just fine with 1GB of RAM. Extrapolate that a little, and 3GB will be plenty if you don't go batty with the settings.

But- more memory will likely be put to immediate use. Meaning, if you have one of the cards with more than the standard 2GB of memory, you'll be able to turn on more of the detail settings, but if you only have 2GB, you'll still be fine. Developers will have to make these games scalable all the way down to integrated video (as envisioned in a year or two), if they want to get their money's worth out of the development time.
 
Status
Not open for further replies.
Back
Top