Geforce GTX Titan - First Pics & True Specs

Are you freakin kidding me? Are you playing on a 27" 1440p monitor in your sig? If so you'd just be wasting your money for more VRAM. I can guarantee you that. I play at 1600p and I'm fine with 2gig vram. I never exceeded my vram, even if I did, I couldn't tell.

I have exceeded 2GB VRAM on my 1440P. But ya, you certainly would not need more than 4GB, even for surround. Since 3GB might be cutting it for really demanding surround setups and the next step was 6GB, I'm glad NVIDIA went that route. I have exceeded 3GB before on my multi-screen setups with my 7970's.
 
d36cb80e-1401-4fed-9334-f15531bbdcba.jpg



VRMs so close to memory and GPU is not very good for cooling. Some distance from each other with proper heat sink/block will help with cooling when over clocking.

Based on this design it doesn't look like it will be a strong over clocker.

This PCB looks kind of low end for a card that is rumored to cost as much as $900. Guess the magic lies in the GPU but a nice well designed PCB will benefit it. Based on the PCB and the rumored performance $500 seems like a fair price but I would wait till price is around $400.
 
I have a source at a online retailer who says as of Feb 15 they are still awaiting inital allotted stock of the Titan. They are on the way.
 
Can someone host these pics from a random server that is not blocked for those of us on 24 hr shifts?
 
VRMs so close to memory and GPU is not very good for cooling. Some distance from each other with proper heat sink/block will help with cooling when over clocking.

Based on this design it doesn't look like it will be a strong over clocker.

This PCB looks kind of low end for a card that is rumored to cost as much as $900. Guess the magic lies in the GPU but a nice well designed PCB will benefit it. Based on the PCB and the rumored performance $500 seems like a fair price but I would wait till price is around $400.

I expect this card to run hot and loud indeed
 
VRMs so close to memory and GPU is not very good for cooling. Some distance from each other with proper heat sink/block will help with cooling when over clocking.

Based on this design it doesn't look like it will be a strong over clocker.

This PCB looks kind of low end for a card that is rumored to cost as much as $900. Guess the magic lies in the GPU but a nice well designed PCB will benefit it. Based on the PCB and the rumored performance $500 seems like a fair price but I would wait till price is around $400.

Yeah... wishful thinking man. They aren't going to sell a card that's faster than a 680 for the same price. I don't think the VRM placement is much of an issue. They will be the first components to receive a fresh dose of ambient air passed above them. The engineers knew what they were doing.

-------------

This card looks nothing like the 690. Different type of fan, different placement of the fan, different heat sink design. The only thing in common are the colors. I'd also like a source for these 40% performance figures. Seems like guess work (just like all the other figures in the past have been)
 
There will be no non reference Titans. Only 10,000 are being made and like the 690, it is reference only.
Ah f that then. My 680 is running @ 1400mhz so really it make the Titan only 20% faster maybe. Definitely not worth upgrading for me! I'd get more performance out of getting another Lightning.
 
This card looks nothing like the 690. Different type of fan, different placement of the fan, different heat sink design. The only thing in common are the colors. I'd also like a source for these 40% performance figures. Seems like guess work (just like all the other figures in the past have been)

Agreed, it is closer to the 680. And to answer your other question of the 40-45% in the other thread - that is just going off math of the posted specs.
 
If this card is priced at $899 and is only 40-45% faster than a GTX680, wouldn't people just wait until the end of the year for the GTX7XX series? Probably be pretty close to performance boost for a lot less of the cost. I'm sitting on a GTX660ti right now and would rather spend $280 to sli than sell this and get a $899 graphics card that will do the same thing.

nVidia isn't going to release a card that is only 40-50% faster than a GTX 680 and then price it at $899 because it won't sell and it will only give them negative attention in the media which is the last thing nVidia wants 48 hours prior to announcement of the PS4.
 
So guesswork, like all the other (higher) estimates have been as well.
 
So guesswork, like all the other (higher) estimates have been as well.

Its all guess-work until the official launch, but you have to admit its a lot more realistic. And the source is better. The inital clocks were all from those early retailers that also said it was 512bit. :D
 
the engineers knew what they were doing.

To me it looks like they wasted some space on the PCB and the placement of the VRMS squeezed in is going to effect the heat in that whole area. If they are going to use the heat sink with direct contact to memory and vrm and gpu the whole heat sink is going to sponge up heat and over heat quick.

To me it looks like they designed this card to not over clock well and that may have been a goal of the engineers. Hopefully NVIDIA will give the aftermarket PCB builders some freedom in designing a PCB for the Titan that will offer better cooling for over clocking.
 
Its all guess-work until the official launch, but you have to admit its a lot more realistic. And the source is better. The inital clocks were all from those early retailers that also said it was 512bit. :D

But you created this thread for facts... :confused:
 
But you created this thread for facts... :confused:

The facts are correct with the math based on those clocks. And those clocks have been officially uploaded to a benchmark site. I think we are 99 percent safe, my friend. I'll trust that before I trust unknown Australian retailers. :)

The only thing that could be wrong is the boosts, as people are saying it isn't working right. Which could be, because the boost clocks don't seem much higher to make a good enough difference.
 
Well, I don't know how loud these cards will be, that depends on the cooling solution. But I am guessing (logically BTW) that they will run hot. It's a huge and complex chip that use 235W of power even at the low clocks of 732Mhz. Of course they are going to generate a lot of heat.
 
The facts are correct with the math based on those clocks. And those clocks have been officially uploaded to a benchmark site. I think we are 99 percent safe, my friend. I'll trust that before I trust unknown Australian retailers. :)

The only thing that could be wrong is the boosts, as people are saying it isn't working right. Which could be, because the boost clocks don't seem much higher to make a good enough difference.

So, not based on facts is what you're saying... It's cool, I don't mind guesswork, but lets call it what it is. guesswork, not fact.
 
Well, I don't know how loud these cards will be, that depends on the cooling solution. But I am guessing (logically BTW) that they will run hot. It's a huge and complex chip that use 235W of power even at the low clocks of 732Mhz. Of course they are going to generate a lot of heat.

I just know with my reference 680 SuperClock, I can't push the fan past 60 percent without it driving me nuts! And cooling solution doesn't look much different here.
 
Its 68% faster in Texture Fill and 68% in SP that gives you a good idea of where it will be compared to the 680 at stock clocks.
 
Well, the source is here

http://videocardz.com/39618/nvidia-geforce-gtx-titan-pictured

I guess people who have managed to actually get real pictures of the card might know the specs too.

And so far, for me, they have been the must believable specs that I have seen.

Yes, and these guys are all getting the info from that uploaded benchmark and probably the sources of the leaked photos.

The early "leaks" were just from retailers making up what they wanted based more so off 690 specs.
 
As for the boost clocks not working, Nvidia has revised drivers released to today to address that, that article seems to be on the up and up
 
Now the only thing is to wait and see what the actual performance is!! How has Nvidia changed the chip to turn massive compute power into gaming power. All the specs seem to be the exact same as the K20x, I was expecting something to be disabled to allow higher clock speeds, but that doesn't seem to be the case. Double precision has been reduced, but games don't use that anyway.

I know people have done the maths and come with 40% faster than the 680 based on the 837 core clock. This seems right, but might increase with resolution. Also it should be a beast at synthetic benchmarks because of it's strong compute abilities. Which is maybe why we have seen it do really well in some of those more believable leaked 3D mark scores.
 
If I were nVidia I'd be pricing this card at $699. That number in my mind makes sense for a 40-45% increase over a 680. At $899 or higher I just don't see the justification. But the problem is from what I can tell, the fact it is going to be limited so a high price doesn't matter. Unless people are wise and don't buy it. At the same time, if they priced it "reasonably" they'd be unable to meet the demand too. You can't win.
 
If I were nVidia I'd be pricing this card at $699. That number in my mind makes sense for a 40-45% increase over a 680. At $899 or higher I just don't see the justification. But the problem is from what I can tell, the fact it is going to be limited so a high price doesn't matter. Unless people are wise and don't buy it. At the same time, if they priced it "reasonably" they'd be unable to meet the demand too. You can't win.

Except we don't know if it's only going to be 40% faster. All you and I are doing is basing the performance from the clock speeds and working from there. The reality might be far different.

Also, because there is only going to be 10,000 of these and because a lot of people who are interested in compute applications are also mad keen to get their hands on one of these, then I think even if it is only 40% faster they will still charge $900 for it.
 
Where did this 10,000 number come from. Sounds like charlie is back at his old stuff again and people are believing it.
 
Where did this 10,000 number come from. Sounds like charlie is back at his old stuff again and people are believing it.

Well I never believed the 10,000 number. But I do believe they will be limited.
 
Well I never believed the 10,000 number. But I do believe they will be limited.

The 10,000 number being thrown around I believe is related to the number nVidia has ready to go at launch and not a number they've quoted as being the total that will be produced.
 
Last edited:
Where did this 10,000 number come from. Sounds like charlie is back at his old stuff again and people are believing it.

It might be less than that, The cards are been built from binned parts from the K20x and K20 tesla cards. So there won't/can't be that many of them.
 
It might be less than that, The cards are been built from binned parts from the K20x and K20 tesla cards. So there won't/can't be that many of them.

not to mention that Nvidia must know that these are not going to sell well even among the hard core enthusiast market...do you think if Nvidia believed that they could sell 75,000 cards that they would limit it to 10,000?...No
 
not to mention that Nvidia must know that these are not going to sell well even among the hard core enthusiast market...do you think if Nvidia believed that they could sell 75,000 cards that they would limit it to 10,000?...No


Its all marketing. They build the the highest profit yielding card and build just enough to meet the anticipated number of buyers.

If they know they can sell 10,000 of these cards for $1000 and make a killing in profits then that is the number of cards that they will build.

To me this card is worth no more than $500 only if it performs 25% better than an equally clocked HD7970.
 
not to mention that Nvidia must know that these are not going to sell well even among the hard core enthusiast market...do you think if Nvidia believed that they could sell 75,000 cards that they would limit it to 10,000?...No

LOL you are still applying normal logic to this card. These will sell out fast. These aren't just of interest to the gamer. People who do cuda will be interested in this card as well, very interested!! Especially if they can find a hack to open all the DP power. Even if they don't it's still going to be a beast at compute. A $3000 workstation card for $900!!

You bet this will sell out.
 
Did Kyle and company get a review card? A lot of review sites didn't get it apparently
 
Back
Top