NVIDIA Introduces The GeForce GTX TITAN Z

Why could they have not made a consumer level GTX 790 instead another compute oriented card? Whatever I suppose. I personally would have found a GTX 790 to be fairly interesting of a product if priced right. Titan Z? Another compute card? Uhhhhh. This card is going to be great for CUDA development yet priced out of the range of nearly all PC gamers. I guess I wanted a PC gaming oriented card. Not another CUDA oriented card.

I guess NV knows what they're doing, but this is not what I expected. Not what I was hoping for. Cuz i'm a PC gamer like I suspect most are here. Eh. Oh well.
 
You can only use two of these in SLI because they are dual GPU. One card and you have two titans in SLI.

On that note. I will never SLI or Xfire. Not worth the trouble and I am a stickler on smoothness.
 
Curious, when's the last time you tried SLI? It has improved tremendously starting with Kepler. I bought 680 sli near launch expecting issues, after using 7970CF. I won't get into the 7970CF stuff with eyefinity, but much to my surprise 680 sli was very good in terms of experience. Not perfect, but really damn good. I think, I was kinda leery like yourself prior to trying it but I was kinda surprised at how good it was. Certainly an improvement over the GTX 200 era type of SLI experience, which was iffy at best.

ANYWAY. I wish they had made a consumer level card instead of the compute oriented Titan Z. Or why not make both or something? I mean they still have different segments since the Titan Z would be significantly better for CUDA dev. What do I know, clearly nvidia is doing something different.
 
Two things. One the titan is certified by Autodesk for CAD and DCC as a workstation card.

Two the current titans require 75 watts from the PCI-e 75 watts from the six pin and 150 watts from the eight pin. Making a single chip a 300 watt card. How of that last 75 watts it needs we don't know but if they could have made it two six pins they would have. So the original titan must use between 225 watts and 300 watts.

The new card is going to be 3 eights pins or 3 eights + a six pin. This is to put it in the range of double what one card with half the gpus and half the ram uses. It needs to be 450 to 600 watts. I'm guessing it will be three plugs just to meet the 525 watts, or four plugs to meet 600 watts, but they could just make that last plug a eight pin to give the card more breathing room, but four 8 pins is more than most power supplies can supply to the whole system. Not that we would try that kind of psu on a rig like this but if they require 4 eight pins it will eliminate a lot of crappy psu.
 
Did they mean 4k? wtf is 5k gaming


If you want to get super cereal, technically there is no 4K either. All marketing to capitalize off the whole 1080p thing. Doesn't sound nearly as great if you call it 2160p so they now they're flip flopping like Mitt Romney by using horizontal pixels vs vertical in order to sell it.

By that logic we're all 2K gaming right now.
 
Yeah making money is for losers. :rolleyes:

AT least NVidia is taking their profits and creating new techs that are incoming. More than can be said for Apple.

Huh?

You mean like Mini-DP standard? Adoption of Thunderbolt? Retina screen Adoption? First Smart Phone OS that doesn't suck? ... A lot more can be named.

There are lots of standard on the things that we used today are all result of Apple's lead on technology adoption and invention. Not even Google can deny that.
 
If you want to get super cereal, technically there is no 4K either. All marketing to capitalize off the whole 1080p thing. Doesn't sound nearly as great if you call it 2160p so they now they're flip flopping like Mitt Romney by using horizontal pixels vs vertical in order to sell it.

By that logic we're all 2K gaming right now.
There is another marketing idea. If we start calling "1080p" gaming 2K, then the truth that 4K is quadruple the pixels will be lost to the not-so-tech-savvy people and the monitor/TV manufacturers wouldn't be able to justify the exorbitant cost of 4K displays to their customers. It's easy at first glance to see that "4K," or 4,000, is the quadruple of 1,000. Comparing 2K and 4K would make it appear only double.

Marketing is all about those little mind tricks. It's why the second Xbox was called "360" because Microsoft thought that calling it "2" would make people think it was inferior to the Playstation "3".
 
You mean like Mini-DP standard?
A different connector for an existing standard that served no purpose on Apple hardware besides forcing users to buy adapters to connect displays that used the original connector.

Adoption of Thunderbolt?
Tech developed mostly by intel for Lightpeak, watered down and run across copper instead of fiber-optic cable with slower speeds and fewer capabilities than Intel had originally intended.

Adoption hasn't been swift, either. USB 3.0 hit first, can handle most of the same tasks, and is implemented on more boards...

Retina screen Adoption?
What is that even supposed to mean? Not even Apple clearly defines what a "retina display" is.

If you just mean screens with absurdly-high resolution, that's not really innovation, that's just a progressively better version of existing display tech. A fairly obvious progression to make, at that.

First Smart Phone OS that doesn't suck?
Uuuuuh, what? I personally LOVED my PalmOS and Windows Mobile based phones, both of which had FAR more functionality than iOS did on launch-day.

iOS looked like a tinker toy compared to the entrenched options, and effectively launched as a feature-phone OS (no app store, no 3rd party apps, no customization options, no 3G connectivity, etc).

... A lot more can be named.
Still waiting to see your first example :confused:
 
Marketing is all about those little mind tricks. It's why the second Xbox was called "360" because Microsoft thought that calling it "2" would make people think it was inferior to the Playstation "3".

I thought it was called the Xbox 360 because the early high failure rates caused people to do a 360 and walk on by.
 
Did you say $3000? Would it not be vastly cheaper to get 2 x Titan Black? And if something fails it's only a $999 replacement not $2999?
 
Two the current titans require 75 watts from the PCI-e 75 watts from the six pin and 150 watts from the eight pin. Making a single chip a 300 watt card. How of that last 75 watts it needs we don't know but if they could have made it two six pins they would have. So the original titan must use between 225 watts and 300 watts.

The new card is going to be 3 eights pins or 3 eights + a six pin. This is to put it in the range of double what one card with half the gpus and half the ram uses. It needs to be 450 to 600 watts. I'm guessing it will be three plugs just to meet the 525 watts, or four plugs to meet 600 watts, but they could just make that last plug a eight pin to give the card more breathing room, but four 8 pins is more than most power supplies can supply to the whole system. Not that we would try that kind of psu on a rig like this but if they require 4 eight pins it will eliminate a lot of crappy psu.

Titan has a TDP of 250W. Multi-GPU cards have always been made from chips binned to run at lower voltages to keep TDP down. I wouldn't be surprised if they launch it with an "official" TDP of 375W with a seriously gimped power ceiling.
 
1000 dollar premium on a card that could be had if you just bought two Titan Blacks.

Meh.
 
$3000 and it still cant do 1mhash on Litecoin

Even at 2000 nobody would care since scrypt mining profitability went to shit. :eek: That's why 290 pricing is returning to normal.

But uhm, yeah. A consumer focused 790 would have been interesting. This...Titan Z. Why. I mean, I get it, apparently NV gets more profits from the professional sector. That doesn't mean that myeslf, the PC gamer, has to like that. After all the leaks yesterday, I can't help but be disappointed. Mainly because of my expectations. I had expected a 790 and maybe possibly some Maxwell news. I wanted PC gaming products, that's what i'm into. The Titan Z news seems like it's from left field, which kinda sucks since I generally really like NV products. But this? Ehmm....gimme a gaming product, don't care about the CUDA stuff. Oh well. Definitely had inflated expectations, I guess that's the big issue here.
 
Last edited:
Yeah making money is for losers. :rolleyes:

AT least NVidia is taking their profits and creating new techs that are incoming. More than can be said for Apple.

No, Apple is for losers...;) Especially that horribly overpriced stock! Reminds me of the heady days of Rambus stock--until it crashed...! Morons...:D (Rather, the ones buying at these prices--not the ones lucky enough to unload it on the suckers and make a bundle.)

Don't worry, though, because spending $3k on something nVidia tells you is "a supercomputer on a stick" only makes you slightly retarded (not a "loser", sort of)...;) I'm thinking nVidia feels left out of the soon-to-be-defunct "Bitcoins for Buttheads" business...:D (At least AMD just stumbled upon it, don't ya' know.)
 
While I'm sure I could elaborate more it pretty much comes down to "DO NOT WANT."
Was hoping for a totally different kind of announcement. This reminds me of whenvever Dana White claims he has big news.
 
Holy Nvidia hate ITT. But once Vega or someone rolls these up into a nice rig people will forget all that.
If the past tells us anything since Nv released this card we might yet see a GTX 790 before Maxwell & if we don't then Maxwell will still easily beat out Titan Z so....Everyone wins but everyone cries.

Why bitch and moan about something you're not going to buy anyways? Nvidia still has a great lineup at the moment with lots of options for everyone and is very competitive with AMD 'cause of the current mining prices.

Hmm, new video card or a 2005 Honda Civic? Let me mull that one over.
Yes, I'm sure people will cross shop a 3k video card and a '05 Civic. Nice logic there.
 
Holy Nvidia hate ITT. But once Vega or someone rolls these up into a nice rig people will forget all that.

I don't think it's hate. Most of us here really like NV products. But we expected consumer level products, while the Titan Z is aimed mostly at profession use for CUDA development and has a pricetag to match. Sure, you can use it for gaming. But that isn't the reason for the premium. I mean nobody is going to get this over 2 6GB GTX 780ti's from EVGA. No reason to.

It's also an expectations issue based on all of the rumors from yesterday, a lot of us expected more information on 2nd generation Maxwell. But of course that didn't happen.

Regardless, it's not hate. Just disappointed. I wanted PC gaming products, not this.
 
I think that is the market that doesn't drive anything, look at new console, technically they sucks. The news engine are even more optimized. Unreal engine 4 fly on a 780 as per the several demo out there, nothing is driving nvidia to push

Also, 20nm fabrication is a mess right now pretty sure it wont be ready until end of 2014 if at all this year. They probably dont even have a working sample of the high end card right now.

Will see next winter i Guess. I feel that I wont need to upgrade for a while anyway
 
Regardless, it's not hate. Just disappointed. I wanted PC gaming products, not this.

Well xoleras, you have some pretty solid logic. I very much understand getting hyped up over rumors & I am looking forward to next Maxwells as much as anyone. How many products do people want Nvidia to announce? The market can only bare so many without getting even more diluted. Maxwell will be here soon enough & it's not like games are starving for more power, even though we @ [H] always want more :p.

Some of the gems with more sure to come: "Nvidia is the new Apple", "$3000 and it still cant do 1mhash on Litecoin" REALLY? I didn't know the Titan Z was a new crypo mining ASIC.
 
To me this represents more than just a high end graphics card. It's a product that represents NVidia's best. It has the best binned GPUs (that are FULLY enabled), the best quality components, a full 12GB of VRAM and combined with it's compute abilities, makes pretty much the undisputed king of the hill.

A product marketed toward professionals and the ultra high-end enthusiast....finally.

Hell yes, I'm down for a couple of these in my gaming system...and maybe three...my NCASE M1 build would be trebly powered with one of these mean SOBs in it.. :D
 
I don't think it's hate. Most of us here really like NV products. But we expected consumer level products, while the Titan Z is aimed mostly at profession use for CUDA development and has a pricetag to match. Sure, you can use it for gaming. But that isn't the reason for the premium. I mean nobody is going to get this over 2 6GB GTX 780ti's from EVGA. No reason to.

It's also an expectations issue based on all of the rumors from yesterday, a lot of us expected more information on 2nd generation Maxwell. But of course that didn't happen.

Regardless, it's not hate. Just disappointed. I wanted PC gaming products, not this.

With a 3k pricetag, it is a bargain for CUDA devs.
 
How do they keep it cool isn't that going to be 600 watts of heat? That's a large hair dryer...
 
With a 3k pricetag, it is a bargain for CUDA devs.

Believe me, I understand the intent behind the product, and you are indeed correct. For the intended audience the card will be great. I'm just not a part of that intended developer audience, though.

As I mentioned , I just had inflated expectations for some PC gaming products; on that front the news was somewhat disappointing. Is that nvidia's fault? Nah. My fault for falling into the hype trap after the rumors from yesterday. No true PC gaming consumer products to note, ya know? This was compounded by the fact that leaks from yesterday indicated we'd get some goodies, but that obviously wasn't the case. Clearly the leaks from yesterday were wrong, and thus expectations were inflated from many people for consumer gaming products.
 
With a 3k pricetag, it is a bargain for CUDA devs.

EDIT: I guess what I meant to say was that for CUDA developers looking to get more powerful hardware, then the price is right. But still - I would bet that most serious developers already have Tesla units though. I don't know. It's a cool card, but whatever.
 
Can someone help me with this:

2X titan blacks have in total 12gb of RAM but can use only 6GB and not 12.

Can Titan Z use 12GB at once, as stated by Nvidia or it can only use 6GB as 2x Titan black in SLI do?

Thanks

It's 6GB per GPU. Anandtech has it listed as 2x6GB, so yeah.
 
Back
Top