Leaked GeForce GTX TITAN X Benchmarks?

and the $600 295x2 in their benches still beat it :D Still way too much speculation to really draw conclusions at this moment. Hopfully and as it looks for 4k and up seems to be the goal in the next performance step. Lots to look forward to over the next couple years.
 
If those performance numbers are close to legit, I'm going to be VERY curious to see how the R9-390X compares.
 
and the $600 295x2 in their benches still beat it :D Still way too much speculation to really draw conclusions at this moment. Hopfully and as it looks for 4k and up seems to be the goal in the next performance step. Lots to look forward to over the next couple years.

The legit comparison with the 295X2 is the G1 SLI or the Titan X SLI not the single chip Titan(s).
 
With 12gb it better do great at 4k.

With "only" 4gb the R9-390X could end up being slower.
 
and the $600 295x2 in their benches still beat it :D Still way too much speculation to really draw conclusions at this moment. Hopfully and as it looks for 4k and up seems to be the goal in the next performance step. Lots to look forward to over the next couple years.

dual gpu card vs card with only a single gpu. 500-600watt draw vs ~250watt. A card that needs external water cooler vs one that can be AIR cooled. If Nvidia wants to take a shot at 295x2, they could price titan X at say 700$, yea its more expensive still but has 3x the useable frame buffer, before people say 295x2 has 8gb, remember CF doesn't work that 1 of the 2 gpu's doesn't have access to all 8gb only 4gb it has dedicated to it.
 
From leaked performance numbers it looks like the Titan X will be about on par with the X380, except the Titan X will be alot more expensive. Assuming that any of the numbers are correct from the X380 or Titan X.
 
Curious, going forward has there been any word of graphic engines incorporating the memory pooling on multiple graphic cards?

With memory pooling the actual vram per card in multi-card configurations could be less problematic and even moot to some degree. Especially with features like HBM memory speeds it could work very well.
 
From leaked performance numbers it looks like the Titan X will be about on par with the X380, except the Titan X will be alot more expensive. Assuming that any of the numbers are correct from the X380 or Titan X.

Um, the x380 is pretty much a rebadged 290x so how you figure it will be on par with that is pretty dumb. Its score is near where 2 of those 290x's in CF are, tad slower which is normal when its 1gpu vs 2
 
The legit comparison with the 295X2 is the G1 SLI or the Titan X SLI not the single chip Titan(s).

I am interested in single card solutions, since I like micro ATX machines. Not to mention money talks bullshit walks.

dual gpu card vs card with only a single gpu. 500-600watt draw vs ~250watt. A card that needs external water cooler vs one that can be AIR cooled. If Nvidia wants to take a shot at 295x2, they could price titan X at say 700$, yea its more expensive still but has 3x the useable frame buffer, before people say 295x2 has 8gb, remember CF doesn't work that 1 of the 2 gpu's doesn't have access to all 8gb only 4gb it has dedicated to it.

I would like to see it priced for it's performance too. Seems like the cards are geared to UHD gaming so I would base it off of that. so far I am just waiting for a good single card solution for a MATX machine for some good 4k+ goodness power draw and heat is low on the list when I am looking for the most horses under the hood. They are only bonuses in the end when you plan to put the whole machine on a water loop.
 
The legit comparison with the 295X2 is the G1 SLI or the Titan X SLI not the single chip Titan(s).

A comparison is legit if it compares things that people care about. In the case of many games it is perfectly legitimate to look at price as the anchor and ignore the downfalls of SLI, higher power etc.... The problem the Titans always have is the ridiculous price points that force them to have to compete with SLI configurations when someone is looking to get the most performance out of the dollars they invest.
 
The legit comparison with the 295X2 is the G1 SLI or the Titan X SLI not the single chip Titan(s).

No it isnt. Yes a 295 card is dual chip, but its a single card. A lot people dont have room for 2 cards. Also, you completely forget price. If you REALLY wanted to be legit, then comparing a $650 295 card to a $2000+ Titan SLI config is rather stupid. Doesnt matter if its dual chip on a single PCB or not.
 
Patiently awaiting the acer 34" curved 1440p g sync monitor and the cutdown 6gb version of these cards
 
I took a peek at the GDC Unreal engine booth, where there were a handful of TitanXs runnin UE4 demos; I managed to alt-tab out and snag a look at the Nvidia control pannel.

These leaks are pretty accurate.
 
No it isnt. Yes a 295 card is dual chip, but its a single card. A lot people dont have room for 2 cards. Also, you completely forget price. If you REALLY wanted to be legit, then comparing a $650 295 card to a $2000+ Titan SLI config is rather stupid. Doesnt matter if its dual chip on a single PCB or not.

295x still runs as Crossfire so it's really not a "single card" from a performance perspective with a lot of games.
 
Curious, going forward has there been any word of graphic engines incorporating the memory pooling on multiple graphic cards?

With memory pooling the actual vram per card in multi-card configurations could be less problematic and even moot to some degree. Especially with features like HBM memory speeds it could work very well.

Still snake oil that hasn't been verified or demonstrated. A post on tomshardware and every blog just ran with it as somehow MS had confirmed it.
 
dual gpu card vs card with only a single gpu. 500-600watt draw vs ~250watt. A card that needs external water cooler vs one that can be AIR cooled. If Nvidia wants to take a shot at 295x2, they could price titan X at say 700$, yea its more expensive still but has 3x the useable frame buffer, before people say 295x2 has 8gb, remember CF doesn't work that 1 of the 2 gpu's doesn't have access to all 8gb only 4gb it has dedicated to it.


All that, for a grand total of over $400 more. Someone do the math on how long until you'll see a ROI on that $400 in utility costs because it uses less power than a 295x2.
 
This is going to be a curious round for the high end. The way I see it panning out (with a good amount of interpolation):

  • 390x will have near performance parity with the Titan X. However, it will be limited to 4GB VRAM (at least for the foreseeable future), therefore not optimal for those running 4k or over. It will be priced in the realm of an enthusiast card, likely around $500-600.
  • Titan will run somewhat more efficiently than 390x's 300w TDP, but will perform about the same, and will have the VRAM to support >4k resolutions. However, it will be priced to the ultra high-end, around $800-1000.

The interesting thing will be, if the 390x actually turns out to outperform the Titan, the win will be somewhat nullified by the fact that it will be gimped at high resolutions due to its limited VRAM -- a bit of a reversal of what we see now with 290x excelling at high resolutions.
 
Seems like just another bogus set of benchmarks to me... I mean who has access to four Titan X GPUs for benchmarking purposes right now? One, maybe two, but four? I just don't see it.
 
Seems like just another bogus set of benchmarks to me... I mean who has access to four Titan X GPUs for benchmarking purposes right now? One, maybe two, but four? I just don't see it.

OCUK have stated they have ~1000 in their warehouse. They're out there.
 
So get a DP to DVI or HDMI to DVI converter cable.


It's not that simple. An HDMI adapter won't work for an overclock and for DP you need an active adapter which is upwards of $100. And even then I haven't seen an adapter that supports a pixel clock that will allow 110-120hz @ 1440p. I don't know why you have it out for DVI, but it's very necessary for a lot of us. ;)
 
Um, the x380 is pretty much a rebadged 290x so how you figure it will be on par with that is pretty dumb. Its score is near where 2 of those 290x's in CF are, tad slower which is normal when its 1gpu vs 2

380, 390 they keep changing the names theres still a few months to go so may change again.
 
Another Titan that delivers stunning performance but with one of the most ridiculous dollars/FPS ratios in the industry. I still see no reason to buy this as a gamer. Wake me when the price comes in line with performance.
 
All that, for a grand total of over $400 more. Someone do the math on how long until you'll see a ROI on that $400 in utility costs because it uses less power than a 295x2.

To be fair, a 500/600W GPU will need a much beefier PSU than a 250W GPU, so plan on at least $50-75 in immediate cost difference (assuming you're building from the ground up or upgrading).

But I don't have a dog in this fight. I'll never consider either.
 
These specs seem way off to me... When it says 12GB of memory, I think it should mean 3.5GB, with 8.5GB of L3 Cache.
 
Lol. Their marketing/lieing is going to be haunting them for a long time.

Since a majority people rarely have something useful to add...they poke fun at things till the end of time. Hell...I still hear about the IBM "deathstars" form time to time still.

As for this card....it is, again, not targeted exclusively for gamers. It is also targeted at people who utilize the GPU for different processing needs. In this case, if you have FARM's of these cards, the efficiency will matter ALOT. If you had 4 of these in a 3U box and in a 42U rack...the delta power is 14kW compared to the AMD offering AND more computation for unit space. To a facilities manager..this is a BIG deal, especially for cooling!
 
Since a majority people rarely have something useful to add...they poke fun at things till the end of time. Hell...I still hear about the IBM "deathstars" form time to time still.

ha, u mad bro?
 
Since a majority people rarely have something useful to add...they poke fun at things till the end of time. Hell...I still hear about the IBM "deathstars" form time to time still.

Well, I for one like to try and hold people to their word, and when they break it, I feel entitled to at least make them aware of the fact that they were, or are dishonest. But maybe you and your kind are the majority, and that's why companies like nVidia can afford to be dishonest and underhand. But if my flippant comment makes just one person reconsider their purchase, then that makes me feel better.
 
Back
Top