NVIDIA Announces The New Titan X

I have a 3930k watercooled as well and that sounds a little high at stock clock/voltage. I have my CPU at 4.7 OC, cannot remember the voltage, but my max heat at full load is about 75c, this is usually after about 40-60min of prime95. On occasion it spikes a little higher, but never over 80c that I can remember. It has been a while since I ran prime95 on my CPU.

Glad to see your results with water. I received my block the other day, but I also ended getting a second Titan XP and ordered a second block with the backplates. Hopefully, MSI will release that Afterburner Beta that Guru3d was using to add voltage. With Water, I am sure we can push them even farther, but hearing your results is great to hear.
 
Now I need to find someone else with a i7-3930K and a Supremacy EVO block to compare temps with, because I'm not sure if I got a good mount of the block to the CPU, or if I need to redo it.

I'm getting stock (no overclock) Prime95 max load temps of 54C at 74F ambient.

This sounds a little high to me, compared to how awesome my GPU numbers were, but at the same time, I know that these old Sandy Bridge-E chips do give off a fair bit more heat than the latest CPU's, so maybe this is normal? I'm going to have to overclock it and see what it looks like then.

Mid 50c in Prime95 is the temp I get at 4.4-4.5ghz overclock and that was with a Alphacool XP3. A Supremacy block should have a slight edge. I would check your mount. Keep redoing it till ya get even (or as close to) core temps, then max temp. As you start scaling that overclock to 5.0-5.2ghz the temps start taking off like a bird in flight.
 
To my knowledge there is no voltage control available yet.

For the overclocked tests I did set power and temp limits to max.

My mistake. I was watching an overclocking video on YouTube and it looked like evga precision x had the ability to add a very small offset for voltage, but maybe that's just for the 1080 (he said it was for all Pascal cards). When folks were saying no voltage control, I thought they meant no "significant" control, as that was a max of 30 mV. Kinda stinks we can't touch it at all (until we get custom bios of course :p )
 
TL;DR: FPS numbers down slightly going from 980ti SLI setup to a single Titan XP, but play experience just as good, if not (circumstantially) better. I could easily make a case for sticking with the older setup, however, if money were tight.

This is long, sorry:

Got my Titan X today. As sad as it seems, it was sitting in the box, ready to be installed, and I was thinking about sending it back (or selling it). The reason is that I was running "before" benchmarks most of the afternoon with my 980ti SLI setup, and quite frankly, it had never run so well.

Thanks to what I had read in one of the threads on Overclock.net, I found that most of my issues with Witcher 3 and other games was due to a weird problem rendering in-game AA. It was giving me horrible GPU utilization rates (80-90% on one, 50-60% on the other in TW3) and therefore a terrible experience. When I disabled it, all of the games I've been playing worked flawlessly.

The Witcher 3 ran at slightly better than 60 fps, and that was with almost every single bell and whistle enabled in 4k (including maxed out hairworks and 4x hairworks AA), OTHER than standard AA, which was disabled. I was blown away, quite frankly. Same experience when I loaded up Dragon Age Inquisition (but the frames were slightly better there). Emerald Graves area was giving me better than 70 fps, and everywhere else was routinely hitting >90. Even Crysis 3 (fully maxed) ran anywhere from 56-61 in the very unscientific "benchmark" I tried to run by just playing the game for a bit from a certain save. Heaven benchmark gave me an average of 64 fps.

The single game that still gave me issues was Black Desert Online. In this, with "high end mode" enabled, I was still getting anywhere from 33-40 fps in Calpheon (pretty heavy graphically), and my GPU utilization was pretty poor. I place the blame on this on the game itself, which is likely a crappy port of the original Korean developer's work.

It's blasphemy, I realize, but I started thinking maybe I should just stick with my current setup for now. My primary motivation with this move was to get to a single card solution that would take care of my needs, and give me enough rendering power to deliver a solid 4k experience.

But I reasoned that, even if the 980ti's are giving me solid numbers, I'd really love to get rid of the headache and inconvenience of a dual card setup, so I cracked open the Titan and fired it up for a comparison. Here were my results:

I didn't do a whole lot of bench testing in games with the Titan at stock clocks. The only reason I didn't install the EK block on it before even putting it in the PC is to make sure the card isn't defective (and not a complete OCing dud). After some tweaking, I was able to get +180-185mhz stable in Heaven benchmark, but later had to nug that down to about +175 to be stable in all games (the witcher 3 crashed after about 20 minutes at +180). I ran at 100% fan to enable this OC, and even at that it was clearly begging for more power, cooler temps, and more voltage. All things that a water block and custom BIOS will give me.

So here were my results, juxtaposed with the 980ti SLI numbers. Keep in mind these tests weren't very scientific, and I didn't exactly follow prescribed courses through the game while benching. Vsync always off for tests.

980ti SLI (1477mhz each)/Titan XP (+175mhz offset)

Heaven 4.0 Bench (UHD, Quality Ultra, Tessellation Normal, AA x2)): 64 fps | 55 fps
The Witcher 3 (UHD, all settings maxed, 4x Hairworks AA, no standard AA, low sharpening, no blur, no motion blur, no vignette): 61-73 fps | 54-60 fps
Black Desert Online (All settings maxed): 33-40 fps | 39-44 fps
Crysis 3 (All settings maxed, FXAA): 56-61 fps | 44-55 fps
Dragon Age Inquisition (All settings maxed, AA off): 68-72 fps | 63-65 fps

So, looking at the numbers alone, this is a little depressing. Not that I didn't realize that I was going to be taking a slight FPS hit across the board with this move (for games that support SLI). What really blew me away, however, was that the actual gameplay experience seemed just as good, if not better, on the Titan.

You may be saying that this is simply confirmation bias, and the scientist in me would affirm that you are probably correct. However, if I were to show gameplay to someone who knows nothing of video games for framerates, I would wager they would be very hard pressed to tell the difference between the two experiences. Frankly, in a lot of situations, though the framerate was worse with the Titan, motion seemed smoother, in general. Maybe this is microstutter at work...or maybe it's just the aforementioned confirmation bias. I've been using dial GPU setups since around 2010, so it's been a a minute since I really had a single GPU play experience.

The long and the short of this is that there really isn't a massive difference between the two. With the exception of Black Desert Online (which was noticeably smoother with the Titan), most of the games really played pretty much the same on the two setups.

So the question is, if I knew what I know now before i ordered the Titan, would I do it again? That's a tough one to answer. After selling my 980ti's used, assuming I can get at least $800 with the water blocks included, it's still a $470 "upgrade" (with tax on the Titan). The thought that a 1080ti could come along in 3 months and give me 99% of the Titan X performance for 75% of the price makes me absolutely cringe. But in the end, I'm very happy with the Titan, and though the numbers may be slightly lower than I was hoping (I really wanted to be able to hit a constant 60 fps in TW3 with the OC), the Titan is butter smooth, and I'm keeping it.
 
Frankly, in a lot of situations, though the framerate was worse with the Titan, motion seemed smoother, in general. Maybe this is microstutter at work...or maybe it's just the aforementioned confirmation bias. I've been using dial GPU setups since around 2010, so it's been a a minute since I really had a single GPU play experience.
It has been said many times that you need higher FPS in SLI to get the same gameplay experience on a single card. That is due to microstutter.
 
So the question is, if I knew what I know now before i ordered the Titan, would I do it again? That's a tough one to answer. After selling my 980ti's used, assuming I can get at least $800 with the water blocks included, it's still a $470 "upgrade" (with tax on the Titan). The thought that a 1080ti could come along in 3 months and give me 99% of the Titan X performance for 75% of the price makes me absolutely cringe. But in the end, I'm very happy with the Titan, and though the numbers may be slightly lower than I was hoping (I really wanted to be able to hit a constant 60 fps in TW3 with the OC), the Titan is butter smooth, and I'm keeping it.

This is the reason I did not opt for the Titan X this time around. I know all too well now that NVIDIA will without a doubt release a 1080 Ti in the very near future rendering the Titan X completely and utterly pointless. I guess the only consolation prize is that as a Titan X owner you get to preview the 1080 Ti performance a few months before everyone else. If that is worth $400+ to some, then more power to them. It was worth it to me at some point in time but not anymore.
 
This is the reason I did not opt for the Titan X this time around. I know all too well now that NVIDIA will without a doubt release a 1080 Ti in the very near future rendering the Titan X completely and utterly pointless. I guess the only consolation prize is that as a Titan X owner you get to preview the 1080 Ti performance a few months before everyone else. If that is worth $400+ to some, then more power to them. It was worth it to me at some point in time but not anymore.
We will also be 10-15% faster ;)
 
This is the reason I did not opt for the Titan X this time around. I know all too well now that NVIDIA will without a doubt release a 1080 Ti in the very near future rendering the Titan X completely and utterly pointless. I guess the only consolation prize is that as a Titan X owner you get to preview the 1080 Ti performance a few months before everyone else. If that is worth $400+ to some, then more power to them. It was worth it to me at some point in time but not anymore.

I agree, but I guess it comes down to what the actual difference will be with the 1080ti and Titan. If it really were almost identical (99% capable), then yeah, I'd feel it's money wasted. But if the Titan even has a 10% edge, I'd probably ok with that given I'm only doing a single card this time around, and I'm going to need every scrap of power I can get. The $1200 price point itself doesn't bother me - hell, I spent more than that on my 980ti setup, especially when you factor water blocks.

I'm not sure what possessed Nvidia to do what they did with the og Titan x and 980ti, but he competition from AMD was different then as well. There's really not much reason at all for them to bring Titan XP performance to the masses at the <$1k price point given how they can't keep the Titan in stock.
 
TL;DR: FPS numbers down slightly going from 980ti SLI setup to a single Titan XP, but play experience just as good, if not (circumstantially) better. I could easily make a case for sticking with the older setup, however, if money were tight.

This is long, sorry:

Got my Titan X today. As sad as it seems, it was sitting in the box, ready to be installed, and I was thinking about sending it back (or selling it). The reason is that I was running "before" benchmarks most of the afternoon with my 980ti SLI setup, and quite frankly, it had never run so well.

Thanks to what I had read in one of the threads on Overclock.net, I found that most of my issues with Witcher 3 and other games was due to a weird problem rendering in-game AA. It was giving me horrible GPU utilization rates (80-90% on one, 50-60% on the other in TW3) and therefore a terrible experience. When I disabled it, all of the games I've been playing worked flawlessly.

The Witcher 3 ran at slightly better than 60 fps, and that was with almost every single bell and whistle enabled in 4k (including maxed out hairworks and 4x hairworks AA), OTHER than standard AA, which was disabled. I was blown away, quite frankly. Same experience when I loaded up Dragon Age Inquisition (but the frames were slightly better there). Emerald Graves area was giving me better than 70 fps, and everywhere else was routinely hitting >90. Even Crysis 3 (fully maxed) ran anywhere from 56-61 in the very unscientific "benchmark" I tried to run by just playing the game for a bit from a certain save. Heaven benchmark gave me an average of 64 fps.

The single game that still gave me issues was Black Desert Online. In this, with "high end mode" enabled, I was still getting anywhere from 33-40 fps in Calpheon (pretty heavy graphically), and my GPU utilization was pretty poor. I place the blame on this on the game itself, which is likely a crappy port of the original Korean developer's work.

It's blasphemy, I realize, but I started thinking maybe I should just stick with my current setup for now. My primary motivation with this move was to get to a single card solution that would take care of my needs, and give me enough rendering power to deliver a solid 4k experience.

But I reasoned that, even if the 980ti's are giving me solid numbers, I'd really love to get rid of the headache and inconvenience of a dual card setup, so I cracked open the Titan and fired it up for a comparison. Here were my results:

I didn't do a whole lot of bench testing in games with the Titan at stock clocks. The only reason I didn't install the EK block on it before even putting it in the PC is to make sure the card isn't defective (and not a complete OCing dud). After some tweaking, I was able to get +180-185mhz stable in Heaven benchmark, but later had to nug that down to about +175 to be stable in all games (the witcher 3 crashed after about 20 minutes at +180). I ran at 100% fan to enable this OC, and even at that it was clearly begging for more power, cooler temps, and more voltage. All things that a water block and custom BIOS will give me.

So here were my results, juxtaposed with the 980ti SLI numbers. Keep in mind these tests weren't very scientific, and I didn't exactly follow prescribed courses through the game while benching. Vsync always off for tests.

980ti SLI (1477mhz each)/Titan XP (+175mhz offset)

Heaven 4.0 Bench (UHD, Quality Ultra, Tessellation Normal, AA x2)): 64 fps | 55 fps
The Witcher 3 (UHD, all settings maxed, 4x Hairworks AA, no standard AA, low sharpening, no blur, no motion blur, no vignette): 61-73 fps | 54-60 fps
Black Desert Online (All settings maxed): 33-40 fps | 39-44 fps
Crysis 3 (All settings maxed, FXAA): 56-61 fps | 44-55 fps
Dragon Age Inquisition (All settings maxed, AA off): 68-72 fps | 63-65 fps

So, looking at the numbers alone, this is a little depressing. Not that I didn't realize that I was going to be taking a slight FPS hit across the board with this move (for games that support SLI). What really blew me away, however, was that the actual gameplay experience seemed just as good, if not better, on the Titan.

You may be saying that this is simply confirmation bias, and the scientist in me would affirm that you are probably correct. However, if I were to show gameplay to someone who knows nothing of video games for framerates, I would wager they would be very hard pressed to tell the difference between the two experiences. Frankly, in a lot of situations, though the framerate was worse with the Titan, motion seemed smoother, in general. Maybe this is microstutter at work...or maybe it's just the aforementioned confirmation bias. I've been using dial GPU setups since around 2010, so it's been a a minute since I really had a single GPU play experience.

The long and the short of this is that there really isn't a massive difference between the two. With the exception of Black Desert Online (which was noticeably smoother with the Titan), most of the games really played pretty much the same on the two setups.

So the question is, if I knew what I know now before i ordered the Titan, would I do it again? That's a tough one to answer. After selling my 980ti's used, assuming I can get at least $800 with the water blocks included, it's still a $470 "upgrade" (with tax on the Titan). The thought that a 1080ti could come along in 3 months and give me 99% of the Titan X performance for 75% of the price makes me absolutely cringe. But in the end, I'm very happy with the Titan, and though the numbers may be slightly lower than I was hoping (I really wanted to be able to hit a constant 60 fps in TW3 with the OC), the Titan is butter smooth, and I'm keeping it.

Your numbers don't match up to the known metrics.

Witcher 3 vs 980ti sli


Witcher 3 vs 1080 sli
 
I'm curious about those results, but why did they have to be videos. So annoying. Can't people just write goddamned articles anymore?

It's not an article, it's a framerate comparison. You need someone to wax poetic instead of watching the actual framerates?
 
It's not an article, it's a framerate comparison. You need someone to wax poetic instead of watching the actual framerates?

Don't need a full article in that case no. Just a table. Videos are one of the absolute worst ways to disseminate information, unless motion is important to the information you are disseminating.
 
Your numbers don't match up to the known metrics.

Witcher 3 vs 980ti sli

Witcher 3 vs 1080 sli

I'm not tracking. The first video isn't relevant because it's shot at 1440p. Everything I reported was at UHD. The second video only supports my numbers (for the Titan XP OCed). If anything, I seem to have gotten slightly better numbers than they did there (could be the maps they ran on)
 
A quick question for you US people.

How long, generally, did your GPU arrive after you ordered it from nVidia? A week? 2 Weeks?

Assume shipped to an US address, and the fastest available shipping.
 
A quick question for you US people.

How long, generally, did your GPU arrive after you ordered it from nVidia? A week? 2 Weeks?

Assume shipped to an US address, and the fastest available shipping.

I ordered on launch day, (Tuesday) and got it on Saturday with free standard shipping, so 4 days for me, with standard free shipping.

Many people on here had theirs the next day after ordering, because they had them overnighted.
 
I ordered on launch day, (Tuesday) and got it on Saturday with free standard shipping, so 4 days for me, with standard free shipping.

Many people on here had theirs the next day after ordering, because they had them overnighted.

Yep. They seem to ship out of MN, so it was 3 business days shipping time for me. Luckily they use FedEx home, so Saturday's are a delivery day. I ordered Wed and received Saturday (I'm near Baltimore, MD)
 
I ordered on launch day, (Tuesday) and got it on Saturday with free standard shipping, so 4 days for me, with standard free shipping.

Many people on here had theirs the next day after ordering, because they had them overnighted.
Yep. They seem to ship out of MN, so it was 3 business days shipping time for me. Luckily they use FedEx home, so Saturday's are a delivery day. I ordered Wed and received Saturday (I'm near Baltimore, MD)

Thanks, so if things go well, it SHOULDN'T take more than a week?

I will be in CA sometime towards the end of the year, if Pitan X doesn't get over here by then, that's what I may have to resort to, assuming Hotels allows FedEx packages to be delivered there.
 
Thanks, so if things go well, it SHOULDN'T take more than a week?

I will be in CA sometime towards the end of the year, if Pitan X doesn't get over here by then, that's what I may have to resort to, assuming Hotels allows FedEx packages to be delivered there.

You need to worry more about your method of payment (credit card, PayPal) allowing for delivery to some random hotel address (versus your own billing address) than the hotels allowing for it. Hotels will take just about any package.
 
I did order stuff on Amazon from my home to have it delivered to an US address, so that is on the seller I think. The amount MIGHT get refused by the Card initially and I'll have to phone in to authorise it. But the difference would be that I will be ordering Pitan X while I am in the US, rather than having it done at home, since I have to change hotels in the middle of my trip (I plan to have it delivered to the longer staying one, so I can only do it after I check in).

Anyway, it might not even be needed, just exploring options.
 
I'm wondering if there is enough space under the EK block to tap into the fan header for the PWM signal.

I wish I had looked at that before I installed mine. Now it's a major pain to get to.
 
I'm wondering if there is enough space under the EK block to tap into the fan header for the PWM signal.

I wish I had looked at that before I installed mine. Now it's a major pain to get to.

What for out of curiosity? LEDs? I plan to devote every last milli amp of on-card power towards the GPU. :D
 
What for out of curiosity? LEDs? I plan to devote every last milli amp of on-card power towards the GPU. :D

I don't need or want the power. I just want to tap the PWM signal pin (and maybe the return RPM signal pin) so I can use it as one of my inputs to fan control. I wouldn't even touch the power and ground pins.
 
Gotcha. Another option would be to dip a temp sensor in your reservoir and have fans ramp up when a set temp flux is detected. In my case I simply eliminated the need altogether by building in way more reserve cooling capacity into my loop then necessary.
 
Last edited:
What for out of curiosity? LEDs? I plan to devote every last milli amp of on-card power towards the GPU. :D

To add further detail, I currently control my fans based on water temperature using a cheap $8 dual channel pwm temperature controller I got on ebay.

Each radiator has a inline thermal sensor which the controller uses to control the fan speeds for that radiator.

The problem with this setup - however - is that while it works well in a cool room, I want to keep my water at about 30C during load, and as the room gets warmer this gets more and more difficult. Usually when I'm in my office working, I have the AC on coling it down to reasonable temps, but if I'm in and out of there, it get get hot in the office, sometimes over 80F, which means more and more fan speed is needed to maintain the 30C water temperature. So if the computer is on and the room gets hot, those fans rev up to high, loud speeds to maintain the water temperature, when it really isn't needed, as the system is idle anyway.

My plan is to pick up a DP3T switch, so I can switch between three PWM signal inputs:



Setting 1 will take fan input from the water temp controller like it is currently set up.

Setting 2 will take PWM signal from the motherboard for the radiator closest to the CPU and from the GPU fan header for the radiator closest to the GPU.

Setting 3 will take PWM signal from a 556 based PWM generator based on this guide.


I picture dremeling one of my drive bay covers and attachign the switch to it, as well as a twisty knob for the pot for the PWM generator circuit.


So, switch all the way to the left = manual control by twisting knob.

Switch in middle = PWM signals from GPU and motherboard control fan speeds

Switch all the way to the right = fan speed governed by loop water temp
 
Gotcha. Another option would be to dip a temp sensor in your reservoir and have fans ramp up when a set temp flux is detected. In my case I simply eliminated the need altogether by simply building in way more reserve cooling capacity into my loop then necessary.

Yeah, the problem is I don't have central air, and regardless of radiator capacity, I can't go sub-ambient :p

I actually have fairly decent radiator capacity with one 45mm thick 420mm rad up top, and one 86mm thick 280mm rad in the front.
 
Yeah, the problem is I don't have central air, and regardless of radiator capacity, I can't go sub-ambient :p

I actually have fairly decent radiator capacity with one 45mm thick 420mm rad up top, and one 86mm thick 280mm rad in the front.

Sounds like you just need the ability to override everything... ala one of those "turbo" buttons like 486 based system cases used to have back in the 80's. :D
 
UPS always drops stuff off at the start of my work day... nooooo.








Nice!

When you go to install yours, if you don't mind, I would greatly appreciate a picture from the bottom of the card in the area where the fan header is, to see what kind of clearance there might be between the fan header and the block.

That way I don't have to drain my loop and take mine out just to check if my plans are feasible :p
 
I'm planning on mounting my block late tonight after kids are tucked in. Zarathustra, I'll hook you up with some close up shots if Rizen doesn't beat me to it.
 
Time for some EK Fullcover temp results:

I ran Heaven on a loop at 1920x1080 and eveything maxxed for these tests.

Stock cooler baseline: (Ambient 73F)

All settings stock, including fan profile and power limit: 1650Mhz, 51% fan, 84C
Same as above, but 100% fan manually set: 1790Mhz, 62C
100% fan + overclock: +160 max stable results in 1987Mhz stable and 65C

Testing with EK Fullcover block: (Ambient 74F)

All settings stock: 1886Mhz, 32-33C
Same overclock as baseline (+160): 2050Mhz 33-34C
New max overclock: (+170Mhz) 2063Mhz 33-34C


So, it looks like just dropping the temps was enough to lift me from 1987Mhz to 2050Mhz. Now my top stable OC is 2063Mhz

Going to need more voltage and a higher power limit to go higher. Hoping for the best from BIOS mods! I certainly have enough temperature headroom! :p

Some final pics to go along with this:


View attachment 6898 View attachment 6899

Now I need to find someone else with a i7-3930K and a Supremacy EVO block to compare temps with, because I'm not sure if I got a good mount of the block to the CPU, or if I need to redo it.

I'm getting stock (no overclock) Prime95 max load temps of 54C at 74F ambient.

This sounds a little high to me, compared to how awesome my GPU numbers were, but at the same time, I know that these old Sandy Bridge-E chips do give off a fair bit more heat than the latest CPU's, so maybe this is normal? I'm going to have to overclock it and see what it looks like then.


I have my fans set to stay at minimum speed (25%, or ~500rpm) until the water hits 28C, then gradually ramp up to 100% at 32C water temp.

Thus far in both CPU and GPU load tests, the water temp has maxed out at about 29C with a fan speed of ~1100rpm to 1150rpm

Considering 74F is 23.3C, I guess that means I'm at a ΔT of 5.7C which isn't terrible. I guess I could improve on that by ramping up fan speeds higher, but I'm not sure I want to.


Oh,

I forgot to mention that I didn't use the paste EK included with the block.

I went with some Thermal Grizzly Kryonaut instead.

Not sure how much of a difference this may have made.
 
Sure thing, I can take some photos of the PCB. I am hoping my meetings at work are quick today so I can start disassembling my loop and card and get it going.

I have some Gelid GC Extreme paste here. Should I use that or the included EK paste?
 
Funny side note.

I was playing Fallout New Vegas last night. First game I've really played since getting the loop up and running.

I play it at 4K with vsync at 60hz and Ultra settings but no mods (yet).

I wasn't expecting it to load up my system at all (it is an older title after all) so I didn't even set OC settings, but it was really amusing to see that the GPU temp didn't budge AT ALL over idle temps even after playing the game for an hour :p
 
Sure thing, I can take some photos of the PCB. I am hoping my meetings at work are quick today so I can start disassembling my loop and card and get it going.

I have some Gelid GC Extreme paste here. Should I use that or the included EK paste?

Not sure. I haven't seen any tests of the EK stuff and I don't know what it is for comparison.

Some suggest it performs similar to mx-2, a little bit better than mx-4, but I think a lot of this is subjective and more dependent on block mount quality than the paste, honestly.

As long as it is non-conductive, I think most paste will be fine under there. I went with the Thermal Grizzly Kryonaut because in every test I've seen it performs the best, because I wanted to get the absolute most out of my GPU as I am downsizing to only one after previously using SLI'd 980TI's and for 4k I needed as much performance as possible.

In retrospect knowing that i see 34C overclocked load temps, I would probably just not have worried about it and used the included EK paste, and saved a few bucks.
 
Back
Top