GeForce GTX 1080 Ti Discussion Thread

I don't understand. That link contains no SLI benchmarks. Or are you just saying that a Single 1080Ti isn't going to yield sufficient frames for you?


Thats what it looks like.. why im not itching to upgrade to a 4k screen just yet.

My 2 1080 classifieds water cooled and heavily overclocked serves me just fine at 2560x1440 on my 165hz g-sync monitor.. just wish sli was better supported currently which its not. To do it all over again id pass on the sli.

So maybe next fall the next big card will do 4k good enough.
 
Yes, according to some of the more demanding game's real world benchmark sections, in the actual frame rate graphs, and some upcoming games - even in the 2560 x 1440 graphs with everthing maxed and crazy AA.


More specifically I meant that I was set to do 1080ti sli by year end anyway since I want to upgrade to one of those asus/acer dp 1.4 4k 144hz FALD HDR g-sync monitors, and that I might just get both gpus now considering. If 1440p scales well enough on such a high ppi 4k screen (163ppi), that could be another option for some games to break 100fps-hz average too.

------------------------------------------------------------------------
1080ti (single) hardwarecanuck's review
------------------------------------------------------------------------
In GTA V we take a simple approach to benchmarking: the in-game benchmark tool is used. However, due to the randomness within the game itself, only the last sequence is actually used since it best represents gameplay mechanics.
GTA V 2560x1440 Ultra Settings 4xMSAA = 79 fps-hz average / 54min

The Witcher 3 also happens to be one of the most visually stunning as well. This benchmark sequence has us riding through a town and running through the woods; two elements that will likely take up the vast majority of in-game time.
Witcher 3 2560x1440 Ultra, HairWorks OFF, AA On = 89.94 fps-hz average/75 min

For this benchmark we complete a run-through from within a town, shoot up a vehicle to test performance when in combat and finally end atop a hill overlooking the town. Note that VSync has been forced off within the game's .ini file. We are now also using the High Resolution Texture Pack add-on.
Fallout 4 2560 x1440 , Ultra, High res textures, HBAO+, TAA = 88.36 fps-hz average/56 min

----------------------------------------------------
That's still not extremely far off from 100fps-hz average at 80 or 90 avg. You could dial down/off the AA and run on very high or very high+/"ultra minus" (custom) settings as necessary.

Simpler games like overwatch get 185 avg / 164 min :)
 
Last edited:
elvn Yeah it's tricky.

The last time I built was Christmas 2013 (system in sig though only running at 4.2GHz - shitty clocker). I went all out and went 780Ti-SLI. Expensive but the cards have lasted me well...still running them today.

However, for the cost of the 2 x 780Ti, I could have just got one when I built and then picked up a single 980Ti, 18 months later or whatever it was. Then a year and a bit after that ugraded to a single 1080Ti.

I asked this in another thread but it seems to me that, in general, the "new" Ti model beats out the "old Ti model in SLI (or is at least sufficiently close in performance for it to be considered a sideways move.

Then there's the issue of SLI support. Now, in my experience I've not really suffered too many issues, sure the odd game like Arkham Knight can be problematic. However, it seems there's a lot of anti-Multi-GPU sentiment around and a lot of people are saying SLI support is only going to get worse with the advent of DX12.

So maybe I should ditch SLI, go single 1080Ti and then see what happens in the next gen.
 
I ordered 2 GTX 1080Tis but ended up selling the second one. They haven't come yet, but judging from the benchmarks I think a single card will be good for 4K.

SLI still has some support, but there are lots of games where there is no boost (and some where performance gets worse). I have 1080 SLI on another rig, and most of the time I leave it disabled rather than going through the trouble of enabling/disabling every time a game has shoddy support.

However, DX12 and Vulkan can use explicit mGPU, which allows developers themselves to optimize for multiple cards. In Rise of the Tomb Raider (for example) this new mGPU results in a massive 92% gain when using 2 GTX 1080Tis. But, I don't know how many developers will spend the time for these optimizations.
 
Indeed, those Tomb Raider results are incredible.

I've no need for Hi-Hz but I do want solid 4k@60Hz (give or take) at High or Very High settings.
 
In addition to other games in 1080 sli benchmark reviews from the past year, I'm inspired by unreal engine adding sli support, and the fact that final fantasy XV pc development showcased 1080's in sli. I'm pretty sure the shadow of mordor sequel I'm greatly looking forward to, "Middle Earth:Shadow of War" will support sli as well. I still have a backlog including GTAV I got on sale for that matter too ;) .

If you had a race car with an engine that could open up to much higher speed tiers but was relegated to using it's boost on a particular 80% of the most popular roads at 30% to 70% boost and more, would it still be worth it? I hate using car analogies but there ya go. :wtf:
Really it's about achieving 100fps-hz average or better for high hz monitors.

I think single 1080ti at 1440p is strong but to get 100fps-hz average on some of the most demanding games you might have to dial down to very high+/"ultra minus". Even on a high ppi 4k you could run 2560 x 1440 upscaled, which generally upscales very nicely on a 4k screen from what I've heard. There is no hope of doing 100fps-hz average with any kind of decent graphics settings at 4k resolution in order to get any benefit out of the high hz on the upcoming dp 1.4 144hz 4k monitors unless you do sli or upscale 1440p.
 
Last edited:
Pardon me for being lazy, but does the FE 1080ti throttle at all, I'm holding out for an AIB card but if the FE's don't really throttle its somewhat pointless.

Found the answer to my own question, it throttles, lame..
 
Last edited:
elvn Yeah it's tricky.

The last time I built was Christmas 2013 (system in sig though only running at 4.2GHz - shitty clocker). I went all out and went 780Ti-SLI. Expensive but the cards have lasted me well...still running them today.

However, for the cost of the 2 x 780Ti, I could have just got one when I built and then picked up a single 980Ti, 18 months later or whatever it was. Then a year and a bit after that ugraded to a single 1080Ti.

I asked this in another thread but it seems to me that, in general, the "new" Ti model beats out the "old Ti model in SLI (or is at least sufficiently close in performance for it to be considered a sideways move.

Then there's the issue of SLI support. Now, in my experience I've not really suffered too many issues, sure the odd game like Arkham Knight can be problematic. However, it seems there's a lot of anti-Multi-GPU sentiment around and a lot of people are saying SLI support is only going to get worse with the advent of DX12.

So maybe I should ditch SLI, go single 1080Ti and then see what happens in the next gen.

DirectX 12 puts SLI optimization and implementation on the game developers rather than NVIDIA or AMD. That's what DX12 changes. Vulkan now has multi-GPU support and as Vulkan gains popularity we may see multi-GPU support improve. Also, NVIDIA and AMD will do what they can to urge game developers to implement multi-GPU technologies. It benefits them to do so. People who buy the fastest GPUs available are few and far between, and only a subset of those buy two. Still, it benefits them as they can definitely increase their sales of high end GPUs. Some people will buy lower or mid-range SLI configurations if the cost / performance ratio is right. So that's even more sales on the table. I think we are in a transitioning period where the game developers are dropping the ball but I see it as a temporary issue. Again, I've run SLI since the first days of it and I've had very few issues with it over the years. Even now I see more benefit to having it than not. Hell it's kept my system cruising a long nicely at 7680x1600 and now 3840x2160 for almost two years. I skipped the GTX 1080, Titan X (Pascal) and I'm finally making the upgrade now. I paid about $2200 total for my Titan X's and if I had been upgrading more frequently on a single card I'd never have gotten the performance I was after. Guys like me maybe in the minority but here's what NVIDIA leaves on the table by not making sure SLI continues to succeed:

My upgrade path looked like this:

GTX 780Ti 3-Way SLI > Titan X SLI > GTX 1080Ti SLI = $2,100 + $2,200 + $1,400 = $5,700

Upgrading from the GTX 780Ti 3-Way SLI and dropping to a single GPU would have looked like this:

GTX 780Ti 3-Way SLI > Titan X (Maxwell) > GTX 1080 > Titan X (Pascal) = $4,750

Even with SLI'ed Titan X's holding me over for two years and skipping the GTX 1080 and Pascal Titan X, NVIDIA gets more money from me in the long run. If I hadn't bought into SLI and stopped after the GTX 780Ti cards, I'd have purchased a high end card every single time one was released. That includes Titan's. I'd skip cards like the GTX 980Ti or the GTX 1080Ti because I'd have bought a Titan first and not felt the "Ti" card was worth buying after already having close to the same performance for months by that point. If I didn't skip each card and in turn chased every once of performance when a "Ti" card was faster than an older Titan (factoring in overclocking headroom) it would be about as costly as what I've been doing.

GTX 780Ti 3-Way SLI > Titan X > GTX 980Ti > (Maxwell) > GTX 1080 > Titan X (Pascal) > GTX 1080Ti = $5,400

Obviously, I'm not counting selling older GPU's and recouping some of my costs. That doesn't even matter to NVIDIA considering they get the same amount of cash whether I do that or not. The point is that SLI generally brings tomorrow's performance today with a few hiccups here and there. It benefits AMD and NVIDIA to make sure that multi-GPU is a viable option for both mid-range and high end card buyers. Frankly, I couldn't have experienced the performance I demand for gaming without going with SLI so there is that. While the GTX 1080Ti may in deed be pretty capable by itself at 4K, that may not be true a few months from now.
 
  • Like
Reactions: Savoy
like this
Thats what it looks like.. why im not itching to upgrade to a 4k screen just yet.

My 2 1080 classifieds water cooled and heavily overclocked serves me just fine at 2560x1440 on my 165hz g-sync monitor.. just wish sli was better supported currently which its not. To do it all over again id pass on the sli.

So maybe next fall the next big card will do 4k good enough.


Wana know something hularious??


I just bought my first ever 4k monitor like an hour after posting this!!! Hahaha

Hope my 1080s do ok or man will my wife kill me spending $1400+ sometime down the road for 1080ti's.
 
You should be fine, I had two 1080s at one point and they did 4k very well. You'll just have to turn down the more demanding settings like AA and such.
 
Pardon me for being lazy, but does the FE 1080ti throttle at all, I'm holding out for an AIB card but if the FE's don't really throttle its somewhat pointless.

Found the answer to my own question, it throttles, lame..
I have not seen any review showing the card dropping below the base clock. Do you have a source?
 
The 1070/1080 will start a stair-step downclock from boost at 52 C. Does the 1080 Ti variant differ in this regard?
 
The 1070/1080 will start a stair-step downclock from boost at 52 C. Does the 1080 Ti variant differ in this regard?

There's not a single review showing the 1080 Ti below its boost clock. (1582Mhz)

Also I never saw my regular 1080 drop below its boost clock either. (1733)
 
Okay, I was a bit imprecise with my language. Thanks to Boost 3.0, the 'boost' clock on a Pascal card is generally well above its claimed boost. For instance, my 1070 will automatically boost above 2K when it's nice and cool (i.e. below 50 C). Once it hits 52 C it starts to downclock. Once the fans kick in (which is at 60 C) it's running closer to 1980. It's advertised boost is 1860 which it has never, to my knowledge gone below (as you correctly noted).
 
You should be fine, I had two 1080s at one point and they did 4k very well. You'll just have to turn down the more demanding settings like AA and such.

Games still run pretty well at 4K on my Maxwell Titan X's. I do have to turn AA down and occasionally one or two other settings but I'm generally able to keep eye candy all the way up.
 
I was contemplating a hybrid kit for my FE, but after playing some games on it, I'm happy as is. It's quieter than my 2 980s, and its crushing 1440p. Wildlands ultra was the only game I tested that was pretty brutal.

It'd be interesting to see a SLI vs Single GPU IQ comparison on these games shipping with TAA. Deus Ex image quality was notably improved. Fallout 4 I felt like there was less shimmer.
 
I run 4K with my 1080OC and rig in sig and I have no issues honestly maintaining playable frames at 4K. The 1080Ti will just take the few games that arent hitting 60 and get them way closer to 60.
 
There's not a single review showing the 1080 Ti below its boost clock. (1582Mhz)

Also I never saw my regular 1080 drop below its boost clock either. (1733)

You're defending the card in the wrong manner, it does throttle with "Boost fail.0" like every other Pascal card, thanks to the voltage stepping. I could really care what the base clock is, I want my OC to stick period, hence I guess I will be waiting for a decent AIB card.
 
You're defending the card in the wrong manner, it does throttle with "Boost fail.0" like every other Pascal card, thanks to the voltage stepping. I could really care what the base clock is, I want my OC to stick period, hence I guess I will be waiting for a decent AIB card.
I have a RF 1080 and have it OC to 2050mhz it sits right around that and rarely ever drops below 2ghz. Now I used EVGA precision OCX to overclock and I run the program that tests the card out at each OC interval
 
I have a RF 1080 and have it OC to 2050mhz it sits right around that and rarely ever drops below 2ghz. Now I used EVGA precision OCX to overclock and I run the program that tests the card out at each OC interval

You're still proving my point. AB lets you micromanage the voltage stepping better than Precision X, but its still going to dip especially in a demanding situation over an extended period of time, plus keeping my room under sweltering hot is also a good thing.
 
You're still proving my point. AB lets you micromanage the voltage stepping better than Precision X, but its still going to dip especially in a demanding situation over an extended period of time, plus keeping my room under sweltering hot is also a good thing.
I don't see how I proved your point at all.
 
Got my 1080ti in today. Here are some quick comparisons vs my OC'ed 980ti. The 1080ti is currently at +105Mhz on the core (for max of 1990) but thermal throttling is keeping it in the mid-1700s. I'll probably put a hybrid cooler on it sooner rather than later. Even with the thermal throttling, the 1080ti is a BEAST.

Firestrike Ultra
WmokjqG.png


4kqkDzO.png


Gears 4
xv8EEA5.png

NpVQ0rS.png


Rise of the Tomb Raider
All settings maxed, SMAAx1, DX12

980ti
5WaT0f0.png


1080ti
ricr7cw.png
 
I don't see how I proved your point at all.

My point was your Max OC isn't going to stick, and as you said, in your post, "I have a RF 1080 and have it OC to "2050mhz it sits right around that and rarely ever drops below 2ghz", You're not pegged at 2050, don't really know what more to do other than draw a map in crayon.
 
My point was your Max OC isn't going to stick, and as you said, in your post, "I have a RF 1080 and have it OC to "2050mhz it sits right around that and rarely ever drops below 2ghz", You're not pegged at 2050, don't really know what more to do other than draw a map in crayon.
No overclock on a GPU is going to hold precise; we are literally talking about a fluctuation of 50mhz.
 
Well the rumors were true about AIB 1080Ti's featuring Dual HDMI (good for us VR headset owners), dual DP, and DVI for all you slackers with Korean 1440p's that haven't upgraded to 4K yet. Leaked" STRIX 1080Ti info:

HzmdyZy.jpg


QIgoEJe.jpg

F2JIbj6.jpg
 
Last edited:
How is the sound on these FE suckers?

If you leave them at the stock clocks they're pretty damn quiet. If you get aggressive with the OC and fan curve they are very audible. I have mine set to max out at 70% speed. With my headphones on I can't hear them while gaming, but the noise is there. If you're really worried about noise I would wait for a card with aftermarket cooling, or just plan on replacing the FE cooler with a AIO water cooler at some point (EVGA hybrid, etc.)
 
  • Like
Reactions: DPI
like this
Every time you post, I further prove my point. I guess I can put the crayons away. It was fun.
Seriously you claimed the FE was a lousy OC it isn't, but keep thinking somehow it is. Not a single card is going to lock into place its OC and never see minor fluctuations.
 
What tool is required to remove the stock FE cooler on 1080 Ti?

From the looks of the gamersnexus video, tiny 6pt socket?

Edit: noticed you commented on OC.net. When my kit comes in, ill pull my card apart. If by then no one repsonds, ill shoot some photos
 
Last edited:
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top