My fellow 1080ti owners, have we achieved good-enough performance for 4k gaming?

zamardii12

2[H]4U
Joined
Jun 6, 2014
Messages
3,409
I was kind of hoping to get some feedback from single 1080ti owners with 4K monitors as to whether it's worth it yet, or if I should wait for the next-gen of cards to upgrade to a 4K monitor?

The story here is that I had a 1080 FE before I upgraded to the card in my sig and I bought a 4K monitor when I first put the system together, and while some games were fantastic in terms of performance and graphics quality in 4K like Doom, others such as Crysis 3 fell to it's knees in 4K. I also didn't like how changing to anything but the native resolution of 4K made the image blurry, so I sold that monitor and bought a 1440p display.

I have a 35" 3440x1440 ultrawide coming in the mail on Tuesday, but now that we all have our 1080ti' I was wondering how 4K performs on your systems. Yes, I already looked at benches everywhere but I was more curious of real-world performance. I know with even the best 4K monitors with G-Sync, they are limited to 60hz @ 4K but that withstanding I was curious how performance is for everyone now. If anyone upgraded from a 1080 to a ti than that'd be an even better comparison of the improvements.
 
This is probably going to blow your mind, but you can always turn down in game settings to get a smooth experience at 4K. I've played a lot of games at 4K using a lowly 1070, I didn't turn into a pumpkin or get a brain disease from it. At this point I'd much rather keep playing at 4K instead of using high in game settings and a lower resolution.
 
on the flip side.... Being able to play 4k games with max settings is pretty sick. The 1080ti has wicked performance there is no doubt about it.
 
I was kind of hoping to get some feedback from single 1080ti owners with 4K monitors as to whether it's worth it yet, or if I should wait for the next-gen of cards to upgrade to a 4K monitor?

The story here is that I had a 1080 FE before I upgraded to the card in my sig and I bought a 4K monitor when I first put the system together, and while some games were fantastic in terms of performance and graphics quality in 4K like Doom, others such as Crysis 3 fell to it's knees in 4K. I also didn't like how changing to anything but the native resolution of 4K made the image blurry, so I sold that monitor and bought a 1440p display.

I have a 35" 3440x1440 ultrawide coming in the mail on Tuesday, but now that we all have our 1080ti' I was wondering how 4K performs on your systems. Yes, I already looked at benches everywhere but I was more curious of real-world performance. I know with even the best 4K monitors with G-Sync, they are limited to 60hz @ 4K but that withstanding I was curious how performance is for everyone now. If anyone upgraded from a 1080 to a ti than that'd be an even better comparison of the improvements.

The answer is no. A single GeForce GTX 1080Ti cannot offer you maximum visual quality in games at a steady 60FPS in all current games. You can get playable performance right now out of almost if not everything at 4K, but you might have to turn down a few options to help. Of course some options have a more visual impact than others, while some options that have minor visual impact might have a big one regarding frame rates. Of course, as new games come out the card may end up feeling dated really fast. That depends on what's on the horizon and what those newer games look like. Some of you guys keep up with upcoming games better than I do. I don't get on the hype train until things are closer to release and the developer actually has something to show me.

Running games at non-native resolution on any LCD will make the image blurry. There is no getting around this.

This is probably going to blow your mind, but you can always turn down in game settings to get a smooth experience at 4K.

No. I have never considered reducing visual quality an acceptable solution to achieve greater performance in games. If technology exists to allow the game to run better I will get it. If I can't, I'll shelve the game until such time as I can experience it properly. Crysis was probably the only game I really had to reduce my settings for or tolerate a shit frame rate. I won't do it again. I've always been able to get playable performance on day one at my monitor's resolution outside of Crysis and I've been on 2560x1600 since 2006 or 2007, so I haven't ever had to deal with 1080P. I've always had to buy dual GPUs or more to get good performance out of the most modern titles. I've also had to upgrade virtually every generation with rare exceptions. Obviously, throwing more money at the performance problem isn't for everyone. I realize that and as such, my statement is my own opinion. I suspect the OP shares a similar opinion on the matter.

At this point I'd much rather keep playing at 4K instead of using high in game settings and a lower resolution.

We will have to disagree on this point.
 
No. I have never considered reducing visual quality an acceptable solution to achieve greater performance in games. If technology exists to allow the game to run better I will get it. If I can't, I'll shelve the game until such time as I can experience it properly. Crysis was probably the only game I really had to reduce my settings for or tolerate a shit frame rate. I won't do it again. I've always been able to get playable performance on day one at my monitor's resolution outside of Crysis and I've been on 2560x1600 since 2006 or 2007, so I haven't ever had to deal with 1080P. I've always had to buy dual GPUs or more to get good performance out of the most modern titles. I've also had to upgrade virtually every generation with rare exceptions. Obviously, throwing more money at the performance problem isn't for everyone. I realize that and as such, my statement is my own opinion. I suspect the OP shares a similar opinion on the matter.

I agree mostly with this, but there's some exceptions. There's a lot of "gimmicky" visual effects that will tank performance and not really add anything to the game. Take "Ultra God Rays" on FO4 and "Hairworks" on Witcher 3. Those kill performance but add almost nothing to the visual experience of the game.
 
I agree with that. There are a few visual options that you can turn off in some games that add nothing, or almost nothing to the experience. I'm fine with disabling or adjusting those values for increased performance.
 
The answer is no. A single GeForce GTX 1080Ti cannot offer you maximum visual quality in games at a steady 60FPS in all current games. You can get playable performance right now out of almost if not everything at 4K, but you might have to turn down a few options to help. Of course some options have a more visual impact than others, while some options that have minor visual impact might have a big one regarding frame rates. Of course, as new games come out the card may end up feeling dated really fast. That depends on what's on the horizon and what those newer games look like. Some of you guys keep up with upcoming games better than I do. I don't get on the hype train until things are closer to release and the developer actually has something to show me.

Running games at non-native resolution on any LCD will make the image blurry. There is no getting around this.



No. I have never considered reducing visual quality an acceptable solution to achieve greater performance in games. If technology exists to allow the game to run better I will get it. If I can't, I'll shelve the game until such time as I can experience it properly. Crysis was probably the only game I really had to reduce my settings for or tolerate a shit frame rate. I won't do it again. I've always been able to get playable performance on day one at my monitor's resolution outside of Crysis and I've been on 2560x1600 since 2006 or 2007, so I haven't ever had to deal with 1080P. I've always had to buy dual GPUs or more to get good performance out of the most modern titles. I've also had to upgrade virtually every generation with rare exceptions. Obviously, throwing more money at the performance problem isn't for everyone. I realize that and as such, my statement is my own opinion. I suspect the OP shares a similar opinion on the matter.



We will have to disagree on this point.

I appreciate the response Dan, and that was my point which Ocellaris missed which is that I don't WANT to lower quality settings on games. That's the reason we spend $700 on video cards among other things, and since I am in the market for a monitor right now I wanted to know how much better the 1080ti was for 4K since i sold my previous 4K monitor after realizing my 1080 FE just wasn't cutting it. I have a 35" 3440x1440 Ultrawide 100hz G-Sync monitor on order now, but I was also considering getting one of the Asus 4K G-Sync monitors as well (which only top out at 60hz @ 4K anyway) to more future-proof myself for the upcoming inevitable updated GPUs that will run 4K better, but I honestly think that the time of 4K maxed-out graphics @ 120hz is still a ways off; maybe even a couple years away so I think i'll be happy with my 340 x 1440 Ultrawide for the time being.
 
You'd probably be happy with a single 1080Ti at 4K most of the time, but there are some games where it isn't enough to get the job done. While fine most of the time today, I suspect that's going to change rapidly over the next couple of years. I'd also wager that single GPU guys will probably be happier going with Volta and skipping the 1080Ti if they can.
 
This is probably going to blow your mind, but you can always turn down in game settings to get a smooth experience at 4K. I've played a lot of games at 4K using a lowly 1070, I didn't turn into a pumpkin or get a brain disease from it. At this point I'd much rather keep playing at 4K instead of using high in game settings and a lower resolution.

The reason why I go for 4k resolution is because I want the best visual quality. I don't want smooth experience if that means lowering the settings.

I'd say 1080Ti is close. Maxed out Wildlands is still at about 30-35 fps :/
But I noticed much better performance with my 1080Ti compared to 980Ti SLI. Average FPS is the same, but there's A LOT less stuttering.
But besides Wildlands, I don't think there's any game out there that's not smoothly playable at max details. I'd say 2080Ti/next Titan will be perfect for 4k@60hz gaming, however that's also when new HDMI/DP is coming out and we will want 4k@120hz.... so yeah, gotta wait for that 5080Ti/6080Ti :D
 
The reason why I go for 4k resolution is because I want the best visual quality. I don't want smooth experience if that means lowering the settings.

I'd say 1080Ti is close. Maxed out Wildlands is still at about 30-35 fps :/
But I noticed much better performance with my 1080Ti compared to 980Ti SLI. Average FPS is the same, but there's A LOT less stuttering.
But besides Wildlands, I don't think there's any game out there that's not smoothly playable at max details. I'd say 2080Ti/next Titan will be perfect for 4k@60hz gaming, however that's also when new HDMI/DP is coming out and we will want 4k@120hz.... so yeah, gotta wait for that 5080Ti/6080Ti :D

Is your monitor GSync capable? SLI does not support GSync. 30-35 FPS on SLI is unplayable.

To answer OP's qn. I cannot even max out graphic details on 1600p with a 1080Ti on certain games hoping for a 60FPS (e.g. Assassin creed syndicate, WatchDog 2 with certain geometry detail percentage), let alone 4k.
 
4K monitors aren't there yet. Until we see 32" or bigger 144Hz 4K monitor, I think it is a bit pointless to play in 4K. At 60 Hz, my eyes would probably spontaneously combust due to the low refresh rate.

Right now, the 2 options are 34" 3440*1440 @ 100 Hz w/ Gsync or 27" WQHD @ 165 Hz. Anything other than that for gaming is pretty much nonsensical imo.
 
4K monitors aren't there yet. Until we see 32" or bigger 144Hz 4K monitor, I think it is a bit pointless to play in 4K. At 60 Hz, my eyes would probably spontaneously combust due to the low refresh rate.

Right now, the 2 options are 34" 3440*1440 @ 100 Hz w/ Gsync or 27" WQHD @ 165 Hz. Anything other than that for gaming is pretty much nonsensical imo.

Weird, I have been gaming at 4k@60Hz for 3 years now and there's nothing wrong with my eyes.
You "MUST BE OVER 100HZ OR NOT WORTH IT" people are sometimes ridiculous. There's absolutely nothing wrong with gaming at 60hz and 4k if you don't play kids games (CS and other shooters).
 
my basic 1080 handles almost all the games I play at 4k with max settings pretty well. If it doesn't, I turn some of the settings down. For me, there isn't much of a difference between ultra and high settings in most cases. Even in VR land, a single 1080 for most of todays games is great.

If you are dead set on running 4k and maxed out game settings, why not go with the new titan instead of a ti?
 
my basic 1080 handles almost all the games I play at 4k with max settings pretty well. If it doesn't, I turn some of the settings down. For me, there isn't much of a difference between ultra and high settings in most cases. Even in VR land, a single 1080 for most of todays games is great.

If you are dead set on running 4k and maxed out game settings, why not go with the new titan instead of a ti?

It's 2-4% faster for twice the price...That's probably why.
 
I'm reading 10 to 11% faster on various sites, but we all know the only review site that matters is this one. ;)

but even if its 10 to 11% faster, you are not totally off, its about 40% more expensive depending on the 1080ti you are looking at. Either way, I agree with you. Even if its 11% faster, it may not be worth the 40% - 50% price jump for most folks. But he did have a basic 1080FE, and a 1080FE to a Titan XP is a pretty big jump and may put him in a spot that he's happy with assuming the budget is there.
 
I think so. As stated earlier, all settings on MAX doesnt always translate into better visuals. I recently went 4k KU7500 and Auros 1080Ti and wont look back. Very much playable @60fps. Go for it!
 
There is no real point in using the added resolution to better define low quality details. If the extra pixels just reveal blurry textures or low-resolution effects, what's the point?
 
There is no real point in using the added resolution to better define low quality details. If the extra pixels just reveal blurry textures or low-resolution effects, what's the point?

Overall visual clarity. I'd much rather seen more object detail when moving than see more texture detail. I generally don't stare at wall textures when playing games.
 
4k @ 60hz with a single Ti here too. While I can't max every game I can play them with most of the important settings up there enough for the game to look awesome.

Can't beat the added resolution imo.

No eye issues here either. Haha.
 
It works and most games are good enough. My 4K screen is a work one though (i find a 32 better than dual monitors) and even though my pc is basically as fast as you can get (5.1ghz 7700 and oc'd 1080ti) I'd have gotten g-sync if I could have, just for smoothness. I regret it but I'm holding out for the 144hz screens later in the year and can't justify a 60hz one for 6 months. Generally I'll be a lot happier to get back to 90hz+ with Adaptive sync.

So yeah it'll be up there or there abouts and I haven't found anything that I can't get 60fps *most* of the time with a few settings changes but I'm only a few days in.
 
I would say it is bare minimum for 60fps. If you are looking for constant high settings in 4k, you are probably best in waiting to see what Volta has to offer. The speeds of GDDR6 bandwidth should really help in the 4k realm.
 
Back
Top