Going from 2560x1080 to 2560x1440 has had a bigger impact on performance than anticipated.

viivo

[H]ard|Gawd
Joined
Sep 7, 2005
Messages
1,695
Last month I bid farewell to my ultrawides and went back to 16:9. What I did not consider was how performance would be affected, and it certainly never occurred to me that some games would become unplayable.

For example, in Kingdom Come at 2560x1080 I was able to maintain a constant 100fps with most settings medium or high (except shaders.) Now at 1440 I can't even get a consistent framerate over ~65 (frequent dips into the 40s) with everything on low. Same for Andromeda and Far Cry 5. My hardware is getting older (1070, 8700k, 16GB C15 3000), but such a big hit for a relatively small increase in only horizontol resolution is shocking and discouraging.

Has anyone experienced something similar?
 
That "small" increase is actually 33% more pixels you are pushing so I don't think the performance drop you are seeing is out of place. The performance hit pushing more pixels is not linear in most cases...
 
A bit disingenuous. 33% on the horizontol, not total, thus the modifier "relatively."
 
I've noticed the same thing, but in the other direction. I went from 2560x1440 to 2560x1080, and performance is much better.
 
That "small" increase is actually 33% more pixels you are pushing so I don't think the performance drop you are seeing is out of place. The performance hit pushing more pixels is not linear in most cases...
If it were the consistent bottle-neck it would be close to linear. What I think might have happened is that it only just started bottle-necking (on fill-rate, memory bandwidth, or shader throughput) somewhere between the old and new resolution, and the resolution/performance slope is a lot steeper than he expected.

My hardware is getting older (1070, 8700k, 16GB C15 3000)
That's not old at all. Your video card may not be able to keep up with your expectations, but it's modern, and 1070 is still considered the price/performance sweet spot. The processor is nearly top-of-the-line and you can't do much better.
 
Yeah that's kinda why I've stayed at 1080 so long...that and my monitor is still working perfectly. Every once in a while I can see some jaggies that I wouldn't at 1440 but it's rare.

I do think visually 1440 is the perfect resolution for a 27" gaming screen but I'd rather have 120+ fps in all my games.
 
Send that display back and buy one with gsync and you won’t care a lick about 45-65 FPS because it’ll feel very nearly as smooth as 100fps.
G-Sync helps, but it's not going to magically make 45 fps look like 100 fps. It just removes the V-Sync related stutter. 45 fps is still 45 fps.
 
yeah. 2560x1440 is half the framerate of 1080p and 4k is one half less
 
G-Sync helps, but it's not going to magically make 45 fps look like 100 fps. It just removes the V-Sync related stutter. 45 fps is still 45 fps.
I contend it does and I’m certaintly not alone. Going back and look at G-sync reviews. Nearly all of them make mention that it just makes gameplay feel smoother — apart from removing screen tearing. Especially as the frame rate goes higher than the absolute minimums of the technology. With Freesync or Gsync it’s smoother at any particular x frame rate (within the spec of the tech)than without Freesync it Gsync. I used both, and gaming without it has become very offputting. I even sold off my perfectly adequate gaming laptop, and bought one with gsync so I could keep the smoothness I’d become accustomed to with my desktop. When I play on a monitor without it anymore I can tell right away.

I still keep a frame counter up and I never even notice things feel any different until low 40fps. I used to be able to tell when things deviated off 60fps without freesync or Gsync.

I play Hunt Showdown often. It runs an average of about 60FPS on my system, with all graphics options maxed. Occasionally dropping into the 40s. It feels buttery smooth, down to about 42FPS IMO with Gsync. When it doesn’t work right in a game, like on Path of Exile — I can immediately tell. :(
 
Last edited:
Chances are pretty good that depending on the game(s) you play there are quality setting you won't notice if you drop them for more FPS. For example since you are playing at a higher resolution you can drop anti aliasing from 8x to 4x, or whatever your AA settings are. There could also be a draw distance, anisotropic filtering (texture quality at a distance), shadow quality, etc that are harder to really notice unless you are looking for it.

All your hardware seems pretty good for 2560x1440 gaming. Might want to double check chipset drivers, video drivers, and potentially overclock your graphics card.

I also reinstalled WoW recently because even at the absolutely lowest settings I was getting bad FPS. There must have been something weird in an addon or who knows what, but after a fresh reinstall of the game everything was fast and smooth with the same addons also freshly installed. So reinstalling your game(s) might also help.
 
I really need to check out 1440p and g-sync monitors to see this tech.

Last year bought a Samsung C27FG70. 1080p 27" VA Q-dot 144hz. For text it looks fine to me where most complain you need atleast 1440p.

A few weeks ago picked up a 1070ti and it smashes 1080p games that I play. Except I tried DSR 4x no AA (4k res downsampled) and love it expect the performance hit. Maybe 1440p looks as good without AA and less performance hit than DSR 4x.
 
Archaea Everyone is a little different. I'm highly sensitive to motion, and in fact can notice a difference going from 144Hz to 166Hz on my new monitor (both G-Sync).

I honestly don't want to play below 90Hz (if I can) though I still do game on a 4K TV at 60Hz but it's a trade-off.
 
Last year bought a Samsung C27FG70. 1080p 27" VA Q-dot 144hz. For text it looks fine to me where most complain you need atleast 1440p.

Yeah, I recently got a 1080p ultrawide. Aside from the desktop feeling cramped up in the real estate department, the resolution still looks good. Obviously 1440p is better, but 1080p is certainly still HD and looks nice.

A few weeks ago picked up a 1070ti and it smashes 1080p games that I play. Except I tried DSR 4x no AA (4k res downsampled) and love it expect the performance hit. Maybe 1440p looks as good without AA and less performance hit than DSR 4x.

I tried DSR when it first came out on a 1080p monitor. It definitely looked nice, but the performance hit was too much for me. I also ended up going Surround 7680x1440, so there was no need for DSR as performance was already a huge struggle and it looked great. I wouldn't mind testing it out again on my 1080p ultrawide just to see. I can report back if you want some comparisons.
 
So I just tried DSR again at 5K2K resolution. It looks amazing. While not quite the detail of native 4K, it honestly looked pretty close and better than I was expecting.

I tried Far Cry 5, Shadow of the Tomb Raider, Dirt Rally, Half-Life 2, Left4Dead. It was a nice improvement for sure. I had it on 4x so 5120x2160 w/ 10% smoothing.

Granted, there was a big performance hit, but I was still able to stay above 90 fps in SotTR and FC5. Dirt Rally and the older games were getting 164 fps locked (my frame limit).

I think native 1440p with decent AA would still be better overall, but with DSR you could get nicer image quality with your current monitor and it's a great option to have.

Especially since running below native res is usually bad, having a lower res native monitor can be more flexible with DSR in the equation.

Will have to check this out more, but so far I'm really impressed, particularly with old games like HL2 and L4D, where running at 5K2K at 166Hz is viable.
 
Back
Top