Are display resolutions outpacing video cards?

Ihaveworms

Ukfay Ancerkay
Joined
Jul 25, 2006
Messages
4,630
It seems that in the past year or so there has been a big push for high resolutions display to 4K and beyond. Now that people are using 4K, 5K, or even higher resolutions, how are games going to be able to play at acceptable frame rates on these things? I feel like video card performance has fallen behind what these new resolutions are asking for in terms of performance and the work around is to throw more video cards at the problem. I can't see myself upgrading past 1440p in the near future until there has been big increase in high resolution performance.
 
It seems that in the past year or so there has been a big push for high resolutions display to 4K and beyond. Now that people are using 4K, 5K, or even higher resolutions, how are games going to be able to play at acceptable frame rates on these things? I feel like video card performance has fallen behind what these new resolutions are asking for in terms of performance and the work around is to throw more video cards at the problem. I can't see myself upgrading past 1440p in the near future until there has been big increase in high resolution performance.

The framerates in general are decent but just not what you are used to with lower resolutions. With the new API (Mantle/DX12/Vulkan) were getting another boost no longer have videocards to be used in Alternate Frame Rendering, that would allow transition to 4K somewhat easier as well.

At 4K resolution you don't need that many framebuffer options you use now (AA).
 
It seems that in the past year or so there has been a big push for high resolutions display to 4K and beyond. Now that people are using 4K, 5K, or even higher resolutions, how are games going to be able to play at acceptable frame rates on these things? I feel like video card performance has fallen behind what these new resolutions are asking for in terms of performance and the work around is to throw more video cards at the problem. I can't see myself upgrading past 1440p in the near future until there has been big increase in high resolution performance.

To answer your question simply, yes.

Here are some reviews that specifically address 4K resolution gaming. All our editors have 4K displays now to test with.

NVIDIA GeForce GTX 970 SLI 4K and NV Surround Review
http://www.hardocp.com/article/2014/11/19/nvidia_geforce_gtx_970_sli_4k_nv_surround_review/

NVIDIA GeForce GTX 980 SLI 4K Video Card Review
http://www.hardocp.com/article/2014/10/27/nvidia_geforce_gtx_980_sli_4k_video_card_review/

GTX 780 OC 6GB vs. R9 290X 4GB Overclocked at 4K
http://www.hardocp.com/article/2014/07/23/gtx_780_oc_6gb_vs_r9_290x_4gb_overclocked_at_4k/

AMD Radeon R9 295X2 and XFX R9 290X DD TriFire Review
http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/

XFX R9 290X Double Dissipation Edition CrossFire Review
http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/
 
It seems that in the past year or so there has been a big push for high resolutions display to 4K and beyond. Now that people are using 4K, 5K, or even higher resolutions, how are games going to be able to play at acceptable frame rates on these things? I feel like video card performance has fallen behind what these new resolutions are asking for in terms of performance and the work around is to throw more video cards at the problem. I can't see myself upgrading past 1440p in the near future until there has been big increase in high resolution performance.

To be expected.
Gfx cards dont magically change their power to cope with all scenarios
Best thing for you is to stick at your current res until graphics cards are capable.

It seems like you arent happy with progress.
 
It's true, current GPU's simply don't have the grunt to run higher settings at the native resolutions these displays are running. However one advantage these displays have is that both 4K and 5K are full pixel doubling of more historically typical resolutions (1080 and 1440 respectively). As a result you can drop down to the 1/4 native resolution and not have to deal with interpolation or an aggressively ugly low resolution.
 
Only enthusiasts are 4k+. 99% of people are using 1080p or lower.

And the 8k iMac that may be coming out later this year? My first thought after reading the LG leak was what the gfx chipset was going to be...
 
Back when I read the reviews of SLI 970's and 980's on 4k, and after my 'test' with some games with DSR to render the games at 4k, I came to the conclusion that, at the time (half a year ago), 4k wasn't ready, too many kinks needed to be worked out, and 4k support for games (particularly in the UI and text scaling) were erratic (EG ME2 scaled perfectly on 4k, ME3 didn't, and ME3 was newer).

I had briefly considered the fact that I could lower resolution to get rid of this problem, then I realised that it defeated the entire purpose of getting a higher resolution monitor. Also another thing was that if I wanted to avoid the scaling blur, I would have to set the resolution to whole number ratio to the native resolution, which, for 4k, is 1080p. DSR smoothing can effectively remove any kind of blur induced by downsampling from non-whole number ratio resolutions.

Practical aspects of higher resolution aside, I do feel that monitors have been progressing at an abnormally fast pace in the last few years, and maybe the rise of smartphones have something to do with this. The PPI of small handheld devices are getting ever higher (EG Nexus 6's 1440p on a 6" display), so we can already make pixels that small, it was just a matter of time that this kind of PPI was made into a much larger panel, and thus leading to a higher resolution monitor. We are stuck at 4k/60hz at the moment probably because of HDMI 2.0/DP1.2 bandwidth limitations. Perhaps we can expect 4k 144hz/5k 75hz monitors when DP 1.3 hits?

EDIT: I wouldn't use Apple as a guide for trends in resolution, because frankly, their monitors may have higher resolution, but some of their models can't even render desktop that high (eg the latest Macbook, 1440p screen rendering a 720p resolution), so having such a high resolution monitor is often moot. I feel that Apple are squandering a lot of great potential with their displays. If they sold their retina displays as a standalone product, they may very well make a killing. Their hardware behind these monitor resolutions is something to be desired for.

EG. iMac Pro: 5k display, yet uses mobile GPU to render it...
 
I do not think that it is only display resolution that is outpacing graphics perfromance. games are being ( badly?) coded for increased deamnd over graphics. Back when eyefinity was a novelty, it was possible to render triple 1080p screens with a sub $1000 system. Now we spend $1200 just on graphics for a single 4k display, and still not reaching 60fps@max settings.

Hopefully the trend is reversing: 4k is most likely not viable for main stream acceptance in the PC market: either the pixels are too small for reading, like on 32-24 4k screens, or the display is too big and your neck hurts like the 40".

I can't wait for a 24" 1440p capable of 144Hz for triple portrait setups. 4k is too little pixels ;)
 
One huge letdown so far has been display scaling. 4K was supposed to work well with 1080P gaming because 4K is literally 2x the number of pixels both horizontally and vertically as 1080P. That means 1080P should be able to be rendered on a 4K monitor as every square of 4 pixels representing one single 1080P pixel, and thus the image should be just as crisp as viewing 1080P on a 1080P monitor and without the typical blurring you get from running a non-native resolution.

Unfortunately in practice this almost never occurs. 1080P on a 4K monitor looks like crap just like any other non-native resolution, where most 4K monitor owners agree 1440P looks better than 1080P. :rolleyes:

I love the idea of a 4K monitor but I don't really care about being able to (nor do I really want to be forced to) play all my games in 4K. My current monitor is a 27" 1080P monitor that does 120hz. Honestly I'd rather stick with a 120hz monitor than deal with 4K at only 60hz. I'm not going to bite until they at least get 1080P scaling figured out and you start seeing 120hz+ 4K monitors pop up on the scene.
 
There is not a single 4k monitor out there that is even playable at all in fast-paced games. They are all 60hz. Granted, I am a bit spoiled by my screen's extremely high refresh rates and zero motion blur (up to 344Hz), but IMO, 60FPS is horrible and unplayable. If these things were even 85Hz, things would be a lot better. I can't stand below 100FPS at all and anything under 120 looks funny to me. All 4K monitors also have atrocious post-and-hold blur.
 
Well yes and no. Yes in that video cards can't drive all the newest games to max rez, max detail, max FPS. However no in that you don't have to operate them at full resolution. With higher rez displays with small pixels, it is very much an option to run at a lower rez than native. Run games at 2.5k or 1080 and they'll work great, and still look good.

Remember we used to do that back in the CRT days, we'd run our desktop at 1280x960 or even 1600x1200 but drop down to 1024x768 for games since video cards couldn't handle the really high resolutions.
 
CRTs had the advantage that they did not have distinct pixels (at least not in the LCD sense), so changing resolution often have no IQ impact as there would be rather minimal blur.

However, I'd much rather downsample from a lower resolution monitor to a higher resolution, and use a high refresh rate monitor, than I am willing to down scale resolution from a 4k monitor. I am sure you can do it to great effect, especially if your PPI is high enough, but I don't know, it feels odd to have a high resolution monitor to game on a lower resolution...
 
Run games at 2.5k or 1080 and they'll work great, and still look good.

I wish that were true, but it's not, unless you have really low standards. I don't expect 1080P on a 4K monitor to look as good as 4K, but 1080P on a 4K monitor should look as good as 1080P on a 1080P monitor. That is because 1080P on a 4k monitor shouldn't involve any blurring or pixel approximation. Unfortunately due to shitty engineering and bad product design it almost never works out that way.
 
Its about time we push for higher resolutions, 1080p and 1600p have been out for far to long
 
I wish that were true, but it's not, unless you have really low standards. I don't expect 1080P on a 4K monitor to look as good as 4K, but 1080P on a 4K monitor should look as good as 1080P on a 1080P monitor. That is because 1080P on a 4k monitor shouldn't involve any blurring or pixel approximation. Unfortunately due to shitty engineering and bad product design it almost never works out that way.

I dunno, I hear complaining about this sort of thing, but in my experience it isn't an issue. I have to run some things at lower resolutions on my 2560x1600 monitor and it works out fine. Yes, things are blurrier/more pixelated since they are lower rez, but the monitor's scaler handles it like a champ and the image is quite good in the end. I'm sure there are monitors with bad scalers out there, but if you look at reviews on a good site like TFTCentral they test that. Also, you can have your nVidia card handle the scaling and it does a good job.
 
As a TV-based PC gamer, I say absolutely. 4K TV's have all but replaced all of the 1080p models from the major manufacturers. If you're buying a new TV for whatever reason, odds are you have to look hard (or have an older model in mind) to avoid 4K.
Granted, 1080p does still look pretty decent on a 4K TV, or at least better than 480p and even 720p look on a 1080 set but I wish I could play more games at 4K.
I'm not on a low-end setup either. I have (more or less) the best single card rig you can build at the moment. Being forced to have a high-end SLI setup for a supposedly standard resolution is no fun.
 
The GPU industry has been the last hardware market to face the growing silicon slowdown. Its been happening for quite a few years and really when we moved from 6-9 month cycles to 12-18 it should have been a sign.

It only makes sense display resolutions would outpace GPU's when we knew 4K/8K was going to come about within 10 years back in 2008 . Back then 1080p was still stressful and 2560x1440 has only recently become feasible on a single GPU. I think things will slow down heavily at 4K though. I honestly can't see a logical reason for mass adoption of 5K/8K panels. They'll just be the niche 2560 resolutions.

We're going to need about 3.5x the Titan X's performance to make 4K more bearable on a single GPU. Also to go with that 99% 1080p stat in the first reply, is almost the same percentage don't bother with SLI/Crossfire.
 
I dunno, I hear complaining about this sort of thing, but in my experience it isn't an issue. I have to run some things at lower resolutions on my 2560x1600 monitor and it works out fine. Yes, things are blurrier/more pixelated since they are lower rez, but the monitor's scaler handles it like a champ and the image is quite good in the end. I'm sure there are monitors with bad scalers out there, but if you look at reviews on a good site like TFTCentral they test that. Also, you can have your nVidia card handle the scaling and it does a good job.

A good scaler is certainly a great thing when it comes to fudging a lower resolution on a higher resolution monitor. Thing is, 1080P on a 4K monitor shouldn't even require a scaler at all... It literally should be every square of 4 pixels representing a single 1080P pixel - that is NOT scaling.
 
A good scaler is certainly a great thing when it comes to fudging a lower resolution on a higher resolution monitor. Thing is, 1080P on a 4K monitor shouldn't even require a scaler at all... It literally should be every square of 4 pixels representing a single 1080P pixel - that is NOT scaling.

It's fucking obscene that this isn't the case for absolutely every single 4K display. (discounting ones that use the different "cinema" 4K resolution)

Like seriously, what kind of shithead says it's okay for 1920x1080 to do anything other than fit evenly into 3840x2160?

I don't have anything 4K and this still seriously pisses me off.
 
Get used to it. You want to be a cutting-edge gamer, you get to deal with the limitations of technology.

Back at around 1999, we had 20" (and later 22") CRTs with 1600x1200, 1920x1440 or higher native resolution, we had to settle for 1280x960 gaming because we only had 4 ROP video cards (GeForce 256 up to GeForce 3) until the release of the Radeon 9700 Pro. And even then, you weren't really cooking with fire until the x800 / GeForce 6800 series were released with 16 ROPs and GDDR3 to keep them fed.

Early LCD displays at that time by-comparison, were well behind CRT resolutions: 15" 1024x768, 17"/19" 1280x1024. And a few years later, the 20" 1600x1200 LCDs aimed at the professional market, and 1650x1080 20" aimed at widescreen movies/gamers appeared. We didn't see the gargantuan 30" 2560x1600 LCDs appear until 2004, and at that time they were the price of a used beater car, and somewhere a few years later we transitioned to affordable 1080p. We didn't see 1440p at reasonable prices until around 2010.

By the time the LCD displays in the mainstream transitioned to 1080p/1200p/1440p, it was relatively painless to power because we already had 16/32 ROP cards in the midrange!

So, let's put this in perspective: we went from being able to power 1280x960 (1.28mpixels) in 1999 with a top-end card to being able to power 2560x1440 (~ 3.5x more pixels) with a single top-end card in 2010, the span of 11 years, disregarding the increase in games complexity along that way (which is going to happen, so ignore it). The increase in pixels from 1440p to 4k is a factor of ~2.5x, so I would expect this to not be affordable to drive for at least the next 5 years or so.
 
Last edited:
Get used to it. You want to be a cutting-edge gamer, you get to deal with the limitations of technology.

Back at around 1999, we had 20" (and later 22") CRTs with 1600x1200, 1920x1440 or higher native resolution, we had to settle for 1280x960 gaming because we only had 4 ROP video cards (GeForce 256 up to GeForce 3) until the release of the Radeon 9700 Pro. And even then, you weren't really cooking with fire until the x800 / GeForce 6800 series were released with 16 ROPs and GDDR3 to keep them fed.

Early LCD displays at that time by-comparison, were well behind CRT resolutions: 15" 1024x768, 17"/19" 1280x1024. And a few years later, the 20" 1600x1200 LCDs aimed at the professional market, and 1650x1080 20" aimed at widescreen movies/gamers appeared. We didn't see the gargantuan 30" 2560x1600 LCDs appear until 2004, and at that time they were the price of a used beater car, and somewhere a few years later we transitioned to affordable 1080p. We didn't see 1440p at reasonable prices until around 2010.

By the time the LCD displays in the mainstream transitioned to 1080p/1200p/1440p, it was relatively painless to power because we already had 16/32 ROP cards in the midrange!

So, let's put this in perspective: we went from being able to power 1280x960 (1.28mpixels) in 1999 with a top-end card to being able to power 2560x1440 (~ 3.5x more pixels) with a single top-end card in 2010, the span of 11 years, disregarding the increase in games complexity along that way (which is going to happen, so ignore it). The increase in pixels from 1440p to 4k is a factor of ~2.5x, so I would expect this to not be affordable to drive for at least the next 5 years or so.

Very thorough post.

1080p/1200p/1440p has been mainstream and sustainable for so long, that people have almost completely forgotten about how long it took us to get to that point. Personally, I'm glad to see this, as it pushes hardware manufacturers to push out better hardware. If we were to stick with 1080p, we would be stuck with the same stagnation that we see with CPUs.
 
I look at it two ways.

The first way is that I am happy that your standard mid/high range GPU can now finally be a hefty competitor in the 1080P range. I feel like if you pick up a nice card now GTX 970 + or 290X + you are investing in a card that will last you for some years to come. Granted you are aiming to stay in the 1080P range. (Of course running multiple 1080P monitors is a little bit of a different story).

The other side, coming back to my first point. The industry needs to move forward, without continuing innovation, or at least creating a gap to fill, GPU cards, TV's etc... sales are going to drop because there is no real need to buy a new one since the one you have likely does the same as any other. Same as GPU's, the one you have now will probably last on 1080p for quite some time, so its not a sustainable model. There is no choice, from a business stand point, but to move ahead and create a higher standard to move towards. I think gaming on 4k (one day) will be awesome, but I agree we are not there yet. (We just got to 1080p with a single card FLUID experience) Until then, I will dabble up to 1440p, but am staying at 1080 at least in the short term.
 
I paid $500 for my GTX680 about 3 years ago and was gaming on a 27" 1920x1200 display that I paid $500 for in 2007. I now have a G-Sync 4k that was $800 and I paid $640 for sli 970's. Honestly my frame rates on games are very similar between the two setups. SLI has come along way the only game I have that it flatout just doesn't work is watchdogs which honestly watchdogs is still a huge pile of a PC game (and I finished it).
 
... how are games going to be able to play at acceptable frame rates on these things?

They aren't, unless you are prepared to go video card crazy - and then depending on the games you play, that may not do any good. So, simply avoid them. They are not for you. It's bleeding edge. There isn't shit for 4K content out there anyways as far as a TV is concerned. But for a PC - stick to the resolution your budget will allow.
 
Back
Top