• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

4K monitors were out more than a decade ago but why is gaming at this resolution still such a big deal?

maverick786us

2[H]4U
2FA
Joined
Aug 24, 2006
Messages
2,189
Way back in 2008 when I was a PC gamer, I happened to game at 1680 X 1050 resolution on my 22 inch monitor. I had GTX 460 which happened to play games well on the resolution at medium settings. I enjoyed high end graphic intensive games like Crysis, Crysis Warhead, Crysis 2 and Battlefield Bad Company 2 in that rig.

Then came 2360 X 1600 resolution, which was considered high end that time, and any graphic card that would play these games at that resolution with medium to high settings costed more than $400. 1 year later 4K resolution came to horizon. It was for niche market. So gaming at that resolution with medium to high settings mean you had to spend a fortune to buy top end 2-3 cards and use them to X-Fire or SLI.

Its been 15 year now I was reading the review of this card and the review said "high fps scores in 1080p and even 1440p.". With such a high end card, the author said "In 4K, the 5060 Ti hit 90 fps in Cyberpunk with 4X frame generation. It’s not a card you’d be buying for consistent 4K gameplay, but it’s still interesting to see it hit beyond 60 fps at such a high resolution." What is the reason? Are the game manufacturers not serious about 4K PCI gaming?
 
I mean, it's an ever moving goal post. Any modern card can play older games at 4K+ smoothly but more recent demanding games especially with ray tracing enabled push cards to their limits. The linked GPU is also a mid-range card.
 
I mean, it's an ever moving goal post. Any modern card can play older games at 4K+ smoothly but more recent demanding games especially with ray tracing enabled push cards to their limits. The linked GPU is also a mid-range card.
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.

I remember back in 2008, most of the GPUs happened to take the game Crysis as benchmark to show performance. It was an award winning game, with the only criticism was the programming wasn't optimized. When Crysis Warhead was released, it was smooth and it looked like the developers optimized the code to make it bit smoother.
 
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.
There's a colloquial 'law' about this.

Though tbh various AAA games really are pushing it in terms of graphics, at least on the highest settings (which are mostly what get benchmarked in reviews). One can just knock down the resolution/settings and get better perf for the most part, same as ever.
 
There's a colloquial 'law' about this.

Though tbh various AAA games really are pushing it in terms of graphics, at least on the highest settings (which are mostly what get benchmarked in reviews). One can just knock down the resolution/settings and get better perf for the most part, same as ever.
Compared to what I remember modern games looks really good on low. Back in the day low setting would cut things like draw distance. Can't see anything until it's 10 feet away. Well, maybe not that bad, but hope you were playing a melee build and good luck navigating. Didn't stop me from buying a stupid expensive vid card because I like the bling, but being stuck on low wouldn't stop me from playing a game these days. Back in the day it could have. Draw distance had a massive effect on gameplay. Now it's all just looks.
 
It make a lot of sense, it is diminishing return to augment resolution after say 720p for something like a game, it is a nice to do if you have extra power (like the kind of PC rare enough people do not make game for), but otherwise compute budget should tend to go elsewhere (bluray of an actual movie at 720p will tend to look better than a 4k native video game still...).

Same reason 1080p took a long while for console to be the norm versus when the first 1080p came out.

360 fps gaming would be a big deal for some type of game, not for others that are made for it because the diminishing return is worth the compromise (same for 4k, for the type of game worth the compromise, not a big deal, just compromise for it)
 
Last edited:
4K is the common standard for TVs, and movie making these days, so we would expect cards to play games at this resolution at medium settings with AAA disabled. Crysis is one game that brought even the best card on its knee when it was launched but I enjoyed playing that game at 1680 X 1050 resolution with ATI Radeon 4850 and later GTX 460 at medium settings with AAA disabled way back in 2008-9. its been 17 years now, Is there are mid range card that can play this game at 4K with medium settings with AAA disabled?
 
Compared to what I remember modern games looks really good on low. Back in the day low setting would cut things like draw distance. Can't see anything until it's 10 feet away. Well, maybe not that bad, but hope you were playing a melee build and good luck navigating. Didn't stop me from buying a stupid expensive vid card because I like the bling, but being stuck on low wouldn't stop me from playing a game these days. Back in the day it could have. Draw distance had a massive effect on gameplay. Now it's all just looks.
This is important. While everyone obsesses over ultra settings, modern games actually look incredible on low/medium.... with jaw dropping performance too.

Old games on low were so compromised it was often not worth playing until you could afford to upgrade the hardware. You lost draw distance, shadows, and some textures would often become a singular blob of colour etc.
 
Considering low to medium in modern games look even better than original Crysis, the requirements just went way up for the game play and graphics. I’ve found the modern over $600 cards now are good 4k pushers, maybe not all the bell and whistles, but most of it on the lower price tiers above $600.
 
The death of SLI. GPU production slowed to less than half the speed it used to be. Costs. Lack of substantial gains and about to hit a brick wall at 1nm. We now rely on upscaling and fake frames just to push games like cyberpunk from 3 years ago on $5000 GPU's. Substantial increases in graphics quality require 10x the power to run smoothly. All of that trashed 4k gaming. I remember when I had 2x 1080TI's and I could play every single game at 4k until they stopped supporting it. RDR2 was the first title I couldn't play at 4k because it didn't support my setup. Since then I've had to use upscaling and fake frames and even that can only be done on insanely priced hardware that is hard to find.

Hate to say it but it's going to be a slow and boring hobby for the foreseeable future, especially with AMD dropping out of the high end GPU race and Nvidia deciding to move on to AI. Also, the destruction of the games industry by corrupt shareholders, bad hiring practices within game development, no accountability, consolization that started when Microsoft bought all PC devs to work on original xbox ect. Mainstream appeal also dumbed everything down and killed pretty much everything but copy paste trash. Every once in a while we get something nice. Maybe 1 game every year or two worth actually playing from my perspective. Those 1 in a million games earn the right to be bought up by trash producers that kill them off every time and then everyone has to be replaced and it takes years for another talented team of actual game developers to form a new studio with the skill to bring something to the table.
 
Last edited:
PS5 has what 40-50 time the power of a ps3, some PS5 game run at the same resolution PS3 games had, GPU could be 10 time more powerful today if gamedev are good enough they should be able to make game good enough that people still choose to not run them at 4k (before we reach at least photo realism, that should be the case for a lot of game, choosing not to be ran at 4k with common hardware)

That almost purely a choice, we could make 8K native game tomorrow if that what we wanted.
 
Last edited:
PS5 has what 40-50 time the power of a ps3, some PS5 game run at the same resolution PS3 games had, GPU could be 10 time more powerful today if gamedev are good enough they should be able to make game good enough that people still choose to not run them at 4k (before we reach at least photo realism, that should be the case for a lot of game, choosing not to be ran at 4k with common hardware)

That almost purely a choice, we could make 8K native game tomorrow if that what we wanted.

I feel 4K gaming on a good monitor like Dell 32 inch Ultra Sharp is the saturation point. Any resolution beyond 4K will need 55 inch monitors or more which can be bit impractical for gaming.
 
I feel 4K gaming on a good monitor like Dell 32 inch Ultra Sharp is the saturation point. Any resolution beyond 4K will need 55 inch monitors or more which can be bit impractical for gaming.''

??

There aren't any Ultrasharp models that compete with actual high-end gaming-centric monitors. All their better (i.e. faster) gaming displays are on the Alienware branding.

Also, 4k+ res requiring 50"+ displays to be worth it is objectively wrong. The smaller the screen for a given resolution, the tighter the pixel pitch, which means less reliance on aggressive anti-aliasing to get a crisp image, and therefore lower relative performance requirements, assuming you're trying to play at native res without upscaling.

If you can't perceive the difference between a 4k/5k/8k display in the 27"-32" range, consider yourself lucky that you've gotten old enough to where you no longer have to care about the minutiae.
 
??

There aren't any Ultrasharp models that compete with actual high-end gaming-centric monitors. All their better (i.e. faster) gaming displays are on the Alienware branding.

Also, 4k+ res requiring 50"+ displays to be worth it is objectively wrong. The smaller the screen for a given resolution, the tighter the pixel pitch, which means less reliance on aggressive anti-aliasing to get a crisp image, and therefore lower relative performance requirements, assuming you're trying to play at native res without upscaling.

If you can't perceive the difference between a 4k/5k/8k display in the 27"-32" range, consider yourself lucky that you've gotten old enough to where you no longer have to care about the minutiae.
50+ so I should target 2650 X 1600 resolution for gaming on display size between 27-30 inch?
 
Just whatever the highest is after which you can no longer see a noticeable improvement without getting inches away from the panel. One thing to keep in mind is that different high performance panels might have non-standard subpixel layouts (e.g. BGR instead of RGB, RGBW, etc), so those might affect your perception of text clarity depending on how sensitive you are to that.
 
Memory bandwidth hasn't really improved that much over the last decade compared to how much faster compute is, think about a flagship GPU in 2017 like Titan V with nearly 700GB/s, the 5090 today is around 1.8TB/s which is only about 2.5x as much. In addition a lot of popular engines are full of bloat and the games barely look any better than they did 10 years ago but require a ton of more compute to run.
 
I feel 4K gaming on a good monitor like Dell 32 inch Ultra Sharp is the saturation point. Any resolution beyond 4K will need 55 inch monitors or more which can be bit impractical for gaming. ...
50+ so I should target 2650 X 1600 resolution for gaming on display size between 27-30 inch?
It is a ratio of eyesights, sitting distance and monitor size (and content to a point). If you seat 10 feet from the 55 inch monitor it will be a "smaller" one than sitting really close to a 27inch one.

Blind test in giant movie theater screen show that going over 1080p tend to be useless to most 20/20 vision people real soon in the room after the first couple of seat rows, for something like text you see the difference on a 27 inch between 4k and 1440p, for game will depend on the type-asset, etc... a text heavy GUI game can use 4k in a more obvious way.

Maybe less than framerate, but everything as subjective as resolution will tend to vary people to people (or if they have their glasses on or not).
 
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.

I remember back in 2008, most of the GPUs happened to take the game Crysis as benchmark to show performance. It was an award winning game, with the only criticism was the programming wasn't optimized. When Crysis Warhead was released, it was smooth and it looked like the developers optimized the code to make it bit smoother.

Optimization


Ahh yes, the old "optimization fallacy." First off, why would studios choose to "optimize" the game to run smoother rather than improve the graphics? Until we can render graphic images at speed which are photo realistic at high speeds, there will always be room for improvement. Graphics quality is one aspect of game design that can easily draw people in and separate one game from another. It makes sense that games improve visually as time goes on. Lets also not forget that target frame rate is part of a game's design. Engine selection and visual style / fidelity comes into play where that's concerned as well. You can hit any frame rate you want to if you sacrifice visual quality. That's part of why games like Overwatch, Valorant, TF2 and even Marathon are so stylized. If you make the graphics cartoony and overly stylized like that then they are less demanding. They are less detailed and less intense to render.

This is intentional so you can play these heavily monetized games on a potato. That allows them to reach more potential customers as they aren't gate keeping by catering to people who run top end hardware all the time.

Next, there is the optimization fallacy itself. The idea that you can have a game look like Cyberpunk 2077 and the only reason it runs badly is due to optimization. This is nonsense. Yes, there are often things that can be done behind the scenes or in the engine to allow it to run better or improve responsiveness but there are limits to what can be achieved in this way. Some of these problems come down to time constraints or even technical ability. In a lot of cases most of the optimization is done for the console versions and PC ports are handled by third parties which aren't necessarily good enough to accomplish the task. This comes down to business decisions which impact the amount of available time to optimize a game and ultimately who they get to do the work. At some point, you achieve the target frame rate of the design and that's all they care about.

But the true fallacy is that this can be done in such a way as to make any game run perfectly smoothly on nearly anything and that every game that runs badly is due to optimization. In some cases it comes down to the way certain things were done design wise. Some games like Crysis have certain odd choices in its design which preclude further optimization from doing any good. (There is a lot of information out there about this, but these decisions ultimately led to a game that looked great and aged well but ultimately runs like shit even on modern hardware.)

The part of it that people truly don't seem to grasp is that sometimes optimization simply means the developers adjust draw distance or visual fidelity of textures and other rendering features to improve speed at the cost of graphics quality. Ultimately, that's what a lot of that optimization is. Cyberpunk 2077 once had a patch that improved performance by 20% and they ended up putting it back like it was when people bitched about how much that optimization hurt visual quality. The amount of optimization and the methods for it are finite meaning that there are limitations to what you can optimize and what impact that will have. Often huge leaps in optimization lead to big losses in graphics quality. It's the same when fine tuning a game's visual settings in the options menu. Some changes massively impact frame rates while taking very little away visually. Others may make things look much worse without giving you that much in terms of frame rate.

There is no free lunch with optimization. It's not the solution to every problem. It's also not why we are stuck on 4K as the maximum for playability for games on even high end hardware.

The Hardware Problem

In terms of CPUs, there has been a shift away from higher clock speeds and even IPC gains to some degree in favor of efficiency and multithreading. Games simply cannot gain performance just by making them run on more cores. A lot of what's required by a game is very specific and can only benefit from parallel processing so much. You can't throw one thread at a CPU for audio processing, another for enemy AI, and two for graphics and get 4x the performance. That's not how it works. Audio processing requires very little CPU wise and adding more cores to it is pointless. CPU frequencies have stagnated to a large degree over the last decade and a half. Crysis once again proves its value beyond being a game by showing that adding threads didn't help it. The only thing that did was raw clock speed. Processors made ten years apart run the game the same.

The focus shifted away from pure performance to efficiency because that's what the mobile and server markets demand. Rarely is there a need for low core count CPU's that have high clocks outside of things like gaming. Unfortunately, gamers are an afterthought here. We know over the years that raw memory bandwidth only goes so far in games as well. It's the same for SSD's to an extent. Mostly what storage impacts is streaming of assets and level / zone load times.

Lastly, you have to understand that GPU's used to advance at a much greater rate than they do today. For as powerful as modern GPU's are, they come out about half as often. In years past we got a brand new architecture every year and a refresh cycle in between each architectural release. Minus a couple times where this wasn't entirely true for various reasons, that generally held true. The jumps weren't always as big as we've seen in modern cards but we got them much more often. Even the refreshes gave us 10-12% sometimes. These days we get a new architecture every two years and if we do get a refresh its usually only a partial refresh that makes one card or part of the stack more appealing than the outgoing models but rarely provides any significant uplift in performance. Those only happen once a year at most.

At the end of the day, games keep improving visually. Not just in terms of raw visual quality but also in scope. Increasing scope adds its own demands to a game. A big open world game isn't going to run as well as a tightly focused corridor shooter will. Beyond that, we have had slow downs in hardware advancement and big hits to performance improvements such as the death of SLI, etc. If games never improved then we'd all be running 8K displays or something but so long as the goal posts keep getting moved we are going to be stuck at 4K for awhile longer.
 
Optimization

Ahh yes, the old "optimization fallacy." First off, why would studios choose to "optimize" the game to run smoother rather than improve the graphics? Until we can render graphic images at speed which are photo realistic at high speeds, there will always be room for improvement.

Honestly I don't know if I want all my games to look photo realistic. Some things should be stylized, otherwise shooter games and such might not be as much fun. I mean I play games to escape reality at times :)
 
Honestly I don't know if I want all my games to look photo realistic. Some things should be stylized, otherwise shooter games and such might not be as much fun. I mean I play games to escape reality at times :)
I think it depends on the type of game. I didn't mean to imply that overly stylized games are always a bad thing. I'm simply pointing out that even the most powerful computers can only do so much and the more the graphics envelope gets pushed, the harder it is to just increase resolution. Optimization can also only take you so far. It's pretty fair to say that Cyberpunk 2077 is far more optimized than it was on launch and yet, it still punishes systems. Even a 5090 has to rely on DLSS 4 and Frame generation once you turn on ray tracing, path tracing etc.

If modern games looked the way they did when 4K monitors first hit the market, we could easily be running 8K displays today but games have continued to evolve visually along with the hardware.
 
I've been gaming at 4k back when I had the 3080. I had a samsung 4k tv and now a LG C2 42in. Some people are just late to the part or don't care about upgrading.

What I find as a treat is integer scaling. I play roller coaster tycoon 2 and set the resolution in game to 1280x720 (16:9) and then use integer scaling so it displays the lower res at full screen on my 42in without any image processing (preserves the detail lower res stretched to full screen)

It's an awesome feature to use with 4k monitors/displays. I keep my 5070ti at no scaling on the gpu otherwise. 4k is nice and all but the same thing can be said about 8k. They have been out for awhile but no one is really gaming at the resolution. Can you imagine the stuff that will be out in 30 years? I am 50 and still a kid at heart and always will be. But the shit to come out in the next 3 or 4 decades is going to be amazing. I hope I live a long life and stay youthful. 4k ain't shit today when in 10 years 16k will be normal!
 
I've been gaming at 4k back when I had the 3080. I had a samsung 4k tv and now a LG C2 42in. Some people are just late to the part or don't care about upgrading.

What I find as a treat is integer scaling. I play roller coaster tycoon 2 and set the resolution in game to 1280x720 (16:9) and then use integer scaling so it displays the lower res at full screen on my 42in without any image processing (preserves the detail lower res stretched to full screen)

It's an awesome feature to use with 4k monitors/displays. I keep my 5070ti at no scaling on the gpu otherwise. 4k is nice and all but the same thing can be said about 8k. They have been out for awhile but no one is really gaming at the resolution. Can you imagine the stuff that will be out in 30 years? I am 50 and still a kid at heart and always will be. But the shit to come out in the next 3 or 4 decades is going to be amazing. I hope I live a long life and stay youthful. 4k ain't shit today when in 10 years 16k will be normal!
Higher the resolution and FPS, better, smoother and more realistic the gaming experience will be. But I feel 4K at 27-32 inch which is like a retina display for that size is a saturation point. Anything beyond that specially 8K resolution will not be noticeable unless I buy an 80 inch displays, that will cost more than a fortune and will need a bigger room. 8K is beyond realistic, its like Dune, something too big for your TV, its what theaters are made for
 
Last edited:
Higher the resolution and FPS, better, smoother and more realistic the gaming experience will be. But I feel 4K at 27-32 inch which is like a retina display for that size is a saturation point. Anything beyond that specially 8K resolution will not be noticeable unless I buy an 80 inch displays, that will cost more than a fortune and will need a bigger room. 8K is beyond realistic, its like Dune, something too big for your TV, its what theaters are made for
i agree, it's all about pixel size. at some point its so small you're not going to notice much if any difference.

although plenty will swear up and down that 4k is a "huge improvement" over 1440p on a 27" monitor...placebo effect?
 
I still game at 1080p. Sure, 4k looks better, but I get virtually the same gaming experience at 4x the framerate with better eye candy all on a lower budget.

Same but that's because I'm still on a RTX 2060, lol.

I do plan on buying a 2K OLED monitor later this year after my new build with RTX 5080 & 9800X3D is completed by next month and I want to research some more and see what new monitors is available soon.
 
although plenty will swear up and down that 4k is a "huge improvement" over 1440p on a 27" monitor...placebo effect?
In many games increasing the resolution also increases the level of detail (LoD), sometimes it's actually the only way to unless there are mods, so in that sense it is a graphical improvement in more than just a resolution sense.

What I'm interested in are 5K displays with good integer scaling, since you'd be able to nearest neighbor 200% scale 1440p for demanding games then switch back to 5K for anything else. Currently can't cleanly scale 1440p on 4K displays. 5K is an awkward res though in terms of not being part of a popular standard so I doubt many regularly size gaming displays will be released with it.
 
You can do integer scaling in GPU drivers nowadays. To be honest I just end up using DLSS performance instead, it looks a billion times better. Integer scaling with modern 3D games... isn't so hot in practice.

I do hate that those dual mode monitors don't provide a clean scaling option, since on those models that is the only way to unlock the peak refresh rate.
 
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.

I remember back in 2008, most of the GPUs happened to take the game Crysis as benchmark to show performance. It was an award winning game, with the only criticism was the programming wasn't optimized. When Crysis Warhead was released, it was smooth and it looked like the developers optimized the code to make it bit smoother.
Developers optimize to the best of their ability, but when nobody wants to spend the resources to develop bespoke game engines anymore you end up with the UE5 problem.
 
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.
I suspect AI based frame generation is going to get better and better. In 10 years we might very well be talking about smaller GPUs with bigger NPU/TPU that are capable of real time resolution upscaling and more.
 
You can do integer scaling in GPU drivers nowadays. To be honest I just end up using DLSS performance instead, it looks a billion times better. Integer scaling with modern 3D games... isn't so hot in practice.

I do hate that those dual mode monitors don't provide a clean scaling option, since on those models that is the only way to unlock the peak refresh rate.
Yeah I only use integer scaling on games from the 90s early 2000s. I mean a lot of it is also up to the dev. If a game runs like shit on a 5090 then what's the point of even upgrading hardware? It is useless if even the most powerful hardware out runs a game at 20fps. Just spend time developing it better to take advantage of the hardware.
 
Looking into all these factors, what resolution should I target, where I don't have spend more than $450 for GPU?
2K. I have 6-7 months to gather money for a gaming rig, Is there a new card is on the pipeline that can comfortably play games at 2K without hurting my wallet ...... much?
 
Many people can't even see the difference from 4k to 1080p on smaller screens and many others use the big screen living room TV to play. There re also those who do not care about resolution, as long as it runs smoothly. That's why 4k, as a standard, still struggles. TV signals haven't all gotten brought to 4k, some of them still come in lower resolutions, usually 2k or 1080p, so 4k seems useless in this scenario. Only the stardardization of 4k for all screens in the market will force brands to match hardware optimizations for it. Meanwhile, there are plenty of options for 1080p 144Hz Free/Gsync screens for sale. One's gonna want to spend more on a better hardware than on a high res screen that will only be fully taken advantage of within years to come. Just my two cents.
 
Because you need high end cards to push that many pixels and high end cards don’t exist anymore….. unless you want to pay about 4 grand.
 
I don't care about 4k gaming, but 4k is much better for productivity and WFH.

I run my Samsung 32" 4K Oled at 1440P for the main game I play (Apex Legends) - I get a solid 240fps vs ~170fps at 4K with no noticeable loss in image quality (I sometimes have to double check the in game resolution) You lose VRR/Gsync at non standard resolutions but the 3090 is able to easily maintain max fps so it's smooth as butter.

For slower games, 4K is fine though...

I thought the 3090 would be a drag at 4K but so far, it's been fine - my main issue is that I really prefer an Ultrawide but 3440x1440 is not great for work. I'm looking forward to getting probably the next gen 5120x2160 ultrawide when I can gain reasonable access to a better GPU. Sad they never made an OLED version of the more or less perfect 38" UW format...
 
Most devs would love to spend the time needed and most project managers are trying to hold to a budget and we end up what we end up with.

I wouldn't blame devs...that bunch would love to have a highly optimized code base of little fast inverse root fuckery functions.
 
The idea that game dev is not one of the place with the most optimization being done and work in the software world tend to be overblown imo.

Start a 3d engine by hands, do not optimize and use a full bag of tricks and look at your frame rate doing 1/20 of what studio made with game engine do...

And I am not sure how much of a link there is with 4k ? A well optimized game should be able to let you use fully a 9800x3d-5090 system at 1440p if you want to... One that did not take time to do it and is not able too and let you have extra performance unused on your hands... sure.

Does anyone think GTA 6 will run at 900-1200p because they did not work a lot on optimization ? And not a giant amount of optimization that concluded that the game looked better at 1200p@30fps than 4k@fps (with the visual compromise needed to fit under the millisecond budget at 4k).

4k is a lot like raytracing on some hardware, a lot of cost for very little return for a cinematic like game.
 
Back
Top