• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

4K monitors were out more than a decade ago but why is gaming at this resolution still such a big deal?

maverick786us

2[H]4U
2FA
Joined
Aug 24, 2006
Messages
2,165
Way back in 2008 when I was a PC gamer, I happened to game at 1680 X 1050 resolution on my 22 inch monitor. I had GTX 460 which happened to play games well on the resolution at medium settings. I enjoyed high end graphic intensive games like Crysis, Crysis Warhead, Crysis 2 and Battlefield Bad Company 2 in that rig.

Then came 2360 X 1600 resolution, which was considered high end that time, and any graphic card that would play these games at that resolution with medium to high settings costed more than $400. 1 year later 4K resolution came to horizon. It was for niche market. So gaming at that resolution with medium to high settings mean you had to spend a fortune to buy top end 2-3 cards and use them to X-Fire or SLI.

Its been 15 year now I was reading the review of this card and the review said "high fps scores in 1080p and even 1440p.". With such a high end card, the author said "In 4K, the 5060 Ti hit 90 fps in Cyberpunk with 4X frame generation. It’s not a card you’d be buying for consistent 4K gameplay, but it’s still interesting to see it hit beyond 60 fps at such a high resolution." What is the reason? Are the game manufacturers not serious about 4K PCI gaming?
 
It's not really the game developers but the hardware developers.
It takes quite a bit of horsepower to render high res 4k images 60-120 times per second.
 
I mean, it's an ever moving goal post. Any modern card can play older games at 4K+ smoothly but more recent demanding games especially with ray tracing enabled push cards to their limits. The linked GPU is also a mid-range card.
 
I mean, it's an ever moving goal post. Any modern card can play older games at 4K+ smoothly but more recent demanding games especially with ray tracing enabled push cards to their limits. The linked GPU is also a mid-range card.
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.

I remember back in 2008, most of the GPUs happened to take the game Crysis as benchmark to show performance. It was an award winning game, with the only criticism was the programming wasn't optimized. When Crysis Warhead was released, it was smooth and it looked like the developers optimized the code to make it bit smoother.
 
My perspective with tech improving I thought the game developers will optimize the games instead of making more and more graphic intensive.
There's a colloquial 'law' about this.

Though tbh various AAA games really are pushing it in terms of graphics, at least on the highest settings (which are mostly what get benchmarked in reviews). One can just knock down the resolution/settings and get better perf for the most part, same as ever.
 
There's a colloquial 'law' about this.

Though tbh various AAA games really are pushing it in terms of graphics, at least on the highest settings (which are mostly what get benchmarked in reviews). One can just knock down the resolution/settings and get better perf for the most part, same as ever.
Compared to what I remember modern games looks really good on low. Back in the day low setting would cut things like draw distance. Can't see anything until it's 10 feet away. Well, maybe not that bad, but hope you were playing a melee build and good luck navigating. Didn't stop me from buying a stupid expensive vid card because I like the bling, but being stuck on low wouldn't stop me from playing a game these days. Back in the day it could have. Draw distance had a massive effect on gameplay. Now it's all just looks.
 
It make a lot of sense, it is diminishing return to augment resolution after say 720p for something like a game, it is a nice to do if you have extra power (like the kind of PC rare enough people do not make game for), but otherwise compute budget should tend to go elsewhere (bluray of an actual movie at 720p will tend to look better than a 4k native video game still...).

Same reason 1080p took a long while for console to be the norm versus when the first 1080p came out.

360 fps gaming would be a big deal for some type of game, not for others that are made for it because the diminishing return is worth the compromise (same for 4k, for the type of game worth the compromise, not a big deal, just compromise for it)
 
Last edited:
4K is the common standard for TVs, and movie making these days, so we would expect cards to play games at this resolution at medium settings with AAA disabled. Crysis is one game that brought even the best card on its knee when it was launched but I enjoyed playing that game at 1680 X 1050 resolution with ATI Radeon 4850 and later GTX 460 at medium settings with AAA disabled way back in 2008-9. its been 17 years now, Is there are mid range card that can play this game at 4K with medium settings with AAA disabled?
 
Back
Top