When is resolution going to stop mattering?

piscian18

[H]F Junkie
Joined
Jul 26, 2005
Messages
11,021
Ive been blowing through backlog stuff and I hit on an unreal 3 engine game. It occurred to me after a while that the game looked good. Its not incredibly old 9/2010 Alien Breed not important.


I know in my brain I have personal requirements for the way games and movies look, not necessarily at a set point. As I recall even when Perfect Dark was released back in the N64 days I literally could not play it. Blurry textures that big in my face I can not look at for too long. I watched Kyles Video stuff of updates for Quake and realized even if it had a killer story its unlikely I could bring myself to play it. Stuff like Unreal tournament2k4 still looks right on the limit of fine to me so does AVP2. I have no issue with "2d" games even as old as Mario for the NES and stuff like Baldurs gate for PC. If its nice and crisp painted stuff Im fine with it.

For movies and TV 720p is my absolute limit, but at the same time anything beyond 1080p I honestly just dont care and would struggle to identify the difference in quality. Ive watched 4k movies and I just didn't really notice any big difference.

Right now for all our advances consoles are realistically at 720p. the highest detailed game I can think of would be Battlefront around 1440p on PC. Not talking Ultra widescreen Surround/Eyefinity. Thats just more view space. I don't think even Battlefront has functional assets to support the vertical Surround mode(someone would have to comment on what resolution that realistically is with 3x1440p monitors.) to scale. I'm fairly confident for my personal vision I think 1080p is about my limit, beyond that I can just wont be able to identify detail any further than that.

I realize I'm being a little vague yes you can argue the semantics of what 1080 resolution means versus 240p etc etc but I think anyone here intelligent enough to know what Im talking about understands I'm just using these as general detail/scale identifiers.

So whats your take on this? Is there a limit on the resolution youre going to be able to even discern small detail? Considering the lack of progress on console resolution those manufacturers seem to think so. Is there a limit on how low resolution a Polygon based game can look before you cant bring yourself to play it?

(Full disclosure I have 20x15 vision)
 
Most people won't notice much difference with 4k because most people have relatively small televisions and view them at relatively long distances. Watch a 133" projector at 9' or so and you will notice a massive difference.

Same goes for PC screens, more or less. 1080p is okay @ 24", horrible @ 40". Some people are fine with 1080p on a 24", some people (including myself) want a massive screen (40"+) and for that 1080p is nowhere near sufficient, but 4k is probably plenty (i.e. going to 8k or 12k wouldn't give much benefit, for me anyway).
 
I noticed a big difference going from 1080P to 1440P I would go 4K but don't have the SLI setup for it.
If you ask me Games from 2008 look better then todays games just because you could see the work the craft or work the developer did.
Same thing with 8 bit games you could see the pixels which made it fun. Which is why Minecraft is popular.
 
I don't pay attention to resolution, I pay attention to the PPI. The larger the screen, the larger the resolution is required to avoid pixelated artifacts and to provide a smooth/clear image. I prefer 105-110 PPI for PC usage, myself, which would break down as follows:
  • 1920x1080 = 21"
  • 2560x1440 = 27"
  • 3440x1440 = 34"
  • 3840x2160 = 40"
  • 5120x2880 = 54"
Since we're talking about eyesight, mine is about 20/25 while wearing corrective lenses, 20/60 without (left eye has myopic astigmatism).
 
Last edited:
When is resolution going to stop mattering?
Depends on the screen size and how close you are to it but I think we are there with 4K for the vast majority.

For TV/movies, 4K is enough.
To get any benefit from 8K you need a 200" screen from 4m away which is just too big for many reasons so it becomes pointless.
(an approximation but you get the point)

For gaming sat close to the screen and desktop space there may be some mileage going above 4K with a physically larger screen.
But the vast majority wont appreciate any higher res than 4K.
The only way 8K will get traction is if the bandwidth to transport video is already available, cost of the bandwidth doesnt matter, price of movies doesnt increase and it replaces 4K TVs directly without a price increase.
Either that or tech like HDR comes only with 8K, but even then it will be a hard sell.
 
I noticed a big difference going from 1080P to 1440P I would go 4K but don't have the SLI setup for it.
If you ask me Games from 2008 look better then todays games just because you could see the work the craft or work the developer did.
Same thing with 8 bit games you could see the pixels which made it fun. Which is why Minecraft is popular.

That isn't why Minecraft is popular. The graphics have nothing to do with Minecrafts popularity. I think the reason why some people like older games better is that current games are starting to get good enough to fall in the uncanny valley.
 
I always preface this with "here, I'm handing in my [H] card..." but... It matters not all that much to me. I still play DOS games for eff's sake in 320x200/240, I LOVE pixel art, and jagged edges have never really bothered me.

Sure they can be a little distracting if they're distorting textures or transparencies, but most reasonable settings at 1080 look pretty good to me. I don't use heavy AA, just moderate AF, and then the rest of my resources go into real eye candy like shaders, shadows, lighting, etc.

I admire the people building insane machines that run things well at 4K, but it's just totally unnecessary for me. If I can play at 1080 at high settings with AA dropped down a bit at 60FPS that is great for me. I still keep buying new video cards at each iteration just so I can continue cranking up the OTHER features, and still keep playing at 1080-60. Might be crazy, but it works for me. Steady frame rate, and cool effects are much more important for me on the visual end of things than how non-jagged a line is. :D
 
I don't pay attention to resolution, I pay attention to the PPI. The larger the screen, the larger the resolution is required to avoid pixelated artifacts and to provide a smooth/clear image. I prefer 105-110 PPI for PC usage, myself, which would break down as follows:
  • 1920x1080 = 21"
  • 2560x1440 = 27"
  • 3440x1440 = 34"
  • 3840x2160 = 40"
  • 5120x2880 = 54"

this is pretty much the answer, but with a little note of: you're not going to be sitting as close to that 54" as to that 21" (hopefully):D

so that does change things a little, the further away, the lower the PPI can be.
 
You shouldn't mix movies and games in this regards, they're both very different animals. Since movies are "rendered" at virtually infinite resolution. (minus the visual effects) While games are rendered at the resolution of your screen.

So we must separate rendering resolution and viewing resolution. Movies are downsampled from infinite to 4K or 1080p, while games were displayed at their rendering resolution until very recently. When they introduced DSR or VSR depending on which camp you ask.

For display resolution there is no need to go beyond 4K unless you have a screen so large that it wouldn't fit trough your front door anyway. But rendering resolution at which the actual image is generated. I think we'd need at least 64K, to not see a difference anymore, possibly even 128K.
 
After what I'd consider a pretty huge jump from standard definition (480i/p) to 720p, I think 1080p wasn't that much bigger of a jump and 4K is even less so. I have a pretty large TV as well. Maybe just because 480i/p was used for 25+ years of my life. That's mainly for movies, though. You can really only see so much detail in most films, you know?
For gaming, I personally don't see as huge of a difference between 1080p to 4K vs, 720p to 1080p. I'm on a 65" TV, too. I've heard the opposite from people with huge projection screens over 100 inches, but those are still pretty niche.
 
I have a 1080p 27" monitor, but that is the absolute largest I would go with 1080p. Anything over 27" would need to be bumped up to 1440p at least. I wouldn't get 4k for anything less than 34".

So basically for me in terms of monitors:
21"-27" = 1080p
27"-34" = 1440p
34"+ = 4k

When it comes to TVs, I have a 55" 4k. I sit about 8' away from it and I can tell the difference between 1080p and 4k content from that distance at that screen size. But personally, I think 4k would be pointless for most TV seating arrangements for screens any less than 50".

Of course, this is all a matter of preference.
 
We are still a long way away, especially for people with better vision than 20-20.

I have 20-10 vision and can easily discern individual pixels on a 1440p 27 inch monitor from 2 feet away and I can do the same on my 5" 1440p phone from a normal viewing distance.
I think a screen would need to have at least 10X the PPI for individual pixels to be indiscernible.
 
this is pretty much the answer, but with a little note of: you're not going to be sitting as close to that 54" as to that 21" (hopefully):D

so that does change things a little, the further away, the lower the PPI can be.
For console gaming and movie watching I agree with your last point. But even from four feet away the pixellation bothers me when using my PC on my 40" 1080p TV.
 
I've got 20-15, and if I want to I can see the pixels on my TV, but honestly, it's 65" at around 9-10ft away with a small amount of AA a game in motion does not bother me in the least. I use a 24" 1080 monitor as well, and yes, I can discern pixels, but in games, doesn't bother me at all as long as the frame rate is fluid and plenty to occupy my vision.

When I'm laying out PCBs or schematics, I'm working with diagrams displayed with some transparency, a touch of AA, and really schematic symbols aren't going to sear my eyes out if I can discern the pixels on the edges.

Totally a matter of my own personal taste. I have other things that some people don't notice that drive me absolutely crazy (mostly in the audio side of things,) so I do understand completely the different points of view. I'm just more picky about sound and less about certain aspects of visuals.
 
A slight update on this, I'm ordering 3x ASUS PB278Q 27" WQHD 2560x1440 today to replace my 3x 27" Acer G27 setup. The catalyst being my roommate wanted to buy mine and I broke an HDMI adapter this weekend with STALKER: Call of Pripyat being escalated to the top of my Backlog this week. I will not play Stalker on a single monitor if it runs on surround. I am a spoiled brat.

I wish there were more surround reviews on HARDOCP, it looks like Kyles moved on top a single 4k setup. Initially I had to ask myself if maybe I should be plunging into a surround 4k setup. If a game is capable of surround thats how I will play it with counter-strike being the lone exception. For me the immersion is beyond important.

However even if a game supported surround 4k and even if I built a monster PC to support it I have trouble believing Id really benefit from it visually. I'm hoping I made the right decision to give this 1440p thing a shot and Im not wasting my money.

I'm suspicious that while Oculus the product might not be the future I'm going to be using VR before I bother with 4k monitors.
 
this is pretty much the answer, but with a little note of: you're not going to be sitting as close to that 54" as to that 21" (hopefully):D

so that does change things a little, the further away, the lower the PPI can be.

This is how i feel, i have no desire for 4k really, but i do want a 1440 27" monitor.
 
for phones/tablets ~ 250-300 ppi
Computers ~ 100-150 ppi
TV's 60+ ppi

Anything much above that is just epeen
 
You can take any movie in 480p and it will look more realistic then anything made in-game by a computer at any resolution.

To make a scene more realistic would require either an expotential more amount of polygons or would require a completely different method of drawing a 3d environment.

So instead manufacturers and software designers have pushed the resolution thing. It requires significantly less resources, but it does make a very noticeable difference. Just draw bigger polygons (or add more, but not so much more as to be impractical) and add higher resolution textures.

The hardware companies and software companies have to make better products. That's what makes you buy them. But they have to choose a path that is practical.

Just consider this. Foliage. We are still using sprites to draw grass and leaves. What if each blade could be an actual 3d object? Little things like that would add lots of realism. But think of the massive amounts of computations that would require. Resolution will never fix this. Only a completely new outlook on how scenes are drawn will fix this.
 
1440p on a 27 is so much more crisp than 1089. I've ordered a 28 inch 4k. Would love to see how much better it will look.
 
I read on a science blog somewhere that that human vision was 576 megapixel (or something like that). If you are talking about photo realism, then I would guess somewhere around there. Right now 4K UHD is only 8 megapixel so ...we have a ways to go.
 
Resolution will not matter once we can't see the individual pixels easily. Still need to get VR headsets to 4K before it just won't matter period.
 
I read on a science blog somewhere that that human vision was 576 megapixel (or something like that). If you are talking about photo realism, then I would guess somewhere around there. Right now 4K UHD is only 8 megapixel so ...we have a ways to go.
Well, if one wants to approximate the equivalent of. Just like we don't see motion in frames, we don't see color in pixels. Resolution will technically start to not matter when the color we see on a display can blend successfully with its neighbors while producing no visual artifacts.

I found your source, by the way.
http://www.clarkvision.com/articles/eye-resolution.html

For funsies and perspective, the 16:9 aspect resolution for 576,000,000 pixels would be 32000 x 18000.
 
Well, if one wants to approximate the equivalent of. Just like we don't see motion in frames, we don't see color in pixels. Resolution will technically start to not matter when the color we see on a display can blend successfully with its neighbors while producing no visual artifacts.

I found your source, by the way.
http://www.clarkvision.com/articles/eye-resolution.html

For funsies and perspective, the 16:9 aspect resolution for 576,000,000 pixels would be 32000 x 18000.

That is around 10X 4k. That seems about right to me.
 
Back
Top