Pixel Density and Your Delusions

How does scaling up negate anything? The purpose of higher resolution isn't "to get more stuff on the screen".

Even scaled, you still get the benefit of the higher resolution screen.

Getting more stuff on the screen is the only reason I would get a higher res monitor. My current monitor is 1080, too wide for one window and too narrow for two. I have used a 1440 and it is perfect for two windows side by side. I stuck with 1080 because the main purpose of my PC is gaming, 1440 would require a more expensive graphics card in addition to a more expensive monitor. Other people have different uses though, so having options is a good thing I guess.
 
My work gave me a Lenovo Yoga 460 with a 13" WQHD screen, and it is just too tiny to be usable. I had to set the resolution to 1600X900 to be able to use it. (Remote desktop, which I use for servers, does not scale with the desktop scaling on a laptop.) I completely do not understand why people would want something that tiny.
I have a 4k laptop and Remote Desktop does scale on newer server OSes. If you're stuck maintaining old ass servers...yeah your life is going to suck.
 
Nah, because icons, window decorations and fonts are plenty sharp at a standard desktop DPI of 90 - 110. It doesn't add anything to the experience.
Except that it does...because I can fit more other applications and such on the screen. Things that don't scale. Obvious example: The fact I scale my desktop icons doesn't make something like Lightroom suddenly fit less pixels on the screen when you say 1:1 actual pixels to screen pixels or such. And it looks much closer to what a 300 ppi standard print will look like. And if my desktop icons are svg...then they'll be less jaggy as well. Circles will be more like circles, etc.
 
To me there is a very simple formula for DPI, at the point at which humans with the best vision cannot tell the difference between text or objects that contain anti aliasing and those that do not at any distance they are from a display, that is when you have saturated DPI. That is when when you can ditch AA and all the computing power in the CPU or GPU associated with it, and that is a good day. I do not claim to be an expert in GPU architecture but it seems to me if we get to a future where AA is simply not needed we could drop some ROPs and replace that space with other more important computational units in GPUs, or simply use them to increase the GPUs ability to handle other effects. From a simple standpoint AA is simply this massive burden on the graphics industry that was conceived as a work around to low DPI displays. And if we get rid of low dpi displays we no longer need the work around.

I have used 1080p and 1440p displays and neither achieve this on any computer monitor. On a phone 1440p might be close. So maybe 4k is the end of the line for phones.
 
I look at my Dads old Nexus 4 with a iirc 720p screen and my LG G4 with a 1440p screen and they both look pretty much the same.

It isn't that important. I'll take colour accuracy/contrast/brightness efficiency over pixel density any day.

It's like digital cameras. I take snaps that mainly get shown on the web or a phone screen. I'd be perfectly happy with a really superb 3.2MP camera with a great lens.

My main TV at home is a 32" 540p screen. I stream 1080p material to it and it looks fantastic. I sit about 10 feet from it. Friends that I know have 4K screens at home often remark "That's a nice picture for an old TV!" and they go bug eyed when I tell them its only 540p. However, throwing 1080p bitrate material at it and having it properly colour calibrated makes all the difference. Properly 50% downscaled 1080p material doesn't have that edgy look and looks more film like, to me anyway.
 
What no link to a 55 gallon drum of lube for our pixel grinding brothers and sisters?
 
To me there is a very simple formula for DPI, at the point at which humans with the best vision cannot tell the difference between text or objects that contain anti aliasing and those that do not at any distance they are from a display, that is when you have saturated DPI. That is when when you can ditch AA and all the computing power in the CPU or GPU associated with it, and that is a good day. I do not claim to be an expert in GPU architecture but it seems to me if we get to a future where AA is simply not needed we could drop some ROPs and replace that space with other more important computational units in GPUs, or simply use them to increase the GPUs ability to handle other effects. From a simple standpoint AA is simply this massive burden on the graphics industry that was conceived as a work around to low DPI displays. And if we get rid of low dpi displays we no longer need the work around.

I have used 1080p and 1440p displays and neither achieve this on any computer monitor. On a phone 1440p might be close. So maybe 4k is the end of the line for phones.


You are wrong here.

Brute forcing it by upping the resolution until you can't see any aliasing is many times more computationally expensive than well implemented AA.

I think many people don't realize just how difficult it is to drive higher resolutions.

For the same framerate 4k takes four times the GPU power and VRAM as 1080p. 8k takes 16 times the power and VRAM.

Even when we do have that kind of GPU power, there will be better ways to spend that computational power (more detailed models and better effects) than throwing it at excessive resolution.

AA is here to stay. In a world where we do not have limitless computational power, it simply is the better solution.
 
I told you guys I would pass on info on my new Dell Gaming monitor, the S2417dg. At the sub $400 cost (price +tax), and as a companion to my Acer X34, I like it. Now when I grab a browser window and drag it across the window remains almost exactly a 1 for 1 replica and no resizing is needed, if it looks like it's going to fit, it probably will.

I ran up Mechwarrior Online and it ran beautifully, G-Sync running from 50 to over 100 FPS and extremely smooth. Now my X34 does about the same thing but it's ultra wide so the smaller monitor feels cramped, but I ran up StarCraft Remastered on the Dell and that was sweet looking and looks better on it than on the X34. The Dell is doing exactly what I wanted it to do, it's being a perfect companion to the X34 and is perfect for content that the X34 doesn't really do well. I'm happy with it.
 
I told you guys I would pass on info on my new Dell Gaming monitor, the S2417dg. At the sub $400 cost (price +tax), and as a companion to my Acer X34, I like it. Now when I grab a browser window and drag it across the window remains almost exactly a 1 for 1 replica and no resizing is needed, if it looks like it's going to fit, it probably will.

I ran up Mechwarrior Online and it ran beautifully, G-Sync running from 50 to over 100 FPS and extremely smooth. Now my X34 does about the same thing but it's ultra wide so the smaller monitor feels cramped, but I ran up StarCraft Remastered on the Dell and that was sweet looking and looks better on it than on the X34. The Dell is doing exactly what I wanted it to do, it's being a perfect companion to the X34 and is perfect for content that the X34 doesn't really do well. I'm happy with it.


Seems tiny for that high of a resolution.

Even a 27" 1440p screen is a little small at that restaurant IMHO.

My old 2560x1600 screen was damned near perfect at 30" though.
 
Getting more stuff on the screen is the only reason I would get a higher res monitor. My current monitor is 1080, too wide for one window and too narrow for two. I have used a 1440 and it is perfect for two windows side by side. I stuck with 1080 because the main purpose of my PC is gaming, 1440 would require a more expensive graphics card in addition to a more expensive monitor. Other people have different uses though, so having options is a good thing I guess.

Well if you think 1440P is perfect then you need to try out a 3440x1440 Ultrawide, it's the same damned display but it's Ultra lol

Just for your information, I did the exact same thing awhile back, I made a conscious decision to limit myself to 1080P gaming and save a bunch of money. Two years later, I change my tune and bought the Acer X34 and a 1070, and now a perfect companion monitor, a Dell S2417DG, (some will want the 27" inch but I am happy with the 24"). This setup right now is really nice for what I do and the 1070 drives it just fine with G-Sync all the way around and frame rates typically running between 50 and 120 FPS, no chugging and no tearing.
 
Seems tiny for that high of a resolution.

Even a 27" 1440p screen is a little small at that restaurant IMHO.

My old 2560x1600 screen was damned near perfect at 30" though.


But, I had a 1080P that this one replaced and if I grab a window from the 34" and drag it over, it explodes and I had to drag it back, resize it guessing whether it would fit or not, then drag it back over. With this one it comes across perfectly, like 1:1 as long is it looks like it will physically fit, it fits. Now the 27" would do it a little better just because the physical hight of the displays would be closer to the same. But besides serving as a companion to the X34 for handling non game apps like Teamspeak or browser windows, it has a secondary purpose. Some game content doesn't support 21:9 and especially a title like StarCraft Remastered which looks bad on the X34, looks terrific on the 24" Dell. So the pixel density might be too tight, no great benefit, but the mechanics of moving windows between the two displays works flawlessly. Just a different angle to consider.
 
A lot of you are equating higher PPI to UI real estate. It's not.

The discussion should be in two categories:
1) Why are all desktop OSes shit at higher PPI in one way or another (especially Windows, and yes, even OSX)?
2) At what point is higher PPI simply not going to affect viewing experience regardless of how good the software becomes (this is easy if we make 20/20 eyesight the benchmark).

For instance: a LOT of people keep complaining that high PPI is bad because it makes windows fonts and icons either too hard to read or just not any better. THAT'S NOT A PPI PROBLEM. THAT'S A WINDOWS PROBLEM. That's like complaining your racing tires are bad because your stock 1990 Honda Civic still performs like ass. The icon doesn't look any better because it was made in a lower PPI for a lower PPI. The fonts are too small because windows is absolutely garbage at assigning correct font size to a specific screen size/resolution/PPI. Simple scaling is also not a very good solution as anyone experienced with typesetting will tell you (hence why each font size of a good font is created pixel by pixel, not simply scaled up).

Now if you were to complain that a 4k, 5" phone screen is bad because your eye literally cannot distinguish between images or text rendered at that size and viewing distance at 4K or 1440p (or even 1080p), then that is a totally valid complaint.
 
Couldn't agree more.

1080p is more than plenty on a 5" class phone. My old Droid Turbo had a 1440p screen and I literally could not tell the difference in resolution.

Same for the desktop. At normal desktop viewing distances of about 2.5 ft there is no need for anything higher than ~110 ppi.

View attachment 41699

I disagree even with this chart and feel 4k should be reserved for 40"+ screens only at desktop viewing distances.
Television and desktop are far from being the same. On a picture recorded with a camera you get no jagged edges or hard contrasts. However when you use artificial images you either need to perform anti aliasing on edges for it to be smooth, or you need magnitudes more pixel density to not notice jagged edges.

Also the biggest picture problem quality wise with television is motion blur, on TV / Movies everything is blurry as hell. So 4K doesn't make as much sense as it should. That further reduces the ability do differentiate on resolution alone.
That's why comparing phone screens to television is not relevant at all.

Pixel density is important, but of course you shouldn't sacrifice viewing angles and brightness for ppi, noone ever suggested that. But coming out with a clickbait article that ppi is not important is raw bullshit.
 
Television and desktop are far from being the same. On a picture recorded with a camera you get no jagged edges or hard contrasts. However when you use artificial images you either need to perform anti aliasing on edges for it to be smooth, or you need magnitudes more pixel density to not notice jagged edges.

Very true, and - as we've been over repeatedly in this thread - extreme pixel density will NEVER be the solution to this problem, as it is orders of magnitude more computationally expensive than even the most intense anti-aliasing techniques. Even in some potential future state when we have GPU capacity much greater than we do today, that GPU capacity is better spent elsewhere (higher polygon count, better lighting effects, etc.) combined with good anti-aliasing techniques than it is on extreme resolutions.
 
Back
Top