Understanding 4K monitor and OS scaling. Best size to have sharp image.

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,765
Hi,
most people who buys 4K monitors think to buy a monitor that will produce a sharper image than an FullHD monitor.

This is not always true, not in all the various situations at least.

Suppose to buy a 24 inch 4K monitor. At full resolution text, icons and desktop app in general will be too much little and unreadable.
To solve this problem we can use OS scaling, windows offer this function by "zooming the desktop" to 125%, 150% and so on.

Zooming desktop produces blurred images because it effectively scales the desktop, like scaling a small image in a bigger one.

This results in having a 4K monitor that is less sharp than a FullHD monitor.

I would like to buy a 4K monitor but 32inch is too big for me, I use it on a desktop and with a 32inch I can see the glow by simply by moving the head to watch from a corner to the other, 32inch is simply a no go for me.

I can tolerate a 28inch but I would prefer a 27inch.
This will be better in gaming too because gaming on a 27inch 4K monitor does not require antialiasing gaming on a 32inch monitor require some sort of AA. (plus or minus depending on tastes)

Saied that.
Is there someone with a 27inch 4k monitor here that uses the OS scaling?
If yes, what is the right scaling to read something without having blurred image?
Is it better to buy a 27 or a 28 in this regard?

With a 24inch 4K monitor with a 200% scaling you can do integer scaling and having a desktop big like a full hd monitor...

What about 27inch or 28 inch?
 
No no no. OS scaling does not just take the lower resolution pixels and scale them up. That only happens in legacy apps.

What it does is report a higher DPI to the application, and rasterize vector graphics at a higher resolution. More detailed bitmap assets will be used as well. The result is a sharper image than is achievable on a lower resolution monitor, while UI elements maintain the same real-life size. There are some caveats, but I won't go into those.

And no, integer scaling is not yet available on any monitor, as far as I know.
 
Well Windows scaling is a hit or miss, as I just found out with my new 4K TV. Most text in Windows or web-browsers does display at higher quality. But there are some apps, Steam is a good example, where is just scales the low res images up and actually looks quite worse than it would on a 1080P or 1440P monitor with no scaling. I've also seen issues where some text or images scale, and others don't, or the developers hard coded spacing in the UI so half the text is cut off, etc. I mean, enough stuff works that it's OK to use, but it could definitely be better. I'd say that 27" 1440P is just about the perfect size and resolution to use without scaling.
 
OP, your best bet is to avoid scaling and make compromises with view distance and screen sizes. Personally, I would stay in the 100-150 PPI range and you may get away with minimal scaling. 32" is IMO the minimum for a 4K monitor.

Just saw that the previous post mentioned 27" 1440p, which is what I am on for a few more days. My 27" 1440p display has a ~109 PPI, which as the last poster said is workable without scaling.

OP, I get the impression you have yet to see a 4k monitor in person, apologies if I'm wrong but don't understimate the math and the PPI factor. A 27" 4K display has a 160+ PPI, you would need serious scaling to make text readable.

Your remark about 27" screen not needing AA and 32" needing AA is subjective. A 32" 4k monitor still offers a ~30% pixel density increase over my 27" 1440p display, despite sizing up. 130PPI already begs for scaling, at 160PPI you will have problems even navigating to the scaling settings to enable it to begin with.

The general idea with the resolution race is to allow you to go bigger while hanging onto pixel density. Windows will eventually find an elegant scaling solution, the problem is having 3rd party programs survive that scaling unharmed.
The lack of good system deep scaling, at least on the windows side, forces users to pick bigger screens in order to mitigate the need for scaling but if scaling worked as it should, picking display size would be a choice, not a necessity.
 
Windows is built around 96 DPI at 100% scale.
So if you want to use 1.00x scale the ideal size is 46" and you get the full 3840x2160 workspace.
37" at 1.25x for a 3072x1728 workspace.
30.5" at 1.50x for a 2560x1440 workspace.
23" at 2.00x for a 1920x1080 workspace.

The problem is that anything other than full integer scales will blur legacy apps.
At integer scales you get unfiltered nearest neighbor scaling for legacy apps. They're still low resolution, but they aren't blurred.

If you choose to stick with 1x scaling on displays which are smaller than 46" then everything will start to get very small on-screen.
If you use 2x scaling with displays larger than 23" then everything will start to get very large on-screen.

Scaling is not just a Windows problem. OSX has its own set of issues with scaling.
What OSX does is, for any scale other than 100%, it renders the entire OS at 2x scale, and then scales the output from that as an image to fit your screen resolution.
So instead of only legacy applications being blurred as they are on Windows, everything is blurred on OSX. But apparently Mac users are fine with that. It does produce more consistent results I suppose.
I get a headache from using the scaled modes on OSX because my eyes are straining to "focus" the scaled image.
And the only apps on OSX rendering in high DPI mode are new apps specifically written to support it. Windows has a legacy scaling mode that sometimes works and sometimes breaks applications. So people have the impression that OSX handles scaling better, because there's no way to try rendering old apps in HighDPI mode.
Though I will say that generally Mac developers do a much better job supporting HighDPI. Most Windows developers seem to be stuck in the mindset of: "why would I ever want to buy a high res monitor and not use it all as workspace" with these new high res monitors, and just don't build High DPI support into their applications.

This will be better in gaming too because gaming on a 27inch 4K monitor does not require antialiasing gaming on a 32inch monitor require some sort of AA. (plus or minus depending on tastes)
Not even close. You may be thinking of how UHD downsampled to 1080p looks, rather than native UHD.

OP, your best bet is to avoid scaling and make compromises with view distance and screen sizes. Personally, I would stay in the 100-150 PPI range and you may get away with minimal scaling. 32" is IMO the minimum for a 4K monitor.
Just saw that the previous post mentioned 27" 1440p, which is what I am on for a few more days. My 27" 1440p display has a ~109 PPI, which as the last poster said is workable without scaling.
1440p works better on the desktop at 27-32" for sure, since you don't require scaling.
Though there are many drawbacks to them: the main ones being that they're only standard DPI displays - which look pretty bad if you're used to phones/tablets/notebooks with highDPI displays. And 1440p is very demanding for gaming.
With UHD you have the option of rendering at native UHD (really demanding but fine for old games) or at 1080p for new games.

That's why I'm really excited for the upcoming UHD 144Hz monitors. Not because I expect to play new games at 144 FPS at that resolution, but because it gives you the flexibility of being able to also display 1080p at 144 FPS, rather than being limited to 60Hz like the current UHD monitors.

The general idea with the resolution race is to allow you to go bigger while hanging onto pixel density. Windows will eventually find an elegant scaling solution, the problem is having 3rd party programs survive that scaling unharmed.
The lack of good system deep scaling, at least on the windows side, forces users to pick bigger screens in order to mitigate the need for scaling but if scaling worked as it should, picking display size would be a choice, not a necessity.
Windows already has good support for scaling, the entire problem is third-party and legacy applications.
 
Back
Top