So, Where are the 5K monitors?

KazeoHin

[H]F Junkie
Joined
Sep 7, 2011
Messages
8,992
I don't know if this has been asked, but where exactly are all the 5K monitors? We only have two 2880p panels IN THE WORLD at the moment: You'd think we'd see all kinds of 1%'er products with 5K resolution on 150" screens or some other nonsense. I would probably kill, or at least become violent for a 50" 2880p monitor with decent contrast, and I'm probably not the only one.


your thoughts, ladies and gentlemen.
 
I'm sure we'll see more 27” 5Ks after DP1.3 becomes standard. Not sure if large 5K displays will appear since the TV people will go to 8K, if anything.
 
I honestly think we've reached a point where there's enough resolution for just about anyone with 4K. To someone that grew up with CRT televisions and 17" 1024x768 monitors, 1080p was more than enough. But 4K... 4K is reaching the point where making the pixels any smaller would be difficult for the human eye to notice unless you made a really huge display and sat right in front of it.

On top of that, a lot of existing hardware can barely push 4K, even stuff that's fairly high-end. 4K monitors also suck at scaling 1080p in a way that doesn't make it look blurry, so you pretty much need native content.

I think 8K will be the next major resolution, and I don't think we'll get there soon. After 8K... I doubt people will want to go any higher.

Right now, more people want 2K monitors or Ultrawide because it's more than 1080p, but their systems can actually take it.
 
There was a time when 1440p was simply too demanding for gaming at the high end, but now its becoming a pretty standard resolution.

I use a 40 inch 4k screen, and I could easilly see upgrading to a 50" 5k screen. More desktop space, more immersive games...

I'm just wondering why there aren't any 5K screens AT ALL outside of the 27" Mac, and the 27" Mac with a HP logo on it. You would imagine at least ONE pannel manufacturer would put out some halo 50+ inch product.
 
There was a time when 1440p was simply too demanding for gaming at the high end, but now its becoming a pretty standard resolution.

Well, yeah. Everyone wanted more vertical real-estate and lamented the decline of 1920x1200 monitors. It was a natural progression, and gaming was the only thing that slowed it down.
I use a 40 inch 4k screen, and I could easilly see upgrading to a 50" 5k screen. More desktop space, more immersive games...

Now, that might be what you're missing here. You have a 40" computer screen. The television in my living room is 42", and the largest computer screen I own is 23". I live in a cramped space and don't have room for a bigger version of either one. I suspect a lot of people are in the same boat.

I'm just wondering why there aren't any 5K screens AT ALL outside of the 27" Mac, and the 27" Mac with a HP logo on it. You would imagine at least ONE pannel manufacturer would put out some halo 50+ inch product.

You know, I would think so too... sure, maybe it wouldn't be common, but someone would do it just to show they could. But it seems like these days, companies don't take chances and won't produce anything that the majority of people won't buy. The market tends to shut down choice, and whatever the majority wants becomes the only thing they make very, very quickly. For instance, I don't like typing on a touchscreen and would prefer a phone with a keyboard. I also find finger-based touch imprecise compared with a stylus. But Apple released the iPhone, and everyone loved it. Then it was copied by almost everyone. Now I can't find a phone that isn't designed like an iPhone.

The market is increasingly such that if you don't like what everyone else likes, you're out of luck. It's always sort of been that way, but it's getting more extreme and unusual/niche products are being killed off faster than ever before.

People that want higher resolution displays than 4K are generally people who just want displays with a ridiculous number of pixels so that they "don't have to see a pixel," and usually rely on scaling everything up to make it work so that the image looks "less blocky," but is effectively the same amount of usable space as a lower resolution. Hence the 27" monitors.
 
Last edited:
the resolution race is pretty much over. now we want less blur, higher refresh rates and better contrast. and monitors prices must be kepot inline out for VR headsets, 5k displays have been in the market for some time, and they did not grab much market share in the period. 4k high quality video is already better than real life, going 5k will nnot improve it, unless it comes with higher contrast and refresh rates. what i miss in the market are not 5k displays, but 4k dispalys in the optimal size: ~35". 32" is too small to use at native resolution and 40" is big enough to make ergonomics an issue.
 
I think they're going to skip 5k and go right to 8k. This will never stagnate, because they have to keep convincing us to all go waste more money on new tech we don't need.
 
Hisense 65" 8K TV is coming out this year with lots of bells and whistles.
 
Part of the problem is the standards, as Luke M mentioned above. If you look at the Dell 5k, it requires two DP inputs to push the 5k. If I recall, the Dell is also basically a dual monitor in one with no bezel in the middle. I just don't see 5k catching on honestly. I think we will all see 4k be the next big thing in monitors for a while, and after that it will be 8k. Once the new DP and HDMI standards come out, they will support 5k and maybe you will see more monitors supporting it.

Now, that might be what you're missing here. You have a 40" computer screen. The television in my living room is 42", and the largest computer screen I own is 23". I live in a cramped space and don't have room for a bigger version of either one. I suspect a lot of people are in the same boat.

As athenian200 said, size is often also a limitation too. I think 50" is too big for me personally. Depending on your own DPI preference with 4k, somewhere between 35" and 45" should be enough.

Scaling is another problem even though it is getting better. I still prefer native resolutions since it is always supported and displayed as intended. 5k would require a much larger display to give me a usable DPI.

If they fix many of these issues, 5k could become more popular, but by that time I'm guessing 8k will already be here.

Personally, a really nice OLED 4k 40"~ monitor with up to 120hz is more appealing than anything.
 
a 30"-32" 5K OLED monitor with 100% Adobe RGB, TB3/DP1.3 and g-sync... drool
 
30-32" is too small for me at 4k, couldn't even imagine 5k. Can't see or read anything in windows like that. Windows doesn't scale well at all so I can't run anything other than 100%. If Windows fixed that some how or you could run at say, 200% to not effect scaling that might work? Seems to me that 8k would be the best candidate for something like that though.

What is the fascination with 5k exactly? Just Higher PPI?
 
I'm just wondering why there aren't any 5K screens AT ALL outside of the 27" Mac, and the 27" Mac with a HP logo on it. You would imagine at least ONE pannel manufacturer would put out some halo 50+ inch product.
Wasn't Dell the first one to release a 5K monitor?

I think they're going to skip 5k and go right to 8k. This will never stagnate, because they have to keep convincing us to all go waste more money on new tech we don't need.
Where/when do you think the 7K ultrawide at 6880x2880 will slot into that timeline? ;)

30-32" is too small for me at 4k, couldn't even imagine 5k. Can't see or read anything in windows like that. Windows doesn't scale well at all so I can't run anything other than 100%. If Windows fixed that some how or you could run at say, 200% to not effect scaling that might work? Seems to me that 8k would be the best candidate for something like that though.

What is the fascination with 5k exactly? Just Higher PPI?
Maybe not the only reason, but it's double 1440p, in the same way "8K" is double UHD which is double 1080p, thus allowing even scaling down to 1440p. I would imagine many people find 1440p on a 27" screen to be a happy medium, at least a lot more so than 27" 4K or 27" 1080p. I just wish they'd make a reasonably priced 27" 2560x1600.
 
Maybe not the only reason, but it's double 1440p, in the same way "8K" is double UHD which is double 1080p, thus allowing even scaling down to 1440p. I would imagine many people find 1440p on a 27" screen to be a happy medium, at least a lot more so than 27" 4K or 27" 1080p.

I can see that... sort of. 1440p at 27" is about 108 PPI, where the equivalent 5k at 108 PPI is 54". I mentioned this before but that is way too large for me and most peoples desks I assume.

Moving to something like 4k right now is mostly about getting more desktop real estate with keeping Windows at 100% scaling. Games do look nice though with the extra lines of resolution, don't get me wrong.

40"~ 4k screens also open up the ability to run 21:9 for the extra FOV in games that support it. This is one of the things I do right now, but also get a better desktop space when being productive than the ultra-wide format monitors. You get stuck with 21:9 at all times and that doesn't work for me.

On the other hand, 5k at 27" is okay if you do 200% scaling, but you don't gain any desktop real estate. Until graphics cards can keep up, I don't see ~220 PPI being useful in any meaningful way. Personally, I don't see see pixels at 100-110 PPI, do you? Maybe you have amazing eye sight?

Again, eventually I see a possible market for really high pixel density but I'm comfortable with what we have now as I assume most other people are too. My concerns are focused on usable desktop space, input lag, refresh rates, and new panel types. R&D should be focused on those things first, then maybe graphics card manufacturers will have the power to go for uber dense monitors.

Just my opinions obviously.
 
I can see that... sort of. 1440p at 27" is about 108 PPI, where the equivalent 5k at 108 PPI is 54". I mentioned this before but that is way too large for me and most peoples desks I assume.

Moving to something like 4k right now is mostly about getting more desktop real estate with keeping Windows at 100% scaling. Games do look nice though with the extra lines of resolution, don't get me wrong.

40"~ 4k screens also open up the ability to run 21:9 for the extra FOV in games that support it. This is one of the things I do right now, but also get a better desktop space when being productive than the ultra-wide format monitors. You get stuck with 21:9 at all times and that doesn't work for me.

On the other hand, 5k at 27" is okay if you do 200% scaling, but you don't gain any desktop real estate. Until graphics cards can keep up, I don't see ~220 PPI being useful in any meaningful way. Personally, I don't see see pixels at 100-110 PPI, do you? Maybe you have amazing eye sight?

Again, eventually I see a possible market for really high pixel density but I'm comfortable with what we have now as I assume most other people are too. My concerns are focused on usable desktop space, input lag, refresh rates, and new panel types. R&D should be focused on those things first, then maybe graphics card manufacturers will have the power to go for uber dense monitors.

Just my opinions obviously.


The issue is that modern UIs are STILL designed around 100 PPI. Even scaled to 200%, things still look blurry and crappy. I currently use 5K worth of desktop space (3x 1080p + 1 4K) and even then I can see wanting more for productivity. I'm happy with 110 PPI, and things look fine to me, I just want more desktop space.
 
I can see that... sort of. 1440p at 27" is about 108 PPI, where the equivalent 5k at 108 PPI is 54". I mentioned this before but that is way too large for me and most peoples desks I assume.

Moving to something like 4k right now is mostly about getting more desktop real estate with keeping Windows at 100% scaling. Games do look nice though with the extra lines of resolution, don't get me wrong.

40"~ 4k screens also open up the ability to run 21:9 for the extra FOV in games that support it. This is one of the things I do right now, but also get a better desktop space when being productive than the ultra-wide format monitors. You get stuck with 21:9 at all times and that doesn't work for me.

On the other hand, 5k at 27" is okay if you do 200% scaling, but you don't gain any desktop real estate. Until graphics cards can keep up, I don't see ~220 PPI being useful in any meaningful way. Personally, I don't see see pixels at 100-110 PPI, do you? Maybe you have amazing eye sight?

Again, eventually I see a possible market for really high pixel density but I'm comfortable with what we have now as I assume most other people are too. My concerns are focused on usable desktop space, input lag, refresh rates, and new panel types. R&D should be focused on those things first, then maybe graphics card manufacturers will have the power to go for uber dense monitors.

Just my opinions obviously.

It's about the 200% scaling for me. The difference between 27" 1440p at 100% and 27" 5k at 200% is ridiculous to me. Much more pleasing to the eye. Hell, even my 25" 1440p monitor looks significantly better than my 27" 1440p monitor just due to the ppi. I don't think we need 220 ppi on desktop monitors though which is why I said 30-32" 5K.

I guess it just depends on eye sight. I've heard both arguments and it usually ends in an eye-sight discussion.
 
It's about the 200% scaling for me. The difference between 27" 1440p at 100% and 27" 5k at 200% is ridiculous to me. Much more pleasing to the eye. Hell, even my 25" 1440p monitor looks significantly better than my 27" 1440p monitor just due to the ppi. I don't think we need 220 ppi on desktop monitors though which is why I said 30-32" 5K.

I guess it just depends on eye sight. I've heard both arguments and it usually ends in an eye-sight discussion.


COMPLETELY agree with this.
I'd already have a 5k monitor, but 27" is just too small for perfect 2x scaling in terms of effective size of onscreen elements.
But a 5k 30-32" would be basically perfect
 
I expect that we'll see more 5K monitors once DisplayPort 1.4 gets here and it can be done over a single cable.

30-32" is too small for me at 4k, couldn't even imagine 5k. Can't see or read anything in windows like that. Windows doesn't scale well at all so I can't run anything other than 100%. If Windows fixed that some how or you could run at say, 200% to not effect scaling that might work? Seems to me that 8k would be the best candidate for something like that though.
What is the fascination with 5k exactly? Just Higher PPI?
Windows does 2x scaling just fine. The problem is that many legacy applications don't support scaling well.
Many people also have a bad impression of Windows' scaling from trying to use non-integer scaling, or using scaling on Windows 7 or earlier.
Windows 8.1/10 handle scaling much better, and integer scaling is handled differently from non-integer scaling. Non-integer scaling has problems which aren't really possible to solve.

Apple doesn't have this problem because they don't care about supporting legacy applications.
They also render non-integer scaling in a very different way from Windows. Personally I really dislike how Apple does it.
OSX renders everything at 2x and then scales that as an image to fit your display. This means that non-integer scaling results in everything being blurry.

Windows will render UI elements and text natively at your non-integer resolution so that everything is nice and sharp. The issue here is that some elements may not render cleanly, and legacy applications will be blurred. But only legacy applications are affected, rather than all applications on OSX.

The issue is that modern UIs are STILL designed around 100 PPI. Even scaled to 200%, things still look blurry and crappy. I currently use 5K worth of desktop space (3x 1080p + 1 4K) and even then I can see wanting more for productivity. I'm happy with 110 PPI, and things look fine to me, I just want more desktop space.
If you want more workspace, you need to buy a bigger display.
People need to stop confusing resolution with workspace.
If you've bought/used a phone, tablet, or notebook in the past five years, I'd be surprised if you were still happy with how text looks on a standard 110 PPI monitor. (especially with the way that Windows renders fonts)

COMPLETELY agree with this.
I'd already have a 5k monitor, but 27" is just too small for perfect 2x scaling in terms of effective size of onscreen elements.
But a 5k 30-32" would be basically perfect
That's just personal preference really. A 27" 5K monitor at 2x is exactly the same as a 1440p monitor at 1x, which is the standard 110 PPI used for most monitors today.
Personally I preferred it when monitors were 16:10 and 100 PPI rather than 16:9 and 110 PPI, so 29" would be ideal. 32" is too big for that resolution in my opinion. But I'd rather 32 than 27 I suppose.
 
If you've bought/used a phone, tablet, or notebook in the past five years, I'd be surprised if you were still happy with how text looks on a standard 110 PPI monitor. (especially with the way that Windows renders fonts)
Most people don't view their phones/tablets at the same distance as a desktop monitor, so DPI isn't the only factor. A 10" 1080p tablet viewed at 1ft is the "same" as a 20" 1080p monitor viewed at 2ft, despite the doubling of DPI/PPI, right?
 
I just hope that OS and program scaling won't still be horrible in Windows by the time 5K and higher resolutions become ubiquitous. Otherwise they're going to need to make 5K monitors that are at least 55"...
 
If you want more workspace, you need to buy a bigger display.
People need to stop confusing resolution with workspace.
If you've bought/used a phone, tablet, or notebook in the past five years, I'd be surprised if you were still happy with how text looks on a standard 110 PPI monitor. (especially with the way that Windows renders fonts)

A bigger display with the same resolution gives you the SAME workspace. I don't understand what you mean by saying resolution doesn't relate to workspace (assuming the same scaling).

Also, phones and tablets are much different due to their size, different operating systems (other than Surface), and viewing distance.
 
Most people don't view their phones/tablets at the same distance as a desktop monitor, so DPI isn't the only factor. A 10" 1080p tablet viewed at 1ft is the "same" as a 20" 1080p monitor viewed at 2ft, despite the doubling of DPI/PPI, right?
No, text will be the same physical size, but have twice the resolution on the smaller display as a result of the higher pixel density.
You need to stop equating resolution with size.

18pt text should be displayed as being 1/4" tall on all displays for example, because points are a physical measurement. (1/72 of an inch)
The higher the pixel density a display has, the more resolution that text will be displayed with.

Workspace should be determined by display size.
Resolution should be determined by pixel density.

Until we have much higher resolution displays where non-integer scaling is no longer a problem, the ideal situation would be for display manufacturers to stick with sizes where it makes sense to use integer scaling. (1x, 2x, 3x etc.)

I just hope that OS and program scaling won't still be horrible in Windows by the time 5K and higher resolutions become ubiquitous. Otherwise they're going to need to make 5K monitors that are at least 55"...
OS scaling is good right now. The problem is that people want to run programs built before scaling was commonplace.
There isn't really anything more that the OS can do to fix this, beyond what it's already doing.
 
No, text will be the same physical size, but have twice the resolution on the smaller display as a result of the higher pixel density.
You need to stop equating resolution with size.

18pt text should be displayed as being 1/4" tall on all displays for example, because points are a physical measurement. (1/72 of an inch)
The higher the pixel density a display has, the more resolution that text will be displayed with.

Workspace should be determined by display size.
Resolution should be determined by pixel density.
I'm not sure I fully understand your argument.

If I have a 24" 2560x1440 screen, and I set it at 2 feet away, how is that any different in terms of workspace if I get a 27" 2560x1440 screen at set it at 3 inches farther away? They should look effectively no different, and I have the same amount of workspace. Sure, if I say that text has to be displayed the same measured height of 1/4 inch, regardless of viewing distance, I need a physically larger screen to display the same amount, but if I have the smaller screen positioned closer to me, I don't need the text to be set to the same point size to still be effectively readable.
 
No, text will be the same physical size, but have twice the resolution on the smaller display as a result of the higher pixel density.
You need to stop equating resolution with size.

18pt text should be displayed as being 1/4" tall on all displays for example, because points are a physical measurement. (1/72 of an inch)
The higher the pixel density a display has, the more resolution that text will be displayed with.

Workspace should be determined by display size.
Resolution should be determined by pixel density.

I disagree with your statements also.

You say that 18pt SHOULD be displayed as a fixed size, but that isn't how it actually works. Do you get out a ruler and make sure each time you open a web browser, or open a Word document that 18pt font is being displayed at 1/4"? I'm thinking you, as most people, do not do this. Also, Windows and applications don't do this either. They will not change the level of zoom or change scaling to match the look of things in software output to match any sort of real life measure on your chose display size/resolution.

Moreover, PPI has a much bigger impact than anything with current software since the standard that is being designed around is 100PPI. 18pt font isn't programmed with a physical size of 1/4" but programmed by how many pixels. With a fixed amount of pixels in mind you can see how workspace can only be determined by display size if resolution also increases or you make changes to your settings. You are thinking in an ideal sense which is not how it happens in the world of computing.

For example, increasing the size of the display without changing the resolution does nothing but decrease the overall PPI. Same pixels spread out over a larger space. If you look at 18pt font displayed at 100% in Word with Windows at 100% scaling on a 24" display at 1080, then increase your display size to 27" at 1080, assuming you keep the same settings, the text will now increase in size since it takes the same amount of pixels but you have those pixels spread out wider on your screen.

If you decide to go up in display size and increase the resolution, then you can keep the font at the same size it was. This is why when moving into 4k displays we are moving into 40-45". This allows the PPI to stay the same and text to look as it did on a smaller lesser resolution screen. It also has the added benefit of increasing your available workspace by the same reasoning. You can now fill more pixels with stuff since you have more of them to fill.

Now if you want to increase the PPI beyond the 100PPI that is currently standard on the PC, you can go with a higher resolution small screen. If you manipulate the settings in your OS of choice to make it appear to be the same size, but with a higher resolution, the text will look the same size to your eye, but much clearer due to increased resolution (at least in theory if programmed correctly). This also assumes that the application and operating system both appropriately handle this. Often this is not the case and why many, including myself, stick to the ~100PPI mark. Windows and applications in Windows do not as a whole have this dialed in yet.

Windows has come a long way and does support much better scaling, but not all applications do. Android and other phone OS's have made much more progress with their implementation, as well as their applications implementation, so this isn't as big of a problem. If not, you couldn't see or use a 2-4k screen on a 5" phone. Mobile OS programmers have had to design beyond the 100PPI standard of desktop computing to circumvent this issue. The PC on the other hand has not fully embraced this kind of thing as of yet. Comparing tablet/phone OS to PC in this regard is not equal.

A combination of things are at play here so it can't be covered by those blanket statements you made about workspace and resolution. Display size AND resolution determine pixel density. Workspace is determined by display size, resolution and application programming. So when you say, "The higher the pixel density a display has, the more resolution that text will be displayed with.", this isn't necessarily the case. It has the POTENTIAL to be displayed with more pixels, but that is only if the OS and software are programmed to do so and you have chosen those options. Applications in Windows do not all get programmed like this and thus do not react this way in practice. Maybe someday both OS and application on the desktop will and then I can see 5k-8k displays becoming more commonplace, which is more on topic for this thread.

Hope this helps clarify the subject just a little. I didn't go into everything obviously but just tried to give a brief overview.
 
Like I said, equating the number of pixels your display has to workspace is old thinking based on everything being rendered at a fixed scale, and very soon that has to change.

I believe Windows still uses 96 DPI as the base for 1x scale.
So a 1080p monitor should be about 23" in size. That's fine.
A 1440p monitor should be about 30.5" in size. Hmm, a bit larger than most current displays.
2160p should now be 45" in size. Pretty big, but some people are buying TVs to use as monitors.
5K should be... 61" in size.
8K should be... 92" in size.

Yeah, people aren't going to be buying 92" monitors.
Display scaling has to happen.

With a modern operating system, you set the display DPI (or it's set automatically) and display scaling will ensure that our 18pt text (and everything else) is displayed at the same physical size (1/4") on all displays regardless of their size or how many pixels they have.

The higher your pixel density, the more resolution (detail) everything is rendered with.
The larger your display, the bigger your workspace will be.

So a 23" monitor would have the same amount of workspace regardless of whether it has a 1080p, 4K, 8K, or 16K resolution. What changes is the amount of detail it is rendered with.
And that makes sense, since we cannot continue to make text smaller - because we won't be able to read it.

If you want more workspace, you use a bigger display, rather than buying a display with more pixels, shrinking text down to illegible (and wrong) sizes.

We aren't there yet, because our displays don't have enough resolution where non-integer scaling will produce sharp results - though Apple seems to disagree. So that means we need to stick to multiples of 96 DPI if we want this to be exact. Or close to it if we don't care about scale being accurate.

With legacy applications, Windows 8.1/10 switched over from attempting to scale things up, which broke lots of applications, to now rendering at 1x and then scaling that up as an image. So legacy applications shouldn't break any more, and they are displayed with the same resolution that they would if you stuck to a low DPI monitor instead. And as long as you stick to integer scales, you get nearest neighbor upscaling instead of filtered scaling. (filtering is necessary for non-integer scales)

So as long as you stick to integer scales, legacy applications shouldn't be a problem.

I'm not sure I fully understand your argument.

If I have a 24" 2560x1440 screen, and I set it at 2 feet away, how is that any different in terms of workspace if I get a 27" 2560x1440 screen at set it at 3 inches farther away? They should look effectively no different, and I have the same amount of workspace. Sure, if I say that text has to be displayed the same measured height of 1/4 inch, regardless of viewing distance, I need a physically larger screen to display the same amount, but if I have the smaller screen positioned closer to me, I don't need the text to be set to the same point size to still be effectively readable.
This breaks down when you start looking at 4K and 5K displays.
The 27" and 32" 4K monitors available right now make text look tiny. You will have to bring your monitor 70% closer to your eyes with a 27" 4K display compared to a 96 DPI monitor if you aren't using scaling - so you're now having to sit 19" from the screen instead of 3ft back.

I'm sure there will be people who say that working at 1x scale is absolutely fine on those screens, but they'll be in the minority.
 
So, you're basically saying that the only "correct" size text should be is based on the point system, and therefore, if you're viewing content at slightly smaller than this "correct" size at 100% scaling (ignoring 4K and 5K displays for the moment), you're doing it wrong and should be telling your OS to scale so it is this "correct" size, and therefore, the only way to increase usable workspace is to use a larger screen with this "correct" pixel density? That's how I understand your explanation.

1440p on a 24" screen at 100% scaling isn't unreadable, especially if it's positioned slightly closer to me than a 24" 1080p screen. I can have more things visible on my screen, and therefore, I consider that I have increased my workspace by moving from 24" 1440p to 24" 1080p, despite not having changed my screen size.
 
Like I said, equating the number of pixels your display has to workspace is old thinking based on everything being rendered at a fixed scale, and very soon that has to change.

This is essentially what it comes down to. You say its "old thinking" although it is how things are currently. Can't escape it with many legacy applications that many of us are forced to use. Now I do want that to change in the future also, but that is NOT how it is yet. So you saying it "should" be that way, maybe it should, but it is not here yet. That is the key issue.

Some day it won't matter and we can crank the resolution up, scale things properly and you will be correct that monitor size itself will equate to workspace. 40-43" of future "workspace" is enough for me as long as everything displays as it does today in 1x4k. Now I don't care if that goes up eventually to 5k, 8k, 16k... or what ever, as long as things display as you suggest. Problem is, that day is not today in any functional way so it rarely gets discussed in that fasion.
 
You say its "old thinking" although it is how things are currently.
If you set the scale correctly based on your screen's DPI it works that way today.

A 27" 5K monitor should be using ~2.27x scale for everything to be displayed at the correct size. (assuming 1x is still 96 DPI)
Realistically, using 2x scale is going to work better on these displays for most people due to the differences between integer and non-integer scaling, even if that's going to display things a bit smaller than intended. (~13.3%)
 
I expect that we'll see more 5K monitors once DisplayPort 1.4 gets here and it can be done over a single cable.

That's just personal preference really. A 27" 5K monitor at 2x is exactly the same as a 1440p monitor at 1x, which is the standard 110 PPI used for most monitors today.
Personally I preferred it when monitors were 16:10 and 100 PPI rather than 16:9 and 110 PPI, so 29" would be ideal. 32" is too big for that resolution in my opinion. But I'd rather 32 than 27 I suppose.

Sorry - My mistake for not being more clear - Sort of assumed that all would garner that it was my personal preference as it was my comment.
It's all subjective and to your point, I also would rather have a 30-32 vs 27 at 2x 5k

The main reason 1440p at 27" doesn't work for me (native or 5k 2x) is that I use a standing desk and it's wall mounted as I like to have my monitor a touch further away than most for eye sight - just works for me.
 
Just for comparison at the same ppi.
http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_30in_2560_same-ppi.jpg

The 4k field would be around 40.8" in that schematic.
Nice real-estate for desktop/apps.

That kind of size is silly to me for 1st/3rd perspective games at a desk though imo,
unless FoV was way more flexible to make the greater part of the monitor's extents
extra FoV rather than just making the same game scene "JUMBO" and out of range
so to speak.

As far as movies go, 1080p blurays are still superiror to 4k streams even on 4k tvs,
so content-wise for movies is questionable.

In regard to gaming, 4k will still be inferior going forward into dp 1.3 gpus and dp 1.3 monitors once we get into HDR,since HDR will drop 4k back to 60hz again.
A7QjyOz.png


The graphics ceiling on games is really arbitrary and will go up with the release of new gpu generations.
The challenge for devs is to whittle things down to "fit" real time, not the other way around.. so it would be easy to blow the ceiling up.
Even now, extreme high rez screenshot forums use massive downsampling of 8k or 16k, rendering mods, etc to get extreme still shots.
So you can push the graphics way up to ultra or even "beyond" utra. However, you don't play a screenshot.

4k will always be lower frame rate which means it will essentially be lower hz, especially considering variable-hz monitors.
Whether variable hz or not, your frame-rate + hz (not just Hz) must be high to reduce blur appreciably, and to increase motion definition and motion articulation, particularly in 1st and 3rd person perspective games where you are movingyour FoV constantly (which in relation to you is moving the entire game world around constantly with movement-keying and mouse looking).

100fps-hz/120fps-hz/144fps-hz:
~40/50/60% blur reduction (a "soften" blur rather than 60fps-hz and less smearing blur)
5:3/2:1/2.4:1 increase in motion definition and path articulation (often unmentioned, huge difference)
g-sync rides the fps graph +/- without screen aberrations .

So, if you can dial in around 100fps average or so graphics-settings wise, you can range from 75fps-hz to high 130's or more dynamically on the wildly varying frame-rate graphs of games on a variable-hz monitor.

Currently a high end gpu. two card sli setup can barely do 100fps-hz average on very high+ custom setting on most games at 2560x1440, and as I said, the graphics ceiling on games is really arbitrary and will go up with the release of new gpu generations so that isn't likely to change. It will always be a balancing act or huge trade-off (depending on gpus and what settings you choose) between still graphics detail and motion excellence.

Even on a dp 1.3 gpu and a dp 1.3 , 4k monitor with g-sync/free-sync you would be limited to 120hz sdr, and 60hz HDR, and always have a much more demanding resolution to render in real time for games which wouldn't be able to feed the hz enough, even at high to very high+ settings on most games most likely.
5k would be limited to 60hz sdr, and be even more demanding.
 
Last edited:
All us [h] nerds with too much disposable income want something different.

I'm primarily a gamer (though less and less games do it for me as I'm becoming older) and internet browser guy. Haven't watched a movie on my PC in years. I still bit torrent the hell out of any movie or TV show I want to watch, but I do it on my macbook and watch it either laying in bed or plug it into the living room TV. I cant "sit" in my PC chair and feel good about watching a 2 hour + film.

For gaming, I'm pretty happy with my 34" 21:9 X34, the big selling point being the size and the 100 hz refresh rate. Going from 60 hz to 100 hz was definitely noticeable the first time I played Far Cry 4 and WarThunder on the X34 vs my previous 60hz LG 34".

The 1080's coming out this month look like they have DP 1.4 outputs. Not sure what this spec is capable of but man would I love a "comfortably" sized (i.e. no scaling needed) 40-44" 4K that packed g-sync and 120 hz refresh rate.

2 x 1080s in SLI isn't priced out of the question for a lot of us the way 2 x Titan X's were and would probably drive 4k games at 100+ FPS, depending on the settings.

Another gamer dream would be a 34" 3440 x 1440 like we have now, but at 200 hz instead of the X34's 100 hz. Not sure at what point the average human eye craps out and adding any more FPS becomes unnoticeable. But sure would be cool to see an FPS game at 200 FPS on a big screen.
 
I do understand that a 27" 5k running 200% scaling 1400p would "look great". But it would not increase the desktop space and would run at a considerably lower refresh rate.
So on the best situation, 5k is a gimmick, or a slow blurry mess.

110PPI is the new standard. I do believe that desktop real space and refresh rate are more relevant for users than insanely small PPI.

There is a gap in PPI offers: from 112 PPI of a 39" 4k, it goes to 117 PPI of 25" 1440p, and then 137 PPi for 32" 4k. We shoudl have the best of both worlds and try to run a monitor at highest possible refresh rate, with the largest desktop real space with the smallest PPI that users can see with healthy eyes.
Currently this technical trifecta resides on the 3440x1440 75-100Hz 34" monitors. They have 109 PPI. If they had 32" the resulting 116 PPI would be a little better than the 25" 1440p.

4k is really ergonomically challenging. Even at 122 PPI, the often unreadable text of 24" 1440, a 4k display would have to be 36" , which is quite big for something that you must sit far enough to not need to move your head from side to side, but with pixels too little to detect at said distance. and 4k is still a lower refresh rate solution.

In the end, i believe that best desktop solution in the near future will be high refresh rate 25" 1440ps running at 120Hz. It is a modular solution, it looks good in both PPI and refresh rate. In the long run, monitors will be replaced for VR googles, which bypass the ergonomical limitations.
 
So on the best situation, 5k is a gimmick, or a slow blurry mess.
>200 PPI is far from a gimmick. It's a significant improvement in text and image quality.
I think the only people who would disagree either haven't spent any time with one of these displays, or they need to get their vision checked if they don't see the difference.

The ideal would be a display which does nearest neighbor scaling while supporting high refresh rates.
Then you could have a 5K panel which could display 5K60, 1440p120, or 960p180.
Or an 8K panel which could display 8K60, 4K120, 1440p180, or 1080p240
So long as nearest neighbor scaling is used, these lower resolutions actually look better than on a natively lower resolution panel.

110PPI is the new standard. I do believe that desktop real space and refresh rate are more relevant for users than insanely small PPI.
Most people use notebooks instead of desktops now, so even moving to 1080p provides more workspace than most people have.
I think you're seriously over-estimating how many people need a larger workspace than 1440p provides.

In the end, i believe that best desktop solution in the near future will be high refresh rate 25" 1440ps running at 120Hz. It is a modular solution, it looks good in both PPI and refresh rate. In the long run, monitors will be replaced for VR googles, which bypass the ergonomical limitations.
That's far too small for 1440p in my opinion. It's 23% smaller than intended, and 1.25x scaling has so many problems.
There's very little benefit to increasing pixel density from ~96 PPI to anything less than double that.
A 25" 1440p display would be fine if you're just using it for gaming, but uncomfortable to use if you're working on the monitor all day.

Another gamer dream would be a 34" 3440 x 1440 like we have now, but at 200 hz instead of the X34's 100 hz. Not sure at what point the average human eye craps out and adding any more FPS becomes unnoticeable. But sure would be cool to see an FPS game at 200 FPS on a big screen.
If you are using a full-persistence display you need at least 2000 FPS/Hz to eliminate motion blur. But this lets you use variable refresh rates to minimize latency and judder.

With low-persistence displays (ULMB/Blur Reduction) you can achieve 0.5ms persistence at any refresh rate. The problem is that this will significantly reduce your display's brightness, and your framerate needs to be constant or else you will get severe judder.
 
I usually say fps-hz, but he is saying the same thing. High hz is practically meaningless without the frame rate to show newer frames unless you are using backlight strobing or screen blanking to otherwise change the state of the displayed frame through the subsequent hz that the frame rate is frozen through.

120hz-fps-compared

This graphic just shows the rates in relation to each other more like a ratio. We currently measure frames in milliseconds, meaning 1000 milliseconds/second. I also made the little graph before 144hz, 165hz, and 200hz monitors were out (y)
60hz-120hz-30-15_onesecondframe.jpg


<snip>

The graphics ceiling on games is really arbitrary and will go up with the release of new gpu generations.
The challenge for devs is to whittle things down to "fit" real time, not the other way around.. so it would be easy to blow the ceiling up.
Even now, extreme high rez screenshot forums use massive downsampling of 8k or 16k, rendering mods, etc to get extreme still shots.
So you can push the graphics way up to ultra or even "beyond" utra. However, you don't play a screenshot.

4k will always be lower frame rate which means it will essentially be lower hz, especially considering variable-hz monitors.
Whether variable hz or not, your frame-rate + hz (not just Hz) must be high to reduce blur appreciably, and to increase
motion definition and motion articulation, particularly in 1st and 3rd person perspective games where you are moving
your FoV constantly (which in relation to you is moving the entire game world around constantly with movement-keying
and mouse looking).


100fps-hz/120fps-hz/144fps-hz:
~40/50/60% blur reduction (a "soften" blur rather than 60fps-hz and less smearing blur)
5:3/2:1/2.4:1 increase in motion definition and path articulation (often unmentioned, huge difference)
g-sync rides the fps graph +/- without screen aberrations .

So, if you can dial in around 100fps average or so graphics-settings wise, you can range from
75fps-hz to high 130's or more dynamically on the wildly varying frame-rate graphs of games
on a variable-hz monitor.

Currently a high end gpu. two card sli setup can barely do 100fps-hz average on very high+ custom setting on most games at
2560x1440, and as I said, the graphics ceiling on games is really arbitrary and will go up with the release of new gpu generations
so that isn't likely to change. It will always be a balancing act or huge trade-off (depending on gpus and what settings you choose)
between still graphics detail and motion excellence.

Even on a dp 1.3 gpu and a dp 1.3 , 4k monitor with g-sync/free-sync you would be limited to 120hz sdr, and 60hz HDR,
and always have a much more demanding resolution to render in real time for games which wouldn't be able to feed
the hz enough, even at high to very high+ settings on most games most likely.
5k would be limited to 60hz sdr, and be even more demanding.

I dial in my graphics settings so that I get a little more than 100 frames of screen updates per second average (100fps-hz). This allows the wildly varying frame rate graph of games (which looks a lot like a sound file graph) to sync to my refresh rate cleanly. The resulting 75 to near 144 frames-per-second-and-hz range from around 30 to 60% blur reduction and negligible to double and even 2.4:1 higher motion definition and motion articulation of the objects in the scene and of the entire viewport being moved around in relation to you when movement-keying and mouse-looking.

This is a good alternative to using ulmb mode since I find that mode makes the screen muted badly because the backlights aren't bright enough to compensate. Perhaps hdr monitors might allow us to use it bright enough outside of hdr mode someday. I'd still be missing out on the higher motion def and motion articulation that the higher frame rate and hz ranges g-sync allows in it's +/- [framerate&HZ ] seismograph though so I'd prob chose the same option of variable frame rate on most games.
 
Last edited:
I do understand that a 27" 5k running 200% scaling 1400p would "look great". But it would not increase the desktop space and would run at a considerably lower refresh rate.
So on the best situation, 5k is a gimmick, or a slow blurry mess.

110PPI is the new standard. I do believe that desktop real space and refresh rate are more relevant for users than insanely small PPI.

Everything I use (phone, tablet, laptop) are 200dpi+ and look awesome, except for my desktop monitors. IMO, it was mostly due to cost that desktop monitors were limited to low dpi. Now that 4K/5K are becoming affordable, that will become the new standard.

If you need more real estate, use 150% scaling instead of 200%, or add more monitors. 100-110 dpi will never look good, and it's more noticeable now that everyone's cheap phone has a 300dpi screen.
 
If you need more real estate, use 150% scaling instead of 200%, or add more monitors. 100-110 dpi will never look good, and it's more noticeable now that everyone's cheap phone has a 300dpi screen.

150% scaling of a 5k resolution will be exactly 4k resolution, which is useless even at 32" and totally unreadeable at 27". If you want high resolution, you need a large panel. but if you are using a large panel, you need to sit further away from the screen, which in turn makes it harder to use high DPI. I suffered zero ergonomics issues with a 27" 1440p, but a 39" 4k, which is the same DPI would either feel to awakard to use move neck from side to side, or too small to read. The limit lies somewhere between the 118DPI of 25" 1440p and the 122DPI of 24" 1440p, and even those are only possible to use at native resolution because they are small enough to allow one to sit closer.

I will repeat to make it clear: abusive use of 200% scaling / superresolution rendering will provide an improved image quality, but that quality is a gimmick compared to the benefits of higher refresh rates and bigger working space associated with running the panel at it's native resolution. The nirvana of monitors right now are the 34" 100Hz 3440x1440. anyone trying to argue that other size/resolution combinations on offer are better will be hard pressed to prove their point: either there is less working space or more motion blur.
 
Everything I use (phone, tablet, laptop) are 200dpi+ and look awesome, except for my desktop monitors. IMO, it was mostly due to cost that desktop monitors were limited to low dpi. Now that 4K/5K are becoming affordable, that will become the new standard.

If you need more real estate, use 150% scaling instead of 200%, or add more monitors. 100-110 dpi will never look good, and it's more noticeable now that everyone's cheap phone has a 300dpi screen.
That would be fine in a perfect world where all applications respected the global scaling, but in reality not all do. The point is you're still giving up potential screen real estate by making things bigger. You may as well just buy a 4K screen if you're going to use 150% scaling on a 5K screen.

100-110 DPI will never look good? The reason DPI is so high on phones and the like is because the screens are small, and people are going to be looking at it close to their eyes. In desktop use situations you're most likely going to be sitting at least 2 feet away. At that distance 110 DPI is just about the perfect density for the arc resolution of the average person's vision before individual pixels are able to be defined.
 
Back
Top