Your thoughts on 4K vs. 1440p/1600p

gan7114

Limp Gawd
Joined
Dec 14, 2012
Messages
275
So we have all this excitement about 4K monitors being on the way. Manufacturing aside, the price of such monitors is going to be very high, especially at first, but even for some years after being available. Some seem to think we'll be seeing such monitors show up next year. I highly doubt this. We've had prototypes and CES demos, yes, but what manufacturers say on the demo floor vs. what actually ships (or doesn't) is another story. I don't expect we'll be seeing any 4K panels introduced to the 21.5"-24" range for at least 5 years, most likely more. Seeing 27" and 30" panels at 4K will most likely happen within 5 years.

Why? The first reason is that, despite manufacturing tech existing to create 4K and high PPI panels, the market for such monitors is just not there in the 21.5"-24" range. It all has to do with consumer type, and this size range of monitors are typically used by non-professionals, businesses, and gamers. All three types generally look to get the best price to performance ratio, which is usually cheaper priced monitors (and with specs and performance that are appropriate for the price). This is opposed to a premium price to performance ratio, sought after by professionals, and which the 27" and 30" monitor market caters to.

If we are to look at 1080p/1200p as an example and the duration to which this resolution has reigned in the 21.5"-24" category, it's going to be a good long while before said monitors receive the 4K treatment. Furthermore, manufacturers like LG and Samsung need to earn back their R&D and plant development costs, which means they need first adopters (i.e. people whose pockets are well-lined). They are certainly not going to introduce 4K panels to the 21.5"-24" market, despite it's much larger size relative to the 27"-30" market, because the cost recuperation and profits are not there. As such, we can expect the premium size panels of 27" and 30" to be first in line to receive a 4K makeover, despite its smaller market relative to the 21.5"-24" market.

But surely, 4K and high PPI are all the rave nowadays. I think soon enough, 1080p/1200p will have outstayed its welcome, and it's coming. Average consumers, despite not being able to afford monitors with 4K panels, are going to start demanding higher PPI on their desktop monitors. And who could blame them? Relatively speaking, having a 23" or 24" monitor with ~95ppi seems lacking these days.

Caveat: 21.5" panels have 102ppi, which is the 2nd highest PPI available among 21.5"-30" panels. For the sake of this post, I will treat 21.5" as the same as 23" and 24" panels, as 21.5" monitors are largely marketed to the same consumer type as 23" and 24" monitors are, as opposed to 27" and 30" monitors.

Of course, the 'retina' of a screen is all relative to the distance a person is positioned from their panel. Mobile and tablet devices necessarily require 300+ppi due to the eyes being very close to the screen. For laptops, 200+ppi seems to be an adequate amount (as judged by Apple, we'll see). As a screen is positioned further and further away, the need for high PPI becomes less and less due to the way our eyes resolve images. A 55" 4K TV would have just 80ppi, but it will sure look incredible, right? That's because you're sitting 10ft or 3m away. An 85" 4K TV will have merely 50ppi, but again distance will compensate for that.

This brings us back to 21.5"-24" monitors and a simple question, "Is ~95ppi acceptable at this range?". I think the answer is becoming more and more, no it is not. However, at 4K, one must ask themselves if they'd really need 205ppi at 21.5", 192ppi at 23", or 188ppi at 24". Considering most people position themselves anywhere from 2ft to 2.5ft (~.75m) from their monitor, it seems like overkill. So what would be the solution?

Personally, I believe that we will see a phasing out of 1080p/1200p in the 21.5"-24" monitor market within the next 5 years, and an introduction of 1440p/1600p to this size range within 2-3 years. It's a nice trade off to having 4K, which will most assuredly be available on 27" and 30" monitors first (for reference, a 4K 27" panel will have 163ppi; a 4K 30" panel will have 151ppi). With 1440p/1600p at the smaller sizes, 21.5" will sport 137ppi, 23" will have 127ppi, and 24" will come in at 125ppi. Even at the lower end, it's a nice sizable ~30ppi bump for common-use monitors, especially when professionals currently enjoy a maximum of 108ppi, and that's on 27" monitors.

So what say you all? Do you think we'll see the implementation of 1440p/1600p to the 21.5"-24" range before 4K? Or will the manufacturers just jump right into 4K? I'm interested to hear what you think.


Notes: All PPI calculations rounded to nearest pixel. 29" panels not considered because the market for them is extremely niche. The high end 24" market was also not considered because the market is also very small.
 
This is an interesting read: http://www.clarkvision.com/articles/eye-resolution.html . For a 24" screen at normal viewing distance (which is almost exactly 20"x13.3"), 4K is not nearly enough for people with normal vision. 8K would be more like it.

Visual Acuity and Resolving Detail on Prints
How many pixels are needed to match the resolution of the human eye? Each pixel must appear no larger than 0.3 arc-minute. Consider a 20 x 13.3-inch print viewed at 20 inches. The Print subtends an angle of 53 x 35.3 degrees, thus requiring 53*60/.3 = 10600 x 35*60/.3 = 7000 pixels, for a total of ~74 megapixels to show detail at the limits of human visual acuity.
The 10600 pixels over 20 inches corresponds to 530 pixels per inch, which would indeed appear very sharp. Note in a recent printer test I showed a 600 ppi print had more detail than a 300 ppi print on an HP1220C printer (1200x2400 print dots). I've conducted some blind tests where a viewer had to sort 4 photos (150, 300, 600 and 600 ppi prints). The two 600 ppi were printed at 1200x1200 and 1200x2400 dpi. So far all have gotten the correct order of highest to lowest ppi (includes people up to age 50). See: http://www.clarkvision.com/articles/printer-ppi
 
The upper limit that the human eye can possibly resolve at 12" is around 480ppi. That's with better than 20/20 vision; literally, your eyes would have to be perfect to be that sensitive. Drop that to 240ppi for a monitor positioned 2ft from you. So you'd technically be correct, 8K monitors in the 21.5" to 30" range would all fall within the upper limits (300+ppi) of what people with normal vision can resolve. As such:

21.5" = 409.84 ppi
23" = 383.11 ppi
24" = 377.36 ppi
27" = 326.36 ppi
30" = 301.89 ppi

There is, however, the matter of feasibility, power consumption, and whether it would be practical to drive that many pixels to achieve those densities at 8K:

16:9 panels = 33,177,600 pixels
16:10 panels = 36,864,000 pixels

Considering that at 4K, we're (only?) at 8 to 9 million pixels, we've got a long way to go.
 
I disagree completely about 21 inch 2560 monitors. You stated this in your original post, but high PPI is important for handheld devices because you hold them inches from your face and they're small screens. This is far different than a desktop screen - It is not needed or feasible to get 250 PPI on a desktop monitor. I won't go into this because you mentioned this in the OP.

The demand for mobile devices is for higher PPI because of above outlined reasons. The demand for HDTVs and desktop screens is the exact opposite - the demand for non mobile devices is for larger screens, period. Desktop monitor sizes have continually gotten larger over the past 2 decades and that will not change, and attempting to get 300PPI on a desktop monitor is ridiculous and not feasible. There is also the issue of DPI not working well so those resolutions on a small screen would not be great in windows, not all applications support super high DPI. This means that 50% or more applications will be broken with the DPI required to make 4k resolution on a 21 inch work.

The market will be 30 inch or higher 4k monitors; indeed if you look at the product pipeline, nobody is making small 4k PC monitors. Getting 250 PPI on a 27 inch requires 8k resolution. That will not happen. And i'm pretty confident that people don't want 21 inch monitors for a non mobile device, the size is the first consideration most purchasers take into account.

Again. Mobile device demand = smaller, thinner, higher PPI. This makes sense because you hold these devices 2-3 inches from the face.

Desktop HDTV, monitor demand = larger screens, period. PPI will increase on desktop screens, but it will be in baby steps. Again, PPI matters much less because you view these devices from several feet away.

PPI alone is not the end all, be all metric for screens. It is one consideration, and it is the most important for mobile devices. It is not the be all metric for desktop or HDTVs. Maybe I could be wrong, but I just don't see it happening...we'll see though. I'm just happy that 1080p will finally be phased out in coming years.
 
Last edited:
First person that has a 4K monitor that is 30" or under has my $$$. And imho it isn't a waste at all. The 108ppi ACD27" on my desk does not have the same clarity from where I sit a few feet away that I get from an iPhone 5.
 
First person that has a 4K monitor that is 30" or under has my $$$. And imho it isn't a waste at all. The 108ppi ACD27" on my desk does not have the same clarity from where I sit a few feet away that I get from an iPhone 5.

Agreed, I can't wait for a 4k 30 inch monitor to reach a reasonable price. I definitely would love to grab one.
 
The biggest improvement high resolution displays made to tablets and mobile phones was that you needed to zoom less on websites because you were able to read even very small text thanks to tiny pixels offering more resolution at small font sizes.

I just hope that we will start seeing a move away from 16:9 formats back to 16:10, just at higher resolution.
 
140-150 ppi on a desktop display would be great. No way you'll see any pixels if you're using it at a reasonable distance...

I'm hoping for 30" 4k to become affordable in the next couple of years.
 
Last edited:
Agreed, I can't wait for a 4k 30 inch monitor to reach a reasonable price. I definitely would love to grab one.

Good luck with that. 1440p/1600p monitors have been out for years and years and HP, Dell, ASUS, etc (excluding the Ebay Koreans) still feel the need to charge $600 to $1000+ for them. Don't expect a 4k monitor to ever have a "reasonable price" in your lifetime.
 
Good luck with that. 1440p/1600p monitors have been out for years and years and HP, Dell, ASUS, etc (excluding the Ebay Koreans) still feel the need to charge $600 to $1000+ for them. Don't expect a 4k monitor to ever have a "reasonable price" in your lifetime.

It won't be overnight and I didn't say such. I'm using a 1600p monitor so i'm aware of the cost - but keep in mind that high end technology depreciates in value rapidly once it becomes more adopted. 1600p/1440p are low in demand currently but i'd expect prices to lower over the next year or so since more people are picking those monitors up.

17 inch CRTs cost over 600$ in 1995. Five years later that rapidly lowered in price. Prices are not static -- higher than 1080p will be in higher demand this year. I'm not expecting 4k to be affordable this year, in fact i'm certain that it will be really expensive.... but eventually it will be within reach.
 
One thing that is hugely overlooked and really is dampening my enthusiasm for 4k monitors is the refresh rate. We will be forced back to 60hz awful'ness for years. I am really sensitive to refresh rate and 60 hz is the suck. Refresh rate and motion clarity are just as important as resolution IMO unless all you do is CAD or look at still images or something.

Undoubtedly 4k panels will all be IPS or variants which means no 120hz, heck there isn't, even a display connection that could run 4k at 120hz. We would need DP 1.5 or something and dual input ghetto rigging is never an optimal solution.
 
I agree that, for the most part, consumer 4k is just silly. In my opinion it's little more than an excuse for manufacturers to "manufacture" a false need for even more resolution.

Everyone is going to start thinking that if you have a TV over 60" you need 4K or else HD stars to look soft.

BS!

I have a 105" screen, being driven by a 720p video projector.

That's right, 720p. Now, to be fair, it's a $10,000 projector that's generally considered to be the best 720 projector ever made (the lens alone is $3,000 and no I'm not rich, but the guy who sold it to me used was) but it's a testimony to the fallacy of the resolution game.

When people come over to watch a movie the #1 thing I hear is "that's only 720??" Everyone is very surprised.

It looks sharp. Is it as sharp as 1080 on a 40" monitor? Of course not. Then again, when I look at 1080 on smaller screens I think it looks TOO sharp, unnaturally so.

But I'm a movie fan and I want my picture to look like a movie. And even at "only" 720, on my 105" screen it essentially has the same sharpness as a movie theater.

Let's face it, when you see a movie it DOESN'T look like a 50 foot flat panel. It doesn't have that razor sharpness we get at home. But that's ok to me, I want movies at home to look like movies and if 1080p on anything under 100" looks super sharp, I just don't seen any need for 4k.

I mean sure, if you want computer monitor sharpness on a massive, 100" scale, then 4k is the only way to fly. But if you just want your home theater to look just as good as a movie theater, trust me - 1080p looks perfect.

Try it out before you decide! Just don't believe the hype. The scariest thing is that they are actually trying to get people to buy 4k screens when there is no 4k content. Talk about jumping the gun! To me that's proof they are just looking to grab your money.
 
Apples and oranges. There is a huge market for 4k computer displays, which can be run at that resolution just fine on any modern GPU. Watching tv and movies is completely different. Unless you are up scaling 1080p, there really isn't any way to get 4k content yet. Redray is a neat idea but it will take forever to get off the ground.

720p may be "ok" for movies but is abysmal as a computer image. 4k will be an instant boon as a computer monitor, it will take some time for it to be worth it for tv/movies.
 
What exactly is the "huge" market for 4k computer displays? I work as a graphics professional in TV & movies and even in my business I can't think of too many situations in which a 4k display would be necessary or even helpful.

Cgi modeling - and only modeling - could benefit, as being able to resolve small details while building a model could be of use, but as already discussed in this thread, unless you've got perfect vision you're unlikely to truly see that much extra resolution.

A frame of 35mm film is somewhere around 4k of resolution, so in theory you could have a monitor that would show you every last drop of information, but film is being used less and less; however, before very long, movies will be routinely shot at 4k, so having monitors that can offer a 1:1 pixel display of all the data you've acquired will certainly be considered a requirement in some post production situations.

I could see medical displays benefiting from more resolution...

But we're still talking about a small slice of public use, and in industries that were always more or less destined to have 4k displays (I mean if you're shooting a movie at 4k it's essentially a no-brainer that monitors capable of displaying all the data would eventually be introduced).

In any case, I think the point of this thread is CONSUMER use of 4k displays, but other than a frivolous toy for rich, I still don't see a need or use for 4k.

On the other hand, what an industry eager to sell 4k displays can CONVINCE people of is a different matter ;-)
 
Last edited:
What exactly is the "huge" market for 4k computer displays? I work as a graphics professional in TV & movies and even in my business I can't think of too many situations in which a 4k display would be necessary or even helpful.

Cgi modeling - and only modeling - could benefit, as being able to resolve small details while building a model could be of use, but as already discussed in this thread, unless you've got perfect vision you're unlikely to truly see that much extra resolution.

A frame of 35mm film is somewhere around 4k of resolution, so in theory you could have a monitor that would show you every last drop of information, but film is being used less and less; however, before very long, movies will be routinely shot at 4k, so having monitors that can offer a 1:1 pixel display of all the data you've acquired will certainly be considered a requirement in some post production situations.

I could see medical displays benefiting from more resolution...

But we're still talking about a small slice of public use, and in industries that were always more or less destined to have 4k displays (I mean if you're shooting a movie at 4k it's essentially a no-brainer that monitors capable of displaying all the data would eventually be introduced).

In any case, I think the point of this thread is CONSUMER use of 4k displays, but other than a frivolous toy for rich, I still don't see a need or use for 4k.

On the other hand, what an industry eager to sell 4k displays can CONVINCE people of is a different matter ;-)

Once again I am not talking about your home TV set to watch movies. I agree, until we have accessible 4k content it is rather pointless. But we are really talking about 4k finally raising the bar from 1440p and 1600p which have been the computer monitor limit for many many years. Besides medical imaging that you mentioned, graphics professionals and CAD, gaming could also benefit greatly from 4k. And less face it, the vast majority of powerful home computers are for gaming. :)

The thing that will really hurt 4k pickup for computer monitors is if they have a dreaded combination of 60hz, input lag and slow pixel response. You will also really need two or more gpu's to run the display properly IMO.
 
One thing that is hugely overlooked and really is dampening my enthusiasm for 4k monitors is the refresh rate. We will be forced back to 60hz awful'ness for years. I am really sensitive to refresh rate and 60 hz is the suck. Refresh rate and motion clarity are just as important as resolution IMO unless all you do is CAD or look at still images or something.

Undoubtedly 4k panels will all be IPS or variants which means no 120hz, heck there isn't, even a display connection that could run 4k at 120hz. We would need DP 1.5 or something and dual input ghetto rigging is never an optimal solution.

This is pretty much how I feel. But if they come out with 120hz 4k I would get it. Even if I needed 2 cards and 4 dual link DVI connections I would do it, but I would prefer if there was just 1 super cable that could handle it. I guess we'll have to wait for new cards and new connections before 4k 120hz is practical.
 
2560x1440 is good enough for me. Anything smaller id need a mag glass :)
 
That's interesting. Here's an article on why 4K for the home is completely ridiculous.

http://reviews.cnet.com/8301-33199_7-57366319-221/why-4k-tvs-are-stupid/

The "completely ridiculous" or not depends on what you assume is the resolving ability of the human eye. The theoretical ideal is 20/8 according to Wikipedia, and apparently some people are getting quite close to that limit:

http://www.psychologytoday.com/blog/the-good-life/201108/example-super-health-beyond-20-20-vision said:
Imagine my pleasant surprise the other day when reading a sports magazine that recounted studies of the visual acuity of elite baseball and softball players, the vast majority of whom have "better" than 20-20 vision, with some testing out at 20-8.9, as close to the absolute limit of 20-8 as is possible (e.g., Laby et al., 1996). To say that a good batter "sees" the ball well is apparently more than a metaphor**. Super vision exists and has obvious benefits, at least for baseball and softball players.

20/20 = 1 arc-minute resolving ability.
20/8 means 0.4 arc-minute. This is where the article I linked originally got the 0.3 arc-minute figure from - slightly finer than the limit of human eye resolution.

The article you linked makes reasonable points as far as TVs go, but since the technology is there, why not just make something that's beyond the limit of our resolving ability and stop worrying about that aspect? Economy of scale will take care of a slightly higher cost, + the manufacturers need to create a new market to move units - the 1080p market is basically saturated from what I can tell, hence the "4K explosion" at the CES. And, as soon as that is done with, they can concentrate on smoother motion, better colors, &c.

Personal anecdote:
I am looking at a 24" screen from 27" distance (just measured) and I am seeing pixelated text and "staircase" slanted lines as soon as I start paying attention -- e.g. the slash between 1440p/1600p in the thread title. I would say I have average vision. I just performed a non-scientific experiment - took my Nexus 4 (~320 ppi screen), zoomed into a black-on-white text till it was physically the same size as the text in this reply form, and held it at the same distance as my main screen. It's night and day difference. So clearly, 95ppi is not enough for me, and I would jump at the opportunity to upgrade.
 
Nearly every time someone mentions "what we have is enough and will never need better" in regards to technology, they are incorrect.

The same thing was said about 1080p. DVDs. Etc.

"640k RAM is more than anyone will ever need".

4k will happen - it won't be this year, but it will happen.

Progress in meaningful directions would be preferable. Don't expect you to understand that.
 
Nearly every time someone mentions "what we have is enough and will never need better" in regards to technology, they are incorrect.

The same thing was said about 1080p. DVDs. Etc.

"640k RAM is more than anyone will ever need".

4k will happen - it won't be this year, but it will happen.

That isn't what the article said at all. It's talking about how it's physically impossible for you to actually notice that kind of resolution on a standard sized (30 - 55 inch) television at a normal viewing distance above 1080p. Many people claim that it's almost impossible for most people to discern between 1080p and 720p until you start hitting the very large screen sizes too and it's mostly psychological otherwise.

4k makes plenty sense for movie theatres, but it doesn't for a home tv, and certainly not for a PC monitor.

Here's an even more recent article on the subject by the same author which is now an OLED vs 4k argument.

http://reviews.cnet.com/8301-33199_7-57514352-221/4k-tv-vs-oled-tv/

But you don't have to take my word for it. I asked the top TV reviewers from around the Web. Here's what they had to say:

"OLED by a country mile, it's not even close. By my guess the only 2D picture quality improvement 4K will bring is a slight increase in sharpness, and then it'll only be visible with the very best program material (native 4K content being ideal) at a screen size/seating distance ratio that's basically theatrical in scale. OLED should improve picture quality across the board, offering absolute blacks for effectively infinite contrast. I expect more consistent light output in bright scenes than plasma is capable of, without the viewing angle or uniformity issues of LED/LCD.
 
Last edited:
High refresh rate spoils. I can see myself moving to a 60hz, be it affordable 4k or 6k, display, only if my current catleap dies and there is no substitute.
 
That isn't what the article said at all. It's talking about how it's physically impossible for you to actually notice that kind of resolution on a standard sized (30 - 55 inch) television at a normal viewing distance above 1080p. Many people claim that it's almost impossible for most people to discern between 1080p and 720p until you start hitting the very large screen sizes too and it's mostly psychological otherwise.

4k makes plenty sense for movie theatres, but it doesn't for a home tv, and certainly not for a PC monitor.

Here's an even more recent article on the subject by the same author which is now an OLED vs 4k argument.

http://reviews.cnet.com/8301-33199_7-57514352-221/4k-tv-vs-oled-tv/

You have got to be kidding me. If any place would benefit from 4k, it would be a computer monitor. 1080p looks like utter dog shit as a computer image sitting from a typical 2 feet away. I seriously think some people out there are blind.

And all of these silly articles from cnet and what not are referring to 4k TV sets only, which doesn't apply to this topic. You sit very close to a computer monitor, not so much with a TV.
 
What exactly is the "huge" market for 4k computer displays? I work as a graphics professional in TV & movies and even in my business I can't think of too many situations in which a 4k display would be necessary or even helpful.

Artists.Those working in print media. Architects. Engineers. Stock traders - more resolution = fewer screens and space costs. etc.

Basically, plenty of people. There just needs to be a business case for it.
 
By the time 4k LCDs are affordable they will be pretty much irrelevant.

Are you guys really willing to drop $2,500 or something on a 4k 30-40" LCD display while there's basically no content available at that resolution even in 4-5 years?

Look how old HD 1080p is and how poorly adapted HD channels are on national TV. Everything but sports channels have horrible quality HD. It's not even close to 1080p. You can forget about streaming 4k online too unless the internet drastically changes. A lot of areas in the world have restricted internet access because there's insane bandwidth caps. 4k streaming would eat through people's monthly BW caps in absolutely no time at all. That's only half the problem though because the speeds required to properly stream 4k would be massive and out of reach for almost everyone.

It will be like 15-20 years minimum and by then something will surely change technologically to make a 4k LCD obsolete. I'm with the reviewer in that I hope oleds take off.

I'd be a million times more happy with a 2560x1440 oled display somewhere in the 27-30"ish range as a computer monitor. I'd also rather see them spend money on making holographic stuff, not improve 2D if it came down to making choices.
 
Last edited:
By the time 4k LCDs are affordable they will be pretty much irrelevant.

Are you guys really willing to drop $2,500 or something on a 4k 30-40" LCD display while there's basically no content available at that resolution even in 4-5 years?

Look how old HD 1080p is and how poorly adapted HD channels are on national TV. Everything but sports channels have horrible quality HD. It's not even close to 1080p. You can forget about streaming 4k online too unless the internet drastically changes. A lot of areas in the world have restricted internet access because there's insane bandwidth caps. 4k streaming would eat through people's monthly BW caps in absolutely no time at all. That's only half the problem though because the speeds required to properly stream 4k would be massive and out of reach for almost everyone.


It will be like 15-20 years minimum and by then something will surely change technologically to make a 4k LCD obsolete. I'm with the reviewer in that I hope oleds take off.

I'd be a million times more happy with a 2560x1440 oled display somewhere in the 27-30"ish range as a computer monitor. I'd also rather see them spend money on making holographic stuff, not improve 2D if it came down to making choices.

The OP is talking about (computer) monitors, not TVs. Your points are irrelevant.
 
The OP is talking about (computer) monitors, not TVs. Your points are irrelevant.

I'm talking about monitors too. Heck I even mentioned it in my last sentence. Nice try though.

I only brought up the HD TV thing to show how horrible 1080p support is after all these years. 4k is a much different animal too because it's a lot more intensive bandwidth wise vs the initial SD vs 1080p step.

Also people use their monitors to stream Netflix and even cable TV stations, especially as monitors grow in size.

I program for a living and value screen real estate greatly. I would still rather have an oled 1440p monitor over some 4k LCD today or even 5 years from now.
 
I'm talking about monitors too. Heck I even mentioned it in my last sentence. Nice try though.

I only brought up the HD TV thing to show how horrible 1080p support is after all these years. 4k is a much different animal too because it's a lot more intensive bandwidth wise vs the initial SD vs 1080p step.

Also people use their monitors to stream Netflix and even cable TV stations, especially as monitors grow in size.

I program for a living and value screen real estate greatly. I would still rather have an oled 1440p monitor over some 4k LCD today or even 5 years from now.

Nice try? OK...

You're talking about monitors too? Oh you mean that one little phrase at the end of your massive off-topic RANT?

Right. Well, good for you I guess.
 
I'd lump all that under the "content" catch-all, which is still a major sticking point for 4K in general IMO, but that's me. That said...a 30" 2560x1600 monitor at typical viewing distance--let's say about one meter--is readable for me as it is, minus the usual complications such as font size. A 4K monitor at the same size, at the same distance, would be substantially harder to deal with without going all old-person and jacking up font sizes and so on; a 24" 1600p monitor might be an interesting tech hand-me-down, however...
 
Nice try? OK...

You're talking about monitors too? Oh you mean that one little phrase at the end of your massive off-topic RANT?

Right. Well, good for you I guess.

What makes it off topic? The title says "your thoughts on 4k vs 1440p/1600p". My thoughts are I think a 4k LCD is worse than an oled 1440p display today and even in the future.

Believe it or not a lot of people use their monitors for more than browsing web pages.

I seriously doubt we'll see a 120hz 4k LCD that's affordable any time soon and for gaming I'd way prefer a really tight/sharp 1440p image on an oled with absolutely no input delay.
 
Back
Top