Has the gaming monitor market become stagnant ?

I have no confidence in TV manufacturers to put out displays that don't have tons of input lag. They can't even release OLED monitors without embarrassing levels of input lag. It's a joke.

Some 2017 TVs have 10ms of lag. It was tested using the leo bodnar device though so who knows maybe its actually more, or it could be less. If 10ms is the true value then it really isnt that bad. Of course a lot high end models STILL have 20+ ms of lag though...
 
You have things backwards.
The problem is not that DPI scaling is bad, it's that screens are too big, and resolution is too low for the sizes they're being sold at.

The main issue with DPI scaling right now is that 4K is not enough resolution.
Using integer scaling (2x) 4K only leaves you with a 1080p workspace - which means that a 4K monitor should be 23" in size.
At 27" or 32" a 4K panel requires 1.75x or 1.50x scaling which means that legacy applications are blurred.
Technically it should be 1.70x / 1.43x, but custom DPI scaling does not work well.

Integer scaling is required to keep legacy applications sharp.
Windows will use nearest neighbor scaling (sharp) rather than bilinear scaling (blurry) when you stick to integer scales, making legacy applications appear as though you are using a display whose native resolution matches the scale; i.e. 1080p for 4K@2x.

To fix this, these screens should be higher resolution.
Personally I dislike this recent trend of monitors targeting 110 PPI. Windows uses 96 PPI as its base, and so should displays.
It's a way for manufacturers to cheap out and sell displays that are 15% smaller.

This means 5K panels at 30.6" for a 2560x1440 workspace with 2x scaling, rather than 4K at 32".
But 5K was never mainstream, they made 27" panels instead, and no-one seems to sell them anymore except Apple.

That leaves us waiting for 8K at either 30.6" for 2560x1440@3x or 46" for 3840x2160@2x.

Fortunately 2018 or 2019 TVs with HDMI 2.1 should support 4K120 VRR. That means OLEDs.
The downside is that OLED is not ideal for being used as a monitor with its WRGB subpixels, and starting at 55". 8K OLEDs though…
The downside is that these will be expensive, but they should also last a very long time.
Note that when I said wanting to use a 4K display at 100% scaling, it's from the original PC paradigm of "higher resolution = more room for windows". For that, a 4K display should indeed be four times the size of your typical 1080p monitor for comfortable usage, as it would be more akin to having a quad 1080p monitor setup without distracting bezels (and without tying up three more of your video outputs).

200% scaling looks like it should, but what I don't know there is how it's going to handle cursor input in certain cases - specifically old graphics programs like Paint Tool SAI that predate this push for DPI-awareness in Windows and tap into drawing tablets instead of using typical mouse input. Artists still favor using 100% scaling for that reason.

Alas, I don't have a monitor DPI-dense enough to test 200% zoom with.

The HDTV market's come a long way, but the input lag still seems rather high by PC monitor standards going by what RTINGS.com has listed. I'm hoping they can bring it down a bit more, at least under the 16ms threshold.
 
I feel like the PC monitor market has hit a point of diminishing returns lately. The last 3-4 years have been insane with monitor tech advancing so fast. As of the last year things have become stale and tech has slowed. Every body and there mom made an ultrawide and thats about it. Hopefully will see advances with DP 1.5? ;\

I have felt the same with TVs and displays for several years now. Stagnation, slow releases etc.

But then I think back in time, and compare to other pieces of tech industry, and I find that for me it is actually not that I could say that displays somehow lag behind other hardware, but that the core problem is that displays and TVS are the only segment of the tech industry that suffers from ridiculously bad marketing, compared to other segments. Display manufacturers don't know how to distinguish between roadmap projections and products.

Check out these two links: https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-announced-at-ces-2017 and https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-35-inch-curved-monitor

In January at CES, ASUS and Acer announce upcoming 144Hz 4K G-SYNC HDR displays. Then, ~5 months later, the second announcement of the same comes in: "Today, at Computex 2017, we unveiled two new 35” curved G-SYNC HDR monitors from Acer and ASUS.", and then the same article also says "The Acer Predator X35 (left) and ASUS ROG Swift PG35VQ (right) are targeted for a Q4 2017 release.".

Notice how the first article carefully avoids mentioning any release dates, but still gives the impression that the technology is now mature and here and wow it's gonna be cool! Anyone will feel the industry moves slowly if the tech they announced in January will only reach customers in October the earliest.

Which other tech industry does this type of announcements for close to one year before actually being able to ship? Imagine if Apple showcased new iPhone in Sept 2016 WWDC with "taking orders in 10 months!", or if NVidia's GTX 1080, which shipped in May 2016, had gotten a CES-like "hands-on" product announcement back in July 2015 already. No other segment of the tech industry does this because that would be perceived completely ridiculous. If TV&display marketing departments worked on e.g. memory, then HBM 3 news from https://arstechnica.com/gadgets/2016/08/hbm3-details-price-bandwidth/ would be saying "HBM3 products unveiled now", or "Samsung HBM3 memory chips announced first in the world", instead of giving a more honest sounding roadmap of "HBM Gen3, 2019/2020". This is why it constantly feels to me that the displays market is lagging behind - because new technologies are being announced as products years before actually being feasible. This gives tech aware enthusiasts a false understanding of products that are supposed to be right at the door real soon from leading manufacturers.

I love an industry that is openly showcasing their R&D to the public, but instead of falsely decorating their R&D as products, they should provide realistic roadmaps like other manufacturers do about when they expect things to become realistic. Did you ever see a display manufacturer company show a roadmap slide in presentation about their expectations when 5K@120Hz displays will become feasible products? Yeah, me neither.

Because of the completely confusing advertising that every upcoming technical feature in displays space is already in a product, I'm already looking for that dual cable DP 1.5 HDMI 2.1 USB3.1 Gen2 240 Hz 8K HDR 100% Rec.2020 G-Sync 42" 1000 nit OLED PC monitor. And boy does time go slow waiting..
 
The gaming monitor market is in a wild west sort of state right now, where anything goes even if it's complete garbage.

Gaudy enclosures, bad QC, still using DP 1.2, gimmicky advertising for 'HDR' (when it's hardly HDR), and sizes all over the place even for the same resolutions and aspect ratios.

Anything below 96PPI these days should be automatically written off. It's the PPI/default scale which Windows is based on number one, and is essentially the 23" 1080p desktop environment scale. That should be the minimum floor for PPI, yet we're being charged premium prices for things like 21:9 monitors with resolutions that give us crap 80PPI or lower unscaled? Complete garbage.

I'd like to see some effort given by both the panel and monitor manufacturers to corral around a standard size for multiple resolutions (to bring down costs/increase economies of scale), while being reasonable for use both in physical size and desktop environment (using intuitive scaling for higher resolution panels).

Turn outs there happens to be one for all major 16:9 resolutions beyond 1080p: 30.6"
 
Last edited:
I have felt the same with TVs and displays for several years now. Stagnation, slow releases etc.

But then I think back in time, and compare to other pieces of tech industry, and I find that for me it is actually not that I could say that displays somehow lag behind other hardware, but that the core problem is that displays and TVS are the only segment of the tech industry that suffers from ridiculously bad marketing, compared to other segments. Display manufacturers don't know how to distinguish between roadmap projections and products.

Check out these two links: https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-announced-at-ces-2017 and https://www.geforce.com/whats-new/articles/nvidia-g-sync-hdr-35-inch-curved-monitor

In January at CES, ASUS and Acer announce upcoming 144Hz 4K G-SYNC HDR displays. Then, ~5 months later, the second announcement of the same comes in: "Today, at Computex 2017, we unveiled two new 35” curved G-SYNC HDR monitors from Acer and ASUS.", and then the same article also says "The Acer Predator X35 (left) and ASUS ROG Swift PG35VQ (right) are targeted for a Q4 2017 release.".

Notice how the first article carefully avoids mentioning any release dates, but still gives the impression that the technology is now mature and here and wow it's gonna be cool! Anyone will feel the industry moves slowly if the tech they announced in January will only reach customers in October the earliest.

Which other tech industry does this type of announcements for close to one year before actually being able to ship? Imagine if Apple showcased new iPhone in Sept 2016 WWDC with "taking orders in 10 months!", or if NVidia's GTX 1080, which shipped in May 2016, had gotten a CES-like "hands-on" product announcement back in July 2015 already. No other segment of the tech industry does this because that would be perceived completely ridiculous. If TV&display marketing departments worked on e.g. memory, then HBM 3 news from https://arstechnica.com/gadgets/2016/08/hbm3-details-price-bandwidth/ would be saying "HBM3 products unveiled now", or "Samsung HBM3 memory chips announced first in the world", instead of giving a more honest sounding roadmap of "HBM Gen3, 2019/2020". This is why it constantly feels to me that the displays market is lagging behind - because new technologies are being announced as products years before actually being feasible. This gives tech aware enthusiasts a false understanding of products that are supposed to be right at the door real soon from leading manufacturers.

I love an industry that is openly showcasing their R&D to the public, but instead of falsely decorating their R&D as products, they should provide realistic roadmaps like other manufacturers do about when they expect things to become realistic. Did you ever see a display manufacturer company show a roadmap slide in presentation about their expectations when 5K@120Hz displays will become feasible products? Yeah, me neither.

Because of the completely confusing advertising that every upcoming technical feature in displays space is already in a product, I'm already looking for that dual cable DP 1.5 HDMI 2.1 USB3.1 Gen2 240 Hz 8K HDR 100% Rec.2020 G-Sync 42" 1000 nit OLED PC monitor. And boy does time go slow waiting..

The auto industry?
 
Note that when I said wanting to use a 4K display at 100% scaling, it's from the original PC paradigm of "higher resolution = more room for windows". For that, a 4K display should indeed be four times the size of your typical 1080p monitor for comfortable usage, as it would be more akin to having a quad 1080p monitor setup without distracting bezels (and without tying up three more of your video outputs).
The problem is that 40-46" is really the absolute limit for that line of thinking - at least for the majority of people who want a single display which doesn't require them to move their head around to use it.

It is arguably past the point where it is useful for most people. Most people want to sit further back from a 40" display than a 27" monitor, which means that you have to increase the DPI scaling anyway.
Depending on how far you have to push the display back to make it comfortable, you may end up with the same workspace as a 27" monitor anyway. 1.25x scaling reduces your workspace to 3072x1728, while 1.50x reduces it to 2560x1440.

With my 34" ultrawide, the distance it's positioned from me compared to the 46" TV it replaced makes the 16:9 portion of the screen almost exactly the same size in my vision. Even though the resolution would be higher with a 4K TV, I'd have to use 1.50x display scaling to keep text at a comfortable size anyway.

200% scaling looks like it should, but what I don't know there is how it's going to handle cursor input in certain cases - specifically old graphics programs like Paint Tool SAI that predate this push for DPI-awareness in Windows and tap into drawing tablets instead of using typical mouse input. Artists still favor using 100% scaling for that reason.
I'm not sure why that would be an issue at all - at least if you have the compatibility mode set correctly for the application.

The HDTV market's come a long way, but the input lag still seems rather high by PC monitor standards going by what RTINGS.com has listed. I'm hoping they can bring it down a bit more, at least under the 16ms threshold.
Variable Refresh Rate support instead of having to use V-Sync is going to make a far bigger difference than knocking off an extra 4ms latency.
Once HDMI 2.1 VRR is here and supported - hopefully by NVIDIA and not just AMD - it is largely going to solve most of the latency problems that televisions have.
Sure, getting that 20ms many TVs currently have closer to 0 will help, but the switch from V-Sync to VRR is a bigger change at this point.
 
Last edited:
I have hope that the PG27UQ will be the first monitor in a LONG time worth upgrading to. Now it just remains to be seen if it is vaporware at this point.....

I'd bet that that one is coming. If they put one out with the same specs but 32", I'd already have one on order...
 
I'm waiting for either the PG27UQ or the PG35VQ/X35. I've gotten into FPS games and my LG34UC97S isn't cutting it anymore. I'll need to upgrade my 980 to a 1080Ti so it's going to be an extremely expensive upgrade :/
 
I'd bet that that one is coming. If they put one out with the same specs but 32", I'd already have one on order...

It's from Asus, so it'll stop working within the first year. The QC on the first ROG Swifts was an absolute joke.

And like everyone's said a hundred times, 27" 16:9 monitors are way too fucking small. 30" or don't even get out of bed. With 16:10 aspect ratio, you could get away with 28" maybe, but a 27" 16:9 monitor is just so short. They're tiny.
 
And like everyone's said a hundred times, 27" 16:9 monitors are way too fucking small. 30" or don't even get out of bed. With 16:10 aspect ratio, you could get away with 28" maybe, but a 27" 16:9 monitor is just so short. They're tiny.

I agree, pretty much anywhere between 30-32" is ideal for monitors in my opinion. They should let scaling handle the desktop real estate issue, rather than putting out ridiculously small or enormously huge monitors.

For example, with 16:9 resolutions:

1440p 30.6" = 96ppi @ 100% scaling
2160p (4K) 30.6" = 144ppi, 96ppi equivalent @ 150% scaling
2880p (5K) 30.6" = 192ppi, 96ppi equivalent @ 200% scaling
4320p (8K) 30.6" = 288ppi, 96ppi equivalent @ 300% scaling

As you can see, this particular size, while being neither too small nor too big, also scales perfectly all the way up for higher resolutions, for the scale Windows is intended to be used at (UI, fonts, etc).

The same can be done with 16:10, while not being too much larger:

1600p 31.5" = 96ppi @ 100% scaling
2400p (4K) 31.5" = 144ppi, 96ppi equivalent @ 150% scaling
3200p (5K) 31.5" = 192ppi, 96ppi equivalent @ 200% scaling
4800p (8K) 31.5" = 288ppi, 96ppi equivalent @ 300% scaling
 
I can't imagine that they are making much money at this point. Computer monitor tech hasn't changed much in the last 5-10 years; from the perspective of the average consumer especially. What is the reason to upgrade? Should I upgrade my high refresh rate TN monitor with terrible viewing angles to the IPS panel with better viewing angles and massive IPS glow? Should I upgrade to this years latest monitors, which are nothing more than rebranded panels of 5 years ago?

The incentive to upgrade just isn't there. It's not like, for example, the cell phone market where each year brings about new features, CPUs, more RAM, etc.

IMO the only people upgrading monitors even semi-regularly are the ones falling for the blatantly false marketing or bogus spec sheets. The sad fact is, every consumer level monitor out today is basically garbage. They all have serious flaws. Buying a new one only trades out flaws for different ones.

I have hope that the PG27UQ will be the first monitor in a LONG time worth upgrading to. Now it just remains to be seen if it is vaporware at this point.....

Every LCD monitor ever bought is flawed compared to my Trinitron CRT. Doesn't mean we don't upgrade and trade flaws for other flaws.
 
They have all hit a ceiling in low input lag and fast response time as a selling point. Only way to differentiate is by focusing on picture quality---most are too incompetent in that area compared to TV manufacturers.
 
from above:

"The HP ZR30W came out in like 2009 and it's still better than 99% of the monitors available today. What a goddamned farce."

hahaha...drivel...several reviews out there , from one:

"The HP ZR30w offers very accurate grayscale performance and excellent viewing angles, but it comes up short in other areas. Its oversaturated reds are hard to overlook, especially since you can't go in and make the necessary adjustments to tone them down, and its input options are severely limited. Moreover, I'd expect a $1,259 monitor to utilize the latest USB technology. The ZR30w's shortcomings make it hard to recommend over the Dell UltraSharp U3014, which delivers much more accurate colors and gives you a slew of picture settings as well as USB 3.0 connectivity and a variety of video ports."
 
from above:

"The HP ZR30W came out in like 2009 and it's still better than 99% of the monitors available today. What a goddamned farce."

hahaha...drivel...several reviews out there , from one:

"The HP ZR30w offers very accurate grayscale performance and excellent viewing angles, but it comes up short in other areas. Its oversaturated reds are hard to overlook, especially since you can't go in and make the necessary adjustments to tone them down, and its input options are severely limited. Moreover, I'd expect a $1,259 monitor to utilize the latest USB technology. The ZR30w's shortcomings make it hard to recommend over the Dell UltraSharp U3014, which delivers much more accurate colors and gives you a slew of picture settings as well as USB 3.0 connectivity and a variety of video ports."

If you're going to make the Dell comparison, should probably compare to the HP's Dell contemporary, the U3011. The U3011 had significantly more input lag and you can calibrate the colors on the HP. If your purpose was both gaming and color accuracy (mine was), the HP was the better buy.
 
"better than 99% of what is available today"...he didn't say at the time it came out...is the Dell among the 1%?...are there no other monitors available today that are better?... and would they total more than 1% of available monitors?...hmmm , interesting
 
"better than 99% of what is available today"...he didn't say at the time it came out...is the Dell among the 1%?...are there no other monitors available today that are better?... and would they total more than 1% of available monitors?...hmmm , interesting

Good catch- I wouldn't put any of the old 30" 1600p monitors in the 'better than 99% of what is available today" category, and I own and use the HP every day. Probably the biggest issue with them is that the LG panels they used are just too slow in terms of pixel refresh times.
 
True, Asus and Acer selling these panels is very concerning from a quality assurance perspective.

I think, however, that perhaps these monitors will be of a better quality than in the past. I say that because Nvidia has been marketing these as THEIR new "Nvidia Gsync 4k HDR" monitors. In other words, they have hitched the Nvidia brand name to the Asus/Acer wagon. I cannot imagine the Nvidia JDs, MBAs, and BBAs who negotiated and signed off on this joint venture with them would be so stupid as to fail to include in the contract between them some kind of quality assurance stipulations. I mean, that is business school 101 material. You never tie your company name to another company without strong guarantees they aren't going to tarnish your brand. If these monitors turn out to be a quality assurance nightmare, Nvidia and their new Gsync HDR brand will suffer accordingly.

That didn't stop the first batch of G-Sync monitor from having the same issues though (PG278Q and XB270HU).

But nVidia and Asus/Acer could have learned their lessons by now, I haven't heard nearly as much complaint about Acer's XB271HU (though it could be that the model has been muddled into obscurity, the monitor is definitely a step above Asus's PG279Q).
 
It definitely seems like a lot of the early teething problems have been sorted- I picked up my Acer Predator at Fry's of all god-awful places, and aside from having to get a different DP cable, it's been well behaved.

That's not to say that there may not be new teething problems though. I just think that these companies might've learned a thing or two about releasing monitors without a good handle on Q/C. These last rounds of people buying and returning or RMA'ing en masse have to have stung.
 
It's from Asus, so it'll stop working within the first year. The QC on the first ROG Swifts was an absolute joke.

And like everyone's said a hundred times, 27" 16:9 monitors are way too fucking small. 30" or don't even get out of bed. With 16:10 aspect ratio, you could get away with 28" maybe, but a 27" 16:9 monitor is just so short. They're tiny.
If I want a TV I buy a TV. At least in my case my display is quite close to me so 27" would be absolutely the maximum.
 
It's from Asus, so it'll stop working within the first year. The QC on the first ROG Swifts was an absolute joke.

27" 16:9 monitors are way too fucking small. 30" or don't even get out of bed. With 16:10 aspect ratio, you could get away with 28" maybe, but a 27" 16:9 monitor is just so short. They're tiny.

This is kinda funny because when I built my first gaming PC (coming from a 15" laptop) I used a 21.5 1080p monitor for 6 years and was perfectly fine with it. Upgraded to a 27" 1440 and it was HUGE but I got used to it and it was perfect, the 21" looks so tiny now. Also got a 32" Hp Spectre 4K which was a shock, It's too big. Only reason I got the 32" was because It's the only gloss 4K. Having a larger monitor forces you to sit farther back. If it was just a little smaller like 30" it would be perfect. 32" is too big considering how close I sit but I learned to live with it and made it work. The glossy screen is worth it.
 
Like you are saying and others have suggested, I feel that it is a matter of viewing perspective vs. distance....
ZxaJuvd.jpg


....so that you aren't eye-bending (and even micro neck bending) to the periphery.

AjhPs2o.gif



In my opinion, a larger TV as a monitor is doable with the right layout, more in line with the viewpoint in the schematic posted above more or less. To me most would be too big for 1st/3rd person actually sitting at a regular desk layout.


This kind of setup below (credit LawrenceCanDraw youtube channel) with a good ergotron arm gives a lot of options with how large the monitor appears to your perspective by allowing you to changing the distance. Consider how large a tiny VR screen looks up close. You don't need a 27" monitor as close as VR screens and not necessarily quite as close as Lawrence's closest frame next to his face of course, but varying anywhere between 1' to 3' is good depending what you are doing. Closer makes the 16:9 wide monitor take up a lot of your view. A 35" ultrawide's screen would be slightly taller, and much wider, perhaps closer to the width of the large black monitor frame on the one shown.



Personally I am looking forward to this one below, which is 35" diagonal curved ultrawide rather than 34" ultrawide. A little larger would definitely work but not too much. The perspective in the picture does a good job at showing how large these monitors can appear up close at normal desk distances for gaming. Don't forget it is curved too so the picture might be showing it slightly narrower as flat.
4nEhOP7.png


Some other pics of the asus 35" curved ultrawide just as links here which give an idea of size
https://rog.asus.com/media/1496495502335.jpg
https://keddr.com/wp-content/uploads/2017/06/rog-monitor-2.jpg
https://keddr.com/wp-content/uploads/2017/06/rog-monitor-1-728x546.jpg


-------------------------------------------------------------------

In response to the actual question of this thread, there was a long stagnant time in LCD gaming monitors until 120hz 1080p gaming monitors came out (with a period where some suffered 60hz and poor response times for some affordable B-grade knock off 2560 x1440 ips screens too). Then, even if proprietary tech, nvidia forced variable hz to market and in the best real world solution available regardless of what free-sync's best case scenario on paper is. High hz 2560 x 1440 hit with g-sync. Then some ips monitors of this type with relatively low response times and some 60hz ultrawides with a few oc'ing to 100hz eventually. 4k was in the mix but only at 60hz (baseline smear blur, low motion definition, and crippling frame rates on the gpus at the time.. molasses), actually 4k at 30hz initially if you can believe it. There was also the release of some 1st gen OLED VR kits. There are some people again suffering 60hz molasses for oversized OLED TV's with several other niggling issues. Now by the start of 2018 we will have 1000nit quantum dot enhanced color HDR gsync 120hz+ low response time monitors in IPS and VA at 4k and 3440 x 1440 ultrawide. I don't see that as stagnant but it is very expensive bleeding edge for early adopters. In later years perhaps LG's 120hz native interpolated video tech (HFR) will translate to a true full featured oled gaming monitor (high hz, variable hz, HDR, low response time) but I'm not sure that is possible with IR and burn in , OLED fade (even with white oleds), color calibration issues on a computer monitor. The VR segment could eventually get higher resolution at higher hz and become more untethered and free roaming AR/VR too. (Spielberg "Ready Player One" Trailer) :droid:
 
Last edited:
This kind of setup below (credit LawrenceCanDraw youtube channel) with a good ergotron arm gives a lot of options with how large the monitor appears to your perspective by allowing you to changing the distance. Consider how large a tiny VR screen looks up close. You don't need a 27" monitor as close as VR screens and not necessarily quite as close as Lawrence's closest frame next to his face of course, but varying anywhere between 1' to 3' is good depending what you are doing. Closer makes the 16:9 wide monitor take up a lot of your view. A 35" ultrawide's screen would be slightly taller, and much wider, perhaps closer to the width of the large black monitor frame on the one shown.

Personally I am looking forward to this one below, which is 35" diagonal curved ultrawide rather than 34" ultrawide. A little larger would definitely work but not too much. The perspective in the picture does a good job at showing how large these monitors can appear up close at normal desk distances for gaming. Don't forget it is curved too so the picture might be showing it slightly narrower as flat.
Here's a comparison I shot when I got my PG348Q. The display in the back is a 46" TV mounted on the wall.
The PG348Q is closer to the front of the desk and was tilted back for the comparison. When vertical, it appears larger.
ultrawide5qagm.jpg


As I've said many times before on here, it ended up that the distance I found comfortable to use the monitor at resulted in the 16:9 portion of the screen being a very similar in size in my vision despite the display being smaller.
I don't think that a larger 16:9 display would really benefit me, because all it would mean is that I have to sit further back from it for the image to be a comfortable size.

Since ultrawide displays have a fixed image height regardless of the content being displayed on them, you don't have to change viewing distance like you do with 16:9 displays. (letterboxed movie = sit closer on 16:9)

Now I'm not saying there is no reason for larger displays, as viewing from a greater distance may be more comfortable for people even if the resulting image size is the same.
I just don't see it as a benefit to workspace. If I'm sitting further, I'm going to need DPI scaling to keep text legible anyway, which would keep the usable workspace around 1440p.
 
Honestly for a computer monitor, 32" is the absolute maximum regardless of how high the resolution, even if it is 8K. Higher DPI means sharper, print like text instead of that "blocky" look. Everything is smoother and easier on the eyes when you turn scaling on. 4K 27-32" with scaling looks better than 40-50" without scaling. You can only sit so far back before it looks blurry even with glasses. Still though, 140 PPI on the 32" 4K is still a higher density than any monitor I have ever owned. I would still like something a little higher though, like 160 PPI and I'll be set. Those 400 PPI smartphone screens spoiled me and now I want my monitor to look the same. With 32" 4K having the exact same PPI as 21.5 1440p or 15" 1080p It's finally to a point where it is "Good enough" for 99% of people. 15" 1080p density on a massive 32" screen, this is why 4K is the future. Now we just need a glossy 120hz 4K and I'll be set (Hp Spectre is only 60hz) I still use my Qnix for gaming as it has zero input lag and 96hz.
 
everything is overpriced and monitors are some of the worst. I remember when everyone had a 13" CRT monitor, then the 15" came out and everyone had to have one, next year, 17", 19" 21" on and on...rip off every year.
Along comes lcd monitors and did they start out big...heck no....its was 17,19,21 same bullshit

Same now, they don't have anything new just more bling (BS) and a higher price.

remember when I went from 24 to 28" monitor....it wasn't but a few days and it was pushed as far back on my desk as it could go
 
everything is overpriced and monitors are some of the worst. I remember when everyone had a 13" CRT monitor, then the 15" came out and everyone had to have one, next year, 17", 19" 21" on and on...rip off every year.
Along comes lcd monitors and did they start out big...heck no....its was 17,19,21 same bullshit

Same now, they don't have anything new just more bling (BS) and a higher price.

remember when I went from 24 to 28" monitor....it wasn't but a few days and it was pushed as far back on my desk as it could go
My first lcd was a 15 inch with 50ms response time. Crazy stuff nowadays haha.
 
Honestly for a computer monitor, 32" is the absolute maximum regardless of how high the resolution, even if it is 8K. Higher DPI means sharper, print like text instead of that "blocky" look. Everything is smoother and easier on the eyes when you turn scaling on. 4K 27-32" with scaling looks better than 40-50" without scaling. You can only sit so far back before it looks blurry even with glasses. Still though, 140 PPI on the 32" 4K is still a higher density than any monitor I have ever owned. I would still like something a little higher though, like 160 PPI and I'll be set. Those 400 PPI smartphone screens spoiled me and now I want my monitor to look the same. With 32" 4K having the exact same PPI as 21.5 1440p or 15" 1080p It's finally to a point where it is "Good enough" for 99% of people. 15" 1080p density on a massive 32" screen, this is why 4K is the future. Now we just need a glossy 120hz 4K and I'll be set (Hp Spectre is only 60hz) I still use my Qnix for gaming as it has zero input lag and 96hz.

PPI is highly related to viewing distance though. Phones and tablets are viewed very close so they need to have a high resolution to display ratio. My 15" Macbook Pro at 2880x1800 is about 220 PPI which is half that of an iPhone 7+, looks great. So for desktop monitor sizes those 27" 4K displays will probably look pretty good with 163 PPI. That said, I would personally prefer something a bit larger like 30-32" but no bigger than that.

The posts saying we need to be at 8K to support legacy apps is ridiculous to me. Legacy apps are exactly that and should not factor in. It's up to their developers to keep up with the times. That said it would be a good gesture from MS to offer more options for scaling those but it can be a losing battle when they have only low res icons and whatnot.
 
PPI is highly related to viewing distance though. Phones and tablets are viewed very close so they need to have a high resolution to display ratio. My 15" Macbook Pro at 2880x1800 is about 220 PPI which is half that of an iPhone 7+, looks great. So for desktop monitor sizes those 27" 4K displays will probably look pretty good with 163 PPI. That said, I would personally prefer something a bit larger like 30-32" but no bigger than that.

The posts saying we need to be at 8K to support legacy apps is ridiculous to me. Legacy apps are exactly that and should not factor in. It's up to their developers to keep up with the times. That said it would be a good gesture from MS to offer more options for scaling those but it can be a losing battle when they have only low res icons and whatnot.

I'd say 140-200 PPI is the sweet spot for desktop monitors. The idea is to not be able to see any pixels. A 27" 4K is 163 PPI, you can't see the pixels at normal viewing distance. I ended up with a 32" 4K because its the only glossy 4K. I would prefer the smaller 27" size, 32" is massive at a 2-3 ft viewing distance. I wouldn't use anything with a PPI lower than 140 PPI. 27" 1440p looks pixelated after using 4K and thats 109 PPI.
 
I'd say 140-200 PPI is the sweet spot for desktop monitors. The idea is to not be able to see any pixels. A 27" 4K is 163 PPI, you can't see the pixels at normal viewing distance. I ended up with a 32" 4K because its the only glossy 4K. I would prefer the smaller 27" size, 32" is massive at a 2-3 ft viewing distance. I wouldn't use anything with a PPI lower than 140 PPI. 27" 1440p looks pixelated after using 4K and thats 109 PPI.

I had a 30" 16:10 2560x1600 display that I felt was a bit too big but would probably appreciate the same size in 4K 16:9 as my 27" 1440p feels like a good size for desktop but could be bigger when gaming.
 
I am using a Sony 43" 4k HDR tv as my temp monitor until I can find a 37-40" Gsync
 
We're close enough to "Q4 2017" when some of these new monitors are supposedly coming out that I guess I'll just keep on waiting barring a sudden, catastrophic failure of my NEC 2490wuxi.
 
Yes I'm hoping that tftcentral will get their hands on and do an in depth review of both the 4k 1000nit QD FALD HDR G-sync ips and the 3440x1440 curved UW 21:9 1000nit QD FALD HDR VA
- comparing the black levels, detail-in-blacks, overall contrast, color volume and brightness/darkness uniformity and how tightly localized each can be in and out of FALD HDR content. That along with all of the other measurements they do, especially pursuit camera runs at 100fps-hz, 120fps-hz, 144 fps-hz (and additionally 200fps-hz hz on the VA though it will prob only be tight at 100 - 120 , maybe 144).
 
It's inexcusible and complete BS that nvidia refuses to support The VESA Sync Standard (Adaptive-Sync). I'm positive nvidia has membership and representation in VESA. This is a complete failure. nvidia should fix it today with a simple driver update, but they don't because they have their heads too far up their own asses, as this guy put it:

https://forums.geforce.com/default/...rt-vesa-adaptive-sync-freesync-yet-/?offset=4

Display advancement is pathetic: there are way too many models with lag, PWM, and 60Hz. Most manufacturers still don't even spec lag! You have to wait for reviews and then be at the mercy of measurement technique/error, which is an idiotic system. And btw I only care about 4:4:4 so don't bother with non-4:4:4 "game modes" with slightly lower (but still laggy) lag. It needs to be <16ms.
 
Yes I'm hoping that tftcentral will get their hands on and do an in depth review of both the 4k 1000nit QD FALD HDR G-sync ips and the 3440x1440 curved UW 21:9 1000nit QD FALD HDR VA
- comparing the black levels, detail-in-blacks, overall contrast, color volume and brightness/darkness uniformity and how tightly localized each can be in and out of FALD HDR content. That along with all of the other measurements they do, especially pursuit camera runs at 100fps-hz, 120fps-hz, 144 fps-hz (and additionally 200fps-hz hz on the VA though it will prob only be tight at 100 - 120 , maybe 144).

We should be getting a taste of 384 FALD, 1000nit HDR10 pretty soon as Simon said this week that the Dell UP2718Q is on its way for review. It's not a gaming monitor, but has impressive specs nonetheless, being a profession grade 4K monitor.
 
http://www.straitstimes.com/tech/pcs/a-dream-monitor-for-creative-professionals

it is a mini review of a new HDR DELL monitor UP2718Q

in the article it says this:

I watched a couple of Ultra HD Blu-ray videos on the Dell monitor using an LG Ultra HD Blu-ray player. While the videos looked impressive, thanks to their 4K resolution and HDR effect, I felt that the UP2718Q's anti-glare coating dampened some of the brighter HDR visuals, as well as making the images look grainy.



anothe one expensive monitor that it lucks due to matt.....

Thanks for the heads up. I'm looking forward to the review especially in relation to the FALD since it might give some idea of how it could perform on the gaming monitors as you said.
 
Last edited:
Really disappointing that they don't offer the UP2718Q with AR treated glass. Basically a type of anti-glare coating that retains the clarity of a glossy while reducing reflections as good as Matte. Why have such a good display and then obstruct the pixels with that hazy layer of plastic? It's 2017 and still no glossy 2K/4K monitors. Nothing except for the HP Spectre 32 which isn't 100% glossy but 98%... We need displays with actual AR glass to reduce glare, not hazy plastic. If someone is spending $800+ for a monitor I'm sure they wouldn't mind spending an extra $200 for AR glass.
 
Really disappointing that they don't offer the UP2718Q with AR treated glass. Basically a type of anti-glare coating that retains the clarity of a glossy while reducing reflections as good as Matte. Why have such a good display and then obstruct the pixels with that hazy layer of plastic? It's 2017 and still no glossy 2K/4K monitors. Nothing except for the HP Spectre 32 which isn't 100% glossy but 98%... We need displays with actual AR glass to reduce glare, not hazy plastic. If someone is spending $800+ for a monitor I'm sure they wouldn't mind spending an extra $200 for AR glass.

Yeah man lack of glossy displays is probably what upsets me more than anything. Doesn't matter what boxes are checked when it comes to a new gaming monitor, nothing EVER manages to check the glossy box.
 
I'd say 140-200 PPI is the sweet spot for desktop monitors. The idea is to not be able to see any pixels. A 27" 4K is 163 PPI, you can't see the pixels at normal viewing distance. I ended up with a 32" 4K because its the only glossy 4K. I would prefer the smaller 27" size, 32" is massive at a 2-3 ft viewing distance. I wouldn't use anything with a PPI lower than 140 PPI. 27" 1440p looks pixelated after using 4K and thats 109 PPI.

crazy

I had a 28" 4k and the PPI was so high it was a complete waste of computing power. Yes the native anti-aliasing from such high PPI was amazing in games, but I went to a 165hz 1440p (109ppi) and they look too similar to warrant the massive fps drop by using 4k.

Now, I'm looking into a 32" 4k, because 32" is my dream size ever since I gamed on a 32" 1080p tv back in the day.
 
crazy

I had a 28" 4k and the PPI was so high it was a complete waste of computing power. Yes the native anti-aliasing from such high PPI was amazing in games, but I went to a 165hz 1440p (109ppi) and they look too similar to warrant the massive fps drop by using 4k.

Now, I'm looking into a 32" 4k, because 32" is my dream size ever since I gamed on a 32" 1080p tv back in the day.

32" 4K is 138 PPI what is the same as 21.5 1440p so it'll be a decent upgrade from 27" 1440p with both sharpness and immersion. FPS drop is the only downside. Need at least a 1080 Ti to have max settings in games.
 
32" 4K is 138 PPI what is the same as 21.5 1440p so it'll be a decent upgrade from 27" 1440p with both sharpness and immersion. FPS drop is the only downside. Need at least a 1080 Ti to have max settings in games.

Max settings is a gimmick in games.

e.g 4096 res shadows 5fps drop yet look exactly the same as medium's 2048 resolution shadows. I used to play 4k on an R9 290 before going to 980 ti and it was suuuuuuper acceptable :p
 
Back
Top