OLED monitors out yet? Best black is black monitor for gaming AND photo editing?

Despotes

Gawd
Joined
Aug 27, 2005
Messages
821
I'm a few years behind monitor tech, but still waiting for a monitor that has black levels that are actually black, excellent colors, at least 144hz and no edge light bleed or streaks of varying shades. Need good uniformity.
Does such a 32"--ish monitor exist?
 
LG 32GH650. It's a VA panel, has either Gsync or FreeSync, 144 Hz, and fast response times. VA will give you much better blacks than IPS or TN with colors in between IPS and TN. https://www.tftcentral.co.uk/reviews/lg_32gk850g.htm

Basically all IPS panels will have pretty bad bleed. It would drive me crazy which is why I'm going either VA or TN for my next monitor.

LG's also coming out with some nano IPS screens in the next few weeks so might be worth checking those out.

Viewsonic has a really good IPS monitor that still has IPS glow but it seems to be a lot less than other screens in reviews but it's $1000 right now, the XG2703-HS.

Might just have to decide what your main priority is. If it's photo editing, get a good 1440, 27" monitor aimed at photo work. It'll likely be IPS but will probably have less bleed since manufacturers will put a little more TLC towards that aspect as picture quality will be the priority. You'll be stuck gaming at 60 Hz tho and that could be rough. It's why I'm wanting to give up my gorgeous HP 32Q I got for a great price. Once you game at 144+, there is no going back. So much so that I'm actually considering a smaller, 1080p, 240 Hz screen! I need moar framez! Haha!
 
Last edited:
The closest you can get is an LG 2019 55" oled tv.

You could do 120hz 4k with variable refresh rate if you had an HDMI 2.1 video card, but the video cards don't exist yet.

You can do 120hz 1080p or 4k 60hz. And maybe freesync that if you use an AMD card. Nvidia doesn't support VRR over HDMI yet.
 
  • Like
Reactions: N4CR
like this
I'm a few years behind monitor tech, but still waiting for a monitor that has black levels that are actually black, excellent colors, at least 144hz and no edge light bleed or streaks of varying shades. Need good uniformity.
Does such a 32"--ish monitor exist?

No.

Not even close.

Best bet is to get separate monitors- something good for gaming, and something good for photo editing.
 
It's not available in screens that size yet, but we're finally see a second effort to get OLEDs in laptops this year, so we're making progress.

https://www.digitaltrends.com/computing/oled-laptops-have-returned-but-are-they-worth-it/

Most of the models are already shipping and have been reviewed, so have a look around. We now have more than a trickle of OLED offerings to put in laptops, and that's good progress for PC monitors! Maybe we'll see something larger in a few more years, or you can compromise and pick one of these 15" models up now.

At least the brightness level are way up vs the previous generation, which is kinda necessary. It's now comparable with IPS/VA, so now they just have to grow the screens.
 
Last edited:
The state of the display market is depressing. You can technically buy OLED monitors now, but they're intended for the professional market and are prohibitively expensive. There are some things to look forward to, however:
  • LG is supposedly going to release 48" OLED TVs next year
  • JOLED is working towards getting a full OLED production line going next year, and among other things they've shown a 27" prototype
  • Most interesting for desktop use, IMO, are the recent developments of dual-cell LCD, with Innolux apparently having a 31.5" panel coming
So nothing that's available right now, but things may finally be looking up in the next year or two.
 
The closest you can get is an LG 2019 55" oled tv.

You could do 120hz 4k with variable refresh rate if you had an HDMI 2.1 video card, but the video cards don't exist yet.

You can do 120hz 1080p or 4k 60hz. And maybe freesync that if you use an AMD card. Nvidia doesn't support VRR over HDMI yet.
On the plus side, once we get past november, OLEDs will fall in price dramatically, meaning you will get top end experience for mid range prices.
 
LG 32GK850G vs Samsung C32HG70. How does the LG compare to the Samsung? Both VA panels cost roughly the same.
Is there something better in the same price range?
 
LTT recently reviewed the Razer Blade 2019 with a 15" Samsung AMOLED display and spent considerable effort analyzing the OLED display itself but ultimately came up with a "no" recommendation.



Some of the marks against it, such as power draw, would not apply to a desktop space anyway. Valid marks against it (to me) were 60 Hz and the overkill pixel density. The display panel itself, while beautiful, just isn't quite checking all the boxes with Linus.

Linus also mentioned something I totally agree with, that within the industry there is a disconnect between display marketing and engineering. 4k'ing everything is not going to capture the interests of the desktop, much less laptop, markets given the sizes they are targeting. 4k on a 55" TV is perfectly fine, but makes no sense on something significantly smaller, say a 27" panel. And on a 15" panel, 4k is absolutely absurd. That's not the fault of OLED, but the industry as a whole.

Basically, if they (OEMs) want to gain any ground with OLED in the desktop or laptop space, they need to stop pushing 4k so hard. What works for TV markets, does not apply in alternate markets. OLED is an expensive, new tech, that needs all the help it can get penetrating new markets.

Also, once you've got a taste of going high refresh, it's hard to justify going back. So the 60 Hz argument is a valid argument right now, especially when it's dollars at stake.

That said, I'd seriously consider an OLED 60Hz desktop offering, knowing I'd sacrifice high refresh, as long as it's offering proper pixel density (110'ish PPI) and not something so small that the dollars per inches ratio leaves a bad taste in my mouth.
 
Last edited:
4k on a 55" TV is perfectly fine, but makes no sense on something significantly smaller, say a 27" panel. And on a 15" panel, 4k is absolutely absurd.

What kind of nonsense is this? The difference between text on a 4K 27" display and a 1440p one is easily noticeable. High pixel densities have value on phones, and they have value on laptops and desktop displays as well. Even higher pixel densities than 4K @ 27" have easily demonstrable value for some applications, such as photo editing.

I agree 60hz is too low refresh rate. But there is no reason that OLED displays can't be 4K120. In fact, most if not all OLED panels already are. The choice not to do higher refresh rates is primarily a (mistaken) belief that only "gamers" benefit. However, the noticeable benefits of 90 and 120hz on phones and tablets are becoming more mainstream so it's only a matter of time before the desktop user demands this as we ll.
 
What kind of nonsense is this? The difference between text on a 4K 27" display and a 1440p one is easily noticeable. High pixel densities have value on phones, and they have value on laptops and desktop displays as well. Even higher pixel densities than 4K @ 27" have easily demonstrable value for some applications, such as photo editing.

Scaling on the desktop is ass. Phones are great! And totally different from a UI and software ecosystem perspective. And making that kind of versatility happen on the desktop in the near future is highly unlikely. I don't want 4k at < 35". I have a 4k panel at 31.5", and good vision, and it's just too high DPI. 27"? No.

And on a 15" panel, 4k is absolutely absurd.

...and now I'm going to appear to contradict myself a bit. On laptops, you have two things going for you that you don't on the desktop: first, you can use 2:1 scaling, which is how Apple did it on everything, and while it's still ass when it doesn't work, you have the second thing which is that laptops have narrower use cases for most people.

Now, I'm not most people- I want 1080p on a laptop, be it 13" or 17" (own both, both work great), and there is an advantage to higher DPI for some workflows so I can see that being a decision point too.

Some of the marks against it, such as power draw

This is the only valid complaint. You can get low-latency high-refresh OLED panels with VRR, just apparently not in the laptop space or Razer of all companies would have used them. But battery life is king in the mobile space and a 15" laptop is still on the portable side of things.
 
What kind of nonsense is this? The difference between text on a 4K 27" display and a 1440p one is easily noticeable. High pixel densities have value on phones, and they have value on laptops and desktop displays as well. Even higher pixel densities than 4K @ 27" have easily demonstrable value for some applications, such as photo editing.
They have value to some, yes, but it's diminishing returns for most. And it depends on the content in question. I was speaking in the general context. Content creation is niche, proably more niche than gaming. If a gamer buys a 27" 4k OLED but inevitably runs practically everything at 1/4 native res, then all that high density has ultimately gone to waste. And if it would have cost the consumer less to have a reduced cost version of that with less pixel density, well that supports anti-4k even more.

The 4k marketing worked in the TV space because scaling a non-4k video is trivial today, and most TV consumers don't care about the end results. The "waste" there isn't nearly as important as the price. Because once consumers could buy a 4k panel at the price of its predecessors, then it was a no-brainer because there's nothing to lose.

In the desktop space, there isn't that luxury. You can't just slap infinite DPI and expect everything to just work like it did magically in the TV space. Desktop content is dynamic, has history, and most software just doesn't want to scale properly. You can't even compare desktop apps to phones, that's pretty much apples and oranges. Phone apps are targeting a completely different audience and are built from the ground up DPI aware. That luxury simply doesn't exist in the desktop space.

Maybe, just maybe, in 10 years things in the desktop world will catch up. Perhaps in 10 years, 4k 240Hz OLED desktop panels for a few hundred will be the norm. Right now, that's a fantasy.
 
On laptops, you have two things going for you that you don't on the desktop: first, you can use 2:1 scaling, which is how Apple did it on everything, and while it's still ass when it doesn't work, you have the second thing which is that laptops have narrower use cases for most people.

Now, I'm not most people- I want 1080p on a laptop, be it 13" or 17" (own both, both work great), and there is an advantage to higher DPI for some workflows so I can see that being a decision point too.
Just to stress the point that I'm not just talking out of my ass, I own a Razer Blade 13" with an IGZO panel that does 3200x1800. For me, that panel was practically a waste and I wish I had gotten the HD version instead. I had no problems with the quality of the panel, mind you. But to actually have the desktop usable, I had to crank the Windows 10 scaling to 300%. As you well know, not all of my apps, some I even developed myself, handled that kind of scaling that well. Then for games, the GTX 970M wasn't going to cut it for native res, so I compromised and ran at quarter res. Well, even on a 13" that looked like ass, but at least my framerates were reasonable.

Basically, my Window 10 scaling experience with a 200+ DPI 13" laptop were horrible. Whatever compromise I came up with ended up ruining whatever glorious glossy IPS expectation I had when I added it to the shopping cart. As a consumer, I will never ignore PPI again, whether it's in the laptop or desktop space. It matters. It really does.

And for those jabbing at me "well, you should have gotten a Mac"? A Mac is not reasonable a solution. And I'm not a stranger to a Mac. It's just another compromise.
 
Last edited:
I own a Razer Blade 13" with an IZGO panel that does 3200x1600. For me, that panel was practically a waste and I wish I had gotten the HD version instead.

Have an XPS13 with something similar- it's set to 1600 x 900. Girlfriend is using and it's perfect for her, but I prefer the 1080p panel on my ASUS for sure as it doesn't need scaling.

And you still can't get a near top-spec XPS13 without 4k. Downright stupid.
 
Have an XPS13 with something similar- it's set to 1600 x 900. Girlfriend is using and it's perfect for her, but I prefer the 1080p panel on my ASUS for sure as it doesn't need scaling.

And you still can't get a near top-spec XPS13 without 4k. Downright stupid.

Regardless, you're not gaming with Intel HD graphics... (pretty sure) At which point, an i5 browses the web just as well as an i7 :)
 
Regardless, you're not gaming with Intel HD graphics... (pretty sure) At which point, an i5 browses the web just as well as an i7 :)

To be clear, I absolutely do. Latest AAA-games? Nope.

On my ultrabook it's League of Legends and similar. I actually do want my laptop to have a 120Hz panel and VRR.

But I digress.
 
If a gamer buys a 27" 4k OLED but inevitably runs practically everything at 1/4 native res, then all that high density has ultimately gone to waste.

That's not how pixel density works, and 4K is getting easier to drive.

If your personal experience is equivocating a 13" laptop screen to a 27" desktop display I suggest rethinking that opinion.
 
I happen to like high pixel density - higher resolutions on smaller screens.

Totally subjective opinion there, but I don't consider 4K at 27" wasted at all, I can definitely notice and appreciate the difference. Especially in clarity of text and graphics, not so much gaming though. Sitting at my desk, about 2.5' away from the monitors, 27" is about as large as I'd like to go right now. I used to use 24" monitors and hoenstly I wouldn't have a hard time going back down to that size. 32" begins to get a big too big, as where I sit I have to move my head side to side to see the entire screen and that bothers me while working/gaming.

My 55" OLED I love to death, but if I'm totally honest about it, sitting back at the couch --- now on that I can often not tell a significant difference between 1080 and 4K while watching movies. I can definitely tell between SDR and HDR though.

So my personal experience and subjective opinion is contrary to what a lot of folks here state - that you need a bigger screen to enjoy higher resolution. I find the opposite to be true: the closer I am to the screen, the smaller a display and higher pixel density I would like.

I won't call those folks that like bigger screens or lower resolutions wrong - you like what you like.
 
That's not how pixel density works
Then by all means, please enlighten us.
If your personal experience is equivocating a 13" laptop screen to a 27" desktop display I suggest rethinking that opinion.
Yeah, well my personal experience extends beyond just a laptop. The laptop just happens to represent a 270 PPI extreme case. Look at my sig, I have a 34" UW (110 PPI) and a 27" QHD (110 PPI), either of which I consider ideal.

I've already test driven an Asus PG27UQ. Nope, do not want.

And as you pointed out, it's an opinion, man.
 
Then by all means, please enlighten us.

Most modern game engines(due to consoles, primarily) support render scaling, by which they render a scene at lower resolution and then it's upscaled to 4K. By using this technique you can render things that strongly benefit from it at higher pixel density, like the UI, and you can reduce the pixel density of the base rendering when it would be visually indistinguishable anyway. In fact, you can even do this dynamically to maintain a certain targeted framerate, and it also allows all sorts of other tricks that aren't possible when you're hard restricted to rendering at a certain native resolution only.

Also, in general, *even if* you naively upscale a 1080P image to 4K using nearest neighbour, and display it on a 4K screen, the 4K screen will look better due to lack of screen door effect alone, ignoring the possibility of doing better anti-aliasing and other such things in the process of the upscale.

The only real downside to higher resolution displays is the increased bandwidth required for the cable connection to the device restricting refresh rate. But this is mostly due to the unreasonably slow process of updating connection standards.

E: On the desktop, I should say. On laptops, higher resolution screens also increase power consumption due to the requirement of brighter backlights to achieve the same screen brightness. Higher refresh rates also increase power consumption.
 
Last edited:
Most modern game engines(due to consoles, primarily) support render scaling, by which they render a scene at lower resolution and then it's upscaled to 4K. By using this technique you can render things that strongly benefit from it at higher pixel density, like the UI, and you can reduce the pixel density of the base rendering when it would be visually indistinguishable anyway. In fact, you can even do this dynamically to maintain a certain targeted framerate, and it also allows all sorts of other tricks that aren't possible when you're hard restricted to rendering at a certain native resolution only.

Most panels today have built-in scalars that do that anyway. And if they don't, most GPU drivers have options that fill that role. Some even taking in aspect ratio into consideration; something a poor scalar implementation might overlook. While I'll agree that having an in-game scaling option is nice to have, and bonus points if done dynamically based on content, it's not nearly as exploited as you're making it out to be. Very few PC games, modern or otherwise, offer even the most basic scaling option, often completely ignoring (or hiding) any game engine offerings. But that doesn't matter, because we, the PC Master Race, have other ways of achieving the same thing. There are shader injector tools to give us all sorts of ways to apply filtering to the rendered output, whether it be sharpening, 2xSaI, or whatever flavor of the month effect one prefers. It is nice to see that more developers are catching on to this need, and even AMD's FidelityFX is a push in the right direction, but we are years away from having such options becoming standard.

And I wouldn't go so far as to say the scaling is "visually indistinguishable", given something like nearest-neighbor will, by design, make contrasting object edges look much softer than the native equivalent. Even worse, if the scaling ratio is non-integer, it will actually look far worse than integer scaling. It's by no coincidence that nearest-neighbor is often dubbed one of the "crudest" forms of filtering. Scaling filters... actually, filters in general are going to be so subjective and each one really depends on if the person notices any issue with an unfiltered image at all. In fact, that's why some prefer integer scaling over anything else.

It's also not a one-way street, as in "visually indistinguishable" works both ways. Unless a person is sitting really close (like less than 1 foot away), or using a magnifying glass, they won't notice any of the so called benefits of upscaling, sharpening, font scaling, etc. Not everyone sits 1-2 feet from their monitor nor has eagle eye vision to pick up on these microscopic details. They'll inevitably find "value" elsewhere.
Also, in general, *even if* you naively upscale a 1080P image to 4K using nearest neighbour, and display it on a 4K screen, the 4K screen will look better due to lack of screen door effect alone, ignoring the possibility of doing better anti-aliasing and other such things in the process of the upscale.

This is a really weak argument.

Anyone except the rare handful of freaks with telescopic vision would notice any screen door effect on a 27" 1440p panel. I have 20/20 vision and even *I* have to use a magnifying glass to find it. Once desktop LCDs started offering larger pixel densities and more efficient pixel arrangements, the whole screen door critique went out the window. I don't even recall the last time I read a desktop monitor review where the screen door issue was brought up. And if such a review exists, it was probably a panel I already knew was going to be garbage.

You'd literally need to be sitting with your face right up to the panel to even notice any of the gaps. Of course, the pixel geometry and pixel spacing, thus quality, of the panel dictates that, so there are always exceptions. Those 27" panels offering a 1080p resolution? I'm certain someone out there finds them "visually indistinguishable" from the higher density versions. I'm also sure if I looked hard enough, I'd find a screen door effect on some arbitrary (i.e. cheaply made) low PPI panel. But if I have to resort to using a magnifying glass, then the point is moot. The reality is that in 2019 the screen door effect is a non-issue and you don't need to limit your options to a 4k screen just to combat it. Just get a quality panel that you are comfortable with and call it a day.

You'll have an easier time convincing someone that uses a 55" TV as a desktop monitor and sits less than 5 feet away that screen door is a "thing". Outside of VR and TV as a desktop crowd, I don't see the argument of "4k hides the screen door effect" carrying much weight here.
 
Most panels today have built-in scalars that do that anyway.

No monitor panel is capable of distinguishing your game UI from the viewport, so comparable scaling is impossible without engine support.


This is a really weak argument.

Doesn't need to be a strong argument, just pointing out that there's really no downside in terms of visual fidelity in any case. The onus is on people opposed to pixel density to make strong arguments against it, anyway, and there really isn't any except for performance in old games without render scaling. Higher pixel density is only going to accelerate as we move from 4K to 8K over the next 10 years.
 
No monitor panel is capable of distinguishing your game UI from the viewport, so comparable scaling is impossible without engine support.
In a dynamic context, you are right. Scaling up until this point has mostly been "dumb".
Doesn't need to be a strong argument, just pointing out that there's really no downside in terms of visual fidelity in any case. The onus is on people opposed to pixel density to make strong arguments against it, anyway, and there really isn't any except for performance in old games without render scaling. Higher pixel density is only going to accelerate as we move from 4K to 8K over the next 10 years.
Fair enough. I will agree that higher pixel density is inevitable. It will give GPU designers something to chase after. My only "fear" is that overall quality will be sacrificed for the sake of marketing. I'm one of the original naysayers that said they'd take a 1080p OLED TV over any 4k LCD competitor. But now we have 4k OLEDs...
 
I'm a few years behind monitor tech, but still waiting for a monitor that has black levels that are actually black, excellent colors, at least 144hz and no edge light bleed or streaks of varying shades. Need good uniformity.
Does such a 32"--ish monitor exist?

If there's one thing I've learned about waiting for that "dream" monitor, by the time it comes out you'll want something that can do 8k, 240hz, SHDR, etc etc :oldman:
 
The LG C9 series of OLED TVs can do what you want, but requires a GPU that can output HDMI 2.1 (none currently do) if you want 120Hz @ 4k. Older models (C7/C8 series) support up to 1080p @ 120Hz as well, though not at 4k due to HDMI 2.0 bandwidth limitations. And yes, these are all native 120Hz panels, not interpolated. Older models (C6) have 120Hz panels but can't support a 120Hz signal.

And again, being high end TVs, you can't get sizes smaller then 55".

EDIT

Full disclosure, I own a LG B6P and will probably update to the B9 once NVIDIA puts out a HDMI 2.1 capable GPU, for both the 4k @ 120Hz as well as the HDMI Forum VRR.
 
Last edited:
The Samsung Q6, 7,8,9 have 1440p 120Hz above 55". I'm considering the Q6. It has contrast ratios far above the best pc monitor w/ VA panel and low input lag and the 55" isn't too expensive. Just bigger than I prefer.

The LG C9 series of OLED TVs can do what you want, but requires a GPU that can output HDMI 2.1 (none currently do) if you want 120Hz @ 4k. Older models (C7/C8 series) support up to 1080p @ 120Hz as well, though not at 4k due to HDMI 2.0 bandwidth limitations. And yes, these are all native 120Hz panels, not interpolated. Older models (C6) have 120Hz panels but can't support a 120Hz signal.

And again, being high end TVs, you can't get sizes smaller then 55".

EDIT

Full disclosure, I own a LG B6P and will probably update to the B9 once NVIDIA puts out a HDMI 2.1 capable GPU, for both the 4k @ 120Hz as well as the HDMI Forum VRR.
 
LG C9 55" does 120hz @ 1440p as well not just at 4K, so it can be used the same way. Previous LG TVs didn't support 120hz @ 1440p.
 
OLED is crap. There will be MicroLED monitors out before OLEDs have resolved all the roadblocks for use as a computer monitor. OLED is just this generation's plasma. It's doomed to die.
 
LG 32GH650. It's a VA panel, has either Gsync or FreeSync, 144 Hz, and fast response times. VA will give you much better blacks than IPS or TN with colors in between IPS and TN. https://www.tftcentral.co.uk/reviews/lg_32gk850g.htm

Basically all IPS panels will have pretty bad bleed. It would drive me crazy which is why I'm going either VA or TN for my next monitor.

LG's also coming out with some nano IPS screens in the next few weeks so might be worth checking those out.

Viewsonic has a really good IPS monitor that still has IPS glow but it seems to be a lot less than other screens in reviews but it's $1000 right now, the XG2703-HS.

Might just have to decide what your main priority is. If it's photo editing, get a good 1440, 27" monitor aimed at photo work. It'll likely be IPS but will probably have less bleed since manufacturers will put a little more TLC towards that aspect as picture quality will be the priority. You'll be stuck gaming at 60 Hz tho and that could be rough. It's why I'm wanting to give up my gorgeous HP 32Q I got for a great price. Once you game at 144+, there is no going back. So much so that I'm actually considering a smaller, 1080p, 240 Hz screen! I need moar framez! Haha!

VA sucks especially for larger monitors. The viewing angles ruin it. People tend to sink into their chairs, shift around, look at monitors at different relative positions vertically...nope.

IPS is still the only real option for computer monitors, and it will be until MicroLED.
 
VA sucks especially for larger monitors. The viewing angles ruin it. People tend to sink into their chairs, shift around, look at monitors at different relative positions vertically...nope.

IPS is still the only real option for computer monitors, and it will be until MicroLED.

I think you're thinking of TN. VA has viewing angles almost as good as IPS but with 2-3 times higher contrast.

Viewing angles was never an issue for me anyway. I have never once games sitting 160 degrees off center.

IPS isn't an option for me because of IPS glow lighting up all 4 corners. I game in a fairly dim room and usually dark lit games. Plus the higher contrast of VA just makes the image look so much better to my eyes.
 
Viewing angles was never an issue for me anyway. I have never once games sitting 160 degrees off center.

Depending on the VA panel used- just like TN, the better ones are pretty good, but the ones with compromises fall a bit short- it takes quite a bit less than 160 degrees off axis for the image to start transforming. I find with my two 31.5" VAs- one a legitimately cheap monitor and one a gaming monitor- and for both I have to keep my viewing position closer to the primary axis than I would for my old 30" IPS, or even the cheap 24" IPS Acers I have as side monitors.
 
OLED is crap. There will be MicroLED monitors out before OLEDs have resolved all the roadblocks for use as a computer monitor. OLED is just this generation's plasma. It's doomed to die.

If you think MicroLED monitors are right around the corner, I have a bag of hurt to sell you for real cheap.

LTT recently reviewed the Razer Blade 2019 with a 15" Samsung AMOLED display and spent considerable effort analyzing the OLED display itself but ultimately came up with a "no" recommendation.

Some of the marks against it, such as power draw, would not apply to a desktop space anyway. Valid marks against it (to me) were 60 Hz and the overkill pixel density. The display panel itself, while beautiful, just isn't quite checking all the boxes with Linus.

Linus also mentioned something I totally agree with, that within the industry there is a disconnect between display marketing and engineering. 4k'ing everything is not going to capture the interests of the desktop, much less laptop, markets given the sizes they are targeting. 4k on a 55" TV is perfectly fine, but makes no sense on something significantly smaller, say a 27" panel. And on a 15" panel, 4k is absolutely absurd. That's not the fault of OLED, but the industry as a whole.

Basically, if they (OEMs) want to gain any ground with OLED in the desktop or laptop space, they need to stop pushing 4k so hard. What works for TV markets, does not apply in alternate markets. OLED is an expensive, new tech, that needs all the help it can get penetrating new markets.

Also, once you've got a taste of going high refresh, it's hard to justify going back. So the 60 Hz argument is a valid argument right now, especially when it's dollars at stake.

That said, I'd seriously consider an OLED 60Hz desktop offering, knowing I'd sacrifice high refresh, as long as it's offering proper pixel density (110'ish PPI) and not something so small that the dollars per inches ratio leaves a bad taste in my mouth.

There’s a whole roster of reasons why OLED is taking so long to arrive in the 27”-48” range, but chief among them:

- LG and Samsung are the only major OLED manufacturers. There’s some smaller players like JOLED, but for right now they are inconsequential. Worse yet, Samsung produces small panel OLED (mobile phones and laptops) while LG produces large panel OLED (TVs). They’re produced differently, and each holds patents related to the manufacturing thereof for their respective size ranges.

- For better or worse, the 27”-43” range has become the economy class of panel sizes. They come in relatively cheap TVs and monitors, but they’re also the last size manufacturers like LG and Samsung will dedicate production lines to. And right now, those production lines are busy churning out panels for mobile phones, laptops, and 55”-65” TVs. It’s all about ROI, and that’s where the money is, especially when major players like Apple, Dell, HP, etc commit to bulk orders for millions of units. Plus LG and Samsung’s own products.

- There also needs to be production capacity. In order for LG to make 40” range OLED panels, they need to start up a production line that’s dedicated to >96” mother substrates, that are then cut four ways. And it can’t share existing production lines that 55” and 65” panels are cut from. LG is doing just that, but it takes time to build new production lines while maintaining existing production capacity.

- If all this weren’t bad enough, HDMI 2.1 is taking forever to come out and realistically, PC users would instead want DP 2.0 now that it’s been announced. And it needs to be on both monitor and graphics cards. Otherwise, PC users can’t obtain the desired 4K 120Hz+ HDR 4:4:4 environment that is the holy grail right now.

TL;DR: It sucks for us desktop people. It’s a bad time to be looking for a great monitor with everything in limbo.
 
Last edited:
Depending on the VA panel used- just like TN, the better ones are pretty good, but the ones with compromises fall a bit short- it takes quite a bit less than 160 degrees off axis for the image to start transforming. I find with my two 31.5" VAs- one a legitimately cheap monitor and one a gaming monitor- and for both I have to keep my viewing position closer to the primary axis than I would for my old 30" IPS, or even the cheap 24" IPS Acers I have as side monitors.

I must not be susceptible to that then. I've used TN panels for years and never had a problem with viewing angles unless I was looking at them from the side as I first walked into my room.

My 32" VA, I see virtually no image shift even from 160-170 degrees. I just know in reviews, the image isn't reported to shift until pretty far off to the side. Granted they're not as good as IPS in this regard but I would definitely not consider that a problem with VA panels especially when compared to TN.

VA is my favorite panel type. Every once in a while I think I need my old 27" back for 144 Hz (my HP 32Q only does 70) but every time I switch them, it doesn't take 5 minutes before I'm lugging my 32 back out. That ~3000 contrast ratio just makes everything look SO much better than the 900 of my old BenQ TN. Yes IPS has better colors but VA isn't far behind and that contrast in the high 2000's compared to IPS's barely 1000 makes the games I play look great. The only real drawback to VA is their kinda sluggish response time but since I play only single player games, I can live with it.
 
The only real drawback to VA is their kinda sluggish response time but since I play only single player games, I can live with it.

This is what kills me- I see it everywhere. Of course, it's worse than IPS, but I don't consider IPS to be 'good'; just in general, IPS panels are nicer to look at than VA or TN (have all three in my office). Even the cheap IPS panels look nicer than decent VAs, on or off axis.

Still, while I'm noting a difference here, I do run all three panel types for their respective strengths. IPS for colors (an old HP, one gaming monitor on a second system) and viewing angles (side monitors), VA mostly because they're most of an IPS but far cheaper and in some cases the only thing available, and TN because they're dirt cheap. Actually have a decent TN on my old gaming laptop, it's almost VA-grade, and good TN panels are quite alright for general usage and can certainly be calibrated for color work too. The VA on my old TV is also decent, and it gets used as a monitor.

In all cases, none are perfect, and selection more comes down to using the right tool for the job, as with your VAs and likely slower-paced games. They're perfect for that!
 
This is what kills me- I see it everywhere. Of course, it's worse than IPS, but I don't consider IPS to be 'good'; just in general, IPS panels are nicer to look at than VA or TN (have all three in my office). Even the cheap IPS panels look nicer than decent VAs, on or off axis.

Still, while I'm noting a difference here, I do run all three panel types for their respective strengths. IPS for colors (an old HP, one gaming monitor on a second system) and viewing angles (side monitors), VA mostly because they're most of an IPS but far cheaper and in some cases the only thing available, and TN because they're dirt cheap. Actually have a decent TN on my old gaming laptop, it's almost VA-grade, and good TN panels are quite alright for general usage and can certainly be calibrated for color work too. The VA on my old TV is also decent, and it gets used as a monitor.

In all cases, none are perfect, and selection more comes down to using the right tool for the job, as with your VAs and likely slower-paced games. They're perfect for that!

Totally agree. I found out a sad truth when I started researching and reading reviews of monitors.... They all suck lol!

I think TN's could be a lot better if they used glossy screens. I had a HP 25" with a glossy screen a few years ago that had gorgeous picture quality. I think the matte coating hurts TN panels a lot more than the other types. The Asus PG278QR is reviewed to have superb colors but too heavy a coating that you can actually be seen in colors. The new Dell S2719DGF isn't glossy but does have a very light matte coating and is reviewed to have close to IPS quality colors and they say it's due to the light matte that adds to the pop and depth.
 
Look at this guy coming in here all hopeful and then he's kicked in the face by the state of the display industry.

Lmfao

source (2).gif
 
Back
Top