Why is display/monitor market such a shite?

Kreon

Limp Gawd
Joined
Jul 4, 2014
Messages
137
I've been building my own PC systems for about 20 years now. Every time you need to find the right details that suit your budget and needs you just put some research work, check available variants, read reviews, read community feedback and eventually a few days (or weeks, depending on your free time) of work always yield good results you are pretty satisfied with. This applies to 99% of the pc components and peripherals EXCEPT goddamn displays. I've never had a non-CRT display that I felt like was a good choice. It's been 15 years and I still search for something I could be moderately happy with. The fuck is wrong with this market? Why is it stagnating last decade?
 
Monitor selection, like with audio gear (ie: speakers/headphones) is very user specific. Even when you find a model that is popular with good reviews, it gets bashed by a group of users because they don't like that type of monitor. After using TN panels for years, I finally moved on to IPS panels and wished I had done it sooner. All monitors have "issues". It's simply a matter of choosing what works best for you. CRT panels could be nice, but they had their drawbacks as well. So in that regard, nothing much has changed.
 
Monitor selection, like with audio gear (ie: speakers/headphones) is very user specific. Even when you find a model that is popular with good reviews, it gets bashed by a group of users because they don't like that type of monitor. After using TN panels for years, I finally moved on to IPS panels and wished I had done it sooner. All monitors have "issues". It's simply a matter of choosing what works best for you. CRT panels could be nice, but they had their drawbacks as well. So in that regard, nothing much has changed.
And I love IPS color but can't stand the washed out blacks. So I am a VA guy. The real answer is to get microLED or OLED PC monitors - but that's not happening any time soon.
 
Because dumb people care more about super high refresh rates than quality displays. There's no reason 1920x1080 monitors should still be produced, but morons who saw some idiots playing CS at 640x480 and 2000FPS said "me too!" and manufacturers were more than happy to start overcharging for obsolete technology.
 
I remember buying and returning several CRT monitors because they had issues with focus, geometry, refresh rates etc that could not be fixed with settings. Displays have always been varying quality and CRT is no exception.

At the same time the majority of people would buy the cheapest, shittiest displays because they did not know any better despite spending a lot on good PC components. Now people just buy big, cheap TVs instead.

The higher end displays have been made with graphics/video professionals in mind for a long time and before high refresh rates you were usually good just buying such a display because input lag was not reviewed at all. We have gotten far from that where even TVs finally have low input lag and are starting to do higher refresh rates too. The big issue is that we have been stuck on 60 Hz for so long that getting anything higher has been an expensive endeavor.

I’ve been happily on a 27" 1440p 144 Hz Gsync 8-bit TN panel for 5 years now because it has worked fine for my uses. Any better versions have not seemed worth buying and ultrawides haven’t been worth it to me either. I want something bigger and higher refresh while keeping the good things I’ve enjoyed all this time.

It’s like enthusiasts are all saying "make me a 32+ inch 4K display with 120+ Hz and VRR" and display manufacturers just say "nah, have another 27" 1440p or 34" ultrawide or 25" 1080p TN". While there is nothing wrong with those and they fit the use cases of a lot of people, there is a gap between monitors and TVs that manufacturers just refuse to fill.
 
Because dumb people care more about super high refresh rates than quality displays. There's no reason 1920x1080 monitors should still be produced, but morons who saw some idiots playing CS at 640x480 and 2000FPS said "me too!" and manufacturers were more than happy to start overcharging for obsolete technology.
Yup. When everything was 60hz, 1080p TN was considered only suitable for budget monitors. Then people started paying double for a 120hz 1080p TN with no real improvements in image quality. The same thing is now happening with 240hz. When TV manufacturers upgraded their sets to 120hz they did so with very little fanfare, no increase in price, and continued to improve performance in other areas at the same time. Monitor manufacturers can get people to upgrade on the basis of refresh rate alone, which is relatively easy to implement, so why would they invest in any further improvements?
 
Last edited:
IMHO, the problem is "I want" a 144Hz monitor with PIP that 27" in size, 4K with gobs of inputs, OLED and zero width bezels... "and keep it under $100".

The world produces crappy monitors because we really don't want anything else. Sigh...
 
Yup. When everything was 60hz, 1080p TN was considered only suitable for budget monitors. Then people started paying double for a 120hz 1080p TN with no real improvements in image quality. The same thing is now happening with 240hz. When TV manufacturers upgraded their sets to 120hz they did so with very little fanfare, no increase in price, and continued to improve performance in other areas at the same time. Monitor manufacturers can get people to upgrade on the basis of refresh rate alone, which is relatively easy to implement, so why would they invest in any further improvements?

I haven’t looked into 1080p TN panels in ages but are the gaming ones 8-bit or 6-bit? Because I’ve been pretty happy with the image quality of my 8-bit 1440p PG278Q which came as a surprise as all TN panels I had seen before it were horrible especially with viewing angles. While the issue exists on mine, it isn’t bothersome in real use because it doesn’t shift noticeably just from changing your position in the chair a bit.
 
I've been building my own PC systems for about 20 years now. Every time you need to find the right details that suit your budget and needs you just put some research work, check available variants, read reviews, read community feedback and eventually a few days (or weeks, depending on your free time) of work always yield good results you are pretty satisfied with. This applies to 99% of the pc components and peripherals EXCEPT goddamn displays. I've never had a non-CRT display that I felt like was a good choice. It's been 15 years and I still search for something I could be moderately happy with. The fuck is wrong with this market? Why is it stagnating last decade?

It looks more like, you just have impossible standards.

Just about every person I know, myself included, never wanted anything to do with CRTs again after getting their first LCDs.

Would I like practical affordable OLED montiors? Sure, but wishing won't make it so. I make the best of what I can get.
 
The problem is that there isn't a way to check every single possible box on a feature list that would please everyone. Especially not when cost is a factor. As a result, different monitors are made to appeal to the broadest range of people, price points and market segments. The people who are looking to buy an Alienware AW3418DW or Acer Predator X34 aren't the same crowd that wants to buy a AOC 24E1Q display. Similarly, the crowd interested in a display like the last one probably couldn't care less about ASUS' ROG XG438Q. Essentially, everyone buying a monitor wants to find a display that ticks the most check boxes off for the money they have to spend on a display.
 
It’s like enthusiasts are all saying "make me a 32+ inch 4K display with 120+ Hz and VRR" and display manufacturers just say "nah, have another 27" 1440p or 34" ultrawide or 25" 1080p TN". While there is nothing wrong with those and they fit the use cases of a lot of people, there is a gap between monitors and TVs that manufacturers just refuse to fill.

That's because displays are a cartel. There's only a few big players and they have a wink wink nod nod agreement not to roll out any big new products. Its all two step forward, one step back stuff rolled out at a snails pace to keep people upgrading. If you had a 4k 32" display that could do 120 Hz and had HDMI 2.1 and DP 2.0 ports that had low input lag and a quality screen, people wouldn't need to replace it for 10+ years. No company is interested in designing a product they will only sell you one of every decade.
 
The display market is doing better than ever. Nothing is stagnating at all. CRT was good, but you have to remember that it was 100 years of progress that culminated in something like the FW900. You can get something a lot bigger and brighter for way less now. Odd features that most people wouldn't miss you don't get, but that's no barrier to the proliferation of displays themselves. You can get a decent screen for every room in your house for only a few thousand dollars today.
 
Displays stagnated, because the emissive technology that was supposed to replace CRT, e.g., SED/FED, OLED, didn't happen. Not until OLED in a very limited way more recently, anyway.

Not even LCD's answer to emissive, FALD, has made the impact it should have. I remember having Samsung's original FALD 40", but not keeping it, reasoning next year's would be even better. Instead, they did edge lit, and for years went, let's say, a different way.

That we now can get 4K, bigger panels, more easily multiple panels, etc., is great. That we have panels whose version of "black" can light up a room and also fail other things that used to be considered basic is still awful...
 
Last edited:
Because dumb people care more about super high refresh rates than quality displays. There's no reason 1920x1080 monitors should still be produced, but morons who saw some idiots playing CS at 640x480 and 2000FPS said "me too!" and manufacturers were more than happy to start overcharging for obsolete technology.

Uh some of us still like 120+ 99% frame rate times with competitive shooters @ 1080p without breaking the bank on CPU / GPU; doing that at 1440 is not that easy or cheap.
 
Yup. When everything was 60hz, 1080p TN was considered only suitable for budget monitors. Then people started paying double for a 120hz 1080p TN with no real improvements in image quality. The same thing is now happening with 240hz. When TV manufacturers upgraded their sets to 120hz they did so with very little fanfare, no increase in price, and continued to improve performance in other areas at the same time. Monitor manufacturers can get people to upgrade on the basis of refresh rate alone, which is relatively easy to implement, so why would they invest in any further improvements?

As a gamer, finally giving up the CRT and having to live with 60hz LCDs was miserable. The arrival of the 120hz TN panels was a god send. They could have looked like smeared dog shit and still commanded "double" the price because they were THAT good to use compared to a 60hz IPS or whatever fancy panel with a bunch of input lag.

Refresh rate isn't a big deal in the world of TVs that they can charge for, because who cares? Who's specifically looking for a high refresh rate TV outside of a small subset of the already niche group of "people who use a TV as their computer monitor"?

I'm not sure why you think refresh rate is relatively easy to implement. If it's so easy, where are all the 120hz+ 4k monitors at??? Panel quality aside, we have always had to pick between high resolution OR high refresh rate. You could either have potato res that feels great, or high res that feels like dragging your mouse cursor through mud.

The LCD market has always been a big shitty compromise, and I'm not going to hold my breath waiting for that to change. By the time these damn manufacturers put out a 32" 4k high refresh rate panel with G-Sync and HDR, 4k is going to be old news and I'll be wondering when one of the damn manufacturers will make a 32" 5k or 8k or whatever resolution comes next high refresh rate panel...
 
As a gamer, finally giving up the CRT and having to live with 60hz LCDs was miserable. The arrival of the 120hz TN panels was a god send. They could have looked like smeared dog shit and still commanded "double" the price because they were THAT good to use compared to a 60hz IPS or whatever fancy panel with a bunch of input lag.

Refresh rate isn't a big deal in the world of TVs that they can charge for, because who cares? Who's specifically looking for a high refresh rate TV outside of a small subset of the already niche group of "people who use a TV as their computer monitor"?

I'm not sure why you think refresh rate is relatively easy to implement. If it's so easy, where are all the 120hz+ 4k monitors at???

The LCD market has always been a big shitty compromise, and I'm not going to hold my breath waiting for that to change. By the time these damn manufacturers put out a 32" 4k high refresh rate panel with G-Sync and HDR, 4k is going to be old news and I'll be wondering when one of the damn manufacturers will make a 32" 5k or 8k or whatever resolution comes next high refresh rate panel...

5k and 8k panels already exist. 32" wouldn't be enough for me. Part of the problem with the market is everyone's wish list is different.
 
On the one hand, I feel like monitors keep getting better, especially for gamers. I kept my 21" Diamondtron for a long time, because it could run >100hz at 1280x1024 without getting fuzzy. I won't get into issues with CRTs, but it's easy to look at the past with rose-colored glasses. I would say that the recent 27" G-Sync IPS FALD monitors are an example of a very good compromise - not perfect, but very good.. at a price.
On the other hand, it is very frustrating that there aren't more "close to perfect" monitors at a reasonable price today. It seems like there's more to be had and good stuff just over the horizon, but that's been the case forever.
My "modest dream" monitor would be - a 37" 4K micro LED IPS 144hz monitor at a reasonable price (sub-1K), with Gysnc and HDMI 2.1 VRR, Dolby Vision and HDR10+, well calibrated out of the box, and without major pimples.
I do recognize that the above is not everyone's dream monitor - some would want an ultra-wide, some want a smaller monitor, some a larger monitor etc. And at 1K, it's going to sell in small quantities anyways, so there's not enough volume to really defray costs. And that's the major issue - the gaming desktops that drive sales of monitors like this are a limited market, and it's already very fragmented. Most people want a sub-$250 and there's only so much that's possible at that price point today.
Someday though, a top-emissions OLED or full LED monitor (no burn in concerns) could hit 1000hz without issues (or use the extra frames for anti-static-image retention without brightness impact) and be inkjet printed for lower production costs. I think we're far far away from that though. I'll take that 1k micro LED in the meanwhile.
 
If they can get the black levels and glow reduced or eliminated on IPS, the rest of the tech is already there. The problem is people are paying high prices for flawed tech/ products. If they were to avoid over priced garbage displays that have silly marketing, we might see a comitment on the panel manufacturers part, to develop superior tech at those price levels. We fon't need more pixels, when they are crappy performing, don't pay for them. 8K is now the buzz term, it's stupid.
 
I've been building my own PC systems for about 20 years now. Every time you need to find the right details that suit your budget and needs you just put some research work, check available variants, read reviews, read community feedback and eventually a few days (or weeks, depending on your free time) of work always yield good results you are pretty satisfied with. This applies to 99% of the pc components and peripherals EXCEPT goddamn displays. I've never had a non-CRT display that I felt like was a good choice. It's been 15 years and I still search for something I could be moderately happy with. The fuck is wrong with this market? Why is it stagnating last decade?

I bought a Samsung CFG73-24 that I've been very happy with. https://www.samsung.com/us/computin...ming-monitor-with-quantum-dot-lc24fg73fqnxza/ Only issue I have is that it's ultra low motion blur mode won't allow you to adjust the brightness of the monitor. It's too bright for my more light-controlled room. Otherwise, it's got everything that I want in a screen. Decent color accuracy (it's SRGB mode is pretty spot on. Only magenta has a delta E > 3, and it's only 4.5), decent contrast (real-world is a little north of 2000:1, but try doing that on an IPS or TN), and its pixels are fast enough to be useable at 120hz. Panel quality is great - no dead pixels. I've had it for a year now and cannot complain.

I agree that CRT was a very great display. Only reason it was truly canned was cost and ergonomics. But do remember that there were CRT monitors that had shit quality too. Sony's venerable GDM-F520, which some to claim to be one of the best displays ever produced, had a nasty quality control issue/design flaw in which the monitor would display ringing artifacts. It was internal to the video circuitry. So CRT's weren't immune to shit quality control either.
 
Part of the problem with the market is everyone's wish list is different.

That's true, but I find that to be a red herring of an argument. If our wishes are so different, why does every manufacturer under the sun release the same darn 1080 144hz, 1440p 144hz, 4k 60/120hz monitors year after year? They're the same thing with different colors. They've been able to do those resolutions at most of those speeds for years now. Where are the Dci p3 panels? Where the better contrast? Where the better pixels?

That has nothing to do with different user wishes. It has to do with stagnation. I don't need another 1440 or 4k panel that I already own, it's a useless purchase if I already own those options. That's why many of us complain. I want qled colors if they can't make oleds. I want FALD that doesn't cost freaking $2k.

I bet many would be appeased for a few years if we had a 27-32" 1440 FALD panel. Displayhdr 600. 120hz version or 60hz with Dci p3 if speed is an issue for color reproduction. That's EASY to make now. Make it $500-700. Where is that panel???
 
That's true, but I find that to be a red herring of an argument. If our wishes are so different, why does every manufacturer under the sun release the same darn 1080 144hz, 1440p 144hz, 4k 60/120hz monitors year after year? They're the same thing with different colors. They've been able to do those resolutions at most of those speeds for years now. Where are the Dci p3 panels? Where the better contrast? Where the better pixels?

That has nothing to do with different user wishes. It has to do with stagnation. I don't need another 1440 or 4k panel that I already own, it's a useless purchase if I already own those options. That's why many of us complain. I want qled colors if they can't make oleds. I want FALD that doesn't cost freaking $2k.

I bet many would be appeased for a few years if we had a 27-32" 1440 FALD panel. Displayhdr 600. 120hz version or 60hz with Dci p3 if speed is an issue for color reproduction. That's EASY to make now. Make it $500-700. Where is that panel???
That panel doesn't exist because the economy of scale isn't there to make it. HDR on PCs is in such a terrible state anyway, I doubt it would be worth it even if it existed. Can't imagine the QC sacrifices that would have to be made to meet that price point lmao.
 
That panel doesn't exist because the economy of scale isn't there to make it. HDR on PCs is in such a terrible state anyway, I doubt it would be worth it even if it existed. Can't imagine the QC sacrifices that would have to be made to meet that price point lmao.

And then today, in the news:

https://pcmonitors.info/asus/asus-pa27ucx-4k-ips-model-with-576-zone-mini-led-backlight/

I don't care about hdr. It's that Displayhdr 600 guarantees color, contrast and luminance improvements,so it's a great way to ensure you get what you want. Understanding HDR signal on top of that is fine, but it's the other things I care about.

Of course, that ProArt will be like $1500. Hear me well: there is no manufacturing reason why this can't be made for $700-800. None at all. The high zone FALD is a potential reason, but no reason why it can't be done cheaper with less zones. There are not new, bleeding edge technologies. I suspect heat is the issue, and manufacturers refuse to make panels that are deeper/fatter for adequate cooling.
 
FALD is a decent substitute but nothing compares to a display that can natively display high contrast. It's similar to projectors with dynamic irises. Sure, they can technically pump out high contrast numbers, but the minute you see something that can natively do high contrast without resorting to tricks, it's hard to accept the substitute. :)
 
FALD is a decent substitute but nothing compares to a display that can natively display high contrast. It's similar to projectors with dynamic irises. Sure, they can technically pump out high contrast numbers, but the minute you see something that can natively do high contrast without resorting to tricks, it's hard to accept the substitute. :)

Well, yeah. MicroLED is the goal, but that's 5-10 years out, easily. FALD and possibly MiniLED could be affordable in the next 3 years (but probably won't be).
 
And then today, in the news:

https://pcmonitors.info/asus/asus-pa27ucx-4k-ips-model-with-576-zone-mini-led-backlight/

I don't care about hdr. It's that Displayhdr 600 guarantees color, contrast and luminance improvements,so it's a great way to ensure you get what you want. Understanding HDR signal on top of that is fine, but it's the other things I care about.

Of course, that ProArt will be like $1500. Hear me well: there is no manufacturing reason why this can't be made for $700-800. None at all. The high zone FALD is a potential reason, but no reason why it can't be done cheaper with less zones. There are not new, bleeding edge technologies. I suspect heat is the issue, and manufacturers refuse to make panels that are deeper/fatter for adequate cooling.

Using FALD in desktop mode is pointless. The Zones are always going to be too large and cause halo/dimming where you don't want it. FALD is mainly useful in Games/Entertainment and even then it is still subject to artifacts, without a large amount of zones and a very good algorithm.

OLED monitors will be the real leap forward when they arrive for Desktops. Already available in Laptops.

MicroLED (AKA one LED per subpixel) is a pipe dream for monitors. It's not happening even in a decade. It is only showing up potentially in watches or other similar sized display units, where you can fit the whole display on an active subrtrate (built on wafer fabrication), or on VERY large and expensive displays where the pitch is very low and the cost high enough to place the individual LEDs.
 
MicroLED (AKA one LED per subpixel) is a pipe dream for monitors. It's not happening even in a decade. It is only showing up potentially in watches or other similar sized display units, where you can fit the whole display on an active subrtrate (built on wafer fabrication), or on VERY large and expensive displays where the pitch is very low and the cost high enough to place the individual LEDs.

I can see that, but if MicroLED is going to be that unfeasible (which I don't take for granted), then you better hope for MiniLED. A 1000 zone panel wouldn't be that bad in 27-32" sizes - that'd be a 20 x 50 LED array, not perfect but not bad - that's a .64" gap - workable for games, horrible to move your mouse around (maybe FALD could be disabled for desktop use?). I agree that FALD in desktop is pointless, but a monitor is a monitor, so you'll use it for regular desktop AND games. OLED would be fantastic... but while burn-in remains an issue, forget it. That taskbar will be burnt in within 3 months.
 
Well, yeah. MicroLED is the goal, but that's 5-10 years out, easily. FALD and possibly MiniLED could be affordable in the next 3 years (but probably won't be).

That's the biggest issue I have with PC monitors. Ever since the demise of CRT, monitors have looked like ass, and the true replacements are always just a few years out. :) Anyways, I'm hoping MicroLED actually becomes a thing. I have a feeling it will be.
 
I can see that, but if MicroLED is going to be that unfeasible (which I don't take for granted), then you better hope for MiniLED. A 1000 zone panel wouldn't be that bad in 27-32" sizes - that'd be a 20 x 50 LED array, not perfect but not bad - that's a .64" gap - workable for games, horrible to move your mouse around (maybe FALD could be disabled for desktop use?). I agree that FALD in desktop is pointless, but a monitor is a monitor, so you'll use it for regular desktop AND games. OLED would be fantastic... but while burn-in remains an issue, forget it. That taskbar will be burnt in within 3 months.

OLED is the answer. People with rose colored glasses about CRT, also seem to forget that CRT's had burn in as well and people coped. CRT is why they invented the "screen saver".

You just need to return to the coping strategies of the CRT days. Auto hide the task bar. Use a screen saver. clean you desktop and use a black background (or rotate a lot of different pictures). Use "dark mode" in common applications, etc...

It will involve some adjustments, but I think it would be worth it.
 
OLED is the answer. People with rose colored glasses about CRT, also seem to forget that CRT's had burn in as well and people coped. CRT is why they invented the "screen saver".

You just need to return to the coping strategies of the CRT days. Auto hide the task bar. Use a screen saver. clean you desktop and use a black background (or rotate a lot of different pictures). Use "dark mode" in common applications, etc...

It will involve some adjustments, but I think it would be worth it.

Not to drag this into a debate but most people who hang on to their CRT's aren't delusional people with rose-colored glasses. It has taken a long time to get where we are so that LCD monitors finally match or exceed CRT (mostly...). The top-end CRT monitors still possess enough visual prowess that warrants holding onto them. Believe me, if OLED monitors were more available, you'd see more people who use CRT monitors jumping ship. But in the current state of affairs, LCD still has too many compromises that render them unsuitable for CRT replacement.

Most CRT diehards surrender them because of other reasons (their monitors die, or if you're me, your living situation makes it so that owning one is untenable).
 
Top emission OLEDs would make burn in a non issue even for PC productivity use. I’ve been using my samsung OLED windows tablet for almost 3 years with not even the slightest hint of image retention. I’ve had LCD monitors with much worse IR than this display. If someone wants to run 1000nits displaying a static logo 24x7 that’s still going to be an issue but again it will be a minimal issue with JOLED panels for example, or the 15” 4Kpanels that Samsung (I believe) has been supplying to Dell and others for laptop use.
 
Using FALD in desktop mode is pointless. The Zones are always going to be too large and cause halo/dimming where you don't want it. FALD is mainly useful in Games/Entertainment and even then it is still subject to artifacts, without a large amount of zones and a very good algorithm.

OLED monitors will be the real leap forward when they arrive for Desktops. Already available in Laptops.

MicroLED (AKA one LED per subpixel) is a pipe dream for monitors. It's not happening even in a decade. It is only showing up potentially in watches or other similar sized display units, where you can fit the whole display on an active subrtrate (built on wafer fabrication), or on VERY large and expensive displays where the pitch is very low and the cost high enough to place the individual LEDs.


Why do you think OLED will "arrive for desktops"?? There is no indication from a single manufacturer that this is in the pipeline, and we already know how slow the monitor industry works even when a monitor is actually announced. No smaller and more practical OLED panels are even slated for production, only LG's 48" which is a TV and barely more suitable than a 55". The best you can hope for is some sort of R&D going on behind closed doors, but that's wishful thinking at best (unless you know something we don't). Furthermore, the issue with burn-in is very much real, albeit on a panel lottery basis, and you can bet manufacturers are well aware of this... they'd have a warranty return nightmare on their hands unless they can somehow improve this issue and be more certain panels won't suffer from it (probably unlikely given the way many people use monitors). At the end of the day, there's just very little incentive to do it at present, despite what enthusiasts want.

We know MicroLED has millions being channeled into R&D from the likes of Apple and Samsung... correctly as you say for smaller devices, but that's to be expected. Samsung envision it as a far bigger product eventually, as they've demonstrated with 'The Wall', so there's every reason to think that will eventually trickle down to more practical sized screens given MicroLEDs potential upsides over OLED. Yes, this is many years away (a decade could be right), but I don't know why you'd think OLED monitors would appear much sooner given the complete lack of any indication they will. It would be nice if they did (with some assurance burn-in would be covered under warranty), but I don't see it happening in a time frame that anyone will be happy with, if ever.

In the relative short term, when GPUs feature HDMI 2.1, and if/when Nvidia support VRR with it (and AMD can compete with them at the top end at 4K), people will at least be able to enjoy some of the PC gaming benefits on bigger OLED TV's. That's as much as we can look forward to for now.
 
Last edited:
Why do you think OLED will "arrive for desktops"?? There is no indication from a single manufacturer that this is in the pipeline, and we already know how slow the monitor industry works even when a monitor is actually announced. No smaller and more practical OLED panels are even slated for production, only LG's 48" which is a TV and barely more suitable than a 55". The best you can hope for is some sort of R&D going on behind closed doors, but that's wishful thinking at best (unless you know something we don't). Furthermore, the issue with burn-in is very much real, albeit on a panel lottery basis, and you can bet manufacturers are well aware of this... they'd have a warranty return nightmare on their hands unless they can somehow improve this issue and be more certain panels won't suffer from it (probably unlikely given the way many people use monitors). At the end of the day, there's just very little incentive to do it at present, despite what enthusiasts want.

We know MicroLED has millions being channeled into R&D from the likes of Apple and Samsung... correctly as you say for smaller devices, but that's to be expected. Samsung envision it as a far bigger product eventually, as they've demonstrated with 'The Wall', so there's every reason to think that will eventually trickle down to more practical sized screens given MicroLEDs potential upsides over OLED. Yes, this is many years away (a decade could be right), but I don't know why you'd think OLED monitors would appear much sooner given the complete lack of any indication they will. It would be nice if they did (with some assurance burn-in would be covered under warranty), but I don't see it happening in a time frame that anyone will be happy with, if ever.

OLED is already here in Laptops, and its already here in TV's, it's already here in professional monitors. There is no technological limit preventing usage in consumer monitors. Burn in will not be warrantied. The real issue is coming up with a business case for an OLED monitor that can sell enough volume at the required price, and consumer education on burn in.

MicroLED is not here in ANY of those applications. It's nothing more than a Pipe Dream. "The Wall" is not really Micro LED, the pitch is so large it's like Mini-LED. It's NOT going to trickle down. You need a whole different type of technology for a monitor.

Dual Layer LCD, at least exists in the size ranges, so that theoretically has a chance of trickling down. MicroLED. Nada.
 
OLED is already here in Laptops, and its already here in TV's, it's already here in professional monitors. There is no technological limit preventing usage in consumer monitors. Burn in will not be warrantied. The real issue is coming up with a business case for an OLED monitor that can sell enough volume at the required price, and consumer education on burn in.

MicroLED is not here in ANY of those applications. It's nothing more than a Pipe Dream. "The Wall" is not really Micro LED, the pitch is so large it's like Mini-LED. It's NOT going to trickle down. You need a whole different type of technology for a monitor.

Dual Layer LCD, at least exists in the size ranges, so that theoretically has a chance of trickling down. MicroLED. Nada.


Where is OLED in professional monitors? I don't mean small ones for ridiculously niche use in medical scenarios etc (where their exorbitant price can be endured), rather something that has practical use for the PC enthusiast community? And that's not laptops either, completely different market. Until you have a MAJOR monitor/panel manufacturer talking about OLED in a meaningful way for actual PC use (not TV etc), you are merely wishful thinking by saying it will "arrive for desktops", because there is ZERO indication it ever will. You wanting it to or making your own arguments as to why it should, even if you're correct, sadly, doesn't make it any closer to a reality.

If you think those conversations about business case haven't gone on already, you'd be mistaken... their conclusion, there was none, which is why in the 10-years since OLED has been around, no such product has even come close to existing.

MicroLED is the pure definition of nascent tech... it wouldn't even be correct to say we're in the early days... the 'days' haven't even begun. But to dismiss it now as a pipe dream would be daft and quite short sighted. It likely is a solid decade away though.
 
Last edited:
Where is OLED in professional monitors? I don't mean small ones for ridiculously niche use in medical scenarios etc (where cost is no issue), rather something that has practical use for the PC enthusiast community? And that's not laptops either, completely different market. Until you have a MAJOR monitor/panel manufacturer talking about OLED in a meaningful way for actual PC use (not TV etc), you are merely wishful thinking by saying it will "arrive for desktops", because there is ZERO indication it ever will. You wanting it to or making your own arguments as to why it should, even if you're correct, sadly, doesn't make it any closer to a reality.

If you think those conversations about business case haven't gone on already, you'd be mistaken... their conclusion, there was none, which is why in the 10-years since OLED has been around, no such product has even come close to existing.

OLED started appearing in devices in the early 2000's. Though at negligible Volume until smartphones around 2010.

We have only had OLED TVs since 2013. We have only had OLED laptops since 2016, and this is exactly the kind of sensible progression you see in technology, rolling out where it makes most sense first.

It only makes sense that it would go last to monitors. I think I pretty much predicted that back in to 2008, when people were already saying "wait for OLED".

We are finally getting close to getting OLED monitors. A little more experience with OLED laptops and someone will make that leap.

Micro LED is not even where OLED was in the early 2000's. Nothing more than a pipedream.

There are really only 3 contenders for very high contrast desktop monitors in the next decade. FALD LCD, Dual Layer LCD, and OLED.

Edit: here is a pro OLED. It's not ridiculously tiny. It's 4K, 30":
 
Last edited:
OLED started appearing in devices in the early 2000's. Though at negligible Volume until smartphones around 2010.

We have only had OLED TVs since 2013. We have only had OLED laptops since 2016, and this is exactly the kind of sensible progression you see in technology, rolling out where it makes most sense first.

It only makes sense that it would go last to monitors. I think I pretty much predicted that back in to 2008, when people were already saying "wait for OLED".

We are finally getting close to getting OLED monitors. A little more experience with OLED laptops and someone will make that leap.

Micro LED is not even where OLED was in the early 2000's. Nothing more than a pipedream.

There are really only 3 contenders for very high contrast desktop monitors in the next decade. FALD LCD, Dual Layer LCD, and OLED.

Edit: here is a pro OLED. It's not ridiculously tiny. It's 4K, 30":


Again, nothing being said, developed (that we're aware of), industry talk, rumours or a single shred of info that suggests OLED is making its way to monitors anytime soon. If they announced an OLED monitor tmrw, it would be 2-years away minimum and cost silly money (just looking at the price of LCD's tells you that much). A £30k 30" 4K OLED, 2 years ago... I think it speaks volumes that we've heard nothing about that since.

I share your enthusiasm for the technology, but it's obvious that it isn't arriving to practical desktop monitors at an affordable price anytime soon. I'm not saying it never will, but I'll be surprised if there's an affordable (and it HAS to be that) circa 30" desktop OLED monitor (with all the features we want) within the next 5 years.
 
Phones have OLED screens, people use 55" OLED TVs as desktop monitors, laptops have OLED displays, OLED watches, even motherboards have OLED displays on them.... I think OLED gaming monitors will eventually come. It's just a matter of time, technology and cartel consensus. OLED monitors will kill TNs, IPS, VAs monitors the same month they hit the shelves. And those panel technologies are not just ancient, the whole industries are built around them. The owners and other participants can't just switch everything to just making awesome OLED monitors - who will wait for months for the shitty IPS screens with piss yellow ips glow in corners that stretches almost to the center of the HDR 400 compatible screen? Noone. Everyone would just go and buy an OLED monitor. Even a 75hz Freesync or Gsync compatible OLED would suit many people more than the shitload of ips and va's 144hz+ panels that can't keep up with stated refresh rates any way because of artifacts.
 
Phones have OLED screens, people use 55" OLED TVs as desktop monitors, laptops have OLED displays, OLED watches, even motherboards have OLED displays on them.... I think OLED gaming monitors will eventually come. It's just a matter of time, technology and cartel consensus. OLED monitors will kill TNs, IPS, VAs monitors the same month they hit the shelves. And those panel technologies are not just ancient, the whole industries are built around them. The owners and other participants can't just switch everything to just making awesome OLED monitors - who will wait for months for the shitty IPS screens with piss yellow ips glow in corners that stretches almost to the center of the HDR 400 compatible screen? Noone. Everyone would just go and buy an OLED monitor. Even a 75hz Freesync or Gsync compatible OLED would suit many people more than the shitload of ips and va's 144hz+ panels that can't keep up with stated refresh rates any way because of artifacts.

Yup, they would kill all competition... at the right price, so that's a big hold up for one. They can't come in LESS than LCD, as that would kill an entire industry which has millions (if not billions) tied up in LCD production... and pricing them higher than top end LCD monitors would see them out of reach for most.

Question is, when will this change...?? Many years for sure, so for now, we must endure LCD, and I don't see it going anywhere for a long time. Monitors are getting more expensive and people keep buying them. Where's the incentive for manufacturer's to develop OLED monitors? They're not doing it yet, and as previously mentioned, aren't even talking about it or seemingly excited at the prospect. This needs to change, and at that point in time, you can expect the first product on shelves perhaps 3-4 years later, allowing for R&D, promotion, scaling down of LCD production to allow for OLED at the right price, business restructuring etc. etc. It's a BIG shift, because as you say, it would pretty much kill LCD. These companies are just far too invested in LCD to change direction right now... something needs to make them do it, or the business needs to be incredibly persuasive... and it just isn't yet. A few people exclaiming their love for OLED on forums such as this won't do the trick unfortunately.
 
Last edited:
OLED still has low yields and its own technical issues - look at the DSE problems with LG panels and other mura effects that have been with OLED panels since the beginning. For PC monitors I assume the standard is going to be even higher since there is simply no viable lower-end market for these panels. Your typical enthusiast who simply cannot unsee even minor quality issues is not worth catering to at this point, in fact they are actively avoiding the market segment for good reason. Still, a good sign that development is making good progress is how low the prices are getting for OLED TVs nowadays - they are often on sale for much less than MSRP and are approaching the three figure mark at the lower end. It's better they don't waste time and money catering to the whims of the unprofitable monitor market and garner as much as they can from the big driver of adoption in televisions and other mass market potential items.
 
Again, nothing being said, developed (that we're aware of), industry talk, rumours or a single shred of info that suggests OLED is making its way to monitors anytime soon. If they announced an OLED monitor tmrw, it would be 2-years away minimum and cost silly money (just looking at the price of LCD's tells you that much). A £30k 30" 4K OLED, 2 years ago... I think it speaks volumes that we've heard nothing about that since.

I share your enthusiasm for the technology, but it's obvious that it isn't arriving to practical desktop monitors at an affordable price anytime soon. I'm not saying it never will, but I'll be surprised if there's an affordable (and it HAS to be that) circa 30" desktop OLED monitor (with all the features we want) within the next 5 years.

BTW, Amazon UK, shows stock on the 22" Asus Pro Art OLED:
https://www.amazon.co.uk/ASUS-PQ22U...7d214fa624a9c8094b5ea61c8b47a6&language=en_GB

I am neither optimistic, nor pessimistic about upcoming technology. I am just assessing the trend lines.

When were talking about OLED monitors 5 and even 10 years ago, and I said not in the foreseeable future, I was called a pessimist. But then there was no trend in that direction. It's obvious the most difficult challenge is desktop monitors and they would only show up there AFTER TV's and Laptops, which we didn't have back when these debates first started.

Now we do, we have multiple OLED Laptop models this year using a 4K 15.6" panel that is reasonably priced and kills every other display. LG is reportedly making a slightly smaller 48" OLED TV, so we have OLED consumer products converging on both sides, along with various pro models in between.

Now it is looking inevitable that there will be a consumer OLED Monitor with 5 years at the latest. That isn't optimism it's just extrapolating one the trends.

Also this is about people jumping onto the next pipe dream: MicroLED. MicroLED is nowhere near where OLED was when I said "not in the foreseeable future", about a decade ago.

FALD, OLED, and Dual Layer are the only contenders in the foreseeable future.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Meh, I don't know how you guys can drop 3-4k on displays these days. I can't imagine spending that much nowadays and dealing with backlight bleed and dead/stuck pixels, which is a problem regardless of price. I remember in the mid 2000s dropping 3k+ on high end monitors back then and being deeply unsatisfied with having to deal with dead pixels, backlight bleed, and other defects. That and the fact that modern electronics just doesn't last very long in general, so you are really paying up the wazoo for a few standout niche features and you'll have to start up the search all over in a few years when your display starts croaking and you're out of warranty. To me these days I just give myself an internal $700-800 price point I won't go over and just search for the features I want, I'm less bothered by the manufacturing defects.
 
Back
Top