LG 55" Oled to Gaming Monitor?

There isn't just one FALD monitor, there is also the ASUS PG35VQ, and that has 512 dimming zones.

But have you seen the reviews, you have halo effects around white pointers in the domming zones.

Not that noticeable in games but very noticeable on desktop.

Oleds the way to go! lol

I was talking about IPS FALD, the PG35VQ is an VA panel. A high zone count FALD is the only way I would want to use an IPS panel. 384/512 zones is definitely too few, we need a 10,000+ zone FALD to really cut down on that blooming. But I'm not waiting on that though I'll just pick up a C9/C10 LG OLED once an HDMI 2.1 gpu comes out.
 
I was talking about IPS FALD, the PG35VQ is an VA panel. A high zone count FALD is the only way I would want to use an IPS panel. 384/512 zones is definitely too few, we need a 10,000+ zone FALD to really cut down on that blooming. But I'm not waiting on that though I'll just pick up a C9/C10 LG OLED once an HDMI 2.1 gpu comes out.
After you use one for awhile you don't notice the blooming in actual HDR content unless you're in a static menu. There is always the option of turning the variable backlight off in SDR content so you don't even have to worry about it all the time. The revision of the PG27UQ coming later this year is going to have 1,024 zones. OLED's per-pixel brightness is obviously superior, but there is a limit on how bright they can get if you want the panel to last an appreciable amount of time. A LED LCD with FALD is a good compromise in the meantime until MicroLED reaches mainstream and comes down to a reasonable price.
 
  • Like
Reactions: Nenu
like this
until MicroLED reaches mainstream and comes down to a reasonable price.

Don't hold your breath on that one, it's not in the foreseeable future. Unless we have something like nonotech replicators, building a TV/Monitor sized MicroLED display will cost many multiple times what OLED does.

OLED is modified inkjet printing technololgy for the cells, which is relatively inexpensive.

OTOH Micro-Led, is wafer fabrication process similar to CPU/GPU chip production, except using something like Gallium Arsenide Wafers which are MUCH more expensive that Silicon Wafers, even worse is that difference color LEDs may need different substrates. Then each individual LED need to be individually placed. It's an expensive nightmare of production costs. Just imagine you are covering an entire screen in fabricated chips, and that gives some idea of the prohibitive costs.

IMO this will be like SED for consumers. IOW, it will never get the price low enough to be practical. The exception may be something like a watch screen if they manage to create all the colors on one substrate, and then the screen can just be one big LED chip, but obviously that doesn't scale much larger, and even then, probably price prohibitive, vs OLED.
 
Don't hold your breath on that one, it's not in the foreseeable future. Unless we have something like nonotech replicators, building a TV/Monitor sized MicroLED display will cost many multiple times what OLED does.

OLED is modified inkjet printing technololgy for the cells, which is relatively inexpensive.

OTOH Micro-Led, is wafer fabrication process similar to CPU/GPU chip production, except using something like Gallium Arsenide Wafers which are MUCH more expensive that Silicon Wafers, even worse is that difference color LEDs may need different substrates. Then each individual LED need to be individually placed. It's an expensive nightmare of production costs. Just imagine you are covering an entire screen in fabricated chips, and that gives some idea of the prohibitive costs.

IMO this will be like SED for consumers. IOW, it will never get the price low enough to be practical. The exception may be something like a watch screen if they manage to create all the colors on one substrate, and then the screen can just be one big LED chip, but obviously that doesn't scale much larger, and even then, probably price prohibitive, vs OLED.
Material costs aside, I would never discount the ingenuity of manufacturing engineers or how desperate company CEOs are to get people to buy new TVs these days.
 
Material costs aside, I would never discount the ingenuity of manufacturing engineers or how desperate company CEOs are to get people to buy new TVs these days.

SED and Plasma are perfect counter examples.

Plasma had a superior picture compared to LCD and a moderate price, and they were pushed out of the market.

SED was demoed multiple times with even better than Plasma picture, and they gave up before even hitting the market because it was going to cost too much more than LCD.

SED is dirt cheap(similar to Plasma construction) compared to what it would cost to build a Micro-LED set.

Heck, we are lucky that OLED is even going forward.

I would expect dual layer LCD to have a chance of showing up, since it is just stacked LCDs, it might only double the price of a good LCD, and they are actually selling pro models.
https://www.eizoglobal.com/press/releases/2019/NR19_006.html

Micro-LED, they aren't even talking about pro models in monitor/TV sizes. It's either giant corporate walls (for hundreds of thousands of dollars) or watches where the real action is.
 
SED and Plasma are perfect counter examples.

Plasma had a superior picture compared to LCD and a moderate price, and they were pushed out of the market.

SED was demoed multiple times with even better than Plasma picture, and they gave up before even hitting the market because it was going to cost too much more than LCD.

SED is dirt cheap(similar to Plasma construction) compared to what it would cost to build a Micro-LED set.

Heck, we are lucky that OLED is even going forward.

I would expect dual layer LCD to have a chance of showing up, since it is just stacked LCDs, it might only double the price of a good LCD, and they are actually selling pro models.
https://www.eizoglobal.com/press/releases/2019/NR19_006.html

Micro-LED, they aren't even talking about pro models in monitor/TV sizes. It's either giant corporate walls (for hundreds of thousands of dollars) or watches where the real action is.

I saw a place making ~12" screens at some Chinese expo on a YouTube video that sounded pretty promising. I think tv and monitor sizes will happen. Probably 10 years from now.
 
I saw a place making ~12" screens at some Chinese expo on a YouTube video that sounded pretty promising. I think tv and monitor sizes will happen. Probably 10 years from now.
Exactly. Not implying that it will happen quickly at all, but some of the big players in the TV industry seem to be on board with it at the moment. It was 20 years between when the first "practical" OLED panel was produced and the first television made it to market. It would be another 5 years before LG started producing their own and prices started to come down.
 
anyways, bit the bullet and pulled the trigger on a Asus XG438Q. Should be here within the next week.
 
anyways, bit the bullet and pulled the trigger on a Asus XG438Q. Should be here within the next week.

I take it you are in Europe or Asia where new products get flown in from China? One thing I hate about new products in the USA, they usually ship them by slooooooow boat from China.
 
anyways, bit the bullet and pulled the trigger on a Asus XG438Q. Should be here within the next week.

Looking forward to impressions. Still nowhere for sale here and I'd rather not order a display from abroad as any warranty or return issues will be costly.
 
  • Like
Reactions: Skott
like this
Yes I am in Australia. And don't assume we get everything first, infact we usually get most stuff last and only a few certain things will come here first.

For example, we are not even getting the LG 38GL950 ever. You guys will have it next month.

There were several monitors that I previously had, a few years ago that I had to order from newegg (US) because they didn't come to Australia till a few months after the US.
 
Last edited:
Plasma was mainly pushed out of the market because building a 4K plasma that could be sold commercially was extremely difficult. It would also have had absolutely insane power consumption even if it was possible to build.

So really, it lost because it was inferior technology - for 4K.

Plasma died before 4k was a thing. The main problem was it wasn't cost efficient to make small plasma panels, and yes, those smaller sizes do sell. And given LCD/LED got *good enough*, you have to question why to have two separate production lines for two techs, one that is universal and one that is more a fringe case?
 
Hint: 4K isn't mentioned once. It's a 2013 article and Panasonic is one of the last plasma maker standings, rumored to exit market (made official not longer after).
Why Plasma TVs Are Dying

Plasma was dying long before 4K became an issue.
The model I posted came out in 2010 and had a resolution of 4096x2160, also known as DCI 4K. Granted, it cost £600k/$930k at time of release, but they were real and can be found on the used market today.
 
The model I posted came out in 2010 and had a resolution of 4096x2160, also known as DCI 4K. Granted, it cost £600k/$930k at time of release, but they were real and can be found on the used market today.

So? They made some kind of pro 4K monitor in 2010.

That doesn't mean 4K had any kind of serious impact on Plasma's decline, which started years before you could buy Consumer 4K TVs.

Plasma left the market because they already in sales decline before 4K.

LCD took off, but plasma never did.

Technology needs critical mass to survive. Plasma peaked quite low in 2010, and quickly fell off after. LCD domination was clear in the early 2000's:

Forecast-for-LCD-TV-Plasma-TV-and-CRT-TV-Unit-Shipments.png


4K really only got going after the market already decided against Plasma:

global-television-production-forecast-total-flat-panel-tvs-4k-tvs_chartbuilder.png
 
Last edited:
My issue with running 1440p 120Hz on the OLED is you would either have it be 1:1 and end up with a small 1440p screen in middle or let the TV upscale 1440p to fit a 4k 55 inch screen and neither solution is optimal for me. If you have been using an OLED TV for over 3 years then it's probably a C6 so I wouldn't bother wasting any money on a crappy monitor and instead just use that saved money to buy an HDMI 2.1 Nvidia 7nm GPU and a C9/C10 OLED for 4k120Hz.
Tell the GPU to upscale.
Reduces lag and should look a little better.
If you have a 20xx series card the latest beta driver lets you use integer upscaling.
There is also an App on Steam called Lossless_Scaling that lets you integer upscale on any card with Windows 8 upward.
 
It's amazing how people will warp an argument just to try to prove themselves right. I didn't say, and nobody said at the beginning, that Plasma was top of the market before 4K came around. However, both Samsung and Panasonic were making premium and budget plasma lines quite happily up until 2014. I know, because I own one.

They likely would have KEPT making those lines into the future if not for 4K, because no additional investment was needed. However, when 4K happened, a massive additional investment was needed to improve the technology so that it could be used to produce 4K televisions, because the production capabilities at the time could NOT produce cost-effective 4K plasmas that consumed an acceptable amount of power. It likely wasn't _impossible_, but yes, Plasma wasn't the top seller at that time, so no billion-dollar investment was warranted.

4K is what killed Plasma. All the rest is what caused its decline to the point that it was very vulnerable to a major technology change killing it.
 
It's amazing how people will warp an argument just to try to prove themselves right. I didn't say, and nobody said at the beginning, that Plasma was top of the market before 4K came around. However, both Samsung and Panasonic were making premium and budget plasma lines quite happily up until 2014. I know, because I own one.

They likely would have KEPT making those lines into the future if not for 4K, because no additional investment was needed. However, when 4K happened, a massive additional investment was needed to improve the technology so that it could be used to produce 4K televisions, because the production capabilities at the time could NOT produce cost-effective 4K plasmas that consumed an acceptable amount of power. It likely wasn't _impossible_, but yes, Plasma wasn't the top seller at that time, so no billion-dollar investment was warranted.

4K is what killed Plasma. All the rest is what caused its decline to the point that it was very vulnerable to a major technology change killing it.

Cool story Bro.

But the graphs linked above, clearly tell a different story. You can look at the peak year for Plasma: 2010, and see that it was already over.

You don't look at the last day of the last Plasma factory closing, and say that 4K was going by then, so it killed plasma.

Plasma failed by mid 2000's, when the market took off with LCD, and plasma didn't, and companies started getting out of Plasma, this happend in 2008:

Plasma TV is Dead - Pioneer Exits

Hitachi also bailed in 2008.
Fujitsu in 2007.
Sony in 2006.
 
Last edited:
It's amazing how people will warp an argument just to try to prove themselves right. I didn't say, and nobody said at the beginning, that Plasma was top of the market before 4K came around. However, both Samsung and Panasonic were making premium and budget plasma lines quite happily up until 2014. I know, because I own one.

They likely would have KEPT making those lines into the future if not for 4K, because no additional investment was needed. However, when 4K happened, a massive additional investment was needed to improve the technology so that it could be used to produce 4K televisions, because the production capabilities at the time could NOT produce cost-effective 4K plasmas that consumed an acceptable amount of power. It likely wasn't _impossible_, but yes, Plasma wasn't the top seller at that time, so no billion-dollar investment was warranted.

4K is what killed Plasma. All the rest is what caused its decline to the point that it was very vulnerable to a major technology change killing it.

Panasonic canned the Plasma division in early 2014 for the explicit reason that it was loosing money, and with small margins there was no real hope of the division ever turning a profit. Remember the company was in a real bad way in the early 2010's, and their CEO cut most of the companys underperforming divisions, Plasma being one of the casualties.

And I'm speaking as someone who owns a VT60 (got it at a PC Richards clearance sale for $750; everyone was dumping them on the market by late 2014).
 
Tell the GPU to upscale.
Reduces lag and should look a little better.
If you have a 20xx series card the latest beta driver lets you use integer upscaling.
There is also an App on Steam called Lossless_Scaling that lets you integer upscale on any card with Windows 8 upward.

I've tried setting the scaling to GPU in the NVCP but honestly I think it's just the fact that we want to scale up 1440p to a pretty large 55 inch screen it's just never going to be good enough looking for me compared to native 4k. I've ran 1440p on my Acer X27 but because it's a measely 27 inches, 1440p on that screen looks acceptable.
 
Nothing more came out of this? It seems like the best compromise, 2million lighting zones! or even 4k(8million) which I assume that beast uses.

There are places working on it. A major problem is power consumption. The Sony mastering looks amazing as long as you look straight on. Also it's small and uses a shitton of power. It isn't as easy as just slapping 2 panels together. There's a lot they need to work on.
 
Nothing more came out of this? It seems like the best compromise, 2million lighting zones! or even 4k(8million) which I assume that beast uses.

Cost /Volume issues, likely keeping it in the pro space for now. Right now they are $30000, but even at $3000, I doubt enthusiasts would buy many.

I suppose if they sell enough pro screens, they might reach better economies of scale to drop down a pricing notch, possibly with a less bright backlight (currently requires active cooling).

I had read something about a supposed Consumer TV model in China, but that seems dubious.
 
That monitor was suppose to be high end, and if that is the quality of what todays high end monitors are s also like, then I am seriously inclined to stay with Oleds and just upgrade to a C9.

I have a Dell P2715Q, a pretty well regarded 4K monitor with good color accuracy. It's not edgelit either. As far as monitors go, I really like it.

It can't hold a candle to my LG OLED. Just not even in the same league.
 


No surprise at all, since he was just running it as a TV.

It's really only using it as a PC monitor where fixed UI elements are at risk of burning in.

It's really common sense. If you have no fixed elements on screen nearly all the time, then there isn't much to worry about.
 
His reasoning for how display models get burn in make sense. You have TV running at full brightness for 12+ hours a day playing the same content and never allowing the tv to do it pixel refresh cause they just completely cut power to everything at the end of the day. I didn't have any reservations about buying one other then the prices. Once a 65" c9 drops under $2k I am in.
 
20 hours a day isn't even that much of a realistic use case scenario too. I don't clock even close to 30 hours a week on my own OLED so if his unit has gone well past 3000 hours with zero signs of burn in then I'm pretty sure my own unit will easily last me 5 years. Can't wait for HDMI 2.1 cards to come out so I can pick up a C9/C10
 
20 hours a day isn't even that much of a realistic use case scenario too. I don't clock even close to 30 hours a week on my own OLED so if his unit has gone well past 3000 hours with zero signs of burn in then I'm pretty sure my own unit will easily last me 5 years. Can't wait for HDMI 2.1 cards to come out so I can pick up a C9/C10

I'd still feel better if somewhere in the OS / drivers / whatever there were an awareness for OLED tech.

I didn't pay much for my B7 55", getting it at the end of the model year, but if I shelled out for 75" like I (unrealistically) want?

Yeah ;)
 
20 hours a day isn't even that much of a realistic use case scenario too. I don't clock even close to 30 hours a week on my own OLED so if his unit has gone well past 3000 hours with zero signs of burn in then I'm pretty sure my own unit will easily last me 5 years. Can't wait for HDMI 2.1 cards to come out so I can pick up a C9/C10

I use my B6 more then you do, but we're more or less thinking the same: Wait for HDMI 2.1 cards, then upgrade to a C9/C10 series.
 
Back
Top