Gsync 4k 144hz HDR monitor prices 'released'

I wonder what ever happened to Panasonic's layered LCD panel tech. It always seemed like a better design than FALD to me. Essentially two LCD panels stacked on top of each other instead of 1000:1 contrast you had 1,000,000:1. I had thought they released an (expensive) professional monitor with this tech but I can't find the model number.

It's obviously got issues including requiring a super bright backlight to compensate for the loss from two panels, and potential issues with syncing the two panels, and other stuff like that. But it still seemed like a more scalable solution than just infinitely increasing the density of the backlight. Maybe it proved impractical for some reason.

E: AH, it did make it into the flagship Eizo HDR Reference Monitor that just came out for $31K. Seems reasonable. :D
 
Last edited:
I wonder what ever happened to Panasonic's layered LCD panel tech. It always seemed like a better design than FALD to me. Essentially two LCD panels stacked on top of each other instead of 1000:1 contrast you had 1,000,000:1. I had thought they released an (expensive) professional monitor with this tech but I can't find the model number.

It's obviously got issues including requiring a super bright backlight to compensate for the loss from two panels, and potential issues with syncing the two panels, and other stuff like that. But it still seemed like a more scalable solution than just infinitely increasing the density of the backlight. Maybe it proved impractical for some reason.

E: AH, it did make it into the flagship Eizo HDR Reference Monitor that just came out for $31K. Seems reasonable. :D

A price like that makes it clear why everyone else is puttering around with segmented backlights.

Even for Ezio's core customer base, they're only going to be able to get a handful of customers willing to buy even one or two of them.
 
Or simply not worry about all of this backlight bandaid nonsense and go OLED. These IPS FALD monitors are going to look terrible in dark scenes.
 
Or simply not worry about all of this backlight bandaid nonsense and go OLED. These IPS FALD monitors are going to look terrible in dark scenes.
I agree about IPS issues but we can't ignore the issues OLED's face either.
 
I'd be curious if this could be somewhat mitigated via software, say if the cursor were ignored by the FALD.
 
  • Like
Reactions: Q-BZ
like this
I'd be curious if this could be somewhat mitigated via software, say if the cursor were ignored by the FALD.
Impossible unless you want to add back in all the lag they got rid of. FALD lag can be a full second or worse on some panels.
 
  • Like
Reactions: Q-BZ
like this
Or simply not worry about all of this backlight bandaid nonsense and go OLED. These IPS FALD monitors are going to look terrible in dark scenes.

OLED feels like "Plasma 2.0" to me with both the pros and hopefully diminishing cons. I feel like it needs another year or two to really settle out.

Obviously FALD and some of this other stuff isn't going to be the magic cure some of us had hoped it would be. At least for now.




If you're referring to burn in, that's an issue that is actually avoidable. While with IPS FALD blooming there isn't a way to avoid that, at least not with 384 dimming zones.

I'm just amazed we're even discussing burn-in and image retention a hop and skip from the year 2020 for cripes sake.
 
Last edited:
this is why i got my 1440p 144hz 27" for 300 cause 144hz 4k wont be able to be touched for years unles your pockets are deep
 
OLED feels like "Plasma 2.0" to me with both the pros and hopefully diminishing cons. I feel like it needs another year or two to really settle out.

Obviously FALD and some of this other stuff isn't going to be the magic cure some of us had hoped it would be. At least for now.
Micro LED looks promising. I think OLED are in a good place right now, and with others entering the market prices should fall a little quicker. I just hope that they'll be able to get the peak brightness higher on OLED panels while continuing to improve image burn in and retention.
 
  • Like
Reactions: Q-BZ
like this
looks like a generation of fast sync will have to do, wont sell to many of those tiny monitors at that price

call me when a 34" 21:9 3440 gsync hits $1000
 
I agree about IPS issues but we can't ignore the issues OLED's face either.

What issues? If you are referring to "burn in", that only happens under extreme cases like watching CNN 8 hours a day. I play multiple hours of PUBG on my OLED every day with zero issues.

Micro LED looks promising. I think OLED are in a good place right now, and with others entering the market prices should fall a little quicker. I just hope that they'll be able to get the peak brightness higher on OLED panels while continuing to improve image burn in and retention.

Higher peak brightness? My C8 already reaches 950 nit highlights in HDR and I almost go blind in my light controlled room. Crazy high peak brightness is totally over-rated (especially when it's next to absolute black and infinite contrast ratio).
 
MicroLED is the only tech that doesn't having a glaring flaw, but it'll be 15 years before it's in a sub $1000 PC gaming display at the rate this shit industry's going.
 
Yeah I don't really see any worrying issues with oled. Burn in is a potential issue, but easily avoidable.
The only thing stopping me from using one on my main pc is lack of gsync.
 
What issues? If you are referring to "burn in", that only happens under extreme cases like watching CNN 8 hours a day. I play multiple hours of PUBG on my OLED every day with zero issues.



Higher peak brightness? My C8 already reaches 950 nit highlights in HDR and I almost go blind in my light controlled room. Crazy high peak brightness is totally over-rated (especially when it's next to absolute black and infinite contrast ratio).
The 8-series is out? I haven't seen the reviews, yet. The 7-series was showing an average around 700 nits in reviews.
 
I just saw on tftcentral that AUO is going to make a version of this monitor without the HDR shit, and I think it'll be a much better buy.

HDR is completely overrated. It really isn't a big deal.
 
The 8-series is out? I haven't seen the reviews, yet. The 7-series was showing an average around 700 nits in reviews.

Yes, I've had mine for about a month. Up to 944 nit HDR highlights:

https://www.rtings.com/tv/reviews/lg/c8

I just saw on tftcentral that AUO is going to make a version of this monitor without the HDR shit, and I think it'll be a much better buy.

HDR is completely overrated. It really isn't a big deal.

I'll definitely disagree there. HDR on OLED looks amaze balls.
 
OLED: Does Sony give you something over LG? (Video processing for example?) I know they use the same panels.

Don't take my comments from my previous post wrong. I'm eager for OLED's success. Hell, I'm eager for anything's success. I'm just a pretty damned good fence sitter. ;)
 
Burn-in is an issue for me as I would not use the panel solely for gaming. For productivity, there are days that I stare at a single interface with a separate spreadsheet/word doc open for 8+ hours. Then there are things like the taskbar that I would rather not have to hide. I want a panel that just works, not one that I have to consciously make decisions to not ruin. Again, just my opinion.
 
People posting here need to update themselves before spreading inaccurate FUD, especially on OLEDs.

LG's 2018 OLEDs are almost pushing 1000nits in the 2-10% windows. They've been widely reviewed. They're now shipping.

I'm so sick of hearing about image retention and burn in for OLED. Can it happen? Sure. But please stop using Rtings' torture test as your basis for such comments. That test isn't grounded in reality, or usage that's even close to being typical. If you don't own an OLED, then just don't comment on it.

People's use cases also differ. There's people here like Vega who've being using OLED for PC purposes for months- this has been known. His experiences should assuage any concerns, at least for gaming.

There's people like myself who own a 55" C7, and I watch cable news on it sometimes for hours each night. No burn in. No image retention. I don't have the set up to use it as my PC monitor, but I wouldn't have any qualms about it were I to do so. If I don't end up getting some sort of 4K VA monitor, I'll be going for an OLED in 2019.
 
Last edited:
Burn-in is an issue for me as I would not use the panel solely for gaming. For productivity, there are days that I stare at a single interface with a separate spreadsheet/word doc open for 8+ hours. Then there are things like the taskbar that I would rather not have to hide. I want a panel that just works, not one that I have to consciously make decisions to not ruin. Again, just my opinion.

True, I would not buy an OLED TV for eight hours of spreadsheets per day. Any $100 monitor could do that. If you can afford $2500 for an OLED for gaming and movies, you can afford $100 more for a spreadsheet LCD.
 
True, I would not buy an OLED TV for eight hours of spreadsheets per day. Any $100 monitor could do that. If you can afford $2500 for an OLED for gaming and movies, you can afford $100 more for a spreadsheet LCD.
Fair point but I would like to, one day, be able to use an OLED type of screen for daily pc work without having to change my workflow. Why can’t we point out the flaws of OLED if we’re also pointing out the flaws of FALD LCD’s? Whether or not you’re willing to work around that flaw, it is still a flaw. That said, I would have a hard time not buying a 40” 4K OLED if LG put one out tomorrow.
 
Fair point but I would like to, one day, be able to use an OLED type of screen for daily pc work without having to change my workflow. Why can’t we point out the flaws of OLED if we’re also pointing out the flaws of FALD LCD’s? Whether or not you’re willing to work around that flaw, it is still a flaw. That said, I would have a hard time not buying a 40” 4K OLED if LG put one out tomorrow.

Because you can at least work around the OLED flaw. Can you work around the blooming problem on this FALD LCD? Nope. Not unless you turn off FALD but then what's the point?
 
Because you can at least work around the OLED flaw. Can you work around the blooming problem on this FALD LCD? Nope. Not unless you turn off FALD but then what's the point?
I agree that FALD is pointless on monitors if they can’t create enough zones that blooming is not an issue. The other issue is the ridiculous price they want to charge for this flawed tech. I imagine that it was difficult to engineer the product, hence the delays, but it doesn’t appear to be a good idea to release this at the rumored price point unless they feel they can considerably reduce manufacturing costs relatively quickly; and thus reduce the purchase price.
 
I really want HDR but they are making it so fucking pricey for PC users to get a good set of future proof features in one monitor.
I am going to have to wait until this comes down a bit. Really cant justify the splurge when I only got my 4K about a year and a half ago.
 
Impossible unless you want to add back in all the lag they got rid of. FALD lag can be a full second or worse on some panels.

Can you explain what you mean here?

At least to my understanding the LEDs in the array behind adjust depending on the content to achieve a higher range of brightness levels for specific spots. If the array in the back were to just ignore the cursor then wouldn't the LEDs not adjust in brightness where it was?

Because you can at least work around the OLED flaw. Can you work around the blooming problem on this FALD LCD? Nope. Not unless you turn off FALD but then what's the point?

Not everyone is vested in one technology or the other. For me anyways neither seems ready at least to me.

In terms of OLEDs I'm like the other poster somewhat, I want one general usage main display for everything that I can forgot about. The higher the price the display the longer I'd also want it to last without issues.
 
I bought a 65" E6 LG OLED a couple of years ago now. It is without doubt the best TV I had ever seen. I'm still blown away with it when watching 4k movies or playing games like the Witcher 3 on it. In fact, I have been so impressed that i went out and bought the C7 65 for my second Lounge. I can't see myself getting anything else until some new tech comes along. OLED rules at this point in time, no question about it.
 
If the array in the back were to just ignore the cursor then wouldn't the LEDs not adjust in brightness where it was?

If the array ignores the cursor then you wouldn't be able to see the mouse pointer on a black field, because the FALD would be completely off and no backlighting means no visible pixels.

If you're asking if it's possible in software to reduce brightness of the mouse pointer so that the FALD doesn't go 'omgwtf 1000 nits to display a mouse pointer', and instead display a lower brightness mouse pointer, then sure.... But haloing isn't about the mouse pointer, that's just a symptom. It wouldn't help you with halos around any other bright objects or images, which is the real issue... there aren't even mouse pointers in most first and third person games and certainly none in movies or tv shows. The mouse pointer itself is just a minor example of the fundamental problem.
 
If the array ignores the cursor then you wouldn't be able to see the mouse pointer on a black field, because the FALD would be completely off and no backlighting means no visible pixels.

If you're asking if it's possible in software to reduce brightness of the mouse pointer so that the FALD doesn't go 'omgwtf 1000 nits to display a mouse pointer', and instead display a lower brightness mouse pointer, then sure.... But haloing isn't about the mouse pointer, that's just a symptom. It wouldn't help you with halos around any other bright objects or images, which is the real issue... there aren't even mouse pointers in most first and third person games and certainly none in movies or tv shows. The mouse pointer itself is just a minor example of the fundamental problem.

Is it actually being turned completely off even for a completely black image? I thought even for these it's just being dimmed with black being shown still via the LCD layer just filtering out remaining light. Doing a quick search on this for FALD TVs at least does not seem to indicate they achieve complete blackness like OLEDs which do completely turn off.

I'm just discussing more mitigation in terms of the effect. I don't expect a software solution to solve things just alleviate the issue. At least going by the video shown in the previous page the person noted that it was most noticeable in certain circumstances such as in that case with the cursor over the dark background but not noticeable in regular game play.
 
The FALD thing seems like a terrible thing for anything with bright, well defined things on screen. I wonder if it's possible to just have an even backlight setting so it acts like a regular backlight and thus you get the brightness needed for HDR but not the anomalies of FALD. It's a compromise but IMO black levels are hugely overrated, I don't care if things are exactly black if there is good contrast overall. I like how the HDR on my Samsung TV looks.

I get that OLED would be better but it's not very useful constantly bringing it up when no manufacturer makes OLED screens in anything but phone and large TV sizes afaik. I'd be interested if I could get an OLED in well under 40" size.
 
Can you explain what you mean here?

At least to my understanding the LEDs in the array behind adjust depending on the content to achieve a higher range of brightness levels for specific spots. If the array in the back were to just ignore the cursor then wouldn't the LEDs not adjust in brightness where it was?
How will the monitor know the white spot on the screen is the mouse cursor? How will the video card communicate back to the monitor where the mouse cursor is?
 
How will the monitor know the white spot on the screen is the mouse cursor? How will the video card communicate back to the monitor where the mouse cursor is?

I did say software solution as in possibly requiring the software itself to be designed with FALDs in mind. Hypothetically the game itself would have coding in it that classifies UI elements differently.
 
Is it actually being turned completely off even for a completely black image? I thought even for these it's just being dimmed with black being shown still via the LCD layer just filtering out remaining light. Doing a quick search on this for FALD TVs at least does not seem to indicate they achieve complete blackness like OLEDs which do completely turn off.

I'm just discussing more mitigation in terms of the effect. I don't expect a software solution to solve things just alleviate the issue. At least going by the video shown in the previous page the person noted that it was most noticeable in certain circumstances such as in that case with the cursor over the dark background but not noticeable in regular game play.

FALDs do turn off completely if the black zone is big enough relative to the zone size, see black bars on the Z9D or Q9FN. Light does bleed into them sometimes depending on the content but generally they are black. Regardless, people want their mouse pointer to be reasonably bright and easily visible. You can obviously set the pixel brightness of the mouse pointer to whatever you want... but there's no free lunch. If you make the mouse pointer really dim, you'll have low black levels, but then the pointer will be hard to see. Haloing is an issue on every single FALD TV despite them not having mouse pointers, and it's especially obvious on scenes like star fields, where you either have halos or you try to do what the Samsung Q9FN does and just straight up delete stars to achieve better blacks... lol.

The youtuber on the last page was testing the display in a brightly lit room at some kind of event, he wasn't allowed to take it home and do a proper review, and youtubers in general simply can't be trusted on the same level as genuine critical reviewers like TFTCentral and prad.de(there are some excellent critical youtube reviewers like Vincent Teoh at HDTVTest, but I'm not aware of any for PC hardware). The fact that haloing is noticeable at all in a brightly lit room tells you it's going to be really bad in a room with controlled lighting, whether it's a mouse pointer, star, or some other bright object on a dark background.
 
The FALD thing seems like a terrible thing for anything with bright, well defined things on screen. I wonder if it's possible to just have an even backlight setting so it acts like a regular backlight and thus you get the brightness needed for HDR but not the anomalies of FALD. It's a compromise but IMO black levels are hugely overrated, I don't care if things are exactly black if there is good contrast overall. I like how the HDR on my Samsung TV looks.

I get that OLED would be better but it's not very useful constantly bringing it up when no manufacturer makes OLED screens in anything but phone and large TV sizes afaik. I'd be interested if I could get an OLED in well under 40" size.

That's not how HDR brightness works.

This is a common misconception about extreme brightness since HDR is still a pretty new concept to most people.




SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.


HDR video increases the range of brightness in an image to boost the contrast between the lightest lights and the darkest darks. If you’re having difficulty grasping how that translates into a more realistic image on your screen, think of the subtle tonal gradations a fine artist creates in a charcoal drawing to build the illusion of volume, mass, and texture, and you should begin to get the picture. But HDR doesn’t just improve grayscale; its greater luminance range opens up a video’s color palette as well. “Basically, it’s blacker blacks, whiter whites, and higher brightness and contrast levels for colors across the spectrum,” says Glenn Hower, a research analyst at Parks Associates.


The result is richer, more lifelike video images. Rather than washing out to white, as it would in conventional video, a ray of sunlight reflecting off a lake in HDR will gleam, and a bright cloud will appear soft and cottony. Basically any image your current TV would render shadowed, dull, muddy, or bleached out will look nuanced, vibrant, and strikingly realistic in HDR."


--------------------------------------------------------


"Realistically, most content will be mastered at 1,200-2,000 nits.

But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits."


--- This is what the uninformed thinks happens:

"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.

If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."


---This is what really happens:


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.

For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.


HDR enables far more vivid, saturated, and realistic images than SDR ever could.

High-brightness SDR displays are not at all the same thing as HDR displays.""


------------------------------------------------------


http://www.creativeplanetnetwork.com/news/shoot/understanding-gamma-and-high-dynamic-range/615702


"To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma."


"Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images."


All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content."

This is a mockup showing a 384 zone grid to give some idea. Its not oled per pixel but compared to a lot of other fald (and obviously compared to edge lit flares) it's a lot. The zones probably play off of each other too, not just on off but blend.

y5FmKGj_d.jpg
 
Last edited:
The statement "I don't care if things are exactly black if there is good contrast overall." makes no sense anyway. The definition of display contrast is the difference between the darkest pixel and the brightest pixel. If you don't have FALD you can't display HDR properly in the first place because with a backlight permanently set to be able to do 1000 nits, your black levels would be solid grey all the time. Without FALD, an IPS display maxes out at about 1000-1500:1 which is nowhere near what is required to display HDR.
 
Higher peak brightness? My C8 already reaches 950 nit highlights in HDR and I almost go blind in my light controlled room. Crazy high peak brightness is totally over-rated (especially when it's next to absolute black and infinite contrast ratio).
Totally seconding that bolded part. A good CRT, or even a VA panel like the one in my Eizo FG2421, can make SDR content feel like HDR just because the black levels don't suck. Contrast is the key.

That said, I have a decent idea of what modern HDR content is supposed to look like, thanks to my Galaxy Note 8 and YouTube. It's actually pretty impressive, but I ironically try to avoid using it for fear of burn-in on the OLED panel when it has that 1000+ nit peak brightness going in HDR mode. That, and the highlights could leave me with retina burn for a while...
 
Back
Top