Future 38" IPS panel 98% DCI P3 144Hz HDR 600

I'd prefer USB C in the mix but if it hits DP 1.4 HDMI 2.0 and does HDR 600 (or even 400) PLUS G-sync and 10 bit color (which is what HDR would call for), then I'm totally in. I don't need super bright HDR so close to my eyes :) but everyone is different. On a TV yep nits matter, but not as much with a device so close. But in a couple years 10 bit color will be everywhere and the pro/bleeding edge will move to 12 bit since cameras are working on 14/16 now. I want to get something that while it may be pricey, is going to satisfy for a few years.

For now I'll have to "live" with just 4k :) And if the specs don't meet everyone's need remember the panel can be put in other monitors too.
 
lol, because the experts at rtings get so angry when describing how bad 'fake news' HDR is with just 8bit+FRC panels.

Yeah, 12-bit is asking a lot when these lazy fucks can’t be bothered to do true 10-bit despite it being common in professional monitors 10+ years ago.
 
Instead of bickering about bitrate I would be more about contrast. Because HDR on IPS with its low contrast ratio is just moronic and HDR is all about wide range of contrast and colors. Even VA panel needs some form of local dimming to show it right.
 
Instead of bickering about bitrate I would be more about contrast. Because HDR on IPS with its low contrast ratio is just moronic and HDR is all about wide range of contrast and colors. Even VA panel needs some form of local dimming to show it right.
TFT Central measured peak contrast on the PG27UQ to range from 6,000:1 to 62,000:1 in their white window testing. HDR10 calls for a contrast ratio of 20,000:1. You're really only going to get better contrast from self-emitting diodes like OLED or MicroLED, the latter of which seems more promising in the PC space. We're already getting a wide range of options with near 100% DCI-P3 coverage, and Rec.2020 is getting progressively closer to that mark.
 
This may be 'the one until MicroLED' but it probably won't be available until 2020...
 
Instead of bickering about bitrate I would be more about contrast. Because HDR on IPS with its low contrast ratio is just moronic and HDR is all about wide range of contrast and colors. Even VA panel needs some form of local dimming to show it right.

I'd just like them to show what they can correctly. Proper tone mapping regardless of output capabilities- if it's too dark, it's crushed to black (fine), and if it's too bright, it's blown out to white (also fine). But get the gamma shifts under control etc, so SDR and HDR content can be displayed at the same time.
 
Hold out for HDMI 2.1. I think that might be a game changer for variable refresh rate compatibility and bandwidth.
 
HDMI 2.1 and HDCP 2.3 both need each other. Sadly 2.1 is here but HDCP 2.3 is farther away yet. So agreed this is a game changer but there is a reason it is taking a while to get here. If LG and others can put it into 2019 models then yes, late 2019 monitors should also have it - because the bandwidth increase is huge which impacts high Hz, high resolution monitors in a big way.
 
Yeah, 12-bit is asking a lot when these lazy fucks can’t be bothered to do true 10-bit despite it being common in professional monitors 10+ years ago.
i think you missed my point.

8bit+frc is virtually indistinguishable from 10bit.

if you gonna die in a ditch over hdr, do it over local dimming and max brightness instead.
 
Is there anything that actually does real pixel doubling? Whenever I've tried it just blurs the image.
I wonder if this is due to individual mapping of pixels that address offset and linearity issues in either the backlight or LCD elements.
ie each pixel needs a different current/brightness to look the same but they are not a perfect match.
When 4 or more adjacent pixels are asked to become the exact same colour/brightness and are expected to look exactly the same, disparities might become obvious.
By scaling the image none linearly the differences blend into each other.
 
fyi
To be sure of HDR support over HDMI, it must support HDMI 2.0a. That is what HDMI 2.0a is for.
Ffor HLG HDR (ie TV transmission) it must be HDMI 2.0b.
An HDMI 2.0 display will not be HDR unless it is a misprint or their documentation dept arent up with their standards.
 
The LG one using this panel is the 38GL950g just FYI

https://www.144hzmonitors.com/monitors/lg-38gl950g-review/ (Not a review, dark seo shens)

144hz and GSync, as my previous comment though I’m not going to expect it this calendar year.

That said, when it’s out I’ll probably replace my Viewsonic xg2703 and old dells with that. I’ve been getting irritated as I don’t have space in my new house for a work desk and a play desk and it’s impossible to get a setup I’m happy with as I work from my Mac and have browsers / reference up on my other screen. Then obviously game on my gsync screen. This means I can consolidate my 2 27’s. It’ll be nice to have a bit more vertical space to play with

Mac’s are irritating as fuck to dual head these days without $300 docking stations. I’d rather put the money towards a screen rather than kvms, docking stations etc.
 
The screen size, resolution, refresh and format works for me. But the panel type (IPS) and G-Sync does not. Who knows when a cheaper freesync variant is released it might still be on my radar if the 'nano' means this IPS gets near VA black levels and contrast.
 
The screen size, resolution, refresh and format works for me. But the panel type (IPS) and G-Sync does not. Who knows when a cheaper freesync variant is released it might still be on my radar if the 'nano' means this IPS gets near VA black levels and contrast.

Nano IPS does nothing to improve contrast. There are Nano IPS panels out there and they still only manage 1000-1200:1 contrast ratios. Only local dimming would improve contrast.
 
i see 12 bit as an option on the oled.

although you'll be stuck at less than 4:4:4.
 
HDMI 2.1 is possible. NO mention of HDR of any kind so would they bother with DP 1.4 and 10 bit panel?

Feels like a 38" copy of the 34GK950 but that's not entirely a bad thing. I just know what 10 bit color looks like compared to 8. But again, wide gamut but just 8 bit mapping might be ok. Might have to wait for a 60Hz 10 bit version - have to see.
 
175 Hz 3840x1600, definitely has my attention!
Those are likely max specs.
ie at 1080p and maybe a little higher res it will do 175Hz.
If it can do UHD 175Hz it will likely have even higher refresh at lower res, then the higher refresh would be promoted.
 
Those are likely max specs.
ie at 1080p and maybe a little higher res it will do 175Hz.
If it can do UHD 175Hz it will likely have even higher refresh at lower res, then the higher refresh would be promoted.

I'm sure it can do 3840x1600 at 175Hz because the 3440x1440 HDR monitors with DP 1.4 are overclockable to 200Hz so 175Hz shouldn't be out of the question for this monitor. But edge lit IPS panels are a no go for me. Yuck!
 
I'm sure it can do 3840x1600 at 175Hz because the 3440x1440 HDR monitors with DP 1.4 are overclockable to 200Hz so 175Hz shouldn't be out of the question for this monitor. But edge lit IPS panels are a no go for me. Yuck!
FALD is just ridiculously expensive though... and the blooming is annoying AF for the expense. This is as good as its going to get on the value front unless they figure out how to produce 1,000 zone FALD monitors for under $2,000 IMO.
 
FALD is just ridiculously expensive though... and the blooming is annoying AF for the expense. This is as good as its going to get on the value front unless they figure out how to produce 1,000 zone FALD monitors for under $2,000 IMO.

Eh it kinda depends. Blooming is definitely a problem with dark scenes in HDR because the display is trying to pump out 1000 nits of brightness for those bright highlights against a dark background. But in normal SDR mode where I'm set to a max luminance of 100 nits, blooming is practically non existent, plus I get far superior contrast to any edge lit VA monitor and none of that nastyass glow.
 
You don't like high framerates?

I do like high frame rates but the benefit of adaptive sync for me is in being able to run a game at 40fps if needed and it still look smooth like 60fps. There is a certain degree of extra mileage sync technologies give you. On less demanding titles ( 90% of most new popular releases) the extra HZ is welcome no matter if it's gsync or freesync.
 
FALD is also likely to have lifespan issues - just go have a look at TV's that end up with burned out spots. It happens more than you'll like to read about.

Given the cost, the imperfection of the tech, edge is good enough. No monitor lasts truly forever, and really we need OLED or microLED to make a more effective zone or per pixel system. Those are years away. Plus FALD comes with fans - no thanks,.

But the specs of this "potential" monitor are very impressive. And I'd heavily suspect you'll see variants of the panel in other designs so if the specs aren't just what you want but the size and resolution are, you'll probably find a variant design by late 2019.
 
Those are likely max specs.
ie at 1080p and maybe a little higher res it will do 175Hz.
If it can do UHD 175Hz it will likely have even higher refresh at lower res, then the higher refresh would be promoted.

175 Hz at 3840x1600 is the spec. No one advertises refresh rates for non-native resolutions as the primary spec.
 
175 Hz at 3840x1600 is the spec. No one advertises refresh rates for non-native resolutions as the primary spec.
They do because it is possible with standard methods and therefore must form part of the primary spec.
My Samsung UHD TV can do 120Hz but not at UHD res.
My 1080p projector can do 120Hz at 720p or lower.
...

edit
You have a point.
Some displays state the higher refresh possible at lower res in their specs, others dont.
Its not a case of none though.
 
Last edited:
I hope this monitor has an SRGB mode. The recent 34" ultrawide models by LG don't have one, so SRGB content is oversatured.
Agreed. The lack of sRGB mode kept me away from buying the 34gk950g. For the record, the Freesync version did have it though. It had higher latency from what I read and the Freesync range wasn’t as good as the gysnc range. I’m really hating Nvidia and their proprietary bs right now.
 
This monitor sounds amazing. Not sure I'm ready to upgrade yet (only had my current monitor for like 2 months) but I like where this is going.
 
Agreed. The lack of sRGB mode kept me away from buying the 34gk950g. For the record, the Freesync version did have it though. It had higher latency from what I read and the Freesync range wasn’t as good as the gysnc range. I’m really hating Nvidia and their proprietary bs right now.

Should really be hating AMD for their half-assed response to G-Sync and well... everything else.
 
Should really be hating AMD for their half-assed response to G-Sync and well... everything else.
I do but there is no reason why the gysnc version of that monitor is more expensive while lacking a ton of features that the Freesync version has (100Hz vs 144Hz, 8bit vs 8bit+FRC, sRGB mode, dport 1.2 vs 1.4, etc). That expensive but old gysnc module was a limiting factor for that monitor.
 
I do but there is no reason why the gysnc version of that monitor is more expensive while lacking a ton of features that the Freesync version has (100Hz vs 144Hz, 8bit vs 8bit+FRC, sRGB mode, dport 1.2 vs 1.4, etc). That expensive but old gysnc module was a limiting factor for that monitor.

That's on LG...
 
The sRGB mode, most likely. Everything else is on Nvidia. Yes, Nvidia has a newer module but speculation says that it adds $500 to a monitor... that’s ridiculous.

Speculation is speculation, but custom FPGAs are expensive. Again, it's on LG to put the appropriate module in.
 
Speculation is speculation, but custom FPGAs are expensive. Again, it's on LG to put the appropriate module in.
Yes but now the Freesync version goes for 1200 while the ‘proper’ gysnc version would go for ~1700? Gysnc isn’t that much better than Freesync. The biggest problem with Freesync is that AMD hasn’t competed with Nvidia’s top tier cards. Nvidia is capitalizing by charging ridiculous prices. It’s a business, so I get it but I don’t think it’s a smart move as AMD, or possibly Intel, could quickly shake things up (quickly being a relative term in the graphics/monitor game).
 
Back
Top