Future 38" IPS panel 98% DCI P3 144Hz HDR 600

Discussion in 'Displays' started by Anemone, Dec 19, 2018.

  1. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    I'd prefer USB C in the mix but if it hits DP 1.4 HDMI 2.0 and does HDR 600 (or even 400) PLUS G-sync and 10 bit color (which is what HDR would call for), then I'm totally in. I don't need super bright HDR so close to my eyes :) but everyone is different. On a TV yep nits matter, but not as much with a device so close. But in a couple years 10 bit color will be everywhere and the pro/bleeding edge will move to 12 bit since cameras are working on 14/16 now. I want to get something that while it may be pricey, is going to satisfy for a few years.

    For now I'll have to "live" with just 4k :) And if the specs don't meet everyone's need remember the panel can be put in other monitors too.
     
  2. Jedibeeftrix

    Jedibeeftrix [H]Lite

    Messages:
    103
    Joined:
    Dec 1, 2016
    lol, because the experts at rtings get so angry when describing how bad 'fake news' HDR is with just 8bit+FRC panels.
     
    Desert Fish and Armenius like this.
  3. chris7191

    chris7191 n00b

    Messages:
    62
    Joined:
    Sep 25, 2018
    Yeah, 12-bit is asking a lot when these lazy fucks can’t be bothered to do true 10-bit despite it being common in professional monitors 10+ years ago.
     
  4. MaZa

    MaZa 2[H]4U

    Messages:
    2,707
    Joined:
    Sep 21, 2008
    Instead of bickering about bitrate I would be more about contrast. Because HDR on IPS with its low contrast ratio is just moronic and HDR is all about wide range of contrast and colors. Even VA panel needs some form of local dimming to show it right.
     
  5. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,902
    Joined:
    Jan 28, 2014
    TFT Central measured peak contrast on the PG27UQ to range from 6,000:1 to 62,000:1 in their white window testing. HDR10 calls for a contrast ratio of 20,000:1. You're really only going to get better contrast from self-emitting diodes like OLED or MicroLED, the latter of which seems more promising in the PC space. We're already getting a wide range of options with near 100% DCI-P3 coverage, and Rec.2020 is getting progressively closer to that mark.
     
  6. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    This may be 'the one until MicroLED' but it probably won't be available until 2020...
     
  7. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    I'd just like them to show what they can correctly. Proper tone mapping regardless of output capabilities- if it's too dark, it's crushed to black (fine), and if it's too bright, it's blown out to white (also fine). But get the gamma shifts under control etc, so SDR and HDR content can be displayed at the same time.
     
  8. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,272
    Joined:
    Oct 19, 2004
    Hold out for HDMI 2.1. I think that might be a game changer for variable refresh rate compatibility and bandwidth.
     
    Anemone likes this.
  9. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    HDMI 2.1 and HDCP 2.3 both need each other. Sadly 2.1 is here but HDCP 2.3 is farther away yet. So agreed this is a game changer but there is a reason it is taking a while to get here. If LG and others can put it into 2019 models then yes, late 2019 monitors should also have it - because the bandwidth increase is huge which impacts high Hz, high resolution monitors in a big way.
     
  10. Jedibeeftrix

    Jedibeeftrix [H]Lite

    Messages:
    103
    Joined:
    Dec 1, 2016
    i think you missed my point.

    8bit+frc is virtually indistinguishable from 10bit.

    if you gonna die in a ditch over hdr, do it over local dimming and max brightness instead.
     
    Desert Fish and Armenius like this.
  11. Nenu

    Nenu [H]ardened

    Messages:
    18,789
    Joined:
    Apr 28, 2007
    I wonder if this is due to individual mapping of pixels that address offset and linearity issues in either the backlight or LCD elements.
    ie each pixel needs a different current/brightness to look the same but they are not a perfect match.
    When 4 or more adjacent pixels are asked to become the exact same colour/brightness and are expected to look exactly the same, disparities might become obvious.
    By scaling the image none linearly the differences blend into each other.
     
  12. Nenu

    Nenu [H]ardened

    Messages:
    18,789
    Joined:
    Apr 28, 2007
    fyi
    To be sure of HDR support over HDMI, it must support HDMI 2.0a. That is what HDMI 2.0a is for.
    Ffor HLG HDR (ie TV transmission) it must be HDMI 2.0b.
    An HDMI 2.0 display will not be HDR unless it is a misprint or their documentation dept arent up with their standards.
     
  13. Iratus

    Iratus [H]ard|Gawd

    Messages:
    1,226
    Joined:
    Jan 16, 2003
    The LG one using this panel is the 38GL950g just FYI

    https://www.144hzmonitors.com/monitors/lg-38gl950g-review/ (Not a review, dark seo shens)

    144hz and GSync, as my previous comment though I’m not going to expect it this calendar year.

    That said, when it’s out I’ll probably replace my Viewsonic xg2703 and old dells with that. I’ve been getting irritated as I don’t have space in my new house for a work desk and a play desk and it’s impossible to get a setup I’m happy with as I work from my Mac and have browsers / reference up on my other screen. Then obviously game on my gsync screen. This means I can consolidate my 2 27’s. It’ll be nice to have a bit more vertical space to play with

    Mac’s are irritating as fuck to dual head these days without $300 docking stations. I’d rather put the money towards a screen rather than kvms, docking stations etc.
     
  14. linuxdude9

    linuxdude9 Gawd

    Messages:
    597
    Joined:
    Dec 25, 2004
    cybereality likes this.
  15. GNUse_the_force

    GNUse_the_force Limp Gawd

    Messages:
    420
    Joined:
    Oct 27, 2014
    The screen size, resolution, refresh and format works for me. But the panel type (IPS) and G-Sync does not. Who knows when a cheaper freesync variant is released it might still be on my radar if the 'nano' means this IPS gets near VA black levels and contrast.
     
  16. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    100% agree... but it better come through with a decent (better than 1000:1) contrast ratio. The last 34" alienware that I tried had a horrible contrast ratio compared to my other IPS displays.
     
  17. MistaSparkul

    MistaSparkul Gawd

    Messages:
    940
    Joined:
    Jul 5, 2012
    Nano IPS does nothing to improve contrast. There are Nano IPS panels out there and they still only manage 1000-1200:1 contrast ratios. Only local dimming would improve contrast.
     
    Baenwort likes this.
  18. DF-1

    DF-1 2[H]4U

    Messages:
    2,563
    Joined:
    Jun 17, 2011
    i see 12 bit as an option on the oled.

    although you'll be stuck at less than 4:4:4.
     
  19. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    You don't like high framerates?
     
  20. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    I'm still drooling...
     
  21. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    HDMI 2.1 is possible. NO mention of HDR of any kind so would they bother with DP 1.4 and 10 bit panel?

    Feels like a 38" copy of the 34GK950 but that's not entirely a bad thing. I just know what 10 bit color looks like compared to 8. But again, wide gamut but just 8 bit mapping might be ok. Might have to wait for a 60Hz 10 bit version - have to see.
     
  22. Vega

    Vega [H]ardness Supreme

    Messages:
    6,178
    Joined:
    Oct 12, 2004
    175 Hz 3840x1600, definitely has my attention!
     
  23. Nenu

    Nenu [H]ardened

    Messages:
    18,789
    Joined:
    Apr 28, 2007
    Those are likely max specs.
    ie at 1080p and maybe a little higher res it will do 175Hz.
    If it can do UHD 175Hz it will likely have even higher refresh at lower res, then the higher refresh would be promoted.
     
  24. MistaSparkul

    MistaSparkul Gawd

    Messages:
    940
    Joined:
    Jul 5, 2012
    I'm sure it can do 3840x1600 at 175Hz because the 3440x1440 HDR monitors with DP 1.4 are overclockable to 200Hz so 175Hz shouldn't be out of the question for this monitor. But edge lit IPS panels are a no go for me. Yuck!
     
  25. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    FALD is just ridiculously expensive though... and the blooming is annoying AF for the expense. This is as good as its going to get on the value front unless they figure out how to produce 1,000 zone FALD monitors for under $2,000 IMO.
     
    Anemone likes this.
  26. MistaSparkul

    MistaSparkul Gawd

    Messages:
    940
    Joined:
    Jul 5, 2012
    Eh it kinda depends. Blooming is definitely a problem with dark scenes in HDR because the display is trying to pump out 1000 nits of brightness for those bright highlights against a dark background. But in normal SDR mode where I'm set to a max luminance of 100 nits, blooming is practically non existent, plus I get far superior contrast to any edge lit VA monitor and none of that nastyass glow.
     
  27. GNUse_the_force

    GNUse_the_force Limp Gawd

    Messages:
    420
    Joined:
    Oct 27, 2014
    I do like high frame rates but the benefit of adaptive sync for me is in being able to run a game at 40fps if needed and it still look smooth like 60fps. There is a certain degree of extra mileage sync technologies give you. On less demanding titles ( 90% of most new popular releases) the extra HZ is welcome no matter if it's gsync or freesync.
     
    Baenwort and xp3nd4bl3 like this.
  28. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    FALD is also likely to have lifespan issues - just go have a look at TV's that end up with burned out spots. It happens more than you'll like to read about.

    Given the cost, the imperfection of the tech, edge is good enough. No monitor lasts truly forever, and really we need OLED or microLED to make a more effective zone or per pixel system. Those are years away. Plus FALD comes with fans - no thanks,.

    But the specs of this "potential" monitor are very impressive. And I'd heavily suspect you'll see variants of the panel in other designs so if the specs aren't just what you want but the size and resolution are, you'll probably find a variant design by late 2019.
     
  29. Vega

    Vega [H]ardness Supreme

    Messages:
    6,178
    Joined:
    Oct 12, 2004
    175 Hz at 3840x1600 is the spec. No one advertises refresh rates for non-native resolutions as the primary spec.
     
  30. Nenu

    Nenu [H]ardened

    Messages:
    18,789
    Joined:
    Apr 28, 2007
    They do because it is possible with standard methods and therefore must form part of the primary spec.
    My Samsung UHD TV can do 120Hz but not at UHD res.
    My 1080p projector can do 120Hz at 720p or lower.
    ...

    edit
    You have a point.
    Some displays state the higher refresh possible at lower res in their specs, others dont.
    Its not a case of none though.
     
    Last edited: Jan 4, 2019
  31. Vega

    Vega [H]ardness Supreme

    Messages:
    6,178
    Joined:
    Oct 12, 2004
    https://www.lg.com/us/monitors/lg-38GL950G-B-gaming-monitor

    Official web page. Show me one manufacturers monitor web page that shows a higher refresh rate than the monitor can do at native resolution. At the very minimum, it would require a HUGE asterisk if that 175 Hz was at a lower resolution.
     
    cybereality likes this.
  32. linuxdude9

    linuxdude9 Gawd

    Messages:
    597
    Joined:
    Dec 25, 2004
    I hope this monitor has an SRGB mode. The recent 34" ultrawide models by LG don't have one, so SRGB content is oversatured.
     
    Baenwort and thelead like this.
  33. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    Agreed. The lack of sRGB mode kept me away from buying the 34gk950g. For the record, the Freesync version did have it though. It had higher latency from what I read and the Freesync range wasn’t as good as the gysnc range. I’m really hating Nvidia and their proprietary bs right now.
     
  34. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,522
    Joined:
    Mar 22, 2008
    This monitor sounds amazing. Not sure I'm ready to upgrade yet (only had my current monitor for like 2 months) but I like where this is going.
     
  35. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Should really be hating AMD for their half-assed response to G-Sync and well... everything else.
     
  36. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    I do but there is no reason why the gysnc version of that monitor is more expensive while lacking a ton of features that the Freesync version has (100Hz vs 144Hz, 8bit vs 8bit+FRC, sRGB mode, dport 1.2 vs 1.4, etc). That expensive but old gysnc module was a limiting factor for that monitor.
     
    Baenwort likes this.
  37. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    That's on LG...
     
  38. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    The sRGB mode, most likely. Everything else is on Nvidia. Yes, Nvidia has a newer module but speculation says that it adds $500 to a monitor... that’s ridiculous.
     
  39. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Speculation is speculation, but custom FPGAs are expensive. Again, it's on LG to put the appropriate module in.
     
  40. thelead

    thelead 2[H]4U

    Messages:
    2,082
    Joined:
    May 28, 2005
    Yes but now the Freesync version goes for 1200 while the ‘proper’ gysnc version would go for ~1700? Gysnc isn’t that much better than Freesync. The biggest problem with Freesync is that AMD hasn’t competed with Nvidia’s top tier cards. Nvidia is capitalizing by charging ridiculous prices. It’s a business, so I get it but I don’t think it’s a smart move as AMD, or possibly Intel, could quickly shake things up (quickly being a relative term in the graphics/monitor game).
     
    Baenwort likes this.