Future 38" IPS panel 98% DCI P3 144Hz HDR 600

Discussion in 'Displays' started by Anemone, Dec 19, 2018.

  1. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    Based on un-cited 'speculation'. Also remember that Freesync isn't 'free', from a hardware standpoint; Freesync2, which seeks to close the gap with G-Sync somewhat, is even more expensive.

    And so long as AMD maintains their steady two to three generation disadvantage, G-Sync + a top-end Nvidia GPU is that much better.
     
  2. thelead

    thelead 2[H]4U

    Messages:
    2,088
    Joined:
    May 28, 2005
    I don’t disagree with your assessment. I would pay it but I doubt it would be much of a commercial success with the price difference.
     
  3. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,433
    Joined:
    Oct 19, 2004
  4. Nenu

    Nenu [H]ardened

    Messages:
    18,893
    Joined:
    Apr 28, 2007
    Anemone likes this.
  5. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,690
    Joined:
    Mar 22, 2008
  6. thelead

    thelead 2[H]4U

    Messages:
    2,088
    Joined:
    May 28, 2005
    I was hoping to hear about the 38GL950G during CES... oh well...
     
    Anemone likes this.
  7. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    Gsync compatible might well explain the need for DP 1.4 to keep up with HDR/10 bit bandwidth and the 144/175 refresh rate. Wonder what the range would be if it did have a variable..
     
  8. Gatecrasher3000

    Gatecrasher3000 Limp Gawd

    Messages:
    295
    Joined:
    Mar 18, 2013


    Damn dude, I think I might have just found my next screen.

    *Skip to 1:30*
     
  9. thelead

    thelead 2[H]4U

    Messages:
    2,088
    Joined:
    May 28, 2005
    Yup... a few videos finally popped up today. I really hope these come out soon (2019).
     
  10. Iratus

    Iratus [H]ard|Gawd

    Messages:
    1,226
    Joined:
    Jan 16, 2003
    Hopefully they just had the brightness cranked to max for that video, the black levels were terrible
     
  11. Anemone

    Anemone Gawd

    Messages:
    892
    Joined:
    Apr 5, 2004
    Hakaba, KazeoHin and IdiotInCharge like this.
  12. Vega

    Vega [H]ardness Supreme

    Messages:
    6,258
    Joined:
    Oct 12, 2004
    Its said that for a while now. TFTCentral is now saying Q4.

    EDIT: Dan from LG over on OCUK forums said July/Aug, but we all know in the display business that means Q4.
     
    Last edited: Apr 8, 2019
  13. Aluminum

    Aluminum Gawd

    Messages:
    672
    Joined:
    Sep 18, 2015
    Hope the delay also means they do a freesync only version (or just ditch gsync version completely and sell more at the lower price bracket hint hint) as the 34" out right now is great, best UW option so far.
     
  14. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    54,642
    Joined:
    Feb 9, 2002
    If that had HDR, I'd actually be interested in that one. Its pretty close to 40" and its got a vertical resolution of 1600 which is my bare minimum. That 1440P stuff just doesn't work for me.
     
    Hakaba likes this.
  15. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    Only worry here is that 'Freesync version' is not specific enough to ensure that the included Freesync capability is as close to G-Sync capability as it can get. That's one of the benefits with G-Sync: you get the best VRR support without question. Sometimes worth paying for.
     
  16. thelead

    thelead 2[H]4U

    Messages:
    2,088
    Joined:
    May 28, 2005
    As long as the g sync module doesn’t need a fan, I would rather have it than a freesync version.
     
    xp3nd4bl3 likes this.
  17. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    Yeah, that's a goofy one. It's clear that there's an economy of scale delta that is pushing Nvidia toward using expensive and power hungry FPGAs as opposed to ASICs for G-Sync. Can't argue with the results though.
     
  18. Archaea

    Archaea [H]ardForum Junkie

    Messages:
    9,433
    Joined:
    Oct 19, 2004
    Could you explain more? There are multiple types of Gsync chips?
     
  19. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    At least two or three- there might have been a revision of the first version, between the modules that users could retrofit a monitor with and monitors that then shipped with G-Sync, and then the ones that have the HDR FALD capability. First two might be the same. Could have been more, but functionality wise there have only been two.
     
  20. Aluminum

    Aluminum Gawd

    Messages:
    672
    Joined:
    Sep 18, 2015
    Giant company designing multiple N-billion transistor chips continues to peddle custom FPGAs for consumer products many years later. Sums up how much they believe in their own bullshit while trying to pawn it off to monitor OEMs.

    Also, on the 34GK950, the F version is now objectively & technically superior to the G version, so at least there is precedence for LG doing it right.
     
  21. thelead

    thelead 2[H]4U

    Messages:
    2,088
    Joined:
    May 28, 2005
    That LG monitor has had many complaints as far as quality control goes. I had some flickering and had to return mine. There’s a reason why it’s not in stock for multiple vendors.
     
  22. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    It's custom. Custom isn't cheap, and you'd bet if they could reduce costs and increase margins (or just sales period), they'd do it.

    I'm not familiar with the models, do you have references that explain the disparity?
     
  23. Aluminum

    Aluminum Gawd

    Messages:
    672
    Joined:
    Sep 18, 2015
    Custom is bullshit. ASIC company using FGPAs for years in consumer products is laughable. They never believed in their own custom smoke and always expected to give in to the actual standard when it was convenient. They are throwing illions at smart car chips that are still shipping in nothing - they believe in that.

    Same glorious LG DCI-P3 98% panel, not the older LG panel that people still fawn over because it has an Alienware™ stamp:
    http://www.tftcentral.co.uk/reviews/lg_34gk950f.htm 144hz, 10bit input, no FPGA, no RGB frag harder disco light.
    http://www.tftcentral.co.uk/reviews/lg_34gk950g.htm 120hz, 8bit input, FGPA, RGB frag harder disco light.

    Custom FPGA literally cripples it.
     
    Jedibeeftrix and xp3nd4bl3 like this.
  24. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    Nvidia makes money. No reason to spend on an ASIC when an FPGA is cheaper at a given volume. Expecting them to spend unnecessarily is 'laughable'.

    Citation needed. They're still shipping G-Sync solutions.

    This is on LG- the current G-Sync modules support everything that the panel is capable of.
     
  25. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    I don't think that's correct. I think that you're assuming that the "current G-sync modules" are all the G-Sync Ultimate modules. I believe that the LG is using the standard G-Sync module (the OG one, still "current") because the panel doesn't support everything necessary for the "G-Sync Ultimate" experience, like HDR1000.
     
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    They could use that module if they chose to fully support the panel, at the very least, and that assumes that there's not another option. Still LG's choice here.
     
  27. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    Which G-Sync module could LG use that supports 3440x1440 at 144 and HDR400 or HDR600?
     
  28. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    The one that supports 4k144Hz HDR?

    I'm not understanding why this is hard...?
     
    thelead likes this.
  29. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    Like I said, that’s called the G-Sync Ultimate module and nvidia requires HDR1000.
     
  30. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    ...if it's going to be a G-Sync Ultimate display.
     
  31. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    It's not an HDR1000 display...
    Therefore it does not meet G-Sync Ultimate requirements from Nvidia...
    Therefore it cannot use the G-Sync Ultimate module...
    Therefore it must employ FreeSync, FreeSync 2, or G-Sync standard...

    G-Sync standard module does not support HDR (or 3440x1440 above 120Hz, IIRC)

    Therefore this is a limitation of Nvidia and their ASIC, not of LG's implementation of G-Sync for this panel.

    Not trying to argue with you, these are Nvidias rules as I understand them.
     
  32. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    This logic doesn't follow. The module presents a set of capabilities. G-Sync Ultimate is a product labeling / branding thing. LG could easily use the module for its expanded capabilities beyond the standard module without labeling and marketing the monitor as 'G-Sync Ultimate'. And again that assumes that an 'in between' module doesn't exist.
     
  33. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    I don't believe that is correct. From what I recall and understand (and I could totally be wrong here) Nvidia won't sell the G-Sync Ultimate module for a panel that doesn't meet the G-Sync ultimate requirements (like HDR1000). This module is very expensive, rumored around $500 and wouldn't make sense in a mid-range monitor anyway. There is no in-between module. (Though specs of unreleased G-Sync "standard" monitors with HDMI 2.0 ports suggest that maybe a refresh is coming - but these don't show HDR support)

    When LG releases two monitors using the same panel and the G-Sync version doesn't support HDR and > 120Hz while the FreeSync version does, you just assume that it's LG's mistake in implementation of G-Sync? There are many, many FreeSync 2 HDR monitors. How many G-Sync HDR monitors are there that aren't "G-Sync Ultimate" certified at HDR1000? None that I'm aware of. I have a hard time concluding that it's the monitor manufacturer's shortcoming and not an Nvidia/ASIC limitation as you have done.
     
  34. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    Because of this assumption:

    And I don't see a 34" 144Hz 10bit panel as 'mid-range'; that's definitely high-end.

    I'm not really trying to be argumentative- it seems that LG could have used a better part to fully support their panel. At the same time any of the reasons you've brought up (cost, branding) could have been involved, but there's no technical reason for the lack of support.

    What's really hurting G-Sync is the economy of scale involved. By implementing fewer monitor-side features, Freesync takes advantage of the economy of scale by being an 'affordable' value add for monitor ASIC producers.
     
  35. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    Here, this is where I read it:

    https://www.anandtech.com/show/13060/asus-pg27uq-gsync-hdr-review/2

    That link references the Nvidia G-Sync HDR whitepaper.

    This is where I concluded that Nvidia won't let somebody release a G-Sync display with HDR support that isn't "Ultimate". And the fact that none exists, and that LG was forced to nerf this display in the G-Sync version seems to support that conclusion.
     
    IdiotInCharge likes this.
  36. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    It's a guess, may be right- but I don't see anything to support that they wouldn't sell the module for use in a non-Ultimate display. I get that earning the certification is a hard requirement, but the part is just a part.
     
  37. xp3nd4bl3

    xp3nd4bl3 2[H]4U

    Messages:
    2,259
    Joined:
    Sep 25, 2004
    I sort of thought the same so I'm not sure why we haven't seen any. With Freesync support on Nvidia GPUs it may not matter anymore.
     
    IdiotInCharge likes this.
  38. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,626
    Joined:
    Jun 13, 2003
    G-Sync is going to be a hard sell going forward. I have two G-Sync displays (and one FreeSync) and I can't complain, but I realize that Freesync2 tightening the standard and Nvidia enabling it over DisplayPort and maybe HDMI on future generations means G-Sync will have to be price-competitive to sell- and that's not going to happen even if G-Sync hardware is the same price as Freesync hardware. Economy of scale and all. We'd need to see something radical like G-Sync support from typical ASIC makers that would likely also support Freesync, or a lot of work on Nvidia's side to lessen the burden on display manufacurers.
     
  39. Vega

    Vega [H]ardness Supreme

    Messages:
    6,258
    Joined:
    Oct 12, 2004
    There is only one DP 1.4 G-Sync module, it is using the Intel Altera Arria 10 GX480 FPGA. A monitor doesn't need to have HDR, let alone HDR1000 to use this G-Sync module. NVIDIA simply doesn't use the FALD back-light channel that the FPGA has on a monitor like the LG 38GL950G.

    Not all DP 1.4 G-Sync monitors now and in the future will be HDR1000 "Ultimate". The new 27" 4K 144 Hz monitors that are the FALD-less 27" (and basically have no HDR) versions of the PG27UQ and X27 use the same G-Sync DP 1.4 module/FPGA, fans to cool it, etc.
     
    Last edited: Apr 11, 2019
    thelead likes this.