OLED HDMI 2.1 VRR LG 2019 models!!!!

Discussion in 'Displays' started by sharknice, Jan 2, 2019.

  1. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,437
    Joined:
    Nov 12, 2012


    BFGD DEAD

    True HDMI 2.1 with every 2.1 feature. 48 gbps variable refresh rate, low latency gaming mode, etc. etc.

    Time to get an AMD card with freesync and do power saving trick with a 2080ti
     
    Baenwort and N4CR like this.
  2. Sancus

    Sancus Gawd

    Messages:
    791
    Joined:
    Jun 1, 2013
    fucking sweet RIP BFGD
     
  3. Anemone

    Anemone Gawd

    Messages:
    884
    Joined:
    Apr 5, 2004
    HDMI 2.1 - worth waiting for
    VRR something every vendor should support, because it will move a wealth of products not just TV's.
    NOW why on earth shouldn't monitors also have HDMI 2.1 in 2019?
     
    Solhokuten and N4CR like this.
  4. MistaSparkul

    MistaSparkul Gawd

    Messages:
    906
    Joined:
    Jul 5, 2012
    Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it, the GPU will need to have it also. But yes still EXTREMELY exciting news.
     
    Armenius likes this.
  5. bizzmeister

    bizzmeister [H]ard|Gawd

    Messages:
    1,442
    Joined:
    Apr 26, 2010
    Can one of you pros explain why this is so amazing compared to the TVs currently out now ?
     
  6. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,437
    Joined:
    Nov 12, 2012
    Enough bandwidth for 120hz 4k. Variable refresh rate which allows tear free and lag free gaming.
    It's OLED so the picture quality blows LCD away.
     
    Baenwort and Brian_B like this.
  7. pippenainteasy

    pippenainteasy Gawd

    Messages:
    516
    Joined:
    May 20, 2016
    Navi almost certainly will have HDMI 2.1, the chipset has been out for the better part of a year.

    2019 I might go all AMD with Zen 2 and dump the 2080 Ti. Tired of no gsync on big 4K displays. I'll get a 4K TV with VRR and game on AMD Navi. Screw Nvidia and their proprietary crap. Who wants to spend $5k on a display just for 4K gsync?
     
    Baenwort likes this.
  8. HiCZoK

    HiCZoK Gawd

    Messages:
    816
    Joined:
    Sep 18, 2006
    I would say it is worth buying freesync monitor more than gsync now. If Intel and tvs start all using freesync, nvidia one day will break and do it too
     
  9. MistaSparkul

    MistaSparkul Gawd

    Messages:
    906
    Joined:
    Jul 5, 2012
    Navi is rumored to only be around the performance of an RTX 2070, not really enough for some 4k games. But can't we still do the trick of using a low end AMD gpu to use Freesync while keeping the 2080 Ti as the rendering gpu as long as the AMD card has HDMI 2.1? Surely even the low end Navi would have it.
     
  10. pippenainteasy

    pippenainteasy Gawd

    Messages:
    516
    Joined:
    May 20, 2016
    I don't play many shooters. When mainly playing RTS or RPG games, I tested my 1440p gsync monitor's lower range, and I found I couldn't tell the difference between 42 fps from 60fps--I had to run 200% resolution scaling with AA jacked up on my 2080 Ti to force the fps that low on a lot of the games I tested (like Witcher 3).

    Also I personally find simply raising texture details to max and reducing the lighting to medium with AA off let's even a GTX 1080 class card to hit 60fps in most games at 4K. If Navi is in that performance range it might be good enough for me, certainly would allow me a lot more flexibility in displays than going Nvidia, and it shouldn't be too hard to target 40+ fps with lower settings.
     
    Baenwort likes this.
  11. bigbluefe

    bigbluefe Gawd

    Messages:
    550
    Joined:
    Aug 23, 2014
    Good enough for emulation, which is all that really matters at this point anyway.
     
  12. N4CR

    N4CR 2[H]4U

    Messages:
    3,429
    Joined:
    Oct 17, 2011
    Waiting paid off..
    Because the chipset to do it ( just hdmi, none of the other functions) will probably be million dollar order minimum and for niche gaming screens it isn't feasible.
     
  13. bigbluefe

    bigbluefe Gawd

    Messages:
    550
    Joined:
    Aug 23, 2014
    KazeoHin and Baenwort like this.
  14. HiCZoK

    HiCZoK Gawd

    Messages:
    816
    Joined:
    Sep 18, 2006
    whatever. Next consoles will be 4k30 anyway. Maybe some games 60fps.
    and on pc, the best hardware now gives you 4k60... but I wouldnt want 55" on my desk
     
  15. Vega

    Vega [H]ardness Supreme

    Messages:
    5,951
    Joined:
    Oct 12, 2004
    I am not convinced this is the real deal 48 GBps chip-set. You can do all sorts of stupid stuff to 4K/120 Hz like reduce chroma to 4:2:0 8-bit, no HDR, do up-sampling/down-sampling, compression to fit it in 18 GBps HDMI 2.0 chips.

    I smell something that stinks. I hope I am wrong.
     
    Last edited: Jan 3, 2019
  16. ND40oz

    ND40oz [H]ardForum Junkie

    Messages:
    11,149
    Joined:
    Jul 31, 2005
    eARC is what I've been waiting for, time to upgrade my B6.
     
    Armenius likes this.
  17. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    16,182
    Joined:
    Jan 28, 2014
    HDMI VRR isn't Freesync. The whole point of HDMI VRR is to eliminate the need for a software layer to make VRR work. Meaning so long as a video card has an HDMI 2.1 output no matter who manufactured it you will get VRR when it is connected to a display with an HDMI 2.1 input.
    I tend to agree. The Xbox One X is HDMI 2.1 in that it supports VRR and ALLM, but it does not support the full 48 Gbps bandwidth.
     
    Last edited: Jan 3, 2019
    Baenwort, N4CR and defaultluser like this.
  18. Lastan010

    Lastan010 [H]Lite

    Messages:
    123
    Joined:
    Mar 2, 2017
    mind as well just get a cell phone to play those games.
     
  19. GameLifter

    GameLifter Limp Gawd

    Messages:
    337
    Joined:
    Sep 4, 2014
    I'll be keeping my eye on these but I still plan to keep my OLED C6 for a bit longer. I'd love it if there was a breakthrough in blur reduction on these. The 2018 models had black frame insertion but I heard it could have been implemented better.
     
  20. bigbluefe

    bigbluefe Gawd

    Messages:
    550
    Joined:
    Aug 23, 2014
    We really need to get to the point where we can just seamlessly use HDMI with a receiver with a computer monitor and not have to have phantom monitors and other horse shit. The whole situation we're in right now is a disgrace.
     
  21. Aluminum

    Aluminum Gawd

    Messages:
    572
    Joined:
    Sep 18, 2015
    I can push 200Gbps over a $20 passive QSFP DAC cable up to 3m or so, but we had to wait how long to get the shitty crippled hdmi above 18Gbps? (Which many retail store will try to trick you into a $100 branded cable for)
    And even then a lot of TVs do dumb chroma shit, the default is compressed black levels etc etc.

    Endless bottom dollar mediocrity and hollywood ball sucking crypto is annoying.
     
    Baenwort and Supercharged_Z06 like this.
  22. gan7114

    gan7114 Limp Gawd

    Messages:
    185
    Joined:
    Dec 14, 2012
    I would temper our excitement until we get the actual details next week. It wouldn't surprise me to see limited implementation of HDMI 2.1 with the first round of products from this year's CES.

    It doesn't help that the HDMI Org has allowed HDMI 2.1 features to be implemented piecemeal, with some able to be implemented on HDMI 2.0. We know that these 2019 models will come with VRR, ALLM, and eARC, but AFAIK all of those can be implemented on 18Gbps chipsets.

    In otherwords, we want the "full fat" 48Gbps chipsets.

    The gold standard we're aiming for (especially for PC use) continues to be 4K @ 120Hz @ 4:4:4 10-bit HDR. That kind of bandwidth clocks in at 45Gbps. Moreover, these 2019 models are meaningless if we don't also have graphics cards that also use HDMI 2.1 and push all these features too.
     
    Supercharged_Z06 and Armenius like this.
  23. Brian_B

    Brian_B 2[H]4U

    Messages:
    2,248
    Joined:
    Mar 23, 2012
    This is good news.

    I will upgrade my C7 as soon as I get some devices that can actually use 2.1 output.
     
  24. Keller1

    Keller1 n00b

    Messages:
    39
    Joined:
    Dec 10, 2013
    Seems like your math's off.
    30bpp * 4k * 120Hz is 29.8gbps.
    These chipsets are theoretically capable of ~4k@190(ish)hz@10bit or 5k(5120x2160) ultrawide@144(ish)hz@10bpc.Both of those are ~48gbps.

    Unless HDR is an additional and not incorporated within the 4k 10bpc i dont see where the extra overhead fits?


    Unfortunately for us, LG probably doesnt care enough to support custom display modes.
     
  25. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,437
    Joined:
    Nov 12, 2012
    They're specifically saying it's supports everything in the press release. Unlike other tvs and the Xbox's partial support.

    But yes, now we need a graphics card that does too.
     
  26. gan7114

    gan7114 Limp Gawd

    Messages:
    185
    Joined:
    Dec 14, 2012
    All I'm saying is I didn't see 48Gbps mentioned. Seems other sites are inferring that, but makes me raise my eyebrow until we otherwise know for sure.
     
  27. MistaSparkul

    MistaSparkul Gawd

    Messages:
    906
    Joined:
    Jul 5, 2012
    Well hey we don't have to wait very long to find out. All will be revealed next week ar CES.
     
    Armenius likes this.
  28. defaultluser

    defaultluser I B Smart

    Messages:
    12,129
    Joined:
    Jan 14, 2006
    I'm excited to see this tech finally hit the streets. In another couple years, it might actually be affordable :D

    I'm still going to enjoy my C7 for another 5 years, while you early adopters have fun!
     
  29. gan7114

    gan7114 Limp Gawd

    Messages:
    185
    Joined:
    Dec 14, 2012
    Yes, HDR adds significantly more bandwidth to the requirement. The math is correct.

    If you want to play around with the numbers, just go here: https://www.extron.com/product/videotools.aspx
     
    Baenwort, Armenius and defaultluser like this.
  30. pippenainteasy

    pippenainteasy Gawd

    Messages:
    516
    Joined:
    May 20, 2016
    The "mind" is unable to find a cell phone that runs witcher 3 at 4k.
     
  31. N4CR

    N4CR 2[H]4U

    Messages:
    3,429
    Joined:
    Oct 17, 2011
  32. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,437
    Joined:
    Nov 12, 2012
    I think they'll keep the price the same as last year's models despite the upgrades. I'm glad I waited and didn't buy a 2018 model on black Friday.
     
  33. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    848
    Joined:
    Mar 23, 2013
    I don't know if this is true. You could use Custom Resolution Utility to overclock your HDMI connection and set custom Freesync ranges. So even if 120hz isn't possible, 90hz or 75hz might be. Still very cool.
     
  34. Lepardi

    Lepardi Limp Gawd

    Messages:
    142
    Joined:
    Nov 8, 2017
    So that means 2500€ for 55" until the 2018 have been sold out, gonna be a long wait.
     
    Armenius likes this.
  35. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,389
    Joined:
    Jun 13, 2003
    Don't use an appropriate receiver?
     
  36. Keller1

    Keller1 n00b

    Messages:
    39
    Joined:
    Dec 10, 2013
    Yes but also like AMD has an anouncement in 5 days @ CES. The hope for a match made in heaven Q2 2019 still lives.
     
  37. Wade88

    Wade88 Limp Gawd

    Messages:
    160
    Joined:
    Jun 21, 2015
    You can use ARC with an appropriate receiver to play smart tv sources or F connector sources noises from your receiver's speakers instead of the TV's. Say your appropriate receiver does not have enough hdmi ports for all of your equipment you can plug them into your TV with eARC and not pay any penalty like it did with ARC with only 2 channel toslink instead of 37-40mbps, and ethernet in addition to the rest of the HDMI 2.1 specs.
     
    Armenius likes this.
  38. ND40oz

    ND40oz [H]ardForum Junkie

    Messages:
    11,149
    Joined:
    Jul 31, 2005
    Depends on the setup. For the living room, less is more and if I can use the TV for source switching, it's better than having to use a receiver to do it. Like Wade88 said, if you're using the TV's app as a source, having eARC will get you full audio support as well.
     
    Baenwort and Armenius like this.
  39. Wade88

    Wade88 Limp Gawd

    Messages:
    160
    Joined:
    Jun 21, 2015
    I prefer to have the avr do the source switching so there can be minimal wires in the wall to the TV's. I have big cats that like to leap around so they get mounted to studs in the wall and have polycarbonate anti-cat armor, 1 power and 1 hdmi go into the wall in a conduit that is next to the conduit for the satellite speakers that aren't L/C/R or the rca to the sub and out behind the TV so they can't unplug anything. If you switch all your sources with the avr it's just the one ARC cable for doing TV sources and traditional AVR duty for your other sources. If you mount on a regular piece of furniture this might not be a consideration. it's also nice to keep my nieces from destroying them when they're not as closely attended.
     
    Corvette and Armenius like this.
  40. Snowknight26

    Snowknight26 [H]ardness Supreme

    Messages:
    4,160
    Joined:
    May 8, 2005
    It doesn't. HDR10 metadata can be encoded in the video stream's SEI (that already exists) and adds negligible overhead to the file (since SEIs are only present once every GOP), which doesn't translate to any additional bandwidth needed since you've already established the display parameters when changing display modes.