OLED HDMI 2.1 VRR LG 2019 models!!!!

Discussion in 'Displays' started by sharknice, Jan 2, 2019.

  1. bigbluefe

    bigbluefe Limp Gawd

    Messages:
    507
    Joined:
    Aug 23, 2014
    Really, it's time for the Jedi special purpose monitors to end. It's bullshit. All monitors should have accurate colors, low latency, variable refresh, etc. It doesn't have to be and shouldn't be a tradeoff. Anything less is a scam.
     
    DF-1 likes this.
  2. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    7,923
    Joined:
    Jun 13, 2003
    Talk about tripling or quadrupling the average price of a monitor...

    I get the sentiment, but most computer uses (well, nearly all) just need 'close enough', at the very best.
     
  3. bigbluefe

    bigbluefe Limp Gawd

    Messages:
    507
    Joined:
    Aug 23, 2014
    Honestly it's all economies of scale. If you just make more of the same product you can make it much cheaper. Consumers are just getting fucked hard here. The industry has blown and stagnated for a very long time.
     
  4. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    7,923
    Joined:
    Jun 13, 2003
    The only part that would be difficult to scale would be accurate colors; perhaps they could devise a production process that produces monitors that are accurate without calibration, but generally speaking in order to guarantee calibration each unit must be tested for such and adjusted where necessary.
     
  5. Lepardi

    Lepardi Limp Gawd

    Messages:
    137
    Joined:
    Nov 8, 2017
    Calibration AI
     
  6. kasakka

    kasakka Gawd

    Messages:
    966
    Joined:
    Aug 25, 2008
    This won't happen as long as most computer buyers keep buying the cheapest displays available.

    The whole monitor and TV industry is fucked up. TV manufacturers barely bother to support their products, I mean my 2016 Samsung TV is not seeing things like Freesync support just because even though it's most likely 90% the same hardware as the models made a year later. Meanwhile computer monitor manufacturers just want to peddle the same 27" panels over and over with extremely slow development.
     
  7. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    842
    Joined:
    Mar 23, 2013
    That's definitely a big part of it. It seems like a majority of PC users have always focused more on the FRAPS counter in the corner of the screen than the quality of visuals they actually have on the screen
     
  8. elvn

    elvn 2[H]4U

    Messages:
    2,997
    Joined:
    May 5, 2006
    Interesting that someone took the idea of how white OLEDs are sort of per pixel FALD and made a dual layer VA LCD with the rear LCD as the backlight. It's capable of .003 black depth but it isn't very bright, definitely not HDR color bright. Around 25ms or less input lag so that has room for improvement. They claim 4k hdmi 2.1 possible. Wonder why no one thought of it before. Idk why they used a 1080p as the backlight display either.

     
    Last edited: Jan 14, 2019 at 1:47 AM
  9. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,347
    Joined:
    Nov 12, 2012
    I hear about progress on this every year. And they always have problems like it isn't bright enough or has terrible viewing angles or strange artifacts or something.

    It seems like they've finally got it working decent here, but you can tell it's far from perfect even through the youtube video. You can see pretty significant haloing in the video and who knows how the colors and everything else is. I think it's a long way away form actually competing with OLED.
     
  10. Necere

    Necere 2[H]4U

    Messages:
    2,651
    Joined:
    Jan 3, 2003
    Panasonic demoed a very similar dual-layer LCD tech a couple of years ago, and there are now a couple of pro monitors you can buy that use it (e.g., Eizo CG3145). Very expensive though.

    I can see what looks like some haloing in the video, but that's on the single grayscale module that's there to show how the tech works. The full display (second half of the video) looks pretty good as far as I can tell. It's a VA panel, of course, so it's going to have more limited viewing angles than IPS or OLED, but if they can actually deliver at a low-ish price this might be a great alternative to OLED or conventional FALD LCD.
     
    Last edited: Jan 14, 2019 at 3:25 AM
  11. Sancus

    Sancus Gawd

    Messages:
    767
    Joined:
    Jun 1, 2013
    They used 1080p to reduce backlight attenuation, which increases as pixel density increases.

    Stacking LCD panels on top of eachother isn't novel technology, that's how the Eizo CG3145 works. And its specs are indeed impressive. But common sense tells you why this isn't a viable consumer product: Two LCDs cost twice as much, AND vastly attenuate the backlight, making it much more expensive and power consuming. Not to mention the fact that LCDs have slow pixel response times and calibrating the panels to respond identically and consistently is probably very challenging. I'm sure there are a ton of other issues. Hell, even getting the FALD backlight to play nice is incredibly challenging and R&D-intensive, that's why Nvidia had to sink so much effort into getting Gsync HDR backlights to actually respond fast enough to even look correct at gaming framerates and viewport movement speeds. Now you're talking about coordinating two LCDs AND an extra-powerful, extra-dense FALD backlight.

    It really doesn't make much sense, even microLED is more practical even though it's not ready yet.

    E: Yeah viewing angles are another issue, since you have two panels with their associated structure around the subpixels, any color variation or brightness reduction off-axis will be at least doubled.
     
    Last edited: Jan 14, 2019 at 3:25 AM
  12. elvn

    elvn 2[H]4U

    Messages:
    2,997
    Joined:
    May 5, 2006
    The video did say the idea wasn't new, that Dolby did some kind of demo years ago but had a lot of artifacts.. It sounds difficult but looked decent at a glance in the vid and they said they had made advancements past some of the previous hurdles - especially artifacts (going by what linus said in the video). Effective 2 million zones of local dimming, multiples of light blocking etc.

    The .0003 black depth and 100% DCI P3 color, and according to linus "cheaper than oled to make".. 120hz if hdmi 2.1 models all sounded good. The brightness ("just shy of 300nit") is severely lacking and I think he said at 25ms due to the panel matching chip which while ok isn't great compared to modern 120hz tvs. Two lcds worth of power wouldn't bother me at all though . I already run three LCDs at my desk plus a surround receiver and speakers :b If all else was good the brightness would still be a dealbreaker. As it is it's like an even dimmer peak color pseudo OLED black depth display with no burn in risk.. instead of getting high HDR color nits of a high end fald LCD.and future microLED tech. Now if it had no artifacting or screw-ups with VRR, HDR, and had 2000nit or more HDR color with 10ms input lag and good response time it would look more like a microled option.

    It'll still be interesting to see if it goes anywhere or improves.
     
  13. EniGmA1987

    EniGmA1987 [H]Lite

    Messages:
    97
    Joined:
    May 2, 2017
    Just something I saw in an article somewhere or other back towards the beginning/middle of last yeark. Nvidia had been bringing the tech into their servers to do properly accurate simulations for autonomous driving for both Tesla and Uber, and then both those deals for hardware and software fell through after Tesla went different direction and Uber slowed things way down after the Arizona crash. With the hardware developed and extra servers around, and no customers wanting to be associated with autonomous driving failures and fatal accidents, Nvidia instead put the hardware to other use
     
  14. elvn

    elvn 2[H]4U

    Messages:
    2,997
    Joined:
    May 5, 2006
    Personally I think we should make the infrastructure space age..

    ... That is, by making factory prefabricated smart road sections and making the entire road infrastructure smart roads starting with the highways.. instead of rock bed concrete and asphalt frosting driveways. With both the roads and the cars being smart , as well as making every car equipped with smart emitters/transmitters/receivers by law.. the margin of error would be extremely small and it would also help vs weather conditions. Phones or some other smart devices could also help pedestrians to be "seen". You could also lay massive stretches of fiber in the roads at the same time.. to reach everywhere roads go in the long run. This would also go a long way toward making AR more doable and sooner on roadways to help enhance things even more. Of course it would also put a lot of (taxable and spending) people to work for decades too. We've had automatied assembly lines for products that fly down conveyor belts and rollers and auto sort and divert etc for decades.. why can't we do similar making the roads into a smart machine? You could alternately retro fit existing roads with smart technology which would be a lot cheaper of course but probably not as grand in vision and results.
    /gets off soapbox.

    TLDR: Rather than trying to make mainly optics based singular cars.. make the roads smart, all crossings smart, make all vehicles smart, make pedestrian's smart entities via smart devices as part of the system framework too. Make jobs doing it.
     
    Last edited: Jan 14, 2019 at 11:55 AM
  15. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,347
    Joined:
    Nov 12, 2012
    Yeah it seems like it could be good for mid range screens.

    But they still have a long way to go for high end. And by the time they improve it enough to compete with OLED microled will be here. If they ever even get to that point.
     
  16. tybert7

    tybert7 2[H]4U

    Messages:
    2,628
    Joined:
    Aug 23, 2007

    The brightness seems to be one of the biggest sticking points to the display color. As brightness rises, you start to lose color volume. And conversely, it's probably easier to hit 100% of DCI P3 at lower brightness levels.
     
  17. Keller1

    Keller1 n00bie

    Messages:
    11
    Joined:
    Dec 10, 2013
    Linus is being a goof. It's marketed as 2900 nits. not 300 nits as the video suggests. The Input Lag is cause their chipset is mediocre.

    If Samsung/LG made one of this using their own processors i'd go for it.

    Really the sacrifice versus OLED here are the perfect viewing angles and panel the response times.

    I could live with that trade off for high brightness, gauranteed no burn in and a lower price, and hopefully either 8k (for upscaled 4k120) or a smaller panel size. Then again we'd need to see the blacks in real life to judge the capibilities of the tech.