LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

Discussion in 'Displays' started by Vega, Apr 26, 2019.

  1. bigbluefe

    bigbluefe Gawd

    Messages:
    588
    Joined:
    Aug 23, 2014
    This monitor has the widest application envelope of anything I've seen in years. The market is CRYING OUT for a larger monitor. Ever since 16:9 took off, monitors have all been too fucking short. We've either been stuck with shitty ass PPI (31.5" monitors @ 1440p) or 60hz (ZR30W derivative 2560x1600 monitors or 31.5" 4k 60hz monitors).

    Getting decent PPI, 30"+, high refresh rates, and variable refresh is basically what many people have been waiting for.

    27" 16:9 monitors were always too small. The fuckers are like the height of 20" 4:3 monitors. They've ALWAYS been too small.

    This monitor is basically the first objective upgrade over the ZR30W, which was released in like 2008 or 2009. The GLACIAL pace of monitor advancement is easily the most frustrating aspect of computing. Hardware is simply not keeping up.
     
  2. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,619
    Joined:
    Jun 13, 2003
    I'm including price in there- this will be a low-volume product as it is. I'm in the target upgrade market, and I have a ZR30W doing server duty right now ;).
     
  3. thelead

    thelead 2[H]4U

    Messages:
    2,065
    Joined:
    May 28, 2005
  4. elvn

    elvn 2[H]4U

    Messages:
    3,099
    Joined:
    May 5, 2006

    For reference, ordered by height.. (roughly, based on raw sizes):

    ----------------------------------------------------------------

    20.5" diagonal 16:10 .. 17.4" w x 10.9" h (1920x1200 ~ 110.45 ppi) FW900 crt

    27.0" diagonal 16:9 .. 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
    34.0" diagonal 21:9 .. 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

    37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

    31.5" diagonal 16:9 .. 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

    40.0" diagonal 16:9 .. 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

    43.0" diagonal 16:9 .. 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

    48.0" diagonal 16:9 .. 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

    55.0" diagonal 16:9 .. 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

    ----------------------------------------------------------------
     
    Last edited: Jun 12, 2019 at 9:08 AM
    tungt88 and IdiotInCharge like this.
  5. xSneak

    xSneak Limp Gawd

    Messages:
    375
    Joined:
    Dec 30, 2013
    Is there any way these monitors will be better than the acer xb271hu in terms of contrast or pixel response?
     
  6. myaccountbroke

    myaccountbroke n00b

    Messages:
    36
    Joined:
    Jul 19, 2016
    I wonder how well content will scale with this & if you end up sacrificing a bit of visual quality in media because of said scaling.
     
  7. bananadude

    bananadude [H]Lite

    Messages:
    98
    Joined:
    Dec 29, 2006

    The PPI between this 38GL950G and a 27" 1440p is the same (along with several other ultrawides), and 21:9 is a pretty standard ratio now, so there shouldn't be any issue with scaling. It's only some games and HUDs that have issues for the most part.
     
  8. coynatha

    coynatha Limp Gawd

    Messages:
    241
    Joined:
    Jun 9, 2004
    What do you mean you can't turn it off? It's in the OSD menu I thought. I have it on when I play in the dark. I think it looks great. It's totally worthless in the daytime/lights on though. Just not bright enough.
     
  9. thelead

    thelead 2[H]4U

    Messages:
    2,065
    Joined:
    May 28, 2005
    I would love to be proven wrong but I have 2 sitting in front of me with updated FW and I don’t see an option to turn it off.
     
  10. coynatha

    coynatha Limp Gawd

    Messages:
    241
    Joined:
    Jun 9, 2004
    I see what you mean. When I first set mine up, I had to turn on HDR in Windows, and then go into the OSD Gamer Profiles and select the "HDR Effect" radio button to get it to kick on. If I do that, in the menu "HDR Effect" changes to "Standard". Hitting that again does nothing.

    The toggle in Windows 10 completely controls turning HDR on and off apparently. The OSD menu doesn't appear to do anything except change that text in the menu if it's on or off, from "HDR Effect" to "Standard"...and even when I pick "HDR Effect" and toggle it ON in Windows, I swap back to to "Gamer 1" where it's retained my other settings. (Brightness/Contrast/Refresh Rate, etc)
     
  11. x3sphere

    x3sphere 2[H]4U

    Messages:
    2,622
    Joined:
    Dec 8, 2007
    There's no option in the OSD but HDR can be disabled in Windows / games as far as I know. I've never had any problem turning it off through the OS and other games.
     
  12. coynatha

    coynatha Limp Gawd

    Messages:
    241
    Joined:
    Jun 9, 2004
    Oooo, Look at Page 28 and Page 29 of the Manual.

    It looks as if the reason I see it changing from "HDR Effect" to "Standard" and vice-versa, is because it appears to have two sets of profiles. One for regular SDR, which they call just "Game Mode", and one for HDR referred to as "HDR Game Mode" in the manual.

    But it sure looks like the Windows HDR toggle is king. You don't change that, it doesn't matter what you pick in the OSD.
     
  13. Blade-Runner

    Blade-Runner 2[H]4U

    Messages:
    3,016
    Joined:
    Feb 25, 2013
    Still waiting on a 4K equivalent ultra wide. I love my X34, can never go back to standard aspect ratios
     
  14. mda

    mda [H]ard|Gawd

    Messages:
    1,485
    Joined:
    Mar 23, 2011
    Want a 24" 1080p one since I don't feel like upgrading my GPU every so often and don't have space for a 27
     
  15. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,619
    Joined:
    Jun 13, 2003
    You're looking at a 12MP display...

    The video cards have not yet been made :eek:
     
  16. coynatha

    coynatha Limp Gawd

    Messages:
    241
    Joined:
    Jun 9, 2004
    Gonna be waiting on that AMDidia GeForce RTXT 6090 Ti Super Mega Ultraaaaaaa Ultraaa Ultra for a while.

    In all seriousness...I bet we get a 4K Ultrawide sooner rather than later. They already have the 5k2k ultrawide, and it's been around a couple years now I think.
     
    IdiotInCharge likes this.
  17. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,619
    Joined:
    Jun 13, 2003
    So, just as a thought experiment on this:

    4K is either 4096 pixels wide or 3840 pixels wide, and 2160 pixels tall. 4K ultrawide, then, would just be something wider than 4096 pixels while still being 2160 pixels tall. If our target ultrawide ratio is 21 : 9, then what we're looking for is a 5040 x 2160 display, i.e. your ultrawide 5K or thereabouts (the LG you linked is 21.33 : 9).

    Going to 32 : 9, that's 7680 x 2160, which I believe is the widest available consumer ratio, and basically double-width consumer 4k, or 3840 x 2160 x 2.

    So, for the 5K you mention, it'll be 11.0MP, and for the 32 : 9 panel, it'll be 16.6MP, for respective scales of 133% and 200% the pixels of 4k and thus 33% more and 100% more rendering performance requirement :).
     
    elvn likes this.
  18. bigbluefe

    bigbluefe Gawd

    Messages:
    588
    Joined:
    Aug 23, 2014
    I've realized that what I actually want is a 6400x1600 screen that's the width of three 4:3 monitors side by side.

    This would basically handle all game scenarios I'm aware of (triple screen 4:3 arcade games, and so on).
     
  19. thelead

    thelead 2[H]4U

    Messages:
    2,065
    Joined:
    May 28, 2005
    A 38" 5k2k 200+Hz micro-led display with fanless gsync is the dream. Maybe in 10 years...
     
  20. Blade-Runner

    Blade-Runner 2[H]4U

    Messages:
    3,016
    Joined:
    Feb 25, 2013
    Would it actually be that taxing though? With G-Sync I don't feel the need for my games to maintain a constant 60 fps.

    I am aware of that monitor, someone just needs to release a IPS 1ms 144hz G-Sync model and I would seriously consider picking it up.
     
  21. elvn

    elvn 2[H]4U

    Messages:
    3,099
    Joined:
    May 5, 2006
    You aren't getting anything out of higher Hz without filling it with new frames of action. Even 60fps could be a 30 - 60 - 90 graph in the mud 2/3 of the "blend". 30fps is a slideshow. I can play bloodborne etc on ps4 but it's definitely page-y and going back to pc is like going from swimming in mud in a strobe light to skating on wet ice. :watching:

    Unless you are getting at least 100fps average for something like a 70 - 100 - 130 fps graph, you aren't getting much out of a 120hz+ monitor's high hz capability imo. That is, blur reduction during viewport movement at speed (cutting down smearing blur to a "Soften" blur with in the lines) , and motion definition increase and smoothness (more dots per dotted line path shape, more unique pages in an animation flip book flipping faster).

    Why demand a high Hz monitor if you aren't going to fill those Hz with frames? WOW it's a 240hz monitor that I'm running 40fps average on. Can't wait for a 500hz monitor. Marketing I guess. The only way it would get motion clarity benefits at low fps would be using some kind of motion interpolation built into the monitor , and even that would just be duplicating the low number of frames so would still appear muddy and clunky movement wise, or some kind of floating cut out look. (VR headsets do use "Time Warp" , "Space Warp" and "motion smoothing" types of interpolation for VR which supposedly works pretty well, but no monitors that I'm aware of have that tech).
     
    Last edited: Jun 14, 2019 at 10:32 PM
  22. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,619
    Joined:
    Jun 13, 2003
    G-Sync and VRR allow you to sync frames without incurring input lag and stuttering from V-Sync, however, you still get input lag / slow response from lower framerates. Whether that matters is obviously up to you and the game you're playing, but in general, faster is better. VRR can't hide uneven frame delivery either, it just makes sure that those frames are whole when they arrive :).
     
  23. bananadude

    bananadude [H]Lite

    Messages:
    98
    Joined:
    Dec 29, 2006

    And the game you play is a MASSIVE component in this of course, so someone needs to take that into account... big frame drops are going to be far more obvious in something like Forza or PUBG than in Civilization VI for example.
     
  24. Skott

    Skott 2[H]4U

    Messages:
    3,933
    Joined:
    Aug 12, 2006
    Well they come out July 1st supposedly so we'll find out how good they are fairly soon.
     
  25. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    53,449
    Joined:
    Feb 9, 2002
    Just to make a point here, I used two cards (or more) to drive 3x30" 2560x1600 displays for several years. That's 7680x1600. Going to this 49" Samsung KS8500 was a reduction in pixel density. I got a performance increase going this route.
     
    IdiotInCharge likes this.
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,619
    Joined:
    Jun 13, 2003
    Hell I used two to drive one!

    The challenge I'm seeing though, not really in rebuttal but to advance the discussion, is the balance between the resolution, the demands of modern AAA-games, and of course, the slump of support for multi-GPU today.

    Also, I can never go back to the pixel response of my ZR30w. I still have it and it's still a great monitor, perfect size and pixel density for use without scaling, great colors, but man is that panel slow.
     
  27. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    53,449
    Joined:
    Feb 9, 2002
    Well yeah, I did too. I had the first 30" Dell monitor about a year or two before the others. At one time 2560x1600 was extremely demanding by itself. Driving three of them was something no single GPU card did well on its own. There were some dual GPU cards that did OK at the time, but I always felt like I needed more power. At 4K, that's still true. My current setup is still 60Hz. I had 3x27" ROG Swifts but they sucked. A single one is great but for multi-monitor gaming they are flat out terrible. The TN panel viewing angles were horrendous and didn't work well for multi-monitor use of any kind which is why I ditched them. I miss G-Sync and the refresh rates but I like the image quality and the panel size of my KS8500 allot more. At 4K, I wouldn't be pushing maximum image quality at 100+ FPS, which is why I've been fine with what I've got.
     
    IdiotInCharge likes this.
  28. bigbluefe

    bigbluefe Gawd

    Messages:
    588
    Joined:
    Aug 23, 2014
    The 2080 ti supers should be able to do 3840x1600 at over 100fps.