LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

It isn't the backlight I'd be concerned about- it's the panel. Without FALD, excessive backlighting isn't of much use. This monitor has a pretty narrow application envelope, and like most, it's a stopgap for better panel and backlight technologies.

This monitor has the widest application envelope of anything I've seen in years. The market is CRYING OUT for a larger monitor. Ever since 16:9 took off, monitors have all been too fucking short. We've either been stuck with shitty ass PPI (31.5" monitors @ 1440p) or 60hz (ZR30W derivative 2560x1600 monitors or 31.5" 4k 60hz monitors).

Getting decent PPI, 30"+, high refresh rates, and variable refresh is basically what many people have been waiting for.

27" 16:9 monitors were always too small. The fuckers are like the height of 20" 4:3 monitors. They've ALWAYS been too small.

This monitor is basically the first objective upgrade over the ZR30W, which was released in like 2008 or 2009. The GLACIAL pace of monitor advancement is easily the most frustrating aspect of computing. Hardware is simply not keeping up.
 
This monitor has the widest application envelope of anything I've seen in years.

I'm including price in there- this will be a low-volume product as it is. I'm in the target upgrade market, and I have a ZR30W doing server duty right now ;).
 
This monitor has the widest application envelope of anything I've seen in years. The market is CRYING OUT for a larger monitor. Ever since 16:9 took off, monitors have all been too fucking short. We've either been stuck with shitty ass PPI (31.5" monitors @ 1440p) or 60hz (ZR30W derivative 2560x1600 monitors or 31.5" 4k 60hz monitors).

Getting decent PPI, 30"+, high refresh rates, and variable refresh is basically what many people have been waiting for.

27" 16:9 monitors were always too small. The fuckers are like the height of 20" 4:3 monitors. They've ALWAYS been too small.

This monitor is basically the first objective upgrade over the ZR30W, which was released in like 2008 or 2009. The GLACIAL pace of monitor advancement is easily the most frustrating aspect of computing. Hardware is simply not keeping up.


For reference, ordered by height.. (roughly, based on raw sizes):

----------------------------------------------------------------

22.5" diagonal 16:10 .. 19.1" w x 11.9" h (1920x1200 ~ 100.6 ppi) FW900 crt

27.0" diagonal 16:9 .... 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
34.0" diagonal 21:9 .... 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

31.5" diagonal 16:9 .... 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

40.0" diagonal 16:9 .... 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

43.0" diagonal 16:9 .... 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

48.0" diagonal 16:9 .... 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

55.0" diagonal 16:9 .... 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

----------------------------------------------------------------
 
Last edited:
Is there any way these monitors will be better than the acer xb271hu in terms of contrast or pixel response?
 
I wonder how well content will scale with this & if you end up sacrificing a bit of visual quality in media because of said scaling.
 
I wonder how well content will scale with this & if you end up sacrificing a bit of visual quality in media because of said scaling.


The PPI between this 38GL950G and a 27" 1440p is the same (along with several other ultrawides), and 21:9 is a pretty standard ratio now, so there shouldn't be any issue with scaling. It's only some games and HUDs that have issues for the most part.
 
I don't agree from what I've seen on the 34GK950F. Unless you can turn it off, which you couldn't on the 34GK950F, it's awful and a worse experience than SDR IMHO.

What do you mean you can't turn it off? It's in the OSD menu I thought. I have it on when I play in the dark. I think it looks great. It's totally worthless in the daytime/lights on though. Just not bright enough.
 
What do you mean you can't turn it off? It's in the OSD menu I thought. I have it on when I play in the dark. I think it looks great. It's totally worthless in the daytime/lights on though. Just not bright enough.
I would love to be proven wrong but I have 2 sitting in front of me with updated FW and I don’t see an option to turn it off.
 
I would love to be proven wrong but I have 2 sitting in front of me with updated FW and I don’t see an option to turn it off.

I see what you mean. When I first set mine up, I had to turn on HDR in Windows, and then go into the OSD Gamer Profiles and select the "HDR Effect" radio button to get it to kick on. If I do that, in the menu "HDR Effect" changes to "Standard". Hitting that again does nothing.

The toggle in Windows 10 completely controls turning HDR on and off apparently. The OSD menu doesn't appear to do anything except change that text in the menu if it's on or off, from "HDR Effect" to "Standard"...and even when I pick "HDR Effect" and toggle it ON in Windows, I swap back to to "Gamer 1" where it's retained my other settings. (Brightness/Contrast/Refresh Rate, etc)
 
I would love to be proven wrong but I have 2 sitting in front of me with updated FW and I don’t see an option to turn it off.

There's no option in the OSD but HDR can be disabled in Windows / games as far as I know. I've never had any problem turning it off through the OS and other games.
 
Oooo, Look at Page 28 and Page 29 of the Manual.

It looks as if the reason I see it changing from "HDR Effect" to "Standard" and vice-versa, is because it appears to have two sets of profiles. One for regular SDR, which they call just "Game Mode", and one for HDR referred to as "HDR Game Mode" in the manual.

But it sure looks like the Windows HDR toggle is king. You don't change that, it doesn't matter what you pick in the OSD.
 
Still waiting on a 4K equivalent ultra wide. I love my X34, can never go back to standard aspect ratios
 
Want a 24" 1080p one since I don't feel like upgrading my GPU every so often and don't have space for a 27
 
Still waiting on a 4K equivalent ultra wide. I love my X34, can never go back to standard aspect ratios

You're looking at a 12MP display...

The video cards have not yet been made :eek:

Gonna be waiting on that AMDidia GeForce RTXT 6090 Ti Super Mega Ultraaaaaaa Ultraaa Ultra for a while.

In all seriousness...I bet we get a 4K Ultrawide sooner rather than later. They already have the 5k2k ultrawide, and it's been around a couple years now I think.
 
4K Ultrawide
So, just as a thought experiment on this:

4K is either 4096 pixels wide or 3840 pixels wide, and 2160 pixels tall. 4K ultrawide, then, would just be something wider than 4096 pixels while still being 2160 pixels tall. If our target ultrawide ratio is 21 : 9, then what we're looking for is a 5040 x 2160 display, i.e. your ultrawide 5K or thereabouts (the LG you linked is 21.33 : 9).

Going to 32 : 9, that's 7680 x 2160, which I believe is the widest available consumer ratio, and basically double-width consumer 4k, or 3840 x 2160 x 2.

So, for the 5K you mention, it'll be 11.0MP, and for the 32 : 9 panel, it'll be 16.6MP, for respective scales of 133% and 200% the pixels of 4k and thus 33% more and 100% more rendering performance requirement :).
 
  • Like
Reactions: elvn
like this
I've realized that what I actually want is a 6400x1600 screen that's the width of three 4:3 monitors side by side.

This would basically handle all game scenarios I'm aware of (triple screen 4:3 arcade games, and so on).
 
A 38" 5k2k 200+Hz micro-led display with fanless gsync is the dream. Maybe in 10 years...
 
You're looking at a 12MP display...

The video cards have not yet been made :eek:

Would it actually be that taxing though? With G-Sync I don't feel the need for my games to maintain a constant 60 fps.

In all seriousness...I bet we get a 4K Ultrawide sooner rather than later. They already have the 5k2k ultrawide, and it's been around a couple years now I think.

I am aware of that monitor, someone just needs to release a IPS 1ms 144hz G-Sync model and I would seriously consider picking it up.
 
You aren't getting anything out of higher Hz without filling it with new frames of action. Even 60fps could be a 30 - 60 - 90 graph in the mud 2/3 of the "blend". 30fps is a slideshow. I can play bloodborne etc on ps4 but it's definitely page-y and going back to pc is like going from swimming in mud in a strobe light to skating on wet ice. :watching:

Unless you are getting at least 100fps average for something like a 70 - 100 - 130 fps graph, you aren't getting much out of a 120hz+ monitor's high hz capability imo. That is, blur reduction during viewport movement at speed (cutting down smearing blur to a "Soften" blur with in the lines) , and motion definition increase and smoothness (more dots per dotted line path shape, more unique pages in an animation flip book flipping faster).

Why demand a high Hz monitor if you aren't going to fill those Hz with frames? WOW it's a 240hz monitor that I'm running 40fps average on. Can't wait for a 500hz monitor. Marketing I guess. The only way it would get motion clarity benefits at low fps would be using some kind of motion interpolation built into the monitor , and even that would just be duplicating the low number of frames so would still appear muddy and clunky movement wise, or some kind of floating cut out look. (VR headsets do use "Time Warp" , "Space Warp" and "motion smoothing" types of interpolation for VR which supposedly works pretty well, but no monitors that I'm aware of have that tech).
 
Last edited:
Would it actually be that taxing though? With G-Sync I don't feel the need for my games to maintain a constant 60 fps.

G-Sync and VRR allow you to sync frames without incurring input lag and stuttering from V-Sync, however, you still get input lag / slow response from lower framerates. Whether that matters is obviously up to you and the game you're playing, but in general, faster is better. VRR can't hide uneven frame delivery either, it just makes sure that those frames are whole when they arrive :).
 
G-Sync and VRR allow you to sync frames without incurring input lag and stuttering from V-Sync, however, you still get input lag / slow response from lower framerates. Whether that matters is obviously up to you and the game you're playing, but in general, faster is better. VRR can't hide uneven frame delivery either, it just makes sure that those frames are whole when they arrive :).


And the game you play is a MASSIVE component in this of course, so someone needs to take that into account... big frame drops are going to be far more obvious in something like Forza or PUBG than in Civilization VI for example.
 
Well they come out July 1st supposedly so we'll find out how good they are fairly soon.
 
So, just as a thought experiment on this:

4K is either 4096 pixels wide or 3840 pixels wide, and 2160 pixels tall. 4K ultrawide, then, would just be something wider than 4096 pixels while still being 2160 pixels tall. If our target ultrawide ratio is 21 : 9, then what we're looking for is a 5040 x 2160 display, i.e. your ultrawide 5K or thereabouts (the LG you linked is 21.33 : 9).

Going to 32 : 9, that's 7680 x 2160, which I believe is the widest available consumer ratio, and basically double-width consumer 4k, or 3840 x 2160 x 2.

So, for the 5K you mention, it'll be 11.0MP, and for the 32 : 9 panel, it'll be 16.6MP, for respective scales of 133% and 200% the pixels of 4k and thus 33% more and 100% more rendering performance requirement :).

Just to make a point here, I used two cards (or more) to drive 3x30" 2560x1600 displays for several years. That's 7680x1600. Going to this 49" Samsung KS8500 was a reduction in pixel density. I got a performance increase going this route.
 
Just to make a point here, I used two cards (or more) to drive 3x30" 2560x1600 displays for several years. That's 7680x1600. Going to this 49" Samsung KS8500 was a reduction in pixel density. I got a performance increase going this route.

Hell I used two to drive one!

The challenge I'm seeing though, not really in rebuttal but to advance the discussion, is the balance between the resolution, the demands of modern AAA-games, and of course, the slump of support for multi-GPU today.

Also, I can never go back to the pixel response of my ZR30w. I still have it and it's still a great monitor, perfect size and pixel density for use without scaling, great colors, but man is that panel slow.
 
Hell I used two to drive one!

The challenge I'm seeing though, not really in rebuttal but to advance the discussion, is the balance between the resolution, the demands of modern AAA-games, and of course, the slump of support for multi-GPU today.

Also, I can never go back to the pixel response of my ZR30w. I still have it and it's still a great monitor, perfect size and pixel density for use without scaling, great colors, but man is that panel slow.

Well yeah, I did too. I had the first 30" Dell monitor about a year or two before the others. At one time 2560x1600 was extremely demanding by itself. Driving three of them was something no single GPU card did well on its own. There were some dual GPU cards that did OK at the time, but I always felt like I needed more power. At 4K, that's still true. My current setup is still 60Hz. I had 3x27" ROG Swifts but they sucked. A single one is great but for multi-monitor gaming they are flat out terrible. The TN panel viewing angles were horrendous and didn't work well for multi-monitor use of any kind which is why I ditched them. I miss G-Sync and the refresh rates but I like the image quality and the panel size of my KS8500 allot more. At 4K, I wouldn't be pushing maximum image quality at 100+ FPS, which is why I've been fine with what I've got.
 
Well yeah, I did too. I had the first 30" Dell monitor about a year or two before the others. At one time 2560x1600 was extremely demanding by itself. Driving three of them was something no single GPU card did well on its own. There were some dual GPU cards that did OK at the time, but I always felt like I needed more power. At 4K, that's still true. My current setup is still 60Hz. I had 3x27" ROG Swifts but they sucked. A single one is great but for multi-monitor gaming they are flat out terrible. The TN panel viewing angles were horrendous and didn't work well for multi-monitor use of any kind which is why I ditched them. I miss G-Sync and the refresh rates but I like the image quality and the panel size of my KS8500 allot more. At 4K, I wouldn't be pushing maximum image quality at 100+ FPS, which is why I've been fine with what I've got.

The 2080 ti supers should be able to do 3840x1600 at over 100fps.
 
The 2080 ti supers should be able to do 3840x1600 at over 100fps.


The Super 2080Ti isn't going to be VASTLY more powerful than a current 2080Ti... people are setting themselves up for a big disappointment if they think otherwise. The Super range is just a refresh of the product stack, not an entirely new GPU that's going to blow the existing cards out the water. A current 2080Ti will be a beast of a card for 3840x1600 still, and once the Super is out, we should see it drop in price.
 
^For sure guys if you're looking at this monitor for gaming, make sure you have a GTX 1080 or better first. I actually just turned a lot of settings down to Medium in Black Ops 4 to eek out some more FPS. GTX 1070 on the 34GK950F (3440x1440@144Hz)...anything less I imagine won't be enough horsepower, and I can't wait for the 5700 XT or 20x0 Supers to get here so I can upgrade.
 
The Super 2080Ti isn't going to be VASTLY more powerful than a current 2080Ti... people are setting themselves up for a big disappointment if they think otherwise. The Super range is just a refresh of the product stack, not an entirely new GPU that's going to blow the existing cards out the water. A current 2080Ti will be a beast of a card for 3840x1600 still, and once the Super is out, we should see it drop in price.

To be honest, I expect another 8800GTX vs. Ultra type scenario. I'm thinking the increase in performance will be something like we've seen in the past where the new reference card matches the old factory overclocked AIB's. Of course new non-reference cards from AIB's will have the potential to be even better, but they'll be significantly more expensive as they are now. You would probably have to compare a non-reference, factory overclocked card to an original RTX 2080 Ti reference card to see a significant performance gap.
 
To be honest, I expect another 8800GTX vs. Ultra type scenario. I'm thinking the increase in performance will be something like we've seen in the past where the new reference card matches the old factory overclocked AIB's. Of course new non-reference cards from AIB's will have the potential to be even better, but they'll be significantly more expensive as they are now. You would probably have to compare a non-reference, factory overclocked card to an original RTX 2080 Ti reference card to see a significant performance gap.

But will it play Crysis better like the Ultra did? I doubled the virtual make believe framerate in Crysis when I went from an 8800GTX to the 8800Ultra because the GPU said ULTRA on it....ULTRA man.....think about it....thats like TURBO.....but ULTRA......some amps go up to 10 and others 11 but that GPU went to ULTRA!!!! And it was such a bargain at only 70% more MSRP vs the 8800GTX!
 
But will it play Crysis better like the Ultra did? I doubled the virtual make believe framerate in Crysis when I went from an 8800GTX to the 8800Ultra because the GPU said ULTRA on it....ULTRA man.....think about it....thats like TURBO.....but ULTRA......some amps go up to 10 and others 11 but that GPU went to ULTRA!!!! And it was such a bargain at only 70% more MSRP vs the 8800GTX!

I'm incapable of reading the word "ultra" without reading it in the voice of the announcers from Unreal Tournament and Killer Instinct. Anyone else? lol

I'm all for getting ultra uber turbro releases. Some of us are still on old crap, and it's nice to get the performance/price bumps. Or just make myself feel better about what I already bought. I kick myself all the time for being out of the game and not having a 1080Ti at launch. What a value it was in hindsight.
 
You aren't getting anything out of higher Hz without filling it with new frames of action. Even 60fps could be a 30 - 60 - 90 graph in the mud 2/3 of the "blend". 30fps is a slideshow. I can play bloodborne etc on ps4 but it's definitely page-y and going back to pc is like going from swimming in mud in a strobe light to skating on wet ice. :watching:

Unless you are getting at least 100fps average for something like a 70 - 100 - 130 fps graph, you aren't getting much out of a 120hz+ monitor's high hz capability imo. That is, blur reduction during viewport movement at speed (cutting down smearing blur to a "Soften" blur with in the lines) , and motion definition increase and smoothness (more dots per dotted line path shape, more unique pages in an animation flip book flipping faster).

Why demand a high Hz monitor if you aren't going to fill those Hz with frames? WOW it's a 240hz monitor that I'm running 40fps average on. Can't wait for a 500hz monitor. Marketing I guess. The only way it would get motion clarity benefits at low fps would be using some kind of motion interpolation built into the monitor , and even that would just be duplicating the low number of frames so would still appear muddy and clunky movement wise, or some kind of floating cut out look. (VR headsets do use "Time Warp" , "Space Warp" and "motion smoothing" types of interpolation for VR which supposedly works pretty well, but no monitors that I'm aware of have that tech).
Solid points. Just a note that space warp and the ability to drive high frame rates have nothing to do with the display, they are the game or applications responsibility. Checkerboarding + temporal AA also allow for higher fps at a given frame rate (also nvidias DLSS) - bottom line is it’s quite possible to be driving games at higher frame rates / quality without completely brute forcing it and thankfully, due to consoles, I’ve seen more and more of these options in modern games. Ideally the GPU control panel or the game would have a target fps and suggest settings to let you hit that.
 
  • Like
Reactions: elvn
like this
It's funny you say that because I made an ugly graphic of a video settings panel years ago that showed a similar idea with a fps slider. The intent was to show that a fps target should be focused on, as well as that unless you are getting higher fps you aren't really aren't playing "ultra" graphics and gameplay across the board.
dER4EUU.png

--------------------

In reply to the console and VR "tricks" - to get higher (90fps solid) frame rates in VR, or at least maintain solid low ones for consoles.. I think an extremely high quality interpolation built into the display would be a better way to go than dynamically dowsampling/upscaling and checkerboarding to hit a fps target personally. An improved version of the Q9FN's interpolation which works in game mode at the same 10ms input lag of the display for example, but some future very high quality more advanced interpolation tech w/o artifacts which still keeps the lag down to 10ms or much less when it's enabled.

I think time warp , ASW2 has a 20ms threshold/goal for VR, and those VR ASW technologies actually cut your frame rate down to 45fps then double it, so you are actually losing motion definition if you are capable of a higher frame rate range using VRR/g-sync/free-sync.

However, they can do the 45fps cutoff and doubling dynamically when your frame rate dips I guess so it would be more of a support for the lower end of your fps graph which could work well on pc but it sounds like it results in 20ms of input lag.

Your fps graph doesn't just switch stages cleanly and normally though either, it's usually more like an audio-graph or seismograph. I suppose if you were in an intense battle with fx going off you'd be well into a regular low period however. So it could depend how itchy the trigger finger was on the 45fps dynamic cutoff. Also whether it is always a 20ms hit when the tech is enabled even when the cutoff isn't triggered, which I'm assuming it is.
 
Last edited:
The Super 2080Ti

Is there any evidence of a Super 2080TI? All the leaks I've seen are 2060/2070/2080 only. It's entirely possible they won't do a Super 2080TI and instead will do a new Titan at a later date based on Ampere or just save an updated 2080TI for later depending on what their 2020 launch schedule looks like.

The 2080TI is far away from being threatened by any AMD card so theres really no reason to update it anytime soon, honestly.
 
The 2080TI is far away from being threatened by any AMD card so theres really no reason to update it anytime soon, honestly.

They released it when they had the 1080Ti as faster than the current AMD part by at least one market segment, and they've typically done so as part of their larger product release strategy, i.e., they've stopped making the dies used in the 1080Ti.

As for a 'Super', since this looks like a mid-release refresh, maybe not, but there is precedence. With respect to ray tracing, they could release something that's actually faster (however they pull that off), and that might justify a higher SKU too.
 
Back
Top