LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

I tried to capture it but I was struggling, I'll set my tripod and proper camera up tomorrow to show it. 160hz is fine though btw. Just can't have 10 bit color. Which for 98% of stuff doesn't really matter (I say that owning a business that does a lot of visualisations)
 
175hz uses 4:2:2 mode, that has purple fringing around text. It’s fine in a game but for text it’s unusable.
This makes so much more sense. I know about chroma subsampling and it’s importance to text clarity, I didn’t realize you needed to drop it down for 175Hz (DP 2.0 and HDMI 2.1 cannot come soon enough)
can you not drop to 8-bit colour depth and maintain 4:4:4 chroma?
Same question I’ve got. I’d easily trade 10-bit for more frames.
 
Well I’ve got it set up. Few frustrations but overall pretty happy. I think the form factor is pretty much perfect size wise. Perfect width without having to move my head.

View attachment 207160 View attachment 207161

First impressions, the stand is terrible, doesn’t go high enough, doesn’t pivot, no cable management. I’ll replace it with a Humanscale m8 over the holidays.

Picture quality is excellent, I’ll bring a colorimeter home to get it set it up properly.

IPS glow is there, it’s in the corners quite bad but as always it’s not unless you’ve got a fully black screen that it’s there. This side of OLED it’ll be fine.

Biggest frustration is probably the limitations of displayport 1.4. Turns out I only had 1 of like 6 display port cables that worked in higher modes. Best settings I can get with 10 bit color and HDR enabled is 120hz, bit lame but I knew . I’m hoping to replace with a HDR 1000 screen that can actually run at 175hz plus at some point in the future but let’s be real, it’s 18+ months

175hz is horrific for desktop use btw, the chromatic abberation around text is awful

Anyway, roll on ampere, need a gpu upgrade now for sure :)
Congrats. What settings can you change in sRGB mode? Does it have the RGB sliders available and maybe even CMYK sliders (colors like cyan, magenta and yellow) in sRGB mode?
 
Real shame DP holding things back. Can’t wait for DP2.0.

Everything just needs to go and stay HDMI. I'm sick of all the stupid arbitrary standards. AVRs are already HDMI so you might as well just go with that. It's too bad that even HDMI 2.1 is a little underpowered.
 
160 Hz is the highest for that afaik. Which is honestly fine because I doubt the panel even keeps up with 175 Hz.

Exactly that.

Not really sure what the right combo is, I did laugh at windows HDR awfulness (pdfs go dark) plus HDR is crap on the screen anyway plus I’m not gonna watch movies. 8 bit is probably fine given I don’t do visual stuff any more.

Realistically, for games, I’ve got a 1080Ti and 160hz is a pipe dream anyway so I’ll provably leave it on 10 bit and 120hz for now.

it’s annoying as fuck that the manual doesn’t explain the best combo of settings though.
 
No one can even see the difference between 144hz and 175hz anyway. It's too subtle.

You can notice 120hz -> 240hz, but 144 to 175 is almost nothing.
 
Can HDR be disabled on the monitor? The 34gk950f did not have the ability to disable HDR.
 
Can HDR be disabled on the monitor? The 34gk950f did not have the ability to disable HDR.

The OS defines it afaik, if it doesn’t receive a HDR signal then HDR shows as inactive. If you turn it on in windows then it shows it. There’s no “disable” option I can see but it might be badly labeled, manual is fuckin terrible.

I don’t really do anything video wise but if I get some time I’ll play around with it. Unfortunately I have shitty broadband so I’m not sure how but I’ll work it out.
 
So whats the highest refresh we can go on this monitor without losing colour for SDR games?

160Hz 4:4:4?
 
What about 8bit color?
Is it 160hz 4:4:4?

And 10bit colour is only needed for HDR is that correct?

Or does 10bit color add something to SDR aswell?
 
Some graphics and visualisation things use 10 bit, but it’s one of those “don’t need it if you don’t know you do” things.

So it’s pretty much HDR, which is shit for this brightness. So it’s no great loss. If you wanted it in a game I’d just lower the refresh for that game.

I think we’re a couple of years away from having a good combination of image quality, resolution, HDR and refresh rate, but I guess we’ll see after CES. Kinda hope I’m wrong but I also don’t want to explain another purchase this time next year

This is almost there, PG35VQ is almost there with different compromises. It’ll keep evolving until we get to 10K nit, 8k, vrr with emissive screens or micro led. It’s just finding your sweet spot.

The prices can fuck off though. It’s madness, Even when you can afford it. Then again, I had an lcd back when they were thousands so it’s not the first time. Hopefully there is a downturn at some point so manufacturers stop cranking the price handle on consumer electronics for nominal improvements.
 
Everything just needs to go and stay HDMI. I'm sick of all the stupid arbitrary standards. AVRs are already HDMI so you might as well just go with that. It's too bad that even HDMI 2.1 is a little underpowered.

HDMI is royalty-ridden, so there will always be a place for DisplayPort. DP is also more flexible, too, with Thunderbolt / USB-C integration.

HDMI exists separately because of the MPAA and their need for control. And HDMI technology has always been a ripoff of DVI and now DP :/
 
HDMI is royalty-ridden, so there will always be a place for DisplayPort. DP is also more flexible, too, with Thunderbolt / USB-C integration.

HDMI exists separately because of the MPAA and their need for control. And HDMI technology has always been a ripoff of DVI and now DP :/

I agree with that BUT the bottom line is that AVRs don't support display port, so it's ultimately useless.

We need seamless computer -> AVR integration.
 
  • Like
Reactions: Panel
like this
I agree with that BUT the bottom line is that AVRs don't support display port, so it's ultimately useless.

Eh, I'd more support a hard delineation between the two. HDMI support on the desktop / computer space in general is a bit of a necessary evil. DisplayPort may not always be superior in lock-step, but it's likely to be the better transport most of the time. It's certainly more flexible today if you're pushing the limits, like needing to bond outputs for a single higher data rate display like 8k60 or some such.

We need seamless computer -> AVR integration.

As much as I agree with the sentiment, and I do believe we'll get there eventually, external interests (MPAA) are going to force the industry to take the long way around.

Fortunately, since HDMI moved from DVI signalling with version one to DP-like packet streams with HDMI 2, it's likely everything will devolve into USB4, i.e. Thunderbolt. If we get 'one connector to rule them all' in the next decade, that will be the one.
 
Eh, I'd more support a hard delineation between the two. HDMI support on the desktop / computer space in general is a bit of a necessary evil. DisplayPort may not always be superior in lock-step, but it's likely to be the better transport most of the time. It's certainly more flexible today if you're pushing the limits, like needing to bond outputs for a single higher data rate display like 8k60 or some such.



As much as I agree with the sentiment, and I do believe we'll get there eventually, external interests (MPAA) are going to force the industry to take the long way around.

Fortunately, since HDMI moved from DVI signalling with version one to DP-like packet streams with HDMI 2, it's likely everything will devolve into USB4, i.e. Thunderbolt. If we get 'one connector to rule them all' in the next decade, that will be the one.

I'm okay with anything as long it lets people do what they want to do. The HDMI audio/DP video frankenstein mess is getting really old.
 
I'm okay with anything as long it lets people do what they want to do. The HDMI audio/DP video frankenstein mess is getting really old.

Well, they're both holding the industry back at this moment. Neither is up to spec as to what displays can actually do.
 
Is any one using the variable backlight feature, this is the first monitor that I have with this feature and wondering what the best setting is for it or if its even worth using. So far very happy with monitor, just waiting to install the ergotron HX stand.
Thanks
 
Is any one using the variable backlight feature, this is the first monitor that I have with this feature and wondering what the best setting is for it or if its even worth using. So far very happy with monitor, just waiting to install the ergotron HX stand.
Thanks
Hey man, can you check if RGB controls are available in sRGB mode?
 
From TFTcentral:

Very little info about these two at the moment but LG are also showcasing at CES their new 34" 34GN850 (HDR 400 only) and 37.5" 38GN950 (HDR 600). Both have 1ms Nano IPS panels and 160Hz overclocked refresh rate, but that's about all we have. Let us know if you see any more info!
I’d guess 38GN950 might be the FreeSync equivalent of this.

 
Now we can only hope it doesn’t take another year to get those out and the cost is lower.
Indeed. This is exactly the sort of (reasonable) thing I’ve been waiting for, and it’s finally within sight. I think I should be returning the LG 5K I bought a few days ago after growing sick of waiting.
 
The thing is, the G-Sync version is going to be better because of variable overdrive. Once you're spending over a grand for a monitor, I'm not quibbling over a couple hundred bucks here and there.
 
The thing about CES is that anything they show or talk about it we wont actually be able to buy until Fall at the earliest and often that's being generous on the time line. Meaning a whole year or two after. All we can do though is hope and wait.
 
g-sync is much more effective than freesync. once you play on a 120hz g-sync display it is very hard to go back to a normal display. The dedicated hardware module syncronyzes frames with gpu too effictively. They say oleds don't need g-sync because the pixel response it too fast but I highly doubt that statement.
 
The thing about CES is that anything they show or talk about it we wont actually be able to buy until Fall at the earliest and often that's being generous on the time line. Meaning a whole year or two after. All we can do though is hope and wait.

Yeah likely Q3 or Q4 the monitors will come in. Maybe there's a chance we get this earlier since the 950G has already been released - but not betting on it.

g-sync is much more effective than freesync. once you play on a 120hz g-sync display it is very hard to go back to a normal display. The dedicated hardware module syncronyzes frames with gpu too effictively. They say oleds don't need g-sync because the pixel response it too fast but I highly doubt that statement.

Used both G-Sync and FreeSync monitors, I won't deny G-Sync is superior but I also don't see much of a difference on good FreeSync ones. It's good enough for me.
 
gsync will always be superior. that's what makes this monitor so valuable.

Dunno about 'always', but likely for the foreseeable future. Essentially, for Freesync (etc.) to equal or exceed G-Sync, a similar level of hardware support must be added.
 
Dunno about 'always', but likely for the foreseeable future. Essentially, for Freesync (etc.) to equal or exceed G-Sync, a similar level of hardware support must be added.
Of course. "Always" - before a better technology shows up. But before that happens gsync will be the best option for smoothest framerates. VRR may be good and even abundant for most people, but for the ultimate smoothness, you gonna have to use gsync. Nothing is even coming that can rival hardware gsync synchronization.
 
Not really true, depends on the panel. The LG27GL850 wouldn't be meaningfully improved by gsync.

Any monitor is meaningfully improved by GSync because you need variable overdrive to minimize overshoot when running games that run at different speeds.

Have you tried playing a 30hz game on a 144hz+ Freesync monitor? It's a fucking joke. You can see literal silhouettes around moving objects the overshoot is so bad.
 
Any monitor is meaningfully improved by GSync because you need variable overdrive to minimize overshoot when running games that run at different speeds.

Have you tried playing a 30hz game on a 144hz+ Freesync monitor? It's a fucking joke. You can see literal silhouettes around moving objects the overshoot is so bad.

This will depend on the individual monitor a lot and how Freesync is implemented. I haven't noticed any real difference between my 1440p 144 Hz G-Sync vs 5120x1440 120 Hz Freesync 2 displays in VRR performance but to be fair I never hit 30 fps in anything. G-Sync may be better but if they can offer Freesync 2 with better inputs, PbP mode, better HDR etc at lower cost then it's fine.
 
This will depend on the individual monitor a lot and how Freesync is implemented. I haven't noticed any real difference between my 1440p 144 Hz G-Sync vs 5120x1440 120 Hz Freesync 2 displays in VRR performance but to be fair I never hit 30 fps in anything. G-Sync may be better but if they can offer Freesync 2 with better inputs, PbP mode, better HDR etc at lower cost then it's fine.

In my experience it's universal across all hardware and what it comes down to is the software you run. If you're like me and primarily play games that run in emulators that run at weird lower refresh rates, the difference is night and day. To be fair, this probably isn't the most common use case, but then again, if it wasn't for emulation, I'm not sure I'd really give a shit about variable refresh at all.
 
Any monitor is meaningfully improved by GSync because you need variable overdrive to minimize overshoot when running games that run at different speeds.

Have you tried playing a 30hz game on a 144hz+ Freesync monitor? It's a fucking joke. You can see literal silhouettes around moving objects the overshoot is so bad.

You didn't read my post, and just mashed your fingers on the keyboard. It depends on the panel. The LG27GL850 is special because on "normal" overdrive mode it has a consistent ~6ms response time with no overshoot across its entire range of refresh rates. At "Fast", it drops down to 5ms with only mild overshoot that TFTCentral reported to be nearly invisible.
 
You didn't read my post, and just mashed your fingers on the keyboard. It depends on the panel. The LG27GL850 is special because on "normal" overdrive mode it has a consistent ~6ms response time with no overshoot across its entire range of refresh rates. At "Fast", it drops down to 5ms with only mild overshoot that TFTCentral reported to be nearly invisible.

Yeah actually it's universal across all panels. Thanks for playing.
 
Back
Top