LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

Is there any evidence of a Super 2080TI? All the leaks I've seen are 2060/2070/2080 only. It's entirely possible they won't do a Super 2080TI and instead will do a new Titan at a later date based on Ampere or just save an updated 2080TI for later depending on what their 2020 launch schedule looks like.

The 2080TI is far away from being threatened by any AMD card so theres really no reason to update it anytime soon, honestly.

The only real reason they would need to is if the new RTX 2080 Super got too close to its performance.
 
The only real reason they would need to is if the new RTX 2080 Super got too close to its performance.

Well rumor is 2080 Super will be 3072 CUDA cores, 2080TI is 4352. That is 4% more cores, and maybe they boost clocks a little bit too, so 2080 Super might be 5-8% better than 2080, which still leaves at least 25% more performance for the 2080TI. 25% is a LOT if you're trying to get acceptable framerates at 4K.

So yeah I dont think the 2080 Super will threaten 2080TI at all.
 
I bought 780Ti's in January 2014 , and 1080Ti's in May 2017. Personally I skipped this whole gen and am waiting for 7nm and hdmi 2.1, and again prob later in the cycle when Ti's come out.
 
Is there any evidence of a Super 2080TI? All the leaks I've seen are 2060/2070/2080 only. It's entirely possible they won't do a Super 2080TI and instead will do a new Titan at a later date based on Ampere or just save an updated 2080TI for later depending on what their 2020 launch schedule looks like.

The 2080TI is far away from being threatened by any AMD card so theres really no reason to update it anytime soon, honestly.

So far there has been no mention about a Super 2080Ti from nvidia. Some people believe nvidia wont make a Super version of the 1080Ti and some people are guessing/hoping they will make one. Right now it doesn't exist nor is it planned.
 
So far there has been no mention about a Super 2080Ti from nvidia. Some people believe nvidia wont make a Super version of the 1080Ti and some people are guessing/hoping they will make one. Right now it doesn't exist nor is it planned.

It doesn't exist, but you cannot know one isn't planned. It's entirely logical that Nvidia will create such a card, and I strongly suspect they will, just a bit later than the others.
 
It doesn't exist, but you cannot know one isn't planned. It's entirely logical that Nvidia will create such a card, and I strongly suspect they will, just a bit later than the others.

I'm just saying there is no word of one yet from nvidia. Could it happen? Sure. Its all speculative at this point though. That's all I'm saying.
 
Boom, sooner than I expected. Day 1 purchase for me.

Sadly until I move house next year and have more space I can’t go to my monitor end game of dual ultrawides rather than a seperate work and a play setup.

Says it all about the price escalation over the last couple of years that the pricing didn’t make me blink.
 
Here is a quick preview of this monitor starting around the 5 min mark. I do not like this guy but was interested in this crazy setup from Origin, to my surprise while watching the video I saw they were using the 38GL950G. Looks nice!!

 
Can’t wait to get my 38GL950G for testing.
Can’t wait to find out about the fan. I’ve read that the pg35vq had a much quieter fan than the the original g sync ultimate monitors. Hopefully this monitor will have the same improvement.

This is definitely my next long term monitor if all things check out.
 
Taking a look at the back of the monitor in the unbox therapy video (5:57 mark), I’m not seeing vents for the fan unless it’s using the rgb ring area for ventilation.
 
Can’t wait to find out about the fan. I’ve read that the pg35vq had a much quieter fan than the the original g sync ultimate monitors. Hopefully this monitor will have the same improvement.

This is definitely my next long term monitor if all things check out.

I thought they were just going Gsync "compatible" meaning no module and no need for a fan.
 
I thought they were just going Gsync "compatible" meaning no module and no need for a fan.
The 27” model got downgraded to ‘G-Sync Compatible’ while this one is still listed as WITH G-Sync. Plus it only has two inputs (1 DP and 1 HDMI) like models with a G-Sync module.
 
The 27” model got downgraded to ‘G-Sync Compatible’ while this one is still listed as WITH G-Sync. Plus it only has two inputs (1 DP and 1 HDMI) like models with a G-Sync module.

Interesting. Price will probably be very high then - much higher than the CRG9. I'm guessing 2200-2500 - I hope I'm wrong.

Maybe there will be other companies making gaming monitors in this resolution - a really good monitor in 3880x1600 or 5120x1440 might get me to abandon NV Surround...
 
Interesting. Price will probably be very high then - much higher than the CRG9. I'm guessing 2200-2500 - I hope I'm wrong.

Maybe there will be other companies making gaming monitors in this resolution - a really good monitor in 3880x1600 or 5120x1440 might get me to abandon NV Surround...

If they sell enough and its profitable some others may try and get in on it too. I'm going to guess though it won't be a huge market for it but I could be wrong.
 
Can’t wait to find out about the fan. I’ve read that the pg35vq had a much quieter fan than the the original g sync ultimate monitors. Hopefully this monitor will have the same improvement.

This is definitely my next long term monitor if all things check out.
I don't think this model will have an internal fan since it doesn't support local dimming.
 
No mention of a fan but that wasn’t much of a ‘review’. Although it was probably something one could figure out with math, I didn’t realize that it would only be able to do 120Hz with, 10 bit color and 4:4:4 enabled.
 
No mention of a fan but that wasn’t much of a ‘review’. Although it was probably something one could figure out with math, I didn’t realize that it would only be able to do 120Hz with, 10 bit color and 4:4:4 enabled.

Using 10-bit on such a limited capability "HDR" panel seems like a waste of bandwidth anyway.
 
Wouldn’t it be useful to smooth out gradients? Especially with a panel that has such a high color gamut?

In theory yes, no idea how many games even support 10-bit output though. I assume it's probably only HDR games... and even some of those have pretty broken HDR implementations that you wouldn't want to use. Personally I would just leave it in 8-bit unless I'm running an HDR game, and see banding that I'm unhappy with. I'm honestly not convinced it's worth turning HDR on DisplayHDR 400 monitors at all.

If I buy this monitor my plan is to play Cyberpunk and other upcoming HDR games on an LG OLED, not on this thing.
 
In theory yes, no idea how many games even support 10-bit output though. I assume it's probably only HDR games... and even some of those have pretty broken HDR implementations that you wouldn't want to use. Personally I would just leave it in 8-bit unless I'm running an HDR game, and see banding that I'm unhappy with. I'm honestly not convinced it's worth turning HDR on DisplayHDR 400 monitors at all.

If I buy this monitor my plan is to play Cyberpunk and other upcoming HDR games on an LG OLED, not on this thing.
Yeah but I would imagine that people do other things on their computers other than just gaming lol

I agree about the OLED thing though. Hopefully Nvidia supports VRR over HDMI on their next refresh.
 
In theory yes, no idea how many games even support 10-bit output though. I assume it's probably only HDR games... and even some of those have pretty broken HDR implementations that you wouldn't want to use. Personally I would just leave it in 8-bit unless I'm running an HDR game, and see banding that I'm unhappy with. I'm honestly not convinced it's worth turning HDR on DisplayHDR 400 monitors at all.

If I buy this monitor my plan is to play Cyberpunk and other upcoming HDR games on an LG OLED, not on this thing.
I agree that HDR gaming on PC is a mess. The other issue is having to enable and disable HDR in the display settings when you're not gaming since leaving it on washes out the desktop.
 
Wouldn’t it be useful to smooth out gradients? Especially with a panel that has such a high color gamut?
It definitely would, I suppose. Smooth out gradients and deepen the colors. The HDR color gamut is fully supported by the panel, it's just the luminance that is crippled. so the bulbs displayed will not be blindly bright as shown in the LInus video.
 
The Linus video confirms that they didn't fuck this one up in any obvious way at least. Sounds like a winner.

VA blows, and Danny Devito monitors blow. You want the height that you get with a 37.5". Anything smaller is simply too short.
 
Yeah. Cannot wait for this display. If the IPS glow is not too severe and it does not exhibit other flaws like sorts of banding, that I've seen on other overclocked IPS panels, then it will be a keeper for me for many fucking years. Something extremely advanced will have to come out to knock it off my desk. Probably a 120Hz 4K OLED 38"+ with hardware GSync, but even then I will not be in a hurry to upgrade from this LG. Years go by, gaming monitors become less important (for me at least), and still they suck shit. I see no advancement in performance since the CRT era. For now, an IPS without apparent flaws with gsync is good enough.
 
Yeah but I would imagine that people do other things on their computers other than just gaming lol

Sure, but it's my understanding that consumer GPUs do NOT allow proper 10-bit output on the desktop, only in DirectX. On top of that, to make any use of 10-bit capability, the software you're running itself has to support it. So if you're running a Quadro GPU and want 10-bit for pro Photoshop work, you can get that... but you cannot get it with a GeForce card, and this is intentional.

Basically 10-bit is full of caveats and special cases, it's not just a matter of flipping it on in display settings and then everything is better.
 
Holy f@q. I discovered this site over 12 years ago while researching cases and completely forgot about it... looking for a new monitor I discover this gem again.

It looks like monitors have literally stood still since the launch of the iPhone. Yes we have fast refresh and more resolution but overall as a package they still haven't kept up with advancements elsewhere.

At reasonable desktop sizes:

Color Accuracy/Grading for film: $30K+ FSI or Sony Oled
Color Accuracy/Uniformity/Print: $6K Eizo CG319X
Gaming Oriented/Quasi HDR: $1-3K pick your poison

The new Asus ProArt PA32UCX looks promising but likely has high lag and I'm not sure about dropping $4K on an Asus. Same goes for the new Apple Pro display which won't work with a PC anyways. If you have the real estate and funds then multiple displays are about the only option if you care about mixing in workflow with gaming.

Overall though this LG is enticing, especially with the additional vertical res.

2nd post in 12 years, gotta be a record... be back in about 10+ years, maybe we'll have a respectable display by then.
 
Well say it costs an added $20 to every TV set to do that. Considering they sell millions and less than what, 1% would ever use that input. They do a cost analysis.

OK, so make a special model for computer users, charge $200 more for that DP input and call it a day!
(special model can be identical to retail with DP inside.. )
 
Back
Top