LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

If you're going to bet, bet on Nvidia.


Hah, yeah. I've always been a default "underdog" supporter. I've tried to favor AMD and ATI, separately, and together, for decades (since the mid-90s in the case of the AMD CPUs) Yet, time and time again, it's been proven to me, that regardless of their staying power (as underdogs), they've never been able to consistently deliver. In fact, in what I remember of their history, whenever they've been able to gain the crown, they've never been able to hold it in the next iteration. Couple that, in the case of ATI (or the GPU side of AMD), with consistently troublesome drivers or support. I've pretty finally capitulated to both Intel and Nvidia, especially for gaming.

I'd love to see that change, but it's such a long running and consistent trend at this point, I'm kind of amazed that it's even a question anymore. Yes, AMD does upset the order, every so often, and regularly delivers the "bang-for-buck" in the upper-middle tier, but, rarely does it consistently compete at the higher performance level, two iterations in a row, and is just as guilty as Intel on the CPU end of having near zero upgrade path one generation to the next. On top of which, I remember, owned, and suffered through numerous "on paper" top performer AMD CPUs that ultimately had major issues with thermal runaway, constant crashes with spec driven chips, etc. etc, and similar with ATI gpu's. Yet, all the while, would watch friend's intel/nvidia rigs just chug along.

I say this, acknowledging the irony in that, I think I just killed a 1080ti recently, but in fairness, it was an MSI Armor OC, that I watercooled, to compensate for the poor cooling that was MSI's fault, and a whole fiasco unto itself, and I'm probably responsible regardless (physical abuse). I'll be getting another 1080ti to replace it, most likely. Radion VII sounds nice, but... why would I go for that, vesus a used 1080 ti for $200 or so cheaper?
 
Hah, yeah. I've always been a default "underdog" supporter. I've tried to favor AMD and ATI, separately, and together, for decades (since the mid-90s in the case of the AMD CPUs) Yet, time and time again, it's been proven to me, that regardless of their staying power (as underdogs), they've never been able to consistently deliver. In fact, in what I remember of their history, whenever they've been able to gain the crown, they've never been able to hold it in the next iteration. Couple that, in the case of ATI (or the GPU side of AMD), with consistently troublesome drivers or support. I've pretty finally capitulated to both Intel and Nvidia, especially for gaming.

I'd love to see that change, but it's such a long running and consistent trend at this point, I'm kind of amazed that it's even a question anymore. Yes, AMD does upset the order, every so often, and regularly delivers the "bang-for-buck" in the upper-middle tier, but, rarely does it consistently compete at the higher performance level, two iterations in a row, and is just as guilty as Intel on the CPU end of having near zero upgrade path one generation to the next. On top of which, I remember, owned, and suffered through numerous "on paper" top performer AMD CPUs that ultimately had major issues with thermal runaway, constant crashes with spec driven chips, etc. etc, and similar with ATI gpu's. Yet, all the while, would watch friend's intel/nvidia rigs just chug along.

I have both a system with Radeon 390 in one computer and several Nvidia cards have been in another, 970x2, 980 Ti, now 2080 Ti. I haven't noticed that AMD drivers have been any worse than Nvidia really. The 390 when it was purchased seemed like the most sensible midrange card in its price range.

In terms of CPUs, AMD has had far better support with its AM4 platform. It supports Zen 1/Zen 1+/Zen 2 and looks likely to support Zen 2+. So if you upgraded every time that's four processors without changing the motherboard. Granted, the road to that support has been a rocky one and Zen 2 still seems to be in a bit of a flux as many manufacturers still don't have the latest AGESA version out etc.

I really want AMD to be able to challenge Nvidia properly, not just in the midrange market which at this point is not relevant to me as I move towards 4K resolutions. They try to promise a 2080 Ti killer but if it comes out when Nvidia is already a generation and a half ahead then it's no good.

I'm glad that Nvidia finally caved to have Freesync support as it will lead to more GPU choices for your chosen display. It looks like they will in the future aim G-Sync more towards the high end model as more and more manufacturers would rather make Freesync 2 displays since they are probably cheaper for them to make with no reliance on Nvidia's chips.
 
I have both a system with Radeon 390 in one computer and several Nvidia cards have been in another, 970x2, 980 Ti, now 2080 Ti. I haven't noticed that AMD drivers have been any worse than Nvidia really. The 390 when it was purchased seemed like the most sensible midrange card in its price range.

In terms of CPUs, AMD has had far better support with its AM4 platform. It supports Zen 1/Zen 1+/Zen 2 and looks likely to support Zen 2+. So if you upgraded every time that's four processors without changing the motherboard. Granted, the road to that support has been a rocky one and Zen 2 still seems to be in a bit of a flux as many manufacturers still don't have the latest AGESA version out etc.

I really want AMD to be able to challenge Nvidia properly, not just in the midrange market which at this point is not relevant to me as I move towards 4K resolutions. They try to promise a 2080 Ti killer but if it comes out when Nvidia is already a generation and a half ahead then it's no good.

I'm glad that Nvidia finally caved to have Freesync support as it will lead to more GPU choices for your chosen display. It looks like they will in the future aim G-Sync more towards the high end model as more and more manufacturers would rather make Freesync 2 displays since they are probably cheaper for them to make with no reliance on Nvidia's chips.


I agree with you wholeheartedly. I mean, I guess the only reason for me to even make a post about it at this stage, is hope, that at some point, they'll realize there *is* still plenty of room to offer value to customers, even if they aren't always the top performer. If they could add value in terms of real upgrade path without replacing everything per iteration, and at least maintain a consistent focus and ethos, then when they did manage to steal the performance crown, and beat on price, being loyal would be very rewarding. MB architecture could always have been designed to be significantly more modular, but the industry is against it generally, as iterative hardware cycles offer less perceptible performance increase, and less turnover of components, forcing a new chipset locked to a marginally different platform, is about the only thing that could drive sales of a big component like a motherboard, still, if anybody is in a position to upset this, it's the underdog, but they are just as invested in that cycle as intel is.

I've always disliked the exclusionary tactics of nvidia and intel, but when the product was so consistently refined, and usually on top in the price/performance tier I was interested in, it became pretty hard not to choose, especially when it was the "easy choice" in terms of knowing what I was getting into.

I know AMD drivers have gotten better, and they're really seemingly the better option under Linux these days, which is kind of amazing considering the history.

Admittedly, the last ATI cards I had were 7870, and later an R9 390 8GB that served me well and gave a great ROI when I sold it to get a 1080ti. My buddy is actually still using the 7870, on a recent 7600K build, and can honestly still run most games at 1080P with decent FPS. However, I consistently had issues with Catalyst during all those years, and he's had some hiccups with them also. Still, I'm not saying NVidia doesn't have the same, just that for whatever reason, nvidia and intel have both managed to at least maintain the "perception" of general refinement that "just works", more than AMD has, for me, but yeah, it's also that I'm interested in higher refresh and resolution also, as I'm spending my hard earned money at increasing price tags, and if I'm going to drop that kind of money, I want the highest level of certainty that I will spend my limited personal time enjoying that expense in game, and not tweaking drivers (ironic since I did make the switch to ultra-wide, and that probably ends up with more tweaking necessary than any amount of video card drivers ever did, from lack of support, I know). I was planning to stick with ATI gpus when I did get the 1080ti, but at the time, coming off a big cycle, especially with everything going on with the miners, it was a no brainer when I managed to snag a 1080ti at release. It had it's caveats, but they were manufacturer probs, combined with the mining circumstances, but I would definitely have preferred to stick with what I knew, since I'd been so long invested in the ATI ethos, I hadn't had an Nvidia card in at least 10 years previous. Now, I'm not going to switch again, unless there's compelling, and consistent reason to, and that'll have to be in performance (either the top, or very close, with much greater value), price, and added value, and not just fluff. That's going to be difficult especially now that I've bit the bullet with a high end G-Sync display. However, I totally agree with you about the Freesync issue, and it should have never been a question frankly.

I'm certainly very glad that Intel and Nvidia have competition in AMD. Honestly, how bad (in terms of business ethics and anti-consumer choices) would they be without it? They've both shown their colors in the past when they thought they had the competition wiped out. So I am in no way decrying AMD in any way shape or form, just tossing my opinion out on the internet in hopes that maybe some day, someone at AMD will listen, get a clue, and realize how great they could be, with the right focus, especially now, when being pro-consumer can so effectively, and permanently increase brand loyalty and overall value, if honestly considered. Although I guess we're really just waxing philosophical on general corporatism in the current meta of mainstream capitalism at this point, and you could apply the same to any number of similar competitors in any market. ;)
 
has anyone reviewed this monitor yet. It has been listed on a local Australian website.

But I just bought the Asus XG43, although I haven't properly tested that yet, I am thinking of selling it and ordering this LG
 
has anyone reviewed this monitor yet. It has been listed on a local Australian website.

But I just bought the Asus XG43, although I haven't properly tested that yet, I am thinking of selling it and ordering this LG

It's not available yet for most of the world (not sure about Oz), and I don't believe anyone has it, and there are no reviews I'm aware of... but I've heard it could be available around September time. It's nearly twice the price of the XG43, here in the UK anyway... and it sure isn't twice as good! I'd be wary about entering the IPS panel lottery again... always a major headache in my experience!
 
Not really a review, but it's something..

Posted several pages ago and pretty useless tbh because Linus doesn't properly test/evaluate response times, contrast or anything else.

I'm also not a fan of putting manufacturer lies("1ms!") front and center. It's deliberately misleading.
 
Well I'm in Australia and can order it now, apparently to get it on the 25th September.

$3,000 AUD ($2011 USD equivalent but we have 10% tax, plus shit is expensive here). Man the exchange rate sucks. I'm really torn, I've been waiting for an ultra wide with these sort of specs for a very long time but oof that feels like a lot of money.

Shame I can't business expense it, well shouldn't rather than can't I guess. Just don't fancy a tax man conversation about why I need a multi thousand dollar gaming screen for work, obviously if it didn't say Gaming and I bought say the new Apple display they'd be fine with it :cautious:

Gonna look into how much I can get for my xg2703-gs
 
You could tell him something in lines of how you find this specific screen format and resolution (DPI) optimal for your kind of work, but the killer feature is the 1ms because you feel discomfort from all the blur during text scrolling and window dragging on a typical slow office screen which is reducing your work efficiency.
 
So can anyone explain what LG is thinking with the pricing of this thing? Did they just look at the 35" FALD ASUS and Acer displays and say "yeah it’s the same"?

The screen sells for 2500 euros in my country which I think is bonkers for a HDR400 screen.
 
One of the reasons for the higher price if the real deal G-Sync chip. Plus it has the 35" FALD monitors beat on size, pixel speed and resolution.
 
One of the reasons for the higher price if the real deal G-Sync chip. Plus it has the 35" FALD monitors beat on size, pixel speed and resolution.
The G-SYNC Ultimate FPGA is only like $300-$400 from what has been spread around, so you're still looking at a price north of €2.000. And lest we forget, 34" "gaming" 21:9 monitors were all around the $1,500 mark when they first hit the market.
 
The price is fine as long as we see sales around the $1400 mark. True g-sync is worth the money from my testing.
 
The price is fine as long as we see sales around the $1400 mark. True g-sync is worth the money from my testing.

For that price I would buy one for sure, even converted to euros. It is pretty much the only game in town right now for 38" higher than 1440p vertical res high refresh rate. I don’t know why manufacturers seem to really hate vertical resolution nowadays as every larger ultrawide would be better at 1600 vertical.
 
Yeah but that's retail price. The 34gk950 is listed at $1500 but 6 months later could be found for ~$800-$900.

Good luck with that lol... it may get a slight reduction, but not that much lol! Not unless it ends up being a big pile of poop.
 
Good luck with that lol... it may get a slight reduction, but not that much lol! Not unless it ends up being a big pile of poop.
Maybe but it wouldn’t be unprecedented considering I gave you a valid example of their last generation panels dipping equally in price.
 
Maybe but it wouldn’t be unprecedented considering I gave you a valid example of their last generation panels dipping equally in price.

They weren't exactly unique though... more than a few 34" monitors out there, so they didn't exactly have that market sector to themselves. The price just came down in line with the other options, that were very similar in spec. The 38GL950G is quite different... nothing else out there like it, so I can see LG and retailers keeping the price high on these for a long time yet.
 
They weren't exactly unique though... more than a few 34" monitors out there, so they didn't exactly have that market sector to themselves. The price just came down in line with the other options, that were very similar in spec. The 38GL950G is quite different... nothing else out there like it, so I can see LG and retailers keeping the price high on these for a long time yet.
That would require selling them at that price. It’s so high that I would expect only enthusiasts to bite and not having proper HDR for such a high price is going to drive a lot of people away. It’s a good size, refresh rate and resolution but it really needs to be top tier everything else at current pricing.
 
That would require selling them at that price. It’s so high that I would expect only enthusiasts to bite and not having proper HDR for such a high price is going to drive a lot of people away. It’s a good size, refresh rate and resolution but it really needs to be top tier everything else at current pricing.


I'd like to think you'd be right... this monitor simply isn't worth such a high price tag, but to expect such a large drop within 6 months of release is very much wishful thinking and not particularly realistic unless it isn't very good... in which case it won't really matter. Maybe a year down the line. I think BLB and glow on an IPS screen this size is going to make it a no-go anyway... LG have a bad track record here as it is, and I've seen no evidence they (or anyone else) cares or intend to do anything about this going forwards. It just seems to be an accepted 'feature' of IPS now, and you gotta deal with it.
 
I am fairly agonizing over waiting for this monitor and paying the premium for it vs just buying a 34GK950F right now.

The pros of the 34" are immediate availability, greater local stock so i can be picky with regards to panel lottery (if neccessary) and of course saving about $700-900. The only real con I can think of is that, somehow, the 34" just feels slightly too small vertically. I worry that I would regret it in 6 months time.

The 38" just ticks soooo many boxes. But the main con here is its not even available yet, and likely wont be widely available for 3-8 months if the launch experience of the 34" is any indicator (LG seems to be horrible at product launches). Who knows how long it will be before there is enough stock to have the luxury about being choosy on panel quality. I know that only in the last month as my local dealer finally had real amounts of stock on hand of the 34" and that's, what, almost a year after launch? Frankly the price and HDR dont even matter to me (much).
 
All the manufacturers announce monitors way way out in advance of actual release. 1+ year lead time is very common. 2 years is not unheard of either. I don't like it either. Just saying.
 
I am fairly agonizing over waiting for this monitor and paying the premium for it vs just buying a 34GK950F right now.

The pros of the 34" are immediate availability, greater local stock so i can be picky with regards to panel lottery (if neccessary) and of course saving about $700-900. The only real con I can think of is that, somehow, the 34" just feels slightly too small vertically. I worry that I would regret it in 6 months time.

The 38" just ticks soooo many boxes. But the main con here is its not even available yet, and likely wont be widely available for 3-8 months if the launch experience of the 34" is any indicator (LG seems to be horrible at product launches). Who knows how long it will be before there is enough stock to have the luxury about being choosy on panel quality. I know that only in the last month as my local dealer finally had real amounts of stock on hand of the 34" and that's, what, almost a year after launch? Frankly the price and HDR dont even matter to me (much).

Just wait. The 34" is too short and we both know it. Don't accept Danny Devito monitors.
 
They've already said the price of this is going to be 2300 Euros though.
The €2.300 price includes VAT, which averages out to 20% across the EU. Without VAT it is €1.916. The EU also generally has higher import duties. Combined with higher shipping rates due to the difficulty shipping from southeast Asia to the EU compared to the US I would expect retail price of this monitor to be around $1,800 USD. Average sales tax across the US is 5%, so that would only add $90 to the price at time of sale.
 
I was all into this monitor until I heard the C9 OLED is getting VRR. ;)
 
xbox and amd have been able to do VRR on samsung 4k tvs for awhile now. The amd gpus can even run 1440p 120hz since the panels are 120hz in the more recent tvs. Nvidia supporting VRR and freesync and still calling it G-sync by title in their drivers is nice but without a displayport on the tvs you aren't getting 4k 100hz 4:4:4 10bit or 120z 8bit 4:4:4.. the nvidia gpus still lack hdmi 2.1 and most tvs do too so far. What good is VRR from 48hz to 60hz with all of that sample and hold blur? Unless you are running non native resolution perhaps but then the quality is muddied.

Sure someday we'll have hdmi 2.1 , 120hz 4k VRR + QFT OLED tvs and Samsung high density and high HDR color nit FALD tvs as well as the next gen of xbox and ps5 doing some hacked form of 4k 120hz on hdmi 2.1 most likely.... and someday nvidia will release a hdmi 2.1 gpu that does VRR. Until then these monitor manufacturers are going to profit off of 100hz+ 4k (and uw) displayport monitors.
 
Last edited:
xbox and amd have been able to do VRR on samsung 4k tvs for awhile now. The amd gpus can even run 1440p 120hz since the panels are 120hz in the more recent tvs. Nvidia supporting VRR and freesync and still calling it G-sync by title in their drivers is nice but without a displayport on the tvs you aren't getting 4k 100hz 4:4:4 10bit or 120z 8bit 4:4:4.. the nvidia gpus still lack hdmi 2.1 and most tvs do too so far. What good is VRR from 48hz to 60hz with all of that sample and hold blur? Unless you are running non native resolution perhaps but then the quality is muddied.

Sure someday we'll have hdmi 2.1 , 120hz 4k VRR + QFT OLED tvs and Samsung high density and high HDR color nit FALD tvs as well as the next gen of xbox and ps5 doing some hacked form of 4k 120hz on hdmi 2.1 most likely.... and someday nvidia will release a hdmi 2.1 gpu that does VRR. Until then these monitor manufacturers are going to profit off of 100hz+ 4k (and uw) displayport monitors.

The RTX 2080 Ti can really only do 60fps in the most demanding titles at 4k regardless, so 60Hz is a non issue and VRR will help smoothen out those dips to 56-57fps. Once a gpu that is powerful enough to push closer to 100fps at 4k comes out, it will also have HDMI 2.1 to enable that full 120Hz anyways. I only play slow single player games on my OLED anyways and leave anything fast paced to my 240Hz monitor.
 
Yeah, 60hz is not cool for many games. Especially when you can experience same games in 120hz with real gsync at the same time on another PC.
 
I would not go back to 60 Hz even on the desktop. Yesterday my display had for some reason gone to 60 Hz and I quickly felt something was off. 120+ Hz just feels so much more responsive.

The 3840x1600 is a pretty good resolution and 38" a good size for it and a 2080 Ti would run that quite nicely. LG just really messed up by making the pricing unappealing by not having at least HDR600. Even if it didn’t have FALD just having decent HDR capability would make the high price palatable. I’m still probably going for the 43" 4K models because I like the form factor better and can still use ultrawide resolutions if I want.

It’s a tough situation where this monitor is too expensive while the 43" models are still coming and everything else is either excessively wide (5120x1440 49") or just 3440x1440p, which I don’t consider enough of an upgrade from 1440p.
 
Another preview. It does look a bit narrow though.


Also, if I understood him correctly, he said "November" in the end.
 
I would not go back to 60 Hz even on the desktop. Yesterday my display had for some reason gone to 60 Hz and I quickly felt something was off. 120+ Hz just feels so much more responsive.

The 3840x1600 is a pretty good resolution and 38" a good size for it and a 2080 Ti would run that quite nicely. LG just really messed up by making the pricing unappealing by not having at least HDR600. Even if it didn’t have FALD just having decent HDR capability would make the high price palatable. I’m still probably going for the 43" 4K models because I like the form factor better and can still use ultrawide resolutions if I want.

It’s a tough situation where this monitor is too expensive while the 43" models are still coming and everything else is either excessively wide (5120x1440 49") or just 3440x1440p, which I don’t consider enough of an upgrade from 1440p.


Very different monitors though. If you have any serious professional/work consideration, the 43" is basically no good. Too much gamma/contrast shift, poor colour reproduction and the BGR issue... none of which will remotely be a problem on the 38GL950G (although typical IPS glow/bleed might be). The 43" may be preferable for gaming for some people, given the more punchy colours you get with VA, and the bigger size of course, but the LG is more of an all rounder for those that need a work AND gaming screen, with perhaps greater emphasis on the work aspect.

That said, 4K is same horizontal res as the 38GL950G, but has an extra 35% pixels in height... and height makes a huge difference when it comes to productivity. Personally, I would find the specs of the LG on a 43" 16:9 monitor FAR better suited for my needs.
 
  • Like
Reactions: elvn
like this
Very different monitors though. If you have any serious professional/work consideration, the 43" is basically no good. Too much gamma/contrast shift, poor colour reproduction and the BGR issue... none of which will remotely be a problem on the 38GL950G (although typical IPS glow/bleed might be). The 43" may be preferable for gaming for some people, given the more punchy colours you get with VA, and the bigger size of course, but the LG is more of an all rounder for those that need a work AND gaming screen, with perhaps greater emphasis on the work aspect.

That said, 4K is same horizontal res as the 38GL950G, but has an extra 35% pixels in height... and height makes a huge difference when it comes to productivity. Personally, I would find the specs of the LG on a 43" 16:9 monitor FAR better suited for my needs.

I think we will need to see how the XG43UQ and the Acer CG437K P turn out to be. As for work, I’ve edited images on an 8-bit TN panel and it has been fine but I don’t do color critical print work etc and to be honest most gaming displays are not suited for that anyway. If the colors are reasonably accurate head on TN or VA will do just fine. You can always double check on a better display. The 38" LG has its own issues for this kind of stuff due to being curved.

I considered the dual QHD Samsung CRG90 but seeing it in person it just seemed awkward, having more vertical space is often just more useful.
 
  • Like
Reactions: elvn
like this
For reference. on vertical space at about the same ppi,
-the largest 4k field would be about 41" diagonal (or far enough away to be that size to your perspective)
- I never added a x1600 uw or 3840x ultrawide to the graphic but you can extrapolate.

nVW9JAA.jpg



What you are paying for is 120hz 4k with displayport (squeezed into too small of a pipe) and gaming overdrive while the companies can milk you for it - since the hdmi 2.1 displays and hdmi 2.1 gpu roadmap are still far off. We'll probably have hdmi 2.1 120hz 4k VRR + QFT tvs and hdmi 2.1 consoles with some hacked/psuedo 4k 120hz like the ps5 before we get a nvidia hdmi 2.1 / dp 2.0 gpu. So no you can't just get a 120hz 4k 600nit ABL capped oled to run off of a pc at 120hz 4k for $2000, or a samsung 512zone QLED FALD with over 1000nit color brightness HDR. . Anything over 43" even at around 3' to 3.5' away at a big desk setup will have bad ppi and text unless it is much further away. A 48" at 3 to 3.5' away is down to the ppi of a 1440p 32" 16:9 nearer.. even a bit worse. 55" is 80.11 ppi and way too massive for anything other than couch or separate desk station from a far monitor.

The dell gaming OLED that they are supposed to produce is said to be SDR only and kept at a 400nit ~ SDR limit, most likely to avoid burn in.. and it's rumored to be $4000. The asus pro art 32" 16:9 HDR1600 monitor with 1152 FALD zones and pro color, etc won't be out for up to a year and is rumored to be $5000.
 
Last edited:
I think we will need to see how the XG43UQ and the Acer CG437K P turn out to be. As for work, I’ve edited images on an 8-bit TN panel and it has been fine but I don’t do color critical print work etc and to be honest most gaming displays are not suited for that anyway. If the colors are reasonably accurate head on TN or VA will do just fine. You can always double check on a better display. The 38" LG has its own issues for this kind of stuff due to being curved.

I considered the dual QHD Samsung CRG90 but seeing it in person it just seemed awkward, having more vertical space is often just more useful.


I've done graphics work on a curved monitor, it's not a problem at all. Architectural stuff where you do have to be absolutely accurate however, would be problematic. Otherwise, it shouldn't pose any issues. But of course, you aren't buying the 38GL950G if you're a professional graphics artist or photographer... there are far better suited monitors for that. This is more for the relatively well-off hobbyist who just wants to be able to do that and game also, but doesn't want two monitors.

As to the XG43UQ and CG437K, I expect them to be identical to the XG438Q, just a bit brighter. You can see from the panel specs on panelook that there is nothing between them at all... unless they do orient it the other way round... but if they do, it begs they question why they didn't in the first place? All other issues, gamma shift, response times, g2g etc. I bet money will be the same... save for the usual panel to panel discrepancies. This is VA we're talking about... it's obvious now we have to accept its limitations as they've obviously hit a wall with how far they can push the technology. I'd like to be wrong and see both these monitors offer something better in terms of the actual user experience, but I think that's mostly wishful thinking.
 
Last edited:
Back
Top