LG 38GL950G - 37.5" 3840x1600/G-Sync/175Hz

I too did photo editing on curved 27" display. Might take some to getting used to, but it is doable. Your images become curved, the brain needs time to adapt for compensating the curvature. Flat monitor for photo editing and graphics is definitely a better choice.
 
Something I'd never say again after 2010 on a pc gaming display.

Well you know what I would never say again? No OLED is a non issue. And 60Hz is currently the price to pay atm if you want to game on an OLED. It just destroys any current LCD tech in terms of picture quality until 1 million+ zone FALD, MicroLED, or Dual Layers arrive. And I know you are talking about purely PC gaming but just know some of the best games out there are unfortunately console exclusive and will not run above 60fps. You telling me you absolutely refuse to play those entirely due to 60Hz limitation? And besides if 120Hz is a must....the LG C9 supports 1440p120Hz, still making it a superior display to this overpriced ultrawide.
 
Last edited:
Well you know what I would never say again? No OLED is a non issue. And 60Hz is currently the price to pay atm if you want to game on an OLED. It just destroys any current LCD tech in terms of picture quality until 1 million+ zone FALD, MicroLED, or Dual Layers arrive. And I know you are talking about purely PC gaming but just know some of the best games out there are unfortunately console exclusive and will not run above 60fps. You telling me you absolutely refuse to play those entirely due to 60Hz limitation? And besides if 120Hz is a must....the LG C9 supports 1440p120Hz, still making it a superior display to this overpriced ultrawide.

Yes, but unless you ONLY game on your PC, and spend lots of hours doing other things on it, burn-in is a serious risk... and if you're unfortunate enough to suffer from it, you're not even covered under warranty. Casual gaming poses little issue for OLED, even with on-screen HUDS, unless you're playing 5-6 hours a day. That's excessive for most people, but that's nothing for general PC use... I'm at mine for 10+ hours a day with fixed windows that can't be hidden, so OLED is just not suitable... I laugh when I see people mention auto-hiding taskbars and changing desktop wallpapers... LOL!! Do they think people who work on their PC's at home are sat staring at the desktop all day?? :D

55" is obviously impractical anyway at present, but even if OLED were available at 40" a lot of people would be taking a crazy risk with it, not to mention it would be price prohibitive anyway. This is what's going to prevent OLED coming to the PC market... manufacturer's know this. Just look at Dell and how their 30" OLED was discontinued. OLED will never become mainstream for PC unless they can resolve this, and I'm not sure they ever will given how inherent it is to the tech. At best it will be a niche and overpriced option, because average gamers will never pay thousands for it. By the time you can get a 27" OLED for $500, we'll all be dead. And I suspect MicroLED will be here before then anyway.

This LG is silly money for what it is, but it doesn't surprise me in the least... has been the way of the monitor industry for some years now, and I see no sign of it stopping. Always enough people willing to pay these prices, sadly.
 
Last edited:
  • Like
Reactions: elvn
like this
Yes, but unless you ONLY game on your PC, and spend lots of hours doing other things on it, burn-in is a serious risk... and if you're unfortunate enough to suffer from it, you're not even covered under warranty. Casual gaming poses little issue for OLED, even with on-screen HUDS, unless you're playing 5-6 hours a day. That's excessive for most people, but that's nothing for general PC use... I'm at mine for 10+ hours a day with fixed windows that can't be hidden, so OLED is just not suitable...

I actually think it would be fine, as long as you are willing to pay some attention and make some minor changes. I agree it is not suitable for the average user on a work/general use machine. Red/yellow icons are the biggest risks, and even then, people act like burn-in breaks your display... you need REALLY severe burn-in before it's noticeable without setting your screen to a single color and inspecting it closely. Most of my non-gaming computer use is web browser, so honestly the orange Firefox icon in the top left is probably the biggest risk factor. Would have to see what I can do about it(maybe do custom builds with the icon changed, or maybe some way to hide the title bar). Tabs get changed often enough that the viewport won't be an issue, and my text editor and chat programs have no red or yellow static UI elements so they are low risk, similar to the Fifa UI that took over 5,000 hours to produce even a smidge of barely detectable burn-in.

Personally, I would accept that minor burn-in after 2-3 years of 8h/day usage as simple cost of doing business for having a 4K120hz OLED... I mean in the end they're relatively cheap displays at around $1500 for the image quality you get. I paid $3500 for the first 32" 4K60hz monitor. What's stopping me from going OLED desktop is the fact that 55" is just too damn big for comfortable desktop use, no matter how you compensate for it by moving far away etc.
 
I know the rtings torture tests are extreme scnearios but be aware that they were conducted entirely in SDR mode at under 400 nits, most at 200nits.. and that some of them had only a logo as as static part of the screen.

Burn in is prob why I heard the DELL gaming OLED will be 400nit which is in SDR range, and no HDR. Normal LG OLED tv's ABL keeps the scene color brightness down to 600nit but you could say TVs are made for more dynamic content than PCs and consoles(even so sports and news have static logos and readouts). .

Speaking of maximum color nits kept as a burn in risk reducing safety factor - as far as I know all of the RTINGS OLED "torture" tests were run in SDR and not with HDR color brightness hitting the screens. In fact I just looked it up again and as their test details show, most were done at 200nit, with their max tv set to 380nit.


"


  • The total duration of static content. LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day.
  • The brightness of the static content. Our maximum brightness CNN TV has more severe burn-in than our 200 nits brightness CNN TV.
  • The colors of the static areas. We found that in our 20/7 Burn-in Test the red sub-pixel is the fastest to degrade, followed by blue and then green.
"

CNN "MAXIMUM" Brightness test they did is only 380nits:
"As above, live CNN is played on the TV through a cable feed. However, for this TV, the 'OLED Light' is set to maximum, which corresponds to a brightness of 380 nits on our checkerboard pattern. This is to show the relationship between burn-in rate and 'OLED Light' with the exact same content and over the same time period.


-------------
Console don't run at 120 Hz. You'd be surprised on how many people on TV forums use LG OLED TV's as monitors.

Well even now the xbox one supports VRR on some samsung TVs (the TV's themselves support 120hz 1080p and 120hz 1440p already they just lack hdmi 2.1 for 4k 120hz) The next gen of consoles is supposed to support 120hz 4k.


Microsoft's next Xbox and Sony's PlayStation 5 are going to play games super smoothly, but you might need a new TV to make the most them
The new game consoles we're expecting from Microsoft and Sony in 2020 will be capable of running games at 4K resolution at 120 frames per second.

https://www.businessinsider.com/best-tvs-for-next-xbox-project-scarlett-playstation-5-120hz-2019-6

------------------------

We are in a middle ground right now due to hdmi 2.1 rollout time constraints and product roadmaps.

----- [ END QUOTES ] ---------
 
Last edited:
I know the rtings torture tests are extreme scnearios but be aware that they were conducted entirely in SDR mode at under 400 nits, most at 200nits.. and that some of them had only a logo as as static part of the screen.

I don't consider this particularly relevant though and I find it strange that people think HDR brightness levels would ever matter. Windows HDR mode on the desktop is still unusable afaik, at least with OLED TVs, and in general there's not much purpose to it. HDR games are still very uncommon, and even then, I can't think of a single HDR game that I would put more than 50-100 hours into. Plus, HDR games do not use high brightness levels for UI -- in fact generally the opposite.

I cannot imagine any scenario in which you would see brightness peaks on a desktop PC that you would not also see those same brightness peaks in a console gaming or movie watching scenario. Nobody is running software that makes their mouse pointer or taskbar 800 nits or something.

And contrary to popular belief, HDR scenes in general are NOT brighter than SDR scenes. After all, full screen average brightness level on OLEDs is limited to 150 nits-180 nits or whatever, depending on model year. MOST HDR scenes are deliberately graded to be around 100 nits average, same as SDR.
 
Yes, but unless you ONLY game on your PC, and spend lots of hours doing other things on it, burn-in is a serious risk... and if you're unfortunate enough to suffer from it, you're not even covered under warranty. Casual gaming poses little issue for OLED, even with on-screen HUDS, unless you're playing 5-6 hours a day. That's excessive for most people, but that's nothing for general PC use... I'm at mine for 10+ hours a day with fixed windows that can't be hidden, so OLED is just not suitable... I laugh when I see people mention auto-hiding taskbars and changing desktop wallpapers... LOL!! Do they think people who work on their PC's at home are sat staring at the desktop all day?? :D

55" is obviously impractical anyway at present, but even if OLED were available at 40" a lot of people would be taking a crazy risk with it, not to mention it would be price prohibitive anyway. This is what's going to prevent OLED coming to the PC market... manufacturer's know this. Just look at Dell and how their 30" OLED was discontinued. OLED will never become mainstream for PC unless they can resolve this, and I'm not sure they ever will given how inherent it is to the tech. At best it will be a niche and overpriced option, because average gamers will never pay thousands for it. By the time you can get a 27" OLED for $500, we'll all be dead. And I suspect MicroLED will be here before then anyway.

This LG is silly money for what it is, but it doesn't surprise me in the least... has been the way of the monitor industry for some years now, and I see no sign of it stopping. Always enough people willing to pay these prices, sadly.

Yes I do only game on my OLED because 55" is unusable to me as a desktop monitor. I don't think many people are recommending you use a 55" OLED to do your desktop work on anyways, but as a gaming display you are really not doing yourselves any favors by skipping out on OLED. Sure it's HDR performance isn't PERFECT but neither is any LCD tech atm.
 
well it would seem to make sense that the peak nit is a concern and would be quite a coincidence that those limits were in the torture test and that the dell gaming oled is reportedly at that same 400nit ~SDR cap.
"The brightness of the static content. Our maximum brightness CNN TV has more severe burn-in than our 200 nits brightness CNN TV."
, live CNN is played on the TV through a cable feed. However, for this TV, the 'OLED Light' is set to maximum, which corresponds to a brightness of 380 nits on our checkerboard pattern.

-----------------------------

I also posted those quotes in reply to the 60hz consoles comments here, because the next gen of consoles, xbox and PS5, is supposed to do some quasi 4k 120hz gaming which I'd assume would be on hdmi 2.1.
"The new game consoles we're expecting from Microsoft and Sony in 2020 will be capable of running games at 4K resolution at 120 frames per second."
- business insider article

-----------------------------

So eventually all the better tvs will have hdmi 2.1 , it's just that we'll then have to wait for pc gpus and display sizes of 43" or less that have it as well. It's starting to sound to me like both consoles and tvs may have hdmi 2.1 well before pc monitor displays and nvidia gpus.


I agree that the 55" or larger size is also a big hurdle for the cheaper TVs as well as flagship LG OLEDs, and flagship Samsung FALD TVs which I think start at 65".. 55" is way too big imo for even 3' away. Even a 48" at 3' to 3.5' is the same or worse ppi than a 1440p 32" , too . You could always have a pc or mini itx pc in your living room with a couch master lapdesk type setup, or adopt a far away command center desk type setup if you have the room I guess.

-----------------------------

For now in this middle ground without hdmi 2.1 (and with some poorly implemented and less available HDR), we are being charged for a narrow displayport pipe to do 120hz 4k and similar resolutions off of pc gpus because hdmi 2.0 is insufficient.
-and that's using gpus that aren't really powerful enough for a full 4k rez at high enough fps ranges for 120hz benefit yet in the first place on the more demanding games, even dialing the settings down a bit. That's where the ultrawide rez in this thread could help a bit, or running 21:9 rez on a larger 4k gaming monitor for some games. Of course with more powerful gpus, the arbitrarily set graphics ceiling always go up so it's a game of leap frog and high Hz 4k could really benefit from a much larger leap in gpu power.

I like to think that in the future HDR will be ubiquitous and that photos, wallpapers, games, video streaming services and shows/movie libraries full screen or in window panes, twitch/youtube etc. will all be HDR range enabled content. Like moving to 1080p or then 4k, but in the 3d colorspace. That's obviously some time off yet.

I'm hoping some dual layer lcds make it to market in consumer tvs in the next few years and either that is perfected and trickles down into larger sized 32" - 43" monitors or that micro led comes around after that. The Asus pro art PA32UCG 32" mini led 4k 120hz VRR asus pro art 1152 zone FALD HDR 1600 monitor sounds great for the time being but it's $5000 and up to a year away. That's a company pro art monitor and would be crazy money for a gaming monitor imo. The dell gaming oled is rumored to be $4000 also for comparison.
 
Last edited:
All I'm trying to say is, do you really wanna throw $2000 at this ultrawide NOW? Hell I don't think you can even buy this thing yet....and by the time you can buy it, that new driver from nvidia will be out for HDMI VRR on the C9 OLED. Buying this thing (OR any other craptastic LCD gaming monitor for that matter) seems like a huge waste of money as a C9 is cheaper and far more future proof with full bandwidth HDMI 2.1 ports. Yes I get it you can't take advantage of 4k120Hz on a 2080 Ti but 1440p120Hz is natively supported on the C9 and besides the 2080 Ti is really only enough for 1440p120fps, not 4k120fps anyways. You can get a C9 over this ultrawide, enjoy 1440p120Hz VRR OLED right off the bat, then upgrade to a 3080 Ti to fully unlock 4k120Hz VRR once it comes out. That just seems like a much better investment IMO.
 
All I'm trying to say is, do you really wanna throw $2000 at this ultrawide NOW? Hell I don't think you can even buy this thing yet....and by the time you can buy it, that new driver from nvidia will be out for HDMI VRR on the C9 OLED. Buying this thing (OR any other craptastic LCD gaming monitor for that matter) seems like a huge waste of money as a C9 is cheaper and far more future proof with full bandwidth HDMI 2.1 ports. Yes I get it you can't take advantage of 4k120Hz on a 2080 Ti but 1440p120Hz is natively supported on the C9 and besides the 2080 Ti is really only enough for 1440p120fps, not 4k120fps anyways. You can get a C9 over this ultrawide, enjoy 1440p120Hz VRR OLED right off the bat, then upgrade to a 3080 Ti to fully unlock 4k120Hz VRR once it comes out. That just seems like a much better investment IMO.

Sure, but that’s ignoring the biggest problem which is size. You need a pretty deep desk even for the smallest LG OLEDs so it isn’t too close. Hell, I would have to reconfigure the whole damn room to accommodate it. OLED TVs become feasible for me when they start selling them in around 43" size.

I agree the LG ultrawide is just overpriced for what it is. Shave it to say 1500-1800 euros and I’d probably buy one but at 2000+ it needs to have HDR1000 FALD which it doesn’t. Even if the rest is great.
 
Sure, but that’s ignoring the biggest problem which is size. You need a pretty deep desk even for the smallest LG OLEDs so it isn’t too close. Hell, I would have to reconfigure the whole damn room to accommodate it. OLED TVs become feasible for me when they start selling them in around 43" size.

I agree the LG ultrawide is just overpriced for what it is. Shave it to say 1500-1800 euros and I’d probably buy one but at 2000+ it needs to have HDR1000 FALD which it doesn’t. Even if the rest is great.


We won't be seeing 43" OLEDs for a good couple years yet, but I suspect much longer. Even the LG 48" doesn't have a release date yet, so could be late next year, and it was announced a while back now. Not even rumours of anything smaller, and I strongly suspect LG will see how well the 48" sells before even considering that.

I wonder how long the price will stick though? The Predator X35 is up for pre-order now in the UK for £2200... I'm struggling to see an argument for the LG at the same price. This should be a £1500 monitor at the most, and even that's pushing it. Even the XG438Q with all its faults would be a more attractive proposition for most people.
 
We won't be seeing 43" OLEDs for a good couple years yet, but I suspect much longer. Even the LG 48" doesn't have a release date yet, so could be late next year, and it was announced a while back now. Not even rumours of anything smaller, and I strongly suspect LG will see how well the 48" sells before even considering that.

I wonder how long the price will stick though? The Predator X35 is up for pre-order now in the UK for £2200... I'm struggling to see an argument for the LG at the same price. This should be a £1500 monitor at the most, and even that's pushing it. Even the XG438Q with all its faults would be a more attractive proposition for most people.

I would not be surprised if the price dropped in a few months after release. It doesn’t seem like LG has a better HDR version of this panel coming to compete with it. I doubt they manage to sell many in Europe with the current pricing ranging from 2000-2500 euros.
 
Pushed back to Oct 16th on B&H. Total failure in the monitor industry!
 
You can basically get one of the 34" high refresh monitors and a C9 for the same price as this. I've seen the 34GK950f on sale for $800 and the OLED $1300 (sometimes less).

That's what I would do personally. Use the 34" for desktop use and the OLED for gaming only. That said, this will probably drop significantly a few months after release, it'll be a better buy then
 
I'm being conservative when I say that HDR won't matter for another 10 years.

It just isn't there yet. Going to take a long time.
 
I wouldn't say it isn't going to matter for that long but the higher tier real implementations of HDR 1000, 1600 and higher HDR 4000 and HDR 10,000 eventually in games and movies, in hardware via FALD, mini LED fald, perhaps dual layer lcd, and later micro LED are going to be relegated to enthusiasts and very high price tags (on monitors) for quite some time.

Unfortunately hdmi 2.1 ~ quasi 4k 120hz next gen consoles as well has 4k HDR movies on flagship TVs will probably have better HDR "out of the box" well before pc monitors and PC games have hdmi 2.1 gpus and stronger HDR implementations, more ubiquitous true HDR hardware and normalized pricing, and better HDR standards available and in more games by default.

------------------------------------------------------------------

To your point though, hdmi 2.0 for 4k 60hz came out in 2013 and it's now 2019 when a few 4k 120hz screens are coming out and that is still only on displayport capable screens with still too narrow of bandwidth so sort of jerry rigged. So probably 7 years between hdmi 2.0 and hdmi 2.1 , maybe 8 for pc gpus.

120hz 1080p to 120hz 4k was a long time overall and still isn't quite here yet.

The availability of 4k material is pretty ubiquitous now but it took awhile.
Per wiki:
"The 4K television market share increased as prices fell dramatically during 2014[2] and 2015. By 2020, more than half of U.S. households are expected to have 4K-capable TVs,[3] a much faster adoption rate than that of Full HD (1080p).[4]"


HDR is in 75+ pc games with some performing better than others.
https://pcgamingwiki.com/wiki/List_of_games_that_support_high_dynamic_range_display_(HDR)

Xbox one
https://en.wikipedia.org/wiki/List_of_Xbox_One_X_enhanced_games

PS4 , about 117 titles (though some are iterations of the same game title through different year's versions)
https://www.resetera.com/threads/all-games-with-ps4-pro-enhancements.3101/

https://abload.de/img/newbonuslistsmfk8o.png


When I hear that people have to fuss around with backlight or whitepoint on pc HDR games it makes me think it has a ways to go yet since HDR is supposed to be using absolute color values including all of the color brightness values that make a 3d color gamut. You aren't supposed to have to adjust any kind of brightness like you do the narrow standard definition's range up and down to cheat brightness/dimness higher or lower.
 
Last edited:
The main problem with HDR on PC is how it's done on Windows. Microsoft really need to sort their shit out because just enabling HDR messes up the colors on the desktop. There needs to be a "use SDR on the desktop" option to go with it so that it can work seamlessly in games that support it.
 
When I hear that people have to fuss around with backlight or whitepoint on pc HDR games it makes me think it has a ways to go yet since HDR is supposed to be using absolute color values including all of the color brightness values that make a 3d color gamut. You aren't supposed to have to adjust any kind of brightness like you do the narrow standard definition's range up and down to cheat brightness/dimness higher or lower.

That's funny because in another thread I was told (not by you) that Gears 5 has "too bright" HDR and that I'm not suppose to expect things to just work 100% with HDR so it's my fault for not understanding how to adjust settings to get a proper image. Not everyone is an expert in HDR and knows what to look for in order to calibrate every single game they play. I believe HDR should indeed "just work" at the flip of a switch.
 
  • Like
Reactions: elvn
like this
HLG is still relative which isn't true form HDR which should have absolute values.

PQ (Perceptual quantization) gamma curve is based on the characteristics of human visual perception.

HLG (Hybrid Log-Gamma) is sort of a sloppy way to make quasi HDR work with inadequate SDR based systems, live un-encoded video feed material, and lower quality quasi-HDR hardware. There's no reason games should be using the makeshift relative version other than that most of the displays people are using for HDR material are inadequate in the first place.


https://www.eizo.com/library/management/ins-and-outs-of-hdr/index2.html/

lkFnvmB.png
 
All I'm trying to say is, do you really wanna throw $2000 at this ultrawide NOW? Hell I don't think you can even buy this thing yet....and by the time you can buy it, that new driver from nvidia will be out for HDMI VRR on the C9 OLED. Buying this thing (OR any other craptastic LCD gaming monitor for that matter) seems like a huge waste of money as a C9 is cheaper and far more future proof with full bandwidth HDMI 2.1 ports. Yes I get it you can't take advantage of 4k120Hz on a 2080 Ti but 1440p120Hz is natively supported on the C9 and besides the 2080 Ti is really only enough for 1440p120fps, not 4k120fps anyways. You can get a C9 over this ultrawide, enjoy 1440p120Hz VRR OLED right off the bat, then upgrade to a 3080 Ti to fully unlock 4k120Hz VRR once it comes out. That just seems like a much better investment IMO.

yeah no thanks. I've had a OLED as a monitor and its WAY too big. Not everyone wants a 50"+ screen on their desk. I have an OLED that we use for movies and such. But people in this thread talking about "go BUY AN OLED" are crazy. Guys an LG c9 OLED, is NOT a monitor. IMO 32"-38" is max monitor size for usability and field of view (and trust I've owned a lot of monitors and TV's as monitors :p )
 
  • Like
Reactions: Panel
like this
yeah no thanks. I've had a OLED as a monitor and its WAY too big. Not everyone wants a 50"+ screen on their desk. I have an OLED that we use for movies and such. But people in this thread talking about "go BUY AN OLED" are crazy. Guys an LG c9 OLED, is NOT a monitor. IMO 32"-38" is max monitor size for usability and field of view (and trust I've owned a lot of monitors and TV's as monitors :p )

Bah I never said to use the dang thing as monitor. As I've said many times before, just get a secondary 32" 4k monitor for your desktop needs and reserve the OLED for gaming. On top of being too big you also run the risk of burn in by using it for desktop stuff. For $2000 you can easily pick up a C9 PLUS a 32" 4k monitor for that kinda price.
 
Bah I never said to use the dang thing as monitor. As I've said many times before, just get a secondary 32" 4k monitor for your desktop needs and reserve the OLED for gaming. On top of being too big you also run the risk of burn in by using it for desktop stuff. For $2000 you can easily pick up a C9 PLUS a 32" 4k monitor for that kinda price.

Not following. Are you suggesting to have both a 55" OLED for gaming and a separate smaller monitor on your desk for other stuff? That would be awkward at best.

The 55" C9 or the 2020 version is on my wish list for my living room setup. Too bad there's no Pascal support for VRR.
 
Not following. Are you suggesting to have both a 55" OLED for gaming and a separate smaller monitor on your desk for other stuff? That would be awkward at best.

The 55" C9 or the 2020 version is on my wish list for my living room setup. Too bad there's no Pascal support for VRR.

Yes. It's what I currently do atm and it's not awkward at all. In fact I actually bounce between 3 displays: Acer X27, OLED TV, and Asus XG248Q.
 
Yeah, I don't know how you accommodate 55" + another display without looking awkward and turning the room into server/flight control room. You can have several separate PC desks in one room, especially if you have many rooms, but, I don't think that's cool either. lol
 
Meh I've thrown in my 2 cents. I would do whatever I could to for that OLED picture quality but if one would rather skip it all entirely because they don't wanna make a proper setup for it, their lost.
 
I'd say about 4' minimum for a 55" is the closest I'd go from screen surface to my eyeballs. That's really not extremely far away but it's definitely not a regular desk distance, and perhaps farther away than that would be more comfortable than the nearest reasonable distance I'm proposing.

It seems like if you subtract about 1/6th of the monitor's diagonal size you get a rough estimate of a reasonable "nearest" viewing distance to work from. (rough "nearest" estimate, not necessarily "best" distance).

Monitor size divided by 6 , times 5 = viewing distance
------------------------------------------------------------------------
15" = 12.5" (around 1')
27" = 22.5" (a bit under 2')
32" = 26.6" (a few inch past 2')
43" = 35.8" (about 3')
55" = 45.8" (3.8' - 4')
65" = 54.16" (4.5')
70" = 58.33" (4.86' - 5')




Once I can get 120hz 4k HDR +VRR hdmi 2.1 off of a pc gpu to the same specs on a oled, or on a Samsung flagship fald tv, 1152nit hdr1600 pro art like mini led display, dual layer lcd etc. I'll reconsider a farther away desk control room setup for a 55"+ sized screen again but that sounds like it will be awhile yet from nvidia, and even then I usually wait for the ti tier of gpu which could be even a bit longer.

otherwise I'll probably keep oled or flagship samsung fald to a purchase for the living room on a giant 70"+ TV in the next year or two, and I'll be running a ps5 at 120hz VRR with quasi 4k rez games way before nvidia gets on board with hdmi 2.1.
 
Last edited:
The main problem with HDR on PC is how it's done on Windows. Microsoft really need to sort their shit out because just enabling HDR messes up the colors on the desktop. There needs to be a "use SDR on the desktop" option to go with it so that it can work seamlessly in games that support it.
HDR on the desktop seems to be working a lot better in 1903 so far than it did in previous versions, but I agree that games should automatically detect if the connected display is HDR-capable and expose the option without needing to change it globally.
 
HDR on the desktop seems to be working a lot better in 1903 so far than it did in previous versions, but I agree that games should automatically detect if the connected display is HDR-capable and expose the option without needing to change it globally.

My Samsung CRG9 begs to differ. All SDR content looks washed out. HDR looks good. I don't know quite what the solution here is because MS supports HDR in windowed apps too, so you could for example have a HDR video playing in a tiny window on a desktop that is SDR otherwise. I wish they at least gave us a "only use HDR in fullscreen" option as that would work for most games.
 
Not following. Are you suggesting to have both a 55" OLED for gaming and a separate smaller monitor on your desk for other stuff? That would be awkward at best.
Yeah, I don't know how you accommodate 55" + another display without looking awkward and turning the room into server/flight control room. You can have several separate PC desks in one room, especially if you have many rooms, but, I don't think that's cool either. lol
I’d say, why not just have a regular monitor on the desk, and have the OLED TV in the living room? Lapboards are excellent these days, as are wireless mice and headphones. Would it really be such a poor gaming experience as some people make it out to be? Perhaps, but I personally doubt it.
 
I’d say, why not just have a regular monitor on the desk, and have the OLED TV in the living room? Lapboards are excellent these days, as are wireless mice and headphones. Would it really be such a poor gaming experience as some people make it out to be? Perhaps, but I personally doubt it.

That's what I failed to mention. I don't suggest trying to squeeze everything into a single desk space, but rather have your desktop monitor setup with your pc on your desk and then the OLED separately off to the side somewhere and then use wireless peripherals when gaming on it. Been working well for me for years.
 
  • Like
Reactions: Panel
like this
HDR on the desktop seems to be working a lot better in 1903 so far than it did in previous versions, but I agree that games should automatically detect if the connected display is HDR-capable and expose the option without needing to change it globally.

Need to give you a reason to upgrade to Windows 11.
 
I assume nobody has managed to get their hands on one of these yet? I put my pre-order in within 5 minutes of it going live on B&H on Sept. 10th but watching their website the release date just seems to get pushed back every week...
 
LG support in Europe said the end of November. I wouldn't be holding my breath if I were you.
 
Your field of view wil be less when you game at 16:9 ........ 21:9 immerses you and you see more horizontally and what not.

Most monitor discussions devolve into "too big / too small / not enough pixels" when the aspect ratio is often the defining factor. When the super wide monitors came out, people said "if it was taller and less wide, it would be great!" - as if such a thing doesn't already exist or the 32:9 was some sort of accident.
 
21:9 and 32:9 that are currently available are very short physically.

You can play practically every ultrawide capable game desired in an ultrawide resolution on a big 4k 16:9 monitor. In the case of OLED , OLED's per pixel emissive type display means the letterboxing border would be completely turned off pixels with no backlight like a lcd has. Even a high density FALD display can turn off the letterboxed parts to full black. If you wanted to go farther in a dedicated pc room you could put dark black soundproofing, other black panels or freestanding black divider(s), or paint the wall black behind the big screen.

An uw resolution on a large display results in a much larger ultrawide display than is currently available (using a 16:9 at 40", 43" .. 55" is huge unless set a bit farther away at around 4' viewing distance but would work if you had a command center island desk separate from the type of setup).

It also allows you to get slightly higher frame rates instead of a crippling full 4k resolution, especially important once hdmi 2.1 gpus come out if you want to use high hz which requires high fps fed to it. So running uw rez on a large 16:9 would be useful on the most demanding pc games while still having the ability to be full screen for easier to render games, older games, dialed in/down games, as well as massive desktop/app space outside of gaming... and of course media playback of different aspects. Most ultrawides are quite short in height, and if you ever added more resolution to the sides of a 4k to make a larger uw one, the frame rates would be even more crippled.

Other than requiring a greater viewing distance, the main problem with the 55" tvs for now is that even with hdmi 2.1 inputs, there are no hdmi 2.1 output gpus so you can't do 120hz pc gameplay on them unless you drop to a non native 1400p resolution. In large sized 16:9 LCD monitors, unfortunately the one 43" 4k 120hz xg438q gaming monitor that came out so far is "ok" but definitely not up to the bar set by some other VA gaming monitors like the gk850g. It's better in a few facets, worse in others, and it's not FALD.


Considering the 38" ultrawides are about 14.8" high they are at least somewhat taller than the typical ~ 13" slim monitors that are essentially a wider 27" 1440p. The 38" uw are about .7" shorter than a 32" 16:9's 15.5" height physically. That makes them the only physical uw size worth considering imo but is still a bit too short to what I'd prefer and you'd be losing desktop/app and 2d real estate.
--------------------------------------------------------------------------

Comparing apples to apples at uw resolution:

34" diagonal 21:9 resolution (109.68 ppi).............................= 13.1" high
49" diagonal 32:9 resolution (81.49 ppi)...............................= 13.3" high
37.5" diagonal 16:10 ultrawide (110 ppi)...............................= 14.8" high
43" diagonal 16:9 at 16:10 uw (102.46ppi x 3' distance).......= 17.9" high
55" diagonal 16:9 at 16:10 uw (81 ppi x 4' distance).............= 22.8" high (a bit smaller perceptually at a comfortable 4' or more viewing distance)
 
Last edited:
According to a Finnish retailer the release date for this monitor has moved to end of November, with 21.10. quoted as the date.
 
21.10 is either a joke or marketing for a 21:10 aspect ratio
 
Back
Top