LG Ultragear 27" OLED 240hz 1440P 27GR95QE-B

According to Rtings, they still couldn’t reign in the color as much as the sRGB mode. I think the color space was still 112% of sRGB after calibration.

Really annoying. I don’t want to drop $1K on a monitor to have even less calibration options than my $219 monitor from 2019.

Hmmm...I haven't seen RTINGS review this one yet - curious where you saw that?

I just did the LG Calibration Studio calibration when I did it for sRGB, but it seemed to have a dE of well under 2 for everything. Visually, not too much difference switching between sRGB and Calibration - the main difference is I could set target brightness and gamma.
 
Hmmm...I haven't seen RTINGS review this one yet - curious where you saw that?

I just did the LG Calibration Studio calibration when I did it for sRGB, but it seemed to have a dE of well under 2 for everything. Visually, not too much difference switching between sRGB and Calibration - the main difference is I could set target brightness and gamma.
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b

Scroll down to Color Accuracy (Post-Calibration). This is the first time in memory where I've seen a monitor score worse than the pre-calibration. Are they doing something wrong? Also - the LG Calibration Studio calibration that you did - is it a true hardware calibration? Are the settings you apply to the screen done on hardware and thus hold for whatever display you use?
 
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b

Scroll down to Color Accuracy (Post-Calibration). This is the first time in memory where I've seen a monitor score worse than the pre-calibration. Are they doing something wrong? Also - the LG Calibration Studio calibration that you did - is it a true hardware calibration? Are the settings you apply to the screen done on hardware and thus hold for whatever display you use?

Oop thanks for the link - for whatever reason I couldn't find the review on the site (but I was browsing on my phone).

Yeah those are odd results, but they're clearly calibrating with their own method since I know the LG Calibration Studio only calibrates to the Calibration 1 and Calibration 2 presets. I think the Gamer modes default to a wider color space, so not knowing how they calibrated, it's hard to say.

LG Calibration Studio is streamlined hardware calibration - I used an XRite colorimeter (I think the iDisplay Plus; it was borrowed). It's a very simple calibration, but worked well. (My only other experience with calibration was calibration of my TV with CalMan for Sony, and that was a similar but more complicated process that took quite a bit longer, though it also calibrated more presets, etc.)

For LG's software, you basically hook up the colorimeter, pick the color space (sRGB and P3 are available...I think maybe another but don't recall...I went with sRGB for Calibration 1 and P3 for Calibration 2), pick the color temperature (6500K in my case), gamma (I went with 2.2), and brightness target (I went with 160 nits, which equated to 83 on Calibration 1). Then it runs through the color testing process and and saves the calibration to the monitor. The whole thing takes under 15 minutes. Yes, the Calibration settings are saved to the presets in the monitor so you could hook it up to any device and still have them. It also reminds you when you select that preset of the calibrated Brightness (in both nits and recommended setting for that nit level), color temperature, and gamma for that preset (though not color space). You can also adjust brightness after calibration, but it'll still remind you of what you initially calibrated to. For example I bumped it up one notch to 84 brightness since I wanted just over 160 and the results were just under, but it still shows calibrated to 163 nits at 83. I'm very happy with the calibration; I think this has to be a lot closer than whatever they got in the Gamer modes, but it is very close to the sRGB preset.

The only thing I wish it did have was the ability to turn on the less aggressive auto brightness limiting or turn on DAS for the calibrated presets. As it stands, the Gamer modes seem to have less ABL than other modes and also enabled DAS, and you can't do either in the Calibrated modes. THAT SAID, in actual gaming, I haven't noticed ABL to be an issue, and I don't really notice a performance hit or lag from not having DAS either, so real-world, these aren't so much problems as niceties.
 
Also, I've been playing a game not using DLSS [on my video card] and thought the performance was good. Playing around, I enabled it the other day (I always thought it was unnecessary/could introduce artifacts if your system was beefy enough to run without it) and got a HUGE performance boost, to the point I can't even believe how responsive this monitor can actually get. (It was decent before, but this is a whole other level as far as how good/smooth it feels.) Pretty insane! I didn't use the frame counter but I'm quite sure I was hitting the 240 with DLSS on. It was like butter - definitely not going back to DLSS off.

OLED scales very linearly in motion blur, where quadrupling frame rates give you 1/4th the display motion blur;

That's why for OLED, makes DLSS a very good strobeless motion blur reduction technology -- future variants that use reprojection-based frame generation will have even less input lag and will make tomorrow's 1000fps 1000Hz displays practical. Increasing frame rates by 10x on an OLED reduces motion blur by 90% on an OLED without requiring strobing.

BFI will still be important for retro material and low framerate material, but the more games that can make BFI obsolete, the better. I hope game engine makers implement reprojection-based frame generation technologies.
 
BFI for modern titles too? If you want to turn on Ray Tracing...
 
BFI for modern titles too? If you want to turn on Ray Tracing...
Yep.

But raytraced are modern titles, and raytraced applications can access reprojection-based optimizations; where 100fps is converted to 1000fps in a lagless manner (assuming UE5 integration, for example).

I do think that by the end of the decade, I envision that one can simply use lagless reprojection techniques (where frame generation APIs have access to mouse/keyboard/6dof motion input), you can convert raytraced titles to 10x framerate.

Instead of today's 2:1, 3:1 and 4:1 frame generation that is very laggy, tomorrow's 10:1 frame generation that is lagless.

Let's conceptualize 100fps being reprojected to 1000fps, where you have 100fps of original frames, and 900fps of reprojected frames, but every single frame has input-read awareness;

Reprojection is not interpolation; reprojection can access input reads (mouse input and keyboard strafes), for freshest low-lag 6dof coordinates; that's why it was invented for virtual reality but should be brought to PC gaming;

Retroactive reprojection can even undo frame rendering latency (10ms frametimes) to reprojection latency (1ms frametimes), if you're using a dual rendering pipeline (original frames pipeline and reprojection pipeline).

Reprojection works bidirectionally along the time dimension, so the 10 input reads that occurs after a frame is generated, can be used to fast-forward (reprojection) it to the future while waiting for the next fully originally rendered frame to be finished. Intially, the laglessness would be client-side (e.g. mouselook, strafes, turns, pans, etc), at least until per-object reprojection became available (e.g. movement of enemies). Technically, it shold be feasible to reproject to keep gametime:photontime continually relative.
Inputread-aware frame generation technology can reduce latency, Linus Tech Tips confirmed it:



This video, is kinda the early bird sneak preview of the technology making the 4K 1000fps UE5 raytraced future possible on approximately RTX-4090-class performance (or only barely more than that)

That demo is even downloadable!

The detail level of the underlying graphics doesn't need to be low detail -- as reprojection is absurdly low GPU overhead. I reprojected 30fps to 360fps on a mere Razer laptop RTX 2080; a whopping 12:1 reprojection ratio. But, I also discovered, 100fps -> 360fps, reprojection artifacts drop dramatically. If you're starting from a low triple-digit framerate (e.g. 100fps) to a much higher frame rates, the artifacts are even less than DLSS 3 motion interpolation. So, experiment with reprojecting 100fps to 360fps on your 360Hz monitor, and see if you can easily tell it apart from native 360fps 360Hz -- it's very very hard to do so.
 
Last edited:
BFI would be cool if they made to work with VRR - which shouldn't be that hard to do in theory.
With however issues like fluctuating brightness even without BFI we might not see BFI & VRR at the same time for some time to come...

Re-projection idea is cool but its much easier to make shader-less demo with few primitives than modern game with it. That said this tech already exists in VR world and I saw it years ago on Oculus Rift along with frame-rate interpolation. It made games running at 45 fps roughly feel like 90 fps. Even input latency was not an issue. What however was were artifacts. Some times subtle and at places somewhat jarring. Still I could see this tech adopted for games for normal displays. Especially with BFI could make real difference.
 
Re-projection idea is cool but its much easier to make shader-less demo with few primitives than modern game with it. That said this tech already exists in VR world and I saw it years ago on Oculus Rift along with frame-rate interpolation. It made games running at 45 fps roughly feel like 90 fps. Even input latency was not an issue. What however was were artifacts. Some times subtle and at places somewhat jarring. Still I could see this tech adopted for games for normal displays. Especially with BFI could make real difference.
While I am doing a fair number of strobe/BFI contracts behind the scenes, I am pressing hard for brute framerate-based motion blur reduction to gain much more traction by the end of the decade. Feeding off a small percentage of my contract work, I'm actually spending a significant amount of time/funds towards the future 1000fps 1000Hz ecosystem, since I've developed a new Blur Busters Master Plan all around it. So I plan to release some white papers/articles sometime this year about it. Reprojection is actually easier to deploy than one thinks -- the biggest problem is integration of inputreads into the reprojection API, so would ideally need to be done as sort of a UE plugin;

What I found that;
- ASW 2.0 was much more artifactless than ASW 1.0 because it used the depth buffer; so the jarring artifacts mostly disappeared; and
- Sample and hold reprojection has no double-image artifacts, it's simply extra motion blur on lower-framerate objects (unreprojected objects).

Tests with the downloadable demo (which is more akin to ASW 1.0 quality, rather than ASW 2.0 quality). Doing 100fps as the base frame rate, is above the flicker fusion threshold also eliminates a lot of reprojection artifacts. To understand the stutter-to-blur continuum (sample and hold stutters and blur is the same thing) -- see www.testufo.com/eyetracking#speed=-1 and look at 2nd UFO for at least 20 seconds. Low frame rates vibrates like a slow-vibrating music string, and high frame rates blur like a fast-vibrating music string.

Once reprojection starts off with a base frame rate above flicker fusion threshold (e.g. 100fps), the vibrating stutter is blur instead; and the sample and hold effect ensures there's no double-image effect. Now, apply the ASW 2.0 algorithm instead of ASW 1.0, and the artifacts is less than DLSS 3.0.

Some white papers will come out later 2023 to raise awareness by researchers, by developers, by GPU vendors, etc. With OLED having visible mainstream benefits even in non-game use (240Hz and 1000Hz is not just for esports anymore) if it can be done cheaply, with near zero-GtG displays. People won't pay for it if it costs unobtainum, but reprojection is a cheap upgrade to GPUs. And, will help sell more GPU s in the GPU glut, and also expand the market for high-Hz displays outside esports.

120Hz vs 240Hz is more mainstream visible on OLED than LCD, and the mainstreaming of 120Hz phones, tablets and consoles is raising awareness of the benefits of high refresh rates. One of the white papers I am aiming to write, is about the geometric upgrade requirement (60 -> 144 -> 360 -> 1000) combine with the simultaneous need to keep GtG as close to zero as possible. Non-flicker-based motion blur reduction is a hell lot more ergonomic; achieved via brute framerate-based motion blur reduction; reprojection ratios of 10:1 reduces display motion blur by 90% without strobing, making 1ms MPRT possible without strobing -- and I've seen it with my eyes. It's the future; but I need to play the Blur Busters Advocacy role to educate the industry (slowly) over the years -- like I did with LightBoost in 2012 to, um, spark the explosion of official strobe backlights.

The weak links preventing Hz-vs-Hz visibility caused by the spoonfeeding of GtG-bottlenecked jitter-bottlenecked refresh rate incrementalism means 240Hz-vs-360Hz, only a 1.5x difference, is literally throttled to only about 1.1x because of other various factors (slow GtG, jitter factors). Average users don't care about that. Even high-frequency jitter (70 jitters/sec at 360Hz) vibrates so fast that it's just extra blur; and that's on top of GtG blur mess on top of pure nice linear-blur MPRT.

Now, in a blind test, more Average Users (>90%) can tell 240Hz-vs-1000Hz on an experimental 0ms GtG display (monochrome DLP projector) in a random test (e.g. www.testufo.com/map becoming 4x clearer) much better than they can tell apart 144Hz-vs-240Hz LCD. You did have to zero-out GtG, and go ultra-dramatic up the curve, but just because many can't tell 240Hz-vs-360Hz or 144Hz-vs-240Hz, doesn't mean 240Hz-vs-1000Hz isn't a bigger difference to mainstream! (Blind test instructions were posted in the Area 51 of Blur Busters Discussion Forum).

On a zero-GtG display (where all motion blur is MPRT-only), for 240fps 240Hz versus 1000fps 1000Hz, the blur difference is akin to a 1/240sec photograph versus a 1/1000sec photograph (4x motion clarity difference):

display-persistence-blur-equivalence-to-camera-shutter-690x323.png.webp


As an indie that plays the Refresh Rate Advocacy role -- I try to find ways to convince companies. If the companies can see a market to pushing frame rates AND refresh rates up geometrically (and catch even a bigger % of the niche). The high Hz market is slowly becoming larger and larger.

I'm really focussed on firing all the cylinders of the weak-links motor. 240Hz OLED just solved the GtG bottleneck, reprojection will solve the GPU bottleneck, and 1000Hz OLED is being targeted before the end of the decade.

But yes, BFI is going to be perpetually important for games that can't use reprojection, and legacy material (I love saying "60 years of legacy 60fps 60Hz material"), as I'm pushing REALLY hard to multiple parties to add BFI to OLED. It's a challenge, since some don't think BFI is needed.

I'm doing both routes.
 
Last edited:
IMG_20230317_112006428.jpg

Diablo 4 looks Amazing on this but not really sure if there are darker areas in the game running 50 percent brightness on gamer 1 60 Contrast gamma slightly lower in NV control panel but guess what there is a in-game gamma slider
 
While I am doing a fair number of strobe/BFI contracts behind the scenes, I am pressing hard for brute framerate-based motion blur reduction to gain much more traction by the end of the decade. Feeding off a small percentage of my contract work, I'm actually spending a significant amount of time/funds towards the future 1000fps 1000Hz ecosystem, since I've developed a new Blur Busters Master Plan all around it. So I plan to release some white papers/articles sometime this year about it. Reprojection is actually easier to deploy than one thinks -- the biggest problem is integration of inputreads into the reprojection API, so would ideally need to be done as sort of a UE plugin;

What I found that;
- ASW 2.0 was much more artifactless than ASW 1.0 because it used the depth buffer; so the jarring artifacts mostly disappeared; and
- Sample and hold reprojection has no double-image artifacts, it's simply extra motion blur on lower-framerate objects (unreprojected objects).

Tests with the downloadable demo (which is more akin to ASW 1.0 quality, rather than ASW 2.0 quality). Doing 100fps as the base frame rate, is above the flicker fusion threshold also eliminates a lot of reprojection artifacts. To understand the stutter-to-blur continuum (sample and hold stutters and blur is the same thing) -- see www.testufo.com/eyetracking#speed=-1 and look at 2nd UFO for at least 20 seconds. Low frame rates vibrates like a slow-vibrating music string, and high frame rates blur like a fast-vibrating music string.

Once reprojection starts off with a base frame rate above flicker fusion threshold (e.g. 100fps), the vibrating stutter is blur instead; and the sample and hold effect ensures there's no double-image effect. Now, apply the ASW 2.0 algorithm instead of ASW 1.0, and the artifacts is less than DLSS 3.0.

Some white papers will come out later 2023 to raise awareness by researchers, by developers, by GPU vendors, etc. With OLED having visible mainstream benefits even in non-game use (240Hz and 1000Hz is not just for esports anymore) if it can be done cheaply, with near zero-GtG displays. People won't pay for it if it costs unobtainum, but reprojection is a cheap upgrade to GPUs. And, will help sell more GPU s in the GPU glut, and also expand the market for high-Hz displays outside esports.

120Hz vs 240Hz is more mainstream visible on OLED than LCD, and the mainstreaming of 120Hz phones, tablets and consoles is raising awareness of the benefits of high refresh rates. One of the white papers I am aiming to write, is about the geometric upgrade requirement (60 -> 144 -> 360 -> 1000) combine with the simultaneous need to keep GtG as close to zero as possible. Non-flicker-based motion blur reduction is a hell lot more ergonomic; achieved via brute framerate-based motion blur reduction; reprojection ratios of 10:1 reduces display motion blur by 90% without strobing, making 1ms MPRT possible without strobing -- and I've seen it with my eyes. It's the future; but I need to play the Blur Busters Advocacy role to educate the industry (slowly) over the years -- like I did with LightBoost in 2012 to, um, spark the explosion of official strobe backlights.

The weak links preventing Hz-vs-Hz visibility caused by the spoonfeeding of GtG-bottlenecked jitter-bottlenecked refresh rate incrementalism means 240Hz-vs-360Hz, only a 1.5x difference, is literally throttled to only about 1.1x because of other various factors (slow GtG, jitter factors). Average users don't care about that. Even high-frequency jitter (70 jitters/sec at 360Hz) vibrates so fast that it's just extra blur; and that's on top of GtG blur mess on top of pure nice linear-blur MPRT.

Now, in a blind test, more Average Users (>90%) can tell 240Hz-vs-1000Hz on an experimental 0ms GtG display (monochrome DLP projector) in a random test (e.g. www.testufo.com/map becoming 4x clearer) much better than they can tell apart 144Hz-vs-240Hz LCD. You did have to zero-out GtG, and go ultra-dramatic up the curve, but just because many can't tell 240Hz-vs-360Hz or 144Hz-vs-240Hz, doesn't mean 240Hz-vs-1000Hz isn't a bigger difference to mainstream! (Blind test instructions were posted in the Area 51 of Blur Busters Discussion Forum).

On a zero-GtG display (where all motion blur is MPRT-only), for 240fps 240Hz versus 1000fps 1000Hz, the blur difference is akin to a 1/240sec photograph versus a 1/1000sec photograph (4x motion clarity difference):

display-persistence-blur-equivalence-to-camera-shutter-690x323.png.webp


As an indie that plays the Refresh Rate Advocacy role -- I try to find ways to convince companies. If the companies can see a market to pushing frame rates AND refresh rates up geometrically (and catch even a bigger % of the niche). The high Hz market is slowly becoming larger and larger.

I'm really focussed on firing all the cylinders of the weak-links motor. 240Hz OLED just solved the GtG bottleneck, reprojection will solve the GPU bottleneck, and 1000Hz OLED is being targeted before the end of the decade.

But yes, BFI is going to be perpetually important for games that can't use reprojection, and legacy material (I love saying "60 years of legacy 60fps 60Hz material"), as I'm pushing REALLY hard to multiple parties to add BFI to OLED. It's a challenge, since some don't think BFI is needed.

I'm doing both routes.
Please tell the manufacturers that the customers want it. Lots of us enthusiasts would love being able to have our cake and eat it too with OLED and BFI. I would jump on it in an instant. No cap. I’m a crt tube head that’s been waiting for this moment.
 
Loving this monitor more and more.

Some tips for this monitor: Turn off energy savings, lower the blue color to 44 in the monitor settings and if your on Nvidia in the control panel go to "Adjust desktop color settings" and under "Color accuracy" check the box to override to reference mode".

FFXIV in SDR:

fi21iueaheoa1.jpg
 
Loving this monitor more and more.
That's the content that OLED shines really well on. I've noticed even 10,000-LED-count FALD LCD still struggle on this specific type of material. FALD is superior for larger amounts of bright pixels than that, but if you're a lover of horror/space/dungeon/etc...

Please tell the manufacturers that the customers want it. Lots of us enthusiasts would love being able to have our cake and eat it too with OLED and BFI. I would jump on it in an instant. No cap. I’m a crt tube head that’s been waiting for this moment.
Still doing my best. You've seen what I've done with some models to successfully bring 60Hz single-strobe to a few models.

That being said, 240Hz OLED BFI (when it finally arrives) will initially be limited (by backplane limitations) to no less than 1/240sec motion blur (240ms). So you can have 60fps material with 75% motion blur reduced, like www.testufo.com/blackframes#count=4&bonusufo=1 -- this TestUFO animation works fantastically on 240Hz OLED, to produce adjustable persistence for 60fps (4.2ms, 8.3ms, and 12.9ms persistence) via 1:3, 2:2 and 3:1 black frame insertion options. This would not be too terribly difficult for a 240Hz OLED nameplate to add to their firmware. I do hope that I will finally be able to convince 1 or 2 of them to add this by the end of the year, but no guarantees. To big vendors, it is a kind of a niche feature, but it is popular among the Blur Busters audience. Let's see what happens this year.

In the best case scenario, 48Hz-120-Hz would gain optional adjustable-persistence BFI. For simplicity of expectations, the first BFI would be for 60Hz and 120Hz only.
 
For simplicity of expectations, the first BFI would be for 60Hz and 120Hz only.

Noooooooo I already have a 120Hz BFI OLED (LG CX) and was really hoping for that next leap to 240Hz BFI OLED. I'm sure the motion clarity will be a sight to behold! Hopefully my CX will last until that day arrives.
 
Noooooooo I already have a 120Hz BFI OLED (LG CX) and was really hoping for that next leap to 240Hz BFI OLED. I'm sure the motion clarity will be a sight to behold! Hopefully my CX will last until that day arrives.
Based on what I now know about large-size OLED panel backplane behavior, I suspect we're hurtling faster towards 1000Hz capability (~2027-2030ish) than for subrefresh rolling-scan BFI.
By the time 1000Hz OLEDs arrive, subrefresh BFI will no longer be critically important for large direct-view OLEDs.

The refresh-cycle-granularity restriction on BFI will still (by 1000Hz days) still be able to do 1ms MPRT on 60Hz material, by sheer 1/1000sec minimum of max-Hz refresh cycles.

If you want 1ms MPRT on OLED today, you will need to use small OLED and micro OLED displays, typically used in VR headsets such as PSVR2 and other units. They are more likely to be PWM-driven, and customized PWM (framerate=Hz) is used as the motion blur reduction methodology instead of discrete full-refresh-cycle black frame insertion. PWM (good and evil) requires few transistors per pixel, allows dimmer brightnesses and colors, and allows higher pixel densities in tiny OLEDs (near-1000 pixels per inch) which just happen to be convenient for VR-based motion blur reduction.
 
Monitor life hack: it should be possible to change temperature of white in other modes like FPS, RTS, Vivid, etc. by tweaking RGB values in service menu.
I did this for LG 48GQ900 and from what I saw on internet they use very similar service menu.
To enter it one can use LG service remote or any programmable IR transmitter. I used LG V20 smartphone with some app which allowed me to use 'pronto' codes and used codes from LG service remote found on internet.

Unfortunately no way to adjust anything for HDR
BTW. If anyone goes to service menu please avoid changing panel type. Apparently can break some monitors.
 
Just downloaded COD MW 2 for the first time for some reason while playing the campaign I was blinking like crazy. I tried a few settings nothing worked so I turned off Adaptive Sync within the monitor that was the problem. Not sure if that is the problem I had with other games but for some reason COD was bad with it on. When you have it off it disappears from the NV control panel as well but still uses 240hz as default.
 
Last edited:
I received my monitor yesterday and does anyone want to share their current settings? I have tried the TFT Central videos settings and the monitor just still feels dull and colors feel off in SDR on Gamer 1 or Gamer 2 modes. I switched on HDR and Auto HDR on within Win 11 and it does feel brighter but the darks feel a little too dark for fps shooters. I tried messing with the HDR/SDR slider Win 11 offers but it made everything washed out. I did try the Vivid mode other videos mention and in SDR it seems to be brighter than the rest of the preset modes but again colors seemed off. At this point I don't know if it's something I just need to spend more time with to get used to the wider color gamut or if OLED might not just be for me. FYI I had a 32" 1440p 165hz LG VA panel that's almost six years old and my 2nd monitor is an LG 32" 1440p ips 60hz panel used for browsing/content. The new LG OLED is used for gaming only.
 
Check your Nvidia control panel sliders for Desktop and Windows 11 Desktop Gamma if it seems too dark I had both turned on.

This is my notepad for settings keep in mind I got a different stand for the monitor Vesa mount so it can go all the way down


LG MONITOR SETTINGS

50 B

60 C

50 Sharpness

50 Black Equalizer

Gamma Mode 3

Cool
Color Temp

Gamer 1 mode for better Black Levels not Grey

Don't enable Desktop Gamma in Windows 11 with built in slider otherwise screen will be too dark for games.


50 B

60 C

.81 gamma important to lower for better contrast

for Nvidia
 
If you experience any eyestrain you might have to turn off Adaptive Sync with the controller I just did that the other day and it helps a ton I know it's a adon feature but I don't think they fully tested it. My eyes feel less strained without it also turned off Gsync in Nvidia setting but I think if you disable Adaptive Sync it makes it vanish in Nvidia's Control Panel for Gsync I just disabled it all. Don't be afraid to adjust settings in the game with the controller or Ingame Gamma I do it all the time cause no two games are the same.
 
Last edited:
TURN ADAPTIVE SYNC OFF EYES BLINK - Copy (2).jpg


I have it turned off due eyestrain I got from COD MW 2 using it I had it on for 3 weeks before I turned it off.
It's not really needed if you have a good PC to drive the monitor. Still runs at 240hz just without frame syncing.
 
That is true I just use my LG oled for Gaming not for Desktop use really it still works out. I lower the setings for tracking stuff on the screen instead of taking in lots of light.
It might take 2 weeks for you to get used to it it took me a good two weeks and the Vesa monitor stand I had to buy off Amazon to lower the screen 3 inches.
 
Gamma Mode 3

(...)

.81 gamma important to lower for better contrast
This would give you display gamma of around 3.0
According to tftcentral review brightness 50 is more like 30% maximum brightness and results in 60 nits

You must have pretty sensitive eyes if you consider this this usable 🤯

-----
I just noticed that on LG 48GQ900 I have DAS enabled in VIVID, sRGB and calibration modes whereas when I checked it before DAS was disabled in these modes and switching between mode with DAS and without it there would be noticeable latency difference (read: one frame of input lag with DAS being off)

Might be something to do with the used settings. I also have DAS at ON for all modes on LG 27GP950 and was surprised and disappointed it was not enabled everywhere on 48GQ900. Now it seems I can play with proper colors without needing to try to calibrate Gamer mode using six axis color controls like I planned to do for games where wide gamut wouldn't look right.

Can anyone with 27GR90QE confirm these monitors can also do DAS ON in sRGB/calibration modes?
 
I received my monitor yesterday and does anyone want to share their current settings? I have tried the TFT Central videos settings and the monitor just still feels dull and colors feel off in SDR on Gamer 1 or Gamer 2 modes. I switched on HDR and Auto HDR on within Win 11 and it does feel brighter but the darks feel a little too dark for fps shooters. I tried messing with the HDR/SDR slider Win 11 offers but it made everything washed out. I did try the Vivid mode other videos mention and in SDR it seems to be brighter than the rest of the preset modes but again colors seemed off. At this point I don't know if it's something I just need to spend more time with to get used to the wider color gamut or if OLED might not just be for me. FYI I had a 32" 1440p 165hz LG VA panel that's almost six years old and my 2nd monitor is an LG 32" 1440p ips 60hz panel used for browsing/content. The new LG OLED is used for gaming only.

This is going to depend on user to user as Comixbooks is running quite a bit different than me, so your mileage may vary and you may need to try things according to your preferences, but here's my take:

For SDR games (and desktop use if you do any of that at all), I'd very much recommend the sRGB mode with HDR off in Windows. Its calibration was pretty good out of the box for me (I did end up calibrating Calibration 1 with a colorimeter, but it's honestly quite close to that sRGB preset). Brightness can be set to your preference. On sRGB I would have probably stuck to 100% brightness. When I calibrated to 160 nits (my personal preference) on Calibration 1, Brightness ended up at 84.

For HDR (toggleable when you want to play an HDR game with WindowsKey+Alt+B if that key combo is enabled in the XBOX Game Bar App; then just set up HDR in the game's menus), I'd recommend sticking to Gamer 1. Personally I don't use Auto HDR and just play in HDR if it's a native HDR game (Dead Space, Forza Horizon 5, or Cyberpunk 2077 are examples), though some people like Auto HDR for non-native games. I haven't adjusted any of the colors, etc. personally in HDR. That said, on Windows 11, you definitely want to run the "Windows HDR Calibration" app and run through that - it will make HDR look a whole lot better IMO. It's not installed by default - you'll need to grab it from the app store (but it's free and easy to run through without any needed equipment and just helps Windows know the lower and upper limits of your monitor).
 
I just noticed that on LG 48GQ900 I have DAS enabled in VIVID, sRGB and calibration modes whereas when I checked it before DAS was disabled in these modes and switching between mode with DAS and without it there would be noticeable latency difference (read: one frame of input lag with DAS being off)

Might be something to do with the used settings. I also have DAS at ON for all modes on LG 27GP950 and was surprised and disappointed it was not enabled everywhere on 48GQ900. Now it seems I can play with proper colors without needing to try to calibrate Gamer mode using six axis color controls like I planned to do for games where wide gamut wouldn't look right.

Can anyone with 27GR90QE confirm these monitors can also do DAS ON in sRGB/calibration modes?

DAS does not appear to be available in any modes except the Gamer modes on this monitor, and there doesn't seem a setting to enable it. It's perplexing why so much is locked out in non-Gamer modes.

That said, I haven't personally noticed any appreciable lag/difference. I'm not a competitive player, but it seems just as responsive to me as long as it's set to 240 Hz and VRR is enabled (GSync is not).
 
DAS does not appear to be available in any modes except the Gamer modes on this monitor, and there doesn't seem a setting to enable it. It's perplexing why so much is locked out in non-Gamer modes.

That said, I haven't personally noticed any appreciable lag/difference. I'm not a competitive player, but it seems just as responsive to me as long as it's set to 240 Hz and VRR is enabled (GSync is not).
I noticed that if DAS is ON or OFF in sRGB or calibration mode depends on video mode on 48GQ900.
At 60, 96 and 120Hz its OFF and at overclocked 138Hz its ON

Whayt you can try is creating custom video mode with higher bandwidth because on my monitor it seems bandwidth is used to enable DAS in sRGB. When there is enough blanking lines to increase bandwidth to around or past what monitor uses in its factory overclock mode DAS is enabled.
1679819273815.png


So it might be worth checking on 27GR95QE i eg. creating video mode with smaller refresh rate but higher overall bandwidth helps. Might be also possible to increase refresh rate but probably monitor will throw out of range message. It should not throw such messages when refresh rate is smaller than 240Hz. Just beware that image might not apear if bandwidth is too high so best to always set video mode to default before restarting GPU otherwise you might have issues. Especially if that is the only monitor you have connected to your PC. Otherwise should be pretty safe to do.
 
I noticed that if DAS is ON or OFF in sRGB or calibration mode depends on video mode on 48GQ900.
At 60, 96 and 120Hz its OFF and at overclocked 138Hz its ON

Whayt you can try is creating custom video mode with higher bandwidth because on my monitor it seems bandwidth is used to enable DAS in sRGB. When there is enough blanking lines to increase bandwidth to around or past what monitor uses in its factory overclock mode DAS is enabled.
View attachment 559570

So it might be worth checking on 27GR95QE i eg. creating video mode with smaller refresh rate but higher overall bandwidth helps. Might be also possible to increase refresh rate but probably monitor will throw out of range message. It should not throw such messages when refresh rate is smaller than 240Hz. Just beware that image might not apear if bandwidth is too high so best to always set video mode to default before restarting GPU otherwise you might have issues. Especially if that is the only monitor you have connected to your PC. Otherwise should be pretty safe to do.

I've used CRU to remove the 4K TV presets that can sometimes display an error message (for some reason, some fullscreen games make the display think it's in 4K on some LG monitors, including this one, until you remove some TV-related upscaling from 4K entries), but this is a little beyond what I'm comfy messing with too since it is just the 1 monitor and I don't feel like I'm 100% sure what to try as far as settings to see if DAS triggers. I did try 90 Hz which it was able to do fine but DAS did not turn on.
 
How does one use CRU? I'm not seeing all the resolutions that are available to me in the NV control panel:

1679828315195.png
 
Figured out my problem. Have to go into "detailed resolutions" then "TV resolutions". Can delete from there. That also deleted the 3840x2160 PC resolution too.

Edit: This fixed my DLDSR resolutions being messed up too, nice!

1679830384769.png
 
I just switched back to the Stock LG Stand the one I bought off Amazon was ok but the Stock one is better =) I think the monitor warped the back a bit cause there is a little play on the back with the Stock stand. I just needed a 3rd party stand for training wheels to get used to the monitor I guess. What I reall y could use is a Office Chair that went a bit higher or my Desk could go a tad lower but I already took a hacksaw to my desk a few year back.
 
Last edited:
So it might be worth checking on 27GR95QE i eg. creating video mode with smaller refresh rate but higher overall bandwidth helps. Might be also possible to increase refresh rate but probably monitor will throw out of range message. It should not throw such messages when refresh rate is smaller than 240Hz. Just beware that image might not apear if bandwidth is too high so best to always set video mode to default before restarting GPU otherwise you might have issues. Especially if that is the only monitor you have connected to your PC. Otherwise should be pretty safe to do.
That's interesting way -- to use QFT to enable DAS.

The bigger the VBI, the lower lag these OLEDs has, because it transmits the refresh cycle faster (higher horizontal refresh rate = more scanlines per second transmitted over video cable). Since many OLEDs fully buffer the refresh cycle (for ABL processing and similar) instead of rolling-window scanout sync between cable and panel -- large VBIs will generally reduce input lag on OLEDs. A large VBI maximizes the Quick Frame Transport latency-reducing effect. The OLED processing loop usually executes immediately upon receiving the last scanline off the video cable.

By making the blanking intervals bigger, the visible refresh cycle can be transmitted faster over the video cable (more time spent in vertical blanking interval (sync or porch), less time spent transmitting visible refresh cycles). A VBI of twice the vertical visible resolution, would have one-half the monitor-side framebuffering latency. This is useful if you're doing low-Hz modes on a 240Hz OLED (e.g. 60Hz fixed-Hz mode transmitting over video cable in 1/240sec).

For other readers -- unaware how those "Custom Resolution Utility" numbers maps out, it's visualized as a virtual resolution bigger than visible resolution;

VideoSignalStructure.png
 
  • Like
Reactions: XoR_
like this
I just switched back to the Stock LG Stand the one I bought off Amazon was ok but the Stock one is better =) I think the monitor warped the back a bit cause there is a little play on the back with the Stock stand. I just needed a 3rd party stand for training wheels to get used to the monitor I guess. What I reall y could use is a Office Chair that went a bit higher or my Desk could go a tad lower but I already took a hacksaw to my desk a few year back.

The trend of stands with these long pointed legs that take up 9000% more desk space than necessary is something I literally can not wrap my brain around. It looks dumb, it feels dumb, its not very stable, and it takes too much room. Why the f do they do this? What was wrong with a solid rectangle that DOESNT go where you want to put your mouse and keyboard? The average desk is 24" deep, and you need 19 of that for these dumbass stands they manufacturers seem to think look very cool.
 
  • Like
Reactions: ncjoe
like this
I switched back to the Amazon stand I can't stand the stock stand I like it but it doesn't go down low enough. There is a Corsair version of this monitor coming out not sure if the stand is any better or if the Asus Stand is better.
 
I ended up returning mine. I could not find the right settings to suit me and it was driving me crazy. IMO if I spend 1k on a monitor it needs to be near perfect out of the box with minimal fiddling on my end. I did nothing but change settings on this thing non stop the entire time I had it, very frustrating. I will probably go the mini led route and look back into OLED in a couple more years down the road. I appreciate you guys sending me some setting to test!
 
I ended up returning mine. I could not find the right settings to suit me and it was driving me crazy. IMO if I spend 1k on a monitor it needs to be near perfect out of the box with minimal fiddling on my end. I did nothing but change settings on this thing non stop the entire time I had it, very frustrating. I will probably go the mini led route and look back into OLED in a couple more years down the road. I appreciate you guys sending me some setting to test!
Sorry to hear it didn't work out. I had gone through two monitors I wasn't happy with before finding this one, so I know how that feels, and it is frustrating! It's a very personal choice with a lot of factors to consider! At any rate, good luck with your search and I hope you find something you're happy with!
 
The cable for the firmware update is inside the LG accessories box mine was up to date. But I'll try the April firmware to see if it improves this Beta experiment.
 
Interesting! I see there's a new version of the OnScreen Control app, but it showed no firmware update quite yet. I just hope it doesn't mess up my calibration as I no longer have a colorimeter here to redo it =oP (Though I'll probably need to invest in one eventually anyways.)
 
Back
Top