Hardware Gsync vs. Gsync Compatible -- notably different feel? (IPS vs. OLED too)

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,825
I've used hardware G-Sync since around late 2017 with an Alienware AW3418DW ultrawide.

It's has been phenomenal. I'm one that can always tell with G-Sync or Freesync is on or off - it really benefits my gaming experience, so much so that playing games without it feels terrible to the point I don't want to play the game. Occassionally if G-Sync got turned off I could immediately tell, and my friend blind tested me with G-Sync on and off and I was 100% accurate - even at high frame rate games detecting if it was on or off.

With hardware G-Sync on the Alienware monitor I don't feel any issues in gaming down to the low 40FPS range - it feels buttery smooth.

I recently swapped to the LG C2 OLED with G-Sync compatibility. It feels WAY less smooth at 40-50FPS range than the Alienware does with CyberPunk 2077.

Why? Is this a known issue? Is this the difference between traditional LCD IPS display and OLED motion?
I'm confident this isn't in my head. I played a lot of Cyberpunk at 40-50FPS on the Alienware monitor and found it perfectly acceptable. >100 hours back at game launch. With DLSS I found the game perfectly acceptable on my 1080TI card (with settings reduced) and my 3080 card with settings maxed.

I recently swapped to LG C2 42" OLED, have new CPU hardware (PC upgraded from I7-6950X to I7-12700K) - and the same 3080 video card, but the increased resolution makes my gaming experience still run about the same 40-50FPS at max settings. It doesn't feel smooth at all at these frame rates. It feels like G-Sync is NOT on. (even though it's set to on in the Windows 11 settings and appears to be on with the OLED Game Optimizer config showing 119FPS max referesh rate and VRR).

Granted there is nearly 2 years of driver updates, and CyberPunk 2077 software updates and Windows 10 vs. Windows 11 difference , and HDR turned on/off etc between my two experiences too complicate that direct A/B comparison from my previous play through to this current playtime...But it just feels very different now than it did before, and not in a good way. Visually the game looks SOOOOOOOOOO much better on OLED with perfect blacks, and HDR. But for the feel - the Alienware with hardware G-Sync crushes the LG OLED smoothness feel. I guess I'll have to hook the Alienware back up and try the game again - - try to load settings to get to that same 40-50FPS and see if my memory serves me correctly, but I am very particular about this, and it's easily observed to me subjectively ---and I do game a lot and this OLED just feels quite different at 40-50FPS.

Is hardware G-Sync vs. compatible G-Sync an expected difference in the 40-50FPS range?

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
1666280281397.png

I notice on Nvidia's site that it says upcoming driver. Is it possible that G-Sync isn't actually working correctly on the LG C2 yet?
I do have the most recent NVidia driver as of this week.
Are there any settings I might be missing that I need to enable with the LG OLED?


It says it’s working:
1666283441113.jpeg
 
Last edited:
It's a limitation of the design. That Alienware monitor has a GSYNC hardware module in it, while the LG C2 is software only.

Also of note, IPS has inherent pixel blur in motion, while the LG OLED tech doesn't. It's possible that what you're seeing has always been there and the IPS monitor just masked it well. LG's OLED tech exposes every stutter due to the fast response of the pixels. You also changed a lot of hardware, so it's possible that something other than the C2 is causing the issue.
 
Download Nvidia Gsync demo
https://www.nvidia.com.tr/coolstuff/demos#!/g-sync
On my Sony X90J 50" TV, supports VRR (Consoles), Gsync does not work. Neither did Freesync. Yes very noticeable but thought the C2 did support both.

As a note, one can use this demo to see if freesync works by turning on and off freesync in the drivers.

Now the C2 may not have a specific Nvidia module for Gsync, it does have hardware for syncing refresh rate with device output. So it is not a purely software solution per se.
 
G-SYNC Compatible doesn't support LFC like FreeSync does, as far as I know. The VRR range on the C2 should be 20-120 Hz. It will obviously feel worse the lower in the range you are absent LFC. I'm sure it will improve once the driver officially supports the C2 line. It might be falling back to the VRR range on the older models absent official support (40-120 Hz).
 
For what it's worth, I notice a marked decrease in smoothness when I'm close to 40 FPS with my 3090 Ti and LG CX. 45 FPS and higher and it feels much smoother, probably due to not quickly dipping down below 40 FPS where Gsync is off, too fast for my FPS counter to show.
 
G-SYNC Compatible doesn't support LFC like FreeSync does, as far as I know. The VRR range on the C2 should be 20-120 Hz. It will obviously feel worse the lower in the range you are absent LFC. I'm sure it will improve once the driver officially supports the C2 line. It might be falling back to the VRR range on the older models absent official support (40-120 Hz).
Did not know this, now my 144hz monitor supported LFC with Gsync even though the drivers said it was not Gsync compatible but when forced on it did. Maybe LFC maybe supported or not, depending on device.
 
Did not know this, now my 144hz monitor supported LFC with Gsync even though the drivers said it was not Gsync compatible but when forced on it did. Maybe LFC maybe supported or not, depending on device.
Apparently LFC is required to be certified G-SYNC Compatible. It may just be a driver issue for this particular display at the moment.
 
My Samsung C32HG70 (32" 1440p 144Hz) supports Freesync 2, but the monitor has two "modes". These modes are "Standard" and "Ultimate". "Standard" supports a more limited FPS range (72-144Hz I believe, with no LFC), which makes Freesync useless unless you are already getting a pretty decent FPS. "Ultimate" supports 48-144Hz and also has LFC for when FPS dips below 48. "Ultimate" works as intended, including LFC, even when using my RTX 2080, and despite the fact that my monitor is not "Validated" as being "GSync Compatible".

Is it possible that your monitor also has multiple different Freesync "modes"? Mine came set with "Standard" as default. I had to manually change it to "Ultimate". This would explain why you are seeing poor results especially at lower FPS.

Also, I don't believe that "GSync Compatible" comes with any inherent limitations compared to Freesync. It's simply a result of Nvidia relenting and being forced to acknowledge that the market prefers monitors that aren't more expensive and brand exclusive due to having a hardware GSync module. "GSync Compatible" is merely Freesync support without a willingness to acknowledge the "Freesync" AMD brand name.

One last thing - I noticed that you have it set to enable GSync / GSync Compatible only in Fullscreen mode, not Windowed. Many games operate in Fullscreen as basically a maximized borderless Window, which for the purposes of GSync still counts as Windowed. You might try enabling support for Windowed mode. There is zero downside.
 
Last edited:
Also, I don't believe that "GSync Compatible" comes with any inherent limitations compared to Freesync. It's simply a result of Nvidia relenting and being forced to acknowledge that the market prefers monitors that aren't more expensive and brand exclusive due to having a hardware GSync module. "GSync Compatible" is merely Freesync support without a willingness to acknowledge the "Freesync" AMD brand name.
G-SYNC Compatible is NVIDIA's driver layer to work with Adaptive Sync. It has nothing to do with FreeSync.
 
Its probably a combination of virtually no sample and hold blur on the oled and hardware gsync vs software, that makes it not look as good Archaea. This sounds like a question for Chief Blur Buster to me... Try either going to blurbusters forum or see if he shows up here since I tagged him :).
 
Its probably a combination of virtually no sample and hold blur on the oled and hardware gsync vs software, that makes it not look as good Archaea. This sounds like a question for Chief Blur Buster to me... Try either going to blurbusters forum or see if he shows up here since I tagged him :).
I need to unpack my reply into two technical sections to clear up confusion.

So the topic of this thread distracts from another additional cause: Fast pixel response makes stutters more visible too.

Firstly, gametime:photontime sync in G-SYNC Compatible and G-SYNC Native can sometimes be different (e.g. Compatible usually stutters slightly more than Native). That can be a factor too.

However, since an OLED is involved, I would like to tell everyone we have found out an additional factor why low frame rates on OLED seems to stutter more, same framerate-for-framerate (e.g. perfect 60fps capped non-VRR).

TL;DR version
1. Don't confuse GtG and MPRT. They are different pixel response measurements. Both can add blur that hides stutter.
2. OLED is sample and hold (MPRT is not zerod out)
3. OLED simply visibly stutters until a higher frame rate because of faster pixel transition (GtG is near zero). Slow GtG helps mask stutters.

That's why you need higher frame rates to compensate for the increased visibility of stuttering made visible by pixel response being too fast. That's why OLED stutters more at the same frame rate as LCD, especially for framerates between 40fps-70fps territory. Above 100fps, the effect is not an issue, but is a consideration for people who hate stutter and have to play at low frame rates. One method is to upgrade your GPU and lower game settings. Another method is to add GPU motion blur to compensate, if you get headaches from stutter in low frame rate games (e.g. Cyberpunk 2077).

(Note: For more reading, you can view the "Research" button on Blur Busters website, which has my favourite well-vetted explainers (Also, I'm cited in over 25 research papers now)

1. OLED has no GtG blur, but it has MPRT blur (MPRT = sample and hold)

OLED always has sample-and-hold blur.
You're talking about lack of GtG blur, not MPRT blur.

There are two different pixel responses, GtG and MPRT.
  • GtG is linked to how fast a pixel changes color (transitions).
  • MPRT is linked to how long a pixel stays visible for (static).
  • MPRT = sample and hold = persistence
  • Both GtG and MPRT adds display motion blur
On your LG OLED, view www.testufo.com/eyetracking -- the top and bottom UFOs still looks different.

That's because of sample and hold. Your eyes are moving (analog) to track moving objects. As you track the moving UFO, your eyes are in different positions at the beginning of a refresh cycle and at the end of a refresh cycle. If a refresh cycle is on the screen continuously for a finite time period, you see display motion blur.

Pixel Visibility Time = MPRT(100%) = persistence:
= frametime on sample and hold (framerate=Hz or divisor or VRR)
= pulse width time on strobed (framerate=Hz)

Mathematically, at GtG=0 and framerate=Hz
Sample-and-hold 120fps 120Hz has half the motion blur of 60fps 60Hz.
Sample-and-hold 240fps 240Hz has half the motion blur of 120fps 120Hz.
Sample-and-hold 480fps 480Hz has half the motion blur of 240fps 240Hz.
Eliminating motion blur without BFI (strobeless motion blur reduction) requires sheer frame rate at sheer Hz, while keeping GtG as low as possible (OLED FTW!)

MPRT100% is generally equal to frametime on sample-and-hold (when excluding GtG), which is why you need to geometrically upgrade displays, e.g. 60Hz -> 144Hz -> 360Hz -> 1000Hz.

Remember, for (display=nonstrobed AND GtG=0 AND framerate=Hz), eye tracked display motion blur perfectly matches a camera shutter:

1666337126894.png


OLED follows Blur Buster Law (MPRT100%) much more closely because GtG=0.
This is why 240Hz-vs-360Hz LCD is barely visible (1.1x) because GtG is large fraction of a Hz.
This is why 120Hz-vs-240Hz OLED is much more visible (2.0x) because GtG is invisibly near zero.

Note: With brand new test variables, in an experimental laboratory (Viewpixx sells 1440Hz projectors commercially), we have found over 90% of people (even grandmas) can tell apart 240Hz vs 1000Hz in scrolling-text readability tests, but it's hard to tell apart 240Hz-vs-360Hz, you gotta geometrically upgrade frame rates and refresh rates for framerate-based motion blur reduction on sample-and-hold displays. We now recommend 1.5x-2x framerate+refreshrate upgrades for power users, and 2x-4x framerate+refreshrate upgrades for mainstream audiences. Just like some family members didn't instantly tell apart DVD-vs-HDTV, most can tell apart VHS-vs-4K. This is true with refresh rate and frame rate geometric upgrades. Increased resolutions also raises the retina refresh rate (aka refresh rate of no further human benefit). Retina refresh rate is actually linked to the human angular resolving resolution versus the pixels step per frame, because that's the guaranteed minimum persistence blur (MPRT) -- 8000 pixels/sec on an 8K 1000Hz 1000fps sample and hold display still generates 8 pixels of visible motion blur -- and we found retina refresh rate is projected to be beyond 20,000 Hz on 180-degree 16K VR displays if you want to 100% completely avoid strobing. Strobing eliminates the retina refresh rate problem, but real life doesn't strobe, so a perfect strobeless display that is also blurless, is by necessity a ultra high frame rate at ultra high refresh rate. Hard to believe? Remember, I'm now cited in several peer reviewed research papers now. Click the Research button at Blur Busters.

2. Stutter is Persistence Too! There's a Stutter-to-Blur Continuum (DEMO)

Did you know stutter and persistence motion blur is exactly the same thing?
Stutters is caused by persistence too.

*Here, I will talk about perfect frame paced stutter (e.g. regular stutter of low frame rates like testufo.com#count=6), I'm ignoring judder, harmonic/cyclic stutter, or erratic frame rates for the purpose of this explanation

Persistence = object staying in same position, regardless of low frame rate (stutter) or high frame rate (blur)
Stutter = slow vibration = like a slow vibrating music string (string looks shaky)
Blur = fast vibration = like a fast vibrating music string (string looks blurry)

Stutter-to-Blur Continuum Demo: See For Yourself!
A second demo: The same effect can be seen in framerate-ramping animations in VRR like the demo at testufo.com/vrr

Now if you're someone who have both an LCD and an OLED:
1. Try this on LCD. Stutters blends to blur earlier (at a lower frame rate)
2. Try this on OLED. Stutter blends to blur later (at a higher frame rate)
Why? Faster pixel response (OLED) raises the flicker fusion threshold of the stutter-to-blur continuum.
  • LCD motion usually stop visibly stuttering at ~50-55fps* for most human eyes, which is why 60fps looks smooth (but still blurry)
  • OLED motion usually stop visibly stuttering at ~65-75fps* for most human eyes, which is why 60fps looks stuttery (hasn't fully blended to blur yet)
Different humans have different flicker fusion thresholds, so the ranges may be different for you.
But no matter what, slower and faster GtG moves the threshold of the stutter-to-blur continuum.

Notice that LCD stops stuttering just a tad below 60fps, and OLED stops stuttering just a tad above 60fps? Bingo.

Now, go to 120fps instead. That framerate is usually above most humans' flicker fusion threshold, so both has just blur from persistence.

Regardless, LCD 120fps and OLED 120fps still has too much motion blur to read the street name labels at TestUFO Moving Map Test because of sample and hold. Try it, you can't read the street name labels, because of the sample and hold effect (aka persistence, aka MPRT above zero)

Solution To Stutters Caused By (Fast Pixel Response AND Low Frame Rate)

Now you know why faster pixel response makes stutters more visible.

If you hate stutters more than blur:
1. Raise your frame rate and refresh rate.
2. Get a more powerful GPU.
3. Adjust game settings.
4. Failing that, Just Add GPU motion blur (yes, it's evil but it's a defacto assistive feature for some people).

GPU motion blur is a pick-poison. It helps masks stutters if you're playing at low "Hollywood Movie" style frame rates (e.g. 24fps-60fps) on OLED. So you're adding GPU motion blur to compensate for pixel response being too fast for low frame rates.

Now, that being said, once you're usually above 100fps, I advise most people to always disable GPU motion blur effects. But it's a wonderful band-aid for Hollywood frame rates when you hate amplified stroboscopics and amplified stutters (some people gets more headaches from stutters than blur).

Choose whatever ergonomic setting helps you the most, sometimes "GPU Motion Blur Is An Assistive Feature" (for easily-stutter-headached people), even though it's evil to us HFR folks...

Personally, I'd rather be armed with an RTX 4090 and call it a day, to spray brute framerate if you want strobeless motion blur reduction...
 
Last edited:
Note: This is potentially also applicable too (in addition to OLED clarity) if the OP has more erratic stutter.
G-SYNC Compatible doesn't support LFC like FreeSync does, as far as I know. The VRR range on the C2 should be 20-120 Hz. It will obviously feel worse the lower in the range you are absent LFC. I'm sure it will improve once the driver officially supports the C2 line. It might be falling back to the VRR range on the older models absent official support (40-120 Hz).
Actually, NVIDIA does LFC on FreeSync.

G-SYNC Native has monitor-side "LFC"

implementations based on generic VESA (FreeSync / G-SYNC Compatible / HDMI VRR) has driver-side "LFC"

As long as the graphics drivers has implemented LFC, and NVIDIA graphics drivers most certainly implements LFC. I can confirm that some of my FreeSync monitors show "66fps" when my NVIDIA game is outputting 33fps -- solid definitive proof that NVIDIA does LFC for FreeSync monitors (using GSYNC Compatible mode). On the signal, 66 refresh cycles are being transmitted every time a game is at 33fps, and the monitor is a FreeSync panel with no G-SYNC advertising.

The quality of non-native "LFC" in NVIDIA's drivers does somewhat differ from AMD's drivers, but they definitely both have variants of "LFC" algorithms. AMD generally has slightly superior software-based LFC in their drivers than NVIDIA does, so these are in descending order of VRR fluidity:

For the same max Hz (e.g. 240Hz) on the same panel tech (e.g. LCD):

(less erratic jitter)
1. G-SYNC Native with NVIDIA Card
2. FreeSync with AMD Card
3. FreeSync via G-SYNC Compatible on NVIDIA Card
(more erratic jitter)

In the general pecking order of VRR fluidity. But there can be "exceptions" where a high performance NVIDIA card out-smoothes a lower performance AMD card. Or even a better CPU.

Regardless, LFC is implemented slightly differently and quality and stickiness (e.g. NVIDIA's LFC algorithm for Compatible will often stay sticky well above 60Hz).

- There is a stiction behavior in LFC that is totally different on NVIDIA cards than on AMD cards
- Even with NVIDIA GPU, LFC behaves differently for Native vs Compatible.

Stiction effect is when LFC stays enabled when framerate bounces back up. Often you see situations where LFC kicks in below 48Hz but doesn't disable itself until well above 60Hz. The stiction ranges can vary a lot between AMD vs NVIDIA, and some driver workarounds were done to prevent monitor-blackouts when VRR ranges fall below minimum Hz (e.g. 47fps) due to software underperformance, so safety margins and LFC stiction was optimized to keep VRR bugs from appearing... The LFC stiction effect is good so you don't have problems when framerates rapidly fluctuate across the LFC threshold (e.g. 47->48->47->48), and higher LFC thresholds will help make overdrive artifacts and inversion artifacts more invisible, but can have other weird side effects like double-images suddenly appearing/disappearing during strobed VRR.

There's even an LFC-resetting app for NVIDIA Compatible implementation that was posted to the Programming subforum at Blur Busters Forums, because it interfered with a hacked VRR strobing implementation (hacked emulator strobed VRR 60fps had 120Hz double-strobing on one model of monitor), since the person was trying to do low-latency emulator strobing via VRR strobing with a 60fps cap. Bbut LFC interfered with that by adding duplicate images (similar to CRT 30fps at 60Hz). The LFC stiction effect caused the monitor to stay permanently 120Hz despite a 60fps emulator. LFC stayed stuck on at 60Hz (=fps), staying enabled, because a brief dip to <48Hz (=fps) enabled LFC and the bounce back to 60Hz (=fps) was not a high enough frame rate for the NVIDIA drivers to re-disable LFC. AMD cards seems to do a better job of managing the LFC stiction effect.

Software-based VRR-related algorithms (e.g. not Native) is more prone to gametime:photontime jitter:
- Windows compositing effects (always use fullscreen exclusive for smoothest VRR)
- Terrible >1ms DPC latency events on bad configurations / inferior setups / poor motherboards (Software-based LFC, especially during collision events, is sensitive to terrible DPC performance)
- Power management adds jitters especially when GPU and/or CPU loading is low.

Less precise gametime:photontime sync is how erratic stutters become visible during VRR. Even 2ms sudden desync = 8 pixel erratic stutter at 4000 pixels/sec. It could be the game's fault or driver performance's fault, or a combination of both, but smooth VRR is simply a math game of gametime-photontime perfect relativity, and any variances to that is erratic stutter.

(Note: gametime is the exact time of the game world rendering, and photontime is when that image hits human eyeball. As long as that stays relative, even fast-erratic frame rates can look smooth, since the rapid fluctuations of framerate is in sync with rapid fluctuations of refresh rate -- defacto like seeing the motions of a room through an erratic flickering xenon strobe light -- real world movements don't look erratic stuttering despite the erratic flicker. That's what ideal perfect gametime:photontime during VRR should ideally do: smoothness despite erratic frame rate, like TestUFO VRR Random Fluctuation Demo.)

Keep your CPU and GPU loading below 100% to maximize driver/mouse/game timing precision, cap your frame rate (usually ~2-4% below max Hz), turn off power management as much as you can, use full screen exclusive mode, tune your game processing (Use VSYNC ON in NVCP but use VSYNC OFF in game menus). Then you can minimize VRR jitters as much as you can from software-performance issues.
 
Last edited:
Chief Blur Buster
Incredible posts! I learned a lot! Thank you for what you do for the community!

Have you studied much in the way of PWM brightness control affecting eye fatigue?


It’s hard to find good info on that but in the last year I’ve learned quite a bit about it as I learned I’m affected by it on JVC and Sony projectors. At first I wondered if there was something wrong with my eyes, but I’ve sense learned it’s PWM related. Some of the information you relayed above could easily tie into PWM sensitivity, because of pupil refresh and dilation rates far exceeding display tech PWM rates. It may also explain how each of us sees motion a little differently.

Here linked was my discovery thread on it after I bought a $11K MSRP JVC projector and my eyes felt hot and tired after each use. The effect is cumulative and it becomes more painful (and quick to become painful) after each regular use. This obviously doesn’t affect everyone — because JVC is perhaps the most recommended projector on AVSForum, but it does affect a small percentage of people negatively. Headaches, dizziness, eye fatigue, after affect vision that makes you see phantom banding on light sources the next day or even days andter and just general eye tiredness that makes them feel better closed than open. It got so bad my eyes were sore for a couple weeks after I stopped using the JVC.

https://www.avsforum.com/threads/pw...eye-fatigue-questions-and-discussion.3237865/

Here’s what it looks like in 240Hz slow motion capture on an iPhone video. I have quite a few more videos showing this on various JVC projectors: RS540, NX5, RS2100. All PWM flash. It’s a property of the LCOS panels they use.
 
Chief Blur Buster
Incredible posts! I learned a lot! Thank you for what you do for the community!

Have you studied much in the way of PWM brightness control affecting eye fatigue?
Yes. But increasingly more of the time (in the booming variety of display technologies) it was a wild goose chase to a red herring. PWM is correctly a common cause, but it is also commonly an overused accidental scapegoat when real ergonomic causes were somewhere else for a speicfic person.

Being Blur Busters, and people who have long had motion-related / motion blur headaches (this was a big problem for VR, but also to a lesser extent, big screen displays), I have probably had zillions of anecdotes of display discomfort, that spins off to this territory. It's one of the world's giant Pandora Boxes.

These posts have become almost so common at Blur Busters Forums, that I probably need to create a Display Ergonimics Forum sometime.

But needless to say, once you're past the basics (eye doctor, optometrist), you have to do a hell of a lot of troubleshooting. Sometimes the shotgum approach (major display tech change, switch between OLED <-> DLP <-> LCD <-> LCoS while also concurrently switching colorspaces and display sizes) to try to binary-search your way through the >100+ causes of display ergonomic problems. This is because nobody has time to try 100 displays, so people who attempt display roulette to solve their display-ergonomic issue, gotta shotgun multiple causes concurrently.

Yes. PWM. The common problem. It is not necessarily PWM but could be. There's over 100 different ergonomic problems with displays, not just PWM or flicker or blue light. I have posted a large list of eyestrain causes on Blur Busters Forums, which I will crosspost here:

Remember that PWM eyestrain is not necessarily the same as flicker eyestrain. PWM effects (multistrobing the same frame in any manner) creates duplicate-image effects that can be very tiring on the eyes. Are you sure you've narrowed your cause specifically to PWM yet? However, some people get headache from PWM dimming, but no headache from framerte=Hz single-strobe PWM. This was confirmed to be the PWM motion artifacts that created the discomfort, not the flicker itself. Sometimes it's the direct flicker itself, but there are other ways PWM can create discomfort (other than the flicker itself). People who never got eyestrain with CRTs, but suddenly had eyestrain seeing serrated-edge motion artifacts -- it's like staring into a serrated knife:

1666732004309.png


But PWM motion-artifact discomfort (instead of flicker) that's only 1 possible cause of over >100 unexpected possibilities.

Displays are essentially imperfect simulations of real life.
Small portions of the human population can be affected by niche line items.

- antiglare filter texture
- polarization
- pixel structure
- direct flicker
- PWM dimming effects
- brightness
- contrast
- color gamut
- motion blur eyestrain
- color separation artifacts (rainbow effect / DLP colorwheel)
- stutter eyestrain
- screen too bright relative to environment
- viewing distance
- excess blue light
- temporal dithering harshness/noise artifacts (DLP does PWM per-pixel)
- motionsickness from high frame rates (with various causes)
- motionsickness from low frame rates (with various causes)
- motionsickness from smooth motionblurred high frame rates (e.g. early interpolation algorithms or odd blurry HFR material)
- motionsickness from vertigo effects (motion on big screen)
- color primaries too spread apart (HDR eyestrain)
- large ultrabright image in dark room (need bias light)
- etc, etc, etc
- etc, etc, etc
- etc, etc, etc
[...]
- etc, etc, etc

And more, that are too niche to list. I've seen people who's got display discomfort from any one (or multiple) of the above line items.

Often, it is a domino effect situation. Like the issues are stacked on top of each other concurrently (size + brightness + color primaries + digitalness of how the image is composed) that stresses a specific human's vision comfort.

For example, low frame rates on ultrafast-response displays can create more visible stutter, which can then consequently create more stutter eyestrain, which means the person is a possible candidate for a LCD / LCoS projector. Other times the ultrawide gamut caused by ultra-sharp color primaries (only three narrowband wavelengths) is the cause of the eyestrain / nausea for some people. Sometimes another domino effect is a weird interaction with DLP colorwheel and frame rate; where some people got discomfort with 1-chip DLP only during 24Hz with DLP, but never with 60Hz, because color-separation artifacts are more amplified with 24Hz than 60Hz -- and some people's comfort threshold of rainbow artifacts actually sometimes (for them) is unexpected linked to frame rate.

Yet another domino effect is you sit close enough to start to see temporal dithering artifacts, which can be very pronounced with ultra-HDR (near Rec.2020) laser projection implementations, even with 3-chip. I wish LCD/LCoS projectors continued to stay common and high quality, as I loved my old Panasonic PT-AE2000U ergonomic comfort more than DLP.

While your eyes are different than mine, I have a personal anecdote on this -- sometimes sharp temporals (colorwheel or temporal dither) + sharp color primaries (narrow peaks at R, at G, at B) + sharp laser speckle + ultrawide FOV -- seems to also have a stacking effect on comfort, where it eventually seems somewhat harsh effect on my eyes (not as harsh as yours) -- and seem to also eyestrain me slightly from long exposure, so I prefer slightly softened temporals like fully-composed refresh cycles (OLED) rather than temporally-composed refresh cycles (DLP) especially when it comes to ultrawide gamuts. Then again, it does not always happen -- I've been fine with ultra-gamut projectors. It's very hit-and-miss, but I could tell that eyestrain is a bigger problem with my line of work than it used to be. Either way, my eyes seems fine fine with a HID-lightbulb 6x 4K DLP projector, since the color primaries are more broadband. Then again my eyes are also fine with ultra-HDR OLED displays too. It's when you throw lasers + speckle + ultra-gamut + ultra narrowband color primaries + extreme brightness + extreme FOV + etc, that it starts to overload ergonomic-comfort a bit for some people. Eyes / brains can be weirdly different in how they respond.

I wish displays were more tunable (narrowband vs broadband color primaries), because there seems to be a ergonomic use case to use more broadband color primaries when watching SDR content.

No two humans see perfectly identically. All those varying levels of eyeglasses prescriptions and colorblindness (12% of population has enough abberation to be considered colorblind), but even among non-colorblind, different humans sees colors ever so slightly differently from each other (even if it's just a 1% different color appearance to you than me). But we've invented the standard gamut that tries to be one-size-fits-all, which is why displays almost never perfectly matches the colors of real life for everyone -- you try to calibrate it that way for yourself, it sometimes look off to a different person. The real world doesn't emit only 3 narrowband light at your eyes, and there's no infinite-spectrum displays (infinite color primaries), we've standardzied displays around a human average that almost nobody perfectly matches -- even you might see green light peak 1nm differently from me -- or you're one of the 12% who has abberation big enough to be considered colorblind -- That's a whole another dimension of round-peg-in-square-hole in addition to the well known compensations like different eyeglasses prescriptions. End of day, "displays are imperfect windows to the real world".

There are people who are just plain sensitive to a specific projector tech (can never use DLP but can use LCD/LCoS, versus can never use LCD/LCoS and is comfortable with DLP), that are diametrically opposing - so there's no universal advice.

Either way:

Everybody sees differently / responds differently.

e.g.
- Try lower frame rates (only 24fps). Check if it fixes nausea / eyestrain / motionsickness? (for some people, it does)
- Try higher frame rates (120fps 120Hz). Check if it fixes nausea / eyestrain / motionsickness? (for some people, it does)
- Try brighter/dimmer
- Try turning on motion blur reduction (add strobing / BFI / flicker mode), best for framerte=Hz on gaming material
- Try turning off motion blur reduction
- Try changing sync technologies if gaming (VSYNC ON, VSYNC OFF, and VRR can have different stutter ergonomics)
- Try changes in viewing distance
- Try windowboxing (shrink size of image with more black borders around)
- Try adding ambient light
- Try unpolarized display
- Try different color generation methods (phosphor, laser, HID, LED, CCFL)
- Try computer glasses (orange tinted), this is more reliable than digital blue-light lowering filters (which can't fix high blue leakages in grey blacks)
- etc (dozens of suggestions exist)
- etc, etc, etc
- etc, etc, etc
- etc, etc, etc
[...]
- etc, etc, etc

These are only generic advices, which may not apply to you, it's simply some above-and-beyond troubleshooting you have to do once you've tried your best with your eye doctor (who will typically not be familiar with the >100+ display-discomfort causes).

Since you spent $11K on the projector and a custom home theater room, this might require a fairly custom troubleshoot sequence for you to try to narrow down the causes, which may also require temporary borrowings of other projectors to use in the same room in A/B ergonomics tests (simplest -- even cheap conference room projectors are useful datapoints). Sometimes it's traced to laser, sometimes it's traced to temporal dither, sometimes it's traced to just sheer brightness, sometimes it's traced to bias lighting.

There are over 100+ ergonomic-discomfort possibilities -- other than "PWM" and "blue light".

I would suggest a totally brand new thread if you want to attempt a display-ergonomics troubleshoot run, and send me a PM link to follow up. I'll move the discussion there, whether a new thread on [H] or one of the existing display-ergonomic threads over at Blur Busters Forums, or elsewhere like AVSFORUM. [either public collaboration venue is fine, I prefer public collaboration since many read my posts]
 
Last edited:
With hardware G-Sync on the Alienware monitor I don't feel any issues in gaming down to the low 40FPS range - it feels buttery smooth.

I recently swapped to the LG C2 OLED with G-Sync compatibility. It feels WAY less smooth at 40-50FPS range than the Alienware does with CyberPunk 2077...

I've had the same experience. Cyberpunk 2077 on my 2080Ti at 35Hz felt perfect. No skips, stutters, or anything. An excellent gameplay experience. My LCD is an ASUS PG348 100Hz with G-Sync module.

It might be that a future driver will help, but I am sold on the G-Sync module (called G-Sync Ultimate now) and plan to stick to that with my next LCD, which I am currently shopping for. Been eyeing the Alienware AW3423DW, which came out generally ahead in a comparison to the other displays tested by Hardware Unboxed. A few possible downsides, the rgb elements (sub pixel layout) are physically aligned in a triangle on the aw3423dw, so the text in Windows isn't perfect, as it is tuned for side by side elements, and ClearType in Windows doesn't yet support the triangle arrangement just yet. Also a possibility for burn-in. But, I've been waiting for 4+ years for high speed hdr/refresh screens and they have been too damn slow coming, so I will probably get the 3423 once the price is back down to 1299. Amazon hikes the price whenever the stock gets low...
See https://www.reddit.com/r/ultrawidem...3dw_and_other_oled_text_rendering_is_bad_are/ for a discussion of the sub-pixel layout issues. There are some fixes that work ok, but the world is waiting on Microsoft to fix it in the ClearType app that's built into windows.
 
Yes. But increasingly more of the time (in the booming variety of display technologies) it was a wild goose chase to a red herring. PWM is correctly a common cause, but it is also commonly an overused accidental scapegoat when real ergonomic causes were somewhere else for a speicfic person.

Being Blur Busters, and people who have long had motion-related / motion blur headaches (this was a big problem for VR, but also to a lesser extent, big screen displays), I have probably had zillions of anecdotes of display discomfort, that spins off to this territory. It's one of the world's giant Pandora Boxes.

These posts have become almost so common at Blur Busters Forums, that I probably need to create a Display Ergonimics Forum sometime.

But needless to say, once you're past the basics (eye doctor, optometrist), you have to do a hell of a lot of troubleshooting. Sometimes the shotgum approach (major display tech change, switch between OLED <-> DLP <-> LCD <-> LCoS while also concurrently switching colorspaces and display sizes) to try to binary-search your way through the >100+ causes of display ergonomic problems. This is because nobody has time to try 100 displays, so people who attempt display roulette to solve their display-ergonomic issue, gotta shotgun multiple causes concurrently.

Yes. PWM. The common problem. It is not necessarily PWM but could be. There's over 100 different ergonomic problems with displays, not just PWM or flicker or blue light. I have posted a large list of eyestrain causes on Blur Busters Forums, which I will crosspost here:

Remember that PWM eyestrain is not necessarily the same as flicker eyestrain. PWM effects (multistrobing the same frame in any manner) creates duplicate-image effects that can be very tiring on the eyes. Are you sure you've narrowed your cause specifically to PWM yet? However, some people get headache from PWM dimming, but no headache from framerte=Hz single-strobe PWM. This was confirmed to be the PWM motion artifacts that created the discomfort, not the flicker itself. Sometimes it's the direct flicker itself, but there are other ways PWM can create discomfort (other than the flicker itself). People who never got eyestrain with CRTs, but suddenly had eyestrain seeing serrated-edge motion artifacts -- it's like staring into a serrated knife:

View attachment 521326

But PWM motion-artifact discomfort (instead of flicker) that's only 1 possible cause of over >100 unexpected possibilities.

Displays are essentially imperfect simulations of real life.
Small portions of the human population can be affected by niche line items.

- antiglare filter texture
- polarization
- pixel structure
- direct flicker
- PWM dimming effects
- brightness
- contrast
- color gamut
- motion blur eyestrain
- color separation artifacts (rainbow effect / DLP colorwheel)
- stutter eyestrain
- screen too bright relative to environment
- viewing distance
- excess blue light
- temporal dithering harshness/noise artifacts (DLP does PWM per-pixel)
- motionsickness from high frame rates (with various causes)
- motionsickness from low frame rates (with various causes)
- motionsickness from smooth motionblurred high frame rates (e.g. early interpolation algorithms or odd blurry HFR material)
- motionsickness from vertigo effects (motion on big screen)
- color primaries too spread apart (HDR eyestrain)
- large ultrabright image in dark room (need bias light)
- etc, etc, etc
- etc, etc, etc
- etc, etc, etc
[...]
- etc, etc, etc

And more, that are too niche to list. I've seen people who's got display discomfort from any one (or multiple) of the above line items.

Often, it is a domino effect situation. Like the issues are stacked on top of each other concurrently (size + brightness + color primaries + digitalness of how the image is composed) that stresses a specific human's vision comfort.

For example, low frame rates on ultrafast-response displays can create more visible stutter, which can then consequently create more stutter eyestrain, which means the person is a possible candidate for a LCD / LCoS projector. Other times the ultrawide gamut caused by ultra-sharp color primaries (only three narrowband wavelengths) is the cause of the eyestrain / nausea for some people. Sometimes another domino effect is a weird interaction with DLP colorwheel and frame rate; where some people got discomfort with 1-chip DLP only during 24Hz with DLP, but never with 60Hz, because color-separation artifacts are more amplified with 24Hz than 60Hz -- and some people's comfort threshold of rainbow artifacts actually sometimes (for them) is unexpected linked to frame rate.

Yet another domino effect is you sit close enough to start to see temporal dithering artifacts, which can be very pronounced with ultra-HDR (near Rec.2020) laser projection implementations, even with 3-chip. I wish LCD/LCoS projectors continued to stay common and high quality, as I loved my old Panasonic PT-AE2000U ergonomic comfort more than DLP.

While your eyes are different than mine, I have a personal anecdote on this -- sometimes sharp temporals (colorwheel or temporal dither) + sharp color primaries (narrow peaks at R, at G, at B) + sharp laser speckle + ultrawide FOV -- seems to also have a stacking effect on comfort, where it eventually seems somewhat harsh effect on my eyes (not as harsh as yours) -- and seem to also eyestrain me slightly from long exposure, so I prefer slightly softened temporals like fully-composed refresh cycles (OLED) rather than temporally-composed refresh cycles (DLP) especially when it comes to ultrawide gamuts. Then again, it does not always happen -- I've been fine with ultra-gamut projectors. It's very hit-and-miss, but I could tell that eyestrain is a bigger problem with my line of work than it used to be. Either way, my eyes seems fine fine with a HID-lightbulb 6x 4K DLP projector, since the color primaries are more broadband. Then again my eyes are also fine with ultra-HDR OLED displays too. It's when you throw lasers + speckle + ultra-gamut + ultra narrowband color primaries + extreme brightness + extreme FOV + etc, that it starts to overload ergonomic-comfort a bit for some people. Eyes / brains can be weirdly different in how they respond.

I wish displays were more tunable (narrowband vs broadband color primaries), because there seems to be a ergonomic use case to use more broadband color primaries when watching SDR content.

No two humans see perfectly identically. All those varying levels of eyeglasses prescriptions and colorblindness (12% of population has enough abberation to be considered colorblind), but even among non-colorblind, different humans sees colors ever so slightly differently from each other (even if it's just a 1% different color appearance to you than me). But we've invented the standard gamut that tries to be one-size-fits-all, which is why displays almost never perfectly matches the colors of real life for everyone -- you try to calibrate it that way for yourself, it sometimes look off to a different person. The real world doesn't emit only 3 narrowband light at your eyes, and there's no infinite-spectrum displays (infinite color primaries), we've standardzied displays around a human average that almost nobody perfectly matches -- even you might see green light peak 1nm differently from me -- or you're one of the 12% who has abberation big enough to be considered colorblind -- That's a whole another dimension of round-peg-in-square-hole in addition to the well known compensations like different eyeglasses prescriptions. End of day, "displays are imperfect windows to the real world".

There are people who are just plain sensitive to a specific projector tech (can never use DLP but can use LCD/LCoS, versus can never use LCD/LCoS and is comfortable with DLP), that are diametrically opposing - so there's no universal advice.

Either way:

Everybody sees differently / responds differently.

e.g.
- Try lower frame rates (only 24fps). Check if it fixes nausea / eyestrain / motionsickness? (for some people, it does)
- Try higher frame rates (120fps 120Hz). Check if it fixes nausea / eyestrain / motionsickness? (for some people, it does)
- Try brighter/dimmer
- Try turning on motion blur reduction (add strobing / BFI / flicker mode), best for framerte=Hz on gaming material
- Try turning off motion blur reduction
- Try changing sync technologies if gaming (VSYNC ON, VSYNC OFF, and VRR can have different stutter ergonomics)
- Try changes in viewing distance
- Try windowboxing (shrink size of image with more black borders around)
- Try adding ambient light
- Try unpolarized display
- Try different color generation methods (phosphor, laser, HID, LED, CCFL)
- Try computer glasses (orange tinted), this is more reliable than digital blue-light lowering filters (which can't fix high blue leakages in grey blacks)
- etc (dozens of suggestions exist)
- etc, etc, etc
- etc, etc, etc
- etc, etc, etc
[...]
- etc, etc, etc

These are only generic advices, which may not apply to you, it's simply some above-and-beyond troubleshooting you have to do once you've tried your best with your eye doctor (who will typically not be familiar with the >100+ display-discomfort causes).

Since you spent $11K on the projector and a custom home theater room, this might require a fairly custom troubleshoot sequence for you to try to narrow down the causes, which may also require temporary borrowings of other projectors to use in the same room in A/B ergonomics tests (simplest -- even cheap conference room projectors are useful datapoints). Sometimes it's traced to laser, sometimes it's traced to temporal dither, sometimes it's traced to just sheer brightness, sometimes it's traced to bias lighting.

There are over 100+ ergonomic-discomfort possibilities -- other than "PWM" and "blue light".

I would suggest a totally brand new thread if you want to attempt a display-ergonomics troubleshoot run, and send me a PM link to follow up. I'll move the discussion there, whether a new thread on [H] or one of the existing display-ergonomic threads over at Blur Busters Forums, or elsewhere like AVSFORUM. [either public collaboration venue is fine, I prefer public collaboration since many read my posts]
I don’t know about the ins and outs of all those variables that you mentioned and it might be a domino stack of several things you listed, but I’ve already concluded that my eyes are just not compatible with JVC projectors, nor Sony projectors -- and I think it has to do with the LCOS panel tech and their flickering. My brain can't "see" the flicker on the JVC, but every time I use them my eyes feel burnt and tired, and it's cumulative with use. It takes a while to build and it takes a while to dissipate. The new Sony Laser projectors have flickering (to my eyes) so bad that I can see the flickering without slow motion video and it's terribly obnoxious. I demoed the new $25,000 Sony 7000 projector at a friends house with a handful of guys and I was asking 'Is this flickering terrible to anyone else? and everyone said No. It looks fine to me'.
I said to my eyes it's flashing like a strobe light and is completely unacceptable. They didn't see it, so I pulled out my phone and took a slow motion video and showed them it had PWM - but I didn't need to use the camera to know - because it was visible to my unaided vision. Here is what the $26,000 JVC RS4100 (NZ9) (top) looks like compared to the $25,000 Sony 7000 laser projector (bottom) from that comparison in 240Hz slow motion. That Sony 7000 flashes like a jackhammer! (240Hz slow motion capture)


I've run into some manner of PWM with a monitor before that hurt my eyes, but at the time I don't think I knew it was PWM. I took it back to the store and bought a different monitor and never ran into it since, until the JVC RS2100 I purchased in December 2021. Back in the day, CRT monitors did bother my eyes, regardless of refresh rate (I had an 85Hz CRT and it still botherd my eyes as did 60Hz). My first LCD screen was a God Send.

I've used projectors as my primary home display in my home theater since 2002. So that's 20 years. I've owned in that time all LCD projectors. Some model business class Sharp projector was my first unit that I purchased off ebay (not a CRT projector). Then a Panasonic 500U, then a Panasonic 700U, then an Epson 8350, then a Panasonic AE8000U, then an Epson 5040UB, then this JVC RS2100 (NZ7), then an Epson LS12000 is my current unit. (though I still have the RS-2100). Thankfully the Epson LS12000 doesn't bother my eyes at all and shows no symptom of PWM under slow motion 240Hz.

The JVC RS2100 (NZ7) was the first projector I've owned that hurt my eyes. I tried all manner of settings, motion control, low laser, high laser, eshift on/off -- it just bothers my eyes, and low laser more than high laser. (presumably because there is more PWM flashing).

I borrowed a good friend's JVC RS1000 (NX5) for a month to see if it hurt my eyes too after realizing the RS2100 did (to find out if it the laser or the LCOS panels that caused my fatigue) - yes, the bulb based JVC model did bother my eyes. (240Hz slow motion capture)


Then I bought a JVC RS540 to see if it hurt my eyes since I couldn't find much information on other people stating JVC hurt their eyes and wondered if it was a newer model issue. Yes the RS540 did bother my eyes too. (240Hz slow motion capture)


Then I saw that Sony laser projector in the last year and it was nearly torturous. So it's clear that people do see differently - because I could barely stand to look at that Sony.

I saw BFI on a OLED at a friend's house for the first time recently and it was straight up like a strobe light to me. TERRIBLE. And again my friends didn't see it. Another validation that people see differently is some blind testing with G-Sync with a friend. He couldn't tell when it was on or off, and I could every single time. Since he couldn't see it, he always thought it was bunk, and challenged me that I couldn't tell either. So he setup a blind test with me and tested me until he got bored with various scenes and games, and I was accurate 100% of the time if G-Sync was on or off. To me G-Sync on my Alienware AW3418DW is the best gaming experience upgrade since the original 3D cards, and it's so buttery smooth with it on that I don't even want to game if it's off. To him, he can't tell if it's on or off. That's why I was curious about the OLED's implementation of it because it's so vastly different than the IPS Alienware I've used since 2017. (inferior to the Alienware's implementation to my eyes.)

Another antecdotal note is that I have used high refresh displays and even though I seem to be senstive to PWM flashing, I'm not bothered by lower frame rates that other people. One of my LAN party friends says anything below 165Hz to him feels too slow and he always lowers IQ settings to keep his frame rate capped. He says G-Sync doesn't help him - only high frame rate. I can't even see anything different when it's that fast. Anything over about 75Hz to me is perfectly smooth in gaming -- especially with G-Sync and I can go much lower. In my testing with the Alienware IPS and hardware G-Sync - it feels very smooth to me down to about 42Hz. I don't like motion blur and almost always turn it off in preference.

This is all, who cares - individual, information I'm sure. But I find it interesting and thanks for giving me the feedback you have. I've learned a lot from this discussion. Interesting that panels work they way they do. I tried the same game that looked super herky jerky on my OLED at 40FPS range with DLSS and that put it in the 80FPS range and it felt much much better. I'd gladly change the small IQ drop for a smother frame rate to my eyes.

One more thing. I watched a video in the last few months talking about the refresh rate or flicker fusion rate of various animals and how fast their eyes and brain process motion. Dogs see about 85Hz, humans about 60hz (average) -- and of course there is a variance on that general range that is specific/unique to an individual animal specimen. But the video was saying for animals that see at a faster refresh rate - time passes a bit slower and it can help give them better reflexes on the general average - as they have more cycles to process fast moving items. It may help with their hunting or escape functions.

So that's interesting too. I'll see if I can dig up that video and share it.
 
Last edited:
Here's that video I was referring to in my last post that talks about the Critical Flicker Fusion rate for various animals and how they perceive time. I found it super interesting.
 
Have you tried the Sony SXRD LCoS?
Here's that video I was referring to in my last post that talks about the Critical Flicker Fusion rate for various animals and how they perceive time. I found it super interesting.

Flicker is only one part of the motion equation.

There are multiple orders of magnitude involved (healthy error margin, but demonstrates massive orders of magnitudes of the different detection thresholds involved)

~10 -- humans stop seeing slideshow effect and begin to perceive motion
~100 -- humans stop seeing flicker. Also stops seeing stutter in perfect framerate=Hz.
~1000 -- humans stop really noticing display motion blur (persistence)
~10000 -- humans stop really noticing stroboscopic/temporal effects (gapped motion effects, temporal dither effects, etc)

Some humans are only sensitive to 1 of the above, others are to all. There are people who get eyestrain from DLP temporal dithering effects (1440-2880Hz per-pixel PWM).

Note: These are crudely simplified orders of magnitude, and can be off quite a bit, but this illustrates there are a whole Pandora Box of artifacts that are far beyond flicker fusion threshold)

One example is the famous stroboscopic effect; but is also visible in games, far far beyond the flicker fusion threshold:

1666760212980.png


crosshairs-stroboscopic-arrowed-example-animate.png


And when you add PWM to that, these things become more pronounced, and becomes more bothersome to some. Can be more pronounced with wide-FOV, very-bright, sharp-primaries, wide-gamut light sources (aka PWM-laser-illuminated projectors)

This even still happens at 1000fps 1000Hz -- the finiteness creates differences versus real life. So in many cases, even 1000Hz is not retina refresh rate. Now, that being said, 4000 pixels/sec at 1000 stroboscopics/sec still creates steppings every 4 pixels. Even 1000fps 1000Hz can still end up doing that, if you don't use GPU motion blur effect intentionally to hide stroboscopics (that's why I said sometimes the dreaded GPU motion blur setting is sometimes another persons' "ergonomic feature" when people have more eyestrains from stutters or stroboscopic effects). Although Blur Busters are all about busting motion blur, we also believe in adding motion blur when it's the lesser evil -- Right Tool For The Right Job.

For a perfect Holodeck (~16K 180-degree VR), the retina refresh rate can be well over >20,000fps 20,000Hz, to cover all above, if you don't use eyetracker-compensated motion blurring. In the diminishing-curve-of-returns stratospheres, larger geometric jumps are needed (e.g. 240Hz -> 1000Hz -> 4000Hz) to be human-noticeable (like blur differences of 1/240sec SLR photo versus 1/1000sec SLR photo). Assuming the blurring/stroboscopics at sufficient motion speeds in pixels per second, are creating artifacts bigger than the human angular resolving resolution. For example, 1ms MPRT (requires 1000fps 1000Hz to do strobelessly) still creates 10 pixels of motion blur on an 8K display at 10,000 pixels/sec scrolling motion, so even 1000fps 1000Hz is not "retina refresh rate", for that resolution. Higher resolutions amplify Hz limitations. As a lot of motion artifacts are a result of the finiteness of refresh rates & frame rates;, and can never perfectly match real life.

Strobing, in fact, is a motion blur reduction band aid, and that's why we're big fans of future brute framerate-based motion blur reduction (sheer fps at sheer Hz), since real life doesn't strobe. Since real life is analog (aka infinite frame rate), the only universal whac-all-mole method for finite temporal behaviors is to get as close to that as possible (such as via brute framerate and Hz).

The LCoS flicker only seem to happen with laser-based LCoS. I don't see any flicker temporals on the HID/LED based 3-chip LCoS, since LCoS itself doesn't flicker noticeably, but the illumination light source can. Aslo, fortunately, you don't seem to be motionblur-sensitive (some of us have headaches from display motion blur), and you seem exceptionally sensitive to so fortunately you can try testing out non-laser-lit LCoS technology -- like a HID/LED-based 4K Sony SXRD projector, or similar. Those are practically zero PWM, no per-pixel PWM, no backlight PWM, and is pretty eye-pleasant, comfortable for viewing for 8 hours on end by my eyes. Just not as good blacks or HDR. You do give up color gamut of lasers, which can sometimes be problematic for the comfort of some human eyes (it's amazing wide gamut if you're comfortable with the image, though). Another option is LCD projectors, but those have certainly stagnated -- I used to have a Panasonic PT-AE2000U projector and it was super-easy on my eyes -- but I don't see any new worthy LCD projectors at the moment. Laser is the big thing going on, looks amazing, but that isn't easy on everybody's eyes.

Displays are certainly vessels of disappointing compromises, and there is no Holy Grail to be found. But some compromises can get really good (high-framerates on VRR OLED is really lovely to many eyes).

You should check my article titled, "The Stroboscopic Effect of Finite Frame Rates", some motion artifacts which is human visible to well over >10,000 Hz in some conditions!
 
Last edited:
Back
Top