27" 240hz OLED Monitor!!! (but there's a catch...)

I won't argue the motion clarity, though I'd say that OLED is there if not nearly there.
Definitely not -- you still have refreshtime (frametime) persistence so you can't get less than (1/240sec) display motion blur on OLED.

During fast scrolls of 4000 pixels/sec on 4K OLED, 4ms of persistence translates to 16 pixel of display motion blur. That's enough to fuzz-up tiny details like fine text on walls or ultrafine-detail graphics.

Blur Busters Law says 1ms of pixel visibility time translates to 1 pixel of motion blur per 1000 pixels/sec, which is exactly true when GtG=0 (aka squarewave strobing, or squarewave refresh cycle transitions), and that's why OLEDs follows Blur Busters Law much more perfectly than LCDs do. However, the formula is still an excellent approximation of the motion blur you see when GtG is somewhat under half a refresh cycle, so the Blur Busters Law works reasonably well (though more of an estimate) on LCDs, while the motion blur math starts to become virtually perfect on sample and hold OLEDs.

At framerate=Hz:

Motion blur = pixel visibility time.
Motion blur = pulsetime on strobed.
Motion blur = refreshtime on nonstrobed.

For VRR, refreshtime=frametime which is why you see increasing/decreasing amounts of motion blur in VRR framerate-ramping animations (e.g. www.testufo.com/vrr) -- which is also a good demo of the stutter-to-blur continuum, where the stutter blends to blur at the moment the stutter-vibration goes beyond your flicker fusion threshold. (Regular low frame rate stutter is a sample and hold effect, just like motion blur)

Also, 1ms MPRT on a CRT is easy, but to achieve 1ms MPRT on sample and hold, you need (1 second / MPRT) = 1000fps at 1000Hz at GtG=0 in order to match the display motion blur of a CRT tube.

Remember MPRT is not the same thing as LCD GtG.

Even GtG=0 can still have display motion blur, since MPRT(0%->100%) is throttled by MAX(frametime,refreshtime) on sample and hold.

You can view www.testufo.com/eyetracking on any non-BFI OLED at any refresh rate, to confirm this -- see for yourself -- because the optical illusion is ONLY possible due to display motion blur being throttled by frametime (which is then limited by the minimum refreshtime).

At high resolutions and high angular resolution / high ppi, motion blur is far easier to see than it was on a SXGA CRT tube, as a difference between static resolution and motion resolution. A sharper static resolution (more pixels) competing against an unchanged motion resolution = more motion blur easily seen when resolutions are higher and game details are higher, for the same physical motion speeds (in screenwidths/sec for a similar size display for example).

Lots of variables like screen resolution, human eye resolution, human eye maximum tracking speed, how long a moving object stays on the screen (enough time to identify motion artifacts such as motionblur or stroboscopics). In an extreme case (e.g. 16K 180-degree FOV, like for a theoretical sample and hold VR headset, since real life does not flicker/strobe) -- retina refresh rate is not until >20,000fps at >20,000Hz due to huge number of pixels with plenty of time for eyes to compare difference between static resolution and motion resolution.

For desktop displays, the ballpark 1000-4000Hz at framerate=Hz can easily be the territory of retina refresh rate at currently-used computer resolutions, since the FOV is much smaller, which means ultrafast-moving objects disappear off the edge of the screen too quickly before you're able to identify whether or not the object has motion blur or not. So retina Hz is tied to the human-resolvable resolution over the width of the display, linked to your maximum eye-tracking speed, and how much time the moving object stays on screen, to allow you to identify if the moving object has a difference in resolution versus stationary object.

So, today we're not even matching a CRT with a sample and hold OLED.... yet.
So:
(A) Ultrashort-flash BFI blur reduction needs to be added, OR
(B) Enough brute framerate-based blur reduction needs to be added.

For more information, click the Research tab of the Blur Busters Website. I'm already cited in 25+ peer reviewed papers, so I've been the authoritative source on display motion blur.
 
Last edited:
Yeah...I'm still using high end Sony CRTs as daily drivers. Nothing's been destroyed. Though I eagerly await such destruction...

(Kind of doubting that 240Hz with Ray Tracing and such is going to be a thing, but I'd love to be wrong about that.)
 
Yeah...I'm still using high end Sony CRTs as daily drivers. Nothing's been destroyed. Though I eagerly await such destruction...

(Kind of doubting that 240Hz and Ray Tracing and such is going to be a thing, but I'd love to be wrong about that.)
Don’t destroy them! Give them to me! Lol
 
It's like as if digital filming had ~50% motion blur during periods when wheeling the camera around at high speed or during periods of high viewpoint/viewport motion when doing go-pro action video capture etc. . , and was on the road to less, but filmic motion cameras and photography could capture zero blur motion shots (after tweaking your camera inside and out periodically and after warming your camera up for a 1/2 hour on site every single time)
This is not correct that digital vs film had camera motion blur differences at same shutter speed.

(...Nonwithstanding older motionblur-worsening compression algorithms AND older slow-shutter digital cinema cameras, but let's assume proper film-vs-digital cinema cameras at same shutter speed...)

I will add a correction to this;

The extra motion blur from modern digital cinema film despite the same shutter speed was purely because:
(A) Modern digital films are often displayed on sample and hold displays.
(B) Old 35mm film was displayed on 35mm projectors that did double strobing, which had a side effect of reducing display motion blur.

I was able to confirm the ability to make digital cinema duplicate 35mm cinema by displaying digital film on a 96Hz sample and hold display with software-based double-strobe BFI black frame insertion (VISIBLE, BLACK, VISIBLE, BLACK), and it looked identical to 35mm film projector displaying footage of the same camera shutter speeds. (It now becomes exactly the same blur mathematics too!).

Also, with a digital cinema projector, you can film sports at 24fps at 1/1000sec shutter, then play it back on a simulated film projector strobe (96Hz sample and hold display with double-strobe BFI algorithm), and get exactly the same look as 35mm film.

Remember, the reason why digital cinema has more motion blur is because digital cinema is displayed on sample and hold displays (LCD TVs, OLED TVs, digital cinema projectors).
  • Remember, camera blur and display blur is additive.
  • Remember, display motion blur is frametime on sample and hold.
(1/24sec) camera persistence + (1/24sec) display persistence = (1/12sec) human-seen motion blur. YUCK.

Now, even if you did a fast shutter:

(1/1000sec) camera persistence + (1/24sec) display persistence = 42.66666ms human-seen motion blur. STILL YUCK.

I do enjoy Hollywood Filmmaker Mode, but add a 96Hz + software doublestrobe to it, and it becomes True 35mm Projector Simulator Mode. The easiest way is to use a computer display, create a custom 96 Hz mode to permit a 48Hz software strobe, and use a software BFI strober (e.g. DesktopBFI or other app, but make sure you're using borderless fullscreen). Then you're reproducing 35mm film projector flicker and its blur reduction effect.

For more information see the UltraHFR FAQ on the Blur Busters website (click the purple "Research" menu of the Blur Busters website). We have already tested 240fps recordings on a 240Hz display, and even experimented with blurless 360-degree shutter (1000fps 1000Hz) by sped-up Phantom Flex footage on a vision science 1000Hz DLP projector (the kinds similar to that Viewpixx projector).

Interestingly, 360-degree shutter footage has no stroboscopic stepping effect on sample and hold displays at all, and if you have enough framerate, 360-degree shutter becomes low camera persistence (1ms shutter). And with 1ms refreshtimes, sample-and-hold concurrently becomes low persistence without strobing (1ms MPRT without strobing). Low persistence 360-shutter AND low-persistence sample-and-hold combined.

Digital cinema, no camera blur, no display blur, no stroboscopics, no phantom arrays, no brightness loss from strobing, motion resolution superior to even 35mm and vastly noticeably superior to classical HFR (48fps-120fps).

So 360-degree 1000fps+ 1000Hz+ sample-hold is the Ultimate HFR of the 2030s.
Perfect for motion simulators when you need to simulate framerateless / strobeless / blurless reality.
Real life has no flicker, real life has no frame rate, and real life has no extra camera blur / display blur above what your eyes/brains generate.

I've seen the promised land.

Mark my words.

EDIT: Added corrections/adjustments.
 
(Kind of doubting that 240Hz with Ray Tracing and such is going to be a thing, but I'd love to be wrong about that.)
Psst.... 4K 1000fps 1000Hz UE5 detail is now achievable with current 4000-series GPUs if we implement reprojection [YouTube]



I actually downloaded that demo and I was downright impressed, but I needed 100fps original framerate to eliminate being distracted by reprojection artifacts. But when I did 100fps->(360Hz+) on sample and hold, I eliminated the old double image effect caused by reprojection in Oculus Rift.

It only used 10-20% of my Razer laptop GPU to convert 100fps to 1440p 280fps, and I hadn't even yet tested on a desktop GPU at the time. For a reprojection algorithm, it only takes under 10% of a 4000-series GPU to convert 100fps to 1,000fps, at the current memory bandwidth (terabyte/sec legaues). Reprojection is GPU-detail independent. 90% of the GPU can be focussed on doing 100fps at RTX ON. (Heck, use DLSS 3.0 as a pre-framerate-amplifier stage, if need be...) And then reprojection can finish the rest of the way.

By using blurless sample and hold, reprojection artifacts cease to be objectionable, especially when 6dof is done at 100fps+ minimum original pre-reprojection frame rate, to keep reprojection artifacts beyond my personal flicker-fusion threshold. There's a relationship between strobing vs sample-hold when it comes to reprojection artifacts, especially when trying to reproject original frame rates lower than a flicker fusion threshold (~70-85Hz). By keeping the stutter-to-blur continuum completely in the blur section for both original AND reprojected frame rates, then 90%-99%+ of reprojection artifacts seems to magically disappear.

There are some ultra-minor oddities, but far less than DLSS 2.0 and DLSS 3.0 oddities. So, as long as a game can do 4K 100fps, modern VR-derived reprojection algorithm can generate 9 extra frames per frame at very low GPU workload at the current 4000-series horsepower and memory bandwidth, multiplying frame rates 10:1 and reducing motion blur by 90% without using strobing. This is brute framerate-based motion blur reduction -- strobeless ULMB, ergonomic flickerfree low persistence sample and hold.

Game developers need to implement reprojection technology, stat. Should be a new Vulkan/DX API and simultaneously accept 6dof coordinates during frame generation to reduce reprojection latency to near-zero. Much lower workload than what DLSS 3.0 is currently doing.

This ain't your grandpa's Sony Motionflow Interpolation, especially since Netflix is 1fps with 23 H.264/H.EVC "interpolated" frames in between anyway, and keeping the framerate-amplification artifacts below the human-visibility threshold can be more useful than trying to lower game detail too aggressively and turning UE5 into Pong.

Strategic decision on how to improve frame rate performance is poorly done by the industry currently, but some of us have roadmapped ways to do 4K 1000fps UE5 raytraced already -- and only using today's technologies already on the shelf. Even 4K 1000fps can be done by refresh-rate-combining 8 different 4K 120Hz strobed projectors stacked onto the same projection screen. Projector stacking is an easy way to combine the refresh rate of multiple projectors. Even when the GPUs are not enough, 8 different separate genlocked computers can take turns generating frames at 1/8th VBI offsets, to spread the horsepower if need be (e.g. Quadro genlock bus). At least in high budget contexts (enterprise, simulators, etc). However, multi-GPU approach is no longer essential when doing RTX 4090 reprojection (yay), given reprojection is more memory-bandwidth limited. It may not be cheap at first, but at least it's now confirmed possible to do strobeless blur reduction at 10:1 ratios on current GPUs.

So we've solved the GPU-side and display-side of a 4K 1000fps UE5-detail realtime raytraced ecosystem completely with currently available off-the-shelf technology.

Public white paper coming 2023.


I'm not the only one who has conceptualized this. Dozens apparently have. But they don't have our brand name catchet, so I'll take the white paper baton. Peer reviewed researcher waitlist ongoing, contact me if any lurker is a researcher in this realm and wishes to participate.

Mark my words.
 
Last edited:
This is not correct that digital vs film had camera motion blur differences at same shutter speed.

(...Nonwithstanding older motionblur-worsening compression algorithms AND older slow-shutter digital cinema cameras, but let's assume proper film-vs-digital cinema cameras at same shutter speed...)

I will add a correction to this;

The extra motion blur from modern digital cinema film despite the same shutter speed was purely because:
(A) Modern digital films are often displayed on sample and hold displays.
(B) Old 35mm film was displayed on 35mm projectors that did double strobing, which had a side effect of reducing display motion blur.

I was able to confirm the ability to make digital cinema duplicate 35mm cinema by displaying digital film on a 96Hz sample and hold display with software-based double-strobe BFI black frame insertion (VISIBLE, BLACK, VISIBLE, BLACK), and it looked identical to 35mm film projector displaying footage of the same camera shutter speeds. (It now becomes exactly the same blur mathematics too!).

Also, with a digital cinema projector, you can film sports at 24fps at 1/1000sec shutter, then play it back on a simulated film projector strobe (96Hz sample and hold display with double-strobe BFI algorithm), and get exactly the same look as 35mm film.

Remember, the reason why digital cinema has more motion blur is because digital cinema is displayed on sample and hold displays (LCD TVs, OLED TVs, digital cinema projectors).
  • Remember, camera blur and display blur is additive.
  • Remember, display motion blur is frametime on sample and hold.
(1/24sec) camera persistence + (1/24sec) display persistence = (1/12sec) human-seen motion blur. YUCK.

Now, even if you did a fast shutter:

(1/1000sec) camera persistence + (1/24sec) display persistence = 42.66666ms human-seen motion blur. STILL YUCK.

I do enjoy Hollywood Filmmaker Mode, but add a 96Hz + software doublestrobe to it, and it becomes True 35mm Projector Simulator Mode. The easiest way is to use a computer display, create a custom 96 Hz mode to permit a 48Hz software strobe, and use a software BFI strober (e.g. DesktopBFI or other app, but make sure you're using borderless fullscreen). Then you're reproducing 35mm film projector flicker and its blur reduction effect.

For more information see the UltraHFR FAQ on the Blur Busters website (click the purple "Research" menu of the Blur Busters website). We have already tested 240fps recordings on a 240Hz display, and even experimented with blurless 360-degree shutter (1000fps 1000Hz) by sped-up Phantom Flex footage on a vision science 1000Hz DLP projector (the kinds similar to that Viewpixx projector).

Interestingly, 360-degree shutter footage has no stroboscopic stepping effect on sample and hold displays at all, and if you have enough framerate, 360-degree shutter becomes low camera persistence (1ms shutter). And with 1ms refreshtimes, sample-and-hold concurrently becomes low persistence without strobing (1ms MPRT without strobing). Low persistence 360-shutter AND low-persistence sample-and-hold combined.

Digital cinema, no camera blur, no display blur, no stroboscopics, no phantom arrays, no brightness loss from strobing, motion resolution superior to even 35mm and vastly noticeably superior to classical HFR (48fps-120fps).

So 360-degree 1000fps+ 1000Hz+ sample-hold is the Ultimate HFR of the 2030s.
Perfect for motion simulators when you need to simulate framerateless / strobeless / blurless reality.
Real life has no flicker, real life has no frame rate, and real life has no extra camera blur / display blur above what your eyes/brains generate.

I've seen the promised land.

Mark my words.

EDIT: Added corrections/adjustments.

What I posted was an "as if" comparative... it wasn't based on actual film vs digital video/image recording physics and wasn't meant to be comparing their literal features and specifications. I still appreciate the info in your response but I was just making comparison as if you were in an alternate dimension so to speak where (fw900) crt gains and limitations were somehow what you'd get with film cameras, and digital HDR display's gains and limitations were what you'd get with digital photography/recording. It was a metaphor of sorts. "A metaphor is often poetically saying something is something else." Appologize if there was any misunderstanding on how I presented it.
 
Last edited:
Oh I was referring to my S95B QD-OLED, not the 34" QD-OLED UW gaming panel. The S95B on older firmware gets insanely bright for an OLED. Here is a real pic of me playing Calisto Protocol in the dark, it's like a 55" PG32UQX but without all of LCD/FALD drawbacks:
Epic!

I picked up one of these S95Bs based on your recommendation and it is MIND blowing. If the Eye of God was a display its S95B.
20221208_175251.jpg
 
55" light torch. How on Earth could you view that in person and not hurt your eyes?! It's also way too big for a normal gaming setup.
 
ITS GLORYASS!!!

"ITS TOO BIG" - WHAT SHE SAID

"LIGHT TORCH" - HELL YES!!!

Opra Winfrey once said of you like your trashy 42" dim trash Woled you can keep it as your not worthy to handle the power of the Ditka!

"QOLED is not your grandmas whimpy gimpy oled.... it's a ball busting nut cracker!"
~ Abraham Lincoln 1869
 
Last edited:
Uh oh. Vega and l88bastard combined giving glowing kudos to hardware is never good news potentially for our wallets around here. ;) I do have some space issues but I'm definitely keeping that Samsung in mind.
 
Nothing touches the S95B in gaming/HDR, and I've had about everything. It's like a PG32 and LG OLED had a baby with all of the negatives dropped and kept all of the positive of both. Just get a big desk, push the S95B back, and rock n' roll. Absolute STEAL at $1,449.
 
The S95 image quality has to be seen to be understood. It's made my C242, C177 and c955 take walk of shames.

But but but it won't fit on my desk.....that's weak betamax speak.....make it fit!!!
20221208_173411.jpg
 
"Normal" / stereotypical "drafting table" or office cubicle like setups for media and gaming are overrated imo. I get that some people are confined by circumstances and space but otherwise consider breaking out of the 90's "up against the wall like a bookshelf" / "upright piano keyboard with sheet music screen" pc scenario.

piano-clipart-25.gif




A slim rail spined tv stand (flat footed or caster wheeled) with full wall-mount like VESA hardware is around $90 - $150 usd and is very modular.

(tray installs are optional on these types of stands)

swll7VU.png


sHneoux.png




That or a more dedicated wall mount or a 2ndary surface for the screen to sit on. Uncouple your screen from your desk. You can get a peripherals desk on caster wheels potentially too or install some on a capable desk - then roll your desk back up to the screen or vice-versa to save space when not in use if necessary. It can also potentially provide enough flexibility in setting up your room layout that you can keep your screen from being a giant light collector catcher's mitt to every direct light source in the room (light pollution is bad no matter if glossy or coated). People do racing wheels, dance pads or dance tracking camera game areas, small scale VR gaming (beat saber, etc) areas, etc. Consoles from a couch. Or even small exercise areas similarly. And it's not even going as far as that. You can still have your full desk. (Space and finances permitting) - It's not really an outrageous concept to get a floor TV stand and increase the distance to a larger screen by a few feet.



A 55" is a decent stretch to sit at optimal 50 to 60 degree viewing angle distances though (or at least a minimum seating distance to get 60 PPD). Somewhere around 3.5 feet to 4 1/4 feet screen to eyeballs.

Human central/binocular viewpoint 50 - 60 deg:

774484_3kU3adt.png


tJWvzHy.png


50 deg viewpoint 4k = 77 PPD
60 deg viewpoint 4k = 64 PPD
. . . .


55" 16:9 at 50 deg viewing angle = 51" view distance

55" 16:9 at 60 deg viewing angle = 42" view distance

55" 16:9 at 64 degree viewing angle = 38" view distance (and ~ 60 PPD at 4k)


. . . .



48" 16:9 at 50 deg viewing angle = 45" view distance

48" 16:9 at 60 deg viewing angle = 36" view distance

48" 16:9 at 64 degree viewing angle = 33.5" view distance (and ~ 60 PPD at 4k)


. . . .


42" 16:9 at 50 deg viewing angle = 39" view distance

42" 16:9 at 60 deg viewing angle = 32" view distance

42" 16:9 at 64 degree viewing angle = 29" view distance (and ~ 60 PPD at 4k)
 
Last edited:
Nothing touches the S95B in gaming/HDR, and I've had about everything. It's like a PG32 and LG OLED had a baby with all of the negatives dropped and kept all of the positive of both. Just get a big desk, push the S95B back, and rock n' roll. Absolute STEAL at $1,449.
I’m glad you mentioned price. Remember back in the day when new plasmas cost the price of a car… That we could have a display that trounces them in every sense for only a minuscule fraction of the price (WITH inflation factored in!) is miraculous.
 
Uh oh. Vega and l88bastard combined giving glowing kudos to hardware is never good news potentially for our wallets around here. ;) I do have some space issues but I'm definitely keeping that Samsung in mind.
And me.

OLED technologies are my primary screens nowadays.

55" light torch. How on Earth could you view that in person and not hurt your eyes?! It's also way too big for a normal gaming setup.
It’s just a matter of adjusting viewing distance. Some of us replace multimonitor with a single large screen, and a 55” panel is just like a matrix of 4x27” monitors. When playing games, sit back a bit — use a deep desk, etc. 1 meter view distance recommended, give or take for the 42”-55” class of TV-as-monitors.

As for brightness, there’s a lot of variables. Many OLED is generally pretty easy on the eyes compared to some harsh high-blue LED backlights that still has blue blacks even with low blue light mode (which only helps colors above near-black) — it depends on the model and what your eyes are sensitive to.

Citation: I currently do Visual Studio software development on an OLED -- my Visual Studio windows are small. Post-2020 OLEDs at correct settings are burnin resistant enough now for that fun stuff. Guess that answers your question? ;)

Windows 11 makes "multimonitor simulator on a single screen" easy and instant:

1670710206451.png


While multimonitor is still useful, but for common use cases like mine who the [BLEEP] needs multimonitor anymore when we now have that multimonitor-simulator convenience now? Multimonitor adds lag and stutters to gaming. I don't want that. My desktop is black and devoid of icons so a 27" window looks like a 27" monitor, with perfect pitch black surrounding it.

It just looks like a giant 90s era laptop bezel surrounding the window, when I shrink a window on a pitch-black-desktop OLED (with taskbar hidden). And I can go bigger or smaller. Essentially, add/remove monitors on a single click.

It's just like a black wall with a dynamically resizeable computer monitor and dynamically resizeable bezels.

What is abnormal about using 42"-55" as a computer monitor (replacing multimonitor of same screen surface anymore), anymore, I ask, I admonish? SMH, if you disagree. It's like criticizing the concept of a smartphone in 2006 before they existed -- try it first (with all the above pluses).

If your desk is deep enough to allow you to have a variable viewing distance between 2 foot and 4 foot, it actually can be heaven. My eyes have LESS eyestrain! Juice the brightness low and enjoy Dark Mode during Visual Studio, then switch to your bright glowy HDR profile during gaming. Etc.

I've finally developed long enough on OLEDs to now be convinced burnin worries are days in the past (at least at my settings on my OLEDs). Yes, there's that usual temporary phosphorescent retention, but nothing permanent after 10 minutes of blank screen or video content. Too many confuse the two.

Not all of us needs 23-27" to maximize peripheral vision performance in order to earn money in esports. The gaming monitor is extremely huge and covers a lot more casual gaming use cases and non-gaming use cases, including very nice ghost-free 120Hz scrolling that is easy on the eyes.

I still love LCDs (they still have less motion blur when using strobing, and they are still much more lag-optimized for esports) -- but if you are a fan of general purpose computing, fantastic all-around performance, and strobeless blur reduction (brute framerate-based motion blur reduction), thanks to the upcoming 2023 boom of OLED, I can confirm OLED is the way to go due to their superlative best-possible sample-and-hold performance.
 
Last edited:
Anyone that actually plays competitive games won't like this monitor.
It's as bright as HDR 100. It is flickering.
It cannot display games that's more than 10 times of its dynamic range.

There is no way this monitor can showcase Battlefield 4 a decade ago.
 
Anyone that actually plays competitive games won't like this monitor.
It's as bright as HDR 100. It is flickering.
It cannot display games that's more than 10 times of its dynamic range.

There is no way this monitor can showcase Battlefield 4 a decade ago.

Due to their still higher-than-LCD latency, not all OLED is ready for all forms of competitive gaming.

However, it's a false strawman argument to exaggerate the HDR that low, nor exaggerate the flicker. Sure, FALD LCD can get brighter. But 100? You're like a stock-shorter exaggerating a disadvantage -- It's time to mythbust these exaggerations. HARD. With brute force.

First salvo.

For example, TN LCDs flicker way more than current OLEDs do. The inversion artifact flicker and the 6-bit FRC flicker bothers some people much more than the very faint once-a-refresh-cycle ultrashort (dimmed for <0.5ms) OLED flicker, and it's only a rolling-scan flicker:

OLED high speed video:

(960fps high speed video of OLED, with that tiny dimmed rolling-bar -- this is the OLED that kramnelis is wildly exaggerating)

While ignoring some of the worst-flickering culprits of LCDs (aka the TN LCD, the panel famous in esports, even with strobing=OFF)

Fast-shutter photo of a TN refresh cycle:
qLIF1UJ.png

(Fast-shutter photograph of a typical BenQ TN LCD -- you need 1/1000sec and a few attempts -- freezing the pixel flicker and making it visible in a photo -- its more visible in strobing but it is still perpetually there in sample-and-hold mode, just requires a 1/1000sec photo or a 1/1000sec strobe backlight flash -- one or the other -- to make visible).

There are even people who get eyestrain from the flickering chessboard effect of TN LCD inversion artifacts.

I admonish that kramnelis is currently completely unaware of reality where newer OLEDs flicker less than certain types of LCDs at the same refresh rate.

(except for lag -- then kramnelis is still correct for most OLEDs ...for now).

There is an element of user preference but you can't do blanket statements. Right Tool for Right Job. Some people get strain from stutters, or from VSYNC OFF tearing, etc. And so on. No one size fits all.
 
Last edited:
I admonish that kramnelis is currently completely unaware of reality where newer OLEDs flicker less than certain types of LCDs at the same refresh rate.
I have talked about flickers before.
https://hardforum.com/threads/the-3...s-started-wait-for-it.2002618/post-1045498268

Unlike the ad showcase, the newer OLEDs like QD-OLED still flickers as hard as sub-300Hz refresh rate to cause eye strain. That's why LG doesn't even want to increase the brightness to 200nits to magnify the eye strain effect. It will do similarly like below.


Also, the contrast of a game is much lower.
Unless you want to do the old ways to lift the black level to see with the sacrifice of image quality, it will be harder to spot enemy on a 100nits monitor compared to a 1000nits monitor.
Contrast_2.png
 
Last edited:

Maybe these arguments are a fad this year for you, but it is a dead horse on my side. I still reiterate what I said.

I will repeat that there are pros/cons of OLED and LCD, and you have to be aware not all OLEDs flicker hard. There is only an ultra-brief flicker on several OLED panels.

The 200 nit you speak of is full white brightness, similar to the ABL-behiavor of a CRT/plasma. ABL can be a very desirable behavior, to prevent a display from becoming too bright in a dark room, since many computer users back off settings, and that we only want the high-nit pixels during very dark scenes. Real world OLED HDR actually still outperforms the typical average 300nit LCD. Now that being said, OLEDs have a low nit behavior when displaying a 100% white field but movies/games aren't 100% white fields. Just because one OLED only hits 200 nits in varying-content material, doesn't mean this OLED has only 200 nits at 5% window size. Also remember ANSI contrast is a 50% window and is not the right test pattern to measure HDR peaking brightness on, and you need HDR specific test patterns. Also you need to choose the right HDR profile as many profiles are disgustingly dim or low saturation (unoptimized), and needs to be recalibrated. I remember the day when the first LCDs had bad out-of-stock calibration.

Also, remember, a panel capable of 0nit 100% black field suddenly into 10,000nit 100% black field is potentially damaging to vision without mandatory (configurable) ABL behavior. The HDR peaking is supposed to be done on tiny window sizes (e.g. <1% of pixels, like crazy bright sunlight reflections off a 1957 Chevy). Sure, OLEDs have more conservative ABL behavior than the typical FALD LCD, but painting ABL as evil is not The Right Way of going about things, considering that ABL is also a safeguard both for the humanside (vision risk) and deviceside (power, voltages, etc). That being said, ABL should be configurable by the user, to various kinds of standards (e.g. 1% window, 5% window, 10% window, etc).

There are over 100 causes of eyestrains from monitors other than flicker and other than blue light, and the flicker of an OLED far outweighs a lot of eliminated line-items of ergonomic issues, and cherry-picking to greatly reduced flicker (versus those tests), is like arguing about a paper cut when your leg is already broken and needs service. If you've ever watched the right forums, you've seen so many people complain about various kinds of issues of displays, and you'd realize that flicker is not the end-all-to-end-all, when a threshold of discomfort falls well below other discomforts of other ergonomic issues.

So, having explained this:

We are not discrediting existing tests -- but you're cherrypicking a big buffet, full stop, mic drop. So you got a cherrypicked reply, as a counterbalance.

I will remind:

1. You can find a specific LCD that flickers way less than a specific OLED
2. You can find specific OLED that flickers way less than a specific LCD

Flicker behaviours can be very different between different-size OLEDs, different brands, etc. One OLED can flicker 10x less than the next OLED.

Just because you had a bad experience with a model, doesn't mean you paint all OLED with the same paintbrush. By cherrypicking correct images of one flawed OLED, you attempt to spread the correctness gospel to other OLEDs. There are bad LCDs and there are good LCDs.

The image quality of LCDs and OLEDs have both pro/con lineitems and, you're just merely having a field day focussing on the bad line items -- In many use cases, OLEDs can achieve more perfect image quality than LCDs can. And in other use cases, LCDs can achieve more perfect image quality than OLEDs can. There is no display that is jack of all trades, but OLEDs will be quite a popular technology in 2020s, as things improve from the smeary LCDs of 1992 through the excellent LCDs of 2022, and OLEDs are clearly following a very interesting improvement path.

If you don't mind strobing, then it is of course, true, that no OLED currently yet manages to reach the motion clarity of the pinnacle of LCD strobe tuning, say, such as a good VR LCD such as Valve Index LCD or a Oculus Rift LCD or a custom-DIY-recalibrated (QFT mode) ViewSonic XG2431 in motion clarity though (able to achieve true real-world 0.3ms MPRT with no strobe crosstalk). And we know LCD can achieve 10,000 nit brightness since Sony's 2018 FALD LCD prototype. As you already aware, I like both technologies for very different reasons.

Anybody who narrowscopes to just CRT, or just LCD, or just OLED, without an open mind, are all just risking self-sabotaging their reputation by not having an open mind.

'Nuff said.

Have a great weekend. Cheers!
 
Last edited:
Maybe these arguments are a fad this year for you, but it is a dead horse on my side. I still reiterate what I said.

I will repeat that there are pros/cons of OLED and LCD, and you have to be aware not all OLEDs flicker hard. There is only an ultra-brief flicker on several OLED panels.

The 200 nit you speak of is full white brightness, similar to the ABL-behiavor of a CRT/plasma.
Just FYI this guy goes on and on and on about "200nits" and "OLED flicker". He invaded the Alienware OLED thread and just kept going on about it and hating on anyone who dared to like the monitor. You aren't going to change his mind, he just has some weird hang up on this and feels the needs to repeat it over and over and get mad at everyone who doesn't agree that OLED sucks.
 
Real world OLED HDR actually still outperforms the typical average 300nit LCD.
99.99% monitors are the average LCD.

I haven't seen one OLED that doesn't flicker.

Compared to a good LCD, all these OLED don't even look as good in SDR. In HDR, there is ABL all along.

In the end, the OLED is just a SDR tier product like this 2K 240Hz 150nits. A competitive player like Spoit won't even consider this monitor at all.
 
Just FYI this guy goes on and on and on about "200nits" and "OLED flicker". He invaded the Alienware OLED thread and just kept going on about it and hating on anyone who dared to like the monitor. You aren't going to change his mind, he just has some weird hang up on this and feels the needs to repeat it over and over and get mad at everyone who doesn't agree that OLED sucks.
I don't give up. Doesn't stop me from counter-filling the threads proportionate to him, anyway.

You're talking to a person who myth busted the argument that "humans can't tell apart 30-vs-60fps" since year 1993.

1670724138358.png


Can I tell that some OLEDs flicker? Yes. Certain models of smartphones, with brightness lowered WAY down.

But for some best lowest-flicker models? No. I can't even tell.

...Unless I am trying to do a maximum-stroboscopic-amplification test for an indirect detection of flicker -- e.g. synthetic ultrafast-moving www.testufo.com/persistence (but configured 1 pixel thickness + motion 16 pixels/frame) and then you see ultrathin brightness-reduced lines, as a telltale sign of the sub-1ms brightness drop between OLED pixel refresh cycles. On some OLED panels, that brightness-drop flicker is under 0.05ms and never drops to 0, and occuring only once per refresh cycle, and at high refresh rates, it gets even more impossible to notice -- harder to notice than other attributes.

Flicker on specific OLED bothers you? Fine. Get the correct panel that your eyes prefer. But cherrypicking 1 of >100 ergonomic issues, especially when the flicker is ultra subtle, even less visible than the sinewave AC soft-flicker of an incandescent light bulb? Not all human eyes see the same.

You can be more bothered by one or more lineitems such as by flicker, by stutter, by antiglare film, by too-low brightness, by too-high brightness, by motion blur, by blue light, by display size, by viewing distance, by eye focus issues (not all eyes have same prescription), by jitter, by tearing, by excess eye movements, by motion sickness, by colorblindness issues (12% population), or any other line items -- many far worse than subtle invisible flicker.

The edge-flicker effect of stutter (even 60fps stutter) is a far worse eye-searing flicker than the sub-1ms between-refresh-cycle OLED flicker, so that greatly misses a worse flicker caused by a finite frame rate (edge flicker effect of stutter). When you plot the flicker discomfort graph of some display flicker mitigation, versus the flicker discomfort of stutter edge-flicker, the figurative napkin graph overlaps, and OLED is far beyond that crossover point -- other discomforts are occuring well before the OLED flicker itself nowadays (at least for the best OLEDs). Talk about the titanic missing-forest-for-the-trees.

Stutter-To-Blur Continuum Edge-Flicker Animation Demo

(A more common cause of eyestrain, that is never written about... yet)

Animation Demo Of This --> SEE FOR YOURSELF --> www.testufo.com/eyetracking#speed=-1 (Varying Frame Rate Edition, look at 2nd UFO for 30 seconds).
1. Test this on an LCD at same Hz
2. Test this on an OLED at same Hz

Remember:
  • The stutter-to-blur continuum flickers until around ~45-50fps on LCDs
  • The stutter-to-blur continuum flickers until around ~70-85fps on OLEDs.
Things stutter on OLEDs until a higher frame rate simply because of OLED's ultrafast GtG pixel response (instantaneous to human eyes). Regular stutters (not erratic/judder) have edge-flicker that corresponds to your flicker fusion threshold. To understand the stutter-to-blur continuum, watch the 2nd UFO for 30 seconds at www.testufo.com/eyetracking#speed=-1 where low frame rates stutter and high frame rates blur.
  • It's akin to slow vibrating music string visibly vibrates
  • It's akin to fast vibrating music string blends to blurry
Observe that things stutter until a higher frame rate on fast-GtG sample and hold displays (old LCD vs newer LCD, or LCD versus OLED, or MicroLED).

You may so eloquently observe, that the stutter-to-blur continuum is on the opposite side of 60fps for many people. This will vary depending on your personal flicker-fusion threshold, and it can be outside these ranges. But the range-shifting principle is the same. On LCDs, that's why 60fps looks smooth and on OLEDs, that's why 60fps does not look smooth to some people -- you need to push the framerate up higher to keep the framerates above your flicker-fusion threshold to avoid the stutter-edge-flicker effect. So many people chicken-little about the wrong OLED ergonomics when we actually know bigger truths far beyond.

We are one of the first people in the world to discover this "shifted threshold" of the stutter to blur continuum; we will write about it in 2023.

But the speed of pixel response (GtG) shifts the flicker fusion threshold of the stutter visibility. Test this at sample-and-hold (non-strobed LCD, non-strobed OLED).

OLED is faster, (0ms GtG), which raises the ergonomic-framerate requirements somewhat. Some people are not bothered but if you get stutter eyestrain at 20fps on LCD, you could still get stutter eyestrain at 35fps on OLED -- you need to spray slightly higher frame rate to compensate for the 0ms GtG fastness (amplifies stutter edge-flicker visibility).

Avoid playing 30fps on OLED at close distances if you get stutter eyestrain -- the 30 Hz flicker of edge-stutter is AWFUL at 0ms GtG.

This will be a problem even if OLED flicker is eliminated. This is 100x the bigger cause of eyestrain on OLEDs, and this is why 60fps looks smooth on LCD and sometimes harsh-stuttery on OLEDs. You need a higher frame rate at a higher refresh rate to push your frame rates beyond your flicker fusion threshold. It's funny how people see the OLED flicker test and scapegoat that instead of the bigger cause of eyestrain. (of flickery edge-vibration of stutter of 0ms-GtG displays, during eye-tracking moving object). Slow GtG turns the sawtooth-flicker of stutter edge-vibration, into a softer slanted-sinewave flicker -- and thus makes it harder to see / less likely to be eyestrained by.

The bottom line is that giant numbers of people scapegoat the wrong eyestrain cause, and I've been over the years an expert at diagnosing eyestrain causes -- and it forks to over 100 causes. We even discovered that for some people, strobing reduces eyestrain (because they had more eyestrain from display motion blur). Everybody sees differently, after all. Do you know how many people have come to us, because of motion blur eyestrain problems? Thousands. And we discovered supplementary causes -- like stutter eyestrain and why sometimes low frame rates on OLEDs creates eyestrain. Read the above.

We've long known that strobe backlights amplify stutter (which is why for some people framerate=Hz VSYNC ON or its low-lag clones like RTSS Scanline Sync or Speical-K Latent Sync reduces eyestrain by zeroing-out strobe-amplified stutters).

But we also found out that faster GtG pixel response also amplifies stutters too!

If you're buying a 240Hz OLED, you're definitely not going to undersize your GPU to only 30fps on that OLED, and you can configure more brute framerate. You can RTX Cyberpunk 2077 at 100+fps with some DLSS, and be done with it -- and not worry about OLED stutter-edge-flicker eyestrain (TOTALLY DIFFERENT THING than that youtube video, and a much more common cause of OLED eyestrain)

The prescription to that is simply use high frame rates on OLEDs, and the problem is solved. Middle-framerates on OLEDs can create eyestrain for some people SIMPLY because of the harsher visibility of stutter caused by 0ms GtG.

I will call people out if they crap a discussion thread I'm in, whether bible-thumping something, much like some people doe the RTINGS burn-in tests (great tests, correctly done, but often abused as an excuse not to buy newer more burnin-resistant OLEDs, etc).

At 140wpm, I can out-reply. I can tell when people don't read my posts.

After all, I never give up. I've been doing this since 1993, long before Blur Busters existed.
 
lol, most "competitive players" have a terrible understanding of computer and display tech, so that is hardly any indicator of where value lies.
I don't see many people understand display either.

These competitive players are trained to win a game with the lowest graphics, the lowest rendering latency.

You think they don't understand the display?
 
I don't see many people understand display either.

These competitive players are trained to win a game with the lowest graphics, the lowest rendering latency.

You think they don't understand the display?
They understand how to configure the settings, but they don't understand the Present()-to-photons black box that Blur Busters does.

Like the subtle nuances of different latency gradients created by:

VSYNC OFF + nonstrobed creates top == center == bottom in input lag
VSYNC OFF + strobed creates top > center > bottom in input lag
VSYNC ON + nonstrobed creates top < center < bottom in input lag
VSYNC ON + strobed creates top == center == bottom in input lag

The first and the last are superior in different contexts (e.g. CS:GO versus VR).

The global strobe versus scanout creates the latency gradient change, because not all pixels on a LCD refresh at the same time, and a LCD strobe is global (unlike CRT rolling strobe or a similar OLED rolling strobe algorithm).

Also, OLED rolling strobe can make it possible to preserve the latency gradient for strobed vs nonstrobed, assuming OLED manufacturers implements it correctly. Then strobing becomes vastly superior (at same lag) compared to LCD, but only if done correctly, and if they eventually equalize pixel-for-pixel lag (GPU output to photons). But at the end of the day, brute framerate-based motion blur reduction is superior to strobing. However, we understand why that strobing is suboptimal for esports sometimes unless they're using RTSS Scanline Sync (or Special K Latent Sync) as a tearingless VSYNC OFF system (that simulates VSYNC ON via VSYNC OFF by steering tearlines off the edge of the screen). That's why LCD strobing + scanline sync is a match made in heaven for certain motion-critical games like panning / scrolling / turning especially in crosshairsless games where your eyes are always moving around tracking moving objects. Unlike strobed VSYNC OFF due to the non-equalized latency gradient (top edge of screen through bottom edge of screen), which is also another contributory cause of strobe-amplified jitters (in addition to the lack of motion blur), and the jitters is another major cause of eyestrain for some people who use strobing.

We are experts at different sections of the input lag chain, and why it's superior for different use cases.

Also, the human is part of the input lag chain:

1670726159209.png


If a specific game (e.g. strobing during Rainbow Six) improves human reaction time for typical game tactics more than the input lag of strobing, then you still win.

That's why a not-too-long-ago Rainbow Six world champion (I think it was 2019 or 2020) uses strobing, but CS:GO champions rarely use strobing. Staring at a crosshairs all the time often means strobing has little benefit, but eye-tracking during strobing has major motion blur reductions (identifies camoflaged enemies faster, or identifying during fast turns, etc).

Configuring a display enhancing technology can reduce human reaction time in some games:
--> Strobing tech (reduce blur)
--> VRR tech (reduce stutter)
--> Better blacks (OLED/FALD)
--> Faster pixel response
--> Latency stabilizing tech.
--> etc.

Using the Right Tool for Right Job can mean intentionally increasing lag can reduce human reaction time, if you do it strategically for specific games where a display-enhancing technology creates human reaction time reductions that exceeds the lag increase. Some smart esports player do this in some leagues.

The human is not measured as part of a display test (e.g. photodiode oscilloscope), but the lag advantages of reducing human reaction time exists. That's why some champions use strobing or VRR in certain games.

Example: Fighting game esports ala Street Fighter / Tekken and recently developed clones/descendants. Big community (that most CS:GO world is completely unaware of). There are fighting game players who sometimes use VRR as a low-lag VSYNC technology, because fighting games feel more consistent with consistent gametime-vs-photontime without the strange effects of VSYNC OFF as it pertains specifically to fighting games like Tekken or Street Fighter style games, and the frameslice is usually on the same horizontal level (e.g. enemy is on the same elevation -- screen horizontal center area -- as the whole pixel rows have the same lag, unlike top versus bottom. Latency often feels strange with VSYNC OFF when the game cap is not perfectly synchronized with the refresh rate, causing a sawtooth-cyclic latency effect.

Switching to VRR completely solves this by allowing the monitor to perfectly sync to the game's self capped framerate. You prefer a solid fixed 7ms latency than a slowly sawtoothing [3ms....11ms] latency lottery when timing VSYNC OFF because monitor was 60.03Hz and game was trying to self-cap at 59.974 Hz. You can see that at
www.testufo.com/refreshrate your refresh rate is never exact, since GPU and CPU clocks slew against each other slightly -- even computer heating can cause one or the other clock to tick slightly faster/slower -- and cause slews between CPU framerate caps and GPU driven display clocks.

Many love strobing (we are Blur Busters) but we know that some prefer the best brute framerate-based motion blur reduction, and OLED excel at that, especially when refresh rates are high.

Certainly in the old 60Hz-era, OLED lag may be too disadvantageous to be overcome by faster human reaction times, but in the 240Hz era, there may very well be specific esports games where it's good for. There are no lag measurements yet of all the 240Hz OLEDs, but I'm assuming one refresh cycle lag. Good enough for some esports, but it very well may definitely not be CS:GO (yet) if it is not yet capable of subrefresh processing (yet...that's being worked on industry wide)

Few esports players use VRR because of VRR latency problem, but a little known thing is that G-SYNC becomes esports quality when your uncapped framerates are completely within VRR range, e.g. 360-500Hz allows you to have CS:GO framerates completely within VRR range, and ultrafast scanout minimizing the latency gradient problem (top vs center vs bottom), regardless of fluctuating frame rate. So if you want to esports with VRR you need to buy the highest VRRmax you can afford, even if your game run at only 100fps, and that your VRR range is beyond frame rate range of all the competitive games you play, while also have ultralow scanout latency (a common lag bottleneck of VRR in the past). You want to blast those 100fps frames in 1/500sec, no matter what framerate you're running, the lag delta of (top,bottom) never exceeds 2ms on a 500Hz display. Sync technologies (VSYNC ON, VSYNC OFF, VRR) do gradually converge to identicalness (blurless, stutterless, lagless) the higher Hz you go (...500, 1000, 2000Hz...) given sync technologies exist merely because frame rates are finite and side effects are visible at current refresh rates and frame rates. Then the pros of VRR starts greatly outweighing non-VRR for some esports contexts, when used in a Right Tool For Right Job context.

Anyway:

Smarter esports players know the Right Tool For Right Job.
Some champs in some leagues intentionally increased display lag (for a display enhancement tech) to reduce human reaction lag.

Not all esports is CS:GO, after all.

Next debate. Ready, Player One.
 
Last edited:
After all, I never give up. I've been doing this since 1993, long before Blur Busters existed.
Well goo don you man. I get tired of it after a while. I try to help people out, but when they are just stuck on something and won't listen, I move on. I'm with you on the OLED flicker issue, like I have seen it in some cases, but it certainly isn't noticeable on my TV at all, or my phone. I'm also with you on the higher frame rates. It is one of the things that I like about the new Samsung phones is their 120Hz displays. It just makes scrolling text so much nicer. I mean even more FPS would be nice, but it is a big improvement over the 60 we had for years.

As an aside, I've noticed that some people think they are really sensitive to flicker, and see it in places where it isn't. Had that back in the day with a coworker's wife. When she came into the office she claimed to not be able to stay long because the flicker of fluorescent lights caused her issues. Now while it is true that fluorescents with a magnetic ballast will flicker 120 times per second as the AC voltage passes the null, these lights were powered with electronic ballasts which cycled at 30kHz. Not only beyond human perception, but so fast the phosphor coating doesn't have time to change brightness hardly at all. She didn't believe it when told though, she claimed that they visibly flickered.
 
As an aside, I've noticed that some people think they are really sensitive to flicker, and see it in places where it isn't. Had that back in the day with a coworker's wife. When she came into the office she claimed to not be able to stay long because the flicker of fluorescent lights caused her issues. Now while it is true that fluorescents with a magnetic ballast will flicker 120 times per second as the AC voltage passes the null, these lights were powered with electronic ballasts which cycled at 30kHz. Not only beyond human perception, but so fast the phosphor coating doesn't have time to change brightness hardly at all. She didn't believe it when told though, she claimed that they visibly flickered.
Yep, I wrote about that in my 1000Hz-Journey article.

From a lighting industry research paper:

1670728247421.png


I don't think that's your wifes' complaint, but don't forget about malfunctioning electronic ballasts as well as cyclic arcing inside new or partially worn fluorescent tubes. Sometimes it's so borderline defect, so subtle of a flicker (much fainter than video) that some see it and others do not -- e.g. faint arcing effect.

Anything (malfunctioning ballast, arcing, LED driver PWM supply, or bad interaction with LED driver and old electronic ballast) (which can flicker at a frequency independently of the ballast) can create situations where they can flicker at under 20,000 Hz.

In addition, some fluorescent wavelengths are very harsh and amplifies a pulsating placebo effect (much like how a pounding headache or fast heartbeat can cause the whole world to flicker in your eyeballs if you're very tired, have a headache, or blood borderlines leaves your head when suddenly waking up, standing up, and turning on fluorescent lights. Not faint-league, but enough to cause pulsating vision), especially visible with bright lights, especially if you just entered from a dark room to outdoors or a harshly lit room. So you can get some interpretations of flicker from causes different from the actual flicker itself.

Whether be biological causes or stutter, the whole world definitely overscapegoat PWM or flicker -- the nature of flicker and the other overlapping multiple-frequencies of flicker (e.g. different flickering stimuli) can be the actual cause, whether be the low-frequency component of arcing instead of the high-frequency component of the ballast, or as it pertains to displays -- the stutter-edge-flicker component instead of the actual tiny between-refresh OLED Flicker, etc. We look at forests and see how the aggregate of trees may be stressing our vision, not narrowscoping to a single flicker tree.

But then again PWM can be definitely the culprit. The new LED tubes can be problematic if they are the cheap kind! If it's a LED "fluorescent tube replacement", they have independent power supplies inside them (sometimes with a lower PWM frequency alas -- like 1,000 or 2,000 Hertz -- UGH). And sometimes have pretty bad PWM behaviors with electronic ballasts if they weren't removed before replacing the mercury tubes.

A dimmered-down 20,000 Hz short-pulse duty-cycle squarewave on LED tubes is sometimes (in very very very certain cases) noticeable, unlike for the soft-wave flicker of fluorescent-phosphor tubes. This would not occur at 100% brightness, but often the LEDs are PWM'd to increase LED lifetime in an enclosed fixture and/or to allow dimming capability. Or that it's caused by a bad interaction with an electronic ballast that shouldn't stay there when the LED tubes were installed. Since LED phosphor is much faster than fluorescent phosphor, especially with cheap low-CRI tubes, and are visible at higher thresholds. Tiny pinpoints of light moving at 20,000 centimeters per second, can still stroboscopically phantom-array at 1cm intervals, given sufficient brightness of the individual pulses. But this is probably not the cause.

There are many situations where this can be a placebo -- and where it may not be.
 
Last edited:
They understand how to configure the settings, but they don't understand the Present()-to-photons black box that Blur Busters does.

Like the subtle nuances of different latency gradients created by:

VSYNC OFF + nonstrobed creates top == center == bottom in input lag
VSYNC OFF + strobed creates top > center > bottom in input lag
VSYNC ON + nonstrobed creates top < center < bottom in input lag
VSYNC ON + strobed creates top == center == bottom in input lag

The first and the last are superior in different contexts (e.g. CS:GO versus VR).

The global strobe versus scanout creates the latency gradient change, because not all pixels on a LCD refresh at the same time, and a LCD strobe is global (unlike CRT rolling strobe or a similar OLED rolling strobe algorithm).

Also, OLED rolling strobe can make it possible to preserve the latency gradient for strobed vs nonstrobed, assuming OLED manufacturers implements it correctly. Then strobing becomes vastly superior (at same lag) compared to LCD, but only if done correctly, and if they eventually equalize pixel-for-pixel lag (GPU output to photons). But at the end of the day, brute framerate-based motion blur reduction is superior to strobing. However, we understand why that strobing is suboptimal for esports sometimes unless they're using RTSS Scanline Sync (or Special K Latent Sync) as a tearingless VSYNC OFF system (that simulates VSYNC ON via VSYNC OFF by steering tearlines off the edge of the screen). That's why LCD strobing + scanline sync is a match made in heaven for certain motion-critical games like panning / scrolling / turning especially in crosshairsless games where your eyes are always moving around tracking moving objects. Unlike strobed VSYNC OFF due to the non-equalized latency gradient (top edge of screen through bottom edge of screen), which is also another contributory cause of strobe-amplified jitters (in addition to the lack of motion blur), and the jitters is another major cause of eyestrain for some people who use strobing.

Anyway:

Smarter esports players know the Right Tool For Right Job.
Some champs in some leagues intentionally increased display lag (for a display enhancement tech) to reduce human reaction lag.

Not all esports is CS:GO, after all.

Next debate. Ready, Player One.
The Right Tool for Right Job.
That's why these competitive players still choose a TN like XL2566K despite the XL series are built cheaper after each generation. You also forget to mention the GPU rendering latency outweighs most of them.
Flicker on specific OLED bothers you? Fine. Get the correct panel that your eyes prefer. But cherrypicking 1 of >100 ergonomic issues, especially when the flicker is ultra subtle, even less visible than the sinewave AC soft-flicker of an incandescent light bulb? Not all human eyes see the same.
The problem is that these flickers are not visible. You cannot see it but you will have eye strain even the frequency is at 100KHz.
All the reason why the actual good LEDs are over 300KHz while OLED is only at several hundred Hz the same as the refresh rate. You cannot get away with that.
Did you participate in anyway during the development of this 240Hz OLED? It is out soon. I will test it out.
 
Guys, I was joking about going blind from the 55" bright oled :p. I didn't think anyone would take that seriously! :)
You are more right than you think. The early firmware S95B OLED can get crazy bright. I've literally had to partly close my eyes/look away during some HDR scenes in a dim room.
 
Windows 11 makes "multimonitor simulator on a single screen" easy and instant:

View attachment 533387

While multimonitor is still useful, but for common use cases like mine who the [BLEEP] needs multimonitor anymore when we now have that multimonitor-simulator convenience now? Multimonitor adds lag and stutters to gaming. I don't want that. My desktop is black and devoid of icons so a 27" window looks like a 27" monitor, with perfect pitch black surrounding it.
I had no idea this feature existed. Pretty useful alternative to using Powertoys. Are there shortcut commands to activate these?

Using a single monitor does have a few caveats though:
  • Fullscreen apps. Games, fullscreen on YT etc will fill the whole screen and this can be an issue especially on ultrawides or superultrawides where it would be more useful to use that extra space for other windows but the fullscreen mode gives you a 16:9 screen in the center with black bars. Using windowed mode for games means you have title bars etc (unless you use some hacks to remove them), sometimes reduced performance. Borderless window sometimes works with a lower than native res and other times acts just like fullscreen. Moving the smaller than native borderless window is often difficult too.
  • Virtual desktops. Each monitor gets its own virtual desktop whereas on a single monitor you get only one set. Having multiple sets of virtual desktops is very useful for quickly switching between sets of programs. I wish you could "zone" virtual desktops to portions of the screen.
I currently use 3 displays for work. A 28" 4K IPS panel, a 16" Macbook Pro and a 12.9" iPad Pro stacked above the MBP. Each with their own virtual desktops. Works well, is not too big, is quite high res. The 2019 MBP is just a pile of garbage though and struggles with the 4K display. Hoping to swap it for a M2 model next year as this is a laptop provided by my workplace.

I hope to change the 28" to a 32-42" display at some point if the right product comes along (better 4K mini-LED 32", 4K OLED 32", curved 42" OLED/mini-LED etc).
 
The Right Tool for Right Job.
That's why these competitive players still choose a TN like XL2566K despite the XL series are built cheaper after each generation. You also forget to mention the GPU rendering latency outweighs most of them.

The problem is that these flickers are not visible. You cannot see it but you will have eye strain even the frequency is at 100KHz.
All the reason why the actual good LEDs are over 300KHz while OLED is only at several hundred Hz the same as the refresh rate. You cannot get away with that.
What competitive gamers do tends to be "this famous, talented player does thing X, so I will also do X" type armament cycle. Often also company sponsored. In reality these gamers are just plain good at the game and whether they use a 120 Hz or 360 Hz display would not reduce their performance to a degree that they still would not dominate. Will it matter at the very highest levels? Maybe, but most players do not play at this level or anywhere even close to it.

Flickering or any other eye strain issues are hugely personal and which ones affect you varies a ton. So there's no single "because tech X does this it's an issue for everyone." I have never gotten eye strain from OLEDs but that doesn't mean you don't. Instead I tend to have issues playing games that have a particular camera movement and no crosshair on screen. For example Gears of War 5 made me nauseus until I used my display's crosshair feature to draw a focal point on screen at all times. Others would not have this problem at all.
 
What competitive gamers do tends to be "this famous, talented player does thing X, so I will also do X" type armament cycle. Often also company sponsored. In reality these gamers are just plain good at the game and whether they use a 120 Hz or 360 Hz display would not reduce their performance to a degree that they still would not dominate. Will it matter at the very highest levels? Maybe, but most players do not play at this level or anywhere even close to it.

Flickering or any other eye strain issues are hugely personal and which ones affect you varies a ton. So there's no single "because tech X does this it's an issue for everyone." I have never gotten eye strain from OLEDs but that doesn't mean you don't. Instead I tend to have issues playing games that have a particular camera movement and no crosshair on screen. For example Gears of War 5 made me nauseus until I used my display's crosshair feature to draw a focal point on screen at all times. Others would not have this problem at all.
No doubt people can just use a 60Hz 20ms monitor to win a game.
But these esport monitors are specifically built to compete. That's why they use them not just because they are trained good.
You think you can win a game on your TV?
Too bad nobody actually play games here.
 
No doubt people can just use a 60Hz 20ms monitor to win a game.
But these esport monitors are specifically built to compete. That's why they use them not just because they are trained good.
You think you can win a game on your TV?
Too bad nobody actually play games here.
A huge ton of people play competitive games on a Xbox or PS5 using a TV of varying quality. I'm just saying that many players are not tech enthuasiasts as well and use what is recommended, popular, what they can afford, what is on sale or what they have already. Which is very different from a nit picky enthusiast community like this one.

I can't stand toxic multiplayer gaming communities anymore so I don't play multiplayer games unless it's with friends. Mostly I play single player games. Currently playing Ghost of Tsushima on my PS5 and 4K OLED TV, next up planning to replay Witcher 3 with the next gen upgrade on PC, Cyberpunk 2077, Plague Tale: Requiem, Return to Monkey Island, Disco Elysium, Resident Evil 3, God of War Ragnarok...more games than I have time to play out there!
 
Reading it all. While I understand that some of the stances are being presented in a point/counterpoint fashion if not downright argument at times in these threads, I really appreciate the time everyone takes to post the information as one can learn(and even re-learn as the case may be) a lot from these kind of discussions as more and more information is delivered.

Going back several replies and focusing on one facet that mark R. and kasakka just mentioned for a moment, the windows popup window arrangement thing is very cool and brings that kind of functionality to masses of people who probably didn't use any 3rd party apps to do similar. Personally I use a stream deck's window management plugins and some displayfusion functions thrown in and have been for years. It's good to see that kind of thing become more standardized and is available on a default windows install or pre-built rig/laptop though, even if simplified in function since it is without any configuration effort required.

However the streamdeck and/or displayfusion method allows you a lot more control and you can do a lot more with it beyond the window's templates. Displayfusion also incidentally allows you to remove the borders from windowed mode apps/games (using a downloadable custom function), and swap back and forth with a hotkey/button toggle multi-press. It can also do virtual screens with dimensions you set up, and you can swap between different virtual screen setups on the fly - - or you can just make sets of saved windows positions on a regular full screen/array that you can activate on the fly which will shuffle out your app's window positions or back to them after you've moved something. You can set buttons up to do similar on a per app basis too, and launch, etc. etc. Best case use is via a streamdeck's buttons imo.

With the stream deck window management method you can set up your own app window sizes and locations so can tile your desktop windows however you want, and deal them out or teleport them back with the press of a button.


I've been doing big verticals with a 43" 4k 60hz samsung VA (nu6900 , 6100:1 contrast) bookended on each side of my main screen for several years now. I call them "the two towers".

It's neat to see more people getting exposed to large vertical screens as usage scenarios though nowadays, with uw's and that ark screen's portrait mode and built in window management, window's 11's easy snap-to window management system, etc.

That's pretty much my reply but in case anyone might be interested or benefit from it, here is some info about how I do window management on my tall screens below.

.........................................................................

Easier Stream Deck + handful of stream deck addons method w/o displayfusion:

.........................................................................

You can do most of what is in the displayfusion section below this section more easily with some of the available plug ins for a stream deck without having to use displayfusion:

https://altarofgaming.com/stream-deck-guide-faq

Navigate to "More Actions…" and install the following plugins to your Stream Deck: Advanced Launcher & Windows Mover & Resizer. Advanced Launcher, will not only let you pick the application you want to run, but you can also choose to Run as Administrator, Limit the number of running instances, Kill the existing instances – as well as the best one of all – set your unique arguments on the launcher!
.

Windows Mover & Resizer on the other hand, takes productivity to a WHOLE new level! The macro will apply to either your currently selected window, or a specific application you define, and it will let you choose the exact monitor, position & size you want the window to take on the click of your button! ?? This. Is. Sick!
.

And what's the last piece of this glorious puzzle? Stream Deck Multi Actions! Simply combine Advanced Launcher & Windows Mover into a Multi Action, where you first launch the application with the exact settings you need, then it gets automatically positioned in the exact coordinates and size you need!
.

Bringing us to the last huge step! Creating a HUGE multi action that will instantly launch a "Workspace" for you! Launch your game, audio output & input settings, stream starting soon scene, face camera, lights and whatever else you can dream of, in the press of a button! Oh yes, Stream Deck can get THAT good! ??
*note by me: you can also set up different sets of those "Saved window position profiles" in that last step (or via displayfusion pro + the displayfusion hotkey mapped to a streamdeck button). In that way, you can hit different buttons or multi-press/toggle a single button to shuffle between different styles of window layouts of your apps.

..

....................................................................

Stream Deck + Displayfusion Methods

....................................................................

I use a stream deck's plugins combined with displayfusion functions to open and set all of my app's "home" locations across three screens. Cobbling together a few existing functions in the displayfusion library, I can hit a single app icon key several time to: ..check to see if the app is open, if not open, launch it. ..check to see if the app is minimized: if not, minimize it. if yes, restore it to its home position. That way I can hit the button once to launch an app or hit it a few times to shuffle the app min/restored to its home position. I also have a bunch of streamdeck buttons that will move whatever window is the active window to a bunch of pre-set locations:



I also have a button set to an overall saved window position profile so that once all of my apps are launched or after anytime I move any from their home positions - I can hit one button and they'll all shuffle back to where I saved them in the window position profile. (I can also save more than one position profile).

I keep the main taskbar dragged off of my primary screens to the top of one of my side portrait mode screens. I use translucent taskbar to make it transparent and I set up taskbarhider app to show/hide the taskbar as a toggle via hotkey/stream deck button. That locks the taskbar away or shows it via hotkey/button rather than relying on mouse-over. I can still hit WIN+S and type two letters and hit enter for anything I have yet to map to a page on my streamdeck's buttons. I can use Win+TAB to page between tiles of all apps/windows (displayfusion can set that to limit which app thumbnails are shown in the tab popup overlay to which apps are open on the currently active screen optionally as well) .. but I can usually just do that with each app's button as I outlined above so rarely need to do that. The start menu button is always available too obviously but again I have little need with Win+S.

.................................

I highly recommend a stream deck to break away from the whole windows kiosk interface even more. It'll change you life.



. . . . . .

While I agree that a single screen OLED in place of a multi-monitor array is possible even now, personally I feel it won't be optimal for me until we get something like an 8k 55" (1000R curved preferably) screen. Then I'd be able to get four zones / 4k quads worth of desktop real-estate out of it and at high PPD rather than quads of 1080p worth. So it would really be able to replace multiple 4k screens 1:1, and running a 32:10 uw rez on one, rather than 21:10, would be viable for me as 32:10 wouldn't be limited to 1200px tall anymore either. The 4k samsung ark was a step in that direction in some facets but it missed the mark on a lot of technical facets and is much overpriced. It gets around ~ 62 PPD at the focal point of the curve which is ok with aggressive AA in games and massaged or alternate text sub-sampling methods but it's not great imo. (There is no AA on the 2D desktop for desktop graphics and imagery outside of text-ss either so it is uncompensated for in most apps and images).

As it is now, a lot of people sit around 24" or so from a 42" screen (that or the equivalent for larger screens). ~ 24" view on a ~ 42" 4k ends up effectively dropping the ppd from what you'd normally associate with 4k fine pixel density down to something more like what 1500p would look like at traditional nearer desk distances. If you then use windows scaling to compensate for the more pixelated, lower PPD on text, you are then also dropping the effective desktop real-estate down more to what you'd get with 1500p too. It's usable like that certainly, as many of us used 27" 1080p and 27" to 31.5" 1440p screens for years, but it's not great in the era where we can get fine pixels via 4k+ screens when viewed at optimal distances. Aggressive AA and massaged text sub-sampling really don't start to compensate enough imo until around 60 PPD, and on the uncompensated for 2D desktop even higher is desirable.

tJWvzHy.png


. . . . .

I'm looking forward to or at least hopeful for a big (55" ?) 8k gaming/"gaming TV" screen as a multi-monitor array replacing screen someyear in the future hopefully. For now a 48" curved 4k would be nice even if not as capable of array-replacing as an 8k would to my sensibilities. A 1000R 48" 4k would get 69 - 70 PPD at 1000R/1000mm/ ~ 40" view distance. Not holding my breath or anything but those kind of formats I mentioned would be something to look forward to.
 
Last edited:
Back
Top