27" 240hz OLED Monitor!!! (but there's a catch...)

Going back several replies and focusing on one facet that kasakka just mentioned for a moment, the windows popup window arrangement thing is very cool and brings that kind of functionality to masses of people who probably didn't use any 3rd party apps to do similar. Personally I use a stream deck's window management plugins and some displayfusion functions thrown in and have been for years. It's good to see that kind of thing become more standardized and is available on a default windows install or pre-built rig/laptop though, even if simplified in function since it is without any configuration effort required.

However the streamdeck and/or displayfusion method allows you a lot more control and you can do a lot more with it beyond the window's templates. Displayfusion also incidentally allows you to remove the borders from windowed mode apps/games (using a downloadable custom function), and swap back and forth with a hotkey/button toggle multi-press. It can also do virtual screens with dimensions you set up, and you can swap between different virtual screen setups on the fly - - or you can just make sets of saved windows positions on a regular full screen/array that you can activate on the fly which will shuffle out your app's window positions or back to them after you've moved something. You can set buttons up to do similar on a per app basis too, and launch, etc. etc. Best case use is via a streamdeck's buttons imo.
Might be worth its own thread, but could you maybe post some info on how you have your (from what I remember) wild multi-monitor battlestation and window management stuff set up and how you typically use it? I could probably pick up a few things.

I have a super programmable keyboard (the horribly named Ultimate Hacking Keyboard) but have considered adding the new Stream Deck+ to my setup just to have those easy little display buttons so I don't have to remember where everything is.
 
  • Like
Reactions: elvn
like this
A huge ton of people play competitive games on a Xbox or PS5 using a TV of varying quality. I'm just saying that many players are not tech enthuasiasts as well and use what is recommended, popular, what they can afford, what is on sale or what they have already. Which is very different from a nit picky enthusiast community like this one.

I can't stand toxic multiplayer gaming communities anymore so I don't play multiplayer games unless it's with friends. Mostly I play single player games. Currently playing Ghost of Tsushima on my PS5 and 4K OLED TV, next up planning to replay Witcher 3 with the next gen upgrade on PC, Cyberpunk 2077, Plague Tale: Requiem, Return to Monkey Island, Disco Elysium, Resident Evil 3, God of War Ragnarok...more games than I have time to play out there!
Is this a console player moment? How competitive you can be in a single-player game? All you do is compete with yourself repeatedly after all. As I said a 60Hz 20ms monitor would be enough.

There are certain console players that are moderately good, good at pinpointing and strategy. In the end, they are not that good compared to the frag PC players with a keyboard and mouse plus an eSport monitor built to compete. I don't think you even realize that they have multi-monitors built for different purposes unlike just using a cheap TV for everything. You say you cannot stand toxic communities. Then there is no pressure you can take in the ranking when your competitors sandbag you 100 times a day.

So much for the Right tool for the Right Job.
 
Real competitive gaming and tournaments happen on a LAN imo, not online.

You can "bench test" system specs locally like tiny differences in already low input lag, much higher fpsHz than 120fpsHz, etc. but playing online due to it's limitations more than muddies the results making what can be very appreciable local increases less meaningful in online gaming as compared to LAN or single player games or vs. local bots. It's just the nature of online gaming and server code. These specs are marketed in advertising as if it's a 1:1 relationship when playing competitive games online. That is false.

You have two sides going on every player's machine, local and what the server manufactures as the result at it's own rate. This inequivalence is happening all the time but it's only the worst most obnoxious occurrences or when the man behind the curtain is shown more obviously so to speak that people call it out as "rubberbanding" , "peeker's advantage" , and things like high fire rate weapons ending up across fewer ticks resulting in "super bullets" (like time slips in movies) among other issues. The thing is, in online gaming it's always rubberbanding and throwing manufactured results back at you, it's just a matter of how long or short that rubberband is and what kind of guesswork biases are coded in one way or another.

That's not even going into the rabbid cheating in online "competitive" games. They are smart enough to dial the cheats down to low-key cheating to be somewhat believable advantages in many cases, and also use them to carry or ladder up their bros/teammates. There are a lot of articles on famously high numbers of cheaters logged by gaming co's in popular games, as well as ranked popular competitive individuals getting busted by the game companies or stupidly in streams. :LOL:

. .
(2021) There are plenty more articles than this available and for other games but it's a telling landscape. There is also the paid cheat service company's #'s I didn't post here.
------------
Fortnite

I think fortnite's tickrate is 30, + doubling your ping time for interpt2 in normal gameplay, the interp2 dialed back in interpolation by the server . . (unless you want to suffer huge ms penalty ~ 250ms on any latency spike) ..

Fortnite players cheat the most in any online multiplayer game, with over 26,822,000 searches for hacks

https://us.blastingnews.com/gaming/...fter-fncs-cheating-controversy-003096277.html

https://fortnitetracker.com/article/1236/fortnite-cheating-crisis-reaches-new-highs

-----------------------
Call of Duty Warzone

Looks like Call of Duty Warzone's tick rate is even lower at around 20.

https://www.forbes.com/sites/paulta...rzone-has-now-banned-half-a-million-cheaters/


------------------------------

Apex legends, surprisingly is also 20 tick rate! (For reference again, Valorant is 128tick). That makes it the same scenario as COD:WZ above in regards to how many action state dots your machine is connecting to on the dotted line so to speak as far as the server is concerned.

March 2021
Hundreds of high ranked Apex Legends players just got banned for cheating

april 2021
Apex Legends Devs Looking Into Compensation For Losing To Cheaters

Feb 2021
https://charlieintel.com/apex-legends-cheaters-hit-by-huge-ban-wave-with-more-to-come/85678/


-----------------------------------------------
 
Last edited:
Might be worth its own thread, but could you maybe post some info on how you have your (from what I remember) wild multi-monitor battlestation and window management stuff set up and how you typically use it? I could probably pick up a few things.

I have a super programmable keyboard (the horribly named Ultimate Hacking Keyboard) but have considered adding the new Stream Deck+ to my setup just to have those easy little display buttons so I don't have to remember where everything is.

A stream deck with a little hobbying configuration time can change your pc-life. Honestly. Especially in multi-monitor setups.
 
I hope this screen dethrones the "tv" displays but i highly doubt it. I'm also on a calibrated s95b and its just....different in a good way especially for the price. The monitor prices are still whacked and some company is just going to have to eat it and cannibalize their own products to make it make sense against tvs. Its like they want you to pay more for less screen.
 

Attachments

  • 20221027_111708.jpg
    20221027_111708.jpg
    562.4 KB · Views: 0
It's nice to see vigorous discussion here.

As always, Right Tool For Right Job. Sometimes it's definitely an XL2566K, for sure. That's the best TN LCD I've ever seen.

OLED and related technologies will probably take much larger segments of the market towards the end of the 2030s. The lightweight design of OLED can even have ultrawides lighter than FALD LCD 27", which lowers the cost of shipping, so once the brightness windows get bigger and gains the capability of subrefresh latency, price-per-performance will be incredibly tempting, meaning bigger pie share of the gaming monitor (likely double-digit % of the market) by the end of the 2020s decade.

At the end of the day, competitive esports is actually, at the end of the day, only a minority % (aka under 50%) of the the gaming market. If you're immersed it feels like 90%-99% of the market. There's mobiles, VR, solo games, casual games. Say, you may see your extended family playing on their consoles, and your friend playing solo games, and yet more playing a powerful RTS game on an iPad, and others playing on Quest 2 VR headsets. While esports is the most high-dollar of the communities, the whole 240Hz market is now far more than just esports now, with great options that have great quality for general purpose use. Many use high refresh rate for non-gaming reasons now too. 240Hz is not just for esports after all, it reduces web-browser scrolling by 75% -- and that's why 120Hz phones/tablets are getting mainstream. But Blur Busters is the most well known in the esports communities. I get it. But then when one says " Too bad nobody actually play games here", it is simply a mere inability to see outside oneselfs' specific community.

Meanwhile, I will begin to write a big post explaining esports-latency displays (next post).
 
Last edited:
  • Like
Reactions: Q-BZ
like this
All the reason why the actual good LEDs are over 300KHz [referring to flicker]
This is completely false 😉

You are false attempting to be authoritative by misrepresenting a Pixel Clock (300KHz+) with other low-frequency flicker like the Inversion Algorithm (the LCD voltage-balancing system, normally invisible, but sometimes has artifacts).

LCD Inversion flicker is only half refresh rate (72Hz flicker at 144Hz).

Common YouTuber single-pixel photodiodes don't see this clearly because this is alternate-pixel flicker (like an alternating-chessboard flicker), so the average picture brightness stays stable through it all.
But they are much more visible in macro-focus high speed camera on a TN LCD.

Also it's worth knowing, I work with manufacturers, see Services section of Blur Busters. I understand the internal behaviours of LCDs and OLEDs much better than you do.

There are MANY different frequencies involved in a display including the common ones pixel clock, horizontal scan rate, vertical refresh rate and the less known ones (inversion algorithms, VRR gamma-balancing algorithm, VRR variable-overdrive, strobe backlight duty cycles, etc. Many of these numbers don't flicker, while other numbers do, depending on the display.

I can't stand toxic multiplayer gaming communities
I can understand the sentiment, even though I am more diplomatic, when I mingle in the esports communities.

Too bad nobody actually play games here.
Such nonsense.
So I circle back to:

Deep breaths gentlemen.... :)

I am well known in the esports community because they use things like TestUFO to check their monitors -- and I certainly respect VSYNC OFF and LCD technologies. I'm also a big fan of the XL2566K if you seen my glowing comments in the other forums.

It's important to know competitive games are only a section of the whole gaming market -- the gaming communities are relatively siloed -- and it is not impossible to create an OLED that has less lag than an LCD (realtime scanout + fast pixel response beating LCD GtG) -- remember the first LCDs had really high lag because they had to buffer the incoming refresh cycle first before scanning-out to the panel.

For those who don't understand the latency chain at the video-cable level and scanout level:

1670789753940.png


1. On the cable, it takes a finite time to transmit from first pixel through last pixel.
2. On the panel, it takes a finite time to refresh from the first pixel through last pixel.

Pixel Clock (>300KHz) is the number of pixels per second over a vide cable. Many people mis-quote this number, but this number is completely unrelated to flicker, it's simply a transmission speed -- like Internet speed (3 megabits/sec, 100 megabits/sec, gigabit). The latest DisplayPort can go up to 77 gigabits per second now.

Pixels are transmitted over the video cable from left-to-right, top-to-bottom, and the porches/sync intervals (seen in a Custom Resolution Utility), formerly used to control CRT electron beam, are essentially defacto indicators to start next pixel row, or start new refresh cycle. And variable refresh rate piggybacks on this by varying the size of the vertical blanking interval (VBI) -- which is the sum of (Vertical Front Porch + Vertical Sync + Vertical Back Porch).

On a 60Hz panel, usually it takes 1/60sec to transmit the first pixel through the last pixel. Some displays can refresh the panel in realtime (stream the cable straight to the panel, while other displays need to buffer the refresh cycle.

Early LCD displays fully buffered the refresh cycle from the cable, to process the refresh cycle first (picture brightness, overdrive, inversion, etc). Later LCD displays, especially in esports, figured out how to stream pixels directly from cable to the panel using only rolling-window processing algorithms, lowering latency to mainly port transceiver latency and pixel response latency. The same thing is happening again to early OLED displays, which will progressively improve in latency. Also, local dimming backlights add lots of latency too because of the processing required. So LCD and OLED have processing tradeoffs. Some algorithms (e.g. find brightest pixel for HDR processing) requires full framebuffer processing while scaling an image can be achieved with a rolling-window (2 or 3 pixel rows) processing. So you have tradeoffs of picture quality vs latency on both LCD and OLED, especially with certain kind of FALD backlights that requires full-refresh-cycle buffering (not esports). There are many latency variables in a screen, but there's nothing stopping OLEDs from eventually matching LCD latency (at least in HDR=OFF). OLED is capable of getting lower-lag perfect-blacks than LCD can, once properly optimized.

This won't be true at first, but technologically, one has to understand why some displays buffer the incoming signal before beginning to refresh.

The ideal situation is when a display can stream a signal (top-to-bottom scanout) to the panel (top-to-bottom scanout) with a synchronized scanout, achieving subrefresh latency, which works excellently with VSYNC OFF:

1670790283968.png


Nothing stops OLED from being able to do this (eventually) at least in the equivalent reduced-processing mode (LCD and OLED reduced-processing modes). The bonus is that the reduced processing mode on OLED still has perfect blacks, which is still fantastic. Mind you, picture gamut may be reduced (e.g. non-HDR) on both LCD and OLED when turning on an esports-latency mode, but time will tell what latency-vs-picture tradeoffs occur. LCDs have been around much longer, and the world has had time to optimize them for latency, including inventing sub-refresh processing algorithms.

Because sometimes some algorithms (both LCD and OLED, such as OLED find-brightest-pixel, or various LCD FALD processing, etc) requires full refresh cycle buffering before displaying it, while other algorithms can simply stream signal scanout straight to the panel, beginning to displaying the refresh cycle even BEFORE the bottom of the refresh cycle has finished incoming-transmitting over the cable!

That's what esports displays use -- subrefresh processing, where lag from pixel-on-GPU-output to photons-emitted-from-pixel, is less than one refresh cycle. This is despite the fact that it takes 1 refresh time (e.g. 1/144sec or 1/240sec) to transmit the first pixel through last pixel on the video cable, at normal VESA video signal timings formulae.

See, I do understand the processing innards of an esports displays better than you do...
 
Last edited:
Is this a console player moment? How competitive you can be in a single-player game? All you do is compete with yourself repeatedly after all. As I said a 60Hz 20ms monitor would be enough.

There are certain console players that are moderately good, good at pinpointing and strategy. In the end, they are not that good compared to the frag PC players with a keyboard and mouse plus an eSport monitor built to compete. I don't think you even realize that they have multi-monitors built for different purposes unlike just using a cheap TV for everything. You say you cannot stand toxic communities. Then there is no pressure you can take in the ranking when your competitors sandbag you 100 times a day.

So much for the Right tool for the Right Job.
I said I don't play multiplayer games. That was a separate thing. Many others do, on 4K TVs with consoles.
 
What competitive gamers do tends to be "this famous, talented player does thing X, so I will also do X" type armament cycle. Often also company sponsored. In reality these gamers are just plain good at the game and whether they use a 120 Hz or 360 Hz display would not reduce their performance to a degree that they still would not dominate. Will it matter at the very highest levels? Maybe, but most players do not play at this level or anywhere even close to it.

Flickering or any other eye strain issues are hugely personal and which ones affect you varies a ton. So there's no single "because tech X does this it's an issue for everyone." I have never gotten eye strain from OLEDs but that doesn't mean you don't. Instead I tend to have issues playing games that have a particular camera movement and no crosshair on screen. For example Gears of War 5 made me nauseus until I used my display's crosshair feature to draw a focal point on screen at all times. Others would not have this problem at all.

I personally get real tired of all the worrying/bitching/worshiping about high end competitive gamers with regards to displays (and other hardware). Unless you play competitive games AND you are performing at such a level where tiny difference might matter, just don't worry about it. So long as your display doesn't have some absurdly high amount of lag, it isn't going to be an issue when you are playing in silver.

Likewise there are many, MANY people who DON'T play competitive games, and thus it doesn't matter at all. For us, getting the display that makes us happiest in terms of appearance, features, etc, etc is all that matters. Doesn't matter that my monitor is something none of the pros use, I like it, and I don't play competitive games so who cares?

I think there's too much fallback in arguments on to who professional gamers do, or what is the best for competitive gaming. That is just not relevant to many people. They either don't game competitively, or don't do so at a level where small performance differences matter. Thus, focusing on that is not only silly, but also detrimental to what would actually give people the best experience. Same is true for graphics settings. Yes, super competitive gamers may turn down most or all the settings to minimum to not only maximize framerates but also to minimize visual occlusion... but that doesn't mean all gamers want that. Eye candy is nice, some of us would like to crank things up and enjoy the experience even if it means lower fps.

It also amuses me how much stressing there is over having the right display for competitive gameplay but I never see ANYTHING about audio for competitive play, despite the fact that we have faster reaction time to sound than visuals on average. None of the "You have to have a 360Hz TN panel or you suck" people can tell me how many buffers their soundcard has, or how big they are. They can't tell me what kind of oversampling filter the DAC uses and how much latency that adds, but god forbid you have an extra ms of latency on your display! That might permanently doom you Silver 3, keeping that elusive prestigious Silver 4 out of reach!

Finally, I think that focusing on what pros do can be detrimental, even if you do the same thing, as people don't always make the provably most optimal decisions, they'll do things for superstitious reasons or other such things. One example from the pro audio world, since I play in that space, is that a lot of pros swear by Yamaha NS-10 speakers. They are ugly little white-coned speakers that you see sitting on many meter bridge. Things is, they are objectively not very good. Do proper spinorama measurements on them and they are easily outclassed in all areas, distortion, frequency response, bass extension, etc, by newer speakers. Cheaper ones, even, since they are now no longer made and thus cost a premium. Doesn't matter, you get more than a few pros that just swear by them and have to have them. But that doesn't mean YOU should try and get them if you get into audio.
 
Finally, I think that focusing on what pros do can be detrimental, even if you do the same thing, as people don't always make the provably most optimal decisions, they'll do things for superstitious reasons or other such things. One example from the pro audio world, since I play in that space, is that a lot of pros swear by Yamaha NS-10 speakers. They are ugly little white-coned speakers that you see sitting on many meter bridge. Things is, they are objectively not very good. Do proper spinorama measurements on them and they are easily outclassed in all areas, distortion, frequency response, bass extension, etc, by newer speakers. Cheaper ones, even, since they are now no longer made and thus cost a premium. Doesn't matter, you get more than a few pros that just swear by them and have to have them. But that doesn't mean YOU should try and get them if you get into audio.
The Yamaha NS-10 is actually a good example of "know your gear." People like them as they tend to reveal issues in a mix because they tend to emphasize those problem areas. Often in a studio you would have multiple sets of speakers - you have your nice and flat response Genelecs or Focals or whatever, then you might have some less than stellar ones so you can make sure your perfect mix also sounds good on what most people will be using: less than perfect speakers and headphones.

This might lead to people picking the NS-10 for the wrong reasons, just like people might pick a monitor just because it happens to work for a particular eSports player, which is not a guarantee it's going to work for you. Apply this to something even more personal like keyboards or mice and you are likely to have even more unhappy users.
 
Personally, I play slow-paced singleplayer games or "drunk fun" co-op games.

And I still think low-latency-high-refresh displays feel AMAZING. it doesn't matter how it improves my skills: It improves my experience.

ESports is not my thing, and I honestly couldn't care less about it for my needs: but I still want a 480Hz OLED with sub 1ms latency. I can imagine how amazing that would look and feel.
 
This is completely false 😉
I'm afraid you are the one who's out of the loop. It's obvious your authority has limits.

You are false attempting to be authoritative by misrepresenting a Pixel Clock (300KHz+) with other low-frequency flicker like the Inversion Algorithm (the LCD voltage-balancing system, normally invisible, but sometimes has artifacts).

LCD Inversion flicker is only half refresh rate (72Hz flicker at 144Hz).

Common YouTuber single-pixel photodiodes don't see this clearly because this is alternate-pixel flicker (like an alternating-chessboard flicker), so the average picture brightness stays stable through it all.
But they are much more visible in macro-focus high speed camera on a TN LCD.

Also it's worth knowing, I work with manufacturers, see Services section of Blur Busters. I understand the internal behaviours of LCDs and OLEDs much better than you do.

There are MANY different frequencies involved in a display including the common ones pixel clock, horizontal scan rate, vertical refresh rate and the less known ones (inversion algorithms, VRR gamma-balancing algorithm, VRR variable-overdrive, strobe backlight duty cycles, etc. Many of these numbers don't flicker, while other numbers do, depending on the display.
Have you ever seen the AUO MiniLED 7.0 panel from ViewSonic VX2781-4K-PRO? It has easy 40KHz PWM. The PA32UCG has 270KHz PWM.
You might blur buster fast monitor but not the HDR monitor.
The PWM frequency will only go higher.
 
Last edited:
Personally, I play slow-paced singleplayer games or "drunk fun" co-op games.

And I still think low-latency-high-refresh displays feel AMAZING. it doesn't matter how it improves my skills: It improves my experience.

ESports is not my thing, and I honestly couldn't care less about it for my needs: but I still want a 480Hz OLED with sub 1ms latency. I can imagine how amazing that would look and feel.
For sure they do. I am not, at all, hating on that. Low latency makes things feel more responsive, and high refresh rate not only helps with that, but just makes things more fluid. I mean I notice the fluidity difference in Hearthstone between my desktop (144hz) and laptop (240hz) and that's a silly card game. All things being equal I will always tell people to go for a monitor with higher refresh and lower latency.

However, all things are not equal, and the other features have to be considered. I could have a 240hz monitor for my desktop, or more if I wanted... but not in this size. Likewise, I want to play on a TV in my living room, which means I am basically stuck with 120Hz (technically you can overlock the S95B to 144Hz but no more) and white still low latency, more than a monitor but that's what I have to have if I want a 65" display.

The argument isn't that nobody should want better technologies, but that when you are choosing what is out NOW your choices should be based on what you do, not what is the meta for professional gaming. Display tech has tradeoffs and the tradeoff of high refresh rate is not always worth it for everyone if it means going to a display with lesser image quality.
 
Have you ever seen the AUO MiniLED 7.0 panel from ViewSonic VX2781-4K-PRO? It has easy 40KHz PWM. The PA32UCG has 270KHz PWM.
You might blur buster fast monitor but not the HDR monitor.
The PWM frequency will only go higher.
I have seen both.

But this misses the point of tech progress and the dozens of displays I've seen. As Chief Blur Busters, I get to see thousands of display per year. Production, prototype, NDA, etc.

Flicker is also often about the weak link, too.
There are many overlapping causes of flicker.
Sometimes the weak flicker link isn't the PWM frequency.
It can be, but it is not always the strongest source of flicker.
There are many variables.

The duty cycle of flicker, e.g. a PWM of 50%:50% is far more painful than >99%:<1% duty cycle, for the same PWM frequency. The flicker pattern matters too -- some people are more sensitive to spatial flickering (adjacent-pixel flicker) like DLP temporal dithering or LCD inversion artifacts -- or color-flicker (e.g. DLP rainbow effect eyestrains) -- while others are more sensitive to just purely temporal flicker, of shorter duty cycle.

Just because one person eyestrains on a specific kind of flicker, doesn't mean the next person has exactly the same flicker eyestrain -- whereupon one gets eyestrain from stroboscopics from 5,000 Hz flicker, but the same person does not get eyestrain from 300 Hz flicker -- because of a totally different flicker pattern. Yet others have no flicker eyestrain, but are eyestrained by stroboscopic stepping patterns (duplicate-images from multi-strobing is also a problem with old low-frequency PWM-dimming backlights -- sometimes the motion artifacts are the bigger cause of the eyestrain for some individuals). It depends on what you are sensitive to. That's why a lot of people didn't mind CRT, but others hated CRT, due to the flicker.

For example, 1000Hz flicker at 1%:99% ON:OFF squarewave duty cycle can be more eyestrain to many people than 100Hz flicker at 99%:1% ON:OFF duty cycle, at equal photons/sec brightness compensated as per Talbot-Plateau Theorem. The 1:99 needs to strobe almost 100x more brightly than the 99:1 strobe, to match equal brightness too. So 100,000 nits strobe to achieve 1,000 nits at 1%:99% pulsing. This can be more harsh than 1,010 nits pulsing at 99%:1%. Both are same average brightness, as per Talbot-Plateau (read it up in a university textbook or see Wikipedia). The effects of very harsh squarewave versus sinewave or shallow-wave (1% differences in brightness), as a part of flicker pattern, is a major factor on whether you get eyestrain or not, and whether other factors of eyestrain (e.g. flickering edges of stutter) far out-dominates the display's own (now-softened) flicker behaviors.

So I see you refer to backlight frequency. Also while backlights can PWM at 300 KHz, it is not a flicker source to the human eyes because the phosphor of a white LED (blue led containing phosphor to turn it into white LED) has more persistence than high PWM frequencies -- turning it into pretty much steady state (PWM-free). The next weakest flicker link comes into play. Whether that's stutter-edge flicker, or the TN inversion algorithm. Both creates flicker of a way larger duty cycle than OLED flicker, and are a more common cause of headaches than either LED ultrahigh-frequency PWM (essentially flattened out by LED phosphor) or those OLED flicker behaviors.

You know the glow in dark toys? Phosphors and quantum dots are the same thing, just ultrafast phosphors, literally at the microseconds timescale (0.5ms or 0.1ms or 0.03ms etc.), as they continue to glow after the PWM cliff too. Oftentimes, the phosphor decay ends up becoming the major part of OLED GtG, if the active matrix gate transistor switches faster than the phosphor decay. If you design the PWM frequency high enough to let phosphorescent behaviors smooth things over, you can really kill a lot of flicker. At that point, 30KHz and 300KHz has no visible flicker difference if the phosphor decays long enough. Also OLED phosphor behaviors sometimes will or will not soften the flicker behavior, phoshors are often very fast (e.g. 0.1ms) so the PWM needs to be fast enough to let a well-designed OLED phosphor or quantum-dot continue phosphorescent behavior long enough to sufficiently anti-flicker the said PWM frequency.

This depends hugely on OLED, and also Samsung QD-OLED versus LG WOLED, and what OLED-brightness algorithm they use (some are very PWM-visibility-resistant) etc. DC, unfiltered PWM, capacitor-flattened PWM, phosphor-flattened PWM, etc, etc, etc. In one of the OLEDs I use here, the dominant flicker (that's not from stutter-edge flicker) is the sub-0.1ms between-refresh-cycle soft-flicker (literally a >99%:<1% duty cycle ON:OFF for many good OLED specimens), and all other flicker subharmonics is below my human perceptual noisefloor, and I have no eyestrain/headaches unlike with, say, some harsh LCD backlights. The venn diagram of display comfort is all over the place!

One OLED can have a flicker-harshness two or three order of magnitudes less than the next OLED.
The same is true for LCD (e.g. Modern IPS LCD is mostly immune to inversion-artifact flicker, unlike TN LCD).

While PWM-free dimming is uber important, I am the messenger to tell that >90% of display eyestrain causes are misdiagnosed despite being sure/confident it's the cause (even if it's still flicker, it can be the wrong sub-lineitem of flicker, from the multiple concurrent flicker harmonics found, including framerate-derived stutter-edge flicker). There are times where I traced specific peoples' eyestrains to a reason completely different from PWM and blue-light. There are over 100+ ergonomic issues of a display.

Hey, I *do* get hired to visit manufacturers and *teach* them in person...
I've already flown over both the Atlantic Ocean and the Pacific Ocean (flight & hotel paid too, on top of consulting fees) to conduct a Blur Busters Masterclass...
I'm ready to continue the debate until the heat death of the universe anyway...

1670805307992.png
 
Last edited:
I have seen both.

But this misses the point of tech progress and the dozens of displays I've seen. As Chief Blur Busters, I get to see thousands of display per year. Production, prototype, NDA, etc.

Flicker is also often about the weak link, too.
There are many overlapping causes of flicker.
Sometimes the weak flicker link isn't the PWM frequency.
It can be, but it is not always the strongest source of flicker.
There are many variables.

The duty cycle of flicker, e.g. a PWM of 50%:50% is far more painful than >99%:<1% duty cycle, for the same PWM frequency. The flicker pattern matters too -- some people are more sensitive to spatial flickering (adjacent-pixel flicker) like DLP temporal dithering or LCD inversion artifacts -- or color-flicker (e.g. DLP rainbow effect eyestrains) -- while others are more sensitive to just purely temporal flicker, of shorter duty cycle.

Just because one person eyestrains on a specific kind of flicker, doesn't mean the next person has exactly the same flicker eyestrain -- whereupon one gets eyestrain from stroboscopics from 5,000 Hz flicker, but the same person does not get eyestrain from 300 Hz flicker -- because of a totally different flicker pattern. Yet others have no flicker eyestrain, but are eyestrained by stroboscopic stepping patterns (duplicate-images from multi-strobing is also a problem with old low-frequency PWM-dimming backlights -- sometimes the motion artifacts are the bigger cause of the eyestrain for some individuals). It depends on what you are sensitive to. That's why a lot of people didn't mind CRT, but others hated CRT, due to the flicker.

For example, 1000Hz flicker at 1%:99% ON:OFF squarewave duty cycle can be more eyestrain to many people than 100Hz flicker at 99%:1% ON:OFF duty cycle, brightness-compensated for Talbot-Plateau Theorem. The effects of very harsh squarewave versus sinewave or shallow-wave (1% differences in brightness), as a part of flicker pattern, is a major factor on whether you get eyestrain or not, and whether other factors of eyestrain (e.g. flickering edges of stutter) far out-dominates the display's own (now-softened) flicker behaviors.

So I see you refer to backlight frequency. Also while backlights can PWM at 300 KHz, it is not a flicker source to the human eyes because the phosphor of a white LED (blue led containing phosphor to turn it into white LED) has more persistence than high PWM frequencies -- turning it into pretty much steady state (PWM-free). The next weakest flicker link comes into play. Whether that's stutter-edge flicker, or the TN inversion algorithm. Both creates flicker of a way larger duty cycle than OLED flicker, and are a more common cause of headaches than either LED ultrahigh-frequency PWM (essentially flattened out by LED phosphor) or those OLED flicker behaviors.

You know the glow in dark toys? Phosphors and quantum dots are the same thing, just ultrafast phosphors, literally at the microseconds timescale (0.5ms or 0.1ms or 0.03ms etc.), as they continue to glow after the PWM cliff too. Oftentimes, the phosphor decay ends up becoming the major part of OLED GtG, if the active matrix gate transistor switches faster than the phosphor decay. If you design the PWM frequency high enough to let phosphorescent behaviors smooth things over, you can really kill a lot of flicker. At that point, 30KHz and 300KHz has no visible flicker difference if the phosphor decays long enough. Also OLED phosphor behaviors sometimes will or will not soften the flicker behavior, phoshors are often very fast (e.g. 0.1ms) so the PWM needs to be fast enough to let a well-designed OLED phosphor or quantum-dot continue phosphorescent behavior long enough to sufficiently anti-flicker the said PWM frequency.

This depends hugely on OLED, and also Samsung QD-OLED versus LG WOLED, and what OLED-brightness algorithm they use (some are very PWM-visibility-resistant) etc. DC, unfiltered PWM, capacitor-flattened PWM, phosphor-flattened PWM, etc, etc, etc. In one of the OLEDs I use here, the dominant flicker (that's not from stutter-edge flicker) is the sub-0.1ms between-refresh-cycle soft-flicker (literally a >99%:<1% duty cycle), and all other flicker subharmonics is below my human perceptual noisefloor, and I have no eyestrain/headaches unlike with, say, some harsh LCD backlights. The venn diagram of display comfort is all over the place!

One OLED can have a flicker-harshness two or three order of magnitudes less than the next OLED.
The same is true for LCD (e.g. Modern IPS LCD is mostly immune to inversion-artifact flicker, unlike TN LCD).

Hey, I *do* get hired to visit manufacturers and *teach* them in person...
I've already flown over both the Atlantic Ocean and the Pacific Ocean (flight & hotel paid too, on top of consulting fees) to conduct a Blur Busters Masterclass...
I'm ready to continue the debate until the heat death of the universe anyway...

While PWM-free dimming is uber important, I am the messenger to tell that >90% of display eyestrain causes are misdiagnosed despite being sure/confident it's the cause (even if it's still flicker, it can be the wrong sub-lineitem of flicker, from the multiple concurrent flicker harmonics found, including framerate-derived stutter-edge flicker).

View attachment 533605

mate, do you have a blog or site (aside from blur busters) where I can just read your writings? I am enraptured by every post you write and I want to see more.
 
I notice the fluidity difference in Hearthstone between my desktop (144hz) and laptop (240hz) and that's a silly card game.

I get that on the surface you would think card games are simple and silly but:

- card games are probably so easy to render that they get the max fps to match the peak Hz of the display, where some games might be using VRR and be running beneath that peak Hz (in some cases way beneath).

- card games have a lot of text and symbol recognition that probably benefit more from, or at least more obviously from, blur reduction than area and character graphics and textures would - at least when the cards/text/symbols are moving.

I think there's too much fallback in arguments on to who professional gamers do, or what is the best for competitive gaming. That is just not relevant to many people. They either don't game competitively, or don't do so at a level where small performance differences matter. Thus, focusing on that is not only silly, but also detrimental to what would actually give people the best experience.

Competitive gaming specs aren't 1:1 to online gaming dynamics either. Online has much different results than LAN gameplay, testing vs. bots, or playing local games. A lot of the extreme competitive edge stuff is just marketing as it relates to playing online on remote gaming servers.

Real competitive gaming and tournaments happen on a LAN imo, not online.

You can "bench test" system specs locally like tiny differences in already low input lag, much higher fpsHz than 120fpsHz, etc. but playing online due to it's limitations more than muddies the results making what can be very appreciable local increases less meaningful in online gaming as compared to LAN or single player games or vs. local bots. It's just the nature of online gaming and server code. These specs are marketed in advertising as if it's a 1:1 relationship when playing competitive games online. That is false.

You have two sides going on every player's machine, local and what the server manufactures as the result at it's own rate. This inequivalence is happening all the time but it's only the worst most obnoxious occurrences or when the man behind the curtain is shown more obviously so to speak that people call it out as "rubberbanding" , "peeker's advantage" , and things like high fire rate weapons ending up across fewer ticks resulting in "super bullets" (like time slips in movies) among other issues. The thing is, in online gaming it's always rubberbanding and throwing manufactured results back at you, it's just a matter of how long or short that rubberband is and what kind of guesswork biases are coded in one way or another.

That's not even going into the rabbid cheating in online "competitive" games. They are smart enough to dial the cheats down to low-key cheating to be somewhat believable advantages in many cases, and also use them to carry or ladder up their bros/teammates. There are a lot of articles on famously high numbers of cheaters logged by gaming co's in popular games, as well as ranked popular competitive individuals getting busted by the game companies or stupidly in streams. :LOL:

. .
(2021) There are plenty more articles than this available and for other games but it's a telling landscape. There is also the paid cheat service company's #'s I didn't post here.
------------
Fortnite

I think fortnite's tickrate is 30, + doubling your ping time for interpt2 in normal gameplay, the interp2 dialed back in interpolation by the server . . (unless you want to suffer huge ms penalty ~ 250ms on any latency spike) ..

Fortnite players cheat the most in any online multiplayer game, with over 26,822,000 searches for hacks

https://us.blastingnews.com/gaming/...fter-fncs-cheating-controversy-003096277.html

https://fortnitetracker.com/article/1236/fortnite-cheating-crisis-reaches-new-highs

-----------------------
Call of Duty Warzone

Looks like Call of Duty Warzone's tick rate is even lower at around 20.

https://www.forbes.com/sites/paulta...rzone-has-now-banned-half-a-million-cheaters/


------------------------------

Apex legends, surprisingly is also 20 tick rate! (For reference again, Valorant is 128tick). That makes it the same scenario as COD:WZ above in regards to how many action state dots your machine is connecting to on the dotted line so to speak as far as the server is concerned.

March 2021
Hundreds of high ranked Apex Legends players just got banned for cheating

april 2021
Apex Legends Devs Looking Into Compensation For Losing To Cheaters

Feb 2021
https://charlieintel.com/apex-legends-cheaters-hit-by-huge-ban-wave-with-more-to-come/85678/


-----------------------------------------------
 
Last edited:
mate, do you have a blog or site (aside from blur busters) where I can just read your writings? I am enraptured by every post you write and I want to see more.
Also see Vega.

I don't post direct links to the ad-filled Blur Busters website here, as a promise to HardForum owner when reinstated.

(I do, however, post links to the ad-free TestUFO website as see-for-yourself explainers -- they are highly educational)

Being that said, if you're interested in my advanced writings, then click the purple "Research" button at the top of Blur Busters for my favourite articles -- it also includes a link to Area51 discussion forum where I write some of my visionary posts on random topics such as framerateless video or low-persistence sample-and-hold. Some of it is early stuff -- I'm sort of the temporal version of a 1986 Japan Muse HDTV researcher, long before HDTV became mainstream. I just hired a new writer and will resume writing big articles for the main site soon.

Most advanced [H] members will not be really interested in the General forum, but rather more the "Laboratory" section of Blur Busters Forums (a screenful scroll down) that talks about programming, TestUFO, display engineering, etc.
 
Last edited:
You are more right than you think. The early firmware S95B OLED can get crazy bright. I've literally had to partly close my eyes/look away during some HDR scenes in a dim room.
S95B is the best "everything combination" I have ever seen on a display.
Incredible brightness, colors, contrast and 4k144!

My eyes when I play HDR games in the evening on the S95B
F025%2F817%2FScreen_Shot_2018-03-30_at_11.34.27_AM.jpg
 
I feel like you should a bring a CRT with you whenever you go so you can remind them what good motion resolution looks like.

Because I imagine, like most of the population, these guys haven't seen a CRT monitor in person in 15 years.
Hard to carry on a plane, sadly, and I don't usually travel with checked luggage -- if I help it. Too risky to lose luggage in this era so I try to bring all equipment in carryons, such as 240Hz projectors, high-hz laptops, or small high-Hz monitors.

If I have to bring a demo "CRT simulation" display, a 23.8" panel specimen such as ViewSonic XG2431 with its stand removed, can (just barely) fit in carryon luggage, if I am careful. With a custom resolution mode VT4500 QFT (hide more LCD GtG via faster scanout and long VBI), combined with ViewSonic Strobe Utility (which I created), and a custom user-adjustable overdrive gain, it can be as clear as a CRT even for top/center/bottom with zero strobe crosstalk -- like an Oculus Quest 2 VR LCD -- in the territory of ~85-100Hz since refresh rate headroom greatly helps a strobe backlight become more CRT motion clarity.

Most manufacturers don't understand this well, due to frequent communications difficulties between monitor brands and the asian scaler vendors and long-established standards (e.g. VESA doesn't have a QFT standard, that would have made it easier to help strobing with more manufacturers).

Also, software based black frame insertion demos work impressively well on OLEDs, far better than LCDs, so that is also a substitute.
 
Last edited:
Uh oh. Vega and l88bastard combined giving glowing kudos to hardware is never good news potentially for our wallets around here. ;) I do have some space issues but I'm definitely keeping that Samsung in mind.
Don't worry they will be giving glowing kudos to something else very soon. They change their monitors as often as most people change their underware.
 
I have seen both.

But this misses the point of tech progress and the dozens of displays I've seen. As Chief Blur Busters, I get to see thousands of display per year. Production, prototype, NDA, etc.

The duty cycle of flicker, e.g. a PWM of 50%:50% is far more painful than >99%:<1% duty cycle, for the same PWM frequency. The flicker pattern matters too -- some people are more sensitive to spatial flickering (adjacent-pixel flicker) like DLP temporal dithering or LCD inversion artifacts -- or color-flicker (e.g. DLP rainbow effect eyestrains) -- while others are more sensitive to just purely temporal flicker, of shorter duty cycle.

Just because one person eyestrains on a specific kind of flicker, doesn't mean the next person has exactly the same flicker eyestrain -- whereupon one gets eyestrain from stroboscopics from 5,000 Hz flicker, but the same person does not get eyestrain from 300 Hz flicker -- because of a totally different flicker pattern. Yet others have no flicker eyestrain, but are eyestrained by stroboscopic stepping patterns (duplicate-images from multi-strobing is also a problem with old low-frequency PWM-dimming backlights -- sometimes the motion artifacts are the bigger cause of the eyestrain for some individuals). It depends on what you are sensitive to. That's why a lot of people didn't mind CRT, but others hated CRT, due to the flicker.
I haven't mentioned there is DC dimming backlight.

There is your 100Hz-240Hz flickering OLED. There is cheap sub 1000Hz flickering LCD backlight like LG 32GP950. Then there is high end 300KHz backlight and DC dimming backlight.
It is obvious which one is superior. They are not even in the same level.

Everybody can see thousands of display on CES. I haven been there too. It's never a good place to actually use and test the monitor in a hotel CES place or you could've busted QD-OLED flickering issues.

Like I said, the 240Hz model is out soon. And this is a monitor which LG doesn't even put a HDR400 sticker on it. It is going be dim and flickering.

So there is a very high chance that I will bust this monitor instead.
 
If I have to bring a demo "CRT simulation" display, a 23.8" panel specimen such as ViewSonic XG2431 with its stand removed, can (just barely) fit in carryon luggage, if I am careful. With a custom resolution mode VT4500 QFT (hide more LCD GtG via faster scanout and long VBI), combined with ViewSonic Strobe Utility (which I created), and a custom user-adjustable overdrive gain, it can be as clear as a CRT even for top/center/bottom with zero strobe crosstalk
Can confirm that the higher refreshes on the XG are CRT clarity (yes I still have an old one around here that gets occasional use, so I assure you this isn't hyperbole). Honestly if future OLED monitors can feature the same BFI clarity that the XG2431 has across its scan range, then as far as I'm concerned, we've arrived.

EDIT - Again though, to keep it somewhat on topic. $999 for 27-inch OLED. Yes.

Chief - is it possible to have software BFI at a driver level? Like let's say I want to implement 60hz BFI for my console ports. How easy is it to have a feature that "tricks" Windows or whatever to implement it and treat it as a native 60hz display? Or am I asking the impossible here?
 
Don't worry they will be giving glowing kudos to something else very soon. They change their monitors as often as most people change their underware.
I like to live vicariously through them. It's like watching those Sony BVM CRT videos. Would I love to have one? Oh yeah. But the only one that works for my family would be the 32-incher. There's five kids and the ones who are old enough love playing video games with me on the weekends. That split screen can get mighty tiny even on a 32 inch screen. And that model is near impossible to find nowadays. If you DID find one, be prepared to pay in the five digits range.
 
I think I'm going to order the 45" Corsair and sit a mile away. I need that 240hz motion clarity but this monitor is just too small.
 
Can confirm that the higher refreshes on the XG are CRT clarity (yes I still have an old one around here that gets occasional use, so I assure you this isn't hyperbole). Honestly if future OLED monitors can feature the same BFI clarity that the XG2431 has across its scan range, then as far as I'm concerned, we've arrived.

Short answer: No -- for any LCD's Hz(max) or its maximum scanout velocity

240Hz strobing at 240Hz(max) will always have crosstalk on any 240Hz(max) LCD. LCD GtG is not fast enough to hide in the vertical blanking interval of VESA standard 240Hz scanout. Interestingly, the Valve Index and the Oculus Quest LCDs have extremely fast scanout velocities and are essentially Hz-capped (underclocked) to eliminate VR strobe crosstalk. So that's why I say "maximum scanout velocity", which normally governs a LCD panels' maximum possible Hz, but sometimes that's intentionally capped (e.g. conceptually like selling a ViewSonic XG2431 capable of only 120Hz, despite it using a 240Hz panel) to keep it CRT motion clarity.

Long answer: Well, to answer that you need to define "CRT motion clarity".

It is best to define it as visually top/center/bottom equal clarity, crosstalk-free, while also using an effective persistence that eliminates all visible trailing effects (ghosting effects, phosphor effects, crosstalk effects). The best strobed LCDs lets you read the street name labels at 3000 pixels/sec+ at www.testufo.com/map#pps=3000 and you can ultra-fine-tune (with an ultra wide PWM range / QFT range / brightness range / etc) to get the best brightest:clarity tradeoffs than you would have in the past.

All the CRT-motion-clarity LCD modes (Quest 2, Valve Index, QFT low-Hz XG2431, etc) all use a fast scanout followed by a long blanking interval to allow time for LCD GtG to settle unseen in total darkness between refresh cycles. Do it long VBI enough, GtG completes enough that the crosstalk disappears. That's why you kind of need 3:1 refresh rate headroom if you want a crosstalkless LCD capable of unbounded motion clarity (motion clarity limited only by strobe pulse width).

One reasonable not-too-tight not-too-loose definition would be a sub-1% strobe crosstalk on any gaming monitor. With QFT + retune + Hz headroom, I can do that on XG2431 while achieving >100 nits. For that, it is essential to have a long time delay between the last pixel refreshed of previous refresh cycle and THEN the first pixel refreshed on next refresh cycle. Not all pixels on an LCD refresh at the same time:



If your VBI is too short, scanout starts at the top again before the LCD GtG finishes at bottom edge. No room to flash a strobe zero-crosstalk. Long VBI means scanout can finish bottom edge and a long time passes (with backlight turned off), to let LCD GtG settle more fully before flashing the backlight, before starting scanout again at the top = zero top/center/bottom crosstalk from some QFT modes.

High speed video of a BenQ XL2720Z. (All LCDs and most OLEDs refresh in this top-to-bottom manner). Crosstalk-free (top/center/bottom equally clear) CRT motion clarity strobing on an LCD requires an LCD GtG(0%->100%) faster than the vertical blanking interval.

100Hz on a XL2566K was also reported to be crosstalk-free (but unfortunately 100Hz is also the minimum strobed refresh rate the XL2566K has), while other panels are more flexible (unlocked pulse width brightness adjustment range, unlocked Hz adjustment range, overdrive gain etc) -- specifically XG2431 and its strobe utiity that's more powerful than the BenQ strobe utility.

Incidentally, for a sample-and-hold OLED, you also need refresh rate headroom to simulate a CRT tube. An electron beam simulator (ala RetroArch BFIv3 suggestion) can use a 960fps 960Hz panel and use 8 digital refresh cycles to simulate 1/120sec of electron beam or 16 digital refresh cycles to simulate 1/60sec of electron beam. So brute sample and hold Hz can allow a software-based CRT electron beam simulator that becomes increasingly accurate the more output Hz you have to subdivide a CRT Hz.

But rheoretically why flicker, if frame rate is available, you can simply use brute framerate-based motion blur reduction instead to reduce motion blur strobelessly (if framerate were available):

motion_blur_strobing-vs-nonstrobing.png


So to match a 1ms MPRT strobe (1ms flicker), you can achieve 1ms MPRT via a future 1000fps 1000Hz OLED (1ms frametime)
- Motion blur is pulsetime on impulsed displays (strobed/flicker/BFI)
- Motion blur is frametime on sample and hold displays (LCD and OLED)

Strobeless blur reduction is more the holy grail for PC gaming content -- real life does not flicker.

But, still we have retro content (e.g. emulators, classic games) and 60+ years of legacy 60fps 60Hz recordings, so to avoid interpolation, we still need to flicker 60 cycles a second to reproduce the retro content faithfully like it was originally seen on a CRT tube. (And you will need to be tolerant of the flicker too)

So as you can see -- to convert sample and hold into an accurate CRT simulation -- requires brute Hz for a sample and hold display.

You need multiple refresh cycles to simulate a CRT electron beam in a sub-refresh way (the blackness, the flicker, the rolling bar, the phosphor decay). Regardless of LCD or OLED, to simulate a CRT tube in software, you need brute Hz to be able to simulate a CRT electron beam at a low Hz (even 10:1 to 20:1+ refresh rate headroom can be advantageous in simulating a CRT tube, regardless of LCD or OLED, because you need to "roll" the simulated CRT scanout in consecutive digital sample-and-hold refresh cycles).

Chief - is it possible to have software BFI at a driver level? Like let's say I want to implement 60hz BFI for my console ports. How easy is it to have a feature that "tricks" Windows or whatever to implement it and treat it as a native 60hz display? Or am I asking the impossible here?
Yup, at least partially. Software already exists; all of which were directly inspired by Blur Busters & my encouragings, and is very useful for OLED displays that don't have BFI. It works as well as hardware BFI, because OLED BFI can be perfectly identical on some panels hardware-based and software-based, as long as you've got enough refresh rate headroom. I am hoping 240Hz OLEDs will be fantastic BFI'ers (except for effective nits).
  • Already possible with SpecialK BFI feature. You can even use it with Cyberpunk 2077 to add BFI to Alienware OLED.
  • Already possible with DesktopBFI (run as Administrator, set its process & priority to REALTIME to prevent excess flicker).
  • There's an open source BFI framework you can build into your software (in Programming subforum at Blur Busters Forums)
Be warned, minimum persistence is OLED refreshtime for an OLED that doesn't have hardware BFI nor subrefresh BFI. So 240Hz OLED means you can get 4.2ms persistence for 60fps (25%:75% ON:OFF) and 120fps (50%:50% ON:OFF). And some of these software such as DesktopBFI is great for simulating the look-and-feel 35mm cinema projector with a custom 96Hz mode (25%:25%:25%:25% ON:OFF:ON:OFF for a 48Hz double strobe) for watching 24fps YouTube videos.

Not everyone cares, but some of my Blur Busters fans are all over these kinds of things. It's not well publicized that BFI software like these already exists; but there you go.

But for full support (true driver level support, virtualizes a low-Hz display to BFI to a higher-Hz display):

For system-wide BFI that even works with PC-based full screen exclusive applications, I used to have a commercial BFI driver, but I've since long ago deleted (both binaries and source) it from this system due to license encumberances -- it is no good for Blur Busters goals. Only dusty backups remain (e.g. in case its license is ever compatible with Blur Busters needs in the future).

Fortunately, open source BFI drivers are in cleanroom development by indies at various stages (and the SpecialK author is evaluating doing it too and jumping into all this fun -- SpecialK now has software BFI built in, but SpecialK is not a windows display driver...yet).

Run a GPU shader every refresh cycle, independent of underlying frame rate.

Conceptually, they're just simply marriages between things like SweetFX/reshade and a Windows Indirect Display Driver (you can get a sample from the Windows DDK for a USB monitor. Just remember the common years-old advice about multimonitor + fullscreen exclusive also applies here; if you make the virtualized display your primary monitor, you can only successfully virtualize a higher Hz than your existing monitors if you're using full screen exclusive mode).

Eventually once such a harness becomes available to the public, shader plugins could be created to do many things like:
- Software-based BFI for everything you run on PC
- Software-based CRT beam simulators (rolling BFI with simulated phosphor fadebehind)
- VRR simulation on non-VRR dipslays identical to www.testufo.com/vrr
- Superior LCD overdrive algorithms (Rebirth of old ATI Radeon Overdrive, but with more advanced math)
- Virtualize a higher Hz display than what you actually own (good for developer debugging or for frame-blending algorithms)
- Unlimited refresh rate combining. Including use 8 different 120Hz strobed projectors to stack 960Hz onto the same projector screen, and including use 2 strobed monitors (one with horizontally flipped image) with a diagonal beamsplitter mirror to create a double-Hz screen (all of these are my ideas). Note: Blur Busters Whitepaper on refresh rate combining is coming out later in 2023
- Many other tasks.

To do it with a console, you would have treat the computer like a video processor. You install a video capture card and run it through this driver, and display the output to a FSE window (and also, do it the lowest possible lag -- a big challenge for a video processor box). Basically your PC becomes the "video processor". A man-in-the-middle box. In theory you can even create a shader/algorithm that converts an input stuttery fixed-Hz framerate into a smooth VRR-output framerate (permitting a frame-dejitter buffer, configurable) -- at least for games with predictable stutter mechanics or easily detected motion vectors.

The concept of video processor boxes are old hat. It's noteworthy I'm very familiar with deinterlacers, scalers, and video processor boxes -- Twenty years ago, I used to work for RUNCO, Key Digital, HOLO3DGRAPH (Faroudja chip), and dScaler app -- so I know my video processors for two decades -- I created the world's first open source 3:2 pulldown deinterlacer algorithm (Internet Archive, Year 2000). Turning a Hauppauge TV card into something that beat a $5000 device, and making HTPC's very popular. And may kind of accidentally put Faroudja slowly out of business...

Basically, a Hz-virtualizer equivalent of a resolution-virtualizer (scrollable virtual desktop), for various mudane temporal-processing tasks that needs to be done at refresh-cycle granularity independently of underlying framerates. Most processing would use 5% of a modern RTX GPU.

If nobody creates it by 2024, I'll probably eventually create a (fairly large) BountySource for a public Apache/MIT open source Windows IDD driver with open source shader plugins. Feel free to contact/PM me.
 
Last edited:
I imagine on something like a g-sync OLED at 240hz+ a GPU driven rolling black bar (to reduce sample-and-hold blur) that is frame rate invariant in speed and compensates for perceptual brightness variation with frame rate would look close to perfect. Are you listening Nvidia and Amd? Chance to differentiate your offerings.
 
I imagine on something like a g-sync OLED at 240hz+ a GPU driven rolling black bar (to reduce sample-and-hold blur) that is frame rate invariant in speed and compensates for perceptual brightness variation with frame rate would look close to perfect. Are you listening Nvidia and Amd? Chance to differentiate your offerings.
This needs to be implemented at the scaler/TCON at the panel level. Sub-refresh rolling BFI requires heavy scaler/TCON modifications, and right now the most of the current OLED backplanes are unable to do BFI at less than one refresh cycle (at the moment).

Software-based BFI only occurs at the monolithic refresh cycle level, not sub-refresh level. To simulate subrefreshes requires multiple refresh cycles per refresh cycles, if rolling BFI is done in a software-based manner.

So you'd need 960-1920Hz for a good 100%-GPU-based 100%-software-based rolling 240Hz BFI.
For 960Hz + software BFI pattern at 240 cycles a second, that achieves 240/960ths the original motion blur (3/4 reduction);
For 1920Hz + software BFI pattern at 240 cycles a second, that achieves 240/1920ths the original motion blur (7/8 reduction);
Since software-based BFI can only be done at the full refresh / full framebuffer level.

This is a viable plan (it's a future RetroArch BFIv3 suggestion to simulate a CRT electron beam), but it requires brute excess refresh rate.

Laws of physics for sample and hold mean you have at maximum blur reduction factor of (targetHz/availableHz) motion blur reduction capability for panels with no hardware-based sub-refresh BFI capability.

To avoid the need for more Hz to do BFI, it needs hardware-based BFI/rolling scan. In order to be able to do sub-refresh rolling scan, needs custom OLED backplane and panel electronics capable of doing 2 refresh scans simultaneously on the same panel (same refresh cycle being modified simultaneously at the same time) -- then you only need 240Hz if you have a dual-scan mechanism built into the OLED backplane for a rolling scan.

BBOLED-60hz.png


So the panels with supported refresh logic have to become available first before NVIDIA can tweak this.
 
Last edited:
Good info. I was hoping you'd relate that as I knew from reading some of your articles that it wasn't as simple as he suggested.

My problem with BFI and strobing was always that the refresh/rate was too low so I'd feel eye fatigue. That and the fact that methods worked best with very high framerate minimums, where I'd rather push the graphics settings onto the gpu harder and balance it with VRR, hopefully at 90fpsHz average (something like 60/75fpsHz<<<90fpsHz>>115fps/117capped) . . to 120fpsHz average (90/105fpsHz <<<117fpsHz capped>>>117fpsHz capped) or higher. A bigger problem for me now with it is that current methods still dim the screen, mute the display overall, which makes it unusable for me with HDR. For me, HDR is a bigger deal tradeoff wise than the current BFI implementations. For gaming HDR is a bigger deal than 4k even (vs x1400 for example), but luckily I can have both 4k and HDR and fairly "high" refresh rate now.

Until/If we ever get a very high rate BFI that still allows 1000nit or more on OLED I can't see using it myself. I'd rather "extreme" Hz (500Hz - 1000Hz oled) + frame amplification tech to fill those Hz, (and the resulting blur reductions more compatible with HDR), would be focused on if it had to be one or the other.
 
Last edited:
A bigger problem for me now with it is that current methods still dim the screen, mute the display overall, which makes it unusable for me with HDR. For me, HDR is a bigger deal tradeoff wise than the current BFI implementations. For gaming HDR is a bigger deal than 4k even (vs x1400 for example), but luckily I can have both 4k and HDR and fairly "high" refresh rate now.
Noticed the same. My LG CX 48" is one of the few things that supports both BFI 120 Hz and HDR with sub-1ms response times so I tried Doom Eternal at 4K 120 Hz w/ HDR then tried turning on BFI from the TV. It basically made the game look like it was running in SDR. Of course it's nice to have options but for me honestly 4K 120 hz with OLED response times is already very good experience.

I do kinda want a 4K 240 Hz screen now that my RTX 4090 can push those kinds of framerates in something other than competitive shooters, which I do not play at all.
 
Noticed the same. My LG CX 48" is one of the few things that supports both BFI 120 Hz and HDR with sub-1ms response times so I tried Doom Eternal at 4K 120 Hz w/ HDR then tried turning on BFI from the TV. It basically made the game look like it was running in SDR. Of course it's nice to have options but for me honestly 4K 120 hz with OLED response times is already very good experience.

I do kinda want a 4K 240 Hz screen now that my RTX 4090 can push those kinds of framerates in something other than competitive shooters, which I do not play at all.
OLED doesn’t get bright enough to do both unfortunately.
 
Back
Top