Interest in brand new 8K 80Hz CRT for 8000$?

The best way to accurately emulate a CRT in the future would be a 1000Hz+ OLED or discrete-MicroLED display, and simply simulating a CRT electron beam in software -- using a GPU shader (or FPGA) that uses many digital refresh cycles to emulate one CRT refresh cycle.

This is a much easier engineering path, to 'enhance' existing MicroLED modules for a higher refresh rate (many already PWM-refresh at 1920Hz but done by repeating refresh cycles, An FPGA per module can commandeer it to unique refresh cycles, using currently off-the-shelf technology, at prices cheaper than fabbing a SED/FED factory).

It's like playing a high-speed video of a CRT in reverse in real time on a ultra high Hz display. It temporally matches the zero-blurness and phosphor-decay behaviours of a CRT!

It's possible to shift CRT electron beam control over to software, and simply use many digital refresh cycles to emulate one CRT Hz. I actually personally added a BountySource bounty on a RetroArch emulator item (BFIv3) that incorporates such an algorithm, since display refresh rates are now getting almost high enough to emulate the electron beam of a CRT to within human vision error margins.

Also, crossposting what I posted in Blur Busters Display Research Forum (Area 51), as I am now cited in over 25 peer reviewed research papers [Google Scholar], so I'm a pretty authoritative voice in this territory. Early tests in CRT beam simulation has shown promising results, with the realism of CRT electron beam simulation improving as refresh rates keep going up -- 1000 Hz displays are no longer too far behind, and will arrive sooner than a FED/SED panel.

BFI is like pre-MameHSL. CRT electron beam simulation in software is like "perfect CRT-like BFI on steroids", with a fadebehind rolling-scan effect that can technically be beam-raced with the raster of the GPU output (for subrefresh-lag CRT simulation).

A CRT electron beam simulator is like the temporal equivalent of spatial MAME HLSL CRT filter.
More resolution = better spatials.
More refresh = better temporals.
Enough refresh rate = perfect low-Hz CRT simulation to the human eye.

I fear that FED/SED will be stillborn, since already-manufactured sample parts (necessary for a 1000Hz display -- e.g. retrofitting ultra-high-resolution MicroLED modules obtained as sample quantity, and then modify them with one FPGA chip per module to enable a higher refresh rate), rather than waiting out pandemic-slowed factory supply chains to try to manufacture a FED/SED or a new CRT tube. Computer parallelism can be utilized, if needing to drive different high-hz modules at lower resolutions, so one computer may have to output to a few computers that drive FPGA-modified MicroLED modules for a 1000Hz display using today's already-manufactured computer parts waiting in warehouses -- at least for a prototype demonstration display. If you wanted to go lower resolution, it gets cheaper (e.g. low resolution 32x32 RGB LED matrixes off Alibaba for $10 each are already refreshing at 1920Hz via repeat-refresh technique, but can be FPGA-modded too for unique refresh cycles), but we're trying to reach high resolution so you gotta buy high-resolution MicroLED modules instead. There are engineering paths of lower-resistance.

At 1000Hz-2000Hz+, with an accurate software-based temporal CRT electron beam simulator (in a GPU shader) we will be able to able to pass the CRT turing test (can't tell apart a masked flat Trinitron and a fake-CRT-glassboxed 1000Hz OLED/MicroLED). The challenge is getting good spatial (4K resolution) and good temporal (1000Hz), so 4K 1000Hz should be a baseline goal for a CRT turing A/B blind test versus a flat tube, though you can get by with lower resolutions (1080p 1000Hz) if you're simulating very tiny Sony PVM CRTs -- you need high DPI to get those phosphor dots resolving uniquely ideally. But you can prioritize temporal first at a much lower budget (modding existing MicroLED blocks with FPGA apiece), and use large viewing distance (wall sized CRT) to hide the LED-bead effect.

However, the bottom line is the technology is purchasable today for a buildable prototype -- given sufficient programming skill (FPGA/Xilinx/shaders/etc) for the processing speed necessary to simulate an accurate CRT beam in realtime (including gamma-correction-conversion, for all the CRT-fade overlaps, so you don't get flicker-band effects).

--- Begin unlinked crosspost from Blur Busters Area 51 Forum ---

Still waiting for one to be released... ;)
High-Hz Micro LED, combined with a rolling-scan CRT emulator mode, would now probably produce vastly superior CRT look than any SED/FED ever could.

The problem with SED/FED is they had to be refreshed differently from a CRT, behaving kinda like plasmas (the prototype SED/FED has worse artifacts than a Pioneer Kuro plasma). Instead, I'd rather shift the CRT refresh-pattern control over to software (ASIC, FPGA, GPU shader, or other software-reprogrammable algorithm) to get exactly the same perfect CRT look-and-feel.

The only thing is we need brute overkill Hz, to allow CRT refresh cycle emulation over several digital refresh cycles.

Think of this as a temporal version of the spatial CRT filters. We already know how great spatial CRT filters look on a 4K OLED display, looks uncannily accurate for low-rez CRT, you can even see individual simulated phosphor dots! But this isn't blurfree. But it's possible to do it temporally too, not just spatially -- it just requires many refresh cycles per CRT refresh cycles.

Direct-view discrete micro LED displays (and possibly future OLED), can do all the following:
- Color gamut of CRT
- Perfect blacks
- Bright whites
- High resolutions enough for spatial CRT filter texture emulation (see realistic individual phosphor dots!)
- Can do high refresh rates enough for temporal CRT scanning emulation (see realistic zero-blur motion and realistic phosphor decay!)
- HDR capability can also give the brightness-surge headroom to prevent things from being too dark

So you just need an ultra-high-rez ultra-high-refreshrate HDR display (such as a 1000Hz OLED or MicroLED), then the rest of the CRT behaviors can be successfully emulated spatially AND temporally in software.

You can use a 960Hz display at 16 refresh cycles per CRT Hz to emulate 60 electron-scanned refresh cycles per second at 16 frames per simulated CRT refresh cycle (each frame would be like a frame of a high speed video of a CRT, but in realtime instead). This is to simulate an electron beam / phosphor / zero motion blur as long as there was enough brute Hz to allow multiple refresh cycles per simulated CRT Hz to emulate the electron beam.

Incidentally, a 1000fps high speed video of a CRT, played back in real time to a 1000Hz display, accurately preserves a lot of the CRT temporals (zero blur, phosphor decay, same flicker, etc).

Instead, you’d use a software algorithm in the firmware or the GPU, to artificially emulate a CRT tube via piggybacking on the ultra high refresh rate capability of future displays.

Direct view micro LED panels (without LCD) can exceed the contrast range, color gamut, and resolution of a CRT tube, and so it’s a great supersetted venn diagram that is a perfect candidate for this type of logic to emulate a CRT tube, with a CRT emulator mode.

That’s the better engineering path to go down, in my opinion.

Although full-fabbed 1000Hz+ panels don't yet exist, DIY 1000Hz+ is possible: If you begin with modified jumbotron LED modules from Alibaba (they already refresh at 1920Hz, they're just repeating 60Hz frames -- need to commander with some modifications 1920 unique images per second) -- then a prototype can probably be built in less than 1/100th the cost of a SED/FED prototype. There'd be a bit more software development needed, but it could be done in plain shadier programming instead of complex FPGA/assembly. It would be low resolution (large DPI) and maybe need to be wall sized.

Alternatively, an early version of the temporal CRT algorithm can be programmed on a 360Hz display (6 refresh cycles of rolling-scan BFI emulation for one 60Hz CRT refresh cycle). It would be something like the algorithm in proposed RetroArch BFIv3 suggestion. It could either be implemented in the display firmware, or as a GPU shader (using a Windows Indirect Display Driver), or programmed into a specific app (like an emulator such as RetroArch).
 
Last edited:
The best way to emulate a CRT would be a 1000Hz+ OLED or discrete-MicroLED display.

This is a much easier engineering path, to 'enhance' existing MicroLED modules for a higher refresh rate (many already PWM-refresh at 1920Hz but done by repeating refresh cycles, An FPGA per module can commandeer it to unique refresh cycles, using currently off-the-shelf technology, at prices cheaper than fabbing a SED/FED factory).

It's possible to shift CRT electron beam control over to software, and simply use many digital refresh cycles to emulate one CRT Hz. I actually personally added a BountySource bounty on a RetroArch emulator item (BFIv3) that incorporates such an algorithm, since display refresh rates are now getting almost high enough to emulate the electron beam of a CRT to within human vision error margins.

Also, crossposting what I posted in Blur Busters Display Research Forum (Area 51), as I now am cited in over 25 peer reviewed research papers now...

--- Begin unlinked crosspost from Blur Busters Area 51 Forum ---


High-Hz Micro LED, combined with a rolling-scan CRT emulator mode, would now probably produce vastly superior CRT look than any SED/FED ever could.

The problem with SED/FED is they had to be refreshed differently from a CRT, behaving kinda like plasmas (the prototype SED/FED has worse artifacts than a Pioneer Kuro plasma). Instead, I'd rather shift the CRT refresh-pattern control over to software (ASIC, FPGA, GPU shader, or other software-reprogrammable algorithm) to get exactly the same perfect CRT look-and-feel.

The only thing is we need brute overkill Hz, to allow CRT refresh cycle emulation over several digital refresh cycles.

Think of this as a temporal version of the spatial CRT filters. We already know how great spatial CRT filters look on a 4K OLED display, looks uncannily accurate for low-rez CRT, you can even see individual simulated phosphor dots! But this isn't blurfree. But it's possible to do it temporally too, not just spatially -- it just requires many refresh cycles per CRT refresh cycles.

Direct-view discrete micro LED displays (and possibly future OLED), can do all the following:
- Color gamut of CRT
- Perfect blacks
- Bright whites
- High resolutions enough for spatial CRT filter texture emulation (see realistic individual phosphor dots!)
- Can do high refresh rates enough for temporal CRT scanning emulation (see realistic zero-blur motion and realistic phosphor decay!)
- HDR capability can also give the brightness-surge headroom to prevent things from being too dark

So you just need an ultra-high-rez ultra-high-refreshrate HDR display (such as a 1000Hz OLED or MicroLED), then the rest of the CRT behaviors can be successfully emulated spatially AND temporally in software.

You can use a 960Hz display at 16 refresh cycles per CRT Hz to emulate 60 electron-scanned refresh cycles per second at 16 frames per simulated CRT refresh cycle (each frame would be like a frame of a high speed video of a CRT, but in realtime instead). This is to simulate an electron beam / phosphor / zero motion blur as long as there was enough brute Hz to allow multiple refresh cycles per simulated CRT Hz to emulate the electron beam.

Incidentally, a 1000fps high speed video of a CRT, played back in real time to a 1000Hz display, accurately preserves a lot of the CRT temporals (zero blur, phosphor decay, same flicker, etc).

Instead, you’d use a software algorithm in the firmware or the GPU, to artificially emulate a CRT tube via piggybacking on the ultra high refresh rate capability of future displays.

Direct view micro LED panels (without LCD) can exceed the contrast range, color gamut, and resolution of a CRT tube, and so it’s a great supersetted venn diagram that is a perfect candidate for this type of logic to emulate a CRT tube, with a CRT emulator mode.

That’s the better engineering path to go down, in my opinion.

Although full-fabbed 1000Hz+ panels don't yet exist, DIY 1000Hz+ is possible: If you begin with modified jumbotron LED modules from Alibaba (they already refresh at 1920Hz, they're just repeating 60Hz frames -- need to commander with some modifications 1920 unique images per second) -- then a prototype can probably be built in less than 1/100th the cost of a SED/FED prototype. There'd be a bit more software development needed, but it could be done in plain shadier programming instead of complex FPGA/assembly. It would be low resolution (large DPI) and maybe need to be wall sized.

Alternatively, an early version of the temporal CRT algorithm can be programmed on a 360Hz display (6 refresh cycles of rolling-scan BFI emulation for one 60Hz CRT refresh cycle). It would be something like the algorithm in proposed RetroArch BFIv3 suggestion. It could either be implemented in the display firmware, or as a GPU shader (using a Windows Indirect Display Driver), or programmed into a specific app (like an emulator such as RetroArch).
Hey there, Mr. Wet Rag. We don't need no stinkin OLED's in this thread! :)
 
160 Hz on CRT would still have better motion clarity than any currently available LCD.
Outdated info.

As of 2022, there are crosstalkless LCDs with less motion blur than a CRT now. For example, a specially tweaked mode on the XG2431 is much better than the XG270 that ApertureGrille commented can beat a CRT in certain circumstances (in his review). We've already improved past that point!

A few perfect-crosstalkless strobed LCDs now exist. For Oculus Quest 2 LCD, Valve Index LCD, and the Blur Busters Approved ViewSonic XG2431.

With VT2700 at 100Hz, re-calibrated with Strobe Utility, the XG2431 goes perfectly crosstalkless for top/center/bottom, with less motion blur than a FW900 CRT.


1652840901136.png


The cherrypicked LCDs I am talking about has less crosstalk than the "1%" in this photo -- the crosstalk fell below human visibility noisefloor thanks to GtG100% hidden completely in the dark in an unusually large VBI between refresh cycles.

They succeed in because they hide real-world LCD GtG100% in ultra-large blanking intervals (multiple milliseconds). For example, a 100Hz refresh rate with a 1/360sec scanout can have a VBI of (1/100 - 1/360) = 7.2 milliseconds of total darkness to hide LCD GtG more perfectly in the total-darkness phase of a strobe backlight.

Large VBIs of 5-14ms each can hide the real-world GtG(100%) of all possible color combinations of 1ms GtG(90%) -- blanking intervals that are several times taller than the height of the visible resolution is the trick to zero-out strobe crosstalk, with generous refresh rate headroom.

Both the Valve Index VR LCD, as well as the Oculus (er, Meta) Quest 2 utilizes this trick, as revealed in their DisplayWeek 2022 presentation, which also matches the technique that Blur Busters Approved 2.0 monitors do to get better-than-ULMB better-than-DyAc better-than-LightBoost zero-crosstalk operation with custom QFT modes that also simultaneously reduce strobe lag. (Requires recalibration by ViewSonic Strobe Utility that's superior to BenQ Strobe Utility).

Bottom line: Perfect strobing was recently achieved on LCDs or top/center/bottom (zero ghosts, zero crosstalk, zero coronas, zero blur), but only very few cherrypicked LCDs can do it, and only with the "massive refresh rate headroom" trick. Fortunately, you can't do 240Hz on a Sony FW900 CRT, and it's easy to use a large vertical total with the max Hz of 1080p signal you usually use with a FW900 CRT, so you just burn the Hz headroom towards crosstalk reduction until you hit the zero-crosstalk refresh rate.

Also important to retune with an available 100-level or 128-level custom overdrive adjustment (like the one in XG2431 Strobe Utility I programmed), which is important for temperature compensation (a 18C LCD can ghost slightly more than a 22C LCD, so adjusting 1/100th or 2/100th of an overdrive can cause crosstalk to disappear again for a specific temperature).

Reviewers don't bother testing the Strobe Utility, so most reviewers don't test the new zero-crosstalk modes of the ViewSonic XG2431, so most don't know about it -- however, it's there now, better than DyAc / ULMB / LightBoost / any LCD strobing in the past.
 
Last edited:
Hey there, Mr. Wet Rag. We don't need no stinkin OLED's in this thread! :)
Doesn't even have to be OLED!

The thing is, it doesn't matter what panel it is, as long as it has perfect blacks and can do enough Hz to simulate a CRT electron beam, and has near 0ms pixel response time.

- MicroLED modules (modified for high native Hz)
- OLED (give one a Zisworks-style treatment)
- Dual-layer LCD (per-pixel local dimming at 1M:1 contrast), but pixel response may be dealkiller, and Hz is not available yet.
- 24 monochrome 1-bit DLP projectors stacked onto the same screen
(doing 1-bit 1920 Hz each, for each bit of the 24-bit color depth, for zero-temporal-dither zero-noise zero-rainbow 1920Hz DLP)
- Modified Jumbotron modules for high native refresh (low DPI nonwithstanding)
- Heck, a 1000Hz SED (with no temporal dithering), would be very meta here. ;)
- MicroLED-dimmed LCD (50,000 zones or more preferred), though LCD GtG still degrades beam simulator tests somewhat.
- Dream your ultra-Hz workaround
- Etc.

The important variables to exceed-the-CRT-venn-diagram:

1. Perfect blacks (at least less than the internal light scatter of a CRT tube)
2. Contrast/color gamut should exceed the CRT you wish to simulate.
3. Enough spatial resolution to hide screendoor and achieve individual-resolvable simulated phosphor dots (if CRT filter
4. Enough HDR brightness surge headroom for the simulated CRT electron beam. (compensate for darkness of temporal CRT simulation)
5. Enough refresh rate to accurately simulate CRT electron beam (many digital Hz per CRT Hz) to human vision requirements.'
6. Ideally, at a size identical to the CRT you wish to simulate (if you want to do an A/B test with a CRT)

Once 1/2/3/4/5/6 is accomplished simultaneously, the rest can be completed in software-based CRT electron beam simulation within today's performance capabilities of a shader on an RTX card. I did the math -- The missing tech piece is the required ultra-high-Hz wide-gamut wide-contrast great-blacks displays exist.

Put this panel behind an optional grey-tinted tempered glass (to simulate the thick glass of a CRT tube) and then you're ready to do a CRT turing test -- a blind A/B test between this display and a real flat-CRT tube. It's now technically possible to begin passing, given sufficient spatial (resolution) and temporal (refresh rate) required to simulate a CRT to retina levels in both the spatial and temporal dimensions.

The brute overkill spatials and the brute overkill refresh, allows simultaneously accurate spatial and temporal simulation of a CRT.

Temporally, one can match a low-Hz CRT via brute ultra-Hz with an electron beam simulator shader:
- same zero blur motion
- same phosphor decay trails that feels correctly analog
- same rolling flicker effect (even when you eyeroll)
- same green trailing for bright objects on black
- same temporal analogfeel
- same flicker ergonomics (identical eyestrain as the CRT it's emulating)

And technically since it's a simulated CRT beam in software (piggybacking on the brute Hz), you can modify the electron beam simulator parameters (much like MAME HLSL settings) as needed if you want more/less of the above, you can speed/slow the phosphor decay to get more/less flicker and motion blur, with custom ergonomics tailored to you -- or turn off the green tint of phosphor trails if you hated that part -- and you can have, perhaps saveable profiles/shaders for different CRT tubes it's temporally emulating is possible. Let's call this "Temporal CRT filters"

Heck, in the future, you could even mix-match existing spatial CRT filters (MAME HLSL quality) with any temporal CRT filter (the concept I'm talking about) to create your own custom CRT look-and-feel you need.

Many already know MAME HLSL looks like crap on a 1080p LCD, but looks CRT-spatial-accurate on a small 4K OLED (like those ASUS ProArt displays, or the Razer 15" OLED laptops) -- the individual phosphor dots are separately resolvable, at least if you're simulating an arcade CRT on a 4K OLED anyway. So we already know we can get there spatially. But what I'm talking about is the missing temporal piece of puzzle.

Prototyping with today's commercially available parts may miss one or two of 1/2/3/4/5/6 (e.g. giant screen with 24 DLP projectors is not the same size as a CRT) but would be sufficient to prove-out the concept before progressing to mass-manufacture. And it's certainly a cheaper engineering path now than rebuilding a SED/FED/CRT factory for mass-production -- which can translate more profit per unit sold -- and lower NRE costs for more certain results.

Also, it's more display technology independent, so in one human generation from now when retina-resolution four-digit-Hz display options are more widespread -- the same open-source CRT electron beam simulator can accurately match the temporal look of a CRT too. Get a better display, get better CRT emulation -- a 4000Hz display can more accurately simulate 0.25ms phosphor decay curves. For now, 1ms is excellent enough for most 320x240 emulator content, even for fast-scrolling Nintendo (NES) games.

In other words, CRT simulation doesn't have to be tied to display technology, once the display's spatial (rez) & temporal (Hz) & contrast/gamut is sufficient to simulate the target CRT.

Today's shader programmers can still prototype a beam simulator before tech too -- at lower granularity (4-6 refresh cycles for 60Hz CRT on 240-360Hz monitor), since the programming scales easily to higher refresh rates for even more accurate CRT electron beam simulation at finer granularities. It's very adaptable and very scalable in the refresh rate race of the current era.

You could target 1600 Hz to do 20 refresh cycles of 80 Hz CRT simulation

Spatial CRT filter is optional: Keep the 4K or 8K emulated electron beam instead: Native 4K or 8K CRT
And you can disable the spatial CRT filter for apps where you weren't interested in simulating the spatial textures a retro CRT. Just use the proposed temporal CRT filter only, not existing spatial CRT filter -- and use the native 4K-ness or 8K-ness instead (only temporally simulating a CRT tube). Basically it'd feel like a 4K or 8K CRT tube. You can skip the CRT filter, and just emulate a CRT temporally. And you've got your 8K 80Hz CRT for $8000 (or less) via a totally different engineering path than a FED/SED/CRT tube that is cheaper to fab-up with currently available factory manufacturing capabilities found on this planet. Some modding required for the Hz needed, but not as expensive as fabbing up a SED/FED/CRT manufacturing line.

The fact is that left-field engineering paths to an accurately CRT simulation is possible. There are many options to simulate a CRT accurately that many aren't currently intuitively thinking about yet.
 
Last edited:
Doesn't even have to be OLED!

The thing is, it doesn't matter what panel it is, as long as it has perfect blacks and can do enough Hz to simulate a CRT electron beam, and has near 0ms pixel response time.

- MicroLED modules (modified for high native Hz)
- OLED (give one a Zisworks-style treatment)
- Dual-layer LCD (per-pixel local dimming at 1M:1 contrast), but pixel response may be dealkiller, and Hz is not available yet.
- 24 monochrome 1-bit DLP projectors stacked onto the same screen
(doing 1-bit 1920 Hz each, for each bit of the 24-bit color depth, for zero-temporal-dither zero-noise zero-rainbow 1920Hz DLP)
- Modified Jumbotron modules for high native refresh (low DPI nonwithstanding)
- Heck, a 1000Hz SED (with no temporal dithering), would be very meta here. ;)
- MicroLED-dimmed LCD (50,000 zones or more preferred), though LCD GtG still degrades beam simulator tests somewhat.
- Dream your ultra-Hz workaround
- Etc.

The important variables to exceed-the-CRT-venn-diagram:

1. Perfect blacks (at least less than the internal light scatter of a CRT tube)
2. Contrast/color gamut should exceed the CRT you wish to simulate.
3. Enough spatial resolution to hide screendoor and achieve individual-resolvable simulated phosphor dots (if CRT filter
4. Enough HDR brightness surge headroom for the simulated CRT electron beam. (compensate for darkness of temporal CRT simulation)
5. Enough refresh rate to accurately simulate CRT electron beam (many digital Hz per CRT Hz) to human vision requirements.'
6. Ideally, at a size identical to the CRT you wish to simulate (if you want to do an A/B test with a CRT)

Once 1/2/3/4/5/6 is accomplished simultaneously, the rest can be completed in software-based CRT electron beam simulation within today's performance capabilities of a shader on an RTX card. I did the math -- The missing tech piece is the required ultra-high-Hz wide-gamut wide-contrast great-blacks displays exist.

Put this panel behind an optional grey-tinted tempered glass (to simulate the thick glass of a CRT tube) and then you're ready to do a CRT turing test -- a blind A/B test between this display and a real flat-CRT tube. It's now technically possible to begin passing, given sufficient spatial (resolution) and temporal (refresh rate) required to simulate a CRT to retina levels in both the spatial and temporal dimensions.

The brute overkill spatials and the brute overkill refresh, allows simultaneously accurate spatial and temporal simulation of a CRT.

Temporally, one can match a low-Hz CRT via brute ultra-Hz with an electron beam simulator shader:
- same zero blur motion
- same phosphor decay trails that feels correctly analog
- same rolling flicker effect (even when you eyeroll)
- same green trailing for bright objects on black
- same temporal analogfeel
- same flicker ergonomics (identical eyestrain as the CRT it's emulating)

And technically since it's a simulated CRT beam in software (piggybacking on the brute Hz), you can modify the electron beam simulator parameters (much like MAME HLSL settings) as needed if you want more/less of the above, you can speed/slow the phosphor decay to get more/less flicker and motion blur, with custom ergonomics tailored to you -- or turn off the green tint of phosphor trails if you hated that part -- and you can have, perhaps saveable profiles/shaders for different CRT tubes it's temporally emulating is possible. Let's call this "Temporal CRT filters"

Heck, in the future, you could even mix-match existing spatial CRT filters (MAME HLSL quality) with any temporal CRT filter (the concept I'm talking about) to create your own custom CRT look-and-feel you need.

Many already know MAME HLSL looks like crap on a 1080p LCD, but looks CRT-spatial-accurate on a small 4K OLED (like those ASUS ProArt displays, or the Razer 15" OLED laptops) -- the individual phosphor dots are separately resolvable, at least if you're simulating an arcade CRT on a 4K OLED anyway. So we already know we can get there spatially. But what I'm talking about is the missing temporal piece of puzzle.

Prototyping with today's commercially available parts may miss one or two of 1/2/3/4/5/6 (e.g. giant screen with 24 DLP projectors is not the same size as a CRT) but would be sufficient to prove-out the concept before progressing to mass-manufacture. And it's certainly a cheaper engineering path now than rebuilding a SED/FED/CRT factory for mass-production -- which can translate more profit per unit sold -- and lower NRE costs for more certain results.

Also, it's more display technology independent, so in one human generation from now when retina-resolution four-digit-Hz display options are more widespread -- the same open-source CRT electron beam simulator can accurately match the temporal look of a CRT too. Get a better display, get better CRT emulation -- a 4000Hz display can more accurately simulate 0.25ms phosphor decay curves. For now, 1ms is excellent enough for most 320x240 emulator content, even for fast-scrolling Nintendo (NES) games.

In other words, CRT simulation doesn't have to be tied to display technology, once the display's spatial (rez) & temporal (Hz) & contrast/gamut is sufficient to simulate the target CRT.

Today's shader programmers can still prototype a beam simulator before tech too -- at lower granularity (4-6 refresh cycles for 60Hz CRT on 240-360Hz monitor), since the programming scales easily to higher refresh rates for even more accurate CRT electron beam simulation at finer granularities. It's very adaptable and very scalable in the refresh rate race of the current era.

You could target 1600 Hz to do 20 refresh cycles of 80 Hz CRT simulation

Spatial CRT filter is optional: Keep the 4K or 8K emulated electron beam instead: Native 4K or 8K CRT
And you can disable the spatial CRT filter if you weren't interested in simulating a retro CRT. Just use the proposed temporal CRT filter only, not existing spatial CRT filter -- and use the native 4K-ness or 8K-ness instead (only temporally simulating a CRT tube). Basically it'd feel like a 4K or 8K CRT tube. You can skip the CRT filter, and just emulate a CRT temporally. And you've got your 8K 80Hz CRT for $8000 (or less) via a totally different engineering path than a FED/SED/CRT tube that is cheaper to fab-up with currently available factory manufacturing capabilities found on this planet. Some modding required for the Hz needed, but not as expensive as fabbing up a SED/FED/CRT manufacturing line.

The fact is that left-field engineering paths to an accurately CRT simulation is possible. There are many options to simulate a CRT accurately that many aren't currently intuitively thinking about yet.
All of that is great info as always… but you do realize I wasn’t being serious right? :D. Anyways. Love that you’re in here posting again!

Edit - when you say “VT2700”. Do you mean vertical totals? I may poke you privately about this. I’ve tried to extend the vertical total on my XG2431 and have thus been unsuccessful. I still have cross talk on the top and bottom of my screen at 60hz.
 
Last edited:
Hey there, Mr. Wet Rag. We don't need no stinkin OLED's in this thread! :)
I know you were kinda kidding on me, but don't forget we're the Hz mythbusters. Blur Busters famously reply big walls of text to these kinds of things. Wink ;)

Skeptical people read these forums, and I am here to educate on the many possible legitimate alternate engineering paths to CRT preservation that has recently become technologically available. Such as realtime CRT electron beam simulators written in GPU shaders and other high-performance processing (whether as a PC or as a video processor or display firmware). Given sufficient destination display refresh rate necessary to simulate a flying raster electron beam in real time to within human vision error margins.

Given the velocity of the refresh rate race -- it is Good To Know information before someone spends millions tooling-up a retro-manufacturing factory in this supply-crisis constrained economy.

After reading those Chief Blur Bluster comments (very interesting!) I'm just going to pat my Sony BVM CRT monitor for doing such a good job for retro gaming.
They're pretty good beasts. Baby it!

______

But, long-term is my worry.

What about CRT in 10 years? 20 years? 100 years?

For a display-independent solution (just supply the needed color gamut, resolution and hertz) -- we need an open source CRT electron beam simulator to protect this legacy when working 100-year-old MultiSync CRTs someday eventually as rare as a working 100-year-old car.

Being said, we still need a long-term plan for 10-20 years as more CRTs die, and PVM / BVM / etc get more impossible to find. That's where our CRT electron beam emulation initiative comes in, since it is a perfect fit into the ongoing progress of the refresh rate race that's currently already engineering towards 1000Hz+ display refresh tastes.

I can vouch on the viability of this engineering path, at least over the long-term (aka "this decade"). We internally worked on a prototype crude CRT beam simulator that worked on a prototype 240Hz OLED. Although it required phosphor decay in 4ms increments (due to long refreshtimes of 1/240sec), it showed vastly superior results to plain old BFI, and 60 Hz GPU-based electron beam simulation had much less eye-strain than 60 Hz BFI or 60 Hz strobing.

Beam simulation is much less accurate on 240 Hz LCD because of LCD GtG limitations, but with newly announced 480 Hz LCDs and 240 Hz OLEDs, we are now reaching territory where it's now worthwhile to begin programming open-source beam simulators that already improve 60 - 85Hz BFI quality (of current crude BFI strobing implementations). These refresh rates are still too low to pass the CRT turing A/B blind test, but still look vastly superior to plain old BFI. Minimally, these deserve to be emulator features today (at the minimum) and long-term a video-processor-box feature (like an enhanced HDFury etc) or a firmware feature, so that it works on all video sources.

CRT Emulator TestUFO demo is viable -- working on it as a spare-time project
P.S. I am determining the viability of porting this CRT simulator to a TestUFO. There is actually enough performance in a web browser on modern platforms to do it, given some sufficient optimization and tricks -- including accounting for correctly gamma-correcting the rasters of overlapping fades between adjacent refresh cycles, since RGB(64,64,64) followed by RGB(64,64,64) in two refresh cycles, do not match a single refresh cycle of RGB(128,128,128) so a lot of real-time gamma correction needs to be built into the electron beam simulator. Correctly temporally-blending adjacent refresh cycles is important in any CRT beam simulator algorithm, and the CRT beam simulator needs to know the current gamma of the display (e.g. 2.2) to do the math correctly.

I have come up with some clever algorithms and lookup-table tricks to make it viable in an average GPU-accelerated web browser performance budget, precomputed upon loading TestUFO and detecting the current refresh rate. This may turn a powerful-GPU required to a lesser-performance.

My goal is a CRT simulator TestUFO demo by end of 2022 or ealry 2023 -- that looks amazing on the upcoming 240Hz OLEDs coming to the market. This will be the first Joe Q Public introduction of a simple CRT electron beam simulator. This specific TestUFO demo will probably refuse to work on any display less than 240Hz, requiring a minimum of 4 digital refresh cycles to make phosphor simulator look reasonably correct, but will look odd on LCDs with bad overdrive -- so the demo will be much better on well-tuned LCDs (360Hz monitors with no corona artifacts) or any 240Hz+ OLED.

I may add a slow-motion mode to this upcoming CRT emulator TestUFO. It will look like frame-stepping a high speed video of a CRT -- it will look far more "analog" than any past BFI or rolling-BFI implementations. The colors will look very dim/dull on most LCDs, but the colors look great on OLED, though HDR may need to be added to compensate for brightness loss, using multiple precomputed PNGHDR/RGBE + alpha bitmaps for accelerating the CRT beam simulator to something that's as simple as bitmap math (using a count of destinationHz/emulateHz in number of bitmap masks) instead of per-pixel operations.

I'm finding many optimizations to optimize performance of a CRT beam simulator to fit into TestUFO performance budget, with some promising results internaly. This will allow more show-and-tell "look at how viable a CRT beam simulator can look, and look at how it improves with more target refresh rate!). Few will have 240Hz OLEDs, but at least show-and-tell can get started on a Razer 15" 240Hz OLED laptop that I plan to get this year to spread the (simulated) CRT gospel too.

All of that is great info as always… but you do realize I wasn’t being serious right? :D. Anyways. Love that you’re in here posting again!

Edit - when you say “VT2700”. Do you mean vertical totals? I may poke you privately about this. I’ve tried to extend the vertical total on my XG2431 and have thus been unsuccessful. I still have cross talk on the top and bottom of my screen at 60hz.
Correct. Large Vertical Totals.
Like this signal diagram, except the offscreen parts can be super-tall (even taller than visible resolution).

1652895992705.png


It's also known as Quick Frame Transport, since it speeds up scanout. With these tricks, it is possible to have a 60Hz or 120Hz refresh cycle scanning out in 1/240sec wiith a blanking interval 3x bigger than visible resolution. This can reduce the input lag of many (but not all) sync technologies. The secondary benefit is strobe crosstalk reduction.

HDMI Forum coined this, but it works on all display cables with custom large-VT resolution -- in both the analog and digital domain -- whether it be a VGA cable or a DisplayPort cable.

1652896011674.png


So you're hitting two birds with one stone -- reducing crosstalk and reducing strobe lag.

crosstalk-annotated-ANIMATED-VERTTOTAL[1].gif


However, given sufficient large VTs (about 2-3x bigger than active resolution) -- the strobe crosstalk can dissappear well below 1%, and below human-visibility noisefloor (less than the adjacent-color difference of two adjacent RGB values)..

You should reply to the "Quick Frame Transport HOWTO" thread in Blur Busters Forums, if you need help with calculating the correct ToastyX CRU numbers for a zero-crosstalk mode. But to summarize, you have to lock the scanrate and pixel clock, and only modify the refresh rate (down from 240Hz working) by enlarging the total. This is much easier to do in ToastyX CRU than NVIDIA Control Panel because you can put the radio buttons on locking the horizontal scanrate and pixel clock of max-Hz. When creating QFT modes, you don't edit the refresh rate -- but lower refresh rate by enlarging the VT of a max-Hz mode.

Eventually new software will make QFT plug-and-play, and theoretically the EDID can be updated with QFT modes -- but manufacturers needs to opt-in to doing that.

I'm able to exceed VT4500 on my ViewSonic XG2431, but it takes really specific confuguring and numbers to pull it off, and sometimes you have to do it via DisplayID-compatible extension block instead of a normal EDID, because of some math limitations -- the old EDID timing specs sometimes glitches with really weird large-VT modes.

In general, most CRT enthusiasts don't know how to optimize an LCD for zero-crosstalk operation, especially the refresh rate headroom trick (buying the highest Hz you can afford with strobe tuning, and use generous Hz margin to reduce crosstalk of your low target CRT Hz). But there are now options that blows away a Sony FW900 in motion clarity.

We are lucky that the manufacturer (ViewSonic) let us do DIY strobe tuning on XG2431 as part of the Blur Busters Approved 2.0 certification, in the era of reluctance to add plug-n-play non-VESA "QFT" modes in EDID. So end users can get better-than-factory strobe tuning, much as artists can get better-than-factory color with a colorimeter. Basically the Strobe Utility is metaphorically the strobe equivalent (temporal dimension) of a colorimeter (gamut dimension).

So we do have to jump through more hoops for the perfect zero-crosstalk top/center/bottom perfect mode as it is not available out-of-box on XG2431, but it's certainly doable if you use gigantic refresh rate headroom, large vertical total, and a Strobe Utility re-calibration.
 
Last edited:
  • Like
Reactions: N4CR
like this
The CRT doesn't seem to be currently an option, but I might be able to do 3840x2400@200Hz OLED soon
That would emulate 50Hz PAL CRT usably with a CRT electron beam simulator. (The 4K-ness will allow accurate spatial CRT simulation).

Better begin staffing up if you're serious about it. It's a lot of logistics beyond just talk.

But this is still, to Blur Busters a serious conversation to be had, because we have indie display hackers like display overclockers and strobe modders (like an indie display-hacker in our forums that successfully added VRR-DyAc to a BenQ XL2546 by attaching an Arduino to its strobe backlight controller). And we've got a few forum members who overclocked a 144Hz BenQ XL2411T to 268Hz+ through the "out of range dismiss" hacks one of our forum members discovered. So, being a community of modders, there's nothing stopping a DIY 1000Hz display much like the indie 240Hz and 480Hz displays built by a one-man ZisWorks outfit! (I would not recommend going that shoestring though). Or perhaps a DIY 240Hz OLED, if you figure out how to modify the TCON of an OLED, etc. And many forum members know the crowdfunded monitors of Eve Spectrum (despite the controversies, let's consider the relevant "indie" related line-items and lessons learned / mistakes to avoid)

So there's precedent for DIY/crowdsourced monitor technology improvements. A new ZisWorks might be lurking within you if you have the skills, who knows? But you have to deliver, like photos of an incremental prototype, ugly as it may be, maybe learn to become a YouTuber to get a louder bullhorn about it (or ally with someone), etc. Get the views / funds / buzz / etc necessary to build a prototype.

1000Hz ~1080p-ish prototypes (if you're willing to compromise on certain quality line-items) can be built for well under $50K via a lot of (real, literal) duct tape and FGPA "glue", and a little bit of blurbusteresque left-field-think.

Some steps are needed like the little-known Windows Insider Build registry tweak to break the 500Hz windows restriction.

One of those inventor masterclasses ("how to become a kickstarter manufacturer" type of seminars with high reviews, even if you don't plan to crowdfund, gives lots of delicious information about contacting supply chains). If you need something unrelated to displays for now and get some "inventor inspiration", even Mark Rober's masterclass can be a good starter, or maybe more appropriately an Arduino masterclass of some sort that operates a 32x32 RGB matrix. That might be a perfect starter toy! Before moving up to more advanced masterclasses that involves teaching you how to communicate to Chinese factories to get things done (warning: pandemic lockdowns there has made things complicated at the moment, but you can twiddle your thumbs for now while studying masterclasses until you're ready to try and leap). This might do you good, to get the skills and logistics necessary to figure out how to track down the contacts necessary to fab up a prototype while navigating supply chain hassles, etc. Go at it, and deliver. ;)

That being said, to avoid compromising on resolution and color gamut, a 240Hz OLED is really a wonderful starting base, since the near 0ms-GtG ability of OLED of a specific Hz makes it better than an LCD of ~1.5x-2.0x refresh rate. A lot more LCD refresh rate differential is needed to compensate for LCD's GtG slowness. Easier to tell apart a 240Hz-vs-360Hz OLED than a 240Hz-vs-480Hz LCD, because GtG is not an error margin in the blur comparisions.

So, I'd love to see 240Hz OLED for "usable" 60Hz CRT beam simulation. 4 refresh cycles per CRT Hz is kind of the bare minimum in my internal CRT electron beam simulator experiments, assuming GtG is near zero (as it is on OLED). This is not a classical rectangular rolling scan of simple rolling BFI -- but an actual gamma-corrected phosphor-fadebehind CRT electron beam simulated in real time for the timeslice of each output digital refresh cycle.

If 200Hz is due to a cable limitation, there's a workaround -- bump 200 to 240 via using manual custom timings. You can use reduced porches instead of CRT-R to fit 3840x2400x240 on a cable that can only do 3840x2400x200). The Horizontal Total of industry standard 4K is a very wasteful 4400 pixels, which means more than 10% of cable bandwidth is wasted in HBI signal (4400/3840 - 14% dummy bandwidth). This is the same as 1980s Japanese analog Muse HD that used almost exactly 14% of a scanline in HBI (HSYNC + horizontal porches), and the 4K standard simply horizontally and vertically doubled the ATSC standard. Assuming 3840x2400 is derived from the bandwidth-hog 4K signal timings, there's almost 20% increase in visible pixel dotclock available by reducing blankings to the bare minimums. That can be redirected to supporting a higher refresh rate by using a custom EDID instead of a DMT / CVT / CVT-R computed EDID. Or at least delete the "OUT OF RANGE" cop in the firmware so the panel can be overclocked with tight-porch settings. A near-zero-porch 3840x2400 signal can turn 200Hz to roughly ~240Hz (e.g. HT3843, VT2403 for a 3840x2400 signal is doable in custom resolution utilities on modern cables, with 1-pixel front porch, 1-pixel sync, and 1-pixel back porch) -- some displays' HDMI/DP controllers tolerate such ultratiny blanking intervals, that helps add more refresh rate headroom.

More Hz is better for more temporally accurate simulation of a CRT electron beam -- but OLEDs running at 200Hz (PAL) and 240Hz (NTSC) is where CRT beam simulators start to look usably superior looking to all past software BFI algorithms (4 refresh cycles per CRT Hz, assuming GtG=0 or as close as possible).

For future CRT beam simulator algorithms for matching the retro look of CRT scanning and its phosphor physics -- LCDs need to be higher Hz (360Hz), preferably locally dimmed + HDR -- to begin looking usably good. But all of this is coming eventually (240Hz OLEDs and locally-dimmed 360Hz LCDs) in less than two years from now.

That being said 4K 200-240Hz OLED, would really jump ahead. Sufficient Hz, gamut, contrast, brightness is needed to push the "meh" crap to the "wow" even for accurately temporally simulating the look and feel of low-Hz CRT via a software-based CRT scanning electron beam simulator.
 
Last edited:
My 480p capable CRT which was used to play Gamecube, Xbox (and a few other consoles) died a few months ago and finding someone to fix it is proving to be a hassle so I’ve been tempted to pull the trigger on one of the 280Hz 1440p low haze (glossy) EVE Spectrum monitors.

At the moment I’ve hooked my consoles up to my 24 inch Sony OLED PVM (with rolling scan) but it’s a tad smaller than I’d like.

My question to Chief Blur Buster:

Would the EVE spectrum running in 60Hz single strobe (with 280Hz headroom) be better than my OLED PVM with rolling scan if my objective is to get closer to CRT like motion resolution?

Can you foresee any other possible upgrade paths for retro console gamers who want to keep using original consoles with 60Hz content but don’t have access to CRTs?
 
Last edited:
160 Hz on CRT would still have better motion clarity than any currently available LCD.

the inconvenient truth, CRT even at 60hz still have better motion clarity than any currently available LCD,OLED, IPS, etc modern monitor, TV specially without losing brightness massively and more aggresive flicker than any CRT monitor that even those "best than CRT motion quality" viewsonic modern monitors suffer.

Outdated info.

As of 2022, there are crosstalkless LCDs with less motion blur than a CRT now. For example, a specially tweaked mode on the XG2431 is much better than the XG270 that ApertureGrille commented can beat a CRT in certain circumstances (in his review). We've already improved past that point!

A few perfect-crosstalkless strobed LCDs now exist. For Oculus Quest 2 LCD, Valve Index LCD, and the Blur Busters Approved ViewSonic XG2431.

With VT2700 at 100Hz, re-calibrated with Strobe Utility, the XG2431 goes perfectly crosstalkless for top/center/bottom, with less motion blur than a FW900 CRT.

and what about the massive brightness loss, much lower than CRT and aggresive flicker than CRT at 60hz? does that tweaked modes improve that? since the XG2431 suffers from massive brightess loss as the xg270 at its only CRT clarity matching mode: pure XP ultra, and having more aggresive flicker than CRTs at its 60hz single strobe mode, would not make sence to claim XG2431 having better motion clarity than fw900 or any other CRT if those flaws are still present even with those tweaks and leave the xg2431 motion quaity strobe mode poorly enjoiable compared to a CRT monitor.
 
Blow away an FW900 in a meaningful way for motion?

FWIW, if anything, when I trace the UFO test by eye it just looks totally clear on this screen. (FW900 November 2003. Low hours until WFH.)
 
and what about the massive brightness loss, much lower than CRT and aggresive flicker than CRT at 60hz? does that tweaked modes improve that? since the XG2431 suffers from massive brightess loss as the xg270 at its only CRT clarity matching mode: pure XP ultra, and having more aggresive flicker than CRTs at its 60hz single strobe mode, would not make sence to claim XG2431 having better motion clarity than fw900 or any other CRT if those flaws are still present even with those tweaks and leave the xg2431 motion quaity strobe mode poorly enjoiable compared to a CRT monitor.
This is true that the squarewave strobing is harsh.

That's exactly why I'm recommending a more proper electron beam emulator instead as a superior method of 60 Hz strobing in the longer term.

I was replying to "160 Hz on CRT would still have better motion clarity than any currently available LCD."
The OP was surgical -- didn't mention blacks / colors / brightness / comfort of strobe.

CRT is better if you want the color, blacks and strobed brightness, and if you have more flicker discomfort.
Thankfully, the XG2431 does provide far more flexibility than XG270, which allows many more "better motion resolution than CRT" modes.

However (ignoring colors, brightness, imperfect blacks, etc) is that motion recently became clearer on a cherrypicked strobed LCDs than on the common flagship computer CRTs like FW900 -- more motion resolution. This is because of the higher resolving resolution of LCDs, the highly tunable strobe pulse width, and the lack of strobe crosstalk (versus extant phosphor ghosting). This is borne out in ApertureGrille's great review of XG270, where he said one of the modes was clearer than a CRT (the 119Hz mode). The XG2431 is much more flexible, since it is fully strobe re-tunable by the end user.

Notice, I did not say LCD is superior in all aspects -- but motion clarity & motion resolution, CRT is already beat under the criteria of motion clarity (ignoring color / brightness / grey blacks) which is what I was addressing. That much is true. If you wanted all the benefits of a CRT, then CRT is definitely better. However, if your priority was maximum motion resolution (if you didn't mind the disadvantages) a well-tuned

No new info here, especially if you actually seen it happen on any device -- such as a Valve Index or Quest 2 instead of an XG2431.

Strobing, yes, is a humankind band-aid for now. The ultimate is retina refresh rates. And for simulating a CRT, develop an open source CRT beam simulator on a retina refresh rate display.

Also, it is true you typically need 100Hz+ strobe to be as comfortable as a 75-85Hz CRT -- the squarewave strobe requires a little more excess refresh rate to compensate.

Notice, I said, surgically, specifically "motion clarity" and "motion resolution". CRTs are still better for blacks and color and flicker-per-Hz.

I'm a big fan of CRTs. We have to technologically-progress with what we have as today's technology. It's not perfect, but it's my job to educate people that strobing has improved massively over the years. You can choose compromises like 100Hz or 120Hz strobe modes (to avoid flicker eyestrain) and re-tune with Strobe Utility to settings that produce better results at all refresh rates than ApertureGrille's XG270 review -- and even choose intermediate settings (e.g. 40 different levels of PureXP Custom via Strobe Utility instead of just 4 in OSD) that tips the fence more often towards LCD than CRT for bigger numbers of refresh rates than it did for XG270. Definitely not all modes certainly, but there are more modes available on XG2431 than XG270 that can beat the motion resolution of a Sony FW900 (ignoring all metrics).

Most agree, though, that the complete disappearance of strobe crosstalk is a huge achievement for LCDs, regardless of other LCD problems. Many other attributes need to improve, though.

Blow away an FW900 in a meaningful way for motion?

FWIW, if anything, when I trace the UFO test by eye it just looks totally clear on this screen. (FW900 November 2003. Low hours until WFH.)
Motion resolution.

Many CRTs have the attribute of phosphor decay (bright objects on black, as an example, full screen mode at night) which can slightly limit motion resolution at ultra fast motion speeds (e.g. 3000 pixels/sec) -- example photos in ApertureGrille's XG270 test. The CRT performs better, until strobe settings is adjusted (albiet dark).

Also, many of us lower CRT brightness anyway, so adjusting strobe brightness to match is not a problem -- it depends on your needs on color, gamut, brightness, etc.

A great example of LCD beating CRT motion clarity is opening TestUFO in the Quest 2 VR browser, maximize it to a bigger full screen browser, and it produces results that has better motion resolution than a CRT tube. (Quest 2 strobes at 0.3ms MPRT fairly brightly). Certainly CRT still produces better color gamut and blacks, but in the motion resolution department -- that line item is beat already.

Other line items are also concurrently improving (colors, brightness of strobe, blacks via eventual 5-digit-count MiniLED local dim backlights with enough MicroLED density to have less bloom than phosphor bloom -- or simply high-Hz OLED). Now that the science of crosstalkless LCDs is more achieved, we can improve upon it with improvements and progress. While also concurrently advocating, let's say, an electron beam simulator, which is a noble long term display-independent goal.
 
Last edited:
EDIT: I'm just responding directly to the OP. I'm not commenting on what anyone else is talking about in here.

No. The LG C2 42" OLED can be had on sale today for less than $1400.
8k gets me precisely nothing. And the C2 does 4k 120Hz with either GSync Ultimate or Freesync Premium through HDMI 2.1.
It has no field curvature, it's "pixel perfect" in terms of geometry.
The C2 is capable of displaying HDR - no CRT can. Calibration for all ray guns is basically designed to be around 100cdm2, which is supposed to pair with theater projection of only being able to display 48 IRE (which for both is a technical limitation). The C2 as a result can display, much larger gamuts than any CRT is capable of.
OLED also is capable of much higher dynamic range.
I know for certain I'll be able to calibrate the C2 more accurately.
Size and weight is also unfavorable for CRT.
So is cost. ($8000 vs $1400 is a bit obvious...)

Basically there is nothing that any CRT display can offer me at this point that OLED cannot do better, other than perhaps being smaller than 42" (I would prefer around 32" 4k). However considering the depth of CRT, I would take larger display area as "being a problem" over 1' of depth being a problem. I know which would fit better on my desk, that's for sure.

I also care far less about gaming and far more about pure picture quality. It would be nice to color grade, or at least use an OLED for reference and start dabbling with HDR grades which I can't do now.

EDIT: Just placing this here for my own reference.
 
Last edited:
EDIT: I'm just responding directly to the OP. I'm not commenting on what anyone else is talking about in here.

No. The LG C2 42" OLED can be had on sale today for less than $1400.
8k gets me precisely nothing. And the C2 does 4k 120Hz with either GSync Ultimate or Freesync Premium through HDMI 2.1.
It has no field curvature, it's "pixel perfect" in terms of geometry.
The C2 is capable of displaying HDR - no CRT can. Calibration for all ray guns is basically designed to be around 100cdm2, which is supposed to pair with theater projection of only being able to display 48 IRE (which for both is a technical limitation). The C2 as a result can display, much larger gamuts than any CRT is capable of.
OLED also is capable of much higher dynamic range.
I know for certain I'll be able to calibrate the C2 more accurately.
Size and weight is also unfavorable for CRT.
So is cost. ($8000 vs $1400 is a bit obvious...)

Basically there is nothing that any CRT display can offer me at this point that OLED cannot do better, other than perhaps being smaller than 42" (I would prefer around 32" 4k). However considering the depth of CRT, I would take larger display area as "being a problem" over 1' of depth being a problem. I know which would fit better on my desk, that's for sure.

I also care far less about gaming and far more about pure picture quality. It would be nice to color grade, or at least use an OLED for reference and start dabbling with HDR grades which I can't do now.

EDIT: Just placing this here for my own reference.

Nobody in this thread gives a fuck about the C2. 120hz OLED is SOOOOOOOOOOO 2019 grandpahhhhh!
LG needs to step it up with 175hz+ for you to even begin breathing its name in the same sentence of CRT motion goodness.
 
............

an interesting XG2431 youtube video review i found, in which they mention its strobing modes and its best "CRT motion quality or better than CRT motion mode" level brightness of just 53 nits!! (funny you come here to make advertising about XG2431 "greatness" again CRT motion but dont mention anything about this video review rather only mention ApertureGrille's XG270 review......(ok, your answer!!..... blur busters is not a review site...got it! )

linked at exact 1:28


quote from the reviewer at that time:

"53 nits is pretty dark i doubt most people if anybody is going to use this, and if you do, you only gonna be able to use this on a really dark room with blackout curtains and no light in it, cause 53 nits is barely visible in day time"

i highly doubt any serious CRT gamer will lower his CRT luminance leves to such low level to claim, "many of us lower CRT brightness anyway, so adjusting strobe brightness to match is not a problem"


also come on!, phoshor decay trial on CRTs is only notable on certain circustances, specialy on black background and vanishes fast enough, and this does not affect CRT motion resolution, the moving object keeps being peflecly clear and phosphor trial is only visible at a side of the moving objec, so if we are going to comparte CRT few flaws agains XG2431 strobing flaws that are many, very low brigntness levels on XG2431 on its matching or "better" motion quality on strobing mode will lower badly and permanetly the brightness on while its best strobing mode being activated, not just on cercain circustances, a well as its worse flicker perspection than CRT for its 60hz single strobe mode and all other flaws that from its best strobing mode.

a quote from you from comments in that XG2431 youtube review video:

lol.jpg


so, one will only be able to get "perfect" crosstalk-less quality at few limited refresh rates from about 100-120hz at the cost of flicker, but 180hz ideal with VT tweaks just to compriomise crosstalk-flicker, also introducing lag tradeoff here (also hence requiring a gpu capable of running 100 -120 - 180 constant fps to achieve that), and this is the monitor you pretend coming to this forums to make people, specialy CRT enthusiast like me to beilieve to be better motion quality than a fw900 monitor we need?, plagged with all those tradeoff, flaws? limited refresh rates? come on!!

in conclusion: CRTs are crosstalk free at any refresh rate, have less notable flicker even at 60hz, have life like clear motion clarity at any refresh rate, any resolution, have a barely notable and fast vanishing phosphor trial behind the moving object that dont affect its motion resolution and is only notable on certain circustances, specially on black backgrounds, they dont need such low brightness levels to achieve life like motion clarity, have excelent latency at any refresh - resoultion used and you come here to try to psychologically convince people XG2431 have better motion qualiity than a fw900? yes, i guess your answer: you are talking about just "motion quality" not all other aspects, it is simply pointless to use "better" than CRT motion mode if that force you to use boring very dim dsiplay, , a more notable flicker than CRTs for 60hz content, crosstalk "free" screen only usable at high refresh up to 100 +hz at constant 100 fps gpu capable, compromised latency..... honesty are a lot of tradeoff-flaws just to "enjoy" (if one can call it an "enjoyment") "CRT motion clarity" or to pretend to replace a good working condition CRT, reagrdles if its a fw900 or other CRT monitor to someting like XG2431.

lets be honest here, this is ovbiously of course done from you in order to improve XG2431 sales and ensure your comission from viewsonic rather than pretending to "educate". im not not saying this is something bad, but as with you did with the xg270, pretending to psychologically convice CRT users to rather "upgrade" to one of these viewsonic monitors at the cost of their many flaws that CRT dont suffer which you dont mention in clear, direct manner, (if i dont mention those flaws you would never had mentioned them) respecfuly i have to say is real something disappointing and have been sadly making blurbusters look like another self bussinees biased rather than transparently "educative", site.
 

Nobu

unless most console games, you can disable motion blur in pc games, at least in all of them i have played there is an option for that, bloom is another subject (which also can be disabled in most games i have seen by the way)
 

Nobu

unless most console games, you can disable motion blur in pc games, at least in all of them i have played there is an option for that, bloom is another subject (which also can be disabled in most games i have seen by the way)
Oh I know, still hate it.
 
lets be honest here, this is ovbiously of course done from you in order to improve XG2431 sales and ensure your comission from viewsonic rather than pretending to "educate". im not not saying this is something bad, but as with you did with the xg270, pretending to psychologically convice CRT users to rather "upgrade" to one of these viewsonic monitors at the cost of their many flaws that CRT dont suffer which you dont mention in clear, direct manner, (if i dont mention those flaws you would never had mentioned them) respecfuly i have to say is real something disappointing and have been sadly making blurbusters look like another self bussinees biased rather than transparently "educative", site.
Disclosure to other readers:

Honest Financial Disclosure: The Blur Busters Approved 1.0 and 2.0 programmes are offered on a fixed-price basis, on a per-model basis (but no guarantee your monitor gets approved -- Blur Busters rejected quite a few after they failed to pass certain thresholds during most-advanced-possible calibration). Between the monitor manufacturers and Blur Busters there are zero commission-based approaches. Except for Amazon affiliate revenues (e.g. affiliate links on the Blur Busters owned websites). The Amazon commission however, is only 2.5% for monitors (Amazon USA), which is only $7.50 on a $300 Amazon purchase. Therefore, this definitely isn't the reason why I am posting here. I'm posting here due to my hobby interest in seeing CRTs be simulated more accurately, and to spread advocacy about this.

Also, because I am deaf since birth, I can't easily be a YouTuber so my advocacy path is via ad-supported banners on the Blur Busters and Forums website (and I have stubbornly refused to put ad banners on TestUFO, a big loss of financial for me, but I have my boundaries). Rather than Google paying me for YouTube views and their ads. If you hate that, fine, but that's the path I gotta go.

You can use an adblocker and change all Amazon affiliate tags to "?tag=hardfocom-20" to support this site instead of Blur Busters. I have no problem with that.

But hating on tech progress? 'come on.

Many ask that Blur Busters should manufacture their own monitors. We don't ever plan to. It's certainly possible given the Zisworks and Eves, but I don't wish to become a Failed Kickstarter. And I'd rather manufacturers copy my ideas (without paying me) than to manufacture my own monitors. It's such a nuclear-loaded Pandora Box for a small business to manufacture monitors. I wish. I'd rather stay more of an indie with my services open impartially to any manufacturer. Blur Busters do sometimes help build jerry rigged prototypes, or help others do, but that's about it -- more of my influence is on the science, advocacy, and getting manufacturers to implement them. That's not a bashworthy approach.

Yes, 60Hz strobing is definitely a band-aid. But it's a stepping stone to even better things -- including open source software-based CRT simulators.

Also, in addition, users on Blur Busters Forums has been posting greatly improved pursuit camera photography of better-calibrated XG270’s and XG2431’s.

Despite Blur Bustsers’ detractors (everybody has their detractors, including LinusTechTips, RTINGS, you name it, etc), most people here knows Blur Busters is a hobby turned business, and I’m a huge fan — I do a lot of "worthless" spare-time projects, such as Tearline Jedi, as well as other things. I worked on raster interrupts back in the Commodore 64 days. I’ll let my reputation speak for itself.

Remember, many of us are people like you and me — and we have our creative passions!

Also, you were skeptical of my ability to convince manufacturers:

3dfan post_id=55413 time=1596040516 user_id=328 said:
man, with respect, but i believe you are wasting your time, and this is sadly a dead request, i have been waiting for this as well for years, being a current crt user in which i have been using 60hz for desktop and for many games for over 20 years and this has been one of the main reasons, among many others, that i have been refusing to "upgrade" to modern monitors.

but after a lot of begging about this to manufacturers, for me is clear that those companies simply does not care about making their customer happy by adding 60hz single strobe, they just care about making their money fooling users about their monitor "greatness" but cheating, hiding, underestimating their many flaws agains crts in order to get their money / commissions and the lack of 60hz single strobe will not prevent that for them.

That's wnat you said, "i believe you are wasting your time, and this is sadly a dead request,"

Now 3dfan, after seeing me successfully convince a few, you choose to bash accomplishments, by attempting to move the goal posts and call me out. Sour grapes, no?

Especially after already seeing those crosstalk-free pursuit photos posted by Blur Busters Forums members that are superior to some reviewers -- most reviewers test the out-of-box experience instead of advanced strobe tuning utilities. Most users don't bother, and that's fine.

What's important is that it's possible to tune certain LCDs to have superior motion clarity to CRT, if you're willing to do extra work as well as live with tradeoffs (colors, brightness). It's massively better than it was in the LightBoost days, though! You are correct though that not everyone accepts the tradeoffs, but you cannot deny that XG2431 is far more tweakable and flexible than XG270 was (you may not have seen the "XG2431 Strobe Utility HOWTO". That YouTuber did an excellent job of off-the-shelf analysis, but if you looked closely at the best pursuit images elsewhere, you already know that others have beat that youtuber in calibrating the XG2431 already -- by following the Quick Frame Transport HOWTO. Yes, yes, it's true I was not able to convince the manufacturers to build Plug-and-Play QFT EDID's, but at least they let users can DIY QFT in ToastyX. That's a win in my book.

Also, keep in mind there are 40 brightness levels for PureXP+ strobe on XG2431 (via Strobe Utility), unlike for XG270. The brightest levels exceed 200 nits, so you simply choose a sweet spot. The great news is you can get a crosstalk-free 100-150 nits on the XG2431 with large vertical totals to hide a longer strobe flash at the end of an ultra-large VBI.

There are millions more strobe tuning combinations possible on XG2431 that you could not do on the XG270 (from the near-infinite number of supported refresh rates that are fully tunable, times the number of variations (large VTs) for any of those Hz, times 40 brightness levels available to all those timings, time 100 strobe phases available to all those timings, times 100 overdrive levels available to all those timings). So you have a vast amount more tunability on XG2431 than you did on XG270.

There are some modes where you can get better than XG270 53-nits with more than 100-nits on XG2431, if you cherrypick a favourite Hz (say, ~100Hz-ish or ~85Hz-ish) combined with the largest vertical total you can get (2700 or larger) for the specific refresh rate, using the Quick Frame Transport HOWTO.

TL;DR: Technology is improving, and you're bashing on technology progress?

Moreover, there are some people who adjust their CRTs to 50 nits to make them last longer, because they're working in a cave. For some people, 50 nits fine at night when all lights are out and your roommate is sleeping in the bed behind you. For others, we love our 1000nit HDR (I loved the 10,000 nit Sony prototype I saw at CES 2020 before the pandemic), and I do see the benefit of all the nit extremes. The bottom line is that the venn diagram of user brightness preference is not the same as yours.

The bottom line, is that over time, it is getting easier and easier to cherrypick an LCD into clearer motion than certain CRTs such as Sony FW900. More dominoes will fall, e.g. gamut / blacks / brighter strobe / switch to rolling scan / etc -- until the Holy Grail of a software-based CRT beam simulator.

Now, I want to close out to say that I'm an everyday geek like y'all -- there is no need to attack me.

Many of us are fans within our industry, and are true end users too, not just business.

https://twitter.com/BlurBusters/status/950954077441089536

1653150723935.png

So many of us are just everyday people who went into business because we loved what we are doing.

CRTs forever ❤️
... but in parallel I want to simulate them more and more accurately too, especially as they get rare over time. If you want to hate on me because I'm doing the right thing, by all means, be my guest. It just looks silly to hate on tech progress.

‘Nuff said.
 
Last edited:
  • Like
Reactions: N4CR
like this
Most games have bloom and motion blur anyway. Ugh.
unless most console games, you can disable motion blur in pc games, at least in all of them i have played there is an option for that, bloom is another subject (which also can be disabled in most games i have seen by the way)
Right on!

They still do it even to this day to help tolerate 24fps-30fps (so choppy that you almost want blur to hide the choppy). If you're stuck on a laptop with Intel GPU, or playing at 30fps on console, they're useful to hide the choppy-mania when you have no other choice.

But they are not useful when the game is able to run 200fps+. Blur busting all the way!

Nobody in this thread gives a fuck about the C2. 120hz OLED is SOOOOOOOOOOO 2019 grandpahhhhh!
LG needs to step it up with 175hz+ for you to even begin breathing its name in the same sentence of CRT motion goodness.
I'm waiting for 1000Hz+ OLED before they match CRT in things like TestUFO Panning Map Test (175Hz OLED fails that, BTW).

Being that said, ~480Hz sample-and-hold will make 960 pixels/sec map marginally readable (for sample and hold, you need a refresh rate about half of the motionspeed in pixels/sec to keep the blur fog low enough to be able to read things like tiny 6-point text on maps, or nametags above players in a fast-panning DOTA2 map). That being said, 480Hz sample & hold (LCD, OLED) still has a guaranteed minimum of 2 pixels of motion blur during 960 pixels/sec, and that's only in the extreme circumstances of a perfect GtG=0ms. So that's still not the endpoint.

However, the geometric-requirement of the refresh rate upgrades needed, make things like 240Hz-vs-360Hz very crappy, since a blur-difference of 1.5x muddied to only 1.1x (due to slow GtG and high-frequency mouse jitters vibrating into extra blur) and the GPU limitations, can make 360 Hz worthless to average users. Mind you, we know. That being said OLED sharpens the differences between refresh rates (120Hz vs 240hz is much clearer difference on OLED than LCD, by removing GtG fog from the MPRT persistence blur).

Even 175Hz OLED can never have less than 1/175sec motion blur (5.7ms MPRT100%) since they are fundamentally a sample-and-hold technology. Much better than LCD though (240Hz OLED looks clearer than 360Hz LCD because of faster GtG). CRTs flash their phosphors for a millisecond or less (time of the rise, to the GtG90% fade pont), so you need 1000 consecutive 1ms frametimes to fill the black period to get identical blur.

175Hz has more motion blur than LightBoost from a decade ago which was was 2ms MPRT100%, and current strobe backlights now do better. However, native frame rate is superior to strobing, since strobing is just a humankind band-aid.

Refresh rates need to be upgraded geometrically (60 -> 120 -> 240 -> 480 -> 1000) or even 2.5x to 3x factors (60 -> 144 -> 360 -> 1000) in order to see human visible benefit -- the diminishing curve of returns. If you wanted numeric decreases in MPRT, you want to do 60 -> 120 -> 1000, since (60, 120) is an 8.3ms difference in MPRT, and (120,1000) is a 7.3ms difference in MPRT. Assuming of course MPRT100% and GtG=0.

However, for equivalence to CRT blur requires frametimes of 1ms, and thus requires 1000fps 1000Hz to avoid BFI or flicker or impulsing.

For improved motion clarity, you can see the need for geometric upgrades in the temporal domain, much like it was needed in the spatial domain (like how VGA -> XGA -> 1080p -> 4K -> 8K) or whatever upgrade factor you choose. Wait longer if you want to be more environmentally friendly and upgrade less often, as refresh rate incrementalism is usually a waste for most (e.g. 144 -> 165) unless there are other concurrent benefits (e.g. resolution upgrade, or LCD -> OLED, etc). But the bottom line, we've got a fair bit to go until we got a display that is a supersetted-venn-diagram completely around CRT in all attributes (blacks, colors, Hz, etc), enough to shift the CRT simulation to software-based electron beam simulators.

Fortunately displays are (eventually) headed there over the long-term of humankind. Which means electron beam simulators are the probable Holy Grail for mimicking a retro CRT.
 
3dfan I think you're being a little overly harsh here. I get what you're talking about though and I agree with you for the most part, in terms of how far we still have to go to actually replace CRT.

Burt right now, there's nothing else like the XG2431. And even still with all of its flaws I decided to keep it. I also have a Samsung CFG-73 (24 inch). At the time I thought it was pretty good although it had some astonishingly stupid shortcomings, that were all artificially smacked on by Samsung. Other than the fact that the Samsung is a VA monitor and its contrast is clearly superior to the Viewsonic, the Viewsonic spanks it in every way. And that was in the gap of three years. I'm kind of excited to see what the next three years' of development can bring.

Chief Blur Buster - How can we join the advocacy? One of the things I like about Blur Busters is that you have brought motion clarity to the forefront. I think every TV manufacturer now offers BFI with their TV's but it can totally be better. Most of them offer just a mode that isn't even remotely tuned to giving a clear picture. I thought of posting to rtings and ask them to start evaluating and scoring TV's BFI function on the QUALITY. Right now I think they just base their score on whether or not a TV even has it to begin with, and which refresh rates it offers with that mode. If a TV offers 60hz then it's an automatic high school, regardless of whether or not it's plagued with crosstalk. If a TV has crosstalk issues, they'll just mention it, but it doesn't affect the score. I would love to see them do an in-depth evaluation on BFI and start scoring this category accordingly. IE, instead of Samsung getting a 9.0 for BFI simply because it does 60hz, slap it with a 2.0 because the mode is practically unusable.

3dfan, regarding the low brightness of the XG. I'm wondering if the new Eve Spectrum monitors can overcome this. One of those monitors can get as bright as 750 nits. Assuming the same light loss as the XG (400 nits peak brightness gets cut to 52 nits for CRT quality), you would have it peaking out at 97.5 nits for CRT motion clarity. That would be competitive with CRT but of course would have other downsides like crosstalk and low contrast.

Chief Blur Buster - how hard is it to simulate a CRT beam scan with a local dimming array setup? Surely any implementation here could reduce crosstalk?

Anyways. Can we also take the time to admire the fact that we do have displays whose quality does surpass CRT? We just don't have one that does it in every way. But man - we're SO close.

EDIT: And I'm a huge diehard CRT fan. Prior to having to move out of my house I had a GDM-FW900, GDM-F520, and GDM-C520K Artisan. All of which were pristine (F520 was from Vito), all of which I calibrated very well. I even wrote the geometry and convergence calibration guide to the FW900 using WinDAS. I don't typically like to pull rank - I'm just saying that I'm intimately familiar with what these things can do. So surely it's to Viewsonic's credit that I decided to KEEP the XG2431? :D And yes - I've tried a bunch and returned a bunch of other strobing monitors. For this price, and for its features and tweaking ability - you can't come close. At least not now. Like I said - we'll see in the next 3 years.
 
Last edited:
Chief Blur Buster - how hard is it to simulate a CRT beam scan with a local dimming array setup? Surely any implementation here could reduce crosstalk?
Not very reliable *yet* to get low crosstalk in a scanning backlight because of internal light diffusion in a scanning backlight. Many attempts, but never successfully zero-crosstalk in a consumer LCD.

This creates strobe crosstalk problems -- and I have not yet seen a scanning backlights go as zero-crosstalk as an Oculus Quest 2 or a cherrypicked-mode cherrypick-tuned XG2431.
This was a major problem for many years for scanning backlights.

But with an ultra-high-LED-count local dimming (e.g. 50,000+ LEDs and more), it is possible to get the LEDs very close to the LCD, and dramatically decrease internal light diffusion from the on-segments to the off-segments.

The big problem is that the timing controllers of the MiniLED / MicroLED backlights are more optimized for HDR and refresh cycle granularity, rather than microsecond-precise single-row control necessary for a good low-crosstalk scanning backlight. So it is not yet easy to program this. However, I got asked by a manufacturer (not ViewSonic) to give them timing specifications for a potential scanning-MicroLED backlight. Longshot, and might be several years, but who knows.

1500 LEDs is probably not enough to go zero-crosstalk with a scanning backlight yet, but hopefully could get pretty low. We shall see and find out how far along we can get with improved scanning backlights. It will make 60 Hz strobing a LOT more comfortable, especially if phosphor decay is simulatable (via fading previous rows of LEDs). The good news is scanning MiniLED backlights can surge to really high nit headroom (e.g. 2000 nits), to compensate for the darkness of strobing. So, assuming a chinese manufacturer /can/ redesign the timing controller of a MiniLED array, we can pull off much more comfortable 60 Hz strobing sooner in humankind. At least for a few models, anyway.

Don't wait though -- this is still super-early research stuff.

However, I now have the basic math/specs needed for the timings of a scanning MiniLED backlight that's highly tunable (phase, pulse width, optional fadebehind, etc) for free for any @manufacturer.com willing to contact me directly for my (currently internal) white paper.
(*I may upsell other Blur Busters services, though. Razor and blades approach, baby!)

Scanning MiniLED do have these advantages:
- Sub-refresh latency for all pixels on the screen, which can make it more esports-friendly than strobe backlights
- Much less flickerfeel at lower Hz, making 60 Hz more comfortable
- Much better behaviours with VSYNC OFF because of latency symmetry between backlight scan and panel scan. So the VSYNC OFF frameslices stream realtime into the scanout (without weird latency offsets caused by LCD scanout versus global flash.)
- For "low" LED counts like 1500 LED, probably much less crosstalk than "average" strobe backlights like NVIDIA ULMB, even if not as zero-crosstalk as Quest 2 VR LCD

It's definitely something I'm working on behind-the-scenes, but it's still currently a long shot because the timing controller of current MicroLED arrays are not flexible enough for subrefresh control yet. However, I am trying to change that over the longer term, because I've now developed a specifications white paper on this.

Yes, I said it. 1500 LEDs is "low". Try an absolute minimum of 50,000 LEDs & zones (preferably more), manufacturers. Even a $8 LED matrix (32x32 RGB) from Alibaba has 32x32x3 = 3072 LED chips in them. Theoretically it should eventually become possible to manufacture sheets of 50,000 monochrome LEDs for less than $100 per flexible sheet, through automated means. Still cheaper to do 50,000+ LED local dimming backlight, than to do a 24-million-LED discrete direct-view MicroLED display or OLED display. This would produce much more kickass HDR with less local-dimming blooming than the average bloom around a bright phosphor. However, if OLED manufacturing becomes brighter and higher Hz for cheaper, it's possible for OLED to pull ahead in this horse race. But right now, I see both options competing head-to-head for a pieslice of the market for the next ten years with separate pros/cons. The benchmark is enough zone-count of a local dimming backlight to have less bloom than the bloom around a phosphor dot.

I'm doing my best to internally (convince | shame | goad | coax | etc) certain manufacturers to doing better, but it's a delicate smart Blur Busters labyrinth navigation...


Anyways. Can we also take the time to admire the fact that we do have displays whose quality does surpass CRT? We just don't have one that does it in every way. But man - we're SO close.
Thanks
And you're a user who actually has both XG2431 and CRT, and spent more time than an average reviewer tuning the XG2431.

Congrats for witnessing the clarity even without a bigger QFT tweak yet -- you can do even better than what you've already done!

If you haven't seen the new "Quick Frame Transport HOWTO", the trick is to NEVER edit the refresh rate, but start at 240Hz and lower the refresh rate indirectly ONLY via editing either the vertical total, the sync, and/or the back porch in ToastyX (radio button locked on Total + locked on Pixel Clock). NVIDIA Custom Resolution doesn't have these radio buttons to help assist on easier QFT creation. This keeps the scanout at 1/240sec even as refresh rate goes down via the increasingly larger VBI's you're experimenting with. Occasionally you get stuck trying to lower the Hz (e.g. 64 Hz), at which point you can start editing the Hz to 60. This still results in VTs large enough to zero-out crosstalk. Once you succeed on VT2700-VT4600+ modes (depending on Hz), remember to test large strobe phases like 85 or 90 or 95 because you want the strobe flash to start a little bit earlier during an ultra-large-VBI situation, and 0 phase actually looparounds to 100 phase (as a complete 360 degree adjustment of strobe phase on XG2431 -- flash can occur at any phase of refresh cycle, even partially overlapping the start of VBI, or end of VBI, to compensate for LCD GtG lagbehind effects, for the sweet zero-crosstalk strobe spot

Would certainly make low-latency low-crosstalk strobe a lot easier without end users being forced to manually create QFT modes to get those coveted zero-crosstalk modes.

For even easier QFT, I heavily wish a monitor manufacturer would dare to try plug-and-play QFT EDIDs with a "Quick Frame Transport ON|OFF" setting in OSD. If this was done, strobe fans would not need to touch ToastyX or a custom resolution ever again.

...Because of the lack of that, I am considering creating a possible QFT Wizard built into a future version of Strobe Utility (loads max Hz timings, then autocompute QFT timing after entering a target Hz, for you to manually enter into ToastyX, or perhaps use a ToastyX API to create the mode -- I sent a suggestion to ToastyX about an API to autocreate QFT modes, much like his old LightBoost utility automatically created LightBoost modes).


EDIT: And I'm a huge diehard CRT fan. Prior to having to move out of my house I had a GDM-FW900, GDM-F520, and GDM-C520K Artisan. All of which were pristine (F520 was from Vito), all of which I calibrated very well. I even wrote the geometry and convergence calibration guide to the FW900 using WinDAS. I don't typically like to pull rank - I'm just saying that I'm intimately familiar with what these things can do. So surely it's to Viewsonic's credit that I decided to KEEP the XG2431? :D And yes - I've tried a bunch and returned a bunch of other strobing monitors. For this price, and for its features and tweaking ability - you can't come close. At least not now. Like I said - we'll see in the next 3 years.
Thanks for jumping to my defense of certain cherrypicked and carefully strobe-tuned LCDs finally one-attribute (motion clarity) surpassing CRT.

I still have to mythbust the disbelievers who surgically talk about motion clarity (ignoring colors, ignoring brightness, ignoring blacks) -- which is why I leaped.

Everyone expects Blur Busters to leap on mythbusting these disbelievers, every chance I come across such a post.
 
Last edited:
I would like to crosspost ApertureGrille's CRT image from this thread, as an example of phosphor ghosting I'm talking about.

Comes from the other HardForum thread:
https://hardforum.com/threads/dell-...z-3440-x-1440.2016696/page-41#post-1045360300

1653206288102.png

Don't get me wrong, CRT beats LCD if you're factoring in all attributes (blacks, color gamut, motion clarity, etc).

But there are multiple LCDs with perfect zero strobe crosstalk now, complete top/center/bottom without any ghost nor duplicate image of any visible faintness. And this actually produces motion quite noticeably clearer than this CRT pictured above. In sequence of better-and-better, basically XG270 < XG2431 < Quest2 in quality package deal (best brightness available for best cherrypicked zero-crosstalk mode)... The Valve Index and Pimax is also similar (less testing) so you can test those too. I simply say Quest 2 as it's more common/cheaper and more likely to be found at a friend's household (At 14 million headsets sold, it outsold the XBox over this time period), so it's an easier test-drive, especially since they all have a preinstalled in-VR Chromium-engined web browser that pops as a floating IMAX-sized window that is capable of running TestUFO -- think about this: Literally an IMAX sized "CRT tube" (motion clarity wise) easily on tap for your testing's pleasure to help micdrop CRT-vs-LCD technology-progress debates faster, sooner, and earlier.

To boot, the standalone Quest 2 is also easier to pack than a CRT tube, for takealong CRT motion clarity content consumption -- you can even Virtual Desktop a 2D PC video game to an IMAX-sized screen too, and reap CRT motion clarity benefits -- a few blur busters fans use VR as a CRT tube workaround since desktop monitors can't strobe 0.3ms MPRT brightly (yet). The Quest 2 LCD supports 60Hz, 72Hz, 75Hz, 90Hz and 120Hz, with modes forceable via SideQuest for the in-VR browser (or specific apps like video players if you needed framerate=Hz performance, like better 24p playback via 72Hz, or better 60fps via 60Hz/120Hz, etc) -- for software such as BigScreen that simulates sitting in a large movie theater, except the theatre screen just so happens to be CRT motion clarity.

It's a left-field alternative to a CRT tube that's not a LCD desktop monitor -- wearing a VR headset just to simulate a giant CRT tube sitting on a virtual desk, and then playing your PC-based video game on that said virtual CRT tube that can be sized as big & distant as an IMAX screen if you wished, or as close to your face as a desktop monitor -- BigScreen and Virtual Desktop lets you resize a virtual 2D in-VR screen. Make sure you switch PC refresh rate to match the VR refresh rate, and use a USB Link cable for jitter-free framepacing (WiFi streaming can stutter/jitter a bit, but if you're seated at a desk as if playing a 2D PC game, then cabling your Quest 2 is better -- and besides, it's ~300 Mbps H.EVC streamed over the USB-C cable to avoid compression artifacts when streaming the PC desktop)

But you can try any of the newer custom VR LCDs, as they are now using ultrafast-switching LCD panels custom-manufactured for VR purposes, where they manage to cram the entire GtG100% heatmap in the dark cycle of a strobe backlight, shifting the only leftover human-visible GtG to literally the speed of the backlight on/off transistor or the nearly zero phosphor decay of the white LED phosphor (<0.1ms) -- far below the error margin of human detectability, at least at current VR resolutions, FOV and motionspeeds.

If you want bright strobing at better motion resolution than CRT with no calibration needed out of the box (unliek XG2431), you have to get one of those better VR LCDs. They use more voltage-boosting and over-engineered LEDs because these VR-specific LCD panels were custom built to be brightly strobed, rather than a strobe backlight retrofitted to a plain non-VR LCD.

Not all LCDs can strobe brightly but some headsets such as Quest 2 does a very bright job of 0.3ms MPRT, with any human-visible portion of pixel response heatmap that's fully flat for both GtG and MPRT -- and fully bidirectionally symmetric, and equal for all color combinations (no LCD ghosting, no phosphor ghosting, no crosstalk, perfectly symmetric motion at leading and trailing edge, etc -- just darn nigh perfect near-zero GtG & MPRT for all possible color transition combinations). So you can skip the XG2431 and go straight to one of the better VR LCDs. Unfortunately it's Facebookened, but it is worth pointing out as a CRT-motion-resolution-beating LCD that does it fairly brightly, and is already pre-calibrated out of box. Most have not GtG/MPRT measured the headset because it is hard to do so, but any longtime greybeard engineer who's tried is massively impressed at how flat-to-near-zero the entire 3D barchart of all the GtG/MPRT values are for all src/dst color pairs (source color along one axis, destination along other axis) -- some sites use color coded heat maps, others use 3D barcharts for mapping the GtG's and MPRT's of different color pairs -- but to see such utter glass-floor flatness in an LCD is quite the jawdrop.

Note -- review websites generally don't GtG-heatmap / MPRT-heatmap a strobed mode -- but if you tried to response-heatmap a Quest 2 VR LCD, the GtG heatmap is flat zero for all 65,280 color pairs (256*255, when excluding the no-change pairs). And the MPRT heatmap is glassfloored at 0.3 -- it is possible MPRT can vary too for different colors on certain display technologies, like the longer-sustained-ontime of brighter colors (especially if per-pixel PWM is used as pixel brightness control -- e.g. certain discrete LED displays -- and it's possible to do it at one PWM pulse per Hz, to combine motion clarity benefits of shorter PWM of darker colors, while retaining brightness benefits of longer PWM pulses).

It has the neat attribute of its ability to have better motion clarity without brightness loss than any available desktop display (including XG2431 which can't get as bright as Quest 2 when it's tuned to 0.3ms MPRT). A few dozen Blur Busters fans / CRT fans are doing this technique, some of them even sideloaded the Android .apk for MAME (Android apps run as floating rectangles approximately 100 virtual inches in diameter, placed at roughly TV viewing distance in virtual space), to get standalone emulation without needing to connect a PC. They did need to use SideQuest to force 120Hz or 60Hz to prevent stutters in MAME though; 120Hz flickers less but has a double-image motion artifact, while 60Hz flickers a lot more -- some don't feel the strain from 60Hz flicker, while others do, at least SideQuest force-refresh-rate gives you the choice. The way Quest 2 strobes slightly differently seems to have slightly less eyestrain than XG2431, but more eyestrain than CRT. YMMV, though.

Not the most comfortable use case (using a VR headset as a computer monitor) -- although (with a 3rd party strap) it's comfortable for long periods with less eyestrain than NVIDIA 3D Vision / Real3D / Disney3D cinema glasses (unless your IPD is beyond Quest 2 spec, or require a vision prescription beyond Quest 2's retrofittability)

I simply mention this because it's a left-field alternative to adding another desktop monitor -- using VR to virtualize a desktop computer monitor to use Windows and play 2D non-VR PC games -- to gain a specific benefit such as better-than-CRT motion clarity.

Being Blur Busters I get to feast my eyes on thousands of displays, so I can call out those cherrypicked LCDs that have more motion resolution than CRT.

P.S. I earn absolutely zilch from Quest 2, not even commissions. I'm talking purely as a CRT fan.
 
Last edited:
Chief Blur Buster did you see my question about rtings? I would love to see them pick apart the BFI implementations and actually score them accordingly, rather than score it based on whether or not the tv has it. Is there any way we can talk to them or ask them? I was going to post on their suggestions forum but I’m not sure how far that would get.
 
Chief Blur Buster did you see my question about rtings? I would love to see them pick apart the BFI implementations and actually score them accordingly, rather than score it based on whether or not the tv has it. Is there any way we can talk to them or ask them? I was going to post on their suggestions forum but I’m not sure how far that would get.
I agree.
The BFI feature on my Samsung Q9FN TV can be got used to but has a lot of drawbacks, I prefer to not enable it.

Chief Blur Buster
Thanks for all you have posted and are doing, most informative.
 
Chief Blur Buster did you see my question about rtings? I would love to see them pick apart the BFI implementations and actually score them accordingly, rather than score it based on whether or not the tv has it. Is there any way we can talk to them or ask them? I was going to post on their suggestions forum but I’m not sure how far that would get.
Good timing for this question:

RTINGS just upgraded their BFI scoring criteria!

They are also now adding more lower-Hz pursuits now, because they have finally realized refresh rate headroom improves strobing quality (e.g. using 120Hz strobe on a 240Hz panel is a very common move, to decrease strobe crosstalk).

https://www.rtings.com/monitor/reviews/acer/predator-xb273u-gxbmiipruzx
3DDD64D6-103D-4F59-AA7C-0F2E5D4A17D9.png


And clearly, they are now scoring more on inclusion of existence of DIY strobe tuning adjustments. There are Strobe Utilities for 3 different brands of monitors now, and will soon be 4 or 5 brands in future.

They’ve added pulse adjustments to their scoring criteria (without me EVEN asking RTINGS) because they realized that it can help users improve strobe quality — they are a bit more common thanks to Blur Busters advocacy to manufacturers!

Not perfect scoring criteria, and doesn’t yet include top/center/bottom pursuits, but much better!
 
Last edited:
For those who mis
Being Blur Busters I get to feast my eyes on thousands of displays, so I can call out those cherrypicked LCDs that have more motion resolution than CRT.
I've decided to back up with example evidence (as a pre-emptive protect of cred given the naysayers that exist in these forums).

Here is my Twitter thread where I feasted my eyes on many displays at DisplayWeek 2022, a convention devoted to people in the display industry (I used to go to conventions often before the pandemic, so this was nice to be at DisplayWeek 2022).

DisplayWeek has many hundreds of displays alone, and I got to see most of them over a 3 day period. That's the perk of an industry insider nuts-and-bolts convention like this... From multiple high-hz OLEDs (Samsung had one on show) to BOE to more obscure companies like panel manufacturers that don't sell directly to consumers.

1653263715788.png

I had the "PIONEER" badge, as I'm recognized by the employees of the display industry.

Now the highlight of the convention was definitely BOE's booth because of the unexpected treat:

1653263148628.png

Despite that, a few employees at the BOE booth suddenly surrounded me for selfies like a celebrity once they realized who the heck I am -- BOE engineers unexpectedly treated me like a celebrity... BOE is a company that's been pushing the refresh rate race really hard.
(Note: BOE and I have not done any paid business before as of today, May 22nd, 2022. But who knows, no guarantees about the future, conventions are often about networking, y'know...).

They didn't mind I couldn't easily talk clearly (I am deaf since birth) -- because they often used translators to talk in Mandarin. I whipped out my LTE iPad running a translator app to communicate to these employees, typing to each other, not caring about my deafness-since-birth. Other times I had to translate from Korean, and yet other times I just type English on my Bluetooth keyboard on iPad as my deaf assistive device at a hearing convention. The ones in marketing spoke fluent english, but the visiting engineer PMs was a lot less fluent, so that's where using a translator app makes my disability invisible.

While the Blur Busters cover page is stagnant (until I find a BlurBusters-educated writer to take over), I'm super-busy behind the scenes on much, much, much, much, much, much more important things than updating the cover page of a website...

Because there were so many manufacturers showing displays on the convention floor, one of the booths was even showing off a prototype 240Hz OLED at one of the public booths (the panel vendor that Razer is going to use for their 240Hz OLED laptop). Thus, even had the opportunity to see 120Hz-vs-240Hz was much clearer on OLED-vs-OLED than LCD-vs-LCD, so I am excited about better display technologies making the refresh rate race even more visible tomorrow than yesterday, too! This is what I predicted, and exactly lines up with Blur Busters knowledge, as OLED follows Blur Busters Law much more accurately than LCD does (no GtG error margin to fuzzy up the math).

Anyway, being that, I see many displays not yet on the market, at industry conventions such as these.

This is why so many people trust me when I talk about cherrypicked displays I've seen. I'm such a huge display geek & fan!
 
Last edited:
@3dfan I think you're being a little overly harsh here.

let me apologize, but it really upsets his intention to "educate" better say "manipulate" his followers not just pretending to understimate and not mentioning anything about the flaws-trade-offs already mentioned, because, again, if it wasnt because i mentioned them, he would have never bothered to inform some of those. obviosuly in order to make look the XG2431 better and superior to CRT than it really is, but also manipulating with wrong information and even with lies when comparing to CRT since CRT have become very pupular specialy fw900 being currenlty mentioned by many famous youtubers and sites so he overpraises XG2431 agains fw900 to psychologically convince people to get their viewsonic monitors to ensure his earnings.

i am not been posting about all this with the intention of defending CRT against any LCD, or plain "haterism" or to convince anyone to buy or not to buy brand or tech type monitors, no, this is rather with the intention of defending customer right to be transparently informed about a product.


how one would feel if one see a product being advertised with wonderfull features one have been been seeking, the seller figure out your interest and go to your location, like him coming to this forums to advertise his business earning monitors to manipulate you psychologically to make you buy his comissioned product talking wonderfullness about it, but he does not mention about its many flaws, tradeoffs, requirements so you get convinced, go and buy the product without being informed about those important things, and you discover when already spent your money and got the thing about those uniformed flaws that makes those feaures you were interested about practicaly useless, the seller didnt care to ethically inform about those flaws, otherwise so you would have your own criteria to accept or not the product knowing its cons and pros you have your right to know about, this is about marketing ethics, but instead the seller pretends to maximize seelling chances, and also tell you lies and use false advertising tactics to convince you as with the following facts:

a proof of he using even lies agains crts to make them look inferior and rise the XG2431:

https://blurbusters.com/black-friday-blur-busters-approved-viewsonic-xg2431-in-stock-at-amazon/

thats a link from the advetising page for XG2431 from the blurbusters main site,

quote from Mark Rejhon from one of the comments from that page:


For those who wish to use 60Hz single-strobe, make sure:

– Adjust to a dimmer picture (50 nits or less);

– Add extra viewing distance;

– Increase amount of room lighting (never use 60Hz PureXP in dark room);

– Enable only during retro content (e.g. when YouTube 60fps or console 60fps is already playing).



This is because:

Yesterday’s CRTs were small (17 inch)

– Today’s monitors are HUGE, so flicker is worse.

Yesterday’s CRTs were dim (50 nits)

– Today’s monitors are BRIGHT, so flicker is worse.



so "yesteray monitors were 17 inch"?? really??? specialy fw900, GDM-520, hitachi, LaCie, Iiyama, etc that many people as today still use.

its totally false and he knows, that "yesterday" CRT "were"...... (ARE) not that dim as low as 50 nits, easily CRTs can go twice or even more than twice higher than that value and just from OSD menu, without special tweaking, hacking or science. this falsity is evidently done with the goal of make CRT look as dimm as the viewsonic at its equivalent CRT quality mode, so he keeps trying to mask CRT advantages against those monitor.

quote from Aperture Grille xg270 video review: "ultra, which well also see in a moment is pretty amazing but its too dark at 70 nits, my dell CRT is nearing 18 years old and it can hit 100 nits". read right? 100 nits! not 50!!
also it can be seen in that video that xg270 best motion clarity mode "pure xp ultra" has a max nits levels of 72, (at 119hz hence requiering gpu capable of 119 constant fps) which is even higher (while still being too dark) than the XG2431 reported from its "best than CRT motion quality mode" pure xp ultra of just of 53 nits, so it means XG2431 is even dimmer than the already very dimm xg270 at its "superior than CRT motion" ultra mode


i see MR Rejhon replying again "there are many brighter pure xp modes instead of ultra"...... yeah!!,, but whats the point of those modes being brighter if those also degrade the motion clarity outside the "CRT superior" motion quality level?





another proof of lies regarding CRT flicker:



https://forums.blurbusters.com/viewtopic.php?f=4&t=7694&p=59172&hilit=flicker+xg2431+60+crt#p59172




quote from Chief Blur Buster;

"I've managed to move the industry needle a little bit. People have long been whining, "Please Add 60 Hz Single-Strobe". It flickers badly like a CRT but some of you still love the nostagilia........"

you know thats a lie, 60hz single strobe flicker is actually more notable, more intense than 60hz CRT, not "badly like a CRT" , no offense but sure most of those followers that blindly believe in everything he says, (i was one of those until the xg270 release) wont be able to grab a CRT and witness the veracity of these words by themself, so it easy to manipulate their ingenuousness.

ironically he even admited flicker being less confortable than CRT at same frecuency here recently, quoting me from few post above (and of course, he would not have mentioned it if i wasnt because i mentioned those flaws):

Also, it is true you typically need 100Hz+ strobe to be as comfortable as a 75-85Hz CRT -- the squarewave strobe requires a little more excess refresh rate to compensate.


more proof of false visual advertising manipulation:

blur-busters-approved-versus.png


in that picture from their XG2431 advertising front page, https://blurbusters.com/black-friday-blur-busters-approved-viewsonic-xg2431-in-stock-at-amazon/
this image is clearly selling you the idea that his certified monitors will have a perfecty clear, brightfuly, artifact-crosstalk-free motion quality, can easily be observed that the left picture is darker and have motion arctifacts such crosstalk, but its already known what the reality is: both certified monitors suffer from massive dark brigtness loss at their best motion qualty, so in real world you will not get that brighter image on the right with the image being that clear during motion on those viewsonic monitors, if you use pure xp ultra mode, to achieve that best motion quality image, you wll get a rather darker image like the one of the left, or will have to rise pure xp levels to get brighter image like the one of the right but wont then get a clear moving image as it shows on the right image since pure xp levels higher than ultra also degrade motion quality,
also without forgeting the high gpu requirements to achieve a crosstalk "free" image, as he wrote in a quote from youtube comment from above reply, you will need something from 100hz up to 180hz to find a compromize between flicker and crosstalk, so you will requiere a gpu capable of running from 100 to 180 fps constantly to ahieve that (a tradeoff requirement i also don't see he caring to inform).



I've decided to back up with example evidence (as a pre-emptive protect of cred given the naysayers that exist in these forums).

i also have decided to backup with evidence how you are abusing and manipulating the faith your followers have with you, to make them buy those "certified" monitors at the cost of their many flaws-tradeoffs you are not caring to inform in an honest-direct, trasnparent manner , again if it wasnt because i mentioned those flaws you would not have bothered to just for your own convenience

its a shame because i really liked blur busters and really was a different and interesting site and indeed it felt "educative" in the past, different from those youtuber users and sites that now days pretend to make money feeding with "followers", "influencers" "likes" with a lot of missinformation basing on sensationalism in order to feed his businees, i see blurbusters going to that direction sadly since you sold your soul to viewsonic, only certifying flaw-fested viewsonic brand monitors when there are a lot more brands strobing tunable monitors outside doesnt make any sence rather than sorely due to bussines strategy rather than being educational about motion clarity science and evolution about it in an unbiased manner, normally it would not be a bad idea or something bad seen that business strategy if it wast because of those non ethical, manipulative, tergiversed, lying tactics in order to sell a product that is not having the advertised quality which "superior" than CRT overpraised strobing features accompanied with all those flaws-tradeoff-high gpu requierements become practically unusable in real world usage.
 
Last edited:
3dfan, with due respect...

I think that you just end up silly with that post, because of Blur Busters' overall honesty here --

how one would feel if one see a product being advertised with wonderfull features one have been been seeking, the seller figure out your interest and go to your location, like him coming to this forums to advertise his business earning monitors to manipulate you psychologically to make you buy his comissioned product talking wonderfullness about it, but he does not mention about its many flaws, tradeoffs, requirements so you get convinced, go and buy the product without being informed about those important things, and you discover when already spent your money and got the thing about those uniformed flaws that makes those feaures you were interested about practicaly useless, the seller didnt care to ethically inform about those flaws, otherwise so you would have your own criteria to accept or not the product knowing its cons and pros you have your right to know about, this is about marketing ethics, but instead the seller pretends to maximize seelling chances, and also tell you lies and use false advertising tactics to convince you as with the following facts:

a proof of he using even lies agains crts to make them look inferior and rise the XG2431:

https://blurbusters.com/black-friday-blur-busters-approved-viewsonic-xg2431-in-stock-at-amazon
I haven't lied here on that page.
You do not have an XG2431 in your hands, as far as I know as of this date.

It's still true that well-tuned LCDs can beat a CRT in the category of motion clarity, as I've earlier described, and that others have also confirmed too.
That's still a fact, even if that didn't used to be true in the past.
It's becoming truer and true as more time passes, and more modes are easily achieved (better Hz availability, better brightnesses, etc).

For those who wish to use 60Hz single-strobe, make sure:
– Adjust to a dimmer picture (50 nits or less);
– Add extra viewing distance;
– Increase amount of room lighting (never use 60Hz PureXP in dark room);
– Enable only during retro content (e.g. when YouTube 60fps or console 60fps is already playing).
These are correct honest best-practices for a strobe backlight.

I even correctly recommend 50 nits to make the flicker of 60 Hz single strobe less painful.
There's a massive difference in 60 Hz single-strobe comfort when you follow these best practices.

This is because:
Yesterday’s CRTs were small (17 inch)
– Today’s monitors are HUGE, so flicker is worse.
Yesterday’s CRTs were dim (50 nits)
– Today’s monitors are BRIGHT, so flicker is worse.
so "yesteray monitors were 17 inch"?? really??? specialy fw900, GDM-520, hitachi, LaCie, Iiyama, etc that many people as today still use.
You're still quoting honesty.

"Yesterday's" refers to "the olden days", not 24 hours ago.
In year 1994, you could only buy a 15" or 17" CRT. Getting a bigger PC CRT of 17" was luxury in the days when most IBM PC Compatibles / Compaq Deskpros / etc came with 14" or 15"

I purchased a Samsung Syncmaster 17GLsi in year 1995 for a 4-figure price, when 17" was still luxury.
Perhaps maybe in some countries/cultures "yesterday" is misunderstood. Maybe "yesteryear" is more appropriate. But turning a sandgrain into a mountain, really?

Also, not all CRTs have the same brightness. Some are dim to 50 nits due to wear and tear, as the continue to be used today, and start to wear out.

It's definitely true that ApertureGrille prefers brighter, but that's not true for everybody.
There is a huge number of people where 100 and 200 nits is way too bright, and 50 nits is their sweet spot.
And many well-worn CRTs (not ApertureGrille's) don't even reach 100 nits. Even on those, many CRT fans intentionally dim their CRTs a little to help them last longer. Besides, it reduces electron beam bloom too, and increases spatial resolution that way.
Not everyone has the same brightness desire.

I was more than honest in actually mentioning 50 nits as a recommendation for 60 Hz single-strobe.
If people think 50 nits is too dim, they can avoid 60 Hz single strobe and use a higher Hz instead for brighter.

Sure -- if you want other CRT attributes that can get much brighter (purchase a NOS or babied tube), but it's still lovely to get perfect LCD geometry (1080p square pixel goodness to max sharpness of PC games) while simultaneously getting CRT motion clarity. Sometimes people prioritize on "great geometry + bigger screen than a CRT + motion clarity compating with CRT alternatives that gets close to the same screen size), rather than "brightness + colors + CRT motion clarity". Many different people have their different checklists for what they want in a screen.

I don't see any dishonesty here.
I want to profusely thank you for posting the link, because I pre-emptively promised to the HardForum moderators that I would never post a link to blurbusters directly again (but that I would TestUFO / other sites).
You've just strengthened my honesty case -- thank you for posting the link.

Re-quoting the link again for others to click:
The most useful part of this is the "step by step guide", which you quoted parts from.

And in your continuation of attempt to put untrue words in my mouth:
its totally false and he knows
What falsehood? It's true. I did publicly recommend 50 nits for 60Hz single strobe. I think you're the one embarassing yourself here.

Doesn't mean that both ApertureGrille and I are both correct. CRTs vary quite a bit in brightness, whether you're visiting a barcade (like "Miniboss" in San Jose) or a big room of CRTs. They wear down at various levels, and many have dimmed quite considerably. ApertureGrille is correct, but it's only 1 cherrypicked CRT sample.

It's definitely possible to have a cherrypicked CRT blow away a cherrypicked XG2431 mode, but that doesn't turn the XG2431 page a lie. You can also tune an XG2431 to beat CRT motion clarity more easily than you can with an XG270.


(at 119hz hence requiering gpu capable of 119 constant fps)
The XG270 had precalibrated presets that could not be manually recalibrated. The XG2431 can be strobe-recalibrated at any Hz

Because of that, any mode 60Hz thru probably approximately 120-150Hz (depending on panel and temperature) -- all refresh rates in 0.001 Hz increments -- if calbirated with all the tips including QFT -- can be superior to XG270's best 119Hz or 120Hz mode. That's hundreds of different modes superior to the XG270's 119 Hz mode*. That gives you WAY more refresh rates to find compromises with -- e.g. a refresh rate higher than 60 Hz that is still zero crosstalk (for 90%-100% of screen surface), finding a sweet spot that you can get brighter with.

Also unlike for XG270, the XG2431 have many modes where 100 and 150 nits is great at higher refresh rates than 60 Hz -- the 50 nits recommendation is specifically to single-strobe 60 Hz. See, I honestly mentioned 50 nits there -- people can decide for themselves if usable 60 Hz single strobe is too dim or not. I erred on the side of honesty there, as you will.

Because of this flexibility, you can reduce the refresh rate to something much closer to what you would use with a CRT, and have more similar GPU requirements than, say needing to hit a sweet spot mode like 119 Hz XG270 (whereas *all* Hz on XG2431 between 60 thru 120-150 blows away with a retuned QFT mode). This puts your "more GPU power than CRT" argument dead in the water.

*119 Hz was certainly a very interesting happenstance as a side effect of a 120 Hz calibration being slightly off for different panel manufacturing batches (I used 119 Hz successfully as an argument to ViewSonic "look at what happened" to convince them to let end users re-calibrate strobe). Originally 120 Hz preset was better than 119 Hz, but eventually this wasn't true anymore as different panel batches had different strobe sweet spots that required 1/100ths of overdrive gain readjustments. XG2431 Strobe Utility permits this. It also meant that the quality of XG270, unfortunately varied slightly from specimen to specimen (just as for all hardcoded strobes, including ULMB) because of things like panel batches (panel lottery) and temperatures. Once I discovered all of these issues, I published all the best-practices honestly in the XG2431 page you thankfully conveniently linked to.

"I've managed to move the industry needle a little bit. People have long been whining, "Please Add 60 Hz Single-Strobe". It flickers badly like a CRT but some of you still love the nostagilia........"

you know thats a lie, 60hz single strobe flicker is actually more notable, more intense than 60hz CRT, not "badly like a CRT" , no offense but sure most of those followers that blindly believe in everything he says, (i was one of those until the xg270 release) wont be able to grab a CRT and witness the veracity of these words by themself, so it easy to manipulate their ingenuousness.
I don't see it as a lie at all -- "like" isn't necessarily "exactly like" -- there is certainly a minor biasing where you need a little more strobe Hz for a squarewave-strobe versus a CRT fade-wave. But it's not a big difference like 60Hz-vs-500Hz. For many people, bumping the Hz to 75Hz makes it flicker less than a 60Hz CRT tube. Others need more of a Hz bump. Flicker sensitivity varies a lot. Squarewave strobe is a technology limitation of many LCD backlights and edgelights, and that's why I'm continuing to strive to improve beyond that.

You are simply finding tiny straws/nits and turning them in mountains to attack Blur Busters, I feel...

There are many past comments (youtubes, forums, elsewhere) of people enjoying the 60 Hz strobe. I could screenshot them for you. They profess it's just like a 60 Hz CRT for them.

Many people often compare OLEDs to CRT and plasmas to CRT, so comparing strobe backlights to CRT is fair game. Good strobe backlights already outperform plasmas and OLEDs in motion resolution too, that's no longer as difficult a technological benchmark checkbox now, as it used to be.

That's why I published the recommendations of 50 nits to make the flicker look closer to a 60Hz CRT tube -- another example of honesty for flicker equivalence. If you want brighter than about 50 nits and still insist on using a static Windows desktop (which was also eye-searing on 60Hz CRTs too), then 60 Hz single-strobe is not for you. But if you're playing Super Mario 3 in a Nintendo emulator, with appropriate tuning and adjustments, the 60 Hz single strobe is amazing -- and some have already mentioned that the reduction of motion blur eyestrain exceeds the flicker eyestrain when it comes to fast-scrolling platformers, once brightness is sufficiently lowered. Not everyone, but there are people that have confirmed as such. This is not dishonesty.


more proof of false visual advertising manipulation:

View attachment 476267

in that picture from their XG2431 advertising front page, https://blurbusters.com/black-friday-blur-busters-approved-viewsonic-xg2431-in-stock-at-amazon/
this image is clearly selling you the idea that his certified monitors will have a perfecty clear, brightfuly, artifact-crosstalk-free motion quality, can easily be observed that the left picture is darker and have motion arctifacts such crosstalk, but its already known what the reality is: both certified monitors suffer from massive dark brigtness loss at their best motion qualty, so in real world you will not get that brighter image on the right with the image being that clear during motion on those viewsonic monitors, if you use pure xp ultra mode, to achieve that best motion quality image, you wll get a rather darker image like the one of the left, or will have to rise pure xp levels to get brighter image like the one of the right but wont then get a clear moving image as it shows on the right image since pure xp levels higher than ultra also degrade motion quality,
This is strobe-vs-strobe, not strobe-vs-CRT.

It's a honest representation when it comes to the fact:
- You can get that perfect zero crosstalk nowadays if you retune the strobe.
- Many LCD strobe backlights are indeed darker than XG270 and XG2431.

Certainly XG2431 does not get as bright as DyAc, but it's far from being the dimmest strobe backlight. There are strobe backlights that struggle to even exceed 100 nits at their brightest setting, and have much worse crosstalk.
It is true there's cherrypicking going on, but you can definitely get that zero-crosstalk (especially for screen middle) on several LCD specimens.

But the fact is that CRTs vary a lot, and LCDs vary a lot. Many aren't aware that the best strobed LCDs can exceed the motion clarity of a CRT.

I feel that you're finding small nits to make major attacks on Blur Busters...

also without forgeting the high gpu requirements to achieve a crosstalk "free" image, as he wrote in a quote from youtube comment from above reply, you will need something from 100hz up to 180hz to find a compromize between flicker and crosstalk, so you will requiere a gpu capable of running from 100 to 180 fps constantly to ahieve that (a tradeoff requirement i also don't see he caring to inform).
Remember, I did say that technology is improving.
Unlike the XG270, the XG2431 now lets you calibrate any Hz from 60 to 240, so you can choose a Hz that doesn't flicker too much (e.g. 85Hz or 100Hz) and that dramatically lowers the GPU requirements. This puts your argument dead in the water.

i also have decided to backup with evidence how you are abusing and manipulating the faith your followers have with you, to make them buy those "certified" monitors at the cost of their many flaws-tradeoffs you are not caring to inform in an honest-direct, trasnparent manner , again if it wasnt because i mentioned those flaws you would not have bothered to just for your own convenience
Be my guest.
I have worked hard to be far more honest than most big companies when it comes to many things. (witness the difficulties of getting VRR working perfectly, as one famous example -- in all areas such as game compatibility to overdrive quality, etc)

its a shame because i really liked blur busters and really was a different and interesting site and indeed it felt "educative" in the past, different from those youtuber users and sites that now days pretend to make money feeding with "followers", "influencers" "likes" with a lot of missinformation basing on sensationalism in order to feed his businees, i see blurbusters going to that direction sadly since you sold your soul to viewsonic, only certifying flaw-fested viewsonic brand monitors when there are a lot more brands strobing tunable monitors outside doesnt make any sence rather than sorely due to bussines strategy rather than being educational about motion clarity science and evolution about it in an unbiased manner, normally it would not be a bad idea or something bad seen that business strategy if it wast because of those non ethical, manipulative, tergiversed, lying tactics in order to sell a product that is not having the advertised quality which "superior" than CRT overpraised strobing features accompanied with all those flaws-tradeoff-high gpu requierements become practically unusable in real world usage.
Just based on how silly this conclusion paragraph is....

3dfan, it's my understanding that you do not have an XG2431...
I'll let other readers judge for themselves -- given technology improvements I strive for.

Everyone has different sensitivities. Sensitivity to color (pickiness on color), tearing (VSYNC OFF), latency, flicker, motion blur, stutter, brightness, etc. The fact is that XG2431 has satisfied quite a large number of people. There are also a huge number of CRT fans that adjust their tubes to 50 nits to make their tubes last longer too, especially in a windowless basement room. It's not that particularly dim for them if their eyes are already adjusted to a darker room. I actually recommended the number -- a move of honesty.

Those who actually have an XG2431, despite the flaws of LCD strobing technology, confirm it's one of the best 60Hz-strobe implementations they've seen compared to other LCDs. Some hate it, some love it -- it's a polarizing feature -- but almost no other monitor offers this feature.

Blur Busters certainly isn't perfect and many things are often misinterpreted. I've messed up terminology before (e.g. "yesterday" vs "yesteryear"). But it's plainly clear (to the majority) how honest I've worked to be. Compare me to an average big company or even a big-time youtuber (e.g. LinusTechTips) and I've been (relatively) sterling in honesty.

Communicating technology that does not confuse population, is astoundingly hard, as I am deaf since birth, and English was hard for me to learn, but I got heavily tutored, well into high school, so I was able to become a fairly good English writer, and become really good (relatively speaking, compared to most) at communicating unusual technology such as the generic benefits of high Hz.

It's also clear that I chimed into this thread with a superior strobe technology (CRT electron beam simulator), and thus my more noble intents were clear at the beginning.

Remember, a different hobbyist-turn-business started this topic (SED/FED)...
Certainly discussion did go to many tangents (while still being ontopic in the spirit of another budding hobbyist-turning-business offering a FED/SED), but it's not worth doing a attack based off. Since a different hobbyist-turned business posted the original topic offering a potential SED/FED, I'm another hobbyist-turned-business chiming into this thread, and you never attacked rabidz7's claims. So I'm still on the spirit of the topic. FED/SED has some similarities in refreshing-drive requirements as a plasma (single-pixel digital drive of an inherently analog pixel, whether be equivalent of single-pixel electron guns or otherwise), requiring addition of some temporal artifacts that was worse than CRT on many of the early prototypes, even if vastly better than all early LCDs at the time. They were working on improving those, but the fact that this was left out, and you never attacked that... Curious.

Sure, I may not be Ghandi because I am paid from the revenues of the Blur Busters business, but people accept that. But so are schoolteachers as well as YouTubers who teach. Some are self employed too (schoolteachers too) and others are hired by a corporation (youtubers too, LinusTechTips hires salaried employees). How people get paid run all over the gamut. And even schoolteachers are influenced (sadly, due to politics of their states/city) -- I really hate that stuff too. When you think about what is happening in certain countries -- it's pretty clear I'm far more honest than an average schoolteacher in certain countries, yet you attack me.... Curious.

But, at the end of the day, many do their jobs because they love their job. Many also say things where people say are 90% honest while 10% say is misleading. Pick your %, but this is the gist. There is nobody whose 100% of population unamiously 100% agrees is honest, even when they say "2+2=4". That's the sad reality. I am not immune to that either. Any smart person sees that I'm being reasonably honest in understanding the forum fine lines involved here on HardForum (that's why I thanked you for posting a link).

I'm not one of those YouTubers that "don't read the comment sections because they're to weak to handle the criticism". My role as a Hz mythbuster means I've also got to address people like you who tries to deceptively spin relative honesty into strong dishonesty. Honestly, I've been mythbusting since the 1990s, I've been mythbusting "Humans cant tell 30fps-vs-60fps" since the FidoNet BBS days [screenshot] - I programmed that demo on a 386DX (though upgraded almost the same time to 486DX2-66) on the very 17" Samsung Syncmaster GLsi which was almost the biggest CRT money could buy at the time. It was so heavily used for years, that the tube also even wore down to less than 100 nits by the time I sold it.

I rest my case...

[Edited to add more info]
 
Last edited:
Now....

Full wholehearted admission time! :)

One of my biggest problems today is I don't have as much time to post my flagship Area 51 articles -- but I am going to have to do that out of necessity soon.

It does increases the emergence of naysayers, even things like reviewers who's never tried QFT custom resolution timings, and manufacturers that haven't yet added features that makes things easier (e.g. QFT / Large VTs in the monitor menus). They make massive differences to strobe quality, and also reduces strobe lag further (to less than NVIDIA ULMB). And publishing more such articles may also finally convince a manufacturer to add QFT EDIDs. Writing articles is often unpaid time that may only pay off in two years into the future (reputation, successfully convincing manufacturer, etc). There are just so many Blur Busters battles.

- Need to publish new white papers on motion clarity accomplishments from the technological improvements that have been happening...
- Need to publish HOWTOs like the "Zero Strobe Crosstalk HOWTO", etc...
- Need to publish easier software (e.g. autocomputing QFT timings wizard for easier Large Vertical Totals)...
- Need to keep goading/convincing more manufacturers to keep pushing the envelope....
- Need to spend more time recruiting to hire a writer for the Blur Busters cover page....
- Would like to polish off more incomplete code bases of unreleased Blur Busteresque stuff....
- Etc, etc, etc...

Guilty as charged -- there are too many Blur Busters initiatives and not enough time / resources / funding / etc.

Does uncompleted projects make me a liar? Nah... No. Well-intentioned fails to check off as many checkboxes as Blur Busters would've liked.

Some simply because of sheer time crunch, others because of external problems (e.g. supply/pandemic disruptions in China), etc. But most confirm that Blur Busters has already done an amazing amount, all things considered.

Still a lot of Blur Busters initiatives to keep doing. Blur Busters can only pick a few battles at a time to tip more refresh rate race dominoes.

Anyway, this is a major topic sidetrack. The bottom line is that blur busting is a noble niche to pursue. This includes trying to beat CRT in increasingly more and more attributes over the long term, and it is becoming technologically easier (witnessing XG2431 > XG270) with increasingly more and more similiar requirements as CRT (e.g. GPU power, since you can use QFT + StrobeUtility tune things like a 98.750Hz or 113.357Hz mode (and 144Hz too! At least for the middle vertical 80%+) on the XG2431 much better than the 119Hz mode of an XG270, as an example of the increasing flexibility that is now happening. All checkboxes are being hit on a best-effort basis, as I fight the technology limitations and politics at manufacturers. But this is not the ultimate goal. The ultimate goal is a CRT electron beam simulator that matches CRT more perfectly (flickerfeel, phosphor decay, eyestrain identicalness, look and feel, etc) while also giving users a massive continuum of full continuum of adjustment flexibility of hybrids between CRT-thru-sample-and-hold (in varying blended amounts). That's why I replied originally to this thread about a CRT electron beam simulator holy grail before these interesting topic tangents appeared.

Many are cheerleading me along on this fun but difficult journey.
 
Last edited:
Mark, I always enjoy your writings.

Is there any breakthroughs or news on the front of VRR with simultaneous strobing/scanning/BFI? Every attempt I've seen so far as been inadequate.
Short Answer / Near term: High Hz OLED, even if without VRR yet. Also working on strobed VRR LCD implementations with some vendors too, but no new news at the moment, nor something I'd consider nearly as big breakthrough as good fast-GtG 1000Hz technologies would be.

Long Answer / Long term, ultra high refresh rates actually behave like per-pixel VRR, making VRR unnecessary.

This is because you can have a concurrent 23.976fps, 24fps, 25fps, 50fps, 59.94fps and 60fps window side by side without any human-visible pulldown judder, because the ultra high refresh rates make the refresh-roundoff stutters/judders far below human visibility noisefloor of the sheer persistence of the low frame ratss.

(Simplified example for 2000Hz example sample and hold, the 1/2000sec = 0.5ms literal nanostutter of a refresh cycle roundoff, is lost in the fog of >50ms persistence of 24fps) at least for current contemporary desktop display sizes, FOV, resolutions, and many content framerates. That’s a 100:1 persistence:stutter ratio for the error margin — not visible without massive amounts of persistence reduction at sufficiently ultra high resolutions (to get MPRT in ms less than stutter in ms), which you would never do with 24fps material anyway. Even double-strobe film projectors still showed each frame for about ~12ms per strobe of each double strobe if it was a 180-degree rollong shutter. Heck, you could even adjust the degree-ness of a spftware based rolling shutter for a custom motion look. At least you can faithfully emulate double strobe of a 35mm film projector accurately too, without visible Hz-roundoff issues. It’s slightly less motion blur than sample-and-hold 24p, while not being too flickery, for those 35mm purists)

Heck, let's fancy up this napkin exercise even further. You could software emulate a retro display on some of those video windows too simultaneously (one window emulates a CRT electron beam, another windows emulates a DLP, another window emulates a plasma, and another window emulates a LCD GtG). The important thing is you need a sufficient "retina" refresh rate (for the given resolution and FOV of display), plus preferably a little oversample margin, then you can pull such feats off. Each window can have its own "refresh rate" (e.g. simulated 72Hz CRT playing the 24p video for a triple-strobe effect). You can have multiple framerates independent of multiple virtualized refresh rates independent of the main display ultra-high-Hz refresh rates. This is a fairy tale today since such a display does not exist, but as display Hz goes up more and more, we are able to simulate more retro refresh behaviors.

Now, let's talk a 2000Hz display, as an example of per-pixel VRR behavior.
The frame time difference between 1999fps vs 2000fps (1/2000sec) is so tiny at current resolutions and motion speeds, for practical purposes, you don't see* a single framedrop on a non-VRR display, unlike the frame drop of 60fps-vs-59fps. So you can have multiple low or middle framerates jittering 1/2000sec against the ultra high refresh rate -- so it ends up visually looking like per-pixel VRR.

(*Exception: You might see 1/2000sec stutter in super extreme cases, like a theoretical ultra-low-MPRT 8K VR headset at ultrafast head turn motionspeeds though, since 1/2000th of 8000 pixels/sec head turn is still a 4 pixel stutter. This experimentally assumes that MPRT (motion blur in milliseconds) is well under the stutter error (1/2000sec = 0.5 millisecond stutter), to prevent motion blur from hiding that ultra-tiny microstutter (I guess I should call it "nanostutter"!). This would still be very subtle compared to today's refresh rates. I only mention this extreme case because retina refresh rate is different depending on the display angular resolution versus human vision angular resolution, FOV to spread those pixels out over the human angular resolving resolution, etc - which is why 2K/4K/8K in VR is much blurrier than the same resolution on a desktop size monitor, because some wide-FOV VR is bigger FOV span than an IMAX movie theater screen when considering a wide-FOV headsets -- amplifying limitations of 8K far beyond non-VR 8K, turning 8K into non-retina-resolution when used in many VR use cases)

Eventually ultrahigh quadruple-digit refresh rates of the future behaves as defacto per-pixel VRR, and can make VRR obsolete in the long term too. 23.976p and 24p will look correct since mis-framepacing will be only a 1/2000sec stutter, completely lost in the blur of 1/24sec sample-and-hold (of frametime).

High Hz doesn't have to be used for just high Hz -- it has other low-framerate benefits such as the neat defacto per-pixel-VRR behavor, with the attendant near-zero latency of ultrafast 1/2000sec scanout (for a 2000Hz display), even if you're just doing 24fps or 60fps.

With such sheer Hz, one can create algorithms to control display motion blur of frame rates in multiple ways, either by simply using ultra high framerates, or inserting blackness/fade/simulated phosphor/etc (even in combination with simulated VRR, if you wished, as long as you had gamma-corrected alphablends across adjacent refresh cycles, in your software algorithm). Infinite numbers of custom strobed VRR implementations (in software) are possible via virtualization on a quadruple-digit Hz, too -- not just one hardware algorithm.

Or, yeah, you can slow down the scanout by virtualizing the scanout (like a CRT electron beam simulator) to get identical scanout velocity as an old CRT. Virtualization is quite a neat solution to piggyback on sheer Hz to create your custom temporal modes via software, that is far fancier than monolithic black frame insertion. You can do any custom scanout speeds simply by virtualization (software algorithms), in the same vein as a CRT electron beam simulator, but you could even instead implement a shader code that does a slow-LCD-scanout-slow-LCD-GtG simulator, or any oddball simulation of retro display. Basically emulating an older LCD in 1/1000sec or 1/2000sec increments. Or a DLP. Or a plasma. Or VRR. Or strobed VRR. Etc.

This is, however, an ultra-longterm view. Shorter term, we'll see more incremental display improvements.

"1000Hz is not the final frontier" Note:
Mind you, I simply use 1000fps 1000Hz (at near 0ms GtG) is a good round number that's adequate for electron-beam-simulating an average CRT. For more flexibility in temporally simulating all kinds of algorithms (e.g. adding software-based strobed VRR to a non-VRR ultra Hz panel), I'd prefer this be done at 5000Hz+ rather than just 1000Hz+, for more algorithmic flexibility, since 1000Hz is not retina refresh rate given sufficient resolution (Vicious Cycle Effect where higher resolutions and wide FOV amplify Hz limitations. Extra Hz also gives more room for temporally antialiasing between adjacent refresh cycles, like simulating non-refresh-divisible flashes via alphablended adjacent refresh cycles. And also, simply to give more margin for eliminating visible Hz limitations such as motion blur -- especially with insanely high resolutions -- 8000 pixels/sec has 8 pixels of motion blur on an 8K display for 1/1000sec sample-and-hold refresh cycles, while still being an easily eye-trackable one-screenwidth-per-second motion, making 1000fps 1000Hz sample and hold still too motion blurry to be considered "retina refresh rate" for a sample and hold display. This occurs when you've got incredible resolutions such as 8K far beyond CRTs back in the day -- extreme resolutions amplify Hz limitations dramatically, as long as angular pixel resolving is still within human perception, as they still is on an 8K VR headset -- it doesn't hit human static-image resolving resolution yet, and you still see the difference between static resolution and moving resolution if you don't strobe the VR screen -- at least until the quintuple-digits Hz for a sample-and-hold display). At this stage, you also need more dramatic jumps up the refresh rate curve (e.g. 1000Hz->4000Hz) to easily see a human visible difference for fast motion content such as VR headturns or fast panning map, etc -- nearer the vanishing point of diminishing returns. For direct-view high-resolution desktop monitors that are not 180 degree FOV, the retina refresh rate of no further humankind-benefit is lower, but is still no less than very deep into the higher end of quadruple digits, especially at 4K+ instead of 1080p.
 
Last edited:
Hmmm
https://www.guru3d.com/news_story/a...nc_compatible_gaming_lcd_is_in_the_works.html

Unfortunately its 24" 1080p with an Esports-TN panel (E-TN).
A step in the right direction though for speed of LCD pixels.
I can't wait to try it out -- LCD GtG needs to keep going faster. They were trailing too far behind at 360Hz, reducing most of 240Hz-vs-360Hz difference, as lovely as the emergence of 360 Hz is.

I still have a very old ~15" NEC XGA LCD here from year 2001 that has approximately a 50ms GtG ....to 90%. I actually probably saw it do 1 second GtG100% in a cold room (10C) in the middle of winter.

....On topic of simulating a CRT electron beam (in software / GPU shader / etc), 500 Hz should enable about 8 refresh cycles per 60Hz. My experiments in CRT electron beam simulator experiments will probably look very 60Hz-comfortable on it (albiet too dark and washed out, due to lack of HDR and because of inky-grey blacks. 60 Hz single strobe is still superior despite flickeriness. I wish they'd combine ultra Hz, HDR and inky blacks more quickly to allow software-based electron beam simulators to blow away 60 Hz single strobe in overall quality -- but guess we'll have to wait a few more years!

While we need more Hz, 240Hz-480Hz+ is the territory where CRT electron beam simulators start to look superior to hardware 60Hz single strobe, assuming enough nit headroom to keep it bright, and assuming reasonably decent blacks. Newer OLED (preferably with HDR) will make that easier.

A 400nit 240Hz OLED will probably still do ~100+ nits in a CRT electron beam. Most of the light in a 4-frame cycle on 240Hz-emulating-60Hz, will likely using two consecutive refresh cycles that are shingle-overlapped, where gamma-corrected rolling phosphor fadezones are overlapping each other in consecutive refresh cycles (temporally shingled as a form of eliminating seams between digital refresh cycles) -- this isn't as algorithmically straightforward as rectangle rolling scan. It's also much lower granularity than needed, as 8 or 16 refresh cycles to simulate one 60 Hz CRT refresh cycle would be vastly superior.
 
Last edited:
I still have a very old ~15" NEC XGA LCD here from year 2001 that has approximately a 50ms GtG ....to 90%. I actually probably saw it do 1 second GtG100% in a cold room (10C) in the middle of winter.

I had a win2k era Panasonic Toughbook with a screen that got that slow in sub-freezing conditions. It was one of the reasons I eventually gave up on what I wanted to use it for (computer starcharts while at the telescope) and let it go to final retirement (I bought it used around '06).

I'd managed to surpress the horror until today. 🤢
 
View attachment 476665

(I wonder what Hz & monitor Bart was using in that episode!...)
Probably my old Envision xga 15" lcd when they first came out and I got it on a smoking deal posted on FatWallet. Anyone else remember that site? It was the slickdeals of the early 2000s and 99. Like your lcd, it had a 50ms response time, and yes I played counter strike on it :ROFLMAO: plus everquest.
 
Back
Top