24" Widescreen CRT (FW900) From Ebay arrived,Comments.

You shouldn't be using those resolutions anyway. They're double refresh rate, meaning you get a double scan effect on 60fps games. Go to TestUFO.com at 120hz then 60hz and notice how clear the 60fps UFO is at 60hz.
Personally, i prefer the downsides that come with running at double refresh rate over the downsides that come with line doubling (or even worse, having 3D elements rendering at twice the resolution of prerendered 2D/background elements)
 
Personally, i prefer the downsides that come with running at double refresh rate over the downsides that come with line doubling (or even worse, having 3D elements rendering at twice the resolution of prerendered 2D/background elements)
Run an integer multiple, like 480p or 1200p, and add artificial scanlines. That way it doesn't look like line doubling, and you can run at 60hz.

Or better yet, why not get a small CRT TV with s-video or component?
 
1) Is the Nvidia dithering tweak different from the Windows registry dithering tweak?
Where can I find info on Windows registry dithering tweak?

2) If you do an Nvidia or Windows registry dithering tweak, does it still need to be reset (enabled and disabled and reboot) when the PC hibernates etc.?
I guess is the same as with EDID changes. You need to either reset PC for those tweaks to apply or reset driver or disconnect display and connect it again.
I did not do extensive testingon this though. What happened before is that hibernation made this tweak go away making it kinda useless.

Not sure if it "survives" sleep or if there are any other issues still but so far it is working fine for me.

3) Do you know if Nvidia's new 3000 series GPUs support dithering by default? Does AMD's GPU support dithering by default?
AMD has dithering by default.
Ampere I do not think would differ in such features. There was nothing stopping Nvidia from adding such feature to their GPU's for many years now. And it was there for years. Though like I said, it did not work very well. At least stability of this setting as by itself it works very well.

BTW. for 8bit displays you need to use:
ditherState – Enabled; ditherBits – 8 bit; ditherMode – Temporal
"DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,01,04,f3,00,00,00

For 6bit panels you need:
ditherState – Enabled; ditherBits – 6 bit; ditherMode – Temporal
"DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,00,04,f2,00,00,00
Should give superior results than using 8-bit dithering on top of 6-bit+A-FCR and far better results if it is old monitor with FCR (like old TN panels and such)

4) Is there a noticeable difference between dithering 8 bit and deep color 10/12 bit? I would imagine that 10/12 bit colors would still look better.
Most (if not all!) 10/12bit LCD displays have actually 8bit panels even if in spec sheets they are rated as 10-bit.

It is already hard to see good temporal dithering vs native 8-bit but it is possible. For something like 10bit dithered to 8bit you simply won't be able to see the difference between native 10-bit panel. Maybe with some specialized equipment... that no reviewer has it would be possible.

BUT you might still see improvement if you use this dithering tweak on 10-bit panel (this setting: ditherState – Enabled; ditherBits – 10 bit; ditherMode – Temporal, "DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,02,04,f4,00,00,00) vs not using it because with it you get higher internal precision. Banding is much more visible than dithering noise is what I am saying. Internal precision used for dithering is probably 12bit but it might also be 16bit.

Overall I would say dithering on 8-bit LCD panels should give the same visual fidelity as native 10/12bit inputs.
For CRTs it should give almost the same fidelity as 10-bit analog outputs. Given CRTs do have some noise (from DAC, cables, electronics, etc.) I would say that no one would be able to differentiate 8-bit+A-FCR (which this "temporal dithering" really is) vs native 10-bit in ABX tests and for all intents and purposes (like correcting gamma, calibrating display, etc.) it solves the most important issue: banding. And CRT's always need some gamma ramp changes and most DAC's are 8-bit so in this case it is very important.
 
You shouldn't be using those resolutions anyway. They're double refresh rate, meaning you get a double scan effect on 60fps games. Go to TestUFO.com at 120hz then 60hz and notice how clear the 60fps UFO is at 60hz.

Personally, i prefer the downsides that come with running at double refresh rate over the downsides that come with line doubling (or even worse, having 3D elements rendering at twice the resolution of prerendered 2D/background elements)

The upside of running double/triple/etc. refresh is less issues with synchronization in emulators. If you do not get refresh exactly matched with how fast emulators spits frames then sync error is less visible. Also input lag will be improved when doing v-sync.

Said that I still prefer to run emulators on something like 640x480@~50/60Hz with scan lines because I love motion clarity running proper refresh rate provides
Also 640x480+scanlines because works out the box. Configuring 240/288p resolutions on PC is a pain, especially on modern operating systems.

480p scanlines vs native 240p should not differ that much in how it looks because beam has some size and running different resolutions doesn't change it. On old monitors 640x480 has no scan lines and on FW900 even 720p has some and 480p definitely has them. I have not tested how 240p@120Hz looks on it but I guess it should look pretty much the same as 480p@120Hz exceps lower maximum brightness and no contrast loss (which do happen on LCD's when using scanlines).
Also I prefer something like 50% scanlines most of the time, especially for something like SNES, NeoGeo, etc.
 
PM me about your unit when you get chance...

Take care and keep safe!

Unkle Vito!
Hey Unkle Vito! I found someone around 180km from where I live who works with CRT monitors and TVs. Also, visually, the only thing wrong is the fuse, so maybe everything else was saved. I may be sending my monitor for repair today. I'll let you know how it goes.
 
Hey Unkle Vito! I found someone around 180km from where I live who works with CRT monitors and TVs. Also, visually, the only thing wrong is the fuse, so maybe everything else was saved. I may be sending my monitor for repair today. I'll let you know how it goes.
When a fuse breaks, 99% of the time (save the 1% where the fuse itself is defective), it's because something else is defective and caused an abnormal current consumption. Don't count too much on a simple fuse change fixing everything. ;)
 
Run an integer multiple, like 480p or 1200p, and add artificial scanlines. That way it doesn't look like line doubling, and you can run at 60hz.

Or better yet, why not get a small CRT TV with s-video or component?
Artificial scanlines don't have quite the same look imo.

A CRT TV *would* be better, but the decent ones are very expensive, so cost is a factor. I also absolutely hate unnecessary overscan. (And if i can't output 256x224@120hz to the FW900, i *definitely* wouldn't be able to output 256x224@60hz to a TV...)
480p scanlines vs native 240p should not differ that much in how it looks because beam has some size and running different resolutions doesn't change it. On old monitors 640x480 has no scan lines and on FW900 even 720p has some and 480p definitely has them. I have not tested how 240p@120Hz looks on it but I guess it should look pretty much the same as 480p@120Hz exceps lower maximum brightness and no contrast loss (which do happen on LCD's when using scanlines).
Also I prefer something like 50% scanlines most of the time, especially for something like SNES, NeoGeo, etc.
320x240 and 640x480 look radically different on an FW900.
 

Attachments

  • 240p.jpg
    240p.jpg
    648.7 KB · Views: 0
  • 480p.jpg
    480p.jpg
    731.1 KB · Views: 0
Artificial scanlines don't have quite the same look imo.

A CRT TV *would* be better, but the decent ones are very expensive, so cost is a factor. I also absolutely hate unnecessary overscan. (And if i can't output 256x224@120hz to the FW900, i *definitely* wouldn't be able to output 256x224@60hz to a TV...)

320x240 and 640x480 look radically different on an FW900.

I think he means 480p *with* scanline filter. Should look very similar to straight 240p. You're not using a scanline filter, which would (should) make every other horizontal line a black scanline.

Otherwise you're correct, scanlines would be way thicker on 240p than 480p without filters.
 
A CRT TV *would* be better, but the decent ones are very expensive, so cost is a factor. I also absolutely hate unnecessary overscan. (And if i can't output 256x224@120hz to the FW900, i *definitely* wouldn't be able to output 256x224@60hz to a TV...)
Expensive? Where do you live where CRT TV's are expensive? I see Trinitrons, JVC D-series, Toshiba AF, Panasonics, Sharps, all going for free or close to free every week.

And the NES, SNES, Genesis, all output 256x224, so I'm not sure why you think that wouldn't look good on a CRT TV. These system were literally designed for CRT TV's.
 
I think he means 480p *with* scanline filter. Should look very similar to straight 240p. You're not using a scanline filter, which would (should) make every other horizontal line a black scanline.

Otherwise you're correct, scanlines would be way thicker on 240p than 480p without filters.
As far as i know, there is no way to add a scanline filter to native Windows games if they aren't compatible with Reshade.

When scanline filters are possible, the results do look passingly similar, but the image is dimmer, and colors pop less. (The difference doesn't really come across in photos sadly.)

(And the pixel clock for 512x448@60hz is also too low, so there isn't even that alternate upside for Famicom/Super Famicom games)

Expensive? Where do you live where CRT TV's are expensive? I see Trinitrons, JVC D-series, Toshiba AF, Panasonics, Sharps, all going for free or close to free every week.

And the NES, SNES, Genesis, all output 256x224, so I'm not sure why you think that wouldn't look good on a CRT TV. These system were literally designed for CRT TV's.
I was more thinking PVMs, but i know at least Trinitrons also run expensive here if the seller knows what they have.

And i didn't say they wouldn't look good? I just remember the overscan on consumer sets being wildly variable even within the same model. It was a major factor in me favoring emulation, even. (I adjust all my resolutions to have atleast 1mm of windowboxing on all sides of the image.)

Also a CRT TV still wouldn't help without the ability to output 256x224@60hz/320x240@60hz/etc., which isn't possible if i can't even output 256x224@120hz/320x240@120hz. (Unless i set up an old PC specifically for that purpose, which was already my intention if there isn't an alternate DP to VGA solution with better low res support than the DP2VGAHD20. I just prefer to do it all using a single PC if possible.)
 
And i didn't say they wouldn't look good? I just remember the overscan on consumer sets being wildly variable even within the same model. It was a major factor in me favoring emulation, even. (I adjust all my resolutions to have atleast 1mm of windowboxing on all sides of the image.)

Also a CRT TV still wouldn't help without the ability to output 256x224@60hz/320x240@60hz/etc., which isn't possible if i can't even output 256x224@120hz/320x240@120hz. (Unless i set up an old PC specifically for that purpose, which was already my intention if there isn't an alternate DP to VGA solution with better low res support than the DP2VGAHD20. I just prefer to do it all using a single PC if possible.)
Variable overscan was always part of using these old consoles. They were designed to overscan by a handful of pixels. It's ridiculous to make that a deal breaker. Especially when your tradeoff is 120hz which is way more destructive to the image.

And hardly anybody "knows what they have". Even if you do see one person charging $50 for a Trinitron or JVC, there will be 10 others that are giving them away for free. Stop grasping at straws here.

And you can put in an old $10 AMD card in a secondary pci-e slot and run CRT Emudriver to get native output of 224p, 240p, 480i, etc.
 
Variable overscan was always part of using these old consoles. They were designed to overscan by a handful of pixels. It's ridiculous to make that a deal breaker. Especially when your tradeoff is 120hz which is way more destructive to the image.
I remember a lot of consumer sets eating into UI and text boxes, even JVCs and Panasonics and the like. *shrug*
And you can put in an old $10 AMD card in a secondary pci-e slot and run CRT Emudriver to get native output of 224p, 240p, 480i, etc.
Wait, that doesn't cause any issues with the main Nvidia card wrt PhysX and the like? That is a much simpler solution if so. I already have an old AMD card or two around, even.

I will look into it further, ty.
 
I remember a lot of consumer sets eating into UI and text boxes, even JVCs and Panasonics and the like. *shrug*
You can adjust overscan in the service menu super easily. Just find the best middle ground between the various consoles and arcade games you'll be playing

Wait, that doesn't cause any issues with the main Nvidia card wrt PhysX and the like? That is a much simpler solution if so. I already have an old AMD card or two around, even.

I will look into it further, ty.

Just make sure you install Nvidia's drivers again after you install CRT Emudriver, so any common DLL's get overwritten by the newer DLL's in the Nvidia driver and don't break compatibility with new games.
 
Just make sure you install Nvidia's drivers again after you install CRT Emudriver, so any common DLL's get overwritten by the newer DLL's in the Nvidia driver and don't break compatibility with new games.
Nifty. I won't be shocked if it has obscure issues specific to my sort of setup, since i'm one of the handful of weirdos still using 3D Vision glasses and that is one hell of an edge case, but it's definitely worth trying (especially if i can find a decent CRT TV to add to my setup)
 
Nifty. I won't be shocked if it has obscure issues specific to my sort of setup, since i'm one of the handful of weirdos still using 3D Vision glasses and that is one hell of an edge case, but it's definitely worth trying (especially if i can find a decent CRT TV to add to my setup)

Just so you know, you have at least 10 CRT's available to pick up today, I guarantee it. Unless you live in rural Ukraine or something
 
Just so you know, you have at least 10 CRT's available to pick up today, I guarantee it. Unless you live in rural Ukraine or something
We had historic snowfall last night, so probably not today :p

I'll definitely look into it more in general tho. I feel like i have heard about more "finds" on the CRT subreddits lately due to people clearing out their old stuff, so it's probably the best chance to find something good in some time.
 
1608239511030.jpg


I think this is related to my adapter, but has anyone seen something like this? on the left and right side there is like this bit that's replaced with the other side?
This only seems to happen at max resolution and not without my current adapter.
I wanted to get a usb-c to vga adapter, but the ones here recommended all seem to be sold out??
 
I think this is related to my adapter,
Yeah, if you have a Synaptics chipset adapter like the Sunix DPU3000, then that's what's going on. I'm not sure what the 16:10 cutoff is, but at 2048x1536 and above on 4:3 monitors, this will happen. The only way to get around it is to go much higher, in my case 2304x1728, to get it to happen less frequently.

What you could do is get a cheaper Nvidia card, like a GT 930 or something, something that's still supported on current drivers, and use it in a second slot for analog output. Alternatively, you could also try this by using a cheap analog AMD card in a second slot. Just hook both GPU's up to your FW900 and mirror the displays in Winows display settings in the event you can't force rendering on the primary GPU
 
Yeah, if you have a Synaptics chipset adapter like the Sunix DPU3000, then that's what's going on. I'm not sure what the 16:10 cutoff is, but at 2048x1536 and above on 4:3 monitors, this will happen. The only way to get around it is to go much higher, in my case 2304x1728, to get it to happen less frequently.

What you could do is get a cheaper Nvidia card, like a GT 930 or something, something that's still supported on current drivers, and use it in a second slot for analog output. Alternatively, you could also try this by using a cheap analog AMD card in a second slot. Just hook both GPU's up to your FW900 and mirror the displays in Winows display settings in the event you can't force rendering on the primary GPU

Are you saying it possible to use an older GPU's (GT 930) analog output with a new GPU (RTX 3080) to connect to the FW900? Thereby not needing a VGA adapter? I didn't think that was even possible.
 
Last edited:
Are you saying it possible somehow to use a new GPU (RTX 3080) that doesn't have an analog output with an older GPU (GT 930) that does have an analog output to connect to the FW900? Thereby not needing a VGA adapter? I didn't think that was even possible.

Yes, but I don't know if it will work with exclusive fullscreen in all games. Nvidia has a an option to select the rendering GPU, but I don't know if all games will respect the setting.

Or you could also try screen mirroring on Windows settings. That should still work with exclusive fullscreen I think.

But borderless fullscreen will be workaround for any games that don't. Though that's not recommended for competitive games where an extra frame of lag would be a big deal.

Actually, it might be a better idea to use an old AMD card if you need to go over a 400mHz pixel clock, because I don't think you can override Nvidia's 400mHz limit like you can for AMD
 
woah, that's interesting, never knew that was an option!
Unfortunately, I don't use my Nvidia card in my AMD 5700xt system very often.

AMD software doesn't have the "primary gpu" option like Nvidia CP has, so borderless is my only option for most games. So I mostly stick with the Sunix, unless it's a game where I really don't mind an extra frame of lag, and the frame time delivery is still good.

And Nvidia cards are a pain in the ass to get interlaced working on, and my Fermi GPU seems to top out at 1920x1440 interlaced before it bugs out.

So I think I may try to get something still supportedon the AMD side like a RX 250 to see if it's a smoother experience.
 
Yes, but I don't know if it will work with exclusive fullscreen in all games. Nvidia has a an option to select the rendering GPU, but I don't know if all games will respect the setting.

Or you could also try screen mirroring on Windows settings. That should still work with exclusive fullscreen I think.

But borderless fullscreen will be workaround for any games that don't. Though that's not recommended for competitive games where an extra frame of lag would be a big deal.

Actually, it might be a better idea to use an old AMD card if you need to go over a 400mHz pixel clock, because I don't think you can override Nvidia's 400mHz limit like you can for AMD

Very interesting. Also, where in the Nivida control panel is there an option to select the rendering GPU? The only available option I've seen is to select the physX rendering GPU/CPU (PhysX is pretty much obsolete).
 
Last edited:
Very interesting. Also, where in the Nivida control panel is there an option to select the rendering GPU? The only available option I've seen is to select the physics rendering GPU/CPU.
It's in 3D settings with all the other stuff.

It might be available in Global settings though, you might have to select on a per-game basis.
 
Where can I find info on Windows registry dithering tweak?


I guess is the same as with EDID changes. You need to either reset PC for those tweaks to apply or reset driver or disconnect display and connect it again.
I did not do extensive testingon this though. What happened before is that hibernation made this tweak go away making it kinda useless.

Not sure if it "survives" sleep or if there are any other issues still but so far it is working fine for me.


AMD has dithering by default.
Ampere I do not think would differ in such features. There was nothing stopping Nvidia from adding such feature to their GPU's for many years now. And it was there for years. Though like I said, it did not work very well. At least stability of this setting as by itself it works very well.

BTW. for 8bit displays you need to use:
ditherState – Enabled; ditherBits – 8 bit; ditherMode – Temporal
"DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,01,04,f3,00,00,00

For 6bit panels you need:
ditherState – Enabled; ditherBits – 6 bit; ditherMode – Temporal
"DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,00,04,f2,00,00,00
Should give superior results than using 8-bit dithering on top of 6-bit+A-FCR and far better results if it is old monitor with FCR (like old TN panels and such)


Most (if not all!) 10/12bit LCD displays have actually 8bit panels even if in spec sheets they are rated as 10-bit.

It is already hard to see good temporal dithering vs native 8-bit but it is possible. For something like 10bit dithered to 8bit you simply won't be able to see the difference between native 10-bit panel. Maybe with some specialized equipment... that no reviewer has it would be possible.

BUT you might still see improvement if you use this dithering tweak on 10-bit panel (this setting: ditherState – Enabled; ditherBits – 10 bit; ditherMode – Temporal, "DitherRegistryKey"=hex:db,01,00,00,10,00,00,00,01,01,02,04,f4,00,00,00) vs not using it because with it you get higher internal precision. Banding is much more visible than dithering noise is what I am saying. Internal precision used for dithering is probably 12bit but it might also be 16bit.

Overall I would say dithering on 8-bit LCD panels should give the same visual fidelity as native 10/12bit inputs.
For CRTs it should give almost the same fidelity as 10-bit analog outputs. Given CRTs do have some noise (from DAC, cables, electronics, etc.) I would say that no one would be able to differentiate 8-bit+A-FCR (which this "temporal dithering" really is) vs native 10-bit in ABX tests and for all intents and purposes (like correcting gamma, calibrating display, etc.) it solves the most important issue: banding. And CRT's always need some gamma ramp changes and most DAC's are 8-bit so in this case it is very important.

The Windows Registry dithering hack fixes the color banding on Nvidia GPUs. They are one in the same thing. I misunderstood what you originally wrote.

Thanks for the adding info!

I tested my HDFury X3 hdmi to vga adapter (advertised to support 10 bit) and selected the 10 bit option in the Nvidia control panel but I'm still seeing banding in games. I'm guessing I'm not getting a real 10 bit image. I'll try the Windows Registry dithering hack and report back.
 
What is the recommended way to get high bandwidth interlaced VGA output rendered from a modern GPU?

I have an old iiyama Vision Master Pro 510 (130 kHz/160 Hz) that I've had in storage. I used to run it at 1600x1200p@100, but I'm hoping it's possible to push it to 2048x1536i@160.
My current GPU is a RX 5700XT, and if I upgrade in the Spring, I'd be more inclined to stay with AMD than get an Ampere.

Does v-sync break completely if mirroring the screen in Windows?
 
What is the recommended way to get high bandwidth interlaced VGA output rendered from a modern GPU?

I have an old iiyama Vision Master Pro 510 (130 kHz/160 Hz) that I've had in storage. I used to run it at 1600x1200p@100, but I'm hoping it's possible to push it to 2048x1536i@160.
My current GPU is a RX 5700XT, and if I upgrade in the Spring, I'd be more inclined to stay with AMD than get an Ampere.

Does v-sync break completely if mirroring the screen in Windows?
I don't imagine vsync would break.

You could try using a Radeon r5 250 or some other card that is still included in the drivers. But if you go with Nvidia, use an architecture newer than Fermi. In my experience so far (2 different cards) they won't go above 1920x1440 interlaced.
 
Yes, but I don't know if it will work with exclusive fullscreen in all games. Nvidia has a an option to select the rendering GPU, but I don't know if all games will respect the setting.

Or you could also try screen mirroring on Windows settings. That should still work with exclusive fullscreen I think.

But borderless fullscreen will be workaround for any games that don't. Though that's not recommended for competitive games where an extra frame of lag would be a big deal.

Actually, it might be a better idea to use an old AMD card if you need to go over a 400mHz pixel clock, because I don't think you can override Nvidia's 400mHz limit like you can for AMD

interesting, i tested a while ago with a geforce gtx 760 analog outputting, paired with a 1080 ti, both cards were listed fine in the device manager, but even swaping them between pcie slots, it did not work, on an asus z87 pro motherboard.
however i dont remember if i did the steps suggested by you, i dont have the 760 anymore to re-test unfortunatelly,
would be interesting if someone test and confirms this.


what worked for me was testing from the integrated motherboard from its analog output, paired with the 1080 ti but with some limitations
https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1044743613
 
Last edited:
Personally, i prefer the downsides that come with running at double refresh rate over the downsides that come with line doubling (or even worse, having 3D elements rendering at twice the resolution of prerendered 2D/background elements)
You can also use 120Hz + software BFI to get best of both worlds. Perfect "NTSC 240p emulation" 60 Hz with no line doubling!

Several emulators support this.

Demo can be done at www.testufo.com/blackframes on your 120Hz CRT -- observe how the 60fps+BFI UFO looks like perfect 60Hz without duplicate images.
 
You can also use 120Hz + software BFI to get best of both worlds. Perfect "NTSC 240p emulation" 60 Hz with no line doubling!

Several emulators support this.

Demo can be done at www.testufo.com/blackframes on your 120Hz CRT -- observe how the 60fps+BFI UFO looks like perfect 60Hz without duplicate images.
Wouldn't this cause ungodly flicker on CRT?
I think it would. Will test it but not today.

BFI should cause the same level of contrast reduction as normal scanlines which makes it right there not perfect solution on either CRT or LCD.
Also to keep brightness equal this trick would mean that any monitor and emulated refresh rate synchronization issues would look exactly like on 60Hz.

On the good side motion clarity might be marginally better because each frame is drawn quicker and there will be less rolling shutter effect. Neither important in this case as motion clarity on CRT is already good and BFI won't fix phosphor trails so... DO NOT RECOMMEND
 
Every now and then there comes a display that makes me think I could enjoy it as much as I enjoy CRTs. This year it is LG CX and ZX line. OLED, 120Hz, FreeSync/G-SYNC. It is a TV but supposedly works well as a PC monitor. They have 48,55,65" at 4K and 8k in 77,88".

https://www.rtings.com/tv/reviews/lg/cx-oled

100% Response Time = 1.7 ms

INPUT LAG for 4k @ 120 Hz = 6.7 ms

I think I have seen it mentioned here already so I thought I would if there is perhaps someone who has one of these LG TVs and also a CRT and would be willing to write down some quick comparison for us.

 
Though that's not recommended for competitive games where an extra frame of lag would be a big deal.
This trick might be good for movies but with its input lag, reduced performance in games, bothersome settings and compatibility issues I would say this is the worst solution out there
 
This trick might be good for movies but with its input lag, reduced performance in games, bothersome settings and compatibility issues I would say this is the worst solution out there
It's a solution for situations where your DAC isn't performing correctly (2048x1536-2304x1732 on Sunix) or you need interlaced output.

1 extra frame of lag isn't game breaking for most types of games. And lots of games nowadays only use borderless window fullscreen, exclusive fullscreen isn't even an option. So I imagine those games would basically perform identically.
 
Wouldn't this cause ungodly flicker on CRT?
I think it would. Will test it but not today.

BFI should cause the same level of contrast reduction as normal scanlines which makes it right there not perfect solution on either CRT or LCD.
Also to keep brightness equal this trick would mean that any monitor and emulated refresh rate synchronization issues would look exactly like on 60Hz.

On the good side motion clarity might be marginally better because each frame is drawn quicker and there will be less rolling shutter effect. Neither important in this case as motion clarity on CRT is already good and BFI won't fix phosphor trails so... DO NOT RECOMMEND
It does help Sonic Hedgehog (Sega Genesis version) and fast motion games -- It's a user preference -- some people get more nausea/headaches/eyestrain from the double-image effect than from the flicker.

Remember our name sake, "Blur Busters", we know people who can't live with motion blur or double-image artifacts (like CRT 30fps at 60Hz)! For SOME people, the lesser poison of "pick your headache poison" between motion blur headaches and flicker headaches. Try 60Hz single-strobe with Sega Genesis Sonic Hedgehog, and you'll see what I mean. Sonic Hedgehog and other superfast-scrollers looks so much better with 60Hz single-strobe. Not everybody is as flicker-sensitive as you are. 60Hz single-strobe is not for eye-searing bright Windows Desktop, but it's great for when you need very clear ultrafast-motion.

Your mileage may vary, I respect user preferences.

Some hate tearing, some hate stutters, some hate motion blur, some hate flicker more, etc.

Double-strobe 60Hz at 120Hz is great when you prefer it, but it is not everybody's preference.

Everybody sees differently! Different levels of eyeglasses prescription, different levels of color blindness (12% of population is colorblind). Others are bothered by excess blue light, but not others. Varying flicker sensitivity; some of us see flicker at any Hz and not be bothered / no eyestrain. So remember, not everyone is bothered by flicker.
 
Last edited:
  • Like
Reactions: XoR_
like this
Yeah, I could never use a CRT at 60Hz, it would always bother me. Always had to be 75Hz or higher.
 
Yeah, I could never use a CRT at 60Hz, it would always bother me. Always had to be 75Hz or higher.
Except you did, because all TV's were 60hz.

I think 60hz just gets a bad rap from people that remember running Excel or a web browser on a 60hz screen. Like yeah, if you looking at a full white screen at 60hz, you'll see flicker.

But for content like video or a video game, 60hz is fine.
 
Except you did, because all TV's were 60hz.

I think 60hz just gets a bad rap from people that remember running Excel or a web browser on a 60hz screen. Like yeah, if you looking at a full white screen at 60hz, you'll see flicker.

But for content like video or a video game, 60hz is fine.

CRT TVs also do bother me, if I did not sit far enough away from them with enough light in the room I would usually get headaches after 30 mins to an hour or so.
 
It does help Sonic Hedgehog (Sega Genesis version) and fast motion games -- It's a user preference -- some people get more nausea/headaches/eyestrain from the double-image effect than from the flicker.

Remember our name sake, "Blur Busters", we know people who can't live with motion blur or double-image artifacts (like CRT 30fps at 60Hz)! For SOME people, the lesser poison of "pick your headache poison" between motion blur headaches and flicker headaches. Try 60Hz single-strobe with Sega Genesis Sonic Hedgehog, and you'll see what I mean. Sonic Hedgehog and other superfast-scrollers looks so much better with 60Hz single-strobe. Not everybody is as flicker-sensitive as you are. 60Hz single-strobe is not for eye-searing bright Windows Desktop, but it's great for when you need very clear ultrafast-motion.

Your mileage may vary, I respect user preferences.

Some hate tearing, some hate stutters, some hate motion blur, some hate flicker more, etc.

Double-strobe 60Hz at 120Hz is great when you prefer it, but it is not everybody's preference.

Everybody sees differently! Different levels of eyeglasses prescription, different levels of color blindness (12% of population is colorblind). Others are bothered by excess blue light, but not others. Varying flicker sensitivity; some of us see flicker at any Hz and not be bothered / no eyestrain. So remember, not everyone is bothered by flicker.
Hey Mark,

May be a little off topic but I figured it's relevant for the people in this thread. For the blur busters certification program, are you going to mandate 60hz single strobe (or 50hz even, for our European friends) so that we can play all of the older content that was meant for 60hz smoothly? Right now, other than the LG OLED TV, we only have one option and it's a TN LCD monitor.

Also, are we potentially seeing any VA monitors getting approved for certification? I would love to have my cake and eat it to, so to speak. We can talk in PM's if you're okay with that (I know you're probably busy). Thanks for dropping by!
 
May be a little off topic but I figured it's relevant for the people in this thread. For the blur busters certification program, are you going to mandate 60hz single strobe (or 50hz even, for our European friends) so that we can play all of the older content that was meant for 60hz smoothly? Right now, other than the LG OLED TV, we only have one option and it's a TN LCD monitor.

Also, are we potentially seeing any VA monitors getting approved for certification? I would love to have my cake and eat it to, so to speak. We can talk in PM's if you're okay with that (I know you're probably busy). Thanks for dropping by!
Yes!

60 Hz single-strobe is now mandatory for Blur Busters Approved 2.0!

This will allow LCDs to more properly emulate a 60 Hz CRT tube. About 4 monitors later in 2021 will have single-strobe capability.

The good news is that Blur Busters Approved 2.0 is now mandating 60 Hz single strobe. It's a mandatory requirement for future logo certification now. It's only a 1-line firmware change to allow 60 Hz strobing, it's such an arbitrary cap that most monitors don't strobe properly at 60 Hz. I've been doing things like Zoom meetings telling manufacturers about reduced sales and reduced review ratings such as these images:

gs-gives-bad-scores-for-lack-of-60hz-single-strobe.png

So I successfully convinced two manufacturers to now including 60 Hz single strobe in some upcoming 2021 models. Those who worry about flicker can just slap a disclaimer "Warning: 60 Hz will flicker. Discontinue use if you have discomfort." and be done with it to keep the lawyers happy. Besides, Sony / Samsung / LG televisions already do 60 Hz single-strobe (which I also repeat like a broken record to the monitor manufacturers!)

No VA panels have successfully been approved because of the slow GtG pixel responses failing the Blur Busters Approved thresholds. Three VA panels was submitted in 2020 and they flunked Blur Busters Approved, I don't want to give out the logo like candy (even if it means I lose money in 2020...) combined with COVID delays. Thankfully, the program will pick up in 2021 as Blur Busters now has 4 monitors in the Blur Busters Approved queue that have high odds of being approved.

About time, I'm tired of seeing only one monitor in the list because I rejected a few models in 2020 for being unable to become good enough (at 120Hz). Red phosphor, *and* COVID *and* bad monitor submissions kinda sabotaged 2020 into promoting only a single monitor, which does not help the cause (sorry) -- but yeah, 2021 will amend that. :)

<Technical>
VA panel are a ***** to strobe-tune! It's ****ing hard. Two orders of magnitude harder sometimes. Temperature, slow dark GtG, problematic overdrive, GtG that varies throughout the year on the same panel, etc. This doesn't mean VA will never be approved, but will require highly advanced overdrive tunning, preferably 3D overdrive lookup tables (2-trailing-refresh-cycle-depth) with 256x256 non-interpolated LUTs, instead of wimpy 17x17 LUTs that are interpolated. In addition, some temperature compensation may be required for VA panels because of how strobe crosstalk appears/disappears in cold rooms in winter versus rooms in the hot summer. Bonus if there's some Y-axis compensated Overdrive Gain too, since bottom edge needs stronger overdrive gain to account for the smaller time differential between GtG curve initiation and the strobe flashtime. In addition, creative "time-flash-as-GtG-shoots-past-destination-color" techniques may also be required on VA panels, to hide ghosting/undershoot/overshoot artifacts for VA strobing. In other words, VA panels are sometimes 1-to-2 orders of magnitude more complex to strobe-tune reliably as TN or new fast-IPS panels, which are sometimes easy to beat NVIDIA ULMB via just only doing Large Vertical Total behavior + simple global OD Gain tuning. XG270 users saw how 119Hz-120Hz XG270 looks better than NVIDIA ULMB! If dark VA GtGs improve as rapidly as the pre-2019-IPS versus post-2020-IPS (shocking strobe quality improvement), then VA tuning may be much easier. But barring that, manufacturers aren't willing to spend the extra NRE costs to do the necessary extra above-and-beyond overdrive-tuning capabilities necessary to Blur Busters Approve a VA panel. Monitors aren't usually multimillion-sellers like televisions are, so monitors have lower engineering costs per unit than television panels, so sometimes limited on how much a turd can be polished (strobe-tune a VA panel). If only VA was as fast as microsecond-GtG blue-phase LCDs... Ha.
</Technical>

I just can't sacrifice the ethics of Blur Busters Approved for a badly-tuned VA panel, one that might also have red-phosphor ghosting too from KSF phosphor (on top of slow GtG). Sure, it makes me look like I am promoting only one model, but alas, that's 2020. AMD certifies FreeSync, NVIDIA certifies G-SYNC, and Blur Busters certifies blur reduction modes -- we're not a "review site" (even though we do special tests & invent tests that other sites use).

Four panels will come out within a year that has 60 Hz single strobe -- thanks to my behind-the-scenes work (Twitter social media, Blur Busters Forums, etc).

Given the common one-year monitor development lifecycles, it may not be till beyond mid-2021 though. Long wait alas, I know.

Sure, you can email me in the About section of Blur Busters, or contact me on the Blur Busters Forums.
 
Last edited:
Back
Top