Hmmm. There's quite a bit of difference between 2 ms and 10 ms. With only 2 ms of persistence, the difference between CRT and OLED in terms of amount of flicker should be close to negligible, following my line of reasoning. (Since 60 Hz would give you nearly 17 ms of time between strobes, even a CRT that's decaying in 2 ms should have plenty of time to stay in its "dark state" between two strobes, creating plenty of flicker.) However, my years of CRT use were during the younger years of my life, when I couldn't really afford very high-end CRT monitors. So, perhaps the CRT displays I'm likely to have experienced were closer to that 10 ms decay time. That would mean the monitor would have stayed on for 60% of the time period between two scans/strobes, meaning quite a high percentage of "on" time. That ratio might be much lower for OLEDs. (Not sure.) On the other hand, it's coming back to me now that the norm for a "good" CRT monitor refresh rate was actually 85 Hz during the latest years I used them; not 60 Hz. I do believe some of the CRTs I owned (at least in post-college years) were 85 Hz models. That would mean just under 12 ms of time between each scan of the phosphors. With a persistence time of 10 ms or a little less, that does translate to the phosphors being lit nearly all the time between successive refreshes thanks to persistence.