What new OLED gaming monitors in 2023?

It's pretty crazy that 48" LG CX owners who bought new at launch realistically have had nothing to upgrade to for almost 3+ years unless they want to downsize to 42.

48 CX to 48 C3 will basically perfom identically.
It just shows it was a good investment and they got in at the right time. A display like that could last a decade.
 
It's pretty crazy that 48" LG CX owners who bought new at launch realistically have had nothing to upgrade to for almost 3+ years unless they want to downsize to 42.

48 CX to 48 C3 will basically perfom identically.

Agreed, the LG OLED C1's were breakthrough amazing tech, especially for the price. And even now, the upcoming C3 has not improved much over the C1.

Would be cool if they released a smaller 36" C3, still 4k, with Display Port and 144hz.
 
Agreed, the LG OLED C1's were breakthrough amazing tech, especially for the price. And even now, the upcoming C3 has not improved much over the C1.

Would be cool if they released a smaller 36" C3, still 4k, with Display Port and 144hz.

LG has plans to release a 32" OLED, but we don't know if it will be 4K or 1440p yet again.
 
Remember the Westinghouse LVM-37W3 1080p? People were using that thing for 10-15 years. It outlasted my first marriage.

The right monitor purchase can leave you set for a long time.

It has not been intentional, but for the last couple of decades since I first started buying "premium" monitors, I seem to be averaging roughly a 5 year monitor cycle.

2001: 21" Iiyama VisionMaster Pro 510 CRT
2006: Dell 2405FPW 24" 1920x1600
2010 (late): Dell U3011 30", 2560x1600
2015: Samsung JS9000 TV 48", 3840x2160
2019 (late): Asus XG438Q 43" 3840x2160

I'm not "due" until 2024, but a 42" LG C3 at 4k is really calling my name, if VRR works with G-Sync Compatible...
 
I'd argue there is no point to anyhting above 120hz, and 32" is WAY too small for 4k. :p
Not at all. It's actually too big for 4k. I prefer 27", but I am sure 32 will happen first with those specs and I'll cave regardless. :) The more hz, the better... We have a long way to go still and I bet both Nvidia and amd double down on frame gen tech.
 
Not at all. It's actually too big for 4k. I prefer 27", but I am sure 32 will happen first with those specs and I'll cave regardless. :)

4k is perfect in the 40-44" range. Smaller than that, and you are just wasting pixels, creating more heat and noise, and lowering performance for no reason.

The more hz, the better... We have a long way to go still and I bet both Nvidia and amd double down on frame gen tech.

In an ideal world with no tradeoffs, sure. But there are always tradeoffs.

All else being equal, higher framerate creates more heat, uses more power, requires more expensive hardware, is noisier, requires reduction of graphics settings, etc. etc. etc.

Everything is a tradeoff, and if you are going to pump up the framer rates you better be getting something tangible for that tradeoff, and I'd argue that eliminate placebo effect and you can increase your framerate and refresh infinitely high, and there is still no difference.

I'd put 120hz vs a billion hz and expect a properly blinded comparison to show no difference in performance, even among pro players, and no one able to identify which is which greater than 50% of the time (in other words, chance)

Super high refresh rates are just there so marketing departments can put big numbers on the box, and people will buy it out of FOMO.

They are just about as useless as the CPU reviews 20 years ago. Mine gets 350 fps in Q3A. Well hahah a mine is better, it gets 412fps! These are completely pointless distinctions.

Damn gaming has become as moronic when it comes to this stuff as the audiophiule community has. Two pieces of equipment can measure IDENTICALLY on high end scientific instrumentation, but Mr Golden Ears over here can hear a difference. It's ridiculous. It's just there to extract money from suckers.
 
  • Like
Reactions: t1k
like this
I'd argue there is no point to anyhting above 120hz, and 32" is WAY too small for 4k. :p
I'd have probably said that until this year. The RTX 4090 is such a beast that it can do near 200+ fps in a number of games and that's before even involving DLSS which can push it close to 240 fps. DLSS 3 can also do interesting things so that extra refresh rate is now relevant for far more than competitive shooters and motion clarity.

I think 32" is fine for 4K. Ideally I'd go a bit higher like say 36" but nobody wants to make that so we deal with what we can get. I'd take a curved 4K, 240 Hz, 42" OLED too. I'd probably be ok buying e.g the LG Flex if it had been that instead of just a curvable C2 at 3x the price.
 
I waited too long for a CX replacecment. I'll just grab the 55" S95C with extended warranty and make damn sure it'll burn in in 2-3 years XD.


I've considered a 55" before since it would still be good near/at the 40" to 48" view distance I view my 48CX.

42inch view distance to 52inch view distance on a 55" 4k
- - - gets the same viewing angle and PPD range as - - -
36inch view distance to 45inch view distance on a 48" diagonal 4k screen does. . .

So there is some overlap in the optimal viewing angles+PPD.

A distance 42", 48" and 55" 4k screens all have in common (within the 50 deg to 60 deg human viewing angle more or less and at 60 PPD to 77PPD) would be around 40" view distance.


50deg angle 77 PPD
= 55" 4k at 52 inch view
= 48" 4k at 45 inch view
= 42" 4k at 39 inch view <<<<<

55deg angle 70 PPD
= 55" 4k at 46 inch view
= 48" 4k at 40 inch view <<<<<<
= 42" 4k at 35 inch view


60deg angle 64PPD
= 55" 4k at 42 inch view
= 48" 4k at 36 inch view
= 42" 4k at 32 inch view

64 deg angle 60 PPD
= 55" 4k at 39 inch view <<<<<
= 48" 4k at 34 inch view
= 42" 4k at 29 inch view


That ~ 40" view distance is also coincidentally the radius or focal point of a 1000R curve so works nicely if any were 1000R curved screens too. Sitting at the focal point of the curve, all points on the screen surface would be equidistant from you and each pixel would be pointed directly at you on axis.

Especially with WOLED subpixels in regard to text rendering, but also considering that the 2D desktop graphics and imagery has no AA/compensation for pixelization - personally I'd aim for higher than 60 PPD.

. . . . . .

A good low profile floor tv stand would do the distance well enough.

Otherwise at near on the desk setups you are doing this kind of thing

777595_piano-clipart-25.gif


Except the screen is the size of the whole back of the piano behind the sheet music. :ROFLMAO:
 
Last edited:
I am curious, how does something like a 77 inch OLED, at 4K, at couch distance, say 10-12 feet, shake out in terms of PPD? Is there a way to calculate it?

Edit: I calculate 146 ish.

I view my 77" C1 from around 8' or so. It hits 100 PPD at 8'. (9' = 111 PPD, 10' = 123PPD, 12' = 146 PPD)

That's only a ~ 35 degree viewing angle even at 8' though so not very immersive compared to what you'd normally use for pc gaming - though I game on it occasionally off of my laptop.

. . . . . . .

Human viewing angle for binocular/color view (without rotation) is 50 to 60 degrees.

3kU3adt.png



For a 77" 4k tv:

50 deg viewing angle = 72 inch ( ~ 6') view distance = 77 PPD

60 deg viewing angle = 58 inch (~ 5') view distance = 64 PPD

tJWvzHy.png
 
Last edited:
I'd argue there is no point to anyhting above 120hz, and 32" is WAY too small for 4k. :p

We probably don't benefit much from going extremely higher than 120fpsHz - 165fpsHz for motion definition increases. 200 might be a good spot.

We do need way higher than 120fpsHz in order to remove sample-and-hold blur during viewport movement at speed. Like 1000fpsHz though we could be moving the screen at 5000px/second at times so even 1000fpsHz isn't "too much".

. . . . . . . .

There are two benefits to increased fpsHz. I think a lot of people are only aware of the first one from some of the comments about it I see.
.
.

1. Increased motion definition. More dots per dotted line curve so to speak, so more motion resolution. More unique pages in an animation flip book flipping faster. This will have diminishing returns but we can benefit from fairly high rates here still.

.
.
2. Sample and Hold Blur Reduction aka motion clarity gains
https://blurbusters.com/faq/oled-motion-blur/

Your eyes are always moving when you track moving objects on a screen. Sample-and-hold means frames are statically displayed until the next refresh. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh; this causes the frame to be blurred across your retinas:

We would benefit from over 1000fpsHz but that is a good point to aim for with more and more advanced frame amplification tech and higher hz screens in the following years.
okw997S.png


We could really use more than that since you could move your viewport at 5000pixels per second or faster on a 4k or 8k screen but 1000fpsHz a good goal to shoot for and it would drastically reduce the sample-and-hold blur of the entire viewport during mouse-looking, movement-keying, and controller character/camera panning.

It would also have benefits for sports and fast action media if the frame rates of the material were high enough or frame amplification tech were applied there as well.
Outside of black frame insertion, which still has major tradeoffs (esp. in regard to HDR color volume), display tech has been a blurry mess when moving at speed since the high end CRT era ended. Most people are so used to bad sample-and-hold blur, esp. during viewport movement, that they don't know the difference/what they are missing b/c they've never seen better.
. . .

All of that said, online gaming has it's own (low) tick rates and server interpolation~ping compensation biases so it's not a 1:1 benefit to have much higher fpsHz as a scoring advantage. That is marketing b.s. Most testing of that which shows any benefit at all is done locally on a LAN or vs bots which is a completely different animal than online gaming.

However blur reduction or reducing image persistence to being negligible would be a huge aesthetic benefit. It would also keep in-game/on screen moving text readable, as well as high detail textures and depth via bump mapping "readable" at speed.
 
Last edited:
We probably don't benefit much from going extremely higher than 120fpsHz - 165fpsHz for motion definition increases. 200 might be a good spot.

We do need way higher than 120fpsHz in order to remove sample-and-hold blur during viewport movement at speed. Like 1000fpsHz though we could be moving the screen at 5000px/second at times so even 1000fpsHz isn't "too much".

. . . . . . . .

There are two benefits to increased fpsHz. I think a lot of people are only aware of the first one from some of the comments about it I see.
.
.

1. Increased motion definition. More dots per dotted line curve so to speak, so more motion resolution. More unique pages in an animation flip book flipping faster. This will have diminishing returns but we can benefit from fairly high rates here still.

.
.
2. Sample and Hold Blur Reduction aka motion clarity gains
https://blurbusters.com/faq/oled-motion-blur/



We would benefit from over 1000fpsHz but that is a good point to aim for with more and more advanced frame amplification tech and higher hz screens in the following years.
View attachment 541188

We could really use more than that since you could move your viewport at 5000pixels per second or faster on a 4k or 8k screen but 1000fpsHz a good goal to shoot for and it would drastically reduce the sample-and-hold blur of the entire viewport during mouse-looking, movement-keying, and controller character/camera panning.

It would also have benefits for sports and fast action media if the frame rates of the material were high enough or frame amplification tech were applied there as well.
Outside of black frame insertion, which still has major tradeoffs (esp. in regard to HDR color volume), display tech has been a blurry mess when moving at speed since the high end CRT era ended. Most people are so used to bad sample-and-hold blur, esp. during viewport movement, that they don't know the difference/what they are missing b/c they've never seen better.
. . .

All of that said, online gaming has it's own (low) tick rates and server interpolation~ping compensation biases so it's not a 1:1 benefit to have much higher fpsHz as a scoring advantage. That is marketing b.s. Most testing of that which shows any benefit at all is done locally on a LAN or vs bots which is a completely different animal than online gaming.

However blur reduction or reducing image persistence to being negligible would be a huge aesthetic benefit. It would also keep in-game/on screen moving text readable, as well as high detail textures and depth via bump mapping "readable" at speed.

1000 hz?

I'm sorry but that is complete and total nonsense.

If that is what blurbusters is claiming, then blurbusters are completely useless and should not be used as a source for anything what so ever.

We are so far into placebo and post-FOMO-purchase justification land here that it is crazy.

There may be a theoretical difference, but a practical one? No way.

We are humans playing these games, not scientific instruments.

As far as I am concerned, a 120hz 42" 3840x2160 OLED screen is end stage shit. Pure perfection. No need to ever upgrade to anything newer or better ever again because it does absolutely everything anyone should ever need. (Except maybe for improved burn-in resistance)

Of course the industry will try to push increasing numbers. They have to, or they won't sell any new monitors after this. They will rely on FOMO to sell increasingly high numbers of things no one needs, using human psychology against their customers/victims.
 
Last edited:
1000 hz?

I'm sorry but that is complete and total nonsense.

If that is what blurbusters is claiming, then blurbusters are completely useless and should not be used as a source for anything what so ever.

We are so far into placebo and post-FOMO-purchase justification land here that it is crazy.

There may be a theoretical difference, but a practical one? No way.

We are humans playing these games, not scientific instruments.

As far as I am concerned, a 120hz 42" 3840x2160 OLED screen is end stage shit. Pure perfection. No need to ever upgrade to anything newer or better ever again because it does absolutely everything anyone should ever need. (Except maybe for improved burn-in resistance)

Of course the industry will try to push increasing numbers. They have to, or they won't sell any new monitors after this. They will rely on FOMO to sell increasingly high numbers of things no one needs, using human psychology against their customers/victims.

I agree but just give me 240Hz instead of 120Hz. The whole 1000Hz thing seems like a neat idea but it's only going to be achieved through fake frames whereas 240fps can be realistically obtained without any interpolation/amplification trickery. I'm still interested to see how it looks in person but I would be totally happy settling for 240Hz.
 
I agree but just give me 240Hz instead of 120Hz. The whole 1000Hz thing seems like a neat idea but it's only going to be achieved through fake frames whereas 240fps can be realistically obtained without any interpolation/amplification trickery. I'm still interested to see how it looks in person but I would be totally happy settling for 240Hz.

I don't necessarily have any problems with 240hz. It can even help reduce motion blur in some areas, for reasons that are separate from actually rendering at 240fps (which I still maintain is utterly pointless)

Monitors which have faster refresh rate capability need to have pixel refresh rates that can support those overall refresh rates, or the results would be awful.

This can result in less ghosting and/or motion blur even when rendering at lower frame rates. Some LCD screens even blank and reinsert the same frame twice or more when the panel is capable of higher refresh rates than the source, in order to improve the perceived motion.

To be clear, you don't need a 240hz panel to benefit from faster pixel refresh, even a 60hz screen could benefit from good pixel refresh rates, but a manufacturers are forced to make sure their pixel refresh is good in a monitor that is rated for very high refresh rates, where they arent in lower refresh rate panels, so to save money they usually don't.

In other words, even if you plan on running at 60fps or 120fps, you could still benefit from a 240hz panel just because the monitor is faster at updating your pixels and minimizing blur, despite the true framerate being lower.

Anyone who was around 20-30 years ago will remember the shitty old 90's LCD panels which were really slow. They were sharp and great for desktop work (as long as you didn't use a VGA input causing the signal to be converted using a DAC from digital to analog, and then converted right back using an ADC in the monitor to become digital again, which resulted in significant blur)

If you tried to use them in games - however - the experience was pretty terrible, with awful ghosting due tot he slow panels. We are talking about the same thing today.

They could make faster panels for 60 and 120hz displays, but often they don't because they aren't forced to. Buying a 240hz panel is one way of making sure you get a faster panel even if you don't need the 240hz, because the manufacturer is forced to use that faster panel in order to make it 240hz capable.

So, they are separate technically things, but they are often (not always) tied together because of business realities.
 
I don't necessarily have any problems with 240hz. It can even help reduce motion blur in some areas, for reasons that are separate from actually rendering at 240fps (which I still maintain is utterly pointless)

Monitors which have faster refresh rate capability need to have pixel refresh rates that can support those overall refresh rates, or the results would be awful.

This can result in less ghosting and/or motion blur even when rendering at lower frame rates. Some LCD screens even blank and reinsert the same frame twice or more when the panel is capable of higher refresh rates than the source, in order to improve the perceived motion.

To be clear, you don't need a 240hz panel to benefit from faster pixel refresh, even a 60hz screen could benefit from good pixel refresh rates, but a manufacturers are forced to make sure their pixel refresh is good in a monitor that is rated for very high refresh rates, where they arent in lower refresh rate panels, so to save money they usually don't.

In other words, even if you plan on running at 60fps or 120fps, you could still benefit from a 240hz panel just because the monitor is faster at updating your pixels and minimizing blur, despite the true framerate being lower.

Anyone who was around 20-30 years ago will remember the shitty old 90's LCD panels which were really slow. They were sharp and great for desktop work (as long as you didn't use a VGA input causing the signal to be converted using a DAC from digital to analog, and then converted right back using an ADC in the monitor to become digital again, which resulted in significant blur)

If you tried to use them in games - however - the experience was pretty terrible, with awful ghosting due tot he slow panels. We are talking about the same thing today.

They could make faster panels for 60 and 120hz displays, but often they don't because they aren't forced to. Buying a 240hz panel is one way of making sure you get a faster panel even if you don't need the 240hz, because the manufacturer is forced to use that faster panel in order to make it 240hz capable.

So, they are separate technically things, but they are often (not always) tied together because of business realities.

Right. When it comes to OLEDs though these are plenty fast that they will be able to take advantage of any extra Hz probably all the way up to that 1000Hz mark unlike LCDs which will have problems with response times as you go higher and higher Hz. 120Hz OLED is great but when I also had a pretty quick 240Hz TN panel at the same time the motion clarity of that monitor was definitely better when running games at 200+ fps. A 240Hz OLED would for sure be the end game for me.
 
1000 hz?

I'm sorry but that is complete and total nonsense.

If that is what blurbusters is claiming, then blurbusters are completely useless and should not be used as a source for anything what so ever.

We are so far into placebo and post-FOMO-purchase justification land here that it is crazy.

There may be a theoretical difference, but a practical one? No way.



Frame amplifcation tech will advance to 500fpsHz and beyond. It will have huge gains in blur reduction. VR's frame amplification utilizes the hardware/drivers, OS, and the way the games are coded so is way more informed about everything that is moving's vectors (incl. the character/virtual camera/or vr headset). Nvidia's young version of frame amplification tech is just taking the next frame and guessing what the vectors are so it is an immature tech with worse results currently. VR is way ahead in frame amp tech but it will go farther in the future for both pc displays and VR/MR.

You won't need to hit 500fps or 1000fps natively to hit those numbers, a base healthy frame rate will be amplified as the tech matures.

You are wrong about placebo as far as blur reduction is concerned. OLEDs suffer sample-and-hold blur during movement at speed no matter what their response time is. It's they way our eyes work. https://blurbusters.com/faq/oled-motion-blur/
Refer back to the two different aspects of gains from high fpsHz.

1. Increased motion definition. More dots per dotted line curve so to speak, so more motion resolution. More unique pages in an animation flip book flipping faster. This will have diminishing returns but we can benefit from fairly high rates here still.
.
.
2. Sample and Hold Blur Reduction aka motion clarity gains

While motion definition has diminishing returns, Sample and hold blur is measureable. Some of the better monitor review sites like TFTcentral have modernized to include pursuit camera testing for this reason, but unlike the simple bitmap ufo tests, sample-and-hold blur affects the whole viewport (blurring the whole viewport full of objects, landscapes, architectures, high detail textures and depth-via bump-mapping, in-game text, etc) during viewport movement at speed. Some of us had fw900 crt's with essentially zero blur during fast viewport movement, and some of us have experimented with black frame insertion for similar results (though it dims the screen using that method). The difference is very visible and is no placebo. It's more like you don't know what you are missing if you don't know better. Allegory of the cave sort of.

KlIRG0B.png


*note that the blur increases more than this at higher than 1000 px/sec movement (incl. viewport movement).

*unlike the simple bitmap ufo tests, sample-and-hold blur affects the whole viewport (blurring the whole viewport full of objects, landscapes, architectures, high detail textures and depth-via bump-mapping, in-game text, etc) during viewport movement at speed.
 
Last edited:
Frame amplifcation tech will advance to 500fpsHz and beyond. It will have huge gains in blur reduction. VR's frame amplification utilizes the hardware/drivers, OS, and the way the games are coded so is way more informed about everything that is moving's vectors (incl. the character/virtual camera/or vr headset). Nvidia's young version of frame amplification tech is just taking the next frame and guessing what the vectors are so it is an immature tech with worse results currently. VR is way ahead in frame amp tech but it will go farther in the future for both pc displays and VR/MR.

You won't need to hit 500fps or 1000fps natively to hit those numbers, a base healthy frame rate will be amplified as the tech matures.

You are wrong about placebo as far as blur reduction is concerned. OLEDs suffer sample-and-hold blur during movement at speed no matter what their response time is. It's they way our eyes work. https://blurbusters.com/faq/oled-motion-blur/
Refer back to the two different aspects of gains from high fpsHz.

1. Increased motion definition. More dots per dotted line curve so to speak, so more motion resolution. More unique pages in an animation flip book flipping faster. This will have diminishing returns but we can benefit from fairly high rates here still.
.
.
2. Sample and Hold Blur Reduction aka motion clarity gains

While motion definition has diminishing returns, Sample and hold blur is measureable. Some of the better monitor review sites like TFTcentral have modernized to include pursuit camera testing for this reason, but unlike the simple bitmap ufo tests, sample-and-hold blur affects the whole viewport (blurring the whole viewport full of objects, landscapes, architectures, high detail textures and depth-via bump-mapping, in-game text, etc) during viewport movement at speed. Some of us had fw900 crt's with essentially zero blur during fast viewport movement, and some of us have experimented with black frame insertion for similar results (though it dims the screen using that method). The difference is very visible and is no placebo. It's more like you don't know what you are missing if you don't know better. Allegory of the cave sort of.

View attachment 541252

*note that the blur increases more than this at higher than 1000 px/sec movement (incl. viewport movement).

*unlike the simple bitmap ufo tests, sample-and-hold blur affects the whole viewport (blurring the whole viewport full of objects, landscapes, architectures, high detail textures and depth-via bump-mapping, in-game text, etc) during viewport movement at speed.
So you are saying sample and hold blur issues are mitigated by higher frame rates? Can sample and hold be overcome?
 
Yes. (Higher FPS + HZ, not just one or the other). Mark R. of blurbusters.com is an expert on monitor tech though so take his word for it if I you don't believe me.

You can reduce and eventually virtually eliminate (depending on the movement speed) sample and hold blur by:

- adding black periods between frames. This has some major tradeoffs in visible flicker and possible eye fatigue even if the flicker isn't as consciously visible when using somewhat higher rates. It also dims the screen in effect to your eyes which makes it a poor method for HDR, where you want to get the highest color volume highlights possible (esp. in regard to oleds). It also typically is incompatible with VRR even if it is theoretically possible for them to work together. So it's best use scenario is high framerate *minimums*, which means much lower graphics settings than you'd typically get using VRR. That or playing very old or very simple to render games (as well as having no HDR or autoHDR enabled). This more or less rules it out in it's current form as far as I'm concerned as I prioritize HDR content.

- adding more frames and higher hz. More frames drawn and for shorter periods. The last image I posted pretty much explains it. The faster the movement (especially viewport movement/controller panning/camera panning), the higher the blur. E.g. 1000pixels/second movement of the viewport, or 4000 pixels/second movement of the viewport, etc. The higher the fps+Hz rates, the more the sample and hold blur is reduced..

1ms of persistence = 1 pixel of motion blur during 1000 pixels/second motion

. . . . . .

You can divide 1000 by your fpsHz to get a baseline for the ms / pixel amount of blur at 1000 pixels/second speed.

E.g. 1 Hz is 1000ms .. 1000ms divided by 120fpsHz = 8.3ms frame = 8.3 pixels of blur at 1000px/sec movement speed (e.g. mouse-looking/movement keying/controller panning of the entire viewport at speed)

Moving up to 240fpsHz would cut that blur amount down to 50% compared to 120fpsHz (4.1 px of blur at 1000px/second movement speed)

Moving up to 480fpsHz would cut that blur amount down to 25% compared to 120fpsHz. (2.1 px of blur at 1000px/second movement speed)


793096_okw997S.png
 
Last edited:
There are definite benefits to motion clarity at 1000+ Hz when it comes to sample and hold. Since OLED pixels are so fast, using sample and hold you'd need 1000 Hz to get 1ms persistence. Just to put that into perspective, that's the same motion clarity as a strobed backlight display at 100 Hz doing a 1ms strobe duration. We already know what a 1000 Hz OLED would look like when using sample and hold. It would be even better though as there would be no strobe-crosstalk. Fun fact: an OLED using sample and hold would have to go over something like 2000 Hz to get the same motion clarity as a CRT (a phosphor rise and fall of ~0.5ms).

Persistence is the main cause of motion blur on todays fast displays, not pixel response time. An OLED running at 120 Hz has a metric ton of persistence, 8.33 ms to be precise. That's why 120 Hz sample and hold OLED motion is NOT clear.
 
We already know what a 1000 Hz OLED would look like when using sample and hold. It would be even better though as there would be no strobe-crosstalk. F

Persistence is the main cause of motion blur on todays fast displays, not pixel response time. An OLED running at 120 Hz has a metric ton of persistence, 8.33 ms to be precise. That's why 120 Hz sample and hold OLED motion is NOT clear.

Sample and hold is just the name for the way images persist on our retinas, it's not a setting you enable or a tech that is designed to make it happen purposefully. It is a side effect. I know that you know that but just to be clear for anyone else. I'm guessing you meant by "when using sample and hold" as in when not using BFI, or just "when using a sample and hold suffering display" i.e. when not using a crt or plasma or reel film projector though those are all very dated now so are outliers to say the least. Sample and hold as a side effect on modern displays from computer monitors, laptops, tablets, phones, watches, car displays, etc. is pretty much the default, (though some TVs still use some bfi or strobing with interpolation for media, this is becoming less common with HDR). BFI is a workaround hack like backlight strobing was, both trying to address the same problem. Sort of like like using shuttering 3d glasses as a 3d implementation (though BFI is at higher hz ~ "shutter" rate in front of both eyes at the same time so less punishing to your eyeballs and brain).

. . .

Crosstalk is a big issue too , yeh.

Very high fpsHz would also be better because:

- it wouldn't dim/mute the screen like BFI. . . which cuts the HDR color volume so drastically that it wouldn't be worth using vs what you'd lose in HDR capability, esp. on OLED but any display benefits from higher HDR color volume and not cutting it's peak capability in half.

- wouldn't suffer other BFI tradeoffs like high minimum frame rate req for best use case. (and thus usually lower graphics settings on demanding games at 4k compared to v-high+ to ultra settings with VRR, or playing low graphics/older games). Once we get those kind of fps+Hz via frame amplification tech, frame rate minimum probably wouldn't really be an issue anyway though.

- would be fully compatible with VRR. . . where practically no display mfgs bother designing an implementation that allows both VRR and BFI, and the only one that tried (afaik) didn't do it well enough as I recall. Some don't even bother adding BFI support at this point as the focus is more on HDR. Once we get those kind of fps+Hz via frame amplification tech, VRR probably wouldn't really be needed anyway though.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


We already know what a 1000 Hz OLED would look like when using sample and hold.

For sure 👍 like I said

Some of us had fw900 crt's with essentially zero blur during fast viewport movement, and some of us have experimented with black frame insertion for similar results (though it dims the screen using that method). The difference is very visible and is no placebo. It's more like you don't know what you are missing if you don't know better. Allegory of the cave sort of.



There are definite benefits to motion clarity at 1000+ Hz when it comes to sample and hold

Yes as opposed to diminishing returns on motion definition/articulation increases.
You could say definite benefits when it comes to pretty much every modern display, especially every HDR display.
We could even benefit from way over 1000 FPS + HZ, or fpsHz, if possible, but 1000 is a good goal to shoot for and would give very appreciable gains in blur reduction. Also the fact that OLED response times should be able to support it.
 
Last edited:
Thoughts on pixels per second speed and resulting blur:

Pixels per second vs resolution
==========================

We could easily rotate one 4k width away in a second, so we'd be moving the viewport at around 4000 pixels per second when doing that. .Two 4k screens worth of pixels in a second, moving 7680 to 8000 pixels per second. That would increase the testufo blur example's blur x4 to x8 compared to the 1000px/sec baseline used in the examples in that image. The ratios between different fpsHz would still scale (240fpsHz blur at 50% that of 120fpsHz , 480fpsHz blur at 25% that of 120fpsHz,) but the faster you move the viewport, the more blurry it will be. However we aren't always moving the viewport quite that fast and not through an entire arc at top speed at all times either. Plus at times you might move the mouse so fast that there is a point where it can get more into "flick" territory, where you might not even register the viewport movement at fastest part of the movement (more like a teleport in effect)- but you would still get appreciable blur reductions during relatively fast more ordinary "panning" that would otherwise be a blurry mess.



Pixels per second vs. pixel size
=========================

Another factor is that just because you are at a higher rez in your viewport and therefore moving at more pixels/second over the same viewport rotation degrees, it doesn't mean the actual blur/smear size to your eyes will increase in real world size dimensions The 4k scene is moving the same amount of degrees as it would at say 1080p but at more pixels/second on a 4k screen. The increased pixel width of blur would be much smaller pixels so it might not be quite as big of a difference.

1080p~2k at 2000px/second speed vs 4k at 4000 px/second could each rotate the viewport around one screen width in a second. . . . 2000px/sec at 480fpsHz would be 4.2 pixels of blur, 4.2ms persistence. 4000px/sec at 480fpsHz would be 8.3 pixels of blur, 8.3ms persistence. But the 4k's pixels would be 1/4 of the size. You might be using higher detail 4k textures too though which might be more noticeable detail loss so there are a lot of factors. The real world dimensions of all the on screen elements in the scene will remain the same between both resolutions so they will be the same size but at different detail levels. Just saying that for example you might get double the # of pixels worth of blur moving a the viewport in a scene the same # of degrees over time at 4k compared to 2k, but the resulting # of pixels of blur would be at 1/4 the pixel size . . .



Virtual Objects moving at different rates
===============================

Further complicating this is the fact that virtual objects and FX are moving at their own rates in the scene itself in addition to your viewport movement rate of the entire scene. However this could potentially be exploited. From a Q&A/discussion thread with Mark R. on blurbusters' forum he indicated that it would be possible theoretically to run a mix of different Hz objects or planes composited within the same scene to save on system demand. For example running objects in the background at a lower hz/update rate than those in the foreground, in a graduated fashion. That would include their movement + update rate in relation to how the viewport/virtual camera is moving as well. No matter how they are moving that is. (Even a static boulder in the background moving in relation to your viewport because the viewport is moving for example). That ties into various other frame amplifications technologies that may emerge and mature in the future that could result in much higher frame rates.

. . . . . . . . . . . . . . . .

500 - 1000 FPS + HZ, frame amplification tech
=======================================

1000fpsHz in a consumer gaming display is theoretical/wishful at this point but frame amplification tech should advance in the following years so I think we should see 500fpsHz or higher, depending on mfgs and game devs graduating to it of course. Hopefully up to 1000fpsHz in the long run. VR is advancing frame amplification tech a lot farther than the PC front but it's Hz is very limited so far (and it's PPD is terribly low). It will probably take a big adoption leap in development on pc to get really good frame amplification tech where the game engine itself, OS+drivers, etc are all informing their overall vectors. The current nvidia implementation is uninformed by comparison and is just taking a look ahead at the next frame and guessing the vectors. It ends up failing at things like an orbiting camera in a 3rd person game that keeps parts of the character or scene static while the player is actually moving in relation to the action and environment itself for example. Better frame amp tech (like that used in VR) that has peripherals/drivers and virtual objects and cameras in the engines informing their vectors to the system rather than only guessing the difference between two frames.

. . . . . .

Now and then
============

For now I prioritize HDR but I still appreciate the motion definition/motion articulation gains and the little motion blur reduction during viewport movement at speed I can get at ~ 120fpsHz compared to 60.

Hopefully we'll get 240hz and 480Hz 4k oleds in the next few yrs and hopefully nvidia's frame amplification tech will mature somewhat hand-in-hand (along with more powerful gpus) so that we can fill those Hz with frame rates even with very demanding games.

Also, as much as I have distain for facebook/zuck, it might be interesting to see what a 10billion dollar space-x like budget dumped into VR advancements can come up with years from now, (and how some of that tech frame amplification wise might be able to translate to pc gaming gpus+monitors+peripheralinputs+gamedev/coding potentially, at least hopefully).
 
Last edited:
Interesting. I gamed on CRT and in some ways we just never got back there. On plasma now, for this reason. OLED is certainly closer but I remember what the first LCDs were like (all really) and it was awful.
So subjectively, is OLED 120 fps more or less blurry than then the best LED displays? I know it is going to be more blurry than my plasma but i knew that already.

If the blur can be mitigated by fps then i may wait for 240 refresh displays or higher just to have the capability. I am talking about desktop displays. I am will likely grab a Sony 77 inch for the living room when they launch their QD-OLED 77 incher, whenever that is.
 
Interesting. I gamed on CRT and in some ways we just never got back there. On plasma now, for this reason. OLED is certainly closer but I remember what the first LCDs were like (all really) and it was awful.
So subjectively, is OLED 120 fps more or less blurry than then the best LED displays? I know it is going to be more blurry than my plasma but i knew that already.

If the blur can be mitigated by fps then i may wait for 240 refresh displays or higher just to have the capability. I am talking about desktop displays. I am will likely grab a Sony 77 inch for the living room when they launch their QD-OLED 77 incher, whenever that is.

120Hz OLEDs won't be competing with the fastest 240Hz/360Hz/500Hz LCD monitors in terms of motion clarity unless you use BFI at 120Hz which is the other way to mitigate blur besides increasing the Hz. Problem with that is only the LG CX and C1 have 120Hz BFI, the C2 removed it and all OLED monitors have zero BFI option. Here's the LG CX with BFI against some other fast monitors.

pusuit_4.jpg
pursuit_120hz.jpg
 
Last edited:
120Hz OLEDs won't be competing with the fastest 240Hz/360Hz/500Hz LCD monitors in terms of motion clarity unless you use BFI at 120Hz which is the other way to mitigate blur besides increasing the Hz. Problem with that is only the LG CX and C1 have 120Hz BFI, the C2 removed it and all OLED monitors have zero BFI option. Here's the LG CX with BFI against some other fast monitors.

View attachment 541471 View attachment 541472
what website provides those images to compare different monitors?
 
thanks. the C2 42" has BFI at only 60hz. I noticed it on the rtings review. you probably already know.

"The LG C2 42 has an optional black frame insertion feature to reduce persistence blur. Unlike the LG 48 C1 OLED, it only flickers at 60Hz, and you can't use it at all with 120 fps content. For it to work, you need to enable OLED Motion, and you also need to make sure you disable VRR and have Prevent Input Delay set to 'Standard', which significantly increases the input lag."

I compared some blur images of other monitors like mine which is 144hz TN to the C2. I don't think I'll be missing much. many of the 170hz non oled monitors blur images look about the same as the C2 42". I'm curious if theres a benefit to getting over 120 fps though.
 
thanks. the C2 42" has BFI at only 60hz. I noticed it on the rtings review. you probably already know.

"The LG C2 42 has an optional black frame insertion feature to reduce persistence blur. Unlike the LG 48 C1 OLED, it only flickers at 60Hz, and you can't use it at all with 120 fps content. For it to work, you need to enable OLED Motion, and you also need to make sure you disable VRR and have Prevent Input Delay set to 'Standard', which significantly increases the input lag."

I compared some blur images of other monitors like mine which is 144hz TN to the C2. I don't think I'll be missing much. many of the 170hz non oled monitors blur images look about the same as the C2 42". I'm curious if theres a benefit to getting over 120 fps though.

Yeah I'd say the general rule of thumb that LCD's need roughly 1.5x the Hz to match OLED clarity is about right. Here's the Samsung Neo G7 at 165Hz vs the LG C2 at 120Hz and although the G7 does win out, I would say it isn't by much and certainly not a day and night difference. Slower LCD's would look even closer to the C2.

1673641902382.png
 
  • Like
Reactions: Gabe3
like this
Interesting. I gamed on CRT and in some ways we just never got back there. On plasma now, for this reason. OLED is certainly closer but I remember what the first LCDs were like (all really) and it was awful.
So subjectively, is OLED 120 fps more or less blurry than then the best LED displays? I know it is going to be more blurry than my plasma but i knew that already.

If the blur can be mitigated by fps then i may wait for 240 refresh displays or higher just to have the capability. I am talking about desktop displays. I am will likely grab a Sony 77 inch for the living room when they launch their QD-OLED 77 incher, whenever that is.

I'd definitely keep an eye on 240Hz displays (and hopefully future 480hz ones), maturation of frame amplification technology, and also importantly - HDR capability. HDR is a huge leap in display quality . . color volume, depth/contrast (especially side by side to the fine pixel level with OLED) and impact that is a huge difference from a dated displays including plasma but really all SDR displays. All displays have their tradeoffs but *quality* HDR is glorious in games, shows/movies - so is a priority going forward, for me at least and . Even the next gens of VR should start having HDR capability, where that was a glaring omission before.


input lag."

I compared some blur images of other monitors like mine which is 144hz TN to the C2. I don't think I'll be missing much. many of the 170hz non oled monitors blur images look about the same as the C2 42". I'm curious if theres a benefit to getting over 120 fps though.

There are people that could give better details than this, or from blurbusters.com pages, but this is from memory so anyone can feel free to chime in on any of this.

BFI = no VRR for all practical purposes
===============================
While theoretically a perfectly operating BFI+VRR implementation could be developed and manufactured - BFI doesn't work well with VRR for all practical purposes in anything released so far afaik. So, BFI/strobing works best with very high frame rate *minimums* to begin with since you can't use VRR vs judder and tearing, and v-sync causes input lag. So without compensations available, the faster you run your fps above the peak Hz of the monitor, the less artifacts and tearing you'll get. There might be a fast-sync option or something though idk if that is compatible or has side-effects.


BFI typically works best at very high strobe/blanking rates
==========================================================
Another reason BFI/strobing works best at high framerate *minimums* is that you typically matched the strobe rate to the frame rate to avoid crosstalk that could happen when strobing on top of frame draws to begin with. Then from there because you want the highest strobe rate you can get in order to minimize the amount of flicker or potential eye fatigue even if the flicker/pulsing is borderline but not consciously "visible". There is rolling scan and partial scan BFI that can be done on OLEDs tech wise though (see link below) so there are other ways of doing it than full frame 1:1.


BFI works maximally operating on the lowest blur the screen is capable of to start with
=========================================================================
The other reason BFI/strobing works best at very high frame rate *minimums* is that it will then be operating from a starting point of the lowest blur the monitor is capable of otherwise to begin with. At 120fpsHz (solid/minimum), you've cut the blur down 50% compared to 60fpsHz (solid/minimum) to start with before BFI is applied. So say your BFI/strobing rate or strength you chose cuts the blur (and probably the screen's brightness) by 50%. It would be a greater effective blur reduction result when operating on a fpsHz minimum exceeding the peak Hz of the monitor as a starting point than if the fpsHz were lower. That's the way to get the maximum BFI blur reduction possible for any given screen, and that's likely what all of those BFI test images that are being posted are doing - running at the max fpsHz of the screen. (Usually any testing done with 2d desktop testing apps is at the max fps/hz as there is little gpu demand compared to a high demand game)

Therefore, as I understand it, typically strobing/BFI has requires very high frame rate *minimums* (usually meaning much lower graphics settings vs. being able to use veryhigh+ to ultra at 4k, or req. playing older or easy to render games).


BFI vs HDR
============

The rule of thumb used to be that the % more the strobing reduced the blur is similar to the % more it dims the screen. So at 25% reduction you'd dim the screen 25%, at 50% reduction you'd dim the screen a round 50%. A recent reply in the display threads said that a 1500 to 1600nit ~ $800 ucg or ucx, I forget which, can run BFI + HDR at 800nit, so the roughly 50% loss seems about right there considering. It's not really compatible with achieving good HDR color volumes on todays OLEDs, and you'd usually want the full color volume possible on any HDR screen really, on the road to HDR 4000 and HDR 10,000 for more impact and realism - but it's definitely interesting that a very high peak color volume/brightness FALD LCD with BFI enabled can still get HDR color volumes similar to a LG CX OLED (~800nit).

Also worth noting that HDR screens only keep their different %s of screen brightness for a period of time and for most, but especially OLEDs, this isn't sustained for all that long currently. The brightness/sustained for SDR and HDR are listed in the RTings review pages for various larger tv gaming screens. There is even a trend on RTings of them rating screens higher that have SDR levels that are above the ABL point, so depending on the screen you'll be triggering ABL even in SDR depending on your brightness levels in whatever named setting you are using.

BFI is dimming the way the screen looks to your eyes, it's not dimming the screen itself as far as the screen OS/firmware and display backlights or emitters are concerned. So if you were at a 50% reduced brightness to your eyes due to BFI, the screen would still be at 100% output and so would apply brightness reductions accordingly, operating on it's % brightness sustained period limitations at the 100% brightness levels of the screen. Therefore any reductions would end up a lot darker than otherwise.. .. e.g ~ 50% darker ABL and ASBL end points to your eyeballs/brain on drops from % of screen sustained brightness periods.


..............................................................................

Rolling scan:
==============

https://blurbusters.com/faq/oled-motion-blur
This technique can actually be done laglessly, unlike LCD-monitor-based motion blur reduction (e.g. ULMB on LCDs). Since it’s a rolling scan, the panel scanout can still stay virtually laglessly in sync with the cable scanout (HDMI, DisplayPort). Whereas existing global-strobe backlights (ULMB, LightBoost, etc) in modern gaming monitors requires scanning out the LCD panel in the dark before flashing the backlight. So there will be no input lag disadvantage for OLED-based rolling-scan motion blur reduction!

One big problem to overcome first, is that impulse-driving require a lot of brightness to compensate for extra black period between refreshes. OLED has historically had brightness problems, however, researchers are continually improving this.

emphasis mine below in the extended quote from the same blurbusters article. It will be interesting to see how the next few gens of meta (and things like PSVR I think) will be able to implement HDR along with pulsed rolling scan. With the near distance to your eyes, VR is able to get a lot brighter in effect to start with though so it likely has an edge there vs desktop/pillar~wallmount displays.

Today, Oculus Rift and HTC Vive virtual realty headsets now already achieve CRT motion clarity via their pulsed rolling-scan technique. Most VR headsets need to eliminate display motion blur, to avoid nausea in virtual reality.

Some 2018 and 2019 OLEDs such as LG HDTVs have a black frame insertion mode. Newer OLEDs are having longer black frame intervals with shorter refresh cycle visibilities, to keep reducing OLED motion blur, especially as OLEDs get brighter with enough headroom for motion blur reduction.

We need an optional low-persistence mode even for ultra-high-Hz OLEDs too such as JOLED’s upcoming eSports gaming OLED monitor. Although doubling the refresh rate will halve motion blur, it is still necessary to use impulse techniques to reduce motion blur further.

For more reading on display motion blur behaviors and example animations, see Blur Busters Law: The Amazing Journey To Future 1000Hz Displays.

. . . . . . . . ..

VR frame amplification
====================

Worth noting again that VR has made advancements in frame amplification tech that PC does not have available. e.g. time warp and space warp, tech that could eventually be applied to amplify a good base fpsHz much higher. For now since VR has been limited to 90Hz for the most part, it's only been being used to double 36 tp 72 or 45 to 90. (Nvidia's current AI frame amplification "frame generation" tech of a "tween" frame in between two rendered frames by comparison is less informed of movement vectors and so it has to guess enitrely).

emphasis mine - current PC tech and game dev doesn't do this, but it could eventually someday with a big adoption shift:
Application SpaceWarp allows an application to render at half rate from the actual display refresh rate, for example 36 compared to 72 FPS. The application must render a motion vector buffer and a depth buffer, in addition to the standard eye buffer, which our systems will then use to synthesize new frames and still output 72 FPS to the display.
..
Application SpaceWarp (AppSW) is a developer-driven optimization technology that can unlock extra computation power for suitable content. In our initial testing, Application SpaceWarp gave applications up to 70 percent additional compute, potentially with little to no perceptible artifacts.

..
We could eventually advance frame amplification tech on pc to have hardware/drivers/os and game development informing the overall scheme of the movement vectors of virtual objects and hardware/peripherals/the character's virtual camera. (VR does this with the headset).
Positional TimeWarp (PTW): Asynchronous TimeWarp can correct HMD rotational error right before displaying, which is enabled in every Quest application. PTW can use depth buffer data to further correct HMD translation latency, which will be automatically enabled with AppSW applications. From a pure HMD latency point of view, we can even say AppSW apps have better latency than full FPS applications without AppSW when using PTW.

. . . . . . . . . . . . . . . . . . . .


So if we eventually get more advanced frame amplification tech (and faster gpus) to go along with higher Hz screens, and we eventually get screens so bright that there is enough headroom for dropping the brightness ~ 50% vs. HDR color volumes, could be in a much better and clearer place than we are now.
 
Last edited:
I just want a quality 32" OLED screen. I'm happy with VRR and frame Hz <100 still nothing to buy if I'm going to be editing photos and video sometimes.

There was a rumor of a 32" OLED last year but it turned out to be curved or hasn't come out yet.
 
Best use case imo is use more than one monitor. Different monitor tech is better at different things.

I've used 2 (or more) screens at my desk since I still had a fw900 crt next to a LCD at my desk in 2005 - because there are always tradeoffs. Later a 1440p glossy 60hz for desktop/app use next to one of the first 1080p 120hz (VA I think) screens. Then glossy 1440p 60hz for desktop/apps next to a AG coated 1440p 144hz g-sync TN for gaming, etc. I still use other screens for static desktop/app use beside my media and gaming oled.

As we move more into the HDR era we'll probably be using hdr photos more, and you could find yourself editing HDR photos and HDR videos depending what you are doing. The ABL on OLEDs works best for dynamic media and games where the camera cinematography in media or the virtual camera cinematography + mouse looking, movement-keying, controller panning in games is moving the viewport around or switching scenes/camera angles regularly rather than static frame work. I don't think the shorter sustained brightness periods and aggressive ABL of gaming OLED screens would be a good use scenario as an editing screen unless everything you do is 180nit or less. If sitting a larger gaming tv like a 42", 45", 48", 55" on a desk rather than farther away on some kind of mount, you'd also be sitting closer than the 50 to 60 degree human viewing angle so you'd increase the size of the areas on the sides that are non-uniform color (on OLED and VA screens), and your pixel density will look lower more like 1500p at a desk (or worse) which makes the pixels so large that things like text and high contrast edges in games (and especially uncompensated for asliasing of the 2d desktop's graphics and imagery) look fringed. Also, static desktop/app window frames etc will "burn down" through your 25% reserved brightness wear-evening routine burn in avoidance buffer faster, especially if not able to use dark theme in the app(s).
 
Last edited:
I just want a quality 32" OLED screen. I'm happy with VRR and frame Hz <100 still nothing to buy if I'm going to be editing photos and video sometimes.

There was a rumor of a 32" OLED last year but it turned out to be curved or hasn't come out yet.

If you're talking about the LG 32" WOLED then that should already be in production according to this article: https://www.oled-info.com/lg-display-start-producing-mid-size-woled-panels-demand-tvs-declines

BUT, I'm 99% sure this will not be 4K but will instead be 1440p. Reason being that WOLED panels seem to have issues going to this high of a PPI and 4K at 32" which is around 140PPI just seems too far out from what LGD is currently capable of. The high PPI OLED displays are either Samsung AMOLEDs made for smartphones or JOLED panels that are 60Hz and costs thousands of dollars. The 32" LG WOLED is almost certainly going to just be a larger version of their 27" 1440p 240Hz panel.
 
For HDR desktop/app and editing, the asus pro-art ucg is probably one of the best consumer priced screens but it can still be up to $3k to $5k usd. The ucx is similar but more gaming oriented (144hz, g-sync) so probably has shorter sustained brightness periods. I think you can find either for around $3k now. Those are both still much cheaper than an up to $30k eizo or sony 32" reference monitor though at about 10% of the price.

https://www.asus.com/us/displays-desktops/monitors/proart/proart-display-pa32uc/

https://www.techradar.com/reviews/asus-proart-pa32ucg-k-display

https://www.rockpapershotgun.com/asus-rog-swift-pg32uqx-review

Not being per-pixel emissive they will have blooming, glow, dimness zones losing fine detail in color and losing things like stars in starfields in the blackness of space scenes, etc. They prob have a little smearing/trailing too response time wise from those reviews. Trade-offs either way with OLED's per-pixel emissive display but the pro art displays won't trigger ABL like an oled and the ucx has a very long sustained full field/screen brightness I think.

I'd only consider one if I was forced to use only a single monitor for some reason (or if I had to edit HDR images or videos I guess with their longer sustained HDR brigthnesses) - but otherwise I'd rather have a different screen for static desktop/apps and an OLED for gaming+media.
 
Been on an LG C1 48 for the last 2 years and not tempted by anything coming out yet. Thought it was too big at the start but love the size now as it's much easier on my aging eyes allowing me to sit a bit further back. The height real estate can be a tad narrower while the width is just right.

Something like this would totally have me salivating and instantly upgrade:

48-52" Flex 240hz OLED (5760x2160 or 5120x2160)

The Corsair XENEON FLEX 45WQHD240 OLED is tempting but the 3440 x 1440 res is totally a turn off.
 
Last edited:
It's about the same PPI as a 55" 4K display or a 27" 1920x1080 monitor. That's seriously bad.

For desktop use yeah that's quite a bit on the low side but for gaming I think it's perfectly fine. There are people who game on much larger 4K TVs you know. It's not like people only game on small 4K displays like 48 inch or below.
 
Back
Top