42" OLED MASTER THREAD

We told you to buy the LG C2 for $899 instead.

That's what I did, although at a slightly higher price. I'm absolutely loving it so far.

I heard the PG42UQ can do more "monitory" stuff, but I don't really care about auto-turn on. I turned ABSL off through service menu and I don't notice any dimming or dimness in Desktop use. It's more than bright enough at 80%. It has a great glossy coating. The $950~ price was also crazy awesome. I also like the TV features and options since I have quite a bit of movies and shows to catch up on.

At first I was like this 42" size is stupid large, but now after playing Halo Infinite on it past two days, I have to say it's great, and so cool.

I'm still trying to get used to the 42" size while trying it out as my main monitor. I am getting a bit of neck strain/pain but I'm on a 25" depth desk only. My 30" depth desk is coming in tomorrow. I would love if my body adjusted to this so I can keep it as my main display rather than a side display. Wonderful hold over until a 32" 144hz oled comes, if I do get adjusted to it.

Out of curiosity, looking at your pics, what depth is your desk? Where do your eyes line up against the monitor (height wise)?
 
Last edited:
That's what I did, although at a slightly higher price. I'm absolutely loving it so far.

I heard the PG42UQ can do more "monitory" stuff, but I don't really care about auto-turn on. I turned ABSL off through service menu and I don't notice any dimming or dimness in Desktop use. It's more than bright enough at 80%. It has a great glossy coating. The $950~ price was also crazy awesome. I also like the TV features and options since I have quite a bit of movies and shows to catch up on.



I'm still trying to get used to the 42" size while trying it out as my main monitor. I am getting a bit of neck strain/pain but I'm on a 25" depth desk only. My 30" depth desk is coming in tomorrow. I would love if my body adjusted to this so I can keep it as my main display rather than a side display. Wonderful hold over until a 32" 144hz oled comes, if I do get adjusted to it.

Out of curiosity, looking at your pics, what depth is your desk? Where do your eyes line up against the monitor (height wise)?

My desk is 28-1/2" deep. I have the monitor push back as far as that at the end. For gaming it's very immersive, but pushed back a a few more inches would be even better, but not much more than that.

My eyes are aimed about 1/3rd down from the top.
 
  • Like
Reactions: Zahua
like this
My desk is 28-1/2" deep. I have the monitor push back as far as that at the end. For gaming it's very immersive, but pushed back a a few more inches would be even better, but not much more than that.

It depends immersion wise vs. pushing the extents of the screen outside of your human viewpoint.

At 60PPD on a 42" 4k screen which is ~ 29" from the surface of the screen to your eyeballs, the extents of the screen are a little out of range which might not be optimal for HUDs , notifications, pointers, chat interfaces, etc. without a little eye bending to the periphery. 60PPD is enough where text subsampling and aggressive AA can compensate enough vs fringing and aliasing though. You are pretty close to 29", maybe at depending how you are sitting vs the desk or when using a controller etc. Like you said a few more inches would probably be better though..

I think it would be good from that point all the way up to ~41" view distance at 80PPD. At that point your view distance forms an equilateral triangle or pyramid cone with the screen, with the screen as the base of the pyramid. That viewing angle allows you to see the whole screen. On my 48cx I sit almost 48" away but I'm using screens on the sides too. I could split the difference between 60PPD and 80PPD at around 70 or so. For me, 60PPD 4k is sitting too close but at least the PPD isn't poor.

I'm still trying to get used to the 42" size while trying it out as my main monitor. I am getting a bit of neck strain/pain but I'm on a 25" depth desk only. My 30" depth desk is coming in tomorrow. I would love if my body adjusted to this so I can keep it as my main display rather than a side display. Wonderful hold over until a 32" 144hz oled comes, if I do get adjusted to it.

Out of curiosity, looking at your pics, what depth is your desk? Where do your eyes line up against the monitor (height wise)?

Like I've said before, most people who buy large screens don't do the math or look at the perspective realistically and so sit way too close. They try to make larger screens work with a traditional "up against the wall like a bookshelf" or "upright piano+sheet music" desk and room layout scenario. Large screens demand a lot more space, best case separating the screen mounting option from the constraints of the desk dimensions you sit at with your peripherals (e.g. rail spine TV stand with flat foot or caster wheels, wall mount or pole mount, or other desk/bench surface just for the screen - even a smaller model adjustable standing desk).
That's most of the pictures of larger 4k screen setups I see online - "up against the wall like a bookshelf" or "upright piano+sheet music" desk and room layout - with a few exceptions. Then they often follow up with complaints about the ppi and text quality. 😝 🙄
..............................................................................
4k PPD
....................................
60PPD 64 degree viewing angle
============================
.. on flat screens, technically a bit too close of a viewing angle vs periphery of screen being pushed out too far, but the pixel granularity will at least be low enough that subsampling and AA can compensate for the most part - at a performance hit
98" 4k screen at ~ 68.5" away has the same PPD and viewing angle and looks the same as:
80" 4k screen at ~ 56" away
77" 4k screen at ~ 54" away (60PPD, 64deg viewing angle)
65" 4k screen at ~ 45" away
55" 4k screen at ~ 38.5" away
48" 4k screen at ~ 33.5" away
43" 4k screen at ~ 30" away
42" 4k screen at ~ 29" away
31.5" 4k screen at ~ 22" away
27" 4k screen at ~ 19" away
..
..
80 PPD 48 deg viewing angle (optimal viewing angle is typically 45 - 55 deg)
===============================================================
..reduced pixel granularity so can probably get away with a little more moderate AA and text (with tweaked subsampling) will look a little better.
..until we get to something like 150PPD+ the pixels won't appear fine enough that we won't really have to rely on AA and subsampling anymore. However the gpu demand would counteract that resolution gain (8k+) anyway, losing motion clarity and motion definition aesthetics so probably better off using an optimal PPD on a 4k screen along with AA and text subsampling for the following years (though using an 8k screen on the side for desktop/apps would be good). May also benefit from 4k + DLSS AI upscaling and frame insertion to 8k at that point.
98" 4k screen at ~ 96" away has the same PPD and viewing angle and looks the same as:
80" 4k screen at ~ 78" away
77" 4k screen at ~ 75.5" away (80PPD, 48deg viewing angle)
65" 4k screen at ~ 64" away
55" 4k screen at ~ 54" away
48" 4k screen at ~ 47" away
43" 4k screen at ~ 42" away
42" 4k screen at ~ 41" away
31.5" 4k screen at ~ 31" away
27" 4k screen at ~ 26.5" away
You can see the 80PPD point (on a 4k flat screen) is where the screen diagonal measurement and the viewing distance make what is more or less an equilateral triangle or pyramid cone with your viewing angle, with the screen as the base of the pyramid. The view distance approaching the screen's diagonal is the neighborhood of the optimal viewing angle for anything with HUDs, notifications, pointers, text windows, etc. in my opinion, regardless of the PPD. Coincidentally, a 48" 4k screen at ~ 47" - 48" away is a 48 degree viewing angle. 48diag ~ "48" view - 48deg .
...
Beneath 60 PPD
================
It's not that the screens are unusable at sub-60PPD or anything, it's just that the pixels / pixel grid will appear much more granular and aggressive. Interfaces, bars, menus, HUDs etc will all be larger by default on lower resolution screens as well (less desktop "real-estate"). Text will also look much poorer in general at low PPD and you won't be able to use as small of a font/text size or interface size without it looking bad (you can't get more desktop real-estate by just scaling things down more - there won't be enough pixels and sub-pixels to do it with a clean result). Nearer than around 60 PPD: AA in games and text subsampling on the desktop (where AA is not available) won't be able to compensate enough anymore.
 
Last edited:
Height wise I try to keep my head/eyeballs in line with the middle of the screen or as near to that as I can get. Sitting over 3' away makes this easier to do especially if the screen is on it's own mount separate from the desk. If your chair has full head, neck, and arm support you might tilt it back slightly to a slight sniffing position which across 3' + makes a difference in viewing angle a bit, imaginary dotted line ends up higher. You could optionally tilt the screen downward a few degrees on a separate mount too (or do both sniffing position a few degrees + downward angle a few degrees).

Speaking of orientation, the color uniformity/screen uniformity issue on OLEDs and VA screens will be worse if you sit closer as well. Which is another reason to not sit too close to the screen (in addition to it lowering PPD and pushing the extents outside of your human viewpoint). I believe there is a sweet spot where the sides and top/bottom begin to not be as much of an off-angle view. The closer you sit the more it becomes like you are viewing the edges/periphery of the screen from the side as if you were standing alongside the outside of the screen from the side looking at it.

This can happen to the top border too when your view is at the bottom with the screen high above your head/eyeballs (at least on VA screens, the top band becomes a darker gradient especially noticeable on bright fields of solid color).
 

42″ 4K OLED Monitor 42M2N8900

New 42” option for pc monitor
Have you guys seen this new option from Philips? It even has a height adjustable stand, should we wait for this instead of getting the asus? Primarily for pc gaming and pc usage.

That is a good looking monitor, but pretty much the same exact panel as Asus ROG. I do like the Philips design with the white and adjustable stand, but it will probably be the highest price of the 3. The LG C2 42 can be had for $999, the Asus ROG 42 OLED can be had for like $1299/$1399, this new Philips will be $1599+
 
That is a good looking monitor, but pretty much the same exact panel as Asus ROG. I do like the Philips design with the white and adjustable stand, but it will probably be the highest price of the 3. The LG C2 42 can be had for $999, the Asus ROG 42 OLED can be had for like $1299/$1399, this new Philips will be $1599+
– the UK price for this model will be £1,569.99 which is a considerably lower price point than originally listed in the press release from the event.
 

42″ 4K OLED Monitor 42M2N8900

New 42” option for pc monitor
Have you guys seen this new option from Philips? It even has a height adjustable stand, should we wait for this instead of getting the asus? Primarily for pc gaming and pc usage.
It's good to have more competition. I actually really like the feature set of this, having USB-C too is useful for us Mac users. Design looks much better than most gaming monitors.

I just wish these manufacturers weren't putting these out by the time the next gen OLEDs are right around the corner or equivalent LG OLEDs are already heavily discounted.
 
Yeah you hit some solid points there.

Monitor version of gaming displays typically:

- have computer port types like displayport, usb-c. (but didn't always have hdmi 2.1)

- might have computer idle/sleep/wakeup

- Smaller size options depending (oled is limited as of now)

-may have somewhat higher Hz than their gaming TV counterparts, at least sooner.

- depending on display type and mfg may have higher color accuracy or better factory calibration. .. but in hdr % p3 color Era probably less of a divide here, especially on gaming displays.

- pixel layout rgb vs bgr, depending. I've never had a problem using screens in portrait though so at distance this may be somewhat overblown issue.

- may have a little less input lag, depending.


Cons:
-AG coatings, non-glossy and its tradeoffs (this is a big one imo, especially on oled)

-Typically an exceptionally large price hike for what ends up mainly being gain of a port and a few Hz overall.

-can have less robust OSD options and media functionality (and hardware) depending.

-lack of smart apps in OSD for HDR versions of some streaming content vs windows being limited in some things. Certified device thing too.

-may have much less frequent and reliable firmware updates from their manufacturers (especially compared to LG TV's track record)

-may have more limited brightness, more aggressive ABL, or less HDR performance due to being marketed as a desktop display. (Some oled computer screens didn't have hdr at all).


≈=========≈≈==========

The phillips 42inch OLED is 138Hz, two hdmi 2.1, one displayport 1.4 (not 2.0/2.1)), a usb-c with alt dp mode (I think usbc is 40Gbps bandwidth), and said to retal for $2000. launch January 2023.
 
It's good to have more competition. I actually really like the feature set of this, having USB-C too is useful for us Mac users. Design looks much better than most gaming monitors.

I just wish these manufacturers weren't putting these out by the time the next gen OLEDs are right around the corner or equivalent LG OLEDs are already heavily discounted.
didn't you read what we talk about on the Philip BDM4035UC and 4065UC, that it's a total train wreck? After burning over $1K and the 43" 4K monitor dies in 4 yr., no more philips for me
 
didn't you read what we talk about on the Philip BDM4035UC and 4065UC, that it's a total train wreck? After burning over $1K and the 43" 4K monitor dies in 4 yr., no more philips for me
I can understand why you would not want to go with Philips anymore but some displays will have failures whether due to a manufacturing flaws or just bad luck regardless of the brand. I've had good luck with Samsung and LG so far but other people not so much.

I'm not going to buy the Philips unless they release that but in a curved design. The $2k price is also too high.
 
It's good to have more competition. I actually really like the feature set of this, having USB-C too is useful for us Mac users. Design looks much better than most gaming monitors.

I just wish these manufacturers weren't putting these out by the time the next gen OLEDs are right around the corner or equivalent LG OLEDs are already heavily discounted.
If they follow latest trends, new OLED panels from LG are released around June every year.

On another topic, reading the Phillips specs:
0.1ms G2G response time, 1 million:1 contrast ratio, 178/178 viewing angles, 10-bit colour depth and a wide colour gamut covering ~98% DCI-P3 and ~126% sRGB.
I mean, how do you compete with that? I have an LG C2 but I never really stopped to think about how absurd these specs are. If they really manage to shrink the panels to something like 32" there will be very little reason to still buy IPS or TN other than price.

Is 2023 the year OLED cannibalizes the market?
 
I have to say I am satisfied with my Asus ROG 42" OLED, not second guessing my decision, or wanting something else. Yes it's very large on my 30" deep desk, but the immersion is out of this world, and 4k resolution for desktop use is nice too, can have so much on the screen.
 
I mean, how do you compete with that? I have an LG C2 but I never really stopped to think about how absurd these specs are. If they really manage to shrink the panels to something like 32" there will be very little reason to still buy IPS or TN other than price.
Ahhh yeah, exactly. It's been coming for a while now. OLED has long been heralded as the tech that will finally replace subpar LCD displays. It has taken some time, just like it took several years for large LCD TVs to become affordable to the masses and to replace the hundreds of millions of CRT TVs that were in use.

Plasma TVs got a brief foothold in the higher end TV market before LCD emerged as the standard/mainstream display tech, but I believe plasma's traditionally higher prices and lack of smaller sizes (as well as a combination of other factors; weight etc.) is what might have led to LCD to emerge as the new "king" for so long. You have to wonder if OLED is the new plasma that will merely serve as a high-end transitory step between LCD > MicroLED, or if MicroLED is so far off that OLED will be the new mainstream standard in displays for years to come as more and more LCDs fade out of use.

They've gotta get them smaller, though. The TV market is one thing, but think of all of the 27"/32"/etc. LCD monitors in use. What we have now are very few, large, actual OLED TVs and monitors vs. a billion cheap, janky LCD monitors in office and cubicles worldwide. I could be wrong of course, but somehow I'm just not seeing the majority of office LCD monitors being replaced with OLEDs of a similar size. It could happen if MicroLED is super far off from being mainstream and practical but otherwise I can see OLED meeting the needs of enthusiasts for several years while LCDs hang around long enough to eventually be replaced by Mini or MicroLED. I don't keep up with this stuff much outside of the forums though, and certainly don't work in the industry, so maybe someone with more knowledge can chime in with how we expect things to really play out.
 
Ahhh yeah, exactly. It's been coming for a while now. OLED has long been heralded as the tech that will finally replace subpar LCD displays. It has taken some time, just like it took several years for large LCD TVs to become affordable to the masses and to replace the hundreds of millions of CRT TVs that were in use.

Plasma TVs got a brief foothold in the higher end TV market before LCD emerged as the standard/mainstream display tech, but I believe plasma's traditionally higher prices and lack of smaller sizes (as well as a combination of other factors; weight etc.) is what might have led to LCD to emerge as the new "king" for so long. You have to wonder if OLED is the new plasma that will merely serve as a high-end transitory step between LCD > MicroLED, or if MicroLED is so far off that OLED will be the new mainstream standard in displays for years to come as more and more LCDs fade out of use.

They've gotta get them smaller, though. The TV market is one thing, but think of all of the 27"/32"/etc. LCD monitors in use. What we have now are very few, large, actual OLED TVs and monitors vs. a billion cheap, janky LCD monitors in office and cubicles worldwide. I could be wrong of course, but somehow I'm just not seeing the majority of office LCD monitors being replaced with OLEDs of a similar size. It could happen if MicroLED is super far off from being mainstream and practical but otherwise I can see OLED meeting the needs of enthusiasts for several years while LCDs hang around long enough to eventually be replaced by Mini or MicroLED. I don't keep up with this stuff much outside of the forums though, and certainly don't work in the industry, so maybe someone with more knowledge can chime in with how we expect things to really play out.
My parents are still using my old Panasonic ST50 plasma TV. It still looks really good for SDR content and has no issues.

For desktop monitor sizes gaming LCDs are at a spot where they are pretty good - for SDR. I have no complaints about gaming on my Samsung G70A at 28" 4K 144 Hz. It's good enough for that. What I don't get is how it's somehow so damn hard to do that same performance for HDR with a mini-LED backlight. Pixel response times tend to become far more of an issue or price shoots through the roof.

Micro-LED will still take years to become a thing in consumer TVs and even longer for it to end up in desktop displays. I am not surprised Samsung is pushing QD-OLED as an interim option because they can't seem to be able to do Micro-LED at the necessary scale.
 
Plasma were good for the time, for movies especially - but like CRTs they ran hot, used a lot of juice, were heavy w/glass. They also often had an annoying hum, and had a blinky pixel thing that was annoying to some people. Response times on older TVs (other than something like a xbr960 CRT) were usually horrible. And yes the price. LCD were cheap, even if people got a really crappy one spec wise they still had a thin, medium to large LCD where they may have never sprung on a Plasma's cost. Plasma were also were 1080p and never made it to the 4k era let alone the HDR era.

Micro-LED will still take years to become a thing in consumer TVs and even longer for it to end up in desktop displays.


Micro LED vs High PPD MR roadmap/time-frame
--------------------------------------------------------------------------

Yep prob take years.. by the time micro led is anywhere near to being common we might finally be getting to where lightweight, fairly slim and very, very high PPD MR glasses could be around or upcoming. Then you could just have a fairly high PPD screen like a hologram within the PPD of the glasses, in real space. You'd probably also have interactive virtual objects and text without it even having to necessarily be a "screen" per se.

At some point we'll most likely drop a lot of our through the looking glass to the digital world monocles, hand mirrors, and desk mirrors, wall mirrors (phones, tablets, desktop monitors, TVs) and we'll step through and be in a MR space. VR has a lot going for it right now but it has a very long way to go PPD wise yet (also just starting some dev of HDR on VR hardware). Once it gets there, light enough and extreme enough PPD - I think people will start replacing their physical screens with "holographic" MR work spaces and gaming spaces. Will have to see how many gens of VR/MR progress happen before microled is actually prevalent in enthusiast hardware let alone common user devices/setups.

VR incidentally is way ahead on blur reduction tech which annoyingly hasn't made it into desktop gaming monitors and TVs.


Room Space
-----------------------

Speaking of VR (rather than MR), especially in regard to real space "Room Scale" gaming. Like large OLED screens I think a lot of people just don't have the space to use them optimally. People might dedicate a room for a workout room if lucky enough to have the space for one, or more commonly a living room as a TV room - but those don't require you to remove the workout equipment or the furniture from the floor plan lol. Most people don't have a large space for a VR room (plus height concerns even in basements), unless maybe they convert a garage or something. For desktops people are often restricted to the stereotypical "up against the wall like a bookshelf" or "upright piano with sheet music" type setup due to space constraints. For most the PC has remained more like a drafting table bench than a media/gaming center so large screens just don't fit that layout or room design.

It takes a lot less space to set up a good view distance on a large 42" to 48" OLED screen (at 29" to ~41" and 33.5" to 47", respectively) than to use roomscale VR but it's still not digestible to a lot of people. So they instead sit up near a wall of screen, pushing the extents out of their human viewpoint to the sides which is bad for HUDs and other info but also in the case of OLED and VA screens it exacerbates the screen/color uniformity issue on the sides of the screen - making it a larger gradient band the nearer your sit as you make the viewing angle to the sides worse and worse. This also results in some eye bending to the periphery if not a little neck bending/tennis match head turning. The perceived pixel size is also too large when sitting too close on a large (4k) screen, so massaged text subsampling and aggressive anti aliasing can't compensate enough anymore vs text fringing and graphics aliasing.
 
Plasma were good for the time, for movies especially - but like CRTs they ran hot, used a lot of juice, were heavy w/glass. They also often had an annoying hum, and had a blinky pixel thing that was annoying to some people. Response times on older TVs (other than something like a xbr960 CRT) were usually horrible. And yes the price. LCD were cheap, even if people got a really crappy one spec wise they still had a thin, medium to large LCD where they may have never sprung on a Plasma's cost. Plasma were also were 1080p and never made it to the 4k era let alone the HDR era.




Micro LED vs High PPD MR roadmap/time-frame
--------------------------------------------------------------------------

Yep prob take years.. by the time micro led is anywhere near to being common we might finally be getting to where lightweight, fairly slim and very, very high PPD MR glasses could be around or upcoming. Then you could just have a fairly high PPD screen like a hologram within the PPD of the glasses, in real space. You'd probably also have interactive virtual objects and text without it even having to necessarily be a "screen" per se.

At some point we'll most likely drop a lot of our through the looking glass to the digital world monocles, hand mirrors, and desk mirrors, wall mirrors (phones, tablets, desktop monitors, TVs) and we'll step through and be in a MR space. VR has a lot going for it right now but it has a very long way to go PPD wise yet (also just starting some dev of HDR on VR hardware). Once it gets there, light enough and extreme enough PPD - I think people will start replacing their physical screens with "holographic" MR work spaces and gaming spaces. Will have to see how many gens of VR/MR progress happen before microled is actually prevalent in enthusiast hardware let alone common user devices/setups.


Room Space
-----------------------

Speaking of VR (rather than MR), especially in regard to real space "Room Scale" gaming. Like large OLED screens I think a lot of people just don't have the space to use them optimally. People might dedicate a room for a workout room if lucky enough to have the space for one, or more commonly a living room as a TV room - but those don't require you to remove the workout equipment or the furniture from the floor plan lol. Most people don't have a large space for a VR room (plus height concerns even in basements), unless maybe they convert a garage or something. For desktops people are often restricted to the stereotypical "up against the wall like a bookshelf" or "upright piano with sheet music" type setup due to space constraints. For most the PC has remained more like a drafting table bench than a media/gaming center so large screens just don't fit that layout or room design.

It takes a lot less space to set up a good view distance on a large 42" to 48" OLED screen (at 29" to ~41" and 33.5" to 47", respectively) than to use roomscale VR but it's still not digestible to a lot of people. So they instead sit up near a wall of screen, pushing the extents out of their human viewpoint to the sides which is bad for HUDs and other info but also in the case of OLED and VA screens it exacerbates the screen/color uniformity issue on the sides of the screen - making it a larger gradient band the nearer your sit as you make the viewing angle to the sides worse and worse. This also results in some eye bending to the periphery if not a little neck bending/tennis match head turning. The perceived pixel size is also too large when sitting too close on a large (4k) screen, so massaged text subsampling and aggressive anti aliasing can't compensate enough anymore vs text fringing and graphics aliasing.
Not to turn this into a versus thread. Your descriptions of plasma are exaggerated. OLED is better in most every way (only exception are the few who don’t do BFI leaving them worse than plasma in motion clarity), but plasma is still a very fine display and most folks wouldn’t tell the difference in any non-critical viewing or content that didn’t take advantage of it - like HDR. Unless your plasma is less than high end , or it’s dead, or you’re after something specific (HDR) I would say that for most it’s a waste of money as SDR would be marginally better on OLED.

Edit - that said. I’m glad there are emissive options for when my plasma bites the dust. I would love to see a C1 in action with 120hz BFI.
 
Last edited:
HDR > BFI in my opinion and is the future of displays and color spaces in everything down to streaming, photos and even static desktop imagery in the long run just like higher resolutions and higher color spaces in general (it's a huge increase in color volume and detail in colors, it's not turning the brightness of a screen up relatively like SDR brightness)

- but everyone has their personal tastes, especially with the landscape still varied as it is for now (though most high budget shows and movie releases are HDR now, and auto HDR adds a ton of gaming titles to benefit from HDR) , and Burn in avoidance tech, peak brightness durations limited etc. But ultimately , BFI is not the way to go since it is for all practical purposes incompatible with HDR (and especially on a OLED). Full stop. That's enough alone. HDR is amazing. However BFI also has eye fatigue for some (if not all) of us over time in a game session, even if you can't "consciously see" the flicker. Also requires very high frame rate minimums to be used most effectively which means a big hit on graphically demanding games.

VR has better blur reduction using reprojection but that tech and expense hasn't made it to desktop displays and tvs.

Mark R. Of Blurbusters.com recently responded to my questions:


elvn wrote:
19 Oct 2022, 10:37
....Could some threshold for difference between redrawn refreshes enable LFC to lower persistence blur? If the next - now near duplicate rather than exact frame redraw happens-, what % difference between redraws or scene "cells" would be required at 4k 10 bit 250fps x4 (via FC +3 per frame) at 1000Hz?

elvn wrote:
19 Oct 2022, 10:37
...Could some sort of a pixel shift type technology (even perhaps even a smaller % of the screen pixels) be enough to make our eyes see "High" Frame Compensation redraws operating on '250fps-to-1000fpsHz' as unique, wiping the previous frame's persistence, without having to resort to full AI hardware interpolated frame insertion mapped more logically between frame states (based on vectors) or vr's own vector/head movement (and even eye tracking) time warp directional prediction? Or would it result in bad artifacts, even at a small shift and at very high speed?


Mark R: Pixel shift techniques is called reprojection. It's already used by most VR headsets. It works wonderfully.

ASW 2.0 added Z-buffer awareness, to pixel-shift foreground objects at different steps than background objects.

I wish reprojection/warping (pixel-shifting) was more common in non-VR use cases, to allow pans/scrolls/turns to stay smooth even during major framerate drops. I'd love to see 125fps "reprojected" to 500fps on a 500Hz display.

I'm sure RTX 4090 has enough bandwidth for it, if the game will co-operate with the reprojector (e.g. high-pollrate mouse synchronized to reprojection logic during panning / turning / scrolling), this requires native APIs in the operating system or drivers. The DLSS 3.0 method can also be used for the parallax-reveal pixels, to reduce reprojection artifacts along edges of objects. So reprojection would be used for most pixels, and DLSS 3.0 be used only for parallax-reveal pixels -- in theory.

But that may be a hugely variable GPU load, due to the varying amounts of parallax-reveal pixels.

Now, also, it's best not to suddenly enable/disable reprojection during LFC because it means sudden changes to display motion blur. Like motion blur suddenly halving everytime the reprojector decided to add new frames. It's best to use the reprojector to try to keep the framerate constant, e.g. converting variable frame rates to permanent consistent frame rates such as 1000fps 1000Hz.

TL;DR: Pixel shifting is called "reprojection" and is already being done in virtual reality.

Reprojection is done to keep framerate=Hz strobing in VR.

Have you ever used a modern VR headset? The newest ones now have less motion blur blur than the best strobed LCD's I've seen, because of the superlative big-money optimization on display motion blur at bigger budgets than for an average gaming monitors. It's shocking how much clearer VR displays are than even a 500Hz esports monitor.

Both Index and Quest 2 have 0.3ms MPRT, which would require 1000/0.3 = 3333fps 3333Hz flickerfree sample-and-hold to match motion clarity without requiring strobing/flicker based motion blur reduction methods.

They're even sharper motion clarity than a CRT now.
 
Last edited:
Not to turn this into a versus thread. Your descriptions of plasma are exaggerated. OLED is better in most every way (only exception are the few who don’t do BFI leaving them worse than plasma in motion clarity), but plasma is still a very fine display and most folks wouldn’t tell the difference in any non-critical viewing or content that didn’t take advantage of it - like HDR. Unless your plasma is less than high end , or it’s dead, or you’re after something specific (HDR) I would say that for most it’s a waste of money as SDR would be marginally better on OLED.

Edit - that said. I’m glad there are emissive options for when my plasma bites the dust. I would love to see a C1 in action with 120hz BFI.

BFI on an OLED is amazing. Best part is that you don't need to fiddle around with a bunch of nonsense like you do on LCDs with vertical blanks or vertical totals or strobe width or strobe pulse or whatever. Just flip it on and simply WORKS, period. And unlike LCD's you'll still have pretty great picture quality (just not for HDR of course) and an immersive 48 inch display. Once I finally replace my Acer X27 with something better, I'll probably make my CX a permanent BFI display while I leave the HDR gaming to the LCD capable of higher peak brightness and color volume.
 
Rtings early access review for the PG42UQ is up. HDR measurements look really nice compared to the C2 but don't at all line up with what I saw in person with both side by side. The firmware that released after I sent my defective one back may have pumped brightness so at least it did improve in some way.
 
BFI on an OLED is amazing. Best part is that you don't need to fiddle around with a bunch of nonsense like you do on LCDs with vertical blanks or vertical totals or strobe width or strobe pulse or whatever. Just flip it on and simply WORKS, period. And unlike LCD's you'll still have pretty great picture quality (just not for HDR of course) and an immersive 48 inch display. Once I finally replace my Acer X27 with something better, I'll probably make my CX a permanent BFI display while I leave the HDR gaming to the LCD capable of higher peak brightness and color volume.
BFI looks literally like a strobe light to me on an OLED. It’s horrid. Makes me recoil.

I can’t imagine using it. But I guess if you can’t see it and don’t get eye fatigue from it — have at it.
 
BFI looks literally like a strobe light to me on an OLED. It’s horrid. Makes me recoil.

I can’t imagine using it. But I guess if you can’t see it and don’t get eye fatigue from it — have at it.

BFI at 60Hz for sure, I'm not sure how people stomach 60Hz strobing. But BFI at 120Hz is perfectly useable to me.
 
Rtings early access review for the PG42UQ is up. HDR measurements look really nice compared to the C2 but don't at all line up with what I saw in person with both side by side. The firmware that released after I sent my defective one back may have pumped brightness so at least it did improve in some way.
It’s the firmware, I was able to “glitch” it to roughly reviewers number. It’s quite frustrating lol
 
60hz OLED strobing I only really notice flicker on very bright backgrounds. For video games it's fine since the content is varied. Though 60hz on a CRT is actually less flickery somehow, probably phosphor persistence on bright colors.

So OLED makes could actually make 60hz BFI better by making brighter colors active longer. Like a 255 white pixel would trail off over 4ms, whereas 100 grey would be 1ms. Making this user-adjustable would be the ultimate pro-consumer move.
VR has better blur reduction using reprojection but that tech and expense hasn't made it to desktop displays and tvs.

You know basically all VR headsets use BFI, right? Motion reprojection is just for low-framerate scenarios
 
60hz OLED strobing I only really notice flicker on very bright backgrounds. For video games it's fine since the content is varied. Though 60hz on a CRT is actually less flickery somehow, probably phosphor persistence on bright colors.

So OLED makes could actually make 60hz BFI better by making brighter colors active longer. Like a 255 white pixel would trail off over 4ms, whereas 100 grey would be 1ms. Making this user-adjustable would be the ultimate pro-consumer move.


You know basically all VR headsets use BFI, right? Motion reprojection is just for low-framerate scenarios

Right? Lol if BFI was so bad with it's "headache inducing flickering" then VR would've been dead in the water by now as I don't think there is a single consumer grade sample and hold VR headset out there.
 
60hz OLED strobing I only really notice flicker on very bright backgrounds. For video games it's fine since the content is varied. Though 60hz on a CRT is actually less flickery somehow, probably phosphor persistence on bright colors.

So OLED makes could actually make 60hz BFI better by making brighter colors active longer. Like a 255 white pixel would trail off over 4ms, whereas 100 grey would be 1ms. Making this user-adjustable would be the ultimate pro-consumer move.


You know basically all VR headsets use BFI, right? Motion reprojection is just for low-framerate scenarios

The convo was focused on reprojection/pixel shifting not BFI. We were talking about using LFC on the whole range but remapping/ pixel-shifting enough of the pixels in order to wipe your retinas so that the compensated frames would decrease blur instead of acting as exact copies. E.g. 125 fps x 4 reprojected.

ASW 2.0 added Z-buffer awareness, to pixel-shift foreground objects at different steps than background objects.

I wish reprojection/warping (pixel-shifting) was more common in non-VR use cases, to allow pans/scrolls/turns to stay smooth even during major framerate drops. I'd love to see 125fps "reprojected" to 500fps on a 500Hz display.

It's best not to suddenly enable/disable reprojection during LFC because it means sudden changes to display motion blur. Like motion blur suddenly halving everytime the reprojector decided to add new frames. It's best to use the reprojector to try to keep the framerate constant, e.g. converting variable frame rates to permanent consistent frame rates such as 1000fps 1000Hz.
 
Last edited:
Sure yeah. If they can get multiple AI frames in between two rendered frames that would be great. I just wish the conversation didn't keep going in an "either, or" direction and was more of a "both, and" for interpolation and BFI
 
Sure yeah. If they can get multiple AI frames in between two rendered frames that would be great. I just wish the conversation didn't keep going in an "either, or" direction and was more of a "both, and" for interpolation and BFI

Reprojection and other frame amplification tech do not dim/mute the screen luminosity so would work with HDR brightness / color volumes (Also not flicker based obviously as that's what dims the screen).

No appeciable HDR color volumes or at all is a dealbreaker at this point, for me anyway. So that alone rules out BFI. In the long run practically everything will be hdr. (Including VR, pc desktop imagery, photos, streams, etc).
 
Last edited:
Right, HDR.

I'm convinced OLED makers could enable an "overdrive" feature when in BFI mode, since the pixels will be off 75-90% of the time, reducing heat.

Like with my current OLED, I can match sample and hold SDR brightness of ~27 (on my FO48U) in BFI by setting brightness to 50. So you really just have to double the brightness of the pixels during that strobing period, whether they're lit for 1ms, 2ms, 4ms, whatever

Keep in mind Sony will be releasing the PSVR2 soon, which will be 90/120hz and HDR. And will by necessity be using BFI. Yeah it's a small screen, but it's a good test bed for where OLED can go
 
Right, HDR.

I'm convinced OLED makers could enable an "overdrive" feature when in BFI mode, since the pixels will be off 75-90% of the time, reducing heat.

Like with my current OLED, I can match sample and hold SDR brightness of ~27 (on my FO48U) in BFI by setting brightness to 50. So you really just have to double the brightness of the pixels during that strobing period, whether they're lit for 1ms, 2ms, 4ms, whatever

Keep in mind Sony will be releasing the PSVR2 soon, which will be 90/120hz and HDR. And will by necessity be using BFI. Yeah it's a small screen, but it's a good test bed for where OLED can go

Meta is working on hdr vr too.

I think frame amplification tech would be a better way to do it (blur reduction) ultimately, especially since oled can theoretically do 1000Hz response time wise. We are trying to get hdr1000 and higher. Strobing by nature lowers the brightness , especially problematic on oleds.

Still will be interesting to see what nit hdr ps and future meta get and how vs blur reduction techs.
 
Meta is working on hdr vr too.

I think frame amplification tech would be a better way to do it (blur reduction) ultimately, especially since oled can theoretically do 1000Hz response time wise. We are trying to get hdr1000 and higher. Strobing by nature lowers the brightness , especially problematic on oleds.

Still will be interesting to see what nit hdr ps and future meta get and how vs blur reduction techs.

Two issues I can see with interpolating more fake frames is that

1. Artifacts might now be more visible. Right now DLSS 3 is showing you a real frame, then a fake frame, followed by a real frame so the according to DF the artifacts found in the fake frames are hard to notice since it's alternating between a real and fake frame back and fourth which masks it pretty well. But what happens when you are trying to interpolate 100fps into 1000fps and now 9/10 frames you are seeing are all faked? Seems like artifacts will now become more obvious to spot in that case.

2. How do we know that interpolating more fake frames won't mean more input lag? It would make sense that it's going to need more processing time if you want to generate 900 fake frames vs 100 and high amounts of input lag for VR especially seems like a bad thing.

I'm sure they can work it out, but that's going to take a long long time to get there so BFI has a good while to go before it's actually obsoleted by frame interpolation.
 
For those of you using a LG C2 OLED as your Windows 11 Display - how did you configure your Windows HDR Calibration tool? I ask because when I run this, with Game Optimizer engaged (pretty much defaults except the white balance change everyone recommends) - When I follow the tuning advice it offers - I set it to 0 as minimum luminance, and it's all the way up to 2,220 before the white boxes disappear on both the partial screen test and the full screen luminance test. That doesn't seem right since we know the C2 42" OLED is more like the 750 range -- however -- at 750 those box grids are SUPER visible. I assume this has to do with the built in LG tonemapping that is being used - but what's the proper setting here?

https://apps.microsoft.com/store/detail/windows-hdr-calibration/9N7F2SM5D1LR?hl=en-us&gl=us

1666744388885.png
 
For those of you using a LG C2 OLED as your Windows 11 Display - how did you configure your Windows HDR Calibration tool? I ask because when I run this, with Game Optimizer engaged (pretty much defaults except the white balance change everyone recommends) - When I follow the tuning advice it offers - I set it to 0 as minimum luminance, and it's all the way up to 2,220 before the white boxes disappear on both the partial screen test and the full screen luminance test. That doesn't seem right since we know the C2 42" OLED is more like the 750 range -- however -- at 750 those box grids are SUPER visible. I assume this has to do with the built in LG tonemapping that is being used - but what's the proper setting here?

https://apps.microsoft.com/store/detail/windows-hdr-calibration/9N7F2SM5D1LR?hl=en-us&gl=us

View attachment 521378

You probably have dynamic tone mapping set to ON. Try setting it HGiG and the white squares should start disappearing around 750 nits.
 
Two issues I can see with interpolating more fake frames is that

1. Artifacts might now be more visible. Right now DLSS 3 is showing you a real frame, then a fake frame, followed by a real frame so the according to DF the artifacts found in the fake frames are hard to notice since it's alternating between a real and fake frame back and fourth which masks it pretty well. But what happens when you are trying to interpolate 100fps into 1000fps and now 9/10 frames you are seeing are all faked? Seems like artifacts will now become more obvious to spot in that case.

Mark R:
I'm sure RTX 4090 has enough bandwidth for it, if the game will co-operate with the reprojector (e.g. high-pollrate mouse synchronized to reprojection logic during panning / turning / scrolling), this requires native APIs in the operating system or drivers. The DLSS 3.0 method can also be used for the parallax-reveal pixels, to reduce reprojection artifacts along edges of objects. So reprojection would be used for most pixels, and DLSS 3.0 be used only for parallax-reveal pixels -- in theory.

But that may be a hugely variable GPU load, due to the varying amounts of parallax-reveal pixels.

Now, also, it's best not to suddenly enable/disable reprojection during LFC because it means sudden changes to display motion blur. Like motion blur suddenly halving everytime the reprojector decided to add new frames. It's best to use the reprojector to try to keep the framerate constant, e.g. converting variable frame rates to permanent consistent frame rates such as 1000fps 1000Hz.

So theoretically, it could avoid artifacts.

2. How do we know that interpolating more fake frames won't mean more input lag? It would make sense that it's going to need more processing time if you want to generate 900 fake frames vs 100 and high amounts of input lag for VR especially seems like a bad thing.

I'm sure they can work it out, but that's going to take a long long time to get there so BFI has a good while to go before it's actually obsoleted by frame interpolation.

Well, like he said it would have to be native APIs in operating system or drivers, working hand in hand with the game (which might have to broadcast z axis/vector info on an optimized level). The tech is shifting pixels, space warping, but doing different shift to nearer objects than farther ones. That's more "moving" existing frame element pixels slightly like pixel shifting and less complete "manufacturing" of frames so to speak. It wouldn't be 9/10 frames. It would be at very high frame rates, operating on a base of 125fps or more (e.g. 125fps x4 -> 500fpsHz, 200fps x 5 -> 1000fpsHz, 250fps x4 -> 1000fpsHz). But you are right, it's not here yet - though some of the tech is available on VR headsets on some level it's not to that degree yet and even as the tech stands currently (Reprojection/Asynchronous Space Warp) it's not available on pancake gaming screens currently, prob due to cost hardware wise and having to dev games to feed vectors to the tech. DLSS itself is still young but I think they realize the direction they need to go in, with baby steps so far.

For now, for me, I prioritize HDR games (and autoHDR games), as well as HDR movies and shows so BFI to me was made obsolete in a way by it's incompatibility with HDR. So I'm hopeful that frame multiplication technologies continue to advance. Seems like the logical way forward to hit 500fpsHz - 1000fpsHz, (and be able to use raytracing even).
 
Last edited:
So theoretically, it could avoid artifacts.



Well, like he said it would have to be native APIs in operating system or drivers, working hand in hand with the game (which might have to broadcast z axis/vector info on an optimized level). The tech is shifting pixels, space warping, but doing different shift to nearer objects than farther ones. That's more "moving" existing frame element pixels slightly like pixel shifting and less complete "manufacturing" of frames so to speak. It wouldn't be 9/10 frames. It would be at very high frame rates, operating on a base of 125fps or more (e.g. 125fps x4 -> 500fpsHz, 200fps x 5 -> 1000fpsHz, 250fps x4 -> 1000fpsHz). But you are right, it's not here yet - though some of the tech is available on VR headsets on some level it's not to that degree yet and even as the tech stands currently (Reprojection/Asynchronous Space Warp) it's not available on pancake gaming screens currently, prob due to cost hardware wise and having to dev games to feed vectors to the tech. DLSS itself is still young but I think they realize the direction they need to go in, with baby steps so far.

For now, for me, I prioritize HDR games (and autoHDR games), as well as HDR movies and shows so BFI to me was made obsolete in a way by it's incompatibility with HDR. So I'm hopeful that frame multiplication technologies continue to advance. Seems like the logical way forward to hit 500fpsHz - 1000fpsHz, (and be able to use raytracing even).

Yeah for now it's all just "In theory" kind of wishful thinking. Until I see it actually done I think it's better to tamper expectations rather than fully expect it to become reality. And since we're doing wishful thinking, I could also say that BFI could one day be compatible with HDR as well. Let's see how well the PSVR2's HDR is.
 
Last edited:
Yeah for now it's all just "In theory" kind of wishful thinking. Until I see it actually done

ASW/reprojection is already being done on VR headsets, but not to the degree we were talking about so that sentiment is understandable. It's definitely not a tech that has been exploited on pc gaming monitors/tvs as of yet. Perhaps due to cost to implement hardware wise and dev cost/work involved. So reprojection, which works on immediate screen redraws and is pixel shifting different planes (foreground vs background), does exist and can be seen in VR headsets. At the very least the framerate x2 (1 normal rendered frame +1 pixel shifted redraw per frame). I think even that could be great on PC.

Nvidia's frame insertion on the other hand seems to be taking the time for two frames and comparing them, manufacturing a "tween" frame without any z-axis information - instead relying on AI analysis of the difference between the two rendered frames to figure out what the objects and vectors are..

As I understand it, reprojection tech needs x,y as well as z-axis and vector info from the drivers/games in order to pixel shift the frame redraws - forecasting where to reproject the pixels. In VR it's using your head and hand movement and I think it's also using broadcasted axis/vectors of virtual objects in the scene but a pc could do it with input/drivers from the peripherals plus the game engine broadcasting vectors.. Nvidias method instead seems to go another actual frame delivered and then go back in time sort-of with a manufactured frame.

. . . .

VR has big budgets and has surpassed pc OS+screen tech in some facets. Development for VR likely goes into it expecting a vector based incl. z-axis coding working hand in hand with the hardware environment like that. PC gaming hasn't made that leap in hardware/os/drivers/game engines and development, at least not yet. Frame amplification technologies seem to be the way to go going forward though with nvidia now jumping on board, even if with immature tech so far.

. . . . .

BFI could one day be compatible with HDR as well. Let's see how well the PSVR2's HDR is.

Would be great if it didn't infringe on what the display could otherwise achieve in HDR color volumes without BFI being active. The rule of thumb in screen blanking used to be that for each % blur reduction, the peak brightness would be cut by around that much (e.g. 25% blur reduction ~> 25% brightness reduction, 50% blur reduction ~> roughly 50% brightness reduction). I'm not sure how that equates if they used rolling scans or partial scans on OLEDs in modern hardware. HDR needs all the range it can get. Hopefully it won't be like some of the "HDR 400" screens.

BFI typically needs very high frame rate minimums to work optimally too, which doesn't jive with VRR's usage scenario very well either. (Squeezing higher graphics settings in and riding a VRR roller coaster of frame rates smoothly). So far idk of any VRR+BFI displays that worked properly, though it's technically possible. For BFI it was usually recommended that the minimum fps be higher than the peak Hz of the display, massively exceeding it best case scenario. . - However - BFI could also benefit from DLSS AI upscaling and frame amplification technologies in that regard. (But at that point I'd probably just stick with the blur reduction gained by the amplified fpsHz if it was to 500fpsHz on a 500Hz or to 1000fpsHz 1000Hz screen, will have to see).

I'm definitely intrigued with the HDR VR aspect just from curiosity about the tech at the very least. Meta is also developing HDR capable headset tech in house already, for future gens, but I have no idea what specs and tech combo any VR headsets are using to attempt HDR color volumes + blur reduction and what the end results would be in their first HDR generation. (However for all of the advancements in VR, the PPD is still incredibly poor unfortunately).

As of now, BFI on desktop pcs is essentially incompatible with HDR, and VRR afaik in practice, (not sure about raytracing either) - but for sure we'll see how tech progresses and on which fronts.
 
Last edited:
I can understand why you would not want to go with Philips anymore but some displays will have failures whether due to a manufacturing flaws or just bad luck regardless of the brand. I've had good luck with Samsung and LG so far but other people not so much.

I'm not going to buy the Philips unless they release that but in a curved design. The $2k price is also too high.
I bought the 2015 Samsung 4k TV for $2000 and it died after 2 years, 1 month after the warranty expired. Shit happens.
 
Back
Top