LG 48CX

I was working from quotes saying that was how they got the white array as a base for WRGB color filtration. Many sites detail the yellow and blue emitters. To be clear, they are using a yellow emitter, and a blue emitter. They aren't converting yellow to blue.


-------------------------------------------------------------------
https://www.cnet.com/news/what-is-oled-and-what-can-it-do-for-your-tv/ November 2019

Yellow plus blue makes green (and red and cyan and magenta)
Currently, all OLED TVs are made by LG, and how they've made them is rather unusual. All TVs, to create the images you watch, use red, green, and blue mixed together to create all the colors of the rainbow (well, not quite all colors, but most). To create the colored light, LCDs use RGB color filters, while plasmas used RGB phosphors and Samsung's short-lived OLED TV (and all their OLED-screened phones use red, green and blue OLED elements.

LG's OLED only use two colors: a sandwich of blue and yellow OLED materials. Then, using color filters, the yellow and blue light is filtered to create red, green and blue. To add a bit more brightness, there's also a clear "white" element, too. It's a lot easier if I show you:


The steps to create an image with LG's OLED.
View attachment 245401

A yellow OLED material creates yellow (i.e. red and green) light. When combined with blue (1), this creates "white" light (2). Using color filters (3) the desired sub-pixel color (including clear/white) is created (4).
Geoffrey Morrison/CNET

Though this seems odd and convoluted, it obviously works since LG is the only company that has successfully marketed large-screen OLED TVs in any numbers. This is because it's more cost-effective to make ("more" being the key word there).

The apparent downsides, such as light output and color accuracy, don't seem to be issues. Sure, they're not as bright as the brightest LCDs, but they are still very bright, and the current models have the same color saturation as the best LCDs.


-----------------------------------------------------------------
https://www.oled-info.com/reports-say-lgd-aims-change-its-woled-tv-structure-yb-rgb Dated article but details the tech.

Reports from China suggest that LG Display is considering changing the basic structure of its white OLED panels (WOLED) used in LGD's OLED TVs. LGD is currently using yellow and blue OLED materials to create a white OLED, but now LGD may switch to an RGB based mix.

It's not clear from the Chinese reports (which are unverified yet, of course) - but it's likely that LGD will not switch to a direct-emission RGB structure, but rather use the RGB materials to create a white OLED and remain with a color-filter based design. Switching from Y/W to R/G/B may enable LGD to achieve higher color purity - and so a larger color gamut, and may also be more efficient.

LGD's WRGB architecture - which creates 4 sub pixels using color filters (reg, green, blue and non-filtered) to create a colored image from a single white OLED pixel - is less efficient and less color-pure compared to a real RGB sub-pixel architecture, but WOLED displays are much easier to produce as there's less need for subpixel patterning.


------------------------------------------------------------
https://www.oled-info.com/qa-cynoras-ceo-discuss-companys-new-blue-5-emitter March 5, 2020

-------------------------------------------------------------

The main point being they are all made into a white array with a clear unfiltered (white) and, r, g, b subpixel filter above. I wasn't listing it as a negative outside of the highest HDR brightness levels perhaps. Rather showing that they likely utilize the additional large white clear spot in the fitler as a subpixel in order to boost effective brightness to our eyes without having to boost the output of the OLED emitters as much. Cutting down on the actual brightness levels and heat of the OLEDS while still getting effective color brightness levels is probably one of the reasons that burn in isn't as much of an issue on LG OLEDs, especially with the brightness levels ~ %windows they are able to hit in HDR now. Before LG developed this method, I believe it was that non-WRGB OLED's blue subpixel emitter would wear out unevenly, much earlier than the others and as I said, the overall output per color brightness probably had to be more without the added large white subpixel.

As the reviews I quoted said, LG's multi layer WRGB tech is one of the reasons LG OLED are reliable enough and why LG OLED became dominant so I wouldn't consider it a negative overall. It's a neat ~ "hack" workaround technology as the main tool vs. burn-in danger levels and OLED wear, and it works. (Further protections using ABL, %window restrictions, pixel shifting, logo detection ~dimming, OLED wear evening cycles, etc. too of course). The white subpixel structure was however brought up by others in the thread as a potential negative - in regard to text rendering for regular desktop use which is why it's been added to the conversation currently in that respect.

Thanks for clarifying, it was arranged in a way to make it look like it had yellow converting to blue.
Blue + yellow does make white, it's been done with lasers as well. It's the same as how an LED works but they are doing it separately as each white pixel is a yellow and blue oled source mixed. They probably have too high losses/thermal dissipation for stokes conversion using traditional way (around 30% with higher CRI solutions), but this blue/yellow way will be less effective for red unfortunately. That said, with the right filters they could have some converted emission.

Another reason the CFA (colour filter array) will be easiest, is existing infrastructure and know how for camera CMOS sensors. It is less efficient than direct emission LEDs though, except possibly green as that has rather poor efficiency, however the fact that green can be 3-5x the lumens/W of blue and red at the same power level helps mitigate this. But as you said it is the winning play at this point and likely will continue to be until further improvements are made in direct emission OLED RGBW and desktop software to address text to different subpixel types. This has been an issue with some AUO panels and others prior so I think the time for it to be addressed is rapidly approaching, especially if owners make more noise...
Also I wonder if the Y+B sub-subpixels for white are individually addressable as it would enable some interesting warm/cold shifts of colour and blue-boosting. But you can see why they are moving to RGB sub-sub-pixels for white, this way you can also work a little easier around the sub-pixel/text issue.

It looks like direct RGB(or more) emission of narrow linewidth sources (the superior method after seeing them all) will be mostly the realm of VR/AR headsets and giant led video walls at this point, but those semiconductor microled solutions blow everything else out of the water. 1kfps and brightness 2 million nits, there isn't really much of a thermal limit with them compared to flexible printed medium.. they are closer to CPUs. And there is no text compatibility issues as you can retain RGB if necessary.


I'd say after a CX48, one of these micro-led headsets will be the next future-looking investment. Only limit for them soon will be fpga and interfaces (and power on your head lol).
 
How Would you guys use this with a adjustable height standing desk? Just on the desk or some kind of wall mount or poll mount behind the desk?

i’m thinking I would need a secondary LED or LCD monitor for document production for work and maybe wall mount this above? Not sure I would be comfortable using this for drafting documents because of burn in.
I feel like desks are not deep enough for this 48" display and I would want it farther away and lowered so that the bottom of the screen is below the desk surface but still in view. I haven't decided on a solution to this yet.
 
For those talking about waiting until the holidays to pick up a CX48...you are cursing yourselves because you know that the rumors will be a C11 that is 43" or has full bandwidth HDMI or whatever and you will end up waiting a year from right now lol ;)
 
How Would you guys use this with a adjustable height standing desk? Just on the desk or some kind of wall mount or poll mount behind the desk?

i’m thinking I would need a secondary LED or LCD monitor for document production for work and maybe wall mount this above? Not sure I would be comfortable using this for drafting documents because of burn in.
That would be tricky since normally you'd need a monitor to move up and down every time you move the desk between standing and sitting position. Personally I'd probably put it on one of the articulating pillar stands I mentioned and try to make the viewing distance, height, and angle enough to still see over the top of your standing desk when in the lifted position. I'm not interested in using this monitor as a static desktop/app monitor myself either.

I feel like desks are not deep enough for this 48" display and I would want it farther away and lowered so that the bottom of the screen is below the desk surface but still in view. I haven't decided on a solution to this yet.
If you page back through the thread, you'd see myself and a few others championing the use of the OLED as a media and gaming "stage", with the desk it's own separate island instead of the stereotype of using a desk up against a wall like a bookshelf with the monitor on the desk you are sitting at.

For me, I try to keep my eye's viewing angle pointed within a middle 3rd band across the middle of a monitor/tv. I already sit over 3' away from my 43" monitors in my current setup, using a separate half-circle~kidney bean shaped desk on caster wheels for my peripherals and chair. The desk is height adjustable using screw in knobs on "flute-holes" like some workout equipment but it's not a standing desk.

For those talking about waiting until the holidays to pick up a CX48...you are cursing yourselves because you know that the rumors will be a C11 that is 43" or has full bandwidth HDMI or whatever and you will end up waiting a year from right now lol ;)

I know there is always something around the corner but to be fair a C9 with already proven 48gbps in it's EIDE and already proven uncompressed audio pass-through via e-ARC by RTings is here now already, and noone is getting hdmi 2.1 output until at least near year end if nvidia 7nm gpus have it and are available. I'm not going to pay $1630 out of pocket for a 48CX without confirmation that it will have an uncompressed video pipeline (10bit native 4k 444 120hz hdmi pipeline ~ nvidia/LG) and uncompressed audio format pass-through support (LG). Were that support confirmed, I would have the option of buying the CX "early" rather than "on time" for the 3000 series when it should hopefully be able to do 120hz 4k natively, but that's not the case. Also, the price might drop by $230 or more, and the C9 will be a lot cheaper then too once all the cards are actually on the table ~ hands shown.
 
Last edited:
There is no guarantee you will be able to find C9's in stock by the end of this year, yeah with the whole virus thing going on maybe you will...or maybe you won't.
 
Will see.. there are still 55" C8's available for order from B&H and a few from wallmart and it's mid may 2020, 2 yrs later.

The C8 were released around may 2018 and though there are few left for sale online we are well past the last 2019 black friday. If I recall correctly there were plenty of 2018 C8's last black friday (nov 2019) when the C9's were the curent 2019 model. That's the normal cycle. We'll see what happens with the global health and economic fallout this year but I don't expect it to fall on the short end.
 
Will see.. there are still 55" C8's available for order from B&H and a few from wallmart and it's mid may 2020, 2 yrs later.

The C8 were released around may 2018 and though there are few left for sale online we are well past the last 2019 black friday. If I recall correctly there were plenty of 2018 C8's last black friday (nov 2019) when the C9's were the curent 2019 model. That's the normal cycle. We'll see what happens with the global health and economic fallout this year but I don't expect it to fall on the short end.

Thing about the C8 is that there is practically zero reason for anyone owning a C7 or maybe even a C6 to go out and upgrade to the C8. The C9 however has HDMI 2.1 so that itself is a worthy upgrade for anyone who does not own one, even C8 owners.
 
That's something to consider but most people in the main buying segment just buy for smart tv and movie watching and aren't hooking up a pc to any of these. That could change once hdmi 2.1 consoles come out but I don't think the numbers are as big for pc outside of these forums, (including videophile/audiophile forums like AVS for the uncompressed audio formats). The economic downturn could also affect sales of higher end tvs when there are much more affordable options at 55".

Generally there are black friday deals that cut the current year's line down by around 22% and also deals on the prior year's models to get rid of stock.

I'm not expecting it to be less, if anything they are currently trying to give stimulus money and rent freezes, extra unemployment money, etc.. while people are and will be losing jobs and businesses are closing - so they could have trouble moving as many newer TVs by that point if the well dries up. I'm hoping things level off but I'm expecting that the economy as it relates to regular workers and the job market are going to be slammed for quite awhile once this "freeze" thaws out.
 
Am I the only one that have noticed that in general, when connecting an LG OLED to an Nvidia GPU to show a normal Windows desktop, it seem to look much better than doing the same thing with an Intel GPU (laptop)?Even though I have verified that I am running 4:4:4 etc it just looks worse for some reason. Any ideas what can cause this? Unfortunately, the control panel for the Intel Display Driver is really bad and does not seem to offer many usefull settings.
 
Ok - i just got my LG48CX connected to my 2080TI and here are the options to run the display. But there are limitations....

1 - 4:2:0 - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
2 - 4:4:4/RGB - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
3 - 4:2:0 - 8 bit 3840 x 2160 @ 120hz max / without Gsync / without HDR
4 - 4:2:0 - 8 bit 2560 x 1440 @ 120hz max / with Gsync / with HDR

I've also tried custom resolutions at 3840x1600 or 3840x1440. Test works fine (stretching but guessing thats a tv setting issue), but enabling it doesn't work.

Am i missing something? I'd like to run 3 but with GSync and HDR... the difference between 60hz and 120hz on the desktop is VERY noticable.

Is this a G-Sync limitation of the display? Or a limitation to the bandwith? The latter i'd assume would be resolved when upgrading to a 3080TI IF that supports HDMI 2.1.
 
Last edited:
Ok - i just got my LG48CX connected to my 2080TI and here are the options to run the display. But there are limitations....

1 - 4:2:0 - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
2 - 4:4:4/RGB - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
3 - 4:2:0 - 8 bit 3840 x 2160 @ 120hz max / without Gsync / without HDR
4 - 4:2:0 - 8 bit 2560 x 1440 @ 120hz max / with Gsync / with HDR

I've also tried custom resolutions at 3840x1600 or 3840x1440. Test works fine (stretching but guessing thats a tv setting issue), but enabling it doesn't work.

Am i missing something? I'd like to run 3 but with GSync and HDR... the difference between 60hz and 120hz on the desktop is VERY noticable.

Is this a G-Sync limitation of the display? Or a limitation to the bandwith? The latter i'd assume would be resolved when upgrading to a 3080TI IF that supports HDMI 2.1.

I would refer you to the previous pages of the thread (pgs 24 & 25) for posts discussing the limitations of 4:4:4 output on NVIDIA gaming cards, and also the limitations of HDMI 2.0b.

Basically: you’ll have to wait for Ampere.
 
For desktop it is best to use 60Hz with either RGB or YCbCr444. In this case even 12bit might be avaialable but 8bit should suffice. Fluid mouse cursor in desktop is least important thing really :p
At 1440p for games you should be able to use better chroma resolution and/or bitdepth. For HDR you need 10bit (so 12bit)
 
For desktop it is best to use 60Hz with either RGB or YCbCr444. In this case even 12bit might be avaialable but 8bit should suffice. Fluid mouse cursor in desktop is least important thing really :p
At 1440p for games you should be able to use better chroma resolution and/or bitdepth. For HDR you need 10bit (so 12bit)

I don't have the 12-bit option when selecting those settings. Just 8-bit. But the screen still switches to HDR. Are you saying that is not HDR - even though the display states that? 12-Bit is only visible when you switch to 4:2:2
 
Last edited:
No.

No.

I want 500Hz on the desktop damnit. Quite serious. I want it so fast that it appears to be 'refresh-rate-less'. At the very least, I want it as responsive as possible at any technology level.

It's too bad these TVs don't go over 120hz. 4k@120hz is the same bandwidth as 1080p@480hz. And OLED response time can fully take advantage of that refresh rate whereas even the fastest LCDs don't even always fully transition at 120hz.

For certain games you could switch to 1080p@480hz and others you could run at 4k@120hz it would be the best gaming monitor in every single way.
 
Ok - i just got my LG48CX connected to my 2080TI and here are the options to run the display. But there are limitations....

1 - 4:2:0 - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
2 - 4:4:4/RGB - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
3 - 4:2:0 - 8 bit 3840 x 2160 @ 120hz max / without Gsync / without HDR
4 - 4:2:0 - 8 bit 2560 x 1440 @ 120hz max / with Gsync / with HDR

I've also tried custom resolutions at 3840x1600 or 3840x1440. Test works fine (stretching but guessing thats a tv setting issue), but enabling it doesn't work.

Am i missing something? I'd like to run 3 but with GSync and HDR... the difference between 60hz and 120hz on the desktop is VERY noticable.

Is this a G-Sync limitation of the display? Or a limitation to the bandwith? The latter i'd assume would be resolved when upgrading to a 3080TI IF that supports HDMI 2.1.

Can you Please make a few Pictures of the LG 48CX Box ?
And where did you buy the 48CX ?
 
Last edited:
That's something to consider but most people in the main buying segment just buy for smart tv and movie watching and aren't hooking up a pc to any of these. That could change once hdmi 2.1 consoles come out but I don't think the numbers are as big for pc outside of these forums, (including videophile/audiophile forums like AVS for the uncompressed audio formats). The economic downturn could also affect sales of higher end tvs when there are much more affordable options at 55".

Generally there are black friday deals that cut the current year's line down by around 22% and also deals on the prior year's models to get rid of stock.

I'm not expecting it to be less, if anything they are currently trying to give stimulus money and rent freezes, extra unemployment money, etc.. while people are and will be losing jobs and businesses are closing - so they could have trouble moving as many newer TVs by that point if the well dries up. I'm hoping things level off but I'm expecting that the economy as it relates to regular workers and the job market are going to be slammed for quite awhile once this "freeze" thaws out.
Black Friday has no effect on the price of high end TVs. Best time to buy is around March/April when they start clearing them out for the new models. You can find deals on them from unauthorized dealers that sell on eBay and stuff. It is where I got my C9 65' for $1800 around September while stores still charging $3000. $1800 was the best price til after the holidays. The best price I seen at a store was $2600 for 65' at best buys over Christmas season.
 
Ok - i just got my LG48CX connected to my 2080TI and here are the options to run the display. But there are limitations....

1 - 4:2:0 - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
2 - 4:4:4/RGB - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
3 - 4:2:0 - 8 bit 3840 x 2160 @ 120hz max / without Gsync / without HDR
4 - 4:2:0 - 8 bit 2560 x 1440 @ 120hz max / with Gsync / with HDR

I've also tried custom resolutions at 3840x1600 or 3840x1440. Test works fine (stretching but guessing thats a tv setting issue), but enabling it doesn't work.

Am i missing something? I'd like to run 3 but with GSync and HDR... the difference between 60hz and 120hz on the desktop is VERY noticable.

Is this a G-Sync limitation of the display? Or a limitation to the bandwith? The latter i'd assume would be resolved when upgrading to a 3080TI IF that supports HDMI 2.1.

I don't have the 12-bit option when selecting those settings. Just 8-bit. But the screen still switches to HDR. Are you saying that is not HDR - even though the display states that? 12-Bit is only visible when you switch to 4:2:2

HDMI 2.0 is limited to 4K 4:2:2 60 Hz with HDR and 12 bit color. In PC use it's best to turn HDR off when you are not using it, no sense running the display with full brightness, that will just result in automatic brightness limiter kicking in. If you play a HDR video or game, turn HDR on for that.

If you want 120 Hz you can do it at 1080p or 1440p with HDR or you can use 4K, 8-bit, no HDR, 4:2:0 for that.

With custom resolutions you need to enable GPU scaling from Nvidia Control Panel, max refresh rate is 60 Hz.

These limitations are in place until you get a HDMI 2.1 GPU which will be available later in the year.
 
Ok - i just got my LG48CX connected to my 2080TI and here are the options to run the display. But there are limitations....

1 - 4:2:0 - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
2 - 4:4:4/RGB - 8 bit 3840 x 2160 @ 60hz max / with Gsync / with HDR
3 - 4:2:0 - 8 bit 3840 x 2160 @ 120hz max / without Gsync / without HDR
4 - 4:2:0 - 8 bit 2560 x 1440 @ 120hz max / with Gsync / with HDR

I've also tried custom resolutions at 3840x1600 or 3840x1440. Test works fine (stretching but guessing thats a tv setting issue), but enabling it doesn't work.

Am i missing something? I'd like to run 3 but with GSync and HDR... the difference between 60hz and 120hz on the desktop is VERY noticable.

Is this a G-Sync limitation of the display? Or a limitation to the bandwith? The latter i'd assume would be resolved when upgrading to a 3080TI IF that supports HDMI 2.1.
It's a bandwidth limitation. Your output source (video card, in this case) also needs HDMI 2.1. Since the 2080 Ti is only HDMI 2.0b you're limited to the resolutions of HDMI 2.0b. Why your custom resolutions are not enabling, I do not know.
 
Black Friday has no effect on the price of high end TVs. Best time to buy is around March/April when they start clearing them out for the new models. You can find deals on them from unauthorized dealers that sell on eBay and stuff. It is where I got my C9 65' for $1800 around September while stores still charging $3000. $1800 was the best price til after the holidays. The best price I seen at a store was $2600 for 65' at best buys over Christmas season.

The best price was $1699 deals for the C9 OLED around Black Friday. I dunno what March/April have to do with anything, as prices do not drop significantly after the holidays. There's been a couple refurbished type deals at around $1500, but that's about it. The best time to buy is Oct/Nov for most people, and while it's true there were already deals for $1800 around Sept and Oct, there's a much larger quantity of deals around late November so it's easier to find for someone who isn't obsessively watching.
 
  • Like
Reactions: elvn
like this
In the Trash ? Okay !!
Strange.

Pics or it didn't happen ;)

Then unfortunately I can't believe you have the 48CX at Home.
thumbnail_image0 (1).jpg
 
I want 500Hz on the desktop damnit. Quite serious. I want it so fast that it appears to be 'refresh-rate-less'. At the very least, I want it as responsive as possible at any technology level.
500Hz? Only 500Hz? This is not direction I wish our technology takes... at least for too long.
Actually it would be more efficient to draw only those part of the screen that changed and do this as fast as possible.
Bandwidth limitations would not disappear and actually sending position information would reduce useful data transmission bandwidth by quite a lot. This could be however solved by adding eye tracker and then it would be know which part of the screen you are looking at and parts closer to your focal point would get higher priority and bandwidth.
That way mouse cursor could easily be in MHz refreshes per second range and you would never have any frame drawing synchronization issues and you would not even be limited to drawing images in frames as you could then draw them in pixels. For rasterization it doesn't probably make that much sense but with ray tracing you could calculate rays depending on where you look and these rays would be sent to monitor and then your eyes immediately.

Input lag would still be caused by processing time and if it was in frames then by frame rate. But if game engine worked solely by calculating physics - all objects would need to physically do what you would in reality do to perform functions, with some simplifications of course though any law of universe which is added would need to be "local" and its effects propagate at "speed of causality" (speed of light pretty much) set in the game engine - then you could calculate places where you actually shoot rays (and by extension any physical activity like moving object because it would be made from "particles" shooting rays) as fast as possible by your computer hardware (actually computing node allocated to this ray). To achieve coherence of whole game world nodes would synchronize world data based on negotiation based on block chain and with influence based on distance, time and speed of causality (speed of light). Local computer could calculate part of the world where you have immediate influence on and rest could reside in cloud. To avoid cheating synchronization would probably require calculating outcomes multiple times with less precision the further from event (eg. what your computer calculates when shooting ray) you are and prone to errors and cheating, but could be corrected by block chain. In which case it would be hard to break laws of physics (eg. cheating or by merely issues with hardware/network or even slow inadequate performance of hardware) and generate game breaking issues because in case of invalid calculation on one node (eg. your computer) it would synchronize with network. On local machine you could have multiple computing nodes (like we actually already have multiple cores) which would process data in the same way and synchronize itself with other nodes. Then you would have more or less physically accurate (with of course assumption we could have physical reality/universe with our defined laws and were comparing accuracy of our game engine to this alternate reality/universe - we might not want to emulate our own universe which is kinda boring) game engine with true real time calculations and life-like presentation. No input lag nonsense, no frame pacing issues, no micro stuttering, no tearing and no bullshit. You could then call such game engine "Real Engine" and only the next step to take would be to connect your brain to it for even less hand and eye processing input lag and simulate whole body and all senses... and then zap your brain to forget you are in the game 😉

So... 500Hz... pathetic 🤣

But seriously 60Hz is enough for desktop and if you have to choose between 4:2:0 chroma resolution at 120Hz and 4:4:4 chroma resolution at 60Hz then it should be obvious the latter option is better and you should pre-order HDMI 2.1 GPU as soon as they are announced.
 
Electroworld.nl ?

I think you are a Lucky One ;)
In Germany the 48CX are Delayed from 15.5.2020 to early June
Was there a firmware update ootb? Just curious about the kind of support it's getting.

Looks good. I also have a 2080Ti and I'm up in the air about getting it.
yes there was.
 
Actually it would be more efficient to draw only those part of the screen that changed and do this as fast as possible.
Honestly I expect this to be the direction that the technology takes next. Not terribly useful for full-screen motion; that will drop frame transmission update times necessarily; but it would be useful for situations where only limited changes are occurring where parts of the screen may be updated as fast as the chain allows. Could be 10,000Hz with OLED and only 1/100th of the viewing area etc., for example.
 
They could eventually develop and use interpolation of say 100fps , even a solid100fps using "AI machine learning" upscaling ~ DLSS etc, then develop much better low latency , non artifacting duplication interpolation to multiply that 100fps x 10 to get 1000fps. If they had 1000Hz monitors with 1000fps, it would be 1ms of sample and hold blur like a FW900 graphics professional CRT, which is essentially "zero" blur.

https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/

blurbusters_blur-per-fps-and-hz_1.png


foveated rendering ~ eye tracking is useful but I think the future is more advanced form factor VR and AR headsets and glasses (even seated with virtual~mixed reality screens/viewports usage scenarios), haptics, etc where those type of frame saving techs are probably more useful with huge fields of view.

That higher FPS+Hz doesn't just reduce sample and hold blur though. It also adds motion definition incl. pathing articulation, potentially smoother animation cycle frames, overall smoothness.. more unique pages in an animated flip book flipping much faster. That applies to the whole game world moving relative to you when you are viewport pannning, mouse-looking, movement-keying, gamepad panning, etc in 1st/3rd person games while it is also affecting every virtual object's movement on the screen as well. There would be diminishing returns on the motion definition aspect as the FPS and Hz got very high but it's a big difference between 60fps solid and 120fps/Hz solid and higher.
 
Last edited:
It's too bad these TVs don't go over 120hz. 4k@120hz is the same bandwidth as 1080p@480hz. And OLED response time can fully take advantage of that refresh rate whereas even the fastest LCDs don't even always fully transition at 120hz.

For certain games you could switch to 1080p@480hz and others you could run at 4k@120hz it would be the best gaming monitor in every single way.
You forget that implementing simplest to implement upscaling algorithm in existence "integer scaling" is beyond capabilities of any monitor manufacturer so even if you had high refresh rate gaming OLED monitor and it could run faster at lower resolution you would get blurry mess and not perfectly defined pixels.

For your PC and connection it is the same bandwidth for these two modes you mentioned but panel itself need to refresh state of all pixels in each frame so some parts of the panel would need to work four times as much and it might just not be possible to overclock panel four times past their rated refresh rate. Issues to consider would be signal characteristics not being good enough to even allow such high speeds, TDP increase and EMI increase, etc. All of these can be solved but monitor manufacturers can't fix them and OLED panel manufacturer won't spend money on these things either.

On a positive note:
When we talk about gaming OLED monitors then we wont use HDMI 2.1 nonsense with them but Display Port 2.0 and with it have stupendous amount of bandwidth, 80Gbit/s and bandwidth saving techniques such as DSC (Digital Stream Compression) which has 3:1 compression with little to no visual quality impact. With 2160p gaming monitor you would not really need to drop resolution to save bandwidth. You might possibly want to do this to get better frame rates from your GPU but with all these new fancy AI up-scaling techniques that we already have and are actively developed it might just be needed. If however you really wanted integer scaled 1080p (or eg. 1440p on 2880p monitor) then you could easily use GPUs own integer scaling without even remotely worrying about connection bandwidth.

I also think that gaming OLEDs wont be 120Hz but more like 240Hz.
 
You forget that implementing simplest to implement upscaling algorithm in existence "integer scaling" is beyond capabilities of any monitor manufacturer so even if you had high refresh rate gaming OLED monitor and it could run faster at lower resolution you would get blurry mess and not perfectly defined pixels.

You should look at what nvidia is doing with "AI / machine learning" DLSS 2.0. It's not leaving it at a lower muddy upscaled resolution. Also worth noting that at first the DLSS could only work on "pre baked" titles but now it can learn any titles according to nvidia. Amd and oculus are also working on similar tech. Oculus' stand alone headset functionality with weaker phone hardware when untethered could benefit a lot also.

Nvidia Keynote Part 2 (Youtube Link)
"NVIDIA CEO Jensen Huang describes how computer graphics is entering a new era with #NVIDIARTX, which combines ray tracing and #AI to create dazzling visuals. See how NVIDIA DLSS 2.0 - deep learning super sampling - uses AI to upscale a low-resolution image to better than native "

-------------------------------

NVIDIA DLSS 2.0 Tested - Too Good to be True!? | The Tech Chap

-------------------------------

It might sound too good to be true but if it works I'm hoping they could potentially develop better interpolation to multiply raw frame rates on top of that. So for example they use DLSS 2.0 on 1080p to produce a 4k display signal with learned fidelity and it gets a solid 100fps - if a better form of interpolation were developed they could then multiply that 100fps many times to feed higher Hz monitors. If DLSS is as good as they say it is, regular upscaling will be obsolete. Anything higher than 120hz is really outside of the scope of this thread though so that extreme Hz talk is probably is getting off topic.

It appears to me that they are promoting the use of DLSS to counteract the massive frame rate hit that RTX~ray tracing can cause, but if DLSS allows more people to feed the 120hz 4k this display is capable of with ample frame rate to get over 100fps even without RTX, it could be a great thing. I'll reserve judgement until I see it in action on titles myself but the videos look promising.
 
You forget that implementing simplest to implement upscaling algorithm in existence "integer scaling" is beyond capabilities of any monitor manufacturer so even if you had high refresh rate gaming OLED monitor and it could run faster at lower resolution you would get blurry mess and not perfectly defined pixels.

Nvidia's GPU scaling supports integer scaling so you can just use that if you want to play at 1080p. To me 1080p looks far worse than regular scaled 1440p paired with image sharpening. I find that 1440p to 4K is a much smaller difference than 1080p to 1440p. Tried games like RDR2 with lots of fine small detail and a lot of it is lost running at 1080p.

DLSS 2.0 is the real deal though. I have a hard time telling Control running at 2560x720 from native 5120x1440 when DLSS 2.0 is enabled. With all raytracing options enabled the game both runs and looks fantastic like this whereas running at native 5120x1440 (which is still 1M pixels less than 3840x2160 btw!) is about 20-30 fps on a 2080 Ti. I originally played through the game with 3840x1080 with image sharpening because the wider aspect ratio made the game better to me and I needed that performance boost. DLSS "1.5" or whatever the previous version was called did not even work at this uncommon aspect ratio.
 
How would this TV compare with something like a fast gaming monitor for FPS games and similar when the TV is in best possible settings for lowest input lag? Are we now at the level where a "fast TV" could actually be used as an alternative for a fast gaming monitor even for somewhat competitive gaming? It's a bit tricky to actually compare since there isn't a well defined way of measuring this and also that there seem to be a lot of "marketing BS" used, especially for gaming monitors with sometimes laughable numbers specified.
 
How would this TV compare with something like a fast gaming monitor for FPS games and similar when the TV is in best possible settings for lowest input lag? Are we now at the level where a "fast TV" could actually be used as an alternative for a fast gaming monitor even for somewhat competitive gaming? It's a bit tricky to actually compare since there isn't a well defined way of measuring this and also that there seem to be a lot of "marketing BS" used, especially for gaming monitors with sometimes laughable numbers specified.
Better image response time, marginally more input lag. Overall the only issue is size which is not ideal for FPS games where you might prefer something smaller.
 
Better image response time, marginally more input lag. Overall the only issue is size which is not ideal for FPS games where you might prefer something smaller.

Size is something that can be fixed with a custom resolution though, use that today already. It is details mainly about the actual difference in input lag I am looking for here as I have seen both things be claimed here, both that the difference now is so small that not even a pro would notice it (and I am far from being a Pro :D) and that there is still a noticeable difference for fast paced games.
 
Back
Top