Where are the 8K Monitors?

From Mikejl over on avsforum's 900D owner's thread :

Rtings QN900D Review is now up. 8.3 overall score.

Years ago I used to swear by Rtings. For the last few years now, I just take their reviews with a grain of salt.
 
but who does it better?

I'd say maybe tftcentral if/when they decide to review something, though they have their own biases too, and don't review many gaming tvs and even lack some monitor model reviews.

From my reply about RTings on avs :

. . . . .

They ding screens for stuff like "ergonomics" , when anyone buying a 2,000 usd to 6000 usd screen would likely spring for a quality 3rd party stand or wall mount. In fact, they prob already have one from their previous screen. They also ding for speakers, and really, idk who would spend money on a quality screen and then use integrated speakers. One other thing they ding for is how much SDR brightness is not in the overkill range, for people who view their TVs in very bright rooms. Even during the day I at least dim my room environment some.

So really, you have to read their reviews in a line item fashion and ignore the headings that have zero or inconsequential impact to you. Then average out the ones that do yourself. Even then, some things matter more to one person than another, like static desktop/app use on a FALD LCD vs and OLED for example.

I just checked the main itemized headings on RTings :

. . They repeatedly complain about very low rez content scaling, when anyone looking at the 900D should know it's the first 8k to be able to do 1080p well. Less than that, you should know better. You can use an external player to upscale anyway, which might help in that regard. Regardless, a 900D isn't a 480p showcase lol.

. . They also mention aggressive ABL, which is valid as long as they ding OLEDs and all 1600nit + FALD LCDs because practically all of them have aggressive ABL. I do wish they would make their housings boxier with vented grilles and active fan cooling profiles though, so I tend to agree there, as long as it's balanced fairly vs screens that go as bright (usually triggering ABL), also measured fairly vs screens that don't go as bright so have no chance to show that range in the first place, and vs. OLED's that all have pretty aggressive ABL.

. . They repeatedly complain that there is no Dolby Vision support. You can look at that from either side of the fence. DolbyVision charges tv manufacturers and content creators. Samsung and others are trying to make free versions of such tech, and are also releasing a free surround tech to compete with dolby ATMOS surround soon. If RTings were to champion open standards, they would take away points for DV and award points for HDR10+. Also, the brighter the peak range of a HDR tv, the more it can map HDR to absolute values rather than tone map it as much (that is, such screens compress the top end of the range less) - so the less it should have to rely on DV altering and shaping scene's levels. Besides that, you can use a DuneHD player and/or a HDFury to render DV to a samsung TV if you really want Dolby Vision on a display that doesn't support DolbyVision tax (though you'd have to buy either of those devices and they aren't cheap either).

. . . .

androidauthority.com article, Jan 2024:


Samsung confirms its Dolby Atmos competitor will arrive in 2024
The format will support previous-generation TVs and legacy sound systems after a firmware update.

Samsung confirms its Dolby Atmos competitor will arrive in 2024

Immersive Audio Model and Formats (IAMF)

. . . . .
 
Last edited:
RTings does for sure have the problem of starting to believe that their subjective opinions are objective. They are quite good about gathering objective data, and hence they are one of the ones I look at (TFTCentral and Monitors Unboxed being the other two) but they do sniff their own farts a little much and start to get in to the "this objectively matters" or "this objectively doesn't" when it is in fact a subjective call.

Still a good resource, but I ignore the scores entirely and most of the commentary. I'm interested in the numbers.
 
  • Like
Reactions: elvn
like this
So, do we still not have a confirmation about the actual panel in the QN900D being 240 hz? It seems like it would infact still be a 120 hz panel, probably even the same panel as last year, but with added "CPU power" to allow it to accept a 240 hz input.

At least the Rtings views usually confirm that most of the initial reviews for every new high end Samsung is basically BS, remember how the early ones for the QN900D claimed no blooming/haloing, more dimming zones etc. etc.
 
So, do we still not have a confirmation about the actual panel in the QN900D being 240 hz? It seems like it would infact still be a 120 hz panel, probably even the same panel as last year, but with added "CPU power" to allow it to accept a 240 hz input.

At least the Rtings views usually confirm that most of the initial reviews for every new high end Samsung is basically BS, remember how the early ones for the QN900D claimed no blooming/haloing, more dimming zones etc. etc.



They seem to repeatedly say that the 900D does 4k 240hz via DSC. There is no mention of cutting vertical resolution lower than 4k, using flicker or any other tricks other than DSC. I don't think they've updated their template to show 4k 240hz in every one of the bold lettered scroll tables themselves yet, but it says it does 240hz 4k via DSC in the paragraph/essay below each table even if it's not listed in some of the tables yet.


firefox_3sR5IKLmeN.png


firefox_minmitk9xn.png



. . . . . . .


For some reason RTings has a 1440p fetish (and 480p and 720p media fetish). I don't know why you'd get a 8k screen if you need to run 1440p for gaming, outside of I guess consoles doing 1440p 120hz or 1536 downscaling during dynamic resolution. Probably better off using a 4k screen for consoles. Consoles can barely do 4k, and don't do over 120fpsHz, so RTings must be talking about 1440p pc gaming for some reason.
. . Consoles checkerboard and consoles dynamically downscale from 4k to ~ 1536p.
. . PCs capable of it upscale to 4k with DLSS (and optionally increase frame rates with frame gen).

They also mention "those with a powerful PC wanting to game at 8k". Sure, that's an option I guess. To my thinking, a powerful rig would be what is required to run 4k 240hz, via dlss+frame gen as necessary at that, in order to get over 200fpsHz average if possible on some games. 240fpsHz for the motion/clarity aesthetics, not running smeary FoV movement and low motion definition/pathing molasses 60fpsHz 8k.

It would be nice if running a 4k, 5k, 6k, or uw resolution windowed wouldn't be limited to 60hz (you'd essentially still be running 8k 60). If you could do custom screen resolutions 1:1, letterboxed with black frame all around at 240fpsHz peak, it would open up some more good 240hz options in "window-like" letterboxed sizes 1:1. A 32:10 and 21:10 240hz option would be nice in additon to 4k, 5k, 6k letterboxed options.
 
Last edited:
Rtings Review quote: Samsung QN900D 8k QLED Review (QN65QN900DFXZA, QN75QN900DFXZA, QN85QN900DFXZA)

The TV has decent lighting zone transitions. Unfortunately, the leading edge of bright highlights when they quickly move across the screen is visibly dimmer, and there's very noticeable haloing.

also, typical of samsung FALD gaming tvs apparently:

When the TV is set to Game Mode, its local dimming performance is slightly worse overall, with more noticeable blooming and zone transitions.

.

I guess that is sort of expected of FALD LCD gaming tvs though. Pros and cons.

.
 
It only has 1,344 dimming zones, which is a big problem for a TV that size. It needs at least triple the number of zones.

Yes for sure, more zones would be a lot better. However HDTV test and RTings have both said in the past of other samsung FALD TVS - that samsung's game mode makes the FALD transitions slower, and across a wider # of zones than when in media modes. At least in previous models, so it wasn't that surprising to hear it being true on this one as well.

However, as far as size goes, that is relative to view distance just like PPD is. A smaller screen at the same viewing angle, nearer, with the same # of zones would look the same.

e.g. 4k rez on a 65" (4k for the sake of example) screen at 4 feet away from your eyeballs, around 60 deg wide viewing angle, will look the same perceptually, resolution/perceived pixel and screen size, and perceived lighting zone size ~ "backlight density" as a 32" 4k with the same FALD count at 24 inches away (also 60 deg viewing angle).

The thing that would change is the screen surface brightness (and peaks as measured by testing hardware right up against the screen), which would be perceived as somewhat less bright the farther away from the screen you were.

So a 32" 4k or 8k at the same viewing angle would benefit from greater # of FALD zones just as much as a larger screen, unless for some reason you are sitting much closer to the larger screen relatively than you would a 32", sitting nearer to where you would be pushing the larger screen way out into your periphery at a much wider viewing angle, with much larger fields of pixels on the sides of the screen being off-axis from you (which exacerbates other problems as well).
 
Last edited:
It only has 1,344 dimming zones, which is a big problem for a TV that size. It needs at least triple the number of zones.
Ironically, the reason I switched from using an OLED as a PC monitor to an 8K LCD was also the fact that the OLED had too few dimming zones despite having a few million of them.
 
How's the sub 50fps 8k gaming treating you fellas?

I don't have one of these but I'm keeping tabs on the performance and specs.

Personally I have zero interest in 60fpsHz gaming on any resolution. These screens are supposedly capable of 240fpsHz 4k scaled on the 8k screen, while still giving you a quad of 30" 4k screen space for desktop/apps without middling bezels. That and good HDR color ranges and peaks (even suffering ABL like many bright screens in slim form factors and w/o active cooling do).

Besides, on more demanding games, you might be lucky to get near 240fpsHz at 4k, and that possibly via DLSS and frame gen depending on the title.

.
 
Personally I have zero interest in 60fpsHz gaming on any resolution.

Not at 8k, but at 4k I go back and forth on this so hard. I love high frame rates... but I love eye candy too. I've been playing Hitman lately and it has some Ray Tracing in it, but they didn't really spend much time optimizing it, it was an addon. Many games they'll do things like only raytrace certain reflections, or use lower rez samples (more blurry but faster) and things like that. It does, it basically replaces SSR and does so at full rez. Net effect is that it can be REALLY heavy on FPS. In small scenes without lots of reflective stuff it isn't bad but in some areas it tanks it. Without RT game runs at the monitor's refresh rate most of the time and is buttery smooth. With it on, it drops a lot, sometimes well below 60.

Yet I find myself toggling it on and off. I like the smooth FPS... but man I really like the way the better reflections make materials look. Usually, I put up with the lower FPS unless it gets REALLY bad in an area then I turn it off.

In Alan Wake 2, my GF and I decided to just leave it cranked all the way. It looks SO NICE with the high-end RT setting. It really punishes FPS but we've decided it is just worth it.

I like smooth motion, but man the eye candy...
 
I'm not very familiar with that game but if you are playing a story game with slow pans, narrow corridors or flashlight beams, and maybe orbiting virtual cameras, and cutscenes throughout, etc. - that might help to avoid FoV movement at speed where the screen blurs a lot more. High fpsHz is less meaningful when watching movies/media too, (though sports and high action could benefit, provided a high fps source). When you have complete freedom of FoV movement in 1st/3rd person games, the sample and hold blur is a worse scenario than that. Higher fpsHz also makes the motion pathing more articulated and smoother too, which is a big gain, but blur reduction is the other big gain.

. . . . . .

Huge reduction in sample-and-hold blur:

This is based on 1000px/second, which can be faster or slower depending how you are moving your FoV at any given time. It applies to the entire screen of high detail when moving your FoV, not just a simple cell-shaded cartoonish ufo object. To me 60fpsHz is sluggish molasses movement, and smearing blur. It's playable, but not great, and that's as compared to 120fpsHz. 240fpsHz would be even better, 4x improved vs 60fpsHz in both blur reduction and motion definition.

blurbusters.pixels.of.motion.blur.fpsHz.png


. . .

Motion definition, path articulation, more detailed animation cycles/action states:

blurbusters testing shows the mouse moving in a straight line but the gaps show how fast a mouse or any object can move and change direction (e.g. "dots on a dotted curved line" , or have it's info/state updated on screen (change in action, how fluidly and detailed it is animated, etc).

MouseStepping-60vs120vs240.png


project480-mousearrow.jpg


. .

I think this one is 360fpsHz:

8000hz-mouse-vs-360hz-display-stroboscopics.jpg
 
Last edited:
Not at 8k, but at 4k I go back and forth on this so hard. I love high frame rates... but I love eye candy too. I've been playing Hitman lately and it has some Ray Tracing in it, but they didn't really spend much time optimizing it, it was an addon. Many games they'll do things like only raytrace certain reflections, or use lower rez samples (more blurry but faster) and things like that. It does, it basically replaces SSR and does so at full rez. Net effect is that it can be REALLY heavy on FPS. In small scenes without lots of reflective stuff it isn't bad but in some areas it tanks it. Without RT game runs at the monitor's refresh rate most of the time and is buttery smooth. With it on, it drops a lot, sometimes well below 60.

Yet I find myself toggling it on and off. I like the smooth FPS... but man I really like the way the better reflections make materials look. Usually, I put up with the lower FPS unless it gets REALLY bad in an area then I turn it off.

In Alan Wake 2, my GF and I decided to just leave it cranked all the way. It looks SO NICE with the high-end RT setting. It really punishes FPS but we've decided it is just worth it.

I like smooth motion, but man the eye candy...
I think it depends a lot on the game. Fast paced games tend to be better with high framerate, while slower pace eye candy games I'm fine with 60 fps. I even played the PS5 version of God of War Ragnarok at the 40 fps setting because it didn't look janky like the 30 fps mode, or blurry vaseline like the 60 fps mode.

In desktop use it's hard to give up 120+ Hz though, your mouse just feels so much more responsive. I'd love for 8K @ 120 Hz to become a thing, and then those screens at say 50-55" size...
 
From RTings. The 900D is obviously much slower response than an oled, as you'd expect.

. . .

80% Response Time 3.9 ms

100% Response Time 7.8 ms

"The Samsung QN900D has an excellent response time for minimal blur behind fast-moving objects. Unfortunately, the response time is slower when coming out of dark states, so there's some noticeable black smearing in dark transitions."

. .

as opposed to their info on the LG C1 4k OLED:

80% Response Time 0.2 ms

100% Response Time 2.3 ms

"The LG C1 has a near-instantaneous response time, but you may still notice motion blur caused by persistence—that is, the way our eyes track movement."

. . .

You are still going to get sample and hold blur on any screen (unless you are using BFI which has it's own major tradeoffs, esp in regard to HDR and VRR) . . but the FALD LCD will have more black smearing on top of that factor.

The higher your fpsHz in a game, using DLSS AI/machine l learning upscaling + frame gen most likely for any kind of demanding 4k game, the lower your sample-and-hold blur will be, which is the ugliest in 1st/3rd person games where you have freedom of FoV movement. In those games, mouse-looking, movement-keying, controller panning, etc at speed will result in the entire game world moving in your viewport and it will be a smeary mess the lower your fpsHz graph (and cieiling of your monitor) is. Higher fps hz gives more motion definition as well, pathing and animation cycle defined movement/action.

. . .

1st and 3rd person action/adventure games with gorgeous game worlds definitely benefit a lot from high fpsHz imo. I've been playing over 100fpsHz average whenever I can since at least when witcher 3 came out, though I had to use 1080ti x2 in sli (with it's own tradeoffs) to get that at "very high+ / "ultra minus" settings back then. GTV V was another that got a lot of performance out of sli, Vermintide 2 which I put a lot of hours into , along with shadow of mordor, nioh games, tomb raider, dishonored series, and a few other things.

I'm looking forward to getting 200fpsHz average or more on a 240fpsHz peak capable 4k (or 4k -> 8k) screen at some point. 120fpsHz has 2x the motion clarity(blur reduction) and motion definition/articulation as 60fpsHz. 240fpsHz has 2x compared to 120, and 4x compared to 60fpsHz. It would be great if frame gen advanced to the point where games and peripherals broadcast motion vectors to the AI system someday, so that frame gen might be able to generate and insert 2 or more frames instead of 1, and a lot more accurately being informed by the vector information of every entity and force in games as well as the peripherals and FoV movement of the player, etc. Maybe then we could get 480fpsHz 4k, but the cables and ports are limiters. Perhaps a gpu module on a screen might bypass that , doing all of the AI DLSS and frame gen on the screen itself after receiving a lower bandwidth (e.g. 1440p or 4k 120hz) signal from the pc, but they'd have to figure out a way around anything that might increase input lag. That or higher bandwidth ports and cables like 80gbps DP 2.1 + good compression depending on limits.

Edit - I looked up the LTT calculator and at DSC 2.x to 1 ratio, 10 bit 480fpsHz fits within DP 2.1 if it's DP 2.1 that is 80Gbps capable (on the gpu, the cables, and the display). The LTT chart lists real world DP 2.1 80Gbps as 77.37 Gbit/s.

Data Rate Required: 73.05 Gbit/s

Video Format:
3840 × 2160 (16∶9 ratio) at 480 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (14 bit/px, 2.143∶1 ratio)
CVT-R2 timing format
 
Last edited:
Rtings just answered my questions about the actual refresh rate of the QN900D saying that the TV has a native refresh rate of 120 hz.

"You can game at 240fps in 4k on this TV if you have a powerful enough computer to handle that. The TV has a native 120Hz refresh rate, so it achieves this by using Display Stream Compression (DSC) technology."

https://www.rtings.com/discussions/...ew-updates-samsung-qn900d-8k-qled?sort=newest

That said, the answer itself feels a bit weird to me as the person responding seems to mix the processing of the input signal with the actual panel displaying the content so I've already posted a follow up question on it. If the answer is actually correct, it would seem that my "suspicions" about the 240 hz panel were warranted, even though I still hope to be proven wrong here as I really want the QN900D to actually have a 240 hz panel.

elvn I think we might be better of continuing the more and more PC focused discussion here :)
 
  • Like
Reactions: elvn
like this
I'd really like a larger 240hz screen, and higher than 4k. 32" 4k isn't going to cut it for me.

I guess there is still hope that things like the 57" 4k+4k super ultrawides and samsung arks will be released someday in large sizes and higher resolutions, maybe with dp 2.1 at 80 Gbps in the long run.

I think in the longer run, XR glasses should get 4k per eye and eventually higher too, hopefully 8k per eye (stereoscopic "3d", so would only be a singular 4k or 8k space in the glasses). Currently they only do 1080p per eye at 120hz and aren't as polished and functional overall as they could be down the road.
 
Last edited:
I'd really like a larger 240hz screen, and higher than 4k. 32" 4k isn't going to cut it for me.

I guess there is still hope that things like the 57" 4k+4k super ultrawides and samsung arks will be released someday in large sizes and higher resolutions, maybe with dp 2.1 at 80 Gbps in the long run.

I think in the longer run, XR glasses should get 4k per eye and eventually higher too, hopefully 8k per eye (stereoscopic "3d", so would only be a singular 4k or 8k space in the glasses). Currently they only do 1080p per eye at 120hz and aren't as polished and functional overall as they could be down the road.
Agreed, I had also hoped that the QN900D was really 240 hz but has always had a suspicion it might not be. Unless Rtings change their current statement, I guess we can conclude that it is still only 120 hz, ie probably the same panel as before.

As I understand it, there would still be marginal gains feeding it with an input signal higher than native refresh rate, but compared to an actual 240 hz panel (and pixel response times to match it), those are probably marginal at best. A 240 hz input signal, I guess it might in fact turn into some kind of frame doubling even if the panel is only 120 hz, at least that the image available to draw once the panel is ready would be fresher than at 120 hz. But if that is something a normal person would actually be able to notice...not so sure. Would seem reasonable to assume that the QN900C, said to be 144 hz, would then also only actually be 120 hz with regards to panel refresh rate.

So the Neo G9 57" is probably the one to get for mainly fast paced 4K+ gaming at 240 hz, as could be expected.
 
Last edited:
Agreed, I had also hoped that the QN900D was really 240 hz but has always had a suspicion it might not be. Unless Rtings change their current statement, I guess we can conclude that it is still only 120 hz, ie probably the same panel as before.

As I understand it, there would still be marginal gains feeding it with an input signal higher than native refresh rate, but compared to an actual 240 hz panel (and pixel response times to match it), those are probably marginal at best. A 240 hz input signal, I guess it might in fact turn into some kind of frame doubling even if the panel is only 120 hz, at least that the image available to draw once the panel is ready would be fresher than at 120 hz. But if that is something a normal person would actually be able to notice...not so sure. Would seem reasonable to assume that the QN900C, said to be 144 hz, would then also only actually be 120 hz with regards to panel refresh rate.

So the Neo G9 57" is probably the one to get for mainly fast paced 4K+ gaming at 240 hz, as could be expected.

I believe that frame doubling just repeats the frame twice - - rather than interpolation adding input lag to manufacture an in-between frame (interpolating "tween" quasi frames kind of like frame gen but not as well). Redrawing the same frame twice (or using flicker), will still cut the sample-and-hold blur down by twice as much, but it won't add any motion definition or motion smoothness, as it's just flipping a page in an animation book to a page that has the exact same image on it.

So in the case of frame doubling, it would gain 2x the motion clarity/blur reduction but no increase in motion definition, pathing articulation, and animation cycle detail/fluidity. If it's instead doing any kind of flicker, that would probably lower the perceived brightness like BFI does - which is a bad thing for HDR gaming. Flicker could also potentially have some kind of crosstalk and artifacting with VRR, depending how it was done.
 
These are pretty much what I'm waiting for. The Samsung G95NC superultrawide does not seem to go low enough in price here in Finland to be worth it, and at this point could just wait for these so I can avoid Samsung's quirks.

I'll probably go for the 45" if it comes out early next year as waiting another year for a 40" version is not nice. I'm currently using a single 28" 4K and my Macbook Pro 16" as displays and it's less than ideal.

To bring this back to 8K...I would still love to see an 8K version of the Samsung ARK 55". Having seen the G95NC and ARK side by side at a store, the ARK is just not high res enough but the curvature is nice.
 
  • Like
Reactions: elvn
like this
Redrawing the same frame twice (or using flicker), will still cut the sample-and-hold blur down by twice as much, but it won't add any motion definition or motion smoothness, as it's just flipping a page in an animation book to a page that has the exact same image on it.

So in the case of frame doubling, it would gain 2x the motion clarity/blur reduction but no increase in motion definition, pathing articulation, and animation cycle detail/fluidity. If it's instead doing any kind of flicker, that would probably lower the perceived brightness like BFI does - which is a bad thing for HDR gaming. Flicker could also potentially have some kind of crosstalk and artifacting with VRR, depending how it was done.
Frame doubling on sample&hold display won't improve motion clarity.
It can at most help LCD panels with their overdrive trickery. LCD panel can also positively react to double scanning even without overdrive but its LCD specific effect and something like perfect sample&hold (OLED? not quite but much closer to perfect than any LCD...) should lack any difference.

It really doesn't make that much difference on my 360Hz OLED for motion clarity if I run monitor at 360Hz or 60Hz if game runs at 60Hz.
On my IPS panel it makes difference and my with its default "high" overdrive wasn't even designed to run at 60Hz so there is big difference between how motion looks between 120Hz and 60Hz even displaying the same 60fps content. Likewise there is big difference when playing game with VRR with frame rate on lower end of VRR range - the moment GPU needs to do double scanning there is big change in how overdrive artifacts look. With overdrive disabled this difference in overdrive artifacts disappears - in which case the same elements in picture which change are just blurred.

----
There are however some differences.
For one if source sends frames in 16.6ms so at 60Hz then if panel runs at twice that we will see:
1. Horizontally moving vertical lines won't bend the same amount as on native 60Hz panel
2. There will be by necessity higher input lag

re.2.
In ideal implementation 8.33ms more lag at the top line, ~4.16ms in the middle line and ~0ms at the bottom. In less ideal and more expected implementation engineers idiots will first get whole frame and only then start sending it to panel so input lag would then be ~16.66ms at top, ~8.33ms at the bottom and ~12.5ms in the middle. Likely more because memory is cheap so why not add more input lag with it? 🙃

This is reverse to running monitor at 120Hz through GPU when game runs 60fps in which case lag is reduced due to streaming it faster out the port to display.
 
  • Like
Reactions: elvn
like this
We are making guesses at this "240Hz 4k" on what is looking like might be a 120Hz display.

It really doesn't make that much difference on my 360Hz OLED for motion clarity if I run monitor at 360Hz or 60Hz if game runs at 60Hz.

I wasn't looking at it from an input lag perspective. While that info is valuable, I was trying to determine what kind of "240fpsHz" end result is being displayed in regard to motion clarity and motion definition on this screen when running "240fpsHz" 4k through to 8k (where 4k 200fpsHz average or better is achievable in some games/settings).

. . If using frame doubling, we aren't getting any better motion definition because there are no new, unique frames of action states/positions being shown.


Frame doubling on sample&hold display won't improve motion clarity. It can at most help LCD panels with their overdrive trickery. LCD panel can also positively react to double scanning even without overdrive but its LCD specific effect and something like perfect sample&hold (OLED? not quite but much closer to perfect than any LCD...) should lack any difference.

. . If they are using some sort of frame doubling, but that frame doubling method also isn't cutting the sample-and-hold blur down, increasing motion clarity, then the value of 240fpsHz sent to a 120hz 8k panel would not be valuable to me. OLED still get sample-and-hold blur (as most of us know, and like you said), so even if they were using doubling which made response time increases nearer to an OLED, running an actual 4k 120hz OLED would be fine compared to this, and this panel would still be nowhere near closing any of the gap to a modern 240fpsHZ capable OLED screen like a true 240fpsHZ 4k+ capable FALD (like the G95NC 57" 4k+4k super ultrawide's 240hz) would be.

Also, the previous gen 900C 8k FALD screen from samsung can do 4k 120hz "natively" to the 120hz panel (though it upscales it to 8k), and can be had for much cheaper if you don't need the AI/machine learning uscaling upgrade in the 900D for movies and shows that much.

900C input lag (per RTings) :

4k @ 60Hz @ 4:4:4 = 10.7 ms

4k @ 120Hz = 6.0 ms

8k @ 60Hz = 10.5 ms


900C response time (per Rtings) :

80% Response Time = 3.5 ms

100% Response Time = 7.5 ms
 
Last edited:
Unfortunately, I feel like the G95NC would be too short for my taste, considering the distance I would sit to find it more usable. I agree a 120Hz-240Hz 8k version of the 55" 1000R samsung odyssey ark could be great, like the G95NC x 2 but without middle bezel. Even better if it could do non-native resolutions 1:1 with letterboxing at 240Hz when desired. Some of that is what I was hoping the 900D would bring with it touting "240Hz 4k" , but at this point, it seems that maybe be smoke and mirrors. :dead:

"All Your Questions Answered 57" Samsung Odyssey G9Neo" (#PETEMATHESON)

View: https://www.youtube.com/watch?v=lDVmVZtxvrE

dotted line added by me:

57in.s-uw_if.it.was.somewhat.jpg
 
Posting what Improwise said on avsforum.. Make up my mind lol..

Plot thickens as Rtings have now updated their answer, in a way that I interpret that the panel is actually capable of doing 240 hz at lower resolutions, not just accepting it as an input signal. Fingers crossed that they don't change their mind again :)"


yes.or.no.animated.gif



edit: RTings gave him this article link to reference - - https://www.hdtvtest.co.uk/news/samsung-s-qnd-900-mini-led-supports-4-k-at-240-hz

According to that, DLG (Dual Line Gate) works by halving the vertical resolution, which up until now, meant 1080p limit pushed on a 4k native screen. With 8k, they are suggesting that the 900D can do the same thing with a 4k 240hz signal to a 8k 120hz panel.

I don't think its 100% confirmed by anyone yet., just textual take comments in articles and the Q&A at the bottom of the RTings 900D review page that improwise has been doing all of the legwork on (thanks btw).
 
Last edited:
Posting what Improwise said on avsforum.. Make up my mind lol..




View attachment 682305


edit: RTings gave him this article link to reference - - https://www.hdtvtest.co.uk/news/samsung-s-qnd-900-mini-led-supports-4-k-at-240-hz

According to that, DLG (Dual Line Gate) works by halving the vertical resolution, which up until now, meant 1080p limit pushed on a 4k native screen. With 8k, they are suggesting that the 900D can do the same thing with a 4k 240hz signal to a 8k 120hz panel.

I don't think its 100% confirmed by anyone yet., just textual take comments in articles and the Q&A at the bottom of the RTings 900D review page that improwise has been doing all of the legwork on (thanks btw).

This seems to be a never ending story going back and forth :)

As you've probably saw, my last comment to the Rtings team now was for them to confirm that they themselves have actually tested and confirmed the 240 hz output of the panel. They say that they think the panel is the same as in the QN900C (which seems reasonable since all the metrics are the same). Which should mean that the only thing that has actually changed is "processing power" which should then somehow make the panel 240 hz capable, not just that the TV can accept and process a 4K@240hz input signal. I have my doubts to be honest...

Especially considering Samsung promises in the past about HDR4000, 4000 nits brightness etc. I would really like someone confirm this saying they have actually verified it themselves, not just referring to someone else in vague statements.

It is also interesting how Samsung previously made the jump from 4K@120 hz to 4K@144 hz, still using the same panel, considering the Dual Line Gate tech explanation.
 
Last edited:
As I understand it, this DLG tech, actually only processes half the supported resolution (at least half of the lines) to achieve its "fake" refresh rate. But then that fake 4K@240hz is upscaled into 8K on a 120 hz panel as you can't disable scaling on the Samsung 8K models. Or am I missing something here?
 
As I understand it, this DLG tech, actually only processes half the supported resolution (at least half of the lines) to achieve its "fake" refresh rate. But then that fake 4K@240hz is upscaled into 8K on a 120 hz panel as you can't disable scaling on the Samsung 8K models. Or am I missing something here?
It's not scaled up, it scans out one line twice (at once), then the next line twice, all the way down. This effectively retains the full image height while halving the scan time (and resolution). IOW 8192 (or so) lines turns into 4096 fat lines.

Assuming that article on DLG is accurate, anyway.
 
It's not scaled up, it scans out one line twice (at once), then the next line twice, all the way down. This effectively retains the full image height while halving the scan time (and resolution). IOW 8192 (or so) lines turns into 4096 fat lines.

Assuming that article on DLG is accurate, anyway.
I think you misunderstood. There is no 1:1 pixel mapping on Samsungs 8K models and it will always scale every 16:9 resolution to full screen / native resolution. That is on top of that DLG thing I would imagine, as it seem to happen for all 16:9 resolutions regardless of refresh rate etc.

That is probably my main frustration with them, including my QN900C. If that had been there, you could basically have had a virtual 32" 4K monitor in the center of the screen.
 
  • Like
Reactions: Nobu
like this
I think you misunderstood. There is no 1:1 pixel mapping on Samsungs 8K models and it will always scale every 16:9 resolution to full screen / native resolution. That is on top of that DLG thing I would imagine, as it seem to happen for all 16:9 resolutions regardless of refresh rate etc.

That is probably my main frustration with them, including my QN900C. If that had been there, you could basically have had a virtual 32" 4K monitor in the center of the screen.
Yeah, I was thinking 8k240 for some reason
 
Some manufacturers halved the vertical on some screens in the past I guess, which we discussed as a possibility in the thread on several occasions . . . but that's not necessarily a bad thing because on a 8k screen that is still a ton of detail even if cut to 4k horiz + vertically.

. . 900D 240Hz mode when fed high enough fps is supposedly getting 240fpsHz gains of motion clarity (blur reduction, especially of FoV movement blurring entire screen), so arguably somewhat less detail lost in dynamic gaming like FoV movement at speed. 4x less smeary than 60fpsHz average, 2x less blur than 120fpsHz, when pushing 200fps average or more.

. . plus supposedly 240fpsHz gains in motion definition (more dots per dotted line curve ~ pathing articulation, more unique action/animation state pages in an animation flip book that is flipping faster).

. . appreciably low input lag, and more frames allowing you to see action updates twice as often as 120fpsHz (*not so in online game server mechanics really*), and allowing you twice as frequent windows to act, redirect, etc. (*not necessarily how it pans out in online game server mechanics there either*)


I don't know if we'd know for sure that the 900D's 240Hz 4k mode can deliver that until someone like RTings, HDTVtest, TFTCentral ,etc. or someone else records with a very high shutterspeed camera in order to determine if for all practical purposes every one of the 240 frames is being drawn, and is being drawn with a --new, unique-- action state/movement position/travel coordinate for whatever is moving on the screen.


Some earlier mentions I made in this thread of halving the vertical rez :

.
Samsung Motion Rate 120, Sony MotionFlow 960, LG TruMotion 240, etc.


Now Samsung "Motion Xcelerator 240Hz" (Xcelerator was not a typo lol).

From samsung's 900D product page:"⁴240Hz is limited to 4K resolution and requires compatible content connection from compatible PCs. Motion Xcelerator 240Hz is sometimes called Motion Xcelerator Turbo 8K Pro.".

Though, it's named that for the 144hz mode on the 900c too, and that is native 120hz/144hz supposedly. Same kind of naming conventions on 1080p/1440p 120hz being capable on 4k 60hz tvs in the past also.

https://www.tomsguide.com/reviews/t...24-hands-on-review-brighter-bigger-and-better
.
The TCL M8 screens are native 120hz. They use a game accelerator mode to cut the vertical rez in half to hit 240hz.

.
From somone's reddit reply, which I had in the previous quote:
Apparently it doesn't cut the resolution in half, it just cuts the *vertical* resolution in half. So 3840*2160 becomes a very weird 3840*1080.

TCL have implemented this "motion accelerator" on a few of their native 120hz/144hz panels, too (specifically, I'm looking at the TCL 65C745K, which might be EU/UK-exclusive - I know I had trouble finding any retailers carrying 120hz TCL models widely available in America over here when I was making notes of what was available some time last year. This one does 120hz, 144hz and this weird 3840*1080@240hz).

.
 
Last edited:
Some manufacturers halved the vertical on some screens in the past I guess, which we discussed as a possibility in the thread on several occasions . . . but that's not necessarily a bad thing because on a 8k screen that is still a ton of detail even if cut to 4k horiz + vertically.

. . 900D 240Hz mode when fed high enough fps is supposedly getting 240fpsHz gains of motion clarity (blur reduction, especially of FoV movement blurring entire screen), so arguably somewhat less detail lost in dynamic gaming like FoV movement at speed. 4x less smeary than 60fpsHz average, 2x less blur than 120fpsHz, when pushing 200fps average or more.

. . plus supposedly 240fpsHz gains in motion definition (more dots per dotted line curve ~ pathing articulation, more unique action/animation state pages in an animation flip book that is flipping faster).

. . appreciably low input lag, and more frames allowing you to see action updates twice as often as 120fpsHz (*not so in online game server mechanics really*), and allowing you twice as frequent windows to act, redirect, etc. (*not necessarily how it pans out in online game server mechanics there either*)


I don't know if we'd know for sure that the 900D's 240Hz 4k mode can deliver that until someone like RTings, HDTVtest, TFTCentral ,etc. or someone else records with a very high shutterspeed camera in order to determine if for all practical purposes every one of the 240 frames is being drawn, and is being drawn with a --new, unique-- action state/movement position/travel coordinate for whatever is moving on the screen.


Some earlier mentions I made in this thread of halving the vertical rez :



.
My main concern is that the 240 hz effort seems to be mostly focused on "yet another sticker" rather than actual improvements. That said, it is probably hard to blame Samsung considering that 8K is such a small market to begin with, and most buyers probably would not even notice a difference between real 120 hz and 240 hz (and honestly, neither might we in many cases). And then of course you have the matter of few LCDs that can really keep up with 240 hz anyway. Would imagine that most gaming done on these TVs are console based as normal people don't tend to put them on desks, and AFAIK, 240 hz gaming is not even a thing on those.

I am kind of glad in some way, as know at least I won't have the urge to upgrade from my QN900C to the QN900D :)
 
Even if there is some trickery involved in driving panel on QN900D in its 4K@240Hz mode it should not matter that much for as long as there is input lag reduction and better than 120fps motion clarity. It is not like its 4K panel which fakes 240Hz with 120Hz timings. More like "we don't know how Samsung achieved it but there are some ways to achieve higher refresh rate without actually driving panel faster" kind of situation where also these tricks that reduce resolution are used on panel with higher resolution. Faking 4K@240Hz on 4K panel would certainly need some compromised but on 8K? Still compromises but probably less visible. Also less visible because it is already quite blurry VA panel and not OLED. And it is not certain they had to use any tricks. Displays are LCD matrix + electronics on the panel itself + electronics driving this electronics. We have no idea if panels in QN900C could support 240Hz or not and we don't know if Samsung changed something about panel's internal electronics. Mayne they didn't feel the need to change/improve LCD tech itself but used better electronics across the board? For now it is as likely as accusations of using some interlacing trickery or whatever this DLG tech is.
 
Even if there is some trickery involved in driving panel on QN900D in its 4K@240Hz mode it should not matter that much for as long as there is input lag reduction and better than 120fps motion clarity. It is not like its 4K panel which fakes 240Hz with 120Hz timings. More like "we don't know how Samsung achieved it but there are some ways to achieve higher refresh rate without actually driving panel faster" kind of situation where also these tricks that reduce resolution are used on panel with higher resolution. Faking 4K@240Hz on 4K panel would certainly need some compromised but on 8K? Still compromises but probably less visible. Also less visible because it is already quite blurry VA panel and not OLED. And it is not certain they had to use any tricks. Displays are LCD matrix + electronics on the panel itself + electronics driving this electronics. We have no idea if panels in QN900C could support 240Hz or not and we don't know if Samsung changed something about panel's internal electronics. Mayne they didn't feel the need to change/improve LCD tech itself but used better electronics across the board? For now it is as likely as accusations of using some interlacing trickery or whatever this DLG tech is.
There are no input lag improvements according to Rtings review though. Some values are a bit better, others a bit worse compared to the QN900C, all within margin of error I would believe. As Rtings wrote themselves, "We’re not totally sure why the input lag is a bit higher than the QN900C, but it’s very possible that Samsung added some extra processing to the TV that drives those numbers up a bit."

So it seems to come down to if there is actual improvement of motion clarity, which it would of course be with a true 240 hz panel (if people could notice it is another question though).
 
There are no input lag improvements according to Rtings review though. Some values are a bit better, others a bit worse compared to the QN900C, all within margin of error I would believe. As Rtings wrote themselves, "We’re not totally sure why the input lag is a bit higher than the QN900C, but it’s very possible that Samsung added some extra processing to the TV that drives those numbers up a bit."

So it seems to come down to if there is actual improvement of motion clarity, which it would of course be with a true 240 hz panel (if people could notice it is another question though).
I was specifically referring to 4K@240Hz vs 4K@120Hz which is 5.1ms vs 8.2ms
Otherwise yes, you are right that QN900D is a step down where it comes to input lag. 6ms for 4K@120Hz is quite a bit better than 8.2ms - or maybe not something that is terrible but still tiny bit worse for new model which is a shame.

----
For reference my 48GQ900 has 4.9ms for 4K@120Hz. Kinda why I got gaming monitor over a TV... even though its miniscule difference compared to e.g. 5.9ms for LG C2 at the time or 5.5ms for C3. I get same 4.9ms input lag on LG27GP900 - and I guess its about as low as you can get with the way these figures are measured. Not even sure if that is correct way to measure input lag. RTINGs would need to measure CRT with their methods and if there is something else than close to 0ms then they apparently do not measure time between actual signal and when its displayed but e.g. some kind of average time for when supposed frame is generated and then displayed - average here being middle of the screen.

I do assume these monitors I use have zero lag actually and not ~5ms and these measurements just measure time it takes for virtual electron beam to get to the middle of the screen.

----
BTW. With that in mind IMHO 240Hz monitor with say 10ms input lag vs 120Hz monitor that also measures 10ms you should still get LESS input lag in 240Hz mode in an actual game. This is due to how rendering works and queues in GPU and other things related to pushing frames. Less so if you don't quite hit 120fps and run VRR but the moment you hit 120fps and run v-synced you are much worse overall latency-wise than if you ran the same game on 240Hz screen even with frame rate limiter at 120fps - let alone allowing oneself to hit >120fps.

That said with measurements on hand even if Samsung implemented this 240Hz without any tricks with its increase of latency 60 and 120Hz modes and seemingly zero improvement in the panel tech otherwise I would hardly say it could be considered an upgrade. I would hardly say it made sense to get newer model than yesteryear's bargain price model. If the price was the same then... for PC gaming specifically I would say newer model with 240Hz should be an improvement. For console gaming of course it is not.
 
I was specifically referring to 4K@240Hz vs 4K@120Hz which is 5.1ms vs 8.2ms
This is actually in itself quite weird, as the 900C at 4K@120hz has a much lower input lag than the 900D. In fact, the 900C at 4K@120hz is almost as fast as the 900D at 4K@240hz which kind of got me started on this quest to find out whats really going on.

Otherwise yes, you are right that QN900D is a step down where it comes to input lag. 6ms for 4K@120Hz is quite a bit better than 8.2ms - or maybe not something that is terrible but still tiny bit worse for new model which is a shame.

I would imagine there is a margin for errors in these numbers so not sure if we can trust them exactly, but I find it really weird that input lag overall seems to be slightly worse, or at lest not better.

That said with measurements on hand even if Samsung implemented this 240Hz without any tricks with its increase of latency 60 and 120Hz modes and seemingly zero improvement in the panel tech otherwise I would hardly say it could be considered an upgrade. I would hardly say it made sense to get newer model than yesteryear's bargain price model. If the price was the same then... for PC gaming specifically I would say newer model with 240Hz should be an improvement. For console gaming of course it is not.
This is my take away as well, marginal improvements according to the numbers it seems, especially for 120 hz which seems to be worse, which would then be a downgrade for console users.
 
BTW. With that in mind IMHO 240Hz monitor with say 10ms input lag vs 120Hz monitor that also measures 10ms you should still get LESS input lag in 240Hz mode in an actual game. This is due to how rendering works and queues in GPU and other things related to pushing frames. Less so if you don't quite hit 120fps and run VRR .


According to RTings.

Input Lag 900C at 60fpsHz = 10.1ms
Input Lag 900C at 120fpsHz = 6ms

Input Lag 900D at 60fpsHz = 11.7ms
Input Lag 900D at 120fpsHz = 8.3ms
Input Lag 900D at 240fpsHz = 5.1 ms

Yes if you aren't running 200fpsHz+, then it's not a big upgrade, and actually would be 2ms or so more input lag. Still well under 10 though, plus 8k desktop/apps, 4k upscaled 8k and 8k media-wise, and 240hz4k gaming capable. 200fpsHz+ would prob be 6 to 5ms depending how high your fps was. So overall there are tradeoffs but I would consider it an upgrade. Personally I wouldn't buy any screen that I was gaming on that didn't do 240hz now for the increased motion clarity/blur reduction(even considering VA/FALD models slower response) and importantly - increased motion definition (showing more unique frames of action states, more defined animation, more fluidity). 120hz isn't an upgrade to me.

BTW. With that in mind IMHO 240Hz monitor with say 10ms input lag vs 120Hz monitor that also measures 10ms you should still get LESS input lag in 240Hz mode in an actual game. This is due to how rendering works and queues in GPU and other things related to pushing frames. Less so if you don't quite hit 120fps and run VRR but the moment you hit 120fps and run v-synced you are much worse overall latency-wise than if you ran the same game on 240Hz screen even with frame rate limiter at 120fps - let alone allowing oneself to hit >120fps.

You can cap your frame rate a few fps below the peak rate of your monitor so you don't trigger v-sync like you said, so v-sync shouldn't be an issue since you can usually set caps on a per game basis.

Your overall response to game action, from an outside observer standpoint, should be faster when you are actually seeing newer action states of a game rather than seeing 1/2 that frame rate running with a Hz higher than it. For example seeing 120fps vs 240fps on a 240Hz capable screen. You should be seeing new action states 2x sooner when running 240fpsHz on said screen. (Mental forecasting, intuition, team dynamics and communication set aside for the moment) - You can't react to what you haven't seen yet so you are already ahead time wise vs yourself in the same scenario where you are instead seeing half of the frame rate. I do understand that higher redraws means "2x" as many windows for action to register though. That's why low latency game modes some tvs have lower input lag when they double 60fps (e.g. consoles) to 120fpsHz on 120hz gaming tvs. The do not show any new, more recent motion states to react to though.

If you are using VRR, the screen's hz is lower when your fps is lower. Most people aren't running without VRR so that they can get somewhat lower input lag (with their screen always running at the max Hz), since that causes stutter/judder, visible de-synced issues. I'm sure some few do run their screen at max Hz without VRR, but I don't see why you would bother if you are not exceeding the Hz of the screen by a considerable margin in your fps lows and minimums. If you are running csgo or LFD2 or Quake or something like those, the frame rate would probably massively exceed that hz of a 120hz or 240hz screen. In that case, running without VRR would make sense, but you wouldn't be at 120fps on a 240hz screen in that kind of scenario.


online gaming, input lag
----------------------------------

Keep in mind that the way online gaming servers work, that you aren't actually seeing things on your screen where the server determines they "actually" are. Online gaming is buffering frames and queuing for starters. Your local game is also predicting frames for you for when it has to wait on server states (from the slower server+online machinery), and the server is also delivering "corrected" biased interpolated action states from it's end, and that at it's own, lower, tick rate (they are all low but some are especially low). So you are responding to predicted and interpolated visual stimuli on your screen from both ends, at a lower tick rate of action, responding at gamer human response time of 150ms to 180ms, aiming at something that isn't exactly where your local game predicted or where the server determines it is when you saw it. Plus, if you use (DLSS +) Frame gen, frame gen is also buffering frames, as well as showing you more generated/predicted in-between frames that aren't 1:1 to the server, and that on top of the server's interpolated result frames and your local game predicting frames.

Splitting hairs on already quite reasonably low input lag is more meaningful in local and LAN gaming. They market screens as if it's a 1:1 relationship to online gaming but that is false. I find it kind of ironic that the most zealous high hz and low input lag people from "gaming advantage" perspective probably are playing online games with sloppy locality and predicted and interpolated and revised action and results, rather than in tight LAN competitions or local games where it would actually matter. Most of the input lag testing you see online that shows any appreciable difference in accuracy, scoring, is done locally and/or vs bots. In online games, unless it was appreciably bad input lag, I think a few ms would be lost in the shuffle (in online games).

Online gaming also uses buffered frames and speculative prediction, (around 2 frames on the server and 3 frames on the client in the case of valorant) , has queuing and tick rates in it's simulation of "real time", plus it delivers biased results based on the flavor of the netcode decisions made by the developer.

The highest tick servers are 128 tick , 128Hz, 7.8ms, but -

"Frames of movement data are buffered at tick-granularity. Moves may arrive mid-frame and need to wait up to a full tick to be queued or processed."

"Processed moves may take an additional frame to render on the client."

If you are running higher fpsHz minimums than the tick rate of the server, e.g. well over 128fpsHz on valorant, (I'm guessing probably something like 180 or 200fpsHz average to be safe), you will lower how much out of sync you are from the server, but it's still a **minimum of 72ms of "peeker's advantage" on 128tick servers** according to the valorant networking article referenced in the quote below. The size of the rubberband/gap, and thus the "peekers advantage" for 60fpsHz players on valorant's 128tick servers is \~100 ms. Lower tick servers, like 60 tick would be even worse. Hard to believe some servers are still running much lower ticks in the 20's. That and, some games net code might not be as optimized on top of that.


There is a lot more to it but your local input lag has to go through a lot more machinery. What you see is not what you get in online gaming so while low input lag is nice, it's not a 1:1 thing how it's processed, or even what you think you are seeing in the first place at any given time to act on as far as the server is concerned in online gaming as opposed to local gaming and LAN gaming/competition.




. . . . .
 
Last edited:
You can cap your frame rate a few fps below the peak rate of your monitor so you don't trigger v-sync like you said, so v-sync shouldn't be an issue since you can usually set caps on a per game basis.
I did some tests and to me 240Hz mode is where I stop being able to tell difference between V-Sync and still remaining within VRR range. Maybe if I spend more time training myself in this art of mental ilness that is being bothered by input lag but for now I find 200+ frames per second a sweet spot. Higher is better but I might just as well focus on more details because my lazy eyes don't want to track objects that are too fast.

With 360Hz monitor I am pretty sure I should not worry about such things anymore. That said I remember how big difference in input lag it did versus how little motion blur it added to limit fps below v-sync so I still do it even if I cannot tell the difference. And my position is that I still recommend to do it even on new fancy 480Hz monitors.

Your overall response to game action, from an outside observer standpoint, should be faster when you are actually seeing newer action states of a game rather than seeing 1/2 that frame rate running with a Hz higher than it. For example seeing 120fps vs 240fps on a 240Hz capable screen. You should be seeing new action states 2x sooner when running 240fpsHz on said screen. (Mental forecasting, intuition, team dynamics and communication set aside for the moment) - You can't react to what you haven't seen yet so you are already ahead time wise vs yourself in the same scenario where you are instead seeing half of the frame rate. I do understand that higher redraws means "2x" as many windows for action to register though. That's why low latency game modes some tvs have lower input lag when they double 60fps (e.g. consoles) to 120fpsHz on 120hz gaming tvs. The do not show any new, more recent motion states to react to though.
Ideally displays just refresh pixels as they receive them - just like CRTs did and how most monitors actually do it.
If you take 120Hz input and draw it at 240Hz then you get on average 2ms more input lag.
If you did the same but with panel running at 60000Hz you would need to start drawing frame after at least 16.5ms versus 16.66ms for the whole frame time. Of course in this case average input lag would be in the 8ms range so even bigger than with just 240Hz. At 120Hz you get zero additional input lag no matter how you calculate/measure it.

Looking at QN900C and QN900D RTINGs reviews it looks like there is 2ms more input lag - maybe Samsung really uses panels in new model at 240Hz?
Not really sure given its 2ms on average across all refresh rates.

To confirm this theory we would need to do one of the either:
1. Use high frame rate camera
2. Have all the data for input lag measurements and draw conclusion for that
3. Having both displays just move vertical line horizontally on both and assess how much it bends - or just draw window around and if it is much less bendy on new model it draws at faster rate.

Fun thing this last method quickly allows to assess how display draws its image to the screen and e.g. Pioneer Kuro draws image from the bottom to the top and does it at much faster rate than 16ms. Panasonic plasma I had had strange bend from the middle to sides suggesting it didn't draw from top to bottom or even bottom to top but from the middle to the sides - which for input lag for games was actually the best way they could possibly do it!

Almost all of my other panels draw image just like CRT with obvious bend at 60Hz.
All except MSI MAG 271QPX which has suspiciously non-60Hz bend at 60fps. It is hard to estimate given how small the bend is but I would say it looks the same at 60Hz and 120Hz and slightly better (less bend) at 240Hz. Then there is elevated input lag at 60Hz which does suggest what I should get 120Hz - observed input lag is what running panel at 120Hz would show. Maybe this is some kind of VRR flicker prevention - this panel does show some VRR flicker when switching frame rates between very large values but normally in games I am still yet to see any such issue except loading screens so who knows, maybe MSI did figure to just draw frames twice for lower refresh rates to improve VRR flicker.

All in all for lowest input lag we need zero buffering and the only way to avoid having to use memory buffers is to directly consume video stream by throwing it at the pixels.

If you are using VRR, the screen's hz is lower when your fps is lower. Most people aren't running without VRR so that they can get somewhat lower input lag (with their screen always running at the max Hz), since that causes stutter/judder, visible de-synced issues. I'm sure some few do run their screen at max Hz without VRR, but I don't see why you would bother if you are not exceeding the Hz of the screen by a considerable margin in your fps lows and minimums. If you are running csgo or LFD2 or Quake or something like those, the frame rate would probably massively exceed that hz of a 120hz or 240hz screen. In that case, running without VRR would make sense, but you wouldn't be at 120fps on a 240hz screen in that kind of scenario.
I would say to just get fast monitor like 360Hz or even 480Hz (there is 1440p 480Hz WOLED monitor) and use VRR.
Disabling V-Sync means input lag can be less but it isn't consistent in the middle of the screen.
Then again when I played on fixed refresh rate screens for FPS games I did disable V-Sync. Using the same logic for 360/480Hz it should still give slightly better experience without V-Sync... then again differences are miniscule and way past when latency is irritating and stuttering and tearing even if reduced is more noticeable than... difference between input lag with VRR and V-Sync off. In fact scanline sync totally made running unlimited frame rate for me useless.

And at the time I did test exactly running game with 60Hz at 60fps with scanline sync versus overclocking monitor to 64.5Hz and running it at 60fps with VRR and other than slight tearing with scanline sync which could move to visible area as frame time was longer which wasn't an issue with VRR I didn't seem much difference between input lag between these two cases. Of course VRR is superior and should be used when available. Running >60fps with tearing didn't feel as much faster as it was more inconsistent. With framerates way in excess like 200+ fps would be faster though and in this case it made some sense. Then again would I recommend to bother? Nope, just get 360/480/etc Hz OLED or something like that and get both buttery smooth motion and good input lag.

online gaming, input lag

Yes and no. I am not that experienced in online gaming but from what I did play what I noticed the algorithm games settled on is: if you see shooting someone down then no matter where he is he will die unless server decide you die first.
This can lead to unrealistic situations where you see you managed to take cover but still die but also situations where you kill laggy player which obvioussly isn't where they are but still get fragged by you where you see them. For this reason it is still important to not have any additional lag on your rendering and display pipeline.

It was somewhat different in earlier late 90's and still some early 00's games where you could for example see your shots have visible delay and lower ping would give less delay. Then it was still best to have less lag to not need to compensate for two kinds of lag but you still had to compensate yourself for your ping. Ah good late 90's FPS games - there you really needed LAN for good experience and not having dedicated server we always made worst player host the game because they then had slight handicap :)
 
  • Like
Reactions: elvn
like this
Back
Top