166Hz Has Ruined 60 fps Forever

I never use to think the high refresh made much of a difference but in more recent years I have struggled with eye fatigue even with good glasses. I noticed switching to a 144hz and 120hz monitors I am no longer having the issue. I am a productivity user so not really gaming almost at all.

I find that it makes a difference on the desktop- I wouldn't mind pushing to 1000Hz for everything, supposing the panel can keep up.
 
Mouse feel is one thing but visually 60fps and 144fps is about the same with good motion blur... I just had the ad27qd for testing and was surprised how well it played on my friend's 60hz monitor with motion blur.
Motion blur really is the answer here... at least for LOOKS. feel is a different topic
 
I disagree on both sample and hold blur reduction and motion definition. The higher your framerate+Hz, the better. We won't get to 1ms CRT level "zero" blur until we get high end motion interpolation of something like 100fps x 10 interpolated on a 1000 Hz monitor. Unless you just want to say that adding motion blur in a game's graphics settings so that all fps-hz are equally smearing is just as good? Or maybe you meant good overdrive? That is to make up for poor response times of a monitor.. You are still limited by sample-and-hold blur and lower motion definition as outlined below

-------------------------------------------
https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

okw997S.png


-------------------------------------------

ALY9lQS.png



mYjY3ov.png
[
 
240hz is 4.1ms frames so the monitor's response time would have to be really fast.
Some ips probably aren't fast enough overall but it could be close.
Modern gaming VA with the best overdrive (LGs 32") are tight to about 120fpsHz which is 8.3ms.

That doesn't mean you can't use monitors over their response time limitations, it just means that you'll get some ghosting and/or black smearing/trailing since high contrast transitions are the slowest and the response times quoted are the average not the extremes.

OLED has very fast response times but are just starting to get to the point where a 120hz 4k with variable refresh rate will be possible and that is still off of one yet to be released monitor with displayport with 4:4:4 chroma limitations due to bandwidth until we actually get hdmi 2.1 gpus to feed them on hdmi at those rates. Any other hdmi 2.1 120hz 4k displays released will have no hdmi 2.1 gpus for awhile yet.

Also keep in mind that you could have a 1000 Hz monitor and you'd still have essentially the same results as a lower hz monitor unless you exceed that lower Hz monitor's ceiling in your actual frame rates. Running 120fps average (usually a 90 - 120 - 150 graph for the most part) on a 240Hz monitor isn't getting anything out of higher than 144hz capability really. There are easier to render games that can get over 240fps average though of course. How demanding of a resolution also matters.

So, the appreciable gains of your refresh rate (Hz) is only as good as your frame rate range.

However 'm still looking forward to higher Hz ceilings on the road to 1000Hz but at some point we'll almost certainly need some very high quality interpolation tech to fill it (100fps x 10 interpolated for example).


 
Last edited:
It amazes me that people think they can perceive so many fps, you can't it is biologically impossible.
 
You no doubt have evidence?
In large part due to Bloch's Law

Different parts of the eye have different response speed. The corner of your eye doesn't see color, but is fast; the center sees color, and is slower. This means that when you look at a 60 Hz monitor straight-on, the image is perfectly steady; but when you look at it from the corner of your eye, it is flickering. As you go to even higher frequencies of refresh, even the rods don't respond fast enough.

This make sense from an evolutionary perspective. When the saber-toothed tiger jumps at you, you need to know about it - quickly. You don't need to know its color. So using the faster rods (sensitive, fast, no color sense) in the edge of the field of view is a good survival strategy. But since we can't move very far in 1/100th of a second, there is no need for sensors that respond at that speed.

The difference is real, and can be perceived. In the corner of your eye, for most people.

Incidentally, the rendering of fast motion is helped by the higher frame rate; if you show a bright object against a dark background moving left-to-right across the screen in 1/30th of a second, the brain will notice the difference between "two images comprise the full motion" and "four images comprise the full motion", even if you don't really perceive the individual frames. You will see a smoother action when more frames make up the motion: after all, in real life you really see "infinitely many frames" even though they blur together.

I don't understand why people here are so rude and childish.
 
I thought this was a decent article: https://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

But I still disagree with some of the findings. I can very clearly, night and day, vanilla and chocolate, tell the difference (and feel a significant improvement in gameplay) going from 60Hz to 120Hz.

And I've also seen smaller differences when I upgraded from 120Hz to 144Hz and, most recently, to 166Hz. Anyone that says otherwise has clearly never played on a high-end high-refresh monitor on a proper system.
 
  • Like
Reactions: Nenu
like this
https://www.blurbusters.com/article-on-why-some-oleds-have-motion-blur/

"Even instant pixel response (0 ms) can have lots of motion blur because of sample-and-hold. Frames continuously shines for a whole refresh. When tracking moving objects, your eyes are in different positions throughout a whole refresh; this causes the frame to be blurred across your retinas:"


sampleandhold1.gif



------------------------------------


https://www.testufo.com/persistence
"
Did you know? Blur Busters is the world's first site to test a genuine 480 Hz display. This animation demonstrates the limitations of even 120 Hz and 240 Hz displays. Using this test, it is easy to tell apart 120Hz, 240Hz and 480Hz. You need 480Hz+ to easily read the street name labels in the Panning Map Through Slitsanimation at 960 pixels per second. Laboratory displays (1000Hz+) have already confirmed it is still beneficial to keep increasing Hertz to quadruple digits.

This effect also applies to random holes instead of vertical lines, so this also applies to gaming in dense foilage such as jungles & forests, moving sideways behind lattice fences, and tilting head to scan behind door cracks. The faster the pixels modulate (120Hz, 240Hz, 480Hz), the more detailed persistence of vision becomes. Good full-readability (zero stroboscopic effect & no motion blur) with rapid occulsion effects, can require quadruple-digit display/VR refresh rates to match real-life in this test.

Spinning LED clocks, LED bike wheel effects, and old mechanical TVs (Nipikow wheels) use the same persistence-of-vision technique (very high Hz for individual flickering light sources). Higher Hz increases the resolution of persistence-of-vision effects."

https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

"1ms MPRT requires frame visibility times of 1ms, which can only be achieved via:

(A) 1000 frames per second on a 1000 Hertz display

OR

(B) A motion blur reduction mode (e.g. ULMB) that that flashes the frame for only 1ms.

So, yes, for today's monitors, 1ms MPRT = motion blur reduction strobe backlight.

Source: Blur Busters Law And The Amazing Journey To Future 1000Hz Monitors

In current modern writings now -- milliseconds "MPRT" and milliseconds "persistence" are the same thing. So when you see "1ms persistence", that's also the same thing as "1ms MPRT" too."

also...

"There is a small latency (reduces at higher Hz) from the time period refreshing the LCD in darkness until the backlight is flashed, typically averaging half a refresh cycle of of latency. Faster response + higher Hz panels can reduce the strobe lag penalty."


-------------------

So there are huge benefits of blur reductions until we get up to 1000fps x 1000 Hz's 1 pixel of persistence due to the nature of sample and hold blur / MPRT, most likely using a very high quality motion interpolation to repeat frames something like 100fps x 10 interpolated = 1000fps at 1000Hz for 1pixel MPRT crt level "zero blur".

In addition to blur redcution, there is also a huge increase in the motion definition, most obvious on a desktop's mouse and window movement since desktop has little demand on video cards so feeds the max fps possible to a high hz monitor consistently . That kind of motion definition and pathing also increases in games as high frame rates fed to a high hz monitor. In 1st and 3rd person games you are continually moving the viewport at speed so not only are individual virtual objects moving on the screen. but you are moving the whole game world relative to you mouse looking and movement keying.

Also - Keep in mind that when people say they are getting X fps they are talking about the average fps which is usually graphed as a variable range of that plus/minus 30fps in either direction for the most part. So in order to be seeing a true 144hz or 240hz or 480hz you have to be filling every Hz with a new frame so that is not 120 fps average, not 240fps average and not 480fps average.. and obviously not running 60 fps and less on a high hz monitor.
 
Last edited:
In large part due to Bloch's Law

Different parts of the eye have different response speed. The corner of your eye doesn't see color, but is fast; the center sees color, and is slower. This means that when you look at a 60 Hz monitor straight-on, the image is perfectly steady; but when you look at it from the corner of your eye, it is flickering. As you go to even higher frequencies of refresh, even the rods don't respond fast enough.

This make sense from an evolutionary perspective. When the saber-toothed tiger jumps at you, you need to know about it - quickly. You don't need to know its color. So using the faster rods (sensitive, fast, no color sense) in the edge of the field of view is a good survival strategy. But since we can't move very far in 1/100th of a second, there is no need for sensors that respond at that speed.

The difference is real, and can be perceived. In the corner of your eye, for most people.

Incidentally, the rendering of fast motion is helped by the higher frame rate; if you show a bright object against a dark background moving left-to-right across the screen in 1/30th of a second, the brain will notice the difference between "two images comprise the full motion" and "four images comprise the full motion", even if you don't really perceive the individual frames. You will see a smoother action when more frames make up the motion: after all, in real life you really see "infinitely many frames" even though they blur together.

I don't understand why people here are so rude and childish.
It kinda seems like you're arguing against yourself for this entire post. While you claim initially that "such high framerates" can't be perceived due to biological limitations, in this post you go on to say that higher framerates are translated more naturally by the brain as 'motion' instead of 'distinct images shown quickly in sequence' - and that's kinda the whole point of high refresh rate monitors.

As a side note, strolling into a thread about high refresh rate monitors and implying that anyone who owns one is a rube with more money than sense isn't exactly a civil or mature "hello", either - I'm sticking with my laughter. It was actually among the more polite ways I felt inclined to respond. ;)
 
It kinda seems like you're arguing against yourself for this entire post. While you claim initially that "such high framerates" can't be perceived due to biological limitations, in this post you go on to say that higher framerates are translated more naturally by the brain as 'motion' instead of 'distinct images shown quickly in sequence' - and that's kinda the whole point of high refresh rate monitors.

As a side note, strolling into a thread about high refresh rate monitors and implying that anyone who owns one is a rube with more money than sense isn't exactly a civil or mature "hello", either - I'm sticking with my laughter. It was actually among the more polite ways I felt inclined to respond. ;)

I'm sorry you feel that way even though no one said nor implied anything of the sort, and technically I'm not wrong. And you trying to illicit a response from me over nothing is indeed immature, but I do thank you for your "civility".
 
So I ended up getting a 240Hz monitor for a spare rig, just mostly cause I wanted to see what it would look like.

Well, it is smoother, but not anything amazing after using 166Hz on my main rig for a few months. It's nice, and I can definitely notice the difference, but I don't think it's at all needed (even for a high-refresh fan like me).

My opinion is that 144Hz is still great, and you have much more options as to the panel tech and resolution and aspect ratio, and many other important things besides the highest refresh.

Maybe if you are a pro Overwatch player or something, then 240Hz is useful. For me, I like it, but the other sacrifices make it not a good call (especially if you want to use this as a main monitor).
 
I have issues seeing much difference above 85hz. But I don't play any shooters or MP twitch games so maybe that's why I'm not seeing the benefit.

I have a 144hz gsync screen but I actually turn it down to 85hz mainly because the hitches when framerate jumps between ~100-140 looks terrible, I actually would rather game at a constant 60fps than look at wild fps swings, it's a far worse experience IMO.

At 85 fps there aren't any fps swings and since for me it mostly looks about the same as 100hz+, so 85hz ends up being my preferred setting.
 
I thought this was a decent article: https://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

But I still disagree with some of the findings. I can very clearly, night and day, vanilla and chocolate, tell the difference (and feel a significant improvement in gameplay) going from 60Hz to 120Hz.

The guys in that article are really focused on tracking moving objects. I tend to agree that doesn't require that high of a framerate. But they only mentioned smoothness as an aside, and didn't mention motion clarity at all. I mean, it's not an FPS, but it's SUPER easy to see how much easier it is to read scrolling text on an LCD at higher refresh rates, like in the Blurbuster scrolling map test. In fact, trying to read that text at 60hz makes me a little nauseous, whereas at 165hz it's a lot easier, and of course strobing makes it even easier.

I think if you just laser focus on fps required to track moving objects, their conclusions make sense, but there's a lot more to the benefits of high refresh than that, especially on sample and hold displays. If we switch to scanned backlight or strobed displays, then yeah I would tend to agree there probably isn't much benefit past the minimum refresh rate required to make flickering completely imperceptible, which tends to be 100-120hz for most people. But even then there are people who are bothered by flickering up to 240hz...

It's easy to dismiss high refresh rates if you only focus on one small part of the benefit they provide.
 
Back
Top