Why does my 60hz monitor look smoother then my 90hz overclocked monitor?

Joined
Mar 18, 2013
Messages
580
When watching 60FPS youtube videos for example, my 60hz work computer screen looks smoother, and as if running at a higher hz then my Korean monitor that has been overclocked to 90hz. My 60hz monitor is flicker free, could this have something to do with it? Its difficult to explain, but each monitor is 2560x1440 IPS, but the 60hz monitor just looks better when watching/playing videos or games…

60hz - Benq GW2765HT
90hz - Qnix qx2710
 
Last edited:
Your overclocked display is most likely dropping frames, which means it is not a successful overclock. Check out this page and follow the instructions to confirm.

Blur Busters UFO Motion Tests

You'll want to back down the overclock until your photo shows a solid line of blocks.
 
90hz isn't an integer multiple of 60hz, so frames have to be fit to the time scale of your monitor.

A 60fps video on a 90hz monitor will show frames for 2 refreshes, then 1 refresh, 2, 1, and so on. Whereas a 120hz monitor will use a constant 2 monitor refreshes for each frame, and a regular 60hz monitor obviously has 1 to 1 relationship.

It's the same phenomenon as when you hear about 3:2 pulldown on TVs.
 
I would say that overclocked panels beyond 60Hz are more of a marketing gimmick than anything else. I mean, all it really does is give people a reason to buy video cards that drive a higher FPS than 60.

But the reality is, the average person is fine with less than 60Hz... and often even half that rate at 30FPS is fine. That's why a lot of people who game on consoles think it's fine and can't tell why the PC is better. They probably aren't as sensitive to the difference between 30 and 60 as PC gamers are. By the time you're talking about the difference between 120 and 60... you're really, really pushing the limits of what the average person can see. The only remotely practical use for it, IMO, is for active-shutter glasses used for 3D monitors. Those monitors display a different frame to each eye using the extra refresh, meaning you get an effective 60Hz in each eye.

The people who can tell a difference between 60 and 120 are a very vocal minority, though. Usually the sort of people that love twitch games because they're naturally good at them and have split-second reaction times. But honestly, there's a very good chance you're part of the larger part of the population that can't perceive much of a difference beyond 60Hz.

Another thing worth noting is that having a higher Hz monitor is actually detrimental if you can't get the FPS to match... it makes things look choppier than if the monitor were running at a lower Hz that the graphics card could actually keep up with. The importance of matching FPS to Hz in obtaining smooth motion is often neglected by people who just want to see the frames as fast as possible even if it looks jerky or weird. Again, generally twitch gamers who want to see things as fast as possible.
 
^^That is bullshit. Even jump to 75 Hz from 60 Hz is massive difference even in desktop use. I bet that everyone can see the difference between 120 Hz and 60 Hz. Those who can't are likely in minority.
 
90hz isn't an integer multiple of 60hz, so frames have to be fit to the time scale of your monitor.

A 60fps video on a 90hz monitor will show frames for 2 refreshes, then 1 refresh, 2, 1, and so on. Whereas a 120hz monitor will use a constant 2 monitor refreshes for each frame, and a regular 60hz monitor obviously has 1 to 1 relationship.

It's the same phenomenon as when you hear about 3:2 pulldown on TVs.

Since the thread is beginning to go off topic, I wanted to highlight the above post by BearOso. He's correct. Even if you have a sucessful overclock to 90hz, 60fps/hz content will display smoothly on a 60 or 120hz display, but will judder on a 90hz display. He specifically mentions 3:2 pulldown, which isn't really noticeable to most people. Well, what you're doing is effectively 2:1 pulldown, which is VERY noticeable.
 
^^That is bullshit. Even jump to 75 Hz from 60 Hz is massive difference even in desktop use. I bet that everyone can see the difference between 120 Hz and 60 Hz. Those who can't are likely in minority.

I have sources to back it up. The best one is page 52 of this PDF:

http://realtimerendering.com/Principles_of_Digital_Image_Synthesis_v1.0.1.pdf

"Under the best conditions, the CFF for a human is around 60 Hz."

And also:

LCD Motion Blur: Fact and Fiction | ExtremeTech

Flicker fusion threshold - Wikipedia, the free encyclopedia

Real-Time Rendering · 60 Hz, 120 Hz, 240 Hz…

If you're sensitive to it, sure it's a massive difference. Even if just 20% of people can tell a difference, that's a huge number of people who will buy this stuff. People who are tuned to notice input lag and Air Force pilots can get faster reaction times out of their reflexes, sure, but that's outside the range of what an average person would notice.
 
Oh, not this garbage again.

Yes, it's true that the human eye has the equivalent of an internal refresh rate. Some prior studies have said 24, 28, 30, etc. You just cited one with "around 60." We can all cite references to back up whatever claim we can come up with. But here's the kicker. Regardless of whatever that internal refresh rate is, it won't be perfectly in sync with any monitor. Real motion looks more fluid than anything on a monitor. Higher refresh rates look more fluid than lower refresh rates as they get closer to real motion and have more motion data for our eyes and brain to process. Even if our eyes only refreshed internally at the equivalent of 15fps, higher refresh rates would still look more fluid to our eyes due to them being natively out of sync with our own eyesight and due to the additional motion data provided.

Side by side, 120fps looks more fluid than 60fps. That has been proven numerous times.
 
7ac.jpg
 
althenian200 has proven exceptionally good at being wrong on a variety of topics since he went active recently.
 
Higher refresh rates look more fluid than lower refresh rates as they get closer to real motion and have more motion data for our eyes and brain to process. Even if our eyes only refreshed internally at the equivalent of 15fps, higher refresh rates would still look more fluid to our eyes due to them being natively out of sync with our own eyesight and due to the additional motion data provided.

So, you think the cutoff beyond which people wouldn't notice a difference is higher than 60Hz. Fair enough, maybe you've got information I don't have. Do you have a specific number? Because if you're claiming that they could raise this number indefinitely and the higher number would always look noticeably better... then this amounts to an alternative version of Zeno's Paradox.
Side by side, 120fps looks more fluid than 60fps. That has been proven numerous times.

I haven't actually seen it proven. I could believe it if someone has proven it, I just haven't seen that data.

If there really are conflicting studies on this issue, then it may not even be worth debating because we aren't going to come to a consensus if the scientific community can't agree on it.
 
Just to back it up once more, althenian200 is totally in the wrong here. 120hz+ is absolutely wonderful and very, very noticeable (versus 60hz) to everyone I've ever demo'd for by simply dragging a window around a screen in 60hz vs 120hz mode. Psh.
 
Just to back it up once more, althenian200 is totally in the wrong here. 120hz+ is absolutely wonderful and very, very noticeable (versus 60hz) to everyone I've ever demo'd for by simply dragging a window around a screen in 60hz vs 120hz mode. Psh.

I'm definitely starting to consider the possibility that there's something wrong with my eyesight or perceptual speed. I had people coming down on me like this in real life when I insisted I couldn't tell a difference and the monitors were right in front of me.
 
So, you think the cutoff beyond which people wouldn't notice a difference is higher than 60Hz. Fair enough, maybe you've got information I don't have. Do you have a specific number? Because if you're claiming that they could raise this number indefinitely and the higher number would always look noticeably better... then this amounts to an alternative version of Zeno's Paradox.

I don't think there's a real cutoff. A refresh rate is not equal to true motion (IE, a person physically moving). A refresh rate is simply a way to trick your eyes/brain into perceiving motion. The higher the refresh rate, the more convincing the effect. And each person's individual cutoff will be different.

We hear gamers all the time say that they won't play a game below XX fps. I'm guilty of that myself sometime. I finally got so sick of screen tearing that I finally set a cutoff of 60fps w/VSYNC (part of why I want GSYNC/FreeSync so badly in my next monitor). But then I got NFS Rivals, which has a locked 30fps. Tried the 60fps "hack" and it messed with the game's physics. And honestly, a locked 30fps was fine. Now, fluctuating from 30-60fps would drive me nuts because it ruins the trick on your mind (where refresh rate tries to simulate motion).

Can I see a difference between 60fps and 120 fps when side by side? Absolutely, and I bet that you could as well if you sat down in front of an actual comparison. Do I care about it enough to buy a monitor over 60hz? Nope. I don't care one bit about gaming above 60hz, but I do acknowledge that there is a difference.
 
Last edited:
I don't think there's a real cutoff. A refresh rate is not equal to true motion (IE, a person physically moving). A refresh rate is simply a way to trick your eyes/brain into perceiving motion. The higher the refresh rate, the more convincing the effect. And each person's individual cutoff will be different.

That sounds right to me. Even when people are looking right at the monitors, people will disagree on stuff. Apparently people are still debating 120Hz vs 240Hz.
We hear gamers all the time say that they won't play a game below XX fps. I'm guilty of that myself sometime. I finally got so sick of screen tearing that I finally set a cutoff of 60fps w/VSYNC (part of why I want GSYNC/FreeSync so badly in my next monitor). But then I got NFS Rivals, which has a locked 30fps. Tried the 60fps "hack" and it messed with the game's physics. And honestly, a locked 30fps was fine. Now, fluctuating from 30-60fps would drive me nuts because it ruins the trick on your mind (where refresh rate tries to simulate motion).

I'm interested in GSync/FreeSync too. That actually does seem a lot smoother to me than having a higher Hz by itself, but that's probably completely subjective. I'm one of those people that likes to enable VSync in games rather than having an unlocked framerate because I think it looks better to have the framerate match up with the display refresh.

Can I see a different between 60fps and 120 fps when side by side? Absolutely, and I bet that you could as well if you sat down in front of an actual comparison. Do I care about it enough to buy a monitor over 60hz? Nope. I don't care one bit about gaming above 60hz, but I do acknowledge that there is a difference.

Well, I'd certainly acknowledge that there's a difference. Just because I didn't see it the one time I actually saw a 120Hz display doesn't mean that it's not real. Apparently a lot more people can see it than I thought. It's really hard to believe in something you can't see for yourself just because other people are seeing it.
 
Personally I run a balance between still shot quality and motion excellence. Dialing in graphics settings to achieve around 100fps-hz or 110fps-hz average you ride a frame rate graph that typically goes from 75-90 <----> 100 - 110ave <---> 130's or more dynamically and smoothly with g-sync.

100fps-hz/120fps-hz/144fps-hz:
~40% / 50% / 60% blur reduction (a "soften" blur rather than 60fps-hz and less' smearing blur)
5:3 / 2:1 / 2.4:1 increase in motion definition and path articulation (often unmentioned, huge difference)
g-sync rides the fps graph +/- without screen aberrations .

Regardless of the monitor's hz, lower frame rates will be blurrier (outside of using strobe mode). That is why I list my rates at fps-hz not fps and not hz alone. Without the frame rates, the hz is practically meaningless.

People are infatuated with graphics detail in still shots, but you don't play screen shots. If you are using a 60hz monitor or a high hz monitor running low fps (sub 75fps-hz to 90fps-hz mode/most of the time in game, really should be like 100 at least imo), you are essentially running a low hz, low motion definition and motion articulation, smearing blur monitor and missing out on most of the gaming advancements modern gaming monitors provide outside of the judder/tearing/stops avoidance.

Also note that VR headsets are 90hz currently (and would be higher if it were possible) because the lower hz and lower motion definition is even more obvious in VR to the point of being nausea inducing.

120hz-fps-compared

An example of a frame rate graph running around 100fps average.

Preview of NVIDIA G-SYNC, Part #1 (Fluidity) | Blur Busters

displayport 1.3 Hz limits per resolution (incl HDR monitors)
 
Last edited:
Like others have said, running videos you aren't running a clean multiple at 90hz and you aren't using a tv with anti-judder tech. Blurays and rips are usually 24hz too by the way.
A g-sync monitor would match the hz perfectly to 60fps-hz , or a 120hz monitor could show the same frame through two monitor refreshes (what I like to call a freeze-frame) for 60fps content , or 120 being 5 x 24hz/ frames for blu-ray content.

I recommend reading and viewing the blur busters g-sync fluidity page examples to give you a clear idea of the screen aberrations resulting from unmatched frame rates and hz.

I suspect you could be missing frames since overclocking monitor (as someone already said).
I also wonder if the 120hz monitor you saw once upon a time was being fed enough frames over 60 to show you a difference. A mouse cursor on a higher hz desktop has an overabundance of frame rate though and the motion defintion and motion articulation increase is beyond obvious even being a simple non-animated image pathing around. An entire game world in 1st/3rd person games being moved constantly relative to your viewpoint when mouse-looking and movement-keying is a huge difference, and an aesthetic one, seeing +/- 50% blur reduction and +/- 2x motion def. and motion articulation at 100fps-hz average.
 
Last edited:
Back
Top