166Hz Has Ruined 60 fps Forever

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,789
So, I have 2 setups. One with a 55" 4K TV and another with a recently bought 166Hz Ultrawide.

I've enjoyed 4K 60Hz gaming for the last year or so, but decided to try 1080p ultrawide for a change.

OMG, it is so much better. I had a 144Hz screen before, but 166Hz is a noticeable improvement for me.

Been gaming on that for the last few weeks, and went to try the TV again to test something. I can't go back.

Played a few minutes at 4K60 and I honestly got a headache. It was physically painful to watch, it was so choppy.

Not sure I can ever go back to 60Hz if I have a choice.
 
nah. It took me 1 day toa djust from 240hz gsync back to 30fps bloodborne or 60fps ps4 games. Not a problem at all. Picture quality is what matters most to me.
144 vs 166 is just a placebo. There is a difference when going 144 to 240
 
Well that's good to hear. I still have a bunch of PS4 games waiting for me to play.
 
For me the jump from 120Hz strobing to 144Hz strobing was significant, I've used a 240Hz BenQ monitor now for a while, only some of the BenQ monitors + 240Hz monitors with GSYNC are 144Hz(+) strobing capable (with the BenQ you can even go past 144Hz but at some point you just run into side effects at like 180Hz or thereabouts depending on the strobing settings). But then again I'm a motion smoothness fanatic and always been (started as a small kid when I was wondering why movement looked smoother on some monitors than others (probably the different refresh rates on the CRTs I saw). The jump from 144Hz to 240Hz without strobing is noticeable but not night and day, the motion smoothness with 144Hz + strobing feels like it must be equal to like 500Hz without strobing (bad comparison as the sample-and-hold is the bottleneck) but yea, really nice for smooth motion. At 120Hz strobing vs 240Hz without strobing it starts becoming a matter of how much input lag is worth to you, great deal less at 240Hz, but at 144Hz~180Hz I prefer the great motion clarity advantage over slight loss of input lag.
 
Last edited:
Having done 120Hz back in the day with CRT that is still arguably superior to the (relative) blurfest of LCD even today, I find the biggest change is more depending on the game.
If you're e-sports world champion playing 240p, yeah sure. If you're casual and playing skyrim or something like that... the engine probably can't even do it?
I'm more of an image quality guy these days but if I get another screen it will be at least 120Hz for FPS gaming, 60Hz is fine for all other content and uses I have though.
And I don't get 'sick' going between high Hz and 60Hz... that's a bit crazy.
 
nah. It took me 1 day toa djust from 240hz gsync back to 30fps bloodborne or 60fps ps4 games. Not a problem at all. Picture quality is what matters most to me.
144 vs 166 is just a placebo. There is a difference when going 144 to 240

likewise.

Been a huge advocate for 120hz+ displays for years, and recently went to a 75hz ultrawide and the downgrade was negligible. 30hz is still pretty rough lol but 60 is absolutely fine, especially for a TV (especially combined with a low latency signal)
 
likewise.

Been a huge advocate for 120hz+ displays for years, and recently went to a 75hz ultrawide and the downgrade was negligible. 30hz is still pretty rough lol but 60 is absolutely fine, especially for a TV (especially combined with a low latency signal)

I've never used anything above 60 Hz, but my whole life I haven't liked 30 Hz. And I notice 60 Hz very much on cinematic games that I'd like to play, to the point that I've set some games aside until I upgrade. But yeah 60 is fine for most games I like.

I'd say I notice lower Hz on Tv's less - maybe it's from being further from the light source?
 
The other problem that could be stemming from the TV is input lag. It might look stuttery but you add in the input lag and it is just plain unplayable.
 
120hz is good because it's neatly divisible by 24, 30, and 60 fps.

That’s why I don’t get why it’s 166Hz. Shouldn’t it be 168Hz? Sounds like whoever the manufacturer is doesn’t understand why the industry has coalesced around certain Hz ranges.

166Hz sounds like judder central when watching 24fps or 60fps content.
 
The other problem that could be stemming from the TV is input lag. It might look stuttery but you add in the input lag and it is just plain unplayable.
Yeah, that may be part of it. I can always run the TV at 1440p/1080p@120Hz, so it's not a complete loss.

And I'll mess with the settings more, I only tried for a few minutes so it's possible something was wrong.
 
sorry but without propepabx testing this kind of information is just useless IMHO. Placebo is underrated
Now if you could pass an abx test on that . now THAT would be interesting
No, I didn't do an AB test, but I'm pretty sure there is a difference. So much so that even 120Hz feels choppy to me. 144Hz is great, and 166Hz even better.

Not a night and day difference, but it is there and noticeable to me. I know people like to think that high refresh is placebo, but it's not. Different people are sensitive to different things.
 
It's true. Once you've played Final Fight at 144hz, the original arcade version feels like a choppy piece of crap. It really is gaming ruining.
 
No, I didn't do an AB test, but I'm pretty sure there is a difference. So much so that even 120Hz feels choppy to me. 144Hz is great, and 166Hz even better.

Not a night and day difference, but it is there and noticeable to me. I know people like to think that high refresh is placebo, but it's not. Different people are sensitive to different things.

I'm not saying you are wrong if i had to guess you are probably right. so please know that first

But I have met so many PPL comming with comemtn about somthing they could clearly feel even when the bends the logic itself of what they claimed.. so by nature im very scepticla to any claims that has just a pinch of chance for placebo.
second part if you do an A/B test you ae doing it wrong. thats a placebo infection test why you see it a lot in audio magasines because they dont like science.
ABX (the extra X makes teh difference) on the other hand is what is needed to objectivesly make a test of difference that rules out placebo.

But offcause it takes a lot of effort and time and most ppl wont do that, Which is totally undestandable.


This is not an "Attack" on your claime. im simply informing of why these kind of claims seems hollow to me ad a tecnical/science guy. but i also tend to hang aorund on a lot of science/tech realeted sites wheres the TOS dicttates to use proper scientific prof of your claims, or it will get deleted, so im more used to that approach.
so lieka lcaim of 166hz makes 144 look terrible would be a simple rule violations with a ABX test to back it up.
again just explaing where my view is coming from.



But in the end if you are happy with what you got. you did the right purchase.
 
I skipped 1080P horseshit. I was on 30" 2560x1600 monitors from 2007 up to about four years ago. I've tried 1080 and 1440 monitors and I just can't deal with that bullshit. I need more vertical real estate. 1080P is like looking through one of these:

ed2e02335bda3e4c4eecf08e49e20483--medieval-helmets-medieval-swords.jpg


I don't give a shit what the refresh rate is if I have to look at the game world through the slit of a medieval helmet.
 
My wife uses a 60hz monitor and although the resolution looks nice the gameplay experience pales in comparison to 100hz+. I experience a difference even desktop browsing so I think refresh > resolution.
 
My wife uses a 60hz monitor and although the resolution looks nice the gameplay experience pales in comparison to 100hz+. I experience a difference even desktop browsing so I think refresh > resolution.

For me it isn't as much about resolution as it is pixel height. When you get used to productivity and gaming on something with 1600 pixels of vertical height for so long, 1080P is just too restrictive. I've also gotten used to the size of my KS8500, and anything less than 40" is a no go for me. I do prize image quality and resolution above refresh rate, and at 4K, with maximum image quality I can't reach 144FPS or 166FPS in most games anyway.
 
Even 1920x1200 vs 1920x1080 is a world of difference when talking productivity. 1080 height is just not enough to get shit done. On the other hand over 1600 was too much for me. Have to tilt my head up and down to cover ->no joy.

That said, I've just went from 144Hz to 75hz, and it took a while to come down. Now after about 6 weeks I'm starting to get used to it. And the low fps.
 
nah. It took me 1 day toa djust from 240hz gsync back to 30fps bloodborne or 60fps ps4 games. Not a problem at all. Picture quality is what matters most to me.
144 vs 166 is just a placebo. There is a difference when going 144 to 240
If you play G-synced, the input lag difference between 144 and 165 brings a really nice connected feel bonus for those millisecond twitch shooters.

When switching between 120 and 144, the increased dia-show effect can be clearly seen, no question about that. 120 is a bit choppy.
 
For me it isn't as much about resolution as it is pixel height. When you get used to productivity and gaming on something with 1600 pixels of vertical height for so long, 1080P is just too restrictive. I've also gotten used to the size of my KS8500, and anything less than 40" is a no go for me. I do prize image quality and resolution above refresh rate, and at 4K, with maximum image quality I can't reach 144FPS or 166FPS in most games anyway.

You should look into the 43" 4k120hz Mango. I got one sitting on my desk next to the tiny X27 and the Mango is pretty much my go to display for everything these days. I like a big, fast display with low lag and the mango delivers.
 
For me it isn't as much about resolution as it is pixel height. When you get used to productivity and gaming on something with 1600 pixels of vertical height for so long, 1080P is just too restrictive. I've also gotten used to the size of my KS8500, and anything less than 40" is a no go for me. I do prize image quality and resolution above refresh rate, and at 4K, with maximum image quality I can't reach 144FPS or 166FPS in most games anyway.

I should have specified a little more on what I meant. To be honest 1080 isn't an option in my house for a main screen display. My wife and I have use multiple monitors for productivity to ensure enough desktop space. Her main is a 30' 1600 @ 60hz and i use a 27" 1440 @ 120hz. Hands down I prefer my setup to hers, especially when it's gametime.
 
I couldn't tell the difference between 60hz and 144hz :/

Maybe I'm blind
Blind, game running at 60fps, or you have left the "prefer maximum refresh rate" enabled in the drivers so you actually saw 144hz, when you had 60hz selected in the game.
 
I remember going from 85Hz on CRT to 60Hz on my first LCD and being disappointed. Its a shame its taken so long for LCD to get to this point.
 
I remember going from 85Hz on CRT to 60Hz on my first LCD and being disappointed. Its a shame its taken so long for LCD to get to this point.

In a lot of ways, LCD's still suck. Your bound to native resolution for the best results. Pixel density is comes into play instead of bigger just being better. You have different panel technologies with their own drawbacks and advantages. The viewing angles aren't quite there either.
 
nah. It took me 1 day toa djust from 240hz gsync back to 30fps bloodborne or 60fps ps4 games. Not a problem at all. Picture quality is what matters most to me.
144 vs 166 is just a placebo. There is a difference when going 144 to 240

not true. I and many others can reliably and consistently discern the difference in a blind test.
 
I couldn't tell the difference between 60hz and 144hz :/
When I went from 60Hz to 120Hz, there was a huge and easily noticeable difference right away.

Even just moving the mouse cursor around or dragging windows on the desktop showed massive improvement.

I guess everyone is different, but I still find it hard to believe that people can't notice some difference.
 
not true. I and many others can reliably and consistently discern the difference in a blind test.
Yeah, for real. I haven't done a blind test, but I can see with my own eyes that 166Hz is visually smoother. I believe I know what I'm seeing.
 
That’s why I don’t get why it’s 166Hz. Shouldn’t it be 168Hz? Sounds like whoever the manufacturer is doesn’t understand why the industry has coalesced around certain Hz ranges.

166Hz sounds like judder central when watching 24fps or 60fps content.

My thoughts exactly. It will be judder central. 166Hz sounds like a random number someone pulled out of a hat. It's not a multiple of anything remotely useful. 83Hz isn't even remotely near anything useful either.

If the point of owning a monitor that does 166Hz is to not run it at its native refresh, then why bother at all? Some engineer should have been fired for this. Stick to multiples of 24, 25 or 30, please.
 
Last edited:
The monitor I have is 144Hz but it overclocks to 166Hz.

In games, this is no issue whatsoever and everything is smooth.

For movies, I'm using PowerDVD and it has a motion interpolation mode that upscales the refresh rate so it remains smooth.
 
The monitor I have is 144Hz but it overclocks to 166Hz.

In games, this is no issue whatsoever and everything is smooth.

For movies, I'm using PowerDVD and it has a motion interpolation mode that upscales the refresh rate so it remains smooth.

It's doing more than just interpolating each frame. It's also stretching the audio.

So you end up with blended frames and aliased audio samples. Nice.
 
It's doing more than just interpolating each frame. It's also stretching the audio.

So you end up with blended frames and aliased audio samples. Nice.


WAT. Why would it need to stretch audio? "frame 200,001 of dvd starts gunshot sound" Why would it need to change it, even if power dvd "made up" 5 frames of every dvd frame?
 
WAT. Why would it need to stretch audio? "frame 200,001 of dvd starts gunshot sound" Why would it need to change it, even if power dvd "made up" 5 frames of every dvd frame?

You don't as long as you maintain the original framerate (or multiple therein). (You can also drop frames, repeat frames, draw partial frames, etc. which is how judder is introduced.) If you deviate from that at all, the audio will become pitch shifted. You compensate by stretching (or compressing) the audio.

Anyway, I was leaning heavily on hyperbole, as a form of sarcasm. Without knowing how PowerDVD internally works to the exact detail, I can't possible know. The point is, they could have gone with either solution (or a hybrid).

The point I'm really trying to make is you end up with some morphed output either way.

A real world example of using audio stretching: Say I have a 24 fps source. The closest multiple of my available monitor refresh rates is 95Hz (96Hz being the ideal). Being 1Hz off is a big deal as it would introduce motion judder. So I have two options: 1) use frame interpolation to hit a target framerate of 95/4 = 23.75 fps or 2) just display the video as-is by slowing it down to 23.75 fps which also requires stretching the audio. As a personal preference, I'm going with option #2 as to maintain the original picture as my eyes are far more sensitive than my ears. You can use ReClock with MPC to achieve that result.

TLDR; Unless you game 24/7, there is nothing "glorious" about a new and fancy monitor that doesn't support a refresh rate that is a multiple of 24.
 
Last edited:
You don't as long as you maintain the original framerate (or multiple therein). If you deviate any from that at all, the audio will become pitch shifted. You compensate by stretching (or compressing) the audio.

Anyway, I was being hyperbolic, as a form of sarcasm. Without knowing how PowerDVD internally works to the exact detail, I can't possible know. The point is, they could have gone with either solution (or a hybrid).

Meh im more worried about converting 15-235 levels to 0-255 pc levels. Those are always the fun problems, because you have to match the program, to the video card, to the monitor. The program will be like, oh you have a pc, lets stretch it to 0-255, then the video card will be like, oh you are using hdmi 2.0 lets convert it back to 16-235, then the monitor will go, oh you are hooking up a pc better not switch to "limited".
 
Meh im more worried about converting 15-235 levels to 0-255 pc levels. Those are always the fun problems, because you have to match the program, to the video card, to the monitor. The program will be like, oh you have a pc, lets stretch it to 0-255, then the video card will be like, oh you are using hdmi 2.0 lets convert it back to 16-235, then the monitor will go, oh you are hooking up a pc better not switch to "limited".

What you are describing is conversion hell.

Would it make you sleep better knowing that same type of scenario is going on behind the scenes well before a decompressed image even hits the video card's framebuffer? "Hello there little pixel. Oh you were encoded as YUV? But that simply will not do for the moment. Let's see, you're using an Nvidia driver, so we'll convert you to NV12 for now, ok?" Then the driver gets a hold of it. "..but we need an RGB or 4:2:2 output, so make sure that NV12 input gets converted or else."

Most the time the chain is always upconverting (i.e. more colors than the original source), which makes the point moot, especially in the case of RGB where most the time that's a lossless chain. That's why 4:2:2 output (something you find in TVs) is, in general, perfectly fine for watching compressed video content (it's already colorspace reduced), but not so much for games since they are RGB sourced.
 
Last edited:
I'm in the same boat....

fps > eye candy

Correct me if im wrong but isn't motion resolution greatly reduced on LCD screens compared with CRT or a DLP projector for instance. Seeing as most games are in motion and fast motion at that, won't you be perceiving a similar resolution running a high refresh, low'ish response time 1440p 144hz+ screen vs a slow'ish response time IPS 4k 60hz screen in actual gameplay. Meaning the vast majority of 4k or higher benefits comes from having a nice desktop reading experience / image editing and not gaming.

Again, i could be wrong here. Im sure there will be slower / static moments in gameplay and perhaps the perception of detail is higher in any case at 4k. The last 4k i had was a TN model with a low response time and in motion it didn't look better than the 1080p TN i replaced it with, only on much slower games.
 
Honestly I am fine with 60hz (using a 75hz monitor currently). The biggest problem at lower refresh is tearing or the input delay associated with fixing it. Luckily Freesync / G-Sync and Fast Sync / Enhanced Sync for high fps games mean that this is less of a problem these days.
This is sadly less of an option in laptops, but with AMD slowly becoming relevant in that segment, as well as Intel planning on supporting adaptive sync, it will be a thing soon.

As well as using a Freesync display, recently also started using motion blur in games, after a lifetime of shunning it like the plague. Doom with its famous per object motion blur converted me.
It's not implemented well in every game, but a good version can definitely make a game feel smoother. I now understand why people are able to stomach console gaming at 30fps
 
In a lot of ways, LCD's still suck. Your bound to native resolution for the best results. Pixel density is comes into play instead of bigger just being better. You have different panel technologies with their own drawbacks and advantages. The viewing angles aren't quite there either.

The only reason why this is a problem is because manufacturers and devs are generally lazy fucks. Display manufacturers lazy in that running at half resolution tends to be blurry for some reason (wtf) and devs lazy in that games still don't seem to work at any resolution or refresh rate because they're poorly cobbled together sacks of shit.
 
Back
Top