240hz is best Quantitatively

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,743
I get tired of the age old argument that no one benefits from 240hz.

That there is no perceivable difference between of 240hz.

As an AVID enthusiast of my Acer 240hz panel, I have had vastly higher and consistent success fps gaming than ever before.

Even titles that are fps limited to 144hz like TitanFall 2

Since everyone thinks with emotion and not logic and wants to constantly argue ill let LTT and a quantitative experiment do the talking.

TLDR: they find that Yes indeed 240hz is superior.

One could argue that what about 144/165/200hz? Yes! They are all incrementally improved over the lower tier.

 
*Shrug*

When I used to play at 90hz and 100hz back in the day, I found it to be slightly better than 60fps. It was such a incredibly subtle difference - however - that I'd mostly consider it marginal. Project that out through 240hz (if you even have the hardware to render any thing of interest that fast) and even then, I just don't see it being significantly different. I'm still able to be highly competitive even in twitchy titles like CS despite being almost 40 and playing at 60fps.

Having some nobody on youtube* claim otherwise is not going to convince me that high refresh rate benefits is little more than placebo.

That said, I do want to upgrade to a slightly higher refresh monitor. I am hoping one of the two announced (LG and Asus) 43" 4k 120hz panels will wind up being good. I'm interested in this more for the adaptive refresh than I am for the higher refresh rates though. I don't expect a GPU to exist that will allow me to run anything even remotely modern at above 90fps for years to come, and there is no way in hell I'm willing to drop down from 4k resolution. High resolution and large screens are such an immersive experience I could never go back.

*Everyone on youtube is a nobody who can't be trusted, no matter who they are
 
I hate the tearing effect when you drag windows around and gaps/flicker moving the mouse cursor just sitting at the desktop. I don't understand how people can't see the difference in motion as the refresh increases.
 
  • Like
Reactions: Meeho
like this
I get tired of the age old argument that no one benefits from 240hz.

That there is no perceivable difference between of 240hz.

As an AVID enthusiast of my Acer 240hz panel, I have had vastly higher and consistent success fps gaming than ever before.

Even titles that are fps limited to 144hz like TitanFall 2

Since everyone thinks with emotion and not logic and wants to constantly argue ill let LTT and a quantitative experiment do the talking.

TLDR: they find that Yes indeed 240hz is superior.

One could argue that what about 144/165/200hz? Yes! They are all incrementally improved over the lower tier.


Sorry man does nothing for me. I am all about having he newest tech. But some things just dont make a difference to some people. I have owned a 240hz panel, and am back using a 144hz panel.

To each their own on this topic.

And using LTT to prove a point? come on man. Enough with this youtube celeb era.
 
Sure 240Hz is better. But why stop there? Why not just argue for 1000Hz displays (4x faster than 240Hz bro!)? At what point does it cease to matter to you? More to the point, when "should" it stop mattering to anyone/everyone else?
Simply put: opinions.
 
I had a 144Hz and went back to 60Hz. It didn't change my success rate, K/D ratio or anything in games. The key is that whatever you have is consistent. You get used to whatever you have so long as its predictable. That's been my experience anyway.

That said, if I could have the size and resolutions of my Samsung KS8500 and 144Hz or better, I would in a second. But I'm not dropping down to some nasty ass 1080P garbage just to hit those FPS (240FPS) numbers.
 
So far, all the 240hz panels are ugly TN panels.

A 144hz IPS is objectively more enjoyable to look at and if you disagree, you hate beauty.
 
So far, all the 240hz panels are ugly TN panels.

A 144hz IPS is objectively more enjoyable to look at and if you disagree, you hate beauty.

Millions are made on 240 at competitions.

I never said Jack squat about beauty. Were talking raw fps gaming performance.

I find it funny how all you commenting cant actually argue how 240 is NOT better for competitive gaming.

Still not one argument that blows the fact out of the water. Beauty doesnt count.
 
Millions are made on 240 at competitions.

That only proves they are a specialized tool combined with their diminutive size means they aren't for everyone. Claiming the rest of us aren't enthusiasts because we haven't bowed down to the superiority of 240hz is ridiculous. The majority of us get more benefit out of a larger screen and 4k.
 
Millions are made on 240 at competitions.

I never said Jack squat about beauty. Were talking raw fps gaming performance.

I find it funny how all you commenting cant actually argue how 240 is NOT better for competitive gaming.

Still not one argument that blows the fact out of the water. Beauty doesnt count.

I will not go quietly into your beauty free competitive gaming hellscape sir.

I WILL NOT!
 
Millions are made on 240 at competitions.

I never said Jack squat about beauty. Were talking raw fps gaming performance.

I find it funny how all you commenting cant actually argue how 240 is NOT better for competitive gaming.

Still not one argument that blows the fact out of the water. Beauty doesnt count.

I never said that it wasn't what people were using competitively. I couldn't care less what competitive gamers do. When I play games I'm looking for an immersive and enjoyable experience. I want a smooth experience, but I'm also looking for visual quality and a large immersive display. I don't get that with a 240Hz and 1080P gaming. 1080P on some 24" monitor is going backwards to me. I don't care how smooth it is when it looks like shit.
 
I never said that it wasn't what people were using competitively. I couldn't care less what competitive gamers do. When I play games I'm looking for an immersive and enjoyable experience. I want a smooth experience, but I'm also looking for visual quality and a large immersive display. I don't get that with a 240Hz and 1080P gaming. 1080P on some 24" monitor is going backwards to me. I don't care how smooth it is when it looks like shit.

Yeah your points are well taken. But none the less they do not establish the fact that so many people on this very forum over the years have said that 240 makes no gaming difference at all. I agree with the immersive gaming experience. a 240hz display actually sucks for RPG/Strat/Open World etc... but on raw FPS frag fests there is no replacement and that was the point of the thing Linus did as well as others as well as this thread.

Again while agreeing with the immersive gaming experience being hyper important. I do have an Acer Predator X34 3440x1440 IPS 100hz Gsync and I absolutely LOVE IT!

But I was referring to the baseless claims that so many make that 240 makes no difference.

Anyways since this thread was an unintended consenquence of unforseen crippling argument about the subject, let me submit that with the release of Display Port 2.0, hopefully next gen gpus will allow 240hz IPS Panels at up to 4k.
 
I never said it didn't make any difference, I'm sure it does. However, I think the difference probably isn't Earth shattering. As I said, I feel consistency is the key. In Destiny 2, I do well on the Crucible. I'm sure there are plenty of guys running 144Hz and faster monitors that get smoked by me. I'm chugging along at 60 FPS or less sometimes.

Would I do better with 144Hz of consistent FPS? Absolutely. Would it radically alter my K/D ratio? I doubt it. Now if I was playing for money I wouldn't leave it to chance.
 
Of course 240 Hz is better, it will give you less motion persistance. But the displays that support that are not displays that I want because they are small in size and low in resolution. In terms of visual smoothness I already have a hard time telling a difference between 120 Hz and 144 Hz. At the same time most games won't be able to reach 200+ fps at high details and higher resolutions than 1080p and that is unlikely to change anytime soon.

We will eventually get 4K displays with 240 Hz as DP 2.0 becomes more common but we are probably 4+ years away from that considering the DP 2.0 spec was just finished. In the meantime I feel 1440p+ resolution screens at 144 Hz offer a better experience as that added resolution will help you see smaller details better.
 
But I was referring to the baseless claims that so many make that 240 makes no difference.
in terms of competitive advantage over 120/144, it's probably still going to be the case though.

In terms of percieved smoothness, obviously 240 looks better
 
I forget if it was Valve or Oculus who had done research into the measured limits of benefits for VR as hz increased and pegged it at 1000hz or 1ms frame time. Obviously this has nothing to do with what’s practical and feasible today but it’s taking this kind of investigation and measuring at all points from 60fps to 1000+.
Also OLEDs are capable of 1000hz but no one is building such panel drivers for the commercial space.
 
... that was the point of the thing Linus did as well as others as well as this thread.
.

Linus didn't really do anything to prove that 240Hz is any kind of perceivable threshold. All he really showed is that the threshold is above 60Hz.

Until someone objectively tests 240Hz against 100Hz, 120Hz, 144Hz, then we really don't have any idea where that threshold might be.
 
Linus didn't really do anything to prove that 240Hz is any kind of perceivable threshold. All he really showed is that the threshold is above 60Hz.

Until someone objectively tests 240Hz against 100Hz, 120Hz, 144Hz, then we really don't have any idea where that threshold might be.

Linus doesn't know shit about shit.

He is a goofy retard who prances around and does stupid shit infront of a camera, but I wouldn't trust anything that comes out of his mouth from a technical perspective. I don't trust him, or Jayz2cents or any of the idiots on Youtube. Youtube is just a swamp of bad information from people talking out of their asses. There is nothing trustworthy on that entire platform.
 
240 hz is great if you can actually push 240 frames a second. Hard to do that in most games. FWIW the cheap Seiki 4K TV from a few years back has been capable of 240 hz with 3000:1 contrast, albeit at 720p, since someone here modded the firmware in 2013 or whenever. Still looks great with a ton of AA even at low res because the higher contrast, better colors and the tight pixel density from the native 4K VA panel do a lot to improve perceived resolution.

Some guy also made a board for the 39” version that does 4K120 and 240 hz at 1080p I believe.
 
Linus doesn't know shit about shit.

He is a goofy retard who prances around and does stupid shit infront of a camera, but I wouldn't trust anything that comes out of his mouth from a technical perspective. I don't trust him, or Jayz2cents or any of the idiots on Youtube. Youtube is just a swamp of bad information from people talking out of their asses. There is nothing trustworthy on that entire platform.
Geez, grandpa, you're a little cranky when you don't take your afternoon nap.

Here you go, have a...

snickers_PNG13929.png
 
240Hz > 160Hx > 144Hz > 120Hz > 100Hz > 90Hz 85Hz > 75Hz > 72Hz > 60Hz > 50Hz > 48Hz > 30Hz > 25Hz > 24Hz

I think there is a pattern here ...
 
Boy I miss the old days on this site.

It used to be about high end enthusiasm for bleeding edge performance.

Everything good dies eventually.


No, that's because after all these years, most people have seen how overkill going higher than 120hz really is.

I tried back in the day to tell the difference between 85 hz and 120 hz vsync off on my fairly responsive CRT, and couldn't no matter how many hours of UT 2003 or CS I played. I can barely tell the difference between 75hz and 85hz (aside from the flicker it added to my CRT).

The reason there is such a movement to "shoot for the sky on refresh rate " was because we go stuck back on 60hz LCDs for years with 20ms response times, and the only way to sell people back on higher refresh rates was 3D glasses at 120hz. Now people are discovering again for themselves, and are surprised how hard it is to tell the difference between 120 hz and 240 hz.

While I will acknowledge that there are competitive gamers out there who can tell the difference, you're very few in number. I am not one of them, and compromised on a 2ms TN 1080p panel 3 years back, overclocked to 75 hz. It's not noticeably slower than the dead CRT it replaced, because this old man doesn't play much online games anymore. But if I did, the most I'd pay for is 120 hz.

Really, the only reason people went fro 150hz was to hack games like Quake 3. There is no other perceivable improvement.
 
Last edited:
240Hz > 160Hx > 144Hz > 120Hz > 100Hz > 90Hz 85Hz > 75Hz > 72Hz > 60Hz > 50Hz > 48Hz > 30Hz > 25Hz > 24Hz

I think there is a pattern here ...


Clearly, all else being equal, the higher the refresh rate the better. The problem is, all things are rarely equal.

In order to reach the high refresh rates, and the high framerates to fill those high refresh rates properly we have to make many sacrifices. Panel image quality, higher powered noisier components, lower resolution, lower video quality settings, cost of higher end components.

Because of the many tradeoffs, it is impractical to consider the sky the limit and just go for gold. This is why we try to figure out how much is "good enough".

I find 60fps is fine for me for most things. If I still played competitive twitchy online games, I might opt for a little higher, but 90-100 somewhere is probably as high as I'd go. IMHO diminishing returns set in rather quickly above 60fps anyway, and the tradeoffs to get up to those higher framerates just aren't ones I'm willing to put up with.

But if all else where equal, and I could just crank up the refresh/framerate? Sure I'd turn it up to a bazillion just to be on the safe side and make sure I am getting the best possible experience :p
 
Boy I miss the old days on this site.

It used to be about high end enthusiasm for bleeding edge performance.

Everything good dies eventually.

I'm into this hobby and this community for the best possible experience from my hardware. Above 75hz or so, I get much more out of raising video quality and resolution than I get out of higher refresh/framerates.

I'm still into bleeding edge performance. I have a custom water loop, and read up on the latest fastest GPU's and CPU's and how to best cool and overclock them. I've even planned out a water chiller system in my head for some day if I have the spare cash to get the most out of my hardware.

I am very much pushing the envelope of what is possible performance wise on PC's. I'd just rather spend that added performance on 4k resolution and ultra-high settings than I would on adding more framerate I'd barely notice.
 
Last edited:
240Hz > 160Hx > 144Hz > 120Hz > 100Hz > 90Hz 85Hz > 75Hz > 72Hz > 60Hz > 50Hz > 48Hz > 30Hz > 25Hz > 24Hz

Tell that to film snobs. They like their 24 FPS judder. ;)

At the other end of the range, there is some point where the difference is imperceptible, so it could be more like:

240Hz = 160Hz = 144Hz = 120Hz > 100Hz > 85Hz > 60Hz
 
At the other end of the range, there is some point where the difference is imperceptible, so it could be more like:

240Hz = 160Hz = 144Hz = 120Hz > 100Hz > 85Hz > 60Hz

Exactly.

I'd like to see a large sample size double blind clinical study performed to find just where that limit lies.

My personal experience is that 60fps gets you most of the way there, but that there are still small gains to be had above 60. I am hugely skeptical that anyone can tell the difference between 120hz and anything above 120hz.
 
Sure 240Hz is better. But why stop there? Why not just argue for 1000Hz displays (4x faster than 240Hz bro!)? At what point does it cease to matter to you? More to the point, when "should" it stop mattering to anyone/everyone else?
Simply put: opinions.

There are scholarly articles claiming that many humans can perceive the smoothness up to 1000 Hz.
 
At the other end of the range, there is some point where the difference is imperceptible, so it could be more like:

240Hz = 160Hz = 144Hz = 120Hz > 100Hz > 85Hz > 60Hz
If it is perceptible or not depends on many factors. Most important is pixel switch time or how it is called "response time" (it is stupid name imho) which can limit benefits from faster refresh... but at the same time higher refresh rates tend to improve this time, especially when paired with very strong overdrive which tend to produce have less artifacts the higher refresh rate is used. Also high refresh rates improve temporal dithering (kinda important for 6bit TN panels... :p). Another thing is that when strobing is used there is less visible flickering (very important for CRT's)

Input lag is always reduced when you go higher and higher refresh rate. Even if we already use VRR and framerate cannot go higher (eg. let's say 90fps) input lag will be higher in 120Hz mode than when you use 240Hz mode. That is because image needs to be drawn on screen and the faster refresh rate the quicker it takes to refresh whole screen.

Also if you move cursor around and you see doubles, these doubles increase in quantity when you have higher refresh rate. This is easily visible with cursor but also in games also in the same way. So it proves higher refresh rates than 120Hz are clearly visible!

No matter how you slice it higher refresh rate is visible and helps. It might be not as important or big of a difference, especially when choosing display in situation where going crazy high like 240Hz doesn't leave you much options, in which situation it won't be worth it, but it is not like it won't be perceptible at all.

I can understand why some people would rather choose 240Hz TN (which is best for speed anyway) over 144Hz IPS
 
There are scholarly articles claiming that many humans can perceive the smoothness up to 1000 Hz.
In ideal world we would be using eye trackers and applications would run at very high frame rates and calculate pixels more densely and frequently in screen area at which human is actually looking at. If we would not have to do whole screen transfers constantly it would not require as much transfer speeds or pixel calculations as how it is done today with refreshing whole screen assuming human is looking everywhere at once. Then 1000Hz (or rather being able to have 1ms latency) would be possible.

OLED displays can easily do it and using ray-tracing instead of rasterization would make it easy (not to mention this is solution to making path tracing work in the long run anyway... especially with 8K, 16K, etc.)
Current processors could not hit 1000fps in games, especially if we up physics... but who knows. AMD could always make bazzillion cores for us =)
 
There are scholarly articles claiming that many humans can perceive the smoothness up to 1000 Hz.

Link some. Because the studies I saw that were misused to make this claim, were not really making that claim.

What they typically did was have someone in a dark room and flash a light for something less than 1 ms, and note that it was detected. Say a bright 0.9 ms pulse of light. Detected, then internet debaters will leap in a go see: 1/.9ms = > 1000 fps.

The reality is our eyes are continuous not discrete, and you nee a certain amount of photons for something to be detected. So you could keep dropping the time if you increase the brightness.

But this is a one shot out of nothing, and is not indicative of being able to see the difference in motion video.
 
If it is perceptible or not depends on many factors. Most important is pixel switch time or how it is called "response time" (it is stupid name imho) which can limit benefits from faster refresh... but at the same time higher refresh rates tend to improve this time, especially when paired with very strong overdrive which tend to produce have less artifacts the higher refresh rate is used. Also high refresh rates improve temporal dithering (kinda important for 6bit TN panels... :p). Another thing is that when strobing is used there is less visible flickering (very important for CRT's)

Input lag is always reduced when you go higher and higher refresh rate. Even if we already use VRR and framerate cannot go higher (eg. let's say 90fps) input lag will be higher in 120Hz mode than when you use 240Hz mode. That is because image needs to be drawn on screen and the faster refresh rate the quicker it takes to refresh whole screen.

Also if you move cursor around and you see doubles, these doubles increase in quantity when you have higher refresh rate. This is easily visible with cursor but also in games also in the same way. So it proves higher refresh rates than 120Hz are clearly visible!

No matter how you slice it higher refresh rate is visible and helps. It might be not as important or big of a difference, especially when choosing display in situation where going crazy high like 240Hz doesn't leave you much options, in which situation it won't be worth it, but it is not like it won't be perceptible at all.

I can understand why some people would rather choose 240Hz TN (which is best for speed anyway) over 144Hz IPS

From CRT days, most people stopped seeing strobing in 72-85Hz range. As far as tricks like flicking around a mouse at high speed a looking for more ghost images. Are you sure you could even tell the difference in double blind test between 144 HZ and 240 Hz doing that?

Even if you could, the situation where people claim to need this most is in FPS games, where you don't have a mouse cursor to flick around.

While I have no use for the test in this thread(60Hz vs 240Hz). I would love to see some tests at higher rates.

I am betting that 144Hz vs 240Hz double blind test in games wouldn't yield much in the way of detectable differences.
 
Back
Top