240hz is best Quantitatively

Linus doesn't know shit about shit.

He is a goofy retard who prances around and does stupid shit infront of a camera, but I wouldn't trust anything that comes out of his mouth from a technical perspective. I don't trust him, or Jayz2cents or any of the idiots on Youtube. Youtube is just a swamp of bad information from people talking out of their asses. There is nothing trustworthy on that entire platform.

Most of them (especially Linus) have no technical background, they're just good salesman who make YouTube videos that the gen Z kids eat up. Jayz at least knows a decent amount about watercooling but Linus is a complete idiot.
 
Most of them (especially Linus) have no technical background, they're just good salesman who make YouTube videos that the gen Z kids eat up. Jayz at least knows a decent amount about watercooling but Linus is a complete idiot.
Exactly. The YouTube commentators killed this happy at least compared to the golden era of the tech boom
 
Mark Rehjon’s work at Blurbusters (this includes published research) shows benefits for scan and hold displays up to 1000 hz where persistence is effectively reduced to 1 ms (if you can push 1000 frames a second worth of differentiable data). The cheaper option for motion is scanning, strobing, etc. which is what they are doing with VR displays. Past 1000 hz I haven’t read anything but I doubt the benefits come to a dead stop at 1001 hz, it’s only a matter of much diminished returns. Simply owing to the power relation inherent to the phenomenon, it’s tough to shave that last few ms of persistence off without doing the trick of not showing anything at all. For flat gaming only 60 hz-fps with a little blur or extra input lag is a good medium if you have to compromise between resolution, color reproduction, contrast and all else with addressable display technology. Blur definitely doesn’t bother people that much it seems since I don’t remember anybody but some hardcore sports watchers or film buffs complaining at the time about how blurry their newfangled LCDs were compared to the CRT tech they dumped in the trash.
 
This is a decent article on the discussion:

https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

It's a multi-faceted issue. While human eyes cannot detect high-frequency flicker directly, humans can still see stroboscopic effects and persistence-based motion blur from Hertz limitations.

Real life has no frame rate. Frame rates and refresh rate are an artificial digital image-stepping invention of humankind (since the first zoetropes and first movie projectors) that can never perfectly match analog-motion reality.

However, ultra-high frame rates at ultra-high refresh rates (>1000fps at >1000Hz) manages to come very close. This is currently the best way to achieve blurless sample-and-hold with no flicker, no motion blur, and no stroboscopic effects.

Also, real life has no flicker, no strobing and no BFI. Today’s strobe backlight technologies (e.g. ULMB) are a good interim workaround for display motion blur. However, the ultimate displays of the distant future will fully eliminate motion blur without strobing. The only way to do that is ultra-high frame rates & refresh rates.


If all things were equal: cost, GPU demand, picture quality, etc, I think anyone would take a 240 Hz display over a 60 Hz display. All I know since I've tested it myself is; it takes about a second of me moving a cursor on a desktop to tell the difference between 240 and 144 Hz displays. But the diminishing returns do start to kick in pretty hard-core. Once you introduce BFI/strobing though, the need for super high refresh rates decreases drastically.
 
For flat gaming only 60 hz-fps with a little blur or extra input lag is a good medium if you have to compromise between resolution, color reproduction, contrast and all else with addressable display technology.
60fps motion is for all intents and purposes already pretty much fluid.
Most console games in PS1, PS3 and PS4 generation are locked 30fps and still pretty much playable.

My own quarrel with 60Hz monitors is more due to input lag and it itself is caused more by terrible V-Sync ON implementation than anything else. Thankfully we have RTSS and VRR monitors or scanline sync for PC for fixed rate monitors.

Still higher refresh rate improves input lag and general motion clarity so that is good.
Of course on CRT's we already had high refresh rates. Most often 85Hz and 100+Hz on better monitors. Even in DOS era games ran 72Hz...

Blur definitely doesn’t bother people that much it seems since I don’t remember anybody but some hardcore sports watchers or film buffs complaining at the time about how blurry their newfangled LCDs were compared to the CRT tech they dumped in the trash.
Most people actively refuse to see anything on top of having pretty terrible eyesight to begin with. All they cared was size even if image quality was terrible.
Bad eyesight is not only ability to resolve contrasty images but the actual way image is processed. It is possible to see badly even with relatively good eyes.

Remember that most people have absolutely terrible settings in their TV's with all image enhancers enabled and set to maximum, wrong contrast, wrong saturation, wrong black point (if such setting was available), wrong aspect ratio (with explanation like "I like image to take whole screen space" - apparently they could not let any cent spent on large screen to go to waste XD) and of course all 100/120/200/240/600/99999999Hz modes enabled to get nice motion artifacts. This is far worse than their TV's could actually display things...

Of course their old CRT's usually had wrong settings too. Overblown colors, brightness that killed any black level and in NTSC region also wrong hue (it was impossible for most PAL sets to even control hue and it was always proper). In CRT thankfully available settings would not allow to degrade image quality that much. Then LCD came and people watched broken image and if you were to ask them if their new TV is better than old one they would strongly agree and start talking about size... pointing to squashed faces on their improperly set TV...

Same goes for games. X360/PS3 looked far better on old CRT's than most HD TV's but almost no one would say that. 1280x720 upscaled to whatever stupid resolution resolution panel had supposedly looked so much better after all the sharpening that made pixels hurt eyes because TV was bigger... and do not get me started about idiots who always claim they cannot see input lag and it does not bother them even on sets which had like 100ms input lag where I could say their TV have input lag even just by seeing them play game. Yup, it is impossible to not notice it but they actively choose not to because whatever stupid reason they have...
 
Who the hell cares about the science behind it? Buy and use whatever makes you happy and fuck the opinions or "objectivity" of everybody else.
 
60fps motion is for all intents and purposes already pretty much fluid.
Most console games in PS1, PS3 and PS4 generation are locked 30fps and still pretty much playable.

you are either Joking or had bad eyes or you definition about playable is very wide
example:
FFXI is capped at 29.9 in is native form.
You can get a plugin that unlocks it to twice the framerates
its clearly more smooth running at near 60fps than 30fps

ABX tested it 16/16
 
for me im feeling 90 fps + as the sweetspot where i cant rly tell anymore, but my screen goes all the way to 165hz. any reduction in inputlag will make u react faster i guess, now im not exactly a pro fps gamer but im capable for most part still.
 
you are either Joking or had bad eyes or you definition about playable is very wide
example:
FFXI is capped at 29.9 in is native form.
You can get a plugin that unlocks it to twice the framerates
its clearly more smooth running at near 60fps than 30fps

ABX tested it 16/16
30fps is playable alright
and no one here is saying there is no difference or anyone should play locked framerates, be it 30fps or even 60fps

it is however not end of the world if game runs 30fps and especially 60fps (which btw is still rarity on consoles)

pc gamers like yourself are just spoilt so much that they do not even know what kind of cruel world majority of gamers have to endure
 
30fps is playable alright
and no one here is saying there is no difference or anyone should play locked framerates, be it 30fps or even 60fps

it is however not end of the world if game runs 30fps and especially 60fps (which btw is still rarity on consoles)

pc gamers like yourself are just spoilt so much that they do not even know what kind of cruel world majority of gamers have to endure


I appreciate the sentiment, but I'd argue that you are wrong here. 30fps LOOKS mostly ok, at least in some titles, but the moment I grab the mouse I can feel that something is horribly wrong. The amount of mouse lag is just unbearable. The only reason it isn't that noticeable on consoles is that playing on a controller is such a terrible and disconnected experience that you can't really tell. If mice were in more widespread use on consoles, console players would have the same distaste for anything under 60fps.

I've spent some time playing vsynced to 50hz on a TV with European 50hz display modes, and it was tolerable on single player games, but for anything multiplayer I'd consider 60fps to be the absolute bare minimum I don't want to ever drop below, not even for a second.
 
i think that 240hz is better, but current implementation not so sure

what happened to the 0.5ms panels, lol. it's been > 1 year no dice, lcd industry ded.
 
30fps is playable alright
and no one here is saying there is no difference or anyone should play locked framerates, be it 30fps or even 60fps

it is however not end of the world if game runs 30fps and especially 60fps (which btw is still rarity on consoles)

pc gamers like yourself are just spoilt so much that they do not even know what kind of cruel world majority of gamers have to endure



This is a subjective statements. there is no point in debating these as they are well subjective.
it might be OK to you and yeah it was "playable" but defiantly not a good experience.

Objectively speaking there is a difference that can be measures and proven between 30fps and 60FPS whatever you accept it or not.
Anythign else Its totally individual opinion and ig you decidedto debate on subjective terms you have to accept that you are not "universal right" you just have a preference.


In order words:
"Playable for you might not be playable for others" just like some of you favorite food does not taste good for other
If you can't accept that, you should'nt really be debating subjectively to begin with.
 
Last edited:
what happened to the 0.5ms panels, lol. it's been > 1 year no dice, lcd industry ded.

Read through this thread, and this is my complaint: I've stopped caring about refresh rates because the panels cannot keep up. I'd settle for a panel that could actually completely refresh every pixel at 120Hz.

Monitors are still retarded.
 
think a 144/165 hz 1440p + is far superior in everything else. if you are a twitch literally, and twitch gamer that compete at a really high level it might have merrit. but middle ground of hardware diminishing returns the 1440p is the best i feel, it dont tax hardware like 4k. but if you want the 4k experience then. it's about individual choice what u prefer really. but at 240 hz the differences must be miniscule. if i would choose a monitor that fits me the most regardless of cost i would pick my 1440p 144/165 every time, it is the best from all worlds. my monitor did cost me a hefty 1000 usd tho, and it is ips with low input lag.
 
My son just gave me a 240hz monitor. I have never used freesynch or gsynch. Gsynch popped up on my screen and does it actually help with anything?
 
My son just gave me a 240hz monitor. I have never used freesynch or gsynch. Gsynch popped up on my screen and does it actually help with anything?

Yes.

I mean, you could google it and you'd have your answer in a second, but real brief, if Gsync is supported by your screen and your GPU, the screen syncs its refresh rate to the framerate being produced of the GPU, so you never wind up having tearing, and get less input lag.

Same with Freesync.

Traditionally Gsync monitors worked with Nvidia GPU's, and Freesync monitors worked with AMD GPU's, but Nvidia has recently enabled Freesync support in their drivers. You just have to manually enable it.
 
Yes.

I mean, you could google it and you'd have your answer in a second, but real brief, if Gsync is supported by your screen and your GPU, the screen syncs its refresh rate to the framerate being produced of the GPU, so you never wind up having tearing, and get less input lag.

Same with Freesync.

Traditionally Gsync monitors worked with Nvidia GPU's, and Freesync monitors worked with AMD GPU's, but Nvidia has recently enabled Freesync support in their drivers. You just have to manually enable it.
Yeah, I googled it, but I was mostly wondering if it's worth it. never even bothered looking them up before since I did not have a monitor that supported it.
 
Yeah, I googled it, but I was mostly wondering if it's worth it. never even bothered looking them up before since I did not have a monitor that supported it.

I'll fully admit, I've never used it, because I've always been a fan of large high resolution screens, and Gsync/Freesync has been more common in smaller "gaming" monitors.

That said, I am really looking forward to the upcoming generation of 43" 120hz 4k monitors with FreeSync. It's going to e a relief to no longer have to worry about fixed refresh rate.

I currently do most of my gaming with adaptive vsync on. So, capped to 60hz unless the GPU can't keep up, and then vsync is disabled, and tearing and other nonsense starts. I would very much like to not have to deal with this. So I'd say all else being equal, it's worth it, and it helps. I wouldn't - however - get a worse looking or lower resolution monitor just so I could get gsync. Everything is always a tradeoff.
 
I'll fully admit, I've never used it, because I've always been a fan of large high resolution screens, and Gsync/Freesync has been more common in smaller "gaming" monitors.

That said, I am really looking forward to the upcoming generation of 43" 120hz 4k monitors with FreeSync. It's going to e a relief to no longer have to worry about fixed refresh rate.

I currently do most of my gaming with adaptive vsync on. So, capped to 60hz unless the GPU can't keep up, and then vsync is disabled, and tearing and other nonsense starts. I would very much like to not have to deal with this. So I'd say all else being equal, it's worth it, and it helps. I wouldn't - however - get a worse looking or lower resolution monitor just so I could get gsync. Everything is always a tradeoff.
Now that would be nice! I have a 4K 27" and I can get a lot of the things big enough to see, but some are still too small. Dam old people....

I mostly play VR games, but sometimes do a little 2D gaming. My son is the one who gives me these kind of things. I do like the RGB mechanical keyboard, so I can see the keys better. :) I will put the other monitor next to this one and use the 4K for the garage comp.
 
hopefully next gen gpus will allow 240hz IPS Panels at up to 4k.

Yes, I need a 240hz 4k display that is 27" and either IPS or VA. Potentially VA would give better text clarity - that's a top priority. But I want the 240hz :)
 
240hz VA? Unless things are really progressing, I think the blur would make this an academic mode. VAs have relatively poor pixel response times so I imagine they will always be a step behind the SOTA in refresh rates. Maybe an IGZO VA panel will be faster..
 
Read through this thread, and this is my complaint: I've stopped caring about refresh rates because the panels cannot keep up. I'd settle for a panel that could actually completely refresh every pixel at 120Hz.

Monitors are still retarded.

Should be possible with a <15ms (B/W and "lower" for G2G) reponsse
and a monitor pulling this off would still be able to look absolute horrible blurred with high konstrants changes

it appears you are basing you logic and a incorrect understanding on what how the numbers works in the physical world.
The refresh rate is NOT a seek time as many thinks of it as
 
Last edited:
What they typically did was have someone in a dark room and flash a light for something less than 1 ms, and note that it was detected. Say a bright 0.9 ms pulse of light. Detected, then internet debaters will leap in a go see: 1/.9ms = > 1000 fps.

The reality is our eyes are continuous not discrete,
The optic nerve and everything it connects to are "analog devices," yes
BUT
and you nee a certain amount of photons for something to be detected. So you could keep dropping the time if you increase the brightness.
No. Retinal cells are perfectly capable of registering a single photon, and animals (like us) can perceive it. No device of any description could do better. Your argument is better sited in neurology, not physics as such. The advantage of a photomultiplier over a rod cell has to do with what it connects to, not with the quantum nature of light. A single photon's impact is a planck-duration event, and "brightness" is only ever an aggregate statistical measure physically.

For photons considered in isolation, there is no upper limit on perceptible frame rate, because brightness is defined by the speed of their succession. A screen would be too bright to look at before increasing frames/second became imperceptible.
 
Last edited:
4K at 240K
You can barely run games at 60fps at 4K...

Everyone thinks you HAVE to run 240 fps to use a 240 hz panel.

But it's about response time, input lag, and general smooth experience.

Hey 240hz isndef not for everyone but if your gonna play fps hardcore even from the confines of your basement gaming complex then 240 is a very bleeding edge tool.
 
Should be possible with a <15ms (B/W and "lower" for G2G) reponsse
and a monitor pulling this off would still be able to look absolute horrible blurred with high konstrants changes

If looking at it mathematically, we're just saying that every pixel must refresh completely within 1000/120 seconds, or 8.3ms. If the panel can refresh that fast, then there will be no blur.

it appears you are basing you logic and a incorrect understanding on what how the numbers works in the physical world.
The refresh rate is NOT a seek time as many thinks of it as

I think you are making a lot of assumptions and posting them in such a way as to be offensive.
 
Everyone's priorities are different. Mi e fall in the following order.

1.) Do whatever it takes to get a framerate that NEVER drops below 60fps.

2.) Get the best resolution and I age quality I can get by turning up settings.

3.) Enjoy any additional framerate possible above 60fps once all quality settings are maxed


So it's not that I wouldn't like framerates above 60. It's just that it's not my top priority. I prefer spending my GPU cycles on image quality and immersion.
 
Everyone thinks you HAVE to run 240 fps to use a 240 hz panel.

But it's about response time, input lag, and general smooth experience.

Hey 240hz isndef not for everyone but if your gonna play fps hardcore even from the confines of your basement gaming complex then 240 is a very bleeding edge tool.
Compared to already available 120Hz G-Sync monitors upping this would yield up to 4ms at the bottom of the screen and 2ms at the center of input lag reduction when running below 120fps. This means 240Hz would not be totally pointless even in this case.
It would hewever be very little of an improvement and not something even remotely worth wanting at this time. If there was no VRR then it would be totally different story...

Personally I would want current 120Hz tech to become much cheaper and/or HDMI 2.1 with VRR support on GPU side and OLED monitors and world peace and CO2 emmision reduction so 240Hz 4K monitor somehow doesn't compute in my mind as anything I think is needed ;)
 
If looking at it mathematically, we're just saying that every pixel must refresh completely within 1000/120 seconds, or 8.3ms. If the panel can refresh that fast, then there will be no blur.

I think you are making a lot of assumptions and posting them in such a way as to be offensive.

So any monitor with below ~16ms B/G refresh time just as a said ?

and you are absolute wrong on you assumption that the it fast enough to not make blur. Again because just because the numbers "fit" does not mean they are what you think they are.
You are quoting numbers you don't understand what they represent.

take it as an insult or not. It does not change the fact that your statement is incorrect based on not understanding the terms you are referring to or how they work.

I'll let you answer my above question first to make sure I am getting you right and then I'' inform you why your logic is incorrect.


-- edit --
Seems like my typing is just extraordinary bad today
 
Last edited:
So any monitor with below ~16ms B/G refresh time just as a said ?

No.

and you are absolute wrong on you assumption that the it fast enough to not make blur. Again because just because the numbers "fit" does not mean they are what you think they are.
You are quoting numbers you don't understand what they represent.

Well, I'm saying one thing and you're saying that I'm saying another, so you can't really judge.

take it as an insult or not. It does not change the fact that your statement is incorrect based on not understanding the terms you are referring to or how they work.

I'm using the terms that I'm using specifically. I'm also not using the terms you are using, specifically.
 
No.

Well, I'm saying one thing and you're saying that I'm saying another, so you can't really judge.

I'm using the terms that I'm using specifically. I'm also not using the terms you are using, specifically.

Correct you are just not understading what the terms mean

accordingg to you having a16ms pixels respons rate is not enough to update every scrren on a 120hz display
that is inccorect its based o nyou thinkings a pixels response time is te time it take from a pixles to chagen form one color to another ... it is not
the pixels reposne time is the both rise AND fall of the pixels color. ergo its going from one color... AND back again
so 16mspixels resposne time ( for this experiment fall and rise is equal) woudl be 8 ms. exactly faster tha what you said yourself was needed to but the right color on the pixels on a 120hz dispaly

ok this was really jsut a side notation on showig how you indeed did not understand how pixels respone time is measiure or what the numbers on the display units specs sheetis actually working



you main point was that if could change the colors faster tha 8.3ms there would be no blur on a 120hz display
AGAIN this is inoccrect because you beliee this is a witing delay for and instant operation to kick in.
in facts this is not the case
this 8.3ms delay is the delay for the process of changing form one color to another

so in your exampl 8.3ms delay on a 120hz ( aka 8.5ms refecy cycle intervall) woudl mean you pixels are showing the incorect color
8.3/8,5*100 = 98% of the time.

A screen showing the in between colors 98% of the time insted of the correct one, does in facts not become blur free it will in facts have SEVERE blur


This is why its important to understand what the numbers means besides just checking to see if they "Fit mathematically"

lets even doubel the speed of thte pixel respon time and put it at 4ms
at this time it would still show the in between colors of 47% of the time.

offcause the blurring would be experience different depending on the controct of the new/old pixel colour
but you argument that it would becomes blurr free is incorrect. and based in missing understading of how things work



In short: things does not work the way you think they work


People in general really should start to read up on stuff before the make staments that <insert objetc> is restarded.
 
Read through this thread, and this is my complaint: I've stopped caring about refresh rates because the panels cannot keep up. I'd settle for a panel that could actually completely refresh every pixel at 120Hz.

Monitors are still retarded.

I didn't use the term 'pixel response rate' on purpose


But you did say it was not enough when i asked you on response time. I asked you purportedly because I knew you would try to weasel out and try move the goalpost
And how conveniently you leave out the entire response to you main argument because... well I assume you realized by now you where incorrect but prides stop you from admitting it.


just to illustrate it more graphical fort it to make makes more sense for people why you where incorrect.

pixelschange.png



As we can see after each refresh cycle with a new color instructions to the pixels, the pixel still stays most of its time in an incorrect color
This is what cause the trail blurring in high contrast area when moving the picture,


again understanding this is not a delay-and-then-instant pixel changes. This is an ongoing change of the color and that is how the blur comes into existence

Even 1ms delay would cause some blur (whatever its perceivable is another debate)


Bottom line: The argument that if just the pixels his fast enough to hit its color before next refresh cycles = no blur, is as shown to be totally incorrect.
 
But you did say it was not enough when i asked you on response time.

Because response time is nebulous and prone to being abused by manufacturers.

You can stop now- I've quoted only small parts of your posts because you're reading books out of my sentences and then forming arguments against things that I did not say.

Any further posts will be interpreted as personal attacks. Hit unsubscribe, ignore, whatever you have to do.
 
Eh I think the benefits of 240Hz depends on the person. Some people may see the benefit but then end up not performing any better than they were on a 60Hz display. I would say 240Hz has personally improved my gameplay. Prior to getting a 240Hz monitor, my K/D in FPS games was usually no higher than 3. Apex Legends was the first FPS title that I started off playing with a 240Hz monitor right off the bat, and with season 2 out they finally released the stat tracker and I was pretty surprised to see that my K/D had surpassed 4.0 in this FPS game when previously I could never get above 3.0 in every other FPS I played from CoD to BF to PUBG. I definitely wouldn't go back to 144Hz for competitive online shooters.
 

Attachments

  • Apex KDR.png
    Apex KDR.png
    1.6 MB · Views: 0
The optic nerve and everything it connects to are "analog devices," yes
BUT

No. Retinal cells are perfectly capable of registering a single photon, and animals (like us) can perceive it. No device of any description could do better. Your argument is better sited in neurology, not physics as such. The advantage of a photomultiplier over a rod cell has to do with what it connects to, not with the quantum nature of light. A single photon's impact is a planck-duration event, and "brightness" is only ever an aggregate statistical measure physically.

For photons considered in isolation, there is no upper limit on perceptible frame rate, because brightness is defined by the speed of their succession. A screen would be too bright to look at before increasing frames/second became imperceptible.

If you want to get pedantic, We can't perceive single photons.
Can a Human See a Single Photon?http://www.math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html

The human eye is very sensitive but can we see a single photon? The answer is that the sensors in the retina can respond to a single photon. But neural filters only allow a signal to pass to the brain to trigger a conscious response when at least about five to nine arrive within less than 100 ms. If we could consciously see single photons we would experience too much visual "noise" in very low light, so this filter is a necessary adaptation, not a weakness.


As much of the eyes can do great thing. They are often quite limited in real time usage, like when people quote massive dynamic ranges, but fail to point out that it takes many minutes to dark adapt, 30+ minutes to fully dark adapt to see the faintest light emissions.

Also the "frame rate" of our eyes is closer to the "Flicker fusion" threshold, and this tends to be in the 60-100 Hz range for humans. We can still detect artifacts beyond this, but not because we can perceive higher frame rates. It's because if you move an object with a strobing light, it will have one flashed position for each strobe, and we can see the strobed positions. You could create massive visible strobe rate artifacts like this, but that isn't anything to do with frame rate of our eyes. I still camera with a long exposure would still capture all the strobe positions in one image. That isn't a high frame rate, it's a frame rate of even 1 Hz capturing an artifact.

But in nature, the light sources don't strobe.

I find it hilarious that people want to introduce strobing back-lights (to fix the sample and hold "artifact"), to introduce a strobing artifact, then complain they need higher frame rates, because of the strobing artifact.

Sample and hold blurring is NOT an artifact, it's how natural vision works in the natural world. Spin a bicycle wheel, or wave your and back and forth in natural light, they blur immediately, because our vision is actually very poor a perceiving fast changes.
 
Back
Top