defaultluser
[H]F Junkie
- Joined
- Jan 14, 2006
- Messages
- 14,398
There are scholarly articles claiming that many humans can perceive the smoothness up to 1000 Hz.
Put Up, or Shut Up.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
There are scholarly articles claiming that many humans can perceive the smoothness up to 1000 Hz.
Linus doesn't know shit about shit.
He is a goofy retard who prances around and does stupid shit infront of a camera, but I wouldn't trust anything that comes out of his mouth from a technical perspective. I don't trust him, or Jayz2cents or any of the idiots on Youtube. Youtube is just a swamp of bad information from people talking out of their asses. There is nothing trustworthy on that entire platform.
Exactly. The YouTube commentators killed this happy at least compared to the golden era of the tech boomMost of them (especially Linus) have no technical background, they're just good salesman who make YouTube videos that the gen Z kids eat up. Jayz at least knows a decent amount about watercooling but Linus is a complete idiot.
Not with the attitude from this thread.We will eventually get 4K displays with 240 Hz
60fps motion is for all intents and purposes already pretty much fluid.For flat gaming only 60 hz-fps with a little blur or extra input lag is a good medium if you have to compromise between resolution, color reproduction, contrast and all else with addressable display technology.
Most people actively refuse to see anything on top of having pretty terrible eyesight to begin with. All they cared was size even if image quality was terrible.Blur definitely doesn’t bother people that much it seems since I don’t remember anybody but some hardcore sports watchers or film buffs complaining at the time about how blurry their newfangled LCDs were compared to the CRT tech they dumped in the trash.
60fps motion is for all intents and purposes already pretty much fluid.
Most console games in PS1, PS3 and PS4 generation are locked 30fps and still pretty much playable.
30fps is playable alrightyou are either Joking or had bad eyes or you definition about playable is very wide
example:
FFXI is capped at 29.9 in is native form.
You can get a plugin that unlocks it to twice the framerates
its clearly more smooth running at near 60fps than 30fps
ABX tested it 16/16
30fps is playable alright
and no one here is saying there is no difference or anyone should play locked framerates, be it 30fps or even 60fps
it is however not end of the world if game runs 30fps and especially 60fps (which btw is still rarity on consoles)
pc gamers like yourself are just spoilt so much that they do not even know what kind of cruel world majority of gamers have to endure
30fps is playable alright
and no one here is saying there is no difference or anyone should play locked framerates, be it 30fps or even 60fps
it is however not end of the world if game runs 30fps and especially 60fps (which btw is still rarity on consoles)
pc gamers like yourself are just spoilt so much that they do not even know what kind of cruel world majority of gamers have to endure
what happened to the 0.5ms panels, lol. it's been > 1 year no dice, lcd industry ded.
My son just gave me a 240hz monitor. I have never used freesynch or gsynch. Gsynch popped up on my screen and does it actually help with anything?
Yeah, I googled it, but I was mostly wondering if it's worth it. never even bothered looking them up before since I did not have a monitor that supported it.Yes.
I mean, you could google it and you'd have your answer in a second, but real brief, if Gsync is supported by your screen and your GPU, the screen syncs its refresh rate to the framerate being produced of the GPU, so you never wind up having tearing, and get less input lag.
Same with Freesync.
Traditionally Gsync monitors worked with Nvidia GPU's, and Freesync monitors worked with AMD GPU's, but Nvidia has recently enabled Freesync support in their drivers. You just have to manually enable it.
Yeah, I googled it, but I was mostly wondering if it's worth it. never even bothered looking them up before since I did not have a monitor that supported it.
Now that would be nice! I have a 4K 27" and I can get a lot of the things big enough to see, but some are still too small. Dam old people....I'll fully admit, I've never used it, because I've always been a fan of large high resolution screens, and Gsync/Freesync has been more common in smaller "gaming" monitors.
That said, I am really looking forward to the upcoming generation of 43" 120hz 4k monitors with FreeSync. It's going to e a relief to no longer have to worry about fixed refresh rate.
I currently do most of my gaming with adaptive vsync on. So, capped to 60hz unless the GPU can't keep up, and then vsync is disabled, and tearing and other nonsense starts. I would very much like to not have to deal with this. So I'd say all else being equal, it's worth it, and it helps. I wouldn't - however - get a worse looking or lower resolution monitor just so I could get gsync. Everything is always a tradeoff.
hopefully next gen gpus will allow 240hz IPS Panels at up to 4k.
4K at 240KYes, I need a 240hz 4k display that is 27" and either IPS or VA. Potentially VA would give better text clarity - that's a top priority. But I want the 240hz
Read through this thread, and this is my complaint: I've stopped caring about refresh rates because the panels cannot keep up. I'd settle for a panel that could actually completely refresh every pixel at 120Hz.
Monitors are still retarded.
The optic nerve and everything it connects to are "analog devices," yesWhat they typically did was have someone in a dark room and flash a light for something less than 1 ms, and note that it was detected. Say a bright 0.9 ms pulse of light. Detected, then internet debaters will leap in a go see: 1/.9ms = > 1000 fps.
The reality is our eyes are continuous not discrete,
No. Retinal cells are perfectly capable of registering a single photon, and animals (like us) can perceive it. No device of any description could do better. Your argument is better sited in neurology, not physics as such. The advantage of a photomultiplier over a rod cell has to do with what it connects to, not with the quantum nature of light. A single photon's impact is a planck-duration event, and "brightness" is only ever an aggregate statistical measure physically.and you nee a certain amount of photons for something to be detected. So you could keep dropping the time if you increase the brightness.
4K at 240K
You can barely run games at 60fps at 4K...
Should be possible with a <15ms (B/W and "lower" for G2G) reponsse
and a monitor pulling this off would still be able to look absolute horrible blurred with high konstrants changes
it appears you are basing you logic and a incorrect understanding on what how the numbers works in the physical world.
The refresh rate is NOT a seek time as many thinks of it as
4K at 240K
You can barely run games at 60fps at 4K...
Compared to already available 120Hz G-Sync monitors upping this would yield up to 4ms at the bottom of the screen and 2ms at the center of input lag reduction when running below 120fps. This means 240Hz would not be totally pointless even in this case.Everyone thinks you HAVE to run 240 fps to use a 240 hz panel.
But it's about response time, input lag, and general smooth experience.
Hey 240hz isndef not for everyone but if your gonna play fps hardcore even from the confines of your basement gaming complex then 240 is a very bleeding edge tool.
If looking at it mathematically, we're just saying that every pixel must refresh completely within 1000/120 seconds, or 8.3ms. If the panel can refresh that fast, then there will be no blur.
I think you are making a lot of assumptions and posting them in such a way as to be offensive.
So any monitor with below ~16ms B/G refresh time just as a said ?
and you are absolute wrong on you assumption that the it fast enough to not make blur. Again because just because the numbers "fit" does not mean they are what you think they are.
You are quoting numbers you don't understand what they represent.
take it as an insult or not. It does not change the fact that your statement is incorrect based on not understanding the terms you are referring to or how they work.
No.
Well, I'm saying one thing and you're saying that I'm saying another, so you can't really judge.
I'm using the terms that I'm using specifically. I'm also not using the terms you are using, specifically.
accordingg to you having a16ms pixels respons rate
Read through this thread, and this is my complaint: I've stopped caring about refresh rates because the panels cannot keep up. I'd settle for a panel that could actually completely refresh every pixel at 120Hz.
Monitors are still retarded.
I didn't use the term 'pixel response rate' on purpose
But you did say it was not enough when i asked you on response time.
The optic nerve and everything it connects to are "analog devices," yes
BUT
No. Retinal cells are perfectly capable of registering a single photon, and animals (like us) can perceive it. No device of any description could do better. Your argument is better sited in neurology, not physics as such. The advantage of a photomultiplier over a rod cell has to do with what it connects to, not with the quantum nature of light. A single photon's impact is a planck-duration event, and "brightness" is only ever an aggregate statistical measure physically.
For photons considered in isolation, there is no upper limit on perceptible frame rate, because brightness is defined by the speed of their succession. A screen would be too bright to look at before increasing frames/second became imperceptible.
The human eye is very sensitive but can we see a single photon? The answer is that the sensors in the retina can respond to a single photon. But neural filters only allow a signal to pass to the brain to trigger a conscious response when at least about five to nine arrive within less than 100 ms. If we could consciously see single photons we would experience too much visual "noise" in very low light, so this filter is a necessary adaptation, not a weakness.