Uncovering The Truth Behind 30 vs. 60 FPS

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The fellas over at TechnologyX decided to rehash the ol' 30 vs. 60 FPS debate. Click the comments link below and pass the popcorn. :D

Many gamers will say that there is no difference at all between 30 and 60 FPS, or any other frame-rate above 30 for that matter. That as long as the frame-rate is a constant 30 FPS or close to it that it will be ‘buttery smooth’ and provide an enjoyable experience.
 
*cough*BULLSHIT*cough*

As someone who reguarly films in HD (1080/60), the framerate drop to 30 is ABSOLUTELY NOTICEABLE even if you keep it perfectly constant.
 
*cough*BULLSHIT*cough*

As someone who reguarly films in HD (1080/60), the framerate drop to 30 is ABSOLUTELY NOTICEABLE even if you keep it perfectly constant.

You obviously didn't read the article. It as it doesn't at all advocate 30 FPS as good enough. :p
 
*cough*BULLSHIT*cough*

werd


I don't understand how anyone could look at 30/60/120/etc and NOT notice a difference. Perhaps it's similar to hearing high pitched frequencies? Where some people (usually older) tend to go tone deaf past a certain frequency?
 
There is no doubt that 60 fps looks smoother and crisper. It also looks 'different'. And that is a difference people will just have to become accustomed to over time. All that said, 60fps is a serious performance drag on hardware. A lot of times I would rather play at 30 fps if it meant better care was taken to other graphical considerations. In other words, don't give me 60fps if the rest of your game looks like shit.
 
The quote on the OP would make you think the article is about how there is no difference but it's pretty much the opposite. Just a heads up to anyone else that will just declare bullshit without reading. Personally I'd sacrifice some graphical effects to stay above 60 FPS but then again I've spoiled myself after getting use to playing on a 144hz panel.
 
gsync is really neat, it makes 45fps seem like 60fps.

because of this, i genuinely think there is more to the debate other than more frames is automatically better. 54 fps for the first 9/10ths of a second and 6fps for the last 1/10th of a second still yields 60fps. but we can all agree it would fucking suck.

but they are fucking console games. the gameplay is really fucking slow, even compared to 20 year old quake3. 30fps is fine for grandma's tv.
 
Article is blocked here. I could add it to the Barracuda, but I'm fairly certain I get the finer points short of the slant the author puts on it. There is absolutely a difference. Not just with raw frame rates, but interpolated rates too. Also, with CRTs, a difference between frames and fields when interlaced. I'm sensitive to this sort of thing, but I would hope most people would be. The 60hz flicker of fluorescent lights bothers me.

60fps should be the absolute standard for games. When I play something like Bayonetta 2, Mario Kart 8, Wolfenstein TNO, Borderlands, etc. the games just feel right. As soon as I'm playing something either locked at 30, or that bounces around a lot, it pulls me right out of the experience. The only game running lower than 60 that I've been able to put up with recently is The Evil Within. It's slow paced, and has enough weird effects overlaid that it's a bit less noticeable. I still think it would be much better at a true 60 with the same effects applied. I'm impartial to 120 or something interpolated to be higher. It looks great, but unnecessary for me. I do notice the difference, but at that point I think the return is somewhat diminished compared to the difference between 30 and 60.
 
I do feel 30fps is more immerse, 60fps feels too smooth and its sort of like fast forward.

With that said I think ti depends on the game, 60fps is probably good for sports and driving games, but 3rd person view RPG's and the like I think are better at 30fps.
 
I've never heard anyone say "There's no difference between 30 and 60 FPS" as this article says some claim.

There is a clear difference. Especially when you see them side by side.
 
First company to bring back CRTs wins. Huge. Between increasing resolutions, pixel speeds etc etc CRT is the answer to a lot of self-created problems. If Sony wishes to fix their ailing recent financials, this one is certainly available to them.
 
just tried playing Diablo with a max frame rate of 30 as opposed to 60 (on a non-gync monitor).

Let's just say, movies are one thing, I no longer find 30 fps remotely tolerable, I can certainly see the game taking frame 'steps' if you will.

I believe 60 fps is the sort of thing where if you get used to it, 30 fps is no longer tolerable, probably the same at higher fps.
 
60fps is a serious performance drag on hardware. A lot of times I would rather play at 30 fps if it meant better care was taken to other graphical considerations. In other words, don't give me 60fps if the rest of your game looks like shit.

That is a very odd comment... how is 60fps a "drag" on hardware compared to 30fps? I mean, obviously it takes more GPU power to run at 60fps but your comment implies that you benefit somehow by simply limiting it to 30. Even if 60fps is a "drag" on hardware, what's the worst that would result? FPS occasionally dropping to 30, or maybe 45 with vsync and triple buffering? That still wouldn't be any worse than simply limiting yourself to 30 from the onset.

I have a 120hz monitor and even the difference between 60 and 120 is immediately obvious. I show my friends, and all I have to do is switch from 60hz to 120hz once and their reaction is usually "whoa".
 
Article is blocked here. I could add it to the Barracuda, but I'm fairly certain I get the finer points short of the slant the author puts on it. There is absolutely a difference. Not just with raw frame rates, but interpolated rates too. Also, with CRTs, a difference between frames and fields when interlaced. I'm sensitive to this sort of thing, but I would hope most people would be. The 60hz flicker of fluorescent lights bothers me.

60fps should be the absolute minimum for games. When I play something like Bayonetta 2, Mario Kart 8, Wolfenstein TNO, Borderlands, etc. the games just feel right. As soon as I'm playing something either locked at 30, or that bounces around a lot, it pulls me right out of the experience. The only game running lower than 60 that I've been able to put up with recently is The Evil Within. It's slow paced, and has enough weird effects overlaid that it's a bit less noticeable. I still think it would be much better at a true 60 with the same effects applied. I'm impartial to 120 or something interpolated to be higher. It looks great, but unnecessary for me. I do notice the difference, but at that point I think the return is somewhat diminished compared to the difference between 30 and 60.
FTFY.

I'm also one of those people who have been spoiled by 144 Hz. Frankly, my eyes hurt while playing anything at less than 60 FPS after awhile. I could play PC games all day, but I have to do consoles in 1-2 hour chunks because of the low framerate.

So I say, the higher the better!
 
As the article says, the reduced input lag from higher frames alone justifies 60fps, even if the human eye couldn't perceive the difference.

Also, you cannot frame this the same as 24fps vs 48fps debate for film. Film has natural motion blur interpolating between frames. Games do not have this and therefore require higher frame rates to look smooth. A low framerate in a game will look choppy rather than "cinematic" like people argue for low frame-rate film. If you try to add motion blue to a game as a post-process to compensate, you end up increasing the input latency yet again.
 
As the article says, the reduced input lag from higher frames alone justifies 60fps, even if the human eye couldn't perceive the difference.

Very true! 30fps feels sloppy (most of the time). I can still play things like Ocarina of Time and other older games, because, well, it must be that I'm used to them that way from way back. (That said, they'd certainly be better at 60. :cool: )

When using a mouse especially, I can REALLY feel the slop. The Evil Within definitely demonstrates this. The reason I can still play this game like this though, and not just toss it aside like I would with most games that felt like that, is the whole game is a messy experience from the aesthetic, to the hallucinations, to the film-grain-Japanese-horror-look, etc. The dissociative feel from the input lag/slop actually seems to cement the mental twisting going on for some reason. I was able to allow that into my suspension of disbelief where with most games it would totally kill it. This makes me think it CAN be part of the game to a degree, but in most cases it just makes me want to wretch on the devs. :D
 
Many gamers will say that there is no difference at all between 30 and 60 FPS, or any other frame-rate above 30 for that matter.
Excuse me?

There is a significantly noticeable difference.
 
dont know what it is, it just feels weird in some games, its sort of like watching a movie at 60fps, it just looks weird.

I know what you're talking about. A lot of British TV is like this, and things like Soap Operas, and on normal TV when they try to simulate a video camera. It looks "too smooth" in those cases, and to me it's a little weird sometimes.

For me though, this is TOTALLY NOT THE CASE with games. The smoother the better with games.
 
As the article says, the reduced input lag from higher frames alone justifies 60fps, even if the human eye couldn't perceive the difference.

Also, you cannot frame this the same as 24fps vs 48fps debate for film. Film has natural motion blur interpolating between frames. Games do not have this and therefore require higher frame rates to look smooth. A low framerate in a game will look choppy rather than "cinematic" like people argue for low frame-rate film. If you try to add motion blue to a game as a post-process to compensate, you end up increasing the input latency yet again.

Input latency? Isn't the human brain kind of slow in responsiveness though? It seems like blaming the computer for accepting input slowly is sort of splitting hairs when you factor in network laggies and brain laggies. I'm sure though, geezer 35 year old somewhere is blaming input lag and motion blur post processing effects for losing to a little kid that's got better reflexes and spends more time playing something.
 
Input latency? Isn't the human brain kind of slow in responsiveness though? It seems like blaming the computer for accepting input slowly is sort of splitting hairs when you factor in network laggies and brain laggies. I'm sure though, geezer 35 year old somewhere is blaming input lag and motion blur post processing effects for losing to a little kid that's got better reflexes and spends more time playing something.

Input lag is very noticeable if you're playing a skill based twitch game. Fighting games like Street Fighter, shooters like Quake/QuakIII/Unreal Tournament, etc. are VERY easy to notice when something's not right between your input, and what's happening on screen. The brain actually processes this information faster than some people give it credit for. It may depend heavily on training and practice though, so it may not be apparent to everyone. This is based on my own observations though. I haven't been hooked up to scanning equipment for this. :D

People who race cars, play certain sports, play a lot of fast paced games, fly jets, etc. would all notice a few milliseconds of lag.

I design analog synthesizers, and can easily notice a few milliseconds of lag when interfacing to external controllers like MIDI to CV conversion via a MIDI keyboard. (I'm also a composer, and play the piano.) If I hit a key, and don't hear a sound for 10-20 milliseconds (which is actually widely considered acceptable,) it tugs on my brain a bit, and doesn't feel right.
 
Excuse me?

There is a significantly noticeable difference.
Read the article in the link...

The actual article is taking the widespread myths and misinformation about framerate and debunking them, proving that higher framerate is better.

This is a great section, attacking the misinformation that the human eye can only see 30 FPS:
We do know that the human eye is designed to detect motion and changes in light, the world being an infinite place, our eyes are constantly being streamed information without pause. If your eyes could only see a max of 30 FPS, you’d likely miss a lot of things just because they happened too fast. If that isn’t enough here is a quote from an article from way back in 2001 by Dustin D Brand that was tackling this exact same question:
“The USAF (United States Air Force), in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to “see” the after image as well as identify the aircraft. This simple and specific situation not only proves the ability to perceive 1 image within 1/220 of a second, but the ability to interpret higher FPS. “
So you see, if this example is any evidence than the human eye can not only perceive images at greater than 30 frames per second, but well above 200 as well.
 
Read the article in the link...

The actual article is taking the widespread myths and misinformation about framerate and debunking them, proving that higher framerate is better.

This is a great section, attacking the misinformation that the human eye can only see 30 FPS:

This is true. It's interpolated, but one of my TVs has pseudo-240Hz mode, and it is very apparent when it's turned on.
 
Input lag is very noticeable if you're playing a skill based twitch game. Fighting games like Street Fighter, shooters like Quake/QuakIII/Unreal Tournament, etc. are VERY easy to notice when something's not right between your input, and what's happening on screen. The brain actually processes this information faster than some people give it credit for. It may depend heavily on training and practice though, so it may not be apparent to everyone. This is based on my own observations though. I haven't been hooked up to scanning equipment for this. :D

People who race cars, play certain sports, play a lot of fast paced games, fly jets, etc. would all notice a few milliseconds of lag.

I design analog synthesizers, and can easily notice a few milliseconds of lag when interfacing to external controllers like MIDI to CV conversion via a MIDI keyboard. (I'm also a composer, and play the piano.) If I hit a key, and don't hear a sound for 10-20 milliseconds (which is actually widely considered acceptable,) it tugs on my brain a bit, and doesn't feel right.

I dunno, I guess maybe that's true, but I really haven't personally taken any video game seriously enough to really think about it. I guess there were times when the Sims 3 would take a little while to respond to mouse clicks or whatever on my netbook, but it was a sorta underpowered computer for that anyway.

I'd think in multiplayer stuff over the Internet, there'd be a lot of other factors to worry about moreso than whether or not motion blur is slowing down the computer responding to things. The brain is only one of those other factors.
 
If there was no difference why do 120Mhz+ TV look to god dam fluid and fake vs all those before them :D
 
30fps is "cinematic", and makes lazy console ports to PC easier. duh
 
I dunno, I guess maybe that's true, but I really haven't personally taken any video game seriously enough to really think about it. I guess there were times when the Sims 3 would take a little while to respond to mouse clicks or whatever on my netbook, but it was a sorta underpowered computer for that anyway.

I'd think in multiplayer stuff over the Internet, there'd be a lot of other factors to worry about moreso than whether or not motion blur is slowing down the computer responding to things. The brain is only one of those other factors.

It really depends on the person as well imho. Some can perceive more, others less. Same goes for motion sickness etc.

I think stuff like G-Sync/Free-Sync might change the formula though, high FPS, in a sense, is trying to compensate for monitors refreshing at a set rate while video cards generate variable rates, the more the two go out of sync the jerkier the motion feels, and higher FPS attempts to reduce that problem.

Personally I would imagine G-Sync/Free-Sync @45fps will feel like 75fps I bet, so it might affect my purchasing habits in the future, instead of spending $300 on a video card I might take $100 off that and throw it at a Free-Sync display instead.
 
I'd think in multiplayer stuff over the Internet, there'd be a lot of other factors to worry about moreso than whether or not motion blur is slowing down the computer responding to things. The brain is only one of those other factors.

I'm sure there are plenty of network latency issues over the internet. I typically don't play much on the internet. I played a lot of the first Borederlands with friends, and never had any issues, but that game isn't quite on the same level as something like Quake III. Typically your movement locally would still be decent, but you'd have other factors (like other player locations) that might lag. This is pretty frustrating, but not quite the same thing.

When I've played things like Quake III or UT to the point where input lag would matter, it's always been on a LAN.
 
I am glad a site finally tried to de-bunk it, but there will still be the twits who beleive 30FPS is enough and all we can see!
 
BF4 at 30fps is good enough...at least thats what my Q6700 and GTX260 keep showing me. :(
 
Input latency? Isn't the human brain kind of slow in responsiveness though? It seems like blaming the computer for accepting input slowly is sort of splitting hairs when you factor in network laggies and brain laggies. I'm sure though, geezer 35 year old somewhere is blaming input lag and motion blur post processing effects for losing to a little kid that's got better reflexes and spends more time playing something.

Input latency would be cumulative with brain latency. So if you're brain response time is around 100ms-200ms (I managed to hit around 150ms average in this test after a few tries: http://www.humanbenchmark.com/tests/reactiontime) and the difference in frame delay between 60fps and 30fps is ~16ms as the article claims then that's around a 10% difference between 60fps delay + brain delay and 30fps delay + brain delay. 10% is not a huge number, but it's not so minuscule as to be completely insignificant.
 
I'm just glad that YouTube now supports 60FPS video so that more people can be exposed to it and see the difference. I believe that the people that think that 30FPS is adequate just haven't had the luxury of experiencing 60FPS but YouTube will allow more people to see how smooth it is and what they're missing out on.
 
Its all about the motion blur. 24fps works in theaters because of it. A white mouse on a black background though at 30FPS, YUCK!
 
This seems to be a very old argument and matter of opinion. But OP I do have to disagree with you, There is a DEFINATE different between those two sets of FPS.
 
Well the fact that there is a perceivable difference is not a matter of opinion. However, whether it bothers someone or not definitely is.

I'm firmly in the 60FPS+VSync+Triple buffering camp. (or 120Hz) However, I won't infringe on anyone's right to enjoy 30FPS, or those weirdos that play without sync. :D :p
 
This is true. It's interpolated, but one of my TVs has pseudo-240Hz mode, and it is very apparent when it's turned on.
Did you quote the wrong post o_0? I didn't say or quote anything about televisions...
 
This is true. It's interpolated, but one of my TVs has pseudo-240Hz mode, and it is very apparent when it's turned on.
There's a bunch things going on with 'motion flow' as some makers call it that doesn't apply to games.

For games, the reason there's a debate is that some people can't tell the difference, others can. Its like severe colorblindness. Some people are impaired and others aren't. How do you explain a color to someone who can't see it.
 
Back
Top