Uncovering The Truth Behind 30 vs. 60 FPS

First company to bring back CRTs wins. Huge. Between increasing resolutions, pixel speeds etc etc CRT is the answer to a lot of self-created problems. If Sony wishes to fix their ailing recent financials, this one is certainly available to them.

Well, these won't be back, but I don't miss them any longer. Agreed about them being far superior to LCD. We'll just have to hope OLED makes its way in to the mainstream. CRT's are so heavy and use so much power. :(

Having said that, it took me a very, very long time to give up my 21" ViewSonic P225. :)
 
I don't think any gamers actually say that. I think some shitty developers or other charlatans/psuedo-scientists have tried to sell that from time to time, although it's been awhile (that I've seen. I also ignore anything to do with consoles).
 
Its all about the motion blur. 24fps works in theaters because of it. A white mouse on a black background though at 30FPS, YUCK!

There's also judder, which is most likely what you're seeing with that high of a contrast.

Between judder (and the lack of it) and interpolation, that is a large part of why we see the hyper unreal 'soap opera effect' on smoother/higher framerate video.

It's hard to actually demonstrate video artifacts with a video, but this one does a good job:

https://www.youtube.com/watch?v=B_dE6HPIAJM

One of the easiest ways to see the low framerate judder is panning through a fence. Hard to find the video that Ebert used as a great example of the drawbacks of 24fps film rates, but this one exploits the weakness in a similar way:

https://www.youtube.com/watch?v=zWAh5TGX03E
 
Mind you, the above examples are a different situation than filming directly at 60fps. For anything with motion, it makes all the difference in the world.
 
So to summarize this whole argument, console manufactures made a cost-benefit analysis and opted to shoot for 30FPS because 60FPS would have been too expensive to produce for price point they wanted to sell their consoles at. Instead of owning up to this, they made deals with game development studios to artificially cap their games and "optimize" game play across all platforms. When they got caught doing this, they claimed 30FPS is all the human eye could perceive, knowing full well that statement is false. Console fanboys like all fanboys rallied to this false argument and here we are.

Side note, that is f'n cool that pilots can identify plane types at 220FPS.
 
First company to bring back CRTs wins. Huge. Between increasing resolutions, pixel speeds etc etc CRT is the answer to a lot of self-created problems. If Sony wishes to fix their ailing recent financials, this one is certainly available to them.

Not sure if serious?

Only a select few people want CRT's back. There is no market and demand is next to non-existant. That's why they aren't made anymore. If there was a market companies would still make them. Technological progress has the capability to make displays that have all the positives of CRT's and none of the negatives. Granted, that tech has been slow evolving the past 10 years, but it is happening. The negatives such as size and weight quite literally outweighed the positives which is why the market dissappeared in the first place in favor of other tech. Sony used to make some of the finest CRT's back in the day, but their financial woes could never be solved by re-introducing CRT's for these very reasons.
 
Not sure if serious?

Only a select few people want CRT's back. There is no market and demand is next to non-existant. That's why they aren't made anymore. If there was a market companies would still make them. Technological progress has the capability to make displays that have all the positives of CRT's and none of the negatives. Granted, that tech has been slow evolving the past 10 years, but it is happening. The negatives such as size and weight quite literally outweighed the positives which is why the market dissappeared in the first place in favor of other tech. Sony used to make some of the finest CRT's back in the day, but their financial woes could never be solved by re-introducing CRT's for these very reasons.

Given that my 17" CRT back in the day weighed about 70 pounds, my 30" monitor today would weigh about 200 pounds. Desks would be slightly more expensive from all the reinforcing, and can you imagine the shipping?!
 
My take on this is as follows:

If I am just watching a game running on screen, I have a very difficult time telling the difference between 30fps and 60fps. Maybe my eyes are just bad, but it is what it is.

When I actually run a game though, I find the difference to be tremendous.

Presumably this is because of the difference in input lag.

For a single player FPS I have no problem playing at 30fps, but for something multiplayer, ideally I want the frame rate to never drop below 60fps, even for a moment.

This "better feel" - if you will - only gets better the higher the framerate goes. Back in the day when I had a Geforce 3 which was total overkill for the original Counter-Strike, I used to run it frame rate synced at 100hz to my 20" 1600x1200 CRT, and once I was used to it I felt like I could barely play on computers with lower frame rates.

Given how the human brain works, it is - of course - possible that this is all placebo effect, but I really don't think so...
 
Given that my 17" CRT back in the day weighed about 70 pounds, my 30" monitor today would weigh about 200 pounds. Desks would be slightly more expensive from all the reinforcing, and can you imagine the shipping?!

Thus the need for a $10k Desk :p
 
The whole approach to this FPS thing is bullshit.. Has nothing to do with the brains perception of x amount of frames per second.. It's all about the brain's ability to detect when the frame rate of a game is in sync with the frame rate of the display.

When they are out of sync, you WILL notice it. Basically because it's skipping frames.
 
Not sure if serious?

Only a select few people want CRT's back. There is no market and demand is next to non-existant. That's why they aren't made anymore. If there was a market companies would still make them. Technological progress has the capability to make displays that have all the positives of CRT's and none of the negatives. Granted, that tech has been slow evolving the past 10 years, but it is happening. The negatives such as size and weight quite literally outweighed the positives which is why the market dissappeared in the first place in favor of other tech. Sony used to make some of the finest CRT's back in the day, but their financial woes could never be solved by re-introducing CRT's for these very reasons.

Back when LCD panels were first starting to be shipped with computers (and they totally stunk for gaming) there was talk about in the future we'd see "flat CRT technology" that would provide the best of both worlds.

If this would have become a reality, I would have jumped on it. I wouldn't be willing to sacrifice the convenience and space saving of a modern good IPS panel if it meant going back to a gargantuan heavy CRT behemoth though...

I mean, my 22" Mitsubishi Diamondtron based Iiyama Visionmaster Pro 510 was a great monitor back in the day, but even then at 22" (huge by the standards of the day, but kind of tiny today) it weighed a bloody ton, and made my desk curve under the weight.

I would dread my current 30" 16:10 with two 20" 4:3 monitor setup if it were CRT.
 
I think it depends on what's being shown. If the graphics were truly "photo realistic" then a constant 24 frames per second would be good, just as it is with movies. However, that's not the way video games work. I need me some constant 85fps (minimum) lovin' at 4k.
 
That is a very odd comment... how is 60fps a "drag" on hardware compared to 30fps? I mean, obviously it takes more GPU power to run at 60fps but your comment implies that you benefit somehow by simply limiting it to 30. Even if 60fps is a "drag" on hardware, what's the worst that would result? FPS occasionally dropping to 30, or maybe 45 with vsync and triple buffering? That still wouldn't be any worse than simply limiting yourself to 30 from the onset.

I have a 120hz monitor and even the difference between 60 and 120 is immediately obvious. I show my friends, and all I have to do is switch from 60hz to 120hz once and their reaction is usually "whoa".

You think some game developers lock the fps of their games at 30 fps because they just flipped a coin? They did because they had to. Because they couldn't maintain a minimum framerate at 60 fps so they locked it at 30, so that it wasn't jittery.

We're in a bit of a transitional period, because 60 fps is becoming more and more popular, and yet there are certainly still computers that can't handle some specific game at 60 fps. And then you have console games that ALSO can't handle 60 fps. I'm a cg artist, I like to make nice looking things to look at in video games. Would I love it if everybody could experience 60fps and not have to sacrifice other things to get it? Sure, but it's not always happening.
 
If the graphics were truly "photo realistic" then a constant 24 frames per second would be good, just as it is with movies.

24 fps being good enough for movies has nothing to do with anything being "photo realistic", it has to do with the camera capturing motion blur information, which is then present in every frame. 24 discreet frames is a LOT different than 24 frames full of motion blur, where the motion blur essentially fills in for the missing frames. There is no flicker because each frame smoothly blends into the next frame.
 
You think some game developers lock the fps of their games at 30 fps because they just flipped a coin? They did because they had to. Because they couldn't maintain a minimum framerate at 60 fps so they locked it at 30, so that it wasn't jittery.

We're in a bit of a transitional period, because 60 fps is becoming more and more popular, and yet there are certainly still computers that can't handle some specific game at 60 fps. And then you have console games that ALSO can't handle 60 fps. I'm a cg artist, I like to make nice looking things to look at in video games. Would I love it if everybody could experience 60fps and not have to sacrifice other things to get it? Sure, but it's not always happening.


You feel that 30 fps gives the player a better experience than 60 fps w/ Vsync that occasionally dips below 60? If anything their decision is probably a combination of the fact that many TVs only do 30fps, and sheer laziness on their part.

I get that consoles and even computers in some cases can't maintain 60fps. That isn't exactly a new or special problem. Games having fps dips here and there certainly isn't the end of the world, and I'd go as far as to say that it's even expected. What you've really not explained is how locking a game at 30fps is beneficial, even if the hardware can't maintain 60. VSync is the solution to this problem that people have used with tremendous success for decades now; why not in this case? :confused: Please explain what leads you to believe that locking the FPS to 30 would present even a single benefit in any context?
 
Did you quote the wrong post o_0? I didn't say or quote anything about televisions...

Yes, I should have quoted the source of your quote. Wasn't thinking. :D I was referring to the comments about 200+ changes per second.
 
Bought Dead Rising 3 which had a frame lock at 30 fps.
Sadly, it just looked jerky to me, totally put me off so I played it twice and really have no interest in playing it again.
Tried to run it uncapped, and it looks just as bad because the animations were designed for 30 fps so it winds up just as jerky as 30fps.

You can tell a difference. DR3 looks like crap in comparison to other PC games.
 
You think some game developers lock the fps of their games at 30 fps because they just flipped a coin? They did because they had to. Because they couldn't maintain a minimum framerate at 60 fps so they locked it at 30, so that it wasn't jittery.

We're in a bit of a transitional period, because 60 fps is becoming more and more popular, and yet there are certainly still computers that can't handle some specific game at 60 fps. And then you have console games that ALSO can't handle 60 fps. I'm a cg artist, I like to make nice looking things to look at in video games. Would I love it if everybody could experience 60fps and not have to sacrifice other things to get it? Sure, but it's not always happening.

I don't care about console games. They are locking PC games when hardware can run it at much higher rates. This is nothing more than a marketing strategy. Most of these shit ports I could run at 144 but they have started to lock them down. FFS that horrible game The Evil Within was locked to 30 with letterbox before they had revolt and patched it... Why don't we force the Publishers (who are more than likely making these type of decisions) to stop f'in the PC community.
 
24 fps being good enough for movies has nothing to do with anything being "photo realistic", it has to do with the camera capturing motion blur information, which is then present in every frame. 24 discreet frames is a LOT different than 24 frames full of motion blur, where the motion blur essentially fills in for the missing frames. There is no flicker because each frame smoothly blends into the next frame.

I don't think it's the same. Plenty of games have a motion blur effect and it isn't the same as a movie.
 
Pixar and other CGI movie-makers seemed to do an ok job recreating a realistic motion blur effect for 24fps, right? It's not rocket science, it just takes a shit-load of power and they don't have to worry about variable sampling rates like games do.
 
I don't think it's the same. Plenty of games have a motion blur effect and it isn't the same as a movie.

Who is saying anything is the same? Motion blur in games looks like shit because it is shit. Motion blur as captured in real life by a camera is really not something that can yet be recreated adequately in games on-the-fly.
 
You only need 30fps and 128MB ram.

let's just shoot for the lowest possible standards
 
Zarathustra[H];1041230652 said:
My take on this is as follows:

If I am just watching a game running on screen, I have a very difficult time telling the difference between 30fps and 60fps. Maybe my eyes are just bad, but it is what it is.

When I actually run a game though, I find the difference to be tremendous.

Presumably this is because of the difference in input lag.

For a single player FPS I have no problem playing at 30fps, but for something multiplayer, ideally I want the frame rate to never drop below 60fps, even for a moment.

This "better feel" - if you will - only gets better the higher the framerate goes. Back in the day when I had a Geforce 3 which was total overkill for the original Counter-Strike, I used to run it frame rate synced at 100hz to my 20" 1600x1200 CRT, and once I was used to it I felt like I could barely play on computers with lower frame rates.

Given how the human brain works, it is - of course - possible that this is all placebo effect, but I really don't think so...

I am just curious but you do understand the relationship between your Frame Rate and the refresh rate of your monitor?

That your monitor's refresh rate, say 75hz means that your video card will output a frame in sync with this refresh rate, so 75 frames per sec will be sent from the video card to the monitor no matter how many frames the card says it's producing.

That card will either send full frames, maybe sending some more then once, or it will just send whatever is in the buffer, but the refresh rate and the resolution determine the actual demand. The only thing that doesn't work this way in modern computing is Display Port which works completely different.
 
Cannot lol this comment hard enough.
Going on year 15 for my Mitsubishi 2040u, and it STILL outgamuts any flatpanel under $2K made today.

Trust me on this one, as unlikely as it sounds. CRTs are still in production today but they've been priced out of reach for everyone except a few. Probably not you.
 
The whole approach to this FPS thing is bullshit.. Has nothing to do with the brains perception of x amount of frames per second.. It's all about the brain's ability to detect when the frame rate of a game is in sync with the frame rate of the display.

When they are out of sync, you WILL notice it. Basically because it's skipping frames.

QFT man.

Once frames start skipping or the math in out head says "that's not supposed to be there! There is zero paths in which object could have been there without missing information!" then the illusion of smoothness is broken.
 
If there was no difference why do 120Mhz+ TV look to god dam fluid and fake vs all those before them :D
Older TV's that run at 60Hz require a three-two pull down in order to display 24 FPS content. This results in juddery video.

A 120Hz TV can display 24FPS content without any framerate interpolation. It can simply display each frame for 5 consecutive refreshes. This results in MUCH smoother video.

THEN you get into TV's that try to do motion interpolation, where instead of displaying the same frame 5 times, they try to generate their own "tween" frames. This results in 120 (or more) unique frames per second, but can cause visual artifacts if the vectorization algorithm is less than perfect. This type of video processing also tends to add a bunch of input delay, as well.
 
Actually that's not 100% true. Of course, the people saying the brain can't perceive things over 30, 60, or whatever are totally wrong. Your eyes and brain DO play a part though beyond just collecting the information. There are people who are sensitive to higher or lower frame rates, those who aren't, those you can respond quickly, those who don't, etc. etc.

I agree though that when the adapter and display aren't in sync, that's when the biggest issues arise.
 
Tearing is the worst for me. I will ALWAYS use VSync, and triple buffering when available. The more uniformly the data is displayed over time, without unnecessary artifacts, the better.
 
The quote on the OP would make you think the article is about how there is no difference but it's pretty much the opposite. Just a heads up to anyone else that will just declare bullshit without reading. Personally I'd sacrifice some graphical effects to stay above 60 FPS but then again I've spoiled myself after getting use to playing on a 144hz panel.

Yes, when I wrote this I never intended on advocating for 30 FPS. Thanks for clearing that up.

Also, OP thanks for posting this here!
 
Tearing is the worst for me. I will ALWAYS use VSync, and triple buffering when available. The more uniformly the data is displayed over time, without unnecessary artifacts, the better.
When I was younger, I never used vsync. Nowadays, I can't stand games without vsync. Must be something that comes I age :\
 
This thread is really awesome! I'm learning lots about brains and eyeballs I didn't know before. I still can't tell the difference in 30 or 60 or whatever and I don't care if my computer runs stuff at slide show speeds, but it's neat to learn what refresh rates and vsync do.
 
You can see the difference in youtube videos captured at 60 frames I didn't even realize that feature came out until I checked the requested resolution.
 
Actually that's not 100% true. Of course, the people saying the brain can't perceive things over 30, 60, or whatever are totally wrong. Your eyes and brain DO play a part though beyond just collecting the information. There are people who are sensitive to higher or lower frame rates, those who aren't, those you can respond quickly, those who don't, etc. etc.

I agree though that when the adapter and display aren't in sync, that's when the biggest issues arise.

The issue here I think is similar to the whole myth of people use 10% of their brain. To the best of my knowledge, that had to do with humans can survive with 10% of their brain (which 10%, I'm not sure). With 30 fps, I was always told it had to do with around 30 fps is where your brain starts to perceive motion rather than a series of stills.

It's like a game of telephone. People pick up on one or two words in the sentence and fill in the rest with who knows what...
 
I agree, though I think the number is a bit lower. At least the level where you can make yourself perceive motion (maybe it doesn't come naturally though.) The reason I say this, is if you've ever seen a zoetrope (sp?) it's just a little wheel with animation steps in it. As it rotates you see a moving object. (the classic one is a running horse I think) I don't think the wheel typically turns fast enough to do the equivalent of 30 frames per second. It still does a decent job of conveying motion though. If I had never seen something that ran at 60FPS I think it would be easier to accept less. Even the Commodore 64 though ran at the vertical sync rate of a standard composite CRT of the time period. (60FPS NTSC / 50FPS PAL) Same with the NES, and other game systems. It wasn't really until PCs, Playstations, N64s where we started seeing less than 60, due to the fact that displaying 3D graphics required a lot more horsepower than 2D sprites. I think they started running at slower frame rates as a compromise.

There were some 2D games that did run at half-frame rate. Things like Gods on the Amiga ran at half. I think that was because of the sprite counts. Some of the weapons were pretty insane. You'd get things like sprite flicker and slowdowns if you displayed too much that couldn't run at sync.

Anyway, I think the goal has always been to display at the refresh rate of the monitor. A lot of earlier machines timed everything by that, including the music, sprite interrupts, blitter operations, etc. etc. That's how you used to sync everything in the game. (of course they coded in assembly or pure ML then too) The point is, anything under the monitor sync has classically been a compromise. I know there have been some legitimate attempts at being "cinematic" in the past, but I think that's largely not the case anymore. If you see it now, typically it's to hide a deficiency.
 
I wish I could edit. I remember it being a big deal when Tobal No. 1 (Square Fighting Game on PS1) ran in 60FPS. IT used less texture detail, more flat shaded surfaces, but ran twice the speed of most PS1 3D games of that time. There were other games that ran at 60, but I think that was the first major 3D game on the system that did.
 
Back
Top