The New Myth about Frames Per Second

NExUS1g

Gawd
Joined
Aug 15, 2004
Messages
554
We all know about the myth that the human eye can't see more than 60 FPS that had been running rampant around the Internet. But there's a new myth that I've been seeing more and more of: Having more FPS than your monitor's refresh rate is pointless. Perhaps it's not as new as I think, but it seems this is a myth that is becoming even more prevalent.

We know there is a difference between 60 FPS and 100 FPS on an LCD monitor operating at 60Hz, so it's clearly not true. Has anyone else been noticing a this myth popping up or is it just me?
 
That myth is really only pertinent to CRTs where they actually had screen refreshes and the FPS optimally should match the frequency that the screen refreshed at. With LCDs a higher FPS is better in my opinion...I'm sure someone will argue with that.
 
Not sure what you mean but on my monitor switching from 60Hz to 120Hz makes a huge difference.
Not so sure if having FPS over 60 while running at 60Hz is going to do much though. I mean yeah, that means that your Minimum FPS is going to be high too so that WILL make a difference but pretending we could have a game at a perfect 100 FPS and one at a perfect 60FPS on a 60Hz monitor, I do not see how there could be a difference.
 
Do I need to fetch the LOL WUT picture again?

How do you figure that a higher FPS than refresh rate does anything? You realize that the refresh rate is the rate that the panel displays frames. As such I fail to see how having a higher rate does anything, if your card puts a new screen for display, and then changes to yet another before the monitor is ready, well that intermediate frame is lost, never displayed.

So what is it you think having a higher FPS than refresh gets you?
 
We all know about the myth that the human eye can't see more than 60 FPS that had been running rampant around the Internet. But there's a new myth that I've been seeing more and more of: Having more FPS than your monitor's refresh rate is pointless. Perhaps it's not as new as I think, but it seems this is a myth that is becoming even more prevalent.

We know there is a difference between 60 FPS and 100 FPS on an LCD monitor operating at 60Hz, so it's clearly not true. Has anyone else been noticing a this myth popping up or is it just me?

It is pointless with a few exceptions such as all GoldSource (HL1) games.
 
Yeah well that is the problem...people don't understand what refresh rate means. In CRT terms it meant that the scanline could refresh the entire screen 60 times in one second. In this fact LCD does NOT have a refresh rate simply because it doesn't need to have a line move over the reactive phosphors to light the screen. The closest to a refresh that an LCD has is response time but it's not compared in the same manner. So something running at 120FPS will infact be noticeably better even if the screen says 60hz. Now this being said there is still the problem of driver controlled refresh rate.
 
What are you talking about? While LCDs don't have a scanning electron beam, refresh rate is still relevant if for no other reason than their connections work the same way. Data is transmitted in a fashion such that the whole screen is refreshed a number of times per second. It isn't a differential setup where only things that change are sent.

Now one of the upshots of this is that it is only sent out so often from the card to the screen. As such you can have a card render as fast as you like, the frames won't get displayed. The card will only send 60 frames per second to the monitor.
 
Of course there is an image refresh time on LCDs. However that doesn't really change the fact that things are smoother at higher frame rates even when limited by supposed refresh rates.
 
But is it smoother ONLY because a card that delivers a 120FPS avg is probably able to keep the minimum over 60fps ? That is the question...
 
Of course there is an image refresh time on LCDs. However that doesn't really change the fact that things are smoother at higher frame rates even when limited by supposed refresh rates.

I'll have to go Wikipedia on you and say "citation needed." Any graphics card I've seen won't go above the refresh rate when you turn on vsync, and turning on vsync gives you the smoothest image. You don't get tearing when it's on, which is more noticeable on LCDs than it was on CRTs. I have not seen a case where 60fps solid with vsync on was smoother than > 60fps with vsync off.
 
Here's the thing, with vertical sync on yes, frames are lost and you're capped to the refresh rate of the monitor. However, with vertical sync off the screen is updated whether it's in the middle of a refresh or not (this is what causes the shearing effect). This means that you don't lose frames being sent from the buffer.
 
No, not really. Cards don't render to the display buffer, they render to a back buffer then swap when they are ready to display a frame. All you'll achieve with vsync off is, well, tearing. Doesn't matter how fast your card is going, you still get only 60fps on the monitor and you'll just have image tearing, which is quite visible to many people.

Better to just lock it to the refresh and have a very solid display.
 
If you think vsync doesn't impair the gameplay, then you haven't tried Counterstrike1.6 or other HL1-based games. And no, it's not just limited to those games, there are many games where it can make a ton of difference.

Obviously some games are perfectly playable even with some input lag, but there are definitely games in which vsync is a killer.
 
If you think vsync doesn't impair the gameplay, then you haven't tried Counterstrike1.6 or other HL1-based games. And no, it's not just limited to those games, there are many games where it can make a ton of difference.

Obviously some games are perfectly playable even with some input lag, but there are definitely games in which vsync is a killer.

I'm a noob, so I won't be getting into the argument, but I'll be reading it of course!

Would you say the same for UT based games? Sometimes I can't tell if I'm lagging in the server, or if my vsync option IS creating the lag...
 
When I first read this thread, I too was of the opinion that "if your LCD is refreshing its image 60 times per second, when your video card renders more frames per second than that, it's nothing more than 'mental exercise'". Looking around the Net a little bit, that still seems to be the generally understood fact, but I must say that I came across more confusing language in reference sources than I initially expected.

Especially misleading (at least as it seemed to me) was what Wikipedia said about refresh rate on LCDs:

Much of the discussion of refresh rate does not apply to the liquid crystal portion of an LCD monitor. [...] The closest thing liquid crystal shutters have to a refresh rate is their response time, [...]

That's essentially what flak-spammer wrote in post #8, above.

Although, at the end of that section, the confusion is clarified a little bit by this last sentence:

However they also have a refresh rate that governs how often a new image is received from the video card (often at 60 Hz).

The page at this URL seems to do a better job of explaining things from the start. Here's a copy of the relevant part of that for the lazy ones among us:

[...] I hear a lot of people claiming "LCD monitors don't have refresh rates", and they're confusing two important different points.

Remember, LCDs don't have electrons and phosphors, they just have crystals and a backlight. Once the crystals for the subpixels corresponding to red, green, and blue have twisted to let through the correct color for a specific pixel, the backlight shining through gives a constant, even light to the viewer. This means that the previous problem of flicker in CRTs is no longer a problem for LCDs, because the pixels don't dim over time, they just change to the new color once the signal changes. This does not mean however that the LCD doesn't have a refresh rate at all. Indeed, the same refresh rate for the video signal itself can be sent from the computer to a CRT monitor OR LCD monitor, and as long as the monitor is capable of handling the signal it will display it. What LCD manufacturs found out though is a new problem with the crystals - response time.

The crystals in LCDs can only change so quickly, limiting the usefulness of sending higher refresh rates to the whole display. If a refresh rate of 1hz is sent but the crystals can only change once every 5 seconds, obviously most of that data will be lost and the end result will be a blurred combination of 5 frames' worth of data. Similarly, if the LCD has a response time of 10ms for all color transitiions, the maximum refresh rate one should send to that monitor is 100hz (10ms = 1/100th of a second). Even so, this would mean that the new frame would never fully be 'developed' by the crystals by the time the next frame is sent, so a response time significantly higher than that would be preferred. The problem with this is that not all color changes can happen at the same rate.

It was soon discovered in early LCD technology that the changes from very dark to very light pixels couldn't happen quickly enough to use a refresh rate higher than 60hz, a relatively standard refresh rate back from the older days of CRTs. Since flicker wasn't a problem anymore, this was deemed acceptable and people 'got by' with their lovely new LCDs. That is, anyone who didn't care for a higher refresh rate.

Gamers however, had long discovered the benefits of a CRT monitor capable of displaying more frames per second. The basic rule of FPS is if your monitor isn't displaying it, you aren't going to see them. However, if you are able to turn your video card's output to 100hz, your monitor is capable of supporting it, and your video card is capable of rendering all those frames, you'll benefit from smoother action on the screen and more precise timing of movement and aiming shots in a first person shooter.

Now that the LCD monitors weren't flickering, most people forgot about refresh rates and started to discuss response times alone. But when gamers realized their new monitors could only display 60 frames per second at any resolution, they were naturally upset. Only very recently (~2007) have LCD manufactures caught on and started making displays designed for higher refresh rates, even though the response times have been low enough for years.

In summary, keep in mind these key definitions:

Refresh Rate: The rate at which your video card is sending complete screens from its frame buffer memory to your monitor, and the corresponding rate at which the monitor refreshes the whole image. 60hz = 60 complete refreshes per second.

[...]

(Emphases are mine.)

So, it seems like it's well established that an LCD's refresh rate is the number of times per second that the video card updates images on the screen. Whether that's a rate imposed by the video card's driver or by the OS, and whether it's a reasonable rate given the LCD's response rate or it's unnecessarily conservative, it's still the fixed number of times your video card updates your screen. And if your video card only updates your screen X times per second, any frame rate generated by your GPU above X per second amounts to nothing more than wasted GPU cycles.
 
//[T.0.P]//;1034961117 said:
I'm a noob, so I won't be getting into the argument, but I'll be reading it of course!

Would you say the same for UT based games? Sometimes I can't tell if I'm lagging in the server, or if my vsync option IS creating the lag...

Sorry, been a long time since I played UT games, and I'm not sure if I ever tried them with vsync on.

Anyway, it basically feels as if when you move the mouse the image on the screen has a tiny delay before it reacts, even though your framerate is good( i.e. 60fps ). It also sort of feels like the view is 'drifting' on its own slightly. With vsync off the response is crisp and immediate, and the image follows your mouse movement precisely.
 
Vsync on doesn't cause input lag. Using a triple buffer is the culprit. And generally you'll only notice the lag if your FPS is fairly low (around 30, but your awareness of the lag is subjective to the user.)

Triple buffers are used with vsync to allow for a full range of FPS values, generally giving higher frame rates than if used without. A triple buffer also uses more memory, since you are using triple (instead of the normal double) buffers for your frames. This can cause a slow-down in certain configurations, or for certain games, or certain resolutions (or any combination of these factors.)

If you turn triple buffering off while using vsync, the input lag will go away. However, turning triple buffering off will lock you into an FPS of your refresh rate divided by a whole integer (60 divided by 1, 2, 3, 4 - 60 / 30 / 20 / 15, and so forth.) If your system isn't able to run the game at 60, but could run it at 45 FPS, then your system with Vsync on and triple buffering off will render the game at 30 FPs. Which is why a triple buffer is helpful; it would let you run your game at 45 FPS (barring memory constraints, which may end up slowing your game down even more.)

The bottom line is, use vsync if you want to avoid tearing. But if your FPS is dropping fairly low, and input lag is an issue, you are probably better off just disabling vsync. If Vsync is always on (some games you can't turn it off - Company of Heroes, for example), then you can turn triple buffering off to avoid input lag.

The nHancer site has a great explanation of triple buffering and vsync, and why this all works the way it does.
 
Vsync on doesn't cause input lag. Using a triple buffer is the culprit. And generally you'll only notice the lag if your FPS is fairly low (around 30, but your awareness of the lag is subjective to the user.)

Well, I actually went and tested that with CS1.6 just now, restarting the game after each change in CCC. Vsync and TB on = Inputlag. Vsync on TB off = Inputlag. Vsync&TB off = Completely different experience. And my system is *definitely* able to run CS1.6 at 60fps steady.
 
Hmmm... Is it possible that the rendering method or "type" can adversely affect the benefits of TB?

UT based games are primarily Direct3D rendered and, when owning an Nvidia card, forcing OpenGL rendering in the game, you could say, runs better in the end. However, particularly with certain titles, OpenGL support or options is limited. So, when you've got a game running with Vsync on including TB @ 1024x768 with a refresh of 144Hz in Direct3D rendering, is it possible that TB in the long run is still a burden?
 
Hmm... I can't live without vsync + triple buffering. It seems like a lot of data is loss when the screen tears so much.
 
60hz is 60 ons+offs per second. It is therefore logically impossible for a monitor refreshing 60 times in a second to show more than 60 frames in a second. I mean, unless rendering frames in between the ones your monitor shows makes a difference to you. I'm sure SOMEBODY in these forums has passed 8th grade physics class...

Really, this is very basic stuff. What guney posted was right on. Can the confusion over this stop now? It's almost as bad as the microstutter junk out there and somehow got turned into an argument about response time.

For the record, response time junkies should first look at their drivers and make sure frames are not pre-rendered. If you're doing all these ridiculous tweaks to game settings but you're still pre-rendering 3 frames (or really even 1) then guess what, your input lag is still there.

These threads are ridiculous.
 
You make it sound like a physics class in the 8th grade is optional...

I haven't been a nerd my entire life. I was merely turned.
 
I didn't think Physics was taught until at least grade 10. Grade 8 Physics in 'general science class'?

"I dropped the ball, the ball bounce back up. What makes the ball bounce up?"
"I dropped a pen, the pen didn't bounce back up. Why doesn't a pen bounce?"

I don't see the corrolation between this and an understanding of Hertz ;).

Thanks everyone for the good articles on V-Sync, TB and LCD refresh rates etc provided in this thread. They were all really nice/good reads.
 
not_this_shit_again.jpg
 
We know there is a difference between 60 FPS and 100 FPS on an LCD monitor operating at 60Hz, so it's clearly not true.
There is no visual difference, except for frame tearing at 100fps. You're still seeing 60 frames every second. It may feel more responsive, as user input is being processed more than once per displayed frame, but outside of competitive twitch shooters this is pretty irrelevant.
 
There is no visual difference, except for frame tearing at 100fps. You're still seeing 60 frames every second. It may feel more responsive, as user input is being processed more than once per displayed frame, but outside of competitive twitch shooters this is pretty irrelevant.

For most games user input won't be tied to FPS either. The OS handles that. I'm sure there are some games where user input and other issues are tied to FPS where extra FPS will help, but most of the time I suspect any perceived difference in responsiveness is the result of the good ol' fashioned placebo effect.
 
Ugh so many inaccuracies here.

First of all, LCDs don't work like CRTs in their basic design, however they do refresh like CRTs in so much that they have a refresh rate. This is a rate at which the monitor analyzes the data being presented to it (essentially the data presented on the monitor cable is what is in the video cards frame buffer at the time) It then starts the refresh process for the screen, it takes a set amount of time to refresh the entire screen, once done the screen is updated and it remains in the updated state until the next refresh comes around.

This refresh is done (triggerd/started) at equally timed intervals and like a CRT actually takes time for the whole refresh process to occur, the whole screen doesn't just update all at once and from what I've seen it actually refreshes in the same pattern as an old CRT, it starts at the top and refreshes down the screen to the bottom.

We know the whole screen doesn't refresh at once because we still get tearing which is an effect caused when the frame being displayed to the monitor changes (the frame buffer in the video card swaps out to the next renderd frame) while the monitor is in the middle of refreshing. If the whole LCD simply refreshes at once taking only one sample from the cards frame buffer, there would be no possibility of tearing.

So LCDs have a set refresh rate, a rate at which a new refresh is triggered, they have a total time to refresh the screen. Moving on...

Now the matter of more than 60fps on a 60hz screen not being visible, this is simply false and now you know that LCDs tear you'll probably be able to work out why.

Simply put the image you see on your monitor once it's finished its final refresh is not just 1 image from one frame (unless vsync is applied) but rather a collage of different frames all stitched together to provide one overall image.

Looking at the top of the screen you will see part of the oldest renderd frame, at some point further down the monitor it tears together with the next frame which is slightly younger, the game has renderd a new frame with all the objects in the scene updated to their new positions (assuming there is a difference in the frames, i.e movement). The difference between the frames causes misalignment where one frame stops and the next start, we percieve this change between different frames as a tear line across the screen. This may happen several times in one entire refresh of your monitor, the amount of frames the final image on the monitor is comprised of is entirely down to what frame rate you get in the game and the speed at which your monitor can complete the entire refresh process, the faster the frame rate the more tears you get on average per frame.

So yes you can see more than 60fps on a 60hz monitor you're just not seeing entire frames you're seeing parts of frames stitched together.

If we were to stuidy just a single refresh of your monitor which was rendering a fast moving object on the screen you can infer there is movement even though you're only looking at one final image, that one image contains information across multiple frames and multiple points in time and when the image tears we can see that this object is moving even though we're studying just one refresh. The information further down the final image is newer and so we can even infer direction of movement.

Of course turn Vsync on and you get 1 entire monitor refresh synced with 1 exact frame and in that circumstance anything above 60fps is pretty much pointless.



*EDIT*

To help prove the point, this is my Dell 30" 3007 WFP-HC rendering L4D2 demo in 2560x1600 with all setting minimum to boost frame rate to cause more tearing, fraps is running. I moved the mouse from side to side quickly to cause a big change between frames to make tearing easy to see, and I snapped a shot with my digital camera the Canon IXUS 901S. I edited it in paint.net and saved as 90% quality to reduce size to 1mb, this is a thumbnail click for a full sized one.

 
Last edited:
For me FPS rules. The more the better. I dont use vsync or triple buffer I always disable vsync. I'd prefer as many FPS as possible because games are smoother to me with more FPS. Seeing 30 or 60 fps is BS because 150 FPS feels smoother to me that 60 fps. Your only locked at 60 FPS (sync with the LCD refresh) when you enable vsync. So for the idiots that claim >60 on an LCD is a waste don't know what they are doing.

Myth, rumor or fact I don't give a shit - I prefer more FPS over vsync. To each his own, but thats my preference.
 
Of course turn Vsync on and you get 1 entire monitor refresh synced with 1 exact frame and in that circumstance anything above 60fps is pretty much pointless.

You must know why Carmack was thinking about making DOOM4 a 30fps only game...?

What about with older games? There must be some logic to having FPS at 90, or even 200? For the Quake3, I believe the MaxFPS cvar(or whatever it's called) is at 85 by default.
 
//[T.0.P]//;1034966038 said:
You must know why Carmack was thinking about making DOOM4 a 30fps only game...?

What about with older games? There must be some logic to having FPS at 90, or even 200? For the Quake3, I believe the MaxFPS cvar(or whatever it's called) is at 85 by default.

The id engines are a bit different from most other game engines, most game engines simply re-calcuate the state of the game world at the rate you're rendering at, as soon as one frame is finished the game engine re-calculates the world states (the position of entities like enemys and in flight rockets etc) then draws the next frame

In the id engines they have a tic rate which is a fixed rate at which the game state is calculated, internally this is 60hz for games like doom 3. The frame rate is capped at 60fps because trying to render more frames would be pointless since none of the objects in the scene will have updated their positions the frame rendered would be exactly the same so you'd end up with an identical frame which is just a waste of time.

If they're considering a ticrate of 30 thats probably due to technical limitations of the platforms they're going to work with, or put another way consoles are crap and run slower :)
 
I guess its about that time. Every 2 months or so, somebody opens a thread saying the same thing. It all ends in a clusterfuck of arguing about this or that and gets locked.

I already see this one going the same direction
 
Well obviously you won't be getting more than 50fps, but the game will be smoother because you have at least 60fps more often than if you just averaged 60fps.
 
Well obviously you won't be getting more than 50fps, but the game will be smoother because you have at least 60fps more often than if you just averaged 60fps.

This is a good point worth making. Almost all frame rate counters measure average frame rate, frame rates are never steady so an average frame rate of 60fps may cause a minimum frame rate of say 30-40fps.

90fps is a good safe render speed to ensure the minimum is always 60 or above.
 
Back
Top