Do all types of AA create input lag? What about FXAA?

Joined
May 21, 2012
Messages
47
When it comes to FPS games, I'm an input lag nazi and I've been curious about this for a while. Whenever I search google, I just find loads of threads of people arguing "Yes it does! No it doesn't!" It's just confusing lol. AA in general is post-processing, however, I've read that FXAA is different from other AA such as MSAA and does not introduce input lag. Some games seem to be more sensitive to AA induced input lag. Skyrim, for example, gets hit hard by any form of AA.

Can anyone confirm this?
 
Last edited:
The lag when using FSAA is more a component of the FPS and monitor refresh rate. It's not input lag specifically, but overall system lag. FXAA is a good alternative to MS or SS FSAA if you don't have the graphics performance needed to keep FPS up above refresh rate.

BTW: If you had fast enough video performance and a newer NVidia GPU, nVidia's lightboost enabled on a 100Hz or 120Hz LCD monitor works wonders, as will using an old school CRT monitor at 120Hz.
(read about that here: http://www.blurbusters.com/zero-motion-blur/lightboost/)
 
AA doesn't introduce input lag usually. Generally, it's vsync that does it.
 
AA doesn't introduce input lag usually. Generally, it's vsync that does it.

Yeah, this is the kind of input lag I'm talking about. I never play FPS games with vsync except non-shooter ones like Skyrim where vsync is practically mandatory to prevent debilitating tearing.

AA, though, seems less conspicuous and varies game-to-game. Again, Skyrim gets noticeable input lag with AA activated, at least to me, while some games like Quake3/QuakeLive don't seem to be bothered by AA.

As far as I've read, any form of post processing introduces input lag. But, some people say some types of AA, like FXAA and MLAA, are not post processing. If they're not post processing, then how do they correct jaggies and prevent input lag from added rendered frames?
 
Yeah, this is the kind of input lag I'm talking about. I never play FPS games with vsync except non-shooter ones like Skyrim where vsync is practically mandatory to prevent debilitating tearing.

AA, though, seems less conspicuous and varies game-to-game. Again, Skyrim gets noticeable input lag with AA activated, at least to me, while some games like Quake3/QuakeLive don't seem to be bothered by AA.

As far as I've read, any form of post processing introduces input lag. But, some people say some types of AA, like FXAA and MLAA, are not post processing. If they're not post processing, then how do they correct jaggies and prevent input lag from added rendered frames?

There's nothing special to post-processing, and no innate reason for it to cause input lag. It's just another step to the rendering. You will never get an incomplete frame, the post-processing is guaranteed done before you ever see the frame.

You just pay whatever the cost of rendering it is, just like anything else... and the post-AA's tend to be extremely cheap.
 
In terms of 'time-to-emission', anything you pile on that increases frame times may have some effect on how you perceive latency. AA usually adds to frame times, but not always — in some cases AA is effectively free (and is sometimes the case with FXAA, depending on how it's implemented). When vsync'ed, and if you don't miss any swaps, the level of input latency as a component of frame times is effectively held constant.

There's no AA method I'm aware of that buffers frames, so you aren't getting any full frames of latency added to the pile.
 
SMAA >> FXAA when speaking of image quality vs performance, so if you are already set on using a shader based antialiasing then i would recommend the sweetfx shader suite greatly, you can either use the configurator from:
http://sweetfx.thelazy.net/

Or get the original flavor from:
http://www.guru3d.com/files_details/sweetfx_shader_suite_download.html

In any case, i would recommend you to look at the 2nd link, and scroll down to the comments section as if there are compatibilities issues it will be reported there.


edit to add:

Ohh yeah, forgot to add "shader based antialiasing nowadays is light enough performance wise that it doesn't add noticeable input lag", you can see how the performance hit is negligible on this Crysis 3 test:
http://www.hardocp.com/article/2013/03/12/crysis_3_video_card_performance_iq_review/6#.UkAjj4ash8E
And quality comparisons:
http://www.hardocp.com/article/2013/03/12/crysis_3_video_card_performance_iq_review/8#.UkAj0Yash8E
 
Last edited:
AA does produce produce input lag. On the Source engine i always have everything on low quality/no AA and bloom so don't get hit by the enormous input lag. Played competitively and i could easily tell a difference even though i played with constant 120fps. It sort of feels like you play with 70 fps instead of 120 and your mouse aim won't be that precise (this is the Source engine though) other games has it aswell but it's harder to tell because most of them aren't "micro" precision games like CS/CSS.

Recently bought a GTX 780 and i couldn't play BF3 on high/ultra without feeling a little impaired because of the settings, it felt sluggish even though fps was good.
 
Framerate limiters solved this issue for me. I despise not using vsync, so it's borderline necessary for me.
 
AA does produce produce input lag. On the Source engine i always have everything on low quality/no AA and bloom so don't get hit by the enormous input lag. Played competitively and i could easily tell a difference even though i played with constant 120fps. It sort of feels like you play with 70 fps instead of 120 and your mouse aim won't be that precise (this is the Source engine though) other games has it aswell but it's harder to tell because most of them aren't "micro" precision games like CS/CSS.

Recently bought a GTX 780 and i couldn't play BF3 on high/ultra without feeling a little impaired because of the settings, it felt sluggish even though fps was good.
I really think it depends on the type.

Traditional AA simply renders the game at a high resolution, then down scales it. AA like FXAA, which is post processing, can add lag and hitching. E.G., FXAA in Skyrim looks good, but causes noticeable hitching.

And BTW, a lot of the newer "features" added to Source games suck. The HL2 "multi-core optimizer", or whatever it's call, has never done anything but cause performance issues for me.
 
I really think it depends on the type.

Traditional AA simply renders the game at a high resolution, then down scales it. AA like FXAA, which is post processing, can add lag and hitching. E.G., FXAA in Skyrim looks good, but causes noticeable hitching.

Oh wow, just confirmed this to myself searching google. All this time I had it backwards and thought it was FXAA and SMAA that wasn't post-process while traditional AA was. Maybe I'll try an injector for Skyrim and Borderlands 2 as opposed to the in-game FXAA option. Thanks.

In the end, I also read the added latency from FXAA in most games is so minimal that it can be less than 1ms added, which is hardly noticeable, especially on a 120hz screen. Skyrim seems to be an entirely different beast, though, and I feel more input lag with it activated. Go go console ports!

And BTW, a lot of the newer "features" added to Source games suck. The HL2 "multi-core optimizer", or whatever it's call, has never done anything but cause performance issues for me.

The multi-core support works for me in most Source games. The first HL2 game, the main game, doesn't utilize multi-core processing very well, if at all. You'll see frame drops in the regular HL2 game. Episodes 1 and 2 will utilize all 4 cores of an intel CPU and you will see higher loads, temps, and less frame dips. So, maybe it depends on the Source game. Fire, on the other hand, always kills frames in Source games. It's like the game engine doesn't know how to handle fire on a multi-core CPU.
 
Last edited:
I played Team Fortress 2 throughout the implementation of multi-core rendering in the Source engine and it seems hit or miss. It gave me a performance improvement on my old E8400 CPU but on my i7 there doesn't seem to be a difference with it on or off except in Linux. If I have it enabled in Linux then I was getting major hitching. In Windows, either way seems to work fine.

So basically, if you have stutter or hitching in those games, toggle the multi-core rendering option on or off and see which works better.
 
Oh wow, just confirmed this to myself searching google. All this time I had it backwards and thought it was FXAA and SMAA that wasn't post-process while traditional AA was. Maybe I'll try an injector for Skyrim and Borderlands 2 as opposed to the in-game FXAA option. Thanks.

In the end, I also read the added latency from FXAA in most games is so minimal that it can be less than 1ms added, which is hardly noticeable, especially on a 120hz screen. Skyrim seems to be an entirely different beast, though, and I feel more input lag with it activated. Go go console ports!



The multi-core support works for me in most Source games. The first HL2 game, the main game, doesn't utilize multi-core processing very well, if at all. You'll see frame drops in the regular HL2 game. Episodes 1 and 2 will utilize all 4 cores of an intel CPU and you will see higher loads, temps, and less frame dips. So, maybe it depends on the Source game. Fire, on the other hand, always kills frames in Source games. It's like the game engine doesn't know how to handle fire on a multi-core CPU.

That's what it costs to render the effect, whether it's a post processing effect is irrelevant. It's not like what your television might do where whatever connected has already rendered the frame, output it, and then the TV wastes time doing something else on top until you turn on game mode or whatever.

You're not getting a frame until everything's done. If that means FXAA cost another 1ms to render, then that's what you paid. That could be thousands of FPS lost or not even a fraction of one.
 
Back
Top