What is better? FXAA or MSAA?

IsaacMM

Gawd
Joined
Jun 9, 2008
Messages
646
Should I ditch MSAA completely in favor of FXAA or is MSAA still better in terms of image quality?
 
It's too subjective to answer - it depends on what bothers you. FXAA can cause texture blurring, but MSAA can miss some edges - so it depends which limitation bothers you more. You really need to test it yourself and see which you prefer. FXAA is much better for performance though, so that is the deciding factor for a lot of people.
 
It's too subjective to answer - it depends on what bothers you. FXAA can cause texture blurring, but MSAA can miss some edges - so it depends which limitation bothers you more. You really need to test it yourself and see which you prefer. FXAA is much better for performance though, so that is the deciding factor for a lot of people.

Typically, moderate FXAA and at least 2xMSAA combine for great results. FXAA is usually almost free, so use it when you can.

FXAA will also vary significantly between games, as it's a developer implemented shader program, so as Forceman said, it's very subjective and you should decide for yourself.

If it helps, both AMD and Nvidia are working on newer forms of AA that minimize the performance impact while maximizing the AA effect to good ends. We see new versions with each generation of hardware.
 
SMAA. Both FXAA and MLAA are terrible solutions, with MLAA being slightly worse.

MSAA is categorically superior to FXAA in terms of image quality and this fact is beyond dispute. SMAA very nearly approaches MSAA image quality with a dramatic increase in performance.
 
SMAA. Both FXAA and MLAA are terrible solutions, with MLAA being slightly worse.

MSAA is categorically superior to FXAA in terms of image quality and this fact is beyond dispute. SMAA very nearly approaches MSAA image quality with a dramatic increase in performance.

Beyond dispute? I think it's disputed quite often. MSAA can't touch shader-based aliasing or transparency aliasing, whereas MLAA (which FXAA is a subcategory of) can nearly eliminate it. I enable 2x MSAA and I can't tell the difference from it disabled. I enable 4x MSAA and geometry edges are smooth, but transparent edges are jagged as a serrated blade and my frame rate is suddenly down by 20%. I enable MLAA or FXAA and now all edges are clean yet my frames are still high.

Let the disputing begin.
 
FXAA has awful image quality: it blurs the whole picture and looks worse the higher the resolution you use. It's a quick fix from the console arena that NVIDIA is pushing because it's easy to implement and has little performance hit. MSAA + Transparency still yields the best IQ. SMAA might get there if it's further developed.
 
FXAA has awful image quality: it blurs the whole picture and looks worse the higher the resolution you use. It's a quick fix from the console arena that NVIDIA is pushing because it's easy to implement and has little performance hit. MSAA + Transparency still yields the best IQ. SMAA might get there if it's further developed.

I have to wonder about this- if I can get 2xMSAA running at 2560x1600 for BF3, I'll give it a try. Setting FXAA to medium on that game yields a nice balance though.
 
FXAA has awful image quality: it blurs the whole picture and looks worse the higher the resolution you use. It's a quick fix from the console arena that NVIDIA is pushing because it's easy to implement and has little performance hit. MSAA + Transparency still yields the best IQ. SMAA might get there if it's further developed.

Depends how you define awful. It does blur the image but the implementation matters a lot, and if you don't have the performance to run higher levels of MSAA (or something better) then it is kind of a moot point. And considering that some game engines don't support MSAA at all, it is certainly better than nothing.
 
I like that FXAA is free. It does blur the textures though. In BF3 if I set it too high it's noticeable but medium doesn't blur too much and does a good job antialiasing the whole screen. I like that I can maintain 60fps 99% of the time.


I love where they are going with these new post AA methods hopefully they keep gettIng better.
 
Its something you need to test yourself really. Screenshots don't convey what there really like as you need to see them in motion.

For me i cant stand FXAA. I was really looking forward to it after hearing about its low impact on performance but then i tried it and everything just looks a blurry mess . I don't understand why anyone likes it especially those who are buying high end cards and then choosing to make their screen look like its got a thin layer of vaseline smeared over it :eek:.

Id personally rather have no anti analising than use FXAA.
 
Beyond dispute? I think it's disputed quite often. MSAA can't touch shader-based aliasing or transparency aliasing, whereas MLAA (which FXAA is a subcategory of) can nearly eliminate it. I enable 2x MSAA and I can't tell the difference from it disabled. I enable 4x MSAA and geometry edges are smooth, but transparent edges are jagged as a serrated blade and my frame rate is suddenly down by 20%. I enable MLAA or FXAA and now all edges are clean yet my frames are still high.

Let the disputing begin.

No. FXAA and MLAA cannot sufficiently eliminate aliasing. They work well in stills, in motion they fail more often than not. MSAA can be upgraded to the (costly) SGSSAA which is beyond any dispute in quality.
 
As much as I love FXAA. I still think MSAA has better image quality over FXAA.

But you take that performance hit when you use MSAA. With FXAA its hardly a performance hit at all.

Personally I try to use them both if possible.
 
Anyone who tells you that FXAA offers better performance or is nearly free is sadly misinformed. FXAA uses shader units whereas MSAA uses video memory and bandwidth. Whichever is taxed less prior to selecting your AA method will give you the better results. Examples:

-Older games on my laptop's GeForce GT130M 1GB tax the 32 shader units pretty hard. Enabling FXAA is near crippling while MSAA is nearly free up to 16xCSAA.
-On my GTX 560, they come out about even in tests run on WoW and Batman: AA. However, I was comparing 16xCSAA + 8xTrSSAA vs. FXAA. That's pretty bad that those two methods combined offer a similar performance hit as FXAA.
-On newer cards like the 670/680 series, we didn't see a massive jump in available memory or bandwidth, but there is a ton of extra shading power. This makes FXAA nearly free whereas MSAA has a larger performance impact.

So it comes down to this. On older cards with too much RAM (I'm look at the GT430 people w/2GB RAM), MSAA > FXAA in performance. On mid-range cards and some older high-end cards, they can come out about even, but it's totally situation dependent. On the newer 670/680 cards, FXAA > MSAA in performance. Going forward FXAA should continue to have the performance advantage over MSAA.

In terms of looks, 16xCSAA + 8xTrSSAA is far superior to FXAA. If it's just MSAA vs. FXAA, you get better sharpness with MSAA, but no anti-aliasing on shaders or alpha-textures.
 
Neither is "better" than the other, they're just different and they can even be complimentary as some have pointed out running both types of AA at the same time. You can't label something as better without having defined criteria and even then its going to be subjective depending on the user. Its like judging a beauty pagent, how can one woman look "better" than another when the decision is so subjective?
 
Neither is "better" than the other, they're just different and they can even be complimentary as some have pointed out running both types of AA at the same time. You can't label something as better without having defined criteria and even then its going to be subjective depending on the user. Its like judging a beauty pagent, how can one woman look "better" than another when the decision is so subjective?

I partially agree. The problem is that it's no so much subjective as it is a trade-off. MSAA only hits the edges of geometry, whereas FXAA literally anti-aliases the entire screen using a shader-based method. The problem is that if you force it in a game not designed for it, it blurs the UI/HUD and text. It also blurs textures, causing you to lose fine details. FXAA is basically going thermonuclear in the war against jaggies.

When combined with Transparency Anti-Aliasing, MSAA can offer edge-smoothing comparable to FXAA (minus some shaders), but without losing the finer details. Thus, MSAA + TrAA is generally better in terms of image quality. The problem is that MSAA + TrAA is a huge video memory and bandwidth hog, whereas FXAA taxes your shading power. And where are modern gamers taxed the most? Memory bandwidth and available video memory. On older titles or older GPUs with too much RAM, MSAA is nearly free whereas FXAA is a performance killer. On newer GPUs and in newer games it is the exact opposite with FXAA being nearly free and MSAA being the killer.

Which one is better in terms of performance is highly dependent on your system and the game you're playing. Which one is better in terms of looks is not so dependent. MSAA + TrAA is superior in almost all cases. The cases where it's not superior are the cases where it cannot be forced (original Halo, for example).
 
MSAA looks ugly in most games with trees... just look at crysis. unplayable with MSAA

That's where transparency AA comes in. You have to mix MSAA + TrAA (or Adaptive AA on ATI/AMD cards) to get an effect similar to FXAA without the blurriness. It's up to you to determine if the performance hit is worth it. Just keep in mind that FXAA is not always less of a hit than MSAA.

http://hardforum.com/showthread.php?t=1699111

On a GTX 560, I got the following results: (16xCSAA + 8xTrSSAA) / FXAA
World of Warcraft - 53.417fps / 58.067fps
Batman: Arkham Asylum - 56fps / 56fps

In my case, because I don't have a 670/680, my memory wasn't much more of a bottleneck than my shading power. In the case of WoW, there was a 10% difference between 4xMSAA with 12x coverage samples and 8x super-sampled transparency AA, versus forced FXAA. The visual quality difference was huge though! In batman, the average framerate was the same. Once again, FXAA was the lesser in quality, though not by a huge margin this time.

In older titles I use 16x/32x CSAA + 8xTrSSAA. In newer titles where my card is pushed to the limit, I find FXAA to offer the same or similar performance hit to the above combo. If I had a 670/680, FXAA would offer a lesser hit, but that's not the case. On my GT 130M, though, FXAA is a horrid performance killer.
 
fxaa is good for games like crysis and it's implimentation is getting much much better. It also nativly works with a fully deffered render. Like in Crysis 2 (no they didn't just do it for "best performance lulz").
TXAA will be interesting.

I do see msaa being phased out.
It doesn't bother me if shader based aa continues to get better and better.
Also, SMAA can be pretty variable in quality.
 
It depends on the game and which implementation of FXAA they use. Some look like shit with high FXAA (Diablo 3, Battlefield 3) and some games look very good (Max Payne 3).

MSAA is "crisper" in terms of edge image quality but misses transparent textures so some combination of light FXAA + 2-4x MSAA is what I prefer, when I have the performance to do so.
 
MSAA is "crisper" in terms of edge image quality but misses transparent textures so some combination of light FXAA + 2-4x MSAA is what I prefer, when I have the performance to do so.

I would advise that you try another combination just to explore your horizons:
-Go into your NV CP and look at the setting for transparency AA
-If you have more video memory than you need, use the super-sample options (newer cards allow 2x/4x/8x whereas older cards only have on/off)
-If you have memory bandwidth to spare, use the multi-sampling option
-Mix the above with either 4xMSAA or 16xCSAA, and turn off FXAA

Given that you're on 680 SLI, the performance hit on this WILL be worse than your current combo, but it may still be within tolerances. IMO, it will look much better than 2xMSAA + FXAA. Please give it a shot and report back.
 
Some people do not realize FXAA doesn't apply well to certain games and the major "blurring" they think applies to all games, and it doesn't. In LOTRO I force FXAA on it's like I took off my glasses the entire scene is a blurr... In Skyrim where's it supported it's fine, in Borderlands it's a bit in the middle. It really varies, I asume if the game utilizes it ingame it snould be a boon in performance. And I agree with Kyle, it softens or AA the glare of specular lighted surfaces where most AA methods don't. I could be wrong in that.

FXAA far from perfect but it's the step in the write direction for good IQ with less performance loss. I'm hoping we have just 1 AA mode that the MS,SM,SS, yetyt, werweutt, HT or whatever varieties... I mean beside gaming euthusists most general gamers have no clue what AA is and unlikely have it on, like my roomate... playing games for years on PC not realizing you can pump up the eyecandy... he's scared to touch video settings and plays his games on default settings often in low default resolutions...
 
MSAA is better. FXAA makes the screen so blurry it feels like my eyes aren't in focus. I can't use it in any game.
 
Some people do not realize FXAA doesn't apply well to certain games and the major "blurring" they think applies to all games, and it doesn't. In LOTRO I force FXAA on it's like I took off my glasses the entire scene is a blurr... In Skyrim where's it supported it's fine, in Borderlands it's a bit in the middle. It really varies, I asume if the game utilizes it ingame it snould be a boon in performance. And I agree with Kyle, it softens or AA the glare of specular lighted surfaces where most AA methods don't. I could be wrong in that.

FXAA far from perfect but it's the step in the write direction for good IQ with less performance loss. I'm hoping we have just 1 AA mode that the MS,SM,SS, yetyt, werweutt, HT or whatever varieties... I mean beside gaming euthusists most general gamers have no clue what AA is and unlikely have it on, like my roomate... playing games for years on PC not realizing you can pump up the eyecandy... he's scared to touch video settings and plays his games on default settings often in low default resolutions...

Absolutely correct. If implemented in game, the developer can control what area(s) are affected and to what degree, to include completely ignoring the HUD/UI/Text. When forced via CP or injector, it hits the whole screen. Native FXAA is a thing of beauty, but forced is truly hit or miss. Native also has much lower performance penalties whereas forced can be (as I've demonstrated) more demanding than MSAA.

MSAA is better. FXAA makes the screen so blurry it feels like my eyes aren't in focus. I can't use it in any game.

Try FXAA in a supported game or three and your opinion might change. When it's forced at the driver level or injected, the results aren't going to be as good.
 
So in the nvidia control panel thing, i should set my FA and MS to auto? Since they're different for every game?
 
So in the nvidia control panel thing, i should set my FA and MS to auto? Since they're different for every game?

I would leave them both at default settings and then change them on a per-game basis. Here's a rudimentary guide:

-If a game has in-game AA settings, use them. If a game supports only MSAA, try to use 4xMSAA (or just 4X as it's sometime listed), then in the NVCP change it to 16xCSAA (enhance in-game setting). This gives you 4x multi-sampling with 12x coverage samples. This gives you much better AA quality than 4x but performance penalty over 4x is negligible.
-If a game supports FXAA in-game, give it a try. It's typically better than driver-forced.
-If a game does not offer in-game AA settings and it's DX9 or older, you can force MSAA in the NV CP without issues. If it's DX11, forced AA generally won't work, but FXAA will. DX10 is hit or miss.
-When using MSAA as opposed to FXAA, it's a good idea to try transparency anti-aliasing. This will smooth out those edges that MSAA doesn't touch (it's called adaptive AA on AMD/ATI cards).
 
I would leave them both at default settings and then change them on a per-game basis. Here's a rudimentary guide:

-If a game has in-game AA settings, use them. If a game supports only MSAA, try to use 4xMSAA (or just 4X as it's sometime listed), then in the NVCP change it to 16xCSAA (enhance in-game setting). This gives you 4x multi-sampling with 12x coverage samples. This gives you much better AA quality than 4x but performance penalty over 4x is negligible.
-If a game supports FXAA in-game, give it a try. It's typically better than driver-forced.
-If a game does not offer in-game AA settings and it's DX9 or older, you can force MSAA in the NV CP without issues. If it's DX11, forced AA generally won't work, but FXAA will. DX10 is hit or miss.
-When using MSAA as opposed to FXAA, it's a good idea to try transparency anti-aliasing. This will smooth out those edges that MSAA doesn't touch (it's called adaptive AA on AMD/ATI cards).

cool, thank you sir. i just updated my sig.. i have the 670 now.
 
I would advise that you try another combination just to explore your horizons:
-Go into your NV CP and look at the setting for transparency AA
-If you have more video memory than you need, use the super-sample options (newer cards allow 2x/4x/8x whereas older cards only have on/off)
-If you have memory bandwidth to spare, use the multi-sampling option
-Mix the above with either 4xMSAA or 16xCSAA, and turn off FXAA

Given that you're on 680 SLI, the performance hit on this WILL be worse than your current combo, but it may still be within tolerances. IMO, it will look much better than 2xMSAA + FXAA. Please give it a shot and report back.
I've played with that all before. The problem is not every game engine seems to play nice with SGSSAA or SSAA. I had problems with Mass Effect 3, and Diablo 3. For those games I used standard MSAA. Conversely using MSAA+TrAA is a great option, but it's not always practical at 6080x1200 (which is my multimonitor resolution), although on single screen resolutions I can crank pretty much any AA setting I want.

When playing at multimonitor resolutions, I would rather play with all the visual settings enabled, and less AA, than have to turn off graphics options and enable MSAA. In those circumstances FXAA is quite nice because of the smaller performance hit and memory usage compared to MSAA.
 
I've played with that all before. The problem is not every game engine seems to play nice with SGSSAA or SSAA. I had problems with Mass Effect 3, and Diablo 3. For those games I used standard MSAA. Conversely using MSAA+TrAA is a great option, but it's not always practical at 6080x1200 (which is my multimonitor resolution), although on single screen resolutions I can crank pretty much any AA setting I want.

When playing at multimonitor resolutions, I would rather play with all the visual settings enabled, and less AA, than have to turn off graphics options and enable MSAA. In those circumstances FXAA is quite nice because of the smaller performance hit and memory usage compared to MSAA.

What problems did you have in ME3? I'm playing at 1920x1200 with ingame max (ingame AA off) and NV Inspector forcing 4X MSAA + 4X SGSSAA (guide suggested matching sample at 4 for each) and it looks ridiculously good. Only issues I've had is sometimes weird shadow effects on faces. Using 304.48 betas still.
 
I understand that, and agree that with a multi-monitor setup, FXAA will be king. As for Diablo 3, I currently run it with 16xCSAA and 8xTrSSAA. No problems, runs beautifully.
 
Mainly playing BF3 lately, and I notice the crap that FXAA is in that game. 2x MSAA plus FXAA high isn't that bad though.
 
Mainly playing BF3 lately, and I notice the crap that FXAA is in that game. 2x MSAA plus FXAA high isn't that bad though.

Wait, FXAA is crap in BF3, but MSAA + FXAA isn't bad?

High is too high in BF3 - try medium instead, works just as good on the aliasing but not nearly as much blurring.
 
doesn't it depend on the implementation? look at maxpayne... terrible msaa great fxaa


bf3 terrible fxaa amazing msaa
 
Wait, FXAA is crap in BF3, but MSAA + FXAA isn't bad?

High is too high in BF3 - try medium instead, works just as good on the aliasing but not nearly as much blurring.

From my experience, FXAA is a blurry mess in BF3, MSAA is not. I used FXAA high with MSAA at 2x and the blurriness wasn't as bad, enough for me to not care anymore. I haven't paid much attention lately, multiplayer isn't for looking at the scenery lol. I'm going to check it out again though and compare.
 
Your trust doesn't matter.


I got 2600K@ 4.8, and GTX680@1280 + SSD and i still got dips to 40 fps/stutter withonly 4X SGSSA + 4XMSAA and AO at very high, its not possible to play Diablo III without fps dips on those settings hence "no problems and running beautifully" is lying.

It is maybe, if you dont use AF/AO/And high quality filtering forced, which i doubt since you use SSAA + 16xCSAA.
 
I got 2600K@ 4.8, and GTX680@1280 + SSD and i still got dips to 40 fps/stutter withonly 4X SGSSA + 4XMSAA and AO at very high, its not possible to play Diablo III without fps dips on those settings hence "no problems and running beautifully" is lying.

It is maybe, if you dont use AF/AO/And high quality filtering forced, which i doubt since you use SSAA + 16xCSAA.

I had the same problem and posted in another thread. There were some varying results and the main culprit seemed to be Vsync. I found that using anything other than application preference in the NV CP (on, off, Adaptive) would cause a chaotic framerate. Using application preference and then turning Vsync on within the game smoothed things out for me. There were still lag hitches, so I gave the game priority from my router. This further smoothed things out. Is it perfect? No, as I'll still have a network-related hitch here and there, but even going to the minimum settings won't fix this.

I'm not lying, I simply took the time to troubleshoot my issues, and the point of my post was to explain that switching from FXAA to 16xCSAA + 8xTrSSAA did not have an adverse affect on my framerates. I have no problems with your experience, but please don't call someone a liar just because your experiences weren't exactly the same. That was completely unnecessary.
 
Yes, i admit i go to far, sorry dude, but many users same as me claimed that D3 is not playable if you go over 4XSSAA even with high end rig + GTX680, the fps drop to much at those settings.

Here is what i did till now.

Turned on vsync ingame.
Turned on prefer maxium performance in nvcp.
Turned off max foreground and background fps.
Turned on disable desktop composition for Diablo III exe.

Nothing seems to fix the stutter....

Do you have ambient occlusion turned on and what settings you got on it?

This is my Inspector profile,

9chdM.png
 
Back
Top