What is better? FXAA or MSAA?

Ive tried to use injectors with no results. Is there some trick to it? I'dd like to see SMAA first-hand, instead of looking at screens.
 
FXAA has awful image quality: it blurs the whole picture and looks worse the higher the resolution you use. It's a quick fix from the console arena that NVIDIA is pushing because it's easy to implement and has little performance hit.
Two points:

1) FXAA doesn't blur the entire frame. It's an edge-detect algorithm that inspects pixels to their adjacent pixels and determines whether there's aliasing present, then anti-aliases those pixels. MSAA works similarly, on a fundamental level, but doesn't use post-process edge detection. FXAA catches more aliasing, like shader aliasing, and attempts to anti-alias it.

An FXAA'ed frame will always appear softer than an MSAA'd frame, since FXAA, by design, attacks more aliasing. The benefit to that is mostly temporal, but it's fairly minor. You need to decide for yourself whether you favor static quality over temporal quality, but the latter's easily much more of an issue these days than the former. We're at a wall in terms of the effect of quantization has on shaded pixels, and as complexity increases, it only gets worse.

2) The theory that NVIDIA is pushing FXAA because it has a lower performance hit doesn't hold water. NVIDIA is in the business of selling you faster cards. It's not in their best interest to improve rendering performance on non-NVIDIA hardware, which is what FXAA does.
 
Man, you slap some FXAA on GTA4 and the game looks so much better. Other games, I don't really notice much of a difference, so 4x MSAA goes on. I personally don't need much for 2560x1440 resolution.
 
From my experience, FXAA is a blurry mess in BF3, MSAA is not. I used FXAA high with MSAA at 2x and the blurriness wasn't as bad, enough for me to not care anymore. I haven't paid much attention lately, multiplayer isn't for looking at the scenery lol. I'm going to check it out again though and compare.

I did this last night, had been playing at 4X MSAA + FXAA high since getting my 670, dropped FXAA down to low and can't really tell much of a difference in image quality, I guess trees (the leaves) are slightly worse now and the entire scene might be a bit sharper, but it's not a massive difference in motion to me. Performance feels about the same.

psyside: what is your connection and router like? Are you using wifi? Network issues cause FPS stutter in Diablo 3. I've also noticed FPS drops in towns (Hidden Camp mostly). My settings are 1920x1200, in game max except AA off/vsync off, then NV Inspector 4X MSAA + 4X SGSSAA + Adaptive vsync. Overall it's butter smooth though at 60fps 99.5% of the time, just weird drops in town sometimes. Connection is 25/2.5 cable with wired gigabit router.
 
Just download and use SMAA injector, you can modify the SMAA usage, low/medium/high/ultra, it looks like 10x better then FXAA and have the same "performance hit" as FXAA.

Just don't forget to unchek FXAA ingame.
 
I partially agree. The problem is that it's no so much subjective as it is a trade-off. MSAA only hits the edges of geometry, whereas FXAA literally anti-aliases the entire screen using a shader-based method. The problem is that if you force it in a game not designed for it, it blurs the UI/HUD and text. It also blurs textures, causing you to lose fine details. FXAA is basically going thermonuclear in the war against jaggies.

When combined with Transparency Anti-Aliasing, MSAA can offer edge-smoothing comparable to FXAA (minus some shaders), but without losing the finer details. Thus, MSAA + TrAA is generally better in terms of image quality. The problem is that MSAA + TrAA is a huge video memory and bandwidth hog, whereas FXAA taxes your shading power. And where are modern gamers taxed the most? Memory bandwidth and available video memory. On older titles or older GPUs with too much RAM, MSAA is nearly free whereas FXAA is a performance killer. On newer GPUs and in newer games it is the exact opposite with FXAA being nearly free and MSAA being the killer.

Which one is better in terms of performance is highly dependent on your system and the game you're playing. Which one is better in terms of looks is not so dependent. MSAA + TrAA is superior in almost all cases. The cases where it's not superior are the cases where it cannot be forced (original Halo, for example).


I understand fundamentally there are trade-off's involved with each type of AA, however the final image is still a subjective opinion as to which is better. Its no different than saying Ms North Carolina has great boobs but Ms California has a great butt. Both are trade-off's yet which one looks better is still a subjective opinion.
 
It's both subjective and qualitative. For the anti-aliasing of polygonal edges, MSAA can deliver greater quality with fewer temporal artifacts (depending on sample pattern). FXAA anti-aliases polygonal edges reasonably well, but also anti-aliases shader aliasing and objects with transparency, so it's generally best used as a complement to MSAA.

That said, you could probably tune FXAA such that it rivals 8xMSAA quality, so qualitative analysis has to be fairly in-depth to get a good feel as to which is actually better for a given scene. I'm still of the opinion that you really do want some form of MSAA or supersampling for maximum quality, though, with post-process AA used in tandem with either.
 
It's both subjective and qualitative. For the anti-aliasing of polygonal edges, MSAA can deliver greater quality with fewer temporal artifacts (depending on sample pattern). FXAA anti-aliases polygonal edges reasonably well, but also anti-aliases shader aliasing and objects with transparency, so it's generally best used as a complement to MSAA.

That said, you could probably tune FXAA such that it rivals 8xMSAA quality, so qualitative analysis has to be fairly in-depth to get a good feel as to which is actually better for a given scene. I'm still of the opinion that you really do want some form of MSAA or supersampling for maximum quality, though, with post-process AA used in tandem with either.

I agree that 4xMSAA plus a small amount of shader AA looks good in most cases.
 
The smash injector in bf3 looks nice. I use all high w low shadows and effects at 1080p. With no other as enabled at all it looks very smooth. Runs well with some drops but that is rare.
 
FXAA imo is better, it hits all edges and takes such a small performance hit, only thing is slight blurring, which really isn't even that bad.
 
Anyone knows how to force SSAA in Crysis 2?

Use nvidia inspector.

Just download and use SMAA injector, you can modify the SMAA usage, low/medium/high/ultra, it looks like 10x better then FXAA and have the same "performance hit" as FXAA.

It also has a tendency to create annoying artifacts due to oversharpening.

1) FXAA doesn't blur the entire frame. It's an edge-detect algorithm that inspects pixels to their adjacent pixels and determines whether there's aliasing present, then anti-aliases those pixels. MSAA works similarly, on a fundamental level, but doesn't use post-process edge detection. FXAA catches more aliasing, like shader aliasing, and attempts to anti-alias it.

Depends on the algorithm. They do have a tendency to blur most if not all of the scene due to the per-pixel nature of the shader. Though obviously the areas with the most aliasing receive the highest levels of blurring.

The benefit to that is mostly temporal, but it's fairly minor. You need to decide for yourself whether you favor static quality over temporal quality, but the latter's easily much more of an issue these days than the former. We're at a wall in terms of the effect of quantization has on shaded pixels, and as complexity increases, it only gets worse.

That's completely wrong. FXAA is not effective at reducing temporal aliasing. Timothy Lottes even wrote a blog on the subject. Both MSAA and FXAA are designed to address static aliasing and are ineffective at reducing temporal aliasing. FXAA is actually worse than MSAA at reducing temporal aliasing in most circumstances.

2) The theory that NVIDIA is pushing FXAA because it has a lower performance hit doesn't hold water. NVIDIA is in the business of selling you faster cards. It's not in their best interest to improve rendering performance on non-NVIDIA hardware, which is what FXAA does.

Since when is nvidia pushing FXAA?

Developers are pushing FXAA because it gives them what they need. A fast, reasonably effective AA solution that works with deferred rendering on all major platforms and APIs.

I want to try the SMAA thing for BF3 but I'm super afraid of it ending up banning me

Some applications don't like you using injectors for obvious reasons.

Anyone who tells you that FXAA offers better performance or is nearly free is sadly misinformed.

MSAA will always have a higher performance hit than FXAA. I dare you to prove otherwise. We've measured the performance hit of both quite extensively.

FXAA uses shader units whereas MSAA uses video memory and bandwidth. Whichever is taxed less prior to selecting your AA method will give you the better results.

Shaders also use video memory bandwidth and capacity just so you know.

So it comes down to this. On older cards with too much RAM (I'm look at the GT430 people w/2GB RAM), MSAA > FXAA in performance. On mid-range cards and some older high-end cards, they can come out about even, but it's totally situation dependent. On the newer 670/680 cards, FXAA > MSAA in performance. Going forward FXAA should continue to have the performance advantage over MSAA.

Prove it. MSAA should take at least an order of magnitude longer to process on ANY system. This is just common sense if you look at how the two are handled by the GPU.

In terms of looks, 16xCSAA + 8xTrSSAA is far superior to FXAA. If it's just MSAA vs. FXAA, you get better sharpness with MSAA, but no anti-aliasing on shaders or alpha-textures.

It also fails to address any aliasing other than geometry and transparent texture aliasing. You are still going to have a ton of shader aliasing and subpixel aliasing on modern games.
 
That's completely wrong. FXAA is not effective at reducing temporal aliasing. Timothy Lottes even wrote a blog on the subject.
I never claimed that FXAA uses any temporal methods. It doesn't. Only TXAA does. Reducing aliasing does improve upon the appearance of moving objects, however, particularly in high-contrast scenes. FXAA can be effective at reducing the appearance of temporal artifacts like crawling — notably those that traditional multisampling does not.

Since when is nvidia pushing FXAA?
I didn't claim they were.

Prove it. MSAA should take at least an order of magnitude longer to process on ANY system. This is just common sense if you look at how the two are handled by the GPU.
Not necessarily. Download NVIDIA's FXAA demo executable. On certain system, FXAA performance can be lower than an a quality-equivalent level of MSAA.

Post-processing always comes with an overhead penalty. In some cases, the penalty is greater than the performance delta between post-process AA and traditional MSAA.
 
Any form of AA is shit and reduces the picture quality. Here, someone had to say it.

What we need is better displays, more pixels and displays with a better dot pitch.
 
It blurs at least the edges so to me that's reducing "quality" because blur =/= quality. Just my opinion of course, I'm a blur hater.

(doesn't mean I don't use AA btw, in the real world it's quite useful with the monitors we have)
 
Any form of AA is shit and reduces the picture quality. Here, someone had to say it.

What we need is better displays, more pixels and displays with a better dot pitch.

You're right, but you're missing the point of this thread :).

I'd certainly rather have my 2560x1600 30" panel over a 1080p 30" HDTV, even if I could run 4xXXAA on the TV and nothing on the 4MP monitor.

Higher resolution displays are coming, with Apple of all gawd-awful companies seemingly leading the charge in the consumer space. But it will happen, and our GPUs will need to get ever more powerful to compensate.

When all of our screens are at 'retina' pixel densities, we'll only need well tuned FXAA.
 
It blurs at least the edges so to me that's reducing "quality" because blur =/= quality. Just my opinion of course, I'm a blur hater.

(doesn't mean I don't use AA btw, in the real world it's quite useful with the monitors we have)


Multi-sample Anti aliasing != blurring. It is actually rendering more accurately per pixel, as each pixel is taking between 4 and 256 depth samples per pixel. You'll find that even the most amazing 15 megapixel DSLR cammeras are not sharp down to the individual pixel. The appeture averages several million photons per digitized 'pixel' when the camera rolls it's shutter. Our eyes operate a similar way, this is why an anti-aliased image always looks more life-like.
 
You're right, but you're missing the point of this thread :).

I know :p but I just feel it's pointless to argue between FXAA and MSAA when both have their drawbacks and still fail to make the picture look truly aliasing-free.
 
Should I ditch MSAA completely in favor of FXAA or is MSAA still better in terms of image quality?

From my personal experience FXAA adds like a movie like blur effect. To give that "cinematic feel". MSAA gives it a much sharp/clear image. Remember it's all personal preference in what you like in your AA filter. Some people love FXAA while others don't.

There was a thread about this on the main video card forum. Someone showed some SMAA screenshots/videos and I was amazed how much better it was.
 
In truth there should be no argument of MSAA versus FXAA, but rather if it is worth it to SUPPLEMENT MSAA with FXAA or not.

The answer is yes, you should.
 
I never claimed that FXAA uses any temporal methods. It doesn't. Only TXAA does. Reducing aliasing does improve upon the appearance of moving objects, however, particularly in high-contrast scenes. FXAA can be effective at reducing the appearance of temporal artifacts like crawling — notably those that traditional multisampling does not.

FXAA is very bad at reducing temporal aliasing. MSAA does a MUCH better job. Even timothy lottes, who is often considered the father of FXAA, admits this.

Any form of AA is shit and reduces the picture quality. Here, someone had to say it.

What we need is better displays, more pixels and displays with a better dot pitch.

Most forms of AA raise image quality. I dare you to say otherwise to a film producer.

While better display technology is indeed a better solution you will never reach perfect or even unnoticeable levels of aliasing with that alone. You will always need some level of AA.

Not necessarily. Download NVIDIA's FXAA demo executable. On certain system, FXAA performance can be lower than an a quality-equivalent level of MSAA.

Post-processing always comes with an overhead penalty. In some cases, the penalty is greater than the performance delta between post-process AA and traditional MSAA.

Unless you've invented some magical GPU that has an enormous pixel throughput and almost no shader throughput or you've invented an FXAA shader that is dozens of times slower than the ones being used by current developers I fail to see how this is possible.

Please show me some data that suggests otherwise.

Does not work already tried it.

(the above quote is in regards to forcing SSAA in crysis 2)
Well then you did something wrong. Were you running the game in d3d9 mode for starters?
 
FXAA is very bad at reducing temporal aliasing. MSAA does a MUCH better job. Even timothy lottes, who is often considered the father of FXAA, admits this.
I never claimed that MSAA wasn't more effective at reducing temporal artifacts, did I?

Unless you've invented some magical GPU that has an enormous pixel throughput and almost no shader throughput or you've invented an FXAA shader that is dozens of times slower than the ones being used by current developers I fail to see how this is possible.
You're thinking in terms of shader execution speed and hardware performance. It's slightly more complicated than that. Knowledge of hardware with respect to how many flops they can push does not tell the whole story: APIs have overhead. State switching is costly. You can't jump into a shader program, do something with it and jump back out of it without a cost. Your failure to see this is not my concern.

All I can tell you is that I ran NVIDIA's own FXAA demo included in SDK 11 on a GT 430 and, in that demo, in that scene, FXAA yields a greater performance hit than 4xMSAA did at the time I ran it. I suspect NVIDIA knows how to author a demo program properly, unless you have reason to suspect otherwise. I'm not downloading it, swapping my GT 430 back in and running benchmarks for your edification. I'm not knocking FXAA either, so don't get so defensive.
 
Two points:

1) FXAA doesn't blur the entire frame. It's an edge-detect algorithm that inspects pixels to their adjacent pixels and determines whether there's aliasing present, then anti-aliases those pixels. MSAA works similarly, on a fundamental level, but doesn't use post-process edge detection. FXAA catches more aliasing, like shader aliasing, and attempts to anti-alias it.

An FXAA'ed frame will always appear softer than an MSAA'd frame, since FXAA, by design, attacks more aliasing. The benefit to that is mostly temporal, but it's fairly minor. You need to decide for yourself whether you favor static quality over temporal quality, but the latter's easily much more of an issue these days than the former. We're at a wall in terms of the effect of quantization has on shaded pixels, and as complexity increases, it only gets worse.
Who cares if the end result is the entire image is still blurred much more with FXAA than MSAA? The simple fact is that FXAA is profoundly inferior in its current implementations, especially as you move to higher resolutions and better image quality settings.
2) The theory that NVIDIA is pushing FXAA because it has a lower performance hit doesn't hold water. NVIDIA is in the business of selling you faster cards. It's not in their best interest to improve rendering performance on non-NVIDIA hardware, which is what FXAA does.
Wrong. NVIDIA is in the business of selling you video cards, period, be they faster or not. Poor man's AA is a great way cheaply "maintain" IQ while making your product look faster compared to the competition. You should read up on NVIDIA's FXAA white papers, especially Timothy Lottes' stuff.
 
I never claimed that MSAA wasn't more effective at reducing temporal artifacts, did I?
Actually you did:
An FXAA'ed frame will always appear softer than an MSAA'd frame, since FXAA, by design, attacks more aliasing. The benefit to that is mostly temporal, but it's fairly minor.
You're thinking in terms of shader execution speed and hardware performance. It's slightly more complicated than that. Knowledge of hardware with respect to how many flops they can push does not tell the whole story: APIs have overhead. State switching is costly. You can't jump into a shader program, do something with it and jump back out of it without a cost. Your failure to see this is not my concern.

All I can tell you is that I ran NVIDIA's own FXAA demo included in SDK 11 on a GT 430 and, in that demo, in that scene, FXAA yields a greater performance hit than 4xMSAA did at the time I ran it. I suspect NVIDIA knows how to author a demo program properly, unless you have reason to suspect otherwise. I'm not downloading it, swapping my GT 430 back in and running benchmarks for your edification. I'm not knocking FXAA either, so don't get so defensive.
Wow, really? Do you think it's more likely that all the articles and research written on FXAA is somehow wrong or misunderstood by everyone but you or that you missed something that one time you ran the demo( and can't even confirm)? :rolleyes:
 
The simple fact is that FXAA is profoundly inferior in its current implementations, especially as you move to higher resolutions and better image quality settings.
That is not "simple fact". Your stating that it is does not make it so.

Wrong. NVIDIA is in the business of selling you video cards, period, be they faster or not. Poor man's AA is a great way cheaply "maintain" IQ while making your product look faster compared to the competition. You should read up on NVIDIA's FXAA white papers, especially Timothy Lottes' stuff.
FXAA is not exclusive to NVIDIA hardware. I don't understand from where your confusion stems. Someone who's actually read Lottes' blog would understand this.

Actually you did
That would be a misinterpretation. I was stating that a frame 'softened' as a result of a post-process step will appear to have fewer temporal artifacts. I wasn't stating that MSAA is inferior to FXAA with respect to reducing temporal aliasing.

Wow, really? Do you think it's more likely that all the articles and research written on FXAA is somehow wrong or misunderstood by everyone but you...
To what articles and research would you be referring? I'm aware of no such articles that authoritatively state that FXAA will always be faster than MSAA regardless of how it's implemented, what preset is used, what type of scene the anti-aliasing is being applied to and what hardware such tests are performed on. If you know of such an article, please show it to me.
 
It's impossible to prove that it would never be possible but all of the FXAA shaders that are currently widely used are dozens or hundreds of times faster than MSAA on any hardware depending on the FXAA preset, MSAA level, and scene.

I have never ever seen any comparison data between the two that hasn't shown FXAA being massively faster at any resolution and with any preset. The difference in performance between the two is so great that I am skeptical of the accuracy of your results.

You're thinking in terms of shader execution speed and hardware performance. It's slightly more complicated than that. Knowledge of hardware with respect to how many flops they can push does not tell the whole story: APIs have overhead. State switching is costly. You can't jump into a shader program, do something with it and jump back out of it without a cost. Your failure to see this is not my concern.

I am aware of that but the framebuffer would have to be extremely small for that overhead to actually cost more performance than you save. So small that you can basically ignore the possibility with modern games.

The simple fact is that FXAA is profoundly inferior in its current implementations, especially as you move to higher resolutions and better image quality settings.

This doesn't make sense. FXAA is better at higher resolutions. It causes less blurring and has a higher impact on aliasing as the internal resolution increases.
 
I have never ever seen any comparison data between the two that hasn't shown FXAA being massively faster at any resolution and with any preset. The difference in performance between the two is so great that I am skeptical of the accuracy of your results.

http://hardforum.com/showthread.php?t=1699111

I have already shown that FXAA is not faster than MSAA in every case. In fact, I've found 16xCSAA + 8xTrSSAA to be on par with or faster than FXAA in some cases, and within striking distance in others.
 
I can't stand FXAA as it reduces texture quality.
 
Y'all all going off with edge cases and you're forgotten the point altogether. FXAA is lightweight and targetibable (BF3 with FXAA Medium), while MSAA and TRSA work to clean the rest up.

The real answer to this thread is that in two or three years from now we won't have AA settings, or if we do have them we won't need them, because developers will be able to match custom mixes of various forms of AA to their game content, providing the best image quality possible with a wide range of rendering resources.

And to that end, BF3 being the example, consider 2xMSAA + FXAA Medium to be the best compromise, turn whatever else you have to to get 60+FPS in MP.
 
SMAA seems to have gained traction over the past few months and replaced FXAA as the best AA implementation outside of MSAA
 
SMAA seems to have gained traction over the past few months and replaced FXAA as the best AA implementation outside of MSAA

Because SMAA doesn't blur the whole screen. SMAA retains the original images sharpness.

I'd take MLAA over FXAA even. MLAA isn't as good in reducing aliasing, but it doesn't blur the whole screen.

MSAA + SMAA are the best options.
 
I'm alpha testing the game Project Cars, and we recently got down sampling as an option to MSAA. We can choose between 2X or 4X. The game runs so much nicer with down sampling! And for those of us that just can't let well enough alone we can slap FXAA or SMAA on top of the down sampling. Looks sweet!
 
Because SMAA doesn't blur the whole screen. SMAA retains the original images sharpness.

I'd take MLAA over FXAA even. MLAA isn't as good in reducing aliasing, but it doesn't blur the whole screen.

MSAA + SMAA are the best options.

MLAA does some weird things to my text that keeps me from enabling it in game. I just can't take the blurred text in most titles.
 
Because SMAA doesn't blur the whole screen. SMAA retains the original images sharpness

in-game FXAA implementation does a much better job then forcing FXAA through the Nvidia CP and doesn't blur the entire screen
 
Anyone have any experience with TXAA? I tried it out in Crysis 3 and was pretty impressed with it's quality and performance. It seemed to best MSAA and SMAA. However, I still really like SMAA all around for demanding games and MSAA for less demanding games where I have the performance to spare.
 
Anyone have any experience with TXAA? I tried it out in Crysis 3 and was pretty impressed with it's quality and performance. It seemed to best MSAA and SMAA. However, I still really like SMAA all around for demanding games and MSAA for less demanding games where I have the performance to spare.

TXAA does a good job reducing jaggies, but again, it adds a blur and it also messes with the color palette for some reason.
 
Back
Top