16x CSAA vs 16xq CSAA vs 32x CSAA

  • Thread starter Deleted whining member 223597
  • Start date
D

Deleted whining member 223597

Guest
So I just popped in my GTX 460 1gb and started playing some BFBC2 campaign(just first level). I was using 16xq CSAA and there were no noticable stutters, but which has the best performance for the best look? 16x CSAA, 16xq CSAA or 32x CSAA?
 
someone really needs to go thru all the AA modes nvidia offers and list them all in order from worst image quality to best.
 
It's already been done plus it's common sense. Higher number = better. Q = better. Do you want me to list the additional AA modes provided by nvidia inspector or just the ones officially supported by nvidia's control panel?

In order of worst quality to best quality:
Off (none)
2x MSAA
4x MSAA
8x CSAA
8xQ CSAA/16x CSAA (depends on the game which is better)
16xQ CSAA
32x CSAA

In order of best performance to worst performance:
Off (none)
2x MSAA
4x MSAA
8x CSAA
16x CSAA (in extremely rare cases 8xQ can be faster)
8xQ CSAA
16xQ CSAA
32x CSAA

An easy way to think of it is Q = quality. And higher quality tends to lead to less performance.

By the way they are already listed in the correct order both in the nvidia control panel and in every single game I have ever played that has in-game AA options.
 
Last edited:
Just to throw a spanner in the works...

I've consistently found in all games I play that 8X MSAA is the best AA setting for nvidia.

Yes, it looks better than 32X (which only has 8 colour sample anyway). Why?

The extra coverage samples don't actually help to improve the image, imho, but infact make it worse as it further exagerates/highlights any and all artifacts from deferred rendering techniques.

8X MSAA just offers the cleanest image. CSAA doesn't practically net you a better image in modern games. IMHO.
 
Yes, it looks better than 32X (which only has 8 colour sample anyway). Why?

Because the hardware and drivers have only ever been designed for 8 samples. This is the max count supported by d3d9, openGL, d3d10/10.1/11, etc.

So then you just have 4 coverage samples per stored color/depth/stencil to give you 32. Currently 4 coverage samples per stored color sample is the highest nvidia's hardware is designed to do (MSAA/CSAA actually have fixed function hardware on the GPU specifically designed for that function).

Just make sure to point out to people that 8xMSAA is 8xQ CSAA so that they don't get confused.

Regardless of whether it can further emphasize other artifacts in some situations having additional coverage samples does improve it's effectiveness at eliminating geometry aliasing.
 
Just make sure to point out to people that 8xMSAA is 8xQ CSAA so that they don't get confused.

What does that mean though? 8xMSAA has 8 color samples right? (still learning), and 8xQ CSAA has 8 color samples but... more coverage samples? how does that compute. or both has 8 color samples but not extra coverage samples? Oh god, Nvidia, my mind it hurts.
 
It's the same exact thing. 8xQ CSAA is the same as 8xMSAA. Technically MSAA always has coverage samples, CSAA is EXTRA coverage samples. With regular MSAA you have 1 coverage sample per stored color sample. With CSAA you normally have multiple coverage samples per stored color sample to further improve efficiency.

2xMSAA is 2xMSAA
4xMSAA is 4xMSAA
8x CSAA is 4xMSAA + 4 additional coverage samples (8 coverage samples total)
8xQ CSAA is 8xMSAA (so it still has 8 coverage samples but it has more color samples)
16x CSAA is 4xMSAA + 12 additional coverage samples (16 coverage samples total)
16xQ CSAA is 8xMSAA + 8 additional coverage samples (still has 16 coverage samples per pixel total but with more color samples)

The q stands for quality. Which implies more color samples in the mixture and therefore better quality/ less performance. The number at the beginning refers to the total number of coverage samples per pixel. MSAA implies 1 color sample for each coverage sample. CSAA implies less color samples.

Here, got to love charts: http://developer.nvidia.com/csaa-coverage-sampling-antialiasing

Oh god, Nvidia, my mind it hurts.
Trust me when I say this, ATIs system is MORE confusing. They have different sample coordinate grid patterns and resolve/blend filters that can be mixed with MSAA/EQAA/SSAA. But in this case, confusing is good. Confusing in this case implies more options, which is never a bad thing.
 
Last edited:
Thanks!

And is it just me or does BFBC2 not look that great? I mean I went from Medium without AA and without HBAO and DX10 to High everything with 16x Csaa(maybe I will go back to 16xq CSAA since that was awesome) and HBAO and it doesn't really look better. :confused:
 
It's really the colour samples that count though.

I find that geometry edges in games smooth really well anyway and so don't need the extra coverage samples from high CSAA modes to improve them; it just draws attention to the now relatively lower AA being applied to everything else. This is why I reckon it makes a given scene, as a whole, look perceivably worse.
 
Thanks!

And is it just me or does BFBC2 not look that great? I mean I went from Medium without AA and without HBAO and DX10 to High everything with 16x Csaa(maybe I will go back to 16xq CSAA since that was awesome) and HBAO and it doesn't really look better.

Yup. Welcome to cross-platform graphics engines. The biggest difference between medium and high settings is the draw distances, which aren't that noticeable. HBAO does offer an improvement but you won't really notice it unless you're actively looking for the ambient shadows.

And the in-game anti-aliasing is terrible. No matter how high your sample count is it simply doesn't affect aliasing on most objects at all, especially on bright maps. You can blame deferred rendering for that one.

This is why I reckon it makes a given scene, as a whole, look perceivably worse.

Well I guess we have a difference of opinion then.
 
OK - Would most people agree that 32x CSAA followed by 16xQ CSAA are the highest quality anti-aliasing options?

I've just setup GTX 480 SLI and I'm finding that I could absolutely max out the settings for certain games (ie. Left 4 Dead, Team Fortress 2, etc) but I was also confused about the "highest quality" anti-aliasing mode available.
 
32x and 16xQ are just 8x multisampling with additional coverage samples. I often find it difficult to see a difference except perhaps when they're combined with transparency antialiasing.
 
Past 8xQ/MSAA I notice Zero difference at 32x. And I'm on a 25" 1080p monitor. I'd like to get myself either a 30" Dell or a giant 1080p TV for games. Maybe I could see it on the TV. But even between 4x and 8x the edges have to be damn contrasted for me to notice during play.
 
This is completely normal. 8 samples is already enough to eliminate most geometry aliasing. It's an exponential curve, each time you double the number of samples the increase in effectiveness is much less than the previous doubling.
 
Back
Top