NVIDIA Multi-Frame Sampled AA Video Card Review @ [H]

AA was originally made for low resolutions like 640X480 or 800X600, where jaggies made everything look like meh. But once you get to 1080p+ resolutions then AA is a pointless waste of resources.
That has to be one the dumbest things I have ever heard. Some games have horrible jaggies with lots of obvious crawling on a typical size 1080 screen. Hell even at 4k res this is still visible in some cases.
 
You always use SMAA anyway if you use AA. You could also just inject SMAA into your games if it doesn't come with it using.

That has to be one the dumbest things I have ever heard. Some games have horrible jaggies with lots of obvious crawling on a typical size 1080 screen. Hell even at 4k res this is still visible in some cases.
So me a comparison then. Photos or something.
 
Actually, the theoretical best image you can get comes from some combination of sparse and ordered grid supersampling, but that's rarely going to be something you can do in real time. FXAA uses an entirely different mechanism, and helps knock out any last bits of aliasing in addition to softening the frame, which helps combat temporal artifacts.

It's also almost always free or near-free.
Maybe, I figure with OG or SGSSAA you're already getting SOME smoothing on just about everything already, so I think it would just add more blur than help much with straggling areas. FXAA shines the most when it's on rough aliased areas that can form a clear edge. That's why it's often not bad with downsampling since it can be applied to the high res image BEFORE the downsample, giving a smoother effect than just the raw downsample.
 
You always use SMAA anyway if you use AA. You could also just inject SMAA into your games if it doesn't come with it using.


So me a comparison then. Photos or something.

I don't think you understand. AA wasn't created because of a resolution issue. AA was created to deal with the limitations of digitally rendering something that is inherently analog from our eyes point of view. I could not need AA if I had a 0.5"x0.5" screen that was ~640 pixels by 640 pixels. However, even if you managed to get your pixel density to "retina levels" or better at your viewing distance, there is still, as I said above, inherent limitations of renders which can make things look "jaggy" or "blocky" at that point. What AA does is attempt to make things more correct to how our eyes perceive reality so we don't see them as disjointed.
 
I just enabled it in Crysis 3 and my lil 970 really likes it. Went from mixed frame rates that varied from the mid 30's to mid 50's to 60 nearly solid. Game still looks great IMHO.
 
I don't think you understand. AA wasn't created because of a resolution issue. AA was created to deal with the limitations of digitally rendering something that is inherently analog from our eyes point of view. I could not need AA if I had a 0.5"x0.5" screen that was ~640 pixels by 640 pixels. However, even if you managed to get your pixel density to "retina levels" or better at your viewing distance, there is still, as I said above, inherent limitations of renders which can make things look "jaggy" or "blocky" at that point. What AA does is attempt to make things more correct to how our eyes perceive reality so we don't see them as disjointed.
It was originally created to smooth edges of an image in a low resolution screen. Weather you need AA or not depends on a two things. One being resolution, and the higher the resolution the less of a need for AA. The second is the size of the screen. At 20, 21, and 22 inches a 1080p barely needs any AA. At 28 inches a 4k display doesn't need any AA.

But AA trades so much performance for a small if any image quality increase. Take a look at Tetris42's image. I honestly can't see a difference except FXAA makes the image blurry. It seems AA only really effects the top right of the fence, while the rest of the fence is still rather jaggie.

14083370007cx2pU3ZI8_14_7_l.png
 
It was originally created to smooth edges of an image in a low resolution screen. Weather you need AA or not depends on a two things. One being resolution, and the higher the resolution the less of a need for AA. The second is the size of the screen. At 20, 21, and 22 inches a 1080p barely needs any AA. At 28 inches a 4k display doesn't need any AA.

But AA trades so much performance for a small if any image quality increase. Take a look at Tetris42's image. I honestly can't see a difference except FXAA makes the image blurry. It seems AA only really effects the top right of the fence, while the rest of the fence is still rather jaggie.


that's the worse possible scenario for AA comparison, anyway i'll do a better job for you..

1415263227VkpLJMz5Ci_7_13_l.jpg


14083370007cx2pU3ZI8_14_12_l.png


14083370007cx2pU3ZI8_14_11_l.png


do yo need more?.... sawtooth its just unacceptable for me..
 
I am very curious how DSR factors into all of this. On many games, simply disabling all available in-game/control panel AA and using DSR instead often results in a superior gameplay experience. Although DSR can still be used in conjunction with, and is not necessarily a replacement for other forms of AA, I'm surprised to see it not even mentioned here. DSR is a tech that obviously comes into play when you're looking at your AA options and is really a game changer in many cases.
 
I think there is some forward thinking with this technology. for guys like me and even others doing 4k right now. It's all very niche but those experiencing this type of high resolution gaming AA is gravy. If you can even afford the performance hit then it's a bonus. Most of the time we deal with the jaggies. They're harder to notice the higher up you go in resolution but for someone who has been gaming at high res long enough you can notice them.

This method is giving you image quality similar to 8x at 4x performance penalty cost which will really benefit guys doing triple screen. Also the 4k crowd can now utilize 4x AA at 2x cost which may improve on their experience quite a bit.

the caveat nvidia needs to find a way around is the per-app thing. They need to get this method working quickly on and all all the big titles form the last 3 years and all the up and coming titles in the next year to even make an impact. It's here where they are lacking.

Also they need to get this working on Multi GPU setups ASAP.
 
I don't think you understand. AA wasn't created because of a resolution issue. AA was created to deal with the limitations of digitally rendering something that is inherently analog from our eyes point of view.
Quantization limits are exactly why we need anti-aliasing. It has nothing to do with how our eyes perceive things — it has to do, quite simply, with quantization limits.
 
You always use SMAA anyway if you use AA. You could also just inject SMAA into your games if it doesn't come with it using.


So me a comparison then. Photos or something.

I have tried a number of times to inject SMAA using radeonPro on Nvidia cards (sweetFX) and have yet to get it to work, either something in Nv cntrl panel wasn't working with or it was not compatible with win 8.1 64bit (sweetFx).

I have checked out a number of tutorials/guides on Nvidia RadeonPro SMAA injection and non have worked. It worked great with my AMD 7950/7970s though. Is there someone here at [H] that is using it in current dx11 on win 8.1 with an Nvidia card?
 
I am very curious how DSR factors into all of this. On many games, simply disabling all available in-game/control panel AA and using DSR instead often results in a superior gameplay experience. Although DSR can still be used in conjunction with, and is not necessarily a replacement for other forms of AA, I'm surprised to see it not even mentioned here. DSR is a tech that obviously comes into play when you're looking at your AA options and is really a game changer in many cases.

Yeah do we even need this new technology when we have DSR ?
 
Yeah do we even need this new technology when we have DSR ?

I found blending low DSR and AA is better, for IQ and performance, than maxing out AA or DSR. FC4 I use SMAA and a 1.25 DSR multiplier and it looks gorgeous. My base resolution is 1900x1200.

Played basically with the same thing in Battlefield 4 for a long time, some resolution scaling + 2xMSAA to match 4xMSAA perf and it looks great. Just doing 4xMSAA or a higher scaling factor didn't look as good.
 
that's the worse possible scenario for AA comparison, anyway i'll do a better job for you..
That comparison was specifically to show the LIMITATIONS of shader-based AA. Some people talk about SMAA like it's the second coming of Christ and solves all your problems. That image is to show, no, it really doesn't. You need SOME flavor of SSAA or downsampling to clean up the image in all situations.

Both SMAA and FXAA do a great job on SOME parts of the scene. On others, they may as well be doing nothing at all.
 
Unreal's Temporal AA is pretty rad.

If you're standing still and look very carefully, you can see it continue to converge. It's sub pixel and very high quality.
 
But AA trades so much performance for a small if any image quality increase. Take a look at Tetris42's image. I honestly can't see a difference except FXAA makes the image blurry. It seems AA only really effects the top right of the fence, while the rest of the fence is still rather jaggie.
That's because it's using SMAA / FXAA. If you use actual SSAA the fence would look MUCH better, especially in motion:

MSAA-vs-SSAA-transparency.gif


This is a problem we had solved years ago, but have since broken it.
 
That's because it's using SMAA / FXAA. If you use actual SSAA the fence would look MUCH better, especially in motion:

MSAA-vs-SSAA-transparency.gif


This is a problem we had solved years ago, but have since broken it.
We've broken it because now developers are back to square one of the problem. They're trying to render more polygons on the screen at once, which requires a lot of processing power. Power that cannot go to a traditional supersampling process.

I wouldn't mind having these new shader AA methods in addition to the traditional hardware ones. Unfortunately they're taking away options on the driver side and only including sparse options on the game side. Some games are not even including MSAA as an option anymore...

I miss CSAA :(.
 
We've broken it because now developers are back to square one of the problem. They're trying to render more polygons on the screen at once, which requires a lot of processing power. Power that cannot go to a traditional supersampling process.

I wouldn't mind having these new shader AA methods in addition to the traditional hardware ones. Unfortunately they're taking away options on the driver side and only including sparse options on the game side. Some games are not even including MSAA as an option anymore...

I miss CSAA :(.
Well it's more like we're doing deferred rendering so that traditional AA techniques can't even get access to the geometry to do their calculations, so all you're left with is shader-based guessing or downsampling.

Again, we kind of had the best of both worlds with TrSSAA from Nvidia and Adaptive AA from ATI. It used MSAA on polygons, and SSAA on alpha textures. It really was the holy grail in terms of graphics quality v. speed until everything got broke.
 
1) We do need AA even on higher density displays, if you want to understand why i would like to invite you guys to look up the concept of Hyperacuity or how our "software" seems to have an Aliasing Finding Machine that goes beyond our Eyes Hardware accuracy, to the tune of 5-10x higher accuity when dealing with things that "don't quite match". http://www.michaelbach.de/ot/lum_hyperacuity/


2) SuperSampling antialiasing isn't really a solution beyond saying that we need even higher density displays, because what SSAA does is just render the image in that even higher resolution, it ends blended down to whatever resolution your display can actually show but this is grossly power inefficient.

3) This is indeed pretty much the same as old AMD's Temporal AA solution, http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868-6.html , it uses way more different patterns but the problems should be pretty much the same going forward. Not a fan.


All in all i would agree that the best thing going forward would be more generalized solutions than these specific ones. Here is to hoping that we get some new surprises this coming year like we had in 2011 with the jump from MLAA to SMAA.
 
1) We do need AA even on higher density displays, if you want to understand why i would like to invite you guys to look up the concept of Hyperacuity or how our "software" seems to have an Aliasing Finding Machine that goes beyond our Eyes Hardware accuracy, to the tune of 5-10x higher accuity when dealing with things that "don't quite match". http://www.michaelbach.de/ot/lum_hyperacuity/
Hadn't seen that before. On my screen I get to 0.13 pixels on the lines before I can't tell the difference anymore. Would be interesting to see what people score who think we don't need AA.

3) This is indeed pretty much the same as old AMD's Temporal AA solution, http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868-6.html , it uses way more different patterns but the problems should be pretty much the same going forward. Not a fan.
Well the biggest difference is ATI's solution worked on direct3D games in general and Nvidia's needs to be implemented on a case-by-case basis.
 
Quantization limits are exactly why we need anti-aliasing. It has nothing to do with how our eyes perceive things — it has to do, quite simply, with quantization limits.

Six of one, half a dozen of the other. The fact is there are many rendered images that should not have jagg's that do. That is a render problem...simply stated. You can attach what terminology you would like to, but how we render stuff, irregardless of resolution, is why we still need and still develop AA. It is just recently that they are trying to come up with more clever (e.g. cheap but still good) ways to deal with limits of current render methods.
 
MFAA hits alpha textures/transparency too according to some articles I'm seeing. SLI support for it can't come soon enough since I play BF4 basically every day and it's one of the whitelisted titles. There's also speculation they may open it up at some point since it was originally intended to work on everything but had some visual glitches in a decent number of games, if they can resolve that over time. Regardless I am quite sure we'll see that whitelist expand pretty rapidly of titles, and 20+ is a lot more than Mantle works on right now which people laud as an amazing innovation :p.

Hoping this gets well-supported and keeps expanding the list of games for now, though... it's a great tech and a big performance advantage since it basically gives you just shy of 4x MSAA quality (and better if you factor in the transparency antialiasing it does additionally, frankly, since you then don't need FXAA which degrades the whole image) at a far higher performance level. For those of us running 4k 60fps and trying to keep our minimums above 55-60, that can be the difference in being able to do so or not in some cases!

Also, as I said, deferred rendering is already being trended away from in the development industry with forward+ becoming the preferred tech going forward (pardon the pun). It was a temporary solution that won't be common a few years down the line here, and a lot of major games already allow for MSAA in-engine even when they're deferred (see BattleField 4 for a prime example).

Tech like MFAA that provides better performance at almost the same quality, while also providing transparency AA and not blurring the picture as a whole, is absolutely superior to any shader-based AA. Even at 4K resolutions it's a pretty noticeable change in texture quality just going from, for example, no FXAA to "low" FXAA in BF4, let alone anything higher, and it obviously doesn't really take care of much in any case. True MSAA/similar techniques have stuck around so long because they're the only accurate solution that provides a performance compromise while maintaining the bulk of the image quality vs. outright supersampling.
 
Regardless I am quite sure we'll see that whitelist expand pretty rapidly of titles, and 20+ is a lot more than Mantle works on right now which people laud as an amazing innovation :p.
That's an apples to oranges comparison. Mantle is an API, you may as well be comparing MSAA to OpenGL. The point is EVERY single AA technology up until this point (except for TXAA, which supports what, 10 titles?) has never been DESIGNED to be game-specific. As someone who plays a lot of non-mainstream games, I think it's an awful precedent and I wish Nvidia would stop crap like this and instead try to find more all-purpose solutions.

Also, as I said, deferred rendering is already being trended away from in the development industry with forward+ becoming the preferred tech going forward (pardon the pun). It was a temporary solution that won't be common a few years down the line here, and a lot of major games already allow for MSAA in-engine even when they're deferred
Do you have any proof of this as an industry trend or just basing it on a few games? I mean I hope you're right, but as of right now, I'm sure I could name over 100 games that use deferred rendering and break support for MSAA.
 
I'm referring to the excitement levels in the comparison. MfAA wasn't designed as game specific, as I said. Pc perspective has an article about MfAA that talks about this and as I remarked that it was meant as a global method but had visual issues with a number of games, so for now they're using a white list approach for it. Also MfAA doesn't have dx12 coming to make it irrelevant soon ;) like mantle does.

Re deferred vs forward+, the industry trend is moving towards it rapidly, and while many haven't publicized it a ton like they did during the transition to deferred, unreal engine 4 and unity for example are major middlewares implementing it ASAP due to demand from developers. I'll see if I can find some specific game articles about it sometime.
 
MFAA still requires a minimum performance just as AMD's original temporal AA, to tackle the transparencies you would just super sample them in specific which can be done, and i would still consider it a terrible clunky solution, but that is a personal opinion about the principle in general.

SMAA doesn't really blur the picture, at least not beyond what any other AA does, the general functionality could be improved tho (and should... hopefully... come onnnn Jimenez & co!).

BTW Forward Plus was praised when AMD used it on the Leo demo way back in 2012, it is indeed quite nice, here is the part where Anandtech wrote about it:
http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa/3

Mainly it allows games with way better material handling while supporting complex lightning and MSAA // SSAA natively without taking huge performance hits, pure win for us as gamers and for the game developers too.
 
Back
Top