Is 8x antialiasing overkill?

biggles

2[H]4U
Joined
Jul 25, 2005
Messages
2,215
I like to run games at my native res of 1680x1050. 8xaa is the major advantage I see from the ati 4870s, they seem to take less of a performance hit vs the 260 gtx. Can you actually see the difference during gameplay of 8xaa or higher versus 4xaa? If not I'm leaning to a 260 gtx for the next upgrade.
 
well thats up to you as your monitor and the actual game itself will produce different results. some games need little to no AA while others still look jaggy even with 4X AA so just take it on a game by game basis.
 
I've never seen the need for it but if I had enough power, it might become more noticeable :)
 
I've always thought that 8x AA is more noticeable in motion than it is in still screenshots.
 
I play at 1680x1050 and I really can't see the difference between 4X and 8X in most games. I keep the screen at a normal distance from me and I don't make a habit of counting the hairs on my character's head. Same with 8X and 16X AF. Most games end up at 4X AA and 8X AF as a result.
 
It bothers some more than others, but at 1280x1024 I find the difference nothing less than stunning. It's like going from "kinda smooth" to near perfect, and when I can turning on edge detect seems to kick it into "there's no way to make this look any better without getting a better monitor. And by that I don't necessarily mean higher resolution, but smaller pixel pitch. The smaller the pixels the less noticable any aliasing becomes. Put 3840x2400 pixels onto a 22 inch widescreen and 2x AA would suddenly become overkill :)
 
I would say yes. Overall it is smoother, but like an earlier poster said you cant see it in screenshots its something you have to feel in motion.
 
I have the same res and I have always felt the more AA the better. But as almost everyone here has said it is up to the game, monitor size, and personal preference. ATI's new CF AA is really nice and I find it does a much better job at smoothing out everything.
 
I have the same res and I have always felt the more AA the better. But as almost everyone here has said it is up to the game, monitor size, and personal preference. ATI's new CF AA is really nice and I find it does a much better job at smoothing out everything.

I would like to know how much of a difference 8xaa vs 4xaa makes at 1680x1050 res on a 22" monitor in games like Fallout 3, Oblivion, or GRID.
 
I thought general rule of thumb is that the higher res you go, the less AA you need.
 
Yep the higher the res the less need for AA passes. I wonder what resolution would yield an image with no perceivable aliasing.
 
I can see the difference, but 4x AA looks good enough to me. I mean, there are so many other things in current games that make them look artificial, like flat, blurry textures, (still) not enough polys, crappy shaders that make rocks look like they're made of plastic, 2D billboard vegetation, etc. etc. Those things need to be improved before it makes sense to spend so much of the available resources on removing every last jaggie from the game.

That said, if the framerate is good enough, I max out all the features like AA, AF etc. I prefer to max out the Adaptive AA setting before I try to use anything above 4x AA, because in most games, 50% of the surfaces don't get AA'ed at all with just standard MSAA, so it doesn't matter whether you use 2x AA or 16X.

Make sure you get the 1GB 4870 if you plan on running 8x AA or more..the 512MB version might struggle in future games at those settings.
 
If YOU can see a positive difference, then it ain't overkill. I usually can, depending on the game.
 
When I play GRiD I set it to 8xCSAA because even at 2560x1600 I can see some jaggies.

This might be because race tracks have white lines that are parallel and merge in the distance.

And like elleana said, this [H[ so no such thing as overkill.
 
If the frame rate holds up, I'll definitely go with 8xAA. Definitely a big difference when you actually see it in the game while playing (in motion).
 
That's a general misconception. It's the dpi that matters.

And the shape of a pixel set used by the dispay.
If the edges of the pixels are rounded, it wont look as aliased.

Also in real life you see forms of aliasing due to the frequency of movement of regular shaped objects, frequencies we see things at and at night, the mains frequency of the lighting.

For example, when a car wheel spins, in real life it can start to look like it is has stopped or is spinning backwards.
Or when passing 2 sets of railings where you can see one through the other, you can see interference patterns.
This is even more pronounced /easier to see when using electric lighting with an AC source.
These forms of interference are present in games and also get complained about as aliasing because they can be lessened by using AA.
However, it is actually realistic to leave those in.

Its a funny thing :)
 
And the shape of a pixel set used by the dispay.
If the edges of the pixels are rounded, it wont look as aliased.

This is true. I tried an number of 20 and 22 inch monitors before I settled on my HP w2207h. Most of the 20 inch look very sharp and clear thanks to their tight pixel pitch, but a couple of the 22 inch looked just as good thanks to the way the pixels were shaped and the default sharpness (which oftentimes can't be adjusted).

The HP I have doesn't have a harsh, grainy look at all. Its just as good looking as an HP w2007 (20 inch at the same resolution). The ASUS 22 inch I sent back looked horrible - very noticeable "screendoor" effect.
 
That's a general misconception. It's the dpi that matters.

To some degree, yes it is a misconception but the statement that he made wasn't totally false either. 2560x1600 has nearly double the pixel count of 1920x1200. So a 30" monitor at 2560x1600 won't need as much AA as a 24" monitor at 1920x1200. Now when you are talking about a 20" LCD at 1680x1050 vs. a 22" LCD at 1680x1050, yes, your statement absolutely makes more sense. Both statements are accurate in the right context.

This is true. I tried an number of 20 and 22 inch monitors before I settled on my HP w2207h. Most of the 20 inch look very sharp and clear thanks to their tight pixel pitch, but a couple of the 22 inch looked just as good thanks to the way the pixels were shaped and the default sharpness (which oftentimes can't be adjusted).

The HP I have doesn't have a harsh, grainy look at all. Its just as good looking as an HP w2007 (20 inch at the same resolution). The ASUS 22 inch I sent back looked horrible - very noticeable "screendoor" effect.

Yes, exactly my point. This is why I never jumped on the 37" LCD TV/Monitors for gaming. At 1920x1080 I didn't feel like the image was clear enough for web browsing, photo editing, etc. It sucks for text compared to the sharpness of a 30" LCD at 2560x1600. With that said I do use a 37" TV as a secondary monitor. All I use it for is video playback though while I'm on my computer.
 
Absolutely not. I definitely notice a difference between 4x and 8x, especially on particular games. It depends highly on your monitor and the resolution you're running however.
 
Rule of Thumb: If it doesn't drag the framerate down, the setting can always go higher. Regardless of whether I notice anything or not. :cool:
 
And the shape of a pixel set used by the dispay.
If the edges of the pixels are rounded, it wont look as aliased.

I still use a Sony FD CRT, and I find I don't need a lot of AA at any point. A lot of times I simply can't tell the difference in the time it takes me to go through all of the game menus and restart to see the results.

I think the little bit of "blur" from the CRT compared to 1:1 pixel LCDs makes the difference. Even though my CRT is "the best ever made" according to some (Sony GDM-F520) it still isn't tied pixel-to-pixel like DVI signaling + LCD.

I'm often playing with 0xAA and I'm quite happy. Generally I'll never go farther than 4xMSAA because of diminishing returns. On some titles where I have found jaggies, I still see them at 16xAA, so something else is going on there in how the game is coded. No, I can't remember specific titles.

(not that it's related to the topic, but I always max AF - this makes a huge difference).
 
Yes, exactly my point. This is why I never jumped on the 37" LCD TV/Monitors for gaming. At 1920x1080 I didn't feel like the image was clear enough for web browsing, photo editing, etc. It sucks for text compared to the sharpness of a 30" LCD at 2560x1600. With that said I do use a 37" TV as a secondary monitor. All I use it for is video playback though while I'm on my computer.

Did the 37" set you were looking at have a 1:1 mode that guaranteed no scaling was taking place? This is usually what makes all the difference with what you're describing. Even with digital inputs, "most" 1080 HD sets will still overscan a couple of percent unless you tell them not to. When you're dealing with a PC input this ruins the entire experience.
 
I run 1920x1080 at 42", 8xAA is definetly not overkill and in some cases it could use more!

But if you were running 24" at 1920x1200, I'd say yes, its over kill, but here at the [H] we don't care to think about "overkill" but if you can add 8xAA over 4xAA and not notice anything you will still do it anyways
 
I like to run games at my native res of 1680x1050. 8xaa is the major advantage I see from the ati 4870s, they seem to take less of a performance hit vs the 260 gtx. Can you actually see the difference during gameplay of 8xaa or higher versus 4xaa? If not I'm leaning to a 260 gtx for the next upgrade.

There is no such thing as "overkill"

that word is a myth


http://9xmedia.com/Images/xtop/9-PAN-screens-3-3_3.jpg
 
Lets forget the resolutions we have access to for a second.

Lets also assume the technology doesn't improve ( in regards to how pixels are displayed )

Won't aliasing slowly disappear as the resolution continues to increase?

5120x3200

or

10240x6400

Would you guys hypothesize that aliasing would still exist?
 
I run an 8800GTX at 1680x1050. I did some testing with Far Cry 2's benchmark tool.

Using the ingame 8xAA, the grass and the tops of barrels were shimmering mess of aliasing. Really fast - I could set almost everything to Very High or Ultra High, but the aliasing was just unacceptable. You can see this clearly in the Ranch Short benchmark.

Using the nVidia control panel to enhance the AA to 8x (which gives you 8xCSAA), the grass looked great with not a single bit of aliasing, and minimal acceptable aliasing on barrels and other things. But the performance hit was pretty high, and I'm having to choose what graphics settings to set to medium. Which is leading me to shopping for a new card.

A short, general answer - the lower your resolution, the more AA matters. At 1680x1050, 8xAA isn't overkill - it's necessary. I can't speak for 1920x1200 or 2560x1600 resolutions.
 
Lets forget the resolutions we have access to for a second.

Lets also assume the technology doesn't improve ( in regards to how pixels are displayed )

Won't aliasing slowly disappear as the resolution continues to increase?

5120x3200

or

10240x6400

Would you guys hypothesize that aliasing would still exist?

Yes :)
I mentioned a few posts back that we perceive some interference patterns as Aliasing because using AA can make those patterns diminish.
Its not like real life if those patterns are lost but then again, is it worse to lose them?
I think its preferable to lose the patterns so for this reason I think AA will be around a while.
 
Lets forget the resolutions we have access to for a second.

Lets also assume the technology doesn't improve ( in regards to how pixels are displayed )

Won't aliasing slowly disappear as the resolution continues to increase?

5120x3200

or

10240x6400

Would you guys hypothesize that aliasing would still exist?

It depends on how visibly big each of those pixel are :p. If I'm standing 3 feet away from a 150" 4k plasma screen, you can bet your ass that aliasing artifacts would still be painfully noticeable.

However if the pixels are visibly small, aliasing will become less noticeable, but not completely gone. For example, edge steps will still appear to crawl, which may be perceived as 'wavering' with such small pixels.

Anyways, on my 27" screen, 4xAA just doesn't cut it, especially on high contrast edges. 8XMSAA or better is almost a minimum requirement now :S
 
sorry I love AA and I would pay more for it (I did in fact) thinking of upgrade just so I can add some.
 
Nah it's not overkill, but I find that in many games I can't really see any difference between 4 and 8 while actually playing, it sort of depends on how the game art is designed.

If you can run at 8x aa and are happy with the fps and other setttings though, go for it.
 
More noticeable difference at 1680x1050....above that, not so much. Definitely refines the image more though,
 
10240x6400 - when they can cram that many pixels in a 30" monitor there probably will not be any aliasing issues.
 
Back
Top