Who uses AA and FSAA? Really?

16XAF is virtually free on modern videocards, so I really can't think of any situation in which I wouldn't use it. 99% of the time you get about a 4% hit from 16xAF. This means even if you are only normally getting 30fps, you would merely drop to 29fps with 16xAF on, and considering how much better things look with 16xAF, I just don't see why anyone wouldn't use it.

AA is another matter because of how large a hit it can be on your performance, but there is a DEFINITIVE improvement in visual quality with AA on, even at 1920x1200. Is it worth low FPS? Depends on the game, for fast-twitch (FPS, Simulation, some RTS and RPG) games, NO it's not worth it. You generally want to be at 40fps or higher on average, and anything that puts you lower in tough situations isn't worth turning on. For slower-twitch games (Turn based, MMORPG's, most regular RPG's) as long as you don't dip below 20fps very often, YES it is worth it. You will see an improvement in visuals, and I personally really do enjoy games more when they look good.

In a lot of ways though it's really a matter of what your used too. People who only had an Xbox before think the 360 looks fantastic and they don't even notice all the Aliasing. Meanwhile people who play on PC with top-of-the-line videocards and monitors and such think the 360 has serious Aliasing issues. It's all relative, and until you are used to near-perfection you don't have any idea how bad what you're used to really is. (But you also don't care.. which will save your pocketbook a lot of dough!)
 
I don't see any difference between 4x AA on and off at 1600x1200 with a 22" CRT.

Zero. Zilch. Nada.

Oh, except framerates are a lot higher with AA turned off. :p
 
Since I'm limited to 1280x1024 on my LCD, I tend to raise the AA/AF as much as possible. It's true that at higher resolutions, the jagged lines don't appear as prominently, but don't kid yourself. They're there. Also, AF is priceless for keeping textures sharp regardless of their relative size.
 
Currently, I do not play without, at least, 8xAA 4+xAF.

I agree with the OP when I take my glasses off !!

AA4EVR
 
I use it whenever I can play the game smoothly, even at 1600x1200 I still use it.
 
I cant run any games today without running at a min 8 to 16 AF, now that I have my motherboard & 7800 gt coming I will use FSAA as well
 
I run full AF and AA with my 7800GT, definately a drop in FPS but well worth it IMHO
 
I hate aliasing, HATE IT WITH A PASSION

When I look out into the real world I see no aliasing. Edges are smooth, no stair stepping, I see no texture aliasing, I see no shader aliasing, I see no aliasing period.

I want the same from my games. I want no aliasing on polygon edges, I want no texture aliasing, and I don't want shaders aliasing.

I also want smooth textures, via anisotropic filtering. When I look out into the real world I do not see the ground as being blurry. I see it very smooth up to a certain distance (which is pretty far) before it blurs out.

I want the same from my games, I want detailed textures in perspective on all angles.

These can be had using antialiasing and anisotropic techniques in games, I demand it. It should be a given.

With today's video cards I wish AA and AF were a given, I wish we didn't have to worry about them, I wish they were always on at their highest levels and didn't hurt performance allowing us to concentrate on new things like HDR and soft shadows.

But I guess we aren't quite there yet.
 
I notice ailiasing very clearly at 1600X1200, so I can run a lower rez *with* AA and it looks nice.

those who say they do not notice ailiasing at 1600X1200 must be blind or on a very bad CRT.
 
i neeed aa/af...even when playing cs: source or hl2 : 1600x1200 i need 4aa/16af and with fear i need 1280x1024 with 4aa/16af it makes a hugeee difference...you should try without aa/af for a week then with aa/af for a week and then see the difference
 
I don't use either AA or AF.

I'm too busy trying to stay alive eg: F.E.A.R., COD2 etc, to notice.
When I got the 6600GT when it first came out I said I would use aa/af but it hasn't happened. Maybe if I had a faster cpu/gpu I would.
 
Here's a solution: Run AF since it's basically 'free' anyways, and if you want some anti-aliasing, just drink a few beer till your vision gets slightly blurry. BAM! You got 'free' anti-aliasing as well! Your vid card will thank you, but probably not your liver.
 
1280x960 0xAA for me and always high AF with my below system. I've found that I get horrible mouse lag with AA on. I play on a 19" CRT.

My system:

Athlon 64 3700+
Evga SLI motherboard
7800 GT Overclocked (480/1.2)
2GB Corsair Valueram
120gb WD 7200 rpm special edition


Jeff
 
I can't imagine playing a game without full AF and at least some AA.
Whats the point of playing a game like FEAR, without all the IQ that the game has to offer. I would rather have my FPS drop below 30 a few times then play a game that looks like shit without full AF and AA.
 
Majin said:
I can't imagine playing a game without full AF and at least some AA.
Whats the point of playing a game like FEAR, without all the IQ that the game has to offer. I would rather have my FPS drop below 30 a few times then play a game that looks like shit without full AF and AA.

Well for me its all about competitive multiplayer. I need 60fps solid to aim and move around swiftly. If you ever watch fatali1y play check out his graphical settings... Its all about fps and smooth movements.

Jeff
 
playrh8r said:
Well for me its all about competitive multiplayer. I need 60fps solid to aim and move around swiftly. If you ever watch fatali1y play check out his graphical settings... Its all about fps and smooth movements.

Jeff

I agree that in Multiplayer FPS count.
I have no problem dropping the IQ a bit for MP. But If I can get high FPS and have on AF and AA for MP, I for sure will.

I didn't buy a next gen game to only enjoy it in last gens look.
 
Yashu said:
those who say they do not notice ailiasing at 1600X1200 must be blind or on a very bad CRT.

Agreed. Either that or they have some low-end card that can't pull of AA with any acceptable performance, so they argue that it isn't necissary.
 
PSYKOMANTIS said:
All I care about is pure frame rate.
I knew you'd own an nVidia....:p:D
Seriously, though, I haven't recently set AF any lower than 16x, and with the x1900, High Quality. AA is the same. I always try to have at least some measure of it on if I can. I run at two resolutions, 1920x1080 or 1280x720, and it's usually Adaptive AA at the latter, and regular AA on the former. Things are just far more clearer and less obviously fake.
It's all about immersion!
 
PSYKOMANTIS said:
My monitor does 85Hz at 1600 X 1200.
My monitor pwnz your POS


Your monitor is a POS

I have a real Sony Trinitron 22" that does 2048 and above...so there.

But my 16x12 LCD completely blows the CRT away for picture quality, and with AA/AF it's da'shizzz
 
I use as much AA & AF as I can, reguardless of the rez, and performance permitting. In older games you get that image quality for almost no fps hit at all. Newer games, well, not so much.
 
I use it as much as possible (19 inch lcd 1280*1024) without degrading smoothness (keep fps above 30) and I notice right away when it's turned off.
 
Frank DC said:
I don't see any difference between 4x AA on and off at 1600x1200 with a 22" CRT.

Zero. Zilch. Nada.

Oh, except framerates are a lot higher with AA turned off. :p

What games are you playing?

Here are some shots of BF2, with 4xAA and 0xAA. Both at 1600x1200.

NoAA
4xAA

The differences are obvious to me, and very easily seen. Even more easy to see when moving. The whole game is butt ugly with no AA to me. Everything has jaggies, and is like nails on a chalkboard to me.
 
fallguy said:
What games are you playing?

Here are some shots of BF2, with 4xAA and 0xAA. Both at 1600x1200.

NoAA
4xAA

The differences are obvious to me, and very easily seen. Even more easy to see when moving. The whole game is butt ugly with no AA to me. Everything has jaggies, and is like nails on a chalkboard to me.
Very good way to kill this [needing to be killed] thread. R.I.P. poor thread.
 
sac_tagg said:
I totally agree that AA is pointless after 1280X1024. Reviews should focus on higher resolutions, more AF, and things like soft shadows and HDR. I'll take SS/HDR over AA any day of the week.
QFT!

Also personally, if I use AA, I'm either going to have it up to maximum or off because as long as there are ANY jaggies, I might as well be running No AA and get a higher FPS. I won't take half assed AA for an answer.
 
fallguy said:
What games are you playing?

Here are some shots of BF2, with 4xAA and 0xAA. Both at 1600x1200.

NoAA
4xAA

The differences are obvious to me, and very easily seen. Even more easy to see when moving. The whole game is butt ugly with no AA to me. Everything has jaggies, and is like nails on a chalkboard to me.
AAAHHHH the jagggies, IM MELTING, IM MELTING.

i crank the AA and AF all the time. i mean why buy a $700 vid card and not use AA. if i didnt use them i would have bought a Voodoo 2 instead
 
fallguy said:
What games are you playing?

Here are some shots of BF2, with 4xAA and 0xAA. Both at 1600x1200.

NoAA
4xAA

The differences are obvious to me, and very easily seen. Even more easy to see when moving. The whole game is butt ugly with no AA to me. Everything has jaggies, and is like nails on a chalkboard to me.

a HUGE Thank you for taking the time to take screen shots due to this thread BTW.

As for comparing shots back and forth I see some jaggies yet mind you, do you really LOOK at jaggies when you play the game at full res?
Example: Do you stop to smell the roses when you are faced with an opponent trying to FRAG you????

Does it really matter when your video card is rendering the game the way it was programed to be rendered RAW jaggies or not???
Althogh I am reading from various users that the adjustable effects are seen and make your games "look better."
Yet I'm finding no evidence that it does not help gameplay but it seems this whole AA, FSAA, and Filtering just a luxury if anything. Kinda like ladeling copious amounts of gravy over what is a rancid pork chop.
I really hate to play devil's advocate but I think this topic really needs to be adressed specifically when video cards are reviewed. It seems that today video card reviews are relying on quality effects settings at maximum 60+ FPS with the highest effects. What happened to reviews that shows VIDEO CARD RAW power? Not this supposed "Apples to Apples Comparison" that shows 2 cards with effects turned on and not TRUE non effect power?

As for those saying my monitor sucks... My monitor does a max res of 2048 X 1536 at 75Hz and beyond... (at least maximum least common denominator resolution)
I highly doubt I'm not missing anything on a high end CRT monitor.

Sorry call me an idiot for asking but I'm finding a startling descovery that there's more out there like me that don't care a BIT about FSAA, Filtering, or AA what so ever yet care about extreme resolution and above 60 FPS at high refresh rates.

Not to fuel the fire but I'm intrigued in other users wondering the same things I'm stating above.

thank you for your time and sharing your experiences.

Psyko M.
 
AA and AF do help gameplay for me

some examples, in racing games, like NFS Underground or Most Wanted it helps if you can see into the distance clearly, AF helps make the road and textures more detailed, thus being able to see objects in detail farther down the road, and AA helps clean up the jaggy 'mess' that is a result of having distant objects with such low detail that all the jaggies are exaggerated (due to LOD), so being able to have a high level of AA enabled clears up this jaggy mess in the distance making objects distinguishable

obviously these things are on a game-by-game basis, some games don't benefit visually much from AA, like DOOM 3 since it is so dark, every game is different, you can't just make a general assumption

also, there are different kinds of aliasing of which AA or SSAA or TR SSAA do different things on, there is edge aliasing (polygon edges), there is shader aliasing and there is texture aliasing, turning on MSAA only anti-aliases polygon edges, while TR SSAA does that plus alpha tested textures, but to cure texture and shader aliasing you'd have to super sample the entire frame, which is a huge perf hit

it isn't that you stop and smell the roses, it is that turning on AA and AF can help clear up a lot of 'crawling' or 'mess' created by aliasing as you move through the game, that's a huge bother to some, including me

i'll again allude to real life, in real life there are no jaggies, no blurry textures
 
PSYKOMANTIS said:
a HUGE Thank you for taking the time to take screen shots due to this thread BTW.

As for comparing shots back and forth I see some jaggies yet mind you, do you really LOOK at jaggies when you play the game at full res?
Example: Do you stop to smell the roses when you are faced with an opponent trying to FRAG you????

Does it really matter when your video card is rendering the game the way it was programed to be rendered RAW jaggies or not???
Althogh I am reading from various users that the adjustable effects are seen and make your games "look better."
Yet I'm finding no evidence that it does not help gameplay but it seems this whole AA, FSAA, and Filtering just a luxury if anything. Kinda like ladeling copious amounts of gravy over what is a rancid pork chop.
I really hate to play devil's advocate but I think this topic really needs to be adressed specifically when video cards are reviewed. It seems that today video card reviews are relying on quality effects settings at maximum 60+ FPS with the highest effects. What happened to reviews that shows VIDEO CARD RAW power? Not this supposed "Apples to Apples Comparison" that shows 2 cards with effects turned on and not TRUE non effect power?

As for those saying my monitor sucks... My monitor does a max res of 2048 X 1536 at 75Hz and beyond... (at least maximum least common denominator resolution)
I highly doubt I'm not missing anything on a high end CRT monitor.

Sorry call me an idiot for asking but I'm finding a startling descovery that there's more out there like me that don't care a BIT about FSAA, Filtering, or AA what so ever yet care about extreme resolution and above 60 FPS at high refresh rates.

Not to fuel the fire but I'm intrigued in other users wondering the same things I'm stating above.

thank you for your time and sharing your experiences.

Psyko M.

BF2 is a horrible example for showing AA differences IMO, it has never graphically impressed me anyways, CS:S or HL2 and many other games make a bigger difference, even at higher resolutions

i know it's overkill at 1600x1200 or 1280x960, but if your card can do 4xAA/8xAF or 16xAF at those resolutions and still play perfectly, why not?

we dont buy $400-$600 video cards to play at 1024x768(well, some who dont know what they are doing and are limiting themselves to a 1024x768 native LCD resolution will), AA or not, we use them to play 1280x960 or better, and if you can play CS:S like i can maxed out detail wise at 1600x1200, 4xAA, 16xAF, Supersampling (TSAA) and Gamma Correct AA, etc.........then do it, thats what the horsepower is there for

if i wanted to just get by with framerates only at 1280x960 i could do that with a $125 6600GT .....
 
And also, do you think we're all competing for the grand MEGA prize! :p

I know when I'm playing NFSMW I love how it looks and a couple of frames isn't going to do anything to my times I doubt. It would be hard for me to flick off the eye candy for MP shooters afterwards. I doubt it would cost most of us more than a kill or two over a long gaming session, being generous. We're not all trying to be the next fatality.

I do buy the high end hardware to not have shitty frames. I don't stand to gain much gaming prowess, and I've accidentally trained myself for certain expectations like lines that aren't jagged coming out of my most expensive toy.
 
This whole thread is actually a good example of what we are trying to do here at [H]. We are trying to find the highest IQ we can set in-game and still maintain a playable and enjoyable level of performance. Turn everything on and crank everything up until it becomes unplayable, and then back down the IQ till the perf is right where you want it. That's the whole idea! That way you get the highest level of IQ that is playable with a certain card in a game. Some cards allow you to have better IQ than others with the same or better perf, that's what we show.
 
Brent_Justice said:
This whole thread is actually a good example of what we are trying to do here at [H]. We are trying to find the highest IQ we can set in-game and still maintain a playable and enjoyable level of performance. Turn everything on and crank everything up until it becomes unplayable, and then back down the IQ till the perf is right where you want it. That's the whole idea! That way you get the highest level of IQ that is playable with a certain card in a game. Some cards allow you to have better IQ than others with the same or better perf, that's what we show.

This is by far the best response I've seen all day. Keep em all coming, I want to hear more from the community what their responses are considering video card quality IQ and effects.

P.S. I do agree that pretty much this issue is a "PER-GAME basis"

BF2 Vs CS:S in quality and framerate are 2 completely different animals.
I'm trying to focus on the full picture when you are considering a 400-500 dollar video card.
That's a console, a damn nice monitor, RENT, mortugage payment, tons of HDD space, among things. Quite an investment.
 
Lol... my "nice monitor" has made me one of the guys buying the $500 video cards. Never before. Bad alternative. :D
 
PSYKOMANTIS said:
This is by far the best response I've seen all day. Keep em all coming, I want to hear more from the community what their responses are considering video card quality IQ and effects.

P.S. I do agree that pretty much this issue is a "PER-GAME basis"

BF2 Vs CS:S in quality and framerate are 2 completely different animals.
I'm trying to focus on the full picture when you are considering a 400-500 dollar video card.
That's a console, a damn nice monitor, RENT, mortugage payment, tons of HDD space, among things. Quite an investment.

i dunno where you are buying houses at, but $450 only pays for 1/4th of my mortgage and truck payments combined, finding a place to rent around Atlanta (suburbs, not in town) for $450 is slim to none unless you want to rent a room in the ghetto
 
PSYKOMANTIS said:
This is by far the best response I've seen all day. Keep em all coming, I want to hear more from the community what their responses are considering video card quality IQ and effects.

Your arguments are not very consistent. I think the big picture is that you cannot afford a better rig, so you are simply lashing out at such luxuries, as if they are like that in spite of you. The fact is that enthusiasts can see a level of detail that you clearly cannot, but there is nothing wrong with that. What’s wrong is suggesting such enthusiasts are wasting their time and money reviewing these luxuries, or purchasing them for gameplay. I, for one, spit on such folly, not because I dislike it, or the accuser, but because it is nothing more than anger without proper vents.

I invite you build (or buy) a high-end pc and turn on high levels AA, FSAA, AF, etc. You will not, [I repeat] will not, think twice before changing your mind.
 
ShuttleLuv said:
There are differences. Maybe you can't notice them....but they're there. Sucks to be you I guess... :(
what are you talking about? i think it would be great if i couldn't tell the difference a game with and without AA. i'd sure as hell have a lot more money. :D
 
cyks said:
Your arguments are not very consistent. I think the big picture is that you cannot afford a better rig, so you are simply lashing out at such luxuries, as if they are like that in spite of you. The fact is that enthusiasts can see a level of detail that you clearly cannot, but there is nothing wrong with that. What’s wrong is suggesting such enthusiasts are wasting their time and money reviewing these luxuries, or purchasing them for gameplay. I, for one, spit on such folly, not because I dislike it, or the accuser, but because it is nothing more than anger without proper vents.

I invite you build (or buy) a high-end pc and turn on high levels AA, FSAA, AF, etc. You will not, [I repeat] will not, think twice before changing your mind.

agreed, people hate what they cant have

i have to say, i would never want to go backwards from being able to run 1600x1200 with AA/AF maxed out, transparency AA, etc........it's just as nice as it gets, if you can then why not?
 
I don't understand the argument about 'pushing' the video cards. When you crank up AA and now HDR, you ARE stressing the video cards. I would think most video card reviewers only review the most common resolutions used by consumers. If most people's monitors only do 1280x1024 (this is VERY popular b/c its the max resolution for the majority of the LCDS out there) who cares if the video card will do 1900x1200 with crap settings getting 40 fps? Hell until recently, FEAR totally made video cards CRY with AA etc turned on. As far as i'm aware, most review sites test with and without the eyecandy turned on anyways. As a consumer I want to know what it is capable of doing w/ 100% eye candy turned on at the resolutions I play at. Why? Because I know if I turn it down (which is what you're doing) I'd get more FPS at the said resolution anyways. Most reviews will scale up on the resolution once it is determined that there is no video card bottle neck and to see if the CPU is the bottleneck.

Having said that.. FPS isn't everything. I've owned NVIDIA cards all the way back from the original TNT days. The 1800 series is THE first ati 3d card i've owned. I honestly must say that the IQ is amazingly better compared to what I've previously owned. It's a shame that most reviewers don't factor in IQ as part of their score. Playing old games with the x1800 is a treat. FPS is definately important no one wants to play a slide show but, if you don't dip below 50-60fps when playing with eye candy turned on..why not. I really can't imagine someone spending like 300-500 on a new card just so they can play their game on the lowest setting possible. Then again this would explain why sooooo many people are saying how great the xbox 360 looks compared to PC games in general.


IQ doesn't affect your gameplay in the sense that it makes you better but, you're argument is vague. It's almost like watching a movie in the theatres and then coming home and watching it on my DLP setup. You get the same story, but you just pick up more detail. Why play Q4 if its like Q3? Why play CS:S when CS 1.6 was similar. Each one of these upgrades made your fps go down and they played EXACTLY the same.
 
Back
Top