Who uses AA and FSAA? Really?

PSYKOMANTIS

[H]ard|Gawd
Joined
Sep 20, 2002
Messages
1,128
You know I've never really understood the point of FSAA or AA.
Sure it smooths out the infamous "jaggies" and makes games look pretty at low resolution. Great... FINE! FSAA and AA have a place for their duty in LOW RESOLUTION gaming.

What I'm asking is why do people still care soo much about FSAA and AA what so ever?
Supposidly you have a good enough monitor that supports 2048 X 1536 @ 75+ HZ...
The point of buying a good video card is high frame rate and stability in high resolution gaming, that's pretty much a given.

What I don't get is once you push your PC to 1600 X 1200 and beyond you really don't notice the jaggies anymore because you pretty much compress everything onto your screen.
What I'm getting to is why stop and smell the roses and use FSAA and AA at high resolutions and hamper your FPS? I don't see FSAA or AA improving my CS:S scores or my K/D ratio in BF2...

I personally own a 21" Trinitron SGI RGBHV monitor I use for photo editing.
When I crank up the FSAA and AA I really don't notice anything but a slowing down of my FPS and more system chugging.

What's really pissing me off is when hardware websites (including hardOCP) do reviews with all the FSAA and AA on. What about pureists like me that don't turn that shit on? What ever happened to HARD FPS facts without all the magic BS?
Its too fucking confusing to see cards pitted against eachother on how well they perform Quality settings.

Honestly, am I missing something?
 
Because... ahh..... it looks better?
Sure, at higher resolutions, AA means less, but certainly still improves the image by a wide margin. And AF? Jesus, without AF, everything looks like complete crap.
 
I never turned on AF or FSAA or AA in my life.
All I care about is pure frame rate.

Like any stereo system, you can have the finest amplifier in the world and still have shit speakers.
That's kinda how I see the video card world but instead of speakers with an amplifier you have a monitor.

My gaming does not look any different on my machine compared to anybody elses when I'm at LANparties. So can anybody answer why this is soo critical?

I can go over pictures upon pictures of ZOOMED up edges of guns and numbers.
But it all does not matter when you are using a LCD monitor with pixels or a shitty CRT consumer grade monitor. That's what I'm getting to.

Like I said before, am I missing something that I'm just NOT seeing?
 
Psyko, I have an assignemnt for you, should you choose to accept it...
Turn on at least AF 8x or above, and play with it like that for two weeks. Then turn it off and see if you can stand it. Id say do the same with AA, but dunno if your rig can handle it (mine couldn't, at those resolutions)
/edit, looking at your sig, your rig might, i dunno.
 
It makes huge difference on a sharp moniter. I have to have at least 2x AA no matter what.
 
I totally agree that AA is pointless after 1280X1024. Reviews should focus on higher resolutions, more AF, and things like soft shadows and HDR. I'll take SS/HDR over AA any day of the week.
 
If you can state all you care about is pure frame rate, it should be obvious that others may not be as extreme. First, alot of people with LCDs are going to use there native resolution. That's it and nothing else. Native resolution still has jaggies, like even 1920*1200 on a 24" screen.

Brings me next to CRTs, my FW900 may not have a native resolution, but likewise even 1920*1200 is still a tiny bit jagged. Naturally, it is a 24" screen just the same, and only a truly INSANE resolution would get rid of jaggies. So in the End, unlike you I DO care if everything is all jagged, and I just need a more modest approach. I need a balance of AA, resolution, and performance. As such I usually shoot for 1920*1200, and a meager 2xAA because that looks good to me.

Really, it's quite obvious as soon as you realize other people care about jaggies more than you do. Consider my situation above. I like high resolution because it makes the textures look even better sometimes to me as well. But going above 1920*1200 is just not possible. My monitor may max at 2000 x whatever, but good luck with that one. With displays that can handle whatever resolution you can throw at them, like my own, you still need something absoltuely insane to eliminate jaggies. So no matter what you do, AA can be your friend. I'm a res whore myself over AA, I only use 2x... but the IQ I like can not be maintained with resolution alone. Best I've ever seen jaggies eliminated by res alone is say on some smaller CRT, like a 19", that can still do 1600*1200 at reasonable Hz. Lol... try a big monitor. The res required would be frightening.
 
Pure FPS is great but if you're hitting like 100+fps already whats the point of not using AA and FSAA?

Your analogy is flawed. I'd compare it more to like having the a good stereo system and then listening to 64bit mp3's just b/c you can store more mp3's.

I used to play a lot of FPS's and honestly going back and playing them with all the settings turned up now makes me feel like i'm playing a totally different game =) Sure it doesn't make my k:d ratio higher..but it sure looks nice when I get my headshots .

And its the higher the res the less you'll need. There's always a balance. I think FSAA and AA are meant more for people that run at lower resolutions either because they are stuck there (max lcd native) or b/c people like to run at lower resolutions (like me for instance). My monitor does 1900x1200 and it's just rediculous how much of a FPS hit i would take to run it like that even on crap settings. Also I like to run games in windowed mode =)
 
Er... first it semed there was a proposal of some sort of using res instead, but now I see it seems to be even more about all out frame rate instead. You don't really need us to explain why [H] benches cards like x1900xt's with AA on do you?

There are always sites that have what each user wants. This one has their highest playable thing going on, so time for what you want is more limited. In fact, what you want seems to fly in the face of their point. However, there are plenty of sites like anand's that just start with pure speed and bring each thing up a notch at a time and bench again. I see value in both, so I read 5 or 6 articles each time out.
 
My flat panel's native res is 1280x1024, and games look much better when AA is enabled than when it isn't.
 
This is by far the most retarded thread ever.

It should be the exact opposite to what the original author stated. All benches should be done with 4AA and 16AF Period!

I did not spend 500+ dollars so I could see jaggies all over the place. If I wanted that I would buy an XBox.
 
There are differences. Maybe you can't notice them....but they're there. Sucks to be you I guess... :(
 
I play FEAR at 16x12 because it't the native resolution of my monitor. The jaggies in that game are annoying as hell, but i can't turn on AA because at that resolution my performance drops too low. Especially in the office levels where there are a lot of desks and doorways, lots of straight lines. The jaggedness of everything is pretty noticeable.
 
i always use AA and AF in my games. I'm not really a framerate junky like a lot of people seem to be. As long as i can play a game at 30fps+ i'm happy. Oh and nothing less than 1600x1200 with 4xaa 8xaf for me. Games look sweet.
 
This thread is pretty irrelevant b/c as individuals the degree of difference that AA and AF make is directly proportionate to our ability to discern detail. If we are the kinda person that eyeballs everything be it in photoshop or even cleaning your car then you are more apt to be sensitive to the effects of AA and AF. If you are the opposite its going to take more gross effects such as HDR and Soft Shadows to grab your attention. I think at any resolution AA and AF makes a huge difference for me b/c im a near perfectionist. So saying that AA and AF dont make a difference is irrelevant b/c we all have a different perspective on reality even if you quantify the net effects of AA and AF.
 
Ive used some level of AA since my 9700 Pro. Even today at 1920x1200 I use 4xAA whenever I can. In some games I cant tell that much difference (Farcry) and some games its still easy to see a difference at that res (BF2).

You telling me I dont notice jaggies is pretty short-sighted. You dont know what I, or others notice or not.
 
Wow, ppl that dont like AA or AF just cant afford video cards fast enough to play with these settings. AA and AF make a huge difference in games even @ 1600x1200.
 
Just about everyone uses AA. Looks a lot better, even at a high resolution. Didnt use to notice jaggies, till I got a card where AA was a playable setting. Now I lower resolution to one where at least 2xAA is playable. And then there's older games like the Hires Duke 3d, that look great with 16xAF and 4xAA :p
 
RavenD said:
Just about everyone uses AA. Looks a lot better, even at a high resolution. Didnt use to notice jaggies, till I got a card where AA was a playable setting. Now I lower resolution to one where at least 2xAA is playable. And then there's older games like the Hires Duke 3d, that look great with 16xAF and 4xAA :p
Agreed, I also lower res for atleast 2xAA. Although, it does depend on the game sometimes, but in most games I require atleast 1024x768, 2xAA, 4xAF settings. Ideal settings are 1280x1024 4xAA, 8xAF. 1600x1200 res is better, but only if it can play with AA and AF.
 
Shimmering Shimmering Shimmering...

Thats why I use it. I don't give a rats ass about jaggies. But the filtering is so much better with FSAA... Atleast with Multisample (or was that supersampling). There is a certain mode of FSAA which eliminates a ton of shimmering... Whatever the Voodoo 5 did...
 
To answer your question: Yes I do use AA & AF all the time. When I first purchased my rig back in July of 05, I stuck with one 7800GTX. When I purchased F.E.A.R., COD2 & Quake 4, my framerates were getting around 30-35fps with all the goodies turned off while playing @ 1900x1200. I decided in December to add another 7800GTX on my machine. It increased my framerates by a factor of 2.5. Now I enjoy all the eye candy along with the maximum resolution. The difference is night and day.

Anthony
 
I use high AF in every game that supports it and use AA whenever possible. The native res of my LCD is 1600x1200, so I prefer running at that resolution. If enabling AA makes the game too slow, 1600x1200 with no AA is better than at a lower resolution with AA IMO. Every game I have runs acceptably at 1600x1200, most run well at 1600x1200 w/4xAA and 16xAF.

When *playing* most games (not admiring a still screenshot or studying a cut scene) the jaggies are hardly noticeable at 1600x1200 even without AA. AA is better looking, but it doesn't make the game less enjoyable.
 
PSYKOMANTIS said:
What's really pissing me off is when hardware websites (including hardOCP) do reviews with all the FSAA and AA on. What about pureists like me that don't turn that shit on? What ever happened to HARD FPS facts without all the magic BS?
Its too fucking confusing to see cards pitted against eachother on how well they perform Quality settings.

Honestly, am I missing something?

Your missing a LOT.

I play everything at 1600x1200 (max res on my LCD monitor) with at least 4-6x FSAA and at least 2x AA, why, because there is a HUGE difference. I can't even stand looking at a game without FSAA. The only game that gives me problems is Fear, I can only run that at 16x12 with 4X FSAA and 2X AA, but the difference between no FSAA and 4X FSAA makes the occasional slowdown worth it.

It's not BS.

If you have never tried it, your missing out.
 
A lot depends on the type and size of your monitor. Those who're stuck with LCDs, especially smaller LCDs and a single "native" resolution usually see major benefits from AA. But I'm still using a 22" CRT and see virtually no difference at 1600x1200 between 4x AA and no AA. The benefit of having it turned off is, crazy frame rate averages at the res (e.g. 107FPS in HL2), even with a relatively ancient 9700 Pro.

AF is another matter. Major improvement in visual quality.

IMO reviews should continue to post results the way they always have been, with these options both off and on.
 
Due to my Dell 2450FPW, I have to run games at either 1600x1200 or usually 1920x1200, so I can't turn on AA or else my card just dies (X800 XT). After I turn up high quality textures and what not, I don't mind not having AA. In games like CoD 2, I'm more occupied by not getting killed, than looking at "OH MY! THERE'S A JAGGY!"
 
why does it matter to run your games at non-native resolution. Its not like it hurts your monitor. I would much rather drop the resolution down to 1280x1024 from 1600x1200 and use 4xAA/16AF rather than nothing at all. Also with lower resolution you get higher refresh rate so it even looks smoother. Only NOOBS run no AA/AF :p

Having jaggies = pwned all day!!!
 
sac_tagg said:
I totally agree that AA is pointless after 1280X1024. Reviews should focus on higher resolutions, more AF, and things like soft shadows and HDR. I'll take SS/HDR over AA any day of the week.
Dissagreed. I run my stuff at 1280x1024 and you do notice huge difference at that resolution. Do not believe me? Iwill take some screenies with and ith no AA in NBA LIVE 06, you will see a difference.

Nirad9er said:
why does it matter to run your games at non-native resolution. Its not like it hurts your monitor. I would much rather drop the resolution down to 1280x1024 from 1600x1200 and use 4xAA/16AF rather than nothing at all. Also with lower resolution you get higher refresh rate so it even looks smoother. Only NOOBS run no AA/AF :p

Having jaggies = pwned all day!!!
Like on my Viewsonic non native resolution means it will look like you are running at 640x480 :(
 
Nirad9er said:
why does it matter to run your games at non-native resolution. Its not like it hurts your monitor. I would much rather drop the resolution down to 1280x1024 from 1600x1200 and use 4xAA/16AF rather than nothing at all. Also with lower resolution you get higher refresh rate so it even looks smoother. Only NOOBS run no AA/AF :p

Having jaggies = pwned all day!!!

My monitor does 85Hz at 1600 X 1200.
My monitor pwnz your POS
 
I do use AA whenever possible, but it is definitely the first thing Ill turn off if I need a framerate booster.

AF, on the other hand, is very important to me. Textures continuing to be crisp looking even from far distances is amazing :p

But shit, honestly, if your card can handle AA and AF on, why the hell WOULDNT you use them???
 
I disagree.
I ran Half Life 2 on 1 step below my native 1920x1200 resolution and I absolutely couldn't stand it. Resolution > eye candy features. All day, every day.
 
aa is sweet, but AF is even sweeter.

the texture clarity that is clearly seen with AF makes it worth the performance hit, honestly your games LOOK better.

i personally don't use aa cause the performance hit is bigger than af and i can live with minor jaggies (1920x1080 res)
 
This thread is just to get attention, and it worked.

A better analogy than the stereo one is sex. No kidding. Say your getting some, all the time, night and day - but they're all mingers and numpties, no real satisfaction there. Now if you were getting less, but they were all pretty and had some content and intelligence, wouldn't you be happier? :p Anything over 35 a second is waste....I'm talking frames! Don't be a FPS slut!
 
Yeah, Ansiotropic filtering is practically free on any card since the 9700Pro there is very little reason to not use it.

As for antialiasing, if you are using an LCD the choice between enabling AA and running native res, i'd always go for native resolution. Altho i have to say that i find it hard to run anything without at least 2x AA on all the time. It is somewhat game dependant tho, quake4 benefits only slightly from AA where Battlefield 2 and other games with 'busy' environments show a huge improvement. Even at 2560x1600 on the dell 30" you'll still want to use AA.
 
Its not that I wanted this thread to get attention, but I was just looking for the general concensis of what people think Resolution and FPS vs Lower Resolution + Filtering and AA.

There's some really useful info here though at the least.
 
Yeah I have a samsung 17inch lcd. And the only game that really seems to have a lot of jaggies is flippin bf2. So it's either I run it at 1280 x 1024 at 60hz with no aa which is really bright and crisp looking but there is some jaggies. Or run run it at 1028 x 764 at 75hz with 2XAA. Also with bf2 I can run everything on med execept dynamic light and shadows at 1028 x 764 with 2xaa and I get really good frames(30-50) until there is more than say 56 people in a server. Once it gets to 56 people my frames go down into teens it pisses me off so much. My comp is a athlon xp3200, 1 gig of pc2700ram, and a vanilla 6800. Shouldn't i be able to run with med settings on a 64 man server.
 
Try playing on a 65" Widescreen LCD without AA on. It's not pretty, I know from experience.
 
Back
Top