Who has better Image Quality, ignore FPS

Image Quality...not frames per second :)

  • I have seen high end ATI & Nvidia in similar setups, and there is no observable IQ difference

    Votes: 47 26.7%
  • I have seen high end ATI & Nvidia in similar setups, and Nvidia IQ is better by a little bit

    Votes: 13 7.4%
  • I have seen high end ATI & Nvidia in similar setups, and ATI IQ is better by a little bit

    Votes: 52 29.5%
  • I have seen high end ATI & Nvidia in similar setups, and Nvidia IQ is remarkably better

    Votes: 10 5.7%
  • I have seen high end ATI & Nvidia in similar setups, and ATI IQ is remarkably better

    Votes: 36 20.5%
  • 3dfx FTW!

    Votes: 18 10.2%

  • Total voters
    176
  • Poll closed .

Battle_Rattle

Limp Gawd
Joined
Nov 6, 2005
Messages
327
IQ = Image quality … and is not to be mistaken for how many frames per second a card can produce.

Please do NOT VOTE if you haven't seen these cards in action.

For Nvidia high end for this purpose starts at 6800gt

For ATI high end starts at x800

There ARE flaws in this poll, but just answer honestly
 
there isnt really any difference when ur running around shooting people. Face it.

high end ATI is x1800+ now.
 
I can tell no difference from my 7800GT to my friends X850 XT PE but there was a large jump in IQ from my X700 Pro and My old RIVA TNT2 Cant touch it.
 
Oh you'll get the !!!!!!s on one side or the other here.

Run any game at 16x12 or better and play it don't run around looking for video imperfections.

You'll probably never notice the difference.
 
Now that I've thought about it, I find it strange that there are options for "remarkably better".
This would imply that someone viewing demos of identical setups with identical monitors (each color calibrated, of course) would look at the ATi powered rig and say, "Wow, that's remarkably better than the nVidia powered rig!"

Sorry, I just don't buy it. I voted no noticeable difference, though I could certainly lean for ATi's IQ being a "little bit" better. Maybe it would be beneficial for those who are voting for the "remarkable" options to try and elaborate on why they did so.
 
Seeing as I recently went from a 7800 GT to an X1900XT I think I can weigh in.

Before getting technical I'll say by far I prefer the ATI card for IQ. Having FSAA with HDR is amazing, and HQ AF makes a fairly noticeable difference.

When you get down to it:

NVIDIA:
Transparent Supersampled AA +1
Poor AA-enabled performance -1
Angle-dependent AF -1
No FSAA with HDR -1

ATI:
Adaptive AA +1
Great AA-enabled performance +1
HQ Angle-Independent AF +1
FSAA with HDR +1
6xAA vs 8xAA -1

I also had a flickering issue with my GT, however that may have been driver related so I'll disregard that for comparison's sake.

I voted option 3.
 
I went from a 7800GT to an X1800XL and IMO Ati IQ is much better, buuut I never tried digital vibrance so bleeh.
 
Stellar said:
NVIDIA:
Poor AA-enabled performance -1


ATI:
Great AA-enabled performance +1
I would agree that ATI may have slightly better IQ overall, but not in AA, in my opinion. I find Nvidia's AA to be higher quality, perhaps explaining the drop in AA performance. So -1 for more demanding AA on Ndidia? +1 for less demanding AA for ATI? But the OP has chosen for performance (FPS) to be disqualified, so yea...
 
cyks said:
I would agree that ATI may have slightly better IQ overall, but not in AA, in my opinion. I find Nvidia's AA to be higher quality, perhaps explaining the drop in AA performance. So -1 for more demanding AA on Ndidia? +1 for less demanding AA for ATI? But the OP has chosen for performance (FPS) to be disqualified, so yea...

I suppose that should be stricken from the record.

Transparency Supersampled AA does provide better IQ than Adaptive AA, I agree. The difference is very neglible however.
 
After going from x850xtpe to a 6800UOC I give a tiny edge to ati in 3d apps with identical settings, and I found Nvidias 2d to be a bit blurry in comparison to ati.
 
I doubt many people have seen a highend ATi and NV card. I have, in my PC. ATi has two key advantages over NV, HQ AF, and HDR+AA where NV cannot do it. There is no denying that. Thus, I voted ATi better IQ, because the better AF is easily seen, and HDR with the addition of AA, is much better than not.
 
Worldhammer said:
After going from x850xtpe to a 6800UOC I give a tiny edge to ati in 3d apps with identical settings, and I found Nvidias 2d to be a bit blurry in comparison to ati.

I noticed this as well, my Windows desktop seems a little more sharp with both my X800XL previous to the 7800 and now my X1900XT.
 
Well, have a BFG 7800 GTX 512 OC and an ATI x1900XTX. The ATI seems a little crisper in games, but it is not night and day. Nod goes to ATI in BF2, COD2, FarCry, or any game where you want AA/AF turned up.
 
x1800xt on a sony gdm-fw 900 and 7800gt on a Dell 2005fp .. I really had a hard time telling the difference between them at all when playing fps games.. In 2d things seemed slightly less crisp with the 7800.. But that could have been the monitor.. I voted no difference..
 
I give the nod towards ATi, after my main rig in my sig was taken apart to be sold I started to use my Shuttle more often and I have a 6800NU in the shuttle. Granted I realize that a X800XT is faster than the 6800NU I have but AA and AF seem much better on the X800XT. As well as Shimmering while it existed on my X800XT the 6800NU has the issue much more noticeable. Of course this can be attributed some to the performance diffrence. 2D wise I dont really see a diffrence.
 
bought x1800xl 2 days after launch from newegg, now running a 7800gt. whats better quality wise? couldnt tell you. if anything the ati has a slighly more defined picture but what it was missing for me was color. the 7800gt has better color for me on my ancient 19"crt. the digital vibrance made colors pop, and is a little easier for me to navigate the driver options like autoloading your presets when a game launches. i also like not running net framework yet, i hate when you leave the machine idle and it bumps you to the login screen.

either way its old tech. the x1900's are here and in 2 weeks the 7900's will launch
 
In most reviews I've read it's hard to tell the difference in image quality between the two. However most reviews and the overall consensus is that ATI is able to kick the resolutions up higher, with better degrees of AA in a greater amount of games. There isnt a great enough disparegy between the two card makers to base a buying decision on image quality alone.
 
I went from a 7800GTX to a 1900XT and the IQ is a little better in ATi land. If you can't tell the difference, it doesn't mean it isn't there. 6x AA is really nice too - I never could enable 8xAA on the GTX and expect playable framerates, despite what I read to the contrary in some reviews. But as many people are pointing it, a lot of the time you won't notice it because you're attention is fully focused on the action.

Digital vibrance for the bin! It totally ruins the colour balance. I tried using it to brighten up an old CRT and found that using calibration software produced a *much* better result. But it doesn't hurt to give users the option I suppose.
 
I noticed that going from a 9800 pro to a 7800GT, that, ignoring the obvious increase graphics settings, that ATI just had a subtle leg up on Nvidia. Also, when will Nvidia fix the texture flickering?
 
Tex flickering on nV is easy to reduce. Just set in your cp High Performance and lock LOD bias to clamp.
 
Yes you can reduce shimmering, but you cannot get rid of it. Sadly, its still a major issue to some people. Myself included.
 
HDR+FSAA is what pushes ATi over for me. 2D wise Nvidia is kicking ATi's butt tho. i tried out my neighbor's x1800xt in my machine with my monitor(GDM-FW900) and the desktop looked like garbage at 1920x1080. on my 7800gt it looks like pure heaven. once i am in a game and running around i could care less about FSAA at the res i run. i dont use HDR in CSS anyway cause the glare is annoying.
 
Now...out of the 29 people who voted no difference...post what your video card is. (I'll bet it's not a majority of ATi)
 
Obi_Kwiet said:
I noticed that going from a 9800 pro to a 7800GT, that, ignoring the obvious increase graphics settings, that ATI just had a subtle leg up on Nvidia. Also, when will Nvidia fix the texture flickering?


nvidia's fix for flickering was to increase default LOD bias, which is why the nvidia image is less detailed than ATi, and it's also why people with nvidia cards swear by high AF settings. (ATi people usually dont need the high AF setting, they already have detailed textures)

If you want nvidia cards to have the same image as ATi, just unclamp LOD and set LOD Bias to -2, but beware, it gets shimmering hell if you enable FSAA with those settings. A good compromise is LOD at -1, texture is almost as good, and shimmerring is less noticeable.
 
My only personal experience in image quality is going from a Geforce2 to a 6800 card and the difference was great...
 
It's definitely ATI. It's either "a little" or "a lot" depending on what you consider a lot or a little to be.
 
Ok guys.... Thanks each and everyone of your posts... I have been keeping tabs and the voting and comments have been extremely constructive
 
xtasyindecay said:
My only personal experience in image quality is going from a Geforce2 to a 6800 card and the difference was great...
i have used a geforce2, geforce4, 5700, x700, x1300pro, 6600, and 6600gt. i am very picky and i really cant tell a difference except in doom3 where the ati cards looked a little blurry. is there really a difference in 2d though? if so then how the hell do people test monitors????? :confused: if there is a difference then why has it never been mentioned in a monitor review? if a reviewer says that a monitor doesnt look good or have crisp text they always blame the monitor not the video card.
 
At identical settings, just playing games and not looking at screenshots, I could not tell the difference between the following cards: 6800GT, X850XT, 7800GTX512.
 
Having built systems from both in our lab I can clearly give the nudge to ATi mainly for better AF. But you really have to nitpick a scene to see it...
I tried in CSS and you have to really walk around and lookk for the difference...
To all those comparing 9XXX generation to todays -WRONG. ATI's filtering is not the same, and it's actually a bit worse, but still better then Nvidias.
In my own rig I went for a 7800GT, nothing like price/performance to get me going. BTW most mid-high end GPU's requested out of the shop/lab I work in are NVIDIA, so in terms of sales, prior to X1900, Ati were getting slapped in the sales department...(you can check quarterly financial statements- they also reflect this...).
 
I voted no noticeable difference because when you're ACTUALLY playing the game there needs to be a big difference in IQ before it actually becomes noticeable (at least for me)

I can however tell the difference between 2 high quality screen grabs or even in game if im specifically looking for it, I must say that Nvidias Transparency AA when set to Super sampled and put to 4x (what I run in BF2 along with 1600 1200 res) nothing ATI has can really compare.

BF2 really benefits from TSAA though, due to its abundance of vegetation in most of its maps, normal AA just doesn't cut it for me anymore it simply HAS to do textures using transparency otherwise it just looks rubbish :)
 
When i recently went from AGP using a X800XTPE and then going to PCI-E with 7800GT i saw a considerable downgrade in image quality.

Problem is that the mmorpg i play most (played it for 3 years now) can only run in 1024x768 res, and on my 24" widescreen Dell it has to be scaled up, so AA and AF is very important.

The ATI X800XTPE just did it so much better than my new 7800GT, AA and AF looks much better with ATI in situations like this.
 
Back
Top