Does the ATI really have better Image Quality?

I switched from an 8800GT to a 4850 and I've found that the 8800 was slightly better when in motion. The ATI card tends to have noticeable flickering of edges and textures when I am moving toward or away from them. It isn't to say that its bad or that other cards don't suffer from the same thing, its just more significant on my 4850 than it was on my 8800GT.

This is running 4x AA and 8x AF in most games at 1600x1200.
 
The ATI vs nVidia image quality question was put to bed with the 7 series GeForce cards. You can't tell em apart.
 
The ATI vs nVidia image quality question was put to bed with the 7 series GeForce cards. You can't tell em apart.

Then why are the guys over on the ATI side claiming that the ATI cards have better image quality?:)
 
ive never really cared about the subtle differences, but then i was reading an article
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=11
and i was shocked at how much i liked the ati aa.

but i dont know if there are differences in the frame rate and if its worth it.

but ive always preferred no aa and a higher res.
 
The ATI vs nVidia image quality question was put to bed with the 7 series GeForce cards. You can't tell em apart.

Then why are the guys over on the ATI side claiming that the ATI cards have better image quality?:)

Nvidia G8x/G9x/G200 filtering better than Ati.

Honestly, it's close enough that it doesn't matter anymore, but just like everything else people have to take up sides, red vs. green. It's pointless. Buy what best fits your budget, intended purpose, and monitor/resolution. Ignore who manufactured or designed it. Move on.
 
Honestly I think ATI cards render at a slightly better quality but the difference is so minor that its not a selling point to me.
 
I've had both company's cards.

I have had very excellent results with nvidia and I will no doubt buy them again and again.

Right now I'm using 4870s in Crossfire and I really do think there is a subtle difference in Image. I like the ATI cards better.

That said, if nvidia makes a card in the next round that is as cheap/comparable at the top end, I'll be back.
 
To be able to notice a difference you would have to take still shots, blow them up, and examine them slowly

But if you do so I have felt lately NVIDIA has drawn more objects on screen since G80 and had longer dept draw in certain games (HL2) as well as rendering things like plants fuller than ATI. IMO NVIDIA also gives better filtering quality.

However, ATI has usually always had better AA quality, and IMO nothing has changed with this generation.
 
I always thought ati was able to produce a little sharper picture in general while nvidia was better at optimization via drivers and what not. But these days it goes by game to game so much, who knows.
 
Depends on what you mean. If you mean AnIso filtering then no. nVidia previously had some problems at various angles. You could see it on tests as it'd make a flower shaped pattern. In games it means some textures were clear, others not as much. That all changed with the 8 series, their AnIso filtering is extremely good now, and more importantly consistent in all directions.

The other thing is ATi had a rep for using better DACs. I don't know how true this is, I never noticed a problem with nVidia cards, but I heard it in a number of places. However, that's only relevant if you use a VGA connection, and really only relevant if you use a VGA connection to a high quality CRT. If you are using a VGA connection to a low quality CRT or an LCD, well then the DACs aren't really going to be the quality problem there excepting maybe an exceptionally good LCD (in which case you are an idiot for using VGA since it surely has DVI). If you are using a DVI connection it isn't relevant since there is no analogue signal conversion and the data is passed unmodified to the display.

There's also a possibility for better AA on the ATi end this generation what with their custom AA filters. How much use that will be I'm not sure as it seems to be too slow in most real world games (since it uses up shader power to do) but it is a theoretical area that ATi could have an advantage.

In general I wouldn't worry much about it. DVI has eliminated any artifacts in conversion and both companies are doing quite an excellent job with filtering and such. We are, thankfully, beyond the days when cards make major errors in rendering. There are differences, but it is pretty minor stuff these days.
 
Then why are the guys over on the ATI side claiming that the ATI cards have better image quality?:)

Nothing better to do I guess? I can't tell the difference between my 3850 in my HTPC, 8600GT, 9800GTX or my 7100 something. They will all look damn near the same when playing 1080p video smoothly and sharply, save the 7100 something. There was a time when ATi WAS the leader in IQ and everybody used their chips. That was back in the Rage era. No difference now.
 
My HD 3870 has distinctly sharper/crisper image quality than my 7900GS did.

I've used both ATI and Nvidia cards over the years.. and ATI always had better image quality.

And when 3dfx was still around, they had way better image quality and richer colors than Nvidia did at the time.

It also depends on your monitor.. and if you are running an LCD monitor, you need to run at the native resolution.
 
To me, ATI's colors are much richer. The Nvidia colors seem brighter and washed out.
Oddly, the NVIDIA shots appear "richer" to me, but that's likely because of an slight difference in gamma, though I'd also venture to guess that the 9800's displaying slightly more dynamic range than the 4850 (hunch). Neither output looks 'better' or 'worse' than the other to me, though. Obviously the most notable difference between the two cards is how AA affects transparent sprites (leaves), and the 4850 looks a bit more pleasing in that regard.

Now, I'm not sure about this "much richer" business...
 
Much richer colors? Seriously people.

Monitors have to be calibrated differently when used with different sources because not all sources output the exact same curve. Review sites /never/ recalibrate between shots and don't take shots with cameras, so yes, different cards will produce shots with different curves. Get over it. So long as you, on your personal setup, calibrate things properly there will be no appreciable difference in color.
 
To me, ATI's colors are much richer. The Nvidia colors seem brighter and washed out. This may be due to a lighting difference in the scene.

Switching between the ATI / Nvidia images, there is no difference in gamma, lighting or colour.
The main difference is down to the shadows.
NVidia are not drawing the darker shadows which unfortunately makes some of the leaves disappear in the trees and removes some of the ambience where dark places are still well lit.
All parts of the images are identical except for the dark shadows except...

ATI's anti aliasing loses a whole side of the chain fence!
 
Switching between the ATI / Nvidia images, there is no difference in gamma, lighting or colour.
The main difference is down to the shadows.
NVidia are not drawing the darker shadows which unfortunately makes some of the leaves disappear in the trees and removes some of the ambience where dark places are still well lit.
All parts of the images are identical except for the dark shadows except...

ATI's anti aliasing loses a whole side of the chain fence!

Gotta pick your poison IMO, the colour on NV was horrible. The apartment at the background shows up as a complete white blob. ATI seems to render the lighting more properly and realistic.

The AA on the fence does look pretty bad for ATI, it seems that while softening the edges made the fence look less pixilated then NV's implementation it end up eating away finer lines at an angel resulting in just some dark pixels floating in the air.
 
I have modern cards from both and I tend to prefer nvidia's gaming IQ slightly more, but there's nothing wrong with either one.
 
The apartment at the background shows up as a complete white blob.

Viewing on a plasma, all the apartment building is present but in much lighter shades of colour.
Your monitor may be saturating if contrast or brightness are too high?
 
I say the difference is rather clear
tree1ph1.png


Background contrast for NV seems to shit itself
apwd6.png


SS 245B Bright:50 contrast:75
 
I would say my 8800GT is better than the 4850 I had, considering the image tearing that I could not get rid of on the 4850 made 3d apps unbearable.
 
I have seen that kinda stuff alot, from both sides, over the past few years.
 
Much richer colors? Seriously people.
Monitors have to be calibrated differently when used with different sources because not all sources output the exact same curve. Review sites /never/ recalibrate between shots and don't take shots with cameras, so yes, different cards will produce shots with different curves. Get over it. So long as you, on your personal setup, calibrate things properly there will be no appreciable difference in color.

Agreed. However, I don't see how you could possibly calibrate your monitor enough to make those buildings in the background appear like they do in the ATI shot. I also feel that, you'd have a hard time trying to get rid of that lighting on the brick wall behind Alyx.

Like I said before, this may just be a bad example, perhaps the game had some kind of dynamic lighting that caused the scene to change so much.

Look at the HL2 screenshots. The chain-link on the right side is missing entirely on ati cards. That's not cool.

Agreed. I originally found that link and posted it to another forum talking about how ATI completely optimized out the fence behind Alyx. I was posting saying how I was torn between whether to get an ATI or Nvidia card. With the ATI card, I'd get the better color, but lose information due to AA. With the Nvidia card I'd get better AA, but lose objects due to washed out color.
 
Like I said before, this may just be a bad example, perhaps the game had some kind of dynamic lighting that caused the scene to change so much.
This looks like a potential Source engine fog issue to me. What would the D3D9 reference rasterizer have to say, I wonder?
 
To be able to notice a difference you would have to take still shots, blow them up, and examine them slowly

But if you do so I have felt lately NVIDIA has drawn more objects on screen since G80 and had longer dept draw in certain games (HL2) as well as rendering things like plants fuller than ATI. IMO NVIDIA also gives better filtering quality.

However, ATI has usually always had better AA quality, and IMO nothing has changed with this generation.

I disagree. Still shots will only tell part of the image quality story, the other half is in how they render things that change. Flickering textures and edges, smoke and fog effects and other miscellaneous issues can be noticed (although not documented) while in motion.
 
AA comparison of ATI and NV in games

I find that Ati’s AA is better than NV,
but while enjoying a game you may ignore the difference.
Generally speaking, more than 4AA setting in game will be hard to discover the difference, but if you have some patience to take a look, you will find they are not always the same!

I use two computers to test the AA quality of ATI and NV,
but with the same monitor(benq fp737s).
computer 1:
E6300, 2GB RAM, HD4850
computer 2:
E8400, 2GB RAM, MSI 8800GT OC

The games I tested were TES4:OBLIVION and Two Worlds,
AF was set to 8x by driver,and AA set to 2x、4x、8x,
so I can compare the difference.
Although the drivers of ATI and NV both have advanced settings in AA,
I ignore them. I only test the general settings of AA that everybody may use in game,
and many benchmarks found on www may be the same!

oblivion
2aa
Oblivion2AA.jpg


4aa
Oblivion4AA.jpg


8aa
Oblivion8AA.jpg


two worlds
2aa
TwoWorlds2AA.jpg


4aa
TwoWorlds4AA.jpg


8aa
TwoWorlds8AA.jpg


---------------------------------------------------------------

oblivion
Oblivion.jpg


4xaa
ob-nv-4aa.jpg

ob-ati-4aa.jpg


8xaa
ob-nv-8aa.jpg

ob-ati-8aa.jpg



Two worlds
TwoWorlds.jpg


4aa
tw-nv-4aa.jpg

tw-ati-4aa.jpg


8aa
tw-nv-8aa.jpg

tw-ati-8aa.jpg


-------------------------------------------------------------------------
-------------------------------------------------------------------------
The following is the AA test of crysis!
Under windows xp sp2, the video quality all set to high.
(It’s a pity I can’t use 8xq and 16xq in crysis with my 8800gt,
the AA setting will become 0AA when applied)
ATI’s TENT and EDGE AA mode should be switched by driver!
The following is the comparison of 4AA and 8AA,
and you should take a look on the frame of the window.

Original (no AA)
noaa-1.jpg


4AA
4aa.jpg


8AA
8aa.jpg


-------------------------------------------------------------------------------------------

AA test of facing lightsource
Original ( no AA)
noaa-2.jpg


ATI
ATI-aa-1.jpg


ATI-aa-2.jpg


NV
nv-8aa16aa.jpg



You can find the variation of FPS by different AA settings.
ATI’s box mode has the best efficiency,
but its Tent mode will cause a big drop on FPS,
still it can smooth the edge of the lightsource and other textures perfectly!
NV’s 16AA is better than 8AA, but not so easy to make out.

(p.s. while playing oblivion,i found the screen flickered with my 8800gt,
but stable with 4850,so it's some other reason i suppose)
 
In Sourced based games yes, in anything else it's neck and neck in my opinion. I just got a 4870 to replace a dead SLi setup I had. This is the first ATi card I have had since the 9700/9800 Pro, and I noticed a huge difference on Source based games. For example I an see across a map sown with out the fog bug in the nVidia cards. For some reason after the 6800's they don't render the fog right in Source games. Just my experience though.
 
Looks like Gamma Correction AA isn't working on NVIDIA cards on those pics.
 
Nv IQ has greatly improved from the crap that was the over optimized 5xxx series and started to improve with the 6xxx series. Really, imho, since the 8xxx series there has been little difference in IQ until you turned AF or AA on. Now both companies do things a little differently so ones AA/AF level may not look exactly the same as the others.
Both companies also have their optimizations for certain games. Some of these optimizations affect IQ a little more than others.

In most games I play, my opinion is that Ati's 2xAA looks better than Nv's. But once you hit 4xAA or higher, I have a very hard time noticing a difference between the two, without zooming in on screen shots.
 
AA comparison of ATI and NV in games

I find that Ati’s AA is better than NV,
but while enjoying a game you may ignore the difference.
Generally speaking, more than 4AA setting in game will be hard to discover the difference, but if you have some patience to take a look, you will find they are not always the same!

I use two computers to test the AA quality of ATI and NV,
but with the same monitor(benq fp737s).
computer 1:
E6300, 2GB RAM, HD4850
computer 2:
E8400, 2GB RAM, MSI 8800GT OC

The games I tested were TES4:OBLIVION and Two Worlds,
AF was set to 8x by driver,and AA set to 2x、4x、8x,
so I can compare the difference.
Although the drivers of ATI and NV both have advanced settings in AA,
I ignore them. I only test the general settings of AA that everybody may use in game,
and many benchmarks found on www may be the same!

oblivion

8aa
Oblivion8AA.jpg


oblivion
Oblivion.jpg

Now I don't know what resolution you ran at, but your pictures were @ 1280x1024 so that is what I set mine to, and I also enabled 8xAF, 8xQ AA in the control panel:




Now using 8x CSAA, which seems closer to yours but still better:




Pop quiz:

What is dumber than comparing images enlarged 800% for a slight difference in jaggedness?





















Answer: Nothing
 
I'm not sure what those screens were I scrolled past them all but yeah I hear nvidia has smaller jaggies.
 
from those screens ATI has better AA from the first few.

Isn't ATI new Edge Detect supposed to be kicking some arse... [H] did a review on it a while back

but as said, unless your stopping to smell the roses in youre games i doubt %99 of you would know what card was what if you were blind tested, sat down at a rig, told to play and pick what card was in it..

heck i would love to see some tests where someone is sat down in front of

Nvidia first
then ATI

NVIDIA system
then NVIDIA again (this one would be amusing)

ATi and ATI.. you get the drift...
 
Back
Top