Why does it seem Nvidia Looks better than ATi?

USMC2Hard4U

Supreme [H]ardness
Joined
Apr 4, 2003
Messages
6,157
I have owned ATi since the Original Radeon back in the day, and have been happy with there 3d performance etc etc....

Before that I owned a GF2 GTS and it was sweet.... whatever...

Well know I am at best buy and playing with all these computers that have GeForce FX 5900's in em and stuff, and there 2d looks soooo much better... it seems more clearer and like windows is using more colors and shadows and stuff for there icons.... wtf? I thought ATi was suppose to have superior Immage Quality for 2d?

And I evern tried a bunch of diff computers, and on diff monitors.. all the same result?

Anyone know?

THanks
 
What ati cards are you comparing to?

Are you comparing the two cards on the same monitor?

Are you comparing integrated vs non?

Etc, etc.

The ONLY way to do a comparison is to compare off the SAME computer on the SAME monitor (perferably a very expensive one). Otherwise it's all speculative hog wash. :D


Cheers,

Mr. Pain
 
Here we go another thread of my 5900BFG does 50000core and 80000mem so it looks better than ati lmfao:rolleyes:
Two tips for ya first being wash your eyes then try a Radeon9800 on a 19in CRT then try a 5900nu ;)
Secondly turn both of these cards to 1600X1200 then turn on AA and AF. then report back to us with the real scientific findings:D
 
The colors are probabaly nVidia's digital vibrance control (The same can be accomplished w/ ati, manually) or a different monitor. The rest (Shadows) is either your imagination or simply different windows settings.
 
It could be digital vibrance but the monitor can make a huge difference. If you have an old monitor the picture can get a little distorted and fuzzy. On top of that refresh rate could sometimes make a difference as well as desktop settings(ie running in 16bit vs 32 bit).
 
I'll have to put a big no on that.

This is pretty subjective, but I've had the 5900 U in my system.
Then I went to a 9800 Pro.

the ATi looks much better. 2D and 3D (ohh... especially 3D).

But that's really subjective.
 
If you're really adamant about 2d image quality and want to get the clearest picture possible, I suppose you could remove the video RF filter on the card (actually bypass it). I did this on my GF2 GTS and it had phenomenal results. Before removing the filter I could do 1280x1024 and the time in the systemtray was a little fuzzy. Now I can crank the res to 1600x1200 with complete clearness. Colors also seem brighter.
 
Pfft.. there's really no noticeable difference in 2D anymore between the two.

You can see the difference in comparing older generations of cards, but not really anymore.
 
Aside from killing the RF filters, which makes all graphics cards pretty freakin' sharp, newer cards aren't really noticeably different in image quality. RENDER quality, maybe, but signal noise, no.

Well, Matrox is still better than everybody, but not by a whole lot.
 
Originally posted by USMC2Hard4U
I have owned ATi since the Original Radeon back in the day, and have been happy with there 3d performance etc etc....

Before that I owned a GF2 GTS and it was sweet.... whatever...

Well know I am at best buy and playing with all these computers that have GeForce FX 5900's in em and stuff, and there 2d looks soooo much better... it seems more clearer and like windows is using more colors and shadows and stuff for there icons.... wtf? I thought ATi was suppose to have superior Immage Quality for 2d?

And I evern tried a bunch of diff computers, and on diff monitors.. all the same result?

Anyone know?

THanks

"it seems more clearer and like windows is using more colors and shadows"

It's a subjective matter, but only in what the viewer has knowledge of, as far as what a properly rendered image should look like. A lot of people like ATI's image because it is brighter, in an excessive contrast/brightness kind of way. nVidia uses lighting and shadows better, to render images in a more realistic way, which displays colors in a more sensible manner.

It takes quite a bit of effort for someone to alter what they are used to viewing (bright "wow affect"), and thus, nVidia and ATI's image will likely remain subjective.
 
you need to go see your optometrist. He'll fix the problem with the ati cards.
 
Well nvidia has got a lot better at 2D than in the past. I don´t think there is much difference in 2D quality between the 5900 and 9800 for example. Digital vibrance can explain the more colours and if you like it sure.

Me on my old ti-4600 I had to turn digital vibrance to low or off when I got a Trinitron monitor since the colours got exaggerated to much for my taste. Looked very cartoonish and unrealistic to me. And the 2D quality from my old Geforce 4 to my current 9700 PRO I can´t be sure I have ever noticed a real difference.

But then my computer was down for 1,5 month before I got my ti-4600 exchanged for an 9700 PRO for free :D
 
Originally posted by Digital Viper-X-
Hes comparing it to the Original Radeon he had in his system


Nooo I am just saying From the original radeon i had, to my 8500, to my 9700 Pro i have now....

it all seems the same to me...

I have a Samsung 17 inch LCD now, and had a SOny CRT 17 inch before

NOW I went to best buy, looked at a couple of sonys vaios and HP's with NVidia Cards n all kinds of monitors, and i can notice so much difference in icon color quality and desktop menu shadows quality compared to all the ATi cards I have ever had... It looks like there is just more color saturation or somehting if that makes sence
 
You do have to be a little careful when ripping off the filters, they are there for a reason - to comply with FCC radiaion emission standards.

By ripping off the filters, you run the risk of ultraviolet colour (just outside of the purple colour) being emitted from your videocard and out the monitor. Infrared (just outside of Red) is not a big deal cause it doesn't harm your eyes (its just heat)

ATi is usually better quality then Nvidia in 2D. You should calibrate your monitor for each videocard... If you've only used Nvidia cards, then you may be used to the Digital vibrance thing.
 
Originally posted by @trapine
Here we go another thread of my 5900BFG does 50000core and 80000mem so it looks better than ati lmfao:rolleyes:

he didn' say anything about hardware, he just said what it looks like to him, you people are way too hostile

anyhow it could just be the lighting at the bestbuy and their monitors compared to yours.

it could also be the windows settings used maybe?

anyhow i doubt there's any real difference, i say use what you like, unless you benchmark i doubt you can tell the difference between any two similarly-priced cards
 
Originally posted by Badger_sly
A lot of people like ATI's image because it is brighter, in an excessive contrast/brightness kind of way. nVidia uses lighting and shadows better, to render images in a more realistic way, which displays colors in a more sensible manner.


Did you accidentally mix up ATI and Nvidia? I find that Nvidia's colours look more exaggerated and cartoonish, while ATI's are more subdued and realistic.
 
Originally posted by ZenOps
You do have to be a little careful when ripping off the filters, they are there for a reason - to comply with FCC radiaion emission standards.

By ripping off the filters, you run the risk of ultraviolet colour (just outside of the purple colour) being emitted from your videocard and out the monitor. Infrared (just outside of Red) is not a big deal cause it doesn't harm your eyes (its just heat)

ATi is usually better quality then Nvidia in 2D. You should calibrate your monitor for each videocard... If you've only used Nvidia cards, then you may be used to the Digital vibrance thing.

part of the reason I bypassed mine was because I lost the color red (or was it yellow?)...either way, my screen was all hosed and bypassing the filters was suggested to me. I did it and it fixed the missing color. Apparently part of the filter on that particular color had failed, and bypassing it gave me my colors back. It also had the nice side effect of a clearer picture.
 
If you think that a monitor can display UV or IR, I've got a bridge in the Sahara to sell to you.

:::EDIT:::
Monitors do emit in those wavelengths; display's the word I was looking for. A monitor can't trick your eyes into thinking that you're seeing stuff above 650nm, because that's what the red phosphors emit when struck by electrons.
:::EDIT:::

On the other hand, if you HAVE a monitor that can do it, I'll gladly buy it from you for a lot of money. (Monitors' gamuts are usually 350nm~650nm, because of the phosphors they use)

As for FCC EMI compliance, bah humbug. :p The only things that should have issues, seeing as that you're encasing the graphics card INSIDE a big metal box, will be a sound card. (This is why you put the sound card in the LAST PCI slot, and why you try to put other cards between the sound card and graphics card.)
 
Originally posted by Anarchist4000
It could be digital vibrance but the monitor can make a huge difference. If you have an old monitor the picture can get a little distorted and fuzzy. On top of that refresh rate could sometimes make a difference as well as desktop settings(ie running in 16bit vs 32 bit).

Digital Vibrance is total crap. It just over saturation, looks aweful.
 
I've tested 3 cards on my system you see in the sig. ATI Radeon 7200, MSI GeForce 4 Ti4200 and the ATI Radeon 9600Pro. (Both of the radeons are built by ati)

To me, the Radeon 7200 and gf4 were almost on the same level when it came to 2D quality. However the gf4 started displaying really fuzzy text when it came to really small fonts. Also, for some reason the convergence on the gf4 wasn't as good as on the 7200. This could have been due to the monitor, I'm not sure. I think my monitor is fairly good in terms of quality.

As for the 9600Pro, the 2D quality was much better than the gf4 and the 7200. There's almost no convergence issues (especially around the corners) and small sized fonts are fairly crisp.

Also, someone mentioned that the 'digital vibrance' color modes almost look cartoonish. I have to agree with that. I found it to be unrealistic when it came to displaying colors, so 90% of the time I had it turned off.
 
Like, oh my god... I totally tested all the latest cards against my Diamond Multimedia Stealth III S540 32MB PCI card and it owned like totally all the other cards because it is the greatest ever and you all are wrong and retarded. Like I looked at Windows 95 and Windows 98 and the latest Windows XP and then like ten versions of Longhorn Beta and I know for sure that the Stealth is tons better than anything from ATI or Nvidia because you all don't know what you're talking about and neither does any hardware review site because Bill Gates pays for those sites with money he steals from Apple and Unix. What I'm saying is that it looks so much better and doesn't hurt my eyes or make me puke so don't you forget it.

Oh yeah... and your butt is really big.
 
Most of this thread gets a :rolleyes:


1/2 the people hear have no idea wtf they are talking about :p

Cheers,

Mr. Pain

-- Like I said. Take all the cards, place them in the SAME system with the SAME settings on the SAME monitor. Then do a comparo.

Otherwise, you're just wasting everyones time.
 
Originally posted by Stanley Pain
Most of this thread gets a :rolleyes:


1/2 the people hear have no idea wtf they are talking about :p

Cheers,

Mr. Pain

-- Like I said. Take all the cards, place them in the SAME system with the SAME settings on the SAME monitor. Then do a comparo.

Otherwise, you're just wasting everyones time.
Agreed. I have yet to see one unbiased review with screenshots that agreed with what 90% of the people in this thread claim. Blah blah blah this one is better I saw it. No, blah blah blah no this one is better I saw it. That sums up this thread pretty well.
 
I haven't seen any of the post GF 4 nVidia cards in action, but I prefer the image quality of my 9800 Pro vs. that of my Ti 4400 (god I loved that card. I love the 9800 Pro too. :p).
 
OK... Just so you people DONT think im retarded...
Sorry for the noobish thread or whatever... It was just my opinion...

However, I found what is causing it...

In the NVIDIA Options, they have this Digital Vibrance witch makes all the colors more vivid and sharper and what not, and thats what it was i was seeing as the difference....
 
Digital Vibrance :)

Hehe..

Makes colours look WAY off. But I guess if it floats your boat, then why not :)


Cheers,

Mr. Pain
 
Originally posted by USMC2Hard4U
OK... Just so you people DONT think im retarded...
Sorry for the noobish thread or whatever... It was just my opinion...

However, I found what is causing it...

In the NVIDIA Options, they have this Digital Vibrance witch makes all the colors more vivid and sharper and what not, and thats what it was i was seeing as the difference....

Your thread is not noobish. I did find digital vibrance useful on my Geforce 4 really whereas my 9700 PRO was fine without it.
 
Back
Top