Wierd graphics quality after ATI -> Nvidia upgrade

eljeffe

n00b
Joined
Dec 6, 2006
Messages
18
Just upgraded my Vista system from an x1950 pro to an Nvidia 9600GT; I'm thrilled with the speed, but the text quality in applications seems to have completely fallen apart. I can't tell exactly what's happening, but instead of nice smooth text, everything is "blotchy." It's almost like the weight of the fonts changes randomly. Anyone else ever notice this? I'd almost guess that the system thinks the color depth is 8 instead of 32 bits.
 
a little specs please, do u have an LCD display?
are u using a DVI-I to DB25 port converter?
 
I'm using dual Westinghouse L2410nm 24" lcd displays with dvi-i to hdmi converters. Everything is running a native resolutions and Cleartype enabled. Nothing has changed except the video card and drivers.
 
Larryjoe that isnt helping him at all.and it isnt true one bit. Otherwise I would be getting it also.

Eljeffee, did you uninstall the ATI drivers before you install the ones for your nvidia card. Because I have a 8800GT and I dont have your problem at all. And I have the same montior as you.

Guru3d sweeper is a good tool to use. just uninstall your drivers and go to safe mode and remove both ati and nvidia drivers then reboot. Then install the 174.74 which I'm Using and see if that fix your problem.
 
i noticed the same thing when i upgraded from X1800XT to 8800GTS. also while i was waiting for my EVGA step up card to arrive i temporarily used an X700Pro and again noticed sharper text (desktop, word, internet explorer, etc.). based on this, i think ATI cards display text better than nvidia. all this was using Vista 32bit, Gateway FPD2185W 21" LCD, and latest drivers. just my personal experience so take it for what it's worth.
 
If your monitor has an auto-adjust which it should, be sure to use it with a bunch of text on screen. That will line up the display resolution with the pixels making everything sharper. There are a few options you can play around with in the nvidia control panel as well.
 
I upgraded from a 7900GS to a HD3870. Text is way clearer as well as all other 2d stuff.

Even 3d clearness/sharpness is better.

Nvidia has always been not so good in the sharpness/clearness arena. It's been like that since 3dfx was still around.
 
agreed with some of the above, ati >nvidia for some things, like sharpness, clarity and video playback, but again, this is personal opinion. so take it for what it is.

but for games, (its a 3d video card) nvidia > ati right now.

went from a 1900XT to 8800GTS, immediately noticed that some .avi files had more noise and were grainier.

went into the Nvidia control panel, and one of the bottom 2 selections has an option to reduce Signal Noise and something else.
I turned those on to the max and the image became very similar to ATI.
 
ati's drivers set stock color saturation differently from nvidia. its not a hardware thing. you can go into the gforce drivers and tweak until you're satisfied with the colors, but if you don't like fiddling with that kind of stuff, then buy ati cards.
 
Well, since no one mentioned it yet, see if for some reason cleartype was turned off in the display settings.
 
sorry, but i'm afraid that for gaming, nvidia cards are a hell of a lot better than ati cards, only the 3870X2 is even slightly comparable to the high end nvidia cards...

really, you'd expect the 3870 to compete with the 8800GT or GTS, but does it? no...

In my opinion, ati cards are fine for 2d work, or typing or whatever, and maybe even an htpc, to help with 1080P playback, and if your really desperate to get all the really high 3d-mark 06 scores, but if you want high resolution, high frame rate gameing, you would go for an nvidia card currently.
 
This thread is full of idiocy. Image quality hasn't been different between ATI and Nvidia for years, and most definitely not over digital like DVI or HDMI. If theres a problem its a driver issue, or in your imagination. Try using different drivers.
 
If the OP is saying what I think he's saying, it's most likely a resolution/refresh rate issue. OP, are you using the same resolution you were using for your ATI card? Are you using the native resolution of your monitor? Sometimes, playing with the refresh rate in the Windows Display properties will clear things like that up...sometimes. I don't think this is an Nvidia only thing. And yes, going from a 7900 to a 3870, you would see image quality differences, just as you would going from the 7900 to an G80 or G92 based card as they are completely different generations of cards.
 
ati's drivers set stock color saturation differently from nvidia. its not a hardware thing. you can go into the gforce drivers and tweak until you're satisfied with the colors, but if you don't like fiddling with that kind of stuff, then buy ati cards.

i was referring to in Nvidia Control Panel
Adjust Video Image Settings -> Edge Enhancement -> set to +100
and
-> Noise Reduction -> set to +100

Then the video files I was watching, became close to the quality i was used to. In fact at this point it became nit pickingly close, but before it was glaringly obvious, the amount of noise was horrendous.

there might be more.
Out of the Box, ATI did 2d Video processing (i watch anime for example) is better.
without the need for any modification. That is my observation.

Those of us who went from one to another, we have this opinion, so I hope that those who are criticizing us, have used both recently. Otherwise, your comments are just a one sided opinion. Without having an ATI card installed in your PC and an Nvidia card installed in your PC, you are not really qualified to post here with a one sided opinion.

But again, this is the opinion of a few of us who went from one to the other.

Disclaimer: I do not disagree that Nvidia > ATI in gaming FPS and Quality settings.
 
sorry, but i'm afraid that for gaming, nvidia cards are a hell of a lot better than ati cards, only the 3870X2 is even slightly comparable to the high end nvidia cards...

really, you'd expect the 3870 to compete with the 8800GT or GTS, but does it? no...

In my opinion, ati cards are fine for 2d work, or typing or whatever, and maybe even an htpc, to help with 1080P playback, and if your really desperate to get all the really high 3d-mark 06 scores, but if you want high resolution, high frame rate gameing, you would go for an nvidia card currently.

come on now, don't turn this thread into the usual green vs red shitfest. take these kinds of comments elsewhere.

This thread is full of idiocy. Image quality hasn't been different between ATI and Nvidia for years, and most definitely not over digital like DVI or HDMI. If theres a problem its a driver issue, or in your imagination. Try using different drivers.

i agree that ATI and Nvidia image quality are equivalent on a 3D front, but i've seen with my own eyes the difference in text quality. you can point the idiocy finger all you want but you're wrong. and are you suggesting that all image quality is equal if it's transmitted via digital signal (hdmi, dvi)? if i'm understanding you correctly then you're completely ignoring the processing being done at the GPU level.
 
come on now, don't turn this thread into the usual green vs red shitfest. take these kinds of comments elsewhere.



i agree that ATI and Nvidia image quality are equivalent on a 3D front, but i've seen with my own eyes the difference in text quality. you can point the idiocy finger all you want but you're wrong. and are you suggesting that all image quality is equal if it's transmitted via digital signal (hdmi, dvi)? if i'm understanding you correctly then you're completely ignoring the processing being done at the GPU level.

I would be truly surprised if there was any "processing" going on with regards to the 2D text. Of course, with Vista + Aero, there could be some 3D aspects to some text, I'm not sure of the mechanics on how Aero works, and what exactly it affects. The only thing I can imagine that would affect text quality is not anything inherent to the card, but rather a driver bug or something of that nature.

The reason for degraded image quality in Windows in the past was always a company using crap ass components for their analog output. This is why Matrox was so big for non-gamers for so long, they had visibly superior 2D image quality. But eventually it came to the point where every companies 2D output was the same, especially since the quality of the components on the board became trivial or irrelevant (in the case of digital output.)

Since the output to an LCD is digital and pixel addressable (i.e. pixel 1 is told to be X color, while pixel 7 is told to be Y color.) Any degradation in 2D image quality would have to be intentionally caused by something. It is unlikely that it would be at the hardware level since the 2D portion of the hardware is trivial and simplistic. There would be literally no reason for it to be better or worse than the 2D on any other card, since it should just be displaying what it is told to display. Its not filtering anything, performing any sort of anti-aliasing etc.

Now I could buy a driver bug, or Windows bug causing the problem. Especially in Vista where video card hardware is used a lot more in the UI. But I have never seen any visible difference between ATI and Nvidia cards when using DVI.

Here's the thing though... if there is a problem with image quality over DVI, it should show up in a screenshot. After all, anything the GPU could have done to it would already have been done by the time the frame you are capturing got into the framebuffer and out to disk. And beyond that, its all digital and isn't mangled at all. (With the exception of older 3dfx cards which used a post filter to bring 16bit color up to ~22bit color.)

So unlike problems with analog output, anything that makes it into the digital output should be visible in a screenshot.
 
I was of the opinion that Nv had cleared up their less than stellar 2d by the time the 7xxx series launched (from the abysmal 5xxx series and prior, and only slightly better 6xxx series imho). Since then I have been hard pressed to notice any glaring 2d differences over vga or dvi, crt or lcd, between Ati or Nv.
Going from a 1900xt to a 8800gts I did not notice a loss in 2d image quality, on a crt connected by vga or a va paneled Dell over dvi.
Perhaps Nv's driver defaults are just not optimal for some peoples hardware/software configs in 2d.

Over vga on a bottom rung card where every corner gets cut to keep the price low, I can see the possibilities for 2d iq differences. Especially now since more and more people connect via dvi. The AIB's could be more easily tempted to cut corners with vga output quality than they might have been otherwise. Cheaper, lower quality DACs for instance.
 
I've noticed a few times as well when switching from an ATI card to an nVidia card that 2D text and graphics seemed somewhat different and maybe not as "high quality" as the ATI card. I remembered I noticed it right away even on the BIOS screen at POST. It was like the nVidia card was using a slightly different font or something. I would suspect it to be driver issue more than anything but at the BIOS level graphics drivers aren't loaded. It's not to say that nVidia 2D image quality is really that bad but I have noticed a slight difference when switching between the two. I can't say how true that is for the latest series of cards from nVidia though since I've been on a laptop the past year and I'm just now in the process of putting together a desktop.

Not everyone notices slight changes as much as other people do. I'm the kind of person that's very particular about certain things because I'm a bit of a perfectionist so if there's been a little change to something I've been used to I noticed it right away. But for some of you it may take more of a drastic change before you pick up on it. For some of you you might for example have a wife that you see every day and yet if she were to go out and get her hair done that day or her nails done or something like and you might not even notice that at all. It might take a full blown DD boob job to even get your attention.
 
Yeah by with the default install-config of NV drivers usually 2D and text is not as clear as with an ATI card but all it takes is a but of tweaking (10 seconds) in the NV CP and it's identical, just disable and re-enable clear type (for LCD's) and raise gamma and image sharpening in the CP and digital vibrance (a hair on that one not too much maybe 10%).
 
Ok, I'm seeing a lot of utter nonsense being thrown around in this thread, and none of it is in any real relation to helping the OP solve his problem.

Lets go pint-by-point here:
1. Nvidia and ATi use different default color settings; I've had an ATi and an Nvidia card in the same system, you can adjust the drivers so they're identical.
2. Image clarity, over a digital connection, should be identical. Any difference is either due to faulty hardware or misconfigured drivers (analog is another story, that depends on more than just the card itself).
3. Someone mentioned the clarity of motion video (in this case, anime), this is a completely different matter entirely, and sounds like you had Purevideo acceleration turned off on the Nvidia card (misconfiguration).
4. DO NOT turn on image sharpening over a digital connection if you care about true image reproduction. If you're doing any kind of graphics work, turning on image sharpening instantly makes the display useless. As mentioned before, if the clarity isn't the same there's something else wrong, don't wreck image quality by turning on sharpening.


To the OP: Download the Cleartype Tuner from Microsoft and try adjusting fonts that way. If that fails, run driver cleaner, reinstall your video drivers, check the settings on your monitor, and (if possible) use a digital input.
 
The difference that I noticed when switching between the ATI and nVidia cards was not due to the default color profile and if it was faulty hardware then it was on the nVidia card which was connected through the same digital DVI connection to an LCD as the ATI card was. There was a difference in the text on the screen which I noticed immediately upon install. It seemed more blurred and almost like a different skinnier font altogether compared to what I was used to seeing with the ATI card. And I don't believe it was related to the drivers so much either because it was something I saw even on the POST screen. The text was just slightly different with the nVidia card and didn't seem as crisp as the ATI card.
 
The difference that I noticed when switching between the ATI and nVidia cards was not due to the default color profile and if it was faulty hardware then it was on the nVidia card which was connected through the same digital DVI connection to an LCD as the ATI card was. There was a difference in the text on the screen which I noticed immediately upon install. It seemed more blurred and almost like a different skinnier font altogether compared to what I was used to seeing with the ATI card. And I don't believe it was related to the drivers so much either because it was something I saw even on the POST screen. The text was just slightly different with the nVidia card and didn't seem as crisp as the ATI card.


I have tested (and had) ATI 9800 Pro-X800XL-X1950 cards right next to 6000-7000 class cards-8000 class NV cards and usually the NV cards (driver depending) need a wee bit of CP tweaking to output text and 2D with the same clarity as the default driver settings of an ATI card. In each end every case I have been able to get NV cards to look absolutely identical to ATI cards with VGA or digital connection in 2D and text. In fact I'm typing/reading on my secondary rig now with a 7950GT and it's every bit as crisp in 2D and text as any ATI card I have installed for a customer or for myself, but it did take some NV CP tweaking.
 
I noticed a drop in 2D text quality when I put in my brother's 8800 Ultra from my X1950Pro. It just didn't seem as crisp...
 
Matrox used to have the best 2d quality and some pretty decent 3d quality although being slower than the competition in 3d. It's unfortunate its a 2 horse race for gaming cards these days. For the short time they were in it Matrox cards couldn't be beat for IQ. It's been my experience ATI is slighly better these days, although I can't tell much difference without looking for it. The last nvidia card I used was the 6600, currently I've got an ati 3850 and I honestly didn't notice a big difference in the sharpness of my text. So unless the newer nvidia cards have worse IQ than the 6000 series I don't think the brand should be a large enough issue for it to be so noticable. Now if I could just get the 2d image quality of the g200 on a card with modern 3d.....
 
Well, it's not just that the *default* driver install from ATI to NV has better 2D/Text IQ (it does) but NV drivers IQ vary from driver to driver. For instance when I went from 163.75's to 169.21's (NV) there was a massive difference in gamma so I had to raise gamma from 20% to 45% on the NV CP, where as from driver to driver with ATI Cats default setting pretty much carry over.
 
Those of us who went from one to another, we have this opinion, so I hope that those who are criticizing us, have used both recently. Otherwise, your comments are just a one sided opinion. Without having an ATI card installed in your PC and an Nvidia card installed in your PC, you are not really qualified to post here with a one sided opinion.

But again, this is the opinion of a few of us who went from one to the other.

Disclaimer: I do not disagree that Nvidia > ATI in gaming FPS and Quality settings.

how about those of us who use ati and nvidia powered system concurrently, do you think we would then be qualified to respond to the op's question? it seems that there are posters out tyring to hijack the op's thread into another red vs green thing, and there are others who trying to make clear that there is a logical and reasonable explanation for everything.
 
3. Someone mentioned the clarity of motion video (in this case, anime), this is a completely different matter entirely, and sounds like you had Purevideo acceleration turned off on the Nvidia card (misconfiguration).

Err, wait, so after a fresh XP install and brand new drivers, the 174.74, by default, something as simple was Purevideo acceleration is not turned on?
Is that right? i mean if it is, then is that weird? should it be like that?

how about those of us who use ati and nvidia powered system concurrently, do you think we would then be qualified to respond to the op's question?

I would say your covered by the below, since a side by side has both

Without having an ATI card installed in your PC and an Nvidia card installed in your PC

it seems that there are posters out tyring to hijack the op's thread into another red vs green thing, and there are others who trying to make clear that there is a logical and reasonable explanation for everything.

there are, I agree, thats why i said the above.

thus far there seem to be two answers:
1) Ati > Nvidia in 2d
2) Nvidia is the same as ATI in 2d after a few minutes tweaking.

If you agree with this (if not, then re read the thread, because these two are the only answers that are out there) then it follows that ATI is better out of the box. Its the same as comparing a card that you can just use vs one you need to OC to get the same results, ie, they are not the same. Most [H] users, i would like to think, know a thing or two, or are willing to learn a thing or two of how to get the most out of their hardware, so its not a big deal. But there is a difference between Plug-n-Play and Plug-n-Configure_for_how_ever_long-n-Play.

Disclaimer -> same as before, currently running a 8800GTS 512, games run better than they did on my 1900xt. and current gen Nvidia > Ati
 
Err, wait, so after a fresh XP install and brand new drivers, the 174.74, by default, something as simple was Purevideo acceleration is not turned on?
Is that right? i mean if it is, then is that weird? should it be like that?
The card is always ready to use Purevideo, you need to configure your media player to take advantage of it (it's a simple check box as far as Windows Media Player is concerned).

2) Nvidia is the same as ATI in 2d after a few minutes tweaking.
Not quite right, the only thing that differs is the color profile (maybe gamma slightly). It's not better or worse, just different. You want your Nvidia card to match your ATi card here's a tip: bump gamma up a tick and boost saturation.

My personal preference is to tone down the ATi card to match Nvidia's defaults, ATi's default just looks over saturated on my monitors.
 
Back
Top