GoodBoy
2[H]4U
- Joined
- Nov 29, 2004
- Messages
- 2,768
Oh, I've stated many times precisely what looks different. The sheer clarity of essentially everything on screen, and especially text is my primary concern. I noted the 'vaseline screen' look of Nvidia and I wasn't exaggerating. People will of course endlessly tell me that it's something in cleartype I've overlooked, or it's just placebo and I'm fooling myself, but there are other people just like me who notice it immediately.
There are plenty of people on this very forum who will sit and tell you that LCD televisions are perfectly acceptable for watching tv, when they are in fact utter crap. I can't help that they are less discerning, but the fact remains they are. A lot of people never even notice things like reduced motion resolution, horrible black/gray levels, flashlighting, etc. These kinds of things simply enforce to me that a great number of people lack something in the way they process the world visually.
I think what you might be seeing is the shittyness of VGA. I recommend going DVI for sure. (and use the dvi-d which doesn't contain the analog signal path, which the card/lcd might default to).
We use HP pc's at work, with built in amd graphics, hp lcd's. Most all of the buildings 800 or so users run dual displays. The hp pc's come with one DVI port, one VGA port (and more recently the DVI was replaced with displayport on last years model desktop). So we use both for the lcd's on each users desk. The VGA connected LCD's have weird issues, the auto-adjust helps a bit but its still there. in focus on most of the display, but there will be vertical sections that look just a tad out of focus. Annoys the shit out of me. The dvi connected side looks perfect.
You are definately right that there can be differences, maybe subtle to some people, between cards. But in my experience it only appears on analog (VGA) connected display devices.
The original posters mention of the age old argument, which I think originally began as fact in the early 90's where different DAC's, etc. between video cards (very simple non-3d devices back then) was quite real. I remember some Viewsonic vga cards, isa slot, with 1Mb ram, that was like a $200 or more video card, and it was mainly priced there for the quality of the output. This is back in the day when monster cable actually could make a difference.
Today, you can eliminate all those issues as the OP did, use the digital connections.
Op, you might try your original tests again, this time purposely using the VGA ports.
The problem with analog connections affects both brands as I can see even today on pc's I support at work. It's possible there's differences in this between brands of cards. In this day and age, it seems more likely this difference might be attributed to manufacturer (ASUS, Gigabyte, etc) along side/in lieu of the GPU manufacturer. Some more testing might reveal it!