My GeForce 8800 GTS 512 finally kicked the bucket, and while I am in the process of RMA'ing it, it's a good time to upgrade; I upgraded the rest of my system already, to Sandy Bridge.
My last 3 video cards were all NVIDIA (GeForce2 MX, GeForce 6600 GT, GeForce 8800 GTS 512), but this time I decided to try the ATI chipset. I bought a PowerColor AX6950 2GB (Radeon HD 6950). I had some nice surprises, but also some unpleasant ones.
Pros of the 6950:
Cons of the 6950:
1) Is the 30-bit-to-24-bit temporal dithering an exclusive feature of the Radeon HD 6000 series, or does the GeForce GTX 500 series do it too? To find out (on your DVI-connected monitor), just display a smooth gradient (preferably one with all 256 grayscale values), and then play with the gamma correction settings in your video driver's control panel; see if that creates visible discontinuities in the gradient.
2) Do all Radeon HD cards do the ridiculous thing of resampling low-resolution modes up to 1280x800 on a 2560x1600 monitor? Does the GeForce GTX 500 series at least do the sensible thing and resample to 2560x1600 (like my GeForce 8800 GTS 512 did)?
3) My research beforehand showed that the Radeon HD series was better at playing video (deinterlacing, etc.) than the GeForce GTX series. However, upon actually trying the card, I could see that the adaptive deinterlacing wasn't working, seemingly at all, at 1920x1080i. Does the GTX series' adaptive deinterlacing actually work? I don't care if it doesn't even let me choose between deinterlacing options (my 8800 GTS 512 didn't), as long as its adaptive deinterlacing works well enough that I'd never want to play with the setting.
My last 3 video cards were all NVIDIA (GeForce2 MX, GeForce 6600 GT, GeForce 8800 GTS 512), but this time I decided to try the ATI chipset. I bought a PowerColor AX6950 2GB (Radeon HD 6950). I had some nice surprises, but also some unpleasant ones.
Pros of the 6950:
- It seems to be able to drive my 24-bit color DVI Dual Link LCD with 30-bit color. My guess is that it's using "temporal dithering", flickering quickly between neighboring 24-bit values. This is something I really, really wanted, so that I could use gamma correction without posterizing smooth gradients. I thought I'd have to wait until I eventually got a DisplayPort monitor to take advantage of 30-bit color. (Incidentally, my GeForce 8800 GTS 512 did output 30-bit color to my CRT quite well.)
- It gives a lot of control over video deinterlacing and other things related to video playback. Since I watch TV on my computer monitor using a Blackmagic Intensity Pro, this is doubly nice. (Or so it would seem... see below.)
- Latency of the Enhanced Video Renderer seems to be less than it was with my GeForce 8800 GTS 512.
Cons of the 6950:
- The video deinterlacing doesn't seem to actually work. Going from Weave to Bob has the exact effect it's supposed to; but then going from Bob to the higher, "adaptive" settings results in brain-dead deinterlacing that looks worse than Bob, pretty much always. The pulldown detection doesn't seem to work either. (I'm using "Enhanced Video Renderer" for output.) My GeForce 8800 GTS 512 at least was able to dynamically choose between Weave and Bob appropriately most of the time.
- The card resamples low-resolution modes (e.g. during boot-up) to 1280x800 instead of 2560x1600. Apparently it looks at the list of modes my LCD supports and goes with the lowest one. So I get the worst of both worlds: the display looks both fuzzy and pixelated at the same time. My GeForce 8800 GTS 512 resampled low-resolution modes up to 2560x1600. (What I'd actually like is for low-resolution modes to be pixel-upsampled by the highest integer multiplier that'd still fit everything in the screen; I don't mind if there are black borders. But given a choice between the two alternatives, I want it to upsample to 2560x1600.)
- When I drag windows (with "Show window contents while dragging"), they often leave "droppings", parts where the window beneath doesn't redraw itself. I tried playing with overclocking and underclocking the 6950, and nothing got ridding of the droppings. This could of course be a flaw in my particular sample rather than all 6950s.
- Although it has two DVI ports, only one of them is DVI Dual Link, and only one of them is DVI-I allowing an analog CRT to be plugged in. Unfortunately the same port does both of those things, so I can't plug my 2560x1600 LCD into one port and my Sony GDM-FW900 CRT into the other one. I didn't know this when I bought the card; all I knew was that only one of its DVI ports was Dual Link, but I thought surely both could be plugged into CRTs.
1) Is the 30-bit-to-24-bit temporal dithering an exclusive feature of the Radeon HD 6000 series, or does the GeForce GTX 500 series do it too? To find out (on your DVI-connected monitor), just display a smooth gradient (preferably one with all 256 grayscale values), and then play with the gamma correction settings in your video driver's control panel; see if that creates visible discontinuities in the gradient.
2) Do all Radeon HD cards do the ridiculous thing of resampling low-resolution modes up to 1280x800 on a 2560x1600 monitor? Does the GeForce GTX 500 series at least do the sensible thing and resample to 2560x1600 (like my GeForce 8800 GTS 512 did)?
3) My research beforehand showed that the Radeon HD series was better at playing video (deinterlacing, etc.) than the GeForce GTX series. However, upon actually trying the card, I could see that the adaptive deinterlacing wasn't working, seemingly at all, at 1920x1080i. Does the GTX series' adaptive deinterlacing actually work? I don't care if it doesn't even let me choose between deinterlacing options (my 8800 GTS 512 didn't), as long as its adaptive deinterlacing works well enough that I'd never want to play with the setting.