Dissatisfied with HD 6950 quirks... should I go with GTX 570 for 2560x1600?

MetaGenie

Limp Gawd
Joined
Nov 6, 2009
Messages
246
My GeForce 8800 GTS 512 finally kicked the bucket, and while I am in the process of RMA'ing it, it's a good time to upgrade; I upgraded the rest of my system already, to Sandy Bridge.

My last 3 video cards were all NVIDIA (GeForce2 MX, GeForce 6600 GT, GeForce 8800 GTS 512), but this time I decided to try the ATI chipset. I bought a PowerColor AX6950 2GB (Radeon HD 6950). I had some nice surprises, but also some unpleasant ones.

Pros of the 6950:
  • It seems to be able to drive my 24-bit color DVI Dual Link LCD with 30-bit color. My guess is that it's using "temporal dithering", flickering quickly between neighboring 24-bit values. This is something I really, really wanted, so that I could use gamma correction without posterizing smooth gradients. I thought I'd have to wait until I eventually got a DisplayPort monitor to take advantage of 30-bit color. (Incidentally, my GeForce 8800 GTS 512 did output 30-bit color to my CRT quite well.)
  • It gives a lot of control over video deinterlacing and other things related to video playback. Since I watch TV on my computer monitor using a Blackmagic Intensity Pro, this is doubly nice. (Or so it would seem... see below.)
  • Latency of the Enhanced Video Renderer seems to be less than it was with my GeForce 8800 GTS 512.

Cons of the 6950:
  • The video deinterlacing doesn't seem to actually work. Going from Weave to Bob has the exact effect it's supposed to; but then going from Bob to the higher, "adaptive" settings results in brain-dead deinterlacing that looks worse than Bob, pretty much always. The pulldown detection doesn't seem to work either. (I'm using "Enhanced Video Renderer" for output.) My GeForce 8800 GTS 512 at least was able to dynamically choose between Weave and Bob appropriately most of the time.
  • The card resamples low-resolution modes (e.g. during boot-up) to 1280x800 instead of 2560x1600. Apparently it looks at the list of modes my LCD supports and goes with the lowest one. So I get the worst of both worlds: the display looks both fuzzy and pixelated at the same time. My GeForce 8800 GTS 512 resampled low-resolution modes up to 2560x1600. (What I'd actually like is for low-resolution modes to be pixel-upsampled by the highest integer multiplier that'd still fit everything in the screen; I don't mind if there are black borders. But given a choice between the two alternatives, I want it to upsample to 2560x1600.)
  • When I drag windows (with "Show window contents while dragging"), they often leave "droppings", parts where the window beneath doesn't redraw itself. I tried playing with overclocking and underclocking the 6950, and nothing got ridding of the droppings. This could of course be a flaw in my particular sample rather than all 6950s.
  • Although it has two DVI ports, only one of them is DVI Dual Link, and only one of them is DVI-I allowing an analog CRT to be plugged in. Unfortunately the same port does both of those things, so I can't plug my 2560x1600 LCD into one port and my Sony GDM-FW900 CRT into the other one. I didn't know this when I bought the card; all I knew was that only one of its DVI ports was Dual Link, but I thought surely both could be plugged into CRTs.
For me, the cons outweigh the pros, and this card is going back to Newegg. However before I buy another card, can anybody answer the following questions please?

1) Is the 30-bit-to-24-bit temporal dithering an exclusive feature of the Radeon HD 6000 series, or does the GeForce GTX 500 series do it too? To find out (on your DVI-connected monitor), just display a smooth gradient (preferably one with all 256 grayscale values), and then play with the gamma correction settings in your video driver's control panel; see if that creates visible discontinuities in the gradient.

2) Do all Radeon HD cards do the ridiculous thing of resampling low-resolution modes up to 1280x800 on a 2560x1600 monitor? Does the GeForce GTX 500 series at least do the sensible thing and resample to 2560x1600 (like my GeForce 8800 GTS 512 did)?

3) My research beforehand showed that the Radeon HD series was better at playing video (deinterlacing, etc.) than the GeForce GTX series. However, upon actually trying the card, I could see that the adaptive deinterlacing wasn't working, seemingly at all, at 1920x1080i. Does the GTX series' adaptive deinterlacing actually work? I don't care if it doesn't even let me choose between deinterlacing options (my 8800 GTS 512 didn't), as long as its adaptive deinterlacing works well enough that I'd never want to play with the setting.
 
What other low-resolution modes are you encountering other than boot-up? And is the resolution in the few short moments in boot-up really a serious consideration? I also use a monitor in 2560x1600, but can't say I've encountered anything like what you're referring to.

Have you tried different drivers in regards to your "window droppings" issue?
 
OP just so you know for the future, what you want to see regarding DVI is either DVI-I or DVI-A.

DVI-D= digital only, it won't transmit VGA signals (as it physically lacks the pins).


You can check that out on the website of the manufacturer of the card of your choice.


The deinterlacing issue is most surely a problem with what you configured, so here i will share this from a different forum that worked for me:

"I consider 7 things in ATI CCC are a must for best PQ that re-creates displayed image as close as possible to the source image, as the creators intended them to be.

IN ATI CCC
1. disable Edge-enhancement
2. disable De-noise
3. disable 'Enforce Smooth Video Playback'
4. disable automatic de-interlacing, and set the slider bar to vector adaptive, confirm vector adaptive de-interlacing using 1080i cheese slices stress test.
5. set underscan to 0%
6. drive your panel/projector at native resolution
7. match output refresh rate to input source material frame rate ( as long as panel/projector supports the refresh rate )"
 
^^slider bar to vector adaptive

This


Having had both a HD69502GB and now a GTX580, I really cannot say I have had any issues with one or the other... (I'm an idiot, so I bought a GTX570 and an unlocked HD6950 2GB, lol, then stepped up the GTX570 to a 580 :rolleyes:)

I have an u2711 at 2560x1440, and:

1) I have an ASUS OEM Avermedia ATSC OTA tuner card previously (it was a PCI and now I only have PCIe slots). No issues with deinterlacing in either one. I never watched live, however. Mostly DVR'd news (the only stuff that seemed to be in HD back then) and watched that using MPC-HC.

2) now I have a Hauppauge Colossus for recording xbox360, laptop (long story), and others. No issues here, either, but I record in 720p because 1080i source for me just isn't good... :(

3) AMD GPU seem to do funny things with the login screen for me, but I have never cared for it, lol. Once I got to the point where I had to input a password, all was fine, so...?

4) I can see the gradients (color banding) in my U2711 on the Windows 7 stock wallpaper (upper left quadrant, so you don't have ot search all 30" for it :p) on my GTX580. I never noticed it before, but on day I was using my laptop, noticed the banding, then looked at my desktop. I didn't notice it at first, but gradually, it became more obvious :( I dunno if MS's stock wallpaper is just low color quality or anything. So YMMV.

5) I dunno if it matters, however, I can help with any requests. I have no clue how to go about the "30 bit 24" bit thing, since I remember my monitor is only 10bit(!? or is it per color?), and only certain programs can even do a 30bit output (dear me, I'm now even more confused on this monitor stuff....)
 
What other low-resolution modes are you encountering other than boot-up? And is the resolution in the few short moments in boot-up really a serious consideration? I also use a monitor in 2560x1600, but can't say I've encountered anything like what you're referring to.
Low-resolution modes after boot-up are not a problem, because I can disable GPU scaling. If I leave it enabled, low-resolution modes (in old or retro games) are resampled with the same ugly scaling, first smoothly to 1280x800 and then pixelatedly to 2560x1600.

I'm annoyed at having my bootup and BIOS look ugly, and since I don't run my computer 24/7, I see the bootup screens enough that it's a constant reminder of my video card's incompetence. This issue alone wouldn't kill the card for me, but combined with the other issues, it does.

Have you tried different drivers in regards to your "window droppings" issue?
No, only the latest AMD driver (8.881.0.0, 2011-07-28). Are there any particular alternatives you'd recommend?

The window droppings might happen more often in hot weather. Lately the weather's been cold and the droppings don't happen often (I have to keep dragging and dragging until finally some droppings appear, making it hard to test different drivers). It was warmer when I first got the card and I had lots of droppings.

OP just so you know for the future, what you want to see regarding DVI is either DVI-I or DVI-A.

DVI-D= digital only, it won't transmit VGA signals (as it physically lacks the pins).


You can check that out on the website of the manufacturer of the card of your choice.
Thanks, but I did figure that out before posting. The page I did look at explaining the specifications of the PowerColor AX6950 only said that one output was Dual Link and the other was Single Link, and it just didn't occur to me to dig deeper. It just didn't occur to me that the manufacturer would be silly enough to put Dual Link and analog output into the same plug. It seems to me that the electronics should be just as simple for having DVI-D Dual Link in one output and DVI-I Single Link in the other output, and that would not lose any flexibility compared to what they in fact did (DVI-I Dual Link in one output and DVI-D Single Link in the other).

The deinterlacing issue is most surely a problem with what you configured, so here i will share this from a different forum that worked for me:

"I consider 7 things in ATI CCC are a must for best PQ that re-creates displayed image as close as possible to the source image, as the creators intended them to be.

IN ATI CCC
1. disable Edge-enhancement
2. disable De-noise
3. disable 'Enforce Smooth Video Playback'
4. disable automatic de-interlacing, and set the slider bar to vector adaptive, confirm vector adaptive de-interlacing using 1080i cheese slices stress test.
5. set underscan to 0%
6. drive your panel/projector at native resolution
7. match output refresh rate to input source material frame rate ( as long as panel/projector supports the refresh rate )"

Thanks. I've actually already tried all of that, except that disabling "Enforce Smooth Video Playback" actually results in tearing, so that's not an option. (Also I see no option for underscan, and there is no underscan or overscan.) I don't notice any difference between "Adaptive", "Motion adaptive" and "Vector adaptive"; all of them appear to be equally bad (resulting in an unsmoothed, "line-doubled" look) and are only successful in Weaving completely still images; anything moving seems to confuse them.

BTW, I use a self-modified version of GraphStudio to watch TV through the Blackmagic Intensity Pro. (My modification is to let the Enhanced Video Renderer be maximized to full-screen.) I tried using Media Player Classic's "Open Device" feature and I can't get it to do the double-framerate deinterlacing properly. (I probably tried other players as well, but it's been a while and I don't remember. MPC has been my player of choice for a while.)

1) I have an ASUS OEM Avermedia ATSC OTA tuner card previously (it was a PCI and now I only have PCIe slots). No issues with deinterlacing in either one. I never watched live, however. Mostly DVR'd news (the only stuff that seemed to be in HD back then) and watched that using MPC-HC.
I have no choice but to watch live, because I'm watching the output from my cable box, and the signal it receives on most channels is encrypted. It also means that any latency in getting that signal onto my monitor gives me lag in controlling my cable box.

4) I can see the gradients (color banding) in my U2711 on the Windows 7 stock wallpaper (upper left quadrant, so you don't have ot search all 30" for it :p) on my GTX580. I never noticed it before, but on day I was using my laptop, noticed the banding, then looked at my desktop. I didn't notice it at first, but gradually, it became more obvious :( I dunno if MS's stock wallpaper is just low color quality or anything. So YMMV.
Well that may just be a flaw in that wallpaper. Try this gradient.

5) I dunno if it matters, however, I can help with any requests. I have no clue how to go about the "30 bit 24" bit thing, since I remember my monitor is only 10bit(!? or is it per color?), and only certain programs can even do a 30bit output (dear me, I'm now even more confused on this monitor stuff....)
If you're using the DisplayPort input, then yes, you'd be getting 30bit (10bit per channel) color.
 
Last edited:
Only the DP port? Does the DL-DVI port do the same? Because while I have a DP cable, my GTX580 doesn't have a port... :(

In that test image, my GTX580 + U2711 combo see a fair amount of banding on IE9 and the Windows Photo Viewer (which I understand MS applications are color managed). Strangely, my Sony SC with a mediocre 13.3" display looks better than that on the Intel GMA HD3000 IGP (i.e. less obvious banding).

And I'm sure i know what color banding, is, too :(

Is there any other way to test this? I don't have an AMD GPU right now with a DP port. I'm sorry.

EDIT: to be clear, the bands are *smaller* on the Sony SC :eek:



EDIT2: a bit of looking around, it seems my U2711 is an 8bit + FRC for 10bit (per color), and it only works over DP. So I'm sorry, I'm out of this test. I'll let you know in the future if you are ever interested.
 
jeremyshaw, could you try disabling color management, or using a photo viewer that does not do color management? You an also test if color management is being done: Take a screenshot of the gradient being viewed, and compare the screenshot against the original gradient. If they're different, color management is definitely being done.
 
In addition to the two edits above, I did try again (over DL-DVI, sorry!) in Opera, FF, Chrome, and Apple Quicktime. In addition, I moved between different pic modes (sRGB, Adobe RGB, "Standard") to no avail.

I now know DP is required for deep color (30bit) support on my panel, and I don't have any DP outputting devices anymore :(

I'm sorry!
 
Bump. I don't know why the main Video Card forum doesn't allow you to look more than 1 page into the past. Makes no sense.

I'll be receiving my RMA-replacement GeForce 8800 GTS 512 tomorrow, and that will free me up to get rid of my Radeon HD 9650. I'd really appreciate it if someone would answer any of these questions:

1) Do any GeForce cards output simulated 30-bit color over DVI, like the Radeon card I'm about to return? Do all recent Radeon cards do this?

2) Do any GeForce or Radeon cards have adaptive deinterlacing that actually works?
 
Last edited:
Bump. I don't know why the main Video Card forum doesn't allow you to look more than 1 page into the past. Makes no sense.

It's at the bottom, view pages from 'last day' is selected by default, but you can change it to go much further back.

I can't comment on the interlacing issue, but as far as I'm aware the dual-link/analog only on a single DVI port is limited to the HD6900 series. I don't think the 6800s do it, but I can't confirm that. AMD are pushing for displayport really, and they don't anticipate anyone using an expensive top-end 30" monitor to also be using the archaic VGA interface for something else. As infuriating as it is, it makes sense. I think you might be able to use an HDMI to VGA adapter though?
 
You can change the date range at the bottom, ;)
Oooops :eek: Now that you mention it, I remember making that adjustment a long time ago, but I'd forgotten that it was specific to each individual forum.

as far as I'm aware the dual-link/analog only on a single DVI port is limited to the HD6900 series. I don't think the 6800s do it, but I can't confirm that.
Hmmm, interesting.

AMD are pushing for displayport really, and they don't anticipate anyone using an expensive top-end 30" monitor to also be using the archaic VGA interface for something else. As infuriating as it is, it makes sense. I think you might be able to use an HDMI to VGA adapter though?
I might be preaching to the choir, but I like to use a CRT side-by-side with an LCD because the CRT can do things no LCD monitor can, like perfect black levels, zero motion blur, and huge flexibility in refresh rates. And of course an LCD can do things no CRT can, like perfect sharpness and geometry.

I'm concerned that an HDMI-to-VGA adapter (or DVI-to-VGA, or DP-to-VGA) might give inferior quality, and might not have a 400 MHz DAC like all modern video cards do. Even if it does, it might not have a 10-bit DAC to allow smooth gamma ramps.

I also have no idea if I can do custom modes over HDMI, such as 2736x1710@60Hz and 1920x1200@96Hz.

And even if all these conditions are met, it'd still add considerable expense — well over $100 for a good one, I would imagine. I'd rather use that money towards getting a better video card.

P.S. It's also important to me to have a DisplayPort 1.2 output, for when 2560x1600@60Hz monitors finally have an affordable higher resolution and/or higher refresh rate successor.

Edit: Received my RMA replacement from EVGA. It's a GeForce GTS 450. :) And it does upsampling of low-res modes properly (upsamples to 2560x1600), and its deinterlacing actually works (most of the time), and it lets me use Dual-Link DVI and VGA both at once using the two DVI-I outputs. Of course it's nowhere near as fast as the Radeon HD 9650 at 3D. Oh, and it does not output 30-bit color to DVI; I see banding in a gradient when using gamma correction (on the 3007WFP-HC, not the GDM-FW900).
 
Last edited:
Back
Top