Actually ATI seems to have buggy drivers for DX10 display modes in general. I can't run games in a DX10 mode at all on my set for interlaced resolutions since I dont have 1080p. I am outputting to a 55" CRT HDTV set via component video, and my set can only display 1080i not 1080p. So what happens is when I go in to a game in DX10 mode any resolution that gets put in to a 1920x1080i timing and interlaced, I cant see that resolution at all. that includes 1024x768, 1280x960, 1280x1024 and up. The only resolutions selectable are 1280x720p and below. Since my HDTV down res 720p to 480p it really looks incredibly shitty.
Reading around it also seems like people outputing to 1080p sets in DX10 games have to try multiple 1080 resolutions. One will be 1080i 30, one will be 1080i 24 and one will be 1080p 60. Even though they select one that they say says 1080p it puts it in to a 1080i mode.
Here is one of the many threads I posted on my side of the issue:
http://www.hardforum.com/showthread.php?t=1331756
Honestly I have been tempted to get a 1080p LCD or Plasma set but my CRT has great color and no motion blur, but with it's age I know finer details arent as good as they used to be so I am really torn lol
Reading around it also seems like people outputing to 1080p sets in DX10 games have to try multiple 1080 resolutions. One will be 1080i 30, one will be 1080i 24 and one will be 1080p 60. Even though they select one that they say says 1080p it puts it in to a 1080i mode.
Here is one of the many threads I posted on my side of the issue:
http://www.hardforum.com/showthread.php?t=1331756
Honestly I have been tempted to get a 1080p LCD or Plasma set but my CRT has great color and no motion blur, but with it's age I know finer details arent as good as they used to be so I am really torn lol