Sotiri
Limp Gawd
- Joined
- Feb 5, 2011
- Messages
- 178
There is a 15% difference between 1920x1200 and 1920x1080.
That is the about the difference between a 560 Ti to a 570 or 6950 to 6970.
If 1920x1080 is the most common resolution that is the one that should be used.
Well, I only want to show one or the other, and I'd rather aim high. What i mean by a big difference is how it's represented in the chart. 1920x1080(2.07MP) vs. 1920x1200(2.3MP) is 11%. The difference from 1920x1200 to 2560x1600 is 78% and really emphasizes the differences, otherwise it would look like a big blob of green.
...In that case then a single 6950 and 6970 2 Gig card should be listed as good performers in the 5040x and 5760x resolutions.
They are perfectly capable of good triple monitor performance in games like: Aliens vs. Predator, Bad Company 2, Borderlands, Call of Duty Black Ops, Dirt 2, F1 2010, Killing Floor, Left 4 Dead 2, Medal of Honor...
Games like Crysis and Metro 2033 may require turning off AA and HDR but they still look good...
OK, I will certainly add a few green bars in 5040x1050 on the 6970. But what about the order of the cards? Should the sort order stay the same and raise the performance of all the other cards below it? Or maybe I should put the 6970 with the 580?
A pair of 560 TI's or 6870's will add up to 2GB. So do you really think a single 6970 beats the performance of these cards in SLI/CF?
Last edited: