5870: low resolutions and interpolation?

mashakos

Weaksauce
Joined
Dec 11, 2007
Messages
75
Ok, Hi everyone.

Just got a 5870 after 3 years with nVidia. EDIT: come to think of it, I've fled from ATI back in 2003, now I'm reminded why...

Using catalyst 10.2, and a Dell 3007WFP

I am facing a very peculiar issue.


Performance is great when compared to my old gtx 280, BUT ...
If I run a game / the desktop on any resolution other than the native 2560x1600 on my screen, I get massive aliasing/pixelation.

Here's how Crysis Warhead looks on my system at 1680x1050:
big pic linked

here's how it should look:
big pic no 2, the way it's meant to be played :(

any ideas? The same issue is evident during bios POST or in DOS, huge pixelation everywhere.
I've enabled gpu scaling, checked "maintain aspect ratio". The issue is that the card does not interpolate lower resolutions on my monitor, something nvidia cards did out of the box - hence why I am not familiar with this problem.
 
Last edited:
huh..

you want the game to run on center of screen with black bars around ?

if thats what you mean, then choose one of the option that do not scale...
 
I don't like 1:1 scaling. I actually want lower resolutions to scale with interpolation. I got the drivers to scale (fill the screen) lower resolutions with correct aspect ratio, but they don't get interpolated. So 1680x1050 now looks like 1280x720 on my screen. On a huge screen that's pretty bad :(
 
Does the 5870 have a built-in hardware scaler? Asking cause the dell 3007wfp does not have any scaling hardware, so it would really suck if ati didn't put in a scaler on the 5870
 
I don't like 1:1 scaling. I actually want lower resolutions to scale with interpolation. I got the drivers to scale (fill the screen) lower resolutions with correct aspect ratio, but they don't get interpolated. So 1680x1050 now looks like 1280x720 on my screen. On a huge screen that's pretty bad :(

huh..

I still don't quite get it..

are you expecting to make it look pretty while using high native resolution and low resolution in application?
 
huh..

I still don't quite get it..

are you expecting to make it look pretty while using high native resolution and low resolution in application?
check the screens i linked to in the OP, better than an explanation. Top one shows ATI scaling, bottom one shows nvidia scaling
i refuse to believe this is not standard on ATI, considering that i bought the card yesterday! It's like my worst fears about switching have come true on day 1!
 
Really worrying that a few threads discussing this topic conclude with no solutions.

Would the 5870 eyefinity edition support full gpu scaling??
 
Scaling outside native resolution never looks good, I have the HC variant of that monitor and happen to know they have 2 native resolutions, at 1280x800 it actually bunches pixels together in blocks of 2x2 to form one larger pixel and it doesn't need to stretch/scale. if you shoot for that option and applied a load of AA/AF then it's going to look better I'm guessing.

I can't really say one pic is significantly worse than the other, there is a slight difference but they're both so awful to begin with that I wouldn't say the delta between the 2 is worth worrying about. If anything it seems like the Nvidia one is more blured...what sort of AA are you running? It might be worth switching between in game AA and control panel AA and see if there's any difference.

it's never going to look good though, honestly I think 1280x800 with more AA is a better bet, the pixel size on the 30" panels is really high so a quad of 4 isn't too bad, again AA will help, you're going to get a much more crisp picture.
 
There's no aa in both pics. The gtx280 automatically blurs the screen, obviously i can't show it from a screengrab - i used photoshop to show it. The actual effect is a lot better at no cost to framerates, that is with the nvidia card. 1280x800 looks horrible on the 3007wfp, but i haven't tried that resolution with aa enabled.
The thing is, the lack of interpolation on the 5870 looks bad in other areas: the bios, linux command line, emergency disk software etc.

I don't understand why there is no interpolation on a seemingly high end card.
 
There's no aa in both pics. The gtx280 automatically blurs the screen, obviously i can't show it from a screengrab - i used photoshop to show it. The actual effect is a lot better at no cost to framerates, that is with the nvidia card. 1280x800 looks horrible on the 3007wfp, but i haven't tried that resolution with aa enabled.
The thing is, the lack of interpolation on the 5870 looks bad in other areas: the bios, linux command line, emergency disk software etc.

I don't understand why there is no interpolation on a seemingly high end card.

if possible, I would like a screen shot that GTX 280 actually blur the screen..

I have never seen it before ...:confused:
 
Yknow, I have the same issue with my 4870 now that I've tried it... During POST it looks like all my fonts are old system style fonts. I never tried gaming outside my native resolution before - and it looks awful.
 
I dont think the bluring on the Nvidia screenshot makes it look better to be honest. It might be that whatever method Nvidia are using trades off a more crisp image for trying to mask the upscaling...kind of looks like the old AA methods like quincunx. There really isn't any truly good ways to do it anyway, it's always going to be inaccurate and just look bad.

I'd concentrait on shooting for 1280 with AA or 2560 with lower settings.
 
Looks like something is running at half resolution.

Double check the settings inside of Crysis video settings and ensure that there is no option set as such.
 
Looks like something is running at half resolution.

Double check the settings inside of Crysis video settings and ensure that there is no option set as such.

resolution is set at 1280x800, FSAA at 8x
 
There is obviously interpolation going on here, though the with less bicubic 'strength' than NVIDIA uses (which is apparently smoother). There are various levels of resampling 'strength' as the algorithm is tweaked, similar to the options for bicubic resampling in Photoshop.

In Crysis, turn up edge-detect AA.
 
There is obviously interpolation going on here, though the with less bicubic 'strength' than NVIDIA uses (which is apparently smoother). There are various levels of resampling 'strength' as the algorithm is tweaked, similar to the options for bicubic resampling in Photoshop.

In Crysis, turn up edge-detect AA.

there are tons of legacy apps and games that were not designed to run at 2560x1600, that's the real problem for me.
 
Honestly, both pictures in the OP look equally bad. That said, you should just game at native res. A 5870 can drive 2560x1600 just fine. And when it no longer can, get a new video card. You spent $1,000+ on a monitor, why would you want to run it at anything other than native resolution? If you are going to game at low resolutions but still want a big display, use a TV.
 
Honestly, both pictures in the OP look equally bad. That said, you should just game at native res. A 5870 can drive 2560x1600 just fine. And when it no longer can, get a new video card. You spent $1,000+ on a monitor, why would you want to run it at anything other than native resolution? If you are going to game at low resolutions but still want a big display, use a TV.

why isn't this more widely known? Oh right, there are probably twenty PC users with 30" monitors who are interested in gaming and other things at the same time...

I can't believe I have to get a gtx 480 over this little issue. If ANYONE knows of a hack or firmware patch that can modify ati's gpu scaling please post!
 
Ok, did a bit of testing:

1920x1200 to 1600x1200: no pixelation
1680x1050 to 800x600: pixelation

custom resolution:
1792x1120: no pixelation BUT performance is exactly like 1920x1200 in games
 
I just did the testing on a nVidia card, but I don't have high resolution monitor right now with me.

so I can only do 1680*1050, with 1280*768 its all pixelated ....

exactly the same as your screen shot.

PS: how do you take picture without taking it on game resolution?
 
hmm...all I can think of is why the heck would someone want to game at a resolution that is not their LCD's native? Running any LCD panel at a non-native resolution make it look like fuzzy crap.

Buy another 5870 and run crossfire
Or
Buy a smaller resolution LCD


this thread makes my head hurt.
 
why isn't this more widely known? Oh right, there are probably twenty PC users with 30" monitors who are interested in gaming and other things at the same time...

I can't believe I have to get a gtx 480 over this little issue. If ANYONE knows of a hack or firmware patch that can modify ati's gpu scaling please post!

And those people all game at or as close to native resolution as they can with powerful hardware. You have the hardware, you just don't seem to want to use it. God knows why.

Ok, did a bit of testing:

1920x1200 to 1600x1200: no pixelation
1680x1050 to 800x600: pixelation

custom resolution:
1792x1120: no pixelation BUT performance is exactly like 1920x1200 in games

So lower resolutions blown up to higher resolutions as fast as possible results in poor filtering applied to the scaled image and a pixelated display? Wow, I would never have guessed that one :rolleyes:

Seriously, it sounds like you're making mountains out of molehills. You bought a $400 GPU - USE IT. Game at 2560x1600 and enjoy the high IQ. My 5870 is driving 5760x1200 - yours can certainly drive 2560x1600 (and 1920x1200 is going to be a cakewalk for the 5870).
 
I just did the testing on a nVidia card, but I don't have high resolution monitor right now with me.

so I can only do 1680*1050, with 1280*768 its all pixelated ....

exactly the same as your screen shot.

PS: how do you take picture without taking it on game resolution?

your small monitor has it's own scaling, you must enable nvidia scaling to notice the difference...

so if you're an ati user, best advice for any problem is "live with it"

btw, I don't play recent games at low resolutions. I do run a lot of DOSbox, winUAE, a bunch of classic games as well, so excuse me if I find this important .

EDIT: Performance wise, the 5870 is not a miracle. It only has a 10fps advantage over the gtx 280. In Dirt 2 with dx11 enabled and everything on highest settings, I'm getting 30fps at 1920x1200. Hence I think lower resolutions will be necesary even in current games if the highest settings are to be enabled.
 
The advice to ALL users should be to stick to native resolutions on LCDs for the best image quality possible.

There is no one perfect way to scale since it's just best guess. I don't personally think Nvidias looks better, maybe it masks the upscaling better, but it looks blurred because of it. The exact method used to upscale is always going to differ between brands simply because there is no one best way to do it, there is always going to be some kind of trade off.

It's no different from AA and AF between different brands, they all use their own methods neither of which are usually better than the other they have their own benefits and drawbacks.

If it's so important to you, you should have researched before buying the card?
 
I made the mistake of assuming that ati covers as many features as nvidia does...
I'm not talking about gaming performance, that of course is where ati is concentrating.
 
Sigh... :rolleyes:

You're not paying attention, they both support upscaling they just do it in different ways.
 
I don't play recent games at low resolutions. I do run a lot of DOSbox, winUAE, a bunch of classic games as well, so excuse me if I find this important.
I'm also a big fan of "legacy" games.
If they don't support my native resolution (1680x1050) I just let them run at their maximum (lower) setting with no scaling. That way they'll still cover the same physical area as they would if running on a smaller screen with the same resolution, which is good enough for me.
(With this in mind I keep a keen eye on the pixels-per-centimetre value when I buy a new monitor, for me the ideal monitor is about 28 PPC native.)

Cheers
Olle
 
Scaling at non-native resolution is not a priority with a high end videocard and almost nobody cares about it aside from a small minority of gamers. You will never see this talked about in reviews or forums. It's just not an important thing to most while it is important to you.

Unfortunately, your monitor doesn't have a built in hardware scaler itself which exacerbates the issue. Is there a sharpness control on your monitor? Often I have used the hardware soften/sharpness control built into the monitor to achieve the same effect. Play with the built in AA features of the 5870, try box, narrow-tent, wide-tent, edge detect, at all the AA levels and enable Temporal Antialiasing @ 2x/3x (you won't find this on the Catalyst Control Center, get ATI Tray Tools to access Temporal AA). I remember I did this on Mass Effect and ended up with a much smoother and softer image overall. Also, remember that Crysis is designed not to use AA and AF. When AA is on, the EdgeAA that Crysis used is disabled. When AF is on, the bump mapping technique Crysis uses is disabled as well.

ATI's interpolation and rasterization engine for scaling non-native is obviously different from Nvidias, with ATI probably being more precise and sharper and Nvidia going for more blur. If this incidental and non-priority difference in scaling methods is so important for you, then sell the 5870 and buy an Nvidia card. I stayed away from ATI for a long time also because of similar reasons - mine being the lack of Digital Vibrance on ATI which Nvidia had. Most people dismissed my complaints but ATI finally included Avivo color.
 
Last edited:
regarding what I said earlier:
Performance wise, the 5870 is not a miracle. It only has a 10fps advantage over the gtx 280. In Dirt 2 with dx11 enabled and everything on highest settings, I'm getting 30fps at 1920x1200. Hence I think lower resolutions will be necesary even in current games if the highest settings are to be enabled.

After installing the 1.1 patch for dirt2 and setting catalyst to optimal performance, the framerates shot up to 60fps. Now the optimisations at ptimal performance settings do degrade image quality in some games (most pronounced was Lost Planet) but I honestly could not notice any degradation in image quality with Dirt 2.

Now I can see what all the fuss is about! Playing on ultra settings, 1920x1200 with 4x AA and getting smooth 60fps was something.

Still, the scaling thing is a deal breaker. I have managed to convince the shop I got the card from to let me swap it with a gtx480 when it arrives where I am. Problem is, I heard a rumour that the gtx480 might shoot up to $680 in my city due to the stock shortages.

I would definitely keep the 5870 then. nvidia's gpu scaling is a big deal for me, but to pay a $280 premium for it?? noThankYou!(tm)
 
I still don't get why you don't just play at native resolution...

You should try installing the 10.3b preview drivers from the ATI underground site. Dirt 2 at 5760x1600 with everything maxed and 2xAA my 5870 averages about 35 FPS. Oh, and that is with CCC set to "Optimal Quality" (which is really what you should set yours too as well - optimal performance is going to be ugly)

But, for what its worth, older pixel graphics games like Starcraft and Diablo will look *better* with ATI's upscaling method than Nvidia's. So neither one is truly "better" than the other. Nvidia will look better with some games, ATI will look better with some games. Specifically, ATI will look better with the class of games that typically don't let you change the resolution.

I stayed away from ATI for a long time also because of similar reasons - mine being the lack of Digital Vibrance on ATI which Nvidia had. Most people dismissed my complaints but ATI finally included Avivo color.

How long ago was that? ATI has let you adjust saturation (which is all DV is - a renamed saturation setting) since at least the 9700 Pro days - which was the first ATI card I used.
 
I still don't get why you don't just play at native resolution...

just tried it, native (2560x1600) 2x AA and ultra settings - CCC set to Optimal Quality.
Not so good... got 45fps average. Not playable for me, my limit is 55fps before I start noticing frame drops.
I heard the 10.3 drivers have problems with flickering in games or something...?

But, for what its worth, older pixel graphics games like Starcraft and Diablo will look *better* with ATI's upscaling method than Nvidia's.

you mean all blocky and stuff right? I lived through the PC's VGA days. Believe me, when it comes to upscaling the blurrier the better, I have had my fill of the jaggies back in the 90's!
Anyway, with nvidia I can always turn off their scaling and revert back to jagged / blocky scaling that is the default on my monitor anyway. With ati it's jaggies all the way.
 
just tried it, native (2560x1600) 2x AA and ultra settings - CCC set to Optimal Quality.
Not so good... got 45fps average. Not playable for me, my limit is 55fps before I start noticing frame drops.
I heard the 10.3 drivers have problems with flickering in games or something...?

I'm having no troubles with the 10.3b drivers. I haven't used the 10.3 drivers, just the 10.3b (10.3b are *newer* than the 10.3 drivers. The 10.3 drivers don't have the performance improvements)
 
Scaling at non-native resolution is not a priority with a high end videocard and almost nobody cares about it aside from a small minority of gamers. You will never see this talked about in reviews or forums. It's just not an important thing to most while it is important to you.

Unfortunately, your monitor doesn't have a built in hardware scaler itself which exacerbates the issue. Is there a sharpness control on your monitor? Often I have used the hardware soften/sharpness control built into the monitor to achieve the same effect. Play with the built in AA features of the 5870, try box, narrow-tent, wide-tent, edge detect, at all the AA levels and enable Temporal Antialiasing @ 2x/3x (you won't find this on the Catalyst Control Center, get ATI Tray Tools to access Temporal AA). I remember I did this on Mass Effect and ended up with a much smoother and softer image overall. Also, remember that Crysis is designed not to use AA and AF. When AA is on, the EdgeAA that Crysis used is disabled. When AF is on, the bump mapping technique Crysis uses is disabled as well.

ATI's interpolation and rasterization engine for scaling non-native is obviously different from Nvidias, with ATI probably being more precise and sharper and Nvidia going for more blur. If this incidental and non-priority difference in scaling methods is so important for you, then sell the 5870 and buy an Nvidia card. I stayed away from ATI for a long time also because of similar reasons - mine being the lack of Digital Vibrance on ATI which Nvidia had. Most people dismissed my complaints but ATI finally included Avivo color.

nVIdia's Digital Vibration is just another word for contrast and brightness, nothing more.
ATI by default have better color scheme. Very noticeable in some games...
 
nVIdia's Digital Vibration is just another word for contrast and brightness, nothing more.
ATI by default have better color scheme. Very noticeable in some games...

No it's not, Digital Vibrance is a level adjuster coupled with a saturation booster. I have argued this so many times I am sick of it. ATI having a better default color scheme was also said hundreds of times. I don't agree with this either. I need to crank up the saturation a lot.

How long ago was that? ATI has let you adjust saturation (which is all DV is - a renamed saturation setting) since at least the 9700 Pro days - which was the first ATI card I used.

I owned a 9800 Pro, X800 XT PE, X850 Pro, X1950, I don't recall any of them having saturation controls at all. They only had options for hue, gamma, brightness, and contrast.

This is all moot now, because I can almost achieve what I want in CCC just fine by tweaking the brightness and contrast in the color options and then go into the individual display menus to access Avivo color.
 
I used to own a Dell 3007 too. I know what you are referring to. But the ATi way of scaling things is actually more correct. On the Dell 3007 ATi cards do pixel doubling at low resolutions, 4 pixels are used as 1. I actually prefered this as it produced much clearer text.

I hate interpolated blurry scaling.

I noticed that on other monitors though 24'' and 27'' monitors, ATi cards blur when they scale too, I wish there was a way to force pixel doubling and other scaling methods so users could choose the scaling method they prefer.
 
Back
Top