Integer scaling, anybody tried it?

I've had a side by side comparison between a native 720p LCD TV and a native 1440p LCD pc monitor, both at 32" diagonal. Tested a couple of games, both 2d and 3d.

For 2d, i've tested Braid, an old 2d game that is 720p fixed resolution. It looked a bit better on the native 720p panel TV. With integer interpolation on the 1440p monitor, the image was sharp but on the same time, the pixels were jagged, most probably an effect of pixels doubling to match 2560x1440 resolution.

For 3d, i've tested Quake Champions, and again, 720p looks worse than on the native 720p screen. Not by much, but there is a difference, it's more jagged...Sharp, not blurry, but no AA settings can alleviate the jaggies. Crucially, 1080p upscaled to 1440p looks way better, especially with some sharpening. I haven't had the chance to test a 4k panel, but i bet 1440p normally upscaled plus sharpening to 4k looks better than 1080p integer scaled to 4k. Kinda like consoles do it.

One thing i've noticed, for 2d games at least, you can achieve similar results with just normal upscaling and tweaked reshade sharpening filters, depending on the native resolution of the panel (ie for 1080p panel you'll have different reshade settings than on a 1440p panel). But it's tricky to get it right...

This makes sense. Integer scaling is way overrated. Mainly useful for pixel art, and most pixel art games, and emulators already have integer scaling options built in.
 
Even on pixel art the integer scaling depends on a source. Because, for example, in case of Genesis/Mega Drive pixel perfect image is actually wrong. Quite many games were built around composite because it blurs dithering together creating more colors than Genesis is actually capable of. And it creates transparency effects too (waterfalls and shield bubble on Sonic 1 for example). On RGB or pixel perfect emulators when there is part that should be either transparent or have smooth color gradient you see dithering patterns everywhere.
 
This is disappointing. I thought 720p (or 1080p for a 4k screen) was going to look as good as if it was native to the monitor. Is there a technical explanation for why this is the case?

I'm playing Stasis right now, a game that is fixed at 720p, and in order to get a clear image i have to rely on that integer app from steam. I thought integer scaling would save me the trouble, but i guess not?
 
This is disappointing. I thought 720p (or 1080p for a 4k screen) was going to look as good as if it was native to the monitor. Is there a technical explanation for why this is the case?
Lol, are you joking?

Just in case, its down to the resolution and screen size.
1080p will look like 1080p, 720p like 720p, that is the point of it. The TV/gfx cards interpolation wont be used.
If you want it to look like it used to, get a screen the same size as you used to have.
Big screens need higher res unless you sit so far away as to make it pointless using one.

edit
I understand what you were trying to say now, you werent wondering why it doesnt look as good as the higher native res.
It is mainly down to the size of the screen vs resolution.
 
Last edited:
This is disappointing. I thought 720p (or 1080p for a 4k screen) was going to look as good as if it was native to the monitor. Is there a technical explanation for why this is the case?

I'm playing Stasis right now, a game that is fixed at 720p, and in order to get a clear image i have to rely on that integer app from steam. I thought integer scaling would save me the trouble, but i guess not?
That integer app from steam does the same thing as integer scaling from nvidia, afaik. Same result anyway. Not blurry, much better than regular scaling, but not identical to a native panel at the same diagonal (though the difference is not that big, as i've said). Integer scaling is a good thing to have, but it's more useful for games with a fixed resolution.

As for the technical explanation, i'm guessing it's because of adjacent pixels trying to emulate a pixel from the original lower res screen. With integer scaling there won't be half of a pixel trying to emulate a pixel (resulting in blur), but the space between the pixels will still be there, so i think that's the cause of more jagginess than the original source. If that's the case, the higher the ppi of a panel, the less this effect will be visible. On my 32" qhd lcd, the dpi isn't that high, at just 91, and i sit pretty close.
 
Last edited:
Even on pixel art the integer scaling depends on a source. Because, for example, in case of Genesis/Mega Drive pixel perfect image is actually wrong. Quite many games were built around composite because it blurs dithering together creating more colors than Genesis is actually capable of. And it creates transparency effects too (waterfalls and shield bubble on Sonic 1 for example). On RGB or pixel perfect emulators when there is part that should be either transparent or have smooth color gradient you see dithering patterns everywhere.
16bit Sega dithering is not something you can hide/mask with using bilinear scaling either. Emulators should have shaders to emulated composite look. Also aspect radio needs to be cięte for this chodziłem and also for other od consoles and this needs to be handled by emulator.

Thus wolę argument od using integer scaling for emulators fly out the window.

Older PC games sometimes do not support higher resolutions or do not scale HUDs and this is one posdible use for integer scaling. Other is RTX stuff and/or simply wanting to use higher resolution screen with underpowered GPU. Good luck driving modern games on 1660 on 5K screen at native res, let alone on 8K...

For 1440p bunch with especially reasonably powerful GPUs this feature is a solution without an issue to solve. Still better to have it avaiable.
 
This is disappointing. I thought 720p (or 1080p for a 4k screen) was going to look as good as if it was native to the monitor. Is there a technical explanation for why this is the case
This is called 'nitpicking' my friend.
Eg. 1080p 27" screen will look different than 2160p@1080p because of pixel structure and screen door effect.

This will mostly affect desktop usage as it will interfere with subpixel font rendering making it show colorization on fonts instead of intended higher resolution. Some AA techniques might also use subpixels but other than maybe at least one tyle of SMAA I am not aware this to be the case.

Other than that subpixels are not filling whole space and there are 3 per pixel so those are obiously gonna look different to integer scaled image on higher res panel which will have more full and defined pixel structure. In theory second case (integer scaled image) should be better and in practice native res looks better (at least to some people as not everyone will agree) because of psychovisual effects. It is very similar effect that makes 'scanlines' effect desirable for emulators. Other than that if there is no subpixel stuff going on in the game there is nothing else.

And as I mentioned it is not that native res looks better, it actually is technically worse and some people will see it as that and will prefer higher integer scaled resolution. I personally like both presentations and do not care. Integer scaling makes my 4K monitor to be also 1080p display and maybe even 720p display for all I care. Rest is sub subpixel nitpicking :)
 
Well, i finally got my new card, so i tried this thing.

720p on 27" is... pretty ugly. HOWEVER, being free of interpolation really does help. AAA games at this resolution look like pixel art games, instead of the blurry mess that interpolation creates. It's an improvement, but i will say that in effect 720p is just too far away from 1440p. 1080p on a 4k 27" monitor might be the only workable combo here, heh.

I wonder though, is there a way to upscale the image to 1080p? When i had my 1080p 27" monitor, i would use that super resolution thingy to upscale the image from 1080p to 1440p. Wonder if it's possible to do the same from 720p to 1080p. I can't obviously set the ingame resolution to 1080p as that wouldn't upscale anything, it would just change the resolution and add interpolation again. Can the super resolution stuff be applied to resolutions that are below the monitor's default resolution?
 
720p on 27" is... pretty ugly.
As expected. It's integer scaling after all, no miracle worker :) It has a very good use case though, but we need monitor makers to pay attention to this as well. It makes perfect sense in 23-24" 4k screens, 32" 5k screens, etc. Might also be acceptable in 27" 4k, 32" 4k would be stretching it already (but probably is still an improvement over the blurred 1080p not that sure that better than 1440p with normal scaling though).

What we would need now is a 32" IPS/VA 5k 60hz screen that can do 120hz or more in 1440p, but I'm skeptical we will see it :) Would buy it though for the right price day one. HDR, FALD, etc are still years away before in a usable state...
 
What we would need now is a 32" IPS/VA 5k 60hz screen that can do 120hz or more in 1440p, but I'm skeptical we will see it :) Would buy it though for the right price day one. HDR, FALD, etc are still years away before in a usable state...
When you do this type of GPU integer scaling you use monitor maximum resolution so if monitor was to be limited by used cable connection bandwidth to 60Hz then it would be still limited at lower resolution. We would need monitor internal scaler to support integer scaling for this to work.
Actually there were some 4K TV's which could do 1080p at 120Hz and had integer scaling.

Integer scaling becomes usable for 2160p screens, important for 2880p screens and absolute necessity for 4320p screens :)
 
  • Like
Reactions: Ors
like this
I wonder though, is there a way to upscale the image to 1080p? When i had my 1080p 27" monitor, i would use that super resolution thingy to upscale the image from 1080p to 1440p. Wonder if it's possible to do the same from 720p to 1080p. I can't obviously set the ingame resolution to 1080p as that wouldn't upscale anything, it would just change the resolution and add interpolation again. Can the super resolution stuff be applied to resolutions that are below the monitor's default resolution?
All you can do is try to force SSAA
 
What's the best way to enforce SSAA with AMD cards? The driver has an override option but it doesn't always seem to work. On nvidia i remember i had to rely on nvidia inspector on a couple of occations to brute force AA, wonder if there's anything like that for AMD as well.

At any rate, i'm glad this function is here. I see they are making 27 4k 144hz screens now. I may buy one down the line and it would probably be the best of both worlds. 1080p for demanding games that benefit from high frame rates and 4k for those capped at 60hz or older ones.
 
Last edited:
Well, you wouldn't gain anything anyhow. If you run the game at 720P, you'd get better performance at the cost of image quality. If you run at 1080p (native) you get normal quality and performance. If you run at 1440p (via VSR) you get worse performance at the cost of better image quality.

I don't believe there is a way to do it, but even if there was, running at display at 720p and downscaling from 1080p would get the same performance as running 1080p native, but with worse picture quality. So it doesn't make sense.

The best thing you can do is enable Virtual Super Resolution in AMD display settings, enable GPU scaling, enable Radeon Image Sharpening, and then run the game at the highest resolution capable on your machine.

RIS can make even a 720p image look okay. I mean, not stellar, but playable. However that and GPU scaling will unlock basically any resolution you want, and the quality is really great.

I just ran Left4Dead at 4K maxed out on my 1080p monitor, and honestly I had to check that the monitor was actually 1080p because it look so nice. You can do this at 1440p, probably the best route, or if you need extra performance try 1600x900. Cheers.
 
When you do this type of GPU integer scaling you use monitor maximum resolution so if monitor was to be limited by used cable connection bandwidth to 60Hz then it would be still limited at lower resolution. We would need monitor internal scaler to support integer scaling for this to work.
Yeah, you are right. Haven't thought about bandwidth limit only performance to push that many pixels in games. Then again bandwidth might not be a problem anymore for 5k at least once GPUs support the latest. Monitor integer scaler would still be best, but with widespread support from GPU makers we at least have a fallback now when it's not the case.
 
If they play this right this might make 8k viable for gaming. Amazing they didn't think of it until now and we had to pester them for years to implement this simple feature.
 
Mmmh, something is off here.

I tried 720p on three games now, Witcher 3, Kingdom Come and Arma 3, and... i'm not gaining a whole lot, which i found confusing. Witcher 3 i gain about 20-30 FPS, which is substantial but not as much as i thought, where with both Kingdom Come and Arma 3 i gain about... 5-10 FPS each? How? In all the benchmarks i see there's a consistent circa 30 FPS difference between 1080p and 1440p in most games, i would think the gains would be MASSIVE for dropping to 720p, where as i'm not gaining a whole much at all.

Kingdom Come is the worst. At 1440p Ultra, i get between 45-70 FPS. At 720p, i get between 50-80. Like, what the heck? Is this the CPU that is holding me back? My concern was visual quality i didn't expect the gain in performance would have been so small, i'm kinda of confused here.
 
What CPU, what GPU
Also, is the performance the same when you disable GPU scaling altogether?

I myself did not test integer scaling performance vs no scaling on my RTX 2070 assuming it will be the same but in reality who knows how they implemented it...
 
Mmmh, something is off here.

I tried 720p on three games now, Witcher 3, Kingdom Come and Arma 3, and... i'm not gaining a whole lot, which i found confusing. Witcher 3 i gain about 20-30 FPS, which is substantial but not as much as i thought, where with both Kingdom Come and Arma 3 i gain about... 5-10 FPS each? How? In all the benchmarks i see there's a consistent circa 30 FPS difference between 1080p and 1440p in most games, i would think the gains would be MASSIVE for dropping to 720p, where as i'm not gaining a whole much at all.

Kingdom Come is the worst. At 1440p Ultra, i get between 45-70 FPS. At 720p, i get between 50-80. Like, what the heck? Is this the CPU that is holding me back? My concern was visual quality i didn't expect the gain in performance would have been so small, i'm kinda of confused here.

I mean, 720p is CPU/RAM bound territory. Do you have something like a 9900k@5ghz and DDR4 at 3600+mhz? If not, gains might not be what you expect. Especially with an AMD CPU you can expect very disappointing results if you have a strong GPU.

And Arma 3 is the worst example you could pick: it's been 99.99% CPU bound for most people since release, with current hardware even more so because GPUs can handle the graphics so easily (it's the AI and very detailed draw distance that kill the performance). AFAIK Kingdom Come is pretty damn CPU heavy too, it's one of those recent games where there is still a noticeable gap between Intel and AMD CPUs at 1080p.
 
Last edited:
  • Like
Reactions: XoR_
like this
Guess i'll upgrade my CPU next time i get the chance, sheesh. Wonder if Ryzen 4000 series will be compatible with B450 boards, heh.
 
And Arma 3 is the worst example you could pick: it's been 99.99% CPU bound for most people since release

Wow, you ain't kidding. I just checked and there's a scene in the prologue where the game drops below 40 fps and while it's doing that the GPU usage goes from 17% to 35%. Unreal, the only time i've seen something like that was in emulators.
 
Wow, you ain't kidding. I just checked and there's a scene in the prologue where the game drops below 40 fps and while it's doing that the GPU usage goes from 17% to 35%. Unreal, the only time i've seen something like that was in emulators.

On the upside you can run it with the most demanding AA techniques on virtually any GPU and have it look great :) VRR is pretty much a requirement though, it's a pretty incredibly awful experience on a fixed refresh display.

Although it could surely be coded better, you'll see very similar behavior in flight simulators and such - my understanding is that some complex calculations just don't lend themselves well to heavy multi-threading - things need to remain perfectly in sync and to be accurate to avoid small errors that eventually amount to game breaking bugs. Arma is as much a game as it is a simulator and they have improved greatly over the years - the previous Arma games actually run far worse than Arma 3 does even on current hardware and despite looking like ass.

But yeah, use that game for benchmarking CPUs and RAM - never GPUs.
 
Back
Top