Integer scaling, anybody tried it?

Opus131

Limp Gawd
Joined
Jul 29, 2016
Messages
298
So Nvidia has finally implemented this. I hope AMD will follow, but i'm wondering how it is now that it is here, at least for Nvidia users.

For instance, how does a game at 720p look on a 1440p screen? Likewise for a movie at 720p versus 1080p on a 1440p screen or TV.

Can't find any opinion on google, surely considering the fuzz we users made about this somebody would have experimented with it by now.
 
It works as expected. I've only tried in a few games but the main issue is that 720p is a really low res so you might still prefer how a higher res upscaled (especially with the Nvidia sharpen filter) looks because more detail can be resolved just by having more pixels. It's probably far more useful for emulators etc where pixel perfect appearance is nearly always better and there are more resolutions that work with integer scaling.

In the future I think something like 2560x1440 on a 5120x2880 screen would be a pretty good option as I feel going above that resolution moves into the diminishing returns department for increased detail in modern games.
 
  • Like
Reactions: isp
like this
I see, so 1080p with DLSS or similar would still look better on a 1440p screen than 720p with integer scaling?

And what about a movie at 1080p vis one at 720p on that same 1440p screen?
 
I see, so 1080p with DLSS or similar would still look better on a 1440p screen than 720p with integer scaling?

And what about a movie at 1080p vis one at 720p on that same 1440p screen?

I would expect it would apply to movies as well, more pixels, more detail even if upscaled. I haven't tried it though.
 
Maybe somebody else will chime in. We can make this the first google search result on this topic, since i haven't even found one, haha.
 
I thought integer scaling was suppose to be used for pixely games like cross code?

It's just a way to avoid interpolation when you can have a pixel perfect subdivision. So 1080p would look on a 4k screen just as good as it would on a native 1080p screen.

At least that's the idea. You basically end up having multiple "native" resolutions as long as you can divide pixels exactly.
 
So Nvidia has finally implemented this. I hope AMD will follow, but i'm wondering how it is now that it is here, at least for Nvidia users.

For instance, how does a game at 720p look on a 1440p screen? Likewise for a movie at 720p versus 1080p on a 1440p screen or TV.

Can't find any opinion on google, surely considering the fuzz we users made about this somebody would have experimented with it by now.
Assume you have a 32" display.
Now imagine it is 720p.
You have a 32" 720p display!
 
Assume you have a 32" display.
Now imagine it is 720p.
You have a 32" 720p display!

Well, there's limits of course, but what if you have a 4k 24 screen? How would 720 (say, a game) look on it? Or a 27 screen, which is a popular size for 1440p. How would a game look at 720 there? You have people who play online games at minimum graphics to get the best performance, so this would add an other option for that, since you can get even more frame per seconds while maintaining sharpness which is needed for those games (where as interpolation would just make everything look like a blurry blob).

Other questions would be, is a movie at 720p better than one at 1080p on a wqhd screen?

I mean, we are talking about limited uses, but i'm still curious and Nvidia has been essentially strong harmed into implemented this since lots of people have been asking for it since forever.
 
Well I would say it comes down to the quality of upscaling. Most high end TVs these days have very good upscaling capabilities so using that might be better than going for integer scaling. My understanding was that integer scaling was a big issue in how pixel games look.
 
My understanding was that integer scaling was a big issue in how pixel games look.

It's because interpolation is particularly horrible there. In fact, anything that blurs the image is bad for pixelated games. I still play Quake 1 in software to this day because the filter opengl uses just makes everything look blurry:

https://www.quaddicted.com/engines/software_vs_glquake

Likewise when i use dosbox i avoid any scaler that blurs the picture because pixels always look better to me. Indie games, games on emulators, they all benefit from this.
 
Well, there's limits of course, but what if you have a 4k 24 screen? How would 720 (say, a game) look on it? Or a 27 screen, which is a popular size for 1440p. How would a game look at 720 there? You have people who play online games at minimum graphics to get the best performance, so this would add an other option for that, since you can get even more frame per seconds while maintaining sharpness which is needed for those games (where as interpolation would just make everything look like a blurry blob).
You cant integer scale a none integer scale resolution.
It wont use integer scaling in this case.
The key is "integer", no fractions.

Either that or you can integer scale the desktop, then a none integer scaling will be applied.

Other questions would be, is a movie at 720p better than one at 1080p on a wqhd screen?
Depends on the scaling algorithm, but generally no.
 
So I just tried it. It works.

It's definitely pixelated, but it looks higher quality than standard bilinear filtering.

Just keep in mind that your native resolution has to be a multiple of the lower resolution you set.

For example, 1080p on a 4K monitor would work. But 720p on a 1080p monitor does not.

So I'm using a 1080p monitor and tried 480p. It looked surprisingly okay for 480p. Usually 480p looks like a VHS tape, but it wasn't so bad.

Unfortunately I gave my 4K monitor to my buddy, and that is really what this is for (or maybe emulators as well).

It works, though, and I'm glad Nvidia finally did it (I think AMD is looking into it as well).

However, the time has passed. With DLSS / FreeStyle / ReShade / Radeon Image Sharpening, you don't need integer scaling anymore. It looks way better and is more flexible to non-integer resolutions.
 
The main use i'm looking for is extending longevity on games. 1440p on a 27 is quite high, excessive i think. I have a friend who had to set windows at 125% because everything is too small. So say you run something like Doom 2016 at ultra but at 720 with some sharpening and anti-aliasing slapped on it. How good would it look, and how many fps would you gain? Would it be worth it if you can hit that 144hz threshold if you can't at 1440p?

Likewise for 4k at 32, or even at 27 since i know some people go that way.

I'm currently trying to decide between a 1080p or 1440p 27" screen, and i'm planning to upgrade my 480 4GB to an 5700 XT. I'm sure AMD will eventually include this since there's so much pressure for it and both Intel and Nvidia have done it already. At 5700 XT it seems i can't even get to 100 FPS at something like Metro Hexodus at Ultra, let alone Extreme. BUT, you might at 720p, and if it doesn't look too terrible, why not?

Hey, a man has to dream, right?
 
However, the time has passed. With DLSS / FreeStyle / ReShade / Radeon Image Sharpening, you don't need integer scaling anymore. It looks way better and is more flexible to non-integer resolutions.

Are those really so good that you can, say, make 1080p look good on a 1440p screen?
 
So I just tried it. It works.

Just keep in mind that your native resolution has to be a multiple of the lower resolution you set.

For example, 1080p on a 4K monitor would work. But 720p on a 1080p monitor does not.

So I'm using a 1080p monitor and tried 480p. It looked surprisingly okay for 480p. Usually 480p looks like a VHS tape, but it wasn't so bad.

Translation the resizing factor has to be an integer.


Likewise when i use dosbox i avoid any scaler that blurs the picture because pixels always look better to me. Indie games, games on emulators, they all benefit from this.
Same. Its he only way to keep the original look of the games
 
1080p to 1440p, maybe not so much.

With the image sharpening you want to be around 80% the vertical resolution.

So 900p on a 1080p monitor or 1800p on 4K, etc. I guess 1200p might work on 1440p, I haven't tried it.

Integer scaling would work best in the case of a 4K screen and game at 1080p. So it should still be useful there.

The point was that you can't do 1080p -> 1440p via integer scaling (or 1440p -> 4K), so you'd be better off with sharpening.
 
I have no way of trying but somehow i doubt any sharpening would be able to fight off the interpolation. Tried 1080 on this 1440p screen and looks pretty damn ugly. Of course, who knows whether 720p would look any better even if scaled without interpolation.

I want my CRT back darn it.
 
You can try FreeStyle on Nvidia cards (or ReShade), it should look similar to AMD RIS.

At 1200p on a 1440p screen it should look alright. Obviously not as sharp as a real render, but playable quality.

It looks similar to console games, which usually render at a less-than-native resolution.
 
Integer scaling is not a good option for modern games. Save this for emulators.
 
It would be if we had higher res displays. 1440p integer scaled on a 5K monitor might work nicely for example but I'm not so fond of 1080p on a 4K.
Yeap, that would rock on a 32" screen. Best of both worlds, crisp text and images on the desktop and good performance in games. Too bad no such displays are out in the wild at affordable prices (if I remember correctly the Asus ProArt was the only one and had a ridiculous price). It would be even better if they coupled it with 120hz in 1440p mode. For desktop use I find 60hz to be enough.
 
Integer scaling seems like it could be pretty good for rtx games if you are playing on a 4k tv. Just render at 1080 p and integer scale to 4k and maybe add some sharpening for good measure.
 
I’m surprised there isn’t more activity on this thread.

I have a C9 on the way and was looking forward to having the option to scale 1080p to 4K. Is it really worse than letting the TV interpolate 1440p to 4K?
 
1080p looks glorious on my 4K screen. It is like 27" 1080p display but without visible screen door. Image is very sharp and well defined.
I actually use it daily when playing Worms Armageddon at 1080p becvause it is more convenient than switching to 1200p screen.
For RTX enabled games ability to do proper 1080p is a blessing :)

Well I would say it comes down to the quality of upscaling. Most high end TVs these days have very good upscaling capabilities so using that might be better than going for integer scaling. My understanding was that integer scaling was a big issue in how pixel games look.
No
 
I’m surprised there isn’t more activity on this thread.

I have a C9 on the way and was looking forward to having the option to scale 1080p to 4K. Is it really worse than letting the TV interpolate 1440p to 4K?

The key issue is that 1080p has less pixels to resolve small details. You will notice this in games that have densely packed scenery with lots of fine details, like RDR2. So even though integer scaled 1080p is sharp, it doesn't look quite as good overall. I think the image sharpening feature now available on Nvidia GPUs does a pretty damn good job at sharpening the image when running at lower than native, non-integer resolutions. For 2D games I think integer scaled 1080p will work fine but for newer 3D games I would prefer running say 1880p with image sharpening.
 
1080p looks glorious on my 4K screen. It is like 27" 1080p display but without visible screen door. Image is very sharp and well defined.
I actually use it daily when playing Worms Armageddon at 1080p becvause it is more convenient than switching to 1200p screen.
For RTX enabled games ability to do proper 1080p is a blessing :)


No

No eh? Sorry but I have to disagree. Using in game resolution scaling on my LG 55 TV makes games like RDR2 look much better as opposed to using integer scaling to drop it all the way down to 1080p .
 
Integer scaling is really for pixel art and retro games. There are better options for the C9 with modern 3d games . Try it for yourself and see.
 
Integer scaling is really for pixel art and retro games. There are better options for the C9 with modern 3d games . Try it for yourself and see.

Agreed. I said before this is very limited use. Because these days many of the Pixel art and Retro games have been hacked to do things like integer scaling, and more modern stuff has better options.
 
Well the scaling on a lot of TVs is not great. At least on the Samsung sets I've had, running 1080p on 4K looked really blurry and bad. 1440p was acceptable, but not ideal.

Now with image sharpening, I feel like lower res (like 1800p) looks almost as good as 4K native. So personally I would try that first.

The integer scaling is a great feature for older or retro games that don't do the right scaling themselves.

Also, if you really don't have the performance for anything higher than 1080p (but you have a 4K display) than integer scaling might be preferable to default scaling.
 
Seems nobody is really interested at running games at 720p on a 27 screen, hahaha.

HOWEVER, let's consider this:



Now ok, putting the game at 720p would defeat the purpose of making the game look better with ray tracing and stuff like that. Still, if you can compensate with anti-aliasing to a degree, would it be really that bad?

I guess the best case scenario for this is to use it on a 4k 27 screen. Given that many people game on 1080p on 27", it would make more sense then. But 720p, i don't know, would it truly be that terrible?

I suppose i'll know soon enough once i upgrade my graphic card since both Nvidia and AMD have integer scaling now. I'll know then how sensible this idea is, heh.
 
I didn't try integer scaling, but I tried 720P -> 1080P on Radeon Image Sharpening and the result looked horrible. It was unplayable visual quality.

1080P -> 4K might be acceptable, I don't have the gear setup to test right now.
 
With 720p to 1080p you run into interpolation. The entire point here is that with integer scaling those resolutions, 720p for 1440p and 1080p for 4k look as sharp as if they were native to the monitor, at least in theory.

Now obviously nobody wants to run 720p if they bought a 1440p screen but if the game is running really slow, what would be better, 720p on high settings or 1440p on low settings? Those are the questions my silly brain has to contend with.
 
I guess the best case scenario for this is to use it on a 4k 27 screen. Given that many people game on 1080p on 27", it would make more sense then. But 720p, i don't know, would it truly be that terrible?

I suppose i'll know soon enough once i upgrade my graphic card since both Nvidia and AMD have integer scaling now. I'll know then how sensible this idea is, heh.

Going down to 1080p with or without integer scaling is a severe downgrade in my experience. Sure, you get high framerates but you take a noticeable hit in image quality as there are less pixels to represent fine details.

For example Red Dead Redemption 2 is chock full of small details and last night I was trying out my new 65" LG C9 OLED with it. I felt that 1440p still looked great on it, but when going down to 1080p you could no longer make out all the details of Arthur's face for example and the overall look was interestingly a bit "last gen".

So for modern games until we get even higher res displays in the 5-8K area, I think the best solution is to rely on the image sharpening offered by both Nvidia and AMD for upscaling and running at 1440p or above. On my 4K TV I felt that there isn't all that much difference running at say 3072x1728 (0.8x resolution scale) or 3360x1890 (0.875x resolution scale) but the performance boost is enough to be worth it the minor drop in image quality.
 
1080p on 2160p screen is a very good thing to have, especially for ray tracing which tends to run pretty terrible at 2160p, especially on non-2080Ti.

Image sharpening is a good option I guess but if we are talking about 1080p on 2160p it all comes down to preference. I will immediately forget I even have 2160p screen the moment I run game at 1080p with integer scaling as it looks pretty much exactly like 1080p screen minus screen door effect.

Integer scaling is much less udał for 1440p monitors obviously...
 
It all depends on the screen size i guess. I think it would work pretty well on a 4k 27" screen. I gamed on a 1080p 27" monitor and it wasn't bad at all. 720p on a 1440p 27" screen though may be less ideal, but can't be that compensated with anti-aliasing? The big thing is to avoid interpolation. Once you get past that, it's all a matter of reducing pixelation, and to weight that with the trade off. If you have a game that runs at 30 frames at 1440p, would it be worth it to run it at 720p to get above 60 frames?
 
Last edited:
Played with integer scaling a bit more and it really is nice depending on the game. In some cases 1080p scaled to 4K is cleaner than 1440p interpolated to 4K.

There are other cases though where the raw rendering resolution is more important especially for high frequency effects like smoke, trees, leaves and grass. The additional fine detail of 1440p vs 1080p provides much more clarity when both are scaled to 4K.
 
I've had a side by side comparison between a native 720p LCD TV and a native 1440p LCD pc monitor, both at 32" diagonal. Tested a couple of games, both 2d and 3d.

For 2d, i've tested Braid, an old 2d game that is 720p fixed resolution. It looked a bit better on the native 720p panel TV. With integer interpolation on the 1440p monitor, the image was sharp but on the same time, the pixels were jagged, most probably an effect of pixels doubling to match 2560x1440 resolution.

For 3d, i've tested Quake Champions, and again, 720p looks worse than on the native 720p screen. Not by much, but there is a difference, it's more jagged...Sharp, not blurry, but no AA settings can alleviate the jaggies. Crucially, 1080p upscaled to 1440p looks way better, especially with some sharpening. I haven't had the chance to test a 4k panel, but i bet 1440p normally upscaled plus sharpening to 4k looks better than 1080p integer scaled to 4k. Kinda like consoles do it.

One thing i've noticed, for 2d games at least, you can achieve similar results with just normal upscaling and tweaked reshade sharpening filters, depending on the native resolution of the panel (ie for 1080p panel you'll have different reshade settings than on a 1440p panel). But it's tricky to get it right...
 
Back
Top