Playing at 1080p on a 4k monitor?

MajGHOB

Weaksauce
Joined
Dec 10, 2003
Messages
77
I'm considering buying a new monitor in the next few months when the next set of G-sync monitors come out. I currently have a 27" 1440p PLS panel and a Titan X, but I doubt if the Titan X will be able to handle 4K with everything maxed out.

So I want to hear from anyone who actually OWNS a 4K panel how games look on it if run at 1080p. If I can run 1080p on a 4K panel without image quality loss then I will buy the Asus ROG PG27AQ 4K panel, but if there is image quality loss then I will have to go for the Acer Predator X34, which is an Ultra-wide panel.

By the way here are the specs for these two monitors:

Asus ROG Swift PG27AQ
27" IPS 3840 x 2160p 60Hz G-Sync

Acer Predator X34
34" IPS 3440 x 1440p 75 or 100Hz(?) G-Sync

Thanks in advance guys :)
 
This is not the answer you want to hear:

MT_ said:
AFAIK, there is currently no 4K monitor that has scaling-by-multiplying behavior implemented for divisible resolutions. So even 1080p is scaled to 2160p with undisableable filtering that results in somewhat blurry image anyway. For computers, this could theoretically be fixed on GPU-driver level (though isn't yet) for GPU-scaling case, but this is not applicable to consoles like PS4.

Maybe some UHD TVs (not monitors) have proper FHD→UHD scaling without blurring, but it's unknown with certainty.

http://hardforum.com/showthread.php?t=1855770

This was posted in March 2015. I don't know if this situation has changed since then.
 
What exactly do you mean by no lost of image quality? Playing at 1080p vs playing at native 4k will always result in lost of image quality because its 1080p and not 4k... :confused:

There are some who claim that if a 4k monitor (not a TV) is run at 1080p there is little to no distinguishable difference in image quality to native 4k when running games. The claim is plausible since for each 1 pixel there would just be 4 pixels when running a 4k at 1080p and so no scaling is required, since it scales perfectly. This is basically what Apple does on all their retina stuff. The problem is that I don't have the hardware to actually confirm this myself on a computer monitor that runs at 4k. Keep in mind this is only relevant to monitors and not TVs.

Hopefully someone here with a 4k monitor can answer the question. BTW if you have a 4k monitor apparently you need to disable high DPI scaling under windows for that specific game for this to work smoothly.
 
Well I can't speak for other displays but on my Samsung 4k while running with no scaling used, gaming at 1080p just looks mad ugly compared to native 4k. The amount of aliasing going on is ridiculous and overall things just don't look clear and sharp. Maybe other monitors do 1080p better.
 
The claim is plausible since for each 1 pixel there would just be 4 pixels when running a 4k at 1080p and so no scaling is required, since it scales perfectly. This is basically what Apple does on all their retina stuff. .


Only it hasn't turned out like that for pretty much any display.

non native on LCD is almost always crapy, especially at closer seating distances. You can get away with it on a TV at TV seating distances. Hence why 4k on a 1080p TV will still look great even in comparison with 4k on a 4k TV. 4k could be a bit pointless for people ( not for everyone obviously ) in a typical living room scenario, with eye sight vs screen size and distance considerations.

4k is best for Monitors & VR and as such its better at Native resolutions due to seating distances.

(*i say seating distances as the increased visual perception the eye has up close as opposed to far away )
 
Last edited:
My 2 cents. 4k with 80 percent of everything maxed out (which is what you Titan X should do) looks better than 1080p with "all the eye candy turned up".

Have to echo what everyone else is saying. 1080p on a 4K monitor looks worse than on a native 1080p monitor.
 
I use an iMac 27" retina with 5k native resolution and there is no difference while running a 1080p movie in this PC and in my 1080p native 17" windows. The only difference is that iMac looks WAY better, but no detail is lost (even though it is scaling from 5k to 1k resolution).
Even if your 4k monitors were to lose some detail you wouldn't notice it cause I guess you don't watch a movie 1 inch away from it.
 
I use an iMac 27" retina with 5k native resolution and there is no difference while running a 1080p movie in this PC and in my 1080p native 17" windows. The only difference is that iMac looks WAY better, but no detail is lost (even though it is scaling from 5k to 1k resolution).
Even if your 4k monitors were to lose some detail you wouldn't notice it cause I guess you don't watch a movie 1 inch away from it.

Apples to oranges. OSX scaling is a completely different beast and you are comparing running a screen at it's native resolution (which a Mac always is under OSX) to simply playing a 1080 video on a screen.
 
Last edited:
Now that I'm rereading the thread's title I see you're completely right :)
 
I like the idea... never given it much thought but this would be nice.
I know it's a pain, but it would be good if you could register on their forums and mention your interest.
They're never going to consider adding the feature if it's just me posting in that topic once a week.
 
Well I guess this means that I either have to buy another Titan X if I want 4k or I'll just buy the Acer X34 Ultrawide 1440p screen and be done with it for the next couple of years. BTW thanks everyone for posting, it was helpful.
 
I would get the 34 inch Acer or the Asus equivalent of it. Comparing using my 27 inch 1440p and 28 inch 4k display, there really isn't much of a difference in overall image quality between the two with the main difference being less aliasing on the 4k which can be somewhat mitigated by using more AA at 1440p. There just aren't a whole lot of games out right now that really have the textures to do 4k justice. 100Hz 3440x1440 seems like a much better gaming experience imo compared to 27 inch 4k.
 
Can somebody PLEASE give a visual comparison of this 'ScalerGate' 1080P scaling on 4K monitors? I have a hard time believing playing 1080P on a 4K monitor (even with shitty scaling) would be worse than native 1080P, it may not be as good as it could be, but I don't see how a larger ammount of higher DPI pixels could make image quality WORSE.

before any of 'yall get up in my face: I know what you mean by 'blurry, upscaled mess'. the whole per-pixel interpolation causing non-native pixel increments blend between two native pixels, I get it. But I can't understand how that could be WORSE than 1080P at the same panel size.

Show me pictures, show me good-quality, tripod-captured, close up screenshots of text, games, etc comparing between two like-sized monitors, one 1080p, one 4K running upscaled 1080p. Lets end this.
 
The above linked post to the Nvidia forums already posted an example from FTL on how GPU scaling butchers pixel graphics by applying interpolation without any option to turn it off.

What it's supposed to look like (pixel doubling): http://abload.de/img/4k-nearestb1r06.png
What the scaling looks like: http://abload.de/img/4k-scaleddaqy3.jpg

Of course this is a GPU scaling example but many 4K monitors and TVs also apply similar scaling algorithms that can't be turned off when upscaling 1080P content.
 
Yes, I get that, but that is a manufactured example: Its a pixel art image scaled in Photoshop. That would not be the signal sent to the screen, nor would the screen output an image like that. I'm talking about pulling out the scientific method and comparing real-world examples: 1 for 1. Use a tripod-mounted camera and take pictures of the scaling system. We can talk about simulations and theoreticals, but lets get down to brass tax: does a 4K monitor receiving a 1080P signal look worse than a 1080P monitor recieving the same signal?
 
Right... I'll have to defer to someone who actually has a 4K panel that has bad scaling on it to take pictures of it.
 
Yes, I get that, but that is a manufactured example: Its a pixel art image scaled in Photoshop. That would not be the signal sent to the screen, nor would the screen output an image like that. I'm talking about pulling out the scientific method and comparing real-world examples: 1 for 1. Use a tripod-mounted camera and take pictures of the scaling system. We can talk about simulations and theoreticals, but lets get down to brass tax: does a 4K monitor receiving a 1080P signal look worse than a 1080P monitor recieving the same signal?

There are a couple of problems with doing this experiment right. First off, what 4k monitor has this perfect scaling to make 1080p look exactly like 1080p? Second, let's say we do find one, now is there no guarantee that there is an exact same type of that 4k monitor but a 1080p version of it. You can't take some 27 inch 4k monitor and then compare it to a completely different 1080p monitor because now things like the AG coating can factor in how the images look. You need a 4k monitor that has this type of scaling AND a monitor of the same size with the same AG coating but 1080p resolution.
 
I think thats getting a bit TOO scientific! Good on you for illustrating the issues here, though, because you make an excellent point.

I think the main issue can be broken down as so:

Question 1: Do 4K monitors upscale lower-resolution signals (1080p, example) in a way that detracts the perceived image quality compared to that of the same signals displayed at native resolution, with all other factors being similar or identical?

Question 2: Is the difference in image quality significant or obvious enough to be unquestionably attributed to the upscaling factor ONLY, and not the panel, backlight, or coating type?

The test: Acquire two different displays: one 2160P, one 1080P. Similar panel types preferred, but not necessary. Calibrate them to be of relative same brightness and colour curves. Sit the displays next to one another, in sight of a secured camera. the secured camera must be of high-enough capture resolution to be able to resolve individual pixels on the higher-resolution screen. Zooming in to accomplish this is acceptable, so long as the majority of a monitor can be photographed with only minor re-adjusting of the camera (rotating the camera to photograph one screen at a time may be required): If the displays are of different sizes, move the larger of the displays further away from the camera in order to maintain equal FOV from the camera's location. Use a video card with two HDMI outputs, or two DVI outputs, or two DisplayPort outputs (the output types must be identical in order to eliminate possible causes of image degradation). Run the cabling to the monitors, and ensure the OS is set to duplicate the desktop displays, instead of extending. Set the desktop resolution to 1920x1080 and proceed to display full-screen 1080p images in a slide-show format. Take pictures with the secured camera, two pictures for each full-screen image, with each photo focusing on a different screen.

We can then look at the photos here and make up our minds with evidence, instead of anecdote.
 
Photographs as requested:

  1. Low-res panel
  2. High-res panel displaying the same image using 2x nearest-neighbor scaling
  3. High-res panel allowing the display to handle scaling (bilinear I assume)

lo-res43p30.jpg
hi-resnearest25rf6.jpg
hi-resdisplayw6o6h.jpg


Yes, that last one is properly focused.
Anything other than nearest-neighbor at integer scales is terrible.
 
Last edited:
How did you get photos to depict integer scaling if there is currently no driver support for this mode.
he hacked driver and added support of point scaling, obviously ;)

but seriously, you do not really know? :eek:

without using any external programs you can use eg. mspaint and do 2x zoom of screenshot, then take photo of that
 
Kudos.
Those images tell more than thousandths words.
For me 1080p monitor looks the best
In person, I prefer the higher resolution screen.
Unfortunately there are some slight brightness/gamma differences between the screens that I couldn't quite correct for, and a photograph is not 100% representative of what we see.

In person what you see is that the higher resolution screen looks much "clearer". (with the nearest-neighbor image)
The low-res screen has a coarse "grid" over the image since there is one display pixel per image pixel.
On the high-res screen, the "grid" between the pixels is difficult to see, which makes smaller details stand out more.

Since there are four display pixels for every image pixel, if you draw something with a thin red line on a low resolution screen, it tends to look thinner than a white line.
On the higher resolution screen their width is much closer in appearance.

There is noticeably more color fringing on the lower resolution screen as well, which does make the image "pop" a bit more in the photo. (I guess that's why many games are adding chromatic aberration effects now?)
What you see in the low resolution image is not due to the camera lens, it's due to the big subpixels.

But the shortcomings of the lower-resolution screen do have a "smoothing" effect.
It's a trick of the eyes, but we somehow don't perceive the coarser image to have such a jagged (aliased) appearance, yet at the same time it almost looks "sharper."

Basically, 1:1 mapping looks best.
But if we already have a high-resolution display, using nearest-neighbor upscaling looks a lot better than the blurry upscaling that most screens use - at least when displaying computer graphics.
For displaying photographs or movies, you might prefer the blurrier but smoother look of the scaling in #3.
without using any external programs you can use eg. mspaint and do 2x zoom of screenshot, then take photo of that
Yes, it's easy to do with a static image.
  1. Display a native-resolution image on the low-res screen
  2. Display an upscaled (200% nearest) image natively on the high-res screen
  3. Send the low-res output (1) to the high-res screen and let it upscale
What I'd like to see is an option in the drivers to have nearest-neighbor scaling applied any time a program/game outputs a lower-resolution signal.
Nearest-neighbor scaling is the most basic upscaling that can be done, so it should be very easy for NVIDIA (or AMD/Intel) to implement. They just have to be convinced that it's worthwhile to do so.
 
Photographs as requested:

  1. Low-res panel
  2. High-res panel displaying the same image using 2x nearest-neighbor scaling
  3. High-res panel allowing the display to handle scaling (bilinear I assume)

lo-res43p30.jpg
hi-resnearest25rf6.jpg
hi-resdisplayw6o6h.jpg


Yes, that last one is properly focused.
Anything other than nearest-neighbor at integer scales is terrible.

That is definately a revealing set of images! The final image does indeed show a blurring, though not reducing detail or readability, it does show a loss of sharpness.

However...

That looks to be a screen-grab of software/pixel art. Definately showing the worst case scenario, and one scenario of which people who ask these questions are least concerned. MOST people who want to upscale via monitor are doing so because their machine (be it a current console, or a PC without enough graphics power) cannot display 4K smoothly in game, so they set the application resolution lower.

Screen grabs of a text-heavy software and screen grabs from within, say, TW3 or GTA5 are two very different examples of usage. Gaming is more what I was looking at in terms of the common scenario. Anyone who buys a 4k monitor for software/application productivity but DOESN'T use the native resolution is surely a small fraction of users, if any.
 
That is definately a revealing set of images! The final image does indeed show a blurring, though not reducing detail or readability, it does show a loss of sharpness.
However...
That looks to be a screen-grab of software/pixel art. Definately showing the worst case scenario, and one scenario of which people who ask these questions are least concerned. MOST people who want to upscale via monitor are doing so because their machine (be it a current console, or a PC without enough graphics power) cannot display 4K smoothly in game, so they set the application resolution lower.
Screen grabs of a text-heavy software and screen grabs from within, say, TW3 or GTA5 are two very different examples of usage. Gaming is more what I was looking at in terms of the common scenario. Anyone who buys a 4k monitor for software/application productivity but DOESN'T use the native resolution is surely a small fraction of users, if any.
I can try and get some photos of games running on a pair of displays next weekend.
But the results really aren't any different. Anything which is rendered by a computer tends to look best when scaled up using nearest-neighbor.
With video you might prefer to use "filtered" scaling, where a smoother appearance is preferable, but not with games.

But if we can convince the GPU manufacturers to include some kind of integer scaling mode, control will be in the user's hands. You'll be able to choose what you prefer.
And if they make that change, it might be nice to see scaling options moved into application profiles.
 
I wish this option already existed. It would certainly make going to a 4K monitor easier.
 
I can try and get some photos of games running on a pair of displays next weekend.
But the results really aren't any different. Anything which is rendered by a computer tends to look best when scaled up using nearest-neighbor.
With video you might prefer to use "filtered" scaling, where a smoother appearance is preferable, but not with games.

But if we can convince the GPU manufacturers to include some kind of integer scaling mode, control will be in the user's hands. You'll be able to choose what you prefer.
And if they make that change, it might be nice to see scaling options moved into application profiles.

Thanks for your help in this. I am hoping if we can get a few, diverse examples we can create something that would turn up in a Google search. This will help future buyers at least.
 
Well I have good and bad news. I was really hoping to borrow a low-res panel with similar characteristics to the high-res one that I wanted to compare it against, but I ended up having to do the comparison between a nice IPS panel, and a low quality TN panel.
We're only supposed to be comparing resolution, but that's the reason there are color and gamma differences.

And though the two panels are physically the same size, the rest of the displays' construction was not, which meant that I have to reposition the camera between shots - so I was unable to get things perfectly aligned. So I had to edit the TN shots to try and improve the alignment. This was exponentially more difficult than the previous comparison where both displays were basically identical other than resolution.

I actually got the alignment pretty good on the full-frame shots, but not the close up ones.
Unfortunately due to the resolution of my camera, I had to shoot at a very high aperture to try and avoid moire, which tends to soften the images a bit - making the full-frame comparisons less obvious than it is in person - so I've left most of those out.

So with those excuses out of the way, here are some images.

FTL:

MGSV: Ground Zeroes

Dark Souls 2

Half-Life 2 with a ton of anti-aliasing

Now I don't know if it comes across well in the photographs, but in person, things look much better using Nearest Neighbor (integer) scaling.
The image looks a lot crisper, and is a much closer match to the natively low resolution panel. I'd go as far as to say that I prefer it, because even though the image is still the same resolution, the higher resolution panel's finer pixel structure makes small details easier to see.

If it looks in any way that the bilinear scaled image has better anti-aliasing, that's because it's a photo. In person it still has all the aliasing present, it just looks blurred on top of that.
 
Last edited:
Awesome! Thanks for the work in this one. The biggest difference favouring the filtering was the dark souls shot, but the half-life and FTL shots favour the NN scaling, yet the MGSV looked better on the native TN for some reason, SMAA maybe?

Either way, this has GOT to be useful for those interested in buying a new screen, an helpful for those looking to persuade video-card vendors to support NN scaling on the GPU. Well done!
 
Great comparison.
Kudos to you.

Now all we need to do is molest AMD and NV :D
 
Awesome! Thanks for the work in this one. The biggest difference favouring the filtering was the dark souls shot, but the half-life and FTL shots favour the NN scaling, yet the MGSV looked better on the native TN for some reason, SMAA maybe?
I think there are a few things going on.
Firstly, I had to resize my images by about 50% - because they were too big for the image host otherwise. (30MB+ JPEGs!)
And because I was trying to correct for differences in the angle, I think the TN panel shots ended up looking a bit softer than the originals - they did end up a different size, as I had to crop them a bit more than the IPS shots.

But I actually think there is something going on with the TN panel itself. Some kind of crosstalk or "bleeding" seems to be visible in the darker images, which explains the SMAA comment.
Part of it is that I don't have a macro lens for my camera, and because I had to use a small aperture to avoid moire, but all the TN panel shots have this hazy look to them - it's the reason that I was hoping to compare IPS against IPS.
In any of the images, the top half of the image seems to be a better point of comparison. The viewing angles are so limited on the TN panel that I couldn't get a uniform image at any angle. The top ended up overly contrasted and the bottom is washed out. It looked more uniform on the camera than the resulting images.

HL2 was using 2x SSAA combined with 8x SGSSAA so it was basically perfectly anti-aliased at that resolution.
MGS and DS2 use FXAA, which I disabled as I can't stand how that blurs the image.

As I've said before, the differences are more stark in person, and none of them were in favor of the filtered scaling, in my opinion.
I spent some more time trying to correct the angle of the close-up shots, so here's another DS2 comparison:

I've also updated the previous MGS images.
 
Most 4K television set you bought can automatically upscale all the content you play on it. Obviously, full HD will continue to look better on it as compared to standard definition content. However, only a good scaling can make 1080p content look good on 4K TV. A better 4K TV with good upscaling technology will make 1080p Blu-ray upconverted to the 4K screens looks great, although not appreciably better than 1080p on a 1080p TV. Just because a TV is 4K doesn't mean it has a good scaler built-in. On cheaper 4K TVs, for instance, you probably save money because of a sub-par scaler. So in these cases, 1080p content probably won't look any better (and possibly worse).
 
35 posts to reinvent the wheel: integer scaling looks better than other scaling alternatives; 4k handles 1080p well, does not do a good job at other resolutions between.

I will add something new: there are 4k displays capable of 1080p 120Hz. The cheapest being a 43" sony 830C ~$640.
 
What exactly do you mean by no lost of image quality? Playing at 1080p vs playing at native 4k will always result in lost of image quality because its 1080p and not 4k... :confused:
Exactly this.

Plus at 4k you really get no benefit from AA.

I've been on a 40" 4k for about a year on a 1070. Crazy thing is, I can play games just fine. But ssshh the Internet doesn't want you to know that.
 
35 posts to reinvent the wheel: integer scaling looks better than other scaling alternatives; 4k handles 1080p well, does not do a good job at other resolutions between.

I will add something new: there are 4k displays capable of 1080p 120Hz. The cheapest being a 43" sony 830C ~$640.

My experiences tell a completely different story.

1440p on my 4k (BL3201PT) scales better than 1080p does on the same 4k Screen, which is ironic, considering that I initially wanted to buy the 4k monitor hoping it'll do 1080p somewhat wellish, it doesn't.

Basically, in my honest opinion, 1080p on 4k screens are dead in the water, I have not seen any monitor or GPU setting that allows for integer scaling at all.

My TV scales 1080p better, but this is coming from sitting 3m+ away from a 49" panel, comparing to sitting 1m away from a 32" panel.

If anyone wants to use a lower resolution for performance reasons, I now only recommend people actually using low resolution native and use DSR/VSR/GeDoSaTo to up their rendering resolution.
 
1440p should look better if the only option is to use the filtered scaling as the increased resolution even when upscaled looks better. Ideally 1080p should look sharper with the correct integer scaling but even then you might prefer the look of the higher non-native resolution as it can resolve more detail even if the end result is softened a bit. Most games running on the PS4 Pro for example use either checkerboard rendering or upscaling to 4K and I really like the results especially for checkerboard. Horizon: Zero Dawn looks fantastic and quite sharp.

There's a long thread on GeForce forums about this: https://forums.geforce.com/default/topic/844905/geforce-drivers/integer-scaling-mode/
Unfortunately it seems this is either ignored or very very low priority for Nvidia. Don't think it would be that difficult to implement.
 
Back
Top