Your 1440p Monitor Could Be Using a 4K Panel

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The antics of the computer monitor industry continues with a new report claiming that certain panel manufacturers are using 4K panels for QHD displays because they are cheaper to produce. The native resolution is limited via firmware, and image quality is reduced due to scaling.

In order to scale a 2560x1440 image to 3840x2160, the scaling factor is x1.5. This means that a single pixel in the lower-resolution original image gets mapped onto one and a half pixels, which increases blurriness. This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
 
Wow thats a major dick move, buying a 1440p and getting punished badly
too bad article has no models/manufacturers pulling this crap , they should really name and shame on this one.
My guess would be manufacturers that offer both, otherwise it would make sense to undercut competition at 4k.
 
Honestly, with the price of 4K monitors coming down in price, and the uptick in refresh rates for HDMI and DisplayPort, the days of the 1440p monitor are probably numbered.
 
Eventually it wont matter once pixel density gets so high. In 10 years all panels will probably be the same for all resolutions. Cause the panel will be something like 20k x 20k pixels.
 
Eventually it wont matter once pixel density gets so high. In 10 years all panels will probably be the same for all resolutions. Cause the panel will be something like 20k x 20k pixels.

Maybe by then we'll have the Geforce 1180 to run games at peak medium settings...
 
This is vastly different to a 4K display running with 1920x1080 input, where each pixel simply gets doubled in width and height, so a 1:1 mapping exists and everything stays sharp.
It only exists in THEORY. Modern GPUs will blur that with bilinear or bicubic scaling just the same. Pretty much anything that's non-native resolution is going to have a blur, unless the software is specifically getting instructions to use nearest neighbor filtering. Ironically, this means you'll probably have less blur on a 1440 image on a 4k screen than a 1080 image on one.
 
What monitors do this? I doubt this is true. If any monitor did this, it would be obvious and people would be complaining.
 
I have the Asus PB278Q, doubtful too, I do know they released two different panels for it. I think it was due to light bleed, but haven't investigated further.

i fell victim to that.

pb278q started life with a samsung panel and great reviews bought mine and got an au optronics panel.

same panel didn't get such good reviews in other monitors.
 
I find that 1440p looks better on 4k monitors than 1080p, because it upscales for both anyway. If the 1440p lacks screen door effect close up, it's probably a 4k panel
 
I don't quite know what all this means but I can tell you that on my HP Omen 32 I have that Nvidia DSR enabled and run my desktop at 2160p. Is it crystal clear? No. Do I have it at like 45% smoothing? Yes. Do I have any issues reading text or seeing small stuff? Not at all. Like, if I didn't KNOW I don't think I could tell this wasn't a native 4K monitor.
 
Honestly, with the price of 4K monitors coming down in price, and the uptick in refresh rates for HDMI and DisplayPort, the days of the 1440p monitor are probably numbered.

This maybe true but the GPU demands if you are a gamer are not coming down.

We still don't have a single GPU that you can game on at 4k without having to reduce settings.
 
Not the first time I've seen this happen. From 1080p to 4k I've seen panels that Windows can see resolutions(w/o adding or changing settings) that are not in the specs. Had an early Philips 47" 1080p/60hz that NV CP let me use 120hz on. It was awesome. My current HiSense 55" had a similar res/refresh until a firmware took it away. Seems like some custom firmware might be on the horizon for some folks soon. Just gotta be careful not to infect them with something. Only real difference with these is that the manufacturers are using the firmware from the get go to control things.
 
Do you even realize what website you're on?

https://m.hardocp.com/article/2018/06/06/state_decay_2_video_card_performance_review/4

Look at that, a game that can be played fully maxed out on a single GPU in 4K. And the review is from this month.
He probably means "can game reliably at 4k with max settings." Even with a 1080 Ti, 4k gaming at max settings isn't a reliable thing at all, it varies a lot depending on the game. State of Decay 2 is based on a game updating a visual style for the Xbox 360, it's not going to be the most demanding title out there. Try something like Crysis 3, DX: Mankind Divided, or GTA V, or plenty of other games that push the graphics envelope and you're not going to be hitting 60fps reliably. At 1440, you will.
 
Not that I wouldn't put it past monitor manufacturers, but we don't have any hard evidence that this is happening, as of yet.
 
What 4k displays actually offer a real 4:1 point filtering upscale of 1080p to 4k? I don't know of any. Otherwise, despite the marketing speak it doesn't matter if it's an even number of pixels in vs out. The interpolation is still going to look bad no matter what. Unless you integer scale it (Point. Do nothing more than 1 input pixel = 4 output pixels. Without trying to interpolate in between)
 
This maybe true but the GPU demands if you are a gamer are not coming down.

We still don't have a single GPU that you can game on at 4k without having to reduce settings.

New games are taking care of this by implementing screen scaling and dynamic resolution with 4K interfaces, so you may as well get the 4K screen.
 
Now the question is this: if you or someone else buys a 2560x1440 (sorry, 1440p is a fucking video format, it's a not a damned display resolution) panel that turns out to be a 3840x2180 panel in reality, will someone be able to find a way to modify the panel's firmware so that it works as it was originally designed meaning at the full native 3840x2160 resolution?

Let the hacking begin!!!
 
I'm guessing one of the displays has either extra or is missing a few lines of pixels to make it divide evenly, then the firmware scales it and presents to the GPU as the proper resolution.
 
New games are taking care of this by implementing screen scaling and dynamic resolution with 4K interfaces, so you may as well get the 4K screen.

But does any of that give you the visual quality of actually running at 4k.

Anytime I hear Screen Scaling and dynamic resolution that is usually inferior quality to native.
 
But does any of that give you the visual quality of actually running at 4k.

Anytime I hear Screen Scaling and dynamic resolution that is usually inferior quality to native.

If you are looking at static screenshots sure, but in motion it is hard to tell. Especially if the interface stays native. Dynamic is obviously better than screen scaling, but both technologies are getting better.
 
Makes sense, so many 4k panels are 27" as well as many 1440p, the scaling sucks though as mentioned in article. 1440p looks best scaling from 720p.
 
Back
Top