4K, 144Hz Monitors Reportedly Use Chroma Subsampling When Running in 144Hz

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,003
Asus’ and Acer’s new PG27UQ and X27 4K 144Hz monitors have been criticized for being expensive, loud, and power hungry. Now, r/hardware is warning prospective buyers of another deficiency: chroma subsampling (4:2:2) is reportedly in play when the monitors are running at 144Hz, which means a softer, blurrier image. Ideally, any display for desktop use should support full, non-subsampled resolution (RGB/4:4:4).

Chroma subsampling reduces image quality. Since chroma subsampling is, in effect, a partial reduction in resolution, its effects are in line with what you might expect from that. Most notably, fine text can be affected significantly, so chroma subsampling is generally considered unacceptable for desktop use. Hence, it is practically never used for computers; many monitors don't even support chroma subsampling.
 

gxp500

Gawd
Joined
Mar 4, 2015
Messages
865
They don't even mention using Chroma Subsampling on their spec page.
Oops soooooorry we forgot...

were-sorry-sorry-via-9gag-com-14012675.png
 

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
The best you can get with RGB or YCbCr 4:4:4 is 4K 120 Hz with 8 bpc color depth, or 4K 98 Hz with 10 bpc color depth (HDR).

96hz isn't too bad.

Still, we're talking about $2000+ monitors. If I were to drop that much money on a 27" monitor, I'd be pissed about having to compromise.

Holy crap, power consumption indeed. 180 watts?

That's what you get with an HDR backlight. In theory, with FALD, it should only peak at 180 watts, not eat that much all the time.
 
Last edited:

thecold

Gawd
Joined
Nov 12, 2017
Messages
849
96hz isn't too bad.

Still, we're talking about $2000+ monitors. If I were to drop that much money on a 27" monitor, I'd be pissed about having to compromise.



That's what you get with an HDR backlight. In theory it should only peak at 180 watts with FALD, not eat that much all the time.

If I purchased a 2000 dollar monitor, I'd be sure to know what the actual specs are on before buying it...
 

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
If I purchased a 2000 dollar monitor, I'd be sure to know what the actual specs are on before buying it...

Thing is, Asus and Acer weren't clear about listing this little caveat. And I bet 96hz isn't a default frequency, you probably have to set a custom, technically unsupported resolution.

I'd research the hell out of it myself, but I'd say the vast majority of people (even those that drop $2k on a 27" monitor) don't meticulously crawl through posts on forums and Reddit before making their purchase.
 

Azphira

[H]ard|Gawd
Joined
Aug 18, 2003
Messages
1,854
The $2000 is for the placed with care dirt between the lcd substrate and diffuser, or the exquisitely crafted defective sub pixels.
 

Nunu

Limp Gawd
Joined
Jun 5, 2017
Messages
257
Wow, that's a shame. If it's a purely technical issue then don't release the damn thing yet, work on it . If it's a cost cutting measure , then make a top model and sell it at a higher price.
 
Joined
Jun 30, 2017
Messages
35
Holy crap, power consumption indeed. 180 watts?

These new 4k 144HZ monitors even have fans... Yes you will hear your monitor... saw a video previously posted on [H] the Acer was bad, the Asus (not posted here, just a reviewer post on a forum) has a fan but supposedly better and less noisy one.

And I'm not surprised about sub sampling there isn't any specs that can handle the necessary bandwidth, however they should be open about it but why would they if they can get away with it?... And a post here and there is almost getting away with it most people won't notice and just be happy with "the best"...
 

Slade

2[H]4U
Joined
Jun 9, 2004
Messages
2,794
I'm so glad I pulled the trigger on the pg27 rog monitor all those months ago instead of wait for this disaster of a monitor.
 

Shadowed

Limp Gawd
Joined
Mar 21, 2018
Messages
506
I am gonna do my next build around a monitor that can do 4k144hz 4:4:4 HDR10.
I did the same thing with 4k60 4:4:4 several years ago.

Such a bummer about current 4k144hz monitors. If anything, I am gonna try to get a 21:9 5040x2160 144hz but that ain't happening for a long time.

I am gonna play my FPS twitch games with a low res high refresh monitor and stick with 4k60 for single player and RTS games.
 

Jim Kim

2[H]4U
Joined
May 24, 2012
Messages
3,826
With aging eyes (far sighted), a dash of astigmatism and a sprinkling of color blindness I am gonna save thousands on my next monitor, I'll probably get a big honken tv instead.
 

Upgrayedd

Limp Gawd
Joined
May 17, 2018
Messages
136
Anyone know the highest refresh the 200Hz 3440x1440 could go in 10 bit HDR?
 

phillyboy

[H]ard|Gawd
Joined
Jun 3, 2006
Messages
1,203
There isn’t enough bandwidth in DisplayPort 1.4 to allow for 4K 144Hz 10-bit 4:4:4.

Now, we're seeing it in these 4K 144 Hz monitors. With full RGB or YCbCr 4:4:4 color, DisplayPort 1.4 provides enough bandwidth for up to 120 Hz at 4K (3840 × 2160) with 8 bpc color depth, or up to around 100 Hz at 4K with 10 bpc color depth (exact limits depend on the timing format, which depends on the specific hardware; in these particular monitors, they apparently cap at 98 Hz at 4K 10 bpc). These monitors claim to support 4K 144 Hz with 10 bpc color depth, so some form of bandwidth reduction must be used, which in this case is YCbCr 4:2:2.

They could be more upfront about it. I would say in a game it won’t matter but desktop use youlll notice
 

MrDeaf

Limp Gawd
Joined
Jun 9, 2017
Messages
428
So, is this a case of nVidia's G-sync being exclusive to DP 1.4 connections biting them in the ass?

Because I am reading here that HDMI 2.1 can support 4K 144Hz 10bit color depth with some video bandwidth to spare.

It also makes me wonder why nVidia didn't run their "DP" connection out of spec, since they provide the G-sync controller in the monitor as well, thereby having total control over the video connection.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,465
So, is this a case of nVidia's G-sync being exclusive to DP 1.4 connections biting them in the ass?

Because I am reading here that HDMI 2.1 can support 4K 144Hz 10bit color depth with some video bandwidth to spare.

It also makes me wonder why nVidia didn't run their "DP" connection out of spec, since they provide the G-sync controller in the monitor as well, thereby having total control over the video connection.
I would think so too, is anyone here sure that it is not the G-Sync Controller that is limiting the subsampling when at high refresh rates?
 

Meaker

Official representative of powernotebooks.com!
Joined
Jan 10, 2004
Messages
924
I would think so too, is anyone here sure that it is not the G-Sync Controller that is limiting the subsampling when at high refresh rates?

The above calculations are based on the bandwidth of DP 1.4.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,962
So, is this a case of nVidia's G-sync being exclusive to DP 1.4 connections biting them in the ass?

Because I am reading here that HDMI 2.1 can support 4K 144Hz 10bit color depth with some video bandwidth to spare.

There are no HDMI 2.1 transmitters to be had though, nor cables. They've set a standard for a spec for 48gbps signaling, but that doesn't mean there's commercial products that can use it. Right now one company has demonstrated a 2.1 transceiver, though it isn't available in commercial quantities. 2019 is the earliest you'd see it integrated in to a product. Likewise nobody is making 48g cables, and as of yet there isn't guidance on how they'll be made so manufacturers can't even start. They are likely to be pricey though, have a look at QSFP direct attach cables. Those do 40gbps, and they are f air bit of cash.

So nVidia's choices are HDMI 2.0 or DP 1.4, those are the latest interface standards that are currently available on the market, both of which the latest GeForces have on chip. HDMI 2.0 caps out at 18gbps (14.4 effective data rate), DP 1.4 goes to 32gbps (25.9 effective rate). So for 144Hz 4k it is DP or nothing right now. HDMI 2.0 can't do it even with 4:2:0 sampling. DP can do it with either 4:2:2 sampling or with DSC.

Something to keep in mind with these new high framerate and res monitors is that transceiver and interconnect speeds are a real issue, and one not easily overcome. When you want a lot of pixels, that takes a lot of data and transmitting that is hard if you want to do it cheap. If you want to see some of the issues and prices, as I noted, look to Ethernet. Ethernet does 40 and 100gbps over 4 simultaneous data channels (same as HDMI/DP) but there is some expensive equipment involved. For consumer electronics they want to keep things cheap, which means keeping the signaling and transceivers simple, which is hard.
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
27,762
Granted, this should have been clearly advertised, but those of us who have been anticipating this monitor and browsing the ROG forum were well aware of this weeks in advance before the preorders went live. If it really bothers anyone, then just run it at 98 Hz. It's not like you're going to be getting close to 144 FPS in any modern games at this point, anyway. You can still have full chroma in SDR at 144 Hz.
 

whatevs

Limp Gawd
Joined
Jun 23, 2017
Messages
199
Granted, this should have been clearly advertised, but those of us who have been anticipating this monitor and browsing the ROG forum were well aware of this weeks in advance before the preorders went live. If it really bothers anyone, then just run it at 98 Hz. It's not like you're going to be getting close to 144 FPS in any modern games at this point, anyway. You can still have full chroma in SDR at 144 Hz.

Why would you pay several thousand dollars to build a high herz capable PC and than the same for a monitor?
When the next gen graphics cards come out, capable of pushing high/ultra 4k at high herz, the monitor still wont be able to take advantage of it in non-lossy way.
And, same deal, when new graphics cards with newer HDMI/DP specification ports come out, the monitor still won't be able to take advantage of that to offer non-lossy high herz.

It's just a cross of dishonest advertising by marketing and also user's fault for being ill-informed. Standard super hype, pump and dump scheme.

Im still waiting for a Rec 2020 capable super high herz 1080p gaming monitor. Not that i mind 4k, but people keep forgetting, 4k pixels allows brute strength/stupid way of scaling graphics quality a bit. Yet, no game can match the pristine quality of any real world 1080p video.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,962
[QUOTE="whatevs, post: 1043685264, member: 303250"Im still waiting for a Rec 2020 capable super high herz 1080p gaming monitor. Not that i mind 4k, but people keep forgetting, 4k pixels allows brute strength/stupid way of scaling graphics quality a bit. Yet, no game can match the pristine quality of any real world 1080p video.[/QUOTE]

Rec 2020 specifies two resolutions, 4k and 8k.
 

whatevs

Limp Gawd
Joined
Jun 23, 2017
Messages
199
Im still waiting for a Rec 2020 capable super high herz 1080p gaming monitor. Not that i mind 4k, but people keep forgetting, 4k pixels allows brute strength/stupid way of scaling graphics quality a bit. Yet, no game can match the pristine quality of any real world 1080p video.

Rec 2020 specifies two resolutions, 4k and 8k.
OK, Rec 2100 than... I meant the increased color space and friends...
 

MrDeaf

Limp Gawd
Joined
Jun 9, 2017
Messages
428
There are no HDMI 2.1 transmitters to be had though, nor cables. They've set a standard for a spec for 48gbps signaling, but that doesn't mean there's commercial products that can use it. Right now one company has demonstrated a 2.1 transceiver, though it isn't available in commercial quantities. 2019 is the earliest you'd see it integrated in to a product. Likewise nobody is making 48g cables, and as of yet there isn't guidance on how they'll be made so manufacturers can't even start. They are likely to be pricey though, have a look at QSFP direct attach cables. Those do 40gbps, and they are f air bit of cash.

So nVidia's choices are HDMI 2.0 or DP 1.4, those are the latest interface standards that are currently available on the market, both of which the latest GeForces have on chip. HDMI 2.0 caps out at 18gbps (14.4 effective data rate), DP 1.4 goes to 32gbps (25.9 effective rate). So for 144Hz 4k it is DP or nothing right now. HDMI 2.0 can't do it even with 4:2:0 sampling. DP can do it with either 4:2:2 sampling or with DSC.

Something to keep in mind with these new high framerate and res monitors is that transceiver and interconnect speeds are a real issue, and one not easily overcome. When you want a lot of pixels, that takes a lot of data and transmitting that is hard if you want to do it cheap. If you want to see some of the issues and prices, as I noted, look to Ethernet. Ethernet does 40 and 100gbps over 4 simultaneous data channels (same as HDMI/DP) but there is some expensive equipment involved. For consumer electronics they want to keep things cheap, which means keeping the signaling and transceivers simple, which is hard.

Thanks, I didn't know that HDMI 2.1 wasn't out yet

Hopefully these 4k 10bit 144hz monitors only require a firmware update to get them into the next DP standard when it comes out.
 

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
4,962
Thanks, I didn't know that HDMI 2.1 wasn't out yet

Hopefully these 4k 10bit 144hz monitors only require a firmware update to get them into the next DP standard when it comes out.

Companies are great about bragging on new standards long before they are on the market. That's the wonderful state of marketing these days. So understandable you might think it was out, given how they hype shit.

As for a firmware update, almost certainly not. It requires new chips to do faster data rates. While you can do a firmware update for a feature sometimes, data rates are going to require new silicon.
 

whatevs

Limp Gawd
Joined
Jun 23, 2017
Messages
199
Likely not. But HDMI 2.1 is now packet based like Displayport is. The DP advantage seems a little gone. Maybe new DP delay is manufacturers are fighting back on two standards. Anyway, reading details, HDMI had to take over another channel, its not pure hertz increase. Four data channels instead of 3. If they want to use same port/cable(but better quality cables) like HDMI, it's very unlikely that the old hardware was designed to have a high hertz data channel running on that pin/path. Having older paths run faster, yes, but running data on a new path, unlikely.

edit: had to add, DP has one super major advantage. The regular sized DisplayPort connector is better than HDMI. HDMI connector is good for once in a year fiddling. Fullsize DP connector is just plain strong/reliable.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
4,392
Thanks, I didn't know that HDMI 2.1 wasn't out yet

Hopefully these 4k 10bit 144hz monitors only require a firmware update to get them into the next DP standard when it comes out.
HDMI 2.1 was so much better than the next proposed DP spec that they had to cancel its launch and take it back to the drawing board and it isn't expected until either late 2018 or early 2019. I doubt they will issue a firmware patch for a an old product that it in itself was a delayed launch by almost a year.
 
Top