The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

well there's still a demarcation between the two, also separated by an expected price different of around 500 EUR. The XG438Q has only 120Hz and HDR 600 (edge lit local dimming) but is available imminently. The XG43UQ uses DSC so will support up to 144Hz and offer a slight refresh bump, and HDR 1000 (still edge lit local dimming) but will not be available for a while. apparently maybe in Q4, but i expect it won't be until 2020 given other delays in recent times.

DSC from the XG43UQ isn't supported from all graphics cards, and you'd need to consider whether you can achieve the extra refresh rate as well from your system at 4K. And then consider whether it's worth the extra 500 EUR to you.

It's about 300 euro price difference, the XG438Q seems to settle around 1200€. Both new Nvidia and AMD cards should support DSC afaik. 300 euros is nothing for a display that is noticeably better in a lot of key specs.
 
Well I guess asus has more of the market covered then, a mid end and high end market with the 8Q and UQ variants. I don't think that they are trying to outdate/replace the 8Q with the later UQ.
 
It's about 300 euro price difference, the XG438Q seems to settle around 1200€. Both new Nvidia and AMD cards should support DSC afaik. 300 euros is nothing for a display that is noticeably better in a lot of key specs.
To some people yes. But not everyone has the latest graphics card or wants to spend extra for a few extra Hz. It remains to be seen whether the UQ will offer any real HDR Performance improvements. Also price and release date is not finalised and will likely change
 
That's why my money's on the LG 37.5". It just has solid specs using proven technologies.
Which LG monitor are you talking about exactly? I haven’t been paying attention to these displays for years, so I don’t know much about the current options.
 
Is the UQ mainly adding DSC / 10bit@4k120 and higher peak brightness HDR1000 or is there something else like FALD to make that HDR mode also higher in contrast ?
 
Is the UQ mainly adding DSC / 10bit@4k120 and higher peak brightness HDR1000 or is there something else like FALD to make that HDR mode also higher in contrast ?

I think it's still edge-lit. No mentions of FALD. It's also 144 Hz instead of 120.
 
if these were FALD they'd probably be more like $2000 to $2500 USD instead of $1200 - $1500 USD. I still wish they had the option on another model.
 
Haha wow man I am jealous, yeah as a race sim fan I know about the Rseat. If I get a new condo I would love the room for 80/20 rig, and I have been wanting to upgrade my wheel to the DD1 or any direct drive for a while now.
You using a clubsport or direct drive?

I have a variable drive motor from an old robot I am using....
I am in the process of moving it and it's been disassembled for over 4 months now and I am going crazy... But I'll for sure have it back up and running before winter hits.

I don't really need all the fuss it took to get the DD motor working, but I got it free... so I made it work. I think I'll invest into a new Fanatec DD when re-assembled.
 
The UQ seems to be the same monitor essentially... I don't see any mention of backlight solution changing, but it is brighter at HDR-1000, and has DSC... and you're paying a few hundred extra for this it seems. Worth it? We shall see. Seems odd to me that they didn't just release the UQ as the only model... diluting the product range like this doesn't seem to make a great deal of sense.
 
The UQ seems to be the same monitor essentially... I don't see any mention of backlight solution changing, but it is brighter at HDR-1000, and has DSC... and you're paying a few hundred extra for this it seems. Worth it? We shall see. Seems odd to me that they didn't just release the UQ as the only model... diluting the product range like this doesn't seem to make a great deal of sense.

I fully expected them to just roll the UQ features into the 438Q when they demoed the then unnamed model earlier this year. I would expect most of these to be sold to enthusiasts who would be knowledgeable about the displays but I guess it doesn't cost them all that much extra to make two versions so they can sell for those who might balk at the 1500 euro price tag of the UQ.
 
From TFT Central:


So far Asus have announced three 43″ sized screens, but with differing specs and features. They are, in order they were announced:

  1. XG438Q – the first to be announced with3840 x 2160 resolution, 16:9 aspect ratio, 120Hz, FreeSync 2, 750 cd/m2 peak brightness and HDR 600 certification. Expected to be available late August (although now maybe Sept?) with a UK MSRP of £1099
  2. XG43UQ – this was the first screen announced that will support DSC for allowing high resolutions and high refresh rates at the same time, without needing to use chroma sub-sampling. This model has the same 3840 x 2160 16:9 aspect ratio VA panel but can offer a slightly higher 144Hz refresh rate and HDR 1000 certification instead. The higher refresh rate is made possible by the DSC, without the need for chroma sub-sampling as well. Although it’s expected to be edge-lit local dimming as opposed to any FALD backlight option. According to some information coming out of Gamescom this is expected to be available at 1,499 EUR in Q4 2019.
  3. XG43VQ – the model discussed in this news piece but without the meaningful HDR experience of the other two models. This one is 120Hz as well but is a 32:10 aspect ratio with 3840 x 1200 resolution.
The newly announced XG43VQ is expected to be available for 899 EUR although release date is not yet known.
 
Is there any reliable info as to what DSC actually brings to the table, other than the 144Hz? I'm very unclear as to whether it would genuinely be worth waiting for this on the Asus or not, and worth the extra. Is this the first monitor to use DSC?
 
Seems like they are purposely holding the UQ back to keep from impacting Q sales, then why announce it 1 day before the Q goes on sale?
 
DSC allows the display to do 4K/144 Hz with full 10 bit color and no chroma sub-sampling. Supposedly "visually lossless".

How much input lag it adds and if it is truly visually lossless compression remains to be tested.
 
DSC allows the display to do 4K/144 Hz with full 10 bit color and no chroma sub-sampling. Supposedly "visually lossless".

How much input lag it adds and if it is truly visually lossless compression remains to be tested.


I highly doubt its going to be visually lossless to some people on this board. If you read VESA's information on DSC their testing was based on random people comparing still images and text using subjective criteria. People here will likely notice some artifacts as most people here are not even close to your average consumer. Hopefully they will implement it in a way where the DSC isn't used below framerates exceeding standard DP 1.4 bandwidth.
 
I wonder if DSC is something that you can enable and disable at will in order to observe any impacts to image quality, input lag, etc. If so, then the UQ would unequivocally seem like the one to get.
 
https://www.synopsys.com/designware...visually-lossless-compression-uhd-2017q3.html

The test standards that they used for the "volunteers" isn't exactly rigorous. Hell, half half the people in a Best Buy wouldn't even be able to tell the difference between a 1080p movie and a 4k movie, so I am taking this "visually lossless" claim with a grain of salt.


Well, to be fair, at typical viewing distances on typical screen sizes almost no one can tell the difference between 1080p and 4k. :p

4k is great when I am sitting 2.5 ft away from a 48" screen, but from several feet away we are easily outresolving the human eye.
 
They should show up close fidelity photo comparisons between

-full 10bit 4:4:4 (60hz-98Hz)
-8bit 4:4:4 (120hz)
10bit 4:2:2 (120hz)
10bit 4:4:4 DSC (120Hz, 144hz)


Just as there have been close-up comparisons shown between things like pixel shapes (e.g. pentile and not) and other specs. The above comparisons should include color fidelity and whether any percent of the regular 2d gamut as well as the "3d" HDR P3 color volume is lost or any other saturation, black depth, contrast, motion blur change, input lag change, and image clarity especially text edges... and by how much.
"9 out of 10 noobs" is not showing us anything.
 
Well, to be fair, at typical viewing distances on typical screen sizes almost no one can tell the difference between 1080p and 4k. :p

4k is great when I am sitting 2.5 ft away from a 48" screen, but from several feet away we are easily outresolving the human eye.

Ya typical screen distance for people who typically have no clue what they are doing.

I always laugh when I see people buy a 55" 4K TV and their sofa is 20 feet away.
 
DSC allows the display to do 4K/144 Hz with full 10 bit color and no chroma sub-sampling. Supposedly "visually lossless".

How much input lag it adds and if it is truly visually lossless compression remains to be tested.
From what I understand, it compresses that resolution into 4:2:0 using the DSC compression algorithm, that looks very much (sort of?) like the original 4k 144Hz 10bit or whatever the output resolution is, to the average viewer. It is also meant for mobile devices, where one definitely wouldn't tell the difference due to the insane ppi.
The signal should not only be encoded but also de-coded using the dedicated chip installed in the display, albeit small one, although adding "one raster scan line" of the delay to the whole image processing timeline. That is less than 8 usec for the final output resolution processing in 4K @ 60Hz, as per the FAQ on the VESA DSC.
 
I wish they would give some examples showing the difference DSC makes like how there are a lot of examples of side by side photos with different levels of jpeg compression, and there are examples of different chroma on text.
 
Last edited:
The problem is even knowing the performance of the VESA standard DSC circuit isn't helpful because its almost certain that AMD/NVIDIA and ASUS won't be using the reference implementation, which lacks a lot of complexity. So even if reference DSC implementations have low lag, the ASUS and AMD/NVIDIA implementations might have significantly more lag. Particularly on the GPU end because prior to the refresh rate craze, the use-case for DSC was for graphics professionals looking to work in 12 and 24 bit color spaces where input lag didn't matter.
 
They mention 4:1 in the video which sounds like 1080p upscaled-ish. Sure there are some good 1080p source bluraydiscs but they still aren't 4k quality.

The modern displayport standard specs say 3:1 compression though. Which would be like 2560 x 1440 upscaled x 3 to 4k.

They are also saying it's compressed like compressed video in from a streaming service. That could mean better than 1080p using 1440-ish compressed base + smart compression algorithms but still not as good as 4k native. I'd also think that in order to get any possible extra smart compression you'd have a buffered frame to work with in order for the tech to figure out what parts are more smartly downgraded and which get more detail, and by what amount on a per frame basis- so that could add some input lag. I'm curious what the actual methodology is at 120hz - 144hz and like burburbur said, what nvidia's implementation would be.

I'm still interested in seeing SBS comparisons (and zoomed in).

I'd really be interested in a comparison of

interger scaled 1080p 10bit 4:4:4 and
upscaled 1440p 10bit 4:4:4
vs
4k 120hz-144hz 10bit 4:4:4 DSC

Unlike image editing, you'd get a ton more performance out of a lower resolution for gaming than stressing your gpu at a full 4k resolution and using display compression.
 
Last edited:
hi,
New xg438q is sitting on my desk.

first opinion:
+ great contrast, good colors (i am coming from ips acer xb270hu). va is not that bad after all :)
+ freesync 2 works great on my nvidia card, lfc works briliantly. i cant see any difference freesync-gsync,
+ hdr is great

now big minus
- i dont know how to set 10bit and 98hz. 10bit (so hdr too) works only at 60hz.
its strange, its not a display port limitation..
 
hi,
New xg438q is sitting on my desk.

first opinion:
+ great contrast, good colors (i am coming from ips acer xb270hu). va is not that bad after all :)
+ freesync 2 works great on my nvidia card, lfc works briliantly. i cant see any difference freesync-gsync,
+ hdr is great

now big minus
- i dont know how to set 10bit and 98hz. 10bit (so hdr too) works only at 60hz.
its strange, its not a display port limitation..
Have you tried any of the few games (like Assassin’s Creed Odyssey) that specifically has a Freesync 2 setting? Can you turn it on in the game settings?
 
HDR works fine in 8-bit. You may see slightly more banding but not that big of a deal in my A/B tests.
i cant set it correctly. when monitor is in 8bit mode, windows hdr option stays grayed out. hdr options in games=same way.
 
Have you tried any of the few games (like Assassin’s Creed Odyssey) that specifically has a Freesync 2 setting? Can you turn it on in the game settings?
i did not. i dont have this game. i just leave v sync off in all games and freesync works. everything is nicely fluid, lfc works when framerate drops below about 50 fps
 
i did not. i dont have this game. i just leave v sync off in all games and freesync works. everything is nicely fluid, lfc works when framerate drops below about 50 fps
Pretty sure AC Origins and Odyssey use their own implementation of HDR, that’s why I asked. I think some of the other new Ubi games have a “Freesync 2” setting in the menu, just not sure which ones. Be interesting to see if they work with a Nvidia card.
 
hi,
New xg438q is sitting on my desk.

first opinion:
+ great contrast, good colors (i am coming from ips acer xb270hu). va is not that bad after all :)
+ freesync 2 works great on my nvidia card, lfc works briliantly. i cant see any difference freesync-gsync,
+ hdr is great

now big minus
- i dont know how to set 10bit and 98hz. 10bit (so hdr too) works only at 60hz.
its strange, its not a display port limitation..

I don't believe you can set 98Hz. Perhaps there is some software out there that would do it, but NVIDIA control panel doesn't support that sort of granular control in any panel I've ever used it with.
 
hi,
New xg438q is sitting on my desk.

first opinion:
+ great contrast, good colors (i am coming from ips acer xb270hu). va is not that bad after all :)
+ freesync 2 works great on my nvidia card, lfc works briliantly. i cant see any difference freesync-gsync,
+ hdr is great

now big minus
- i dont know how to set 10bit and 98hz. 10bit (so hdr too) works only at 60hz.
its strange, its not a display port limitation..

Any issues with ghosting on the VA panel?

Have you tried setting a custom resolution with 98 Hz? It's odd that it is not supported considering ASUS made the PG27UQ.
 
Any issues with ghosting on the VA panel?

Have you tried setting a custom resolution with 98 Hz? It's odd that it is not supported considering ASUS made the PG27UQ.
i see little ghosting, but its acceptable for me.
I tried custom resolution but it always run in 8 bit. i just dont know how to force 10 bit.. maybe its nvidias limitation??
 
A little ghosting huh......GAWD I hate LCD!
giphy.gif
 
Unless he's running in the neighborhood of 120fps (solid) at 120hz to cut the sample and hold blur by 50% compared to 60fps@60Hz+.....
... the sample and hold blur at speed will trump VA ghosting and make it a moot point for the most part at lower fps ranges.


This chart lists the amount of pixel blur per solid frame rate (not average), on a 1000Hz monitor where your frame rate is the only limitation.

okw997S.png



https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

https://www.blurbusters.com/faq/oled-motion-blur/


On higher than 120Hz max monitors, VA again lose the sweet spot when ranging higher than 120fps on a 144hz+ though, too.. so the best case is stay at a 100fps common low or more and cap it at 117 - 120fps. More realistically , around 100fps average and a 117 - 120fps cap. That said, VA overdrive implementation quality can vary a lot between mfgs and models so it's not cut and dry there either.
 
Last edited:
i cant set it correctly. when monitor is in 8bit mode, windows hdr option stays grayed out. hdr options in games=same way.

Hmm that is odd. I've never had a monitor in which I couldn't turn on HDR in 8-bit color mode. I literally just did it now. Something else may be afoot.
 
Back
Top