The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

Not many "AAA games" (been watching too much Jim Sterling on YT) support SLI anyways. I suppose it's still good if you're into overclocking (which I'm not).

That's why these displays are future investments for me. Yes, I want the adaptive sync features with my 2080Ti but it might pay for me to wait for a full 10bit HDR 120Hz display with those features.

That said I really am not happy using my 2015 Samsung 48" TV as my every day display anymore. Really want a decent PC focused 4k display with adaptive sync.


Yeah, I've done multi-GPU three times. Once way back in the day with Voodoo2 cards. That actually worked fairly well.

Then I did it again in 2010 when I was an early adopter of the 30" 2560x1600 screens, and no single GPU on the market was fast enough to render the resolution well at 60fps. I ran dual triple slot Asus DirectCU II Radeon 6970's. Sure, average frame rates went up, but the experience was miserable, buggy, games crashed or were unstable, stutter, frame skipping, and the all important minimum frame rates pretty much stayed put, or only improved very little. So it essentially became a Toyota Supra Dyno Queen. Lots of high performance when it wasn't really needed, and no additional performance when it really mattered during minimum fps conditions.

I upgraded to a single 7970 as soon as it launched, and it was a clear improvement, despite only being 20% faster than each of the 6970's. I swore I would never do multi-GPU again.

Then 2015 happened. I was an early 4k adopter with my current Samsung TV as a screen. No single GPU was sufficiently fast, so I decided to try dual 980ti's. Maybe Nvidia did it better than AMD? Nope. All the same problems. As soon as the Pascal Titan X launched I jumped on it.

So my take is, has never been good in the modern era. It has always been a flawed solution. Today many fewer titles support it, and that's because it sucks. The experience is a total downgrade from any single GPU solution and has been since SLI actually stood for scanline interleave.
 
Just ordered mine on impulse because it showed instock to a local retailer here in Australia.

Looking forward to it, but I know I will take a massive sacrifice in visuals coming from my OLED E6.


I'm curious how many users on here actually run SLI 2080Ti's? Not many I'd imagine.

I was a big propronent of SLI'd GPUs decided to skip it with my 1080Ti (now have a 2080Ti). Really isn't much point in it anymore.


Doesn't RTX use NVLink and not SLi bridge?

I am considering getting a 2nd RTX 2080 Ti but I am a bit worried of increase input lag from a 2nd GPU that is present on SLi, but have been informed by Nvidia there is little to no extra input delay from NVLink multi-GPU. Not sure how true that is, anybody here know he answer to that?
 
Just ordered mine on impulse because it showed instock to a local retailer here in Australia.

Looking forward to it, but I know I will take a massive sacrifice in visuals coming from my OLED E6.

Nice! Price if you don't mind me asking?

I'm hoping that one of these combined with my B7 will give me the best of both worlds. I simply do not think that I can go back to playing dark/horror titles on an LCD. So one for velvety smooth/fast 4K gaming and one where image quality and immersion is of utmost importance.

Hopefully the 2020 LG OLEDs with 120Hz @ 4K and VRR will allow me to have just one display that combines most of what the two will do individually, but given the fact that these appear to be releasing for much less than what we initially thought, I'm thinking that I'm going to have to try one out.
 
Just ordered mine on impulse because it showed instock to a local retailer here in Australia.

Looking forward to it, but I know I will take a massive sacrifice in visuals coming from my OLED E6.





Doesn't RTX use NVLink and not SLi bridge?

I am considering getting a 2nd RTX 2080 Ti but I am a bit worried of increase input lag from a 2nd GPU that is present on SLi, but have been informed by Nvidia there is little to no extra input delay from NVLink multi-GPU. Not sure how true that is, anybody here know he answer to that?


It does but I just call it SLI out of habit. It's just not worth it anymore.
 
Hopefully the 2020 LG OLEDs with 120Hz @ 4K and VRR will allow me to have just one display that combines most of what the two will do individually, but given the fact that these appear to be releasing for much less than what we initially thought, I'm thinking that I'm going to have to try one out.


The LG VRR solution isn't compatible with Freesync/G-Sync though... that's the problem.
 
I had 2080 Ti NVLink aka SLI and it was okay at best.

While it did work for some games, I found after I removed 1 card performance was still fine.

I guess it depends what you are doing or which games you play. Overall though, it's not worth the price.
 
The LG VRR solution isn't compatible with Freesync/G-Sync though... that's the problem.

Yeah, that's a bummer. Do you think that there will be any tangible benefit to be had from the VRR, though? Forgive my ignorance on this subject - I've been using large 4K displays since 2015 and haven't closely followed all of the VRR/FreeSync/G-Sync limitations and how they work together.

I know it won't be the absolute best gamer's monitor due to the lack of FreeSync etc. but between HDMI 2.1, high frame rate support at 4K, and the reduced input lag I see it as a definite upgrade from what I'm using now.
 
My reply wasn't only about SLI,
Another point was that even if you aren't getting 120fps solidly, using VRR/free-sync with a high enough frame rate graph will get some benefit out going over100fps/100hz to 120fps/120hz ranges if your frame rate average spans up into that. 95 fps at 98Hz would definitely still get some benefits though.

There are games that can get 100fps average with a single high end gpu at 4k, depending on your settings and what game it is, but that the most demanding games at the highest settings can't get near it. Running under 100fps average (~ 70 <<<100>>>120 ) graph isn't really getting much if anything out of the higher hz capability of a high hz monitor. You don't get to 40 - 50% blur reduction and 5:3 to 2:1 motion definition increase until 100fps to 120 fps (rates, not averages).


I skipped the 2000 series gpus as they are incremental and still not hdmi 2.1, and I have zero interest in running lower than 100fps on a 120hz+ monitor (getting little to no benefit out of the higher Hz).. so RTX is pretty much useless to me, let alone at 4k 120hz.

The price on these displays is tempting especially since the 43" size would fit my array perfectly, but I feel like it's planned obsolescence once hdmi 2.1 displays come out (in yr 2020?) and a die shrink + hdmi 2.1 output gpus finally come out eventually someday.

So I could buy one of these for xmas 2019 perhaps but I'd be eager to get a hdmi 2.1 , (FALD HDR even better) in the same form factor if they make one someday. I usually don't drop over $1k a pop on monitors per year so that is a big factor:b I guess I could just push the real hdmi 2.1 display and hdmi 2.1 die shink gpu upgrades back another year to like end 2021 and keep this monitor for a few years, buying it if the reviews are good.

Breadcrumb road maps.. ugh. Really it's kind of putting the cart before the road and the horse at this point .. The monitor (the cart) , before the road (hdmi 2.1) , and the horse (die shink + hdmi 2.1 gpu)... but very tempting.
 
Last edited:
if anyone was wondering it cost me $1600 AU.

But now the webpage shows the item is out of stock an ETA of 23/08 and my order is still in processing. So if I misssd out on this batch I am getting a refund and possibly waiting for something else.
 
Yeah, that's a bummer. Do you think that there will be any tangible benefit to be had from the VRR, though? Forgive my ignorance on this subject - I've been using large 4K displays since 2015 and haven't closely followed all of the VRR/FreeSync/G-Sync limitations and how they work together.

I know it won't be the absolute best gamer's monitor due to the lack of FreeSync etc. but between HDMI 2.1, high frame rate support at 4K, and the reduced input lag I see it as a definite upgrade from what I'm using now.


Potentially... obviously the deep OLED blacks will be nice, and if input lag and latency is low enough, I don't see why it would be bad... but no question its imperfect until we get something better suited for PC.
 
So, it looks like both the ASUS and the Acer are be release "before prime time". The ASUS have Freesync 2 but can't handle 4K@120 in 10 bits. and While the Acer uses 2 DP cables to handle 4K@144 in 10 bits but does not support Freesync 2. Maybe I should just get a Samsung Q60/70 and call it a day until next year.
 
Racing Simulator: Rseat N1 (scroll down to near bottom, to see all attachments... and transducers, etc.)
Haha wow man I am jealous, yeah as a race sim fan I know about the Rseat. If I get a new condo I would love the room for 80/20 rig, and I have been wanting to upgrade my wheel to the DD1 or any direct drive for a while now.
You using a clubsport or direct drive?
 
So, it looks like both the ASUS and the Acer are be release "before prime time". The ASUS have Freesync 2 but can't handle 4K@120 in 10 bits. and While the Acer uses 2 DP cables to handle 4K@144 in 10 bits but does not support Freesync 2. Maybe I should just get a Samsung Q60/70 and call it a day until next year.
Damn, we are always just one little step away from perfect.
but yeah I really don't want a screen without 10bit colour and 120hz anymore. as seeing the halo in black with my 8bit screen drives me nuts.
I assume the halo I see is because of the colour bit, at least I read it was.
 
So, it looks like both the ASUS and the Acer are be release "before prime time". The ASUS have Freesync 2 but can't handle 4K@120 in 10 bits. and While the Acer uses 2 DP cables to handle 4K@144 in 10 bits but does not support Freesync 2. Maybe I should just get a Samsung Q60/70 and call it a day until next year.

Asus is releasing another monitor, the XG43UQ, that will supposedly do 144 Hz at 4k with one DP1.4 connection using compression. However, I think the DSC standard they are using is only supported by AMD so it seems somewhat pointless given that by the time AMD has a GPU capable of driving it, HDMI 2.0 should be everywhere.
 
Ya I am trying to find out if NVIDIA will allow DSC but coming up short.
 
I believe NVIDIA's Quadro cards support DSC (its pretty commonly used in professional applications where you want more than 10 bit color). It requires its own hardware decoder so if the current GTX/RTX consumer cards don't have it, I doubt it will be added, given that the next generation cards should support HDMI 2.1.
 
Asus is releasing another monitor, the XG43UQ, that will supposedly do 144 Hz at 4k with one DP1.4 connection using compression. However, I think the DSC standard they are using is only supported by AMD so it seems somewhat pointless given that by the time AMD has a GPU capable of driving it, HDMI 2.0 should be everywhere.

Well, HDMI 2.0 does not solve this problem.

With HDMI 2.0 you are limited to 8bit 60hz at 4k.
 
If that's the case, it might be worth waiting for the DSC version over this 120Hz version. But, I doubt they'd release it so close to such a similar product so we probably won't see it for another year.
 
DSC is part of the Displayport 1.4 spec.

I don't know, I might stick with my Samsung 48" 2015 curved display for a while to see how things pan out on the display front.

Really am tired of it- while it looks decent enough, I would like a dedicated PC display.


I'm in the same boat with my 2015 JS9000 Sammy. I REALLY want some modern features, like some form of VRR, but I also don't want to wind up regretting jumping in to early. I tend to keep monitors about 5 years on average. I don't want to spend until 2024 wondering if I made a mistake by not waiting for 10 bit at 120hz....

I'm still thinking about it. I'm going to wait for well written reviews of this unit and then make up my mind.
 
Asus is releasing another monitor, the XG43UQ, that will supposedly do 144 Hz at 4k with one DP1.4 connection using compression. However, I think the DSC standard they are using is only supported by AMD so it seems somewhat pointless given that by the time AMD has a GPU capable of driving it, HDMI 2.0 should be everywhere.

Here's the product page for that display: https://www.asus.com/Monitors/ROG-Strix-XG43UQ/
It's pretty much just a placeholder but it tells that it is coming and it will be more expensive as all ASUS UQ models are. How much more expensive remains a mystery.

Q4 might be a possibility as the smaller XG27UQ is expected to release around that time according to this. The XG43UQ is being demonstrated at Gamescom according to this.

Here's some info about DSC vs chroma subsampling vs 2 DP connections: https://www.trustedreviews.com/news/asus-4k-gaming-monitors-dsc-3931633

God damn it ASUS, I was ready to go for the XG438Q but if they are going to release the XG43UQ later this year or early next year it might be a better option to wait.
 
Last edited:
HDR/10bit color is so overrated. It just isn't going to matter in any substantial way for years. It's not a reason to buy or not buy a monitor. I think people should be more focused just on solid fundamentals.

That's why my money's on the LG 37.5". It just has solid specs using proven technologies.

It's just bizarre to me that so many people seem worried about HDR when almost every contemporary movie blows anyway and you'll probably end up playing WOW Classic more than anything else.
 
8bit vs 10bit -- or you can't get 4:4:4 chroma at over 98Hz without hdmi 2.1 so it's not just 10 Bit HDR, though on a monitor marketed as HDR it's not unreasonable for it to be a consideration. It's also marketed as 120hz and 98Hz isn't quite as good.. Compared to a 60fps at 60Hz+ baseline, that's ~ 40% blur reduction at 100fps at 100Hz+ , 50% blur reduction and a full 2:1 doubling increase in # of frames of motion definition when 120fps at 120Hz+. Even if you are at around 90 to 100fps average you'll get increased benefit from your frame rate graph fluctuating another 30fps/30Hz to 20fps/20Hz higher using VRR.

TFT Central's review of the PG27UQ goes into the tradeoffs pretty well, though some of us don't agree that "it's no big deal" , "it might not make a real difference", "not THAT noticeable" , "in some cases you may not see a real difference" .. possibly/maybe takes etc. is acceptable on a $1k + monitor.

This is a compromise because as I said a few posts back...

"Really it's kind of putting the cart before the road and the horse at this point .. The monitor (the cart) , before the road (hdmi 2.1) , and the horse (die shink + hdmi 2.1 gpu)... but very tempting. "

We all know 4k 120hz+ ultimately really needs a wider road (hdmi 2.1) and a more powerful horse with the tackle (hdmi 2.1 output) and strength to use that road and pull 4k 120hz.
Until then these tradeoffs. 98Hz , 8-bit, or 4:2:2 .. and lower game settings to get higher Hz benefits at 4k resolution on the most demanding games.
.

PG27UQ TFT Central Rewiew Outtakes Regarding bandwidth limitation tradeoffs*
================================================================

https://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm#chroma

"
. As we explained earlier, 10-bit colour depth support is only applicable for gaming on this display given the limitations of Geforce gaming graphics cards, and not for professional applications like Photoshop. This is useful for HDR gaming where 10-bit colour depth is more commonly used and a lot of other games may not even support 10-bit anyway. No colour compression is needed for refresh rates up to 98Hz anyway so 10-bit is there if you need it.

If you want to push the screen up to high refresh rates >98Hz then some kind of colour compression is required so that it can fit within the bandwidth capabilities of DP 1.4. There are two ways this can be achieved


1) Drop from 10-bit colour depth to 8-bit - this might not actually make any real difference for many games, especially if they are non-HDR games or just simply don't support 10-bit colour depth. In a game where 10-bit is supported, or in HDR gaming you may see some improvements in colour gradients when using 10-bit over 8-bit but then again in some cases you may not see much real difference. For 120Hz refresh rates you can drop the colour depth to 8-bit and then not worry about the chroma sub-sampling discussed below.



2) Use Chroma Sub-sampling - this is a method for compressing the colour information in a signal to save on bandwidth, without significantly impacting the picture quality in many cases. This avoids the need to reduce the luminance information (luma) in the signal which would have a more noticeable impact on picture quality, and this method can help reduce the file size by a significant amount. "

------------------

Does Chroma sub-sampling make a difference?


You will see that when you switch to YCbCr422 mode (4:2:2 chroma sub-sampling) that the 'output dynamic range' setting in the NVIDIA control panel also switches to 'limited'. We know from some previous screens that often when the graphics card gets accidentally set in limited range, it has a huge impact on the black depth and contrast ratio of the display. That is because a limited RGB range (16 - 235 instead of the normal 0 - 255) clips some dark shades and some bright shades. We were initially concerned about this on the PG27UQ but we need not have worried.


When running at 4:2:2 chroma mode there was no noticeable impact to the setup of the screen. Running a test with our i1 Pro 2 showed the same gamma curve, white point and low dE that we'd seen out of the box by default. We measured a static contrast ratio of >1000:1 as well, and carrying out visual tests of the darkest black/grey and brightest grey/white shade samples which showed no noticeable difference between 4:4:4 mode and 4:2:2 mode. There was no limiting of the contrast ratio here which was great news and it appears not to be operating in any limited dynamic range. This applied to SDR content as well as HDR.


Where there is an observable difference is when viewing text. For normal day to day PC use like office documents etc, fonts sometimes look more blurred and a little broken in places in 4:2:2 mode, particularly with text on solid coloured backgrounds. This is more noticeable the smaller the fonts get. The compressed colour data makes reading text a problem sometimes if running at 4:2:2 mode although to be honest it's quite slight, and only in certain conditions that you'd really see it. A lot of the time you'd have to go specifically looking for it. This text blurring and clarity issue, where visible, is a commonly observed and known side-effect of the lower chroma sub-sampling on displays.

================END TFT CENTRAL QUOTEs==================

*Note that the PG27UQ is a 384zone FALD monitor, so regular use of dynamic contrast contrast mode/black depth FALD ordinarily even in SDR content - in regard to 4:2:2 chroma mode will not be direct parallels to how the tradeoffs may affect this 43" monitor. It's also a much higher perceived pixel density (depending on your viewing distance) which might make a difference in perceived chroma/text clarity at 4:2:2 compared to a 43" 4k.
 
Last edited:
DSC is part of the Displayport 1.4 spec.

I don't know, I might stick with my Samsung 48" 2015 curved display for a while to see how things pan out on the display front.

Really am tired of it- while it looks decent enough, I would like a dedicated PC display.

DSC on DP 1.4 is optional. Hence why there hasn't been a single DSC DP1.4 display to date.
 
If games supported custom resolutions letterboxed across the board it might help things if you were willing to run a 21:9 or 21:10 resolution (for example, playing 21:9 at x1400 or x1600, or 3840x1645 or 3840x1828) letterboxed with bars) on a considerably larger screen like this for higher bitrate and wider FoV on more demanding games, HDR games, etc.. while not being limited to a true 21:9 resolution monitor and smaller screen size (and height) overall. On larger screen sizes like this, running letterboxed would still result in a considerably large gaming viewport ~ screen and in some games the wider FoV could be appreciated. I don't know what the hard limit is custom resolution wise for 10bit , 4:4:4 120hz is offhand. (DisplayPort 1.3/1.4 (HBR3) RAW 32.4Gbps Effective 25.92Gbps)

Close but not quite 3840 x 1645 10bit bandwidth according to https://k.kramerav.com/support/bwcalculator.asp


FcPskMo.png

This result could be a little too strict though since on the same calculator, 3840x2160 10bit 98Hz ends up being 29.26 Gbps. There could be some leeway between RAW and "effective" Displayport 1.4 rates though which could explain it.

In that case, a full width 21:9 custom rez of 3840x1645 10bit 120hz at 27.29 Gbps would fit,
(a 16:10 at 3840x1828 at 30.21 Gbps would be a bit over the 4k 10bit 98Hz baseline 29.26, while still below the dp 1.4 RAW max of 32.4 Gbps).


Moot point if the game doesn't support custom widescreen rez on a 16:9 though anyway.


edit... mistakenly typed x2160 when I was writing about 21:9 rez the last few lines.. fixed.
 
Last edited:
8bit vs 10bit -- or you can't get 4:4:4 chroma at over 98Hz without hdmi 2.1 so it's not just 10 Bit HDR, though on a monitor marketed as HDR it's not unreasonable for it to be a consideration. It's also marketed as 120hz and 98Hz isn't quite as good.. Compared to a 60fps at 60Hz+ baseline, that's ~ 40% blur reduction at 100fps at 100Hz+ , 50% blur reduction and a full 2:1 doubling increase in # of frames of motion definition when 120fps at 120Hz+. Even if you are at around 90 to 100fps average you'll get increased benefit from your frame rate graph fluctuating another 30fps/30Hz to 20fps/20Hz higher using VRR.

TFT Central's review of the PG27UQ goes into the tradeoffs pretty well, though some of us don't agree that "it's no big deal" , "it might not make a real difference", "not THAT noticeable" , "in some cases you may not see a real difference" .. possibly/maybe takes etc. is acceptable on a $1k + monitor.

This is a compromise because as I said a few posts back...

"Really it's kind of putting the cart before the road and the horse at this point .. The monitor (the cart) , before the road (hdmi 2.1) , and the horse (die shink + hdmi 2.1 gpu)... but very tempting. "

We all know 4k 120hz+ ultimately really needs a wider road (hdmi 2.1) and a more powerful horse with the tackle (hdmi 2.1 output) and strength to use that road and pull 4k 120hz.
Until then these tradeoffs. 98Hz , 8-bit, or 4:2:2 .. and lower game settings to get higher Hz benefits at 4k resolution on the most demanding games.
.

PG27UQ TFT Central Rewiew Outtakes Regarding bandwidth limitation tradeoffs*
================================================================

https://www.tftcentral.co.uk/reviews/asus_rog_swift_pg27uq.htm#chroma

"
. As we explained earlier, 10-bit colour depth support is only applicable for gaming on this display given the limitations of Geforce gaming graphics cards, and not for professional applications like Photoshop. This is useful for HDR gaming where 10-bit colour depth is more commonly used and a lot of other games may not even support 10-bit anyway. No colour compression is needed for refresh rates up to 98Hz anyway so 10-bit is there if you need it.

If you want to push the screen up to high refresh rates >98Hz then some kind of colour compression is required so that it can fit within the bandwidth capabilities of DP 1.4. There are two ways this can be achieved


1) Drop from 10-bit colour depth to 8-bit - this might not actually make any real difference for many games, especially if they are non-HDR games or just simply don't support 10-bit colour depth. In a game where 10-bit is supported, or in HDR gaming you may see some improvements in colour gradients when using 10-bit over 8-bit but then again in some cases you may not see much real difference. For 120Hz refresh rates you can drop the colour depth to 8-bit and then not worry about the chroma sub-sampling discussed below.



2) Use Chroma Sub-sampling - this is a method for compressing the colour information in a signal to save on bandwidth, without significantly impacting the picture quality in many cases. This avoids the need to reduce the luminance information (luma) in the signal which would have a more noticeable impact on picture quality, and this method can help reduce the file size by a significant amount. "

------------------



================END TFT CENTRAL QUOTEs==================

*Note that the PG27UQ is a 384zone FALD monitor, so regular use of dynamic contrast contrast mode/black depth FALD ordinarily even in SDR content - in regard to 4:2:2 chroma mode will not be direct parallels to how the tradeoffs may affect this 43" monitor. It's also a much higher perceived pixel density (depending on your viewing distance) which might make a difference in perceived chroma/text clarity at 4:2:2 compared to a 43" 4k.
The newest GeForce driver added 10-bit color support outside of games.
 
  • Like
Reactions: elvn
like this
If games supported custom resolutions letterboxed across the board it might help things if you were willing to run a 21:9 or 21:10 resolution (for example, playing 21:9 at x1400 or x1600, or 3840x1645 or 3840x1828) letterboxed with bars) on a considerably larger screen like this for higher bitrate and wider FoV on more demanding games, HDR games, etc.. while not being limited to a true 21:9 resolution monitor and smaller screen size (and height) overall. On larger screen sizes like this, running letterboxed would still result in a considerably large gaming viewport ~ screen and in some games the wider FoV could be appreciated. I don't know what the hard limit is custom resolution wise for 10bit , 4:4:4 120hz is offhand. (DisplayPort 1.3/1.4 (HBR3) RAW 32.4Gbps Effective 25.92Gbps)

Close but not quite 3840 x 1645 10bit bandwidth according to https://k.kramerav.com/support/bwcalculator.asp




This result could be a little too strict though since on the same calculator, 3840x2160 10bit 98Hz ends up being 29.26 Gbps. There could be some leeway between RAW and "effective" Displayport 1.4 rates though which could explain it.

In that case, a full width 21:9 custom rez of 3840x2160 10bit 120hz at 27.29 Gbps would fit,
(a 16:10 at 3840x1828 at 30.21 Gbps would be a bit over the 4k 10bit 98Hz baseline 29.26, while still below the dp 1.4 RAW max of 32.4 Gbps).


Moot point if the game doesn't support custom widescreen rez on a 16:9 though anyway.

That is my first go-to when when old Titan is unable to keep up at 4k. I have created a 3840x1646 resolution I use. I have yet to come across a game in which I've needed it where it didn't work.
 
That is my first go-to when when old Titan is unable to keep up at 4k. I have created a 3840x1646 resolution I use. I have yet to come across a game in which I've needed it where it didn't work.

Cool on the game end thanks for letting me know.

Now I'd like to find out if it will work at 3840 x 1645 or 46 (and even attempt 16:10 3840 x 1828) at full 4:4:4: 120hz 10bit.

A PG27UQ should be able to make the attempt(s) if anyone feels like trying. :D
 
Last edited:
I had good luck with setting custom 21:9 resolutions with Nvidia. I tried a bunch of games, I think maybe 1 didn't work right but I can't remember the name.

However, I could not get it to work on AMD at all. This was with the older Vega cards, haven't tested on Navi yet, but I was not able to get a custom resolution to accept.
 
I waited this long, I might as well wait for the UQ variants.

Why would ASUS announce the UQ right before the Q is going on sale? Aren’t they potentially killing most of their sales on such a big ticket item?
 
It doesn't end there, apparently there is a XG43VQ coming that is even cheaper than the XG438Q with HDR400.
https://www.overclock3d.net/news/gp...s_lower_cost_xg43vq_4k_120hz_gaming_display/1

EDIT: It seems the XG43VQ is 3840x1200 43" ultrawide. Thanks ASUS for confusing model names.

XG43UQ seems to be 1500 euros and is expected to release sometime in Q4 2019.

I wonder if these will all just use different quality tiers of the same panel or different panels for the low and high end model.
 
Last edited:
I waited this long, I might as well wait for the UQ variants.

Why would ASUS announce the UQ right before the Q is going on sale? Aren’t they potentially killing most of their sales on such a big ticket item?

I'm with you, going to wait for the UQ.
 
I waited this long, I might as well wait for the UQ variants.

Why would ASUS announce the UQ right before the Q is going on sale? Aren’t they potentially killing most of their sales on such a big ticket item?

well there's still a demarcation between the two, also separated by an expected price different of around 500 EUR. The XG438Q has only 120Hz and HDR 600 (edge lit local dimming) but is available imminently. The XG43UQ uses DSC so will support up to 144Hz and offer a slight refresh bump, and HDR 1000 (still edge lit local dimming) but will not be available for a while. apparently maybe in Q4, but i expect it won't be until 2020 given other delays in recent times.

DSC from the XG43UQ isn't supported from all graphics cards, and you'd need to consider whether you can achieve the extra refresh rate as well from your system at 4K. And then consider whether it's worth the extra 500 EUR to you.
 
Back
Top