PG32UQX - ASUS 32" 4K 144 Hz HDR1400 G-Sync Ultimate

Status
Not open for further replies.
"Most" HDR or "normal" HDR is made based on current average displays. You have to consider how many monitor can display the content.
While a lot of monitors can display low light, low APL scene, not many monitors can display HDR 1,000 to its upper end. That's where HDR shines the most.

I assume this is probably the brightest HDR most people have ever seen if their monitors can do it properly. And it is just HDR 1000.


You can download the HDR screenshots to check the images.
View attachment 507814

View attachment 507815

View attachment 507816

View attachment 507817

View attachment 507818

View attachment 507819

View attachment 507820
View attachment 507821
View attachment 507822
View attachment 507823

View attachment 507824

Hello I was wondering if you made that video or if it was just ripped from the movie like that. I downloaded a few different versions of the movie Belle and none of them have the HDR that bright. Either that or I have misconfigured something in MPV. If you use MPV what kind of settings do you use? Or do you use anything else. Or did just choose a few clips and grade them yourself? Thanks!
 
Hello I was wondering if you made that video or if it was just ripped from the movie like that. I downloaded a few different versions of the movie Belle and none of them have the HDR that bright. Either that or I have misconfigured something in MPV. If you use MPV what kind of settings do you use? Or do you use anything else. Or did just choose a few clips and grade them yourself? Thanks!
You need to buy the UHD disk to watch it with a Dolby Vision monitor to see higher range but you can also just turn up the contrast in PG32UQX menu to see similar images.
 
FWIW, I went back to using 10-Bit at 144Hz vs. 12-Bit. I don't think it matters much or makes any real difference at all, but after reading into it more, the 12-Bit is a nVidia GPU --> Gsync Dithering Layer to "simulate" 12-Bit. AFAIK, no PC games that use HDR use 12-Bit, all are either 10-Bit, or some are actually still 8-Bit with increased brightness levels. I think only Dolby Vision content is 12-Bit? In which case, using 12-Bit might be useful to simulate the color range as close as possible or when creating 12-Bit content, but outside of that, is there even a use?

I figured; if this monitor is true 10-Bit and HDR gaming is 10-Bit, why should I dither "guesstimated" color values that do not exist when I can just view the 10-bit color that was actually intended? The specifications were unclear on if this monitor only will Dither if the content is 12-Bit, or if it tries to "smooth" things out with dithering even if the content is 10-Bit asking to be displayed at 12-Bit by the color setting in the nvidia control panel.

Anyone have any clarification on how this is actually operating in the background?
 
I would be aboard on this if they had released a cheaper, regular backlight Freesync 2 model to go with it. The premium is just absurd for mini-LED and G-Sync.

While I generally say that OLED is not the way to go for desktop monitors due to size and burn-in possibility, when the only competition costs 2-3x more you might as well just buy the LG 48" OLED then buy another one a few years later if you have burn-in issues. The prices for these have gotten really out of hand. Acer revealed a similar 32" model at the same pricing and also the X38 which is essentially their version of the LG 38GL950G, again at 2000+ euro pricing.

High refresh rate has an incredible premium right now. 32" 4K 60 Hz screens start from about 400€, maybe 1000€ for a high quality model. 38" 3840x1600 60 Hz screens cost about 1000-1100€ vs 2000-2500€ for the same thing in 175 Hz. It's completely ridiculous.
Almost got this 120Hz OLED for PC use. https://www.microcenter.com/product...55-class-(546-diag)-4k-ultra-hd-smart-oled-tv
 
FWIW, I went back to using 10-Bit at 144Hz vs. 12-Bit. I don't think it matters much or makes any real difference at all, but after reading into it more, the 12-Bit is a nVidia GPU --> Gsync Dithering Layer to "simulate" 12-Bit. AFAIK, no PC games that use HDR use 12-Bit, all are either 10-Bit, or some are actually still 8-Bit with increased brightness levels. I think only Dolby Vision content is 12-Bit? In which case, using 12-Bit might be useful to simulate the color range as close as possible or when creating 12-Bit content, but outside of that, is there even a use?

I figured; if this monitor is true 10-Bit and HDR gaming is 10-Bit, why should I dither "guesstimated" color values that do not exist when I can just view the 10-bit color that was actually intended? The specifications were unclear on if this monitor only will Dither if the content is 12-Bit, or if it tries to "smooth" things out with dithering even if the content is 10-Bit asking to be displayed at 12-Bit by the color setting in the nvidia control panel.

Anyone have any clarification on how this is actually operating in the background?

Not going to be able to explain how the 12 bit option operates but something else to consider. Apparently, according to a German review site and maybe one other place, even the 10 bit mode on the PG32UQX works up to 120 Hz (or maybe 100Hz? I forget). After that, it's 8 bit + FRC.

All I can say is that none of the reviews mentioned any downsides to setting it to 10 bit or 12 bit mode. It's probably not worth worrying about. I'll try to find the link for that review later.

So practically speaking, this isn't anything to really worry about. Unless you're just trying to satisfy your curiosity.
 
Not going to be able to explain how the 12 bit option operates but something else to consider. Apparently, according to a German review site and maybe one other place, even the 10 bit mode on the PG32UQX works up to 120 Hz (or maybe 100Hz? I forget). After that, it's 8 bit + FRC.

All I can say is that none of the reviews mentioned any downsides to setting it to 10 bit or 12 bit mode. It's probably not worth worrying about. I'll try to find the link for that review later.

So practically speaking, this isn't anything to really worry about. Unless you're just trying to satisfy your curiosity.
Yeah, mostly curious on what the 2-Bit scalar in the Gsync module actually does... like, is it for 12-Bit content and will leave 10-Bit alone or with a 10-Bit HDR package does it simply "dither" colors into the mix to make it look 12-Bit (technically adding noise/color steps that do not need to exist).

Also, you are correct, it is TRUE 10-Bit up to 120Hz, at 144Hz, it is 8-Bit + FRC. Not sure why, likely a limitation of the panel itself or the Gsync module with 1024 color steps at that speed? It claims it is for bandwidth reasons, but with DSC, I don't see how. The 8-Bit + FRC is not even noticeable to the human eye though, unlike that Chroma subsampling that the PG27UQ had at that frequency.

Mostly trying to satisfy my curiosity on the 10-Bit vs. 12-Bit thing on this panel as it is never really explained what it actually does or when it is active!
 
FWIW, I went back to using 10-Bit at 144Hz vs. 12-Bit. I don't think it matters much or makes any real difference at all, but after reading into it more, the 12-Bit is a nVidia GPU --> Gsync Dithering Layer to "simulate" 12-Bit. AFAIK, no PC games that use HDR use 12-Bit, all are either 10-Bit, or some are actually still 8-Bit with increased brightness levels. I think only Dolby Vision content is 12-Bit? In which case, using 12-Bit might be useful to simulate the color range as close as possible or when creating 12-Bit content, but outside of that, is there even a use?

I figured; if this monitor is true 10-Bit and HDR gaming is 10-Bit, why should I dither "guesstimated" color values that do not exist when I can just view the 10-bit color that was actually intended? The specifications were unclear on if this monitor only will Dither if the content is 12-Bit, or if it tries to "smooth" things out with dithering even if the content is 10-Bit asking to be displayed at 12-Bit by the color setting in the nvidia control panel.

Anyone have any clarification on how this is actually operating in the background?
You don't need G-sync module to do 12-bit. FRC is dithering. As long as the native 10-bit panel can flip between colors it can do 12-bit given enough bandwidth. This 12-bit dithering is accurate enough or you would notice something else like artifacts.
Once you are in 12bit HDR, what you see is already 12bit. It's just your content is not bright enough to show the difference of expanded 12-bit color.
 
You don't need G-sync module to do 12-bit. FRC is dithering. As long as the native 10-bit panel can flip between colors it can do 12-bit given enough bandwidth. This 12-bit dithering is accurate enough or you would notice something else like artifacts.
Once you are in 12bit HDR, what you see is already 12bit. It's just your content is not bright enough to show the difference of expanded 12-bit color.
So is the 12-Bit mode (using FRC) on this monitor trying to show 12-Bit content (through added color steps) in a game that is HDR (which is natively 10-Bit, no 12-Bit HDR games are out to my knowledge)? Or is it just if I come across 12-Bit color content, it will work but it does NOT add color steps to 10-Bit content?
 
So is the 12-Bit mode (using FRC) on this monitor trying to show 12-Bit content (through added color steps) in a game that is HDR (which is natively 10-Bit, no 12-Bit HDR games are out to my knowledge)? Or is it just if I come across 12-Bit color content, it will work but it does NOT add color steps to 10-Bit content?
You need 12-bit content in the first place. It won't automatically add colors or steps in a 10-bit footage. It will look the same.
But you wouldn't know if a game uses 12bit such as Battlefield or Mass Effect Andromeda. Enabling 12-bit won't lose anything.
 
You need 12-bit content in the first place. It won't automatically add colors or steps in a 10-bit footage. It will look the same.
But you wouldn't know if a game uses 12bit such as Battlefield or Mass Effect Andromeda. Enabling 12-bit won't lose anything.
Ok, good to know, because the description of the feature reads that the GSync module uses the 2-bits as a "scalar"... whatever the hell that means in this case. I'd have assume 10-Bit + FRC for 12-Bit color.... maybe it means the same thing said differently?

"*Using DP 1.4 (with DSC), Full Range RGB 10-bit can be selected in the graphics driver at any refresh rate, up to the native resolution. For bandwidth reasons 8-bit + FRC is employed by the monitor at 144Hz for the native resolution via DP, whereas for lower refresh rates or resolutions true 10-bit is supported. The difference between the two implementations is negligible in practice. 12-bit can also be selected, with an additional 2-bit dithering stage added by the scaler for all refresh rates (up to 144Hz)."
 
Ok, good to know, because the description of the feature reads that the GSync module uses the 2-bits as a "scalar"... whatever the hell that means in this case. I'd have assume 10-Bit + FRC for 12-Bit color.... maybe it means the same thing said differently?

"*Using DP 1.4 (with DSC), Full Range RGB 10-bit can be selected in the graphics driver at any refresh rate, up to the native resolution. For bandwidth reasons 8-bit + FRC is employed by the monitor at 144Hz for the native resolution via DP, whereas for lower refresh rates or resolutions true 10-bit is supported. The difference between the two implementations is negligible in practice. 12-bit can also be selected, with an additional 2-bit dithering stage added by the scaler for all refresh rates (up to 144Hz)."
Without adding additional timings,
4K 120Hz 8 bit needs 3840x2160x8x3x120x10^-9=23.89Gbit/s.
4K 120Hz 10bit needs 3840x2160x10x3x120x10^-9=29.86Gbit/s.
4K 144Hz 10bit needs 3840x2160x10x3x144x10^-9=35.83Gbit/s, which is already beyond DP1.4 32.40 Gbit/s bandwidth.

But with DSC 1.2a compression, you can compress at least 3x data into DP1.4 30Gbit/s. So the data rate before compression can be 32.40x3=97.2Gbit/s.

4K 120Hz 12bit needs 3840x2160x12x3x120x10^-9=35.83Gbit/s, which is the same as 4K 144Hz 10bit.
4K 144Hz 12bit needs 3840x2160x12x3x144x10^-9=43.00Gbit/s.

There is still headroom to compress. I see no reason not to implement a patented profile to use native 10bit FRC + DSC for 4K 120Hz 12bit instead of only using 8-bit FRC for 10bit or 12bit.
These reviewers are wrong, they've never shown the 8-bit FRC on the factory menu. They only know other displays have banding or scanline issue when it is beyond 4K 120Hz 10bit which is non-existent on PG32UQX.
 
Without adding additional timings,
4K 120Hz 8 bit needs 3840x2160x8x3x120x10^-9=23.89Gbit/s.
4K 120Hz 10bit needs 3840x2160x10x3x120x10^-9=29.86Gbit/s.
4K 144Hz 10bit needs 3840x2160x10x3x144x10^-9=35.83Gbit/s, which is already beyond DP1.4 32.40 Gbit/s bandwidth.

But with DSC 1.2a compression, you can compress at least 3x data into DP1.4 30Gbit/s. So the data rate before compression can be 32.40x3=97.2Gbit/s.

4K 120Hz 12bit needs 3840x2160x12x3x120x10^-9=35.83Gbit/s, which is the same as 4K 144Hz 10bit.
4K 144Hz 12bit needs 3840x2160x12x3x144x10^-9=43.00Gbit/s.

There is still headroom to compress. I see no reason not to implement a patented profile to use native 10bit FRC + DSC for 4K 120Hz 12bit instead of only using 8-bit FRC for 10bit or 12bit.
These reviewers are wrong, they've never shown the 8-bit FRC on the factory menu. They only know other displays have banding or scanline issue when it is beyond 4K 120Hz 10bit which is non-existent on PG32UQX.
Yeah, I have 0 issues at 144Hz and just leave it there. 10-Bit/12-Bit at that range looks perfectly fine to me on desktop or gaming, even if it is only 8-Bit + FRC (which your eyes can't even tell anyway).

My question is; if bandwidth is NOT the issue; why is the screen 8-Bit + FRC at 144Hz and NOT at 120Hz? Would this be a limitation of the AU Optronics panel itself not being able to transition such fine color steps at that refresh rate? Perhaps the Gsync module in this monitor is the limiting factor in terms of processing power?

This is more curiosity than anything else at this point, as I notice 0 difference between the true 10-bit at 120Hz vs. 144Hz.
 
Yeah, I have 0 issues at 144Hz and just leave it there. 10-Bit/12-Bit at that range looks perfectly fine to me on desktop or gaming, even if it is only 8-Bit + FRC (which your eyes can't even tell anyway).

My question is; if bandwidth is NOT the issue; why is the screen 8-Bit + FRC at 144Hz and NOT at 120Hz? Would this be a limitation of the AU Optronics panel itself not being able to transition such fine color steps at that refresh rate? Perhaps the Gsync module in this monitor is the limiting factor in terms of processing power?

This is more curiosity than anything else at this point, as I notice 0 difference between the true 10-bit at 120Hz vs. 144Hz.
There is no proof it uses the 8-bit + FRC after maxing out the DP 1.4 bandwidth beyond 4K 120Hz 10bit.
DSC Encoder doesn't restrict the compression into only 8 bits per color either. It can specify the target bit rate the same as the input bit rate.
With a custom profile, It can use native 10bit + DSC for 4K 144Hz 10bit.
 
There is no proof it uses the 8-bit + FRC after maxing out the DP 1.4 bandwidth beyond 4K 120Hz 10bit.
DSC Encoder doesn't restrict the compression into only 8 bits per color either. It can specify the target bit rate the same as the input bit rate.
With a custom profile, It can use native 10bit + DSC for 4K 144Hz 10bit.
Well, is the proof not from ASUS who market it as true 10-bit up to 120hz? Or is that silly on their part as they actually mean DSC kicks it at 144hz and its still true 10-bit?

Also, what do you mean by custom profile? You talking something on the monitor side we don't have access too?
 
Well, is the proof not from ASUS who market it as true 10-bit up to 120hz? Or is that silly on their part as they actually mean DSC kicks it at 144hz and its still true 10-bit?

Also, what do you mean by custom profile? You talking something on the monitor side we don't have access too?
Only these reviewers claimed 8-bit FRC without any proof. They got these claims from just dithering not from ASUS.

If you check VESA DSC encoder, you can set the parameters to use a custom profile with the higher bit rate. It's can be 10-bit in then 10-bit out.
VESA DSC Source Guide.png
 
Only these reviewers claimed 8-bit FRC without any proof. They got these claims from just dithering not from ASUS.

If you check VESA DSC encoder, you can set the parameters to use a custom profile with the higher bit rate. It's can be 10-bit in then 10-bit out.
View attachment 543078
Well, I'm guessing they might be making this assumption because of ASUS themselves?

Asus own website for the PG32UQX states:
  • True

    10-bit

    color in 4K@120Hz
Why not just say true 10-Bit color in 4K@144Hz then? Why would ASUS say 120Hz vs 144Hz on that specific line item? It never advertised the PG27UQ as true 10-bit color at 98Hz because it was an 8-Bit + FRC panel.

FWIW, I am not arguing here, I actually want to know, and you have been very helpful on this matter!
 
Well, I'm guessing they might be making this assumption because of ASUS themselves?

Asus own website for the PG32UQX states:
  • True

    10-bit

    color in 4K@120Hz
Why not just say true 10-Bit color in 4K@144Hz then? Why would ASUS say 120Hz vs 144Hz on that specific line item? It never advertised the PG27UQ as true 10-bit color at 98Hz because it was an 8-Bit + FRC panel.

FWIW, I am not arguing here, I actually want to know, and you have been very helpful on this matter!
Because DSC is still a lossy codec to redo images. A reconstructed 10bit color is not the original 10bit. The color difference after DSC can be detected by test equipment but not by human eyes.
Even by standard for human eyes, this is the only monitor that can do it properly without perceptible artifacts.
 
Because DSC is still a lossy codec to redo images. A reconstructed 10bit color is not the original 10bit. The color difference after DSC can be detected by test equipment but not by human eyes.
Even by standard for human eyes, this is the only monitor that can do it properly without perceptible artifacts.
Ok, that would make sense. So does DSC not kick in until AFTER 120Hz then? If so, why did the PG27UQ not allow 10-Bit color above 98Hz if 120Hz is the uncompressed max at 4K? I thought 98Hz was in fact the limit for non-DSC 10-Bit color.
 
Ok, that would make sense. So does DSC not kick in until AFTER 120Hz then? If so, why did the PG27UQ not allow 10-Bit color above 98Hz if 120Hz is the uncompressed max at 4K? I thought 98Hz was in fact the limit for non-DSC 10-Bit color.
DSC already kicks in after 98Hz 10bit. Though the total bandwidth of DP 1.4 is 32.40Gbit/s, the data rate of DP 1.4 is limited to 25.92Gbit/s.
PG27UQ doesn't have DSC to compress more than 4K 120Hz 10bit 3840x1920x10x3x120=26.54Gbit/s data while PG32UQX has one profile to do DSC 4K 120Hz 10bit, it has another profile do further DSC compress at 4K 144Hz 10bit.
 
DSC already kicks in after 98Hz 10bit. Though the total bandwidth of DP 1.4 is 32.40Gbit/s, the data rate of DP 1.4 is limited to 25.92Gbit/s.
PG27UQ doesn't have DSC to compress more than 4K 120Hz 10bit 3840x1920x10x3x120=26.54Gbit/s data.
Correct, thats kind of my point then. If DSC was the reason Asus does not claim full 10-Bit at 144Hz, then why would they claim it at 120Hz then?
 
Correct, thats kind of my point then. If DSC was the reason Asus does not claim full 10-Bit at 144Hz, then why would they claim it at 120Hz then?
They claimed 4K 120Hz 10bit along with 98% DCI-P3 and Delta E<2.
After 4K 120Hz 10bit the accuracy will probably drop by the test. Though you can still see the same shades of 10bit color, the tone will be different.
 
They claimed 4K 120Hz 10bit along with 98% DCI-P3 and Delta E<2.
After 4K 120Hz 10bit the accuracy will probably drop by the test. Though you can still see the same shades of 10bit color, the tone will be different.
Ok, so this is where the confusion of people claiming 8-Bit + FRC comes from then (on top of the document you provided previously)? Cause I could never understand why the panel would be native 10-Bit color stepping, but magically not be native 10-Bit at 144Hz and have to switch to 8Bit + FRC, like, why would it switch it's mode... lol. If that was the case, I would think 12-Bit would not work at 144Hz because how would 8-Bit + FRC even display that? You need 10-Bit + FRC.
 
Ok, so this is where the confusion of people claiming 8-Bit + FRC comes from then (on top of the document you provided previously)? Cause I could never understand why the panel would be native 10-Bit color stepping, but magically not be native 10-Bit at 144Hz and have to switch to 8Bit + FRC, like, why would it switch it's mode... lol. If that was the case, I would think 12-Bit would not work at 144Hz because how would 8-Bit + FRC even display that? You need 10-Bit + FRC.
They are confused about 4K 144Hz 10bit 3840x2160x10x3x144 = 35.83 Gbit/s will max out the DP 1.4 bandwidth of 32.4Gbps.
There is already DSC encoder to compress 10bit images to the lower bitstream to fit inside the 32.4Gbps. bandwidth. The bitstream will be decoded back to 10bit for the display.
 
This might be mildly off topic, but can anyone recommend me a tool for measuring peak brightness of my screen? How is it done typically? I'm looking into getting a calibration tool as well.
 
A monitor luminance meter like SM-208 or TES-137 is enough to measure the peak brightness. They are easy to use. Put the probe onto the screen. It will show the brightness.

A colorimeter needs at least X-Rite i1 Display Pro Plus (ColorChecker Display Plus) for HDR 1400 brightness measurement. It needs software such as Calman to work with.
 
I’m still trying to decide what I want to spend on replacing my many year old displays. My question is if I am working from home, is the 8 hours in addition to gaming going to put undo wear on a display like this? Are they rated in hours?
 
They are, but it's absurdly long - I've only had one LCD have the backlight die from age, and it was... 13 years old at that point? And in use 12-14 hours a day. These aren't OLED.
 
Yeah, I can't even think of the last LCD I had that lost brightness.... probably my very first one that was the old bulb type for an LCD.

FWIW, I had/have a PG27UQ that I have used for gaming and WFH for years now (12+ hours a day on easy) and it has no lost a beat. I'd expect the same type of longevity from my PG32UQX as well.

Not sure if it helps the monitor (it helps my eyes), I leave HDR on, but turn the brightness to 5% when I am working and not gaming using the windows slider. I'm sure that also saves the backlighting quite a bit over time.
 
I have a pair of LCDs that lost about 2/3rds of their brightness. They're at least 15+ years old and have been heavily used since I bought them used around '07. (20" 1200x1600 are almost the perfect size to use as side displays with a 32" 4k; and nothing new has came out in that size/resolution in a long time.)
 
Last edited:
Is this monitor considered out of date at this point? Is there anything competing?
I had the same concern. It's looking like ASUS is launching essentially the new version of this later this year (PG32UQXR) as announced at CES, but it's not known all the details for that yet (AFAIK anyways).
There is an older ProArt that uses similar tech but is a lot more expensive and there are a lot of bad panels out there (I'm returning one because I wasn't totally happy with it - few dead pixels and it takes forever to wake [and sometimes doesn't want to at all]).
 
I had the same concern. It's looking like ASUS is launching essentially the new version of this later this year (PG32UQXR) as announced at CES, but it's not known all the details for that yet (AFAIK anyways).
There is an older ProArt that uses similar tech but is a lot more expensive and there are a lot of bad panels out there (I'm returning one because I wasn't totally happy with it - few dead pixels and it takes forever to wake [and sometimes doesn't want to at all]).
There’s also a newer pro art that is better, but at nearly 5k iirc isn’t in the same discussion.
 
Is this monitor considered out of date at this point? Is there anything competing?
Of course not.

This has been the top-tier HDR monitor through and through. It will stay ahead until AUO introduces a 2000-zone 144Hz 4K IPS.

What you can buy this year are all mid-tier with reduced zones, some 3rd party PWM backlight or reduced color accuracy here and there, that won't look anywhere better than this one.
 
Is this monitor considered out of date at this point? Is there anything competing?
In terms of HDR quality for 4K gaming... nothing IMO out right now is as good or will be in 2023. OLED can't match the brightness and VA can't touch the color quality. Plus, this IPS backlight is DC voltage controlled, no PWM bullshit.

After using this monitor for a month, it is absolutely amazing. I think some reviews got it wrong too, I have 0 issues with response time or ghosting. And thats playing games at 144 FPS and 144Hz in COD and BF2042 which is where it matters the most. Never had a problem getting 1st place and the monitor feels extremely responsive to me.

Its pricey for sure, but its aimed at gamers looking for the absolute best HDR quality and those who want to fully max settings in games at 4K. This monitor is not aimed at those looking for 240+ FPS or those who just lower settings in games for pure FPS.

This is 100% a graphics whore monitor... ;)
 
Last edited:
I also think the motion quality on this monitor is fine. Some response times don't measure well on it, but I can't say I noticed that in actual gaming use.
You Nailed it. At one point I literally had the PG42UQ OLED next to the PG32UQX and only by having them right next to one another did I notice any response time ghosting. Even then, it was miniscule.
 
I had the same concern. It's looking like ASUS is launching essentially the new version of this later this year (PG32UQXR) as announced at CES, but it's not known all the details for that yet (AFAIK anyways).
There is an older ProArt that uses similar tech but is a lot more expensive and there are a lot of bad panels out there (I'm returning one because I wasn't totally happy with it - few dead pixels and it takes forever to wake [and sometimes doesn't want to at all]).
Even if its basically the same monitor as the one in this topic I think I could wait until at least April if I had to and hold out for this one. Ironically the new display port features won't mean anything for the 4090 so I am not sure what this is for.

Edit: Nevermind. The PG32UQXR appears to not have Gsync.
 
Last edited:
I haven’t read through this entire thing, but is there a consensus on key differences against XG321UG or are they more or less equal?
 
Status
Not open for further replies.
Back
Top