Why I've settled on the PG32UQX in 2024

Started going down the rabbit hole again because I was informed that the GR OLED refresh for the 27" LG which is now the GS has fixed brightness and better tracking. It literally doesn't feel like these are HDR displays due to the brightness limitations (even on the refresh). I like how it has a notification at the top left telling you when HDR is on, but the colors and pop just aren't there. There's also vertical banding in some colors. For example, my one wallpaper that induces blooming from the mouse cursor also has vertical banding on the LG.. lol. I'm convinced that gsync should be disabled on OLED monitors because it significantly helps reduce eye strain and flickering in general.

I'll see if I can capture these talking points in a camera for everyone to see.
 
Launched 2 sessions of cyberpunk on both monitors, I would not say that SDR look better than HDR like it did in 2021:

View: https://www.youtube.com/watch?v=ms-qPkvpXrQ

I would even say that the M272V HDR look better than the cheap 400 nits monitor (could be better HDR, could be 4k, couid be both) looking at both side by side instead of changing screen in the menu and going by memory, but I would not call one image phenomenal versus the other one, just a little and small improvement.

Maybe I am not using the good HDR setting, I tried some
NITS: 1000 (for the M272V, 400 for the 400 nits)
Tone mapping: 2.0 (tried 1 and other value before)
 
Started going down the rabbit hole again because I was informed that the GR OLED refresh for the 27" LG which is now the GS has fixed brightness and better tracking. It literally doesn't feel like these are HDR displays due to the brightness limitations (even on the refresh). I like how it has a notification at the top left telling you when HDR is on, but the colors and pop just aren't there. There's also vertical banding in some colors. For example, my one wallpaper that induces blooming from the mouse cursor also has vertical banding on the LG.. lol. I'm convinced that gsync should be disabled on OLED monitors because it significantly helps reduce eye strain and flickering in general.

I'll see if I can capture these talking points in a camera for everyone to see.

That's just WOLED for you. QD OLED fairs much better in colors but then you run into brightness problems in TB400 mode and poor EOTF tracking in P1000 mode that results in aggressive dimming. I think I've just come to accept the fact that for now, OLED monitors are just "HDR lite" displays, if you want a full fat HDR experience from an OLED display then you need to pony up for a high end TV like an LG G4 or Samsung S90D because these monitors just ain't it. I have to say though, after not using my mini LED for well over a month, it felt extremely jarring to go back from a 240Hz OLED to a 144Hz LCD. So much so that I think I'm just going to stick to a high refresh OLED and upgrade as HDR brightness improves over time.
 
Yeah there is no real way to tell when this monitor was revised but based on what I've seen and heard from others, a 2023 manufacturer date is a good bet.

I know this because the previous one I had was late 2022 date and I RMA'd it and received one with a Jan 2023 date and noted in one of these threads how much less the replacement is blooming.
 
Yeah there is no real way to tell when this monitor was revised but based on what I've seen and heard from others, a 2023 manufacturer date is a good bet.

I know this because the previous one I had was late 2022 date and I RMA'd it and received one with a Jan 2023 date and noted in one of these threads how much less the replacement is blooming.
I'm interested to hear how you feel about the new OLED G8 considering we both come from the PG32UQX.

The motion clarity is nice, but I still feel like I want just a little more brightness on most occasions.

I tried the Asus PG32UCDM and it was ok. Honestly like the AW3225QF a bit more. That slight curve really helps with reflections and keeping the monitor in a decent field of view.

I would have to try one again with gsync disabled as the QD-OLED caused the most eye strain and I think the matte on the new G8 would help with that.

Thank goodness for return policies.
 
I'm interested to hear how you feel about the new OLED G8 considering we both come from the PG32UQX.

The motion clarity is nice, but I still feel like I want just a little more brightness on most occasions.

I tried the Asus PG32UCDM and it was ok. Honestly like the AW3225QF a bit more. That slight curve really helps with reflections and keeping the monitor in a decent field of view.

I would have to try one again with gsync disabled as the QD-OLED caused the most eye strain and I think the matte on the new G8 would help with that.

Thank goodness for return policies.
My friend who posted this video actually borrowed my PG32UQX to record it:


View: https://youtu.be/sRGwzbnuLJA

He liked it so much that he bought one for himself. My biggest take away after seeing them both in person side by side is that it really comes down to what you appreciate more. There are a lot of people who just can't tolerate the PG32UQX slow panel and so its immediately a no go. I don't think that any PG32UQX owner who primarily consumes content in HDR can "downgrade" to these OLED monitors. What we discussed is that anytime you game in HDR on these dim OLED's there is always the lingering thought of what you're missing out in terms of HDR impact and how much better it would look on the PG32UQX. That basically sums up my thoughts between the two. No matter how good the motion clarity or smoothness of OLED is it doesn't matter when the games I want to play look so flat. Both of us are in a demographic who just don't care for competitive/esports at this point in our lives so of course someone who is a 50/50 split between graphic heavy titles and Valorant may come to a different conclusion.

If I were a PG32UQX owner I don't think now is the time to ditch it. RTX HDR gave it even longer legs. I dunno how much longer it will be in production but I assume it is close to end of life so if nothing tops it in the coming year or two I anticipate it will be sought after.
 
Last edited:
Ya I just ordered another new PG32UQX. I don't use the 240/480 Hz features of my OLED as much as I thought I would.
 
My friend who posted this video actually borrowed my PG32UQX to record it:

He liked it so much that he bought one for himself. My biggest take away after seeing them both in person side by side is that it really comes down to what you appreciate more. There are a lot of people who just can't tolerate the PG32UQX slow panel and so its immediately a no go. I don't think that any PG32UQX owner who primarily consumes content in HDR can "downgrade" to these OLED monitors. What we discussed is that anytime you game in HDR on these dim OLED's there is always the lingering thought of what you're missing out in terms of HDR impact and how much better it would look on the PG32UQX. That basically sums up my thoughts between the two. No matter how good the motion clarity or smoothness of OLED is it doesn't matter when the games I want to play look so flat. Both of us are in a demographic who just don't care for competitive/esports at this point in our lives so of course someone who is a 50/50 split between graphic heavy titles and Valorant may come to a different conclusion.

If I were a PG32UQX owner I don't think now is the time to ditch it. RTX HDR gave it even longer legs. I dunno how much longer it will be in production but I assume it is close to end of life so if nothing tops it in the coming year or two I anticipate it will be sought after.
I still think the oled looks better in hdr dark content but color me shocked that pg32uqx looks to perform nothing like mine did in those darker scenes.


View: https://youtu.be/2_XL7F8jC1c?feature=shared

The 2 that I owned looked more like this dudes in a night scene in cyber punk with the non stop flickering and fireworks looking effect when they transitioned zones....I absolutely hated it.

Is it a trick of the camera or does it really not do this anymore?
 
I still think the oled looks better in hdr dark content but color me shocked that pg32uqx looks to perform nothing like mine did in those darker scenes.


View: https://youtu.be/2_XL7F8jC1c?feature=shared

The 2 that I owned looked more like this dudes in a night scene in cyber punk with the non stop flickering and fireworks looking effect when they transitioned zones....I absolutely hated it.

Is it a trick of the camera or does it really not do this anymore?


That video is 2 years old so it would be on the old dimming algorithm. More recent batches don't do this anymore while the early batches were indeed bloom fiestas.
 
That video is 2 years old so it would be on the old dimming algorithm. More recent batches don't do this anymore while the early batches were indeed bloom fiestas.
Yeah I'm not entirely convinced that everyone who claims that isn't incentivized by Asus to try and trick my dumbass into buying one of these monitors again. :D

While I don't doubt Asus probably has better QC and made some manufacturing improvements on this I still find it hard to believe it's real significant. You know since there's no real idea when it got better, no new firmware version, no new model revision and it's all just hearsay that there's sweeping improvements.

No offense to anyone making these claims but this video is really the first time I've seen any actual evidence of improvements so I'm still skeptical.
 
Last edited:
Yeah I'm not entirely convinced that everyone who claims that isn't incentivized by Asus to try and trick my dumbass into buying one of these monitors again. :D

While I don't doubt Asus probably has better QC and made some manufacturing improvements on this I still find it hard to believe it's real significant. You know since there's no real idea when it got better, a new firmware version, a new model revision and it's all just hearsay that there's sweeping improvements.

No offense to anyone making these claims but this video is really the first time I've seen any actual evidence of improvements.

I mean I don't own a newer batch myself so I cannot personally confirm it for you. I will say though that I highly trust SoCali on this. That guy has found issues with monitors that not even well respected reviewers managed to noticed until much later, if ever. So if he's saying there's some obvious change under the hood then there probably is.
 
Any chance you could upload a Smoothfrog or UFO shot? Very curious if the smearing is any better.
Here are a couple of pictures I took for the ufo test and the smooth frog test. I don't know that I took them correctly.
pic01 - Copy.jpg
pic02 - Copy.jpg
 
The way the FALD currently behaves is nothing like those old videos. That's how I remember mine being too. In the old locked PG32UQX thread I mentioned 20x that setting FALD to "3" was nuts because it just looked like fireworks going off. Most people with the original versions settled on 2 to avoid this.

The current version the best setting is actually now 3 which behaves more like the previous 2 setting but doesn't cost you as much contrast ratio like before.
 
There are a lot of people who just can't tolerate the PG32UQX slow panel and so its immediately a no go. I don't think that any PG32UQX owner who primarily consumes content in HDR can "downgrade" to these OLED monitors. What we discussed is that anytime you game in HDR on these dim OLED's there is always the lingering thought of what you're missing out in terms of HDR impact and how much better it would look on the PG32UQX. That basically sums up my thoughts between the two. No matter how good the motion clarity or smoothness of OLED is it doesn't matter when the games I want to play look so flat. Both of us are in a demographic who just don't care for competitive/esports at this point in our lives so of course someone who is a 50/50 split between graphic heavy titles and Valorant may come to a different conclusion.
For me the issue is entirely about cost vs performance. The PG32UQX even today is still 3499 € here in Finland. The cheapest it has ever been is about 3200 €. It may be great in HDR, but for that kind of money, I'd want it to be great in pixel response times too, as far as LCDs go at least. Being outperformed by cheap edge-lit 4K 144 Hz displays in this area is not great. I wouldn't mind it so much if the PG32UQX was say 1500 €.

Similarly I find it easier to compromise on HDR performance on a 900-1500 € LG OLED TV, when in return you do get benefits in other areas. I don't play any multiplayer online games at all, but still appreciate the motion clarity of OLED.
 
For me the issue is entirely about cost vs performance. The PG32UQX even today is still 3499 € here in Finland. The cheapest it has ever been is about 3200 €. It may be great in HDR, but for that kind of money, I'd want it to be great in pixel response times too, as far as LCDs go at least. Being outperformed by cheap edge-lit 4K 144 Hz displays in this area is not great. I wouldn't mind it so much if the PG32UQX was say 1500 €.

Similarly I find it easier to compromise on HDR performance on a 900-1500 € LG OLED TV, when in return you do get benefits in other areas. I don't play any multiplayer online games at all, but still appreciate the motion clarity of OLED.
Ya, that's how I'm feeling too.

That smoothfrog is very close to what my 75Hz PB328Q looks like with the custom overdrive shader I wrote for it. It's 3300:1 at 140nits, which is not terrible. If it were 4K, I'd try to keep it for another 50,000 hours.
 
For me the issue is entirely about cost vs performance. The PG32UQX even today is still 3499 € here in Finland. The cheapest it has ever been is about 3200 €. It may be great in HDR, but for that kind of money, I'd want it to be great in pixel response times too, as far as LCDs go at least. Being outperformed by cheap edge-lit 4K 144 Hz displays in this area is not great. I wouldn't mind it so much if the PG32UQX was say 1500 €.

Similarly I find it easier to compromise on HDR performance on a 900-1500 € LG OLED TV, when in return you do get benefits in other areas. I don't play any multiplayer online games at all, but still appreciate the motion clarity of OLED.
I think everyone agrees the cost should be adjusted accordingly. The problem is the lack of competition in the high end miniLED market. The PG32UQX will drop to about $2300 on sale a few times during the year which makes it more tolerable. But it's still a little too high. The PG32UCDM goes for $1900 CAD new so the extra $400 difference makes the PG32UQX a slightly better buy for a consistent HDR experience.

We really need TCL to enter the market or have Samsung release an updated panel for the Neo g8 line at the very least. Innocn and KTC are just too niche and inconsistent to put any sort of pressure on the NA market.
 
I think everyone agrees the cost should be adjusted accordingly. The problem is the lack of competition in the high end miniLED market. The PG32UQX will drop to about $2300 on sale a few times during the year which makes it more tolerable. But it's still a little too high. The PG32UCDM goes for $1900 CAD new so the extra $400 difference makes the PG32UQX a slightly better buy for a consistent HDR experience.

We really need TCL to enter the market or have Samsung release an updated panel for the Neo g8 line at the very least. Innocn and KTC are just too niche and inconsistent to put any sort of pressure on the NA market.
Ya, or you can just buy the MSI and just replace it every two years for the same money.
 
I think everyone agrees the cost should be adjusted accordingly. The problem is the lack of competition in the high end miniLED market. The PG32UQX will drop to about $2300 on sale a few times during the year which makes it more tolerable. But it's still a little too high. The PG32UCDM goes for $1900 CAD new so the extra $400 difference makes the PG32UQX a slightly better buy for a consistent HDR experience.

We really need TCL to enter the market or have Samsung release an updated panel for the Neo g8 line at the very least. Innocn and KTC are just too niche and inconsistent to put any sort of pressure on the NA market.

At this point I don't know what's going to happen first, top tier mini LED monitors being priced equally to OLEDs, or OLEDs getting their brightness figures up. The newest ipad pro has an OLED display that can do 1600 nits peak and 1000 nits full field, when will we get such capability in a desktop monitor?
 
At this point I don't know what's going to happen first, top tier mini LED monitors being priced equally to OLEDs, or OLEDs getting their brightness figures up. The newest ipad pro has an OLED display that can do 1600 nits peak and 1000 nits full field, when will we get such capability in a desktop monitor?

I got a Samsung Galaxy Tab S9+ 12.4" OLED the other day, and right before writing this I compared it to my M2 Max Macbook Pro 16" Mini-LED screen. Running several 4K HDR nature videos off YouTube, the differences between the two were pretty subtle. The Macbook was able to get more detail out of the very brightest areas like say the sun, clouds, snow capped mountains and so on, but I had to pretty much pause the video and start looking most of the time. The Mac also had better color accuracy. At no point did I see any dimming happening on the OLED tablet.

It's hard to find the actual measurements online for laptop and tablet displays though, but both my Samsung Galaxy Fold 4 and the Tab S9+ feel like they do HDR better than my LG CX 48" does. The difference vs the Macbook Pro was easier to notice when comparing it to the TV.
 
I was playing some Hades last night and the OLED looked better than the PG32UQX, but the ABL was insane. The LG 27GS would noticeably get dimmer for a second or two at a time completely ruining the experience. Where as it doesn't seem to have as much of an issue in games like Apex.

Honestly the matte coating on the LG is top notch in keeping blacks looking black, where as the PG32UQX can have the ugly dispersed lighting in awful matte displays. So I'll give LG a win on the coating. Especially if it's supposed to be even better on the 32" 4k.
 
I got a Samsung Galaxy Tab S9+ 12.4" OLED the other day, and right before writing this I compared it to my M2 Max Macbook Pro 16" Mini-LED screen. Running several 4K HDR nature videos off YouTube, the differences between the two were pretty subtle. The Macbook was able to get more detail out of the very brightest areas like say the sun, clouds, snow capped mountains and so on, but I had to pretty much pause the video and start looking most of the time. The Mac also had better color accuracy. At no point did I see any dimming happening on the OLED tablet.

It's hard to find the actual measurements online for laptop and tablet displays though, but both my Samsung Galaxy Fold 4 and the Tab S9+ feel like they do HDR better than my LG CX 48" does. The difference vs the Macbook Pro was easier to notice when comparing it to the TV.

OLED displays found in portable/mobile devices just seem to be better, probably because they don't give a hoot about longevity in such devices. HDR content on my iphone does indeed look far better than my LG CX or MSI QD OLED.
 
OLED displays found in portable/mobile devices just seem to be better, probably because they don't give a hoot about longevity in such devices. HDR content on my iphone does indeed look far better than my LG CX or MSI QD OLED.
Yeah I suppose they bank on phones being typically on for a limited time of anything from a minute to maybe 30 minutes, and tablets at most for maybe a couple of hours if you watch a movie. Plus people swap a bit more often than monitors. Though the iPad Pro I traded for the Samsung I had been using for 7 years already...
 
Yeah I suppose they bank on phones being typically on for a limited time of anything from a minute to maybe 30 minutes, and tablets at most for maybe a couple of hours if you watch a movie. Plus people swap a bit more often than monitors. Though the iPad Pro I traded for the Samsung I had been using for 7 years already...
I do enjoy the posts from people whenever a burn in post gets made stating how they've never encountered burn in on their oled phones.

They just happen to neglect stating they don't keep them more than a year or two at a time. lol

I've got disgusting burn in on my Google Pixel 2 XL, but I refuse to buy a new phone. I'm over this constant need to buy new disposable products.
 
So I'm not going crazy right?? lol. I definitely thought the same and couldn't understand what all the blooming fuss was all about. Whatever they did, it was excellent. It actually had me guessing whether the local dimming was enabled. On this refurbished, I can definitely tell.
No you aren't going crazy, I think some of it is improvements to the algorithms but also some of it is just people making a mountain out of a mole hill. Like yes, blooming IS something you can see on this monitor, depending on the dimming level, the content, etc... but it isn't bad. I think some people just want to find a flaw and then get real mad about it. I mean also different people are going to be sensitive to different things, but some of it is just people finding a flaw, latching on to it, and then raging. It's why I tell people never do a dead pixel test on a monitor. Look at the monitor normally and see if you notice any, but if you don't, then don't worry. Don't go hunting because if you find one dead red subpixel off in the corner you may fixate on it as a problem, even though it doesn't bother you in real use.

So ya, I feel the same as you. I have an OLED TV, and yes I can see the difference in blooming between the monitor and the TV... but it is really minor and situational. The blooming really isn't a problem.

Started going down the rabbit hole again because I was informed that the GR OLED refresh for the 27" LG which is now the GS has fixed brightness and better tracking. It literally doesn't feel like these are HDR displays due to the brightness limitations (even on the refresh).
Welcome to the problems with small OLEDs :(. Unfortunately, they just can't get bright enough for good HDR impact. TVs are not such an issue, while their full screen brightness usually isn't any more than monitors, sometimes less, they get a lot more on a 10-20% window which ends up being what matters for most HDR content. Monitors just can't dissipate the heat and/or deliver the power needed so they get dim fast. I'm hopeful they'll come up with a solution some day but for now, they just can't push the brightness very high.

I do enjoy the posts from people whenever a burn in post gets made stating how they've never encountered burn in on their oled phones.

They just happen to neglect stating they don't keep them more than a year or two at a time. lol
My friend has burn-in on his Samsung Note 9. It isn't bad, but it is noticeable. Another thing that keeps burn-in down on many phones is just usage time. Even people who use their phones a lot probably use them only a couple hours a day of total on time.


I ended up getting a PG32UQX for all the same reasons about 6 months ago. I was too worried about burn-in on an OLED, since I do lots of desktop stuff, and I also knew from experience and general knowledge about human perception that brightness matters to HDR impact (much as SPL matters to dynamic impact in sound). I'm not sorry I did. While there's things I would very much like about OLEDs, like the low motion blur, the super-precise per pixel brightness and so on, I felt that the MiniLED was a better choice for now. I'm real happy with it too. I expected it to be more of a compromise, where I'd like it, but still maybe like my TV for gaming better. No, I actually tend to prefer the monitor. The extra brightness just gives it an impact the TV doesn't have (my TV is older, an S95B, so only does about 900nits in real HDR content) and the downsides, while noticable, really aren't bad.

Personally I find the bright-dark transition blue more noticeable than the blooming. If I could only improve one thing on the monitor that would be the one I'd choose.
 
For me the issue is entirely about cost vs performance. The PG32UQX even today is still 3499 € here in Finland. The cheapest it has ever been is about 3200 €. It may be great in HDR, but for that kind of money, I'd want it to be great in pixel response times too, as far as LCDs go at least. Being outperformed by cheap edge-lit 4K 144 Hz displays in this area is not great. I wouldn't mind it so much if the PG32UQX was say 1500 €.

Similarly I find it easier to compromise on HDR performance on a 900-1500 € LG OLED TV, when in return you do get benefits in other areas. I don't play any multiplayer online games at all, but still appreciate the motion clarity of OLED.
Kind of the reason I got the QN900C instead and got 4 PG32UQX:es and a faster glossy panel for less than 3000 EUROs (with discount, special offer etc) :)

Yes, I know there are of course other things to consider but all in all the QN900C is the best LCD I have ever used and the idea of going back to a single 4K monitor (or even a multi monitor setup) seems like a big step down now. My Acer X27 still has better PPI but having passed 50, I have to admit that using it without scaling is starting to become a challenge. Of course, this all assumes that you need more than 4K resolution which most people probably don't.

For mostly gaming and entertainment, I consider OLED to be the only real choice today even though the extra brightness of a good MiniLED goes a long way of compensating for the shortcomings.But for someone more of a casual gamer like myself, the QN900C is surprisingly good to be honest. Especially impressed with how it manages to control blooming/haloing even though it only has about 1300 local dimming zones spread out over 65" and does not seem to do too much aggressive dimming like the X32 etc. Don't get me wrong, it is still there, especially if you compare to an OLED, but that is true for all LCDs.
 
Last edited:
Well, the second PG32UQX was also a first gen model, so I'll be waiting for a sale on brand new. Sad panda.
 
Well, the second PG32UQX was also a first gen model, so I'll be waiting for a sale on brand new. Sad panda.
Is there anything to support there actually being a second gen of this monitor and not just FW updates? Ie meaning some changes/updates to the hardware in the last couple of years.
 
Is there anything to support there actually being a second gen of this monitor and not just FW updates? Ie meaning some changes/updates to the hardware in the last couple of years.
No, there is nothing official. There's no firmware update, so whatever occurred came from factory. We have multiple first hand accounts. I've experienced it first hand.
 
The blooming difference between the early manufactured models to later. It's a completely different experience. It's been discussed in this thread..
A few random people on the internet discussing things does not make it true though, especially without some hard facts, sources etc.

It has been a trend the last couple of years to "solve" the blooming problem by simply lowering brightness, like on the X32. Based on the little information I have, it seems to me more like Asus tuned the firmware as I mentioned above. This is why I am asking for some actual sources etc. to make this more than just personal opinions and faint memories.

Please note that I am not claiming anyone to be wrong nor claiming there are no changes besides firmware. I've never even seen the monitor IRL and don't plan to buy one.
 
I picked up the Viewsonic version (XG321UG) last year and haven't bothered trying any of the 32" OLEDs after seeing the absolutely dismal brightness measurements on them. Sounds like I've made the right choice based on this thread. But my monitor does have two strange HDR behaviors that I was wondering if any PG32UQX owners have encountered:
  1. On the third screen of the Windows HDR Calibration app (full screen white test), the screen starts EXTREMELY bright, but then dims considerably after a couple seconds like some sort of ABL kicked in.
  2. The monitor seems to randomly enable some sort of brightness tone mapping. This is apparent in the Windows HDR Calibration app, or games like Forza Horizon 5 that offer proper in-game HDR calibration with a test image where you increase the brightness until the test text/image blends into the background. Sometimes the max brightness is achieved at ~1500 nits as I would expect. But other times I need to push the slider to ~3300 nits to get achieve the same effect. Like the monitor gets stuck in a mode where it expects 3000+ nits as peak brightness and scales everything down accordingly to match. My PG35VQ did this a couple times, but it happens often with the XG321UG and I'm often left wondering if I'm actually getting peak brightness out of it in games most of the time.
I'm pretty happy with the monitor overall, but in the back of my mind I feel like my PG35VQ had better HDR performance with less bloom. The XG321UG seems to have a bit of input lag in comparison as well. I've been reluctant to compare them side by side since I don't want to go back to ultrawide or 3440x1440. Maybe it's all just in my head...
 
On the third screen of the Windows HDR Calibration app (full screen white test), the screen starts EXTREMELY bright, but then dims considerably after a couple seconds like some sort of ABL kicked in.
That's normal. It can't sustain its full 1600+ nits fullscreen, so it'll lower down to around 1200 nits. That's only an issue for very large amounts of brightness though, at 50% it shows full brightness with no ABL so any real world content you ought to have no issues. It's pretty rare to see something with an average picture level over 1000, which is where you'd need to be for any of this to even begin to kick in.

Pretty much all really bright MiniLEDs do this as the power and heat requirements of going super bright full screen are just an issue, so they do kick in ABL. Some of the REALLY bright ones have an ABL curve more like OLED, where they start dropping after 10% or so, the difference being they peak much higher (4000nits or more in some cases) and they are still around 900-1000nits at full screen. This one does even better, sustaining its brightness to about 50% window and only then dropping.

The monitor seems to randomly enable some sort of brightness tone mapping. This is apparent in the Windows HDR Calibration app, or games like Forza Horizon 5 that offer proper in-game HDR calibration with a test image where you increase the brightness until the test text/image blends into the background. Sometimes the max brightness is achieved at ~1500 nits as I would expect. But other times I need to push the slider to ~3300 nits to get achieve the same effect.
I have not observed this in any of the games I've played. In all the games where the calibration displays useful information, or where I load up an analysis tool, it has shown that I have to get it in the 1600ish range for it to disappear.

If you want to see what the game is actually outputting, like if the game is full of crap or not, you can get Reshade and use the Illium plugins, there's a great HDR analyzer and you can tell if it is working right or not. The analyzer should show a 10,000nit peak for the control area and then the correct value in the area under test.

Do note some games don't pay attention to their own calibration well or are capped. Hitman 3 is such a game, the calibration works right near as I can tell, I have to get it above 1600 before it disappears, just like in Windows. However the game seems to have a hard cap of 1000 nits, probably because it uses some kind of auto HDR implementation. No matter what you set the calibration to, it won't go any brighter that I've seen. Or there's No Man's Sky which has 3 levels you can choose from, all labeled incorrectly, and it goes pretty ham in its HDR400 mode (which is actually about 2000nits peak) and you get scenes with like an 800nit+ APL.
 
I picked up the Viewsonic version (XG321UG) last year and haven't bothered trying any of the 32" OLEDs after seeing the absolutely dismal brightness measurements on them. Sounds like I've made the right choice based on this thread. But my monitor does have two strange HDR behaviors that I was wondering if any PG32UQX owners have encountered:
  1. On the third screen of the Windows HDR Calibration app (full screen white test), the screen starts EXTREMELY bright, but then dims considerably after a couple seconds like some sort of ABL kicked in.
  2. The monitor seems to randomly enable some sort of brightness tone mapping. This is apparent in the Windows HDR Calibration app, or games like Forza Horizon 5 that offer proper in-game HDR calibration with a test image where you increase the brightness until the test text/image blends into the background. Sometimes the max brightness is achieved at ~1500 nits as I would expect. But other times I need to push the slider to ~3300 nits to get achieve the same effect. Like the monitor gets stuck in a mode where it expects 3000+ nits as peak brightness and scales everything down accordingly to match. My PG35VQ did this a couple times, but it happens often with the XG321UG and I'm often left wondering if I'm actually getting peak brightness out of it in games most of the time.
I'm pretty happy with the monitor overall, but in the back of my mind I feel like my PG35VQ had better HDR performance with less bloom. The XG321UG seems to have a bit of input lag in comparison as well. I've been reluctant to compare them side by side since I don't want to go back to ultrawide or 3440x1440. Maybe it's all just in my head...
The PG35VQ was a beast at HDR for it's time. The fald algorithm was amazing on that thing I never had bloom in darker games especially with like crosshair or small objects and it still got hella bright which is crazy. Even with half the zones the pg32 has it did a much better job in darker content. Some of that's due to the VA vs IPS panel on them but still probably one of the better fald performers I have seen.

To bad it had it's own laundry list of problems with the older VA panel tech or it would have aged a little better.
 
Back in the PG32UQX camp and loving it! Got lucky and snagged a new-in-sealed-box on Ebay for $1,200. Zero pixel defects, and recent (July 2023) build. I love how the HDR just "works" on this display. The motion clarity isn't as bad as I remember either.

I'll probably be rocking this for a long time.

EDIT: which variable backlight setting are you guys using and why?
 
Last edited:
Back in the PG32UQX camp and loving it! Got lucky and snagged a new-in-sealed-box on Ebay for $1,200. Zero pixel defects, and recent (July 2023) build. I love how the HDR just "works" on this display. The motion clarity isn't as bad as I remember either.

I'll probably be rocking this for a long time.

EDIT: which variable backlight setting are you guys using and why?
I use '3' and the settings in this link:
https://hardforum.com/threads/pg32u...r1400-g-sync-ultimate.1991370/post-1045617320

In SDR mode for work, I barley notice a little blooming around the mouse cursor on dark screens.
 
  • Like
Reactions: Vega
like this
EDIT: which variable backlight setting are you guys using and why?
Depends on the usage:

Desktop usage I leave it in SDR mode and leave dimming off. While it works fine to leave it on, and there are times when I will have it in HDR on the desktop, I notice the backlight particularly on neutral grey like [H] so I just leave it off and am happy.

HDR games I use level 3. I've yet to discover the game that I like anything lower in. While I can see mild blooming artifacts sometimes, they are minor and it is worth it for the very high contrast it gets.

For SDR games it depends, usually level 3, sometimes level 2. So like Octopath Traveler 2 worked great on level 3, game looked fantastic and I saw essentially zero blooming. However, Star Valor didn't, because it was a lot of "bright ship/UI on completely black background" the blooming was too distracting, so I turned it down to level 2 and that was a good tradeoff between getting a better black level, but not having obnoxious blooming.

I've never used level 1 for anything. Maybe I'll try it on the desktop someday but for desktop usage, off is fine and actually I find I DON'T want super high contrast on the desktop, less contrast is easier on the eyes.

I'll probably be rocking this for a long time.
I want to replace it with something better because I always want something better... but I imagine it'll be quite awhile. OLED just doesn't look like they are going to solve the brightness issue on desktop sized displays any time soon, and I'm addicted to high brightness (plus burning worries). It'd have to be a better MiniLED and I just don't know that any of those are in the pipeline. To replace it they'd also have to not only offer improvements (more zones and faster transition times) but also keep the extremely good tuning. One of the things I like about this display is that it has extremely good gamma and EOTF response. Things just look "right". Seems to be very hard to find in a display and I don't want to trade that off.

I'm sure that will come at some point, maybe with an OLED tech improvement, but I don't think it'll be soon.
 
  • Like
Reactions: Vega
like this
Back
Top