PG32UQX - ASUS 32" 4K 144 Hz HDR1400 G-Sync Ultimate

Status
Not open for further replies.
Part of DolbyVision's PR:



So, based on a 10,000nit screen's curve obviously (other screens like 725-800nit and hdr1400nit have an accuracy fall-off and compress the top end so the values will be some what different compressed into the remaining range on the high end):


So,
50% of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.

25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).

The top 25%. (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)


The top end will be compressed on screens since practically all of them are below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit. Not sure where the fall-off is on the asus proarts uqx and ugc.

more from the dolby mastering page:

"I found it helpful to focus on segmenting an image and placing different portions of the shot at different nit values where they are best represented."
This algorithm I use is more competitive than Dolby Vision. 10,000nits objects are rarer than 2,000nits objects. So better saturate 2,000nits object first.

Again, contrast is higher lol. It's an entertainment to see what you got.
 
You're just saying the same thing again lol. It's fine if you like it better that way but you've taken certain brightness/darkness objects and areas in the mids and raised them. You took darker crayons and substituted brighter ones in mid areas. It doesn't matter if you raised the top end to 100,000 and made a crayon box 1000 crayons tall, saying look it's a larger contrast measurement now. You've lifted the mids so they don't look the same and no matter how high you make the peaks to nuclear blinding white for a taller contrast measurement, it will never be as contrasted and as impactful as if you hadn't raised the mids along with the highs.
Where does that lower contrast claim come from? Vincent? lol. Now you are saying the "mid tone" is lifted. Of course I'm going to lift 80nits-160nits of your "mid tone" with larger contrast.

And funny thing is you are actually comparing SDR pictures. SDR pictures from high APL HDR always look crashed lol.
 
Last edited:
Worth noting that while the ucg has hdmi 2.1 , the asus 32uqx and the viewsonic xg321ug do not.

tftcentral review of the 32ucx
People need to understand how important G-sync is. G-sync doesn't include hdmi 2.1 because it reduces its calculating power. G-sync makes faster backlight so the dimming zone can turn faster to reduce the blooming in motion. It also has lower input lag. XG321UG's input lag is a bit higher than PG32UQX because it has to show additional information of Reflex. You can see how much stressed the current hardware is when dealing with native 144Hz 10bit data.

PA32UCG blooming is a 4x larger square than PG32UQX when the image starts to move. In some cases PG32UQX has better reference quality than PA32UCG when displaying 120Hz motion objects.

The curse of implementing HDMI 2.1 with current hardware introduces a lot of bugs as well.
 
  • Like
Reactions: elvn
like this
Now you are saying the "mid tone" is lifted. Of course I'm going to lift

You are perverting the mid tone levels yes. Raising the peak range higher isn't changing the fact that you are lifting the values of the mid level objects and areas. They could remain the same for the most part with details and areas like they were and just raise the peaks (light sources, reflections of those, scintillation/glint etc).

You are lifting them. No matter how bright the rest of the screen is as a distance contrast wise, your "black isn't black anymore" so to speak, that is - your mids aren't what they were at anymore. The top 25% of the screen can be super bright (within the artist's vision of whatever content optimally) but you shouldn't be lifting the mids grossly like that in the elden ring shot in my opinion. If you left the mids darker detail on the tree more and darker area under the bow of the erdtree, etc. . the bright end would look much more impactful. It's fine if you prefer it that way though. It's your setup.


Where does that lower contrast claim come from?

No matter how much higher you range the brightness to get a taller gapped contrast measurement, it would still have been greater had you left the mids for the most part and raised the top end right alongside of them at those levels.


.
 
You are perverting the mid tone levels yes. Raising the peak range higher isn't changing the fact that you are lifting the values of the mid level objects and areas. They could remain the same for the most part with details and areas like they were and just raise the peaks (light sources, reflections of those, scintillation/glint etc).

You are lifting them. No matter how bright the rest of the screen is as a distance contrast wise, your "black isn't black anymore" so to speak, that is - your mids aren't what they were at anymore. The top 25% of the screen can be super bright (within the artist's vision of whatever content optimally) but you shouldn't be lifting the mids grossly like that in the elden ring shot in my opinion. If you left the mids darker detail on the tree more and darker area under the bow of the erdtree, etc. . the bright end would look much more impactful. It's fine if you prefer it that way though. It's your setup.




No matter how much higher you range the brightness to get a taller gapped contrast measurement, it would still have been greater had you left the mids for the most part and raised the top end right alongside of them at those levels.


.
You don't even know how contrast work in the very first place. Of course I'm going to raise the contrast between 80nits to 160nits of your "mid-tone". How dim these mid-tone looks like in an open area.

You pick point A and point B in the Elden Ring image. I can show you exactly how many nits that point has. Then the brightness difference between A and B is always higher after mapping algorithm. This is how contrast work.
Go pick them. I cannot wait for you to tell me the contrast is lower.

Also, the black is blacker under 10nits. They have even larger contrast. Your definition of 10+nits higher "blackness" is very funny.

Your 80nits-160nits "mid-tone" needs to be changed to look more natural and impressive. I know you want to keep it so your OLED can see it without so much ABL. I won't grade Elden Ring for OLED. I grade what looks better.
 
Last edited:
Funny how people complain about HDR > 1000nits being so bright it strains their eyes..ever stepped outside for a minute? Or even turned a light bulb on? Seriously lol…

Wonder why studios such as DolbyVision are targeting 4,000-10,000nit mastered HDR media? I guess they’re all idiots. The holy grail of display technology is to mimic reality, isn’t it?

Most people still need to see them to believe. They cannot see these HDR images accurately on their displays.
 
It was an imaginary comparative, comparing the perverting from the referenced mid's levels to "black not being black anymore". I wasn't talking about actual blacks.

The contrast, or the distance in nits from the peak areas zone wise and the overall impact would be higher. The distance between the higher peak brightness level range in the scene and the original more ordinary mids areas if you never lifted those mids higher. It's as simple as that.

You can range the peaks higher and higher and fatten curves but the lower points of the original mids would still be still always be darker and so provide a taller distance from the peak range. You could have extreme "high contrast" with a very bright high end but the range of mids might not be as dark as they were (or "should be" depending on the artist's intent) in actuality, with the way you are doing it. Again it's fine if you prefer it that way. There are a lot of ways people can tweak HDR games and screens to their liking. It's not the absolute values HDR was originally supposed to be really, all things considered.


Funny how people complain about HDR > 1000nits being so bright it strains their eyes..ever stepped outside for a minute? Or even turned a light bulb on? Seriously lol…

Wonder why studios such as DolbyVision are targeting 4,000-10,000nit mastered HDR media? I guess they’re all idiots. The holy grail of display technology is to mimic reality, isn’t it?

That's the top 25% of the screen. I'm all for brighter HDR peaks but not brightening all of the ordinary mids unnecessarily.

From the DolbyVision site:

50% of the screen displayed is at 0 to 100

( ~ SDR ranges and down to infinite black depth in OLED. This foundation probably remains the same even on compressed after fall-off screens).

25% of the screen is at 100 to 1000nit.

"The remaining code values are 1000 to 10,000".

The top 25%. (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)


. . . . . . . . . . . . .

Part of DolbyVision's PR:

"Catch nuances in every scene, from seeing the emotions change on a character's face in a dark night shot, to avoiding blown-out details under a bright sunlit scene."

"The code values are used in the most perceptually important parts of the PQ curve. This allows for exceptionally detailed shadows while allowing highlights up to 10,000 nit. "

So, based on a 10,000nit screen's curve obviously (other screens like 725-800nit and hdr1400nit have an accuracy fall-off and compress the top end so the values will be some what different compressed into the remaining range on the high end):


"Over half of the code values are in the zero to 100 nit range.

About 25% of the code values are in the 100 to 1000nit range."

So,
50% of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.

25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).


"The remaining code values are 1000 to 10,000".
The top 25%. (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)


The top end will be compressed on screens since practically all of them are below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit. Not sure where the fall-off is on the asus proarts uqx and ugc.

more from the dolby mastering page:

"I found it helpful to focus on segmenting an image and placing different portions of the shot at different nit values where they are best represented."
 
Last edited:
It was an imaginary comparative, comparing the perverting from the referenced mid's levels to "black not being black anymore". I wasn't talking about actual blacks.
Funny it's you who hallucinates all the time. I do the metaethical mapping

I tell you to check the RGB waveforms or pick any points to compare the contrast. You cannot even dare to do this.

Higher contrast is simply higher contrast. Don't deny it. 10nits and below is even lowered. It has even more contrast lol.

Your claim of "reference" mids, or 10+nits "darkness" maybe works for OLED. It doesn't look better on HDR 1400 monitors. I make it look better. The mids of HDR1400 are 500-700nits or above, especially in a daylight scene. That's where the mids are supposed to be.
 
Lift is a lift. You turned the whole orange, yellow and red heatmapped erdtree into ~1000nit and lifted the whole green area on the right from a more meaningful depth. Those areas are not as dark in absolute value nor in comparison to whatever peak brightness areas are displayed anymore. Simple as that. I think it would look better with the "darker" mids maintained and the peak 25% of the screen raised alongside of them. It's fine if you prefer it the way you did it. There are a lot of ways you can change they way HDR games and screens look.
 
Lift is a lift. You turned the whole orange, yellow and red heatmapped erdtree into ~1000nit and lifted the whole green area on the right from a more meaningful depth. Those areas are not as dark in comparison to whatever peak brightness areas are displayed anymore. Simple as that. I think it would look better with the "darker" mids maintained and the peak 25% of the screen raised alongside of them. It's fine if you prefer it the way you did it. There are a lot of ways you can change they way HDR games and screens look.
LoL. Your attempt is hilarious. So you are actually comparing the pixel A /pixel A+ lol. instead of B/A vs B+/A+. Is that how contrast work for you?
Higher contrast is higher. Where do you learn the contrast formula?
 
The thing people don't understand is that they equate a brighter image to a better image in terms of "pop" , but that is not the case. For me, a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted in terms of the apl or luminance by artificial means such as dynamic tone mapping. As the villain in the incredibles said: "If everyone is special, no-one is special".
. . .

It's fine if you like it better that way but you've taken certain brightness/darkness objects and areas in the mids and raised them. You took darker crayons and substituted brighter ones in mid areas. It doesn't matter if you raised the top end to 100,000 and made a crayon box 1000 crayons tall, saying look it's a larger contrast measurement now. You've lifted the mids so they don't look the same and no matter how high you make the peaks to nuclear blinding white for a taller contrast measurement, it will never be as contrasted and as impactful as if you hadn't raised the mids along with the highs.

This has become a circular argument so we should prob just leave it at our differences of opinion. I appreciate all of your maps and info provided in this thread.
 
I'm gonna tell my kids I was here for the great HDR wars of 2022. I'd love the viewsonic monitor if I could afford it, sounds pretty much perfect, one HDR 2.1 would be nice if for some reason I bought a console (but not really necessary).
 
. . .



This has become a circular argument so we should prob just leave it at our differences of opinion. I appreciate all of your maps and info provided in this thread.
Again, higher contrast is higher contrast. It pops a lot more lol. It looks more realistic as well. Why do you think I grade like that. It will be funnier for you to check back what your original claim is, how you calculate contrast.
Highlights, high APL doesn't mean lower contrast at all.
 
Last edited:
I cannot wait for people to say the images look way more detailed on the left. The left looks so much more in depths. The left must have way more contrast than contrast lol.
52344115156_01f299c158_k_d.jpg

52344116466_b2b8fa64f7_k_d.jpg
 
Last edited:
I'm gonna tell my kids I was here for the great HDR wars of 2022. I'd love the viewsonic monitor if I could afford it, sounds pretty much perfect, one HDR 2.1 would be nice if for some reason I bought a console (but not really necessary).
I can't even comment because it's so out of my reach, that I feel stupid.
 
That one is better for sure.(y) There are a lot of ways to adjust HDR games and screens to your liking so things can vary quite a bit.

This was the first one that normalized and flattened the tree and blew out the darker tower area in the backround under the right side of the tree boughs. The bumps and detail on his helmet etc all got lifted and absorbed too, etc. :

znUdyS8.jpg



This is your latest one that preserves a lot more of the darker details:

nX6FgN6.jpg
 
I can't even comment because it's so out of my reach, that I feel stupid.
Don't feel stupid, I think the one guy does this for a living, I could be wrong though. I'm learning a lot looking up stuff they're talking about and getting a general sense of how this stuff works. Everyone has their specialties, no one can know everything. And in the grand scheme of things, none of us know anything really.
 
That one is better for sure.(y) There are a lot of ways to adjust HDR games and screens to your liking so things can vary quite a bit.

This was the first one that normalized and flattened the tree and blew out the darker tower area in the backround under the right side of the tree boughs. The bumps and detail on his helmet etc all got lifted and absorbed too, etc. :

View attachment 508367


This is your latest one that preserves a lot more of the darker details:

View attachment 508368
Funny you still compare SDR pictures instead of using heatmaps and such. It only looks better on SDR. The contrast is higher in HDR with even more depth. HDR looks a lot better than the left. There is a lot way to adjust HDR. I care about which one looks better lol.

As I always said OLED shills can only see SDR. But whatever their SDR images don't even look good when tone mapped down or don't do proper tone mapping at all. It's very manipulative of you to do that while my SDR pictures generate along side with HDR images with proper tone mapping.

It's too obvious anyone with actual HDR 1000 monitors and above knows which is better.

Let's see what's more you can do.
 
Last edited:
Don't misunderstand. I'm not saying the left version is better. I was saying the new version (of the right side image) is better looking, in both the heat map and the regular pic, than the original right side image you posted earlier in the thread. It's a much better result than the first time imo. (y)
.

First version from earlier in the thread (focusing on the right side panel):

RJHpg16.png


2nd version (focusing on the right side panel).

AvozkFM.jpg
 
Don't misunderstand. I'm not saying the left version is better. I was saying the new version (of the right side image) is better looking, in both the heat map and the regular pic, than the original right side image you posted earlier in the thread. It's a much better result than the first time imo. (y)
.

First version from earlier in the thread (focusing on the right side panel):

View attachment 508386

2nd version (focusing on the right side panel).

View attachment 508387
Funny they are the same pictures.
 
Perhaps something got lost in translation somehow between the actual screenshotted images. 🤔 The brightness (of the screenshot image itself) is different between the first set of pics and the 2nd set of pics. The 2nd set (the screenshot image itself) is darker than the 1st set overall.

Both layers in the gifs below are the "after shots". There are no left panel "before" shots in any of the images below in this reply.


bright/exposed one is the 1st version. Some major artifacting/loss/banding from conversion to gif format obviously. . making them look extremely blocky and uglier than originally, more blown out than in the original screenshots - but still gives an idea the difference between the brightness of the actual image of the 1st "after" and 2nd "after" screen shots. Can always refer back to the original non-gif version in the prev. posts.
eldenring_brightness-exposure_ss-not-heatmap.gif



The first posted "after" pic of the heat maps is the layer below that looks flatter, that is without the isolated darker (actual heat map representative color gradient wise, not darker in the actual scene) hard red lines of detail in the tree, as well as without things like the darker ribbed detail on the pillar on the right (which shows up if you look closely in the 2nd pic).

eldenring_brightness-exposure_heatmap.gif




If the screenshots were exactly the same, there would be no change in the gif between the two layers. Maybe the exposure was bad in the first set of screenshots or something. I find the second versions look a lot better and more like what I'd like to see in game. More ("darker") mids are preserved, even if alongside brighter highs.
 
Last edited:
Perhaps something got lost in translation somehow between the actual screenshotted images. 🤔 The brightness (of the screenshot image itself) is different between the first set of pics and the 2nd set of pics. The 2nd set (the screenshot image itself) is darker than the 1st set overall.



bright/exposed one is the 1st version. Some major artifacting/loss/banding from conversion to gif format obviously. . making them look extremely blocky and uglier than originally, more blown out than in the original screenshots - but still gives an idea the difference between the brightness of the actual image of the 1st "after" and 2nd "after" screen shots. Can always refer back to the original non-gif version in the prev. posts.
View attachment 508391


The first posted "after" pic of the heat maps is the layer below that looks flatter, that is without the isolated darker (actual heat map representative color gradient wise, not darker in the actual scene) hard red lines of detail in the tree, as well as without things like the darker ribbed detail on the pillar on the right (which shows up if you look closely in the 2nd pic).

View attachment 508393



If the screenshots were exactly the same, there would be no change in the gif between the two layers. Maybe the exposure was bad in the first set of screenshots or something. I find the second versions look a lot better and more like what I'd like to see in game. More ("darker") mids are preserved, even if alongside brighter highs.
Funny you still compare the 1st SDR picture you generated. Do you want to derail the topic even further about how website handles larger pictures and how you generate the SDR pictures you made yourself instead of mine?
 
Funny you still compare the 1st SDR picture you generated. Do you want to derail the topic even further about how website handles larger pictures and how you generate the SDR pictures you made yourself instead of mine?

Those are for reference. You can refer to the originals like I said, the first of which were a 24mb download I believe, and the 2nd set in one of your more recent replies.

The differences between the two sets of "after" screen shots, even in your original posts, is there for whatever reason. If it looks more like the 2nd pics and maps in actual gameplay, that would look better to me. There are a lot of ways to modify how hdr looks in games and screens after the fact though anyway.
 
Those are for reference. You can refer to the originals like I said, the first of which were a 24mb download I believe, and the 2nd set in one of your more recent replies.

The differences between the two sets of "after" screen shots, even in your original posts, is there for whatever reason. If it looks more like the 2nd pics and maps in actual gameplay, that would look better to me. There are a lot of ways to modify how hdr looks in games and screens after the fact though anyway.
Funny the images of your super overexposure SDR from whatever tone mapping method you use is more authentic than mine. And you want to pick holes in them to fulfill your narratives without understanding it's meaningless to compare contrast on SDR images .
My SDR generates directly from the actual original raw data the same time HDR is generated lol.
 
Don't feel stupid, I think the one guy does this for a living, I could be wrong though. I'm learning a lot looking up stuff they're talking about and getting a general sense of how this stuff works. Everyone has their specialties, no one can know everything. And in the grand scheme of things, none of us know anything really.
People need to understand how the basic contrast works. Contrast is the luminance ratio from different parts of a picture. For example, the luminance ratio between point A and point B.

The higher the ratio, the more contrast these two parts have. One of the aspects of HDR (High Dynamic Range) is the contrast, another aspect is the similar ratio of color volume.

For example,
In a low contrast image-1, the ratio between A and B is B/A= 1.274
In a high contrast image-2, the ration between same location A+ and B+ is B+/A+ = 1.581

1.581>1.274, so (B+/A+)/(B/A)>1. It means part A+ and part B+ have a larger difference in luminance, higher contrast, compared to the luminance difference between A and B in image-1.

And this applies to every random point A and point B in the high contrast image vs low contrast image. The ratio is always higher in image-2 if the contrast is increased properly.

If a monitor is more capable of displaying a higher dynamic range, image-2 will look significantly better because the contrast and color volume is more close to the impactful real-life images.

52345832775_b022f93b3c_o_d.png

52345406261_b5f6547f1d_o_d.png


52344449432_f797202ec1_o_d.png
 
In terms of the capability of an HDR monitor, it's always better when the monitor is able to display a higher range with both higher contrast and higher color volume.

From the RGB waveform, it's obvious the see the higher dynamic range from how stretched the wave is.

1.Higher contrast, high color volume
Wave-1.png


2.lower contrast, lower color volume
Wave-2.png

But most monitors cannot even display the second image accurately. Sometimes certain displays like OLED just look like the 3rd image or the 4th image.

3. even lower contrast, lower color volume.
Wave-4.png


4.hard clip, over blown.
Wave-3.png
 
And a monitor like PG32UQX is more impressive even though the original content needs tone mapping from 3,000nits down to PG32UQX's 1800nits peak brightness to display properly, which is configurable in the R2D2's game options.

So you quite like the monitor? Do you think any PC display does HDR better?
 
So you quite like the monitor? Do you think any PC display does HDR better?
I like what this monitor can do in terms of sustained brightness and color space, despite its shortcomings such as only 1152 dimming zones, slower responsive time. I use multi-monitors at the same time. The RGB waveforms are often displayed alongside the content I watch. With such monitors, I can view contents or make them on another level. The games will look on another level as well.

For now, most contents are still very limited in dynamic range. They don't look vivid or impactful. They don't look natural enough or saturate enough in both contrast and color volume. Most of them are more like 200nits SDR plus a few highlights.
People think the high APL scene doesn't have contrast. It has way more contrast than the lower APL scene as proven in the previous post considering how the grading algorithm is done to stretch the RGB waveform. It's the display that limits people to see the contrast due to either ABL or hard clip. And the APL needs to be higher to look better. a more realistic APL is a lot higher than the current content APL.

There are two reasons:
1. The grading technique these contents are used needs to be improved. For example, DaVinci Resolve needs a lot of manual work to grade each luminance zone at different levels. It's precision work. Only the biggest productions can afford the effort to make it look great.
2. Most displays cannot even get this much range, the final output of the content is toned down a lot. So better make the contents at least show what the creators or distributors think not too compromised to most of the displays.

But things are changing, some games can be fully extended to showcase the capability of this monitor. Companies like Microsoft have used a similar grading technique or algorithm in their games. Old movies from 20 years ago can be graded to look as impactful as HDR 1000 or above if the data is enough for the grading. They won't even lose any original details, they look more natural.

There isn't any other current panel that can do the same. A lot of panels can do low APL scenes just fine or look similar to each other such as OLED, FALD IPS, FALD VA. But only this panel can do higher APL scenes properly with both the range of contrast and color. It's where HDR impacts the most.
 
I see this monitor has dipped to around $2.3k again.... debating having my local MC price match and pick one up... but still not sure it's a worthy upgrade (for the price) over my PG27UQ for gaming... need to see if any monitors are coming out soon that could compete on this front.
 
I do miss the HDR on this display. Nothing else gives such a "lightning bolt" to the eyes experience.
 
I just don't think any PC related HDR content does this monitor justice to this day. 98% of games HDR implementations are garbage.

Maybe in a few years.
 
I just don't think any PC related HDR content does this monitor justice to this day. 98% of games HDR implementations are garbage.

Maybe in a few years.
HDR content comes directly from PC monitor.
A lot of HDR games has slider. People don't even bother to configure it. They won't see half the potentials of HDR.
 
To those who are using this how well does auto HDR work with this monitor? Back when I had it, auto HDR didn't exist and neither did the SDR brightness slider from what I remember. Can this be set to HDR permanently or does the desktop look terrible.

The guy I sold mine too over a year ago wants to sell it and gave me a dirt cheap offer. Debating grabbing it just for desktop use and high APL HDR games.
 
"How good is Auto HDR?" "A lot of games just have terrible HDR implementation" These questions only exposes how underwhelming other monitors can display with little capability.

Windows Auto HDR corresponds to the dynamic range of a monitor. If a monitor is capable of HDR 1400, the SDR will be scaled to HDR 1400 with higher contrast.
This means if the monitor is good at HDR, Auto HDR will look good. If the monitor is crap at HDR, the Auto HDR looks like crap.

Here is an example of Arkham Knight Auto HDR captured from different monitors of HDR 400, HDR 1000, HDR 1400.
The high moon is around 500nits, 1000nits, 1600nits. It's obviously HDR 1400 has the most contrast and colors.
Auto HDR 400 vs 1000 vs 1400.png

The next thing Microsoft needs to do is to have more slider in different zones and boost more missing color to avoid posterization.

The funniest part is that a good HDR monitor is never bad at higher range of SDR. It already shows good HDR. There is no shy to limit the range on SDR. So the monitor can make SDR look like HDR the similar way Windows Auto HDR does. If people say their monitor's SDR cannot look like HDR, their monitor HDR sucks.
This means if you compare an Auto HDR image scaled from a HDR 400 monitor, the image will look like an image of SDR 400-500nits from another monitor that is capable of true HDR 1,000 or above.
So the contrast and highlight of the Auto HDR capture from a HDR 400 monitor are at the same level to the SDR capture on PG27UQ, PG32UQX, PA32UCG.
No doubt people say the two actual pictures look different to each other. It's because their monitors cannot display SDR at the level of HDR 400.
But the waveform and heatmaps say they look the same.
HDR 400 vs SDR 400
SDR 400 vs Auto HDR 400.png
 
Has anyone here with a PG32UQX tried to calibrate the screen with the Windows HDR calibration APP? The Full screen and peak luminance calibration windows seem to disappear around 750 nits when using the app. Seems very odd.
 
What are some games that really shine (pun-intended) with HDR on this display? So far I've used RDR2, FH5, GoW, Cyberpunk and HDR look fantastic. Even BF1, for such an old game, looks terrific with HDR. Any other games you guys recommend?

Also, I'm running HDR through Win 11 and so games that don't support HDR natively use 'Auto-HDR' which still makes them look way better than SDR on this display. Any specific HDR settings you guys recommend I tweak to get the best experience possible?
 
Ya that doesn't sound right. My G8 calibrates to 1060 nits.
Yea
What are some games that really shine (pun-intended) with HDR on this display? So far I've used RDR2, FH5, GoW, Cyberpunk and HDR look fantastic. Even BF1, for such an old game, looks terrific with HDR. Any other games you guys recommend?

Also, I'm running HDR through Win 11 and so games that don't support HDR natively use 'Auto-HDR' which still makes them look way better than SDR on this display. Any specific HDR settings you guys recommend I tweak to get the best experience possible?
Any of the remastered Crysis Games. Plagues Tale, Assassins Creed.
 
Status
Not open for further replies.
Back
Top