Regular desktop use just doesn't have enough motion for it to make a practical difference, unless you are the type to jiggle your windows around just to be in awe of how smooth they are :p

Which is something you will do a couple times when you get your first monitor that can do over 100Hz. :)
 
Regular desktop use just doesn't have enough motion for it to make a practical difference, unless you are the type to jiggle your windows around just to be in awe of how smooth they are :p


...it's a habit...

But I really do notice it everywhere. Mouse cursors, dragging windows, and of course scrolling- I'm one of those that could tell you what refresh rate your CRT was running at.
 
It’s similar to the jump from 30 to 60 - try running at 30fps for a bit in desktop use. Is it fine? usable? Sure, but you can definitely feel the jump back to a higher hz setting. That said the jump to 120+ for desktop use is less than the 30 to 60, but very welcome to me .
 
I like high Hz on desktop usage but since I started using my stream deck array of hardware buttons in conjunction with displayfusion to combine hotkey assignement of different pre-defined window sizes and locations on different monitors on the fly, I hardly ever drag windows anymore. I just hit a pysical button with my custom LED display graphics on them and the window is instantly where I want it and it aligns perfectly with other windows I pop around to share the screen space.

I'd appreciate the higher Hz for scrolling perhaps but with that much desk space real-estate I don't have to scroll quite as much either - I'd probably just hit the 60% right or 40% left screen position buttons which make the browser window full height and narrow enough to read more comfortably
The mouse movement would be nice but overall it's not a big deal for my side monitors considering my hardware hotkey button array device setup and what I am reading and watching on them. The TCL one of my two 43" 60hz is also a strobing PWM backlight at a high rate of 120hz which helps a little for clearer motion too. (Any lower of pwm would bother me of course.. this implementation is good for the monitors duties in this scenario).

So I get 4k 43" screens for $230 - $270 each for a ton of desktop/app/media playback space and at very high contrast and black levels compared to non-FALD computer monitors (which is almost all computer monitors so far). The TCL is 4170:1 and the 43" samsung is 6100:1. My center gaming monitor will always be high hz + VRR (and overdrive if I can get it). I'm interested in replacing my 32" 1440p ~144hz VA with one of these 43" 4k 120hz VRR gaming monitors eventually but I want to wait to until the 1000nit one is out and reviewed so that might be a while.
 
Last edited:
I like high Hz on desktop usage but since I started using my stream deck array of hardware buttons in conjunction with displayfusion to combine hotkey assignement of different pre-defined window sizes and locations on different monitors on the fly, I hardly ever drag windows anymore. I just hit a pysical button with my custom LED display graphics on them and the window is instantly where I want it and it aligns perfectly with other windows I pop around to share the screen space.

I'd like to see this in action!
 
displayfusion lets you set a shortuct/hotkey in the customization settings where it has a targeting reticule in the hotkey dialogue. (Right-click the tray icon -> Settings -> Fucntions...) Double click the function to open a dialogue window "edit function"...



You can drag that targeting reticule to whatever window you have sized and placed to your liking and assign that to the shortcut automatically. The targeting icon will have input the postion X and Position Y. width and height values automatically found from the window you dragged it to. You can also define functions in a way that allows the placement on whatever active monitor the active window happens to be on rather than targeting the grid values based on the entire monitor array with the target icon, and a lot of other options. There are also a lot of pre-made scripts to do common things and you can edit them and make your own custom from it if you want to dig that far into it.

IcbMu47.png


You don't need a stream deck to use the hotkeys but I find it a lot more useful. Settled on this setup for some time now but I might adjust it more later.

eosJHa8.png


They sell a newer stream deck with a larger array of 32 buttons now also but it's pretty expensive. The typical way to use one is reserving the upper left corner for a "one level up" folder recursion so it sort of wastes a button. I'd prefer they had a blank button above for that, even a smaller one but that's the way it is so a larger array would have more room. Even so, this works great for my uses. You can navigate to other button array "folders" for other apps and functions too if you take the time to build them. It works with OBS and some other apps directly in preconfigs also but that's not what I'm focusing on here.


The screenshot below is from an earlier button config that had flux +/- and volume on it but I re-did all that for more window management as show in the configuration screenshot above since I have a volume knob and I wasn't really using flux much. It affected all three monitors at once anyway so wasn't that useful.

 
Last edited:
Full product page for ASUS XG438Q is up: https://www.asus.com/Monitors/ROG-Strix-XG438Q/

Interestingly https://edgeup.asus.com/2019/say-hello-to-amds-radeon-rx-5700-xt-and-radeon-rx-5700/ mentions Displayport DSC support with RX5700 and XG438Q but I don't know if that is just marketing weirdness because I thought it would be tied to the version of this display that ASUS was showing off at AMD's Next Horizon event. I wonder what happened to that as it would be 144 Hz and DisplayHDR1000. Or they are just going to sell it as an even more upscale version of this display later in the year.
 
Interesting thing is the ASUS has Freesync 2 support, so even thought th HDR specs aren’t as good as the Acer it will be able to use it with adaptive sync. I still have my doubts that the Acer will be a true HDR1000 display at that price point.

Now that more product details are coming out, I wanted to touch on this, especially since the ASUS page doesn't:

First, are we getting FreeSync over DisplayPort? I have a 4k monitor on my desk that only supports FreeSync over HDMI.
Second, if so, are we getting 4k120 4:4:4? That's been an issue in the past.

And yeah... will both be available across DisplayPort?
 
XG438Q manual also available: https://dlcdnets.asus.com/pub/ASUS/LCD Monitors/XG438Q/ASUS_XG438Q_English.pdf

It does seem rushed though because the supported resolution list is wrong, doesn't list 3860x2160 so probably made on top of an existing manual for some other product.

Some interesting tidbits regarding the Picture In Picture and Picture By Picture features:
  • To turn on PIP/PBP function will disable Adaptive-Sync/ FreeSync, Dynamic Dimming and HDR function.
  • PIP/PBP function is supported below 60Hz.
  • 1920x2160@60Hz per input in horizontal split mode.
  • 3840x1080@60Hz in vertical mode.
  • 3840x1080@60Hz large window in vertical 3 way split mode (larger window on top or bottom).
  • 1920x2160@60Hz in horizontal 3 way split mode (larger window on either side).
  • 1920x1080@60Hz for small windows in any 3 way split modes.
  • 1920x1080@60Hz per input in 4 way split mode. I wonder how it does for 4 player, 4 computer couch co-op? :)
The "supported below 60 Hz" line is a bit odd but I guess it means it will drop down to 60 Hz per input when PIP/PBP is enabled. It's a bit of a shame but expected. For me this feature is important because it would allow me to use my desktop PC and work Macbook Pro connected to the same display simultaneously by splitting the massive screen in half.

It seems to allow a few favorite settings to be stored and that could be used to toggle between single input and split screen views easily.
 
Now that more product details are coming out, I wanted to touch on this, especially since the ASUS page doesn't:

First, are we getting FreeSync over DisplayPort? I have a 4k monitor on my desk that only supports FreeSync over HDMI.
Second, if so, are we getting 4k120 4:4:4? That's been an issue in the past.

And yeah... will both be available across DisplayPort?

From Wikipedia: DisplayPort 1.4 can support 4K UHD (3840 × 2160) at 120 Hz with 30 bit/px RGB color and HDR.

HDMI 2.0 ports on this will be bandwidth limited and can probably do 4K @ 60 Hz 4:2:2 with HDR or alternatively 4K @ 120 Hz 4:2:0 without HDR.

So to get the most out of this, connect with Displayport. It's too bad it only has one of them so toggling between two computers will be a bit of a compromise but probably not a real issue for most users.

Whether it supports over 60 Hz or Freesync over HDMI is a mystery still but specs nor manual don't say anything about that. Likewise product page does not have the kind of disclaimers the PG27UQ has where it states the limitations of that screen. I don't know if Display Stream Compression comes into play on this display.
 
Whether it supports over 60 Hz or Freesync over HDMI is a mystery still but specs nor manual don't say anything about that.

The manual shows 1440p120 as max res/refresh and 4k60 as max res for HDMI, and Freesync over HDMI at 1440p120. So at least they're making use of the spec- and I was confused as I believe I was thinking of the next HDMI spec.
 
The manual shows 1440p120 as max res/refresh and 4k60 as max res for HDMI, and Freesync over HDMI at 1440p120. So at least they're making use of the spec- and I was confused as I believe I was thinking of the next HDMI spec.

Hmm, maybe they have made a swift revision because I could swear that info wasn't right in the manual before. Shame that no 4K @ 120 Hz over HDMI even with limited color space. 2560x1440 is oddly missing from the DP resolution list but should be no trouble for scaler.

"FreeSync support 48-120Hz under DP mode.Adaptive support FHD/QHD 48-120Hz under HDMI mode." according to the manual. While the minimum seems low, Freesync 2 requires Low Framerate Compensation and that should make sure you don't run into issues at lower framerates than that.

"HDR: High Dynamic Range. Contains three HDR modes (ASUS Cinema HDR, ASUS Gaming HDR and FreeSync2 HDR). ASUS Cinema HDR and ASUS Gaming HDR are selectable when input video is not AMD FreeSync2 HDR. FreeSync2 HDR is selectable when connected to AMD FreeSync2 HDR video. (FreeSync2 HDR non-support under HDMI mode)."

So FreeSync2 + HDR doesn't seem to be possible over HDMI. This is a shame when considering using this with next gen consoles which will surely support this. Maybe it's another HDMI 2.0 limitation. I don't want to make guesses on what the difference is between those other HDR modes, hopefully they aren't just another image quality crapifier like many settings like these tend to be.

This seems to have its own set of compromises but at least they are mostly for HDMI outputs rather than issues with color spaces and refresh rates like on the PG27UQ.

EDIT: Added some more stuff.
 
Last edited:
Apparently Freesync 2 won't work with Nvidia cards anyways because it's an AMD exclusive. We have no idea how standard Freesync will work with these displays and Nvidia cards anyways (I'll let someone else go first). So the best us Nvidia people will get is HDR with no Freesync. So the differences between these 2 displays are getting smaller and smaller.
 
Apparently Freesync 2 won't work with Nvidia cards anyways because it's an AMD exclusive. We have no idea how standard Freesync will work with these displays and Nvidia cards anyways (I'll let someone else go first). So the best us Nvidia people will get is HDR with no Freesync. So the differences between these 2 displays are getting smaller and smaller.

Well, that's disappointing.

Without proper variable refresh on Nvidia GPU's these screens are completely useless.

It doesn't matter how awesome or not FreeSync2 may be. AMD has no GPU that can handle modern titles at 4k 60hz, let alone 120hz.

If this is true, I don't understand why they are even being made.
 
Apparently Freesync 2 won't work with Nvidia cards anyways because it's an AMD exclusive.

Gonna need a link for that- Freesync is a collection of related but separate features, and Freesync 2 is really just a minimum feature level.
 
Gonna need a link for that- Freesync is a collection of related but separate features, and Freesync 2 is really just a minimum feature level.

We also haven’t tested any FreeSync 2 monitors with Nvidia GPUs. Because FreeSync 2 is an HDR pipeline exclusive to AMD that allows the game to talk directly to the monitor for lower latency HDR processing, we don’t expect this functionality to work with Nvidia GPUs. However, this won’t stop regular HDR from working in conjunction with adaptive sync on Nvidia GPUs, as we’ve already mentioned. So those that own or are thinking of buying a FreeSync 2 monitor will get HDR functionality, just not FreeSync 2 HDR functionality in the limited selection of games that support it.

https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/

Don't know how accurate this is- I couldn't find anything else where people said they actually tested Freesync 2 features with a Nvidia card.
 
The 'lower latency HDR' thing, if provably an issue, has to have more to do with the asstastic implementation of HDR on the desktop than anything else. So if it is a provable thing now, it's not going to be an issue going forward.
 
https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/

Don't know how accurate this is- I couldn't find anything else where people said they actually tested Freesync 2 features with a Nvidia card.

They had a "revisited" article where several Freesync 2 and HDR monitors worked just fine.

Freesync 2 HDR is yet another confusing term from AMD as it means several things:

* Adaptive sync
* HDR support w/adaptive sync
* HDR latency reducing feature game developers can implement.

The last one is described here like this:

FreeSync 2 HDR is an open standard developed in part by AMD in order to provide game developers with low latency HDR displays and allow games to utilize their full displayable color and brightness range. The major latency problem that is seen in current displays is that games tone and gamut map their frames to be rendered in a standard HDR color space, and the monitor then also tone and gamut maps these frames to its native color space, resulting in double the amount of work and longer latencies as can be seen below.

fs2_explanation-1024x576.jpg


FreeSync 2 HDR removes this latency by having the game do the tone and gamut mapping directly to the monitor’s native color space, leaving the monitor to just display the image without modification and thus lower latency.

One other issue with current monitors is that regular monitors tend to have filters and post processing applied to the game’s frame before it is displayed, and that post processing is not controllable or able to be disabled by the developer. These filters generally over-brighten or over-saturate the frame, making the game look different when rendered on different monitors.

This issue is solved by using FreeSync 2 HDR’s display modes, which guarantee that the frame provided by the game will be displayed as-is, without any post-processed modification. The monitor also must pass an AMD certification process which verifies that colors are accurately displayed, resulting in a more consistent display of the game across different FreeSync 2 HDR monitors.

So that is the feature that will most likely be AMD GPU only as Nvidia most likely doesn't/can't implement this feature on their GPUs. There isn't even a comprehensive list of games that support it, according to this Assassin's Creed: Odyssey, Far Cry 5, Resident Evil 2, The Division 2 and some others do.

Seriously, why couldn't they call this something like "Low Latency HDR" or something to separate it from rest of the Freesync 2 HDR features?

To reiterate, the important parts of Freesync 2 HDR - adaptive sync, HDR and LFC - will work just fine on Nvidia GPUs.
 
So that is the feature that will most likely be AMD GPU only as Nvidia most likely doesn't/can't implement this feature on their GPUs.

They likely have not yet implemented "low-latency HDR", but we don't have evidence that they cannot implement it. Further, it mostly sounds like a problem with two solutions: speeding up HDR processing on monitors, which should be the priority, and passing 'monitor gamut' to games, which doesn't shouldn't be an issue in the first place and should be part of fixing HDR on the desktop overall.
 
  • Like
Reactions: Wag
like this
They likely have not yet implemented "low-latency HDR", but we don't have evidence that they cannot implement it. Further, it mostly sounds like a problem with two solutions: speeding up HDR processing on monitors, which should be the priority, and passing 'monitor gamut' to games, which doesn't shouldn't be an issue in the first place and should be part of fixing HDR on the desktop overall.

Only reason why Nvidia would not be able to implement it is if some patents are involved that would mean extra paperwork and possibly license fees to get the feature in. Otherwise they could definitely implement it but I would not hold my breath because Nvidia is notoriously bad for not adding requested features.
 
Otherwise they could definitely implement it but I would not hold my breath because Nvidia is notoriously bad for not adding requested features.

Agreed, though no doubt they're not in a hurry with HDR as Microsoft seems to not be in a hurry with HDR.
 
They had a "revisited" article where several Freesync 2 and HDR monitors worked just fine.

Freesync 2 HDR is yet another confusing term from AMD as it means several things:

* Adaptive sync
* HDR support w/adaptive sync
* HDR latency reducing feature game developers can implement.

The last one is described here like this:



So that is the feature that will most likely be AMD GPU only as Nvidia most likely doesn't/can't implement this feature on their GPUs. There isn't even a comprehensive list of games that support it, according to this Assassin's Creed: Odyssey, Far Cry 5, Resident Evil 2, The Division 2 and some others do.

Seriously, why couldn't they call this something like "Low Latency HDR" or something to separate it from rest of the Freesync 2 HDR features?

To reiterate, the important parts of Freesync 2 HDR - adaptive sync, HDR and LFC - will work just fine on Nvidia GPUs.

That is quite interesting.

Has anyone seen this quantified? How much of an input lag difference when displaying HDR content are we talking about?
 
That is quite interesting.

Has anyone seen this quantified? How much of an input lag difference when displaying HDR content are we talking about?

No idea- haven't paid close attention to it since the software stack is still ratfucked.

It strikes me as similar to nicer monitors about a decade ago that had fancy scalers that added significant input lag. Picked up my HP ZR30w instead of Dell's U3011 for this reason. It has no scaler.

And if 'HDR' in monitors is adding input lag due to processing, then we're just a less halfassed iteration away from that being a non-issue.
 
And if 'HDR' in monitors is adding input lag due to processing, then we're just a less halfassed iteration away from that being a non-issue.

HDR without doubt adds more input lag. On my 2016 Samsung KS7005 (Nordic KS8000) SDR vs HDR is 22 vs 37 ms input lag. It is noticeable when playing too unfortunately.

In newer TV models they have gotten the HDR input lag down to around the same level as SDR so it becomes a non-issue so I'm not quite sure how much of a difference the game-implemented Freesync 2 HDR latency thing would make.

I expect that there won't be any noticeable difference in SDR vs HDR on the XG438Q regardless.
 
As long as I'm on NVIDIA hardware, I'll keep buying G-Sync displays. Speaking of which, anyone have any idea on pricing for these two displays? I'm betting they are north of 2K, and a G-Sync version would undoubtedly be worse.
 
HDR without doubt adds more input lag. On my 2016 Samsung KS7005 (Nordic KS8000) SDR vs HDR is 22 vs 34 ms input lag. It is noticeable when playing too unfortunately.

In newer TV models they have gotten the HDR input lag down to around the same level as SDR so it becomes a non-issue so I'm not quite sure how much of a difference the game-implemented Freesync 2 HDR latency thing would make.

I get that there are sets where there is a testable difference between modes- what I'm pointing out is that this seems to be a function of the ASICs used in the sets moreso that a result of using HDR encoding.

I expect that there won't be any noticeable difference in SDR vs HDR on the XG438Q regardless.

That's what I'm hoping for...
 
As long as I'm on NVIDIA hardware, I'll keep buying G-Sync displays. Speaking of which, anyone have any idea on pricing for these two displays? I'm betting they are north of 2K, and a G-Sync version would undoubtedly be worse.

Somewhere between US$1,000 and US$1,500 it seems- no one is really talking specifics, but being Freesync and non-FALD does seem to put a limit on pricing.

Also, while I mostly agree on G-Sync- on some displays I've seen the delta drop to sub-US$80- well-implemented Freesync is more than acceptable, and if a display is pimping Freesync 2 HDR (and deserves the label), it's well-implemented.

Aside from the technological difference, my main beef with Freesync has been the shitshow of feature support along with exceedinly poor OEM labeling and independent verification. Freesync 2 HDR more or less fixes that to the point that the differences should be imperceptible.
 
156955-amd-radeon-freesync2-hdr-logo-dark-1260x500_0[1].png


AMD Radeon FreeSync™ 2 HDR Technology²

AMD Radeon FreeSync 2 HDR technology raises the bar to the next level for gaming displays, enabling an exceptional user experience when playing HDR games, movies and other content:

  • Meticulous monitor certification process to ensure exceptional visual experiences
  • Guaranteed support for Low Framerate Compensation (LFC)
  • Guaranteed Support for displaying HDR content
  • Low latency

Oasis Demo
 
I can't find anymore info than the couple blurbs I've seen out on the web but is the acer version still going to be able to use 2 displayport inputs to get full 120hz 4:4:4 10 bit at 4k bandwidth? I read that that would be the case but details on that monitor have been much more sparse at the moment.
 
Woah, did not know that but it makes sense. Crosses that monitor off the list for me. HDMI 2.1/ dp 1.5 can't come soon enough.

Thanks for the insight Vega :)
 
  • Like
Reactions: Sufu
like this
DP 2.0 is the next big thing, but ya displays constantly being hamstrung is getting old!
 
You can't use VRR on a display while using two inputs which sucks.

What does this mean exactly? Does it mean that duel displayport inputs is required for full features (4k 4:4:4 10bit), but VRR will not work when both inputs are used? Is this true for both the Asus and Acer displays?
 
Yes you need to get the DSC version of the display if you want high color bit depth, no chroma sub-sampling and VRR all at the same time.
 
Yes you need to get the DSC version of the display if you want high color bit depth, no chroma sub-sampling and VRR all at the same time.

Currently, only AMD supports DSC which is a shame. DSC is also not a silver bullet as it does do a form of chroma compression to a certain extent (so there is some loss of detail). So if I'm using nvidia, what's the max I can drive XG43UQ without DSC? Maybe 4k at 8bit?
 
Currently, only AMD supports DSC which is a shame. DSC is also not a silver bullet as it does do a form of chroma compression to a certain extent (so there is some loss of detail). So if I'm using nvidia, what's the max I can drive XG43UQ without DSC? Maybe 4k at 8bit?
One of the more understated changes comes with the display outputs, which thanks to Turing's new display controller now features DisplayPort 1.4 and DSC support, the latter of which is part of the DP1.4 spec. The eye-catching addition is the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.
https://www.anandtech.com/show/1334...tx-2080-ti-and-2080-founders-edition-review/4

Lots of sites say different.
 
Back
Top