The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

Well the Acer 43" is suppose to have a MSRP of $1299 in the US so I don't expect the ASUS to cost more than that.
 
Well the Acer 43" is suppose to have a MSRP of $1299 in the US so I don't expect the ASUS to cost more than that.

You DON'T expect Asus to charge more for their products? Have you been living in a remote Antarctic outpost with no contact with the outside world for the last few years? :D
 
If you go back to the product link on the 1st message. It now shows a RRP of 1099 pounds which usually means it will have a similar price in the US. If this info is correct. it's a very reasonable price

No, they just updated it yesterday....! (Asus ROG Strix XG43-Q)

Matter of fact, they are now showing is as FreeSync2.0 (and not as g-sync compatible). It also goes to 120hz, instead of 144hz. (ed: Sorry I was thinking of the LG 43" and confused the two)

footnote: Hexus



I would have to think, in the States, this would be at least $1,399. (Similar to My Acer x34 was 4 years ago.) If these come in just around a grand, I will buy two.
 
Last edited:
Acer doesn't have Freesync 2 tho, which means no HDR with Freesync.

Means nothing for me as I'll be using a RTX 2080 so I'll rather have G-Sync. I also rather have HDR 1000 for when I want to watch 4K HDR content (although mostly I would watch those on my 65" 4K that hit over 1500 cd/m2 at home but there I times I want to do that in my office).
 
Last edited:
Means nothing for me as I'll be using a RTX 2080 so I'll rather have G-Sync. I also rather have HDR 1000 for when I want to watch 4K HDR content (although mostly I would watch those on my 65" 4K that hit over 1500 cd/m2 at home but there I times I want to do that in my office).
Not sure what you mean by "so I'll rather have G-sync"? These displays were never planned to have G-Sync, and if you'd rather have G-Sync with a RTX 2080 why are you getting this display?

I have a 2080Ti and hoping either of these displays work well with the Freesync compatibility of Nvidia's drivers. Probably be seeing less "G-Sync" equipped displays just for this reason.
 
Means nothing for me as I'll be using a RTX 2080 so I'll rather have G-Sync. I also rather have HDR 1000 for when I want to watch 4K HDR content (although mostly I would watch those on my 65" 4K that hit over 1500 cd/m2 at home but there I times I want to do that in my office).

Most people are taking this opportunity to get away from G-Sync.


I too have a RTX2080 and G-Sync X34 Monitor. But both will be upgraded before this Holiday shopping season, and I intend to buy a monitor that DOESN"T have g-sync, because for the life each monitor, I usually end up going through several GPU. That means I am stuck with Jensen cards.

If I break my cycle and sell of my RTX2080 once big-navi hits and This new FreeSync2.0 monitor, then I will now be on the Standard that the new Xbox, PlayStation and manufacturers are all going towards.


You will see far more RDNA + FreeSync in the future, more-so than you will see g-sync.
 
Most people are taking this opportunity to get away from G-Sync.

Citation

That means I am stuck with Jensen cards.

Wouldn't it suck to have the fastest GPU options available?

then I will now be on the Standard that the new Xbox, PlayStation and manufacturers are all going towards.

So you don't put your consoles in the living room with the big TV and the couch?

You will see far more RDNA + FreeSync in the future, more-so than you will see g-sync.

Given AMD's lackluster and very late release of 'RDNA', the evidence supports the opposite.
 
IMO G-Sync still has the best fire-and-forget VRR experience.
 
IMO G-Sync still has the best fire-and-forget VRR experience.
That's great, but once again, these displays never were planned to have G-Sync, so why are we discussing it?

I am more interested in how the Freesync features in these displays work with Nvidia's cards.
 
Because G-Sync is the main competing version of VRR that this display uses? Well within a reasonable topic of discussion that the topic display may not have the optimal form of VRR. I await the tests to see how it pans out.

Who speak'em Chine?



watch
 
Because G-Sync is the main competing version of VRR that this display uses? Well within a reasonable topic of discussion that the topic display may not have the optimal form of VRR. I await the tests to see how it pans out.

Who speak'em Chine?



watch

I thought these displays were strictly Freesync only? You're talking G-Sync in Nvidia's drivers? I thought that was just adaptive sync compatibility?
 
I thought these displays were strictly Freesync only? You're talking G-Sync in Nvidia's drivers? I thought that was just adaptive sync compatibility?

I think it gets confused because I believe Nvidia brands the Adaptive Sync screens tested and certified to work with their new drivers using some of the G-Sync branding, despite it having nothing to do with the G-Sync technology.
 
I think it gets confused because I believe Nvidia brands the Adaptive Sync screens tested and certified to work with their new drivers using some of the G-Sync branding, despite it having nothing to do with the G-Sync technology.
Right. That's what I thought. So I get confused when people say these displays are G-Sync when they're marketed as Freesync.
 
Not sure what you mean by "so I'll rather have G-sync"? These displays were never planned to have G-Sync, and if you'd rather have G-Sync with a RTX 2080 why are you getting this display?

I have a 2080Ti and hoping either of these displays work well with the Freesync compatibility of Nvidia's drivers. Probably be seeing less "G-Sync" equipped displays just for this reason.

Look at the 1:59 mark of this video from computex in June



It says "nVidia G-Sync Compatible for PC /Console
 
I'm pretty happy with my 34" ultrawide. However, I think the 40" 4K TV I used to have was just about perfect.
 
It says "nVidia G-Sync Compatible for PC /Console

G-Sync Compatible = Freesync.

Nvidia now has several tiers:

G-Sync Ultimate: 4K, high refresh rate, HDR, G-Sync
G-Sync: 1440p and ultrawides, high refresh rate, G-Sync
G-Sync Compatible: Freesync monitors that Nvidia has certified to work with their GPUs. Other displays may be G-Sync compatible but untested.

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/
 
Also note that G-Sync compatible means it works, but may not be the full functionality of either FreeSync with an AMD card, or G-Sync with Nvidia.

For example, my 240Hz FreeSync monitor works with both AMD and Nvidia. However, on AMD the range seems better, especially with low framerates between 30 and 50. With Nvidia, everything is good above 50fps, but lower becomes a bit choppy.

But with AMD is is practically playable down to around 30fps, it still feels okay. Obviously I would like higher framerates, but I was surprised that 30fps was playable.
 
For example, my 240Hz FreeSync monitor works with both AMD and Nvidia. However, on AMD the range seems better, especially with low framerates between 30 and 50. With Nvidia, everything is good above 50fps, but lower becomes a bit choppy.

But with AMD is is practically playable down to around 30fps, it still feels okay. Obviously I would like higher framerates, but I was surprised that 30fps was playable.

This is unfortunately going to vary from display to display. Hopefully monitor companies commit to even performance across platforms.

G-Sync Compatible: Freesync monitors that Nvidia has certified to work with their GPUs. Other displays may be G-Sync compatible but untested.

I'd like to see Nvidia categorize Freesync 2 monitors. Need more divisions of their 'G-Sync Compatible' ratings.
 
Look at the 1:59 mark of this video from computex in June



It says "nVidia G-Sync Compatible for PC /Console



Yeah, but it's not true Gsync. It's the driver compatibility with VRR. Nvidia is intentional muddying the waters here a bit.
 
Yeah, but it's not true Gsync. It's the driver compatibility with VRR. Nvidia is intentional muddying the waters here a bit.

That's one way to look at it; when Nvidia decided to enable compatibility with DP VRR, they started up a certification program to identify those DP VRR displays that were problem free as far as DP VRR can be. They are still far more stringent for displays that use G-Sync modules, of course, but in general if a display is listed as 'G-Sync Compatible' you know you're getting the next best thing to actual G-Sync.

And well, it's nice to know, even if their range of certified monitors isn't that great and perhaps arbitrarily excludes monitors that should be certified. One example is that Nvidia requires VRR to be enabled by default (like G-Sync) in order to gain the certification and for most monitors with Freesync it seems that VRR wasn't enabled by default.
 
Right. That's what I thought. So I get confused when people say these displays are G-Sync when they're marketed as Freesync.

I could have been a little more clear. I meant that displays that have a physical G-Sync chip in them typically have superior VRR/overdrive. I am currently using a Freesync monitor in "G-Sync compatible mode" and it's not near as good as "real" G-Sync display.

Hopefully they make this Freesync display/overdrive as best they can.
 
I meant that displays that have a physical G-Sync chip in them typically have superior VRR/overdrive. I am currently using a Freesync monitor in "G-Sync compatible mode" and it's not near as good as "real" G-Sync display.

I need to get my hands on a good Freesync display. The only one I have supports Freesync over HDMI, and that's not going to work with my 1080Ti.

In general, good Freesync displays, whatever they're marketed as, should be nearly imperceptable from the same display with a G-Sync module. Of course that's going to vary significantly with the viewer and what's played, which is why G-Sync is still by default 'the best'.
 
Considering there are much more Nvidia cards out there than AMD, you'd think display manufacturers would want to test their newly released Freesync displays stringently for Nvidia compatibility.
 
Considering there are much more Nvidia cards out there than AMD, you'd think display manufacturers would want to test their newly released Freesync displays stringently for Nvidia compatibility.

All evidence points to them doing just that- at the glacial speed of the display industry, at least. And it's a bit of a new thing. I had no idea that Nvidia was going to allow 'G-Sync Compatible' branding with stickers and labels and so on when they enabled support for DP VRR. They didn't skip a beat with the switch in narrative, as well a company with their competitive nature should be expected to, and on balance at least they're following through with sorting stuff out and getting 'official' information on performance and compatibility visible to consumers.
 
All evidence points to them doing just that- at the glacial speed of the display industry, at least. And it's a bit of a new thing. I had no idea that Nvidia was going to allow 'G-Sync Compatible' branding with stickers and labels and so on when they enabled support for DP VRR. They didn't skip a beat with the switch in narrative, as well a company with their competitive nature should be expected to, and on balance at least they're following through with sorting stuff out and getting 'official' information on performance and compatibility visible to consumers.
At least Asus will, have no idea about Acer tho.
 
Also, wonder when ASUS will put out an ELMB-Sync version with this panel.

Really interested in the results of that technology given how slow non-TN monitors are today.
 
Citation
Wouldn't it suck to have the fastest GPU options available?
So you don't put your consoles in the living room with the big TV and the couch?
Given AMD's lackluster and very late release of 'RDNA', the evidence supports the opposite.


Why do you need a citation, when you know for a fact, that more and more and more people are choosing FreeSync & FreeSync2, over G-sync... due to cost. Seriously, are you so well paid that you constantly pretend: "I live under a rock, inform me" narrative of being purposely dumb?


Almost all Samsung TV's being sold now, are FreeSync 2.0 compatible... because they know every Xbox & PlayStation owner is also going to have a FreeSync2.0 capable Consoles. Powered by RDNA.
 
So you are pretending not know that G-sync monitors cost more and People have been moving over to freesync for the last few years. So much so... that Jensen had to relent and enable "freesync" compatibility..?

You know none of this..? and you have 10K posts..?
 
So you are pretending not know that G-sync monitors cost more and People have been moving over to freesync for the last few years. So much so... that Jensen had to relent and enable "freesync" compatibility..?

Pretending? I'm not pretending anything. Specifically, I'm not pretending to know 'for a fact' what sales numbers are nor what customer motivations are.

You know none of this..? and you have 10K post in 11 years..?

Might want to head back to math class.
 
Pretending? I'm not pretending anything. Specifically, I'm not pretending to know 'for a fact' what sales numbers are nor what customer motivations are.

Might want to head back to math class.

Yes, you are pretending not to know that Nvidia's G-Sync monitors cost more. And then feigning ignorance about price/performance and pretending not to know, that a higher cost G-sync chip, effects sales... :ROFLMAO:


Or to not have known, that Nvidia enable "FreeSync" compatibility... even though in earlier post you acknowledge knowing this. But have zero clue why Nvidia finally relented. No clue... hunh?





Putting you on ignore. I understand your purpose here.
 
Yes, you are pretending not to know that Nvidia's G-Sync monitors cost more.

I'm quite aware that they cost more. The delta is US$50 to US$80 where an equivalent Freesync monitor is available.

And then feigning ignorance about price/performance and pretending not to know, that a higher cost G-sync chip, effects sales... :ROFLMAO:

Quite aware how pricing affects sales. But also performance- and why AMD has a minuscule share of the market today.

Or to not have known, that Nvidia enable "FreeSync" compatibility... even though in earlier post you acknowledge knowing this.

?

But have zero clue why Nvidia finally relented. No clue... hunh?

Several. But I'm sure you're going to link an Nvidia press release as to exactly why Nvidia enabled Freesync. Right?

Putting you on ignore. I understand your purpose here.

To think I just took you off...
 
Back
Top