The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

Sure you can sit inside the longer focal length of a wider angle screen but curved would be better if it were a focal length that resulted in a focal point where you were sitting. I've read people's reviews of curved screens where they were wishing it was even more curved for that reason (especially in regard to VA shift). For farther viewing a wider angle would be more beneficial. I'm not saying it's unusable in either scenario or curvature it's just not optimal and you aren't able to change the static curvature so that is a trade-off to me. It would be better if we had no bezels and if we had some kind of flexible oled tech, at least at the transitions.




africa_en_sub_pc_hero_006.jpg


I actually like the idea of one large curved screen but I to me all the ones they've made so far are extremely short in height, especially the 34" diagonal ~ 13" high ones.. so they end up looking like a belt window slot rather than a full window so to speak. I think a 19.6" high (height of a 40" 16:9) ultrawide at "8k" x4320 60hz height with "4k" x2160 120hz height integer scaling would be perfect, preferable in OLED, later in mini LED FALD, dual layer LCD, and micro LED. Unfortunately that probably won't happen. Years later hopefully VR and AR will have ultra high resolution and be capable of making screens in virtual space with high detail to whatever porportions you'd want.










In the more realistic nearer future, I'll probably end up with 55" hdmi 2.1 OLED (or 48" depending on release date) and run a 21:9 resolution on it while gaming, and keep other screen(s) on the side for desktop/app use without burn in risk.
 
Last edited:
There are some big flaws on this display, but not being curved is a non-issue imho.

bigger issues as i see them:

BGR layout
less than optimal overdrive
slow pixels

Why is BGR a problem? It is a common pixel format in VA panel televisions and I never saw that as a problem. At least in the case of 4K screens the pixels are so fricking tiny that it, to my eyes anyway, does not really should not matter in what order the subpixels are.
 
Why is BGR a problem? It is a common pixel format in VA panel televisions and I never saw that as a problem. At least in the case of 4K screens the pixels are so fricking tiny that it, to my eyes anyway, does not really should not matter in what order the subpixels are.

It doesn't work as well as RGB for subpixel smoothing of text. Even though you can change Windows Cleartype to use BGR subpixel smoothing, it doesn't look as good as RGB. MacOS doesn't support BGR at all but you can use grayscale subpixel smoothing or turn it off.

Honestly it's just a subpixel format that has no good reason to actually exist and I have no idea why any display would be made like that when they could just turn the panel upside down and design the rest accordingly.
 
It doesn't work as well as RGB for subpixel smoothing of text. Even though you can change Windows Cleartype to use BGR subpixel smoothing, it doesn't look as good as RGB. MacOS doesn't support BGR at all but you can use grayscale subpixel smoothing or turn it off.

Honestly it's just a subpixel format that has no good reason to actually exist and I have no idea why any display would be made like that when they could just turn the panel upside down and design the rest accordingly.

Well if I have to guess BGR was not developed for any particular reason. The panels just ended up being assembled upside down on the assembly line and TV manufacturers did not give a shit because it does not matter for media consumption. And they still use the same assembly line for all panels they create. But office work where you have to read a lot of text, well I guess the subpixel alignment might be a problem especially on Mac if it does not have a Cleartype equivalent. But then again this monitor is clearly not for office work, it is for games and movies first and foremost.
 
Spotted this video showing XG438Q motion blur. Looks pretty bad at anything except Level 5 OD which replaces it with just inverse ghosting caused by overshoot.

 
@kasakka That looks... pretty horrible. I guess I either need to wait a year or setting for going back to duel-screen set up (instead of one large 43" 4k display).
 
Well the Acer 43" CG7 4K @ 144Hz HDR1000 monitor has come in stock on Acer Website (Australia).

It costs $2499 AU, that's exactly $900 AU more than the Asus XG438Q if anybody wants to know.

I am tempted to pull the trigger, but I want to know if it has

1. DSC
2. 4K @ 144Hz 4:4:4
3. Can use g-sync with 4K @ 144Hz
4. How man local dimming zones
etc.
 
Well the Acer 43" CG7 4K @ 144Hz HDR1000 monitor has come in stock on Acer Website (Australia).

It costs $2499 AU, that's exactly $900 AU more than the Asus XG438Q if anybody wants to know.

I am tempted to pull the trigger, but I want to know if it has

1. DSC
2. 4K @ 144Hz 4:4:4
3. Can use g-sync with 4K @ 144Hz
4. How man local dimming zones
etc.

That would make the estimated U.S. price ~$1,699 if I had to guess.

I'm interested too...
 
Last edited:
Well the Acer 43" CG7 4K @ 144Hz HDR1000 monitor has come in stock on Acer Website (Australia).

It costs $2499 AU, that's exactly $900 AU more than the Asus XG438Q if anybody wants to know.

I am tempted to pull the trigger, but I want to know if it has

1. DSC
2. 4K @ 144Hz 4:4:4
3. Can use g-sync with 4K @ 144Hz
4. How man local dimming zones
etc.

Specs on Acer store website say that it's 4K @ 144 Hz via 2 DisplayPort connectors, 120 Hz when using just one. Using two will most likely also disable Freesync. The XG43UQ will be the better display if the panel won't have the issues of the XG438Q.
 
Specs on Acer store website say that it's 4K @ 144 Hz via 2 DisplayPort connectors, 120 Hz when using just one. Using two will most likely also disable Freesync. The XG43UQ will be the better display if the panel won't have the issues of the XG438Q.


Ah damm that really sucks! So using 2 DP cables means that it doesn't have DSC, is that correct?

And does that mean DSC will allow 4K/144 4:4:4 via 1 DP cable?

Anybody know if using 2x DP cables for 4K/144 disables g-sync? I might have to ask ACER?
 
Specs on Acer store website say that it's 4K @ 144 Hz via 2 DisplayPort connectors, 120 Hz when using just one. Using two will most likely also disable Freesync. The XG43UQ will be the better display if the panel won't have the issues of the XG438Q.


Moot point to me, because:

a.) I don't believe anyone who says they can tell the difference between 120hz and 144hz; and

b.) Show me the GPU which will hit 144hz on a even remotely modern game


:p
 
Moot point to me, because:

a.) I don't believe anyone who says they can tell the difference between 120hz and 144hz; and

b.) Show me the GPU which will hit 144hz on a even remotely modern game


:p


I mean, I can tell the difference between 90, 120, 144... so It's not that big of a leap to say that someone will notice. However, I will say that the major thing is that the FPS is over 100 and has adaptive sync, so long as the minimum FPS in the adaptive range is below 48Hz, you're looking at an LFC capable screen so ANY frame rate is smooth.
 
I mean, I can tell the difference between 90, 120, 144... so It's not that big of a leap to say that someone will notice.

IN actual use? I think it's mostly placebo.

It is possble to see the difference in that UFO test image, but that is not a real use scenario.

I'd argue anything over 90fps is mostly a waste, and the difference going from 60 to 90 is pretty small.

However, I will say that the major thing is that the FPS is over 100 and has adaptive sync, so long as the minimum FPS in the adaptive range is below 48Hz, you're looking at an LFC capable screen so ANY frame rate is smooth.

To me LFC is a complete waste as well, for two reasons.

1.) I always configure my games such that minimum framerate never drops below 60fps.

and

2.) Sure, they may be able to artificially smooth out low framerates, and they may APPEAR smooth by looking at them as a spectator, but you are adding so much lag at that point (because you have to wait for the subsequent frame in order to be able to render the in-between frame, and then add compute time to create that extra frame after that. The lag is so horrid it doesn't really matter how smooth it looks.


I mean, I'm never going to see a framerate down low enough to warrant frame interpolation, but even if I did, I would make damned sure that feature was disabled and never turned on.
 
Spotted this video showing XG438Q motion blur. Looks pretty bad at anything except Level 5 OD which replaces it with just inverse ghosting caused by overshoot.




I do agree that looks pretty bad, but why is this a feature you would ever want?

I don't want my monitor messing with my frames. I just want it to display the frames exactly as it is handed them. This is why adaptive refresh is so good, because it can actually display you your real frames as fast as your GPU is able to deliver them. I want absolutely nothing beyond that.

This and LFC are just kind of silly gimmick features I'd never use, that make things worse rather than better.
 
Moot point to me, because:

a.) I don't believe anyone who says they can tell the difference between 120hz and 144hz; and

b.) Show me the GPU which will hit 144hz on a even remotely modern game


:p

a) I can tell a little bit of difference between 100 and 120, so I guess I should a little also between 120 and 144. Not much though.

b) Definitely imprtant for fast pace FPS. I have tuned all my settings in PUBG @ 4K that I can see between 150-200fps when I uncap the fps. Obviously I manually set it to 120 because thats the best my XG438Q can do. So I already know I will see 144 with my setup if I have the monitor.
 
a) I can tell a little bit of difference between 100 and 120, so I guess I should a little also between 120 and 144. Not much though.

b) Definitely imprtant for fast pace FPS. I have tuned all my settings in PUBG @ 4K that I can see between 150-200fps when I uncap the fps. Obviously I manually set it to 120 because thats the best my XG438Q can do. So I already know I will see 144 with my setup if I have the monitor.


Fair enough. I forgot people sabotage their visuals for high framerates.

For me the #1 goal with any game is to make it look as good as possible, provided I can get at least 60fps minimum. I don't play anything multiplayer anymore though. Just don't have time, but even back when I did, we used to bemoan those who ruined the game by turning down their settings for an advantage.
 
  • Like
Reactions: Panel
like this
IN actual use? I think it's mostly placebo.

It is possble to see the difference in that UFO test image, but that is not a real use scenario.

I'd argue anything over 90fps is mostly a waste, and the difference going from 60 to 90 is pretty small.



To me LFC is a complete waste as well, for two reasons.

1.) I always configure my games such that minimum framerate never drops below 60fps.

and

2.) Sure, they may be able to artificially smooth out low framerates, and they may APPEAR smooth by looking at them as a spectator, but you are adding so much lag at that point (because you have to wait for the subsequent frame in order to be able to render the in-between frame, and then add compute time to create that extra frame after that. The lag is so horrid it doesn't really matter how smooth it looks.


I mean, I'm never going to see a framerate down low enough to warrant frame interpolation, but even if I did, I would make damned sure that feature was disabled and never turned on.

I remember one time while I was attending the Epic booth at the 2015 GDC and they had the most recent build of UT running on a handful of PC's connected to 120Hz monitors on the then-new GTX980 video cards. I remember one 'booth' was running a bit slow, it wasn't smooth. It was noticeably choppier than the others. So I used the console command STAT FPS to show the in-game FPS display and it showed the game was running between 90-95 FPS. I ALT-TABed out of the game to find that there were actually two instances of the game running, I closed the extra one, went back to the original game and Lo-and-behold it was back up to 120FPS.

No matter what anyone says, never buy into the myth that the human eye can't discern the difference between high frame rates.
 
Fair enough. I forgot people sabotage their visuals for high framerates.

For me the #1 goal with any game is to make it look as good as possible, provided I can get at least 60fps minimum. I don't play anything multiplayer anymore though. Just don't have time, but even back when I did, we used to bemoan those who ruined the game by turning down their settings for an advantage.

Highest visuals is true for SP games, but for competative MP games then visuals do not matter as long as they are not potatoe :)
 
Highest visuals is true for SP games, but for competative MP games then visuals do not matter as long as they are not potatoe :)


I still struggle with that. Even when I play competitive MP games (thought it has been a while since my favorite MP game, Red Orchestra 2 more or less died) is to live into the setting, and have a team based approach to defeat the opposing team. Living into the setting is every bit as important to me.

To me, when kiddies lower quality settings to get better frame rate, its just lame. Firstly, because I believe it's all in their heads and that a frame rate above 60-90 somewhere is of marginal benefit, but secondly because it completely misses the point. Whats the point in dominating if you have to disable shadows and shorten render distances to get an unfair advantage in order to do so? Might as well not play at all.
 
I remember one time while I was attending the Epic booth at the 2015 GDC and they had the most recent build of UT running on a handful of PC's connected to 120Hz monitors on the then-new GTX980 video cards. I remember one 'booth' was running a bit slow, it wasn't smooth. It was noticeably choppier than the others. So I used the console command STAT FPS to show the in-game FPS display and it showed the game was running between 90-95 FPS. I ALT-TABed out of the game to find that there were actually two instances of the game running, I closed the extra one, went back to the original game and Lo-and-behold it was back up to 120FPS.

No matter what anyone says, never buy into the myth that the human eye can't discern the difference between high frame rates.


To be fair thought, this does not seem like it is a realistic apples to apples comparison. Stutter caused by - say - swapping due to running our of ram, because two instances of a game are running, or pinning the CPU for the same reason, would inherently result in a highly erratic and stuttery gaming performance. Not at all like just having a lower, but still fairly even frame rate.

I buy the whole, "might as well max it if I can just in case" argument, but expressing a clear benefit from framerates that high seems closer to audiophile golden eared nonsense than anything else.
 
I still struggle with that. Even when I play competitive MP games (thought it has been a while since my favorite MP game, Red Orchestra 2 more or less died) is to live into the setting, and have a team based approach to defeat the opposing team. Living into the setting is every bit as important to me.

To me, when kiddies lower quality settings to get better frame rate, its just lame. Firstly, because I believe it's all in their heads and that a frame rate above 60-90 somewhere is of marginal benefit, but secondly because it completely misses the point. Whats the point in dominating if you have to disable shadows and shorten render distances to get an unfair advantage in order to do so? Might as well not play at all.

It isn't all in my head, I personally do play better if I get higher fps. BUT I will say this doesn't apply to everyone, some of my buddies will perform no better playing at 240fps vs playing at 60fps so they prefer to just turn up those graphics settings as well since >60fps isn't benefiting them. And to show that it's not "all in my head" here's my stats from Apex Season 2 after playing 100 matches on a 1080p 240Hz screen. Never in my life have I ever performed this well in ANY fps game, from COD to BF to PUBG, never had K/D above 3.0. After playing a 240Hz screen and going try hard for the first 100 games of season 2 I can say with confidence that >60fps benefits me.
 

Attachments

  • Desktop Screenshot 2019.07.21 - 23.14.05.23.png
    Desktop Screenshot 2019.07.21 - 23.14.05.23.png
    2.4 MB · Views: 0
I do agree that looks pretty bad, but why is this a feature you would ever want?

I don't want my monitor messing with my frames. I just want it to display the frames exactly as it is handed them. This is why adaptive refresh is so good, because it can actually display you your real frames as fast as your GPU is able to deliver them. I want absolutely nothing beyond that.

This and LFC are just kind of silly gimmick features I'd never use, that make things worse rather than better.

It's not a feature, it's a problem. The panel cannot keep up with black transitions so you get that trailing shadow. If overdrive is adjusted so that this does not happen you instead get inverse ghosting which results in a bright trail. Both are bad, ideally you would have no black smearing and no overshoot. Most displays have an ideal overdrive setting where it has the least amount of overshoot which usually leads to better response times.

LFC is great for extending the working range of Freesync. You really need to consider that people have different needs and just because you don't use something does not mean it's not a valid feature.
 
Specs on Acer store website say that it's 4K @ 144 Hz via 2 DisplayPort connectors, 120 Hz when using just one. Using two will most likely also disable Freesync. The XG43UQ will be the better display if the panel won't have the issues of the XG438Q.
It looks pretty interesting, and more promising than I previously expected. When are they bringing this to the US? It’s infuriating that it’s only on the Singapore Acer store.
 
It's not a feature, it's a problem. The panel cannot keep up with black transitions so you get that trailing shadow. If overdrive is adjusted so that this does not happen you instead get inverse ghosting which results in a bright trail. Both are bad, ideally you would have no black smearing and no overshoot. Most displays have an ideal overdrive setting where it has the least amount of overshoot which usually leads to better response times.

Thank you for explaining that, I appreciate it.


LFC is great for extending the working range of Freesync. You really need to consider that people have different needs and just because you don't use something does not mean it's not a valid feature.

I can't even imagine an application in which gaining smoothness at the expense if hugely increased input lag would be a positive.

Maybe statically rendered demos that don't require user input?

In every other circumstance I can even imagine, this would be not just a small, but a massive negative.
 
It looks pretty interesting, and more promising than I previously expected. When are they bringing this to the US? It’s infuriating that it’s only on the Singapore Acer store.

Also instock in Australia.

I am tempted, but I want to know if using 2x DP cables for 144hz @ 4K will disable g-sync.
No point in me getting this monitor for 120hz 4K gsync because I already have the XG438Q which I have been happy with. I am only going to spend the extra $900AU if I get that 144hz + gsync + HDR1000 otherwise I won't upgrade.
 
Also instock in Australia.

I am tempted, but I want to know if using 2x DP cables for 144hz @ 4K will disable g-sync.
No point in me getting this monitor for 120hz 4K gsync because I already have the XG438Q which I have been happy with. I am only going to spend the extra $900AU if I get that 144hz + gsync + HDR1000 otherwise I won't upgrade.
Nice to know that they’re present in Australia too, that’s probably a good sign that a North American release is imminent. I’m still running a 27” 1440p monitor at 60Hz, so I’m not really as concerned about G-Synce being possible on 2 cables. As long as I can get 144Hz at 4K all the time with 10-bit color and HDR (which should at least be passable on a display this bright) with 2 cables, I’ll be happy. I really do hope this thing doesn’t have the BGR sub-Pixel array though. That’d drive me crazy, especially on macOS where ClearType can’t even combat it.
 
Fair enough. I forgot people sabotage their visuals for high framerates.

For me the #1 goal with any game is to make it look as good as possible, provided I can get at least 60fps minimum. I don't play anything multiplayer anymore though. Just don't have time, but even back when I did, we used to bemoan those who ruined the game by turning down their settings for an advantage.
I share this sentiment, but the sheer number of people who say that high refresh really does change the experience has made me want to try it out.
 
I share this sentiment, but the sheer number of people who say that high refresh really does change the experience has made me want to try it out.

It'll change the experience, but really it's a combination of high refresh, high framerates (low frametimes!), and VRR.

High motion clarity and low input lag, meaning responsive, crisp inputs.
 
If you're interested in an excellent 4K TV for monitor use, the Samsung Q60 43" is down to $498 on Amazon and Crutchfield.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Nice to know that they’re present in Australia too, that’s probably a good sign that a North American release is imminent. I’m still running a 27” 1440p monitor at 60Hz, so I’m not really as concerned about G-Synce being possible on 2 cables. As long as I can get 144Hz at 4K all the time with 10-bit color and HDR (which should at least be passable on a display this bright) with 2 cables, I’ll be happy. I really do hope this thing doesn’t have the BGR sub-Pixel array though. That’d drive me crazy, especially on macOS where ClearType can’t even combat it.

I am really sorry to bust your chops, but a member on another forum down here has confirmed that this monitor does have BGR pixel layout, he was told by an Acer product manager.
 
I am really sorry to bust your chops, but a member on another forum down here has confirmed that this monitor does have BGR pixel layout, he was told by an Acer product manager.
WHY?!?! I feel like I'm going to CRY. They just keep dangling hope in front of us and then pulling it away! It's unbearable!
 
Is BGR really that bad lol?

I have been using my XG438Q for over 6 weeks now, mainly for gaming and a little bit of video editing and I just can't tell the difference lol.

I am so close to pulling the trigger on this Acer, or should I wait for the XG43UQ?
 
Ok so I got some answers for a few of my questions I asked Acer supports.

1. This monitor does not have DSC.

2. I asked them:

'Will I need to use 2x DP 1.4 cables to achieve 4K @ 144Hz, and if so, will g-sync still be active on this monitor when using 4K @ 144Hz.'

Their response was:
'No support when 2DP mode'

Not entierly sure how to interpret that response., but it sounds like it does not support g-sync when using 4K @ 144hz?

3. There is 16 local dimming zones.

Sounds like I may have to wait for the ASUS XG43UQ.
 
Ok so I got some answers for a few of my questions I asked Acer supports.

1. This monitor does not have DSC.

2. I asked them:

'Will I need to use 2x DP 1.4 cables to achieve 4K @ 144Hz, and if so, will g-sync still be active on this monitor when using 4K @ 144Hz.'

Their response was:
'No support when 2DP mode'

Not entierly sure how to interpret that response., but it sounds like it does not support g-sync when using 4K @ 144hz?

3. There is 16 local dimming zones.

Sounds like I may have to wait for the ASUS XG43UQ.
Based on this response, it’s still entirely possible that 6-bit 144Hz with G-Sync is possible with one cable. Only 16 zones isn’t great, but should still be alright outside of desktop usage.
 
Back
Top