The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

PRO/CON...Resolution: The resolution gives a much better pixel density but drives your frame rate lower which gets nothing out of the high hz capability or requires you to turn you graphics settings down considerably on the more demanding games to get toward 100fps average or more for 120hz+ benefits.

Well seeing how you value 100fps so much this monitor is completely pointless to buy. Not even an 2080 Ti will get you to that frame rate in many games, even WITH reduced settings. Only a select few like Wolfenstein will actually run at 100+fps at 4k. 2080 Ti is really only good for targeting 60Hz in the vast majority of AAA titles, good luck with a 1080 Ti.
 
Well seeing how you value 100fps so much this monitor is completely pointless to buy. Not even an 2080 Ti will get you to that frame rate in many games, even WITH reduced settings. Only a select few like Wolfenstein will actually run at 100+fps at 4k. 2080 Ti is really only good for targeting 60Hz in the vast majority of AAA titles, good luck with a 1080 Ti.

Eh?

1. You're restricting your focus to only the most recent AAA games - this may be true for you, but it's not likely to be true for most PC gamers
2. You're undervaluing the capability of having 4k120 at a decent PPI everywhere. This includes older games as well as general productivity tasks- even the experience of navigating around the OS is improved
3. You're assuming that a GPU upgrade is unlikely in the near future for potential / actual purchasers - GPUs are the most common part to upgrade, and given that the 2080 Ti doesn't make sense for many 1080 Ti owners, it's reasonable to pick up a new display while also waiting for the next round of GPU releases
 
Eh?

1. You're restricting your focus to only the most recent AAA games - this may be true for you, but it's not likely to be true for most PC gamers
2. You're undervaluing the capability of having 4k120 at a decent PPI everywhere. This includes older games as well as general productivity tasks- even the experience of navigating around the OS is improved
3. You're assuming that a GPU upgrade is unlikely in the near future for potential / actual purchasers - GPUs are the most common part to upgrade, and given that the 2080 Ti doesn't make sense for many 1080 Ti owners, it's reasonable to pick up a new display while also waiting for the next round of GPU releases

1. Good point. I agree
2. Not talking about productivity here, just gaming which is why I only quoted that part of his post.
3. If you're gonna wait for a better gpu to handle 4k120hz, a better display will be out by then. The "UQ" version of this monitor? HDMI 2.1 2020 Samsung TV's that everyone here loves so much? LG C10 OLED?
 
2. Not talking about productivity here, just gaming which is why I only quoted that part of his post.

I do understand that it's not relative to gaming, however, I know many including myself find the overall desktop experience with higher refresh rate monitors to be quite desirable. The merits of the display on the whole make this a point of consideration.

3. If you're gonna wait for a better gpu to handle 4k120hz, a better display will be out by then. The "UQ" version of this monitor? HDMI 2.1 2020 Samsung TV's that everyone here loves so much? LG C10 OLED?

From my perspective, GPUs will never be 'fast enough'. They never really have been, save for a few moments where games and / or displays were catching up. I'm used to sacrificing something in terms of visuals in order to get the desired experience, and I value fluidity over absolute image quality. This monitor gets us to having both fluidity and great gaming support.

Now, on balance, I wouldn't recommended picking one of these up unless it absolutely hits ones wants and needs. For me, the black smearing is likely to approach being unbearable, the text issue is the same I see on one of my current displays, and the picture quality is even worse. I'll be waiting with you for better options along these lines.

But for someone that isn't bothered by those things and can take advantage of the screens featureset? Why not.
 
This XG438Q has been out in the UK and was in stock from a few vendors at one point in the USA. Newegg online warehouse store and I think microcenter brick and mortar. Newegg has it listed out of stock for $1099.99
----------------------------------------------------------

I've been actively participating in this thread and I've read a few good site reviews of this monitor and have come to some conclusions....

I wanted this monitor to be a full upgrade to my 32GK850G high hz 1440p VA in every facet.

PRO:.. Size upgrade without being way too large: great size (at ~ 3' viewing distance or more)t which also matches my other 43" 4k 60hz monitors. Once you get a 32" 16:9 or 40" to 43" 4k monitor, the ~ 13" tall 27" 16:9s and ultrawides look like a narrow box window or a belt. Single monitor up close they are ok, so is a laptop. Depends on your taste and setup I guess. Screen dimensions image including heights.

PRO/CON:... PPI: It has a much better PPI than the poor ppi of the GK850g but requires you to flip the monitor upside down to get RGB for the best text clarity.

PRO...Massive Desktop/app space with the physical screen size combined with the ppi. It has so much more desktop and app real-estate than other monitors even ultrawides. You can also run ultrawide resolutions on a 4k monitor on most games which will give you the benefit of higher fps than 4k on more demanding games. and at a larger uw phsyical viewport size. Desktop/app space comparison picture.

PRO/CON...Resolution: The resolution gives a much better pixel density but drives your frame rate lower which gets nothing out of the high hz capability or requires you to turn you graphics settings down considerably on the more demanding games to get toward 100fps average or more for 120hz+ benefits.

PRO/Equivalent... High Hz: both screens have high hz capability.

PRO/CON: Freesync .. Good to have nvidia capable VRR but G-sync still has some benefit over freesync. GK850G uses a g-sync module.

PRO/CON: Some quasi HDR capability using Hybrid Log Gamma style HDR that works with SDR and "SDR+" screens labeled HDR (hdr 400, hdr600 that can't do PQ HDR of HDR1000 white point using absolute values). The edge lighting isn't go to do real hdr side by side highlights by a long shot, however the GK850G has no color brightness ability past SDR range at all.

PRO: Native Contrast: The native contrast of the XG438Q is 900 to 1000 more than the GK850G at almost 4000:1 and with the accompanying deeper black depths even before adding local dimming to the equation.

PRO.. Local Dimming: Not FALD unfortunately but adds significant black depth and contrast in large areas of scenes. Variable and scene dependent.

CON ..Overdrive + Response time: Unfortunately it's overdrive + response time combination is slightly slower which means it can't eliminate as much of the black smearing on the worst transitions as the bar the GK850G has set.
*This really should only be noticeable on games and settings that get 100fps or more on a high hz monitor because that's when you start reducing sample-and-hold blur to a soften blur rather than a smearing blur. 100fps ranges on a 4k monitor won't always be the case. The desktop will always be max fps vs hz though so there is that as well.

CON... Uniformity: The uniformity on the XG348Q is poorer with much darker corners than the already poorer-than-my 4k tvs GK850G which has "ok" uniformity. My TCL s405 and samsung nu6900 are great by comparison.

CON.. BRG pixel layout: This can necessitate you tweaking cleartype more aggressively and adjusting your display settings and viewing distance to compensate if it bothers you. Or if you really want RGB it could require you to buy a vesa mount of some kind and flip the monitor. The OSD of the monitor and any device bios screens will not be flipped, but windows display settings easily flips and remembers your settings with no problem.

CON... Displayport 1.4 with no HDMI 2.1: We are still trying to stuff a 4k signal down a too narrow dp 1.4 pipe and this has no future proofing since it has no hdmi 2.1

CON... PRICE: Considering all of the caveats I don't think this monitor is worth $1100 + tax (~$1200 here). While it looks like a decent gaming monitor overall, much like my gk850g it has some weaknesses in some facets that result in tradeoffs. Had the XG438Q matched and/or surpassed my gk850g in every facet I'd be much less reluctant to buy one. As it is it would still be a nice upgrade from my 32" LG in some respects but not at that price tag all things considered.
You forgot that you can't disable the wide gamut mode, that makes this monitor basically unusable for any desktop operation outside of HDR content. Including non-HDR sRGB games.
A PC monitor without a proper sRGB mode is so stupid. Like an iPhone without a touch screen.
 
You forgot that you can't disable the wide gamut mode, that makes this monitor basically unusable for any desktop operation outside of HDR content. Including non-HDR sRGB games.
A PC monitor without a proper sRGB mode is so stupid. Like an iPhone without a touch screen.

Can you elaborate on this issue, why you believe that it affects this monitor, and why you believe that it makes the monitor 'unusable'?
 
Well seeing how you value 100fps so much this monitor is completely pointless to buy. Not even an 2080 Ti will get you to that frame rate in many games, even WITH reduced settings. Only a select few like Wolfenstein will actually run at 100+fps at 4k. 2080 Ti is really only good for targeting 60Hz in the vast majority of AAA titles, good luck with a 1080 Ti.


It's not just that I value 100fps average (or better) so much.. it's that you aren't getting real gains out of a high Hz monitor at anything lower so you might as well not have (and pay for) high hz if you aren't feeding it into the ranges where it reduces blur appreciably and adds motion definition.

100fps is something like 70 - 100 - 130 graph so it's well below 100 part of the graph even then.

-----------------------------------------------

I'm a patient gamer and I have a big steam library with a lot of different game genres and engines, formats. There are a lot of games that will play at fairly high frame rates from the last several years, some dialed in (details down.... fps , motion clarity and motion definition up) and others just simpler games graphically. Some games I still enjoy support sli too believe it or not but even then 4k is demanding. I mentioned that a lot of (most?) games support playing in a 21:9 or 21:10 resolution with bars. On a 43" monitor that results in a bigger (and taller) viewport than any physical ultrawides and can grant somewhat higher frame rates while still having the full 4k screen size for easier to render/simpler games, games hard coded to 60hz limit, as well as desktop/app use. The biggest uw overall incl height is a 37.5" diagonal which is around 34.6" wide and 14.4" tall. A 43" 16:9 is about 37.5" wide and at a 16:10 3840x1600 resolution would be 15.6" tall.

Borderlands 3 can apparently work in sli btw but with some bells and whistles (high reflections, fog) turned off. I don't own BL3 but it works with 2. Apparently some people are getting 100 fps average in uw resolutions on BL3 using 1080ti sli but they turned g-sync off in BL3 since it supposedly drops performance by 20% hit both cards. That's a huge tradeoff losing g-sync.. idk if it is the same loss for a freesync monitor without the g-sync module. I've posed a question about what settings would be used to get 100fps with 1080ti sli with g-sync enabled and I'll report back if I hear anything. I'm sure high+ to very high+ settings would work to do it, I'm just not sure exactly were between the two with this game.

I guess besides dialing settings in (detials/fx down, fps and motion def and motion clarity UP)... and using a UW rez on more demanding games, there are also some games that use dynamic resolution in game something like what consoles do - if you are into that kind of thing for quasi 4k resolution dynamically downrezzing during the most action.
 
Last edited:
Someone was selling a wasabi mango uhd430 for 700 here, missed out on that. Since that retails for I believe $1200, this being lower might be a better deal. This monitor has better specs too.
 
I assume that most people who purchase this display have a high end card like a 2080/2080Ti. Why wouldn't they wait for the XG43UQ at this point? For a few hundred dollars more you would be getting a more future proof display?

As I said I'll be waiting for this model, but I don't expect to be buying one unless, by some miracle, it's a totally different beast than the 43Q.
 
I assume that most people who purchase this display have a high end card like a 2080/2080Ti. Why wouldn't they wait for the XG43UQ at this point? For a few hundred dollars more you would be getting a more future proof display?

As I said I'll be waiting for this model, but I don't expect to be buying one unless, by some miracle, it's a totally different beast than the 43Q.
I have a 1080ti, going to build a rig around that. Sell my 2600k rig, keep case and psu tho
 
I assume that most people who purchase this display have a high end card like a 2080/2080Ti. Why wouldn't they wait for the XG43UQ at this point? For a few hundred dollars more you would be getting a more future proof display?

As I said I'll be waiting for this model, but I don't expect to be buying one unless, by some miracle, it's a totally different beast than the 43Q.

I'd say people are waiting for the UQ as well as the Acer CG437KP at this point. Those are probably the only larger 4K / 120+ Hz displays coming out this year.

I already went a different direction with the Samsung CRG9 super ultrawide. It's a little bit less GPU intensive than 4K so my 2080 Ti handles everything but games with raytracing quite well on it. I have to play Control at 3840x1080 to get good performance at max raytracing settings but I still prefer that to using a narrower ultrawide resolution. While I understand that in my use I often don't get full advantage of my 120 Hz screen, to me it's all about having headroom so that I am not limited to just 60 Hz. I'm so sold on ultrawide now that I wish consoles had the option and we had ultrawide TVs too. It's just a better format for games, movies as well as desktop use.
 
I'd say people are waiting for the UQ as well as the Acer CG437KP at this point. Those are probably the only larger 4K / 120+ Hz displays coming out this year.

I already went a different direction with the Samsung CRG9 super ultrawide. It's a little bit less GPU intensive than 4K so my 2080 Ti handles everything but games with raytracing quite well on it. I have to play Control at 3840x1080 to get good performance at max raytracing settings but I still prefer that to using a narrower ultrawide resolution. While I understand that in my use I often don't get full advantage of my 120 Hz screen, to me it's all about having headroom so that I am not limited to just 60 Hz. I'm so sold on ultrawide now that I wish consoles had the option and we had ultrawide TVs too. It's just a better format for games, movies as well as desktop use.

I expect the UQ and CG437K to be the same really... unless they decide to flip the panel. I still can't get around the thought that Asus must have done this for a reason. One can hope for better, but I won't hold my breath.

You must have the flickering issue with the CRG9 and Nvidia GPU though, with Ultimate Engine enabled?
 
I thought this too but if you switch to OD5 prior to playing an HDR game, overshoot everywhere.

The reason you see so much more black smear using OD4 and HDR is because the brightness makes it much more pronounced/obvious. Play the same game in SDR and manually crank the brightness (450nits) and you'll see it looks identical.

i am sure its somehow connected to hdr. i am not testing it on games (i dont see a big smearing problem in games) , i am testing it on black text scrolling on white background. In hdr desktop is quite dark and lifeless and yet smearing is visible. When i disable hdr i dont see this problem even on 90% brightness and 70% contrast
 
You forgot that you can't disable the wide gamut mode, that makes this monitor basically unusable for any desktop operation outside of HDR content. Including non-HDR sRGB games.
A PC monitor without a proper sRGB mode is so stupid. Like an iPhone without a touch screen.

sdr games and desktop looks great on this monitor, much better than on my previous xb270hu. Colours, contrast and brightness are so much better.. Even photos look great (and i know what i am talking - i work only with raw files and dxo photolab) - from a distance beacause of clear contrast shifting, in my opinion the only 438q real problem
Oversaturated colours - of course, when set at 100% in monitor menu :) Set it at 40—60% and everything is ok.
Its not a pro monitor after all..
 
Last edited:
  • Like
Reactions: elvn
like this
I'm not totally against buying this display if they bring the price down a few hundred dollars. Possible after the UQ arrives.
 
yeah this monitor seems OK but kind of parallel to my gk850g. better in some and actually worse in certain specs. Its not worth it to me at the 1200 out of pocket I'd have to spend on it at this price so I'll wait to see what the other two 43 knch screens look like in reviews and otherwise wait on a price drop. The gk850g dropped on sales fairly early on and then overall considerably.

I'd say people are waiting for the UQ as well as the Acer CG437KP at this point. Those are probably the only larger 4K / 120+ Hz displays coming out this year.

I already went a different direction with the Samsung CRG9 super ultrawide. It's a little bit less GPU intensive than 4K so my 2080 Ti handles everything but games with raytracing quite well on it. I have to play Control at 3840x1080 to get good performance at max raytracing settings but I still prefer that to using a narrower ultrawide resolution. While I understand that in my use I often don't get full advantage of my 120 Hz screen, to me it's all about having headroom so that I am not limited to just 60 Hz. I'm so sold on ultrawide now that I wish consoles had the option and we had ultrawide TVs too. It's just a better format for games, movies as well as desktop use.

I'd rather a wall of high ppi monitor and set my own resolutions and aspect ratios in windows on it if I had a choice. Any large high ppi 16:9 can run most games in 21:9 or 21:10 ratios as far as I'm aware, and on a physically larger 21:9 or 21:10 viewport while still getting way more desktop and app space, media playback space outside of the games as well as gaming in full screen 16:9 larger when desired. I mean theoretically it could be a game of leapfrog but in reality the 21:x aspect monitors are always going to have less. Even though wider aspect ratios (like lenses) with virtual cinematography and HOR+ in games is always more game world/scene shown, the larger 16:9's will always be bigger and have more pixels on them so can do both. Ultrawide gaming displays were more interesting when they were larger than a 27" 16:9 13" tall gaming display, adding some to the sides. That isn't the case anymore with 43" 4k 120hz (or in the future a 55" 4k 120hz vrr off a hdmi 2.1 nvidia gpu if necessary to go that big to tick all the boxes), but I can understand the UW from personal preference point of view especially for viewing nearer like a 27" 16:9 or if a very large 3840x1600.


4k in the image is equivalent to ~ 41" diaongal
nVW9JAA.jpg
 
Last edited:
sdr games and desktop looks great on this monitor, much better than on my previous xb270hu. Colours, contrast and brightness are so much better.. Even photos look great (and i know what i am talking - i work only with raw files and dxo photolab) - from a distance beacause of clear contrast shifting.
You may like it, but your desktop looks oversaturated as well as any SDR game you play, reproducing the wrong colors for the sRGB color gamut, within which the Windows OS and all non HDR games are build as well as anything you see on the Internet. DXO can control the color gamut and output the correct sRGB gamut and cover exactly 100% of it, as you don't want your people see your wide gamut images on there standard gamut display devices (like, all of them, except your gaming monitor).
 
Well I don't have this monitor so I can't say how the colors look to the eye, but on my 32" gk850g I had to bump the color vibrance up a bit on the desktop 2% and in some games with freestyle ... which makes it no longer color accurate but very pleasing for gaming and even on desktop and image galleries etc rather than having muted color vibrance out of the box. It did take a lot of tweaking of the RBG, contrast and brightness in the osd on that monitor to get it to where I wanted though, especially whites.
 
That's great and all that you hate wide gamut displays but I just can't go back to using a sRGB confined monitor after having all these WCG ones. I prefer the saturation.

My youtube logo being too red and my desktop "workflow" of Microsoft Word's blue theme being too blue are absolute tragedies I know.

I was under the impression that this only mattered if you were doing color-sensitive work (such as photo editing or creation work) where it was important to ensure strict color accuracy. I too prefer the more vivid, saturated look. It might not be as "accurate", but it sure makes games pop and I find it visually appealing elsewhere, too. Most comments I've seen regarding this say that if you're not doing color-sensitive work, it largely comes down to individual preference. I've certainly never felt that it impeded anything that I was doing.
 
I expect the UQ and CG437K to be the same really... unless they decide to flip the panel. I still can't get around the thought that Asus must have done this for a reason. One can hope for better, but I won't hold my breath.

You must have the flickering issue with the CRG9 and Nvidia GPU though, with Ultimate Engine enabled?

Haven't experienced any flickering issues. Googled this and seems people have it mostly on windowed/borderless mode and I try to avoid those if I can. I'll try it again this evening to see if I can notice it.
 
I'd rather a wall of high ppi monitor and set my own resolutions and aspect ratios in windows on it if I had a choice. Any large high ppi 16:9 can run most games in 21:9 or 21:10 ratios as far as I'm aware, and on a physically larger 21:9 or 21:10 viewport while still getting way more desktop and app space, media playback space outside of the games as well as gaming in full screen 16:9 larger when desired. I mean theoretically it could be a game of leapfrog but in reality the 21:x aspect monitors are always going to have less. Even though wider aspect ratios (like lenses) with virtual cinematography and HOR+ in games is always more game world/scene shown, the larger 16:9's will always be bigger and have more pixels on them so can do both. Ultrawide gaming displays were more interesting when they were larger than a 27" 16:9 13" tall gaming display, adding some to the sides. That isn't the case anymore with 43" 4k 120hz (or in the future a 55" 4k 120hz vrr off a hdmi 2.1 nvidia gpu if necessary to go that big to tick all the boxes), but I can understand the UW from personal preference point of view especially for viewing nearer like a 27" 16:9 or if a very large 3840x1600.

Yeah big enough 4K 16:9 screens will do regular ultrawide just fine but for 32:9 is a different beast. The 49" Samsung CRG9 is as wide as a 55" TV but with higher horizontal resolution. 55" 4K TV is essentially two 3840x1080 displays stacked on top of each other. You will most likely need to mess with a FOV slider if you want to make better use of the TV's space by having a higher vertical resolution than 1080p.

At this point I would probably be pretty happy with 5120x2160 as a resolution at something like 43-46" size as it would combine big size with high PPI while still not being so big that I have to look up and down a lot or have a very deep desk. I'll probably have to wait years for that!

43" 16:9 is the absolute maximum I'm willing to put up with on my desk but for desktop use the super ultrawide works better with a good PbP mode and more horizontal resolution which makes for more convenient window placement and sizing. Vertical space is useful to a point until it gets so tall that you have trouble seeing the corners if you are sitting closer to the screen. I could use a little bit more on my CRG9, 5120x1600 for example would be pretty great with a little bit more room on top and bottom. I don't think I could handle say a 48" 16:9 unless I can lower it down so that the bottom edge almost touches my desk.

I did a quick Photoshop comparing a few display sizes to the 49" super ultrawide. As Displaywars.com doesn't center them I Photoshopped them like this to make better sense of the size differences. The 49" 24:10 doesn't exist but would be a pretty good format at say 5120x2140.

Z5dDMl3.png
 
Haven't experienced any flickering issues. Googled this and seems people have it mostly on windowed/borderless mode and I try to avoid those if I can. I'll try it again this evening to see if I can notice it.

I'm curious if you've tested this? With Ultimate Engine on, are you able to see flickering, but then eliminate it by turning off windowed/borderless mode? I noticed it myself in several games, including Ghost Recon Wildlands and Assassin's Creed Odyssey... I don't believe they were running in windowed mode, but within NVCP I did have G-Sync enabled for both.
 
I'm curious if you've tested this? With Ultimate Engine on, are you able to see flickering, but then eliminate it by turning off windowed/borderless mode? I noticed it myself in several games, including Ghost Recon Wildlands and Assassin's Creed Odyssey... I don't believe they were running in windowed mode, but within NVCP I did have G-Sync enabled for both.

Ok I am a fool. I had updated my drivers but forgot to check the extra "Enable settings for this panel" checkbox found in the NVCP G-Sync tab. Tried the Nvidia Pendulum demo and got flickering with Ultimate Engine in both fullscreen and windowed. How disappointing! Do you know if this is a known bug with Freesync, Nvidia drivers or this particular display? Might be a big enough problem to make me return it as I expect to occasionally hit those lows on my 2080 Ti if I run at high graphics settings.

EDIT: I tried playing Control which I know tanks my performance if I run full res + RT and I did not notice any flicker issues. I have to investigate more.

EDIT 2: Tried more games, GTA V and SotTR. Despite my best efforts to drop framerates to levels where it would switch between the two I can't get flicker to happen. I think it's just the Nvidia Pendulum demo where the issue lies because limiting that to 100-120 fps still got me flicker. As long as I don't get any issues in real games I'm good.
 
Last edited:
Yeah big enough 4K 16:9 screens will do regular ultrawide just fine but for 32:9 is a different beast. The 49" Samsung CRG9 is as wide as a 55" TV but with higher horizontal resolution. 55" 4K TV is essentially two 3840x1080 displays stacked on top of each other. You will most likely need to mess with a FOV slider if you want to make better use of the TV's space by having a higher vertical resolution than 1080p.

At this point I would probably be pretty happy with 5120x2160 as a resolution at something like 43-46" size as it would combine big size with high PPI while still not being so big that I have to look up and down a lot or have a very deep desk. I'll probably have to wait years for that!

43" 16:9 is the absolute maximum I'm willing to put up with on my desk but for desktop use the super ultrawide works better with a good PbP mode and more horizontal resolution which makes for more convenient window placement and sizing. Vertical space is useful to a point until it gets so tall that you have trouble seeing the corners if you are sitting closer to the screen. I could use a little bit more on my CRG9, 5120x1600 for example would be pretty great with a little bit more room on top and bottom. I don't think I could handle say a 48" 16:9 unless I can lower it down so that the bottom edge almost touches my desk.

I did a quick Photoshop comparing a few display sizes to the 49" super ultrawide. As Displaywars.com doesn't center them I Photoshopped them like this to make better sense of the size differences. The 49" 24:10 doesn't exist but would be a pretty good format at say 5120x2140.

View attachment 189530

If you did the same thing with 21:10 3840x1600 rez you'd see a much better letterboxed viewport imo. If a 8k or 16k horizontal uw existed at the height of my 43 inch and allowed 120hz in a windowed rez somehow I'd consider it but it would be insanely prohibitively priced and even hdmi 2.1 wouldn't work like that for gaming.

So the only realistic option outside of a larger lower ppi uw would be adding screen size to the sides of a 40 to 43 inch 4k just like ultrawide basically added some to the sides of a 27 inch 16:9 display initially. If i could get "fatter 43 inch 4k" like that which could still do 120hz id consider that too, but then the rez would be even more demanding than 4k so worse off overall. By the time that kind of size and resolution screen at high hz with a powerful enough gpu and/or futuristic interpolation were out - an equivalent or larger perceived viewport and resolution at high hz will probably exist in vr headsets anyway. One other possible option would be the modular micro led building blocks displays with little to no seams visible I guess but again price and resolutions would likely be very limiting factors for gaming.


The best scenario for now imo is a larger 4k screen giving the option of running uw rez when you want to, and getting slightly better fps, while still having the option of running full screen on easier to render games, highly optimized games, games after reasonably user dialing in graphics settings, or games locked at 60hz anyway, etc. as well as getting the full screen desktop/app space and media playback.
 
Last edited:
As side note, Ultimately I think advanced interpolation is going to have to be implemented in the long run no matter what as resolutions and game demands grow. In fact VR is already using interpolation technologies to maintain 90fps vs screen blur and motion sickness. Consoles are doing their own checkerboarding and downrezzing tricks in an attempt to bridge the gap for now and will have to on the next gen touting 120hz quas/hybridi 4k resolution as well, even with more beefy hardware. And of course pcs use gsync/freesync VRR to smooth out lower frame rate graphs and variability to make lower frame rate spans on a higher resolution screen more usable.

Tricks and maybe integer scaling and raw gpu power advances will help - but In the long run I believe nothing outside of interpolation will be enough.
 
Last edited:
As side note, Ultimately I think advanced interpolation is going to have to be implemented in the long run no matter what as resolutions and game demands grow. In fact VR is already using interpolation technologies to maintain 90fps vs screen blur and motion sickness. Consoles are doing their own checkerboarding and downrezzing tricks in an attempt to bridge the gap for now and will have to on the next gen touting 120hz quas/hybridi 4k resolution as well, even with more beefy hardware. And of course pcs use gsync/freesync VRR to smooth out lower frame rate graphs and variability to make lower frame rate spans on a higher resolution screen more usable.

Tricks and maybe integer scaling and raw gpu power advances will help - but In the long run I believe nothing outside of interpolation will be enough.

I agree. I already have to play Control at 3840x1080 resolution because DLSS does not work for super ultrawide and my 2080 Ti is not beefy enough to run 5120x1440 + all raytracing fx. That's still less pixels than 4K. Adding the Nvidia sharpen filter helps reduce the blur induced from this lower resolution. Things like variable rate shading are going to improve performance without visual difference.

To me integer scaling becomes more viable when we start to have 5K+ screens. 1080p to 4K is still a bit too few pixels for my tastes but 1440p is fine. It's been this race where first GPUs were not powerful enough to handle 4K 60 fps and now they've thrown raytracing into the fray which again makes 4K 60 fps difficult to achieve.
 
If they'd make multi core cpu usage and multi gpu and perhaps even a side processor gpu for offloading whatever possible fx all seamlessly load bearing on demand in games and not on the devs it would go long way. Many games are still single or 1 to 2 core cpu speed dependent and multi gpu support is lacking on titles and in easy seamless implementation for devs and gamer end users for that matter.

Imagine if you had multi core gpus like multi core cpus - and games seamlessly and automatically used all of them on both cpus and gpus regardless of the game engine. Almost like running the game in a VM of sorts, a virtual single core cpu and a virtual single core gpu as far as the game development and game playing was concerned.
 
If they'd make multi core cpu usage and multi gpu and perhaps even a side processor gpu for offloading whatever possible fx all seamlessly load bearing on demand in games and not on the devs it would go long way.

Synchronizing the work done by a 'swarm' of compute units for real-time computing (as in gaming) where I/O latency is felt by the user is nearly impossible on a small scale and within the bus limitations imposed by consumer hardware.

Many games are still single or 1 to 2 core cpu speed dependent and multi gpu support is lacking on titles and in easy seamless implementation for devs and gamer end users for that matter.

There is no easy solution to the problem of extractiing parallelism from inherently serial routines. Most likely this will be solved with 'smarter' compilers, and the knowledge to engineer such compilers will likely come from applying machine learning to the problem.

Imagine if you had multi core gpus like multi core cpus - and games seamlessly and automatically used all of them on both cpus and gpus regardless of the game engine.

This is essentially breaking the GPU into multiple compute dies, and is already well into development by all major parties. The current en vogue term for the idea is 'chiplets', but it's far from new, and there are very large challenges that need to be addressed at the hardware and software levels once hardware layout and manufacturing processes are sorted.

Almost like running the game in a VM of sorts, a virtual single core cpu and a virtual single core gpu as far as the game development and game playing was concerned.

This would largely be disastrous, at least with the current understanding of code within the industry. This level of abstraction is the opposite of what DX12 and Vulkan do, for example; it would be more like running a Flash game. Essentially, what you're suggesting sounds great for the pooling of resources, but by specifying hardware abstraction (one core - one GPU), any consideration that the code may have been able to make in order to optimize for the underlying hardware is made impossible by design.

More likely, while virtualization now more through containerization is a growing and needed trend, the ability of 'virtualized' processes to communicate more or less directly with hardware will expand, not shrink.


Now, as for 'multi-GPU' in general, the challenge is implementing it without also incurring input lag. While possible in theory that is just not something that has been demonstrated let alone developed for, even though Nvidia has exposed the tools necessary to get started, and we can assume that AMD and Intel would follow suite as they have the last decade- if there were a push from the software side.
 
  • Like
Reactions: elvn
like this
I was under the impression that this only mattered if you were doing color-sensitive work (such as photo editing or creation work) where it was important to ensure strict color accuracy. I too prefer the more vivid, saturated look. It might not be as "accurate", but it sure makes games pop and I find it visually appealing elsewhere, too. Most comments I've seen regarding this say that if you're not doing color-sensitive work, it largely comes down to individual preference. I've certainly never felt that it impeded anything that I was doing.

Uh no. I want my images to look proper even if I'm not doing color critical work. I've found using wide gamut to display content that's suppose to be sRGB completely ruins skin tones when watching youtube videos and such. I always enable sRGB mode on my Acer X27 if I'm not viewing HDR content otherwise things just don't look right. Sure it is definitely personal preference as you said, so what about those of us who want proper sRGB gamut to display sRGB images/videos? The Asus leaves you completely SOL in that regard.

Wide gamut vs sRGB.

LL.jpg
 
Great replies thanks. "Chiplets" sounds interesting. I don't know that the bus limitations you mentioned would exist on the units themselves if they were multi units connected on the same board rather than using pcie slots, and again it would have to be developed along with motherboard and cpu technology that could handle the throughput so this is all futuristic hypotheticals of course. It's like we are still developing cars for the same 1970's infrastructure trying to squeeze another few miles per gallon out of carbon drum banging when we could be developing space age terrestrial designs and infrastructures.

This would largely be disastrous, at least with the current understanding of code within the industry. This level of abstraction is the opposite of what DX12 and Vulkan do, for example; it would be more like running a Flash game. Essentially, what you're suggesting sounds great for the pooling of resources, but by specifying hardware abstraction (one core - one GPU), any consideration that the code may have been able to make in order to optimize for the underlying hardware is made impossible by design.

Yes I understand that for current coding squeezing out as much as possible out of such limited current hardware designs - but the overall gains from pooling in the long run jumping far down the road to a hypothetical multi cpu and gpu seamless virtualization as 1 and 1 could vastly outweigh trying to optimize performance for your 1 core 500mhz cpu and vodoo2 gpu so to speak.
 
Edited adding the color gamut issue.


This XG438Q has been out in the UK and was in stock from a few vendors at one point in the USA. Newegg online warehouse store and I think microcenter brick and mortar. Newegg has it listed out of stock for $1099.99
----------------------------------------------------------

I've been actively participating in this thread and I've read a few good site reviews of this monitor and have come to some conclusions....

I wanted this monitor to be a full upgrade to my 32GK850G high hz 1440p VA in every facet.

PRO:.. Size upgrade without being way too large: great size (at ~ 3' viewing distance or more)t which also matches my other 43" 4k 60hz monitors. Once you get a 32" 16:9 or 40" to 43" 4k monitor, the ~ 13" tall 27" 16:9s and ultrawides look like a narrow box window or a belt. Single monitor up close they are ok, so is a laptop. Depends on your taste and setup I guess. Screen dimensions image including heights.

PRO/CON:... PPI: It has a much better PPI than the poor ppi of the GK850g but requires you to flip the monitor upside down to get RGB for the best text clarity.

PRO...Massive Desktop/app space with the physical screen size combined with the ppi. It has so much more desktop and app real-estate than other monitors even ultrawides. You can also run ultrawide resolutions on a 4k monitor on most games which will give you the benefit of higher fps than 4k on more demanding games. and at a larger uw phsyical viewport size. Desktop/app space comparison picture.

PRO/CON...Resolution: The resolution gives a much better pixel density but drives your frame rate lower which gets nothing out of the high hz capability or requires you to turn you graphics settings down considerably on the more demanding games to get toward 100fps average or more for 120hz+ benefits.

PRO/Equivalent... High Hz: both screens have high hz capability.

PRO/CON: Freesync .. Good to have nvidia capable VRR but G-sync still has some benefit over freesync. GK850G uses a g-sync module.

PRO/CON: Some quasi HDR capability using Hybrid Log Gamma style HDR that works with SDR and "SDR+" screens labeled HDR (hdr 400, hdr600 that can't do PQ HDR of HDR1000 white point using absolute values). The edge lighting isn't go to do real hdr side by side highlights by a long shot, however the GK850G has no color brightness ability past SDR range at all.

PRO: Native Contrast: The native contrast of the XG438Q is 900 to 1000 more than the GK850G at almost 4000:1 and with the accompanying deeper black depths even before adding local dimming to the equation.

PRO.. Local Dimming: Not FALD unfortunately but adds significant black depth and contrast in large areas of scenes. Variable and scene dependent.

CON ..Overdrive + Response time: Unfortunately it's overdrive + response time combination is slightly slower which means it can't eliminate as much of the black smearing on the worst transitions as the bar the GK850G has set.
*This really should only be noticeable on games and settings that get 100fps or more on a high hz monitor because that's when you start reducing sample-and-hold blur to a soften blur rather than a smearing blur. 100fps ranges on a 4k monitor won't always be the case. The desktop will always be max fps vs hz though so there is that as well.

CON... Uniformity: The uniformity on the XG348Q is poorer with much darker corners than the already poorer-than-my 4k tvs GK850G which has "ok" uniformity. My TCL s405 and samsung nu6900 are great by comparison.

CON.. BRG pixel layout: This can necessitate you tweaking cleartype more aggressively and adjusting your display settings and viewing distance to compensate if it bothers you. Or if you really want RGB it could require you to buy a vesa mount of some kind and flip the monitor. The OSD of the monitor and any device bios screens will not be flipped, but windows display settings easily flips and remembers your settings with no problem.

CON... Displayport 1.4 with no HDMI 2.1: We are still trying to stuff a 4k signal down a too narrow dp 1.4 pipe and this has no future proofing since it has no hdmi 2.1

CON... PRICE: Considering all of the caveats I don't think this monitor is worth $1100 + tax (~$1200 here). While it looks like a decent gaming monitor overall, much like my gk850g it has some weaknesses in some facets that result in tradeoffs. Had the XG438Q matched and/or surpassed my gk850g in every facet I'd be much less reluctant to buy one. As it is it would still be a nice upgrade from my 32" LG in some respects but not at that price tag all things considered.

EDIT: ----------
CON...Wide Gamut Only: This is a wide gamut monitor that has no sRGB setting in the OSD for accurate pc color use in SDR content including obvious things like skin tones in media. Most other gaming monitors are sRGB and some wide gamut color monitors have sRGB setting as an option in the OSD. This can be a deal breaker for some.
 
Great replies thanks. "Chiplets" sounds interesting. I don't know that the bus limitations you mentioned would exist on the units themselves if they were multi units connected on the same board rather than using pcie slots, and again it would have to be developed along with motherboard and cpu technology that could handle the throughput so this is all futuristic hypotheticals of course. It's like we are still developing cars for the same 1970's infrastructure trying to squeeze another few miles per gallon out of carbon drum banging when we could be developing space age terrestrial designs and infrastructures.


The AMD Ryzen 3000 series CPUs are all chiplet designs. One of the key drawbacks is that communication speed between the chiplets sets limitations even when it's done within the confines of the processor. This is one area where Intel has done better, whereas AMD has overtaken them in pretty much everything else. Delegating between the various chiplets and their cores is no easy task either.
 
I don't know that the bus limitations you mentioned would exist on the units themselves if they were multi units connected on the same board rather than using pcie slots

They shouldn't, but the bus speeds you're looking at must be embarrassingly fast in order to attempt to synchronize all the work done; in order to facilitate, GPU chiplet solutions will be working with drivers to 'prep' workloads so that individual outputs can then be reassembled into a coherent datastream. That's for 'on package' links. Across a PCB, the problem compounds, and across system buses? It's not happening.

it would have to be developed along with motherboard and cpu technology that could handle the throughput so this is all futuristic hypotheticals of course

Well, this all exists already in the enterprise / supercomputer space, and development is accelerating. The main problem is that a single application instance is still going to be slower than a moderately configured consumer solution, however, as seen with configurations like Google's Stadia, having other enterprise resources nearby does present some advantages - just not in terms of framerate :D.

It's like we are still developing cars for the same 1970's infrastructure trying to squeeze another few miles per gallon out of carbon drum banging when we could be developing space age terrestrial designs and infrastructures.

We're really doing both, and we cannot 'hard cut' from one to the other without civilization-crippling disruptions.

Yes I understand that for current coding squeezing out as much as possible out of such limited current hardware designs

To note, this is going to become more necessary as simple hardware improvements become more difficult- what happens when it is no longer feasible to add more cores and increase core speeds for mass-produced processors?

but the overall gains from pooling in the long run jumping far down the road to a hypothetical multi cpu and gpu seamless virtualization as 1 and 1 could vastly outweigh trying to optimize performance for your 1 core 500mhz cpu and vodoo2 gpu so to speak.

This pooling is already happening, and it works - just not for games, yet.
 
The AMD Ryzen 3000 series CPUs are all chiplet designs. One of the key drawbacks is that communication speed between the chiplets sets limitations even when it's done within the confines of the processor. This is one area where Intel has done better, whereas AMD has overtaken them in pretty much everything else. Delegating between the various chiplets and their cores is no easy task either.

AMDs move was a bit surprising on the outset, as the inherent drawbacks you mention are generally fatal for branching, out-of-order execution due to the latencies involved. This of course can be mitigated with more cache, which is usually quite expensive in terms of increasing die sizes and decreasing yields, however AMD also shrank their average die size so that cost doesn't appear to be affecting them.

Now, while AMDs CPU chiplets are interesting, they don't really take into account the GPU side. CPUs are inherently latency sensitive because the most important logic they perform is branching code. For GPUs, latency matters much less as the work has few branches, but bandwidth needs are on an entirely different scale. The work done by GPUs is also highly parallel, so as long as the work is properly segmented and the bandwidth between chiplets is available, it can work very well.
 
Uh no. I want my images to look proper even if I'm not doing color critical work. I've found using wide gamut to display content that's suppose to be sRGB completely ruins skin tones when watching youtube videos and such. I always enable sRGB mode on my Acer X27 if I'm not viewing HDR content otherwise things just don't look right. Sure it is definitely personal preference as you said, so what about those of us who want proper sRGB gamut to display sRGB images/videos? The Asus leaves you completely SOL in that regard.

Wide gamut vs sRGB.

Of course it's going to be a glaring difference side by side (if you only had the wide gamut and used it exclusively for weeks your brain would likely adjust to the new "normal"). But you're right in that the Asus leaves you guys up a creek in that regard. I'm not arguing for this monitor or that anyone should or shouldn't buy it. I just chimed in to agree with AngryLobster's preference of a more saturated look, and to say that I've owned several normal and wide gamut monitors and never felt the need to run the wide gamut ones in sRGB mode when they had one because to me it made the colors look dull and I wasn't concerned with absolute accuracy. I've always liked the more vivid look, especially for games where there is no "proper" color. If you guys want sRGB that's fine and obviously it's another reason to give this Asus a pass.
 
  • Like
Reactions: Panel
like this
Back
Top