The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

According to the review it will be around $1,100 US. That's very tempting for me.
Yeah I mean, even with its apparent flaws, getting this screen for less then $1500 sounds like a tough deal to beat. Getting a 42" 4k 120hz screen would be an amazing experience to see.
Think about all the details in our games that would come to life at that size and resolution, being able to see everything the developers wanted us to see. It would be like when I upgraded from crummy logitech speakers to high quality 6" bookshelf speakers, it was a "oh, so this is what I was missing" moment.
 
Yeah I mean, even with its apparent flaws, getting this screen for less then $1500 sounds like a tough deal to beat. Getting a 42" 4k 120hz screen would be an amazing experience to see.
Think about all the details in our games that would come to life at that size and resolution, being able to see everything the developers wanted us to see. It would be like when I upgraded from crummy logitech speakers to high quality 6" bookshelf speakers, it was a "oh, so this is what I was missing" moment.

Yeah, this went from a "maybe wait and see" to pretty much an instant buy for me. I really did not want to overpay for this thing, even being as unique as it is, when I already have a killer display...but I have no issues with that price.
 
Yeah, this went from a "maybe wait and see" to pretty much an instant buy for me. I really did not want to overpay for this thing, even being as unique as it is, when I already have a killer display...but I have no issues with that price.

Ditto.

Can't watch the video right now, but is there any word on availability date?
 
"R-O-G"
"ASÜS"
"VÉSA"

Nice to see it in action, but these guys seem ignorant about display tech. I guess it's good to get a layman's take on it, at least.
Ya it's nuts how little they covered. I know it's semantics, but it's much more a "first impression" than review
 
Hope they come out with an updated DP 2.0 model next year so you don't have to make sacrifices between 10bit or 120hz then it would be perfect. Have a 4" 4K 60hz at the moment but when it dies I'll look for a decent upgrade. Could fit a 43" perfectly
 
It's a pretty good monitor, but like everything else there are trade offs with it. I'd prefer G-Sync to FreeSync, but that would really add to the cost.
 
Hope they come out with an updated DP 2.0 model next year so you don't have to make sacrifices between 10bit or 120hz then it would be perfect. Have a 4" 4K 60hz at the moment but when it dies I'll look for a decent upgrade. Could fit a 43" perfectly

So what does the max refresh at 10bit wind up being?
 
Hope they come out with an updated DP 2.0 model next year so you don't have to make sacrifices between 10bit or 120hz then it would be perfect. Have a 4" 4K 60hz at the moment but when it dies I'll look for a decent upgrade. Could fit a 43" perfectly

Considering how long it has taken to get this out and the lack of DP 2.0 hardware as is, late next year is probably the earliest we would get something like this. 32" high refresh rate 4K panels are in AUO's pipeline too so I expect those might hit the market in displays next year too.

I'm going to order this screen as I can make good use of the large size for productivity, my main concern is if it will feel good to play games when put as far away from me as my desk allows but I can at least return it if it doesn't work out. I'm used to a 8-bit TN panel and have no complaints about that so the lack of 10-bit color at 120 Hz is not a big deal for me.

I don't know why people make a big deal about VA viewing angles. I have never noticed any problems on my 65" Samsung 4K TV. Even on my TN panel they are barely an issue when sitting in front of the screen. I have experienced viewing angle issues and black crush on a VA panel monitor I had in the past as well as cheap TN panels but with anything modern it's really not a concern at all.
 
60 Hz with 10 bit color at 4:4:4 and 4K

You can do 8 bit color OR 4:2:2 for 120Hz at 4K.

That's not that bad.

My current Sammy drops down to 4:2:2 when you set it to "game mode" for lower input lag. The difference is visible on the desktop, but I can't tell at all in most games.

Unless I am looking at colorful fonts it's really difficult to tell.
 
I thought Asus planned a full 10bit 120Hz model for next year?

WTF? They had a Nvidia card hooked up and they didn't bother testing Freesync on it.
 
Last edited:
Lol Hardwarecanucks monitor reviews have always been completely useless, more like they're just reading off the spec sheet and giving some initial impressions. No data providing some hard numbers shown at all. Only thing we got out of that video is that the price is much better than we thought, but a real review will have to come from someone else.
 
Not sure why everyone is always worried about 10-bit color. Really only useful for HDR and professional applications where colors are critical and you have a full 10-bit workflow. I don't think anyone is buying this gaming display for the latter.

DP 1.4 can do 98 Hz 10-bit Full RGB HDR (or 4:4:4). If you drop that down to 8-bit, you can get 120 Hz.
 
Not sure why everyone is always worried about 10-bit color. Really only useful for HDR and professional applications where colors are critical and you have a full 10-bit workflow. I don't think anyone is buying this gaming display for the latter.

DP 1.4 can do 98 Hz 10-bit Full RGB HDR (or 4:4:4). If you drop that down to 8-bit, you can get 120 Hz.
It will be good for now because there is no single GPU that can push 120fps on a 4k display anyways.

That said, not only didn't they test Freesync with a Nvidia card, they didn't even bother watching a movie on it. lol
 
It will be good for now because there is no single GPU that can push 120fps on a 4k display anyways.

That said, not only didn't they test Freesync with a Nvidia card, they didn't even bother watching a movie on it. lol

Movies are irrelevant. This is a monitor not a TV :p
 
It will be good for now because there is no single GPU that can push 120fps on a 4k display anyways.
Depends on which game you're talking about and when it came out.

I recently played the full Half-Life series in 5K ultrawide resolution at 166Hz. Don't forget there are lots of old games that are still good.
 
Last edited:
Lol Hardwarecanucks monitor reviews have always been completely useless, more like they're just reading off the spec sheet and giving some initial impressions. No data providing some hard numbers shown at all. Only thing we got out of that video is that the price is much better than we thought, but a real review will have to come from someone else.

On the other hand I felt they had some good insight on what it is to actually use a display like this. Most reviews focus on technical testing and ignore the practical side. That said, the video cut have been cut in half without anything being lost. Unboxing is 99% pointless filler where the only value is to see what actually comes in the box if it's not clearly stated somewhere.
 
It will be good for now because there is no single GPU that can push 120fps on a 4k display anyways.

That said, not only didn't they test Freesync with a Nvidia card, they didn't even bother watching a movie on it. lol

Sure there is. With my sig setup running 4K/144 Hz/FPS in PUBG I rarely go below that and am routinely in the ~200 FPS range.

Not every game is single player Metro Exodus graphics maxed with RTX on...
 
Sure there is. With my sig setup running 4K/144 Hz/FPS in PUBG I rarely go below that and am routinely in the ~200 FPS range.

Not every game is single player Metro Exodus graphics maxed with RTX on...
I'm talking newer games. I don't play multiplayer FPS anymore, mostly single player games so this display will be fine for me. I find myself using Gamestream on my Nvidia Shield a lot because I don't like sitting in front of the monitor for long periods of time anyways. I can lounge on the couch this way.

If you can find a GPU that can push 4k/120FPS on AC: Odyssey I'll but that too. :)
 
Sure there is. With my sig setup running 4K/144 Hz/FPS in PUBG I rarely go below that and am routinely in the ~200 FPS range.

Not every game is single player Metro Exodus graphics maxed with RTX on...


I've seen you mention this before.

Did they significantly reduce PUBG graphical quality since 2017? I haven't played it since that summer, and then it was really hard on the GPU. Every bit as much as Deus Ex Mankind Dividedd. It was one of the few games I had to run in my alternate lower custom 21:9 resolution of 3840x1646 in order to get good enough framerates.
 
Sure there is. With my sig setup running 4K/144 Hz/FPS in PUBG I rarely go below that and am routinely in the ~200 FPS range.

Not every game is single player Metro Exodus graphics maxed with RTX on...

Did they improve PUBG performance to that level now? I used to play it at 1080p on a 1080 Ti and I would drop below 100fps if there were too many smoke grenades going off.
 
I turn off a few graphics settings. Like foliage that only puts you at a disadvantage. And shadows low.
 
I'm talking newer games. I don't play multiplayer FPS anymore, mostly single player games so this display will be fine for me. I find myself using Gamestream on my Nvidia Shield a lot because I don't like sitting in front of the monitor for long periods of time anyways. I can lounge on the couch this way.

If you can find a GPU that can push 4k/120FPS on AC: Odyssey I'll but that too. :)
A monitor that has a high range is good for LFC, so newer games benefit. You don't need games to run at 120fps to take advantage of a 120hz refresh rate at 4K
 
So,

Can anyone clear up the 8bit vs 10bit confusion for me?


I understand the technical diatinction, but what I don't have a good grasp on is are any titles actually rendering in 10bit today, or is it strictly used for professional applications?
 
So,

Can anyone clear up the 8bit vs 10bit confusion for me?


I understand the technical diatinction, but what I don't have a good grasp on is are any titles actually rendering in 10bit today, or is it strictly used for professional applications?
Almost all games are still using the sRGB color space. Some games have just recently allowed the user to change the color space they use. If a game has an HDR mode then it will use a 10-bit color space. I believe HDR10 is Rec.2020, though most HDR computer monitors translate it to DCI-P3.
 
It will be good for now because there is no single GPU that can push 120fps on a 4k display anyways.

That said, not only didn't they test Freesync with a Nvidia card, they didn't even bother watching a movie on it. lol


We don't all run single gpus and the step up to 4k is very demanding and outpaces a single gpu in a lot of configurations on more demanding games as far as high hz benefits go.. However, there are isometrics and easier to render games that get at least 100fps average at 4k resolution with a single gpu ( I think overwatch always scores high on 4k benchmarks), and even some of the more demanding games can get to 100fps average at very high+ (to "ultra minus") custom settings depending on the game and settings.

That being an average, you can utilize VRR (g-sync/free-sync and hdmi 2.1 VRR in the future) to straddle that average. The "frame rate I'm getting" is usually a graph something like plus and minus 30fps from that average. This makes a better blend of trade-offs between

..higher graphics settings + blur reduction (action/image clarity)
and
..motion definition (more unique pages in an action flip book shown and at more pages per second).

There's no reason to cut your fps off at anything less than 120fps / 120hz for a VA even, if 4:4:4 chroma, bitrate wasn't the issue.

120fps at 120hz cuts the sample and hold blur about 50% vs 60fps at 60hz+ ... 2 frames shown to every 1 motion def wise (6:3)

100fps at 100hz is probably ~ 40%.. blur reduction and 5 frames to every 3 at 60fps @ 60hz+ ... still a decent improvement

Personally I run my fps averages at 100fps on more demanding games so I end up with around 70<<100>>>130 fps, capped at 120fps/hz usually on my VA. I'd rather run a 90 <<<120.>>>150 graph to keep the clarity tighter but I'm balancing graphics settings vs frame rate even at 1440p depending on the game.

I'd really like a HDMI 2.1 , 4k 120hz VRR monitor and a hdmi 2.1 output GPU with a die shrink for some real performance gains but they are all leaving a trail of breadcrumbs with their product roadmaps.
 
Last edited:
We don't all run single gpus and the step up to 4k is very demanding and outpaces a single gpu in a lot of configurations on more demanding games as far as high hz benefits go.. However, there are isometrics and easier to render games that get at least 100fps average at 4k resolution with a single gpu ( I think overwatch always scores high on 4k benchmarks), and even some of the more demanding games can get to 100fps average at very high+ (to "ultra minus") custom settings depending on the game and settings.

That being an average, you can utilize VRR (g-sync/free-sync and hdmi 2.1 VRR in the future) to straddle that average. The "frame rate I'm getting" is usually a graph something like plus and minus 30fps from that average. This makes a better blend of trade-offs between

..higher graphics settings + blur reduction (action/image clarity)
and
..motion definition (more unique pages in an action flip book shown and at more pages per second).

There's no reason to cut your fps off at anything less than 120fps / 120hz for a VA even, if 4:4:4 chroma, bitrate wasn't the issue.

120fps at 120hz cuts the sample and hold blur about 50% vs 60fps at 60hz+ ... 2 frames shown to every 1 motion def wise (6:3)

100fps at 100hz is probably ~ 40%.. blur reduction and 5 frames to every 3 at 60fps @ 60hz+ ... still a decent improvement

Personally I run my fps averages at 100fps on more demanding games so I end up with around 70<<100>>>130 fps, capped at 120fps/hz usually on my VA. I'd rather run a 90 <<<120.>>>150 graph to keep the clarity tighter but I'm balancing graphics settings vs frame rate even at 1440p depending on the game.

I'd really like a HDMI 2.1 , 4k 120hz VRR monitor and a hdmi 2.1 output GPU with a die shrink for some real performance gains but they are all leaving a trail of breadcrumbs with their product roadmaps.

I'm curious how many users on here actually run SLI 2080Ti's? Not many I'd imagine.

I was a big propronent of SLI'd GPUs decided to skip it with my 1080Ti (now have a 2080Ti). Really isn't much point in it anymore.
 
I'm curious how many users on here actually run SLI 2080Ti's? Not many I'd imagine.

I was a big propronent of SLI'd GPUs decided to skip it with my 1080Ti (now have a 2080Ti). Really isn't much point in it anymore.

I find the SLI experience to almost always be worse than with a single GPU. Just not worth it
 
I find the SLI experience to almost always be worse than with a single GPU. Just not worth it
Not many "AAA games" (been watching too much Jim Sterling on YT) support SLI anyways. I suppose it's still good if you're into overclocking (which I'm not).

That's why these displays are future investments for me. Yes, I want the adaptive sync features with my 2080Ti but it might pay for me to wait for a full 10bit HDR 120Hz display with those features.

That said I really am not happy using my 2015 Samsung 48" TV as my every day display anymore. Really want a decent PC focused 4k display with adaptive sync.
 
Back
Top