The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

No, that it was an evolution of VA into more of an IPS-type panel.
Not sure I follow. It's not "more of" an IPS-type panel, it is an IPS-type panel. It is not, nor has it ever been, a VA-type panel technology.

I get what you're saying that it used to be worse in a way more similar to VA panels, but it was still IPS.
 
The contrast and black depth are around triple on a VA gaming display compared to non-fald IPS and TN That around 3000:1 contrast on a gaming VA compared to 860:1 , 980:1, to 1000:1 on TN and IPS.. so there is no comparison there.

6OJtiel.png

IPS and TN gaming displays are very PALE blacks and contrast. On a non-fald VA TV - it's way more.. like 6,000:1 contrast or more. So my 43" 4k TV formerly beside my pg278q tn was over 4 times the contrast and black depth... I put a samsung on the other side that is around 6100:1 contrast ratio too. Even dragging image and video windows from my current VA gaming monitor's ~2800 to 3000:1 contrast ratio is a huge difference but the games look way better on the 32" VA than my other monitors ever did too.. especially dark scene games like dark souls 3, dishonored 2 , far cry primal, vermintide 2, ESO, remnant: from the ashes, etc.

I'll never go back to such low black depths. so VA, or at least a FALD IPS or VA , is the least I'll settle for. I still have my concerns about OLED and it's limitations, and I'm really not interested in spending much over $2k on a monitor , I'd just wait and get a hdmi 2.1 tv for the living room at that point.
 
Not sure I follow. It's not "more of" an IPS-type panel, it is an IPS-type panel. It is not, nor has it ever been, a VA-type panel technology.

I'll retract a little bit- my initial criticisms were that PLS was trying to emulate IPS, and succeeded largely in terms of viewing angles and color, but still failed in the same way that VA panels failed at the time, i.e. input lag from overdrive, while not bringing more contrast than IPS. Basically useless relative to IPS for gaming but a plausible alternative otherwise.

Now that Samsung has refined it, it's basically equivalent to IPS, to the point that you're worried about other variables more.
 
I'll never go back to such low black depths.

I wouldn't want to, but- the LG 32" VA we share is noticeably slower in response time to my older 27" IPS Predator, and I don't notice the additional contrast on the desktop or in games.

I will also say that I haven't gotten around to playing with it yet to try and improve that, so I may be off base, but that's the default impression from having both.
 
If you tweak it and view it in relatively dim room conditions it should be much darker on the black levels. I'd also recommend bumping the color vibrance up a few percent. not for accuracy but for better vibrancy in games.

I highly recommend using nvidia freestyle which is part of the nvidia experience and shows up as a slide out menu of sliders overlaying your game where you can edit a lot of things like contrast, gamma, saturation, etc. beyond (or even below when desired) your base OSD settings, on a per game basis which it remembers. It also has a setting that unfortunately the monitor OSD does not, which is a sharpness filter. That means you can use freestyle to add some sharpness in games which is appricated for some games. This also works as an unsharpen filter if desired. Unfortunately you can't use any of the freestyle filters on the desktop (notably the sharpen filter), only in games.
Freestyle was written with the help of the author of reshade so is kind of like an easy mode reshade. If you ever run into a game that isn't supported by freestyle, you can always use reshade optionally. When you exit whatever nvidia freestyle filtered game you are playing, your monitor goes back to its regular OSD setting levels. Outside of that I keep the nvidia desktop vibrance in the color settings at about 52%.

I think you'll notice quite a differce tweaked and in the right room lighting settings. It's still no oled by far, it's not even the 6100:1 of the 43" 4k samsung 6900 TV i have next to it. Moving a HD or uhd movie window off of the 32" gaming panel to the 43" samsung VA TV next to it is a big difference. Still it's way better than ips and tn black levels for gaming on the 32 inch VA though.

-----------------------

I'm hoping these 43 inch gaming VAs are at least as dark and contrasted if not more considering their HDR labeling, and hoping their overdrive (and response time) is at least on par with the high standard the 32" LG VA has set.

Really I'd prefer a FALD model even with a dim or glow halo on edges in places since it would increase the contrast and black levels by magnitudes well over 10, 000:1. or oled if it had the full boat of other relevant gaming tech and wasn't astronomically priced (over $2500)

I'd rather save my money for a 70"+ sized hdmi 2.1 oled tv or q9fn tier hdmi 2.1 fald for the living room if I was going into $4000+ budgets personally.
 
Last edited:
What about on the desktop? Is ghosting noticeable there in any way?

its noticeable (for me) only on desktop, only while fast scrolling text.

After some testing i think the biggest 438q problem is typical va contrast shifting. you just cant photo edit on this monitor, you never know which contrast is the true one. its not noticeable on games and movies
I think acer 43 will not be better..
 
its noticeable (for me) only on desktop, only while fast scrolling text.

After some testing i think the biggest 438q problem is typical va contrast shifting. you just cant photo edit on this monitor, you never know which contrast is the true one. its not noticeable on games and movies
I think acer 43 will not be better..

What distance are you sat from the screen exactly?
 
  • Like
Reactions: elvn
like this
Shadow...
How is it..? Did you try any of the blurbuster stuff..? Microcenter couldn't hold on to the few they had, they are getting more. But I am not going to pull the trigger until I see it's blur. TFT Central is mum about theirs so far...
 
Does anyone even know how edge-lit HDR 1000 is even possible or how it compares to FALD? I'm assuming it's a cheaper/worse solution. I'm guessing you would get an entire vertical or horizontal "blooming" line going bezel to bezel versus just splotches/circles of blooming like FALD's do when there's contrasting content?
 
Edge lit LCD HDR1000 will be the worst option and will look similar to this:

Moon_Local_Dimming_Top_and_Bottom.jpg



Hence why edge lit LCD like this monitor is so cheap.
 
Yep. I wouldn't get this expecting pinpoint HDR , especially over 400nit color brightness (SDR is almost 350 - 400nit max really already anyway). It sounds like the Dell OLED is limited to 400nit max color brightness too, which is even below what a OLED HDR TV's ABL is normally at 600nit color volume limit.

Here are a few OLED ABL 600 and FALD representations for comparison:

 
Last edited:
Yep. I wouldn't get this expecting pinpoint HDR , especially over 400nit color brightness (SDR is almost 350 - 400nit max really already anyway). It sounds like the Dell OLED is limited to 400nit max color brightness too, which is even below what a HDR TV's ABL is normally at 600nit color volume limit.

Here are a few OLED ABL 600 and FALD representations for comparison:

Eh? isn't that the dual layer LCD (CG3145) :)?
 
Thanks for the comparisons. Blah... looks like this won't be the perfect monitor then. I may just need to settle for the PG35VQ or the X35 once I can confirm the flickering issues have been resolved. I've got like 12+ games in my backlog library all the back from 2016 which are confirmed PC HDR games that I refuse to play them until I have an HDR monitor (Forza Horizon 4, HITMAN, Rise of Tomb Raider, Shadow Warrior 2, Mass Effect: Andromeda, Final Fantasy XV, Assassin's Creed Odyssey/Origins, FaryCry New Dawn, etc.)
 
Eh? isn't that the dual layer LCD (CG3145) :)?

Yes the advert was for dual layer LCD but the reason I showed it is that it in each example it gives a representation of the trade-offs of both OLED's ABL and FALD's dim/glow haloing on edges. I didn't feel like editing out the eizo dual layer info part of the image but I may in the future after this lol.

To be fair to OLED, in most of the displacement maps of HDR games and some other HDR material I've seen on the internet, much of the HDR peak color brightness in a scene is at 600nit (represented as red) or less other than the sun or a bright moon, and either of them glinting off of a gun barrel, water, car, mirror/reflection etc. However, tone mapping could take the brightest point(s) as 600 and ratchet all the rest down depending how the tone mapping works so potentially the whole scene could ratchet down a few steps lower in color brightness and overall scene brightness.

QfSF1rn.jpg


I'd like the option of a FALD model, even with the "glowy" edges (especially at hdr 1000 brightness points)... which is something like a crt glow but most overt on the most contrasted parts (edge of a bright spaceship in space, etc). . I could live with that until something better comes along. However these are the perfect size for my array and the price isn't horrible so definitely in consideration. It would be an up-size from my 32" 1440p VA size and rez at the very least.
 
Last edited:
Thanks for the comparisons. Blah... looks like this won't be the perfect monitor then. I may just need to settle for the PG35VQ or the X35 once I can confirm the flickering issues have been resolved. I've got like 12+ games in my backlog library all the back from 2016 which are confirmed PC HDR games that I refuse to play them until I have an HDR monitor (Forza Horizon 4, HITMAN, Rise of Tomb Raider, Shadow Warrior 2, Mass Effect: Andromeda, Final Fantasy XV, Assassin's Creed Odyssey/Origins, FaryCry New Dawn, etc.)

I have played every single game you listed except for ME:A and I can tell you that even on a FALD LCD, the HDR experience is really 50/50. Looks amazing during bright daytime scenery and absolutely HORRIBLE once night time kicks in like in AC:O and Far Cry. Waaaaay too much blooming when you are asking for 1000 nits highlights against pitch black. Of course the VA panel that are you getting will probably do better in that regard but I have found OLED to be a far superior HDR experience.
 
I have played every single game you listed except for ME:A and I can tell you that even on a FALD LCD, the HDR experience is really 50/50. Looks amazing during bright daytime scenery and absolutely HORRIBLE once night time kicks in like in AC:O and Far Cry. Waaaaay too much blooming when you are asking for 1000 nits highlights against pitch black. Of course the VA panel that are you getting will probably do better in that regard but I have found OLED to be a far superior HDR experience.

Thanks for the input. Geeze...I guess I'm going to have to backlog these games for yet another year and wait until at least 2020 to get the proper HDR experience. It seems like only OLED can pull this off properly, but there's no such thing as HDMI 2.1 GPU's yet. I don't think I can go back to 60hz and no VRR after experiencing 120hz & G-Sync for well over a year now.
 
I have played every single game you listed except for ME:A and I can tell you that even on a FALD LCD, the HDR experience is really 50/50. Looks amazing during bright daytime scenery and absolutely HORRIBLE once night time kicks in like in AC:O and Far Cry. Waaaaay too much blooming when you are asking for 1000 nits highlights against pitch black. Of course the VA panel that are you getting will probably do better in that regard but I have found OLED to be a far superior HDR experience.


The trade off for the superior HDR experience (and that's pretty much irrefutable) being tearing or lag... which would obviously be alleviated by VRR, yet this isn't an option yet for PC and TV's.
 
The trade off for the superior HDR experience (and that's pretty much irrefutable) being tearing or lag... which would obviously be alleviated by VRR, yet this isn't an option yet for PC and TV's.

The tearing you can eliminate as long as you have the gpu power to maintain a locked vsync fps. As for lag well yes not much can be about that since we are comparing a low lag gsync monitor to a TV but at least the input lag was bearable for me since I was playing with a controller and it was just single player games anyway.
 
The tearing you can eliminate as long as you have the gpu power to maintain a locked vsync fps. As for lag well yes not much can be about that since we are comparing a low lag gsync monitor to a TV but at least the input lag was bearable for me since I was playing with a controller and it was just single player games anyway.


Yes, but V-Sync adds lag, on top of existing lag... it's going to be unbearable for some. IQ obviously wins out, but it comes at a compromise. All very much personal preference at the end of the day. Bigger issue for most is the complete impracticality of using 55" as a PC monitor vs even 43" which is do-able for far more people.
 
You also aren't getting anything out of your higher Hz capability if you are locking your fps lower with v-sync.

You don't reduce viewport movement smearing by 40% blur reduction until at 100fps solid and 50% blur reduction (and double the motion definition) at 120fps solid rate. Using VRR/g-sync/free-sync allows you to straddle a relatively high rate of say 100fps in for example a 70 <<100fps>>130 fps graph, having some lower fps mixed in and still playing in a relatively high mix of frame rates to get more out of the higher Hz capability of a high hz monitor without suffering from jarring frame rate dips and potholes, stutters, etc.

Just run v-sync on a "powerful enough gpu" .. and get into the 100hz - 120hz ranges with your frame rate? What gpu does this on a 4k rez on a demanding game at high+ to ultra settings? Some of the most demanding can't even on 1440p.

- So in order to get higher pixel density and HDR capability you'd turn the graphics settings down a lot and/or lose a ton of image quality.. motion clarity, and motion definition throwing your high hz capability ceilings away.. Where is the overall fidelity and high end graphics gain there?
 
You also aren't getting anything out of your higher Hz capability if you are locking your fps lower with v-sync.

You don't reduce viewport movement smearing by 40% blur reduction until at 100fps solid and 50% blur reduction (and double the motion definition) at 120fps solid rate. Using VRR/g-sync/free-sync allows you to straddle a relatively high rate of say 100fps in for example a 70 <<100fps>>130 fps graph, having some lower fps mixed in and still playing in a relatively high mix of frame rates to get more out of the higher Hz capability of a high hz monitor without suffering from jarring frame rate dips and potholes, stutters, etc.

Just run v-sync on a "powerful enough gpu" .. and get into the 100hz - 120hz ranges with your frame rate? What gpu does this on a 4k rez on a demanding game at high+ to ultra settings? Some of the most demanding can't even on 1440p.

- So in order to get higher pixel density and HDR capability you'd turn the graphics settings down a lot and/or lose a ton of image quality.. motion clarity, and motion definition throwing your high hz capability ceilings away.. Where is the overall fidelity and high end graphics gain there?

Ummm my OLED is 60Hz? What need is there to run 100fps on a 60Hz TV? And my 2080 Ti has been enough to maintain a mostly locked 60fps? I'm talking purely about the IMAGE QUALITY IN HDR on my OLED vs FALD LCD. Obviously I want greater than 60fps but I'm waiting on more powerufl hdmi 2.1 gpus and a 120Hz C9/C10 OLED for that. I literally made no mention of the obvious and desirable higher frame rate advantages yet this somehow always comes into the conversation.
 
Ah, ok There are 1000nit HDR FALD 120hz ips computer displays that have been out awhile, and more of these non-fald "fake" or quasi HDR ones at 120hz and 144hz due out, and people have been keeping the very high priced dell 120hz gaming oled (non-HDR) in mind as a comparison in this thead too so I was going off of that. People buying the monitor discussed in this thread are mostly interested in the higher hz capability at 4k that they've been waiting for, otherwise they'd just buy a 60hz tv at this size or similar. I get that you were talking about your 60hz tv, but to me image clarity (blur reduction) especially, as well as moiton definition.. is a big part of the overall image quality in 1st and 3rd person games.

HDMI 2.1 displays and hdmi 2.,1 output gpus (and a die shrink) are definitely worth waiting for if you can hold out that long I agree. These trail of breadcrumbs roadmaps and monitors are tempting a lot of us though - especially the ones that are closer to $1k than $2k+.
 
These trail of breadcrumbs roadmaps and monitors are tempting a lot of us though - especially the ones that are closer to $1k than $2k+.

Why would you blow 1-2k grand when you can just pick up the Alienware 55??? At four grand its so cheap, I am surprised people are not buying three for Eyefinity!
 
People act like everything but OLED or FALD is just unusable and horrible but really, edge lit HDR TVs in my experience look perfectly fine. Sure, OLED and FALD are better but you can't have everything.


its noticeable (for me) only on desktop, only while fast scrolling text.

After some testing i think the biggest 438q problem is typical va contrast shifting. you just cant photo edit on this monitor, you never know which contrast is the true one. its not noticeable on games and movies
I think acer 43 will not be better..

How does the VA contrast shift manifest itself? Like do you see if you shift position in the chair or something? Does viewing distance matter? Acer 43" will use the same panel so won't make a difference.
 
Anyone heard when Micro Center is due to get more in? All I see is SOLD OUT and In Store Only for every store.
 
How does the VA contrast shift manifest itself? Like do you see if you shift position in the chair or something? Does viewing distance matter? Acer 43" will use the same panel so won't make a difference.

Much like any LCD monitor issues/anomalies, they exist objectively, but subjectively for some people they are show stoppers, while for others, they can't even see it if they try, so are complete non issue. If you already use VA panels, without issue, I wouldn't worry about it.

Contrast shift for VA panels manifest as the strongest contrast when your eye is perpendicular to the display, it is especially noticeable in low-mid tone region, where it is darker when viewed perpendicular, fading or washing out at an angle.

The usual response is: "Who cares? since I will only use my monitor straight on". But you are really only perpendicular to the very center of your monitor(if seated perfectly centered), the contrast will shift the further away from the center of the screen the image gets. The bigger the screen, and the closer you sit, the more viewing angle to the side gets progressively changed, and the contrast shift also gets progressively worse. On top of that, even small shifts in head position will cause shifts in the image presentation. Also the view for each eye is slightly different, because they are at different angles to the screen.

Subjectively this unnoticed, or ignored by most people.

On the other side of the coin, is that some people are very sensitive, like me. Those shifts in image presentation, and difference to each eye, tells my brain, something is wrong, and I get headaches when using VA monitors. Even on much smaller 24" screens.

I have no issues using VA TVs from 10 feet away, because then the shifts only manifest as fading if I you sit off axis, not weird image instability, and differences to each eye.
 
People act like everything but OLED or FALD is just unusable and horrible but really, edge lit HDR TVs in my experience look perfectly fine. Sure, OLED and FALD are better but you can't have everything.




How does the VA contrast shift manifest itself? Like do you see if you shift position in the chair or something? Does viewing distance matter? Acer 43" will use the same panel so won't make a difference.


These are some examples of the VA shift on my 43" TVs

CLICK THE IMAGES .. the thumbnail is NOT an accurate representation. :geek:

---------------

https://www.rtings.com/tv/reviews/tcl/s-series-s405-4k-2018



https://www.rtings.com/tv/reviews/samsung/nu6900


In regular usage, at the side angled viewing I use them at, it's down to mostly the nearest low corner and only noticeable on solid bright fields of color.
-------

Example of a gk850g-B

https://pcmonitors.info/reviews/lg-32gk850g/#Viewing_angles


------

and the GK850G (non B) uniformity map

https://www.tftcentral.co.uk/reviews/lg_32gk850g.htm#uniformity




The tolerance is very low on the maps which always make them look worse than they are, but you can see that either the top or bottom corners will have the most "shading gradient" of shift depending what angle you view the monitor from.
-------

In gaming, movies, and regular usage it's really not noticeable head on. If you are looking for it on solid full screen content or you are doing a full screen bright field of color like a full screen document it's noticeable in the coners (and a bit on the edge on my 32") as a slight shading gradient. When farther away it's less prominent. Sitting 1.5 - 2' away from my gk850g was more obvious than 3' away.

So used as a gaming and media monitor it's a non issue imo. Used for full screen documancy, poster board art, image editing and such VA are not great all-arounders but are definitely usable depending on what you are doing.. That said, my 43" tvs-as-monitors have way less shift width/area as evidenced in the photos I linked and from my personal experience throwing some solid bright 4k color images up on each. It's very minimal on them, The 32" gaming panel is the worst of the bunch. So not all VA are the same amount of shift, and the distance also matters. I use the 43" tvs for pretty much everything but I keep the 32" gaming one's duties down to gaming mostly and as some extra space for a file management window or another web browser instance.
 
Last edited:
People act like everything but OLED or FALD is just unusable and horrible but really, edge lit HDR TVs in my experience look perfectly fine. Sure, OLED and FALD are better but you can't have everything.

It's been that way since technological advancements existed.

"Hard drives are unusable after experiencing SSDs!"
"60 Hz is unplayable after experiencing 120 Hz!"
"LCDs are unwatchable after you experience OLED!"

In reality, none of these are true but the differences are significant enough that I understand why people tend to spout the hyperbole. I've done it myself, and goodness knows I've seen plenty of it on this forum and others over the years.
 
They are more than fine for SDR material but some of us have issue with slapping a HDR label on everything and charging more for it. Displaying a HDR 600 - 1000nit color ceiling coded source in a content's frames and trying to flashlight that from the edge across darker colored areas dynamically is not going to work out right at all.

Even for SDR, a full FALD array will have over 10,000:1 contrast ratio, in some cases over a 19,000:1 contrast ratio. Even moreso for OLED black depth SBS with bright colors. Lower contrast ratios are definitely usable and look ok but they will be much paler and less stunning - and really aren't a true HDR contrast and black depth (the contrast and black depths drop across flashlighted areas also).

There's a difference between saying it's not a decent display or it's unwatchable and saying it's incapable of displaying HDR content properly.

Still the size of these 43" displays and the fact that they have 120hz 4k and VRR/free-sync compatible with nvidia gpus and at this price point is good - and at up to triple the contrast and black depth of ips/tn.
 
Last edited:
People act like everything but OLED or FALD is just unusable and horrible but really, edge lit HDR TVs in my experience look perfectly fine. Sure, OLED and FALD are better but you can't have everything.
Well, Why can't we have everything?o_O Hum?:confused: I'll condsider upgrading my monitor when they release a 4K 144+hz HDR 1000 with at least a .5 to 1ms response time. (y)
Thanks,:)

:D
 
It's been that way since technological advancements existed.

"Hard drives are unusable after experiencing SSDs!"
"60 Hz is unplayable after experiencing 120 Hz!"
"LCDs are unwatchable after you experience OLED!"

In reality, none of these are true but the differences are significant enough that I understand why people tend to spout the hyperbole. I've done it myself, and goodness knows I've seen plenty of it on this forum and others over the years.
giphy.gif
 
It's been that way since technological advancements existed.

"Hard drives are unusable after experiencing SSDs!"
"60 Hz is unplayable after experiencing 120 Hz!"
"LCDs are unwatchable after you experience OLED!"

In reality, none of these are true but the differences are significant enough that I understand why people tend to spout the hyperbole. I've done it myself, and goodness knows I've seen plenty of it on this forum and others over the years.

I just wish we were at the next echelon of monitors, I mean there is no 'oh you NEED to have that' in regards to monitor features, where using a spinning disk over an SSD would be foolish (yes I know, for large storage...) However, with monitors, 4k or 144hz seem nice, but not a necessity, basically at this point what monitor someone chooses comes down to just preference of specs.
Don't get me wrong, I like choice, but there are not many options that have it all.
Fingers crossed for a 43" 4k 144hz IPS or OLED in the near future.
 
I just wish we were at the next echelon of monitors, I mean there is no 'oh you NEED to have that' in regards to monitor features, where using a spinning disk over an SSD would be foolish (yes I know, for large storage...) However, with monitors, 4k or 144hz seem nice, but not a necessity, basically at this point what monitor someone chooses comes down to just preference of specs.
Don't get me wrong, I like choice, but there are not many options that have it all.
Fingers crossed for a 43" 4k 144hz IPS or OLED in the near future.

I don't think there is any monitor out there that has it all. The ASUS ROG XG438Q checks off a lot of boxes for me, but it hardly has it all. Every monitor is a compromise. That's just how it is right now.
 
I don't think there is any monitor out there that has it all. The ASUS ROG XG438Q checks off a lot of boxes for me, but it hardly has it all. Every monitor is a compromise. That's just how it is right now.

Exactly. There is no perfect monitor, and it's been that way for a long time now. You just have to choose the one that ticks the most boxes that you're looking for that are important to you. They all have some sort of compromise.
 

Painful or less than optimal? Yes. Unusable? No.

If you're used to an NVMe SSD and a 240Hz OLED (if it existed) and then you downgraded to a 5,400rpm hard drive and a 15" TN LCD, I promise that you would still be able to use your computer and even play games on it (gasp!). You just wouldn't want to. It wouldn't be enjoyable, much less the greatest experience you could be having. The point is, all the conjecture is just a figure of speech when you're [H] and used to the finer cuisine in the world of tech. It's hard to go backwards. :) But, not impossible.
 
Painful or less than optimal? Yes. Unusable? No.

If you're used to an NVMe SSD and a 240Hz OLED (if it existed) and then you downgraded to a 5,400rpm hard drive and a 15" TN LCD, I promise that you would still be able to use your computer and even play games on it (gasp!). You just wouldn't want to. It wouldn't be enjoyable, much less the greatest experience you could be having. The point is, all the conjecture is just a figure of speech when you're [H] and used to the finer cuisine in the world of tech. It's hard to go backwards. :) But, not impossible.

LCD is hot garbage and now that Dell / Alienware are making the 55 OLED so affordable there is no excuse for anybody to ever use LCD again!
 
Back
Top