The perfect 4K 43” monitor! Soon! Asus XG438Q ROG

Of course it's going to be a glaring difference side by side (if you only had the wide gamut and used it exclusively for weeks your brain would likely adjust to the new "normal"). But you're right in that the Asus leaves you guys up a creek in that regard. I'm not arguing for this monitor or that anyone should or shouldn't buy it. I just chimed in to agree with AngryLobster's preference of a more saturated look, and to say that I've owned several normal and wide gamut monitors and never felt the need to run the wide gamut ones in sRGB mode when they had one because to me it made the colors look dull and I wasn't concerned with absolute accuracy. I've always liked the more vivid look, especially for games where there is no "proper" color. If you guys want sRGB that's fine and obviously it's another reason to give this Asus a pass.

You do not need to have the monitors side by side to notice that the colors are off. People are going to have tomato skins on a wide color gamut monitor and I can notice that without a side by side comparison. Admittedly yes I do like to turn on wide color gamut mode for some games and seeing as this monitor is marketed as a gaming display I can see why some people are saying that lack of sRGB and even BGR layout isn't a problem. If people ignore these things and just buy it anyways then that let's Asus know that this is totally acceptable and they don't need to make any improvements when charging over a grand for a monitor. We need to stop with this whole idea that it's just a gaming monitor so it's fine to be missing features.
 
Last edited:
Typical VA asstastic black smearing, it lacks the appropriate connections to actually do 120Hz, and the overshoot is wild. On top of it sucking for desktop work.

So, middling display with a high price, basically.

That's it in a nutshell. I'd consider it at half the price, as a second screen, but could never have it as a daily driver for work stuff, and that's what I'd want from a £1000+ monitor.

It bugs me somewhat when I see people actually defending an overpriced product that's clearly cut so many corners... of course people are free to think what they wont, but it's those attitudes which go a long way to explaining why the likes of Asus continue to make shoddy crap like this... because they know people will buy it. Where's the incentive to do better? I'd like to hold out hope for the UQ version, and Acer's CG437K, but I fear it's mere wishful thinking at this stage. I can't see they will offer anything but minor improvements.
 
I pretty much only want IPS for work. My experience thus for with VA is that the price you pay for extra static contrast is negated by black smearing, excessive overshoot, or both.
 
I pretty much only want IPS for work. My experience thus for with VA is that the price you pay for extra static contrast is negated by black smearing, excessive overshoot, or both.

Yes, but then you have horrendous glow and bleed to deal with in even high-end IPS monitors these days. VA can at least be done better than this 43" example, but there's no avoiding some degree of smearing, and if you're sensitive to that, you'll always see it.
 
Yes, but then you have horrendous glow and bleed to deal with in even high-end IPS monitors these days. VA can at least be done better than this 43" example, but there's no avoiding some degree of smearing, and if you're sensitive to that, you'll always see it.

It's glow and bleed or uniformity issues, really different things that have the same basic effect of unneveness. But the black smearing? Well, I see everything, so most of the time I just deal with the deficiencies of whatever I am using.

In this case it's more of a lack of a real gain, trading more contrast for more smearing, which in effect reduces the detail that I wanted the contrast for.


The main thing is that this is 43", 4k, and 100Hz+. That's useful. But not enough to spend this much for this many deficiencies unless it's absolutely what one is looking for :).
 
Black Smearing
------------------------------------

Black smearing and uniformity can be done better. The % of the worst transitions outside of the ability of the response time and overdrive can be less than what the xg438q is capable of. The bar has been set high by the 34gk850g which has a very good response time for a VA and an excellent overdrive implementation. The gk850g has it's own tradeoffs in ppi/resolution, color vibrancy out of the box and not the best shift uniformity. Whether these 43" monitors would be able to match the response time + very high quality overdrive implementation of the gk850g was one of my main concerns.

Black smearing is a moot point pretty much on any games that aren't getting out of the 60fps~60hz smearing sample and hold ranges upward to 100fps+. Not only fast moving virtual objects but the entire viewport movement in 1st/3rd person games will be smearing blur, overshadowing any black smear that you'd otherwise see. The number of games at smearing sample and hold blur ranges of lower frame rates will be a lot higher at 4k resolution. That still leaves dialed in/down graphics settings games able to achieve 100fps+, easier to render or older games, 2d games, and desktop use all at a more of a soften blur at 120fps~120hz where the blur is cut by ~50% compared to 60fps~60hz. Without the smearing levels of sample and hold blur to mask the black smearing it becomes more of a standout.

Personally I will never go back to 860:1 to 1000:1 contrast and accompanying black depths of TN and IPS screens. They are 1/3 of typical gaming VA screens and a 1/4 of those at the native contrast of this xg438q screen (~4000:1) before considering the local dimming which can put it up to 7000:1 or more in large parts of a frame. VA tvs have a native contrast of 4100:1 and 6100:1 depending on the model, perhaps even over 7000 on some models and any with local dimming or FALD will go over 11,000:1 and sometimes much higher.

---------------------------------
Show me what you got
----------------------------------
I agree that the price is way too much for what this monitor is offering. If it were the same or better in every facet to my gk850g it would be more worth it but compared to the gk850g

-overdrive+response time isn't as effective in getting rid of as large of a % of black smearing on the worst transitions
-uniformity/shift in it's extents (corners) is much darker shaded
-text might not look as crisp due to BGR so not a clear upgrade from the low ppi "soft" text of the 32gk850g here either
-colors are inaccurate from being wide gamut only with no sRGB setting in OSD available
-freesync .. g-sync still has some benefits over free-sync
-no HDMI 2.1 so using a resolution too high for the dp1.4 bandwidth at 4k 120hz and no future proofing with a hdmi 2.1 input
-Price is a lot higher. Gk850g was $850 when it came out but soon dropped to $650 and lower with the freesync models hitting $450 where their values really lie.

Pros:
-screen size without being way too large (and personally, fits my array perfectly)
-higher ppi, resolution
-massive desktop/app and isometric/2d space
-high Hz (but much harder to feed a high enough frame rate at 4k)
-some quasi HDR or "SDR+" capability
-native contrast is almost 1000 more at around 4000:1, going up to several thousand higher/darker with local dimming in some areas of screen. (I'd prefer spending a little more and getting FALD on a better monitor overall personally)
-VESA mount (might need it to flip upside down if BGR bothers you enough)
 
Black Smearing
------------------------------------

Black smearing and uniformity can be done better. The % of the worst transitions outside of the ability of the response time and overdrive can be less than what the xg438q is capable of. The bar has been set high by the 34gk850g which has a very good response time for a VA and an excellent overdrive implementation. The gk850g has it's own tradeoffs in ppi/resolution, color vibrancy out of the box and not the best shift uniformity. Whether these 43" monitors would be able to match the response time + very high quality overdrive implementation of the gk850g was one of my main concerns.

Black smearing is a moot point pretty much on any games that aren't getting out of the 60fps~60hz smearing sample and hold ranges upward to 100fps+. Not only fast moving virtual objects but the entire viewport movement in 1st/3rd person games will be smearing blur, overshadowing any black smear that you'd otherwise see. The number of games at smearing sample and hold blur ranges of lower frame rates will be a lot higher at 4k resolution. That still leaves dialed in/down graphics settings games able to achieve 100fps+, easier to render or older games, 2d games, and desktop use all at a more of a soften blur at 120fps~120hz where the blur is cut by ~50% compared to 60fps~60hz. Without the smearing levels of sample and hold blur to mask the black smearing it becomes more of a standout.

Personally I will never go back to 860:1 to 1000:1 contrast and accompanying black depths of TN and IPS screens. They are 1/3 of typical gaming VA screens and a 1/4 of those at the native contrast of this xg438q screen (~4000:1) before considering the local dimming which can put it up to 7000:1 or more in large parts of a frame. VA tvs have a native contrast of 4100:1 and 6100:1 depending on the model, perhaps even over 7000 on some models and any with local dimming or FALD will go over 11,000:1 and sometimes much higher.

---------------------------------
Show me what you got
----------------------------------
I agree that the price is way too much for what this monitor is offering. If it were the same or better in every facet to my gk850g it would be more worth it but compared to the gk850g

-overdrive+response time isn't as effective in getting rid of as large of a % of black smearing on the worst transitions
-uniformity/shift in it's extents (corners) is much darker shaded
-text might not look as crisp due to BGR so not a clear upgrade from the low ppi "soft" text of the 32gk850g here either
-colors are inaccurate from being wide gamut only with no sRGB setting in OSD available
-freesync .. g-sync still has some benefits over free-sync
-no HDMI 2.1 so using a resolution too high for the dp1.4 bandwidth at 4k 120hz and no future proofing with a hdmi 2.1 input
-Price is a lot higher. Gk850g was $850 when it came out but soon dropped to $650 and lower with the freesync models hitting $450 where their values really lie.

Pros:
-screen size without being way too large (and personally, fits my array perfectly)
-higher ppi, resolution
-massive desktop/app and isometric/2d space
-high Hz (but much harder to feed a high enough frame rate at 4k)
-some quasi HDR or "SDR+" capability
-native contrast is almost 1000 more at around 4000:1, going up to several thousand higher/darker with local dimming in some areas of screen. (I'd prefer spending a little more and getting FALD on a better monitor overall personally)
-VESA mount (might need it to flip upside down if BGR bothers you enough)
ty, that helps, I'll wait for it to drop, I'm already on the gk850g.
 
After lots of testing I think this monitors HDR performance is basically on par with the 49" x900f I had. That TV only had 25 zones so for most content it barely did anything just like this monitor.

IMO its basically always outputting 400+ nits in HDR content due to the limited zone count and relying on its native contrast to disguise blooming which it does a great job in. Highlights and stuff is where the 650+ momentary flashes occur and TBH I dunno if I'd be comfortable using this display @ 1000nits.

It's real weakness is in subtitles due to how fast Asus made the local dimming react. It behaves like there is a explosion on screen and can briefly light up the scene depending on what's being displayed. It's much worse in dark content with subtitles to the point where I'd turn off local dimming.

My PG27UQ causes a lot of eye strain in HDR due to the brightness.

I find my Acer X27 to be a little bit too bright for prolonged HDR gaming sessions as well but then someone told me that it's not the displays fault but mine for not understanding how HDR works soooooo ¯\_(ツ)_/¯
 
I'm pretty sure most displays (and users) are not using HDR properly right now. In fact, with true PQ HDR which uses absolute values, the common complaint is that the average image brightness is too dark for rooms that aren't home theater dim to dark rooms. HLG gamma is relative which allows the white point to be all over the place.


https://www.lightillusion.com/uhdtv.html
"Although the nominal nits value for HLG diffuse white will vary with the peak brightness of the display, a 1000 nit display will have place diffuse white around 200 nits, similar to PQ based HDR, while a 5000 nits HLG display will have diffuse white around 550 nits, depending system gamma (see later for info on HLG system gamma).


So the reality is that HDR should mainly add to the existing brightness range of SDR displays, so more detail can be seen in the brighter areas of the image, where existing SDR images simply clip, or at least roll-off, the image detail."

--------------------------------------------

That means more detail and brighter colors throughout the 3d color gamut heights instead of clipping or rolloff at sdr peaks. It also means that when HDR is done properly the gamma and white point is absolute at 200nits. When you are using a "hybrid" hdr and changing the white point you are changing everything. If you scale the white point up and/or change your gamma curve to compensate for a poor viewing environment ("It's too dim, I'll turn it up") - your highlights and peaks are going to be darker or much brighter than the HDR standards based on human eyesight.

-----------------------------------------------
https://www.lightillusion.com/uhdtv.html


HDR (High Dynamic Range) via PQ (Perceptual Quantizer) or HLG (Hybrid Log-Gamma) EOFT (Gamma) and WCG (Wide Colour Gamut) imagery

Absolute vs. Relative - PQ vs. HLG
One of the things we have simply come to accept when watching TV at home is that we set the peak brightness of the TV to accommodate the existing viewing environment within the room that houses the TV - most commonly the lounge. This is obviously ignoring the videophiles that have environment controlled man-caves with true home cinema setups, but they are not the norm for home TV viewing.


Whilst we know and understand that the SDR grading display will have been calibrated to 100 nits, we also understand that it will have been housed in a controlled grading environment, with little ambient light. The beauty of SDR's relative approach to gamma is that the TV can simply be made brighter to overcome uncontrollable light contaminated environments, including the use of different gamma values.


One of the often overlooked potential issues with PQ based HDR for home viewing is that because the standard is 'absolute' there is no way to increase the display's light output to overcome surrounding room light levels - the peak brightness cannot be increased, and neither can the fixed gamma (EOTF) curve.



As mentioned above, with PQ based HDR the Average Picture Level (APL) will approximately match that of regular SDR (standard dynamic range) imagery. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the PQ HDR image will appear very dark, with shadow detail potentially becoming very difficult to see. This is still true with a diffuse white target of 200 nits, rather than the original 100 nits diffuse white.



To be able to view PQ based 'absolute' HDR imagery environmental light levels have to be very carefully controlled. Far more so than for SDR viewing. This really does mean using a true home cinema environment.


Or, the PQ EOTF (gamma) has to be deliberately 'broken' to allow for brighter images - which many home TVs do.


To back this statement up, the average surround illumination level that is specified as being required for PQ based HDR viewing is 5 nits, while for SDR it has always been specified as 10% of the maximum brightness of the display. Unfortunately, the surround illumination specification for SDR has since been (incorrectly) changed to 5 nits as well...

--

PQ - An Absolute Standard

Referring to PQ as an 'absolute' standard means that for each input data level there is an absolute output luminance value, which has to be adhered to. There is no allowance for variation, such as changing the gamma curve (EOTF), or increasing the display's light output, as that is already maxed out.

With HLG based 'relative' HDR this can be less of an issue, as the HDR standard can be scaled in exactly the same way as traditional SDR TVs, and further, includes a system gamma variable based on surround illuminance, specifically aimed at overcoming environmental lighting issues.


But, having said that, HLG based HDR has its own issues if the peak luma of the display is below approx. 1000 nits, as the average picture level of the HDR image will appear dimmer than the equivalent SDR image. This is due to the nominal diffuse white point being lower than the actual peak luma of an SDR TV set to 200 to 250 nits, as is normal for home viewing environments.
 
Last edited:
I'm pretty sure most displays (and users) are not using HDR properly right now.

I guarantee that I am not ;)

I gave up on the desktop -- it's useless at the moment, and I don't see anything in the pipe that could improve it.

I'm looking forward to color, contrast, and brightness being factory calibrated for HDR.
 
Just wanted to update that I sold this monitor to my friend who has 0 standards when it comes to display quality.

I spent a lot of time messing around between OD4 and OD5 while also comparing to my Q80R and came to the conclusion that the Asus is just way too slow. OD5 is comparable to a typical fast IPS (albeit tons of overshoot) but the difference is profound compared to OD4.

The turning point was in AC:Oddysey where I'd preemptively lament the sun going down because it meant tons of dark transitions and in turn a blurfest. I switched to my Q80R and realized that even though the Asus is achieving 60hz+, I'm only benefiting in terms of camera smoothness. In terms of clarity it's no better than 60hz.

The other thing is that I think my 83% DCI-P3 measurement is accurate from eyeballing between my Q80R and other wide gamut displays. Lastly the FALD in dark games with HDR support (and sometimes in dark SDR games) is just way too aggressive. GoW5 is a perfect local dimming stress test to demonstrate this.

IMO the monitor is not even fast enough for 100hz and is just way too blurry.

I'm back to using my PG27UQ except it's 12 inches from my face to compensate for the size.


Jeez, Asus really did drag the bottom of the VA barrel for these panels! I hold no hope for the CG437K or XG43UQ after this hunk-a-junk.

As far as I can see, we're basically screwed when it comes to monitor selection in this sector for the foreseeable future... the LG 38GL950G is stupidly overpriced, and there's nothing else in the ultrawide or 40"-43" spectrum upcoming that I'm aware of. There is the Philips 328M1R, 32" 4K 120Hz with HDR-600 due next year, but they're oddly pitching that as a console monitor so I don't know how it'll turn out. Also there is some mention of 'unconventional' sub-pixel layout here, which is concerning. I believe the AUO 32" 4K high refresh IPS panel is still due at some point, so maybe that will surface in 2020, but entering the IPS glow lottery is never fun.

LG's 48" OLED next year might also be worth a look for those who can fit it on their desk, but it'll be too big for most... and you also run the risk of burn-in if it's your dedicated monitor of course, so that's far from ideal.

I'm sure we'll get more monitor announcements in the coming months, but we all know how that plays out and it'll be years before anything actually lands on shelves.

:inpain:
 
Jeez, Asus really did drag the bottom of the VA barrel for these panels! I hold no hope for the CG437K or XG43UQ after this hunk-a-junk.

As far as I can see, we're basically screwed when it comes to monitor selection in this sector for the foreseeable future... the LG 38GL950G is stupidly overpriced, and there's nothing else in the ultrawide or 40"-43" spectrum upcoming that I'm aware of. There is the Philips 328M1R, 32" 4K 120Hz with HDR-600 due next year, but they're oddly pitching that as a console monitor so I don't know how it'll turn out. Also there is some mention of 'unconventional' sub-pixel layout here, which is concerning. I believe the AUO 32" 4K high refresh IPS panel is still due at some point, so maybe that will surface in 2020, but entering the IPS glow lottery is never fun.

LG's 48" OLED next year might also be worth a look for those who can fit it on their desk, but it'll be too big for most... and you also run the risk of burn-in if it's your dedicated monitor of course, so that's far from ideal.

I'm sure we'll get more monitor announcements in the coming months, but we all know how that plays out and it'll be years before anything actually lands on shelves.

:inpain:

I agree. I really don't know how they managed to fuck it up so badly with the XG438Q and the Acer and XG43UQ will most likely have the same issues but with a brighter panel. Which will still not be that great for HDR due to the edge lit local dimming.

I have zero complaints about my VA panel Samsung CRG9 when it comes to responsiveness and black smear but of course it's a super ultrawide which has its own pros and cons. Mainly game support and being just two 27" 1440p screens without bezels. TFT Central's roadmaps mention a 31.5" curved 4K 120Hz Samsung VA panel with Q2 2019 mass production but so far it hasn't surfaced as a display. That might be a potential option for next year if they can make it perform as well as the CRG9 panel.

The LG 48" OLED might still be your best bet if you want a quality larger 4K screen, can put it further away from your eyeballs and are willing to work around the burn-in potential. I'll probably stick with the CRG9 on the desktop because it's fantastic for most things and instead upgrade my living room TV to a LG 65" C9 when its price drops or it gets replaced by a 2020 equivalent B-series model, provided it has HDMI 2.1, VRR support and all that goodness.
 
The turning point was in AC:Oddysey where I'd preemptively lament the sun going down because it meant tons of dark transitions and in turn a blurfest. I switched to my Q80R and realized that even though the Asus is achieving 60hz+, I'm only benefiting in terms of camera smoothness. In terms of clarity it's no better than 60hz.

The other thing is that I think my 83% DCI-P3 measurement is accurate from eyeballing between my Q80R and other wide gamut displays. Lastly the FALD in dark games with HDR support (and sometimes in dark SDR games) is just way too aggressive. GoW5 is a perfect local dimming stress test to demonstrate this.

IMO the monitor is not even fast enough for 100hz and is just way too blurry.

I'm back to using my PG27UQ except it's 12 inches from my face to compensate for the size.

This is why I bit the bullet and got a AW5520QF. Yes is expensive, yes its big.....but HOLY FUCK does it AMAZE. High Refresh Rate OLED is an ABSOLUTE ANIMAL. Dance on tables, strip on poles, sell your ass on the side of the road....whatever it takes......get the AW5520QF, you will never be able to go back.

Also, my penis size & girth increased 300% afterwards....dont know if thats relevant or not, but it happened!
 
I'm not willing to pay those inflated prices for a 400nit non HDR OLED that like all the other dp monitors lacks hdmi 2.1 on it (4:2:2. 8bit at 4k or 98hz and no future compatibility with 2.1) but I'm sure it's nice for right now. I'm not dropping $5k on a mini led FALD pro art or $3k - $5k on a 65" FALD BFG either. They dropped the ball on what could have tempted me to upgrade my monitor one more time before hdmi 2.1 in their "too narrow for true 4k" displayport medium-large monitor lines too so yes I'll be interested in those 48" OLEDs.

The 48" LG OLEDS should have 48gbps hdmi 2.1 just like the c9 oled tvs so it's just up to whenever nvidia gets hdmi 2.1 , for me personally on a top tier Ti model. That could be awhile. I'll be playing ps5 with quasi 4k rez (checkerboarding, 1500p to 2160p dynamic rez, etc) in performance mode games at 120hz on a hdmi 2.1 TV by the end of 2020 regardless. I'll wait on the 48" OLEDS for at my desk in the same period if there is a hdmi 2.1 Ti gpu by then(I can move my desk back a bit farther from the monitor) and see what comes out in micro LED FALD for gaming and what word comes out on JOLEDs since they said they are going to make a 32" one which could work at the right distance.. and they might make a few other sizes eventually who knows but those are a longer wait.



-------------------------------------------------------------
48" OLED
--------------------------------------------------------------

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1559553128

The remarks came during a press tour in China where LG Display is setting up a second 8.5G factory for mass production of OLED TV panels. The new factory will be located in Guangzhou and is expected to almost double production capacity from 70,000 to 130,000 substrates per month. The company is also setting up a third and more advanced 10.5G factory in Korea.

https://www.avforums.com/news/lg-confirms-48-inch-oled-for-2020.16197

Affirming plans that the smallest OLED TV yet will start coming off the production line next year, Oh Chang-ho, vice president of the TV business, LG Display, told China Business News, said, “It’s a strategy to solidify our footing in the high-end TV market, while continuing to have presence in the standard and premium segments.

-----------------------------------------------------------------------------------------
JOLED
-----------------------------------------------------------------------------------------

https://www.j-oled.com/eng/customar/20191002eizomonitor/

“We are confident that EIZO’s high-quality and high-performance product group and JOLED’s medium-sized, high-definition OLED display have a high affinity. We hope to continue working together with EIZO to strengthen product development, targeting high-end monitors,” said Taro Funamoto, Executive Officer, Deputy Head of Panel Business Division, JOLED Inc.

The OLED display market is expected to double from 2018 to US $48.5 billion in 2025, and steady growth is also expected for medium-sized products.*1 With the proprietary RGB printing technology for producing OLED displays, which enables efficient production and represents a major innovation within the OLED industry, JOLED will contribute to the market expansion through offering medium-sized, high-definition OLED displays

JOLED will produce medium-sized (10- to 32-inch) printed OLED displays for use in areas such as high-end monitors and automotive displays. At the same time, JOLED will strongly proceed R&D for flexible and foldable OLED for practical use.

------------------------------------------------------------------------------------
DUAL LAYER LCD TVs
------------------------------------------------------------------------------------

I'll also keepmy ears open about dual layer lcd in consumer tvs if that becomes a thing in the USA. There is one hisense model released in china this year apparently.

https://www.gizmochina.com/2019/07/08/hisense-u9e-launches-as-the-worlds-first-dual-image-tv/

https://www.avforums.com/news/hisense-releases-dual-layer-u9e-lcd-tv-in-china.16313

Despite their assertion that the dual-layer approach is cheaper than buying an OLED panel, the 65-inch U9e still comes in at 18,000 Yuan, (approx. USD2,600) making it relatively expensive for a domestically produced TV.

Hisense has indicated that they will launch 55 and 75-inch versions next year and could bring the technology to territories other than China in the fullness of time. However, given that the screen was manufactured in partnership with China’s own BOE, for now, it appears Hisense are happy to champion this as a homegrown technological solution to match or exceed OLED image quality.

Hopefully some other mfgs will make some consumer Dual Layer LCD tvs intead of just astronomically priced dual layer lcd reference monitors by panasonic, sony and eizo that have replaced their OLED ones.
 
Last edited:
I'm not willing to pay those inflated prices for a 400nit non HDR OLED that like all the other dp monitors lacks hdmi 2.1 on it (4:2:2. 8bit at 4k or 98hz and no future compatibility with 2.1) but I'm sure it's nice for right now. I'm not dropping $5k on a mini led FALD pro art or $3k - $5k on a 65" FALD BFG either. They dropped the ball on what could have tempted me to upgrade my monitor one more time before hdmi 2.1 in their "too narrow for true 4k" displayport medium-large monitor lines too so yes I'll be interested in those 48" OLEDs.

What the hell does "too narrow for true 4k" mean??? The alienware 55 hits 4k120 FULL RGB 8-bit no problem. Also, 400 nits is plenty bright for a 55" monitor 5 feet from your face, hell 90% of monitors out there don't even go over 300 nits.

The alienware 55 is hella expensive but it kicks all kinds of fucking ass over everything else out there and you could be waiting 6-12 months before GPUs support HDMI 2.1 and frankly I dont want to live in a reality with BLB, IPS glow, VA Smear, shitty TN viewing angles anyfucking more

ROFL stop spreading misinformation and ignorance, I swear your the Cliff Clavin of the forums....errrr Cliff Elvin!

5fCGVM4E_400x400.jpg
 
Last edited:
The 48" LG OLEDS should have 48gbps hdmi 2.1 just like the c9 oled tvs so it's just up to whenever nvidia gets hdmi 2.1 , for me personally on a top tier Ti model. That could be awhile. I'll be playing ps5 with quasi 4k rez (checkerboarding, 1500p to 2160p dynamic rez, etc) in performance mode games at 120hz on a hdmi 2.1 TV by the end of 2020 regardless. I'll wait on the 48" OLEDS for at my desk in the same period if there is a hdmi 2.1 Ti gpu by then(I can move my desk back a bit farther from the monitor) and see what comes out in micro LED FALD for gaming and what word comes out on JOLEDs since they said they are going to make a 32" one which could work at the right distance.. and they might make a few other sizes eventually who knows but those are a longer wait.

Well Ampere itself is already looking to come out in 1H 2020:

https://www.techpowerup.com/259842/nvidia-could-launch-next-generation-ampere-gpus-in-1h-2020

Of course maaaaybe it won't be a 3080 Ti and will end up just being a 3080 class gpu but hey nvidia jumped straight from a 1080 Ti to 2080 Ti so who says they won't jump straight to a 3080 Ti this time around? With their huge focus on ray tracing I'm willing to bet that the non ray tracing performance of the 3080 Ti will not be a big leap over a 2080 Ti but instead it's the ray tracing performance that will be. Probably going to be another 30-40% in non ray traced games over a 2080 Ti but 50%+ better when it comes to ray tracing.
 
I meant for HDR dude.. 400 nits is SDR. It's the color brightness in a 3d color gamut in HDR not the relative brightness band you shift +/- in SDR tvs. HDR color brightness is nothing like SDR relative brightness setting on old tvs that cranks the whole display up relatively. Don't be ignorant yourself of how HDR works. And yes displayport 1.4 can't push full 4k 120hz properly at 10bit , 4:4:4, 120hz. Read the pg27Uq review on tftcentral it spells it out for you. You spent like $4000 I'm guessing on a 400nit tv end of 2019 that can't even do HDR like a $1200 - $1500 TV of the same size and lacks full 48gpbs bandwidth for 444 10bit 4k 120hz that the C9's now have.

Well Ampere itself is already looking to come out in 1H 2020:

https://www.techpowerup.com/259842/nvidia-could-launch-next-generation-ampere-gpus-in-1h-2020

Of course maaaaybe it won't be a 3080 Ti and will end up just being a 3080 class gpu but hey nvidia jumped straight from a 1080 Ti to 2080 Ti so who says they won't jump straight to a 3080 Ti this time around? With their huge focus on ray tracing I'm willing to bet that the non ray tracing performance of the 3080 Ti will not be a big leap over a 2080 Ti but instead it's the ray tracing performance that will be. Probably going to be another 30-40% in non ray traced games over a 2080 Ti but 50%+ better when it comes to ray tracing.

I do hope those gpus come out in good time though yes. The I can get a full 48gbps hdmi 2.1 tv for it.
 
Last edited:
Well Ampere itself is already looking to come out in 1H 2020:

https://www.techpowerup.com/259842/nvidia-could-launch-next-generation-ampere-gpus-in-1h-2020

Of course maaaaybe it won't be a 3080 Ti and will end up just being a 3080 class gpu but hey nvidia jumped straight from a 1080 Ti to 2080 Ti so who says they won't jump straight to a 3080 Ti this time around? With their huge focus on ray tracing I'm willing to bet that the non ray tracing performance of the 3080 Ti will not be a big leap over a 2080 Ti but instead it's the ray tracing performance that will be. Probably going to be another 30-40% in non ray traced games over a 2080 Ti but 50%+ better when it comes to ray tracing.

Yea 1H 2020 could be 3 months or it could be 8 months and they may screw us with no hdmi 2.1

I don't see how they could possibly leave out HDMI 2.1 with the next GPU, but Nvidia has dropped the ball before, so only time will tell.
 
I meant for HDR dude.. 400 nits is SDR and it's the color brightness in a 3d color gamut not the relative brightness band you shift +/- in SDR tvs. HDR color brightness is nothing like SDR relative brightness setting on old tvs. Don't be ignorant yourself of how HDR works. And yes displayport 1.4 can't push full 4k 120hz properly at 10bit , 4:4:4, 120hz. Read the pg27Uq review on tftcentral it spells it out for you. You spent like $4000 I'm guessing on a 400nit tv end of 2019 that can't even do HDR like a $1200 - $1500 TV of the same size and lacks full 48gpbs bandwidth for 444 10bit 4k 120hz that the C9's now have.


I do hope those gpus come out in good time though yes. The I can get a full 48gbps hdmi 2.1 tv for it.

HDR literally only works properly on like three games and unlike LCD tech, OLED makes SDR and 8-bit look fan fucking tastic, which is about where 99% of my backlog falls in.

Also, 99.9% of the population could not tell you the difference between 8-bit or 10-bit even if you told them which was which.

Now you wanna wait until GPUs support HDMI 2.1 and get a C9 or C10 and be totally future proofed that is cool. I got a c9 in the bedroom waiting for that day....but don't dis the AW5520QF because I know for a fact you have not tried it and or seen 4k120VRR OLED in person. So basically long story short, you have no idea what you are talking about....your bitching about a product that you have never seen or tried and griping about "facts" that are largely irrelevant for 99.9% of gaming...anyway, back to putting you on ignore.
 
Last edited:
Yea 1H 2020 could be 3 months or it could be 8 months and they may screw us with no hdmi 2.1

I don't see how they could possibly leave out HDMI 2.1 with the next GPU, but Nvidia has dropped the ball before, so only time will tell.

I really doubt they will stick to HDMI 2.0, too many TVs with HDMI 2.1 coming next year as well as new consoles and AMD cards to compete.
 
  • Like
Reactions: elvn
like this
OLED looks great and I'll likely end up with one in the living room if not also my pc monitor eventually. $4000 for one incapable of HDR even for movies and pictures and youtube.. ever.. especially useful on such a large screen.. and yes some games work fine with hdr, plus you can plug a console into your dislplay too and at 120hz in the next gen..-- at that price lacking those features at a few months from 2020 that's quite a price gouge. You're basically paying +$2500 to +$2800 more vs a C9 55" for a displayport that doesn't even have the full bandwidth for full 4k (yeah I said dp 1.4 is "too narrow for true 4k 120hz" , full 4k 120hz bandwidth capability same difference) 444 10bit 120hz 48gbps. 10bit comes into play more in HDR since it's a 3d color gamut, which again is moot with that SDR monitor/tv.
 
Last edited:
I really doubt they will stick to HDMI 2.0, too many TVs with HDMI 2.1 coming next year as well as new consoles and AMD cards to compete.

I hope they include it! Paired with LG C9's that would bring affordable high refresh rate OLED gaming to a wider audience and really put a fire under display makers asses to produce better monitors that consumers actually want (32" OLED 4k120vrr)
 
OLED a great value but the 4000 dollar one lacking all the modern feature set isn't lol

IfK9spc.png
 
Last edited:
OLED a great value but the 4000 dollar one lacking all the modern feature set isn't lol

View attachment 191784

I already got four grand worth of entertainment out of it this past week alone!
The price is expensive, but there is nothing else out there that compares.
Fucking cat lady Cnet reviewer did not do this baby justice!

Its tempting to compare the LG C9's + vaporware HDMI 2.1, however, there is no guarantee that LG's tv's "game mode" electronics will have as good input lag as this beast. This thing was engineered to be a gaming monitor first, not a tv.

Plus look at what $2,500 currently gets you. $2,500 gets you a dogshit VA UW, with shitty motion smear, Haloing backlight, asstastic 1440pee vert res, crippling AG coating and broken firmware. Should this thing be priced lower, sure....will dell lower the price in a month or two....probably. Is it worth every penny....yes.

I have not enjoyed gaming in a long, long while. Its an incredible experience. This thing is so much fun!
4k120VRR OLED + Glossy screen + zero input lag due to precision electronics = Gaming Nirvana
 
Last edited:
If I drop $3200 - $3600 it'll be on a 77" C9 oled with HDR for movies and videos in the living room. With hdmi 2.1 48gbps that'll be ready for ps5 120hz quasi 4k (perhaps even VRR on a few games) as well. The 77" C9's have been down to like $3850 - $3900 already from a reputable warehouse authorized LG OLED dealer on ebay and they haven't even been out that long -so sometime in 2020 is looking good if that trend continues. I have a 70" tv now so am not interested in downsizing to a 65" let alone a 55" in the living room. If they made a 70" C9 at a price in between the 65" and 77" it would have been better for me.

I'll keep an eye out for the 48" OLED in the next year for pc use too. They'll probably be more affordable than the 55" C9s even but I'll then drop $1k+ on a hdmi 2.1 nvidia Ti gpu when they hit as well so it'll be a double money drop. Those will prob be out well before 32" JOLED and to be honest I'm liking a farther desk with my 43" monitor setup now so 32" isn't as appealing as it used to be. I wouldn't expect consumer dual layer LCD 55" , 65" and 75" Hisense in usa for a few years after if they ever become a thing here at all, even though they released a 65" in china already supposedly. Micro led are probably way out for consumer pricing, years.

..
..

I don't mind spending fair money on stuff but I do pick my battles when possible upgrade wise. All ball busting aside, I'm glad you are enjoying your monitor. I just can't justify it at that price with my personal spending roadmap and no hdr or hdmi 2.1 at that price is astounding to me. I know how tempting it can be to drop money on cool things but for me it's a no go. In a way I'm kind of glad the xg438Qs didn't end up that great, saved me the temptation of getting an overpriced 4k dp 1.4 43". I do wish there was a really excellent 43" 4k 120hz hdmi 2.1 display coming out but it is what it is I guess.

I'm aiming for something like this, subject to change if new competitive hdmi 2.1 products come out in the same time period:
---------------------------------------------------------------------------------------------------------------------------------------------------------------------
- 77" C9 OLED or similar hdmi 2.1 tv to upgrade from my 70" FALD VA in living room .... (when around $3500 or less, wish there was a 70" model for a middle ground price betw 65" and 77")
- PS5 and misc PS5 stuff and a few games ($750+ all totaled??, might not be out till this time next yr/xmas 2020 though)

- 48" OLED TV as a monitor... ~ $850 to $1000 ? (the 55" C9 are $1200 - $1500) Not bad prices at all. AFAIK it's the only 35"+ sized display worthwhile - in 2020 hopefully-, w/o going 55" or 65" but I will rearrange my setup for a huge 4' away screen if I have to.
- 3080 Ti Hybrid (or whatever they call it) 7nm hdmi 2.1 48gbps...(guessing $1000 to $1200 at least) ... I've decided this is the missing piece of the puzzle for the pc display trigger after all considering what displays have finally come out lately.
 
48" is the most promising C10 for computer monitor use as long as it has all the HDMI 2.1 + HDR bells and whistles.

55" is good for me as a strictly gaming setup 4-5 feet away.

77" is way to large for 4k. If your gonna go over 65" you need to start looking at 8k.
You talk about wasting money on the AW55, but the price point between a c9 65" and 77" is a cool $2-$3k and for that extra
money you are getting way less PPI. You could have TWO 65" C9s for less than one 77"

Currently there is nothing else out there like the AW55. 4k120VRR OLED is AMAZING and there is only one place
to get that right now. We think the LG Oleds will deliver that once HDMI 2.1 gpus come out, however, we don't how long that will be
or even if Ampere is gonna have HDMI 2.1. It could be two months, it could be eight months, it could be another year!

As is RIGHT NOW and the foreseeable future, if you want a high refresh gaming monitor, your choice is:
-Super Smooth TN limited by size, resolution, shitty AG and screen uniformity
-Slower IPS with shitty contrast, or FALD IPS in overpriced tiny 27" form factors
-Trash VA with horrid motion issues
-Incredible AW55 with enormous price tag and BIG ASS size.

The last time I truly had fun in the PC hobby was when I was building 3x1 and 5x1 portrait surround setups.
Playing Battlefield 3 on my Quadfire 5x1 setup was pretty dang epic as you can see in my old setup
5x1club1.jpg

proxy.php?image=http%3A%2F%2Fimg.photobucket.com%2Falbums%2Fv725%2Fl88bastard%2F5x1club3.jpg

proxy.php?image=http%3A%2F%2Fimg.photobucket.com%2Falbums%2Fv725%2Fl88bastard%2F5x1club5.jpg

proxy.php?image=http%3A%2F%2Fimg.photobucket.com%2Falbums%2Fv725%2Fl88bastard%2Fthin1.jpg


But there were so many headaches with 3x1 and 5x1 setups...not to mention bezels, inferior monitors, RGB being rotated, etc, etc.

The AW55 solves all of those issues we had with our surround setups into one nice clean package, with no bezels, amazing contrast
and beautiful OLED glory. You say the $4k price point is excessive, but its a fuck ton less money than five monitors, four GPUS and the watercooling setups!

aw55-1.jpg
html multiple images
aw55-2.jpg
 
Last edited:
yeah I remember that setup. Looks cool. The 55 oled is nice too.


For a living room tv I'm not downsizing to a 65" form 70 period.. so 77" is the only option it seems unless some other manufacturer. makes one. That means from my perspective if I'm going to spend ~ $3500 I'm going to get a full featured hdmi 2.1 48gbps 77" HDR tv for the living room rather than a 55" w/o hdr and hdmi 2.1 this late for my computer monitor. The 55 is just too expensive for what I'd be getting so for me it'd be much wiser to put that big money into a an amazing tv for the living room. I'm not buying a new TV until the prices drop to the lower 3500 (or less) range on sales though so that figures in. They've been 3900 already and they haven't been out long so that is promising but will see how it goes.

I agree with your assessment of what's available but to me the missing features combined with the very high price on the 55inch oled gaming monitor puts it in the same category along with all of the other screens I had been waiting for that all ended up having missing features or weak facets at very inflated prices for the luxury of a narrow dp 1.4 connection. My spending is multiple thousands of dollars in a purchasing road map not just a right now thing.

Yes it could be a wait on hdmi 2.1 gpus. I was interested in what was coming out to bridge the gap during this period we ar in using dp 1.4 but now that they've been revealed for what they are and at what prices - whatever display I buy going forward will be hdmi 2.1 with some quality hdr especially anything with a very high price.
 
Last edited:
I played bf3 yesterday... on my [email protected] and aorus xtreme 2080Ti... and 55" OLED 4K 60hz. Stupid, I know, but the game still rocks today like it did when it laucned. I can see how 4k 55"OLED @120HZ synced refresh rate can be awesome. But for me... I want a normal PC monitor with hardware gsync.
 
(1) Your failing to comprehend the gap between 4k120VRR OLED and 4k60Vsync. Its Vast.

(2) Your also failing to comprehend that the AW55 is a "gaming monitor" with streamlined electronics that have zero input lag and not a full featured TV that may be weighed down with slight lag.

(3) Your also failing to appreciate that sometimes things get delayed for another year and then another one after that. The first 144hz 1440pee gaming TN and 4k120hz IPS FALDS come to mind there.

Your $3900 77" will be half the price when c10s come out and worthless by the time C11s arrive in only two and a half years
 
Meanwhile, seems the Acer CG437KP has been delayed to mid-December according to a local store.
 
It's worth it to me for the living room at 70"+ (forced to 77" since no 70" model being made) with hdmi 2.1 and a good HDR implementation. It's not worth it to me on non hdr non hdmi 2.1 capable 55" monitor for $4000 , again to me and my personal spending roadmap. LG OLED TVs have like 6ms input lag isn't not a big deal at all like it was in the past. The living room is picking battles on a large expenditure not as a pc gaming monitor. The most gaming on that would be a ps5 which prob requires hdmi 2.1 for it's quasi 4k 120hz capability but primarily for home theater use. There is potential for having overstocked OLEDs and ramped up production so I'm expecting some sales in 2020 on the larger ones. Hopeful on timing of the 48" OLED for the pc which should be even a bit cheaper than a 55" C9 - but if I have to I'll set up 4' away for a 55" C9 after nvidia releases a hdmi 2.1 7nm Ti hybrid. HDMI 2.1 is freesync/vrr capable and the C9's can do hdmi vrr which nvidia would have to support. Xbox already does. C9's don't do freesync unlike some other tvs but they are one of the few (the only?) one(s) with hdmi 2.1 48gbps ports currently. LG C9 OLED tvs have 6.6ms to 6.8ms input lag at 120hz.
 
Last edited:
Back
Top