Asus ROG Swift PG35VQ Pricing is Insane!

So do you think the PG35VQ 's backlight performance is the same as your X27 after watching that Linus vid?
If you crank up the SDR brightness to 500 nits, the blooming is visible because it needs to light up the white mouse cursor to 500 nits. I use 200 nits on the desktop and the blooming just brings up the black level from true OLED-like black to an IPS black. Overall the black levels (even with the blooming) are vastly superior to any other monitor I've used.
 
So Linus was using bad desktop/app settings. I'm curious if these monitors (and the x27) have named osd settings profiles like a lot of TV's do (day, night, calibrated, dark, custom, etc) that you can change on the fly without drilling down, either in software or using a remote.


So it's able to be worked around but in the future I'm sure HDR desktop wallpaper and HDR photo and video editing and browsing are going to be a thing so the blooming could remain an issue that has to be resolved with higher backlight densities or better yet, dual layer lcd. His RTS game in HDR mode also had issues so some game menus and certain game designs could unless the mouse pointer were coded somehow to be isolated to lower nit, or using a darker mouse pointer as someone suggested in the replies here.

I'd rather have the black depth and not spotlight the mouse personally, but even window frames and other onscreen elements were blooming badly. Dark themes. browser addons (nosquint plus), and minimalist window frames (winaerotweaker) could help there.
 
Last edited:
Proper HDR is the best technology I've experienced in a long time. I don't regret my OLED or X27 at all despite paying over $2500 and $2200 for them. Blooming is a non-issue in games on the X27. The OLED is totally unusable for PC games due to the horrendous input lag even in game mode. I'll take refresh rate, response time, and input lag over any other aspect of image quality.

You must have an older OLED. The 2019 sets have input lag rivaling gaming monitors, pixel response times far exceeding any LCD and now tick the refresh rate box with 120 Hz 1440p and 4K.
 
You must have an older OLED. The 2019 sets have input lag rivaling gaming monitors, pixel response times far exceeding any LCD and now tick the refresh rate box with 120 Hz 1440p and 4K.
I have the B7 which does 1080p 120 Hz with 20 ms lag. It's unusable. I'm not sure the 13 ms C8+ will be much better. Of course, I'm comparing this to a 240 Hz G-SYNC monitor which I use for twitch gaming... But even compared to my X27 it's pretty bad.
 
The 2019 LG OLED sets have 6.6ms input lag, which is basically about the same as a 120 Hz G-Sync monitor. Quite astounding for a TV set. But you can only run it up to 2560x1440p at 120 Hz until we get HDMI 2.1 GPUs.
 
So Linus was using bad desktop/app settings. I'm curious if these monitors (and the x27) have named osd settings profiles like a lot of TV's do (day, night, calibrated, dark, custom, etc) that you can change on the fly without drilling down, either in software or uainf a remote.


So it's able to be worked around but in the future I'm sure HDR desktop wallpaper and HDR photo and video editing and browsing are going to be a thing so the blooming could remain an issue that has to be resolved with higher backlight densities or better yet, dual layer lcd. His RTS game in HDR mode also had issues so some game menus and certain game designs could unless the mouse pointer were coded somehow to be isolated to lower nit, or using a darker mouse pointer as someone suggested in the replies here.

I'd rather have the black depth and not spotlight the mouse personally, but even window frames and other onscreen elements were blooming badly. Dark themes. browser addons (nosquint plus), and minimalist window frames (winaerotweaker) could help there.
The G-SYNC HDR monitors have only one HDR mode - BT.2020 / PQ / 1000 nits.
In SDR mode you can choose between sRGB / 2.2 gamma / custom brightness, and wide gamut (uncalibrated native gamut - not BT.2020 mapped) / 2.2 gamma / custom brightness. The only correct SDR setting on Windows is sRGB. On macOS you can use a calibrated ICC profile with wide gamut mode.
These monitors have a static tone mapping curve in HDR mode unlike most TVs, which means you can leave HDR on permanently in Windows. The desktop and SDR applications are correctly mapped to sRGB / 2.2 gamma / "SDR content appearance" brightness setting within the BT.2020 / PQ / 1000 nits signal. HDR aware applications seamlessly work alongside SDR applications. On YouTube, only the video area is displayed in 1000 nits for HDR videos and the rest of the page looks SDR. You can also view wide gamut images in the browser on regular pages. There is no drawback to leaving HDR on permanently.
 
Last edited:
From RTings about 2019 LG OLED ... "
Update 5/17/2019: We've retested the input lag on the same firmware (03.50.31) and found the 4k @ 60Hz + HDR input lag is in the same ballpark as the other resolutions (around 13ms)"
.. 13.5 ms, but in 120hz modes at 1080p and 1440p they get 6.8ms ! Both quite low for a tv. Q9Fn 512 zone FALD VA tvs get ~11ms at 120hz modes even with optional interpolation on too, which is huge since interpolation usually adds considerable input lag.

They., (specifically the upcoming hdmi 2.1 55" OLED tv version, 65" samsung FALD QLED flagship, and the dell 55" OLED gaming monitor) are all however way too big for me even though I run 43" monitors on my desk

-----------------

The G-SYNC HDR monitors have only one HDR mode - BT.2020 / PQ / 1000 nits.
In SDR mode you can choose between sRGB / 2.2 gamma / custom brightness, and wide gamut (uncalibrated native gamut - not BT.2020 mapped) / 2.2 gamma / custom brightness. The only correct SDR setting on Windows is sRGB. On macOS you can use a calibrated ICC profile with wide gamut mode.
These monitors have a static tone mapping curve in HDR mode unlike most TVs, which means you can leave HDR on permanently in Windows. The desktop and SDR applications are correctly mapped to sRGB / 2.2 gamma / "SDR content appearance" brightness setting within the BT.2020 / PQ / 1000 nits signal. HDR aware applications seamlessly work alongside SDR applications. On YouTube, only the video area is displayed in 1000 nits for HDR videos and the rest of the page looks SDR. You can also view wide gamut images in the browser on regular pages. There is no drawback to leaving HDR on permanently.

I find the side by side HDR capability very cool but that Linus video is still pretty shocking looking. I wouldn't want to lower my desktop SDR black levels to non-FALD IPS levels to avoid that. IPS outside of dynamic fald are lucky to hit 1100:1 contrast like the pg27uq. The black depth is like .3 to .4 outside of FALD. With FALD on , it's .02 so from 35 to 2 on the hundreth scale. That's a big difference. I'd also want to be able to switch to or maintain a high contrast and deep black level in SDR for SDR games.

TFT central review stated:
and HDR content and can significantly increase the active perceived contrast ratio of the screen. Dark areas are dimmed, and brighter areas are turned up. With a screen calibrated to around 120 cd/m2 and FALD active in SDR mode we measured a black depth of 0.02 cd/m2 and therefore an active contrast ratio of ~6000:1. This was for where a small white sample on the screen was compared with a measurement of a black part of the screen furthest away.

Actually you can achieve the same black point of 0.02 cd/m2 even when you increase the brightness up to the maximum 100% setting (533 cd/m2 luminance peak), and therefore you can achieve active contrast ratios of around 26,650:1 even in SDR mode. The FALD is capable of producing some very strong active contrast ratios even in SDR content, and you are only really limited by the maximum luminance in terms of how high that contrast ratio will go.

Taking that as true, then you should actually still be getting a much better contrast and black depth than typical IPS ~1000:1 and .33 black depth when instead using FALD in SDR at 200nit considering 120nit is getting 6000:1 and .02 black depth. But say it's 3k to 4k :1 (idk how much less than 6000:1) , that's still a lot better than most gaming monitors and probably a worthwhile trade-off to avoid severe glow "haloing" in SDR content.

tftcentral quote:
With a peak luminance of around 1237 cd/m2 possible we measured a black point on the same screen of 0.02 cd/m2. This gives rise to a maximum HDR contrast ratio of 61,850:1 which was excellent and a long way beyond the normal static contrast ratio of around 1071:1.

So (4000:1 at 200nit?) 6000:1 at 120nit SDRis nowhere near the 26,000:1 contrast at 533nit SDR but it's still pretty good for a gaming monitor. It would be nice if there were quick profles you could switch between depending on what you are doing, like on my tv.
 
Last edited:
So (4000:1 at 200nit?) 6000:1 at 120nit SDRis nowhere near the 26,000:1 contrast at 533nit SDR but it's still pretty good for a gaming monitor. It would be nice if there were quick profles you could switch between depending on what you are doing, like on my tv.
The X27 has 3 customizable profiles and assignable hotkeys where you can save your preferred brightness and other settings.
 
  • Like
Reactions: elvn
like this
Blooming is a non-issue in desktop use because you shouldn't be using HDR outside of media consumption, anyway.
 
You can, and should, use dynamic FALD in SDR mode though which can still show bad effects like in the linus video apparently if your peak brightness is set too high in SDR mode. That's why I was asking so many questions of Monstieur regarding hotkey/quick switching between a SDR brightness for desktop/apps that would avoid that ugly blooming and a SDR brightness for viewing SDR media and playing SDR games that could be a more aggressive contrast ratio. Monstieur also informed me that you can run HDR content in different window panes even on a SDR desktop and alongside sdr apps and other sdr media, games.
 
Last edited:
Monstieur also informed me that you can run HDR content in different window panes even on a SDR desktop and alongside sdr apps and other sdr media, games.
Once you have adjusted "SDR content appearance" to match your SDR mode monitor brightness, the desktop and SDR apps look identical whether HDR is on or off in Windows. Switching HDR on just allows HDR aware apps to display HDR content. In HDR mode all monitor settings are locked and Windows outputs a BT.2020 / PQ / 1000 nits signal - all SDR content mapping is done at the source.

HDR is a "capability" that's meant to be left on permanently like G-SYNC. The only exception is for TVs with dynamic tone mapping that varies with content brightness - this screws up the desktop and SDR-mapped content when HDR is on. The G-SYNC HDR monitors do not alter SDR-mapped content in HDR mode.
 
Last edited:
  • Like
Reactions: elvn
like this
I tested some black backgrounds at 200 and 500 nits. Even at 500 nits the cursor doesn't look anything like Linus' video - it's maybe 30% of what that looks like. Long regions of white next to black have terrible blooming at 500 nits, but regular IPS-like blooming at 200 nits. The cursor blooming is only noticeable at 200 nits if you actively look for it.

It's a non-issue for SDR videos and games, and even for HDR videos and games except for overly dark HDR content which results in terrible blooming.
 
Typical camera exposure levels will reveal more "blooming" than the naked eye.
 
Typical camera exposure levels will reveal more "blooming" than the naked eye.

I thought the same at first too, kind of like how you can’t judge an HDR review if the recording and your viewing device aren’t capable of HDR.

But then Linus actually made a negative comment about it during the vid, so I thought hmm, maybe it is really that bad...
 
It's there but it's not bad. It's better than 99% of HDR TVs except maybe the Z9 and OLEDs.
 
I haven't watched this yet. Title sounds a bit biased lol.

 
I haven't watched this yet. Title sounds a bit biased lol.

Sounds like he enjoys the monitor. The slow motion ghosting tests were appreciated, and although he says it's disappointing that the LG Ultrawide he tested against had slightly better ghosting, they look very comparable to me, and I'm not sure he understands that it's really impressive for a VA panel to keep up with a modern fast response IPS panel. Most can't keep up at all.

I can't believe he used a fucking iPhone review as an HDR test... there's so much good HDR content on YouTube! At least use a Digital Foundry video, god.
 
Takeaway from reviews... is it worth it? No. But there's obviously a niche of consumers for whom money is no object. Well duh... there's always people who want and need to be at the bleeding edge, no matter the cost. This monitor is for them.
 
You can, and should, use dynamic FALD in SDR mode though which can still show bad effects like in the linus video apparently if your peak brightness is set too high in SDR mode. That's why I was asking so many questions of Monstieur regarding hotkey/quick switching between a SDR brightness for desktop/apps that would avoid that ugly blooming and a SDR brightness for viewing SDR media and playing SDR games that could be a more aggressive contrast ratio. Monstieur also informed me that you can run HDR content in different window panes even on a SDR desktop and alongside sdr apps and other sdr media, games.
I turn the variable back light off except when playing games or watching video. Blooming is an issue on the PG27UQ with it on when on the desktop. So I standby what I said, which extends to back light usage.
 
I turn the variable back light off except when playing games or watching video. Blooming is an issue on the PG27UQ with it on when on the desktop. So I standby what I said, which extends to back light usage.
That makes the whole screen grey like a typical TN / IPS panel. Using a low SDR brightness with FALD is better except in a few instances.
 
And even that low SDR brightness/contrast is probably slightly less contrast than I'd personally prefer to get for SDR media where the blooming isn't as obvious, especially compared to the gains.

That's why I inquired about using hotkeys for different OSD settings profiles with dynamic FALD enabled...

- in order to use a lower peak SDR brightness and contrast ratio when doing desktop apps, reading/browsing like Monstieur does.. and
- a slightly more aggressive peak SDR brightness and contrast ratio when playing SDR games and watching SDR media full screen.

You could potentially also use nvidia freestyle (an easy-mode of reshade with sliders on an overlay built into the nvidia exp. suite.. written with the help of the reshade author) to tweak SDR games further
.. and Flux hotkeyed for brightness adjustment via hotkey on the fly..
- but those are all driver/app based, not OSD level which is preferable so it's good that monitors like the pg27uq have OSD settings profiles, and likely the PG35VQ since it's similar asus tech and generation.
 
Takeaway from reviews... is it worth it? No. But there's obviously a niche of consumers for whom money is no object. Well duh... there's always people who want and need to be at the bleeding edge, no matter the cost. This monitor is for them.

Even if money were no object, this is still not the display for them.

To quote former United States President Abraham Lincoln, "YO YO FO SCOO DIS DISPLAY BE FREAKIN TAAWDED YO YO"
 
Makes me concerned about how much they're going to want for their upcoming 43" 4k ROG.
 
It will be cheaper because it's edge lit and thus much, much worse at HDR.
It's not like there are a lot of PC HDR games and I'd rather have a full 4k display than an ultra-wide.
 
It will be cheaper because it's edge lit and thus much, much worse at HDR.

I wouldn't say "much, MUCH" worse, but it won't be as good. But the point is, is a four figure premium for FALD really worth that?? Really? Precious few games implement HDR, and fewer still do a really good job of it. Most people won't be using a monitor for HDR media content, although if they are doing this consistently then it's obviously more attractive, but for games alone... no way would I pay that much of a premium. First adopters who need to be on the bleeding edge are getting bent over and shafted dry here.
 
I wouldn't say "much, MUCH" worse, but it won't be as good. But the point is, is a four figure premium for FALD really worth that??

It's much worse to me. Elevating brightness to even 600 nits for HDR without any kind of backlight control produces black levels 2x as high as I see on existing LCDs, and to me they already look like garbage displaying black or darkness in general, so. Depends on your standards. The whole point of HDR is increased CONTRAST, not increased average panel brightness...

That said I wasn't making any argument about monetary value. Building and implementing the software for a FALD of the caliber that is in the G-sync Ultimate monitors is a lot of work, it's why they're so expensive. Any non-FALD monitor is going to be cheaper regardless of resolution, because just increasing panel resolution is a lot easier.

I wouldn't *recommend* to anyone else that they spend $2K+ on any LCD, but for the time being it will be the best experience you can get on PC.
 
If they had a FALD version of the upcoming 43" 4k 120hz-144hz VRR HDR1000 monitors for $2k-ish I'd prob get that one instead of the $1200 - $1500 one. I definitely am in the market for 43" so I'm really happy there are a few coming out that check a lot of the boxes.

I agree that ultra-wides are too short unless you are sitting fairly close. At least for my tastes now and my current setup.
----------------------------------------------------------------

22.5" diagonal 16:10 .. 19.1" w x 11.9" h (1920x1200 ~ 100.6 ppi) FW900 crt

27.0" diagonal 16:9 .. 23.5" w x 13.2" h (2560x1440 ~ 108.79 ppi)
34.0" diagonal 21:9 .. 31.4" w x 13.1" h (3440x1440 ~ 109.68 ppi)

37.5" diagonal 21:10 .. 34.6" w x 14.4" h (3840x1600 ~ 110.93 ppi)

31.5" diagonal 16:9 .. 27.5" w x 15.4" h (2560x1440 ~ 93.24 ppi) .. (3840x2160 ~ 137.68ppi)

40.0" diagonal 16:9 .. 34.9"w x 19.6" h (3840x2160 ~ 110.15ppi)

43.0" diagonal 16:9 .. 37.5" w x 21.1" h (3840x2160 ~ 102.46 ppi)

48.0" diagonal 16:9 .. 41.8"w x 23.5" h (3840x2160 ~ 91.79 ppi)

55.0" diagonal 16:9 .. 47.9"w x 27.0"h (3840x2160 ~ 80.11 ppi)

----------------------------------------------------------------
 
It's much worse to me. Elevating brightness to even 600 nits for HDR without any kind of backlight control produces black levels 2x as high as I see on existing LCDs, and to me they already look like garbage displaying black or darkness in general, so. Depends on your standards. The whole point of HDR is increased CONTRAST, not increased average panel brightness...

That said I wasn't making any argument about monetary value. Building and implementing the software for a FALD of the caliber that is in the G-sync Ultimate monitors is a lot of work, it's why they're so expensive. Any non-FALD monitor is going to be cheaper regardless of resolution, because just increasing panel resolution is a lot easier.

I wouldn't *recommend* to anyone else that they spend $2K+ on any LCD, but for the time being it will be the best experience you can get on PC.


Well, we don't know what solution the 43" monitors are using yet, but quite likely it will be the same as the Philips 436M6VBPAB, which used a 32-zone edge lit solution... and actually that did a pretty decent job. It wasn't dreadful anyway. The new 43" 4K monitors should have superior panels also, so overall they really should deliver a satisfactory HDR experience for most people, but certainly well short of what FALD can achieve.

If you demand a super high HDR standard though, this does require a top quality panel and FALD backlight solution, on that I very much agree... the problem is that this crazy priced monitor doesn't deliver the VERY best... it's just the best you can get in a high refresh PC monitor today... you have no other choice. Cheaper TV's are doing a better job delivering HDR... the only drawback there being the low refresh rates... and the 120Hz TV's we're starting to see now can only achieve that over HDMI 2.1, which of course no GPU has yet.

We're at this weird point in time where this monitor can be allowed to exist... in a year or two, when we have GPUs with HDMI 2.1 and loads of high refresh HDR TV's, there's just no way it will be able to get a foothold in the marketplace. Until then though, it's for those with money to burn and not a value-based care in the world.

I do think the 43" monitors are going to give this a serious run for its money though... the huge price difference alone is going to have a lot of consumers scratching their heads. Not to mention, we have the Acer X35 and Agon versions which will come in cheaper, and that will necessitate Asus dropping this one also. I can't see it staying at this price for long to be honest.
 
Last edited:
We kinda do. 43" VA for only $1,299 isn't going to have FALD. I'm actually worried the price is too low and the panel will be sub-par.

Being edge-lit, with the slowest pixel type (VA), I'm leaning more towards the 38" LG with these new "fast" IPS pixels and real 175 Hz G-Sync. But you do sacrifice any meaningful HDR. But I doubt a 1000 nit edge lit HDR experience will be all that great anyway on the 43'ers.
 
We kinda do. 43" VA for only $1,299 isn't going to have FALD. I'm actually worried the price is too low and the panel will be sub-par.

Being edge-lit, with the slowest pixel type (VA), I'm leaning more towards the 38" LG with these new "fast" IPS pixels and real 175 Hz G-Sync. But you do sacrifice any meaningful HDR. But I doubt a 1000 nit edge lit HDR experience will be all that great anyway on the 43'ers.

I mean I guess it depends on how much you care about size. I don't care much, to me 43" just means more awkward desk positioning compared to 27+8", I care more about image quality. $2500 is almost twice as much as $1299, granted, but if the AOC or Acer versions push this below $2K, then it becomes a much closer choice for me at least. I mean, would I pay $700 more for the much, much better black levels of a FALD? Yeah, probably. Is "real 4K" nice? I mean yeah, I guess, kinda, but it's also a lot harder to drive in games, and most of the advantage in pixel density is eliminated by the larger display size to start with, at least for the 43".

From a financial standpoint, the smartest choice is probably to just wait for LG's 48" OLED announcements. As much as I want to do that, my 27" XB270HU is getting a bit long in the tooth and I really have the upgrade itch for Cyberpunk 2077. But maybe I should just play that at 60hz on an OLED TV, tbh.
 
Practice has shown that FALD is a gimick because of the bloom issue. At least on the VAs it is. The closest thing to the best gaming monitor now is the LG 38GL950G, but no meaningful HDR. I don't know if I can stomach it after the amazing HDR gaming experience on my 55" LG OLED from 2017, but I really want a fast gaming 35"+ display with G-sync.

For the price of the PG35VQ we shouldn't be talking about any bloom issues...
 
Ya I was referencing the LG 38GL950G. I have zero interest in the PG35VQ. 48" OLED may come in 2020, but HDMI 2.1 VRR support via NVIDIA is still a total unknown. I MUST have VRR.
 
Ya I was referencing the LG 38GL950G. I have zero interest in the PG35VQ. 48" OLED may come in 2020, but HDMI 2.1 VRR support via NVIDIA is still a total unknown. I MUST have VRR.

The slow adoption of HDMI 2.1 is really pissing me off.
 
The slow adoption of HDMI 2.1 is really pissing me off.
QA process for devices just started last fall, if I'm not mistaken. I don't even think the new cables are out on the market, yet. Has HDMI's QVL even been updated?
 
We kinda do. 43" VA for only $1,299 isn't going to have FALD. I'm actually worried the price is too low and the panel will be sub-par.

Being edge-lit, with the slowest pixel type (VA), I'm leaning more towards the 38" LG with these new "fast" IPS pixels and real 175 Hz G-Sync. But you do sacrifice any meaningful HDR. But I doubt a 1000 nit edge lit HDR experience will be all that great anyway on the 43'ers.


Obviously no FALD... as I said, it will almost certainly be using the same solution as the Philips 436M6VBPAB, with 32-zone edge lit. It did a good job... nothing spectacular, but fine. And these 43" models will have faster panels, hopefully better all round, so if the Philips is a base level, we can only hope the Acer/Asus models will be better. We'll just have to wait and see.
 
Back
Top