PCPer Reviews the ASUS ROG Swift PG27UQ 4k 144Hz G-SYNC Monitor

Don't kid yourself or talk yourself out of it. HDR looks outstanding for gaming. I turn my backlight up to 100% when it's bright in my living room. It's not blinding. It looks nice.

I see it as Gameworks in FF 15, It looks different but hell if I could say if A looks better than B in the games out there.
Granted the game that gets props for good HDR is farcry but I haven't tried that but I've tried numerous other titles on a good quality display and I really really really cannot say HDR on a PC is at all mature.
The windows experience alone makes it.. meh, for most with a screen they are comfy with in terms of resolution and refresh rate I'd hold out and not be sold on HDR.

HDR For movies, a completely different story and it completely transform some movies.
HDR Is great, HDR for PC gaming and even xbox seems a bit imature in my eyes.

Sometimes the Panels are just better with the black levels and you think its cause of the HDR but the panel is just that much better than what a person is used to, that was my 2cent on it.
 
Two things. I've seen HDR and while it's good I don't think it's earth shattering. It perhaps needs more time to mature. The second thing is that I wish manufacturer's would quit wasting resources on 4K monitors that are 27" and under. It's just not a good 4K experience on this size of monitor.
 
Help me out here, guys. What does HDR have to do with ultra high nits? I thought HDR was just increased color space?
 
Help me out here, guys. What does HDR have to do with ultra high nits? I thought HDR was just increased color space?
HDR has everything to do with luminance, and not much to do with color space. HDR reproduces the range of luminance that we experience in real life, and in order to do that it requires a powerful backlight with very granular control, so it can have something as bright as the sun (or a reflection of similar intensity) and shadows as dark as night, without it washing out (grey blacks, or backlight bleed).

The colorspace is effected more by the encoding and bits per pixel. Most monitors use 8bpp, windows uses 16bpp (iirc) but it's converted by the video card before reaching the monitor. The more bpp, the more colors can be represented in a given colorspace.
 
I'm pretty much totally converted to gaming from the couch on my TV. I'm on a 55" 4k60Hz panel, that will also do 1080p120Hz, sitting about 5 feet away. Basically I have just enough room for the stand I built for my G27 wheel. I built a lap pad from spruce scraps, ran a USB 3.0 powered hub to the couch, and haven't looked back as far as gaming and media is concerned. If I work from home I have a desk, that's essentially a hobby station otherwise. I'm still shopping the perfect surround receiver, but for now I've got 120w/ch (stereo) @ .06 THD thru Klipsch R-15Ms backed up by a Dayton Audio DA1200 sub. I just can't go back to desk gaming. While I could go with a slightly smaller screen and acheive the same FOV saturation, it wouldn't be as comfortable for M/KB games, and using the wheel would be a pain in the ass since I could never achieve the orientation I can with the stand. Seriously, this setup rocks for racing games. It's also great for other game styles: RPGs are epic, RTSs are truly strategic feeling, FPS are immensely engaging, Witcher 3\Shadow of Mordor is insane. Being able to just switch inputs to the Switch is a nice bonus. It's also really nice using real speakers instead of headphones. I can wear my helmet when I play F1 2017. :D

I've seen G-Sync and FreeSync in action, along with HDR in it's various forms. VRR is borderline transcendent, and HDR pushes it over the edge, though it is really nice by itself with the right content. I'm waiting to see how HDMI 2.1 shakes out before buying another screen. Maybe I'll make the jump from this to VRR/HDR + Atmos all at once. (y)

I want all the cookies.

Precisely where I've gone, I ran the PB278Q and ROG Swift 144hz 24", moved to my 55" Samsung and haven't looked back. I mean, I like the 27" @ 100hz but there's just something missing. Gaming at 1080P or 4k on the TV is so much better, I feel more immersed and like I can see more. I have fake HDR+ unfortunately but for me, it looks really good and runs super fast in Game mode. No perceptible lag in Hunt.
 
Help me out here, guys. What does HDR have to do with ultra high nits? I thought HDR was just increased color space?

'HDR' by itself is meaningless without context; in the current context of monitors and TVs (i.e., HDR10), that context includes increased color space in addition to the increased absolute brightness and absolute darkness that HDR already implies.
 
Vega knows his display setups, that's for sure. No way I'm early adopting this myself, but it is nice to see the way forward.
Yep and commentary that is from an owner and knowledgeable unlike the trolls bashing it here out of hand.
 
After reading this and a few other reviews I spent some time looking for a 10bit 1440p. There's a thread here on [H] already for 'em. Surprisingly there's a few out there for o.k. prices $700-1100 and a couple even had HDR10 support. Only real problems were either 60hz or no g-sync/free-sync. Rather than fight the uphill battle for 4k I'd be much happier dropping $1000ish on something like that.
 
4k 144hz. I've read most of the posts here in the thread talking about the interconnect requiring 4:2:2 Chroma and that's great and all -- but -- will it play Crisis? Honestly, 144Hz at 4k seems like it would require a tremendously beefy GPU to push. Even in the HardOCP reviews for 4k gaming (60Hz-70Hz) seems to be where cards max out fps wise on maximum details settings for modern games on the best graphics cards. So are we talking QuadSLI-1080TIs for 144Hz on maximum settings?

Ex. FarCry 5 (58FPS average with 78 maximum) for 4k ultra high settings TAA and 16xAF

https://www.hardocp.com/article/2018/04/17/far_cry_5_video_card_performance_review/3

It seems almost impossible and like there is no way you're going to be able to get this game going at 144Hz ultra settings with today's cards. Assuming an near-impossible 100% scaling, two 1080TIs would be only 116 FPS (not 144 yet).

4k @120Hz would be enough to probably disable even what 2x1080TIs can do in the latest titles on ultra quality.

I guess one could buy dual 1080TIs and put the graphics at medium to high to get 144Hz but who would honestly do that?
 
guess one could buy dual 1080TIs and put the graphics at medium to high to get 144Hz but who would honestly do that?
In many games, high vs Ultra is barely noticeable in screenshots for some or many settings. I saw a comparison thread for a bunch of games at ocn awhile back and was surprised.
 
In many games, high vs Ultra is barely noticeable in screenshots for some or many settings. I saw a comparison thread for a bunch of games at ocn awhile back and was surprised.

To add:

One would expect the monitor to last through multiple GPU upgrades, and that would present a not unpleasant choice: to run at the highest image quality settings and take (more) advantage of G-Sync to smooth things out, or back off of the settings a bit to push out more frames.

You know, until faster GPUs come along ;)
 
Yep and commentary that is from an owner and knowledgeable unlike the trolls bashing it here out of hand.
Does his vast knowledge make the fan dissapear? Or change the inherent properties of IPS panels?
 
I honestly don't see the appeal of being blinded while gaming. I mean, sure, it's nice to have the dynamic range, but if we can't do it without blinding ourselves, then I think I'll pass...I'm already turning down the brightness on my non-hdr displays as it is (including my phone).

I prefer a high-contrast display with medium brightness (not blinding, but bright enough that I do have to turn it down a little to be comfortable).

I was at Microcenter today and they had one of these out on display. They had unigine heaven running on medium no AA or tessellation with a 1080Ti. Now I want to go back and turn the brightness all the way up to see how blinding it really is, too bad its limited by DP.

1000 nits in a 27" screen 3 feet from your face...yeah that sounds awesome. I can't handle the 400 nits thrown out by my current monitor after heavy usage, the eyestrain is killer.

HDR max brightness isn't quite the same as it is in SDR. In SDR the max brightness is how bright the panel can make pure white, so the peak brightness will be blasted at you all of the time. For comfortable viewing that is usually around 100 nits. In properly authored HDR your typical "pure white" is still around 100 nits. What the higher brightness is used for is highlights, things that in real life are "brighter than bright", like the sun glinting off a shiny surface. So while the panel could blast you with 1000 nits well done HDR content should rarely actually do so. The extra brightness is used to achieve more contrast and a more lifelike image, not just an overall brighter image.

I'm also not sure about this screen but HDR TVs can't actually drive the whole panel at their peak brightness anyway, takes too much power. A full screen max brightness might be more in the 400 nit range even if the TV is rated for 1000 nits.
 
I'm ignoring the facts by stating that facts > opinion. That makes sense.

So you're prepared to show that the historical IPS issues are just as pronounced on this monitor as they are on your average crappy IPS?
 
So you're prepared to show that the historical IPS issues are just as pronounced on this monitor as they are on your average crappy IPS?
It's not a historical thing, it's a technology limitation. It will always be a crappy IPS when it comes to blacks, contrast and glow. The latter one possibly mitigated with a polarizer, which I guess they couldn't fit in only a $2000 price.
 
Back
Top