ASUS announces Swift PG27UQ 4K IPS 144Hz G-Sync HDR monitor

2000€ is simply too much.
Their top tier gaming monitor now costs 800€, there is no reason to put another top tier at 2000€.

1200€ is the reasonable price to me.

Not really. This monitor has a lot of firsts in it. First FALD monitor. That alone significantly increases the panel production costs versus edge lit. First DP 1.4 G-Sync T-Con monitor. First 4K monitor over 60 Hz. Add on HDR to all of that too.

$1200/EUR isn't realistic. Just to put that into perspective, a regular old 60 Hz G-Sync 4K 32" Acer costs today $1300. So you want something that absolutely destroys that monitor in specs to cost less?

Another perspective: the Dell 5K display at launch cost $2500. That is just a 5K panel at 60 Hz. That's it. $2000/EUR isn't unreasonable for this new Asus.
 
Misleading, whether intentional or not. Even the dark parts of the image are clearly brighter on the HDR monitor, which is not how it's supposed to work.


You can't see HDR without a 1000nit HDR monitor and any picture of two monItors with one brighter than the other is going to show one as much darker and less saturated than in RL due to camera bias. That picture is nothing like what that would look like in person to your eyes. I would completely disregard it.
 
I'm just really curious how they did FALD without increasing input lag a lot. Usually the display electronics have to "read" the incoming source, process it and then send it to the FALD back-light. Only thing I can think of is that this was designed explicitly into the new DP 1.4 G-Sync T-Con to do this. Usually simply turning on FALD on a modern TV spikes input lag 50+ms by itself.
 
There's no sense if having that large number of backlight zones (~ 96 per quadrant if divided up equally) if they aren't going to be turned brighter and dimmer and on and off dynamically in games. The same goes for the HDR implementation keeping up, with the backlight array on HDR content.

Hopefully it won't have to do more than one frame of buffer and can keep up with shorter frame-times, and if it was using a constant framebuffer (which g-sync already does at 30fps or lower) it would have to keep up dynamically with wildly varying framerate input.

30fps-hz = 32.2ms, 40fps-hz = 25ms , 60fps-hz = 16.6ms

80fps-hz = 12.5ms frames, 100fps-hz = 10ms, 120fps-hz = 8.5ms , 144hz = 6.94ms

If you have to turn dynamic backlight off in a "Game Mode" like a tv in order to avoid getting 35ms - 45ms - 50+ms input lag, this monitor probably wouldn't be worth it compared to the other samsung and AUO high hz 2560x1440 and 21: 3440 x 1440 VA panels of various sizes due out this year, especially if this FALD one is $2k instead of $1200. I can't imagine bad input lag being the case if it is touted as being a HDR gaming display since HDR varies the backlight so much.
 
There's no sense if having that large number of backlight zones (~ 96 per quadrant if divided up equally) if they aren't going to be turned brighter and dimmer and on and off dynamically in games. The same goes for the HDR implementation keeping up, with the backlight array on HDR content.

Hopefully it won't have to do more than one frame of buffer and can keep up with shorter frame-times, and if it was using a constant framebuffer (which g-sync already does at 30fps or lower) it would have to keep up dynamically with wildly varying framerate input.

30fps-hz = 32.2ms, 40fps-hz = 25ms , 60fps-hz = 16.6ms

80fps-hz = 12.5ms frames, 100fps-hz = 10ms, 120fps-hz = 8.5ms , 144hz = 6.94ms

If you have to turn dynamic backlight off in a "Game Mode" like a tv in order to avoid getting 35ms - 45ms - 50+ms input lag, this monitor probably wouldn't be worth it compared to the other samsung and AUO high hz 2560x1440 and 21: 3440 x 1440 VA panels of various sizes due out this year, especially if this FALD one is $2k instead of $1200. I can't imagine bad input lag being the case if it is touted as being a HDR gaming display since HDR varies the backlight so much.

Damn the Pro Art version looks sick too.
Noticed the specs on the table card @ 1:21 stating "led driving technology achieves 1 micro second operation."


https://www.asus.com/us/Monitors/PA329Q/ :woot:

Gonna have to put in some extra hours this year
preview.jpg
 
Last edited:
I'm just really curious how they did FALD without increasing input lag a lot. Usually the display electronics have to "read" the incoming source, process it and then send it to the FALD back-light. Only thing I can think of is that this was designed explicitly into the new DP 1.4 G-Sync T-Con to do this. Usually simply turning on FALD on a modern TV spikes input lag 50+ms by itself.

Even if it's designed into the T-Con it will add a couple of ms since it needs to know 1-3 frames ahead when and where to dim the picture in time.
Or otherwise the local dimming adds no input lag at all but then it will react slightly delayed.
In an optimal case there would then be those two modes to choose from.
 
Honestly, the ultimate FALD implementation is OLED. Can't beat the pixels themselves being the light source, rather than a lower-resolution backlight array trying to approximate it!

But I don't think we'll be seeing OLED monitors any time soon, so the PG27UQ will have to do for those of us rich enough to get one at the implied $2,000 price point.

Speaking of which, that proposed price makes my wallet scream in anguish. But hey, look at it this way: the GDM-FW900 debuted with a $2,500 MSRP, was released some time around 2002, and people STILL use them. At least when the D boards with the flyback transformers don't blow on you like mine did. If the PG27UQ manages to hold up after 10 years, 15 years, maybe even longer than that and still look good doing it, the cost could be plenty justified in the long run over many years of happy gaming and viewing.
 
The price is fine considering the specs. The question is will this monitor actually deliver (input lag, calibration etc.) and what will quality control be like. Might pick one up on day one to do my own testing if it's available locally.
 
Last edited:
2000€ for a gaming monitor is simply a stupid price.
waiting for acer counter parts.

all in all it's new ok, but no need to bump the price up every time there is something newer and better.
it's not the way the market works. your 2017 car costs more or less the same money of your 2007 car even if it is way better.
it's called progress...
 
2000€ for a gaming monitor is simply a stupid price.
waiting for acer counter parts.

all in all it's new ok, but no need to bump the price up every time there is something newer and better.
it's not the way the market works. your 2017 car costs more or less the same money of your 2007 car even if it is way better.
it's called progress...

FLAD, HDR, 4k, 144hz, g-sync and the fact it's all optimized to work together with presumably better than ever before performance (thinking about input lag especially) and that suddenly makes the price not unreasonable for a first timer. On paper it simply looks like a fantastic monitor - the kind of thing some of us have been waiting for.

Acer counterparts are unlikely to affect the price in the short run. When the 1440p G-Sync Predator came out it didn't suddenly make the Swift obsolete (they both have pros and cons) or drop in price - it was in fact priced around the same or even a little higher. And those two haven't really dropped in price if you buy the current up to date version (small improvements vs the original, such as 165hz instead of 144hz but nothing major).
 
Last edited:
I'll pass on it. After having a 34" monitor I just can't go smaller again. I'd rather have a good 34" 21:9 than a good 27" 4k monitor I don't care how sweet a picture it is. That's just me though. I guess the manufactures research is telling them most gamers want a 27" 4k monitor. I'm not most gamers though. I'm into large screens now.
 
I'll pass on it. After having a 34" monitor I just can't go smaller again. I'd rather have a good 34" 21:9 than a good 27" 4k monitor I don't care how sweet a picture it is. That's just me though. I guess the manufactures research is telling them most gamers want a 27" 4k monitor. I'm not most gamers though. I'm into large screens now.

computer are generally used on a desk, you can't use a 34 inch monitor on a desk, probably you use PC like a tv :D
 
FLAD, HDR, 4k, 144hz, g-sync and the fact it's all optimized to work together with presumably better than ever before performance (thinking about input lag especially) and that suddenly makes the price not unreasonable for a first timer. On paper it simply looks like a fantastic monitor - the kind of thing some of us have been waiting for.

Acer counterparts are unlikely to affect the price in the short run. When the 1440p G-Sync Predator came out it didn't suddenly make the Swift obsolete (they both have pros and cons) or drop in price - it was in fact priced around the same or even a little higher. And those two haven't really dropped in price if you buy the current up to date version (small improvements vs the original, such as 165hz instead of 144hz but nothing major).

certainly is more than before but not enough to justify a 220% price increase over previous 4K gaming monitor.
 
Obviously its a value judgment based on your individual needs, but for me the increased price is worth it from a productivity perspective alone. If your work is document or multitasking intensive then its simply fantastic being able to display everything on screen without having to flip between apps or documents.

In terms of gaming, just about everything I play supports 21:9 natively except for a handful of games where the devs have a retarded bee in their bonnet about ultrawide support (specifically Blizzard) or where its from ass backward Japanese devs like Konami or Capcom. Looking back through what I have played since buying my X34, all of the following support 21:9:

GTA V
Shadow of Mordor
Doom
Arkham Knight
Metro Last Light
Homeworld Remastered
Titanfall 2
The Witcher 3
Steep
Assassins Creed Unity
Unreal Tournament 4
Shadow Warrior 2
Star Citizen
BF1
BF4

Edit: couple of extras

Alien Isolation
Darksiders Warmastered Edition
Shadow Warrior 2

Booting up a game like Witcher 3 or GTA V in ultrawide is just so incredibly immersive that everyone I demonstrate it to is left in absolute awe. Support is becoming more common than not, but honestly I am not all that bothered by black bars on the few games where it isn't supported.


What about titles from Bethesda?
 
2000€ for a gaming monitor is simply a stupid price.
waiting for acer counter parts.

all in all it's new ok, but no need to bump the price up every time there is something newer and better.
it's not the way the market works. your 2017 car costs more or less the same money of your 2007 car even if it is way better.
it's called progress...

Except you are comparing a 2017 Ferrari to a 2007 Toyota. This new monitor completely blows everything out of the water. But remember there is also an Acer version in the works.

Honestly I would pay $2,500 to $3,000 if it were 32". But I know they stopped at 27" because even at $2,000 sales numbers were going to drop drastically as less and less people could afford. Going larger probably wasn't an option from the bean counters.
 
Even if it's designed into the T-Con it will add a couple of ms since it needs to know 1-3 frames ahead when and where to dim the picture in time.
Or otherwise the local dimming adds no input lag at all but then it will react slightly delayed.
In an optimal case there would then be those two modes to choose from.

I put the frame times in my post in case something like what you outlined were the way it's going to work. If it's dependent on frame times, more reason to keep your fps-hz at least 100 (10ms), or higher (shorter ms). That is, if it's not a locked in set buffer time regardless of the frame times, and of course your frame rate with g-sync would be oscillating all over the place constantly. The other scenario where the dimming takes place after would still work if it were extremely fast, fast enough to not be noticeably delayed. Look forward to tftcentral getting their hands on one of these and putting it through it's paces.
 
Last edited:
..... and of course your frame rate with g-sync would be oscillating all over the place constantly....

I am not sure why you talk about this from this perspective; A video card, any video card, is only going to produce the number of full frames,(G-Sync and V-Sync ON),that it can produce and the monitor's refresh rate doesn't have anything to do with it. What does have something to do with it is the demand placed on the card, display settings and screen resolution set the demand. The difference is in the timing and with V-Sync ON, the time is set by the monitor's refresh rate and with G-Sync the video card controls the monitor's refresh rate by telling the monitor to refresh with each new frame being sent. It's like changing from a game where you try to make everyone do everything at the same time to a game where one follows the other.

Now I am pretty sure you actually understand all of this already. You are reading this going "yea yea I know". Which is why I don't understand why you are phrasing things the way you are. I mean, the entire adaptive sync technology thing really changes the entire game. Frame rates and refresh rates become almost immaterial as long as they remain within a useful range. But G-Sync doesn't have any control over this other than the Max refresh rate settings you can choose from. It's really all about your card and the demand you are placing on it.
 
Except you are comparing a 2017 Ferrari to a 2007 Toyota. This new monitor completely blows everything out of the water. But remember there is also an Acer version in the works.

Honestly I would pay $2,500 to $3,000 if it were 32". But I know they stopped at 27" because even at $2,000 sales numbers were going to drop drastically as less and less people could afford. Going larger probably wasn't an option from the bean counters.

Maybe they stopped at 27" because their panel manufacturers can't produce enough panels at larger sizes yet?

How many products, monitors and TVs are 4K and run either smaller than 40" or larger than 27" ?

On Newegg 4K TVs start at 40" and for monitors, if I leave out 27" and 28", there are just under 40 different monitors, some I am sure share the same panel, and none are faster than 60Hz.
 
Last edited:
G-SYNC simultaneously solves all the above problems by allowing the graphics card to drive the monitor’s timing of refreshes. The monitor’s refresh can now occur at arbitrary intervals, staying in sync with the GPU

Typical frame rate graphs at around 100fps-hz average and 60fps-hz average.

uZ7JjB4.gif


What I was pondering was that if input lag from a dynamic FALD buffer was a function of the ms duration of each frame, keeping a higher average frame rate with g-sync enabled could result in less input lag overall, at least compared to lower frame rates. For example, 1 frame buffer at 10ms (100fps) vs 1 frame buffer at 25ms (40fps). However using variable hz would make the ms of each frame variable by definition (10ms at 100fps-hz +/-), in this scenario making the input lag vary across the variability of graphs similar to those above.

Variable input lag sounds like it might be worse since people can compensate for ordinary input lag well enough after they get a feel for it. Therefore I think if it's buffered frame(s) it would probably be a set amount of input lag or a slight overall delay, like igluk said. Hopefully whatever it is it's not horrible nor makes turning off FALD necessary. That would defeat the major benefit.

Even if it's designed into the T-Con it will add a couple of ms since it needs to know 1-3 frames ahead when and where to dim the picture in time.
Or otherwise the local dimming adds no input lag at all but then it will react slightly delayed.
In an optimal case there would then be those two modes to choose from.
 
I will buy this thing and be grateful if it ACTUALLY comes out (ahem dell OLED)

People have zero reason to complain about the price, there is plenty of time between now and October 2017 to raise funds by selling plasma and hooking the streets for spare cash!
 
I will buy this thing and be grateful if it ACTUALLY comes out (ahem dell OLED)

People have zero reason to complain about the price, there is plenty of time between now and October 2017 to raise funds by selling plasma and hooking the streets for spare cash!

I've already got my black dildo ready. Who's first?
 
The price is fine considering the specs. The question is will this monitor actually deliver (input lag, calibration etc.) and what will quality control be like. Might pick one up on day one to do my own testing if it's available locally.

No it isn't. 27"?

74569570.jpg


If you think I'm paying 2 grand for a pissant tiny monitor like that, I have a bridge to sell you. It's even more ridiculous given that it's 4k. It should have been a 32" for that price. Man the display manufacturing industry just BLOWS.

If the CPU manufacturing industry was like the display manufacturing industry, Intel would be releasing a new CPU that was the size of a postage stamp but slower than a Pentium 2 450mhz. Hi guys. It's 2017. Why aren't monitors better than they were IN 1998?
 
Except you are comparing a 2017 Ferrari to a 2007 Toyota. This new monitor completely blows everything out of the water. But remember there is also an Acer version in the works..

This monitor still doesn't even blow away budget CRTs from the 90s. It's still shit as far as motion blur/motion resolution goes. So no, don't shovel me shit and call it sunshine. The display manufacturing industry fucking sucks. 2017 monitors should be objectively superior to old CRTs in every quantifiable way, and they're NOT. Their technology blows. Fuck OLED, fuck LCD, fuck all of this shit with crippling, ridiculous problems. How about making a monitor that doesn't BLOW, and then you can consider charging 2 grand for it.
 
  • Like
Reactions: Meeho
like this
I'm just really curious how they did FALD without increasing input lag a lot. Usually the display electronics have to "read" the incoming source, process it and then send it to the FALD back-light. Only thing I can think of is that this was designed explicitly into the new DP 1.4 G-Sync T-Con to do this. Usually simply turning on FALD on a modern TV spikes input lag 50+ms by itself.
two ways:
1. use the previous frame's content to control the fald.
2. buffer just enough rows from the current frame to control the fald.
i assume 384 zones means 24x16. so 1/16 of a frame period would be the minimum possible input lag if the fald is controlled by the current frame's content. and that's a negligible amount (1ms at 60hz, <0.5ms at 144hz)
 
  • Like
Reactions: igluk
like this
CRT's... am I the only one that does not miss 19" 4:3 screens, half my desk being under my monitor, and my monitor weighing 50 pounds?
 
I also don't remember this amazing picture quality either. Blacks were not black and everything just looked fuzzy.
 
CRT's... am I the only one that does not miss 19" 4:3 screens, half my desk being under my monitor, and my monitor weighing 50 pounds?
19"? Pfft, I'm on a 21" GDM-5410, and I'd be on my 24" widescreen GDM-FW900 if it wasn't dead. You have no idea how much I miss that thing and want to find a reasonably priced replacement D board for it.

I'm no fan of the depth and herniating weight of these top-tier CRTs, though, and 50 pounds is maybe HALF the weight of a FW900. I'm also not a fan of having to tune the geometry and convergence just right.

Thing is, I'm also not a fan of things like black levels that look like dark grey, non-native resolutions that look like crap, viewing angles that result in significant color or gamma shifts (even IPS-type panels are not perfect in this regard), higher input lag, lower refresh rates until fairly recently, high-refresh LCDs being TN garbage until very recently, backlight bleed, dead/stuck pixels, and generally having to trade off between gaming performance and image quality - especially if it means I have to pay a pretty penny to deal with some or all of that.

I mean, it takes $700-800 to get one of those 1440p 144 Hz G-SYNC AHVA monitors, and even then, you're putting up with the AU Optronics panel lottery and godawful QC, and this new PG27UQ might end up costing more than twice that!

That's why I'm still on a CRT for my primary monitor; I'm still waiting for a flat-panel that's worth buying at anything above dirt cheap thrift store prices. I just hope the PG27UQ is that display, even with the exorbitant proposed price tag.
 
  • Like
Reactions: Meeho
like this
What about titles from Bethesda?

I don't have any installed, but google tells me that Bethesda falls into the same category as Blizzard, except that it seems pretty straight forward to mod in 21:9 resolutions (besides, it wouldn't be a Bethesda game if you didn't have to install 20 separate mods to get it into a half acceptable state anyway!).
 
C1hqXhvUkAASAAO.jpg


Is it just me or does HDR just make most images look over saturated? Also difficult to tell whether the left display has been intentionally set to a much lower brightness to accentuate the difference.
The left display looks like ULMB is on, which naturally decreases perceived brightness.
Yep, just saw that they removed the price from the web page. If it's $1,999 it will have to be near perfect for me to take the plunge and buy it.
For $2k I'd rather get a large OLED TV... For $1,199 USD this would have been a day one buy for me.
 
two ways:
1. use the previous frame's content to control the fald.
2. buffer just enough rows from the current frame to control the fald.
i assume 384 zones means 24x16. so 1/16 of a frame period would be the minimum possible input lag if the fald is controlled by the current frame's content. and that's a negligible amount (1ms at 60hz, <0.5ms at 144hz)

That sounds amazing, hopefully you're right about this.
 
This monitor still doesn't even blow away budget CRTs from the 90s. It's still shit as far as motion blur/motion resolution goes. So no, don't shovel me shit and call it sunshine. The display manufacturing industry fucking sucks. 2017 monitors should be objectively superior to old CRTs in every quantifiable way, and they're NOT. Their technology blows. Fuck OLED, fuck LCD, fuck all of this shit with crippling, ridiculous problems. How about making a monitor that doesn't BLOW, and then you can consider charging 2 grand for it.

I see someone has a penchant for hyperbole. A budget CRT from the 90's was a 15" 800x600 60 Hz flickering 4:3 display. A 27" 4K 144 Hz display with FALD and HDR can't compete with that? lol. OLED doesn't have any crippling, ridiculous problems. The only reason the motion quality isn't better isn't because of the technology, it's because of the way it's packaged in today's 60 Hz TV's. OLED picture quality absolutely destroys CRT.

two ways:
1. use the previous frame's content to control the fald.
2. buffer just enough rows from the current frame to control the fald.
i assume 384 zones means 24x16. so 1/16 of a frame period would be the minimum possible input lag if the fald is controlled by the current frame's content. and that's a negligible amount (1ms at 60hz, <0.5ms at 144hz)

If you use the previous entire frame, there will be a lag with the FALD responding. There is a reason why in every TV made, turning off FALD is a requirement for any decent input lag. I'm just wondering what the geniuses at NVIDIA came up with to minimize the lag.
 
If you use the previous entire frame, there will be a lag with the FALD responding. There is a reason why in every TV made, turning off FALD is a requirement for any decent input lag. I'm just wondering what the geniuses at NVIDIA came up with to minimize the lag.

It seems like TVs always have bad lag no matter what. Interpolation also adds ridiculous lag on TVs but doesnt the Eizo FG2421 do interpolation to 240hz without an insane amount of lag?
 
If we go by the specs of the Pro Art version that i posted already, which states "led driving technology achieves 1 micro second operation" 1 MICRO second. And since the GPU has calculated that color data already when it pre renders a frame then it should not make much of a difference than it ever did with your settings set to even 1 pre rendered frames. The monitor only needs to accept it in a timely manner and the back light is basically supposedly instant. And if what they mean when they say 1 microsecond backlight "operation" is actually from actual input to execution, then it is .001 of a millisecond execution. So effectively nothing. If they didn't build this from the ground up with this in mind then i don't know what business they have making gaming monitors. It's not a TV which does not work in tandem with your gpu like this monitor certainly must in a very precise way along with Gsync. So the the pitfalls of FALD on a TV and trying to PC game on them when that is not what they were really designed for should not be related to this at all. I suspect this will likely be, as demanded by owning the title "gaming monitor" a total input lag range of 1 frame or less.
 
It seems like TVs always have bad lag no matter what. Interpolation also adds ridiculous lag on TVs but doesnt the Eizo FG2421 do interpolation to 240hz without an insane amount of lag?

No that Eizo just pulsed the back light twice per frame, one tiny one and one large one so they called in 240 Hz lol. Silly gimmick calling it 240 Hz but it was actually a pretty bad-ass display. Never seen an LCD with such good blacks and contrast.
 
computer are generally used on a desk, you can't use a 34 inch monitor on a desk, probably you use PC like a tv :D
My desk is actually a 40" a table that is 40" wide or deep or however you want to say it. I have a 34" 21:9 monitor sitting on it and its just fine. For TV viewing I have a 39" TV in the room. So 34" is not too big for a gaming monitor. Heck, I'd love a good 40"-49" PC monitor. They just don't exist yet. Not good enough ones IMO anyway.
 
34" 21:9 diagonal is only as tall as a 27" 16:9 so while larger horizontally, that's not exactly an equal comparison to a larger 4k (larger than 27" 16:9) height wise at a desk viewed 1.5 - 2' away. a 21:9 34" just adds 440 pixels to each side.

That said, the 4k in this graphic equates to around 40.8" diagonal and something like that would allow you to run some variety of 21:9 across the middle 1:1 px with bars, other windowed resolutions 1:1 , or 1440p scaled fullscreen on the most demanding games.

4k_21x9_2560x-27in-and-30in_1080p_same-ppi.jpg
 
Back
Top