ASUS SWIFT PG27UQ

RogueTadhg

[H]ard|Gawd
Joined
Dec 14, 2011
Messages
1,527
Anyone know when this monitor is to be released? I'm looking for a release date but I'm unable to find one.

Mass Effect will be coming out soon. I need that sweet sweet HDR-4k eye candy.
 
Scheduled for Q3, but Tftcentral expects delay until Q4
There will also be a very similar Acer version the XB272-HDR, so hopefully no significant delay since they'll want to beat each other to the punch ;)
 
Can't wait for either! I think I like the design of the Acer more. I'm not one for the flashy ROG designs of their monitors. For 1,200 Dollars. I hope the latency isn't too bad either.
 
Better make that $2000, then you have a deal.
That 384 zone FALD HDR will add most of the cost but if it works as intended the image quality will be amazing.
 
$2000? Hell no to that. I could probably stomach around the ultrawide price range of $1200USD... but not at $2000! Though the monitors' features look nice.
 
ASUS monitors always miss their release timeframes. I honestly wouldn't expect it until this time next year. Expect it to be around $1,800 to $2,000 USD.
 
Yeah, you'd think $1200-1300 would be the upper echelon for a gaming monitor but here we are. I'd be disappointed if it was any higher than that to be honest.
 
Day one purchase for me. I expect the Acer to release first, late Q3/early Q4 is my best guess.
 
I'm in depending on what the IPS FALD can do contrast and black depth wise. I'd think VA FALD would be better at that but 384 zones is a lot and maybe they will be able to do some great things with a modern premium HDR panel. HDR premium spec is .05 black depth. I'm not sure what hdr 10 is.
 
those will have to be around 1200 for me to consider it. Even then it's a stretch. I mean it took some convincing for my wife to let me spend 800 on a monitor.
 
But what i wanna know is...who´s the sheriff :p

Jokeing..i wanna know about this monitor if it has back light bleed or ips glow ?
Or buggers,dust, deadpixels,gum ect stuck in the display ??
 

TFTCentral‏ @TFTCentral
The 27” G-sync HDR 4K screens (Asus PG27UQ, Acer X27, AOC AG273UG) are not now expected until Q3. The 35” models (PG35VQ, X35) are due now in Q4

Yes, this is a rock solid source.

Another delay. These are basically vaporware. Let's get real.
 
Good thing I didn't wait on one of these. I needed a monitor sooner rather than later (My LCD backlight was starting to pink). Lets see if we can push these to 2019 :ROFLMAO:
 
  • Like
Reactions: Q-BZ
like this
Now I just have to start hoping my 980 Ti does not break down because there is literally no reason for me to upgrade from that card + ASUS PG278Q until the 4K 144 Hz displays are out.

I don't really get what is the hold up on these displays in the first place. 4K 120 Hz panels are already in most TVs so developing a desktop size panel should not be that big a feat. If the FALD and HDR is what makes it hard then they could've just released another model without it to get something on the market.
 
  • Like
Reactions: Q-BZ
like this
Now I just have to start hoping my 980 Ti does not break down because there is literally no reason for me to upgrade from that card + ASUS PG278Q until the 4K 144 Hz displays are out.

I don't really get what is the hold up on these displays in the first place. 4K 120 Hz panels are already in most TVs so developing a desktop size panel should not be that big a feat. If the FALD and HDR is what makes it hard then they could've just released another model without it to get something on the market.


The hold-up is producing the same post-processing as your TV does, while cutting the time that takes WAY down. You TV doesn't care about a little input lag if it's just displaying a video. 144 Hz gaming monitors do, however.

That's why your TV turns off these post-processing effects and just runs at 60Hz when you're in console/PC mode.

That, and the DP 1.3 driving hardware in these monitors is still a tough cookie to crack - your TV doesn't have to worry about high-speed inputs like that. And there's only two years of video cards with support for DP 1.3, so every year you wait gives you a larger market for your halo monitor.

Cutting-edge sometimes means waiting a little while for all the needed tech to catch-up. And waiting for a market for your overpriced first-generation monitor. It took us several years to jump from 1080p 144Hz to 1440p 144Hz, because we needed to wait for the faster DisplayPort 1.2 connector, and the processing power to drive it at low input lag.
 
Last edited:
It may also have to do with shrinking the FALD to fit. If it was so easy each sub pixel would have it's own light source similar to OLED.
 
At this rate, the first production displays with bandwidth for 4k 120hz will be HDMI 2.1 TVs.
 
It may also have to do with shrinking the FALD to fit. If it was so easy each sub pixel would have it's own light source similar to OLED.

Yeah, that's a lot of zones to fit into 27 inches. The smallest TV with a competent FALD implementation is 55 inches (Vizio P), and this baby has triple the zones.

I think the industry was hoping OLED would handle this issue for them, but it's not quite there yet on pixel density.
 
Last edited:
At this rate, the first production displays with bandwidth for 4k 120hz will be HDMI 2.1 TVs.

120 Hz 4K OLED TV's via HDMI 2.1 are widely expected in 2019. You'd have to be crazy to buy an LCD for gaming after that point unless you really have a space limit.
 
I'm interested in the BFGD ... higher peak brightness and thus higher HDR color volume than oled, none of the oled fading/calibration concerns, has G-Sync (which can be important for 4k gaming frame rates at higher graphics settings), and is a larger display than the other HDR gaming ones. You can moderate the perceived size by adding distance and/or even run a 21:9 rez letterboxed on it for games. The price will probably be ridiculous though. OLED has incredibly deep black levels but a FALD VA is probably going to be around 3800:1 contrast ratio or more which is pretty dark compared to 850 - 1000:1 ips and tns.

I'm still interested in whether a true gaming monitor OLED will come out someday like the BFGD FALD VA ones. The lack of OLED panels marketed as computer monitors is concerning and writing off g-sync on an overly demanding resolution is asking a lot.
 
You're making a lot of positive assumptions for BFGD, which hasn't been tested by anyone yet since it's not a production product. Assuming a VA panel is not going to have irritating dark trails, and that a FALD won't have seriously evident haloing is pretty generous. Even more generous to say that peak brightness will exceed OLED, since only the absolute highest-end televisions do that, and not by much. And even then, it's still 65" which has a ridiculously low PPI.

And that's assuming that it's not vaporware too, which as we've learned is another generous assumption.
 
elvn you'd take a BFGD LCD over a 120 Hz 4K OLED? wtf, lol.

The only thing IMO it has going for it is G-Sync. 200 nits more brightness for HDR highlights with crap contrast ratio of LCD with massive blooming is still going to look way worse than 800 nit infinite contrast HDR OLED. There is a reason why OLED had/has its own category for HDR since it completely wrecks LCD in every way besides peak eye-searing brightness. Watching HDR movies in a dark room on my OLED it's almost already too bright, making you squint. And that's not even going into the fact that OLED has virtually zero smearing/trailing.

When linus did his video of the BFGD in a lit room, you could clearly see the FALD bloom. 384 zones is nothing, especially on a 65".
 
Ya I'm thinking about how I'm going to fit a 55" 4K into my desk setup. Probably going to get a freestanding adjustable stand to put it a 1-2 feet behind my desk and that should work. Funny thing is the 2018 OLEDs do 4K 120hz, they showed it at CES for HFR video via video files and streaming. But since HDMI 2.1 was finalized so late I guess it wasn't possible to get the silicon ready for 2018 release dates. So 4k 120hz displays are literally shipping now and what's holding us back yet again is connector standards.
 
We will see I hope.

A modern gaming VA should be tight to 120fps-hz or so before it starts to lose it again clarity wise in relation to it's panel response time.

384 zones isn't a lot compared to an oled's sort of "per pixel FALD" white oled array with layered colored subpixels.. but compared to other lcd FALD displays it is. My VA tv for example is 64 zones at 70" ;b FALD is a great thing for VA LCDs. I wouldn't even consider HDR with edge lit.

Personally I can't write off g-sync easily especially on a frame rate crushing 4k resolution and due to the nature of frame rates we shoot for typically being an average with a range (including going lower) rather than a solid minimum.

I definitely could never go back to 60hz gaming which some of you did for 60hz oleds and others did initially to get 4k resolutions so anyone who was ok with 60hz gaming who is now being dismissive of losing g-sync isn't exactly in line with my modern gaming display values.

----------------------------------------------

Regarding squinting/brightness.. yes HDR highlights can be bright but hdr peak brightness is not the same as raising the narrow band SDR's brightness of the whole scene. Most of the HDR scene is standard brightness but you are missing brighter color volume on highlights. The Range (the R in HDR) opens up more so you can see the bright colors and darks at the same time where typically on monitors you have a narrow band . OLED goes full dark but it can't go as bright.. once you hit a good black level you go "more black" on oled compared to a decent black level display but when you hit the peak brightness you are missing brighter HDR color volume. An OLED will clip HDR color volume beyond it's peak nit to white.

From what I've heard modern oleds cheat their brightness higher by adding an extra white subpixel which can affect the monitor's display quality.. I'll have to see if I can find a citation. Feel free to correct me if I'm wrong on that.

1000nit is typically the peak (full color gradation) highlights on real HDR standards capable LCDs... movies are mastered at 4000nit so even 1000 is the minimum acceptable for media. Eventually digital photos youtube, desktop apps, content creation suites, wallpapers etc will all be HDR eventually I'm guessing.


-------------------------------------

The BFGD's and the other HDR FALD lcd monitors are designed as full featured modern HDR gaming monitors not re-purposed tvs. There is only one OLED monitor I'm aware of in development and it's 22" and non gaming. Where are all the oled gaming monitors? Where are the color professional OLED monitors?

https://www.digitaltrends.com/computing/oled-laptops-and-monitors-are-missing-at-ces-2018/

All of that said, I'll definitely be keeping an eye on the 120hz oled progress all the same. I may end up with one in my home theatre living room someyear in the future at least. I am leaning heavy toward the BFGD for my pc but as always I'll wait for reviews and feedback about them and I'll see what you guys think of your 120hz non g-sync 4k oleds as well before dropping money.
 
Last edited:
OLED is overrated anyway. I have a phone with OLED screen and it isn't that impressive at all. OLED has more dead pixel problems, burn in and are overpriced. IPS is still very good and cheap technology. Only thing that OLED does better is contrast, but all other things are almost same as IPS.
 
movies are mastered at 4000nit so even 1000 is the minimum acceptable for media.

The usability of this is super questionable. The Z9D is one of the brightest displays available with 10% windows exceeding 1600 nits. My Galaxy S8 goes bright enough that it forces me to squint, in pitch darkness, on some HDR videos I've watched, and it only goes up to about 1000 nits. Most likely extreme brightness of this kind is only going to be used for extremely small point light sources... which btw a 384 zone FALD cannot display properly! Brightness numbers are one thing, but if you can't display a point light source properly without severe haloing it doesn't matter how bright you get. In fact, with FALD haloing, the higher you push brightness the WORSE you make the scene look.

And that's all ignoring the fact that only part of HDR is brightness, the other part is shadow detail. True blacks produces far better dark scenes than any FALD LCD can ever do. They also provide far better SDR scenes, and lets be real here, the vast majority of games and content for the next 10 years is still going to be SDR, and true emissive displays are a HUGE upgrade for SDR that no FALD can compete with.
 
The UE4 engine supports hdr for those devs who want to utilize it so there could be some games with it. HDR movies will become more common obviously, HDR youtube vids, netflix, digital art and photos...
It's still early though like 4k.
-------------------------------------------------------------------------------------------

Bright light in pitch darkness always makes you squint.. Like flashlights not looking bright in daylight but very bright at night. But every scene isn't full of flashlights, and a big screen is not the same as looking at a tablet or phone. Most of a scene is typically in SDR brightness levels even in brighter highlight scenes.
Similarly with a decent surround sound system you'd hear the pirates whispering below deck, and when the cannon fire starts it might sound "too loud".. but cannons and muskets aren't constantly firing every second of the movie, either.
Obviously you don't want things so real that a nuclear blast makes you go blind or cannon fire makes you go immediately deaf or anything but my point stands. :D

-------------------------------------

FALD isn't perfect but done well it's way better than edge lit, especially for HDR. People experiencing edge lit flashlighting , clouding and washed out screens now would have a lot more direct lit leds (96 per quadrant of the screen) which rather than only being in an on/off state should balance their brightness and dimness between each other dynamically.

I'm not saying there won't be any imperfections but I think you may be exaggerating their effect if the fald is done properly (on a 3000:1 contrast based gaming VA before the fald is activated). I've seen edge lit have bloom way more than a good FALD, but a poorly implemented FALD solution could be just as bad. We don't know what the HDR FALD screens will be like at release.

The OLED will lose color volume at the top. This will be highlights and edges, sun and waterdrops and eyes and lasers and metal edges and other scintillating things that will instead crush to white where a brighter lcd will continue to show full color volume at the top end.

There are other concerns with oleds and the lack of OLED monitors is concerning in itself.


A 16:9 image broken into 384 zones for reference: https://imgur.com/y5FmKGj
y5FmKGj.png


------------------------------------------------------------------------------------

People who were willing to drop modern 120hz gaming with a premium Variable Refresh Rate solution (g-sync) for 4k 30hz , 4k 60hz, and 60hz OLED really don't have the same priorities for modern gaming features so they might dismiss losing g-sync on a 120hz OLED similarly, even on a frame rate crushing 4k resolution with ever more demanding graphics ceilings.

I'm glad to have some high hz options but throwing g-sync out the window is a huge trade-off and there are still concerns about oleds organics utilized as monitors over time.

Give me a 120hz 1000nit 4k HDR OLED gaming monitor with g-sync at 40" - 43" (with a long fade protection replacement warranty... )

But seriously .. what happened to oled coming to real full featured gaming monitors and professional color monitors?


https://www.digitaltrends.com/computing/oled-laptops-and-monitors-are-missing-at-ces-2018/

------------------------------------------------------------------------------

I'm just waiting now until all the new monitor (and even 120hz oled TV) tech is out and reviewed but this is my perspective currently.
 
Last edited:
I'd love it if they put out a g-sync OLED monitor, I'd love it even more if Nvidia would just support HDMI VRR so that it's not a huge issue. I'm not usually an Nvidia hater, but unfortunately they seem more interested in forcing everyone to buy their proprietary sync solution than in improving the experience for everybody. It's disappointing. There does remain the wild card of the JOLED panels, assuming they can get their 22 inch one out this year... which is yet another generous assumption haha. But they did claim they were starting production in December.
 
Gsync is great but if I was given a choice between a 4k 120Hz FALD VA LCD with gsync or a 4k 120Hz OLED without gsync, I would pick the OLED.
 
yeah, tearing is something you can put up with. it really depends on the game whether it bothers you or not. Turn off vsync, and you get the same low input lag as GSYNC.

But image quality off-axis AND HIGHLY variable response time of VA will be with you the whole time.
 
Ya the only way I buy a VA FALD over an OLED is if the FALD is like 25% of the price. But the BFGD is looking like it will be be more expensive than a comparable OLED TV, not less, which is pretty insane. Even the $2K+ rumored price tag for the PG27UQ is pushing it hard for a FALD LCD when 2019 55" OLEDs will be well under $1500. If they think they're going to launch these products 2 years late at the prices that were rumored ($2-$3K) I think they will have problems finding buyers.
 
I can understand your points of view and having different priorities about what is better feature wise in a modern pc gaming oriented display. However G-sync does more than eliminate tearing (and reduce input lag compared to v-sync).

Tearing usually happens when your frame rate exceeds the refresh rate (and the display is still busy displaying a different frame). G-sync eliminates tearing but on a 4k 120hz+ few games are going to go over 120 even as the top of an average's actual frame rate range unless you are using high tier sli on a game that supports it. You could also cap your frame rate a few fps below the max hz so it won't hit it. There could still be some tearing due to frame rates out of phase with the refresh rate though. Capping on the 200hz VA ones would be a good idea anyway since VA starts to lose it's clarity again much past 120hz due to it's response times.

G-sync also keeps the screen completely smooth during abrupt slow downs and stops / stutters which will happen a lot otherwise. That is where it comes into play even at higher frame rates but especially when your fps ,(which is an average so is actually a wider range +/-) is moderate and swings quite low. This would be the scenario which most people would have at very high to ultra settings on demanding games at 4k resolution.

G-SYNC benefits explained and shown with animated simulations here at blurbusters.com
https://www.blurbusters.com/gsync/preview/

.

"With G-SYNC, you can continuously turn from complex scenery (e.g. open space) to low detail scenery (e.g. wall or floor), without seeing a single erratic stutter."

""you can turn back and fourth from complex scenery to simple scenery, experiencing massive changes in frame rate, in an ultra-smooth manner."

"You do get the “low frame-rate feel” if the frame rates go low (e.g. 30fps), however, now the monitor refresh is always synchronized to the refresh rate, eliminating stutters."

""Game engines can also be the source of stutters, However, the link between a variable-framerate source (GPU) and a fixed-refresh-rate display (status quo!) also creates stutters as well. "

"It can also be caused by the display, since the graphics card has to wait for the display to finish refreshing (an old fashioned scan-out) before refreshing a new frame."

"Without stutters, it feels like a major upgrade .. " "in this situation, any gamer can be more comfortable in Ultra quality settings. " <--- I'd say 'much higher settings' personally
 
Last edited:
lol that is almost 2 years since reveal.
That is typical ASUS. Always excited to show off new products before they're even done testing them. I think there was a similar length of time between when the PG278Q was announced and when you could actually buy it.
 
Back
Top