ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

He corrected himself in a YouTube comment and said that it's going to be $799 USD.

PRE-ORDER CANCELLED!!!!!!

This. Frikkin delays have cost me a lot of money because I'm so bored I started buying Korean monitors. I think I'm gonna stick with my Catleap and give the Swift the finger.

In my boredom I went through a Samsung U590D, Dell UP2414Q, Qnix EvoII and LG 34UM95.....but I am keeping the 34UM95 as it perfectly accommodates my penis.
 
Yeah pretty useless, mostly just dialogue and watching a "reporter" personality more than focusing the camera on the hardware itself. And why would you video review a TN panel from an extreme sidelong angle like that :rolleyes:
 
He seems to be running those SLI Titans with G-Sync in the video just fine. Not to mention the FAQ in the geforce website states it works seamlessly as well.

"Q: How does NVIDIA G-SYNC work with SLI?
A: The NVIDA GPU connected to the display manages G-SYNC. SLI GPU setups work seamlessly with G-SYNC displays."

Source

They say that on the website, but the reality is it doesn't work properly. Games either crash, have strange stuttering, or get worse FPS than using a single GPU.

source: I have SLI 770s and a G-Synced 248QE, other people have the same problem
 
That would be a huge deal-breaker for g-sync at this kind of resolution and high hz since you really need dual high end gpus to hope for 100fps at ultra or even on high settings on many of the more modern games at 2560x1440.
http://forums.evga.com/m/tm.aspx?m=2069079&p=

I posted a question on the comments for that youtube video about ulmb/backlight strobing mode not being mentioned and my question has been ignored so far.

At least if g-sync needs drivers/firmware updates later to work with sli properly, you would hopefully be able to tweak your game settings down enough to get 100fps+ or 120fps+ depending on the game and use ulmb mode.

Backlight strobing mode was a major marketing point on the eizo FG2421 "240hz" monitor so I am concerned that backlight strobing is not being touted on this monitor even if they are trying to market it more as "smooth even at this resolution for modest gpu setups" (lower fps users). I want to use ulmb mode on TF2, L4D2, and any number of easy to render games in my steam library, and g-sync when I need to for more demanding games. Also, even though I'm planning on dual 800's later, games aren't going to get any less demanding so I still want both options. Dual high end cards can barely break 100fps on a lot of games on ultra, and on a few already can't (~60fps at extreme settings on a couple of titles) unless you turn things down.

If watchdogs scales like BF4, it should break 100fps by a bit with dual 780ti's at 2560x.. in company with tombraider and some others. Metro LL is around 63 fps with two 780ti's at 2560x though (on ultra). I can live with not being able to use windowed/windowed+maximized mode, but dropping sli in order to use g-sync would be unacceptable.

BF4 [email protected] 780ti SC 2560x1600 ultra
2560x
1 gpu: 52 fps ave
2gpu: 99 fps ave
<snip>
Lots of other games besides BF4 though of course. Personally I'm tired of that kind of game. Looking forward to "Evolve" from the team that did L4D. Watchdogs is pretty demanding from reports (whether poor optimization partly to blame or not). The only 2560x bench of watchdogs I've seen quoted it at 57fps ave on ultra (+5 more fps than that evga forum BF4 benchmark).
 
Last edited:
They say that on the website, but the reality is it doesn't work properly. Games either crash, have strange stuttering, or get worse FPS than using a single GPU.

source: I have SLI 770s and a G-Synced 248QE, other people have the same problem

I'm pretty sure any problems that the G-Sync DIY kit had will be ironed out in the retail G-sync monitors so "IMO" the diy kit doesn't count until i have an actual retail G-sync monitor in my hands or is reviewed in depth. I consider the diy kits a beta if you will.
 
Last edited:
I'm pretty sure any problems that the G-Sync DIY kit had will be ironed out in the retail G-sync monitors so "IMO" the diy kit doesn't count until i have an actual retail G-sync monitor in my hands or is reviewed in depth. I consider the diy kits a beta if you will.

I'm pretty sure it has nothing to do with the hardware and everything to do with the drivers.
 
In every video I've seen with this monitor displaying a game it looks pasty and washed out, like the gamma is way too high and the contrast is way too low. That video showing it running Battlefield 4 is the worst. Granted, viewing a video which wasn't designed to show image quality in the first place through my monitor which distorts the image even further isn't the best way to get an accurate idea of what it's going to look like in real life, but still, relative to other videos I've seen of other monitors, the image quality just doesn't look good. Hoping I'm wrong because otherwise this thing looks like it's going to be awesome.
 
That would be a huge deal-breaker for g-sync at this kind of resolution and high hz since you really need dual high end gpus to hope for 100fps at ultra or even on high settings on many of the more modern games at 2560x1440.
http://forums.evga.com/m/tm.aspx?m=2069079&p=

I posted a question on the comments for that youtube video about ulmb/backlight strobing mode not being mentioned and my question has been ignored so far.

At least if g-sync needs drivers/firmware updates later to work with sli properly, you would hopefully be able to tweak your game settings down enough to get 100fps+ or 120fps+ depending on the game and use ulmb mode.

Backlight strobing mode was a major marketing point on the eizo FG2421 "240hz" monitor so I am concerned that backlight strobing is not being touted on this monitor even if they are trying to market it more as "smooth even at this resolution for modest gpu setups" (lower fps users). I want to use ulmb mode on TF2, L4D2, and any number of easy to render games in my steam library, and g-sync when I need to for more demanding games. Also, even though I'm planning on dual 800's later, games aren't going to get any less demanding so I still want both options. Dual high end cards can barely break 100fps on a lot of games on ultra, and on a few already can't (~60fps at extreme settings on a couple of titles) unless you turn things down.

If watchdogs scales like BF4, it should break 100fps by a bit with dual 780ti's at 2560x.. in company with tombraider and some others. Metro LL is around 63 fps with two 780ti's at 2560x though (on ultra). I can live with not being able to use windowed/windowed+maximized mode, but dropping sli in order to use g-sync would be unacceptable.

I wouldn't worry too much. They seem to be trying to drive home the point that people shouldn't be afraid of 1440p here, so they want to underscore that with Gsync we could just use 1 GPU. Which is true. Single 780Ti would be fantastic if you didn't get all crazy with the MSAA. I would think you're more interested in ULBM, and I'm pretty much positive it will be there. What I forgot about was that ULBM seems to be Nvidia only... so I'm looking at having to use hacked lightboost...
 
Incorrect. Overclocking such a panel does nothing to increase pixel transition speed. It's still sample and hold with really slow IPS pixels. I've owned 7+ "overclock" 1440p monitors, and my fastest 2B would do ~136 Hz.

It still could never do under 10ms in the MPRT test. The pixels operate at the same transition speed they do at 60 Hz. Just as a comparison, Lightboost at 10% gives a MPRT of 1.4ms. That is over seven times clearer motion than an overclock 120+ HZ IPS monitor. The difference is massive. The increased "smoothness" of the increased refresh rate only makes the image feel smoother, it does virtually nothing for motion clarity which is completely different.

By the way. I've been doing some testing and overclocking does decrease pixel persistence.

My catleap at 72Hz=12.6ms
My catleap at 120Hz=7.8ms

This is with MPRT test.
 
By the way. I've been doing some testing and overclocking does decrease pixel persistence.

My catleap at 72Hz=12.6ms
My catleap at 120Hz=7.8ms

This is with MPRT test.

You realise MPRT is heavily linked to refresh rate and that alone can account for the differences you're seeing? :confused:
 
You realise MPRT is heavily linked to refresh rate and that alone can account for the differences you're seeing? :confused:

Yes... I do realize that.
That is why I am disputing this comment:
Overclocking such a panel does nothing to increase pixel transition speed.

What do you think overclocking a monitor is? Increasing the refresh rate.
 
Going from 30hz to 60 and 60 to 120 will decrease it in practice. However, you're not going to be able to get ~1-2ms persistence without backlight strobing.
 
Hey-

I plan to buy 3 of the rog swifts come release time. I have a Radeon 6990.

I run all 3 monitors at 2560x1440 - but when I am gaming, I only do it on the center monitor (I dont run in eyefinity or anything) - The right and left monitors continue to show my desktop/browsers/etc..

Do you think the 6990 will have enough power to do the above, or will I need to upgrade to a SLI setup?
 
Yes... I do realize that.
That is why I am disputing this comment:


What do you think overclocking a monitor is? Increasing the refresh rate.

Refresh rate has absolutely nothing to do with pixel speed.
 
Refresh rate has absolutely nothing to do with pixel speed.

Really? Then why is it when you increase the refresh rate the persistence drops? Not changing anything else... not switching out a different monitor...

Let's ask Mark, eh?
I read some of his explanations on this topic. Here's one that stuck out:
Mark Rejhon said:
All 1ms and 2ms flicker-free monitors have a persistence of 16.7ms in 60Hz mode, and 8.3ms in 120Hz mode
Hmmm... that seems like there must be a relationship between persistence and refresh rate no?

Maybe I just don't understand what you mean by "pixel speed". I thought persistence was pixel speed. I guess I just need to have the difference between "response time", persistence and pixel speed explained real slow because I'm not getting it. Is it that you can't change response time by overclocking, but you can make the persistence much lower? I mean obviously not all LCD displays have the same persistence at a given refresh rate, right? But if the only variable you change is refresh rate and it lowers your persistence (not half just faster) how can you say they aren't related? What does it all mean?

And whoever said you can't get a 1-2ms response display without strobing is not paying attention at all. Most TNs are 1-2ms response and don't strobe. Add in the strobing and even Mark agrees the total pixel time doesn't matter so much anymore because the strobe effect hides it--which is why the slow ass FG2421 works so well (it's not so bad even without the strobe). 8ms IPS with strobes will work fine. These overclockers have fast enough persistence. Because there's the rub. The catleap is fast for an IPS only because it doesn't do anything to the signal. Lots of other lcds are much much slower. Their persistence and their response time are both slower. So lots of things affect "pixel speed"--including scalers and LUTs as well as refresh rate.
 
Last edited:
Really? Then why is it when you increase the refresh rate the persistence drops? Not changing anything else... not switching out a different monitor...

Let's ask Mark, eh?
I read some of his explanations on this topic. Here's one that stuck out:

Hmmm... that seems like there must be a relationship between persistence and refresh rate no?

Maybe I just don't understand what you mean by "pixel speed". I thought persistence was pixel speed. Response time is specific to a part of the overall refresh cycle--persistence is that total period of time. Can't change response time by overclocking, but you can make the persistence much lower (and again, I think that counts as "pixel speed"). So are we arguing about semantics again?

And whoever said you can't get a 1-2ms response display without strobing is not paying attention at all. Most TNs are 1-2ms response and don't strobe. Add in the strobing and even Mark agrees the total pixel time doesn't matter so much anymore because the strobe effect hides it--which is why the slow ass FG2421 works so well (it's not so bad even without the strobe). 8ms IPS with strobes will work fine. These overclockers have fast enough persistence. Because there's the rub. The catleap is fast for an IPS only because it doesn't do anything to the signal. Lots of other lcds like would are much much slower. Their persistence and their response time are both slower. So lots of things affect "pixel speed"--including scalers and LUTs as well as refresh rate.

I'm not an expert on this topic, I must confess. But my understanding is that MPRT (Mean Pixel Response Time) is affected by refresh rate primarily because it reflects how you perceive the blur. At a higher refresh rate your eyes move less so there is less blur. It doesn't matter of the pixels are going 2ms or 8ms, MPRT might still be 8ms. Mark wrote a great article on this with the pcmonitors author which is where I learnt about this as well as on Blurbusters.
 
I'm not an expert on this topic, I must confess. But my understanding is that MPRT (Mean Pixel Response Time) is affected by refresh rate primarily because it reflects how you perceive the blur. At a higher refresh rate your eyes move less so there is less blur. It doesn't matter of the pixels are going 2ms or 8ms, MPRT might still be 8ms. Mark wrote a great article on this with the pcmonitors author which is where I learnt about this as well as on Blurbusters.

Well, I don't know. I'm super confused. I try to sound like I have it all figured out but mostly I'm clueless and hoping someone will make it simple.

But let me ask you this:
Why is it that when I adjust my display scaling:
Capture2_zps0a6e14d3.png~original


from that setting, which I normally use, and gives me this MPRT result (when I track the ufo I get a checker pattern)
Capture3_zpse68b39cc.png~original


to the smallest scaling (native 1440p) I get this MPRT result?
Capture_zpsa26422a4.png~original

Because 4.9ms is starting to get close to that half 120Hz frame which Mark says would be beneficial.

Edit: Then I do a bit more messing about with the MPRT test and go very large so I can see the squares very well, and then use that setting for the very small 8 pt test:
Capture4_zps4a43f189.png~original

Now my eyes hurt. But this setting actually gives me a very evenly measured checker-pattern, the squares of white and black are pretty evenly sized but tiny so who knows. What is all this shit about? I thought all 120Hz LCDs were 8.3ms... but that's curiously similar to exactly 1/120. So if persistence is exactly the same thing as sample and hold... then what is the point of differentiating between refresh rate and persistence? No, I must reject this logic. I see Mark often refer to persistence as sample and hold and it just doesn't sound right. Sample and hold is not a universal thing, there's a lot of uses for "sample and hold" circuits. When these guys talk about sample and hold displays they just mean "it don't strobe".

I'm beginning to be more and more confident that this "Overclocking monitors have 8ms response time" assertion isn't quite right. That's the refresh rate--the total sample and hold or "frame" time. The pixels change color faster than that or it would look like shit. Since I'm not blind and can see that a strobing backlight decreases motion blur even further (dramatically)... I think it's also fair to say I'm not blind and the Catleaps "work" at 120Hz, as in, they don't look like shit because they are too slow. I don't know what the total response time is, but it must be 8.3ms or faster or they wouldn't make the next frame in time. Right?? If at 135Hz Vega's Catleap was still trying to change colors when the next frame came around I don't think he would have bothered with 8 of them. So how do you actually measure response time instead of just quoting the length of time for the monitors refresh rate to pull up a new frame?
 
Last edited:
Alright, just managed to read up on this entire topic.

Let's hope this screen gets some good reviews. The price in Europe will be even more expensive than in the U.S. seeing as the Asus PB287Q 4K monitor is $649 in the U.S. and &#8364;649 in Europe... almost $230 more.
That would explain why Asus releases certain items in Europe well before the U.S. :D

Anyway, my old HP W2408H isn't bad either but I can't wait to see what this puppy can do.
I remember that when i bought this one, people were complaining it was a glossy screen and they all wanted matte. :rolleyes:
LaX2d0k.png
 
Last edited:
Mostly I continue to trash this thread and champion the Catleap/Qnix because I still can't order this fucking thing.

Vega knows the answers he just doesn't have time to explain it.
 
I'm at dreamhack right now so I have the chance to try it out. I can say I think it looks good from different angles, and the g-sync you have probably already heard about. I didn't manage to activate ULMB mode though. Maybe this sample just doesn't have the feature. (it's grayed out in the OSD)
I'm no color connoisseur but I'll see if I can do some color test on it later.
 
I'm at dreamhack right now so I have the chance to try it out. I can say I think it looks good from different angles, and the g-sync you have probably already heard about. I didn't manage to activate ULMB mode though. Maybe this sample just doesn't have the feature. (it's grayed out in the OSD)
I'm no color connoisseur but I'll see if I can do some color test on it later.

Awaiting more posts from you. Possibly the ulmb will not be available or for some time at least after the release given that the review didn't mention it either at all. Possibly it will be released as an official hack of the display for "advanced users" separately. My speculation though.
 
I'm at dreamhack right now so I have the chance to try it out. I can say I think it looks good from different angles, and the g-sync you have probably already heard about. I didn't manage to activate ULMB mode though. Maybe this sample just doesn't have the feature. (it's grayed out in the OSD)
I'm no color connoisseur but I'll see if I can do some color test on it later.

Please give us more details! Do they let you sneak some pictures for us too?
 
Mostly I continue to trash this thread and champion the Catleap/Qnix because I still can't order this fucking thing.

Vega knows the answers he just doesn't have time to explain it.

I was like you, but then I cancelled my pre-order from Gamestop and now I feel much , much better! Believe you me you me you I, it feels great announcing PREORDER CANCELLED every 5 minutes for any midgets who will listen.
 
I have finally experienced 120hz (actually 110hz) on a QNIX. I don't know how people say they can't tell the difference... I can even tell the difference on the desktop.

Don't see how I'll ever go back to 60hz now.
 
I have finally experienced 120hz (actually 110hz) on a QNIX. I don't know how people say they can't tell the difference... I can even tell the difference on the desktop.

Don't see how I'll ever go back to 60hz now.

Nonsense, the left human eye can only see 12fps and the right eye can only see 12fps, which when combined totals the magic number of 24fps used in film. What you perceive as noticeable is nothing more than a psychedelic roller coaster of PWM induced dementia. You should seek medical attention soon.
 
Nonsense, the left human eye can only see 12fps and the right eye can only see 12fps, which when combined totals the magic number of 24fps used in film. What you perceive as noticeable is nothing more than a psychedelic roller coaster of PWM induced dementia. You should seek medical attention soon.

You're kidding, right? :O
 
I'm at dreamhack right now so I have the chance to try it out. I can say I think it looks good from different angles, and the g-sync you have probably already heard about. I didn't manage to activate ULMB mode though. Maybe this sample just doesn't have the feature. (it's grayed out in the OSD)
I'm no color connoisseur but I'll see if I can do some color test on it later.

Looking forward to seeing more from you. :)

Any computer media outlets there?
 
ULMB mode was probably greyed out of OSD because G-sync was enabled.
This. It's either/or, not both at the same time. See if you can disable Gsync to check if ULMB is available. It has to be if it's in the OSD, right? :p
 
Why would anyone buy this monitor over the ASUS 4K which is actually cheaper? Yes, it's 60Hz but surely a much lower resolution (1440P) is not better than 4K UHD?

This monitor does seem awesome w/ the G-Sync etc. but it sounds a bit gimmicky at this point. Need some real reviews of this damn thing.
 
Why would anyone buy this monitor over the ASUS 4K which is actually cheaper? Yes, it's 60Hz but surely a much lower resolution (1440P) is not better than 4K UHD?

This monitor does seem awesome w/ the G-Sync etc. but it sounds a bit gimmicky at this point. Need some real reviews of this damn thing.

I'm waiting for Pascal before I consider 4K. By then they should have 120+hz 4K.
 
Why would anyone buy this monitor over the ASUS 4K which is actually cheaper? Yes, it's 60Hz but surely a much lower resolution (1440P) is not better than 4K UHD?

This monitor does seem awesome w/ the G-Sync etc. but it sounds a bit gimmicky at this point. Need some real reviews of this damn thing.

Motion clarity is the reason. High resolution motion clarity at that. Gamers want high framerates and they want to see what's going on as well. At least, I'm speaking for myself on this one.
 
There are a lot of posts in this thread from me showing why I don't consider 4k a good choice outside of desktop real-estate and desktop/app use for at least a few years.

The first and most obvious is 60hz panels have massive screen blur during your continual movement keying + mouse look flow pathing in 1st and 3rd person games. 120hz TN reduces this blur by 50% so that it is more like a strong soften blur, making your entire viewport of high detail geometry and textures (incl depth via bump mapping as well as other shader effects) "fuzz out".

120hz-144hz (even at 100fps or so) also allows many more game world state action slices show per second which has several aesthetic (and depending on the game, potential performance vs game and/or other players) advantages:
Increased motion defintion, increased motion tracking (seeing more "dots per dotted line path length"), and increased animation definition. These factors create a more refined/articulated and smooth motion of both individual screen elements their individual pathing/paths, their individual animation cycles, and of the entire world moving during continual viewport/player-camera FoV movement.

You can barely get (at or just over) 100fps on many of the most popular modern games on ultra settings running dual 780ti's in sli on a single 2560x monitor currently,

The only way I could see using a 4k monitor as your only monitor is if you ran the 4k monitor at 1080p 120hz while playing 1st/3rd person games.. which some 4k monitor's can do. However, the desktop/app real-estate benefit of a 4k panel would be a big tradeoff if you were using a TN panel for the speed. Conversely, using a 4k ips panel would make it blur considerably more even at 120hz 1080p mode. Another big consideration is that most 4k panels so far don't have g-sync. A few coming out soon will have g-sync, however they won't be able to use the ulmb mode at 60hz without looking very flickery, if they can use ulmb mode at all. I'd be curious if they could run ulmb mode in 1080p 120hz mode though. Outside of all that 3840x2160 60hz mode would suck for 1st/3rd person gaming imo. Very low fps on even robust enthusiast gpu budgets, low hz ceiling yielding smearing FoV movement blur, low motion tracking/moition definition, flow, and animation definition. And if the particular 4k model lacks g-sync and ulmb mode there is that too.

That said, 4k is a massive increase in desktop/app real-estate. I would like an ips 4k someday to upgrade the desktop/app side of my dual monitor setup, but with a much more suitable gaming monitor next to it. Currentlly I run a 2560x1440 glossy ips next to a notably lush/vibrant 1920x1080 120hz TN, both 27". A 3840x2160 ips next to this 27" 2560x1440 TN seems like a logical evolution. I'm also very interested in the 90Hz-100hz + low persistence/blur elimination oculus rift when the conumer release comes out.
 
Last edited:
Back
Top