ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

I think $650 is an important price point for a lot of people. For me, it would still be more than I have ever spent on a monitor and definitely is still a premium price compared to many other monitors. Even though I want this monitor, at $800 I just have a hard time seeing myself buying it. At $650 it would be pretty good that I would buy it.
 
$650? I'll believe it when I see it lol. I highly doubt it (I think its just a mistake by the rep who posted it) and even then, gougezilla is in form of the retailers anyway.
 
$650? I'll believe it when I see it lol. I highly doubt it (I think its just a mistake by the rep who posted it) and even then, gougezilla is in form of the retailers anyway.

That is true. If it really does come out at $650 it won't stay there long. It'll likely fly off the shelves and then retailers will jack it up to $900 or so in no time.
 
I even private messaged them on facebook for further clarification.

y5OTRmo.png
 
Are there going to be new 24" models also, or is it just 27" for now?
Nothing else has been hinted at or announced beside the 27" at this time.

Anyone see any preorders up yet? I would like to secure one of these as early as possible :cool:.
 
I even private messaged them on facebook for further clarification.

y5OTRmo.png

"Ready mid-July for $799!!!" but by ready we mean not at all, and by mid July we mean the 28th, if you live in Europe only. And then it was "US pre-orders will begin in late August blah blah blah boat blah blah asia." And now it's $650?

The launch of this monitor has been filled with so much "official" misinformation it really irritates the hell out of me. I sincerely hope the price is that low, but considering the huge f-ing deal they made about it June and how that ended up, I'm not ready to believe a damn thing they say until I see it on sale on Amazon.

The communication problems among departments at ASUS are so bad I have to wonder how it is they even manage to produce such an amazing monitor in the first place. I wish there was someone else I could give my money to for a 27" 144Hz screen, but instead we have to lie down and take it.
 
Day one with the Swift. Compared to my old Dell 3008WFP:
  • Higher Hz is really nice. Makes a huge difference in games. Combined with ULMB or G-Sync they are just so smooth.
  • ULMB brightness is ok. It seems that if the pulse width setting is lower than 90 it gets darker. The lowest settings are close to unusable.
  • Viewing angles are quite acceptable! This was my biggest worry and I'm glad it was an unnecessary one. Sideways there isn't too bad color shifting and vertically it's bad but when sitting normally I don't notice any shifts or big differences between top and bottom.
  • Colors are really good out of the box, no real need to adjust anything but brightness. Being used to more vivid colors than SRGB, the Swift's colors seem a bit flat at first.
  • The "GamePlus" mode has the crappiest design. Crosshairs are rather ugly designs and you can't have both timer and crosshair at the same time. No easy way to disable either (apparently if you wait in the menu without selecting it disables the crosshair or timer). No custom timer available. I think this could've been a neat food timer if it was adjustable. :)
  • I wish the USB ports were in a more convenient location. Now they're really only useful for peripherals that are connected all the time.
  • AG coating isn't too noticeable. Then again I didn't mind the Dell's AG much either so those who are really anal about such things might be disappointed.
  • Turbo button should've been just a menu option. It's not much faster to change between the different modes. Same for the GamePlus. They should've replaced those with something like ULMB/G-Sync toggle or something.
  • Power light is extremely subtle.
  • Menu joystick is very good! Good idea for adjusting settings. My Dell is downright awful to adjust in the rare case you need to do that.
  • The box for this thing is massive. I don't remember my Dell coming in a box half this big despite being a bigger screen.

I don't know if firmware is in any way upgradable on this thing but I really wish ASUS would give us better options for the GamePlus stuff. It's not a bad idea, the execution is just the definition of half-assed. If ASUS won't do anything about it, hopefully someone figures how to replace the crosshair graphics.

It's a shame ULMB doesn't work with 144Hz. I don't know what kind of technical limitation there is for that (nobody makes backlights that can switch at that rate?) but it would've been nice if there wouldn't be such discrepancies with what works in which mode.

I'm going to miss my Dell's scaler for gaming at lower res (though that shouldn't be too necessary with my GTX 770 SLI), Nvidia really should write a better GPU scaler, the Dell's scaler has so much better image quality at less than native resolution.

I haven't tried it side by side with the Dell yet (have to clear some desk space first) but I could easily see using the Swift as my only monitor.
 
I suspect it is a recognition that supply will not meet demand and prices will hit 800+ anyway. Or... It could be a recognition that free sync is not only just as good or better but free...
 
Lol you guys are soo funny man, this monitor is worth about $650 on a good day, considering how many months it was delayed and how many other manufacturers are getting in the race as well, however almost a year worth of beating you guys down with the $800 price tag has made you believe that the monitor is darn cheap at $650. I love marketing tactics...they play tricks with the human mind at a different level.
 
I am (and was, as soon as I read that it was a full 8 bit high quality TN panel) fairly set on buying this whether it's 800 or 650. If it's 650 that's just a cherry on top. I already have 780 SLI's so I'm fairly Nvidia'd at the moment anyway and this is pretty much the gaming monitor to get, will likely last at least 3-5 years.

If I had to think of a reason it might retail for less all the sudden, I'm guessing the unexpectedly high demand overseas is making them predict that they can sell more quantity here, too, especially if they decreased the price. That much is evident just by looking at this topic. They might be figuring that moving more quantity will outweigh the loss per unit. It would make quite a bit of sense, and looking at this topic it would be a good move by them. This can't cost that much in terms of raw materials, so they just need to make a profit considering setting up of manufacturing and R&D. Perhaps they can make a cheaper manufacturing system that will take a bit more R&D assuming it will sell more at the price point. Whatever, it's all speculation atm.

As for Freesync... doesn't the monitor have to support it in the first place? I know it's part of some spec, but I'm really curious if it really doesn't require extra hardware in the monitor anyway, and whether it will really work as well... and either way you're buying a new monitor that needs to support it. They will likely also advertise it as a "freesync capable monitor" and charge more either way. I'm not seeing much advantage, and Nvidia at least has something that has been demonstrated and is working in consumer homes at the moment... but then again like I said I'm already in the Nvidia camp at the moment (personally; I still suggest R9's to other people, just that I'm not switching). If I had R9's I'd probably be thinking about this different.

I just really hope this won't be a super panel lottery. I have bad luck with panels. Very bad luck.
 
If you think g-sync is taking a long time to come out, free-sync is beyond vapor by comparison as far as something you can buy with similar functionality any time soon.

Free-sync also never makes mention of any kind of backlight strobing zero blur option which seems pretty standard in nvidia's g-sync monitors.
 
I suspect it is a recognition that supply will not meet demand and prices will hit 800+ anyway. Or... It could be a recognition that free sync is not only just as good or better but free...

Freesync is a tech demo hastily cobbled together. There are no shipping Freesync devices, or even devices that use the variable refresh rate spec of Displayport. Even if there were, G-Sync is more than just a way to implement a variable refresh rate.
 
I find it amusing how many people are getting upset about Freesync.

I say something like "it could be..." and guys are rushing to correct me as if they know any better than I do. Except they aren't saying "maybe". :D
Why would Asus drop the price? Just to be friendly to Americans?

Interesting timing to do it right after the Freesync FAQ comes out.

Also G-sync has nothing to do with anything BUT variable refresh rate.

We already had Lightboost and it wasn't costing any extra.
 
I am interested to hear any feedback from AMD GPU users in regards to how this monitor plays with a pair of 290/x's in crossfire in 144Hz mode, (without using GSYNC or Lightboost obviously).

Also, first-hand compare & contrast reports between this and the 120Hz capable Korean IPS monitors would be greatly appreciated. I am just trying to gauge if it will be worth upgrading to this as an AMD user who is pretty happy with his current 110Hz 27" Korean panel.
 
I am interested to hear any feedback from AMD GPU users in regards to how this monitor plays with a pair of 290/x's in crossfire in 144Hz mode, (without using GSYNC or Lightboost obviously).

Also, first-hand compare & contrast reports between this and the 120Hz capable Korean IPS monitors would be greatly appreciated. I am just trying to gauge if it will be worth upgrading to this as an AMD user who is pretty happy with his current 110Hz 27" Korean panel.
I think it'll probably be better to get BenQ or Eizo with their own version of UMLB, especially the Eizo VA panel (if you win the panel lottery).

You AMD folks would at least want UMLB right?
 
I'd keep them side by side. I don't have a korean oc one, just a 60hz 2560x1440 ips and I plan on keeping both at the desk, upgrading from the 1080p 120hz TN that is currently the dedicated gaming monitor.

From everything I've read the korean oc ips monitor's response times are slow so they are more toward the 60hz smearing outside of the lines end of the motion clarity "scale". That smearing doesn't just affect individual objects moving on your screen, it smears the entire viewport/game-world during your continual movement keying + mouse looking periods in 1st/3rd person games.
Further down the page in the following link, I borrowed some pictures from blurbusters.com and some eizo advertisement/review that try to show blur amount difference between 60hz, 120hz, and backlight strobing"zero blur" modes.
web-cyb.org 120hz-fps-compared

So an oc'd ips is more toward the 60hz TN baseline smear blur end of things from what I've read from motion clarity purists in this thread and others, and reviews. On a 120hz TN at high fps you get about 50% blur reduction, at 144hz TN you get about 60% reduction. This results in what is more of a strong soften blur effect, where the screen goes "fuzzy", but more within the "shadow mask"/footprint of onscreen elements. The texture detail and depth via bump mapping, etc is all lost just like you can't read blurry in game text.

Obviously an ips will have more uniform color(no tn shift/ "gradient shadow") and more degrees of color. The asus rog swift seems to be capable of some good color saturation from what's being reported, which was a big concern of mine since several of the newer 120hz-144hz tn monitors have been notably pale lately. I hope the swift can be tweaked for some lush (relatively high vibrancy) color even if not totally uniform nor having the most extreme color pallete size. I'm happy with the color of my samsung A750D 120hz 1080p tn for gaming, but i keep a ips next to it for desktop/apps.

So you would gain a much tighter version of viewport motion blur - much greater motion clarity but not g-sync nor zero blur backlight strobing which (along with being 2560x1440 at 1ms, 120hz-144hz) are the other major reasons the monitor is expensive. There is a chance that someone will make a hack for amd to use g-sync eventually too btw but don't take that as a promise from me. :p
That's not first hand but I thought I'd throw it out there.
 
Last edited:
I don't know if this link has been mentioned since I haven't kept up with this thread, but as FYI, here's a review of this monitor by guru3d.com. They expect this monitor to sell for 800 euros or more (a euro is a little more than a USD). :eek:
 
I find it amusing how many people are getting upset about Freesync.

I say something like "it could be..." and guys are rushing to correct me as if they know any better than I do. Except they aren't saying "maybe". :D
Why would Asus drop the price? Just to be friendly to Americans?


I pointed out some possible of reasons for them to drop the price. Along with that I'm gonna add that the US usually gets the better end of the stick in terms of prices I think (I'm pretty sure we're one of the largest consumers of anything in the world)? I just don't see how a simple FAQ with no substance would make ASUS change their pricing scheme. I highly doubt that has anything to do with it. If anything, it would be a large combination of factors where that might have been a (very) small consideration. Anyway, all of this is moot speculation. The reason doesn't matter because the end product doesn't change. You either think this worth it or you don't.

And who is acting upset? This is some kind of assumption on the basis that there are two sides waging a battle on these forums. Let's get out of this "GPU vendor camp" mentality. Your job isn't to fight the GPU vendors' wars on the forums. Your job as the consumer is simply to see where the pieces fall and then make a decision based on your circumstances... and then maybe if you get enough momentum, change where the pieces are falling.

As a consumer, this is what I'm seeing: Freesync isn't out. Its requirements are still vague and there hasn't been a single monitor released despite its supposed ease of integration. G-Sync meanwhile has solutions, out, it's proven, it works, and it has products that I can purchase. And this monitor has its own merits with or without it but: well, you know I already have Nvidia cards that I'm fine with. So for me this G-Sync feature is part of the package without me having to do anything. I imagine this evaluation changes when you're on the other side of the coin. Simple cost-benefit evaluation. I can see for AMD users this monitor definitely loses a fair share of value... in fact actually if it's so simple, I'm curious as to why ASUS couldn't include Freesync on this thing. Surely ASUS was not paid off to play sides. Catering to both vendors if possible would have been more beneficial to the bottom line.
 
Last edited:
And who is acting upset?
Some of the people who've been waiting for this display are definitely upset, and they definitely seem hostile to FreeSync.

I'm certainly not biased towards one manufacturer or the other. I like video cards in general. I like it when the underdog wins, though. It's an American tradition.
 
I'm not hostile to it. It just seems like beyond vapor(hard)ware at this point and just a proof of concept. How many modern g-sync monitors have come out since it was actually developed, and how long since it was demo'd.. outside of the diy kit? If you are waiting on working freesync in a 120hz-144hz monitor that you can buy I suspect you may be in for a very long wait. In addition, as I keep saying in posts - g-sync monitors seem to have ulmb zero blur backlight strobing as a pretty standard inclusion and I see no mention of that on freesync blurbs ever. Yes backlight strobing can be included somehow by amd (like 3d gaming can be) or it can be included by monitor mfgs themselves... but monitor mfg's could make their own dynamic hz functionality too if they really wanted to. I welcome competing versions of these technologies. The point is what is going to be available for the next six, eight, twelve, or more months and what isn't?

For myself I'm also keeping an eye on the oculus rift's consumer release (90hz or higher input oled w/ some form of low persistence~blur elimination) if we are talking about things we could wait for.
 
I find it amusing how many people are getting upset about Freesync.

#letsmakestuffup

Also G-sync has nothing to do with anything BUT variable refresh rate.

Not really. The point of G-Sync is to have the GPU drive the display's timing. Yes, it matches refresh rate with FPS, but the big deal—and the reason why G-Sync needs hardware—is that the GPU can directly control when frames are delivered and rendered on the display, not just the refresh rate of the display.

Freesync will be better than nothing, but less effectual than G-Sync, if and when products ship that eventually use it. It doesn't solve motion blur, either, and ULMB is superior to Lightboost in that regard, from what I can tell.
 
Day two. Hooked up my Dell 3008WFP next to the Swift.

The 3008WFP at 30" 16:10 is huge compared to the Swift. I seriously don't know what to do with this much desktop space, maybe would be nice for work (I'm a web developer) but most of the time not very useful. I'm not the type who runs everything all at once, just can't handle the information overload I get with that.

Colors surprisingly are not that different, I think the ASUS does tremendously well here and the stock calibration is really good. It's somewhat hard to compare since the Dell has a wider gamut so its colors are more saturated. I think the ASUS would be fine for non-color-critical image editing at SRGB. Viewing angles obviously better on the Dell's IPS panel.

Portrait mode is completely ridiculous on a display this size. Viewing angles become a much bigger issue and reading is difficult as you have to tilt your head constantly. Oh well, at least the cables were easy to hook up in this orientation.

The 120/144 Hz modes have really spoiled me. Even with games where I don't come anywhere close to 120 FPS I've really liked using ULMB mode. The picture clarity and motion is so much better. Brightness with ULMB has been perfectly fine both during the day and at night. Motion clarity and smoothness is the one thing that puts this in a class of its own.

ASUS Swift works perfectly fine at 120/144 Hz with G-Sync or ULMB with the second display running at 60 Hz. No need to turn one display off or anything. Only distraction is that the second display isn't blanked during games, which can be a plus or minus depending on if you want to have info on the other display.

Swift seems to have some issues with the Turbo button. I sometimes get a blank screen with "Out of range" message if I try switching refresh rate thru that. It also sometimes seems to set a 144 Hz 16-bit (instead of 32-bit) custom resolution which only supports 144 Hz. Might be something to do with the second display as I didn't have this issue yesterday. In any case it might be better to change refresh rate from the Nvidia control panel.
 
#letsmakestuffup
Wasn't even necessarily talking about this thread, although you are pretty much doing the stereotypical angry Nvidia stalwart thing, imo.

As for the rest of the mumbo jumbo in your post... sounds plausible :D
 
If you are waiting on working freesync in a 120hz-144hz monitor that you can buy I suspect you may be in for a very long wait.
But G-sync has nothing to do with why the Swift is appealing to me, and not just because I'm running AMD right now. If I wanted, I could switch back to 2x780s. I like the AMD cards better. I'm not saying the Swift is a bad buy.... I'm saying G-sync got suckerpunched and "Nvidia" people are kinda sensitive about it.

The FAQ itself got a buncha flaming comments from doubting Nvidia users. I find it amusing... that's all.

If G-sync is a flop... it doesn't matter to me. And I don't think it will be anyway. I just think Nvidia users are mad they don't have a "GOTCHA" moment on their hands.

If FreeSync never comes to fruition, it matters not to me one wink.

This is because I'm buying a Swift to run at 144Hz with Vsync. I can't wait and I'm irritated it's taken so long.

It's also a Vietnamese tradition.

Yes, which is why so many Americans were actually rooting for the Vietnamese that the Hawks in the White House had to give up. Their warlord weapons dealing friends the DOD contractors were inconsolable.
 
Wasn't even necessarily talking about this thread, although you are pretty much doing the stereotypical angry Nvidia stalwart thing, imo.

As for the rest of the mumbo jumbo in your post... sounds plausible :D

I'm explaining the difference between what you think and what actually is. If you want to call that mumbo jumbo, feel free.

By the way, I ain't even mad.
 
I'm explaining the difference between what you think and what actually is. If you want to call that mumbo jumbo, feel free.

By the way, I ain't even mad.

This is how you respond to me saying your claims seem "plausible"? The smiley face wasn't sarcastic and by mumbo jumbo I meant "I don't get it" not "it's wrong". Defensive much?
 
Hey- Hoping someone can help me out with this. I currently have three 27" cinema displays, all running at 2560x1440.

I intend to buy 1 of the Rog Swift monitors when they are released to replace my center monitor. I only game with the center monitor in 'windows fullscreen' - When I am in game, my left and right monitors still show my desktops and my game takes up the entire center monitor

So, heres my question- I want to be able to drive the Rog Swift at 144hz in 1440p - What type of video card setup would I need? I mainly play Starcraft and Battlefield 4

Would a single 690 or 780 be sufficient? Currently I have a 6990, but since it doesnt have gsync, I assume I will have to upgrade it. (right?)

Thanks
 
This is how you respond to me saying your claims seem "plausible"? The smiley face wasn't sarcastic and by mumbo jumbo I meant "I don't get it" not "it's wrong". Defensive much?

I don't think that's grounds for being defensive. You're on the internet, where it's hard to tell sarcasm apart from seriousness unless it's blatantly obvious. Personally I couldn't tell either, especially considering the first part of your post:

Wasn't even necessarily talking about this thread, although you are pretty much doing the stereotypical angry Nvidia stalwart thing, imo.

If we were going to go from another perspective I think this is getting overly defensive on your end, considering I don't think that many people in this thread have demonstrated PRIME style fanboyism. Keep separate forums separate...
 
Hey- Hoping someone can help me out with this. I currently have three 27" cinema displays, all running at 2560x1440.

I intend to buy 1 of the Rog Swift monitors when they are released to replace my center monitor. I only game with the center monitor in 'windows fullscreen' - When I am in game, my left and right monitors still show my desktops and my game takes up the entire center monitor

So, heres my question- I want to be able to drive the Rog Swift at 144hz in 1440p - What type of video card setup would I need? I mainly play Starcraft and Battlefield 4

Would a single 690 or 780 be sufficient? Currently I have a 6990, but since it doesnt have gsync, I assume I will have to upgrade it. (right?)

Thanks

I have a similar set up with three monitors, and use a 780. I get lots of blue screen crashes when I try to use gsync (in full-screen mode) unless I disable the other two displays before starting a game. It works perfectly fine if I disable them first, but it's kind of annoying. I used to have to do the same thing to get amd's frame pacing to work before I switched to Nvidia though, so I'm already used to it.

You need an Nvidia card if you want to use gsync or ulmb but an amd card will still benefit from 144hz though a 6990 pc will struggle at 2560x1440 on newer games.

Also, gsync requires real full-screen mode and won't work with windowed full-screen.
 
I have a similar set up with three monitors, and use a 780. I get lots of blue screen crashes when I try to use gsync (in full-screen mode) unless I disable the other two displays before starting a game. It works perfectly fine if I disable them first, but it's kind of annoying. I used to have to do the same thing to get amd's frame pacing to work before I switched to Nvidia though, so I'm already used to it.

You need an Nvidia card if you want to use gsync or ulmb but an amd card will still benefit from 144hz though a 6990 pc will struggle at 2560x1440 on newer games.

Also, gsync requires real full-screen mode and won't work with windowed full-screen.

When you say 144hz will still work- Would a 6990 be able to push what I need? ( 1 monitor @ 1440p/144hz + 2 monitors @ 1440p/60hz )

I wonder if its not worth buying a nvidia card if I dont plan on ever playing in full screen mode.

I suppose i could also just buy a second 6990 just for the center monitor...

What would be the best bet?
 
When you say 144hz will still work- Would a 6990 be able to push what I need? ( 1 monitor @ 1440p/144hz + 2 monitors @ 1440p/60hz )

I wonder if its not worth buying a nvidia card if I dont plan on ever playing in full screen mode.

I suppose i could also just buy a second 6990 just for the center monitor...

What would be the best bet?

You're not going to get 144 fps in anything more intensive that CS:Source, but it'll still work.

Let's be honest though if your going to drop the dough on the monitor you should upgrade your gpu as well, there's no point in having a 144hz screen if you never see fps higher than 60.
 
144 Hz is not only useful for 60+ fps.
Lower motion blur benefits any FPS number. What did happen when we all used CRT monitors? They worked at 85 Hz (some at 100 Hz or little more) and I didn't have a powerful graphics card at that time. But when I switched to my current LCD monitor, I instantly noticed the motion blur.

Things like ULMB or Turbo240 are extremely usefull, not only for games but for general application usage too.

Our mind likes clarity.
 
Back
Top