ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

For the first week I ran G-Sync, 20% brightness, 50% contrast, normal color. The second week I switched to ULMB, 100% brightness, 50% contrast, and pulse width at 75. Running ULMB is easier on my eyes (which are very sensitive to bright light) but ULMB requires beefy video cards in most games. Globally I set refresh rate to 120 and G-sync on, then disable G-sync in nvidia 3D settings for all games that can run at 120 fps or better.

When I run the blur busters test, it always says synchronization failed. Anyone know why?
 
For the first week I ran G-Sync, 20% brightness, 50% contrast, normal color. The second week I switched to ULMB, 100% brightness, 50% contrast, and pulse width at 75. Running ULMB is easier on my eyes (which are very sensitive to bright light) but ULMB requires beefy video cards in most games. Globally I set refresh rate to 120 and G-sync on, then disable G-sync in nvidia 3D settings for all games that can run at 120 fps or better.

When I run the blur busters test, it always says synchronization failed. Anyone know why?

I also have ULMB on at all times I can. 10% Pulse Width, 100% brightness. It's a bit dark but nothing I can't handle.

As for your question, are you making your browser full-screen? That might be why. Otherwise try Chrome or Firefox.
 
So G-sync is just as amazing as everyone said and I suspected. It was odd at first because my mind is trained to react and be prepared for varying amounts of input lag. There is virtually no input lag that I could tell using G-sync and it actually made me better in the games I tried. I even enjoyed myself more because of it. My mind also wasn't prepared for the elimination of judder. It's confusing everything ingrained in my brain after years of dealing with LCD displays, and I'm loving it :D.

I'm going to give ULMB a shot tonight. I didn't have very good luck with the third-party Lightboost mod on my VG278HE, so looking forward to this official implementation.
I'm afraid I have to join the minority that doesn't like this monitor. I bought one last night and I was disappointed. I've never used a TN monitor before and have been using an ISP monitor for years now. I've read the various comments on this thread and thought I was prepared.

The biggest problem was it actually made my eyes hurt; it was too bright. I had to turn the brightness all the way down to zero to stop that but then it just didn't look good. It still felt too bright and too washed out. I spent a bunch of time messing around with the color temperature, brightness, contrast on the monitor and the nvidia control panel (such as vibrance) and just couldn't get it to a point where I was happy with it.

I perhaps wasn't being too fair since I was mainly using my Windows desktop and not actual in-game usage (although I did try that too and it's definitely nice and smooth). But I use my computer for my job and I need something I can be happy with both in and out of games. I guess I care more about the image definition than I thought I would.

In the end, I decided to take it back and continue waiting for another monitor. It was too bad because the monitor had no flaws. No bleeding, uneven clouding or dead/stuck pixels. I so sad :(
That is weird you feel this way about the brightness. The reviews I've seen show that this monitor will go down to around 50 nits at 0 brightness. If you really wanted it darker you could have enabled ULMB and set the pulse width low. At 10% with 20 brightness TFT Central reports it goes all the way down to ~12 nits.
 
Thanks alabrand1 & icor1031, will try full screen tonight. 10% pulse width is pretty dark!

One odd thing i noticed with ULMB on is the mouse cursor in the [H] forum studders real bad when moving over the dark background but is super smooth when on a white background.
 
Does lowering pulse width make a huge difference? I mean I only had the Swift for a couple of days before I returned it, but I ran the blur busters scrolling map test with pulse width at 100% and the text was perfectly clear. So why even lower it? I guess it helps with really fast motion?
 
When talking about browser issues you should post which browser you are using since it could be an issue particular to a certain browser. I think there were issues with chrome posted earlier in this thread. Also post whether you are using any 60hz monitors on the same video card. Enabling scaling on the 60hz monitors fixes judder on videos.

There is a blurbusters article about mouse pointer skipping at lower poll rates. You could try turning up your mouse rate to 1000 Hz
http://www.blurbusters.com/mouse-125hz-vs-500hz-vs-1000hz/

I thought I'd mention the mouse polling rate thing but I doubt that is your problem since it is very subtle and you are probably talking about some overt skipping in chrome or something.
For competetive players concerned about responsiveness, I would always suggest using high polling rates, high frame rates (with maybe a cap for consistency), high refresh rates and no v-sync (and preferably G-Sync, or soon FreeSync). Many competitive players I have seen don’t use G-Sync yet, play without v-sync anyway and are perfectly fine with tearing. If you don’t mind tearing, the issue of polling rate/refresh rate misallignment certainly won’t bother you, as it’s far too subtle for most people to notice or care about.
 
Last edited:
120 Hz. Latest Firefox. One monitor connected, 2-way 780 SLI enabled. Mouse polling set to 1000 in Logitech software. Stock DP cable.
 
Does lowering pulse width make a huge difference? I mean I only had the Swift for a couple of days before I returned it, but I ran the blur busters scrolling map test with pulse width at 100% and the text was perfectly clear. So why even lower it? I guess it helps with really fast motion?

Well, for me, there is noticeable ghosting at ULMB 100% Pulse Width. So I won't use that ever. It's literally worse than not using ULMB at all.

For me, ULMB at 10% does not produce any ghosting. But heck, it might just be so dark that I don't see any to begin with.
 
I can't go below 80 or I start to feel like I'm straining my eyes to see players (especially if they are off in the distance or in a dark scene / night time) which is kind of the opposite reason I got the monitor for (improve my PvP game).

It reminds me of how dim 3D movies were when 3DTVs first came out, a very annoying trade off.
 
My impressions of the Swift so far...Gsync in combination with the Swifts turbo modes is damn good. It seems to be clearest when setting the base refresh rate closer to your avg fps (been using 85hz for games that move in the 50-90 fps range). I was very surprised at how well 30fps (evil within) was playable. The response time of the screen is important, ips may not work well.

SLI and Gsync aren't a good combination, generally, some games it is terrible (watch dogs) and performs worse! Not sure if it will ever be ironed out. Single gpu is just smoother, no stutter.

Colors are fine, pixel density is good. I think I may sell my eizo fg2421 and qnix 27" ips and just settle on the swift. Pretty solid tech. I hope drivers for my 980's continue to improve I terms of Gsync.

Does the screen require firmware updates or usb attachment to the pc?
 
I got my Tuesday Newegg order yesterday and its good so far. The newgg inventory sticker on the box said it was from a batch of 50.

Ten weeks and four swifts later I seem to have one without hardware issues. Just in time for DA:I too.
 
One odd thing i noticed with ULMB on is the mouse cursor in the [H] forum studders real bad when moving over the dark background but is super smooth when on a white background.

Can't confirm. Just tried and there is no difference between 120 Hz w/ULMB and 144 without.

SLI and Gsync aren't a good combination, generally, some games it is terrible (watch dogs) and performs worse! Not sure if it will ever be ironed out. Single gpu is just smoother, no stutter.

This is not my experience at all. I find SLI with G-Sync runs rather smoothly. I think this might be something that is game-specific. I've recently been playing Alien Isolation and Shadow of Mordor and both run nicely with SLI.
 
Can't confirm. Just tried and there is no difference between 120 Hz w/ULMB and 144 without.



This is not my experience at all. I find SLI with G-Sync runs rather smoothly. I think this might be something that is game-specific. I've recently been playing Alien Isolation and Shadow of Mordor and both run nicely with SLI.
My thought is if the game doesn't work well with SLI then G-sync is also going to suffer. All the games I've tried so far (Battlefield 3, The Evil Within, Metro 2033) work great with G-sync in SLI. Watch_Dogs has always been bad with SLI, so it's not surprising G-sync doesn't work well with SLI in that game.
I got my Tuesday Newegg order yesterday and its good so far. The newgg inventory sticker on the box said it was from a batch of 50.

Ten weeks and four swifts later I seem to have one without hardware issues. Just in time for DA:I too.
Yeah, seems like a good batch. I haven't heard of any complaints from it so far. I would go so far as to say the one I got is perfect. I'm happy now that I was not able to grab one early on.
 
Can't confirm. Just tried and there is no difference between 120 Hz w/ULMB and 144 without.



This is not my experience at all. I find SLI with G-Sync runs rather smoothly. I think this might be something that is game-specific. I've recently been playing Alien Isolation and Shadow of Mordor and both run nicely with SLI.

That's exactly what I said. Single gpu is smoother than SLI with Gsync, fact. Being happy with it is more subjective than comparing the two. Stop justifying your purchase and covering valid information.
 
What he is saying is that watch dogs is coded badly and is known to run poorly on sli, where his other favorite games do not suffer from that poor implementation. Some console specific targeted games tend to be badly optimized/coded for pc, some much worse than others. Watch dogs has also been said to benefit less from g-sync in general because it is coded poorly, in this thread as a matter of fact if I remember correctly.

From another forum:
Watchdogs on PS4 plays "smoothly" because its native at 900p and it runs at 30fps(not 60fps), due to frame pacing. If it was running at 60fps, it would be stuttering as bad as the pc version. Don't forget that half the effects on the PS4 version are also turned way down or OFF completely too.

From [H] Review of watchdogs:
When this game is run with "High" textures it looks like a game that is several years old. Crysis 3 from over a year ago looks incredibly better. It really is a mystery why the game requirements are so high in order to run with "Ultra" textures. Whether it be some mistake in programming, or negligence or lack of efficiency on the developers we may never know.

The requirements are darned high for "Ultra" textures and it doesn't seem like it needs to be. We've seen the kind of textures we see in "Ultra" quality in other games, and those don't have stuttering problems. We cannot explain it, only to say that the last two patches have not fixed the problem. We are still waiting ultimately for this game to be fixed, but we fear that may never happen, or it may take a really long time.

The point is this, running this game at anything lower than "Ultra" textures is just not that fun, and it seems a waste of money just to run this game at a low texture quality. Without at least Ultra textures Watch Dogs simply does not look good and it loses its immersive quality with its terrible textures. One thing we did find out though is to make sure you force 16X AF from your driver control panel if you want better image quality in this game. It makes the "Ultra" textures pop in detail and clarity a lot better. It doesn't make this game look anywhere near E3 quality, but it is one easy thing you can do and control to make it look better.


As for more smoothness outside of badly coded games like watch dogs' reported stuttering frame rates - you aren't going to get the smoothness of greatly increased motion definition and motion articulation that 100fps+ average provides on a 1440p 120hz-144hz monitor on high+ (custom) to ultra settings with any single card solution.

.
 
Last edited:
As for more smoothness outside of badly coded games like watch dogs' reported stuttering frame rates - you aren't going to get the smoothness of greatly increased motion definition and motion articulation that 100fps+ average provides on a 1440p 120hz-144hz monitor on high+ (custom) to ultra settings with any single card solution.

.

I'm talking about SLI and GSync. I'll take 100fps + when I can, probably stick with the fg2421. SLI has added more stuttering with all games I've played with Gsync, some more than others. I wish we could all agree on this and call it like it is. If a person doesn't mind it (I don't with games using SLI properly), that's different. I guess I am used to quality motion and notice it more.

Watch Dogs is an outlier for sure, yet is pretty darn smooth with a single gpu with Gsync. Without Gsync, SLI does a good job with it.
 
could be some random driver issues with sli + g-sync:


From forums , some people have stutter issue with sli + g-sync, other's don't.. and apparently some miraculously/randomly fix it with fresh installs.
when I first started using it for games I was getting stuttering in sli while playing everygame I had installed l4d2,BF4,max payne 3 etc etc,but when I turn off sli no stutter,then one day when I turned on my computer one of my ssd's I had installed at the time had failed,and at that time what I had installed were 2 Samsung 840 evo 1 tb in a raid 0, so I did a fresh install of everything with 2 Samsung ssd 830 512gb each, (I have 8 samssung ssd's in all 512 and up BTW)so after I finish installing everything and start playing again,guess what,no stutter of any kind in sli, I checked then double checked then checked again, sli on,G-sync on and no stutter of any kind,and no lie it is even smoother and btw the drivers were always 344.11 from the GeForce website,will this work for other folks with the same issue,i don't know but it is worth s try,thats my 2 cents....



Also, as always with microstutter (outside of any possible SLI+g-sync driver issues) - it's effect is exacerbated by non-high average frame-rates because even if uneven, the whole fps micro-stutter graph range sinks lower so it's variable rate's range is alternately dipping into the sludge/molasses fps and back to a medium/non-high frame rate. Conversely, keeping a high average fps makes any microstutter dip's lower limit never sink to those very low framerates. If you are starting out at a low average frame rate to begin with, forget about it - a microstutter fps graph is going to bottom out completely on dips. I've always said, g-sync really isn't a "fix" for 30-40fps, it's more of a no v-sync fix for framerate variation above and below a decent (not low) frame rate.
I have SLI Titans and a PG278Q. G-sync + SLI has more stutter than G-sync without SLI but less than anything without G-sync. It took me a while to notice it but after testing at very low frame rates (~35 fps) it suddenly became very obvious to me in Tomb Raider and now I can see it even at higher frame rates. At 100+ Hz the only way I notice it is more apparent motion blur as the stutter is too small to notice/feel.

This guy is running three gpus with three swifts and seems to be saying everything runs silky smooth.
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Surround-Impressions-Using-3-ASUS-ROG-Swift-Displays
Note that it has been said that running three gpu's might minimize microstutter since it is no longer a 1 - 2 cycle, having three gpus may normalize the microstutter graph greatly for those super sensitive to it in general.

People in this forum are saying flawless SLI+g-sync, while others in the same thread are saying a stuttering mess.. http://forums.overclockers.co.uk/showthread.php?p=27052031
Seems like it might be a driver issue/bug that doesn't affect everyone. And for those experiencing it - it also might be exacerbated by certain games, and lower framerates.

It's pretty obvious that anyone who wants high framerates, even moderately high frame rates on the most demanding games and upcoming games (at high+ to ultra settings, and modded even higher perhaps) at 1440p is going to have to use more than one gpu.
 
Last edited:
This has got to be one of the worst monitors ever created in terms of quality control.

I just bought two of them from Microcenter. You would think at least one of them was good.

Nope. Both of them have multiple dead and/or stuck pixels. Both of them have darkened bottom right corners, one more so than the other. Probably bruising considering the styrofoam presses against the screen. One of them has this horrendous backlighting issue on the right side.

I don't understand it. This is my 4th one now. Every single one of them is defective. How the hell can they charge $800 for this piece of crap? It's criminal, it really is.

I'm sure Microcenter is going to love me for returning $1600 worth of monitors.
 
very slightly dark in the lower right corner (probably due to the control panel being there, this has happened on other monitors I owned).

I never noticed it in the past 3 months, and even now that I know its there I doubt I will notice it ever again unless I setup a white background and look at it. So its pretty moot.
 
Can you guys check your bottom right corners? I'm curious if this is widespread because I'm seeing it in both of mine.

Mine looks fine, just got it yesterday from Newegg. Not sure where to look for batch number or if that has any significance to when or where you got yours. Maybe QA was taking an eggroll break in China when yours was pushed down the production line. (I'll regret that statement when I sober up). What I meant to say is the 12 year old that put your monitor together, just ate some candy and didn't wash their hands before they returned to finish applying the matte coating to your display. Sorry, that was wrong to. I feel your pain and $800 hurts.
 
I got my Rog Swift Friday from Newegg -- pretty quick shipping since I specified the cheapest - $7.59 and ordered on Tuesday. I was horrified to see that they didn't put it in another box -- just the commercial container. It did arrive safe and sound though -- no damage and no dead pixels.

So far though, I've seen no motion blur differences than on my 60hz monitors. I've set the Nvidia control panel for Gsync, set the monitor refresh rate for 2560x1440 144hz. It shows 2560x1440 144hz mode normal in the monitor's system setup-> Information page. I was expecting to see some difference while just moving windows around or even the mouse for that matter. The blur buster site UFO test doesn't look any different to me either. It shows 144fps, 144hz, 7 pixels per frame, 960 pixels per sec, 3 ufo's and a green Valid.

I'm assuming something's wrong or my eyes just can't see the difference
 
So, with the repeated mentions that Microcenter has them, I drove about an hour to the closest location and picked one up.

And... I kinda love it. There was an initial scare where I had a stuck bright green pixel while the background was black, but after the screen was on for about a minute and I had moved some windows around it started responding and hasn't been an issue since.

So zero dead/stuck pixels now, maybe the best backlight uniformity of any TN panel I've ever had, etc.

And Gsync is just a great feature in general. I didn't quite understand what difference it might make until seeing it function.

I haven't really put it through its paces yet, mostly just played my usual diet of Diablo 3 and Starcraft 2, but I plan to throw some BF4 and other more graphically intensive games at it. I don't think I have Crysis 3 installed right now, might throw that one back on just to look at all the sparkly graphics.
 
I'm seeing it - very faint and about the size of the tip of my index finger.
very slightly dark in the lower right corner (probably due to the control panel being there, this has happened on other monitors I owned).

I never noticed it in the past 3 months, and even now that I know its there I doubt I will notice it ever again unless I setup a white background and look at it. So its pretty moot.
 
Can you guys check your bottom right corners? I'm curious if this is widespread because I'm seeing it in both of mine.

Go to this webpage and click "White", then go fullscreen with F11.

http://jasonfarrell.com/misc/deadpixeltest.php

Look at the bottom right corner. Is is slightly darker than the rest?

The last two I've had have this little dark "smudge". Don't recall on the first three. It doesn't bother me because it only becomes noticeable on a bright cutscene or loading screen.
 
I've had a ROG Swift running for about a week now. Having the same problem as this guy:
http://www.amazon.co.uk/review/R2LW...etail-glance&nodeID=340831031&store=computers

Fine vertical lines when there is fast motion or panning on large sections of the screen. I just can't unsee it. Talked to newegg CS today and they treated me real well -- getting a refund with shipping label, no fee, etc. I had high hopes but this monitor's panel let me down.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
That's exactly what I said. Single gpu is smoother than SLI with Gsync, fact. Being happy with it is more subjective than comparing the two. Stop justifying your purchase and covering valid information.

My experiences with SLI haven't been any different from single GPU other than having much higher FPS, G-Sync or not. I'd think G-Sync shouldn't come into the equation really as microstutter manifestation can differ from game to game based on how it renders stuff and what kind of multi-GPU optimizations it has. The difference to a single GPU can be anything from nonexistent to noticeable.

I don't need to justify my purchase to anyone, much less myself. I'm very happy with both my SLI rig and Swift.
 
Ok, this is what I figured out about the blurbuster test. My current setup has 2 24" 60hz monitors on each side of the Rog Swift using separate displayport connections. If I disconnect these monitors, and then try the test -- it works as expected. I can then see a nice smooth horizontal scroll of the UFO's.

Is this a problem with using mixed refresh rate monitors in general or only a displayport issue? I read in another thread on this forum dated back a couple years ago and probably with DVI that the mixed refresh rate made no difference unless trying to use eyefinity or surroundview. I just have the extra 60hz monitors as extended desktop so that they have independent resolution/refresh rate.

Anybody else with mixed 60/120hz monitors seeing the same thing?

So far though, I've seen no motion blur differences than on my 60hz monitors. I've set the Nvidia control panel for Gsync, set the monitor refresh rate for 2560x1440 144hz. It shows 2560x1440 144hz mode normal in the monitor's system setup-> Information page. I was expecting to see some difference while just moving windows around or even the mouse for that matter. The blur buster site UFO test doesn't look any different to me either. It shows 144fps, 144hz, 7 pixels per frame, 960 pixels per sec, 3 ufo's and a green Valid.

I'm assuming something's wrong or my eyes just can't see the difference
 
Surround.jpg


I got it all setup and it is glorious!!! Triple Rog Swift, 7680x1440, and in 3D Vision Surround.

However, even my beastly machine is no match for this insane setup. For example, with 2D Surround w/ G-Sync on Bioshock Infinite I could play on High settings and get a smooth 100 fps. With stereo 3D that performance tanked, and I was having to use Low settings to get the smoothness back. L4D was totally smooth in 3D Surround, but that game was never very intensive.

Still need to test more, but so far I am digging it.
 
My video playback would stutter badly when I first mixed my samsung a750D 120hz monitor into my 60hz monitor array. I found a post somewhere online back then saying that turning scaling on those monitors made it go away. I turn scaling on in the nvidia drivers on all of my 60hz monitors, but not my gaming monitor. If you are having issues with a mixed refresh rate monitor array you might want to try that out.

Edit: I am still doing the same thing with my swift replacing the a750d in the same monitor array. However, my 780ti only has one displayport output. The dvi output is using a powered adapter to minidp which goes to my cinema display, and the two side monitors are on dvi/hdmi too. I have no idea whether the scaling thing would work or not on multiple displayport outputs off of one gpu, but I'd be interested to know if you can try it out.
 
Last edited:
Ok, this is what I figured out about the blurbuster test. My current setup has 2 24" 60hz monitors on each side of the Rog Swift using separate displayport connections. If I disconnect these monitors, and then try the test -- it works as expected. I can then see a nice smooth horizontal scroll of the UFO's.

Is this a problem with using mixed refresh rate monitors in general or only a displayport issue? I read in another thread on this forum dated back a couple years ago and probably with DVI that the mixed refresh rate made no difference unless trying to use eyefinity or surroundview. I just have the extra 60hz monitors as extended desktop so that they have independent resolution/refresh rate.

Anybody else with mixed 60/120hz monitors seeing the same thing?

it's a windows problem. turning off aero and removing everything from the other monitors when you want to play a game will prevent any issues like that. it's a real pain in the ass and there's no way to fix it.
 
When you say remove everything, does that include icons or just running apps? I guess either way it totally eliminates any usefulness of the other monitors. Even something like vent/teamspeak/mumble would cause an issue, right?

If this is the case then I guess I'll just create a single display 120hz gaming machine and have another system with multiple 60hz monitors for a general purpose usage.


it's a windows problem. turning off aero and removing everything from the other monitors when you want to play a game will prevent any issues like that. it's a real pain in the ass and there's no way to fix it.
 
Does anyone know why the Blur Busters moving photo test is stuttering for me? I'm running ULMB 120 Hz. No tabs open, nothing running in background. Running the latest Firefox.

http://www.testufo.com/#test=photo&photo=toronto-map.png&pps=960&pursuit=0&height=0

I don't recall this happening with my first Swift. But I did have a different video card back then (680 GTX then, 970 GTX now).

Are there any other scrolling tests (perhaps non-browser) that I can use to make sure that I'm not stuttering everywhere?
 
My monitor array is mixed 60hz and 120hz-144hz. I don't have any problems with video playback or other stuttering once I enable scaling in the nvidia control panel on the 60hz monitors. However during a game session while I was minimizing the game with g-sync enabled (to check my email or something), my aero theme/taskbar on bottom of the monitors blinked back to windows basic. I'm still using aero at the moment and it's not really a big deal other than that aero dropout thing on minimizing a g-sync game on occasion. I run blue/black outer space fantasy wallpapers on my monitors so I could just change the windows basic theme to black if had to or decide to turn of aero for good.

I had considered running one of my other pc's to my cinema display side of the desk on a long switchable run before because when I had a 6990 I similarly had to run full screen (for dual gpus to be used properly). I decided against it. It might be more convenient than (somewhat clunkily) minimizing games every time I want to do anything else on the pc during a game session, but having the monitor array and system power available for image editing, compression/decompression, rendering, and overall pc usage would be missed outside of gaming sessions - so I'd probably have to do it with a switchable setup. Switching the whole setup over and over that way isn't more convenient than just minimizing a game really so I'm keeping the setup as it is for now. I still have the option if I want to make the cinema display + portrait mode monitors dedicated to my spare room pc entirely someday. I'd just have to disable the spare room monitor when not at that desk, and vice/versa, so I wouldn't lose windows, popups, alerts, etc "offscreen" on a different room's monitor.
 
If I don't run an Aero theme on my desktop, the fastest the blurbuster ufo test will go is 60 fps - it does 144 fps with Aero. My scaling function was on aspect ratio/gpu on the Rog Swift, the other monitors were set to aspect ratio/monitor. I fooled around with the settings and think no-scaling on the Rog Swift might work better but it didn't fix the blur problem.

I'll probably try the 60hz monitors on DVI next and see if that makes a difference.

[edit] The DVI/HDMI cables made no difference. I went back to a non-Aero theme with only a single monitor connected and the UFO test worked. I then added the 60hz monitors via displayport while the test was running and it continued to work correctly without blur. As soon as I restarted the web browser, the maximum fps was again 60 and this was true in Chrome, Firefox, and Opera. I have to believe at this point that either the test is broken or a mixed monitor configuration just doesn't work how it should This is with a single EVGA GTX980 card.
[/edit]
 
Last edited:
Back
Top