ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

....
-at low hz and low fps, you are at greatly reduced motion definition and control definition as well.
http://www.monitortests.com/yamakasi...mouse-60hz.jpg
http://www.monitortests.com/yamakasi...ouse-120hz.jpg
Greatly less the amount of new action/animation/world state slices shown, seeing longer "freeze frame" periods during which a high hz+high fps person is seeing up to several newer updates. Compared to a 120hz+120fps user:

1/2 the motion+control definition and opportunities to initiate actions in response at 60hz-60fps. (120hz/fps user sees double the motion tracking and animation definition)
1/3 the motion+control definition and opportunities to initiate actions in response at 40. (120hz/fps user sees 3 more current scene update state slices to your one, 3x the motion tracking)
1/4 the motion+control definition and opportunities to initiate actions in response at 30. (120hz/fps user sees 4 more current scene update slices to your one, 4x the motion tracking)
If you aren't supplying a more recent, unique frame of action to the new screen update(hz).. for example, 60fps on a 120hz monitor.. you are basically just seeing a freeze-frame of the last outdated frame of action for 16.6ms (compared to a 8.3ms per action slice update 120fps@120hz user).


60hz-120hz-30-15_onesecondframe.jpg


More unique action frames at 120hz only when fed a new action frame slice in that hz
(otherwise "freeze-framed", skipping slices of world action/animation states).
This graphic is inaccurate because it shows 5 frames vs 3 frames where 120hz@120fps vs 60hz/60fps would be a doubling, 6 more recent action slice frames to every 3, so even greater motion definition difference than shown.

120hz-vs-60hz-gaming.jpg


Borrowed toastyX's pics. Desktop rendering always has enough frames/second to fill 120hz so is an easy example. However other than the motion path , it isn't an animated object in itself. Higher fps+hz together show greater animation definition as well as motion definition.

120hz-compared_mouse-60hz.jpg


120hz-compared_mouse-120hz.jpg


There is a 120fps game recorded movie sample here, but of course you need a 120hz monitor to see 120frames/second on it. I don't think it shows a 60fps cut during it though unfortunately.
http://www.blurbusters.com/hfr-120fps-video-game-recording/
 
Last edited:
probably something to do with that 930... and maybe, you know, wasting power on anti aliasing at such a high resolution when you could just use smaa...

Even at 1440p I find AA makes a nice difference, so I will disagree with you that it is a waste. It doesn't give me much of a hit anyways.

I would have expected my 930 to be holding me back a little as I approach 120fps, but my numbers on my 930 / SLI 780 combo compares quite well with with the numbers elvn posted of that 3930 / SLI 780ti setup, so perhaps it isn't holding me back much at all. So SLI 780 ti paired with one of the fastest consumer processors you can buy doesn't even get you a steady 120fps in BF4. I rest my case.

I have a feeling this monitor will be prompting a lot of us at the [H] to upgrade.
 
I know I'll be upgrading my rig with these monitors. Most likely I'll get one to start with and buy two more for a tri monitor setup after I get my rig. I'm still a bit curious as to what the whole "starting at $799" slide is all about though.

I'm running an Intel 975 with a single 580 GTX now (two 'monitors' at 1080p - 24" and 32"), but if I have any hope of playing things at max res, and ultra, I'm thinking an 8 core haswell-e with hopefully by then three GTX 880's shoved in there. Though with that I'll most likely have to water-cool it, that'll be a first time for me.

The biggest problem is I need a bigger desk to put three monitors on!
 
I know I'll be upgrading my rig with these monitors. Most likely I'll get one to start with and buy two more for a tri monitor setup after I get my rig. I'm still a bit curious as to what the whole "starting at $799" slide is all about though.

I'm running an Intel 975 with a single 580 GTX now (two 'monitors' at 1080p - 24" and 32"), but if I have any hope of playing things at max res, and ultra, I'm thinking an 8 core haswell-e with hopefully by then three GTX 880's shoved in there. Though with that I'll most likely have to water-cool it, that'll be a first time for me.

The biggest problem is I need a bigger desk to put three monitors on!

Well, buy a bigger desk! Duh! :p
 
I really am set up better for a 1080p 120hz ULMB/backlight strobing presently with my single card. 1080p is really the best rez for hitting 120fps on demanding games with the higher end of enthusiast budgets at this point without going to extreme gpu budgets to feed 2560x1440 at 120fps.

The release schedule for the new ULMB monitors is staggered and 20nm gpus won't be out until end of year. I'm not planning on upgrading from 3930k (OC'd at a safe 4.6ghz currently) for awhile but I would like to get sli 20nm xmas/tax return period end of year. It's not worth it for me to get another 780ti only to blow a lot of money on 20nm sli ~8 months after this monitor comes out. I should probably wait out this monitor for a bit and see what people think of it, but it sounds really nice so I am tempted to buy it "ahead of time", well before the 20nm sli upgrade of my rig. For now a single 780ti SC is giving me decent fps on my 1080p 120hz samsung but it doesn't have good enough backlight strobing tech for me to get zero blur (still blurs during strobe and adds a lot of input lag so I don't enable that). The eizo 120hz backlight strobing monitor (FG2421) sounds cool but I prefer 27" monitors in my setup, and would like to jump to 2560x at year end so again would be stupid to buy one for 10 months. :p

If this monitor turns out to be great from what early adopters say, I'll probably end up grabbing one and playing some source games and less demanding steam library games with ULMB mode at over 120fps. I can get 90 - 140 fps or more in borderlands2 at 1080p so that won't be happening on games even that level of demand at 2560x unless I turn a lot of stuff down. Tomb raider etc would have to go way down or drop ULMB mode for gsync dynamic hz on the most demanding games until the 20nm sli upgrade at the end of the year.
 
Last edited:
I want to know if Asus is making the same version of this monitor but without V-Sync adapter.
 
I want to know if Asus is making the same version of this monitor but without V-Sync adapter.

You mean without G-Sync?

Why? So that it costs only a mere $700 instead of $800? :rolleyes:

You can turn off the G-sync for whatever perplexing reason you'd want to. If your an AMD user then that's understandable. The screen will work fine for you. However, If you have the bones to throw down for a display like this you've got the bones to throw down for a better suited and more than likely superior tech. (Nvidia).

Nevertheless, there is no way they'll release a non-G-sync version of this monitor, they won't even release a glossy version as they've stated it would cost too much to produce both matte and glossy; and that's just a blurry sheet of plastic. Given that, creating another SKU sans the g-sync circuitry for which this model is built for to cater to people still in the dark ages so to speak would obviously be out of the question.

So no.
 
You mean without G-Sync?

Why? So that it costs only a mere $700 instead of $800? :rolleyes:


Nevertheless, there is no way they'll release a non-G-sync version of this monitor, they won't even release a glossy version as they've stated it would cost too much to produce both matte and glossy; and that's just a blurry sheet of plastic.

So no.

sad
 
If you have the bones to throw down for a display like this you've got the bones to throw down for a better suited and more than likely superior tech. (Nvidia).
AMD cards run circles around nVidia at high resolutions and can pay for themselves via mining, but nVidia has the superior tech? :rolleyes: Yeah I suppose when your big money nVidia card runs out of VRAM and chokes you are going to really need that GSYNC.
 
AMD cards run circles around nVidia at high resolutions

Demonstratively false. Even [H]'s own comparison (one of many out there) came to that conclusion:

In our evaluation today we have found out that the new GeForce GTX 780 Ti equalizes the gameplay experience with Radeon R9 290X at Ultra HD 4K display gaming. For the most part, performance was similar between the two video cards. There were a couple cases where the 780 Ti was small percentages faster, and a couple where the R9 290X were small percentages faster. On the whole, both cards are so similar to each other at Ultra HD 4K gaming that it would be impossible to discern between these while gaming on a 4K display. In all of our testing today, we tried to look for the differences between 3GB and 4GB of VRAM while gaming. We encountered no scenarios in these games where the 3GB of VRAM on the GTX 780 Ti was holding it back at Ultra HD 4K gaming. We also encountered no scenarios where the 4GB was an advantage on the R9 290X.

My 780Ti also runs at 1424 MHz for 24/7 gaming and 1529 MHz benching, clocks a 290x couldn't even dream of reaching. All the while using less power and putting out less heat clock for clock.
 
I know I'll be upgrading my rig with these monitors. Most likely I'll get one to start with and buy two more for a tri monitor setup after I get my rig. I'm still a bit curious as to what the whole "starting at $799" slide is all about though.

I'm running an Intel 975 with a single 580 GTX now (two 'monitors' at 1080p - 24" and 32"), but if I have any hope of playing things at max res, and ultra, I'm thinking an 8 core haswell-e with hopefully by then three GTX 880's shoved in there. Though with that I'll most likely have to water-cool it, that'll be a first time for me.

The biggest problem is I need a bigger desk to put three monitors on!

I just bought the desk.

I was putting together a build list (started a thread a couple days ago) with a goal to run three of these on ultra settings at 120hz. I am having doubts that a 4930k water cooled with 2 or (even 3) water cooled Titan blacks can handle it.

I am someone who rarely waits for next gen cards because there is always something better down closely approaching, but with the new Maxwell cards coming out later this year and the high demands of current monitors, I am starting to believe an expensive build like the one above should be postponed so that games over the next year or two can be enjoyed on ultra with great fps. If I cant enjoy maxed settings for at least a couple of years on a costly build like the one above, then it just doesn't seen worth it to me.

Any thoughts?

On another note, I was thinking of starting a thread about this 21:9 "true 4k" monitor as it was the other screen on my short list. I can't find info on refresh rates (hopefully at least 60hz) or response times, but it looks legit.
http://www.pcworld.com/article/2081...cinematic-ultrahd-pc-monitor-at-ces-2014.html
 
Last edited:
Demonstratively false. Even [H]'s own comparison (one of many out there) came to that conclusion:



My 780Ti also runs at 1424 MHz for 24/7 gaming and 1529 MHz benching, clocks a 290x couldn't even dream of reaching. All the while using less power and putting out less heat clock for clock.

Point taken, I see the 780Ti is a phenomenal card but we can all thank AMD and their "inferior technology" for pushing nVidia to come up with it in the first place.
 
we can all thank AMD and their "inferior technology" for pushing nVidia to come up with it in the first place.

This is very true, I only wish that AMD could magically come across many billions of dollars to lubricate their R&D with, because they clearly have brains just not enough resources as it seems. Imagine a world where these two companies are neck and neck for the duration of the foreseeable future; a monumental struggle for the crown lasting for decades producing the most mindblowing results for both sides and more importantly, us. Instead it's this cat and mouse bullshit. I want AMD to bring it on again so we can all benefit.

So until then, yeah, nvidia, superior tech, nanny nanny boo boo.
 
Point taken, I see the 780Ti is a phenomenal card but we can all thank AMD and their "inferior technology" for pushing nVidia to come up with it in the first place.

Ya, I'd hate to see what would happen if either side went away from the high-end GPU market.
 
I would think review samples would be shipped by now, unless they're hushed under NDA. Which is entirely possible, as Nvidia is still working the kinks of of g-sync.
 
I just bought the desk.

I was putting together a build list (started a thread a couple days ago) with a goal to run three of these on ultra settings at 120hz. I am having doubts that a 4930k water cooled with 2 or (even 3) water cooled Titan blacks can handle it.

I am someone who rarely waits for next gen cards because there is always something better down closely approaching, but with the new Maxwell cards coming out later this year and the high demands of current monitors, I am starting to believe an expensive build like the one above should be postponed so that games over the next year or two can be enjoyed on ultra with great fps. If I cant enjoy maxed settings for at least a couple of years on a costly build like the one above, then it just doesn't seen worth it to me.

Any thoughts?

On another note, I was thinking of starting a thread about this 21:9 "true 4k" monitor as it was the other screen on my short list. I can't find info on refresh rates (hopefully at least 60hz) or response times, but it looks legit.
http://www.pcworld.com/article/2081...cinematic-ultrahd-pc-monitor-at-ces-2014.html

There's always the argument buy now so you can enjoy it that much sooner, as something new is always around the corner. Personally I find that argument totally unacceptable. Yes there's always something around the corner, but the more you wait then (in theory anyways) the more you save not upgrading your machine time and time again.

Spending less money overall for upgrades = other things I can buy (never thought I'd actually admit that as I do so enjoy new stuff for my computer)!

On topic though, looking at the 750 / 750Ti, there's certainly an increase in performance coming from the kepler series. That's to be expected of course, however we don't know exactly how much better the 880's will be. I do know one thing for about 99% certainty though. We won't see nVidia cards in the 700 series with dual display ports. There's no way with Kepler cards we'll be able to surround these monitors, but MAYBE we might see something like that on the Maxwell cards (don't hold your breath).

Ultimately NVidia will move there - it's the next logical step with G-Sync, but if we get dual display ports on one card, they are going to need to seriously ramp up that card's capability.

I for one do want to see what they do with their Maxwell series, and perhaps by the time the end of the year rolls around, some components will have come down in price for my new rig. Haswell-e is definitely what I'm waiting for as there's next to no reason for me to go from a 975 to a sandy-e or ivy-e.

Another thing to keep in mind is supposedly haswell-e works with DDR4 ram. That's not everywhere yet, but it's another thing to keep in mind - especially a year or two down the road.

I digress from the thread's topic though... I haven't found anything further on these monitors online but from what I've heard (total word of mouth at this point) from some of the local vendors around here is the monitor actually shows up in some of their systems as being on order, so we most likely will hear something between now and the end of April I'd think!
 
I have a U2713h and this monitor would be amazing right next to it. One monitor for gaming, one for everything else.

But I would feel pretty dumb for spending more on a TN panel than my IPS (especially since my U2713h took three RMAs before it flaw-free; it is, however, freaking fantastic now).
 
This sounds really disturbing, quoted from pcdiy asus comments:

some dude: "JJ please reassure this old girl that native 1440p 3D on the ROG swift just ain’t so and that its some gimmicky upscaled 1080p dipperdoodle trash."

JJ: "3D on the SWIFT is implemented per current NVIDIA 3D VISION specification of 1920×1080."

Why would this thing not work 3d vision in 1440p? Can someone explain or am I just misinterpreting? Cuz if u gotta run 1080p for 3d that would be shit.
 
This sounds really disturbing, quoted from pcdiy asus comments:

some dude: "JJ please reassure this old girl that native 1440p 3D on the ROG swift just ain’t so and that its some gimmicky upscaled 1080p dipperdoodle trash."

JJ: "3D on the SWIFT is implemented per current NVIDIA 3D VISION specification of 1920×1080."

Why would this thing not work 3d vision in 1440p? Can someone explain or am I just misinterpreting? Cuz if u gotta run 1080p for 3d that would be shit.

I don't think you're misinterpreting but I'd be shocked if Nvidia didn't fix this by release time or very shortly after. I'm also looking forward to 3D on the Swift for the few games that are good in 3D.
 
I don't think you're misinterpreting but I'd be shocked if Nvidia didn't fix this by release time or very shortly after. I'm also looking forward to 3D on the Swift for the few games that are good in 3D.

3D isnt a main thing for me but I still consider it a very nice icing on the cake since I would definetly use it from time to time.

I just hope it can be fixed and is not a bandwidth issue of some sort. I'm so excited for this monitor.
 
Regarding 3D @ 1080p only, and this is only from my limited understanding and I could be totally wrong here but the monitor can run @ 120-144hz @ 1400p without the need of MST mode as opposed to other high res displays such as 4k at 60hz requiring MST on a single display to function; and it was to some concern as to whether the PG278Q was to operate similarly, but it was simply not the case as confirmed by JJ at Asus. It has been speculated that the screen can only do this through some wizardry (technical term) with how the G-sync circuitry works. Now we know G-sync will not (at least this iteration) function in 3D mode. It may be, as suggested, that G-sync allows the monitor to function at maximum refresh rate @ native resolution without MST and therefore without G-sync it may be that the display switches to an MST operating mode effectively killing the chance of 3D at 1440p. Being that, so far as I know 3D will not work in MST mode.

That's the only explanation I have. I'm throwing caution to the wind here on this so please correct me if I'm wrong or confused.

But that's all I've got.
 
bunch of people complaining about how 'hard it is to clean glossy' and complaints of glare

You say that like there's something wrong with that opinion. Glare is pretty much the distinction between gloss and matte and if you don't like glare and can't avoid it, you're not going to like gloss.
 
https://www.facebook.com/asus.n.america

Asus currently has a 'poll' up of what screen type do you prefer, no wonder they chose Matte, bunch of people complaining about how 'hard it is to clean glossy' and complaints of glare

That's funny because it's actually easier to clean glossy as you can spot grime, fingerprints and other nasties better than on a matte screen. It's a straw-man argument for matte screen; further clarity and color representation be damned, they want it to be easier to clean (which it isn't). The market is stupid and backward.

It must be that (horrible) grainy sparkle effect that has them so mesmerized I guess.

Pity.
 
The majority of those who vote have likely never used or even seen a glossy monitor and Asus's excuses for not providing both glossy and matte options are retarded since they sell 3 versions of the VG278H (VG278H, VG278HE and VG278HR).
 
You say that like there's something wrong with that opinion. Glare is pretty much the distinction between gloss and matte and if you don't like glare and can't avoid it, you're not going to like gloss.

I understand the uses for Matte, but I also understand the merits of Glossy, that's the difference. People need to learn how to use curtains, or how to turn off a lamp, not just expect the monitor to do it for you. Sometimes there are unavoidable situations (especially in a work environment) and for those, Matte is sometimes the only solution, I agree. But that doesn't mean the rest of us should be gimped with terrible clarity
 
Thread is starting to repeat argument again sort of. Rather than repeat my position I'll just post a link to the last time I replied to this argument in this thread.

http://hardforum.com/showpost.php?p=1040560773&postcount=214

Very interested to see the results if vega removes the coating on these monitors for anyone, incl. himself.
 
Last edited:
So will this monitor still be worth it to AMD users or should I just go buy a Korean 27" and try to overclock it? I do game but not as much as I do other tasks like reading :)

I'm not a fanboy AMD/Nvidia in any case but I just refused to buy a 780 @ $800 so I jumped to AMD when they released the 290x for $500. If Nvidia goes back to giving more for the money then I will change back to Nvidia again :)

I would assume I'll need a second 290x to drive this thing at 120 in BF4 maxed.
 
Well you can probably sell that 290x now for insane cash and get 780ti instead for cash received :D
 
So will this monitor still be worth it to AMD users or should I just go buy a Korean 27" and try to overclock it? I do game but not as much as I do other tasks like reading :)

I'm not a fanboy AMD/Nvidia in any case but I just refused to buy a 780 @ $800 so I jumped to AMD when they released the 290x for $500. If Nvidia goes back to giving more for the money then I will change back to Nvidia again :)

I would assume I'll need a second 290x to drive this thing at 120 in BF4 maxed.

Afaik, the only thing this monitor won't do for you with a non NVidia card is you don't get to use g-sync. You still get everything else.
 
Afaik, the only thing this monitor won't do for you with a non NVidia card is you don't get to use g-sync. You still get everything else.

Pure and total speculation--Ultra Light Boost Mode is a function of the Gsync PCB as I understand it, so I'm thinking that function as well as any new 3d functions (like the one Benq will be releasing on their Gsync monitors, as an alternative mode to Gsync or ULBM) will be limited to Nvidia graphics cards as well. Traditional toastyX Lightboost will work on the Swift I have heard so that will be the way to go for Radeon users, perhaps?
 
I have the GSYNC version of the VG248QE.
I can confirm that ULMB does not work with my onboard intel 4600. It just says ULMB not available when you press the button. It might be possible to do some sort of hack to get it to work like lightboost though.
 
What does everyone think the input lag will be like? I'm hoping it will be in the range of Asus 24 inch monitors, the 27 inch Asus I had wasn't very smooth. The benq MLG one I have now is great, 13 ms I think. Here's to hoping this new monitor will be sub 1 frame, hopefully in the 10 range
 
Back
Top