ASUS Announces ROG SWIFT PG278Q Premium Gaming Monitor

Like I said in prev posts.. I've run two 25' heavy gauge 24awg mini dp extension cables off my 6990 when I had it and it worked well to a 2560x1440 60hz + a 1920x1080 120hz. However my 780ti's output is much weaker for some reason and I can't even run both monitors at the same time anymore on those runs (either or, one will not display). I also get occasional screen blinks even running the single 120hz 1080p monitor. I never had those issues on the 6990 on the 25' + ends run.

Assuming I buy a PG278Q, I'm not expecting this to go much over 10'. I might try a 10' cable and at first use a 1' mini dp to full sized DP cable on the end as a "dongle"/port saver type thing, and then try to swap it out for a 3' one and see whats up. I wouldn't expect anything past that, especially with my current gpu. (I'd rather keep everything mini dp on the runs and adapt to full sized dp at each end as required since all the other hardware and adapters I have are mini).

This is going to require me to move my pc to a non-optimal location but at least it won't be at my desk. Hoping running cable from lower part of pc case through floor -> ceiling to desk height monitor won't be too much of a stretch. It's a low ceiling so I should be ok.

Personally I'd never run non-native (in this case 1080p vs 1440) b/c non-native to me looks like mud. A 4k might scale cleanly that way being a quad of 4 1080p resolutions, but this monitor won't. Plus 2560x1440 gaming is the main or at least the primary selling (and one of the high price points) of the monitor.
 
Last edited:
With G-Sync, I don't think the difference between 100 Hz and 144 Hz will be particularly obvious. It doesn't seem like too ugly a solution.

I'm presuming, of course, that G-Sync will actually work at 100 Hz.

I'm actually rocking 2 Radeon R9 290, so for me the G-Sync is a waste, but maybe the AMD freesync is equal?

Maybe I should hope that there comes a non G-Sync display.

btw, come to think of it, would it be possible to split a display with 2 DVI connectors in half and run eyefinity on single display, so one DVI gets 50% and other DVI gets the other 50% of the screen?(?)
 
Considering how long ago nvidia mentioned g-sync and how long it is taking to get to market outside of the diy kits, I doubt "free-sync" monitors/implementations will be to market for a long time realistically. They are just saying yes they can manage to put out some similar tech. Monitor mfg's could implement their own dynamic hz tech as well, just like they can implement their own backlight strobing tech. That doesn't mean you are going to realistically have anything you can buy in the next year or two that is competing with g-sync. Another point which is continually overlooked when people are talking like freesync is going to show up soon is that ulmb/zero-blur backlight strobing mode seems to be included in all high hz g-sync monitors. There is no mention of 2d gaming backlight strobing supported by amd being a standard or at all in free-sync monitors.

There is a chance toastyx or someone else will "hack" g-sync to work on amd cards but there is no guarantee of that. Even if that happens, you might have to consider whether that turns into a game of catch-up of hacks vs g-sync fw/driver update rollouts.

Even if you don't want g-sync, you aren't going to find another 1ms 120hz-144hz monitor at 2560x1440, and intentionally designed to do 120hz-144hz.
A korean guaranteed to hit 120hz can be expensive and their response time is slow.

During continual movement keying a mouse-look flow/pathing:

60hz TN blur is like a smear of the entire viewport/game world.

120hz higher response time IPS at higher fps is more toward the 60hz smear end of the scale
but it attempts to show more motion articulation and flow, and animation definition etc at high fps though smeared.

120hz TN at higher fps is like a strong soften blur of the entire viewport/game world. 50% blur reduction vs 60hz TN.

144hz TN at higher fps is 60% blur reduction vs 60hz TN.

ulmb mode is essentially zero blur.
 
Last edited:
With G-Sync, I don't think the difference between 100 Hz and 144 Hz will be particularly obvious. It doesn't seem like too ugly a solution.

I'm presuming, of course, that G-Sync will actually work at 100 Hz.

G-Sync does exactly that; it syncs the monitor to the GPU frame output. So of course it will do 100 hz. But once the FPS goes above what the monitor can do hz wise then who knows what happens.
 
G-Sync does exactly that; it syncs the monitor to the GPU frame output. So of course it will do 100 hz. But once the FPS goes above what the monitor can do hz wise then who knows what happens.

I would imagine G-Sync prevents the FPS from going above the display's refresh rate. G-Sync directly controls refresh rate, after all.
 
As far as I know, if your performance fps is above the refresh rate, then the result is the same as V-Sync. However, if your fps dips below the refresh (in this case 144Hz) then you don't hold the frame like you have to do with V-Sync. It will just update the screen as soon as the GPU is finished rendering.
 
I would imagine G-Sync prevents the FPS from going above the display's refresh rate. G-Sync directly controls refresh rate, after all.

G-Sync makes the monitor follow the GPU. I haven't heard not one bit of news that said G-Sync limits FPS to monitor hz max.

Let's face it... most people struggle with getting a clean 60 fps all the time in block buster titles. I don't think going over 144 fps is going to be a common event.
 
G-Sync makes the monitor follow the GPU. I haven't heard not one bit of news that said G-Sync limits FPS to monitor hz max.

Let's face it... most people struggle with getting a clean 60 fps all the time in block buster titles. I don't think going over 144 fps is going to be a common event.

Actually, going above the refresh rate with your fps can be a *problem* for people with the power to do so. It is not useful for anything other than benchmarks to go above 144FPS, but I can easily push 300FPS if I want to, the problem is that causes a ton of heat to be generated for no reason.

G-sync caps the framerate at the refresh rate. If you are running at 144Hz with G-sync you won't get higher than 144FPS, 120Hz=Max 120FPS with gsync.
 
Actually, going above the refresh rate with your fps can be a *problem* for people with the power to do so. It is not useful for anything other than benchmarks to go above 144FPS, but I can easily push 300FPS if I want to, the problem is that causes a ton of heat to be generated for no reason.

G-sync caps the framerate at the refresh rate. If you are running at 144Hz with G-sync you won't get higher than 144FPS, 120Hz=Max 120FPS with gsync.

I'd like to see where you got that information.
 
Q: How much performance gain can I expect to see with G-SYNC technology?
A: NVIDIA G-SYNC ensures that every frame rendered by the GPU is displayed (up to the max refresh rate of the monitor). It does not increase the rate at which the GPU renders a frame.

Source: http://www.geforce.com/hardware/technology/g-sync/faq

From their wording, it seems that G-SYNC allows the GPU to control the refresh rate of the monitor(up to 144hz) and has no effect on it's frame rate. I'm assuming once the frames start to get over 144 you begin to see tearing due to the monitor not being able to display every frame rendered by the GPU.
 
Last edited:
Source: http://www.geforce.com/hardware/technology/g-sync/faq

From their wording, it seems that G-SYNC allows the GPU to control the refresh rate of the monitor(up to 144hz) and has no effect on it's frame rate. I'm assuming once the frames start to get over 144 hz you begin to see tearing due to the monitor not being able to display every frame rendered by the GPU.
Not sure why you would say that after looking at the very thing you just linked to. It will simply cap at 144hz just like vsync would.
 
I'd like to see where you got that information.

Blurbusters said:
At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.

http://www.blurbusters.com/gsync/preview2/
 
Last edited:
It seems no one is [H] enough on OCUK to test out ULMB yet, but other initial impressions seem good.

I'm really interested to know if ULMB is as dim as it sounds. By far the number one feature I'm concerned with working well.
 
G-Sync does exactly that; it syncs the monitor to the GPU frame output. So of course it will do 100 hz. But once the FPS goes above what the monitor can do hz wise then who knows what happens.
What I mean is whether G-Sync can be enabled when the display's refresh rate is set to 100 Hz.

G-Sync makes the monitor follow the GPU. I haven't heard not one bit of news that said G-Sync limits FPS to monitor hz max.
When G-Sync is enabled, the frame rate cannot exceed the maximum refresh rate. In this case, that would be 144 Hz.
 
I have to disagree that very high fps of 200 - 300 fps won't be a common event because a lot of people still play TF2, L4D2, and other source games even if they might also play more demanding titles. There are usually a ton of games in people's steam libraries that are easier to run as well for people who have a good amount of games on steam - and a good amount of them are great games. They aren't all old games either, there have been plenty of highly rated, less gpu demanding games that have been released in the last few years - and I'm not just talking about platformer indie ones. In a few cases, they are rated higher than the most demanding of gpu titles at that.
 
Last edited:
Is it true that this is a panel made by ASUS them self?
So I can forget Benq coming with a much cheaper non G-Sync version of this display?

I'm starting to consider selling my R9 290's and buy a GTX 780 or 780ti, from the things Ive read the G-Sync actually is a good working product when not getting 144fps, my understanding is when you get 80-144fps G-Sync feature makes it feel smooth as butter(?).
 
Is it true that this is a panel made by ASUS them self?
So I can forget Benq coming with a much cheaper non G-Sync version of this display?
Nope.
TFT Central said:
The Asus ROG Swift PG278Q utilises a AU Optronics M270Q002 V0 TN Film panel which is capable of producing 16.77 million colours. This is achieved according to Asus' specs through a true 8-bit colour depth as opposed to Frame Rate Control (FRC) being needed.
 
What is wrong with you? Its right there in front of you and even in bold. How many people have to keep providing links for you to get it?

Why can't you quote? I want to see where it say that the max will be certain value.

Do you think all our readers will link? No.
 
Dunno why you guys are arguing about the details. Well no, sorry it's Hardforum. That's our job here. Carry on.

I think this is going to be a really spectacular gaming monitor personally, one that will last at least 4-5 years. I'm pretty much certain that I'm gonna get one of these. Luckily it's releasing in other regions first so I get to see what the guys over in other places say about it before I plop down the dough though.
 
Why can't you quote? I want to see where it say that the max will be certain value.

Do you think all our readers will link? No.
At this point you are either just trolling or you have some basic reading comprehension issues. It was already quoted more than once. :rolleyes:
 
Dunno why you guys are arguing about the details. Well no, sorry it's Hardforum. That's our job here. Carry on.

I think this is going to be a really spectacular gaming monitor personally, one that will last at least 4-5 years.

If you can swallow the shitty TN image it might very well be.
 
Everything has its shitty tradeoff.

The image quality on a ips with it's slower response time or a 60hz tn during your continual movement keying and mouse-look flow pathing is a smeared mess that can't even be defined as a solid grid resolution. You don't play a screen shot.
The image quality in continual periods of motion of the entire viewport/game world is a non-resolution on these inferior (imo considering the tradeoffs) for 1st/3rd person gaming panels.

A 120hz-144hz TN is at least down to a strong soften blur effect, where the whole viewport goes "fuzzy" more within the 'shadow masks' of onscreen elements rather than a full smear. Backlight strobing option brings this down to essentially zero blur on games you can manage to use it with.

you also don't get the higher motion articulation/delineation/flow nor the increased motion and animation definition at 60hz ips (incl your motion and the degrees of motion of the entire viewport/game world during FoV movement).
If you pay for a korean guaranteed to oc to 120hz it will attempt to show more motion definition but through a smeared non-resolution more toward the 60hz tn end of the blur scale during movement keying + mouse looking.

The only thing to compare to this monitor for 1st/3rd person gaming would be the eizo FG2421 VA - which has a panel lottery/issues , is 24" and 1920x1080. It has it's own backlight strobing but no g-sync option.
 
Last edited:
Here a pic of it next to a glossy Acer.

2112143
 
Is there any benefit in getting this monitor or any upcoming GSYNC monitors if your using an AMD video card?
 
Is there any benefit in getting this monitor or any upcoming GSYNC monitors if your using an AMD video card?

you get the only display in the world atm that has 2560x1440 resolution and very lowe input lag and it has 144hz.

remember the Korea displays that may be able to overclock have higher input lags.

For both AMD and NVIDIA on GPU is displayport 1.2 needed.

it's possible AMD gpu's will NEVER recieve support for gsync, to my understanding the benefit of gsync is that if you have a weak gpu that only gives 80-144fps then there is no screen tearing?

so if would be looking without screentearing, but if you have 80-144fps without gsync there could be screentearing, obviously having overkill system with 144fps+ would solve this problem(?)
 
you get the only display in the world atm that has 2560x1440 resolution and very lowe input lag and it has 144hz.

remember the Korea displays that may be able to overclock have higher input lags.

For both AMD and NVIDIA on GPU is displayport 1.2 needed.

it's possible AMD gpu's will NEVER recieve support for gsync, to my understanding the benefit of gsync is that if you have a weak gpu that only gives 80-144fps then there is no screen tearing?

so if would be looking without screentearing, but if you have 80-144fps without gsync there could be screentearing, obviously having overkill system with 144fps+ would solve this problem(?)

Agreed. If youre looking for 120+ Hz at 2560x1440 there arent a whole lot of choices, well actually none that I can think of at least. Not sure that makes it worth the extra cash though. Youre talking about a $400 premium over a 1920x1080 BenQ for example with the same performance just at a lower resolution. My personal opinion is it wouldnt be worth it as the BenQ is an excellent gaming screen with no PWM dimming and very good calibrated picture quality and what looks to be the same performance as the Asus in terms of lag and response time. So yeah, basically the only thing an AMD owner gets with this is higher resolution but at $800 instead of $400. Hell at that premium, Im not entirely sure its worth it for us Nvidia owners really. Thats why Im really hoping this thing gets into the $500-600 range soon as thats a more fair asking price IMO.
 


Eff me running... that looks phenomenal for a TN screen. Jesus I cannot wait to grab one of these!!!
 
One report from the OC UK forums of ULMB not being too dim. I hope to hear more like that.
 
Those worried about the ULMB's brightness should consider buying a nice, +/-15$ Daylight/6500k CFL/LED (1000-1600 lumens is perfect for 120cmd/2 brightness) light to compensate for the lower maximum brightness. Most homes come equipped with 2700-3000k Orange/tanning salon-esque lights. Display whites also look nicer, especially matte displays when paired with nice lights.

Light Color Temperature Comparison.
 
Is it true that this is a panel made by ASUS them self?
So I can forget Benq coming with a much cheaper non G-Sync version of this display?

I'm starting to consider selling my R9 290's and buy a GTX 780 or 780ti, from the things Ive read the G-Sync actually is a good working product when not getting 144fps, my understanding is when you get 80-144fps G-Sync feature makes it feel smooth as butter(?).

gsync makes lower framerates look smooth, but they do not FEEL smoother !!

Got my one this morning, the bump in resolution is very nice. Tried Gsync with a few games but not quite sure of it. Played Crysis 3, fps was bouncing between 45-90 but did not feel smooth in the slightest. Checked to see if Gsync was enabled in the control panel, and it is.
Then tried BF4, was smoother than Crysis 3. Fps was between 80-144 but still didn't feel all that smooth. Not sure if my Gsync is working correctly or it's just not that great.

For instance in BF4 when the fps drops from max refresh rate you can still feel it drop a bit, was hoping gsync would solve that a bit. But based on my experiences with it and Crysis 3 i'm not sure it's working correctly. Might try and reinstall graphics drivers and see if any better.

http://forums.overclockers.co.uk/showthread.php?t=18570584&page=57
 
Back
Top