Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I have a G400s, from the specs I think it's up to 1000 hz polling rate?
Low framerates are still going to look low, especially if you're used to triple-digits. IIRC G-sync doesn't work below 30 or 40 FPS. If you're on the PG278Q, I think it looks best when you're around 70-100 FPS. It does make lower framerates more bearable, though. It's true when previous posters say they don't even pay attention to FPS numbers anymore because it generally doesn't matter.
One thing to keep in mind is you may also have to invest in a better mouse. I would definitely recommend getting a mouse that offers a polling rate of at least 500 Hz, as movement can get jerky any lower than that.
Why wouldn't g-sync work below 30 fps?
What do you guys make of this beast? http://www.pcgamer.com/acer-unveil-super-quick-144hz-g-sync-ips-monitor/
Has the holy grail of monitors been found?
I've gone as low as 15-20fps in super modded Skyrim (more CPU limited than anything) as well as 5K DSR Alien Isolation (because SLI + G-Sync + DSR = only 2/3 work on the Swift right now). I didn't see any flickering. I have however noticed it only loading screens where the frame rate may drop to almost nothing. So I can say that G-Sync does work below 30fps. But in the input delay feels atrocious when you're used to ~100fps.
Crappy color quality makes graphics look much worse, as does color shifting. I'd also add in that the much worse dark contrast makes you play at a disadvantage in any game with shadows/darker spots, honestly. For me TN, even the new 28 inch 4k ones that are improved, is an absolute no go still. I currently am running an Acer b326hk 32 inch ips 4k 60hz dp1.2 monitor. The only improvement I could even want now would be to add gsync in that mix. Before I used an x-star dp2710 plus 2560x1440 110hz for a long time and a Dell UP2414Q 4k 60hz most 24 inch several months. To me, even for gaming, using a TN is like slamming your graphics settings down sharply.... It is just that much worse a picture compared to a good ips.No one is talking about the response time of it. The wonks are saying that IPS has higher latency and therefore cannot be as fast as the Swift.....but should still be plenty fast.
I bought mine as a second, gaming-only monitor so speed is paramount. I don't really worry too much about perfect color in games.
I don't really get the G-sync/Freesync hype. It sounds like cool tech, but are that many people really suffering from tearing? I turn v-sync off in almost every game, and haven't seen tearing in years. Are there that many people that suffer from tearing? Is it only in the fps games (which I generally don't play)?
It isn't just tearing, when the framerate is not a multiple or equal to the refresh rate frames are dropped or repeated which is ugly and it has always bothered me way more than tearing which is pretty minimal with high refresh rates &/or framerates.
And the beauty of g-sync is how it's always smooth even when your framerate is fluctuating a lot (which WILL happen in demanding games with maxed settings).
That makes sense, though I've never noticed that. Now I probably will though.
I think I'll have to see one in action to see the difference.
Crappy color quality makes graphics look much worse, as does color shifting. I'd also add in that the much worse dark contrast makes you play at a disadvantage in any game with shadows/darker spots, honestly. For me TN, even the new 28 inch 4k ones that are improved, is an absolute no go still. I currently am running an Acer b326hk 32 inch ips 4k 60hz dp1.2 monitor. The only improvement I could even want now would be to add gsync in that mix. Before I used an x-star dp2710 plus 2560x1440 110hz for a long time and a Dell UP2414Q 4k 60hz most 24 inch several months. To me, even for gaming, using a TN is like slamming your graphics settings down sharply.... It is just that much worse a picture compared to a good ips.
My opinion, as an owner of the Swift, is that the tradeoff is worth it for fast-action FPSs. Other game types, maybe not so much. Anyone considering a Swift and has the means should at least give one a test drive because G-Synch is a truly, no-joke amazing tech and will enhance your experience that much.
see, I cant understand this way of thinking. The Display is the portal to everything you do on your computer. Ultimatley if I didnt have a choice id rather run a slower GPU with lower settings but plugged into a higher quality monitor than run a next level beast GPU and run it through a small poor quality screen.
How many people have not posted their good experience with the display compared to those who have had a poor one? Going by some of the posts here I would be led to believe that coil whine is a problem with all GTX 970s regardless of manufacturer, but I have had no such issues with mine. The number of complaints with monitors from batches that started shipping at the end of October have been far less than those who bought from the earliest batches. One person's experience can't speak for everyone, just as the experience of those people posting on the internet can't speak for all purchasers. Regardless, LCD monitors are always a lottery. As the number of parts goes up, the number of defects in the manufacturing tolerance also goes up.Because not everyone has the cash to throw on a monitor for 600-800 dollars. It's really that simple.
And then there's all the notorious defects with Swift. Some people have done 3-4 RMAs and it's not an extreme case. Don't really understand why this is difficult to comprehend
Adaptive-sync support technically is bound to the hardware because the drivers have to be written to support it. With word on whether NVIDIA will add it to their drivers or not is yet to be seen, so technically it is bound to a specific GPU with AMD's Freesync currently the only software offering to work with Adaptive-sync. Open standards are good, but let's not jump the gun here. We don't even know yet how Adaptive-sync's software reliance will compare to the hardware interface approach of G-sync.To OP: I'd wait for a while. Adaptive Sync, the underlying tech behind both G/F-Sync, is a big deal. But you're right to question the sky-high prices and the (very) limited availability right now.
Personally, I'd prefer to buy a monitor that isn't bound to any specific GPU. Asus' 120 Hz IPS display has DP 1.2a, which has A-Sync as an option and AMD said that it would work with their GPUs.
I imagine Nvidia will be forced to support the open technology at some point. G-Sync has its own algorithms, but their margin for individuality, if we put it like that, is relatively slim. So you probably don't have to worry about locking yourself in to either red or green(if you're like me, at least, who typically switches between both depending on where in the upgrade cycle I am and which has the best price/perf ratio)..
I was the first to provide this comment, and you left out one important qualification:P.S. It's interesting to read several commenters here who say that the technology works best in 70-100 fps range. PCPer.com's Ryan Shrout has stated that 4K is really the more natural realm for G-Sync/Freesync because it's in the lower frame rates that the effect is the biggest.
The Acer XB280HK is using a completely different panel at a different native resolution. But I'd still like to respectfully disagree. One of the complaints I sometime hear from those advocating for 30 FPS caps is that the fluctuation in framerate is lessened, which makes the experience "appear" more smoother for them. While technically true, this also means the assertion that G-sync shines in lower framerates is untrue. With framerates having the potential to vary more wildly in the upper reaches, G-sync makes this experience better because it reduces the stutter and input lag of the display that occurs when these changes happen.If you're on the PG278Q, I think it looks best when you're around 70-100 FPS.
That's not quite correct. If you can cap your framerate slightly below 144 you don't get any input lag and it's much better than v-sync. So a game like Quake (which is best played at 125fps) or Jedi Academy (best played at 100fps) benefits immensely from g-sync: perfect picture with no input lag (too bad we can't combine ULMB with it though).
I don't blame you. Quake Live has mutilated the Quake III formula. Basically they lowered the skill ceiling to appeal to casuls, while alienating the original fan base.I don't think they fixed the footstep bug of 250 fps though I haven't been following Quake news too closely as of late.
Quake is best played at 250 fps, not 125 fps. They (somewhat) recently increased the frame cap from 125 fps to 250 fps (quake live).
Nope. TN panels could technically get that high without ghosting problems. OLED supposedly has the potential for "unlimited" refresh rate. I think the last CRT monitor I had could do about 200 Hz at 800x600, too.Are there any displays that can actually output 200+ Hz at respectable resolutions?
Even my top-end CRT from 15 years ago barely hit 200 Hz at 800x600 IIRC.
OLED supposedly has the potential for "unlimited" refresh rate. I think the last CRT monitor I had could do about 200 Hz at 800x600, too.
OLED can theoretically deliver an infinite contrast ratio, not refresh rate. That said, the theoretically-possible refresh rate of OLED monitors - WHEN someone actually chooses to go beyond 60-75Hz - is supposed to be in the order of 1000s of Hz.
That is far off in the distance. Maybe 5+ years or more before we see anything beyond 144hz refresh rate. Bandwidth is the issue as well as display technology.
OLED does have the benefit of something like 0.001ms pixel response which is a huge advantage over 1ms in the fastest gaming displays.
Maybe with the new MHL standard that can do 8k at 120hz we finally have the bandwidth we've needed and manufacturers won't have a reason not to.
That is far off in the distance. Maybe 5+ years or more before we see anything beyond 144hz refresh rate. Bandwidth is the issue as well as display technology.
keep in mind that with higher refresh rates, you're gonna need more brightness. For example, the luminance of each refresh in a 1000 hz display will need to be 10 times greater than that of a 100 hz display, if each refresh is going to appear equally bright. Within a certain duration window, the human visual system integrates luminance over time, and you can exchange luminance and duration equally. So, a 1 ms pulse of light that is 100 cd/m^2 will appear to be the same brightness as a 10 ms pulse of light that is 10 cd/m^2.
keep in mind that with higher refresh rates, you're gonna need more brightness. For example, the luminance of each refresh in a 1000 hz display will need to be 10 times greater than that of a 100 hz display, if each refresh is going to appear equally bright. Within a certain duration window, the human visual system integrates luminance over time, and you can exchange luminance and duration equally. So, a 1 ms pulse of light that is 100 cd/m^2 will appear to be the same brightness as a 10 ms pulse of light that is 10 cd/m^2.
It will be worth it when it starts working in ULMB mode. Otherwise its not that amazing because at low framerate you will not get any tearing, but you will get rather crappy motion resolution. It also needs 120Hz VA monitors to be truly worth it.
Eizo Foris FG2421 is the only worthy monitor these days, but there are some bad units out there that have overly low gamma on the right and left edges.