Is G-Sync worth it?

I'm just in disagreement with your 'considerable blur' judgement.
Well it's always subjective, that's why I included links that demonstrate it in my posts.
In my opinion, the amount of blur that exists on a full-persistence display is indeed considerable, even at low speeds.
The trade-off for using G-Sync makes sense in newer 3D games which struggle to sustain consistent framerates on any current hardware, but that's basically the opposite of emulating old 2D games, where the hardware demands are not that strict and you can guarantee a certain framerate.

My point is that such level of blur is really acceptable, not many 2D games scroll as fast as Sonic games do so often and consistently, and if occasional speedy sequences happen in many games it's not a catastrophe if your monitor has decent pixels response to not make things worse by adding lots of smearing.
Well that's why I started with 4px/frame (already shows lots of blur) and then 8px/frame in my post.
8px/frame is slower than Super Mario World scrolls at when you're running, and that is absolutely not a fast moving game. Yet even that speed is a complete blur on a 60Hz full-persistence display.

Picture settings have their importance too, for instance the latest HLSL defaults are an absurd horror apparently voluntarily adding tons of blur and color smearing, talk about misleading crap (not looking even close to any decent rgb low res crt monitor/tv).
Instead use integer scaled, sync-locked, lightly filtered settings to compare and you'll see motion clarity is worlds apart. I'm always saying people; "you own a 60Hz monitor ? > ask it to display contents it's ok with", since it will blur stuff to a degree anyway why add heavy blurry color and contrast-destroying filters/shaders ? It looks nice when still but the moment things start moving it will get in the way.
I nearly always use unfiltered integer scaling with emulation, that was my assumption for any kind of motion testing.

Still, the problem with how people present things now, is that since strobing has come out of the niche/custom woods and become a commercial thing and they've spent big bucks on those new monitors, they've labeled everything else before that an absurd load of useless unwatchable junk, like it's burning their eyes.
You've become what I call cutting-edge-owner-perfectionists (a common feature of any pc hardware addict since computers have become a mass-consumer market).
I understand people should always demand better/best, but that doesn't mean it's the only equipment level which existence makes sense in any situation.
Well I'd hardly say that it's anything new. It was the standard mode of operation for displays until LCDs came along in ~2001.
And it was 2006 that Philips introduced backlight scanning to fix the problem of full-persistence motion with ClearLCD/Aptura technology.

I'm not comparing against an $800 monitor with backlight scanning, I'm comparing against a 15 year-old shadowmask CRT monitor that I picked up for $15.
After spending a few years only looking at LCDs - even LCDs with limited backlight scanning options - it was shocking how much clearer motion was when I went back to a CRT.
It made a far bigger impression going back to it after years of trying to find a decent flat panel for gaming, than the initial switch away from CRT. (which was made out of necessity at the time)

So, I'm not as radical as you guys are, IMO you're dramatically exaggerating things, again like most people do after they're got their hands on the better hardware solutions.
Well my problem is that we still haven't caught up to 15+ year-old display technology, not that people should spend thousands on displays.
G-Sync displays are arguably the most expensive displays on the market right now for what you get.
 
Well you know, I've been using CRTs and RGB since about the mid-80's (still do), so overall of course I agree it's natural for someone who have stopped using any for about a decade and suddenly decides to try one again to experience a kind of shock.

But strictly speaking about LCDs and their performance, I'm not as shocked as you are because I'm ready to accept compromises, really what you see as lots of blur to me is acceptable/minimal in most cases, the 4px/frame blur for me here is nothing, I think you are very demanding. ^^
The way I'm seeing the comprmise here is that upscaled contents of low res 2D games looks extremely blocky when not filtered of course, so in my eyes the blur and smear produced by the average 60Hz LCD somewhat works as a kind of smoothing thing.
Far from ideal, but as a middle ground I've found that using light and clear forms of smoothing and overlay solutions are preferable to as mentioned shader chains attempting to reproduce many details of shadow masks or aperture grilles.

Anyway, since I'm relatively satisfied with how a 60Hz display can manage those 2D games as they are most of the time, it's only natural that the major remaining annoyance for me would be the sync problems.
Therefore I'm more interested in G-Sync when it comes to LCDs.
 
Last edited:
crts and ulmb are dim. I'm commenting in reference to "zero" blur vs what I get at high hz-fps with g-sync mode.
I guess it's all about the tradeoffs. I got used to a more brilliant LCD screen even in dim viewing settings with a 120hz 1080p samsung and now my 1440p swift, even though I don't use it anywhere near max brightness.

As per blurbusters.com , on a high hz, very low response time monitor
vs. baseline 60hz (60fps-hz):

100fps-hz yields ~40% sample and hold blur reduction
120fps-hz yields ~50% sample and hold blur reduction
144fps-hz yields ~60% sample and hold blur reduction

These also result in much higher motion definition and motion articulation of course.
This is different than blur reduction.
Locking my fps-hz lower would lose out on the higher motion def of running 100 - 120 - 144 range with g-sync.

I know strobe mode reduces this a lot more but it's still a huge tradeoff vs:
- appreciable (40) 50% (60%) bur reduction
- the benefits of g-sync mode on over a common playing rate of 100fps-hz on a roller coaster graph fluctuating up/down on demanding games
- much more brilliant and colorful screen/game world

I did test out very high frame rate, easy to render games like L4D2 and torchlight2, path of exile, darksiders, etc in ULMB mode when I got my swift but switching back and forth was a huge difference and I went with the g-sync and appreciable blur reduction (vs near blur elimination of strobing mode).

I can see where people could choose the strobing mode though, especially on the 1080p benq where you could boost the backlight level much higher, and you could get much higher frame rates at 1080p(especially with sli). If the brightness were capable of going that high on my 1440p I'd revisit the choices. Maybe with dp 1.3 gpus, dp 1.3 monitors and HDR range capabilities but in non-hdr mode someday. The framerates would still be a huge issue for me though at 1440p or 3440x1440 144hz on 21:9 dp 1.3 monitors so I'd likely stick with g-sync still.
 
Last edited:
1455189919EDyKUcGV8E_10_1.gif


To be clear I'm not talking about this example's averages, I'm talking about tweaking your settings (with sli at 1440p in my case) to get an average of 100fps and using g-sync to ride the graph - suffering the lower and enjoying the higher ranges in a nice smooth blend. Sli is imo necessary to achieve this on very high to very high+ custom settings (very high to ~ ultra 'minus' depending how you look at it :b ) at 2560 x1440. In such a balanced motion aesthetic and still graphics fidelity scenario the most frequent lows and highs (with the average dialed in at 100+ fps-hz) would be higher as well.

For example, say you tweaked the graphics settings similary so that you would slide all of the graphs way up uniformly in relation to each other, until the midline of the 980 sli graph is at 100 on the rise of the tomb raider graph. This would raise the graphs up by around 3 rows, or +30 fps each.

The single 980 would have 54 fps-hz min, 97 fps-hz max, and about 70 fps-hz average.

The 980 sli would have 74 fps-hz min, 127 fps-hz max, and about 102 fps-hz average.
This would allow you to range back and forth +/- the 100 fps-hz smoothly utilizing g-sync.
 
Last edited:
3440 x 1440 (21:9) and 3840x2160 (4k) won't be doing 120hz to 144hz until we have dp 1.3 output gpus and dp 1.3 input monitors.

displayport 1.3 hz-rez info *incl. HDR bandwidth
A7QjyOz.png


---------------------------
>100fps-hz/120fps-hz/144fps-hz:
~40/50/60% blur reduction (a "soften" blur rather than 60fps-hz and less smearing blur)
5:3/2:1/2.4:1 increase in motion definition and path articulation (often unmentioned, huge difference)
g-sync rides the fps graph +/- without screen aberrations
.

Regardless of the monitor's hz, lower frame rates will be blurrier (outside of using strobe mode).
That is why I list my rates at fps-hz not fps and not hz alone. Without the frame rates, the hz is practically meaningless.

People are infatuated with graphics detail in still shots, but you don't play screen shots. If you are using variable hz at 1440p to run low (sub 75fps-hz to 90fps-hz mode/most of the time in game, really should be like 100 at least imo), you are essentially running a low hz, low motion definition and motion articulation, smearing blur monitor and missing out on most of the gaming advancements modern gaming monitors provide outside of the judder/tearing/stops avoidance.

120hz-fps-compared


Preview of NVIDIA G-SYNC, Part #1 (Fluidity) | Blur Busters
-----------------------------

These real world usage aesthetic benefits are not easy to show online to anyone who doesn't already posses the technology. So these huge aesthetic benefits (not just pew pew and rank seeking!) are typically overlooked by still shot champions who play blind to the continual non-resolution blur periods -(very ugly, not "beautiful" at all imo!) and low motion definition - in 1st/3rd person gaming's FoV movement of the entire game world.

The question becomes which tradeoff do you prefer really. Both are aesthetic choices in my opinion. Motion blur reduction(or even elimination) of the entire game world/viewport .. and motion articulation&definition (and even animation cycle definition) are huge aesthetic gains.
Gsync also eliminates judder and tearing on top of that if you aren't using backlight strobing, rather than using v-syncs frame limiting caps and drops and in some cases input lag.
G-sync allows you to enjoy the whole range of frame rates (75/90 -> 110 -> 13X.00) once you dial it in to 100fps+ average basically, and you enjoy that wildly varying frame rate roller coaster smoothly. You can't do that kind of frame rate range, at least at 1440p even with sli dialed in to very high or very high+ settings, on demanding games with strobe mode. You'd be capped and maybe chopped in half at times with v-sync or suffer the tearing, judder and stops.
 
Last edited:
I agree that Sonic games blow with high persistence displays, but Sonic games are not your typical scenario. Most 2D games don't scroll anywhere near as fast as Sonic games (including most shooters).

Really, most 2D games simply do not scroll fast enough for it to matter that much.

Also, you're wrong about the speed difference being only 1%. What about the hundreds of games that run at 50hz through 55hz? There is a noticeable speed difference there when you force the games to 60hz.

For old 2D games, a CRT arcade monitor is of course the ideal solution, but for anyone who just wants a convenient all in one solution playing every type of game, a variable refresh LCD works great. Another thing that people aren't taking into consideration is that when you run MAME with a 144hz G-Sync monitor, it actually is running at 144hz and just duplicating frames. I've done tests between setting the refresh rate at 60hz and 144hz, and there is less motion blur at 144hz. So not only are you getting lower persistence there at the higher refresh rate, you're not eating v-sync input lag, and you're running games at the right speed. MAME works really well with G-Sync. It's nice.

We're getting to the point where there are enough newer widescreen games where a 4:3 CRT just can't handle everything anymore.
 
Last edited:
By the way, I actually made a program to query the games in MAME by refresh rate and was surprised myself by how many of the games don't run close to 60hz. It's interesting how many games actually run faster than 60hz, which makes strobing them really problematic without getting audio hitching. Overall, close to 60hz is probably the exception, not the rule.

 
I agree that Sonic games blow with high persistence displays, but Sonic games are not your typical scenario. Most 2D games don't scroll anywhere near as fast as Sonic games (including most shooters).

Really, most 2D games simply do not scroll fast enough for it to matter that much.
Well that's why I picked a couple of examples. Super Mario World is a very slow moving platformer, while Sonic is a fast-moving platformer.
Most games seem to fall somewhere between those two.
Even the speeds in SMW blur a huge amount on a full-persistence display.

Also, you're wrong about the speed difference being only 1%. What about the hundreds of games that run at 50hz through 55hz? There is a noticeable speed difference there when you force the games to 60hz.
Well obviously you don't run those at 60Hz, you change your display to run at 50/55Hz. (or 100/110Hz with BFI if it won't sync below 60Hz)
That's why I have repeatedly said that a multi-sync CRT monitor is a better option than a G-Sync LCD for most old games - you can easily set it to run within a fraction of a percent of the original speed.

If we look at R-Type, it runs at 55.017606Hz. If you just set up lazy 55Hz timings and get 55.000Hz, the speed is only off by about 0.3% which is small enough that I doubt anyone would notice.

The only area where a G-Sync monitor will beat the CRT is latency.
Of course this is personal preference, but outside of fighting games where motion blur is not much of a concern, being able to clearly see what's happening on-screen benefits me far more than shaving off a frame or two of latency.

And I just did a fairly extreme test of that yesterday, where I played Sonic 2 from start to finish on my LCD TV.
In its full-persistence mode (~17ms persistence) my TV has about 20ms latency.
In its low-persistence mode, which first interpolates to 120Hz and then adds backlight scanning to reduce persistence to ~2ms, my TV has about 120ms latency.
Even though 120ms latency is absolutely horrible, my gameplay improved significantly when I switched the TV into that mode.
Of course I was much better still on a CRT where persistence is <1ms and the display has no inherent latency, but I was surprised that I was playing much better on the LCD in its low-persistence mode despite that adding another 100ms of latency.

Another thing that people aren't taking into consideration is that when you run MAME with a 144hz G-Sync monitor, it actually is running at 144hz and just duplicating frames. I've done tests between setting the refresh rate at 60hz and 144hz, and there is less motion blur at 144hz. So not only are you getting lower persistence there at the higher refresh rate, you're not eating v-sync input lag, and you're running games at the right speed. MAME works really well with G-Sync. It's nice.
Persistence is a function of framerate, not refresh rate, on a full-persistence display.
If you had a display running at 100Hz vs one running at 10000Hz, persistence would be 20ms for both displays if your game is running at 50 FPS.
Pixel response time may be marginally improved at higher refresh rates, which is really a failing of the overdrive system in that display, but it has no effect on image persistence with a flicker-free display.

If you forego G-Sync, and instead use a low-persistence display, you can get <1ms persistence at any framerate.
Even 1ms persistence is not totally free of motion blur, it's just significantly less than the 16-20ms (plus pixel response time) that you'll be seeing on a G-Sync LCD.

We're getting to the point where there are enough newer widescreen games where a 4:3 CRT just can't handle everything anymore.
There's always the FW900.

By the way, I actually made a program to query the games in MAME by refresh rate and was surprised myself by how many of the games don't run close to 60hz. It's interesting how many games actually run faster than 60hz, which makes strobing them really problematic without getting audio hitching. Overall, close to 60hz is probably the exception, not the rule.

Well, I don't know anyone using an ArcadeVGA setup. Anyone I know that is using a 15kHz CRT seems to be using CRT Emudriver, which offers a far bigger list of supported modes.
I don't know how comprehensive it is, or if it supports custom modes, because I've never used it myself.
But I'd recommend multi-sync PC monitors, not arcade/broadcast monitors or consumer televisions, unless you're really insistent on using a true low resolution display for these games.

You're repeating the same tired argument about games stuttering and having audio problems when your refresh rate is not an exact match for the framerate.

Modern emulators can lock the game speed to your refresh rate. This completely eliminates tearing, stuttering, and audio glitches. The shadows in Samurai Showdown will flicker on and off every other refresh perfectly.

Forcing the game to run at its original speed exactly, when your display is within a fraction of a percent of that, is just stupid.
That creates these hugely noticeable errors, instead of running at such a close speed to the original that no-one is going to notice, unless it's being compared side-by-side with the original.
Even then, if gameplay/audio is not synced between the two I think most people would have a hard time telling which is running at the original speed.

Outside of the highest level of players - and very likely even among them - I defy anyone to tell when a game is running within a fraction of a percent of the original speed, and not exactly the original speed.
We're not talking about PAL vs NTSC where the difference was almost 20%, we're talking a fraction of a percent.
 
Last edited:
Huh ? R-Type synce to a 60Hz monitor runs at 109% speed, and it's cleary sped up to the point it makes that already very hard game even a little more of a challenge. Take another merciless game like Seibu's Raiden II/DX at 108% and yes that's too much of even a little increase.
Also you can't make working custom timings for every game, there are hundreds diferent sub-60Hz arcade hardwares with hundreds of different resolutions, and tons of different LCDisplays more or less flexible with that, it's not a valid solution.
Whatever you say you can talk to the hand, I've seen what G-Sync/freeSync can do with MAME and it's golden. Next to that I don't give a damn about a little blur.

And what are you talking about in regards to CRTs ? Nothing beats actual real low res crt consumer sets, which are at a few minor differences the same as arcade monitors assuming you have an RGB model like easily found in Europe, or indeed a PVM for instance if you're not there (I'm no fan of BVM, they're too 'good' imho), after all they're the real deal, so why bother with a pc crt ?
The whole point of emulating in 15Khz is getting the same speed and resolution as the real arcade pcb on a real low res crt, that's what everyone in this hobby wants for immensely obvious reasons.

Seriously I don't know where you're coming from but you're making several statements that I find completely personal here, diametrically opposed to pretty much what the whole experienced arcade crowd would say.
 
Huh ? R-Type synce to a 60Hz monitor runs at 109% speed, and it's cleary sped up to the point it makes that already very hard game even a little more of a challenge. Take another merciless game like Seibu's Raiden II/DX at 108% and yes that's too much of even a little increase.
I did not suggest running the games at 60Hz.
I said that on a multi-sync display you will be able to easily get within a fraction of a percent of the original speed.
From what I can see, Raiden II/DX runs at 55.470Hz. So if we use the same example from my previous post, of using lazy timings and running at 55.000Hz, the game speed is off by ~0.85%.
It will be far closer than that if you actually set up proper timings.

Is sacrificing ≥18x motion clarity worth ~0.85% speed accuracy?
(55.47Hz = 18ms persistence, CRT has ≤1ms persistence)

Also you can't make working custom timings for every game, there are hundreds diferent sub-60Hz arcade hardwares with hundreds of different resolutions, and tons of different LCDisplays more or less flexible with that, it's not a valid solution.
Whatever you say you can talk to the hand, I've seen what G-Sync/freeSync can do with MAME and it's golden. Next to that I don't give a damn about a little blur.
So set up 50-65Hz in 1Hz increments, or whatever you deem necessary.
My whole point is that it's not at all necessary to be running at the exact speed.
You aren't going to notice if the speed is off by half a percent unless you're doing a direct comparison, instead of running at the exact speed and have it stuttering and tearing with audio glitches. Anyone would notice that.

If your display won't sync below 60Hz, double the refresh rate. Sync that 55.47Hz game to 111Hz instead - which would also bring the speed within 0.05% of the original.

And what are you talking about in regards to CRTs ? Nothing beats actual real low res crt consumer sets, which are at a few minor differences the same as arcade monitors assuming you have an RGB model like easily found in Europe, or indeed a PVM for instance if you're not there (I'm no fan of BVM, they're too 'good' imho), after all they're the real deal, so why bother with a pc crt ?
The highest-end BVMs are what 1000 TVL?
Even an average PC monitor will beat that, because they were designed to display high resolution content, not SD content.

I'd almost say that it's a waste to hook up a PC to a BVM or a PVM.
Those are displays for people running the original hardware.
A PC monitor offers just as good image quality and more flexibility for a PC-based setup.

The whole point of emulating in 15Khz is getting the same speed and resolution as the real arcade pcb on a real low res crt, that's what everyone in this hobby wants for immensely obvious reasons.
Sure, if you want a true low-resolution display, nothing will emulate a <600 TVL 15kHz CRT. You need the real thing.
Then you're back to all the same limitations of getting 15kHz out of a PC, if you aren't using the original hardware.

Seriously I don't know where you're coming from but you're making several statements that I find completely personal here, diametrically opposed to pretty much what the whole experienced arcade crowd would say.
Sure, and a lot of people in the "experienced arcade crowd" run their emulators at the original hardware speed without v-sync so the games are constantly tearing and stuttering.
From that, G-Sync is a massive upgrade, even if it throws away all motion resolution.

If you're willing to compromise <1% speed accuracy you can get a perfectly smooth stutter/tear free image on a CRT while still having great motion clarity. (≤1ms persistence vs >16ms persistence)

G-Sync is wonderful for demanding 3D games which have variable framerates, for which no hardware exists which can run them at a locked framerate. But G-Sync's only advantage for emulation is latency, and a fraction of a percentage of speed accuracy.

As I said, if NVIDIA would enable a mode that combined G-Sync with ULMB and strobed the backlight natively at any refresh rate, then you have something which rivals or arguably beats the CRT. (if you had an OLED which does this, instead of an LCD, it absolutely beats the CRT)
 
Last edited:
Dude, you're wrong or making stuff up on so many parts it's no use, you're twisting what people say mixing up with your own stuff to try and force your point, when you're actually clearly misinformed, especially about actual hardware and 15kh emulation.
There's no discussion here, I realize I'm talking to a wall. I'm outta here.
 
It's worth it but I wish they gave it up and just went with freesync. It's a waste of money to get a monitor that may not work with you video card ideally in a few years.
 
Dude, you're wrong or making stuff up on so many parts it's no use, you're twisting what people say mixing up with your own stuff to try and force your point, when you're actually clearly misinformed, especially about actual hardware and 15kh emulation.
Please tell me what I'm wrong about or "making up".
Because it seems like you're either skipping over or not understanding some of my points, and just saying "you're wrong" means this can't go anywhere.
 
It's worth it but I wish they gave it up and just went with freesync. It's a waste of money to get a monitor that may not work with you video card ideally in a few years.

If you continue to buy Nvidia cards, it will always work... ;) Sucks to be locked in like that, though.
 
If you continue to buy Nvidia cards, it will always work... ;) Sucks to be locked in like that, though.

I don't know about that, it's pretty likely the two will combine into one standard sooner or later.
 
At the moment gsync is superior to freesync right? Or has freesync sorted out it's kinks now?
It should be pretty marginal now that AMD have added Low Framerate Compensation to their drivers.
It was always said that the display should be able to implement variable overdrive, though I'm not sure if any do, so that would be one advantage that NVIDIA may still have.
Most G-Sync displays also support ULMB, but ULMB is so limited that it's a fairly useless feature.
 
It should be pretty marginal now that AMD have added Low Framerate Compensation to their drivers.
It was always said that the display should be able to implement variable overdrive, though I'm not sure if any do, so that would be one advantage that NVIDIA may still have.
Most G-Sync displays also support ULMB, but ULMB is so limited that it's a fairly useless feature.

I think G-Sync still has the edge. It's usually able to provide a wider frequency range compared to FreeSync, even on the same panel. And as you said, have a well implemented variable overdrive. I guess this is because Nvidia is the supplier of the G-Sync module (the scaler), and monitor manufacturers don't have to fiddle too much with it.

For FreeSync, the monitor manufacturers have to fiddle more with the firmware I think, to make it work well with other scalers. They maybe have to implement this variable overdrive from scratch, and they aren't doing a good job yet. Several monitors were released and had to be recalled, and getting a patched overdrive, or updated frequency range. Overall, I don't feel comfortable buying a FreeSync monitor today, because they have messed up somewhere. With G-Sync, you seem to be guaranteed consistent behaviour. So in this regard, I think Nvidia has done a good job. But it's pretty terrible this locks you in into their ecosystem, and that the monitor you buy is tied to the graphics card.

Don't know too much about the inside technology of scaler/monitor/firmware though, so I shouldn't say too much. But that's my impressions.
 
I think G-Sync still has the edge. It's usually able to provide a wider frequency range compared to FreeSync, even on the same panel.
That's exactly the problem which Low Framerate Compensation should solve. As long as the panel supports a maximum refresh rate ≥2.5x its minimum, it should work just like G-Sync when the framerate falls below the minimum refresh rate that the panel can handle.

And as you said, have a well implemented variable overdrive. I guess this is because Nvidia is the supplier of the G-Sync module (the scaler), and monitor manufacturers don't have to fiddle too much with it.
That's true, I don't know if it's something that any of the FreeSync panels have solved yet.
NVIDIA solve this with a shader in their mobile G-Sync implementation, so it's certainly possible for AMD to do the same.

With G-Sync, you seem to be guaranteed consistent behaviour. So in this regard, I think Nvidia has done a good job.
That's true. Most G-Sync displays have the same featureset and should all have been optimized by NVIDIA, instead of hoping that the display manufacturer gets it right.
 
Only difference with g sync and freesync is that g sync is always 30 to whatever the max refresh rate is. Freesync has a few models that do that but not all.

The downside to the nvidia scaler is that your options for a g sync monitor are very limited and your paying a $250+ premium. Freesync on the other hand has waay more monitors to choose from and are cheaper to boot.

They both have their positives and negatives. To me g sync doesnt offer any monitors over 32" and the 34" is the height of a 27" monitor. So no g syncs are appealing to me because they are way too small.
 
Only difference with g sync and freesync is that g sync is always 30 to whatever the max refresh rate is. Freesync has a few models that do that but not all.
Again: this is exactly what Low Framerate Compensation in the AMD Crimson driver fixes.
NVIDIA is not running these display panels at refresh rates lower than 30Hz either - that's a physical limitation of the panel.
NVIDIA is doubling refresh rates when the framerate drops below the panel's minimum refresh rate. The difference is that NVIDIA does it in the display hardware, while AMD does it in the GPU driver.

It seems like NVIDIA are able to achieve tighter tolerances with this than AMD, where AMD requires that the maximum refresh rate is at least 2.5x the minimum for LFC to work, but that should still cover most displays.
I do think that NVIDIA's solution is superior, but AMD can do it.

They both have their positives and negatives. To me g sync doesnt offer any monitors over 32" and the 34" is the height of a 27" monitor. So no g syncs are appealing to me because they are way too small.
What I really want to see is an OLED TV with Adaptive-Sync support.
I'd be happy with G-Sync support too, but I don't see that ever happening.

However, as more and more games are being released on PC which are struggling to stay above 60 FPS (late-game Dark Souls III) I'm very tempted to pick up a G-Sync display until that happens. I just don't know that I'd be happy going from a big TV to a small monitor.
 
That's exactly the problem which Low Framerate Compensation should solve. As long as the panel supports a maximum refresh rate ≥2.5x its minimum, it should work just like G-Sync when the framerate falls below the minimum refresh rate that the panel can handle.

That's a bit interesting actually. Showing one frame over a certain timespan vs. refreshing screen and showing the same frame 2 or 3 times during the same timespan. Maybe things like contrast, blur and brightness are affected. There must be a reason why Nvidia prefers to go down to 30 Hz in their implementation.
 
That's a bit interesting actually. Showing one frame over a certain timespan vs. refreshing screen and showing the same frame 2 or 3 times during the same timespan. Maybe things like contrast, blur and brightness are affected. There must be a reason why Nvidia prefers to go down to 30 Hz in their implementation.
NVIDIA do exactly the same thing.
The issue was that AMD were not doing this until LFC was introduced, so if the monitor has a 30-75 range and the framerate is 25 FPS, it would have stuck to 30Hz. Now it will display 25 FPS at 50Hz, just like G-Sync does.
 
Only difference with g sync and freesync is that g sync is always 30 to whatever the max refresh rate is. Freesync has a few models that do that but not all.

The downside to the nvidia scaler is that your options for a g sync monitor are very limited and your paying a $250+ premium. Freesync on the other hand has waay more monitors to choose from and are cheaper to boot.

They both have their positives and negatives. To me g sync doesnt offer any monitors over 32" and the 34" is the height of a 27" monitor. So no g syncs are appealing to me because they are way too small.


You are paying premium because you are getting better monitor with g-sync

If you go with 1440p 144Hz IPS then cheaper Freesync option is missing strobing mode and has 35-90 Hz freesync range for Asus or if you go with Freesync Eizo you are paying more than Asus PG279Q or Acer XB271HU and still have two freesync ranges that need to be changed manually.

If you go with 1440p 144Hz 8 bit TN then Dell S2716DQ is currently cheaper than freesync Benq 2730Z and other freesync screens are either missing strobing or ergonomic regulation
 
if you go with Freesync Eizo you are paying more than Asus PG279Q or Acer XB271HU and still have two freesync ranges that need to be changed manually.
The 56-144Hz range on that Eizo covers everything with LFC, since the maximum refresh rate is >2.5x the minimum, so you shouldn't have to switch ranges any more.

If you go with 1440p 144Hz 8 bit TN then Dell S2716DQ is currently cheaper than freesync Benq 2730Z and other freesync screens are either missing strobing or ergonomic regulation
As far as I am aware, the only monitor which has good strobing options is the BenQ 2720Z, since it's the only one which offers a single strobe per refresh at any rate the monitor will sync to.
NVIDIA's ULMB is largely useless in many games because it only operates at 85/100/120Hz.
Most other strobe modes are equally useless because they strobe multiple times per frame, or have similar restrictions.
 
You are paying premium because you are getting better monitor with g-sync

If you go with 1440p 144Hz IPS then cheaper Freesync option is missing strobing mode and has 35-90 Hz freesync range for Asus or if you go with Freesync Eizo you are paying more than Asus PG279Q or Acer XB271HU and still have two freesync ranges that need to be changed manually.

If you go with 1440p 144Hz 8 bit TN then Dell S2716DQ is currently cheaper than freesync Benq 2730Z and other freesync screens are either missing strobing or ergonomic regulation

Your paying a premium because of the g sync module. The g sync monitors are exact same panels as the freesync models Just with a wider range. Now that freesync has lfc that's almost irrelevant . For instance the X34 g sync is the same monitor as the freesync version with the "possibility" of 100hz vs 75hz. So in that case your literally paying the premium for 25hz that you might not even get. Adaptive sync functions the same between the two. Only like 2 g sync monitors have more than dp, which is a negative for many people. FreeSync has way more partners, monitors, freesync hdmi, and Intel on board. G sync will be going to way of physx very soon. There's one 32" g sync 4k monitor, nothing in the foreseeable future over 40" or even close to that. The x34 is like 1200+, a 40" or bigger g sync screen would be like 2k. I got a 49" ips 10 bit passive 3d freesync with USB 3.0 and hdmi 2.0 for less than $1k. That experience is not possible on g sync nor may it ever, especially at that price
 
Last edited:
for pro CSGO stuff, is IPS + gsync good as TN film? or is tn still the way to go?
144 hz IPS is still slower in terms of some transitions, so there will be more blur/ghosting, but the input lag and overall response times are comparable on the recent crop of gaming monitors.
 
do you know of a side by side comparison of ips gsync vs tn for input lag? i can't find any, think it would be common
 
in complete agreement with zone here. i will never use a display that doesn't strobe again whether i have to deal with stutter/tearing or not, sample and hold is absolutely disgusting.
Using the FW900 here and I do not know of this blur you speak of? lulz
 
Do you turn off strobing on the desktop? The flicker should be annoying...
 
for pro CSGO stuff, is IPS + gsync good as TN film? or is tn still the way to go?

If you are buying a G-Sync monitor you can probably also afford, or already own, a card capable of running CSGO at 144fps. Hence a strobing function is more usefull. The G-Sync monitors have the ULMB mode you can use instead, but for purely CSGO you may as well consider good adaptive sync -less monitors with strobing like the LG 24GM77.
 
Back
Top