Variable Refresh Rate Monitors (G-Sync) --- Refresh Rate Varies While You Play!!!

You obviously don't have even a basic understanding of the topic to be making such silly comments.

explain me your reason and I will demonstrate you that is the opposite, I mean that you are the one who don't understood the topic.
 
How is there a software solution? How does triple buffering change the monitor refresh rate so it matches the frame rate of the game? This is what the technology is all about, there is no way of doing that through software.
 
How is there a software solution? How does triple buffering change the monitor refresh rate so it matches the frame rate of the game? This is what the technology is all about, there is no way of doing that through software.

no need to change the refresh rate of the monitor, the intent of the triple buffering is to show all the fps that the video cards can draw to a monitor without tearing and this is what triple buffering does, end of the story.
 
no need to change the refresh rate of the monitor, the intent of the triple buffering is to show all the fps that the video cards can draw to a monitor without tearing and this is what triple buffering does, end of the story.

How about watching the video and educating yourself? They show exactly what happens with v-sync + tripple buffering AND framerate dropping below the refresh rate : stuttering*. G-sync is the ONLY solution to that. There is no software solution at all and there can't be any.

*because when framerate is not equal to OR a multiple of the refresh rate there is NO possibility of TRULY smooth motion. It is not physically possible in our World. You can use motion blur and whatnot all you want that's still not a solution :)
 
How about watching the video and educating yourself? They show exactly what happens with v-sync + tripple buffering AND framerate dropping below the refresh rate : stuttering. G-sync is the ONLY solution to that. There is no software solution at all and there can't be any.

where is this video?
if framerate drops, it drops, no way to see a fluid framerate if it drops.
 
with G-Syns there is no framerate drops below -vsync frequency because there is no fixed syncing frequency

less framerate will never cause stutter
less fluidity and more motion blur yes but never stutter
it is stutter that make playing game at varying 40-60fps nightmare not lack of fluidity itself
at 144Hz monitor even playing games at lower refresh rate feels much better than on 60Hz
G-Sync will be superior to 144Hz monitors ! It will be superior to hypotetical 1000Hz monitor :D
 
g-sync looks like it will be fantastic, but so far the talk of real products is all abnout the 120hz-144hz TN panel displays. Are there any signs of g-sync being adopted into any IPS/PLS/VA panels?
 
g-sync looks like it will be fantastic, but so far the talk of real products is all abnout the 120hz-144hz TN panel displays. Are there any signs of g-sync being adopted into any IPS/PLS/VA panels?

Yup, says here it will be included on 2560x1440 and 4K models so they won't be TN.

Edit: Also on the official G-sync page - http://www.geforce.com/hardware/technology/g-sync/technology. "monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models".
 
Yup, says here it will be included on 2560x1440 and 4K models so they won't be TN.

Edit: Also on the official G-sync page - http://www.geforce.com/hardware/technology/g-sync/technology. "monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models".

Yes, I plan on purchasing the first 4K monitor with G-Sync.

I surely hope the Dell 31.5" IGZO 4K monitor has been delayed to the end of the year is due to implementing G-Sync. Especially considering that IGZO panel has been out for a good half a year now. If Dell puts its stellar semi-gloss AR film on it, with the great picture quality of that 4K IGZO panel and G-Sync, that display would be epic.
 
It's here http://www.engadget.com/2013/10/18/nvidia-g-sync/ but you're sounding like a lost cause.

never seen that much stuttering in games where I cannot afford rock solid 60FPS.
from that video stuttering is quite difficult to perceive, if it is difficult to perceive in a demo with a pendulum, can't understand how can you see the difference while playing.

surely, if your GPU cannot afford 40FPS at least that is not stuttering but simply low framerate and the only solution to low framerate is buying a greater GPU or lowering settings, no gsync can solve low framerate.
to save your money on gsync and buy greater gpus.
 
Last edited:
never seen that much stuttering in games where I cannot afford rock solid 60FPS.
from that video stuttering is quite difficult to perceive, if it is difficult to perceive in a demo with a pendulum, can't understand how can you see the difference while playing.

surely, if your GPU cannot afford 40FPS at least that is not stuttering but simply low framerate and the only solution to low framerate is buying a greater GPU or lowering settings, no gsync can solve low framerate.
to save your money on gsync and buy greater gpus.

You really don't understand.
 
@sblantipodi
did you know that SLI increases input lag?
no? then it just mean you are immune to it and G-Sync is not for you
some people like myself are irritated by such effects and it decrease pleasure from games hence need for better gaming displays

if you are happy about your 60Hz SLI input-lag hell then at least don't be ignorant in this topic...
 
where is this video?
if framerate drops, it drops, no way to see a fluid framerate if it drops.
Actually it's remarkable; fluidity still occurs.

That's because the positions of moving objects in the frame is in the correct position along the motion vector axis. Put a ruler along the display. A constant speed moving object. Refresh the screen randomly at random intervals, but paint the object at the correct position on a time-basis.

That's EXACTLY how G-Sync solves the stutter problem.

Yes, there ARE minor side effects occuring (variable motion blur occurs; the "ghosting" can increase/decrease), but it's still zero stutter.
On non-strobed variable-framerate displays, motion blur is linearly proportional to framerate. The motion blur trail length is the distance the object moved between two frames -- and that's exactly how I designed www.testufo.com/eyetracking to work. The illusion (created out of motion blur), generated by my precise knowledge of motion blur, is surprising to those who has not seen it before. Also, if you run www.testufo.com/blurtrail at a constant speed (e.g. 1000 pixels/sec), you will see the motion blur also is directly proportional to framerate=refresh rate.

Now, variable refresh rate displays solve the stutter problem completely.

Without G-Sync:
fps-vs-hz-1024x576.png


With G-Sync:
fps-vs-hz-gsync.png


It's true: Zero stutters during variable frame rate situations.
The only side effect is variable motion blur because motion blur is directly proportional to framerates on non-strobed LCD's. Just see for yourself at www.testufo.com
 
Last edited:
@sblantipodi
did you know that SLI increases input lag?
no? then it just mean you are immune to it and G-Sync is not for you
some people like myself are irritated by such effects and it decrease pleasure from games hence need for better gaming displays

if you are happy about your 60Hz SLI input-lag hell then at least don't be ignorant in this topic...

quite funny how a person with an i7 860 can do so much the squeamish :D
upgrade your CPU before thinking to the input lag, the input lag in your case is made from cpu not monitor or refresh rate, after you upgraded the cpu you can speak of input lag.
with all the respect, I'm your friend and don't want to be rude. :)
 

this video confirms what I'm saying.
it is difficult to see the difference with a pendulum in a demo, impossible to see the difference while playing.
in any case I don't like the idea of thinking my monitor every time I change my videocard.
if this tech will be a success than it will be a gsync 2, than a gsync ti than a gsync mega ecc ecc.
they will find the way of creating a reason to change the monitor every time you change your gpu. if you like it, ok this is a good technology.
 
If there's a pause for a full fraction of a second (e.g. 1/5th second freeze), you WILL see it even with nVidia G-Sync.
But if you're smoothly fluctuating 30fps through 144fps, all you see is motion blur smoothly increasing/decreasing.
 
this video confirms what I'm saying.
it is difficult to see the difference with a pendulum in a demo, impossible to see the difference while playing.
Fixed-Framerate Video is INVALID proof.

It's impossible to see the difference in the video because the video converts the variable framerate into a fixed framerate. This re-slots the correctly timed frames into incorrectly timed frames. The video is LOSING timing information. I can confirm that the difference is massively more dramatic in person (more than an order of magnitude) than in video. Also, you can't capture the LightBoost effect in regular video, either. Regular 30fps or 60fps video doesn't preserve strobe timing information either.

I can safely say this: I prefer upgrading to G-Sync, than spending $400 on an extra graphics card to put into SLI. For modern highly-variable framerate games such as Crysis3, using G-Sync is a better investment in motion fluidity than adding 50% to your framerate. Some people may not agree, but the difference is that dramatic. (...only if I was forced to choose between, and needed to play my variable-framerate games. However, if I can have my cake and eat it too: Get both the GPU upgrade *and* get G-Sync! And throw in strobing too, as well.)

Just like you may not believe in LightBoost, I believe in LightBoost (and many do rave about it)
Just like you may not believe in G-Sync, I believe in G-Sync. (and everyone who saw it in operation, raves about it)
G-Sync is Blur Busters approved!
 
Last edited:
I have added a new prominent G-Sync section to the Blur Busters website. It's going to be rapidly expanding in 2014.
 
To be very clear, there is a very minor side effect of variable frame rates on a variable-refresh monitor: Variable motion blur.
- As framerates go down, your eyes perceive more motion blur.
- As framerates go up, your eyes perceive less motion blur.
This effect is self-explanatory in the animations at www.testufo.com where lower framerates looks more blurry than higher framerates.

BUT, as long as frames don't dip to slideshow framerates (e.g. under 30fps), there is is no stutters with G-Sync.
Stutter is the erratic shakiness of the image.
Just only the side effect of varying motion blur.

If framerates dip to 24fps, it just looks like a movie framerate. No stutters. At really low framerates, the motion blur becomes flickery (edge strobing) just like you see in movies. Perfect 20fps@60Hz doesn't have any stutters either.
 
it is difficult to see the difference with a pendulum in a demo, impossible to see the difference while playing.

Incorrect, your eyes are very capable of seeing stuttering and tearing on almost all moving displayed images no matter what they are at the refresh rates we deal with today. 144Hz and below.
 
It's worth pointing out that G-Sync only solves a certain classification of stuttering related to the output chain. Frame time variance will still cause issues in motion quality.

The perfect 'fluidity' of moving objects is only guaranteed if the difference between the frame rendering time and the computation of the object's next position (the tick) is identical to the former frame and the next frame.
 
Yes, I plan on purchasing the first 4K monitor with G-Sync.his?

I surely hope the Dell 31.5" IGZO 4K monitor has been delayed to the end of the year is due to implementing G-Sync. Especially considering that IGZO panel has been out for a good half a year now. If Dell puts its stellar semi-gloss AR film on it, with the great picture quality of that 4K IGZO panel and G-Sync, that display would be epic.

That would be pretty amazing to say the least.

Dare I ask what something like this is going to cost?
 
It's worth pointing out that G-Sync only solves a certain classification of stuttering related to the output chain. Frame time variance will still cause issues in motion quality.

The perfect 'fluidity' of moving objects is only guaranteed if the difference between the frame rendering time and the computation of the object's next position (the tick) is identical to the former frame and the next frame.
Yes, you are right.

But that's simply all software design now, now that G-Sync has made it possible for game software to do so. Proper 3D game rendering would ensure that timing of object positions (in virtual gametime) correspond with the actual real-world-time of the display of the object.

Most 3D games are already "attempting" to do it this way now because that's the way it looks best during VSYNC OFF -- at least in the games that have the least-jittery and smoother-looking VSYNC OFF. But this isn't perfect because only partial frames are delivered & you get tearing. G-Sync fills the missing piece of puzzle in GPU-monitor synchrony during variable GPU framerates.
 
That would be pretty amazing to say the least.
Dare I ask what something like this is going to cost?
I'd personally hope that all 4K G-Sync monitors have a strobe mode, or nVidia is breaking their promise:

nVidia: “We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.
 
I'd personally hope that all 4K G-Sync monitors have a strobe mode, or nVidia is breaking their promise:

nVidia: “We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.

So I was mistaken on my guess of redundancy. Interesting. BOTH at the same time.

I really am looking forward to seeing what this looks like. :)
 
So I was mistaken on my guess of redundancy. Interesting. BOTH at the same time.
Nope.
You can only have one or the other (currently).

John Carmack (@ID_AA_Carmack) tweeted:
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”


See this post:
CONFIRMED: nVidia G-Sync includes a strobe backlight upgrade!

G-Sync == superior for variable framerates.
LightBoost == superior for fixed full framerates (120fps @ 120Hz). LightBoost stutters noticeably during variable framerates.
However, it should technically be possible to combine both.
 
If there's a pause for a full fraction of a second (e.g. 1/5th second freeze), you WILL see it even with nVidia G-Sync.
But if you're smoothly fluctuating 30fps through 144fps, all you see is motion blur smoothly increasing/decreasing.

30fps even perfectly synced is still nowhere near smooth for most people and i think this would easily be detected as stutter, even if its just fractions of a second/a few frames. 60fps is also not smooth for many.
But if you decreased the range (which is also very realistic as a usage scanario) and said "But if you're smoothly fluctuating 85fps through 144fps, all you see is motion blur smoothly increasing/decreasing.", then i think it would be true for almost everyone.
I'm sure gsync will greatly reduce stutter during huge drops like 144->30fps as well, but I don't see any way it could mask it completely during such extreme fps-drops.
 
30fps even perfectly synced is still nowhere near smooth for most people and i think this would easily be detected as stutter
Let's carefully define terminology.
When I said "stutter", I mean erratic-looking motion (e.g. the kind you see during 47fps@60Hz, 35fps@60Hz, etc)

Basically, everything will look like framerate=Hz at all times.
G-Sync looks like 30fps@30fps whenever the GPU runs at 30fps.
G-Sync looks like 45fps@45fps whenever the GPU runs at 45fps.
G-Sync looks like 60fps@60fps whenever the GPU runs at 60fps.
It is continually staying in sync so the smoothness is always maxed-out for a specific Hz
etc.

Perhaps the accurate term is, "G-Sync completely eliminates irregular stutters".
Thinking this, Blur Busters will need to be careful defining the terminology.
Stutter, judder, jerkiness, etc, can have different meanings for different people.
 
Let's carefully define terminology.
When I said "stutter", I mean erratic-looking motion (e.g. the kind you see during 47fps@60Hz, 35fps@60Hz, etc)

Basically, everything will look like framerate=Hz at all times.
G-Sync looks like 30fps@30fps whenever the GPU runs at 30fps.
G-Sync looks like 45fps@45fps whenever the GPU runs at 45fps.
G-Sync looks like 60fps@60fps whenever the GPU runs at 60fps.
It is continually staying in sync so the smoothness is always maxed-out for a specific Hz
etc.

Perhaps the accurate term is, "G-Sync completely eliminates irregular stutters".
Thinking this, Blur Busters will need to be careful defining the terminology.
Stutter, judder, jerkiness, etc, can have different meanings for different people.

Hm, yeah, not sure what the "correct" definition of visual stutter is. It seems to be a word originating from sound.
In my mind it is anything that is not smooth, making the motion look choppy.
For example, 60fps@60hz looks smooth if you move straight forward, as soon as you do even reasonably fast fps-gaming-panning, 60fps@60hz stutters permanently (is not smooth), until the panning ends.
 
as soon as you do even reasonably fast fps-gaming-panning, 60fps@60hz stutters permanently (is not smooth), until the panning ends.

Something's wrong with your game or setup. Or v-sync off.
 
60fps@60hz stutters permanently (is not smooth), until the panning ends.
Try again with VSYNC ON, or comparing to www.testufo.com

Motion fluidity G-Sync
-- is similar looking to VSYNC ON Framerate=Hz
-- it doesn't look like VSYNC OFF Framerate=Hz

What's diffrent is that G-Sync can maintain the "VSYNC ON Framerate=Hz" look-and-feel at all framerates.
 
Something's wrong with your game or setup. Or v-sync off.

No, there are just nowhere near enough frames to do fast pans at 60fps. 60fps vsynced is fast enough for movie-speed movement/pans, but not fps-gaming speed pans, objects/players quickly moving past at an angle, etc.
 
Trolls are going to troll I suppose. Nvidia is not spending millions to develop a gimmick.

I am just going to start saving up a war chest and wait and see what unfolds next year. I REALLY want to upgrade my Acer 235HZ monitor. Not that I hate it but I really want to get a bit bigger and with better colors. It just seems stupid to do so until this tech releases.
 
No, there are just nowhere near enough frames to do fast pans at 60fps. 60fps vsynced is fast enough for movie-speed movement/pans, but not fps-gaming speed pans, objects/players quickly moving past at an angle, etc.
Then you've got a heightened sensitivity to 60fps@60Hz edge-strobing issues (similiar to 30fps@60Hz). Some people see the edge strobing, others see motion blur. But that's not perceived as erratic/random stutter.

Still, 60fps@60Hz always will be smoother than 55fps@60Hz or 65fps@60Hz.

Below a certain framerate (corresponding to your flicker fusion threshold), you may see a rampup-rampdown in edge flicker (edge strobing -- like the 30fps UFO at www.testufo.com ...) as the framerate varies, but the motion would not look erratic with G-Sync. From that perspective, there's no erratic/random stutter, provided object position calculations are correctly done (keeping real world frame times in sync with virtual game physics times), so everything is where they are expected to be, the moment the refresh occurs.

Beyond a certain framerate (corresponding to your flicker fusion threshold), edge flicker is so fast, it blends into a motion blur. Also, if the LCD panel is slow response in GtG, edge flicker won't occur until the framerates are low enough. For example, 30fps edge flicker (sometimes called "strobing") on 1ms LCD's than it is on old 8ms+ LCD's. Today's 120Hz and 144Hz LCD's are fast enough that some people DO see edge flicker on these panels during 60fps@60Hz or 60fps@120Hz; causing less smooth appearance of motion than 60fps@60Hz on an old 8ms monitor (smoother motion but with more motion blur). The less motion blur, the higher the framerate needs to be before it looks smooth to your eyes.

Most people's flicker fusion threshold is around the 75-85Hz level, so fluctuating framerates between 75fps to 144fps will have no edge-strobing effect (edge strobing is so fast it turns into motion blur), and thus look 'smooth'.
 
Last edited:
How is there a software solution? How does triple buffering change the monitor refresh rate so it matches the frame rate of the game? This is what the technology is all about, there is no way of doing that through software.
Both hardware and software are cooperating.

For example, the frame is pushed to the monitor immediately the moment Direct3D Present() API is called. Basically, the existing Direct3D software API is now triggering the delivery of the frame to the monitor if the monitor is currently waiting for the frame. No being forced to wait for the next scheduled refresh interval. It's necessary to have very close integration of the software with the hardware, in order to support variable-refresh-rate monitors.
 
I'd personally hope that all 4K G-Sync monitors have a strobe mode, or nVidia is breaking their promise:

nVidia: “We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.

Hm, don't think they would attempt strobing on 60 Hz 4K monitors as that would be way too annoying like old 60 Hz CRT's. They would have to bring frame interpolation into play, 120+ Hz.
 
Back
Top