I bought a gsync monitor, does it still make sense to have the latest and greatest videocard?

GMcDougal

Gawd
Joined
Aug 22, 2004
Messages
936
Bought a Dell S2716DGR which is fantastic after the calibrated ICC profile is loaded. Now that things run smooth all the time due to gsync, does it make sense to upgrade to a 1080 or a 1080ti?
 
An OCed 980ti trades blows with a 1070. The 1080 is going to be a 15-20% improvement best case scenario. So not nothing, but to my mind not worth the $300+ cost after selling your current card, unless it's really not doing something you want now.
 
Bought a Dell S2716DGR which is fantastic after the calibrated ICC profile is loaded. Now that things run smooth all the time due to gsync, does it make sense to upgrade to a 1080 or a 1080ti?
Yes. The benefits of higher frame rates are still going to be present, you're just not dealing with screen tearing and judder anymore. With G-Sync I can still instantly tell if a game is running at 60 FPS or lower compared to 80-100 FPS.
 
I'd wait for a 1080Ti or equivalent. I have a 980Ti and your monitor is potentially on my shortlist and that's what I'm going to do.




Yes. The benefits of higher frame rates are still going to be present, you're just not dealing with screen tearing and judder anymore.


With G-Sync I can still instantly tell if a game is running at 60 FPS or lower compared to 80-100 FPS.


Really? That seems to go against the hype, marketing, and purpose (at least how I understood it) otherwise you are some kind of freak. ;) :)



I have a 980Ti and the OP bought a monitor potentially on my shortlist so that's why this got my attention.
 
Dont hesitate Q-BZ, its a great monitor. Unfortunately after one day i got a stuck pixel and exchanged it for another one if that tells you how much i like it.
 
  • Like
Reactions: Q-BZ
like this
Really? That seems to go against the hype, marketing, and purpose (at least how I understood it) otherwise you are some kind of freak. ;) :)

Not at all. The purpose is to avoid tearing, stuttering etc.

The benefit is for much less demanding people that lower frame rates becomes a better experience. But for a much more dedicated FPS gamer you still want the higher FPS. You can buy 240hz gsync monitors for the same reason.
 
  • Like
Reactions: Q-BZ
like this
I'd wait for a 1080Ti or equivalent. I have a 980Ti and your monitor is potentially on my shortlist and that's what I'm going to do.







Really? That seems to go against the hype, marketing, and purpose (at least how I understood it) otherwise you are some kind of freak. ;) :)



I have a 980Ti and the OP bought a monitor potentially on my shortlist so that's why this got my attention.
I lock the framerate to 60 FPS for Titan Quest, which I have been playing a lot of recently. That game gets all sorts of fucked up with high framerates, and I don't mind locking it down for this game. Sometimes I forget to disable the cap, though, so for example when I go into BF1 I curse myself for not remembering. G-Sync is a godsend, though, when I want to crank all the visual settings and DSR and I'll happily play a game running around 40 FPS.
 
  • Like
Reactions: Q-BZ
like this
Gsync is a double-edged sword.

It's amazing, but after getting used to 90+ fps on the reg, I can't go back.

It's ruined most games for me if I can't hit at least 75 FPS with decent quality settings.


On the plus side, with Gsync you get every penny out of your GPUs going forward.
 
It won't cure the fundamental itch that you--and most of us on this board--have to upgrade more often than we reasonably should.

That said, buy moar and buy [H]ard brah.:cool:
 
  • Like
Reactions: Q-BZ
like this
Dont hesitate Q-BZ, its a great monitor. Unfortunately after one day i got a stuck pixel and exchanged it for another one if that tells you how much i like it.

Not sure what to buy.

I see so much panel roulette and nonsense for a lot of them that it's kept me on the fence.
 
Not sure what to buy.

I see so much panel roulette and nonsense for a lot of them that it's kept me on the fence.
The display the OP mentions is a solid choice with a lower percentage of possible issues, plus you get Dell warranty and support. Unfortunately there are not any good IPS panel picks in the 16:9 realm last time I checked, as far as lottery goes.
 
The display the OP mentions is a solid choice with a lower percentage of possible issues, plus you get Dell warranty and support. Unfortunately there are not any good IPS panel picks in the 16:9 realm last time I checked, as far as lottery goes.

I double that. Despite all this "TN eww" from a lot of people, when it comes to these specific gaming g-sync monitors, Dell's S2716DG comes as a solid quality choice with a really great picture for gaming which is the sole purpose of this monitor. I've been using it for more than 6 months already. For mere $480 I got a top quality monitor of a proper size for its good resolution to satisfy my gaming needs for years to come.
 
I double that. Despite all this "TN eww" from a lot of people, when it comes to these specific gaming g-sync monitors, Dell's S2716DG comes as a solid quality choice with a really great picture for gaming which is the sole purpose of this monitor. I've been using it for more than 6 months already. For mere $480 I got a top quality monitor of a proper size for its good resolution to satisfy my gaming needs for years to come.

Despite my own caveats against TN... at least the 6 bit ones... I do have this and its 24 inch brother on my short list.
 
I double that. Despite all this "TN eww" from a lot of people, when it comes to these specific gaming g-sync monitors, Dell's S2716DG comes as a solid quality choice with a really great picture for gaming which is the sole purpose of this monitor. I've been using it for more than 6 months already. For mere $480 I got a top quality monitor of a proper size for its good resolution to satisfy my gaming needs for years to come.
I agree. The 8-bit TN panel used in the Dell and my ASUS is wonderful. It's hard to tell the difference in color from head on between common IPS panels and this one. Just felt I should put that out there for anyone who prefers IPS for whatever reason.
 
  • Like
Reactions: Q-BZ
like this
As others have said, having a GSync monitor doesn't magically make games look smoother, it just gets rid of tearing regardless of framerate. You are still going to want to have the highest FPS possible (or at least, 60 FPS minimum).

EDIT: That said...I'm not sure why you would buy a $700 TN panel? There are IPS GSync monitors for similar or less.
 
EVGA 1080 + S2716DGR owner here, I absolutely love the combo and completely recommend it. Originally ran the S2716DGR with a 970, frames weren't to my liking, wasn't getting enough with the eye candy turned on and decided to upgrade to the 1080.

The difference was something else, couldn't believe the smoothness first time I fired a game up. Solid frames between 80-110 on Witcher 3 with HW off, 70-80 in GTAV with 2x MSAA, with some tweaking in ARK, 70-90+ depending on the location, etc. IMO, it's important to pair this awesome monitor with a nice graphics card to reap the benefits of the higher Hz.

If your bank roll allows, I wouldn't hesitate to pull the trigger.
 
Last edited:
  • Like
Reactions: Q-BZ
like this
As others have said, having a GSync monitor doesn't magically make games look smoother, it just gets rid of tearing regardless of framerate. You are still going to want to have the highest FPS possible (or at least, 60 FPS minimum).

EDIT: That said...I'm not sure why you would buy a $700 TN panel? There are IPS GSync monitors for similar or less.
no, it definitely does make games smoother unless for some reason you were playing with vsync on and its god awful input lag prior. not sure where you got $700 from.
 
Does vsync need to be turned on in the nvidia control panel when gsync is enabled?
 
IMO, absolutely. Gsync is going to give you a performance boost (you can leave vsync off with no tearing), but even with that it doesn't mean that you can pull 60+ FPS on all modern games. Especially when you factor in that monitor is 25x14 and supports a high refresh rate if your card can push it.
As always, I'm a fan of buying the single fastest card you can afford at the time. There isn't much point to having a monitor like that if you aren't able to take advantage of it.
 
  • Like
Reactions: Q-BZ
like this
As others have said, having a GSync monitor doesn't magically make games look smoother, it just gets rid of tearing regardless of framerate. You are still going to want to have the highest FPS possible (or at least, 60 FPS minimum).

EDIT: That said...I'm not sure why you would buy a $700 TN panel? There are IPS GSync monitors for similar or less.
I can't stand IPS glow, so that's why I go TN. As I stated earlier the color reproduction on the 8-bit panel used in the PG278Q and S2716DG is as good as the average IPS panels I've seen in person, so the only real downside is the viewing angles.
Does vsync need to be turned on in the nvidia control panel when gsync is enabled?
No, it doesn't. V-Sync will just prevent the framerate from going above the maximum refresh rate, at which point screen tearing again becomes a possibility. You will still have to deal with the input lag that comes from V-Sync should this happen. I personally use FastSync with G-Sync, instead. But most of the time my games won't get close to the 144 Hz barrier, anyway. If your games are running at or above 120/144 Hz then you should use a frame cap and ULMB instead, since ULMB still gives superior motion clarity.
 
no, it definitely does make games smoother unless for some reason you were playing with vsync on and its god awful input lag prior. not sure where you got $700 from.

You mean if you weren't using VSync? Otherwise I don't understand what you're saying.

And $700 was from Googling the price of that model name.
 
EDIT: That said...I'm not sure why you would buy a $700 TN panel? There are IPS GSync monitors for similar or less.

AU Optronics panel roulette.

I still have the Asus PG279Q (everything I want on paper) and the Acer equivalent on my short list despite that. A lot of us are in the same boat going back and forth on this which is keeping us on the fence. These have been closer to $800 new from what I've seen.
 
Last edited:
As others have said, having a GSync monitor doesn't magically make games look smoother, it just gets rid of tearing regardless of framerate. You are still going to want to have the highest FPS possible (or at least, 60 FPS minimum).

EDIT: That said...I'm not sure why you would buy a $700 TN panel? There are IPS GSync monitors for similar or less.


gsync is not to be rid of tearing, if that is all you wanted you could go vsync.

gsync is just a Variable fram rate so you dont get FPS drops ( aka stutter) from you frame rendering time missing its deadline and have to wait for and entire new refresh before a bufferswap can happen and the FPU can continue rendering (under double buffering).

So YES Gsync bassically bringing to the table is removing framedrops and actually making things smoother /reduce inputlag

or in other words. gsync came to be to fix a drawback from enabling vsync. Vsync already got rid of tearing. ( any kind of sync will)
 
gsync is not to be rid of tearing, if that is all you wanted you could go vsync.

gsync is just a Variable fram rate so you dont get FPS drops ( aka stutter) from you frame rendering time missing its deadline and have to wait for and entire new refresh before a bufferswap can happen and the FPU can continue rendering (under double buffering).

So YES Gsync bassically bringing to the table is removing framedrops and actually making things smoother /reduce inputlag

The way I understand GSync, this is not accurate. GSync is a variable refresh rate system designed to match the refresh of the display to the framerate coming out of the card. The result is to allow for tearing-free gameplay without the input lag and other issues associated with having to use VSync.

It doesn't do anything about your actual framerate.
 
I picked up the 1440P 24" Gsync version of the Dell it's more Amazing then my Swift even I like it alot looks like a 4K monitor.
 
I don't buy anything higher than GTX 1070. Power consumption, efficiency, heat, temperatures, silent single fan, small PCB format are more important to me than top performance. That's why I still prefer Nvidia cards. G-SYNC is the best thing that came out to improve overall gaming experience.
 
You mean if you weren't using VSync? Otherwise I don't understand what you're saying.

And $700 was from Googling the price of that model name.
no, i mean were, because if you were using vsync, gsync wouldn't look any smoother, and you're nuts for having used vsync. S2716DG goes under $500 all the time.
 
The way I understand GSync, this is not accurate. GSync is a variable refresh rate system designed to match the refresh of the display to the framerate coming out of the card. The result is to allow for tearing-free gameplay without the input lag and other issues associated with having to use VSync.

It doesn't do anything about your actual framerate.

It does allow for tearing free gameplay indeed.
but soo does vsync the entire purpsoe of vsync is to get rid of tearing.

I'm sorry to bring this up but you clearly don't know what the word sync means in this case and what is the reason to tearing happens
Please allow me to explain.

Syncs mean that you start of a refresh on you monitor are in sync with the bufferswap.
A buffer swaps is when the GPU changes which Buffer that is storing the renderde image data to send to the monitor and which buffer its can do the actually rendering in.
mostly game use two so that is what we are going with here . but it actually possible to use 3 buffers as well, called well... triple buffering

before gsync we had two options

no Vsync and Vsync enabled


with No vsync we get tearing, the reason for this is the that GFX don't care if what part of the refresh the monitors currently at when the GFX changes buffers. so the monitor might be midways on the screen ( going top to bottom) the GFX now changes the buffers and now its a newer rendered image slightly different from the old one that is in the buffer that the monitor is getting its data from. So the next half of the monitor screen is showing a slightly different half. the misalignment on object between those two half images are what is called tearing.
lets say you have a vertical line at 100 pixels from the left on the first image and 105 from the left on the second. it will look like this line was cut in half and its bottom parts was moved 5 pixels.

The benefit from this that you GPU never need to wait and it can render as fast as its can. and also when ever a image it done rendering a image the screen start showing it immediately ( but albeit some random palce on the screen)


now you enable vsync
Basically the GPU is not swapping buffers until the monitor is ready for a refresh. this mean there is not an update in the middle of a refresh cycle and not two half of an image being displayed at the same refresh cycle so no tearing at all.
Off cause that mean since one buffer i showing the next image and one buffer is showing the current image you have no buffer to render on and the GPU, it has to sit and wait for the screen to be ready.
since the screen is only ready X amount of times per second ( 60 times for 60hz) the GPU can never do a more bufferswaps than x per second and therefore never begin or finish a rendering more than x times. so this is where the 60FPS cap on a 60hz screen comes from with vsync

The other thing is that if your GPU are slightly slower. lets say your GPU can do 50fps but we put in in vsync on a 60hz monitor Your GPU is 20 ms to render a frame. you monitor is ready for a new image every 16.6ms then this little timeline happens:

16.6 ms monitors is ready but image is not so we can do swap yet
20 ms. GPU has rendering a scene and now needs to wait for monitor to be ready and cant render anything ( both buffers are full)
33.3ms monitor is ready. buffer swaps can happens and gpu can start again
50ms monitor is ready. GPU is not ready it onyl 16.6ms since it started renderings after a bufferswap but it needs 20ms
53.3ms GPU is ready but have to wait for monitor to start a new refresh
66.6 ms monitor is ready. buffer swaps can happens and gpu can start again

you now now see that even though you GPU is capable of delivering a new frame in 20ms aka 50 fps. its only able to do the buffer swap ever 33.3ms and you dropped from a potential 50fps to 30fps
you know as me that fps are never completely stable so if you ate running around 60 fps sometimes you dip a bit down in the 50's . but with vsync on that little dip goes all the way down to 30fps. which you will clearly feel as stutter aka none-smooth.


Now lets look at Gsync
With gsync we are still somehow syncing the bufferswaps to the display. how this is done is depending on if you enable vsync or not with gsync.
If we take the above timeline you will see there are time-points where the screen is ready bu the GPU is not, but the display is still dictated to refresh since it has a hard refresh rate of 60hz

With gsync that is not the case it can delay starting a new refresh cycle to sync op to the GPU beeing ready with a new image and then do buffwerswap and star refresh cycle
so instead we get something along this timeline

16.6 ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
20 ms. GPU has rendering a scene . GPU does not longer need to way cause the screen is waiting for the GPU and is ready. buffer swaps can happen and GPU can continue rendering
36.6ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
40ms GPU has rendering a scene . GPU does not longer need to way cause the screen is waiting for the GPU and is ready. buffer swaps can happen and GPU can continue rendering
56.6ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
60ms GPU has rendering a scene . GPU does not longer need to way cause th screen is waiting for the GPU and is ready. buffer swaps can happen and GPU can continue rendering

What you see now is that pesky waiting time for the GPU is eliminated because the screen was nice enough to wait for the GPU. you also notice that the screen is only refresh every 20ms so its actually only going 50hz. but that aligns perfects with you 50fps and there for you avoid the stutter and none-smoothness there would be under vsync from dropping to 30fps.

Also just like with vsync off. Any rendered frame is displayed immediately. so input lag is reduces.
However we are still capped to the refresh rate of the monitor

Some drives will allowed you to go out of sync if fps is higher than your monitors hz but then you get tearing back in the game unless the VFR is constantly changed for some even divisor of the current FPS.
i have no idea of this though but i hardly think so.


triplebufferings (and fast sync) changes some of this but is not supported that much and this is already a wall of text.


sorry for all the typos but om had to write this fast during lunch break.
I hope it was a beneficial read
 
Last edited:
no, i mean were, because if you were using vsync, gsync wouldn't look any smoother, and you're nuts for having used vsync. S2716DG goes under $500 all the time.

If you are using vsync you get more stutter if you fps is not as high as you HZ. so this make completly no sense what you are saying.

or am I totally missing what you are trying to compare?
 
If you are using vsync you get more stutter if you fps is not as high as you HZ. so this make completly no sense what you are saying.

or am I totally missing what you are trying to compare?
i know what it does and how it works. i thought it was obvious that framerate being equal to or greater than the refresh rate was implied. nonnative english speakers are really getting on my nerves lately.
 
It does allow for tearing free gameplay indeed.
but soo does vsync the entire purpsoe of vsync is to get rid of tearing.

Im sorry to bring this up but you clearly don't know what the word sync means in this case and what is the reason to tearing happens
Please allow me to explain

Syncs mean that you start of a refresh on you monitor are in sync with the bufferswap.
A buffer swaps is when the GPU changes which Buffer that is storing the renderd image data to send to the monitor and which buffer its can do the actually rendering in.
mostly game use two so that is what we are going with here . but it actually possibel to use 3 buffers as well could well triple buffering

before gsync we had two oooptions

no Vsync and Vsync enabled


with No vsync we get tearing, the reasoen for this is the tha GFX dont care if what par of the refresh the monitors currently at when the GFX changes buffers. so the monitor might be midtways on th screen ( gogin top to ottom) the GFX now changes the buffers and now its a newer render image slightly different from the old one that is in the buffer that the monitor is getting its data for. So the next half of the monitor screen is showing a slightly diffrent half. the misalignment on object between those two half images are what is called tearing.
lets say you have a vertical line at 100 pixels from the lift on te fist image and 105 from the left on the second. it will look like this line was cut in half and its bottom parts was moved 5 pixels.

the benefit from thiis that you GPU never need to wait and it can render as fast as its can. and also when ever a image it done rendering a image the screen start showing it immediately ( but albeit some random palce on the screen)


now you enable vsync
Basically the GPU is not swapping buffers until the monitor is ready for a refresh. this mean there is not an update in the middle of a refresh cycle and not two half of an image beeing displayed at the same refresh cycle so no tearing at all.
Off cause that mean since one buffer i showing the next image and one buffer is showing the current image you have no buffer to render on and the GPU, it has to sit and wait for the screen to be ready.
since the screen is only ready X amount of times per second ( 60 times for 60hz) the GPU can never do a more bufferswaps than x per second and therefore never begin or finish a rendering more than x times. so this is where the 60FPS cap on a 60hz screen comes from with vsync

The other thing is that if your FPU are slightly slower. lets say your GPU can do 50fps but we put in in vsync on a 60hz monitor Your GPU is 20 ms to render a frame. you monitor is ready for a new image every 16.6ms then this little timeline happens:

16.6 ms monitors is ready but image is not so we can do swap yet
20 ms. GPU has rendering a scene and now needs to wait for monitor to be ready and cant render anything ( both buffers are full)
33.3ms monitor is ready. buffer swaps can happesn and gpu can start again
50ms monitor is rady. FPU is not ready it onyl 16.6ms since it started renderings after a bufferswap butit needs 20ms
53.3ms GPU is rady but have to wait for monitor to start a new refresh
66.6 ms monitor is ready. buffer swaps can happesn and gpu can start again

you now now see that even though you GPU is capable of delivering a new frame in 20ms aka 50 fps. its only able to do the buffer swap ever 33.3ms and you dropped from a potential 50fps to 30fps
you knwo as me that fps are never completly stable so if you ate running around 60 fps sometims you dip a bit dow in the 50's . but with vsync on that little dip goes al lte way down to 30fps. which you will clearly feel as stutter aka nonesmooth.


Now lets look at Gsync
With gsync we are still somehwo syncing the bufferswaps to the display. how this is done is depending on if you enable vsync or not with gsync.
If we take the above timeline you wil lse there are palce her the screen is reade bu the gpu is not but the dispaly is still dictead to refrsh since it has a hard refrsh rate of 60hz

With gsync that is not the case it can delay starting a new refresh cycle to sync op to the GPU beeing ready with a new image and then do buffwerswap and star refresh cycle
so instead we get something along this timeline

16.6 ms monitors is ready but image is not so we can do swap yet.. ok mointor is just going to wait then
20 ms. GPU has rendering a scene . GPU does not longer need to way cause th scrren is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendeirng
36.6ms monitors is ready but image is not so we can do swap yet.. ok mointor is just going to wait then
40ms GPU has rendering a scene . GPU does not longer need to way cause th scrren is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendeirng
56.6ms monitors is ready but image is not so we can do swap yet.. ok mointor is just going to wait then
60ms GPU has rendering a scene . GPU does not longer need to way cause th scrren is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendeirng

What you see now is that pesky waiting tim for the GPU is eliminated because the screen was nice enough to wait for the GPU. you also notice that the scrren is onyl refrsh every 20ms so its actually only goign 50hz. but that aligns perfects with you 50fps and there for you avoid the stutter and nonsmoothnes therr would be under vsync from dropping to 30fps.

Also just liek with vsync. any renderred frame is displayed emediatly. so input lag is reduce.
However we are still capped to the refresh rate of the monitor

Some drives will allowed yout o go out of sync if fps is higher than your monitors hz but then you get tearing back in the game unless the VFR is constantly changed for some even divisor of the current FPS.
i have no idea of this though but i hardly think so.


triplebufferings (and fast sync) changes some of this but is not supported that much and this is alraedy a wall of text.


sorry for all the typos but om had to write this fast during lunch break

It sounds like you are saying the same thing I just said. In your first post, however, you said that GSync wasn't designed to eliminate tearing and that it was a variable framrate. Neither of those are true, and in your post quoted above it seems you understand that, but I doesn't match your first post.

GSync is designed to replace VSync as a solution to tearing, without the input lag and "locked" framerate stepping of VSync.
 
It sounds like you are saying the same thing I just said. In your first post, however, you said that GSync wasn't designed to eliminate tearing and that it was a variable framrate. Neither of those are true, and in your post quoted above it seems you understand that, but I doesn't match your first post.

GSync is designed to replace VSync as a solution to tearing, without the input lag and "locked" framerate stepping of VSync.


if you read my post you would see it.
I did'nt say GSync would cause tearing I said it was not the reason gsync got made because we already have a a solutions for tearing.
You said gsync didn't make the game more smooth which actually it does.
The entire purpose of having gsync over vsync is to make the game more smooth. if you still don't belive gsync make the game more smooth than vsync then you clearly did not read/understand what i wrote above.
 
if you read my post you would see it.
I did'nt say GSync woulf cause tearing I said it was not the reason gsync got made because we already have a a solutions for tearing. this is a statement of reason to make it
You said gsync didn't make the game more smooth which is statements of what it does. and that is the purpose of having gsync over vsync is to make the game more smooth.
if you still don't belive gsync make the game more smooth than vsync then you clearly did not read/understand what i wrote above.

GSync exists because the "solution" we have for tearing (in VSync) sucks.

I do understand what you are saying, but I disagree that GSync was not developed to eliminate tearing. It replaces VSync which was developed to eliminate tearing.
 
i know what it does and how it works. i thought it was obvious that framerate being equal to or greater than the refresh rate was implied. nonnative english speakers are really getting on my nerves lately.

Where is it implied ?
People saying something but leaving out important parts and then claim is implied instead of owning up to their mistake of being inaccurate is really getting on my nerves lately
 
GSync exists because the "solution" we have for tearing (in VSync) sucks.

I do understand what you are saying, but I disagree that GSync was not developed to eliminate tearing. It replaces VSync which was developed to eliminate tearing.

It make no sense to say a product made to fix an issue that is already fixed in the previous solutions. if it is to replaced vsync ( which is it not is and its a sideby features, cause they can be combined) then gsync is not fixing tearing. As you said it was already fixed in the previous solutions. What gsync bring into the picture and improves upon vsync is that it removes the framedrops by lowering the refresh rate.

If you run gsync without vsync and you frame rate goes above your refresh rate you will again have tearing.
However if you run gsync with vsync you will never have tearing ( and only able to go above hz with a fully rotated triple buffer aka fast sync)



let try with jsut a bisc example

Trying to get rid of antes in my back yards

Solutions 1 : NAPALM
benefit1: 100% kill rate of my antes
drawback 2: sever damaged to vegation


then 10 years later we get

Solution2: nanobots
benefits1 100% killrate of my antes
No drawback of killing my vegation

You would still say that the nanobots got devloped because we neede a solutions to kill ants. or was it developed to save vegation from napalm damage ?



Also why does vsync sucks? it only sucks if you have low fps to begin with.
if you can keep your fps at a constants same rate as your monitors refresh rate you are bassically the same as with gsync. except you have slightly miniscule better input lag with vsync (gsync has a native miniscule inputlag due to the communication with the GFX))

also vsync wit triplebuffering solves frame drops. its not as good as gsync but can get pretty close.
 
Last edited:
The Problem: Old Tech
When TVs were first developed they relied on CRTs which work by scanning a beam of electrons across the surface of a phosphorus tube. This beam causes a pixel on the tube to glow, and when enough pixels are activated quickly enough the CRT can give the impression of full motion video. Believe it or not, these early TVs had 60Hz refresh rates primarily because the United States power grid is based on 60Hz AC power. Matching TV refresh rates to that of the power grid made early electronics easier to build, and reduced power interference on the screen.

By the time PCs came to market in the early 1980s, CRT TV technology was well established and was the easiest and most cost effective technology for the creation of dedicated computer monitors. 60Hz and fixed refresh rates became standard, and system builders learned how to make the most of a less than perfect situation. Over the past three decades, even as display technology has evolved from CRTs to LCD and LEDs, no major company has challenged this thinking, and so syncing GPUs to display refresh rates remains the standard practice across the industry to this day.

Problematically, GPUs don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instantaneous load that the GPU sees. So with a fixed refresh rate, how do you get the GPU images to the screen? The first way is to simply ignore the refresh rate of the display altogether, and update the image being scanned to the display in mid cycle. This is called ‘VSync Off Mode’ and it is the default way most gamers play. The downside is that when a single refresh cycle show 2 images, a very obvious “tear line” is evident at the break, commonly referred to as screen tearing. The established solution to screen tearing is to turn VSync on, to force the GPU to delay screen updates until the display cycles to the start of a new refresh cycle. This causes stutter whenever the GPU frame rate is below the display refresh rate. And it also increases latency, which introduces input lag, the visible delay between a button being pressed and the result occurring on-screen.

Worse still, many players suffer eyestrain when exposed to persistent VSync stuttering, while others develop headaches and migraines. This drove us to develop Adaptive VSync, an effective, critically-acclaimed solution. Despite this development, VSync’s input lag issues persist to this day, something that’s unacceptable for many enthusiasts, and an absolute no-go for eSports pro-gamers who custom-pick their gaming hardware to minimize the life-and-death delay between action and reaction.

The Solution: NVIDIA G-SYNC

Enter NVIDIA G-SYNC, which eliminates screen tearing, while minimizing input lag, and stutter. G-SYNC achieves this revolutionary feat, by synchronizing the display to the output of the GPU, rather than the GPU to the display, resulting in a tear-free, faster, smoother experience that redefines gaming.

Industry luminaries John Carmack, Tim Sweeney, Johan Andersson and Mark Rein have been bowled over by NVIDIA G-SYNC’s game-enhancing technology. Pro eSports players and pro-gaming leagues are lining up to use NVIDIA G-SYNC, which will expose a player’s true skill, demanding even greater reflexes thanks to the unnoticeable delay between on-screen actions and keyboard commands. And in-house, our diehard gamers have been dominating lunchtime LAN matches, surreptitiously using G-SYNC monitors and laptops to gain the upper hand.

If, like eSports pros, you want the clearest, smoothest and most responsive gaming experience possible, NVIDIA G-SYNC monitors and laptops are game-changers, the likes of which cannot be found anywhere else. A true innovation in an era of iteration, NVIDIA G-SYNC will redefine the way you view games.

http://www.geforce.com/hardware/technology/g-sync/technology

G-Sync was developed to achieve these goals, as stated in the above:
  1. Eliminate screen tearing
  2. Minimize input lag
  3. Minimize stutter
Throughout the G-Sync section of the website NVIDIA continually emphasizes the issues with V-Sync including using extra framebuffers to minimize frame stepping.
 
let try with jsut a bisc example

Trying to get rid of antes in my back yards

Solutions 1 : NAPALM
benefit1: 100% kill rate of my antes
drawback 2: sever damaged to vegation


then 10 years later we get

Solution2: nanobots
benefits1 100% killrate of my antes
No drawback of killing my vegation

You would still say that the nanobots got devloped because we neede a solutions to kill ants. or was it developed to save vegation from napalm damage ?

I am wondering if there is a language or semantics barrier here, but I will answer your question.

I would say that nanobots were developed to replace napalm as a better method of killing ants that does not destroy vegetation.

I would not say nanobots were developed specifically to not destroy vegetation. That is not the purpose of them, that is a benefit of using them to kill ants over using a deprecated method such as napalm.

Make sense?

BTW, I am not trying to offend you here, I just don't agree with what you are saying, but again, I wonder if that is a matter of misunderstanding.
 
I am wondering if there is a language or semantics barrier here.

no offence taken.
One does not exclude the other. :D i guess this is just a matter of view point.

I'm "standing" on vsync and saying the reasone we go over here (gsync) is because of benefits B ( the smoothnes).
Where you are "standing" on no-sync and are saying we go over here is for benefits A but only because it is now good enough because of benefit B.

but we both agree one the tecnical facts

non-sync
Tearing
No FPS cap
No FPS drops/stutter when FPS is lower than HZ

Vsync
No Tearing
FPS cap
FPS drops/stutter when FPS is lower than HZ

Gsync
No tearing
FPS cap
No FPS drops/stutter when FPS is lower than HZ

Right?
 
It does allow for tearing free gameplay indeed.
but soo does vsync the entire purpsoe of vsync is to get rid of tearing.

I'm sorry to bring this up but you clearly don't know what the word sync means in this case and what is the reason to tearing happens
Please allow me to explain.

Syncs mean that you start of a refresh on you monitor are in sync with the bufferswap.
A buffer swaps is when the GPU changes which Buffer that is storing the renderde image data to send to the monitor and which buffer its can do the actually rendering in.
mostly game use two so that is what we are going with here . but it actually possible to use 3 buffers as well, called well... triple buffering

before gsync we had two options

no Vsync and Vsync enabled


with No vsync we get tearing, the reason for this is the that GFX don't care if what par of the refresh the monitors currently at when the GFX changes buffers. so the monitor might be midways on th screen ( gogin top to ottom) the GFX now changes the buffers and now its a newer render image slightly different from the old one that is in the buffer that the monitor is getting its data for. So the next half of the monitor screen is showing a slightly diffrent half. the misalignment on object between those two half images are what is called tearing.
lets say you have a vertical line at 100 pixels from the left on the first image and 105 from the left on the second. it will look like this line was cut in half and its bottom parts was moved 5 pixels.

The benefit from this that you GPU never need to wait and it can render as fast as its can. and also when ever a image it done rendering a image the screen start showing it immediately ( but albeit some random palce on the screen)


now you enable vsync
Basically the GPU is not swapping buffers until the monitor is ready for a refresh. this mean there is not an update in the middle of a refresh cycle and not two half of an image being displayed at the same refresh cycle so no tearing at all.
Off cause that mean since one buffer i showing the next image and one buffer is showing the current image you have no buffer to render on and the GPU, it has to sit and wait for the screen to be ready.
since the screen is only ready X amount of times per second ( 60 times for 60hz) the GPU can never do a more bufferswaps than x per second and therefore never begin or finish a rendering more than x times. so this is where the 60FPS cap on a 60hz screen comes from with vsync

The other thing is that if your FPU are slightly slower. lets say your GPU can do 50fps but we put in in vsync on a 60hz monitor Your GPU is 20 ms to render a frame. you monitor is ready for a new image every 16.6ms then this little timeline happens:

16.6 ms monitors is ready but image is not so we can do swap yet
20 ms. GPU has rendering a scene and now needs to wait for monitor to be ready and cant render anything ( both buffers are full)
33.3ms monitor is ready. buffer swaps can happens and gpu can start again
50ms monitor is ready. FPU is not ready it onyl 16.6ms since it started renderings after a bufferswap but it needs 20ms
53.3ms GPU is ready but have to wait for monitor to start a new refresh
66.6 ms monitor is ready. buffer swaps can happens and gpu can start again

you now now see that even though you GPU is capable of delivering a new frame in 20ms aka 50 fps. its only able to do the buffer swap ever 33.3ms and you dropped from a potential 50fps to 30fps
you know as me that fps are never completely stable so if you ate running around 60 fps sometimes you dip a bit down in the 50's . but with vsync on that little dip goes all the way down to 30fps. which you will clearly feel as stutter aka none-smooth.


Now lets look at Gsync
With gsync we are still somehow syncing the bufferswaps to the display. how this is done is depending on if you enable vsync or not with gsync.
If we take the above timeline you will se there are time-point wthere the screen is ready bu the GPU is not but the display is still dictated to refresh since it has a hard refresh rate of 60hz

With gsync that is not the case it can delay starting a new refresh cycle to sync op to the GPU beeing ready with a new image and then do buffwerswap and star refresh cycle
so instead we get something along this timeline

16.6 ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
20 ms. GPU has rendering a scene . GPU does not longer need to way cause the screen is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendering
36.6ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
40ms GPU has rendering a scene . GPU does not longer need to way cause the screen is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendering
56.6ms monitors is ready but image is not so we can do swap yet.. ok monitor is just going to wait then
60ms GPU has rendering a scene . GPU does not longer need to way cause th screen is waiting for the GPU and is rady. buffer swaps can happen and GPU can continue rendering

What you see now is that pesky waiting time for the GPU is eliminated because the screen was nice enough to wait for the GPU. you also notice that the screen is only refresh every 20ms so its actually only going 50hz. but that aligns perfects with you 50fps and there for you avoid the stutter and none-smoothness there would be under vsync from dropping to 30fps.

Also just liek with vsync. any rendered frame is displayed immediately. so input lag is reduce.
However we are still capped to the refresh rate of the monitor

Some drives will allowed you to go out of sync if fps is higher than your monitors hz but then you get tearing back in the game unless the VFR is constantly changed for some even divisor of the current FPS.
i have no idea of this though but i hardly think so.


triplebufferings (and fast sync) changes some of this but is not supported that much and this is already a wall of text.


sorry for all the typos but om had to write this fast during lunch break.
I hope it was a beneficial read
Thank you! This was very informative.
 
Back
Top