Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Should've patented it...
And for that reason, I don't think it will see wide spread adoption. I just don't see Lucid staying on top of driver updates as often as they may need to be, in addition to it bringing yet another driver update for the end user to consider. It just seems to be dependent on too many variables.
I'm sure this was being worked on before you made that post.
Zarathustra[H];1038624282 said:As has been mentioned before in this thread, people have been doing this manually in game engines that support a max fps configuration for over a decade.
In the source and gld_src engine's I believe it was just the "fps_max" console command that would do the trick. You'd just set it to equal your refresh rate, and viola, "dynamic vsync".
how many times does it have to be said that a framerate cap is NOT the same as adaptive vsync?Zarathustra[H];1038624282 said:As has been mentioned before in this thread, people have been doing this manually in game engines that support a max fps configuration for over a decade.
In the source and gld_src engine's I believe it was just the "fps_max" console command that would do the trick. You'd just set it to equal your refresh rate, and viola, "dynamic vsync".
how many times does it have to be said that a framerate cap is NOT the same as adaptive vsync?
well most people including myself still see plenty of tearing at or below the refresh rate. if that did not happen then there would be need to even come up with adaptive vsync which lets you get rid of the tearing at least some of time.Zarathustra[H];1038624361 said:It may not technically be the same, but it accomplishes pretty much the same thing.
When I had my framerate capped to the same refresh rate as my monitor i VERY rarely saw any tearing.
Zarathustra[H];1038624361 said:It may not technically be the same, but it accomplishes pretty much the same thing.
When I had my framerate capped to the same refresh rate as my monitor i VERY rarely saw any tearing.
Every method has some drawback. Also I'd like to correct some stuff in this article which wasn't correct.
Vsync eliminates tearing but limits fps to the refresh rate. It also drops the framerate all the way down to half of your refresh rate if your framerate even barely drop below the refresh rate. So even if you can run at 59fps, you'll drop to a solid 30fps on a 60hz display. Also if you dip below and then above 60fps often, you'll get perceivable stutters because of this quantization.
Vsync off (immediate present mode) results in tearing that will show up at any frame rate. Technically it can even show up at 60fps because there's no synchronization happening with the display's refresh cycle. So even if you render at 60fps, the display refresh and the frames present timing are out of phase.
Triple buffering with vsync solves tearing and makes you less likely to drop down to 30fps as fast...I.e. if you have a frame or two that run too long to make the vsync period deadline... So you avoid some of the stutters. It also allows you to run at average frame rates between 30fps and 60fps, but in this case it will do so by having some pattern of 16ms frames (60fps frames) followed by a 33ms frame (30fps frame), repeated over and over. Obviously this will give you a heartbeat style stutter on the displaying of your frames, even if you're achieving some average framerate between 30fps and 60fps over the course of a second. So even if your average framerate reads 45fps, that is merely an average of higher and lower frame times, not a consistent framerate. Additionally triple buffering requires an additional back buffer, meaning more video memory is used and there is an extra frame if input latency.
Informative post. So basically Adaptive VSync is overall a better option unless you can't handle the sub-60 FPS tearing.
It's technology we should have seen introduced years ago. Much like adaptive vsync, in fact. These things are difficult to market, though, so they just don't happen.Another way is to have displays that only refresh when signaled that there's a new frame ready, and in the meantime the display would just continue to light up the pixels that were lit up by the last frame... That would be like some intelligent refresh technology instead of blindly updating all the displays pixels on a fixed schedule.
It's technology we should have seen introduced years ago. Much like adaptive vsync, in fact. These things are difficult to market, though, so they just don't happen.
its a beta driverWent to the nVidia site to download it (much faster connection here at work ). Something odd - if I punch into the dropdowns "GeForce 400 Series" (Windows 7 64 bit), it brings up a 296 driver, not the 300 series one. The only way I could get to the 300 series drive was to put in "GeForce 600 series".
Will those still work on my sad old 460 based system? I run in 1920x1200 most of the time and do get tearing on the older games I run, so I wanted to try it out, but I'd prefer not to ru-ru my system with the wrong drivers...
Went to the nVidia site to download it (much faster connection here at work ). Something odd - if I punch into the dropdowns "GeForce 400 Series" (Windows 7 64 bit), it brings up a 296 driver, not the 300 series one. The only way I could get to the 300 series drive was to put in "GeForce 600 series".
Will those still work on my sad old 460 based system? I run in 1920x1200 most of the time and do get tearing on the older games I run, so I wanted to try it out, but I'd prefer not to ru-ru my system with the wrong drivers...
1) Your frames are always lower than your monitor's refresh rate (for typical users it's <60fps), then Adaptive Vsync does absolutely nothing, so you might as well have it turned off, because you'll get plenty of screen tearing either way.
2) Your framerates are always higher than your refresh rate (>60fps). Adaptive Vsync works exactly like regular vsync. Basically you might as well run with vsync on.
you both kind of missed what I was saying which is tearing will not always be exactly the same with different monitors. I have looked at it side by side and have seen tearing look different from monitor to monitor. for example I have seen a game on my crt have a small consistent tear in the same place but have a tear that looked like a wave in the same game on the same settings with my LCD.
Deus Ex Human Revolution can be a stutterfest in spots for anyone so don't blame adaptive vsync for that. I stutter like crazy walking around in Detroit in DX9 but its almost perfectly smooth on max DX11 settings. really just messing with random settings can impact the amount of stutter in that game.I have been briefly playing some of my games to see how these beta drivers and A-Vsync affect the performance. While I haven't made my mind up on the input lag kicking in --I can feel it but I haven't played enough to know if it really bothers me yet-- the overall experience is rather positive. However, in Deus Ex Human Revolution the stuttering seems to be even worse than before making the game-play experience rather unpleasant.
EDIT: Also experiencing a lot of tearing in RAGE.
Deus Ex Human Revolution can be a stutterfest in spots for anyone so don't blame adaptive vsync for that. I stutter like crazy walking around in Detroit in DX9 but its almost perfectly smooth on max DX11 settings. really just messing with random settings can impact the amount of stutter in that game.
and why would you use adaptive vsync in RAGE when it has its on similar smart vsync right in the game? I am pretty sure they recommended to not try and force vsync on or off from your cp with this game and to just use the game's options.
yeah its not a good idea to globally force settings as it will cause issues in a few games or other apps. the only thing I globally force is high quality texture filtering and I clamp the negative LOD bias. most games I just use the in game settings but for a few games I do use FXAA, adaptive vsync and 16x AF. again its best to never force those type settings globally though.To clarify -- I was one of those who were experiencing noticeable stutter in Deus Ex when it came out. When playing the game with the new drivers it has become worse than before.
Regarding RAGE, my settings were already done in the Nvidia CP. I just launched the game out of curiosity and those were the results. I usually do not create profiles for games ; I like to just have one setting for everything and forget about it which 99% of the time works fine. I guess this won't be the case with RAGE.
Okay. Thanks for the adviceyeah its not a good idea to globally force settings as it will cause issues in a few games or other apps. the only thing I globally force is high quality texture filtering and I clamp the negative LOD bias. most games I just use the in game settings but for a few games I do use FXAA, adaptive vsync and 16x AF. again its best to never force those type settings globally though.
...Basically, here are two scenarios for a typical user:
1) Your frames are always lower than your monitor's refresh rate (for typical users it's <60fps), then Adaptive Vsync does absolutely nothing, so you might as well have it turned off, because you'll get plenty of screen tearing either way.
2) Your framerates are always higher than your refresh rate (>60fps). Adaptive Vsync works exactly like regular vsync. Basically you might as well run with vsync on...
It's a great feature, turn it on.
Vsync limits the display to multiples of 60?
I thought it was factors of the vertical refresh rate, meaning typically factors of 60... (11 pages later, nobody else thought this. Am I wrong? Do people even care about blatant inaccuracies?)