New to me Freesync Info

Shouldn´t you first make sure you get it before you try to fix it? :confused:

Freesync is not the limiting factor, but the monitor and scalar used is. When Freesync is active and fps is within the freesync (variable refresh) range the monitor supports, there is no mouse lag or tearing.

Freesync allows for Vsync to activate if framerate goes outside of the Freesync range, but as we all know, Vsync introduces input lag. However, unlike Gsync, Freesync allows you to deactivate vsync, so there is no input lag if framerate goes outside of the freesync range.

If your computer is capable of getting you above the minimum freesync refresh rate, you can cap the FPS on the maximum and you wouldn´t get mouselag or tearing.

I know but that is how Freesync as a whole works. People buy a Freesync card and a Freesync monitor and you may still have tearing and or input lag. I just didn’t like the idea of saying that if you use it correctly it will work. You have to know that Freesync monitors are different and how much frame rate fluctuation you are going to have and I think most folks are not going to be told about that. Off the top of my head I don’t know how many games I play will fall in any given frame rate range. I know when they are too low.
 
Last edited:
What you described as a "limitation of Freesync" is just as much of a limitation for GSync as well.

Do you really think GSync gives you "tearing free" experience if you go higher than the monitor's refresh rate without turning on VSync?

The only true "imitation" is the usable FPS range of Freesync, and that remains to be seen atm.

I was not talking about G-Sync and yes others are saying it will have the same issue and for now you can't turn V-Sync off to help. I'm sure they'll add that if AMD has it and it helps.
 
I know but that is how Freesync as a whole works. People buy a Freesync card and a Freesync monitor and you may still have tearing and or input lag. I just didn’t like the idea that if you use it correctly it will work. You have to know that Freesync monitors are different and how much frame rate fluctuation you are going to have and I think most folks are not going to be told about that. Off the top of my head I don’t know how many games I play will fall in any given frame rate range. I know when they are too low.

You buy a GSync monitor and IT WILL STILL HAVE tearing or input lag if you go "outside of spec" so to speak.

Actually based on what's been said in this thread, it appears you don't even have the option of turning VSync off if you go out of spec, which means you have no choice but to accept input lag.
 
You buy a GSync monitor and IT WILL STILL HAVE tearing or input lag if you go "outside of spec" so to speak.

Actually based on what's been said in this thread, it appears you don't even have the option of turning VSync off if you go out of spec, which means you have no choice but to accept input lag.

I know, I know. See the post just above and post #7. Never trying to say G-Sync works and Freesync don't.

We’ll have to wait for some side by side reviews but it really does look like Nvidia was more interested in a licensing fee from manufactures than using what was already part of the DisplayPort specifications.
 
I know but that is how Freesync as a whole works. I just didn’t like the idea that if you use it correctly it will work. You have to know that Freesync monitors are different and how much frame rate fluctuation you are going to have and I think most folks are not going to be told about that. Off the top of my head I don’t know how many games I play will fall in any given frame rate range. I know when they are too low.

This is the nature of monitors. :) There is no common standard of monitors refresh rate, pixel response time or other specs of a screen where all monitors are equal.

Gsync and Freesync gives an option to have the variable framerate synced to the refreshrate, but they don´t give you higher refresh rates, lower pixel response time etc. You still need to research what screen to buy.

You also need to see what framerate which is acceptable for you, regardless if you have a Gsync/Vsync screen or not. For many its >60fps or bust. Gsync and Freesync makes it easier though. Most important thing is to make sure that you maintain at least a certain minimum framerate with either lowering game settings or getting a faster GPU.

Point is, without Gsync/Freesync, you are much worse off.
 
I know, I know. See the post just above and post #7. Never trying to say G-Sync works and Freesync don't.

We’ll have to wait for some side by side reviews but it really does look like Nvidia was more interested in a licensing fee from manufactures than using what was already part of the DisplayPort specifications.

Oh I see, are you saying that with Freesync, the usable range would depend on each monitor's scaler, and that's an annoyance? If so fair enough.
 
I know, I know. See the post just above and post #7. Never trying to say G-Sync works and Freesync don't.

We’ll have to wait for some side by side reviews but it really does look like Nvidia was more interested in a licensing fee from manufactures than using what was already part of the DisplayPort specifications.

On some notebooks Nvidia enabled GSYNC without a module for testing purposes. Notebook tech is the basis of GSYNC and FreeSync. Really sad that they expect their customers to pay more to port what was already there for notebook users to the desktop.
 
so 280x/280 or the rebranded 7xxx series aren't supported? :rolleyes:

this is the last straw for me, first lack of VSR support for those cards and now free sync, sorry AMD but I've lost my trust in you.

As the owner of a 7970, I too am disappointed. That being said, I've owned this card for over three years, which is a long time for a gpu imo. It's older tech and simply isn't hardware compatible like the newer chips are. It's like getting upset whenever h265 becomes relevant due to not having hardware decoding, when it wasn't even close to a standard when the hardware was made.
 
Oh I see, are you saying that with Freesync, the usable range would depend on each monitor's scaler, and that's an annoyance? If so fair enough.

Yeah, it looks like one of the G-Sync advantages over DisplayPort’s VRR or adaptive refresh rate is the frame buffer in the display that can help with (only?) frame rates from the GPU that go below that VRR window. But that just makes waiting for a monitor with a large VRR window that much more important. So is the VRR window on all G-Sync monitors the same?
 
Yeah, it looks like one of the G-Sync advantages over DisplayPort’s VRR or adaptive refresh rate is the frame buffer in the display that can help with (only?) frame rates from the GPU that go below that VRR window. But that just makes waiting for a monitor with a large VRR window that much more important. So is the VRR window on all G-Sync monitors the same?

No, the VRR windows differs between G-Sync monitors too. Also for G-sync, the window depends much on the monitor itself, even though they all share the same G-sync module. According to the reviews on sites like Anandtech, it seems the sweetspot is from 40hz/fps and up to 60 on 60hz screens like the Acer XB280HK and from 40hz to 144 on 144hz screens like the Asus Rog. Below 40 there is flickering on both screens.
 
As the owner of a 7970, I too am disappointed. That being said, I've owned this card for over three years, which is a long time for a gpu imo. It's older tech and simply isn't hardware compatible like the newer chips are. It's like getting upset whenever h265 becomes relevant due to not having hardware decoding, when it wasn't even close to a standard when the hardware was made.

wasn't it expected to at least work with all GCN based GPU or even GPU from much earlier than GCN based on AMD statement that they have the necessary tech inside their GPU for three generations already. Back in January when Gsync was first announced AMD alluded that it would work with all GCN cards:

He explained that this particular laptop's display happened to support a feature that AMD has had in its graphics chips "for three generations": dynamic refresh rates. AMD built this capability into its GPUs primarily for power-saving reasons, since unnecessary vertical refresh cycles burn power to little benefit. There's even a proposed VESA specification for dynamic refresh, and the feature has been adopted by some panel makers, though not on a consistent or widespread basis. AMD's Catalyst drivers already support it where it's available, which is why an impromptu demo was possible.

http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

considering that G-sync works with basically all Kepler+ Nvidia products and Kepler was launched only about 3 months after Tahiti launched, then why didn't AMD bother to enable freesync with Tahiti if they were developing freesync for such a long time before G-sync?? also kind of fishy GCN 1.0 only works for movies but not gaming .

Either way this won't affect me since I'm handing these cards down to my nieces, these AMD cards depreciated so much that they aren't worth selling in my case, but woulda been nice if they could have enjoyed this tech with them.
 
because the necessary hardware is not there.

Gsync relies on the module in the monitor, it doesnt need extra hardware on the card.
 
I actually saw some posts today criticizing AMD for blocking Nvidia out of FreeSync. Hearty chuckles ensued.
AMD just can't win, can they? Hearts and minds.
 
wasn't it expected to at least work with all GCN based GPU or even GPU from much earlier than GCN based on AMD statement that they have the necessary tech inside their GPU for three generations already. Back in January when Gsync was first announced AMD alluded that it would work with all GCN cards:



http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

considering that G-sync works with basically all Kepler+ Nvidia products and Kepler was launched only about 3 months after Tahiti launched, then why didn't AMD bother to enable freesync with Tahiti if they were developing freesync for such a long time before G-sync?? also kind of fishy GCN 1.0 only works for movies but not gaming .

Either way this won't affect me since I'm handing these cards down to my nieces, these AMD cards depreciated so much that they aren't worth selling in my case, but woulda been nice if they could have enjoyed this tech with them.

He is talking about laptops. It's has been part of the embedded display port specification since 2009. This is why GCN 1.0 APUs support adaptive sync and not the GCN 1.0 desktop GPUs.

The reason why it works in movies and the desktop is because they are static frame rates, there is no need to buffer frames or anything like that.
 
It's seemingly less and less cool now. No Crossfire, tearing if your frame rate outpaces the display and still needing to use V-Sync and in turn having mouse lag. :( Why can't V-Sync just generate frames in sync with the display? :( :(

Because V-Sync has no way of controlling when the display draws frames. It doesn't control that. Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.
 
Because V-Sync has no way of controlling when the display draws frames. It doesn't control that. Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.

What? Wow WRONG, FreeSync does just that.

Wow Terpfen, at least read a little about the topic prior to jumping claiming that Gsync does something that FreeSync doesnt, they do the very same thing, the difference is that Gsync asks for an extra hardware module that adds anywhere from 150 to 260 $ to the cost of the Monitor vs FreeSync.
 
Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.

Whut, the freesync scaler does the same thing as G-Sync board. The only difference is that if you go out of spec or more specifically over the spec G-sync behaves like v-sync and increases input lag by atleast one frame (16ms) depending on your ingame fps.

With freesync you have the freedom to choose if you want to have v-sync enabled or disabled when going out of spec. I guess most competitive gamers would want to have v-sync disabled.

EDIT: blah someone was faster... damn you Revdarian :p
 
Because V-Sync has no way of controlling when the display draws frames. It doesn't control that. Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.

LOL funniest post that I have read here in a while :D

I presume you are joking?
 
Because V-Sync has no way of controlling when the display draws frames. It doesn't control that. Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.

Nvidia has stated that G-Sync module has to poll the panel which negatively impacts performance.

Pro G-Sync supporters calling FreeSync inferior made statements that FreeSync HAD to poll the panel which obviously is false if you read any of their detailed tech rundowns.
The point of FreeSync is that the panel tells the scaler what refresh range it can do. In turn, it tells the GPU that refresh range, so the hardware controller ON THE GPU is directly sending the frames to the panel, no need to wait for the scaler to poll the panel in between frames.
 
He is talking about laptops. It's has been part of the embedded display port specification since 2009. This is why GCN 1.0 APUs support adaptive sync and not the GCN 1.0 desktop GPUs.

The reason why it works in movies and the desktop is because they are static frame rates, there is no need to buffer frames or anything like that.

gotcha, thanks for clarifying that. A little disappointed but like the old saying goes, you gotta pay to play :p
 
gotcha, thanks for clarifying that. A little disappointed but like the old saying goes, you gotta pay to play :p

FYI - They did state multiple times all over the place that FreeSync gaming would only work on GCN1.1+ cards.
 
driver delayed.........ffffffffffff They should stop announcing deadlines for releases if they can't stick to it.
AMD needs better execution.... jeez
 
I actually saw some posts today criticizing AMD for blocking Nvidia out of FreeSync. Hearty chuckles ensued.
AMD just can't win, can they? Hearts and minds.

Pretty much. When nVidia does it it's OK because of REASONS. :rolleyes:

I swear the braindead rabid fanboys is part of the reason why I wish to leave the green ecosystem.

(yes yes obviously team red has those characters too, but in a way I don't blame them for being defensive when you get shit on just because you're using AMD)

This is not to say AMD doesn't have its problems compared to nVidia. But for someone who's done his research and is either willing to put up with those issues or simply won't be affected by them. I don't see why they should be given any flak. Anyway I digress...
 
Because V-Sync has no way of controlling when the display draws frames. It doesn't control that. Thus why G-Sync uses its own hardware controller. That's the entire basis of G-Sync, and why Freesync is a poor alternative.

G-Sync uses its own hardware controller, because there were no other alternatives when Nvidia launched G-Sync. I might be a bit bias here, since I tend to buy both AMD and Nvidia cards and therefore its an advantage to be able to use the same feature on the monitor on both, but I would have wished Nvidia would go over to AMD`s DP 1.2a solution.

The hardware controller method only have one advantage as I see it and that is that Nvidia have control over which scalar that is on the screen. Problem is that that scalar is inferior. Even for one that only uses Nvidia cards and therefore have no need to connect other GPU`s to the screen. Heres a few reasons:

1. Money. It adds a lot of extra cost towards the monitor which is passed on to the customer. Money the customer otherwise could have used to lets say upgrade from a GTX 960 to a GTX 970.

2. Overheating. As we saw with the Asus Rog screen, Asus had to overclock it so it could be used with their resolution and refresh rate, leading to overheating problems. This can be fixed, but in its current state its inferior.

3. Range. Freesync supports a refresh rate range from 9-240hz. Gsync from 30-144hz.

4. Availability. There is no extra cost for a monitor manufacturer to add Freesync to the display. They get a checkbox feature for free. If Nvidia were to scrap Gsync in favor of supporting dp 1.2a variable refresh standard, I bet we would be hard pressed to find a monitor in the end that DIDN`T support dp 1.2a variable refresh/Freesync.

5. Connection options. Gsyncs hardware module only supports DP. Though you need DP to use either Gsync or Freesync, Freesync doesn`t prevent having HDMI, DVI and VGA connectors in addition to the Freesync enabled display port.

Nvidia should scrap the Gsync module in favor of DP 1.2a variable refresh. Kudos to Nvidia for creating Gsync to begin with though.
 

AMDMatt said:
Hello Everyone,

First of all, I'm personally sorry for advising you all the driver would be ready today. I hate letting people down as i know many of you have been waiting a long time for this driver.

We want the Catalyst 15.3 Beta driver with AMD FreeSync technology to be the highest quality as possible and are working hard to release it as soon as possible. I’ll make sure to update everyone here when the driver is ready for download.

I don't anticipate a long wait for it to be released, but it's not going to be today I'm afraid as i was expecting.

Thank you for your patience

http://forums.guru3d.com/showpost.php?p=5032468&postcount=1785
 
The strengths of each system really are:

Freesync
  • More affordable
  • Custom resolution scaler on the display side instead of the GPU
  • Choice of having Vsync on or off
  • Ability to use more than Displayport and doesn't disable audio input over DP

Gsync
  • No Ghosting
  • ULMB as an option when playing at very high frame rates
  • When going below minimum refresh superior performance
  • A fixed standard refresh range set by one company instead of having ODMs choosing random values.

The crossfire thing I expected to be fixed in 2 months. It shouldn't be a big deal it's not available now atm. The difference in supported GPUs is annoying but only a marginal few will care a year later. There are bigger concerns about the Freesync implementation.
 
The strengths of each system really are:

Freesync
  • More affordable
  • Custom resolution scaler on the display side instead of the GPU
  • Choice of having Vsync on or off
  • Ability to use more than Displayport and doesn't disable audio input over DP

Gsync
  • No Ghosting
  • ULMB as an option when playing at very high frame rates
  • When going below minimum refresh superior performance
  • A fixed standard refresh range set by one company instead of having ODMs choosing random values.

The crossfire thing I expected to be fixed in 2 months. It shouldn't be a big deal it's not available now atm. The difference in supported GPUs is annoying but only a marginal few will care a year later. There are bigger concerns about the Freesync implementation.

1- Just asking but I thought the ghosting was a make model LCD thing in the ones they had for testing and not a variable frame rate thing?
2- What is ULMB?
3- Yeah, I don’t like that some may not know that the variable frame rate monitor they get may not go low enough. Hopefully that will be an advertised selling point early on. If they all get down to 20-30Hz that should be enough. Most will not want to play games at less than 30Hz.
4- But we should see more ranges on more displays as there is no license fee. The particular monitor you want may have some kind of variable frame rate vs. none.
 
1- Just asking but I thought the ghosting was a make model LCD thing in the ones they had for testing and not a variable frame rate thing?
2- What is ULMB?

1. Ghosting on Gsync occurs and overdrive is not controlled by Gsync, but on the screen itself. Heres from the review by TFTcentral of Acer Predator XB270HU with Gsync, which is a screen many are waiting for:

pursuit_overdrive.jpg


No ghosting is not a Gsync feature, but is controlled by how well overdrive is implemented.
http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

2. ULMB stands for Ultra Low Motion Blur, which is an evolution of the backlight strobing features that is used in stereoscopic 3d capable screens. Its a monitor feature, not a Gsync feature (you cant enable it as long as Gsync is on even) and several monitor manufacturers have their own name on it. Benq calls it Motion Blur Reduction and its a feature on their Freesync screen. You can read more about ULMB on the tftcentral link above.
 
1- Just asking but I thought the ghosting was a make model LCD thing in the ones they had for testing and not a variable frame rate thing?
2- What is ULMB?
3- Yeah, I don’t like that some may not know that the variable frame rate monitor they get may not go low enough. Hopefully that will be an advertised selling point early on. If they all get down to 20-30Hz that should be enough. Most will not want to play games at less than 30Hz.
4- But we should see more ranges on more displays as there is no license fee. The particular monitor you want may have some kind of variable frame rate vs. none.

Tamlin_WSGF covered 1 and 2 mostly. ULMB is a gsync feature because Nvidia specifically tied the implementation into the gsync module. You can get monitors that have similar functionality as ULMB but right now they are rare. Read this page to learn about the rest: http://www.tftcentral.co.uk/articles/motion_blur.htm (start from Natively Supported Blur Reduction Methods )

Also while ghosting could've happened with gsync displays Nvidia saw the problem and compensated for it (which is probably one of the reasons they went with a module [hopefully that 's not the case]).

I have nothing to add to 3.

The license fee has no bearing on the supported refresh ranges. What matters is the implementation of the display hardware. Technically nothing special needs to be done to implement variable refresh rate as pointed out by the scaler ODMs but one rep from the finished display ODM pointed out that if they really wanted to make something as good as gsync they would have to put effort into it.

So just like everything else from input lag, to choice of hardware scaler vs software scaler, to pre-calibrated displays you get what you pay for when it comes variable refresh ranges.
 
Sure and it was just info on Freesync that I was disappointed to hear. You heard about this cool stuff to get rid of something that has always bugged you only to learn that it kinda sort of will fix it, maybe.

I still don’t get why they can’t just fix this on GPU. They know the frame rate of the display how is telling the display on the fly to hold an image longer easier/better than having the GPU send out the last completed frame again? Why is mouse lag with V-Sync an issue?
You can't guarantee a matching rhythm if it's above the monitor refresh rate, frames will be either made behind or not be completed when it comes time to push the image to the display. So either you introduce lag or you accept screen tearing if you want fps above your refresh rate. That or you could just cap fps.
Shouldn´t you first make sure you get it before you try to fix it? :confused:

Freesync is not the limiting factor, but the monitor and scalar used is. When Freesync is active and fps is within the freesync (variable refresh) range the monitor supports, there is no mouse lag or tearing.

Freesync allows for Vsync to activate if framerate goes outside of the Freesync range, but as we all know, Vsync introduces input lag. However, unlike Gsync, Freesync allows you to deactivate vsync, so there is no input lag if framerate goes outside of the freesync range.

If your computer is capable of getting you above the minimum freesync refresh rate, you can cap the FPS on the maximum and you wouldn´t get mouselag or tearing.
So that more expensive nvidia module is doing something eh, I guess you get what you pay for. ;)
 
All the reviews I'm reading about Freesync look pretty great to me.

Have you seen PCPer's review? The tl;dr version is that if FPS dips outside the VRR window you're screwed, as you have to either choose between judder or tearing.

Given that it is more likely for current games to dip less than 40 FPS than to exceed 144 FPS, and having witnessed both technologies first-hand, I personally find it extremely important to stay in a variable refresh mode at the low end of the LCD panel's variable range. I'll gladly take the potential slight flicker of G-Sync over the 40 Hz judder/tearing of the BENQ. The take home point from my observations is that when gaming lower than the variable range, FreeSync panels retain the disadvantages of the V-Sync on/off setting, but amplify those effects as the panel is refreshing at an even lower rate than a standard display (e.g. 60 Hz).

And apparently there's some ghosting concerns as well.
 
Ghosting concerns have nothing to do with sync tech has everything to do with the monitors ability to change pixels fast enough, i suppose nvidia has tighter control of monitor quality but that's about it.
 
I'm just reporting on what PCPer said, because it wasn't exactly what I would call a favorable review. (if anything the review focused a lot on what Freesync DIDN'T do when outside the VRR window as opposed to what it DOES and how it compares to GSync)

As far as ghosting goes, that's exactly what I thought as well, that it's a panel specific phenomenon and should have nothing at all to do with Freesync.

But Allyn (second reviewer) was swearing up and down in the comments section that Freesync IS the source of ghosting on those specific panels. It all flew over my head since I'm not a panelhead, so yeah.
 
Well, this is pcper we are talking about I'm surprised with haven't watermarked the AMD slides with nvidia logos or a videocard box in front of the screen.

Anyways...
 
They could've at least tried to give the appearance of an objective review, but I'm glad that Allyn guy made those edits, because now the subtle biases are suddenly not so subtle.
 
I know but that is how Freesync as a whole works. People buy a Freesync card and a Freesync monitor and you may still have tearing and or input lag. I just didn’t like the idea of saying that if you use it correctly it will work. You have to know that Freesync monitors are different and how much frame rate fluctuation you are going to have and I think most folks are not going to be told about that. Off the top of my head I don’t know how many games I play will fall in any given frame rate range. I know when they are too low.

I know this has been said a few times but you must have missed it, "Set an FPS cap."

No tearing and no lag. voila!
 
Guys pcper aren't the only ones seeing ghosting.

A smaller level of performance, such as 60 fps, a very pronounced ghosting phenomenon appears. The ghosting is a natural phenomenon for LCD screens, but various techniques, such as overdrive, so are in principle to eliminate or at least reduce the maximum.

But for XG270HU or disables Acer simply overdrive when the LIF is used (3 options are proposed and make no real difference) or the overdrive was calibrated only for operation at 144 Hz and operating parameters not adapted to the lower refresh rates. But ultimately, the problem is that on this screen and realistic level of performance for most players, FreeSync adds a compromise over for the players: you must choose between flow and no ghosting.

So out of 4 displays that could be reviewed 3 of them are showing signs of ghosting. The fourth one is the 29" version of the LG display so it's doubtful it escaped this problem.
 
Back
Top