Freesync reviews and displays released.

wand3r3r

Limp Gawd
Joined
Dec 17, 2011
Messages
422
I didn't see any threads on this yet, nor a review from [h] at this time.

It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

The short summary is that FreeSync works just as you’d expect, and at least in our limited testing so far there have been no problems. Which isn’t to say that FreeSync will work with every possible AMD setup right now. As noted last month, the initial FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations. Another driver should be coming next month that will support FreeSync with CrossFire setups.

http://www.anandtech.com/show/9097/the-amd-freesync-review

The two graphs above show how frame rates are affected when enabling / disabling V-Sync. With V-Sync enabled (red line) on a display that has a refresh rate of 60Hz, and the games configured for high image quality settings to target the 40-60 FPS range, it is not uncommon to see frame rates bounce between 60 and 30 FPS for a time (half the monitor's refresh rate), which means many frames are duplicated, which introduces lag. We should mention that this is another area where FreeSync has an advantage over G-SYNC. With FreeSync, if V-Sync is disabled, frame rates are not limited by the max refresh rate of the connected display.
http://hothardware.com/reviews/amd-freesync-and-lg-34um67-widescreen-monitor-review

Along with this information, AMD also gave some performance data. It has long been a question of whether FreeSync will have any performance impact, and to answer this AMD has done some tests of its own. On identical platforms using a Z87 motherboard and an i7-4770K processor, AMD said that enabling FreeSync actually improved performance by about 0.2 percent when using an R9 290X. When using a GTX 780, AMD actually observed that its competitor's technology, G-Sync, reduced performance by about 1.5 percent.
http://www.tomshardware.com/news/amd-project-freesync-launch,28759.html

Hands-on with AMD's FreeSync: The technology that could kill Nvidia's G-Sync

If there's one thing tech market doesn't need, it's another standards cat fight. But you survived Firewire vs. USB, HD-DVD vs. Blu-ray, and RDRAM vs. DDR so get ready for the battle between Nvidia's G-Sync and AMD's FreeSync to kick into high gear.

...

But back to the consumer who will be forced to choose between the two when buying a monitor. If the 11 monitors that support FreeSync actually all appear, it would mean AMD has an advantage in support. Even almost a year and a half after announcing G-Sync, the number of current G-Sync panels is six according to Nvidia's own page. If AMD is right, and we see 20 FreeSync panels by the end of this year, that's a strength in numbers G-Sync has never enjoyed.

Nvidia's strength, on the other hand, is the popularity of its GPUs. Most hardware surveys give Nvidia roughly a 2:1 advantage in discrete graphics market share, which means there's a higher chance of a gamer buying a G-Sync monitor to match his or her Nvidia GPU.

To balance that out, monitor's using FreeSync appear to have a price advantage:
http://www.pcworld.com/article/2897...echnology-that-could-kill-nvidias-g-sync.html

http://www.bit-tech.net/hardware/monitors/2015/03/19/amd-freesync-officially-launches/1
The LG 34UM67 has an MSRP of $649 while its little brother, the 29UM67, can be purchased for $449. Compared to other ultrawide monitors the new LG FreeSync display holds no additional price premium. In fact, the monitor is cheaper than the equivalently spec'd LG 34UM65 which lacks FreeSync support. Such aggressive pricing bodes well for the competitiveness of FreeSync displays in the marketplace.
http://hexus.net/tech/reviews/monitors/81694-lg-34um67-amd-freesync-monitor/

http://techreport.com/news/27987/amd-makes-freesync-official-reveals-display-pricing
 
Are there any reviews from Display specialty sources? Most of the ones you quoted sound like they have no idea what they're talking about.
 
Thank you, NVidia, for leading the way. Since I already got my G-Sync display, I'll most likely not get another gaming display for 3 to 5 years. But my next gaming will likely be FreeSync, as long as AMD keeps the graphics battle competitive.
 
Are there any reviews from Display specialty sources? Most of the ones you quoted sound like they have no idea what they're talking about.

PCper's articles are usually full featured and the ones I read first.
 
Saw this on PCPer:
ghost1.jpg


Well, I'm pretty sure that AMD will have that problem solved by the time I get the next gaming monitor.
 
Cool. I'll still continue to reserve judgement for how they handle this with drivers. One thing AMD has never impressed me with is good stable drivers. Since this is software based though I would like to see how Freesync works across millions of users versus a handful of tech sites with base impressions. One thing Nvidia has nailed is that G-sync just works. It worked 100 percent from the moment I started using it. Very few games (none that I've played) that I know of have issues with it.

But if AMD can make it better , cheaper and obliterate Nvidia's price point issue then great. What I'm really hoping for is that Nvidia just adds Freesync ( not immediately because that won't happen) , ditches G-sync and I can keep using my Nvidia GPU's without having to switch back to AMD. I refuse to go back to AMD based on my previous experiences with them. I'm pretty sure unless we see a huge list of issues that Freesync just wins by default.
 
As time goes by I am sure both G-Sync and FreeSync/A-Sync will get more advanced, A-Sync will be the layman's middle ground for those who want some form tearfree monitors (it will most likely become THE standard in the next year) without suffering from vendor lock in, and nVidia, more likely than not, will probably support these monitors for at least basic G-Sync (they already have several proof of concept drivers for laptop monitors that have A-Sync built in).

So it could mean that while A-Sync will become widespread, G-Sync monitors may have a few nifty features for a premium (EG I don't think ULMB, for what its worth, is available on A-Sync monitors).

It's going to be good year for monitors.
 
I wonder if this is caused by gpu or monitor - if it's gpu then driver update can fix it but if it's on display side then it would need at least firmware update.

Hopefully it is one of those. It could also be an unavoidable problem with the given hardware, maybe one of the reasons GSYNC monitors have that extra, costly hardware in them.

If I was looking to get a FreeSync monitor I would definitely wait for them to fix this.
 
I'm really confused about what happens when the frame rate drops below the minimum refresh of the monitor.

In the PCPer review they say that when the number of frames feed to their BenQ XL2730Z drops below its 40 Hz minimum rated display rate, the display keeps refreshing at 40Hz for all FPS below that number. It's also made clear that there is no mechanism present in either the GPU or the monitor to duplicate (or triplicate, quadruplicate...) a given frame in order to stay in the variable refresh rate region of the monitor, which explains this behavior of locking the monitor to its lower refresh rate. And this doesn't make sense to me.

To fix this issue the solution is to repeat frames when you're below your minimum refresh rate, obviously, and either the GPU or the monitor would have to do it. Now, I don't think doing it in the monitor is the proper way to do it, because you'd need something like the G-Sync module, with memory and all sorts of logic and trickery, which increases costs and in case of bugs, makes a fix very hard to implement. The monitor should be "dumb", and just display what the GPU tells it to display. Which is exactly what an Adaptive Sync monitor does.

So, why doesn't the GPU do this frame du-tri-quadru... plication? It has all the needed hardware, it would only need software/driver support. You are generating 40 FPS? Cool, output that. You drop to 39? No problem, output the last frame twice (for a total of 78 refreshes). You drop to 15 FPS? No problem, output the last frame 3 times (for a total of 45 refreshes). This would keep the monitor in variable refresh mode at all times. Well, except for when you are generating more than 120/144 FPS, but there is no way around that.

Another way of improving picture quality would be to repeat the last frame as many times as needed to get the closest to the maximum refresh rate as possible, as pixel response and other parameters improve as you increase the number of Hz. You are generating 15 FPS? Instead of repeating the last frame 3 times (the bare minimum to be in the variable refresh window), repeat it 9 times, for a total of 135 refreshes (as close as possible to the maximum 144Hz).

Am I missing something? Why hasn't AMD implemented something like this (or NVIDIA, because they will eventually support Adaptive Sync)? Is this in the works?
 
I'm really confused about what happens when the frame rate drops below the minimum refresh of the monitor.

In the PCPer review they say that when the number of frames feed to their BenQ XL2730Z drops below its 40 Hz minimum rated display rate, the display keeps refreshing at 40Hz for all FPS below that number. It's also made clear that there is no mechanism present in either the GPU or the monitor to duplicate (or triplicate, quadruplicate...) a given frame in order to stay in the variable refresh rate region of the monitor, which explains this behavior of locking the monitor to its lower refresh rate. And this doesn't make sense to me.

To fix this issue the solution is to repeat frames when you're below your minimum refresh rate, obviously, and either the GPU or the monitor would have to do it. Now, I don't think doing it in the monitor is the proper way to do it, because you'd need something like the G-Sync module, with memory and all sorts of logic and trickery, which increases costs and in case of bugs, makes a fix very hard to implement. The monitor should be "dumb", and just display what the GPU tells it to display. Which is exactly what an Adaptive Sync monitor does.

So, why doesn't the GPU do this frame du-tri-quadru... plication? It has all the needed hardware, it would only need software/driver support. You are generating 40 FPS? Cool, output that. You drop to 39? No problem, output the last frame twice (for a total of 78 refreshes). You drop to 15 FPS? No problem, output the last frame 3 times (for a total of 45 refreshes). This would keep the monitor in variable refresh mode at all times. Well, except for when you are generating more than 120/144 FPS, but there is no way around that.

Another way of improving picture quality would be to repeat the last frame as many times as needed to get the closest to the maximum refresh rate as possible, as pixel response and other parameters improve as you increase the number of Hz. You are generating 15 FPS? Instead of repeating the last frame 3 times (the bare minimum to be in the variable refresh window), repeat it 9 times, for a total of 135 refreshes (as close as possible to the maximum 144Hz).

Am I missing something? Why hasn't AMD implemented something like this (or NVIDIA, because they will eventually support Adaptive Sync)? Is this in the works?
The PC Per guys talked about this briefly in the comments section of their article (see below). It sounds like it's certainly possible but I doubt AMD will get around to it anytime soon. They're 6 months behind when they said the first Freesync monitors would arrive at tech sites, and the CFX driver isn't ready either.

"March 19, 2015 | 02:53 PM - Posted by Allyn Malventano
Very good point. This could likely be implemented at the driver level. I for one hope to see them do just that actually!"

"March 19, 2015 | 04:21 PM - Posted by Ryan Shrout
If you read the last page of the story, I have heard AMD hint that might be coming in the future."

"March 19, 2015 | 04:55 PM - Posted by Allyn Malventano
Mostly correct.

If AMD can correctly implement some form of frame redraw multiplication on the low end of the range, and if the manufacturers can get the ghosting issue under control in their TCON's, then yes, they would then be equal. I believe those issues contribute to NVIDIA's decision to use a module, at least for the time being.

Above the VRR range, AMD actually has an advantage here as G-Sync is forced to V-Sync ON while FreeSync is selectable (though that same selection applies to operating below the VRR range - you can't choose one for high and another for low)."

http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion
 
What the hell? Is this really Freesync related?

According to Allyn on PCPer, XL2730Z and Asus Rog Swift uses the same panel, yet Swift does not exhibit the same kind of ghosting as XL2730Z, hence he seems to be leaning on being FreeSync induced.
 
I'm really confused about what happens when the frame rate drops below the minimum refresh of the monitor.

In the PCPer review they say that when the number of frames feed to their BenQ XL2730Z drops below its 40 Hz minimum rated display rate, the display keeps refreshing at 40Hz for all FPS below that number. It's also made clear that there is no mechanism present in either the GPU or the monitor to duplicate (or triplicate, quadruplicate...) a given frame in order to stay in the variable refresh rate region of the monitor, which explains this behavior of locking the monitor to its lower refresh rate. And this doesn't make sense to me.

To fix this issue the solution is to repeat frames when you're below your minimum refresh rate, obviously, and either the GPU or the monitor would have to do it. Now, I don't think doing it in the monitor is the proper way to do it, because you'd need something like the G-Sync module, with memory and all sorts of logic and trickery, which increases costs and in case of bugs, makes a fix very hard to implement. The monitor should be "dumb", and just display what the GPU tells it to display. Which is exactly what an Adaptive Sync monitor does.

So, why doesn't the GPU do this frame du-tri-quadru... plication? It has all the needed hardware, it would only need software/driver support. You are generating 40 FPS? Cool, output that. You drop to 39? No problem, output the last frame twice (for a total of 78 refreshes). You drop to 15 FPS? No problem, output the last frame 3 times (for a total of 45 refreshes). This would keep the monitor in variable refresh mode at all times. Well, except for when you are generating more than 120/144 FPS, but there is no way around that.

Another way of improving picture quality would be to repeat the last frame as many times as needed to get the closest to the maximum refresh rate as possible, as pixel response and other parameters improve as you increase the number of Hz. You are generating 15 FPS? Instead of repeating the last frame 3 times (the bare minimum to be in the variable refresh window), repeat it 9 times, for a total of 135 refreshes (as close as possible to the maximum 144Hz).

Am I missing something? Why hasn't AMD implemented something like this (or NVIDIA, because they will eventually support Adaptive Sync)? Is this in the works?

From PCPer:

Since LCD panels have a maximum time between refreshes that can not be exceeded without risk of damage, the G-Sync module inserts additional refreshes in-between the incoming frames. On current generation hardware, this occurs adaptively and in such a way as to minimize the possibility of a rendered frame colliding with a panel redraw already in progress. It's a timing issue that must be handled carefully, as frame collisions with forced refreshes can lead to judder (as we saw with the original G-Sync Upgrade Kit - since corrected on current displays). Further, the first transition (passing through 30 FPS on the way down) results in an instantaneous change in refresh rate from 30 to 60 Hz, which on some panels results in a corresponding change in brightness that may be perceptible depending on the type of panel being used and the visual acuity of the user. It's not a perfect solution, but given current panel technology, it is the best way to keep the variable refreshes happening at rates below the panel hardware limit.

Sounds like a non-trivial problem, probably with edge cases galore.
 
Thanks for the comments hint guys, didn't read them.

Thing is, G-Sync is doing exactly the same as we are suggesting in the low end of the range, but instead of doing it on the GPU is doing it on the monitor, and it seems to work. If that works somewhat okay, doing it on the GPU is probably better, as the driver has information about the current rendering process that the monitor completely lacks, which could help to reduce possible collisions.

Another point, if refreshing at low Hz lowers the brightness, then why do it at all? In a 144Hz monitor, you can start duplicating at 72FPS input.

Ah, let's see how this tech develops.
 
The hexus review is definitely tailored to display enthusiasts. They measured and calibrated it with a tri-stim. Looks like they liked the 34um67 quite a bit, and thought that the addition of freesync was most certainly positive.

The experience with FreeSync is an overwhelmingly positive one - when the frame-rate resides in the supported range gameplay is silky-smooth. The dynamic synchronisation greatly improves the fluidity of motion-intensive games such as racing and flight simulators; combine that with the extra field of view from the ultrawide panel and it's a very immersive gaming experience.
 
Thanks for the comments hint guys, didn't read them.

Thing is, G-Sync is doing exactly the same as we are suggesting in the low end of the range, but instead of doing it on the GPU is doing it on the monitor, and it seems to work. If that works somewhat okay, doing it on the GPU is probably better, as the driver has information about the current rendering process that the monitor completely lacks, which could help to reduce possible collisions.

Another point, if refreshing at low Hz lowers the brightness, then why do it at all? In a 144Hz monitor, you can start duplicating at 72FPS input.

Ah, let's see how this tech develops.

I think it's the discontinuity in refresh rates that changes the brightness. The pixel matrix has been getting electrical signals at generally similar intervals, and then suddenly the interval is cut in half. Worse, this happens on some monitors and not others, and in the end Nvidia is probably involved in ensuring it's not a major defect in the product.

As for the rest of the issues, yeah AMD will probably continue to improve this tech. They got this feature up and running and people can buy it today and save a couple hundred bucks or whatever. If they couldn't get XYZ feature working right this instant it makes sense to save that for future revisions, rather than wait forever to develop and include every possible function you could ever want from a VRR monitor.
 
The hexus review is definitely tailored to display enthusiasts. They measured and calibrated it with a tri-stim. Looks like they liked the 34um67 quite a bit, and thought that the addition of freesync was most certainly positive.

They don't measure input lag, overshoot, motion clarity, etc. which are pretty important to gaming display enthusiasts. It is cool that they measure color uniformity but this is a gaming display and the other factors are very important.
 
LG now just needs to add Adaptive-sync to the 34UM95. I don't think 2560x1080 is worth it on a 34" display.
 
LG now just needs to add Adaptive-sync to the 34UM95. I don't think 2560x1080 is worth it on a 34" display.

Personally I was almost set on the 29" one, that is until I've seen those ghosting issues.

Strangely though, PCPer seems to be the only review highlighting them heavily. I don't know what to think, I'd really dig a 21:9 monitor with FreeSync....And there's no other in sight.
 
That's mostly because a lot of those "reviews" are shit tier quality and barely mention operation in low frequency range.

And it's not the only one http://www.hardware.fr/focus/108/freesync-disponible-premiers-ecrans-decoivent.html

FreeSync once activated, the flow advances, without increasing latency. This is as good as G-Sync ... at least if the performance level is between 120-144 fps. Below, the Acer screen quickly shows its limits.

A smaller level of performance, such as 60 fps, a very pronounced ghosting phenomenon appears. The ghosting is a natural phenomenon for LCD screens, but various techniques, such as overdrive, so are in principle to eliminate or at least reduce the maximum.

But for XG270HU or disables Acer simply overdrive when the LIF [VRF, variable refresh frequency, google translate couldn't handle this] is used (3 options are proposed and make no real difference) or the overdrive was calibrated only for operation at 144 Hz and operating parameters not adapted to the lower refresh rates.

Wat. Is this real?
 
Back
Top