Monitor manufacturers chime in on the state of GSYNC and FREESYNC.

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,363
Why AMD FreeSync is beating Nvidia G-Sync on monitor selection and price.
Why AMD FreeSync is beating Nvidia G-Sync on monitor selection and price

I thought it was going to be a Red vs Green article, but then I saw that the information for the article was coming from representatives of the actual monitor manufacturers. Really interesting article because of the viewpoints expressed. Should read it from beginning to end as it is very informative.

Here's what I found the most interesting that I hadn't considered in the past as I'm a throw more power at it type of person.


Some display makers say Nvidia’s module requires more room inside the monitor enclosure. While that may not seem like a big deal, creating a custom product design for one type of monitor raises development costs considerably, says Minhee Kim, a leader of LG’s PC and monitor marketing and communications. By comparison, Kim says, AMD’s approach is more open, in that monitor makers can include the technology in their existing designs.

“Set makers could adopt their technology at much cheaper cost with no need to change design,” Kim says. “This makes it easier to spread models not only for serious gaming monitors but also for mid-range models.”



Even if monitor makers proceed with the necessary research and development, the resulting product will be more expensive, which inevitably means it will sell in lower volumes. That, in turn, means it’s harder for monitor makers to recoup those up-front development costs, says Jeffry Pettinga, the sales director for monitor maker Iiyama.

“You might think, oh 10,000 sales, that’s a nice number. But maybe as a manufacturer you need 100,000 units to pay back the development costs,” Pettinga says.


Meanwhile, he says, monitors are constantly improving in other areas such as bezel size. As monitors shrink from wide bezels to slim bezels to edge-to-edge displays, the risk is that a slow-selling G-Sync will become outdated long before the investment pays off.

“Let’s say you introduced, last year, your product with G-Sync. Six months of development, and you have to change the panel. You haven’t paid off your development cost,” Pettinga says. “There’s a lot of things going on on the panel side.”
 
You know, Nvidia will just make "GSync 2.0" and make it basically the same adaptive sync, firmware solution as freesync, except it will not work on AMD cards. This way they can completely avoid the 'design and manufacturing cost' issue discussed above.

Nvidia will NEVER support FreeSync. They would rather go bankrupt.

The [H]ard prophet has spoken.
 
I have contract-designed a gsync monitor and am currently working on freesync displays, with the support of AMD.

With Nvidia, you get a reference design, modify it a bit, and they do all of the testing and tuning. Branding with gsync is basically who handles support, sales, and distribution. The monitor itself may as well be an nvidia brand. That said, nvidia's implementation, testing, and tuning is excellent and gives top-notch results.

If suddenly NV decided to drop the BS with lockin and allow adaptive-sync input to their controller, the current implementation that nvidia has on the gsync monitors would still be the best available.

With AMD, you get a spec saying "do this and freesync will be enabled". If you want to use the freesync logo and be supported by AMD in the marketing sense, you have to do some quality testing and submit a couple of prototypes to AMD so that they can verify the results and give you a pass/fail.


Nvidia will certainly support freesync when they determine that it will be more profitable to do so. I have no inside information on this, but expect that by the time freesync is a very common feature, it will happen. Might take another 2 years.
 
I think the market will be self correcting. I don't think NVIDIA is going to develop/spend the money on a DP 1.3/1.4 G-Sync TCon. This means all the displays that are coming out in 2017 that use over DP 1.2 bandwidth of the currect G-Sync chip and beyond won't have G-Sync
 
I think the market will be self correcting. I don't think NVIDIA is going to develop/spend the money on a DP 1.3/1.4 G-Sync TCon. This means all the displays that are coming out in 2017 that use over DP 1.2 bandwidth of the currect G-Sync chip and beyond won't have G-Sync

Which is almost, but not quite, like admiting that Freesync has won the first market battle of this standard war. Looking back in time , who would have said when AMD launched Freesync that it would become the most common variable refresh rate tech solution?

This [H]ard prophet [H}ere predicts that there will be DP 1.3/1.4 solutions with Freesync.

There is always some chance that Freesync becomes a mandatory feature of future DP standards, while there absolutely no chance of that for Gsync .
 
Displayport supported adaptive refresh rate before G-Sync was developed, yet monitor manufacturers didn't support it. Instead, they were focused on gimmicks like 3D. I have very little sympathy for their cost structures when they could have avoided the entire situation by simply supporting the friggin' technology that was already available.

That all said, G-Sync and Freesync are not quite equivalent technologies, as G-Sync allows the GPU to directly control the display's timing, which is why it requires special hardware. Bake that feature into a Displayport standard revision, and allow it to go down to 24hz to eliminate the various tricks we have to use to get 24fps movies to display correctly, and then you truly will kill G-Sync.
 
Displayport supported adaptive refresh rate before G-Sync was developed, yet monitor manufacturers didn't support it. Instead, they were focused on gimmicks like 3D. I have very little sympathy for their cost structures when they could have avoided the entire situation by simply supporting the friggin' technology that was already available.

That all said, G-Sync and Freesync are not quite equivalent technologies, as G-Sync allows the GPU to directly control the display's timing, which is why it requires special hardware. Bake that feature into a Displayport standard revision, and allow it to go down to 24hz to eliminate the various tricks we have to use to get 24fps movies to display correctly, and then you truly will kill G-Sync.

Not quite, gsync was first. NV went with their solution to minimize time to market to maximize profit. The vendor lockin was an afterthought that they stuck with for profit reasons.

Freesync is the one which allows the GPU to control the display timing more directly. The dedicated harware in a gsync module regenerates timing and video, particularly for the case when refresh<panelminrefresh. IIRC the freesync minrefresh is 9Hz, but the available freesync monitors do not support this because they'd need to do timing and video regeneration like what the module does.

Too bad Freesync is the inferior technology.

No, it isn't. The tech does the same thing. The NV controller in gsync monitors is the best controller available now, but that does not mean that gsync is a better tech. They are functionally equivalent. IMO most of the features of the gsync module *should* be done on the GPU.
 
Last edited:
Not quite, gsync was first. NV went with their solution to minimize time to market to maximize profit. The vendor lockin was an afterthought that they stuck with for profit reasons.

Adaptive refresh rate has existed since eDP in 2009. G-Sync was first with a more comprehensive solution where the GPU controlled the monitor timing, but adaptive refresh rate via Displayport had already existed. The problem was that no one supported it. Freesync in Displayport 1.4 is a response to G-Sync, but it is not a counter.

Freesync is the one which controls the display timing more directly.

Freesync just changes refresh rate to match the FPS output. G-Sync goes several steps beyond, because it replaces the monitor's timing entirely and allows the GPU to control the display, rather than simply pass off frames at altering rates.

No, it isn't. The tech does the same thing.

No, they do not. They may have a similar final effect—which is somewhat debatable, as it's possible to at least subjectively measure the perceived differences in smoothness and tearing—but they absolutely do not do the same thing. G-Sync requires hardware for a reason.
 
Did we ever get the specifics of what G-sync is really doing with their hardware? If they finally revealed it I missed it.

I really hope NV sees the light and adds freesync support soon.
 
Let's talk about V-Sync, Free-Sync, G-Sync, Adaptive-Sync and Fast-Sync. self.buildapc REDDIT

Dissecting G-Sync and FreeSync - How the Technologies Differ | PC Perspective

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
 
Last edited:
Yah, you could totally send the repeat frame from the GPU instead of throwing an expensive frame buffer in between the GPU and the monitor. There might be a GPU silicon reason for it but they really should fix it (and any software issues that go along with it) to enable the cheaper monitor hardware, which is Adaptive-Sync aka freesync. Sounds like AMD has some issue with their software to get tearing below the min supported FPS. They say as much in the pcper article.

If they want to call their support Adaptive-Sync instead of Free-sync to avoid drinking any of AMD's cool-aid that is fine.

Thanks for the links!
 
Yah, you could totally send the repeat frame from the GPU instead of throwing an expensive frame buffer in between the GPU and the monitor. There might be a GPU silicon reason for it but they really should fix it (and any software issues that go along with it) to enable the cheaper monitor hardware, which is Adaptive-Sync aka freesync. Sounds like AMD has some issue with their software to get tearing below the min supported FPS. They say as much in the pcper article.

If they want to call their support Adaptive-Sync instead of Free-sync to avoid drinking any of AMD's cool-aid that is fine.

Thanks for the links!

AMD fixed up the low-framefate tearing after that article was published with LFC. This only works on the higher-end screens, but it's essentially adaptive sync from 1Hz all the way to the max refresh.
 
I always had expected that FreeSync monitors will become much more widespread than G-Sync, as last I heard, nVidia has a pretty large control over the specific modules, which is a non-compete against a DP standard.

In fact, I think it's only a matter of time before EVERY monitor on this planet (laptop or desktop) comes with VRR, especially if Intel's iGPU support it. nVidia will either continue to hide in its own G-Sync bubble (makes business sense if you are undisputed leader in the GPU market, but if they ever lose it for any extended periods of time, however...), or give in and support DP's VRR.

It's such a pity that AMD could not exploit that position from their GPU, had AMD's GPU offering been better than they are now, I'd wager many people on [H] would at least consider that option due to the much greater number of cheaper panels.

I know I probably would be, and I am a G-Sync owner.
 
Last edited:
FreeSync is literally the cut-rate version of G-Sync. It's cheaper and more profitable, but so is designing and producing anything of inferior quality. I still don't know why people are willing to pay thousands for GPUs but decide to go cheap when it comes to the technology on which all that output is displayed.
 
We, as consumers, have an extraordinarily bad idea of which monitors are successful and which are not, and that contributes heavily to the appearance of monitor manufacturers moving in mysterious and unexplained ways.

By this, I mean, let's take for example the Acer Predator X34. How many of them were made and sold? Is anybody here able to even guess whether 10,000 is a big number or a small number? Do we have any idea whether or not Acer thought it was a success or a failure?

This article is remarkable to me because I've never seen monitor manufacturers speak to consumers before.

I'm more than a bit annoyed that one of their primary concerns is the cost of the cable they put in the box. THAT could decide whether they go G-Sync or FreeSync with a monitor? Ugh.
 
My last post in this thread.

You are all conflating implementation with specification, probably because you don't know what the actual difference is.

Freesync only says "the display will operate properly if you vary vblank within this range".
Gsync essentially says "ask me if I can accept a new frame".

There are implications to each of these approaches when the render rate drops below the display rate.

Freesync requires that the graphics card continues to send data. It does not specify what is being displayed, only that the video stream continues to exist. It is up to the gpu+user to choose some combination of frame multiplication, bufering, and tearing.

Gsync has the monitor perform a refresh from it's own framebuffer (note: requires hardware for this). The user does not get to choose behavior in this mode, but could avoid the situation by duplicating frames GPU-side. Polling is used to determine readyness for a new frame.

The end result of both approaches is a series of vblank-stretched frames in both cases, the difference being where the data source for the lower-than-limit video data is located. Freesync places the datasource on the GPU, while gsync places the datasource within the display.

GPU-driven rerefresh is a higher-performance option since the GPU has more knowledge of what is going on and what the user wants (allow tearing or finish the duplicated frame).

PSR has the potential for lower power consumption and is in the specs, but is basically unused outside of the mobile domain. PSR requires a framebuffer on the target display.

The short story is that Freesync gives more direct control of the display to the GPU, eliminates the 'are you ready' polling, and does not have the display-side framebuffer prerequisite. Freesync is the superior specification.

That said, Nvidia has done an excellent job with the gsync monitor controller, so none of the freesync displays currently match the overall quality of solution that nvidia offers. Gsync appears to the user as a better solution due to quality of implementation, not due to the superiority of the specification.
 
My last post in this thread.

You are all conflating implementation with specification, probably because you don't know what the actual difference is.

Freesync only says "the display will operate properly if you vary vblank within this range".
Gsync essentially says "ask me if I can accept a new frame".

There are implications to each of these approaches when the render rate drops below the display rate.

Freesync requires that the graphics card continues to send data. It does not specify what is being displayed, only that the video stream continues to exist. It is up to the gpu+user to choose some combination of frame multiplication, bufering, and tearing.

Gsync has the monitor perform a refresh from it's own framebuffer (note: requires hardware for this). The user does not get to choose behavior in this mode, but could avoid the situation by duplicating frames GPU-side. Polling is used to determine readyness for a new frame.

The end result of both approaches is a series of vblank-stretched frames in both cases, the difference being where the data source for the lower-than-limit video data is located. Freesync places the datasource on the GPU, while gsync places the datasource within the display.

GPU-driven rerefresh is a higher-performance option since the GPU has more knowledge of what is going on and what the user wants (allow tearing or finish the duplicated frame).

PSR has the potential for lower power consumption and is in the specs, but is basically unused outside of the mobile domain. PSR requires a framebuffer on the target display.

The short story is that Freesync gives more direct control of the display to the GPU, eliminates the 'are you ready' polling, and does not have the display-side framebuffer prerequisite. Freesync is the superior specification.

That said, Nvidia has done an excellent job with the gsync monitor controller, so none of the freesync displays currently match the overall quality of solution that nvidia offers. Gsync appears to the user as a better solution due to quality of implementation, not due to the superiority of the specification.

Kudos for a factually informative post, though being against closed standards as I am I rotate more towards freesync, but I do agree that the implementation is far better in NVIDIA's product. Which is always what happens with NVIDIA, or does most of the time: whether the specs or tech is actually worse matters not when the moment you use it you get a better experience. Although I am very happy with my 290X after owning NVIDIA products for a decade and being disappointed with their drivers lately.

---

Regarding the war itself between Freesync and Gsync... IMO, the biggest problem that Freesync suffers from doesn't come from the VRR itself, but from the gpu side of things. If you are a power user you are forced to use NVIDIA (simply because their gpu options, as of today, are unmatched), and that forces you too towards Gsync. So, IMO, gsync monitores are sold more based on the obligation of owning a high-end gpu than because of preference. Yes, some users will see the gsync implementation as being the best one (and they will probably be right) and choose an NVIDIA card based on their monitor preference but I think most simply pick Gsync because that is the brand they have to used based on their gpu.
 
Gsync appears to the user as a better solution due to quality of implementation, not due to the superiority of the specification.
When you don't set any standards, and let manufacturers produce the least-cost solution, the result is one and the same. FWIW, Freesync monitors that approximate the performance of a G-Sync solution are not cheaper in any way (see the Eizo FS2735), as the corollary to the above would suggest. So thanks for elucidating why and how FreeSync is the inferior choice because it shifts the burden of implementation to manufacturers, who are more than happy to cut corners and provide inferior products to consumers if people let them get away with the illusion of having a product that is just as good.
 
Its about time nvidia had already moved from FPGA to ASIC. Going FPGA might made sense at first, but its expensive and bulky. I recall Tom Petersen stating that they would move to ASIC when it made business sense. Whatever that means.
 
When you don't set any standards, and let manufacturers produce the least-cost solution, the result is one and the same. FWIW, Freesync monitors that approximate the performance of a G-Sync solution are not cheaper in any way (see the Eizo FS2735), as the corollary to the above would suggest. So thanks for elucidating why and how FreeSync is the inferior choice because it shifts the burden of implementation to manufacturers, who are more than happy to cut corners and provide inferior products to consumers if people let them get away with the illusion of having a product that is just as good.

Doesn't that open the way for future professional models from manufacturers to make more expensive models that can take advantage of the full range of the technology? Developers like cirthix to make their own boards to sell and modify existing models? The way I see both GSYNC and FreeSync are that they are relatively new tech and thus I didn't expect the first monitors from the respective camps to take advantage of their tech fully. As it matures and more fields see it as a boon, then we will see more expensive and better implementations. Of course these will trickle down to cheaper models.

Asus is coming out with a GSYNC 240Hz model soon. How many 2016 AAA+ quality games can run at 240Hz on a Titan X Pascal with the settings turned up? I can't imagine a single person playing the new Mafia III game at lowest settings to reach the magical 240Hz plateau. Even at lower resolutions you would be bottlenecked by the CPU.

Does that mean that the Asus monitor is a POS? Hell no! It's probably going to be a great monitor. Likewise FreeSync isn't a POS because the monitor releases in the first couple of years failed to take advantage of everything available in the technology.

Let the technology mature on the DisplayPort side of things. Hopefully the TV manufacturers making HDR sets will enable it via HDMI since there is the possibility of a console have the hardware to handle it. Just seems like the chips in them would. Get barebones FreeSync as a part of the cheapest office cubicle display. Then focus on getting every ounce of performance from the spec.

What would be nice to see is AMD partnered with a HDR set manufacturer and created a prototype monitor that could handle the full range of FreeSync for the upcoming CES. An even better outcome would be for Nvidia and Intel to share that stage with them.

I can't see Red vs Green in display technology surviving. Sounds like the dumbest thing to ever happen to PC technology.
 
You can wait and wait and you can hypothesize on what you feel manufacturers "should" do. The thing is nvidia forced variable hz to market with what is so far a superior performance solution real world regardless of whether it is proprietary or not or if freesync theoretically has potential equal to or better than it. It was announced around this time in 2013, DIY kits were available in jan 2014 and I think pre-built in monitors a bit later that year. So people have been able to use a good g-sync solution for two and a half to three years already. How much longer will freesync and/or displayport take to "mature"? How long until full featured (high hz, variable hz, HDR, screen blanking, etc) OLED gaming monitors are ubiquitous? 3 - 5 yrs from now?
Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here | GeForce
Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here
By Andrew Burnes on Fri, Oct 18 2013


"If you’re as excited by NVIDIA G-SYNC as we are and want to get your own G-SYNC monitor, here’s how. Early next year, G-SYNC monitors from ASUS, BenQ, Phillips and ViewSonic will be available direct from the shelves of retailers and e-tailers. If you can't wait, G-SYNC modules will be winging their way to professional modders who will install them into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available."
"(Update December 20, 2013: We are excited to confirm that the NVIDIA G-SYNC Do-It-Yourself Kits will be available for purchase in early January. Further details will be announced shortly.)"
 
Last edited:
Freesync monitors that approximate the performance of a G-Sync solution are not cheaper in any way (see the Eizo FS2735),

Since when does Eizo offer GSync version of Foris ? This is Freesync-only model, so you simply can't say that g-sync version wouldn't be even more expensive.
 
If I may add my input to this thread as a manufacturer of one of the 24" FreeSync monitors in the market.

Regarding G-Sync:
I believe Nvidia is in the monitor supplier business - by validating and testing their G-Sync Module (aka scalar with PCB/Inputs) to make sure it works well with specific panels before selling the G-Sync modules to monitor vendors to assemble, market, sell and distribute. I remember Tom Peterson saying this in one of his interview videos. What Nvidia is doing is what a typical "scalar supplier" does because you have to get the G-Sync Module from Nvidia.

Regarding FreeSync/Adaptive Sync and our involvement:
We get our scalar from one of the three major scalar suppliers. We worked with the scalar supplier on validation, testing, firmware development and PCB/Video input configurations. I have the tools to test if our monitors would pass/fail FreeSync, then I submitted it to AMD for FreeSync to verify - and if AMD found any issues, they would support or advise us to resolve it before they pass it for FreeSync certification. If does not pass and we were not able to resolve the issue - we would still be able to sell and distribute the monitor as an Adaptive-Sync monitor (But without AMD support and FreeSync branding).

With that being said - we actually went through many PCB revisions, implementation of the scalar and firmware revisions in developing our FreeSync monitor. I can confidently say we essentially created a custom designed "FreeSync Module" because at the time when our monitor shipped over a year ago, it had the widest range of 30Hz to 144Hz and no ghosting issues with over drive working. I think to this day our NX-VUE24 is still the only 24" 1920x1080 30Hz to 144Hz FreeSync monitor.
 
Back
Top