AMD Demonstrates "FreeSync", Free G-Sync Alternative at CES 2014

Status
Not open for further replies.
I think Koduri might have been misquoted. I read that he was asked why nVidia needed extra hardware and he said that he didn't know why, but it's possible that their GPU's don't support VBLANK.
"However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction."
http://www.guru3d.com/news_story/nvidia_responds_to_amd_freesync,2.html
 
"However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction."
http://www.guru3d.com/news_story/nvidia_responds_to_amd_freesync,2.html

After checking this stuff out I have to agree with that.
 
While I like the idea of FreeSync being free and creating competition for NV I don't understand people getting all worked up. A consumer version of FreeSync doesn't exist and is years away from a reality.

First the standard has to be adopted then monitor manufacturers have to design and build units. I would imagine NV has a clause in the agreement with Acer, AOC, ASUS, BEnQ, Philps and ViewSonic that they can't implement any competing technology to G-Sync during the term of the contract. So right there it knocks out a few major competitors for at least a few years.

G-Sync is a reality and is slowly rolling out in the next few quarters. Either you pay the premium for a product already in existence or you wait to see if FreeSync ever becomes a reality. Can't fault NV for creating the technology, marketing it to enthusiasts, and keeping it to themselves.

I hope FreeSync becomes a reality but I am not holding my breath.
 
I was going to post this in the other thread in response to Elledan's post, but thread was closed and now points to here.

He said:
G-Sync works with existing hardware and DP 1.2, not requiring any eDP or DP 1.3 features. All it requires is a different handling of the DP link between GPU and display so that the latter is synced to the frequency of the GPU instead of the other way around as used to be the case.

From what I've seen G-Sync doesn't work with existing hardware. The "upgrade kit" basically consists of gutting entire controls out of the asus lcd. G-Sync board contains new power plug + new DP. To me that is not "works with existing hardware".

When plugged in you basically end up with an "nvidia display", where drivers can control 100% of what's going on. Not saying this is a bad thing, just that there is no "existing hardware" left apart from the LCD panel + housing.
 
Either way my problem with freesync outside of that it's not exactly what it's name implies "free" is that in a situation where games can cause the fps to erratically fluctuate free sync is not suitable.
 
A hardware solution in this case is going to be far superior to a software solution. Just like with frame rate metering.
 
So lets take a look,

G-Sync requires a $275 module that completely replaces the electronics in your monitor.

Free-Sync requires a free VESA standard (DisplayPort 1.3) which new monitor will support.

G-Sync only works on NVIDIA hardware so you have a crippled monitor.

Free-Sync works on everything.

Both require new monitors.

A hardware solution in this case is going to be far superior to a software solution. Just like with frame rate metering.

Both Free-Sync and G-Sync are a combinations of hardware and software.

Its just Free-Sync uses hardware that already exists in GPUs and in the DP 1.3 standard.
 
Last edited:
I was going to post this in the other thread in response to Elledan's post, but thread was closed and now points to here.

He said:


From what I've seen G-Sync doesn't work with existing hardware. The "upgrade kit" basically consists of gutting entire controls out of the asus lcd. G-Sync board contains new power plug + new DP. To me that is not "works with existing hardware".

When plugged in you basically end up with an "nvidia display", where drivers can control 100% of what's going on. Not saying this is a bad thing, just that there is no "existing hardware" left apart from the LCD panel + housing.

The way I see it is this:

GSync is hardware-based using a programmable Altera Arria V GX FPGA from Altera. The Arria V SDK is also available for $4000 USD from the Altera website. Yes, I checked and posted it in the last GSync thread.

What Nvidia did basically is replace the controller board of a monitor and implemented their own in order to manipulate the VBlank interval at their will. It is still controlled software-side by the Nvidia driver so as to tell the GSync board over DisplayPort how and when to hold a frame and display it. DP works a lot similarly to ethernet networks or SATA so I can see why it requires a DisplayPort connection, but also a very fast GPU to support it.

Now, if going by the other thread that VBlank will exist in DisplayPort 1.3 standard then that means Nvidia's manipulating VBlank interval is proprietary creation if existing DP 1.2 doesn't have it. This doesn't make sense since VBlank has been known to exist for quite a while long before DisplayPort existed. So, DP 1.2 SHOULD have VBlank support.

So, there are several ways to go about this:
  1. AMD obtains an SDK from Altera and implements GSync support within AMD's Catalyst drivers.
  2. A third party obtains the SDK and creates a driver to support it for non-Nvidia cards.
  3. AMD waits for LCD manufacturers to implement native hardware-based manual VBlank control once DisplayPort 1.3 is finalized.
Now, #1 would work if AMD would be willing to support GSync and given Nvidia's most recent response to Freesync, AMD would have to do the legwork themselves with no help from Nvidia whatsoever. It's a $4000 SDK so it's chump change for a company like AMD and Intel if they want to support it in their own drivers. And, there should be a lot of existing documentation to send signals to-and-from the monitor via DisplayPort, so programming shouldn't be too difficult.

But, will they? When was the last time two or more competing companies worked together for a single standards solution outside of price-fixing and collusion? :p

#2 would happen if someone with enough money, time and boredom wanted to say "Fuck you, Nvidia and AMD, I'll do this myself." Who has $4000 lying around and the time needed to reverse engineer the GSync module and implement a third-party driver solution to manually control VBlank?

This would be the best solution because that third party driver would work with either AMD or Intel GPUs.

#3 means little work for AMD. AMD will just simply implement a manual VBlank control within the Catalyst driver and call it Freesync once monitors with VBlank support are added to the controller boards. However, this solution requires that LCD manufacturers implement the standard natively within their own monitors, and that will require AMD pushing for it. This is probably another best-case scenario because it doesn't add more cost to an existing monitor and is just implementing a new or existing VESA standard. Customers benefit not having to spend an extra $100 to $200 for a supported monitor. But, this still requires AMD pushing for the standard be implemented across a wide number of monitors by going up to the manufacturers themselves to implement it.

Downside? Like HDMI 2.0 monitors for computers, we have quite some time to wait for this to be commonplace in LCD monitors.

So, out of the three possibilities, I'd say #1 is the best but would AMD be willing to do it?

Possibly not.

#2 is the next possibility, but who's bored enough to do it?

#3 is a waiting game and that will require AMD pushing LCD manufacturers to implement it.

Opinions? I'd love to hear it.
 
That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know."

He must have missed this, then... http://www.paradetech.com/products/...troller-products/dp633-643-653-edp-psr-tcons/ and the full press release here... http://www.paradetech.com/2012/10/p...-on-chip-frame-memory-for-panel-self-refresh/

Embedded DisplayPort 1.3 display controller that supports variable refresh rates (and includes internal memory), first announced October 22, 2012.

A hardware solution in this case is going to be far superior to a software solution. Just like with frame rate metering.
Both G-Sync and FreeSync require that the monitor can handle a variable v-blank signal. The hardware-based controller in the monitor needs to be able to monitor this value and refresh on-demand.

Current DisplayPort 1.2 monitors can only handle a fixed v-blank interval. Altering this value on a normal DisplayPort 1.2 monitor results in a full mode reset.

New monitors with updated controllers (similar to the one I linked above) will be required to support G-Sync/FreeSync
 
Last edited:
A hardware solution in this case is going to be far superior to a software solution. Just like with frame rate metering.

Only freesync isn't a software solution, hilariously enough. PCPer has more details on this on their front page. Monitors supporting this require a new controller board, and that controller board isn't part of the DP 1.3 spec.
 
So lets take a look,

G-Sync requires a $275 module that completely replaces the electronics in your monitor.

Free-Sync requires a free VESA standard (DisplayPort 1.3) which new monitor will support.

G-Sync only works on NVIDIA hardware so you have a crippled monitor.

Free-Sync works on everything.

Both require new monitors.



Both Free-Sync and G-Sync are a combinations of hardware and software.

Its just Free-Sync uses hardware that already exists in GPUs and in the DP 1.3 standard.

G-sync monitors will be on the market in less than 6 months.

Freesync will be on market in :

AMD isn’t ready to productize this nor does it have a public go to market strategy, but my guess is we’ll see more panel vendors encouraged to include support for variable VBLANK and perhaps an eventual AMD driver update that enables control over this function.

http://anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
 
G-Sync requires a $275 module that completely replaces the electronics in your monitor.
Or you buy a monitor that already has the module built-in, like most normal people will be doing.

Oh, and the G-Sync module only replaces the monitor's scaler. It's not a full blown LCD controller in its own right.

Free-Sync requires a free VESA standard (DisplayPort 1.3) which new monitor will support.
FreeSync requires an optional portion of the (yet to be finalized) DisplayPort 1.3 spec.

A baseline DP 1.3 monitor supporting the minimum number of features to be called DP 1.3 compliant will not support FreeSync. Monitor manufacturers will have to go out of their way to include the additional logic to handle this optional feature (just like adding a G-Sync module).

G-Sync only works on NVIDIA hardware so you have a crippled monitor.
First off, we don't know if G-Sync displays will only sync up to Nvidia hardware. It's possible that any graphics card that can push an altered v-blank signal will be able to refresh a G-Sync display on-demand.

Second, in what way is the monitor "crippled"? It will, for sure, continue to work just like any normal DisplayPort 1.2 monitor when attached to a non-Nvidia GPU.

Free-Sync works on everything.
FreeSync will work on monitors that decide to implement that portion of the (again, still non-final) DisplayPort 1.3 spec.

Both Free-Sync and G-Sync are a combinations of hardware and software.

Its just Free-Sync uses hardware that already exists in GPUs and in the DP 1.3 standard.
Uh, no. No GPU that exists today implements DisplayPort 1.3 (because DisplayPort 1.3 isn't final).

Some current GPU's may be able to support the required features of DisplayPort 1.3 when it comes around, but there is no guarantee whatsoever that such will be the case for any current card.
 
Only freesync isn't a software solution, hilariously enough. PCPer has more details on this on their front page. Monitors supporting this require a new controller board, and that controller board isn't part of the DP 1.3 spec.

What exactly do you mean need a new controller board? pretty much every new model of monitor will have a "new controller board" ,the big difference here is that the controller board does't have to be licensed or purchased from NV
 
What exactly do you mean need a new controller board? pretty much every new model of monitor will have a "new controller board" ,the big difference here is that the controller board does't have to be licensed or purchased from NV

This controller is not mandatory and is not part of the DP 1.3 standard. NO ONE WILL USE IT unless AMD does the work to make it happen.

AMD has to convince monitor manufacturers to use a special controller in any upcoming monitors to support variable refresh. No desktop monitors currently do this. And unless AMD takes the initiative to make it happen, that won't change. AMD has to give them a reason to implement it. They can't just put something on a piece of paper and magically expect it to happen. It doesn't work like that. I'm skeptical that AMD will do the legwork to make this happen. This isn't dissimilar to HD3D where AMD basically said "Here's our API! Use it!" and nobody used it. Tri-Def charges for the HD3D driver, and AMD does not update HD3D any longer.

You tell me. Do you think AMD will consult with manufacturers to make this happen? Or will they just offer a proof of concept hoping that manufactuers will implement it? Either way, no monitors in development or on the market support free-sync right now. So I highly suspect free-sync will not happen in 2014. We shall see though. For once, I would like to see AMD take the initiative and put their money where their mouth is. As of right now this was nothing but a marketing stunt to take attention away from g-sync, despite the fact that AMD has nothing right now.
 
What exactly do you mean need a new controller board? pretty much every new model of monitor will have a "new controller board" ,the big difference here is that the controller board does't have to be licensed or purchased from NV
He means that any monitor that wants to support FreeSync will have to have some advanced control logic (and include some internal memory) in-place of a cheap scaler that can only handle fixed refresh rates.

Currently, including this additional hardware is an optional component of DP 1.3 (as in, manufacturers will have the option to release a pile of DP 1.3 monitors that will continue to use cheap scalers that only support fixed refresh rates).
 
No, that actually isn't the case. AMD has to convince monitor manufacturers to use a special controller in any upcoming monitors to support variable refresh. No desktop monitors currently do this. And unless AMD takes the initiative to make it happen, that won't change. AMD has to give them a reason to implement it. They can't just put something on a piece of paper and magically expect it to happen. It doesn't work like that. I'm skeptical that AMD will do the legwork to make this happen. This isn't dissimilar to HD3D where AMD basically said "Here's our API! Use it!" and nobody used it. Tri-Def charges for the HD3D driver, and AMD does not update HD3D any longer.

You tell me. Do you think AMD will consult with manufacturers to make this happen? Or will they just offer a proof of concept hoping that manufactuers will implement it? Either way, no monitors in development or on the market support free-sync right now. So I highly suspect free-sync will not happen in 2014. We shall see though.

Yes I actually do think Monitor manufactureres will be convinced to produce FreeSync capable monitors.
 
Yes I actually do think Monitor manufactureres will be convinced to produce FreeSync capable monitors.
Sure, but not all DP 1.3 monitors will support FreeSync.

You'll need to buy monitors that implement that portion of the spec. Such monitors will likely be more expensive due to the additional hardware requirements of supporting FreeSync.

Pretty similar to the situation G-Sync is currently in.
 
Yes I actually do think Monitor manufactureres will be convinced to produce FreeSync capable monitors.

Hope you're right. AMD has a history of not delivering though. They will promise all sorts of things, but that doesn't necessarily match reality.

In any case, not to dog on AMD, but i've owned AMD cards and just learned to ignore every promise they made. Because most of the time, they never came true. In free-sync's case, i'm sick of their marketing bullshit telling us that free-sync is free and implying that free-sync is imminent. As you know neither of those are true. A special controller board is certainly not free. If AMD marketing were to be believed, everyone here thought that this was as simple as using a new driver and would work on any existing DP panel. We now know the truth. Basically, AMD marketing is full of shit and the fact that "free-sync" was demoed when AMD had nothing with no concrete timeframe to get this to market speaks volumes.

Answer a question honestly. When free-sync was announced, you thought that it was going to be out soon and that it simply required a driver update. And would work on any DP monitor. Right? Is that what you thought? Most people here bought into that. That's what AMD marketing implied through the AT article. Countless people here on this very forum thought that. And we know it isn't true now.

On one hand, I expected this outcome. On the other hand, it is still disappointing. Who wouldn't want a "free" alternative to g-sync. I mean c'mon. WHO DOESNT want that. A free alternative to g-sync would be so good for competition. Which we need. I like the fact that AMD counterbalances nvidia, even though I do prefer NV these days. Anyway, when I heard about free-sync - my intuition told me that AMD was bluffing with half truths, and as it turns out, I was correct.
 
Last edited:
Sure, but not all DP 1.3 monitors will support FreeSync.

You'll need to buy monitors that implement that portion of the spec. Such monitors will likely be more expensive due to the additional hardware requirements of supporting FreeSync.

Pretty similar to the situation G-Sync is currently in.

I understand that, however I doubt it will be as expensive as the G-Sync module, which is why I think we'll see some freesync monitors pop up. Hey I Could be wrong, but that is my prediction.
 
The real choice for the display makers to enable some kind of sync is this:

1. Partner with NV for G-sync and integrate their controller hardware into your monitor. Requires the users to use NV hardware.
2. Change you controller hardware to one that supports the new VBLANK spec, and thus "freesync", either by partnering with someone else or rolling your own. Requires 3rd party driver support.

The up front costs are likely a lot lower with #1, NV might even to be willing to subsidize the hardware costs depending on how much they are convinced this will help them sell high end cards.

The downside of #2 is you need NV, AMD, Intel and possibly others to add support for it in their drivers, and maybe even hardware. The upside will be a larger potential market than #1, and possibly better profit margins depending on what NV charges (or subsidizes) for G-sync.
 
Also note that the panel self refresh feature is NOT needed to support freesync, just VBLANK, so there is no reason to add a frame buffer to controllers that support just VBLANK. They can continue to use the far cheaper line buffers that most scaler chips use today.
 
People don't pay extra for DVI ports on their monitor because it's part of a standard. If FreeSync is popular then the cost will be absorbed into the base price of all monitors. I would say that all of them will support it after the first couple of "testing the water" years. Monitor manufacturers just want a reason, based upon a low cost solution, to sell more monitors.
 
People don't pay extra for DVI ports on their monitor because it's part of a standard. If FreeSync is popular then the cost will be absorbed into the base price of all monitors. I would say that all of them will support it after the first couple of "testing the water" years. Monitor manufacturers just want a reason, based upon a low cost solution, to sell more monitors.

The same is true for g-sync, unless you get the FPGA module. G-sync will also be absorbed into monitor price. Keep in mind FPGA (upgrade module) is expensive, ASIC isn't. ASIC takes significantly longer to develop, whereas FPGA doesn't. The downside is that FPGA is easy to develop, but costs a lot more.

There was also a 120hz 1440p panel announced by Asus that is retailing for 799$ - rumor is that is using ASIC for g-sync which means it isn't unreasonable in price. That price is not bad, since it is the first commercial 1440p panel to officially support 120hz. Their other professional 1440p (made by asus - PB278Q I believe) panel is in the 650$ range. Again, this will be cheaper over time when it switches over to ASIC. If AMD's freesync solution is FPGA based, it will also be expensive. Very expensive. But this isn't the point. AMD's freesync isn't free just like g-sync isn't free. YOU MUST PAY FOR THE CONTROL BOARD. And you must buy a new monitor. It really isn't different than g-sync in terms of cost, because both solultions will require either FPGA or ASIC control boards. FPGA = expensive, ASIC = cheap.

Again - it is absorbed in the monitor price but that will also be true for g-sync. The initial solutions require the FPGA module which isn't cheap - but that will change down the road when monitors use ASIC based solutions which are a mere fraction of the cost.

The real problem here is time to market for AMD. G-sync is going to be Q1 2014. Right now there are zero monitors which are Displayport 1.3 compatible, and there are zero monitor manufacturers to whom AMD has consulted with. There are zero monitors in development with confirmed displayport 1.3 support. Displayport 1.3 isn't even finalized yet. So nvidia will have a significant head start in bringing this to market. As things are, I highly doubt that free-sync will happen in 2014 because of DP 1.3 not being finalized, and AMD not having any monitor manufacturers on board yet. AMD has to consult with manufacturers to get them to add this control board for variable refresh. They will not add to their own costs with a special control board on their own volition.
 
Last edited:
The same is true for g-sync, unless you get the FPGA module. G-sync will also be absorbed into monitor price. Keep in mind FPGA (upgrade module) is expensive, ASIC isn't.

There was also a 120hz 1440p panel announced by Asus that is retailing for 799$ - rumor is that is using ASIC for g-sync which means it isn't unreasonable in price. That price is not bad, since it is the first commercial 1440p panel to officially support 120hz. Their other professional 1440p (made by asus - PB278Q I believe) panel is in the 650$ range. Again, this will be cheaper over time when it switches over to ASIC. If AMD's freesync solution is FPGA based, it will also be expensive. Very expensive. But this isn't the point. AMD's freesync isn't free just like g-sync isn't free. YOU MUST PAY FOR THE CONTROL BOARD. And you must buy a new monitor. It really isn't different than g-sync in terms of cost, because both solultions will require either FPGA or ASIC control boards. FPGA = expensive, ASIC = cheap.

Again - it is absorbed in the monitor price but that will also be true for g-sync. The initial solutions require the FPGA module which isn't cheap - but that will change down the road when monitors use ASIC based solutions which are a mere fraction of the cost.

The real problem here is time to market for AMD. G-sync is going to be Q1 2014. Right now there are zero monitors which are Displayport 1.3 compatible, and there are zero monitor manufacturers to whom AMD has consulted with. Displayport 1.3 isn't even finalized yet. So nvidia will have a significant head start in bringing this to market. As things are, I highly doubt that free-sync will happen in 2014 because of DP 1.3 not being finalized, and AMD not having any monitor manufacturers on board yet. AMD has to consult with manufacturers to get them to add this control board for variable refresh. They will not add to their own costs with a special control board on their own volition.

GSYNC is not going to go to zero cost because it goes from FPGA to ASIC. The module also requires RAM which is far from free. Furthermore, since when has NVIDIA ever decided to charge LESS for something simply because their cost of production went down? They will charge as much as possible until there is genuine competition, then they'll lower the price. I agree with the poster that FreeSync-type of tech will be absorbed into every monitor at minimal cost someday, and that will be cheaper than using a third-party module with all the hardware costs and NVidia licensing fees attached.
 
This is what really gets me about AMD. Answer this question for me truthfully cageymaru. Everyone thought a few days ago that free-sync would come to market very soon and would simiply require an updated driver. This was implied by many websites that reported about free-sync.

Did AMD's marketing tell you that it required DP 1.3? No. Did they tell you that DP 1.3 wasn't finalized? No. Did they tell you that it requires a new control board in desktop panels? No.

I told Digital the same thing. Who DOESNT want a free g-sync alternative? I sure do. I was hoping I was wrong, yet my intuition told me that AMD marketing was full of shit. And it turns out, I was right. Their Free-sync marketing stunt was full of half truths, AMD didn't tell you all of the caveats that I mentioned in the above paragraph. Did they? This is why I believe nothing AMD says anymore. I learned to not trust them after dealing with 7970s in 2012. I know better now.

Answer a question honestly cageymaru. You thought, a few days ago, that this would be a free alternative, coming to market soon, and would simply require a new driver with new monitor firmware. You didn't have to buy anything else. That's what many websites implied. Answer me honestly. That's what you thought, wasn't it? That's what nearly EVERYONE on this very forum thought. Turns out they were wrong. This is, again, why I never trust what AMD says. You can vilify nvidia for charging too much for the upgrade module. But here's the thing. Nvidia was forthcoming about EVERYTHING related to gsync: release date, monitor partners, cost, everything. Meanwhile, AMD is full of half-truths. You see the difference here? Nvidia laid everything out on the table. AMD, on the other hand, you have to pick everything apart to discern the truths from the half truths and lies. That's what REALLY soured me over AMD in the past year. I got sick of it.

I know what you'll say. Nvidia fanboy. Blah blah blah. Go ahead. Say it. I won't care. I gave AMD thousands of dollars for their GPUs so my cynicism toward them is based on real issues i've had to deal with way too much - in fact, i've purchased 4 generations of GPUs from AMD. You can think i'm lying, but that's the God's honest truth. I'm sick of AMD and their half truths. You know what? I'll go ahead and say that nvidia charges way too much for their shit. They're cocky. But there's a vast difference between nvidia delivering on promises (most of the time, it happens) versus AMD promises (rarely happens). Free-sync is a prime example, and i'm tired of hearing promises that aren't followed up on and half truths. It just gets old after a while.
 
Last edited:
GSYNC is not going to go to zero cost because it goes from FPGA to ASIC. The module also requires RAM which is far from free. Furthermore, since when has NVIDIA ever decided to charge LESS for something simply because their cost of production went down? They will charge as much as possible until there is genuine competition, then they'll lower the price. I agree with the poster that FreeSync-type of tech will be absorbed into every monitor at minimal cost someday, and that will be cheaper than using a third-party module with all the hardware costs and NVidia licensing fees attached.

I didn't say zero cost. I implied the cost would lower over time. Like I said, consider the g-sync 120hz 1440 panel being made by asus. 800$. Their IPS 1440p panel is 650$.

Not unreasonable, especially given that it is the first 1440p screen with official 120hz. The cost will lower over time. I never said "free". I said it will be absorbed into monitor cost, and will eventually be lower. And yes, FPGA costs more than ASIC. By the same token, free-sync will not be "free". The logic board costs money. I'll also remind you that 768MB of RAM is pennies. C'mon. You know this. That doesn't add a significant cost to BOM unless it is GDDR5, and it isn't GDDR5.
 
GSYNC is not going to go to zero cost because it goes from FPGA to ASIC. The module also requires RAM which is far from free. Furthermore, since when has NVIDIA ever decided to charge LESS for something simply because their cost of production went down? They will charge as much as possible until there is genuine competition, then they'll lower the price. I agree with the poster that FreeSync-type of tech will be absorbed into every monitor at minimal cost someday, and that will be cheaper than using a third-party module with all the hardware costs and NVidia licensing fees attached.

It's not going to zero cost but it replaces part of electronics which normally are included in monitor.
So the price premium of g-sync display vs non g-sync should go down over time.

As for "free" alternative I'd say it's up to Intel. If Intel implements it into their CPUs then it should get wide adoption as companies will want to tick one more "it saves energy" marketing point for bussiness buyers.
 
This is what really gets me about AMD. Answer this question for me truthfully cageymaru. Everyone thought a few days ago that free-sync would come to market very soon and would simiply require an updated driver. This was implied by many websites that reported about free-sync.

I don't know who everyone is, but the first article I read was Anand's, and it made it clear to me that AMD was not marketing this anytime soon. It also was very clear that only supported hardware could do this. Read the article linked in OP.
 
This is what really gets me about AMD. Answer this question for me truthfully cageymaru. Everyone thought a few days ago that free-sync would come to market very soon and would simiply require an updated driver. This was implied by many websites that reported about free-sync.

Did AMD's marketing tell you that it required DP 1.3? No. Did they tell you that DP 1.3 wasn't finalized? No. Did they tell you that it requires a new control board in desktop panels? No.

I told Digital the same thing. Who DOESNT want a free g-sync alternative? I sure do. I was hoping I was wrong, yet my intuition told me that AMD marketing was full of shit. And it turns out, I was right. Their Free-sync marketing stunt was full of half truths, AMD didn't tell you all of the caveats that I mentioned in the above paragraph. Did they?

Did you actually watch the demo and read what they said? You clearly haven't. They made it clear that it needed DP 1.3. And DP 1.3 isn't out yet, everyone knows that. So of course you will need a new display, as you need DP 1.3.
 
I read this: http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

Apparently there were differences among the many reports about free-sync. Although it appears that TR has since updated this article. Still didn't see any mention of the required control board and DP 1.3, but if it was reported elsewhere, i'll take your word for that.

I don't even know why AMD demo'ed it if they had nothing to be honest. If they didn't know the date for DP 1.3 to be finalized, had not consulted monitor manufacturers, why bother even demo'ing it? The name "free-sync" is clearly a marketing shot at nvidia as well, even though we now know free-sync wont' actually be free. Nvidia didn't peep a word about g-sync until it was essentially finalized. With panel makers on board, release date determined, plans to bring an upgrade module to the market, and 4 panel vendors signed up to develop g-sync monitors, everything. There are vast differences in how these two companies operates in terms of emerging tech. Nvidia generally isn't all talk in cases like this, they had a plan to market while as far as I can tell. AMD doesn't.

Anyway, like I said, i'll take your word on the reports elsewhere if they did fully acknowledge the control board and DP 1.3 requirement. My mistake and apologies if so.
 
Last edited:
AMD is one of the developers of DisplayPort so they will know far more about what features are supported then we do.
 
Here's another example in the initial TR report:

The quantization problem can only be completely resolved via dynamic refresh rates. However, the exec initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware.

The exec's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit. That would explain things. I haven't yet had time to confirm this detail with Nvidia or to quiz them about whether G-Sync essentially does triple-buffering in the module. Nvidia has so far been deliberately vague about certain specifics of how G-Sync works, so we'll need to pry a little in order to better understand the situation.

The above statements by the AMD representative are outright lies. Given that we now know that free-sync requires desktop monitors to have a module between the GPU and scaler chip and also requires DP 1.3. These were later confirmed by AMD's head graphics engineer, Kojuri.

How do you reconcile those facts with the above statement from AMD rep? You can't. Half truths and lies.. Gotta say, an AMD representative lying? Not surprising. Status quo for AMD. This is why I don't trust them. Ever. I used to listen to them in 2012 with great hope (when I owned AMD GPUs), and this just confirms what i've known for some time now. I'll take another look at the AT report and see what it says to compare.
 
Also note that while DP 1.3 isn't due to be finalized until Q2'14 the eDP 1.3 spec was published February 2011 and is what I think they are referring to since it's what will be used in laptop screens and other embedded (hence the e in eDP) screens.
 
Here's another example in the initial TR report:



The above statements by the AMD representative are outright lies. Given that we now know that free-sync requires desktop monitors to have a module between the GPU and scaler chip and also requires DP 1.3. These were later confirmed by AMD's head graphics engineer, Kojuri.

How do you reconcile those facts with the above statement from AMD rep? You can't. Half truths and lies.. Gotta say, an AMD representative lying? Not surprising. Status quo for AMD. This is why I don't trust them. Ever. I used to listen to them in 2012 with great hope (when I owned AMD GPUs), and this just confirms what i've known for some time now. I'll take another look at the AT report and see what it says to compare.

Have you tried reading what he is saying.

He said their is no need for an external chip in the monitor, as the GPU should be able to do it all. He is not talking about the scaler.
 
Also note that while DP 1.3 isn't due to be finalized until Q2'14 the eDP 1.3 spec was published February 2011 and is what I think they are referring to since it's what will be used in laptop screens and other embedded (hence the e in eDP) screens.


Well, that would be mostly worthless since most ultrabooks are using intel HD graphics and for the most part, and (usually) not used for gaming. Light gaming, if that. There are high end gaming laptops , of course, but that is a niche market in the overall sceme of macbooks and ultrabooks.

So with that being the case, how is that even remotely useful? Gaming laptops are expensive and most of them use nvidia discrete graphics (some using AMD). But the point remains. AMD's statements made everyone think that desktop panels would essentially get this for free. Hence, "free-sync".
 
Have you tried reading what he is saying.

He said their is no need for an external chip in the monitor, as the GPU should be able to do it all. He is not talking about the scaler.

But you would be wrong. Free-sync does require a variable framerate aware control board embedded into the monitor, which no current monitors have.

Portables have eDP while desktop monitors do not. Desktop monitors, since they all have scalers, will have the requirement for a control board for either free-sync or g-sync. This was reported by PCPer today after discussion with Kojuri:

To be clear, just because a monitor would run with DisplayPort 1.3 doesn't guarantee this feature would work. It also requires the controller on the display to understand and be compatible with the variable refresh portions of the spec, which with eDP 1.0 at least, isn't required. AMD is hoping that with the awareness they are building with stories like this display designers will actually increase the speed of DP 1.3 adoption and include support for variable refresh rate with them. That would mean an ecosystem of monitors that could potentially support variable speed refresh on both AMD and NVIDIA cards. All that would be needed on the PC side is a software update for both Radeon and GeForce graphics cards.

It requires new controllers in monitors which can interpret variable refresh rate. This is part of eDP (portables), but not currently included in desktop panels. Hence, desktop panels DO need a new control board , and AMD will have to convince panel makers to include it. Note that this variable refresh aware control board is not part of the DP 1.3 specification, and is optional.
 
Here's another example in the initial TR report:



The above statements by the AMD representative are outright lies. Given that we now know that free-sync requires desktop monitors to have a module between the GPU and scaler chip and also requires DP 1.3. These were later confirmed by AMD's head graphics engineer, Kojuri.

How do you reconcile those facts with the above statement from AMD rep? You can't. Half truths and lies.. Gotta say, an AMD representative lying? Not surprising. Status quo for AMD. This is why I don't trust them. Ever. I used to listen to them in 2012 with great hope (when I owned AMD GPUs), and this just confirms what i've known for some time now. I'll take another look at the AT report and see what it says to compare.

I'd want to see the exact quotes from the exec, since there is a big difference between a lie and poor reporting. The NV hardware still looks to be needlessly expensive to me as it has a bunch of RAM for the framebuffer on the board in addition to the FPGA. You really should be able to do this by replacing the scaler chip on the monitor with a different one without adding the RAM for a framebuffer by just using a line buffer which can easily be done with on-chip storage. The RAM costs cash and also will push you to a larger and more expensive chip (FPGA in this case) as well. Unless NV comes out an explains why they need the framebuffer it's fairly reasonable to assume there is some limitation in their existing GPUs they are using it to work around.
 

What you're quoting in no way says you need an additional board.

All it says is that it would require monitor manufacturers to implement DP1.3 + "Optional Specs" (Which they seem to imply are part of the DP1.3 specs, but I can't confirm that).

It just says the controller on the display (which is always there in every monitor) needs to "understand" and be "compatible" with the variable refresh portion of the specs. (which is optional)
 
I'd want to see the exact quotes from the exec, since there is a big difference between a lie and poor reporting. The NV hardware still looks to be needlessly expensive to me as it has a bunch of RAM for the framebuffer on the board in addition to the FPGA. You really should be able to do this by replacing the scaler chip on the monitor with a different one without adding the RAM for a framebuffer by just using a line buffer which can easily be done with on-chip storage. The RAM costs cash and also will push you to a larger and more expensive chip (FPGA in this case) as well. Unless NV comes out an explains why they need the framebuffer it's fairly reasonable to assume there is some limitation in their existing GPUs they are using it to work around.



The high cost is due to FPGA as mentioned. You and I both know that 768MB of DDR3 RAM costs pennies. This isn't using GDDR5 so the RAM doesn't add anything meaningful to the cost.

Both free-sync and g-sync will add to cost of monitors. That's the bottom line. So the term "free-sync" is disingenuous and deceptive. If Free-sync is free, then g-sync is free. AMD is completely ignoring the added BOM cost which monitors will require - if they're doing that, will you ignore the BOM costs that g-sync adds?

Like I said, g-sync will be cheaper once it is an ASIC instead of FPGA. The RAM is a non issue in terms of cost. 768MB. Nothing. Now i'm not saying g-sync is free. It WILL add to cost. But free-sync will add to cost as well. Which is why I find AMD's marketing term "free-sync" to be a fabrication.
 
Status
Not open for further replies.
Back
Top