LG is coming out with a 34" 21:9 monitor that supports Freesync

Nvidia is gonna lose this battle.

Right- that huge userbase of people with 5 latest models of gpu from manufacturer with 30-40% gpu market share will be much more attractive for display makers than users of 3 generations of gpu from company that holds 60-70% of market.
 
Right- that huge userbase of people with 5 latest models of gpu from manufacturer with 30-40% gpu market share will be much more attractive for display makers than users of 3 generations of gpu from company that holds 60-70% of market.

If we're going by market share, then whichever adaptive sync solution Intel backs will be the winner.
 
Right- that huge userbase of people with 5 latest models of gpu from manufacturer with 30-40% gpu market share will be much more attractive for display makers than users of 3 generations of gpu from company that holds 60-70% of market.

Sorry, you are forgetting APUs, All GCN APUs support Adaptive sync monitors. These aren't included in the market share you listed.

And here's another interesting bit of info, all 4th gen icore and M core igps from Intel can support adaptive sync too. All they need is a driver update.
 
And here's another interesting bit of info, all 4th gen icore and M core igps from Intel can support adaptive sync too. All they need is a driver update.

Ah, well that's quite an interesting tidbit here! Makes sense for mobility chips also.
 
Ah, well that's quite an interesting tidbit here! Makes sense for mobility chips also.

Yeah, people don't realize this at all. So when people say why would Intel support Adaptive sync I usually reply with why not? It's something they can do for practically nothing that will add value to their integrated gpus.
 
Yeah, people don't realize this at all. So when people say why would Intel support Adaptive sync I usually reply with why not? It's something they can do for practically nothing that will add value to their integrated gpus.

And low-end graphics would be the second-greatest beneficiaries of this tech, since they usually struggle to top 30fps consistently.

This would be a whole lot smoother than v-sync stuttering between 30 and 20fps, and if lots of GPUs support it they should add support to cheaper monitors over time.

Can they make freesync work at such low framerates?
 
Last edited:
Assuming it's possible, do you think nVidia will allow it?

I think it's inevitable. Monitor makers will push for the cost reduction, and once you can get the functionality of G-Sync on a dedicated AISC, there's little additional cost to add both to the same system.

Remember when motherboards used to have CFX support OR SLI support, but not both? And remember when, to compete with CFX availability on Intel and AMD chipsets, Nvidia insisted on using their PCIe switch if you wanted SLI certification on Intel chipset boards? Now both those restrictions are gone now that widely-available cheap ASICs offer all the same features and performance.

So today it's simply a licensing cost that marks the difference between CFX, SLI and neither. The only thing Nvidia dictates is that each slot offer physical 8x connection, and tey can use any PCIe switch they want to for that.
 
Last edited:
And low-end graphics would be the second-greatest beneficiaries of this tech, since they usually struggle to top 30fps consistently.

This would be a whole lot smoother than v-sync stuttering between 30 and 20fps, and if lots of GPUs support it they should add support to cheaper monitors over time.

Can they make freesync work at such low framerates?

I believe they said it was 10fps-200fps
 
And low-end graphics would be the second-greatest beneficiaries of this tech, since they usually struggle to top 30fps consistently.

This would be a whole lot smoother than v-sync stuttering between 30 and 20fps, and if lots of GPUs support it they should add support to cheaper monitors over time.

Can they make freesync work at such low framerates?

I can't imagine anyone with integrated graphics buying a $800 gaming monitor. It would be like trying to power a corvette with a moped engine.
 
I can't imagine anyone with integrated graphics buying a $800 gaming monitor. It would be like trying to power a corvette with a moped engine.

i dont think every freesync monitor will cost $800. its not gsync.
 
i saw this somewhere yesterday. a 144hz ips monitor sounds really cool. i need more details on that. it just seems like something isnt right there. there isnt any 120hz ips monitors at any rez that i can think of.

Depends on whether you count those overclockable Korean "120Hz capable" IPS panels
 
i dont think every freesync monitor will cost $800. its not gsync.

Not every gysnc monitor is $800. The point is that it would be ludicrous to buy an expensive gaming monitor to use with integrated graphics.
 
Right- that huge userbase of people with 5 latest models of gpu from manufacturer with 30-40% gpu market share will be much more attractive for display makers than users of 3 generations of gpu from company that holds 60-70% of market.
If you were a display manufacturer, would you support

A) The technology that is part of the VESA standard and requires zero additional hardware or licensing fees to implement,

or

B) The technology which requires additional proprietary hardware and adds $100+ to the cost of your display?

Think about it.
 
If you were a display manufacturer, would you support

A) The technology that is part of the VESA standard and requires zero additional hardware or licensing fees to implement,

or

B) The technology which requires additional proprietary hardware and adds $100+ to the cost of your display?

Think about it.

Option A
 
If you were a display manufacturer, would you support

A) The technology that is part of the VESA standard and requires zero additional hardware or licensing fees to implement,

or

B) The technology which requires additional proprietary hardware and adds $100+ to the cost of your display?

Think about it.

You would support B because it has by far the highest number of potential customers. Also A is 100% incorrect. It's an optional part of the standard and it does require additional hardware. AMD uses a lot of misinformation when trying to push it's products. It's why I refuse to support them.
 
You would support B because it has by far the highest number of potential customers. Also A is 100% incorrect. It's an optional part of the standard and it does require additional hardware. AMD uses a lot of misinformation when trying to push it's products. It's why I refuse to support them.

So that's why! What company doesn't use misinformation when trying to push their products?

I think rizen worded the question wrong. Freesync will cost $100 less than gsync monitors. There is a cost to freesync, and amd has said that all along. The cost is gsync -$100.
 
Last edited:
You would support B because it has by far the highest number of potential customers. Also A is 100% incorrect. It's an optional part of the standard and it does require additional hardware. AMD uses a lot of misinformation when trying to push it's products. It's why I refuse to support them.

I thought the VESA standard only required a proper scalar to implement it. How much more equipment is required?
 
Not every gysnc monitor is $800. The point is that it would be ludicrous to buy an expensive gaming monitor to use with integrated graphics.

Yeah, not every Gsync monitor is that cheap, $800 is only on half-price sales.
 
I thought the VESA standard only required a proper scalar to implement it. How much more equipment is required?

Yeah, the idea behind Freesync is that it's relatively cheap to implement.

Why would they just copy the expensive method Nvidia is using? That would just be wasteful. I can see all monitors supporting Freesync (it will start-out as a premium feature, but will quickly filter down)
 
Yeah, the idea behind Freesync is that it's relatively cheap to implement.

Why would they just copy the expensive method Nvidia is using? That would just be wasteful. I can see all monitors supporting Freesync (it will start-out as a premium feature, but will quickly filter down)

Would be nice to see any kind of proof it actually works the same first with AMD marketing history of overpromising
 
You would support B because it has by far the highest number of potential customers. Also A is 100% incorrect. It's an optional part of the standard and it does require additional hardware. AMD uses a lot of misinformation when trying to push it's products. It's why I refuse to support them.
You are simply wrong.

http://www.amd.com/Documents/FreeSync-Whitepaper.pdf

AMD’s variable refresh rate technology has been available to notebook PC makers for quite some time as a system power saving feature for embedded notebook panels (known as DRR). The DisplayPort Adaptive-Sync feature is already a capability of the Embedded DisplayPort interface. When the system enters a static screen state (no new content), the refresh rate of the display is lowered to the minimum rate that it can support, to save power. The transition between refresh rates is invisible to the end user, and it comes at a low cost to PC makers, since no additional hardware is required to enable this feature.
Unless by "additional hardware" you mean DisplayPort connectivity, but what monitor doesn't have that already?
 
Unless by "additional hardware" you mean DisplayPort connectivity, but what monitor doesn't have that already?

Well, not to play Devil's advocate, but the truly cheap screens don't. As soon as DP is involved, it's a $200+ screen.
 
Well, not to play Devil's advocate, but the truly cheap screens don't. As soon as DP is involved, it's a $200+ screen.
Plenty of cheap monitors do support DisplayPort. I just pulled up a random Dell $150 LCD and it had a DP connector on it. The monitors that are going to offer GSync or FreeSync were already going to have DisplayPort anyways. I have an ROG Swift which has GSYNC and it ONLY has a DisplayPort connector.
 
You are simply wrong.

http://www.amd.com/Documents/FreeSync-Whitepaper.pdf

Unless by "additional hardware" you mean DisplayPort connectivity, but what monitor doesn't have that already?

http://techreport.com/news/26919/freesync-monitors-will-sample-next-month-start-selling-next-year


There are some associated hardware requirements, but the additional cost should be minimal, according to Huddy, who told us he'd be surprised if FreeSync compatibility added more than $10-20 to a display's bill of materials. Even taking additional validation costs into consideration, monitor makers should be able to support adaptive refresh rates fairly cheaply. They're still free to charge whatever premium they want, though.

Straight from AMD. That last line is key as well. Don't expect monitor manufactures to hand out any sort of discount on a niche product. It's 100% optional to the DP spec and most new monitors won't use it.
 
seems like all the big player have committed, so im not sure how you say most wont use it.

unless you consider "most" business class montiors, but thats obvious.
 
http://techreport.com/news/26919/freesync-monitors-will-sample-next-month-start-selling-next-year




Straight from AMD. That last line is key as well. Don't expect monitor manufactures to hand out any sort of discount on a niche product. It's 100% optional to the DP spec and most new monitors won't use it.
Okay, so it includes an additional $10-20 in materials. Interesting that they have conflicting information, but regardless, that is peanuts compared to the $150 cost of a GSync module. Perhaps by "no additional hardware required" they mean beyond what would typically be included to drive a display. Either way that's a stark difference.
 
You would support B because it has by far the highest number of potential customers. Also A is 100% incorrect. It's an optional part of the standard and it does require additional hardware. AMD uses a lot of misinformation when trying to push it's products. It's why I refuse to support them.

Actually you are 100% wrong. Option A has more potential users, as already stated in this thread.
While Adaptive-sync may be an optional part of the VESA spec, any monitor manufacturer updating their lineup for 2015, with the exception of the lowest-end, bargain basement models, would be very shortsighted to not be using Adaptive-sync capable scalers.

AMD didn't use any misinformation, they were simply stating facts. There were Adaptive-Sync compatible monitors and laptops on the market at the time they made the statement. That is where the "no additional hardware required" comes from, i.e. you don't need to purchase an add-on module to get the functionality since it is all handled in the scalar.
 
Last edited:

Huddy guessed at the price and you are comparing it to a kit...which doesn't even exist for adaptive sync.

Actually you are 100% wrong. Option A has more potential users, as already stated in this thread.
Only a very small number of AMD models support it. Combine that with the fact that AMD represents a minority of the market to begin with. AMD was down to 20% last quarter and not all of those support adaptive sync.

While Adaptive-sync may be an optional part of the VESA spec, any monitor manufacturer updating their lineup for 2015, with the exception of the lowest-end, bargain basement models, would be very shortsighted to not be using Adaptive-sync capable scalers.

Why waste the extra money on a product very few people can use and even less will buy?

Read the LG press release. They are not offering adaptive sync across the board even though the other monitors support the new DP spec.
 
Last edited:
Huddy guessed at the price and you are comparing it to a kit...which doesn't even exist for adaptive sync.

youre missing the point. the point is that the add in kit cost $200. not $10-$20, $200. the cost is $200 for gsync.


Only a very small number of AMD models support it. Combine that with the fact that AMD represents a minority of the market to begin with. AMD was down to 20% last quarter and not all of those support adaptive sync.

what makes you think that every gpu from now forward made by amd will not support freesync? also, if intel gets onboard, that would increase that market share to almost 80%.

Why waste the extra money on a product very few people can use and even less will buy?

Read the LG press release. They are not offering adaptive sync across the board even though the other monitors support the new DP spec.

you can say the same thing about gsync.

im sorry amd killed your family.
 
youre missing the point. the point is that the add in kit cost $200. not $10-$20, $200. the cost is $200 for gsync.
The cost for a kit. You do know the difference?

what makes you think that every gpu from now forward made by amd will not support freesync? also, if intel gets onboard, that would increase that market share to almost 80%.
Intel is not on board. Nothing suggests they will be. So I'm still very much correct.


you can say the same thing about gsync.
Not at all. NVIDIA enjoys a very healthy market share and it's using a proven technology. It is odd that AMD has not allowed any independent reviews of "freesync". Makes you wonder why.

im sorry amd killed your family.
I'm sorry we could not maintain a mature conversation.

Either way this thread is devolving quickly. I await actual reviews from monitors that can actually be purchased.
 
..
Intel is not on board. Nothing suggests they will be. So I'm still very much correct.
...

Intel is very much on board of the adaptive sync technology, since the iX-XXX mobile series.
Read: the Intel iGPU's support the VESA extensions since the introductions a few years ago.
 
Reading articles like: “Industry's Biggest Scaler Vendors Pledge Support for AMD's Project FreeSync” I would think that in a few months time, the companies that make monitor scalers/chip sets with DP may have a variant with adaptive sync and without. In another year it’ll just be another logo on the box as they won’t bother making a set with DP but without adaptive sync. Like cellphone radio chips that have Wi-Fi and Bluetooth and so on. It don’t seem as likely with GSync.
 
Last edited:
Huddy guessed at the price and you are comparing it to a kit...which doesn't even exist for adaptive sync.
No he wasn't guessing, that is what he was told. Those scalers very much do exist and have for the last few months.


Only a very small number of AMD models support it. Combine that with the fact that AMD represents a minority of the market to begin with. AMD was down to 20% last quarter and not all of those support adaptive sync.
Wrong. 100% of AMD GPUs and APUs that are shipping right now support Adaptive Sync.

Why waste the extra money on a product very few people can use and even less will buy?
Why would they waste the money on ridiculously expensive G-Sync FPGA's that very few people can use and even less will buy not to mentioned the 6-12+ months of lead time it takes for Nvidia and friends to R&D and handtune the FPGAs for each display.
Why would they use 1-2 year old scalers in a new midrange and up monitor/display line?

Read the LG press release. They are not offering adaptive sync across the board even though the other monitors support the new DP spec.
At this time.
 
Last edited:
Back
Top