AMD Demonstrates "FreeSync", Free G-Sync Alternative at CES 2014

Status
Not open for further replies.
I didn't say zero cost. I implied the cost would lower over time. Like I said, consider the g-sync 120hz 1440 panel being made by asus. 800$. Their IPS 1440p panel is 650$.

Not unreasonable, especially given that it is the first 1440p screen with official 120hz. The cost will lower over time. I never said "free". I said it will be absorbed into monitor cost, and will eventually be lower. And yes, FPGA costs more than ASIC. By the same token, free-sync will not be "free". The logic board costs money. I'll also remind you that 768MB of RAM is pennies. C'mon. You know this. That doesn't add a significant cost to BOM unless it is GDDR5, and it isn't GDDR5.

And the real cost of that monitor if gsync wasn't over priced could be what? $599-699?
 
And the real cost of that monitor if gsync wasn't over priced could be what? $599-699?

I don't' know. You tell me. Asus' upcoming G-sync 120hz 1440p panel is 799$ and it will be released in the latter part of Q1 2014. Meanwhile, the VG248QE is available now in limited quantities. Yes, the upgrade module is expensive, yet the switch to ASIC (which I suspect Asus will be using) will be much cheaper. Do you consider that too much of a premium? I mean, you'll buy 280X crossfire without flinching. I don't know. Maybe for some people it is. I find it reasonable considering that this asus panel is the first 1440p with _official_ 120hz support.

By the same token, what do you expect the special control boards in free-sync monitors to cost whenever they hit (which PROBABLY won't happen in 2014), but nonetheless. I ask you. Do you think free-sync enabled monitors will be zero dollars? If you expect that, I have a bridge to sell you.
 
I dont' know. You tell me. The asus IPS 1440p panel retails for 650-700$ USD. Their upcoming G-sync 120hz 1440p panel is 799$.

Do you consider that too much of a premium? I mean, you'll buy 280X crossfire without flinching. Is 100$ too much for g-sync? I don't know. Maybe for some people it is. I find it reasonable considering that this asus panel is the first 1440p with _official_ 120hz support.

By the same token, what do you expect the special control boards in free-sync monitors to cost whenever they hit (which PROBABLY won't happen in 2014), but nonetheless. I ask you. Do you think free-sync enabled monitors will be zero dollars? If you expect that, I have a bridge to sell you.

Actually I do consider nvidias absurd pricing on everything they release . A VESA standard cost next to nothing to implement btw... If you want your monitor certified for DP standard , it must have all of the requirements for that standard which the manf doesn't rape the customer on. This is opposite of Nvidias logic or marketing. There wasn't a price premium for me to move to DP...

To give you an example, when I purchased a 5870 Eyefinty 6 in 2009, I got 3 hp led 1080p monitors to go with it. At the time it was the only available led backlit monitor with DP. I got those monitors for $275 each...
 
The controller board with variable refresh support is not part of the VESA specification. We've already discussed this. It is required by free-sync and it will add to the monitor costs. Please read today's article at PCPer. Free-sync requires a variable refresh aware logic board just like G-sync does. And that IS NOT PART OF VESA OR DP 1.3. It is optional.

But by all means, rant on about nvidia pricing. Yawn. Free-sync isn't free for the above reasons, that's my main point. It adds to the monitor cost due to the special control board. If you think it's going to be free, AMD already deceived you with their half truth bullshit marketing. And it isn't coming anytime soon since DP1.3 isn't finalized.
 
You should reshow us your source that states a special control board is required.

Because all I've seen points to an optional part of the VESA standard. So it's already in the standards, just optional. (you skipped over my previous post asking what control board)
 
I just told you. Read today's story at PCPer. They just posted it a few hours ago.

To be clear, just because a monitor would run with DisplayPort 1.3 doesn't guarantee this feature would work. It also requires the controller on the display to understand and be compatible with the variable refresh portions of the spec, which with eDP 1.0 at least, isn't required. AMD is hoping that with the awareness they are building with stories like this display designers will actually increase the speed of DP 1.3 adoption and include support for variable refresh rate with them. That would mean an ecosystem of monitors that could potentially support variable speed refresh on both AMD and NVIDIA cards.
 
I've read that 10 times, and nowhere does it say a controller is required.

(I've broken it down for you in previous post)

Edit for clarity: Nowhere does it say "an aditional controller" is required.
 
The controller board with variable refresh support is not part of the VESA specification. We've already discussed this. It is required by free-sync and it will add to the monitor costs. Please read today's article at PCPer. Free-sync requires a variable refresh aware logic board just like G-sync does. And that IS NOT PART OF VESA OR DP 1.3. It is optional.

But by all means, rant on about nvidia pricing. Yawn. Free-sync isn't free for the above reasons, that's my main point. If you think it's going to be free, AMD already deceived you with their half truth bullshit marketing. And it isn't coming anytime soon since DP1.3 isn't finalized.

It will still be a hell of a lot cheaper than gsync without it being proprietary is what I'm saying. I doubt it will ever come to market anyhow. Point is, Nvidia is just trying to lock people in.. When that happens they can continue to rape your wallet like they have been. You guys are the ones who will be crying the blues when you will be paying 30% more for your hardware to support the gsync standard which already carried a premium on top of it.

You = Sheep
 
I've read that 10 times, and nowhere does it say a controller is required.

(I've broken it down for you in previous post)

It is required because controllers in desktop panels do not use eDP. eDP is used by portables. Because desktop screens include many inputs, they do not use eDP; Desktop screens also don't have the power saving features that laptops have - eDP is a power saving feature used in laptops. Obviously this is not applicable for desktop. Controllers in current panels do not support variable refresh, but it is included within eDP (portables) for power saving reasons.

Why do you think AMD demo'ed this on a laptop. Let's not be dense here. Desktop panels do not include eDP, thus do not include variable refresh rate aware controllers. Nvidia aren't fucking idiots. They didn't add the control board for the fuck of it. They had to make it variable refresh rate aware, which desktop panels do not currently include support for without a different controller. If current panels included everything to be variable refresh rate aware, there would be no need for nvidia's FPGA. But that is not the case - desktop panels do not have variable refresh aware controllers. Some laptops do, as laptops can use eDP.
 
You = Sheep

I believe the sheep would be the fucking idiot that listens to the company that spits out continual half truths and lies, and never delivers on their promises. I know what that's like. Should I name all the bullshit with 7970CF again? Did AMD deliver the crossfire-DX9/eyefinity frame pacing fix that they promised a year ago? Witcher 2 crashing 5 straight months in crossfire? How many beta drivers did I use just to be greeted with perpetual black screens. How many ubi games had zero crossfire support. I always submitted support tickets to AMD. We're working on it. Blah blah blah. More promises from AMD, and they never fixed their shit. Here 2.5 years later crossfire still has microstutter in eyefinity on the 79xx cards. After the bug existed for 3 years?

Please. I normally wouldn't be so abrasive, but since you threw the "sheep" term out there, I think you're the fucking sheep. You'll listen to AMD's bullshit marketing lies and you'll buy their hardware knowing that they have endless issues, but that's cool.

You go right on ahead and listen to AMD's marketing bullshit. You go right on ahead and deal with AMD's issues. I had issues every month with 7970CF. I am cynical of AMD for a reason. I got tired of dealing with bullshit. Apparently you love dealing with their bullshit. You go and have fun with that, bro. You're probably a cryptocoin miner therefore you don't even HAVE to deal with AMD's bullshit. You'll just happily mine coins with your AMD cards. Right? Am I getting this correct? I bet you don't even play PC games, do you.

Like I said, I wouldn't be abrasive like this but you're the one that threw the sheep term out there. After dealing with what I dealt with on the AMD side, I think I know who the goddamn sheep is. Actually, maybe I am a sheep. I bought 4 generations of GPUs from AMD ending with 7970CF. Why? Maybe i'm a fucking idiot. I learned my lesson though. Never again, AMD, never again.
 
All AMD did is demo it on hardware that already implemented the optional parts of the display port standard. Then they go and say they hope this gets manufacturers interested so that in the future, all displays will implement this optional part of the standard.

I don't think Nvidia are idiots. I just find it odd they always try to fight against the flow of things.
 
The cost of the RAM is not just the RAM, it's the increased board size, the added pin count for three DRAM chips plus the die area needed for the pins and the DRAM controller, not to mention the added complexity of the frame buffer logic. Even assuming you move to an ASIC solution the G-sync will cost a substantial amount more than what they were using before because of this.

The existing line buffer based scaler chips are very cheap and don't need any external storage at all. You should be able to add dynamic frame rate support as a fairly minor tweak to an existing scaler chip with no pinout changes and likely no substantial die size change either. It should be some tweaks to a few state machines to make sure they can tolerate the differing delays between frames.

I'd guess the total cost difference of the frame buffer (G-sync) solution and the one without it is at least 2x or 3x. You don't double or triple the cost of you components without a reason. So, I'd really like to know why they are doing it. I really doubt they give a flip about the power savings that drives the framebuffer for mobile displays, so it's got to be some other reason. Working around a GPU limitation seems to be the most obvious answer without additional info.
 
All AMD did is demo it on hardware that already implemented the optional parts of the display port standard. Then they go and say they hope this gets manufacturers interested so that in the future, all displays will implement this optional part of the standard.

I don't think Nvidia are idiots. I just find it odd they always try to fight against the flow of things.

Nvidia aren't the idiots... They are a tech corporation that is trying to stay relevant within the scope of competition while remaining profitable. The idiots are the ones who don't own any stock in the company and buy into to their overpriced products, Gsync being one of them... Then come on these forums and contend that anything AMD does is pure shit or a lie.

Nvidia would love for all of its current customers to be locked into proprietary hardware so you have to keep buying their new cards, which we all know, are overpriced and even sometimes, crippled .
 
I don't think Nvidia are idiots. I just find it odd they always try to fight against the flow of things.

Against the flow? They brought Gsync to market first. Like most video card technology these days, they are paving the path.
 
Against the flow? They brought Gsync to market first. Like most video card technology these days, they are paving the path.

No they aren't.

Variable Refresh Rate is part of DP 1.3.

All NVIDIA did was look at what was being developed and made their own versions.

They didn't come up with this, this has been in development of a while.
 
I believe the sheep would be the fucking idiot that listens to the company that spits out continual half truths and lies, and never delivers on their promises. I know what that's like. Should I name all the bullshit with 7970CF again? Did AMD deliver the crossfire-DX9/eyefinity frame pacing fix that they promised a year ago? Witcher 2 crashing 5 straight months in crossfire? How many beta drivers did I use just to be greeted with perpetual black screens. How many ubi games had zero crossfire support. I always submitted support tickets to AMD. We're working on it. Blah blah blah. More promises from AMD, and they never fixed their shit. Here 2.5 years later crossfire still has microstutter in eyefinity on the 79xx cards. After the bug existed for 3 years?

Please. I normally wouldn't be so abrasive, but since you threw the "sheep" term out there, I think you're the fucking sheep. You'll listen to AMD's bullshit marketing lies and you'll buy their hardware knowing that they have endless issues, but that's cool.

You go right on ahead and listen to AMD's marketing bullshit. You go right on ahead and deal with AMD's issues. I had issues every month with 7970CF. I am cynical of AMD for a reason. I got tired of dealing with bullshit. Apparently you love dealing with their bullshit. You go and have fun with that, bro. You're probably a cryptocoin miner therefore you don't even HAVE to deal with AMD's bullshit. You'll just happily mine coins with your AMD cards. Right? Am I getting this correct? I bet you don't even play PC games, do you.

Like I said, I wouldn't be abrasive like this but you're the one that threw the sheep term out there. After dealing with what I dealt with on the AMD side, I think I know who the goddamn sheep is. Actually, maybe I am a sheep. I bought 4 generations of GPUs from AMD ending with 7970CF. Why? Maybe i'm a fucking idiot. I learned my lesson though. Never again, AMD, never again.

You aren't going to have much fun with multi-display on Nvidia cards: been there, done that. AMD handles multi-display much better.
 
You aren't going to have much fun with multi-display on Nvidia cards: been there, done that. AMD handles multi-display much better.

Huh? How could you say that considering the well documented tearing and microstutter bugs in DX9 and eyefinity with CF on all 79xx and prior cards. I used 7970CF. Are you trying to tell me this with a straight face? Because you're either being disingenuous or don't know what you're talking about. OR, more than likely, you don't play PC games. IF you game with crossfire/eyefinity, you'd know what's wrong with your statement.

I'm curious. You aren't a PC gamer? Some people don't game. That's cool. But there are some obvious problems with your statement - this would be obvious if you were a gamer using CF with eyefinity. Unless you have some type of visual impairment to where you don't see microstutter and tearing. These bugs have existed since the 5870CF in eyefinity BTW. Gotta love those driver bugs that exist for 3+ years.
 
Last edited:
That's hilarious considering the well documented tearing and microstutter bugs in DX9 and eyefinity with CF on all 79xx and prior cards. I used 7970CF. Are you trying to tell me this with a straight face? Because you're either being disingenuous or don't know what you're talking about. OR, more than likely, you don't play PC games. IF you game with crossfire/eyefinity, you'd know what's wrong with your statement.

I'm curious. You aren't a PC gamer? Some people don't game. That's cool. But there are some obvious problems with your statement - this would be obvious if you were a gamer using CF with eyefinity. Unless you have some type of visual impairment to where you don't see microstutter and tearing. These bugs have existed since the 5870CF in eyefinity BTW. Gotta love those driver bugs that exist for 3+ years.

Maybe you should do your homework. I build gaming PCs as my job, and run multi-display at home, and have tried on both Nvidia and AMD hardware: AMD is a better experience.

Also, I design games as a hobby. A google search of my screen name would show you that.
 
Let's just ignore the tons of websites that have shown proof of microstutter on AMD cards in EF-crossfire. We'll just pretend that never happened. I count about 20 websites that have tested frametimes in crossfire eyefinity on 79xx cards.
 
Last edited:
Also, it should be noted, that THIS VERY WEBSITE complained of this issue as well on 79xx cards.... It is fixed on the 290/X but not on prior cards. CF/eyefinity or DX9 / CF has microstutter on 79xx and older cards. And tearing.

But we'll take your word for it over HardOCPs. Especially since AMD admitted to the issues.
 
Your posts are starting to seem quite personal. I'm not happy with what appear to be personal attacks on me when I haven't tried to discredit or defame you in any way. If I have, I'm sorry, but your attitude thus far has escalated to inappropriate levels of attack.

Edit: also, I make claims based on my own experience. I don't cite graphs or other sources because I'm in a position where I can freely test new hardware, software and configurations because of my job, hobbies, etc. I like graphs, but they don't tell you the full experience: like Nvidia will not span a set of screens unless they are 100% identical, even separate firmware revisions are enough to kick the NVSurround out and only detect 2 screens in SLI. also, some games don't respond well with Nvidia's approach to the surround feature: stretching the HUD, or locking the aspect ratio to 4:3, wherein most of these titles have no issue on AMD cards.
 
Last edited:
Sorry. If you're making untrue statements i'll post about the untrue statement. I don't know if your'e being disingenuous or really believe what you're saying. Because it isn't true.

Like I said, this issue has been proven to exist on numerous websites and even this website complained of it. That's a fact. AMD admitted to the issue. Also: fact. So with that being the case you're telling us that AMD has the better drivers for crossfire/eyefinity on older cards? Like I said. That's just an untrue statement. Period. If you want to reconcile your untrue statement into a true one, hey great.
 
Last edited:
Such as here:

http://www.hardocp.com/article/2012..._gtx_680_sli_video_card_review/9#.Us4E4vRDuaU

We can't communicate to you "smoothness" in raw framerates and graphs. Smoothness, frame transition, and game responsiveness is the experience that is provided to you as you play. Perhaps it has more to do with "frametime" than it does with "framerate." To us it seems like SLI is "more playable" at lower framerates than CrossFireX is. For example, where we might find a game playable at 40 FPS average with SLI, when we test CrossFireX we find that 40 FPS doesn't feel as smooth and we have to target a higher average framerate, maybe 50 FPS, maybe 60 FPS for CrossFireX to feel like NVIDIA's SLI framerate of 40 FPS. Only real-world hands on gameplay can show you this, although we can communicate it in words to you. Even though this is a very subjective realm of reviewing GPUs, it is one we surely need to discuss with you.

The result of SLI feeling smoother than CrossFireX is that in real-world gameplay, we can get away with a bit lower FPS with SLI, whereas with CFX we have to aim a little higher for it to feel smooth. We do know that SLI performs some kind of driver algorithm to help smooth SLI framerates, and this could be why it feels so much better. Whatever the reason, to us, SLI feels smoother than CrossFireX.

Personally speaking here, when I was playing between GeForce GTX 680 SLI and Radeon HD 7970 CrossFireX, I felt GTX 680 SLI delivered the better experience in every single game. I will make a bold and personal statement; I'd prefer to play games on GTX 680 SLI than I would with Radeon HD 7970 CrossFireX after using both. For me, GTX 680 SLI simply provides a smoother gameplay experience. If I were building a new machine with multi-card in mind, SLI would go in my machine instead of CrossFireX. In fact, I'd probably be looking for those special Galaxy 4GB 680 cards coming down the pike. After gaming on both platforms, GTX 680 SLI was giving me smoother performance at 5760x1200 compared to 7970 CFX. This doesn't apply to single-GPU video cards, only between SLI and CrossFireX.

AMD has fixed this bug for single screen DX11 resolutions only. As of now it still exists in DX9 crossfire, DX10 crossfire, and Eyefinity crossfire. And it has existed for 2+ years.
 
AMD's response to proprietary hardware versus open standards in the computer industry.
Blog link.


Doing the Work for Everyone

In our industry, one of the toughest decisions we continually face is how open we should be with our technology. On the one hand, developing cutting-edge graphics technology requires enormous investments. On the other hand, too much emphasis on keeping technologies proprietary can hinder broad adoption.

It’s a dilemma we face practically every day, which is why we decided some time ago that those decisions would be guided by a basic principle: our goal is to support moving the industry forward as a whole, and that we’re proud to take a leadership position to help achieve that goal.

The latest example of that philosophy is our work with dynamic refresh rates, currently codenamed "Project FreeSync”. Screen tearing is a persistent nuisance for gamers, and vertical synchronization (v-sync) is an imperfect fix. There are a few ways the problem can be solved, but there are very specific reasons why we’re pursuing the route of using industry standards.

The most obvious reason is ease of implementation, both for us from a corporate perspective and also for gamers who face the cost of upgrading their hardware. But the more important reason is that it’s consistent with our philosophy of making sure that the gaming industry keeps marching forward at a steady pace that benefits everyone.

It sometimes takes longer to do things that way — lots of stakeholders need to coordinate their efforts — but we know it’s ultimately the best way forward. This strategy enables technologies to proliferate faster and cost less, and that’s good for everyone.

The same philosophy explains why we’re revealing technology that’s still in the development stage. Now’s our chance to get feedback from industry, media and users, to make sure we develop the right features for the market. That’s what it takes to develop a technology that actually delivers on consumers’ expectations.

And Project FreeSync isn’t the only example of this philosophy and its payoffs. We worked across the industry to first bring GDDR5 memory to graphics cards— an innovation with industry-wide benefits. And when game developers came to us demanding a low-level API, we listened to them and developed Mantle. It’s an innovation that we hope will speed the evolution of industry-standard APIs in the future.

We’re passionate about gaming, and we know that the biggest advancements come when all industry players collaborate. There’s no room for proprietary technologies when you have a mission to accomplish. That’s why we do the work we do, and if we can help move the industry forward we’re proud to do it for everyone.

Jay Lebo is a Product Marketing Manager at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.
- See more at: http://community.amd.com/community/...ng-the-work-for-everyone#sthash.eBK3YbKJ.dpuf
 
"We worked across the industry to first bring GDDR5 memory to graphics cards..."

What? Hmm we can list Mantal and we need one more thing... hmm.. what is one more thing to make this seem like a list... uh we were first to use GDDR5! YEA! Open standards!
 
It's hilarious that AMD is taking credit for Infineon's (now Qimonda) work on GDDR5. LOL.
 
Whats funny is that they were first to lots of stuff besides the random GDDR5 reference. DirectX10, DirectX11, triple-monitor gameplay, morphological Anti-Aliasing... All firsts by AMD and open standards. why they chose GDDR5... Wtf...
 
Jay Lebo probably just wanted to keep the blog post short and to make a point. If he had made an exhaustive list of industry standards that they had worked on it would have detracted from the overall theme of the blog posting. Sometimes it's better to be terse and to the point rather than ambiguous and jumping around the topic.
 
There’s no room for proprietary technologies when you have a mission to accomplish.
I guess this statement affirms that AMD will no longer be producing proprietary graphics cards or graphics APIs.
 
I guess this statement affirms that AMD will no longer be producing proprietary graphics cards or graphics APIs.

Not sure what you mean? Ask the writer of the article. His blog even has a comment section where you can ask him directly. I was just bringing you'll some news that I thought was interesting and meaningful. :)
 
It's a joke. The statement suggests that AMD does not dabble in proprietary technologies despite the fact that their graphics cards, APIs and other technologies are proprietary.

Ergo, they no longer have a mission to produce them.
 
nvidia is free to use mantle if they choose, they just have to figure out the best way for them to do so.

so it is not proprietary.

if a game comes out that is mantle only, amd wont go knocking on nvidia's door if nvidia makes a wrapper or what ever driver tweaks are needed to get mantle working on an nvidia gpu.
 
Oh yeah. Nvidia is going to produce an AMD GCN architecture GPU. That'll happen. Gotta hand to AMD's marketing, they love trying to sound like the "Good guys" even when it's bullshit. It's quite hilarious.
 
Oh yeah. Nvidia is going to produce an AMD GCN architecture GPU. That'll happen. Gotta hand to AMD's marketing, they love trying to sound like the "Good guys" even when it's bullshit. It's quite hilarious.

Nvidia makes wrappers to convert CUDA to OpenCL. Why not make one for Mantle? The really good thing is that they don't have to make one. They can just use DX11. It seems that's it's good enough for them already. They should be extremely happy to use it and their fans should be also.
 
they dont have to make a GCN capable card to use mantle, they just have to write a driver layer/wrapper to make it run on their cards. it wont run as fast as an amd GCN card, but atleast the game will run.

AMD can do the same for older cards if they chose too, but they have no real need too since they want to push their newer cards to actually make some money.

nvidia doesnt need to do it until a mantle only game comes out or if their mantle wrapper shows some perf. improvements over DX.
 
AMD has said (on Tom's hardware I believe) that mantle does not require GCN, it's just that GCN is the first architecture to be designed to use it. They said once Mantle is mature that other companies can craft architecture that complies with the mantle standards.
 
Status
Not open for further replies.
Back
Top