Another 6 months before VEGA?

The longer we wait the higher the expectations are for a home run. If the launch was around the corner people would be more happy with at least a ground rule double.
I had money in 2014 for a new GPU. AMD had nothing worth the cost and the Nvidia 9 series was coming. Waited to see what AMD responded with and was let down by the Fury stuff. 1080 came and was great, was hoping for a Ti but with no competition it wont come. Which ever is next from both vendors that has the best bang for my buck will be my upgrade. The games I mainly play are fine on my 7970 for now. Thought several I cannot run through Eyefinity anymore.

If you have to go now pick a side and be happy, but if you can afford to wait.... Vega better be solid for the lengthy and teased wait otherwise camp green will gain a lot more customers this cycle. Good for them, bad for the industry.
 
Amd doesn't lock you in to freesync. It's based on an open standard. Nvidia could easily support it but they won't. So stop blaming amd for something nvidia doesn't want to do.

When only one IHV uses it the word open is meaning less. And now AMD is working on Freesync 2 that will increase cost and require API work.

Freesync doesn't work with Intel, VIA, Matrox etc either.
 
Last edited:
When only one IHV uses it the word open is meaning less. And now AMD is working on Freesync 2 that will increase cost and require API work.

Freesync doesn't work with Intel, VIA, Matrox etc either.
Stop playing with words and facts. Fresync is AMD only whereas Adaptivesync is the VESA standard. Intel, VIA, Nvidia and Matrox all could use Adaptivesync being that it is an OPEN STANDARD, there is absolutely nothing keeping them from doing so. However the opposite applies to Gsync as the competition would be required to PAY for access and even now I wouldn't be too sure Nvidia would allow that access.
 
Stop playing with words and facts. Fresync is AMD only whereas Adaptivesync is the VESA standard. Intel, VIA, Nvidia and Matrox all could use Adaptivesync being that it is an OPEN STANDARD, there is absolutely nothing keeping them from doing so. However the opposite applies to Gsync as the competition would be required to PAY for access and even now I wouldn't be too sure Nvidia would allow that access.

Could, but they dont. So the result is the same.

You buy G-Sync for Nvidia and Freesync for AMD.
 
When only one IHV uses it the word open is meaning less. And now AMD is working on Freesync 2 that will increase cost and require API work.

Freesync doesn't work with Intel, VIA, Matrox etc either.
Intel has announced support already, likely coming with Cannonlake. Would have been sooner, but there are some minor hardware requirements. They just need to update the display controller a bit. Freesync 2 won't increase costs beyond requiring a minimal standard of hardware and basic conformance testing.

You buy G-Sync for Nvidia and Freesync for AMD.
Then why are there so many more Freesync than GSync displays out there? If what you're saying is true I'd expect it to match the market share numbers. Fact is AdaptiveSync is the industry standard while GSync is a proprietary addition.
 
Intel has announced support already, likely coming with Cannonlake. Would have been sooner, but there are some minor hardware requirements. They just need to update the display controller a bit. Freesync 2 won't increase costs beyond requiring a minimal standard of hardware and basic conformance testing.

They haven't announced any support. I think what you refer to is a sole source of Scott Wasson, AMD employee, at Techreport saying Intel first was positively inclined towards new standards like adaptive sync and then next saying they would implement it.

No such thing have been mentioned in future products yet. You may have to look as far as Icelake for any potential support.

Then why are there so many more Freesync than GSync displays out there? If what you're saying is true I'd expect it to match the market share numbers. Fact is AdaptiveSync is the industry standard while GSync is a proprietary addition.

Because there is no quality control for Freesync. Something Freesync 2 will introduce and hence increase cost.
 
Last edited:
Stop playing with words and facts. Fresync is AMD only whereas Adaptivesync is the VESA standard. Intel, VIA, Nvidia and Matrox all could use Adaptivesync being that it is an OPEN STANDARD, there is absolutely nothing keeping them from doing so. However the opposite applies to Gsync as the competition would be required to PAY for access and even now I wouldn't be too sure Nvidia would allow that access.

This falls down though with the rumours AMD may license Freesync2 or charge some kind of royalty, and also the intention to enforce more actively QA-spec with FreeSync 2.
When they were asked about the possibility of a royalty charge pertaining to FreeSync 2 they said no comment.
Cheers
 
Not looking good. According to this article we're looking at the last months of Q3 2017. Didn't I say that it wont be until September before we see these parts? AMD leaves me no choice but to overpay for a GTX1080, and hope I don't have screen tearing issues. Who's going to be willing to wait until September? AMD is Doomed (no pun) if they think that releasing a competing product, which will be 26 months after the fury x, makes them a company that's worth sticking with. The longer they wait? The longer it's going to take for them to make any kind of impact on the high end. AMD need to stick to the mid-range. They are incapable of releasing high-end cards that fit any type of product life-cycle. If what is below is true i'm done with AMD for now. I'll sell my monitor, get a freesync monitor, overpay for both because NVidia can do as they please because AMD is incapable of competing at the high-end. Leaving no competition to help drive down prices. On top of that, by the time September comes (8 months now) NVDIA will probably have a card faster than the VEGA waiting in the wings to be released in October. WTF is wrong with AMD? Retaining costumers must not be in their project plan.

Here's the quote for the lazy:

"
AMD Vega 10 Release Date

Vega 10 GPUs, the first ones to utilize the new architecture of AMD, is expected to hit the store shelves in the last months of Q3 2017, according to PC Perspective. The GPU is said to come with a higher price compared to its predecessor because of its high-performing specs, but not as expensive as NVIDIA GTX 1080.

http://www.universityherald.com/art...a-10-vs-nvidia-gtx-1080-amd-vega-10-specs.htm
 
Not looking good. According to this article we're looking at the last months of Q3 2017. Didn't I say that it wont be until September before we see these parts? AMD leaves me no choice but to overpay for a GTX1080, and hope I don't have screen tearing issues. Who's going to be willing to wait until September? AMD is Doomed (no pun) if they think that releasing a competing product, which will be 26 months after the fury x, makes them a company that's worth sticking with. The longer they wait? The longer it's going to take for them to make any kind of impact on the high end. AMD need to stick to the mid-range. They are incapable of releasing high-end cards that fit any type of product life-cycle. If what is below is true i'm done with AMD for now. I'll sell my monitor, get a freesync monitor, overpay for both because NVidia can do as they please because AMD is incapable of competing at the high-end. Leaving no competition to help drive down prices. On top of that, by the time September comes (8 months now) NVDIA will probably have a card faster than the VEGA waiting in the wings to be released in October. WTF is wrong with AMD? Retaining costumers must not be in their project plan.

Here's the quote for the lazy:

"
AMD Vega 10 Release Date

Vega 10 GPUs, the first ones to utilize the new architecture of AMD, is expected to hit the store shelves in the last months of Q3 2017, according to PC Perspective. The GPU is said to come with a higher price compared to its predecessor because of its high-performing specs, but not as expensive as NVIDIA GTX 1080.

http://www.universityherald.com/art...a-10-vs-nvidia-gtx-1080-amd-vega-10-specs.htm
Looking at the PCPer article, the Q3/Q4 is the dual Vega 10 and not the 1st Vega launch.

Cheers
 
Not looking good. According to this article we're looking at the last months of Q3 2017. Didn't I say that it wont be until September before we see these parts? AMD leaves me no choice but to overpay for a GTX1080, and hope I don't have screen tearing issues. Who's going to be willing to wait until September? AMD is Doomed (no pun) if they think that releasing a competing product, which will be 26 months after the fury x, makes them a company that's worth sticking with. The longer they wait? The longer it's going to take for them to make any kind of impact on the high end. AMD need to stick to the mid-range. They are incapable of releasing high-end cards that fit any type of product life-cycle. If what is below is true i'm done with AMD for now. I'll sell my monitor, get a freesync monitor, overpay for both because NVidia can do as they please because AMD is incapable of competing at the high-end. Leaving no competition to help drive down prices. On top of that, by the time September comes (8 months now) NVDIA will probably have a card faster than the VEGA waiting in the wings to be released in October. WTF is wrong with AMD? Retaining costumers must not be in their project plan.

Here's the quote for the lazy:

"
AMD Vega 10 Release Date

Vega 10 GPUs, the first ones to utilize the new architecture of AMD, is expected to hit the store shelves in the last months of Q3 2017, according to PC Perspective. The GPU is said to come with a higher price compared to its predecessor because of its high-performing specs, but not as expensive as NVIDIA GTX 1080.

http://www.universityherald.com/art...a-10-vs-nvidia-gtx-1080-amd-vega-10-specs.htm


Q2 not Q3, When ya got such a small R&D budget, and with the lay offs or engineers leaving AMD, can't expect them to on the ball all the times on their products, we see Polaris it was performance wise just as good as the 1060 right? But when it comes down to it, the 1060 is ahead in power consumption A LOT, so that extra 6 months on Vega, going to change that delta? Rumor has it Vega uses 225 and looks to be a 1080 competitor, hmm that means Vega is using 20% more power for the same performance as the 1080 as the leaks suggest. So where are we at? A little bit of improvement over Polaris, but the same old?
 
The University Herald article has factual errors. They can't even tell the difference between a USB 3 and an Ethernet port.
The other is all based on "facts" gathered by the click bait site videocardz
 
The University Herald article has factual errors. They can't even tell the difference between a USB 3 and an Ethernet port.
The other is all based on "facts" gathered by the click bait site videocardz

Thank you. I will never go there again. Kinda invalidates my rant. LOL!!!
 
The University Herald article has factual errors. They can't even tell the difference between a USB 3 and an Ethernet port.
The other is all based on "facts" gathered by the click bait site videocardz

University herald yeah but, videocardz has been pretty good with their leaks, and we already know 12tflops at 225 is from AMD's insight cards, which coincidentally goes along with the same Tflop differences of Polaris to the gtx 1060 % wise too. Also given the fact AMD even stated Q2 2017 many times, nothing really changed there.
 
Seems like they didn't even read the article, just like me.
It is a university publication, what do you expect :)
RTFM in my experience was used on those after they left university, ah the good ol days (or not lol).
Light hearted comment before anyone takes it too serious.
Cheers
 
I Just wish AMD would let us now an exact time frame. And to be honest, I still don't think we will see VEGA until September the earliest.
 
Its never been Q1.

I would expect Vega 10 to launch around May/June.

Ryzen is/was Q1 2017 and Vega is/was H1 2017.

I was trying to think where I got the early 2017 Vega release date from. It was from AMD's own road map. So now Early Q1 of 2017 has turned into 1H of 2017? What else will they change? :(

Roadmap-640x360.jpg
 
I was trying to think where I got the early 2017 Vega release date from. It was from AMD's own road map. So now Early Q1 of 2017 has turned into 1H of 2017? What else will they change? :(

View attachment 14568

There isn't any date on the roadmap. Not even a pointer to where the years start. And it fuels the hypetrain as always, AMD is good at this. Fanboys came up with everything from Q1, December 2016 and down to October 2016. Because only the best possible outcome could be real.

The perf/watt compares was a big fat (semi)lie too. Polaris got compared to the absolute worst 28nm part with a way overspecced TDP. While Vega uses packed math to get its perf/watt. Navi is a 2019(Can easily end up as 2020) product and 7nm.
 
Last edited:
That's your own estimate and this is why AMD is so good at hyping :)

Its how the chart works. There doesn't need to be 12 lines in between the years to represent the months. If you draw the line down from Polaris it falls right in the middle of 2016 and 2017 (June). Polaris was released in June 16. It seems as though Vega is delayed.
 
Because there is no quality control for Freesync. Something Freesync 2 will introduce and hence increase cost.
Couple extra cents to the cost? Not sure I'd call that a significant increase. All that's really required is a few chips get certified as compliant and that largely comes down to being able to support a wider range of frequencies.

They haven't announced any support. I think what you refer to is a sole source of Scott Wasson, AMD employee, at Techreport saying Intel first was positively inclined towards new standards like adaptive sync and then next saying they would implement it.

No such thing have been mentioned in future products yet. You may have to look as far as Icelake for any potential support.
Intel’s Chief Graphics Software Architect, David Blythe, has revealed that they will support the VESA Adaptive Sync standard in their integrated graphics processing units (iGPU) in the future. Link
Back in 2015, no specific mention of which products are bringing it though.
 
I was trying to think where I got the early 2017 Vega release date from. It was from AMD's own road map. So now Early Q1 of 2017 has turned into 1H of 2017? What else will they change? :(

View attachment 14568

In their last two (or three?) quarterly reports they've been saying Q1 for (Ry)Zen and H1 for Vega. If you look at nVidia's roadmap chart from about the same time AMD's came out I think they had Volta as 2016 and Pascal wasn't even on it. Shit happens; things change. :unsure:

Even a few months late, I see more positive in AMD/RTG roadmap than negative this time around. As always, people need to keep expectations in place and realistic.
 
In their last two (or three?) quarterly reports they've been saying Q1 for (Ry)Zen and H1 for Vega. If you look at nVidia's roadmap chart from about the same time AMD's came out I think they had Volta as 2016 and Pascal wasn't even on it. Shit happens; things change. :unsure:

Even a few months late, I see more positive in AMD/RTG roadmap than negative this time around. As always, people need to keep expectations in place and realistic.
Yeah Pascal was added as an interim step to Volta.
But the original 'whitepaper' (more like a technical news brief) for Volta said 2017 :)
I have a copy of it and it says:
Introducing Summit and Sierra
The ORNL Summit system will be a leadership computing platform for the Office of Science. Delivered in 2017, Summit is expected to reach between 150 and 300 petaFLOPS and is positioned as a precursor to the U.S. DoE’s exascale system.
The Nvidia paper is dated November 2014, and is specific to Volta.

Just to add I would say those roadmap slides are just a summary product cycle and not designed for accurate dates, this applies to both AMD and Nvidia.
Cheers
 
Last edited:
I did use the unsure smiley. ;) Thank you for clarifying that detail.

Like you, I do believe consumer Volta is landing this year (I am leaning towards end of/beginning of Q2/Q3).
 
what if this is what the road map is supposed to look like. I drew the lines on using the bits on the end as the average in between years. that would mean that Polaris was ahead of schedule and vega is looking like may/june-ish. the no vert lines on the original is what really throughs everything off!

Roadmap-640x360.jpg
 
Couple extra cents to the cost? Not sure I'd call that a significant increase. All that's really required is a few chips get certified as compliant and that largely comes down to being able to support a wider range of frequencies.

Citation?
 
Citation?
Beyond the hourly cost of an engineer to sit down and verify results that are distributed across all production models I don't have one. Without royalties or other onerous legal processes it normally doesn't cost much. Not that different from PCIE standards for example where companies can self-certify. Even if it costs a million dollars spread across a million panels, which wouldn't be rather conservative for panels, you're looking at an increase of a dollar.

I haven't seen a hard specification on what FreeSync2 requires, but there's not a whole lot there if other standards were met. The goal looked to be avoiding panels with limited variable refresh rates and supporting HDR. Seems a matter of verifying certain frequency ranges can be achieved. A test any consumer could probably conduct by running their refresh up and down to the extremes.
 
Beyond the hourly cost of an engineer to sit down and verify results that are distributed across all production models I don't have one. Without royalties or other onerous legal processes it normally doesn't cost much. Not that different from PCIE standards for example where companies can self-certify. Even if it costs a million dollars spread across a million panels, which wouldn't be rather conservative for panels, you're looking at an increase of a dollar.

I haven't seen a hard specification on what FreeSync2 requires, but there's not a whole lot there if other standards were met. The goal looked to be avoiding panels with limited variable refresh rates and supporting HDR. Seems a matter of verifying certain frequency ranges can be achieved. A test any consumer could probably conduct by running their refresh up and down to the extremes.

So you dont know. One of the key points of Freesync 2 is to add a quality level. Something AMD have learned about with the hit or miss to put it mildly on Freesync. Freesync 2 may also not be so free and on the game front it requires extra development for a Freesync 2 API.
 
So you dont know. One of the key points of Freesync 2 is to add a quality level. Something AMD have learned about with the hit or miss to put it mildly on Freesync. Freesync 2 may also not be so free and on the game front it requires extra development for a Freesync 2 API.
I only know what common sense and math dictate. There's absolutely no reason whatsoever it will significantly increase costs. Not unless they decide to arbitrarily increase those costs. Do you have a citation for these cost increases you keep alluding too? Because it sounds like pure BS.

The API work you mention, to figure out the color space of the display, sounds like a really rough development cost. So much work to call a single function during initialization since windows doesn't currently support the feature. Probably a lot of dev costs sunk in that 5 minutes that takes to implement.
 
I only know what common sense and math dictate. There's absolutely no reason whatsoever it will significantly increase costs. Not unless they decide to arbitrarily increase those costs. Do you have a citation for these cost increases you keep alluding too? Because it sounds like pure BS.

The API work you mention, to figure out the color space of the display, sounds like a really rough development cost. So much work to call a single function during initialization since windows doesn't currently support the feature. Probably a lot of dev costs sunk in that 5 minutes that takes to implement.


AMD already stated Freesync 2 is going to be higher cost then Freesync the amount is unknown though but if they are saying it, logic dictates its going to be a decent amount otherwise there would be no reason to mention it.
 
AMD already stated Freesync 2 is going to be higher cost then Freesync the amount is unknown though but if they are saying it, logic dictates its going to be a decent amount otherwise there would be no reason to mention it.
For the certification or the quality of the display though? The implementation isn't changing from the status quo. Just the standard requiring better performing hardware with LFC support. So those 30-60Hz displays won't meet the spec. Monitors supporting HDR and hopefully significantly higher than 75Hz max refresh will probably cost more than standard displays that only reach up to 60Hz. That will be the case even without FS2 and the "higher cost" of the standard. The whole reason manufacturers went crazy with Freesync in the first place was that it was cheap to implement to provide new features. If making a display that already meets those requirements, supporting FS2 won't cost that much more.
 
doesn't matter what its for, if they stated its going to cost more, and they did, its going to be a decent amount more. Just by saying that its a negative selling point, so.......
 
As long as it's cheaper than GSync it's still a selling point in favor of the AMD ecosystem.

But maybe now they need to start calling it NotSoFreeSync (sorry, had to go there; I'll let myself out now).
 
doesn't matter what its for, if they stated its going to cost more, and they did, its going to be a decent amount more. Just by saying that its a negative selling point, so.......
Negative only in that 144Hz HDR displays won't be widely available for $100 or similar prices as standard 60Hz panels. That only changes if they go for a royalty. That could be a rough sell as the manufactures would likely support Freesync2 without certification. I haven't seen anything on FS2 suggesting additional hardware or implementation costs that weren't already trivial.
 
well that we don't know yet, but all we know its going to cost more ;)
 
Up to 8GB? That better be a wccftech joke since it implies a version with less.

Runs up to 4K, up to 120fps. Release up to H1 '17.

Up to

I can run a 40 mile marathon up to the one mile mark, then I may just have a kitkat
 
doesn't matter what its for, if they stated its going to cost more, and they did, its going to be a decent amount more. Just by saying that its a negative selling point, so.......
At least they give you options.
 
Back
Top