Radeon RX Vega Discussion Thread

Well argue with ya there, but Ryzen has its own problems lol. She didn't hype it by showing things, but by not showing things, also created hype by having people believe based on what they showed it would translate over.

Raja is in the middle of a rock and a hard place. He has the left over remnants of a dismantled ATi going against a goliath (nV), who is firing on all cylinders and looks like they aren't slowing down any time soon. Can't really compare it to the CPU division cause Intel was just milking it. nV is not doing that they are innovating faster and more then before (at least performance per tier)

Ryzen is fine for the most part. If anyone expected it to be the new king its their own fault. Where it landed I absolutely have no problem with it.
 
Ryzen is fine for the most part. If anyone expected it to be the new king its their own fault. Where it landed I absolutely have no problem with it.


Well that is AMD's ability to keep people in perspective, and no its not all fine in Ryzen land lol, what AMD showed is a part that can compete with Broadwell, unfortunately it only does that in the things they showed lol, nothing else. It doesn't do it in most workstation type situations, for things like the Adobe suite, for things like 3d modelling, it can only do what they showed for very specific applications (rendering and encoding). I'm not even going to get into gaming, we all know where that lands. Then the platform really holds it back for power workstation users. So what ya really have is a market for cheapo system specific workstation tasks and general users (who might as well not worry about the extra core)

Was Ryzen really only made for these people, that are interested in rendering and encoding and general consumers that don't need more than 2 cores?

I don't think so, that wasn't how Ryzen was marketed.....

Again reality vs. marketing. Reality is what it is, marketing is just BS.
 
Well that is AMD's ability to keep people in perspective, and no its not all fine in Ryzen land lol, what AMD showed is a part that can compete with Broadwell, unfortunately it only does that in the things they showed lol, nothing else. It doesn't do it in most workstation type situations, for things like the Adobe suite, for things like 3d modelling, it can only do what they showed for very specific applications (rendering and encoding). I'm not even going to get into gaming, we all know where that lands. Then the platform really holds it back for power workstation users. So what ya really have is a market for cheapo system specific workstation tasks and general users (who might as well not worry about the extra core)

Was Ryzen really only made for these people, that are interested in rendering and encoding and general consumers that don't need more than 2 cores?

I don't think so, that wasn't how Ryzen was marketed.....

Again reality vs. marketing. Reality is what it is, marketing is just BS.

Wait weren't you the one that said those workloads use GPU? There is no point going down this road its a never ending, lol. So I guess we agree to disagree there. Damned if you do, damned if you don't. Some people just don't like giving AMD any credit for having a decent chip for once.

I don't even want to derail this thread anymore talking about ryzen in vega thread, so I guess I will stop talking about it. There is plenty of other threads to beat up on ryzen.
 
Wait weren't you the one that said those workloads use GPU? There is no point going down this road its a never ending, lol. So I guess we agree to disagree there. Damned if you do, damned if you don't. Some people just don't like giving AMD any credit for having a decent chip for once.


Most of them can be GPU driven for part of those work loads, not all of them and the quality you are going for depends on that.

AMD doesn't have a good market cap to just make a mediocre chip, that is what they have, its not bad, its not great either. Its OK at most tasks but those task Intel chips either be 4 core more doesn't matter just do better, AMD Ryzen is very good in a specific ordeal.

And if I'm rendering off my CPU for something, which means I'm going for the highest quality for something. I wouldn't even look at Ryzen, I would needs ram tons of it, 128gb and more, more like 256 gb for that, Ryzen boards that have been launched so far, don't handle that that much right?

The ram alone will cost more than 2 times of the Ryzen chip. 2 times more than an Intel chip of equal specs, so Why would I bother with saving 500 bucks, and not being able to get enough ram into the system?

Back on topic, Vega, so far all we know, is this, Doom, around 1080 performance, Open Cl benchmark leaks, don't put it in good light, it puts it around 1080 performance comparing other GCN products to Pascal products. And yes we now the name is Vega, and we also know the reference cooler is white and red......

At least we know more about Vega so far then we new about Fiji at this point right?

The 11 sku's are interesting. I don't even know how to make sense of that lol.
 
Last edited:
Most of them can be GPU driven for part of those work loads, not all of them and the quality you are going for depends on that.

AMD doesn't have a good market cap to just make a mediocre chip, that is what they have, its not bad, its not great either. Its OK at most tasks but those task Intel chips either be 4 core more doesn't matter just do better, AMD Ryzen is very good in a specific ordeal.

And if I'm rendering off my CPU for something, which means I'm going for the highest quality for something. I wouldn't even look at Ryzen, I would needs ram tons of it, 128gb and more, more like 256 gb for that, Ryzen boards that have been launched so far, don't handle that that much right?

The ram alone will cost more than 2 times of the Ryzen chip. 2 times more than an Intel chip of equal specs, so Why would I bother with saving 500 bucks, and not being able to get enough ram into the system?

Back on topic, Vega, so far all we know, is this, Doom, around 1080 performance, Open Cl benchmark leaks, don't put it in good light, it puts it around 1080 performance comparing other GCN products to Pascal products. And yes we now the name is Vega, and we also know the reference cooler is white and red......

At least we know more about Vega so far then we new about Fiji at this point right?

The 11 sku's are interesting. I don't even know how to make sense of that lol.

I don't know how true it is but its I have read a few place in forums that Vega ES sample has been shown running around 1200mhz in the latest demos. Idk how true that is, but the final chip does run on 1500 or so boost, we may not have seen the full speed of the chip yet. Either way I am happy with my 1080 mini I just grabbed for 460. We will see if Vega is a viable upgrade since my monitor is freesync. I am running my 1080 averages over 2ghz boost and I have my monitor locked at 75hz.
 
This is a classic example. You now attack me instead of the real problem, the endless hype that can never deliver. You claim I hate AMD when I give a realistic view. When you just seem to be angry and need someone to blame for the failed hype when reality hits.

And let me know what you disagree with specifically in my post if you want to argue about it.

We have the AOTS leak and AMD demos.
NO! You and the others that keep mentioning this so called Hype was in direct contrast to the majority of AMD posters that told you it wouldn't happen. NO ONE here mentioned these "expectations" other than you and the other AMD bashers. So that way when the dust settled you could make it look like you were the rational ones when in fact the majority of the hype ALWAYS comes from poster that make statements like this: If XXX is faster than my YYY then I will buy it. Then others like you start making the claim that others are expecting XXX to be faster than YYY, when ON THIS SITE most were not expecting such outcome and were telling you that it wouldn't happen.

As FAR as the 480 Most of us told you to expect 390X performance with far lower power usage. Now being rational individuals and giving a range, the range would have been 390(non X) up to Nano or reg Fury but not FuryX. And lo and behold it landed right in the middle of that range as most HERE ON THIS SITE STATED. Yet here you and so many others still parrot that you were the lone stand against the hype of fevered AMD fanbois. A damn fine example of fake news ( and I hate that term being a buzz word that tech has been incorporated heavily over the last 20 years).
 
NO! You and the others that keep mentioning this so called Hype was in direct contrast to the majority of AMD posters that told you it wouldn't happen. NO ONE here mentioned these "expectations" other than you and the other AMD bashers. So that way when the dust settled you could make it look like you were the rational ones when in fact the majority of the hype ALWAYS comes from poster that make statements like this: If XXX is faster than my YYY then I will buy it. Then others like you start making the claim that others are expecting XXX to be faster than YYY, when ON THIS SITE most were not expecting such outcome and were telling you that it wouldn't happen.

As FAR as the 480 Most of us told you to expect 390X performance with far lower power usage. Now being rational individuals and giving a range, the range would have been 390(non X) up to Nano or reg Fury but not FuryX. And lo and behold it landed right in the middle of that range as most HERE ON THIS SITE STATED. Yet here you and so many others still parrot that you were the lone stand against the hype of fevered AMD fanbois. A damn fine example of fake news ( and I hate that term being a buzz word that tech has been incorporated heavily over the last 20 years).

Yup. The overhype had nothing at all to do with Raja's presentation before it launched.

Remember 2x480 at $500 beating a 1080 at only 51% utilization?
 
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/29.html

I find it interesting that the FuryX now is faster than the 980Ti in 4K - yet people called it a failed card - I guess that is true since it did not make a bunch of money for AMD.

Anyways the difference between the 1080 and FuryX is 29% at 4K (from a number of folks you would think it was 80% but it is not, Nvidia is just more effective and getting people to believe their hype).

Vega if it's boost clock is 1500 (the clock which AMD normally keeps the gpu at if conditions are OK) would be 1500/1050 ~ 43% faster. If all the new improvements makes that a perfect scale then yes Vega will be faster than the 1080 by a significant amount.

The 1080Ti is 82% faster than the FuryX - Unless AMD has some magic Vega looks to be in between the 1080 and 1080Ti. Still it all depends upon clock speed and architecture improvements. Just too many unknowns really, clock speed which will be limited by power consumption and heat (big die should be easier to cool, the MI25 being passive tells a lot there). Then power consumption which we already know is 300w to hit that 12.5 TFLOPS for the Instinct MI25. I do not see AMD releasing a 300w Vega, probably a 250w Vega that will OC decently will be the result so the user can take it up past 300w etc if they want to. Meaning real performance ability will have to be determine after launch and OC. Kinda like the Nano, slightly faster than a 980, rated at 175w but can perform at 980ti/FuryX levels except it will be pulling 250w+ when OC and 50% Powertune. So Vega could be rated at 225w, 1200mhz 1350mhz boost - but then you add 50% powertune (yep AMD allows you to increase power limits by 50%!) and OC the card to way past 300w - Question is what really is the max performance or capability against max performance of a 1080 and 1080Ti?

It is all a waiting game now, which is getting very long in the tooth.
 
Yup. The overhype had nothing at all to do with Raja's presentation before it launched.

Remember 2x480 at $500 beating a 1080 at only 51% utilization?


well not sure why he even went that far, that wasn't overhype that was just a faux pas

And to Just Reason's posts, yeah it was overhyped here too, before AMD stated they were going for midrange first. I remember quite a few people, mostly not so in tune with what was going on and how AMD was not talking about perfomance rather perf/watt of Polaris. They took it as 2.5 x the perf/watt,so same wattage 2.5 the performance..... or 2.0 perf/watt and they were all taking about over the r290 parts, but the problem was AMD was not talking about over the r290 they were talking about over the Tonga.
 
At this point I need GTX 1080 performance at or under a $400 price tag to make this interesting. Vega is so late that if it doesn't offer insane $/Performance I will just wait for Volta.


Although it will have to come in at around gtx 1080 price if its performance is just a tad higher, at 400 bucks I don't see it happening with HBM 2 at least not with any margins that will help AMD.
 
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/29.html

I find it interesting that the FuryX now is faster than the 980Ti in 4K - yet people called it a failed card - I guess that is true since it did not make a bunch of money for AMD.

Anyways the difference between the 1080 and FuryX is 29% at 4K (from a number of folks you would think it was 80% but it is not, Nvidia is just more effective and getting people to believe their hype).

Vega if it's boost clock is 1500 (the clock which AMD normally keeps the gpu at if conditions are OK) would be 1500/1050 ~ 43% faster. If all the new improvements makes that a perfect scale then yes Vega will be faster than the 1080 by a significant amount.

The 1080Ti is 82% faster than the FuryX - Unless AMD has some magic Vega looks to be in between the 1080 and 1080Ti. Still it all depends upon clock speed and architecture improvements. Just too many unknowns really, clock speed which will be limited by power consumption and heat (big die should be easier to cool, the MI25 being passive tells a lot there). Then power consumption which we already know is 300w to hit that 12.5 TFLOPS for the Instinct MI25. I do not see AMD releasing a 300w Vega, probably a 250w Vega that will OC decently will be the result so the user can take it up past 300w etc if they want to. Meaning real performance ability will have to be determine after launch and OC. Kinda like the Nano, slightly faster than a 980, rated at 175w but can perform at 980ti/FuryX levels except it will be pulling 250w+ when OC and 50% Powertune. So Vega could be rated at 225w, 1200mhz 1350mhz boost - but then you add 50% powertune (yep AMD allows you to increase power limits by 50%!) and OC the card to way past 300w - Question is what really is the max performance or capability against max performance of a 1080 and 1080Ti?

It is all a waiting game now, which is getting very long in the tooth.


This is the first card that is fully developed under Koduri, so theres really no way to judge performance based on the previous architectures. What has been revealed so far about the architecture is pretty interesting especially considering what they appear to be doing with Infinity Fabric, Radeon Instinct, ROCm, Radeon Pro SSG, Naples and Vega's HBM2 and HBCC with 512TB virtual address space. Its pretty hard not to get hyped. :p Even more so considering Zen and Ryzen's perf and perf/watt.

But, "deny everything"
 
This is the first card that is fully developed under Koduri, so theres really no way to judge performance based on the previous architectures. What has been revealed so far about the architecture is pretty interesting especially considering what they appear to be doing with Infinity Fabric, Radeon Instinct, ROCm, Radeon Pro SSG, Naples and Vega's HBM2 and HBCC with 512TB virtual address space. Its pretty hard not to get hyped. :p Even more so considering Zen and Ryzen's perf and perf/watt.

But, "deny everything"
Oh but you can - :geek:

12.5/8.6 VegaFlops/FijiFlops = 1.45 -> processing power is 45% higher
If it performs same or very similar to Fiji as in TFlops/performance then you have a 45% better performing card, except there is more than increasing the clock speed here. Which by the way corresponds going from a FuryX 1050 to 1500mhz.

Now it is a matter of final clock speed and improvements that make it more efficient than Fiji or Pascal and power consumed. I do think 225w is about right for the GPU, add in everything else and you're looking at 250w+.

If AMD wants to give more headroom like the 7970 which was an overclocker from hell, clocked at 925mhz yet mine would do over 1300mhz and it was 1300/925 = 41%! Overclocked! Of course power went through the roof but the card was able to handle it. Vega could be similar. My first GTX 1070 out of the box ran at 1900mhz or so, max OC was 2100mhz 2100/1900 = ~ 11% What gets me people look at rated boost clock when it is a totally meaningless number when the card actually operates at a higher speed all the time then actually OCing it 10% but think they OC it 30% or some BS like that - that is hype that Nvidia sells (very effectively). That is why I am most interested in the peddle to the metal performance, damn the watts. Take it to the strip and full throttle it, Nvidia stuff is pretty much full throttled with 10-15% extra if you tap into it.

Interesting that AMD put Wattman into their drivers, meaning to me you will get a rated card which has a lot of potential if you want to put some power into it. kinda like having your cake (the lower power rating) and getting to eat it to (OC performance or higher performance). Which AMD could show the power savings of their new line while showing the OC performance (ignoring the power hike) against the competition. I would almost consider my Nano in that category - a 175w card by spec yet through the drivers can consume over 250W and perform well. In a nutshell the specs become meaningless - AMD power rating, Nvidia Base clock/Boost clock speeds etc.

Now why is Vega taking so long I would like to know.
 
Last edited:
Oh but you can - :geek:

12.5/8.6 VegaFlops/FijiFlops = 1.45 -> processing power is 45% higher
If it performs same or very similar to Fiji as in TFlops/performance then you have a 45% better performing card, except there is more than increasing the clock speed here. Which by the way corresponds going from a FuryX 1050 to 1500mhz.

Now it is a matter of final clock speed and improvements that make it more efficient than Fiji or Pascal and power consumed. I do think 225w is about right for the GPU, add in everything else and you're looking at 250w+.

If AMD wants to give more headroom like the 7970 which was an overclocker from hell, clocked at 925mhz yet mine would do over 1300mhz and it was 1300/925 = 41%! Overclocked! Of course power went through the roof but the card was able to handle it. Vega could be similar. My first GTX 1070 out of the box ran at 1900mhz or so, max OC was 2100mhz 2100/1900 = ~ 11% What gets me people look at rated boost clock when it is a totally meaningless number when the card actually operates at a higher speed all the time then actually OCing it 10% but think they OC it 30% or some BS like that - that is hype that Nvidia sells (very effectively). That is why I am most interested in the peddle to the metal performance, damn the watts. Take it to the strip and full throttle it, Nvidia stuff is pretty much full throttled with 10-15% extra if you tap into it.

Interesting that AMD put Wattman into their drivers, meaning to me you will get a rated card which has a lot of potential if you want to put some power into it. kinda like having your cake (the lower power rating) and getting to eat it to (OC performance or higher performance). Which AMD could show the power savings of their new line while showing the OC performance (ignoring the power hike) against the competition. I would almost consider my Nano in that category - a 175w card by spec yet through the drivers can consume over 250W and perform well. In a nutshell the specs become meaningless - AMD power rating, Nvidia Base clock/Boost clock speeds etc.

Now why is Vega taking so long I would like to know.

You know I am all for Vega doing Well. But this is the exact reason of hype. You hyped Vega to be a hopeful overclocker from hell. Common let's keep it realistic. AMD hasn't had a fricking overclocker from hell for ages. I doubt it's happening with Vega. Only reason being is that they are so far behind nvidia they will have no choice but to clock Vega probably 50-100 below max attainable clocks. Expect Vega to not be a overclocking monster only because probability of that is way higher than the other.
 
Last edited:
...........................................
As FAR as the 480 Most of us told you to expect 390X performance with far lower power usage. Now being rational individuals and giving a range, the range would have been 390(non X) up to Nano or reg Fury but not FuryX. And lo and behold it landed right in the middle of that range as most HERE ON THIS SITE STATED. Yet here you and so many others still parrot that you were the lone stand against the hype of fevered AMD fanbois. A damn fine example of fake news ( and I hate that term being a buzz word that tech has been incorporated heavily over the last 20 years).

You keep forgetting a small detail about RX480.
AMD & Raja was talking about "Premium VR experience". Did they or didn't ?
Well, after the reviews, now we all know the results from what was stated as "Premium VR experience" ( https://www.hardocp.com/article/2016/11/17/amd_nvidia_gpu_vr_performance_google_earth/6 ) :
At the bottom of the table (*and far behind even from its main competitor GTX1060 )
 
How? Ryzen did sort of back up the hype for the most part. I think Lisa is a way better presenter than Raja is. Raja just trolls you all the way. lol

I couldn't disagree more. Lisa Su is the biggest liar/hyper of the 2. With RTG they more or less make people make the hype. With the CPU division they just flat out lie like they did for over 10 years now.
 
I couldn't disagree more. Lisa Su is the biggest liar/hyper of the 2. With RTG they more or less make people make the hype. With the CPU division they just flat out lie like they did for over 10 years now.

and you just hate amd. How could amd possibly earn your love? Lol. It's okay. I don't think Lisa cares too much about your opinion.
 
Oh but you can - :geek:

12.5/8.6 VegaFlops/FijiFlops = 1.45 -> processing power is 45% higher
If it performs same or very similar to Fiji as in TFlops/performance then you have a 45% better performing card, except there is more than increasing the clock speed here. Which by the way corresponds going from a FuryX 1050 to 1500mhz.

Now it is a matter of final clock speed and improvements that make it more efficient than Fiji or Pascal and power consumed. I do think 225w is about right for the GPU, add in everything else and you're looking at 250w+.

If AMD wants to give more headroom like the 7970 which was an overclocker from hell, clocked at 925mhz yet mine would do over 1300mhz and it was 1300/925 = 41%! Overclocked! Of course power went through the roof but the card was able to handle it. Vega could be similar. My first GTX 1070 out of the box ran at 1900mhz or so, max OC was 2100mhz 2100/1900 = ~ 11% What gets me people look at rated boost clock when it is a totally meaningless number when the card actually operates at a higher speed all the time then actually OCing it 10% but think they OC it 30% or some BS like that - that is hype that Nvidia sells (very effectively). That is why I am most interested in the peddle to the metal performance, damn the watts. Take it to the strip and full throttle it, Nvidia stuff is pretty much full throttled with 10-15% extra if you tap into it.

Interesting that AMD put Wattman into their drivers, meaning to me you will get a rated card which has a lot of potential if you want to put some power into it. kinda like having your cake (the lower power rating) and getting to eat it to (OC performance or higher performance). Which AMD could show the power savings of their new line while showing the OC performance (ignoring the power hike) against the competition. I would almost consider my Nano in that category - a 175w card by spec yet through the drivers can consume over 250W and perform well. In a nutshell the specs become meaningless - AMD power rating, Nvidia Base clock/Boost clock speeds etc.

Now why is Vega taking so long I would like to know.

From what we've seen with Ryzen, Polaris and Fiji I highly doubt Vega will be an overclocking monster. They all have something in common: Very little OC headroom.

I highly doubt AMD will leave much OC headroom on the table, but instead opt for as much performance out of the box as possible. Nobody gives a crap about 41% OC headroom, but if Vega running at, say, 95% of its maximum theoretical clock speed whoops a stock 1080, and matches its OCed performance, it's a completely different story.

There are 2 scenarios I can see:

1. 1500MHz is ~1080 performance, and there's at most 10% clock headroom left. No chance AMD will release full Vega at a lower clock just to claim "overclocker's dream", that's retarded. That'll mean typical performance is at 1070 level, which means 1070 pricing, and with HBM2 it's not gonna happen. They need to hit that 1080 price point to even justify selling Vega with HBM2.

2. 1200MHz is ~1080 performance, and Vega is capable of 1500MHz or a bit higher. No way in hell AMD wouldn't release Vega at 1500MHz if this is the case. If they can get to within 10-15% of GTX 1080Ti, they can suddenly sell for $600 and have much better margins. They can do it like the Fury series: Fury X would be chips that can hit 1500MHz, Fury would be chips that can't make 1500MHz, so they'll be released at, say, 1250MHz or 1300MHz, completely trouncing the 1080, for $500. Nvidia can't easily drop the price of the Ti to $600 to squeeze AMD, since that'll destroy their lower tiers pricing stack. If they drop 1080Ti to 650, AMD can still sell at 550 or 580 and make decent money. And the Ti dropping to $650 will cause everything below it to shift in price. And there's no need for Nvidia to do that.
 
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/29.html

I find it interesting that the FuryX now is faster than the 980Ti in 4K - yet people called it a failed card - I guess that is true since it did not make a bunch of money for AMD.

Anyways the difference between the 1080 and FuryX is 29% at 4K (from a number of folks you would think it was 80% but it is not, Nvidia is just more effective and getting people to believe their hype).

Vega if it's boost clock is 1500 (the clock which AMD normally keeps the gpu at if conditions are OK) would be 1500/1050 ~ 43% faster. If all the new improvements makes that a perfect scale then yes Vega will be faster than the 1080 by a significant amount.

The 1080Ti is 82% faster than the FuryX - Unless AMD has some magic Vega looks to be in between the 1080 and 1080Ti. Still it all depends upon clock speed and architecture improvements. Just too many unknowns really, clock speed which will be limited by power consumption and heat (big die should be easier to cool, the MI25 being passive tells a lot there). Then power consumption which we already know is 300w to hit that 12.5 TFLOPS for the Instinct MI25. I do not see AMD releasing a 300w Vega, probably a 250w Vega that will OC decently will be the result so the user can take it up past 300w etc if they want to. Meaning real performance ability will have to be determine after launch and OC. Kinda like the Nano, slightly faster than a 980, rated at 175w but can perform at 980ti/FuryX levels except it will be pulling 250w+ when OC and 50% Powertune. So Vega could be rated at 225w, 1200mhz 1350mhz boost - but then you add 50% powertune (yep AMD allows you to increase power limits by 50%!) and OC the card to way past 300w - Question is what really is the max performance or capability against max performance of a 1080 and 1080Ti?

It is all a waiting game now, which is getting very long in the tooth.

Did they retest the 980ti for example with the new driver after they replaced games with Doom, Hitman etc?
Vega 10 so far have only been shown at 1000/1200Mhz from the OpenCL leak.
Vega 10 wont have a faster memory bus than Fury. It may even end up slower (409-512GB/sec). So this may imply a scaling penalty.
Fury´s biggest failure is its VRAM limit that it suffers badly from. Had it been an 8GB card as originally planned it would have done much better.
OCing it may be another "overclockers dream". AMD likes to bin to the max since Fiji.
 
Oh but you can - :geek:

12.5/8.6 VegaFlops/FijiFlops = 1.45 -> processing power is 45% higher
If it performs same or very similar to Fiji as in TFlops/performance then you have a 45% better performing card, except there is more than increasing the clock speed here. Which by the way corresponds going from a FuryX 1050 to 1500mhz.

Maxwell->Pascal Titan is about 30% faster with 65% more processing power. Fiji->Vega is about 40% faster with at least 100% more performance due to FP16 usage. Also why they dont use P100 in the chart.
AMD-INSTINCT-VEGA-benchmarks-MI25.jpg


But we know from the OpenCL test that AMD have only managed so far to reach 1000/1200Mhz.

AMD-Vega-10-Boost-Clock-and-64-CUs.png
 
Last edited:
From what we've seen with Ryzen, Polaris and Fiji I highly doubt Vega will be an overclocking monster. They all have something in common: Very little OC headroom.

I highly doubt AMD will leave much OC headroom on the table, but instead opt for as much performance out of the box as possible. Nobody gives a crap about 41% OC headroom, but if Vega running at, say, 95% of its maximum theoretical clock speed whoops a stock 1080, and matches its OCed performance, it's a completely different story.

There are 2 scenarios I can see:

1. 1500MHz is ~1080 performance, and there's at most 10% clock headroom left. No chance AMD will release full Vega at a lower clock just to claim "overclocker's dream", that's retarded. That'll mean typical performance is at 1070 level, which means 1070 pricing, and with HBM2 it's not gonna happen. They need to hit that 1080 price point to even justify selling Vega with HBM2.

2. 1200MHz is ~1080 performance, and Vega is capable of 1500MHz or a bit higher. No way in hell AMD wouldn't release Vega at 1500MHz if this is the case. If they can get to within 10-15% of GTX 1080Ti, they can suddenly sell for $600 and have much better margins. They can do it like the Fury series: Fury X would be chips that can hit 1500MHz, Fury would be chips that can't make 1500MHz, so they'll be released at, say, 1250MHz or 1300MHz, completely trouncing the 1080, for $500. Nvidia can't easily drop the price of the Ti to $600 to squeeze AMD, since that'll destroy their lower tiers pricing stack. If they drop 1080Ti to 650, AMD can still sell at 550 or 580 and make decent money. And the Ti dropping to $650 will cause everything below it to shift in price. And there's no need for Nvidia to do that.
Don't forget there is a power ceiling that is viable for a graphics card besides needing to be less than 300W. That may preclude full performance and that was my point. 250+ card but can OC well. You can't clock the hell out of it so as to match a 1080Ti if the power would be 350w. For example AMD rates it at 225w, brags about that aspect then turns around and talks about OCing and how easy it is in the drivers that Nvidia does not have - it gets cranked up and toasts a 1080Ti and no mention it is pulling 350+w.

Of course if it tops out under 300w they will probably give a smaller upper clock speed so they can actually make and sell them. Just too many not knowns.
 
Maxwell->Pascal Titan is about 30% faster with 65% more processing power. Fiji->Vega is about 40% faster with at least 100% more performance due to FP16 usage. Also why they dont use P100 in the chart.
AMD-INSTINCT-VEGA-benchmarks-MI25.jpg


But we know from the OpenCL test that AMD have only managed so far to reach 1000/1200Mhz.

AMD-Vega-10-Boost-Clock-and-64-CUs.png
I would not count them as very reliable but how could AMD rate the MI25 12.5Tflops if running at 1200mhz? Boost speed?
 
You keep forgetting a small detail about RX480.
AMD & Raja was talking about "Premium VR experience". Did they or didn't ?
Well, after the reviews, now we all know the results from what was stated as "Premium VR experience" ( https://www.hardocp.com/article/2016/11/17/amd_nvidia_gpu_vr_performance_google_earth/6 ) :
At the bottom of the table (*and far behind even from its main competitor GTX1060 )
Seriously I get your disdain but you really need to understand marketing and how it works. Premium is a label attached to a multitude of products and I guarantee most are not.
 
Did they retest the 980ti for example with the new driver after they replaced games with Doom, Hitman etc?
Vega 10 so far have only been shown at 1000/1200Mhz from the OpenCL leak.
Vega 10 wont have a faster memory bus than Fury. It may even end up slower (409-512GB/sec). So this may imply a scaling penalty.
Fury´s biggest failure is its VRAM limit that it suffers badly from. Had it been an 8GB card as originally planned it would have done much better.
OCing it may be another "overclockers dream". AMD likes to bin to the max since Fiji.
PROOF? Stating it doesn't make it so. Being the FuryX generally does quite well at 4K.
 
Seriously I get your disdain but you really need to understand marketing and how it works. Premium is a label attached to a multitude of products and I guarantee most are not.

So, since you realise that, can you tell me if these kind of tactics can be considered as hype from AMD 's side ?
( That's what i'm saying all this time at most of my posts !! )
 
Don't forget there is a power ceiling that is viable for a graphics card besides needing to be less than 300W. That may preclude full performance and that was my point. 250+ card but can OC well. You can't clock the hell out of it so as to match a 1080Ti if the power would be 350w. For example AMD rates it at 225w, brags about that aspect then turns around and talks about OCing and how easy it is in the drivers that Nvidia does not have - it gets cranked up and toasts a 1080Ti and no mention it is pulling 350+w.

Of course if it tops out under 300w they will probably give a smaller upper clock speed so they can actually make and sell them. Just too many not knowns.

They slapped an AIO on the Fury X, so we have precedent of them watercooling to lower TDP from 300W to 250W.
 
I would not count them as very reliable but how could AMD rate the MI25 12.5Tflops if running at 1200mhz? Boost speed?
Check out AMD's last workstation cards. They do not come close to their "rated" FLOPs there either.

Can Vega hit 1500Mhz OCd? Probably. Will it do it without turning into Furmarked Fiji furnace? No.
Fury´s biggest failure is its VRAM limit that it suffers badly from. Had it been an 8GB card as originally planned it would have done much better.
Please, you are smarter than that. Fiji's biggest failure is inadequate front-end for it's raw compute power. That's it.
 
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/29.html

I find it interesting that the FuryX now is faster than the 980Ti in 4K - yet people called it a failed card - I guess that is true since it did not make a bunch of money for AMD.

Anyways the difference between the 1080 and FuryX is 29% at 4K (from a number of folks you would think it was 80% but it is not, Nvidia is just more effective and getting people to believe their hype).

Vega if it's boost clock is 1500 (the clock which AMD normally keeps the gpu at if conditions are OK) would be 1500/1050 ~ 43% faster. If all the new improvements makes that a perfect scale then yes Vega will be faster than the 1080 by a significant amount.

The 1080Ti is 82% faster than the FuryX - Unless AMD has some magic Vega looks to be in between the 1080 and 1080Ti. Still it all depends upon clock speed and architecture improvements. Just too many unknowns really, clock speed which will be limited by power consumption and heat (big die should be easier to cool, the MI25 being passive tells a lot there). Then power consumption which we already know is 300w to hit that 12.5 TFLOPS for the Instinct MI25. I do not see AMD releasing a 300w Vega, probably a 250w Vega that will OC decently will be the result so the user can take it up past 300w etc if they want to. Meaning real performance ability will have to be determine after launch and OC. Kinda like the Nano, slightly faster than a 980, rated at 175w but can perform at 980ti/FuryX levels except it will be pulling 250w+ when OC and 50% Powertune. So Vega could be rated at 225w, 1200mhz 1350mhz boost - but then you add 50% powertune (yep AMD allows you to increase power limits by 50%!) and OC the card to way past 300w - Question is what really is the max performance or capability against max performance of a 1080 and 1080Ti?

It is all a waiting game now, which is getting very long in the tooth.

More nuanced because a lot of that is using old history benchmark data.
Try looking at recent games tested at PCGameshardware where they use custom AIB models from both AMD and Nvidia for testing launched games.
And lets be honest neither of those cards are truly 4K but now days 1440P, even the GTX1080 is pushed to its limits to do 4k nicely.
So at 1440P
Mass Effect Andromeda: 35% for 980ti., at 4k 27% for 980ti
Ghost Recon: 31% for 980ti, at 4k 22% for 980ti
For Honour: 19% for 980ti, at 4k 21% for 980ti
Sniper Elite 4: 12% for 980ti, at 4k equal
Resident Evil 7: 1% for 980ti, at 4k 34% for 980ti (probably down to the dynamic caching by driver needing resolving with this game on Fury X as 390X is faster than it)
Watch Dogs 2: 36% for 980ti, at 4k 33% for 980ti
CoD Infinite Warfare: 26% for 980ti, at 4k 16% for 980ti - game very well optimised for Polaris.
Titanfall 2: 7% for 980ti, at 4k 8% for 980ti
Shadow Warrior 2: 19% for 980ti, at 4k 9% for 980ti
Battlefield 1: 19% for 980ti, at 4k 19% for 980ti - looked at their DX11 for 980ti and DX12 for Fury X

Cheers
 
More nuanced because a lot of that is using old history benchmark data.
Try looking at recent games tested at PCGameshardware where they use custom AIB models from both AMD and Nvidia for testing launched games.

And in my opinion there is another parameter: FuryX is AMD's only high-end GPU, so all their driver updates are focused to improve this GPU. NVidia on the other hand has -in the meantime- released an entire new lineup after the 980Ti, so it's only logical that the new drivers are mainly focused at this new lineup.
 
Did they retest the 980ti for example with the new driver after they replaced games with Doom, Hitman etc?
Vega 10 so far have only been shown at 1000/1200Mhz from the OpenCL leak.
Vega 10 wont have a faster memory bus than Fury. It may even end up slower (409-512GB/sec). So this may imply a scaling penalty.
Fury´s biggest failure is its VRAM limit that it suffers badly from. Had it been an 8GB card as originally planned it would have done much better.
OCing it may be another "overclockers dream". AMD likes to bin to the max since Fiji.
That is exactly right; question becomes is the data being presented accurately reflecting current performance if not it should be thrown out? I see many use old data not realizing that it could be off by a larger degree. For example HardOCP looking at drivers over time particularly the RX 480 which had a huge increase - that alone makes all previous benchmark obsolete. So the newest results are probably most accurate but even they get errors rather quickly.

Also another consideration for example the 980Ti was a good overclocker, while FuryX may have caught up at base conditions, it has no way exceed the max performance capability of the 980Ti except maybe in some DX 12 games like Sniper Elite or upcoming ones.

I have no idea how much head room Vega will have, it being rather big suggest maybe it will be easier to keep cool and would be able to take the power if need be. All speculation.
 
Take it as best case AMD has never shown anything else. So......
Really sounds like it will be more like 1300mhz with a boost which also comes down to overall power with a max of 300w unless one OC which is not rated. I can't see Vega being more then 275 total. Will be interesting
 
NO! You and the others that keep mentioning this so called Hype was in direct contrast to the majority of AMD posters that told you it wouldn't happen. NO ONE here mentioned these "expectations" other than you and the other AMD bashers. So that way when the dust settled you could make it look like you were the rational ones when in fact the majority of the hype ALWAYS comes from poster that make statements like this: If XXX is faster than my YYY then I will buy it. Then others like you start making the claim that others are expecting XXX to be faster than YYY, when ON THIS SITE most were not expecting such outcome and were telling you that it wouldn't happen.

As FAR as the 480 Most of us told you to expect 390X performance with far lower power usage. Now being rational individuals and giving a range, the range would have been 390(non X) up to Nano or reg Fury but not FuryX. And lo and behold it landed right in the middle of that range as most HERE ON THIS SITE STATED. Yet here you and so many others still parrot that you were the lone stand against the hype of fevered AMD fanbois. A damn fine example of fake news ( and I hate that term being a buzz word that tech has been incorporated heavily over the last 20 years).

This is absolutely untrue.

You were considered a hater, if you believed polaris had the performance of r9 390 as this was the absolute worst expectation of performance. The original thread got deleted because there was so much fighting and heat.

The general consensus was gtx 980 ti and fury x performance or a tad below. Go to other forums where their threads still exist i.e pre may 2016 and with thread starts in dec 2015/jan 2016. The most optimistic were 10-20 percent beyond gtx 980 ti.

And much of this was due to fans extrapolating performance from the most optimistic scenario. Don't you remember the hitman 60 fps preview where people claimed it was at max settings v synced or people extrapolating from the 2.5x performance per watt claims from Fury x, or the demonstration vs the battlefront preview which showed tremendous gains in performance per watt and extending this to 100 watt cards with fury x performance?

It was only a few weeks before the actual release of polaris did people realize they had to vastly lower their expectations.
 
This is absolutely untrue.

You were considered a hater, if you believed polaris had the performance of r9 390 as this was the absolute worst expectation of performance. The original thread got deleted because there was so much fighting and heat.

The general consensus was gtx 980 ti and fury x performance or a tad below. Go to other forums where their threads still exist i.e pre may 2016 and with thread starts in dec 2015/jan 2016. The most optimistic were 10-20 percent beyond gtx 980 ti.

And much of this was due to fans extrapolating performance from the most optimistic scenario. Don't you remember the hitman 60 fps preview where people claimed it was at max settings v synced or people extrapolating from the 2.5x performance per watt claims from Fury x, or the demonstration vs the battlefront preview which showed tremendous gains in performance per watt and extending this to 100 watt cards with fury x performance?

It was only a few weeks before the actual release of polaris did people realize they had to vastly lower their expectations.


Yep I remember I even thought P10 was going to have Nano level performance. And I remember stating to hit Fiji level performance its going to need to 150 watts or higher.
 
Really sounds like it will be more like 1300mhz with a boost which also comes down to overall power with a max of 300w unless one OC which is not rated. I can't see Vega being more then 275 total. Will be interesting


I think its going to be around 250 watts stock.
 
I didn't expect much as soon as I find out it's 2304 shaders. That just told me about 390/390x performance best case. If people expected it to deliver the world after that its their own fault.

Although i just reviews of new Mass Effect game and it's up there with fury x and about 10-20% faster than gtx 1060 Whatever the hell that means lol.
 
I didn't expect much as soon as I find out it's 2304 shaders. That just told me about 390/390x performance best case. If people expected it to deliver the world after that its their own fault.

Although i just reviews of new Mass Effect game and it's up there with fury x and about 10-20% faster than gtx 1060 Whatever the hell that means lol.


Hmm didn't know the shader count till about a month before it was released ;)

And not sure which review cause haven't seen that yet for the most part the 1060 has a lead marginal but a lead, but the reason why its up there with Fury X in some of the reviews is because of the 4 gb on Fury X.
 
Hmm didn't know the shader count till about a month before it was released ;)

And not sure which review cause haven't seen that yet for the most part the 1060 has a lead marginal but a lead, but the reason why its up there with Fury X in some of the reviews is because of the 4 gb on Fury X.

No I know what you mean but in that game it was across every resolution. It's pcgamer website that had the review up for that game.
 
and you just hate amd. How could amd possibly earn your love? Lol. It's okay. I don't think Lisa cares too much about your opinion.

You know you guys keep replying to him and I have to keep seeing his lame posts, just put him on ignore and he can sit there and tell just himself how bad he hates AMD. Hype is always irrelevant, some people expect miracles and others expect failures no matter what. Marketing has 1 job and thats to make you think you need this product, so they are only going to show you the strong points and overlook any weak points. Marketing 101, you dont lie you just stretch it.
 
Back
Top