Vega Rumors

Eventually AMD will catch up to NVIDIA, its inevitable because you can only keep shrinking silicon so far and we're pretty close to the limit now. Once they hit a lithographic limit, I wonder what NVIDIA will do to keep increasing GPU speed beyond what AMD can do? Obviously NVIDIA using their new MCM technology with minimal performance hit will be a way to use multiple dies on a GPU and AMD will have a GPU version of infinity fabric but even then I suspect the performance difference between the two won't be as large as it is today.
 
If they keep going like they are now, RTG might not be there by the time MCM or infinity fabrics come to gaming cards ;), the earliest I see it will be happning is 2020, that is 3 years or 2 gens from now. So if Navi comes out and disappoints, we can say goodbye RTG, cause AMD will be looking to how to fix RTG up.
 
Last edited:
If they keep going like they are now, RTG might not be there by the time MCM or infinity fabrics come to gaming cards ;), the earliest I see it will be happning is 2020, that is 3 years or 2 gens from now. So if Navi comes out and disappoints, we can say good by RTG, cause AMD will be looking to how to fix RTG up.

Yeah they really need to get their power consumption under control. I think if Navi can do that then they will survive long enough for their GPU infinity fabric to show up and thats when things will really get interesting.
 
I know that Raja was not super involved with the overall design of Vega from the beginning, but its not like this is a hands-off process once a GPU design is decided on. The actual GPU tapeout is when things are set in stone, which is usually no more than a year from launch, even then, RAM configurations, power delivery, binning parameters, FIRMAWARE AND SOFTWARE... it's all stuff Raja has control over.
 
Yeah they really need to get their power consumption under control. I think if Navi can do that then they will survive long enough for their GPU infinity fabric to show up and thats when things will really get interesting.

Pretty much that is the only problem AMD has, if they can control voltage that will drop power consumption, now that isn't easy though. nV has been doing it with many different things outside of the rasterizer. The rasterizer is part of it, but not all of it. The hand laid transistors should save them 20% of power usage at least.

I know that Raja was not super involved with the overall design of Vega from the beginning, but its not like this is a hands-off process once a GPU design is decided on. The actual GPU tapeout is when things are set in stone, which is usually no more than a year from launch, even then, RAM configurations, power delivery, binning parameters, FIRMAWARE AND SOFTWARE... it's all stuff Raja has control over.

Well he will get blamed because he his the lead, but it will be unfair for him to get blamed for somethings he had no control over.
 
Pretty much that is the only problem AMD has, if they can control voltage that will drop power consumption, now that isn't easy though. nV has been doing it with many different things outside of the rasterizer. The rasterizer is part of it, but not all of it. The hand laid transistors should save them 20% of power usage at least.



Well he will get blamed because he his the lead, but it will be unfair for him to get blamed for somethings he had no control over.


Saying that a guy who was in charge two years before tapeout has "no control over" the final design is silly. Is the initial idea his conception? No. But this has a lot of Raja's blood, sweat and tears and a lot of his ideas went into the final design.
 
True but when your hands are tied by decisions and contracts made in the past, there is only so much design one can do :)

Pretty much Raja had directives based on things he had no control over, when things like that happens, he would have needed to improvise based on limiting his options which were tied to things he might not have wanted to do to begin with. Compromises had to be made, now those compromises do fall on his head. He made those choices.
 
Vega is going to be an EPYC disaster.

In fact, this would be the biggest disaster in the history of NVIDIA and ATI.

I doubt anything will ever top the Nvidia FX series. Now that was a disaster. I can not find an article or image for the life of me but I do remember that it got so bad, the FX 5900 series & higher came with around 6 free PC games & another disc that was full of demos. Not sure if it was limited to the 5900 cards but I do recall seeing the promotion at Comp USA for a FX 5900 XT :ROFLMAO:. Plenty of images & articles floating around showing the cards coming with at least three full PC games though.
 
Last edited:
"Lastly, AMD reps told the public that the AMD system has a $300 US difference which means two things. First, AMD puts the average price of a FreeSync monitor compared to a G-Sync monitor as $200 US cheaper. Then if we take that $200 out of the $300 from what AMD told, it means that the Radeon RX Vega can be as much as $100 US cheaper than the GeForce GTX 1080 at launch which should be a good deal but they haven’t told anything on things aside from that like performance numbers in other titles, power consumption figures and most importantly, what clocks was Vega running at which seems a little sad.''

looks like vega will be 400 now I think we might be talking about Vega XT with less shaders? Becasue full vega at $400 doesn't seem realistic to me.
 
"Lastly, AMD reps told the public that the AMD system has a $300 US difference which means two things. First, AMD puts the average price of a FreeSync monitor compared to a G-Sync monitor as $200 US cheaper. Then if we take that $200 out of the $300 from what AMD told, it means that the Radeon RX Vega can be as much as $100 US cheaper than the GeForce GTX 1080 at launch which should be a good deal but they haven’t told anything on things aside from that like performance numbers in other titles, power consumption figures and most importantly, what clocks was Vega running at which seems a little sad.''

looks like vega will be 400 now I think we might be talking about Vega XT with less shaders? Becasue full vega at $400 doesn't seem realistic to me.

Well all Nvidia has to in that case is lower the prime of the 1080 GTX. Hell they could even lower it to $449 and it would still be a better buy just because of Power/Heat/Overclocking performance.

I really want RTG to thrive and stay competitive, but when they show Vega against a 1080 GTX, well that tell's you right there what kind of performance it will have. Then add in power consumption...well yea its starting to look bad.

1 thing that might save Vega is it will only be using 8GB of HBM2, that might save them some over Vega FE, but I have no idea how much.
 
Eventually AMD will catch up to NVIDIA, its inevitable because you can only keep shrinking silicon so far and we're pretty close to the limit now. Once they hit a lithographic limit, I wonder what NVIDIA will do to keep increasing GPU speed beyond what AMD can do? Obviously NVIDIA using their new MCM technology with minimal performance hit will be a way to use multiple dies on a GPU and AMD will have a GPU version of infinity fabric but even then I suspect the performance difference between the two won't be as large as it is today.
I mean, the trick here is that nV presently makes far better use of the same lithography, just compare 1050 Ti and anything Polaris in power efficiency department.

So, for AMD to catch up they just have to pull off something akin to Kepler->Maxwell jump.
 
I doubt that AMD will ever catch up, what with the huge discrepancy in R&D.

Sure, silicon is the limitation, but software is the even bigger limitation. There might be better, mind-blowing rendering techniques out there that still haven't been discovered/researched/commercialized that can blow current implementations out of the water with fidelity and performance, but requires a massive redesign.

Without the cash to experiment, AMD's path is very linear. They can only follow.
 
The only chance of RTG catching up with Nvidia would be if the latter stumbles badly but even that is not likely to happen since NV has shown itself as company that learns from its mistakes. I guess what RTG really needs is to do start from scratch, go back to basics and focus on what is really important like what AMD and Jim Keller did with Zen, Intel with Conroe and Nvidia with Kepler.
 
The only chance of RTG catching up with Nvidia would be if the latter stumbles badly but even that is not likely to happen since NV has shown itself as company that learns from its mistakes. I guess what RTG really needs is to do start from scratch, go back to basics and focus on what is really important like what AMD and Jim Keller did with Zen, Intel with Conroe and Nvidia with Kepler.


Well to do that, it will take them 2 generations. So lets say Vega was the brick wall, Navi and the chip after that will be mediocre. Same thing happened with the FX series, the 5800 was bad, the 5900 was bad, and the 6800 was decent still just caught up to ATi did not surpass. Its takes time to do this and this was when nV was flush with cash, AMD and RTG on the hand has a large debt and at the moment no cash coming in.

Edit: this is the same thing that happened with the r600, r600 bad, rv680 not so good, rv780 looked much better but that was because nV stuck with the tesla architecture for 3 generations, the 2xx series wasn't that impressive.

So minimum 2 gens of mediocrity inclusive of a bad product launch, typical 3 gens.
 
Last edited:
Well to do that, it will take them 2 generations. So lets say Vega was the brick wall, Navi and the chip after that will be mediocre. Same thing happened with the FX series, the 5800 was bad, the 5900 was bad, and the 6800 was decent still just caught up to ATi did not surpass. Its takes time to do this and this was when nV was flush with cash, AMD and RTG on the hand has a large debt and at the moment no cash coming in.

Edit: this is the same thing that happened with the r600, r600 bad, rv680 not so good, rv780 looked much better but that was because nV stuck with the tesla architecture for 3 generations, the 2xx series wasn't that impressive.

So minimum 2 gens of mediocrity inclusive of a bad product launch, typical 3 gens.

But, like you said, this was with money. AMD has no money.
 
Well all Nvidia has to in that case is lower the prime of the 1080 GTX. Hell they could even lower it to $449 and it would still be a better buy just because of Power/Heat/Overclocking performance.

I really want RTG to thrive and stay competitive, but when they show Vega against a 1080 GTX, well that tell's you right there what kind of performance it will have. Then add in power consumption...well yea its starting to look bad.

1 thing that might save Vega is it will only be using 8GB of HBM2, that might save them some over Vega FE, but I have no idea how much.
NV doesn't generally adjust prices of their current lineup based on anything AMD releases
 
I think they should just release the card to reviewers and retailers without any official event. Raja is terrible on stage and with only add more laughter to an already hilarious launch.
Release the card and start talking up Navi, like Vega was just a preview to what to come. Spin, Spin Spin.
 
I think they should just release the card to reviewers and retailers without any official event. Raja is terrible on stage and with only add more laughter to an already hilarious launch.
Release the card and start talking up Navi, like Vega was just a preview to what to come. Spin, Spin Spin.

that didn't worked quite well with polaris.. wasn't launched and they were talking about VEGA, but now vega is a turd and they can't apply the same this time..
 
that didn't worked quite well with polaris.. wasn't launched and they were talking about VEGA, but now vega is a turd and they can't apply the same this time..
They can't claim the speed or efficiency crown. There's nothing to talk about at the event.
Variable refresh rate monitors, VR, AI etc. are all better on Nvidia as well. It will be a display of incompetence.
 
They can't claim the speed or efficiency crown. There's nothing to talk about at the event.
Variable refresh rate monitors, VR, AI etc. are all better on Nvidia as well. It will be a display of incompetence.

yet they will still claim that VEGA will be the most advanced and suscefull GPU on the planet, with unrivalled VR performance, advanced and futuristic features, the best performance/efficiency/dolar ratio. etc.. you know, as always.
 
If you have a massive dick, you just whip it out and get laid. If you have a tiny dick, you talk big game and say how good you are and wait for the lights to go out before going full commando.

Same goes with advertising. If you have the better product, you just release it.
Say you do have a massive one, how exactly do you whip it out? I have always had a problem with that. Because until you are naked, nobody knows....
 
The thing is lets say AMD didn't push Vega to the max with clocks and just went after lets say 1070 performance levels, they might have gotten to that fairly easily with 200 watts. And I think it would do just fine for sales. 1070 is the big money maker for nV, and it has the bulk of all sales (volume) and margins..... That would be the card to go for, not the 1080 or 1080ti, specially when pushing voltage and power usage to the max, its not worth it.
Absolutely agree. And, psychologically, they would be competing one level up from the 1060/rx480. But, that assumes they can achieve volumes. Could it be that they are having issues with that?
 
But, by the sounds of it, Vega is an expensive card to manufacture. So they would have competed on performance with the 1070, by more than likely cost the same or more than the 1080. I think they are in between a rock and a hard place when it comes to Vega.
That would not be something new to them. AMD margins are a fraction of NVIDIA's.
 
They can't claim the speed or efficiency crown. There's nothing to talk about at the event.
Variable refresh rate monitors, VR, AI etc. are all better on Nvidia as well. It will be a display of incompetence.
Do you have any evidence to back that up? To the best of my knowledge nobody outside of AMD has an unlocked card to gauge actual performance beyond possibly a few pro benchmarks with a different driver and compiler tool chain. No guarantee those are fully enabled either. All we know is that AMD has stated people are underestimating the card and Titan is the competition.

yet they will still claim that VEGA will be the most advanced and suscefull GPU on the planet, with unrivalled VR performance, advanced and futuristic features, the best performance/efficiency/dolar ratio. etc.. you know, as always.
Well it is more advanced than Pascal, possibly Volta, based on DirectX features, so being more "advanced" would be factually accurate. Performance and power efficiency remain to be seen as drivers can make a huge difference there.
 
Absolutely agree. And, psychologically, they would be competing one level up from the 1060/rx480. But, that assumes they can achieve volumes. Could it be that they are having issues with that?

Well that's the thing, business is war, if you can't out right beat your opponent, they need to do tactics that will hurt them and in the long run give you an type of advantage. Yet AMD keeps doing the same things over and over again.... I don't know if its inability to change a dire situation, or bad management, or a mix of both. Just seems logical when we see what the end results are from our point of view, but why doesn't AMD see it?

Oddly enough they did it with Polaris, but they aren't going to even try ti Vega? Then we have the odd ball rx 580 which is pushed way too far too?
 
Last edited:
Do you have any evidence to back that up? To the best of my knowledge nobody outside of AMD has an unlocked card to gauge actual performance beyond possibly a few pro benchmarks with a different driver and compiler tool chain. No guarantee those are fully enabled either. All we know is that AMD has stated people are underestimating the card and Titan is the competition.


Well it is more advanced than Pascal, possibly Volta, based on DirectX features, so being more "advanced" would be factually accurate. Performance and power efficiency remain to be seen as drivers can make a huge difference there.


NOPE to ALL of that.

Oh btw why do you think at the Budapest event they put a 1080 and Vega head to head in blind test? You think they don't show the best of the best and are sandbagging lol? When was the last time you saw AMD do anything of sort? specially after so much bad press about gaming performance with Vega FE?
 
Last edited:
I believe in VEGA 's advanced feature list as much as I trust the 970's 4GB of VRAM.

In other words, irrelevant.
 
yet they will still claim that VEGA will be the most advanced and suscefull GPU on the planet, with unrivalled VR performance, advanced and futuristic features, the best performance/efficiency/dolar ratio. etc.. you know, as always.
"Lastly, AMD reps told the public that the AMD system has a $300 US difference which means two things. First, AMD puts the average price of a FreeSync monitor compared to a G-Sync monitor as $200 US cheaper. Then if we take that $200 out of the $300 from what AMD told, it means that the Radeon RX Vega can be as much as $100 US cheaper than the GeForce GTX 1080 at launch which should be a good deal but they haven’t told anything on things aside from that like performance numbers in other titles, power consumption figures and most importantly, what clocks was Vega running at which seems a little sad.''

looks like vega will be 400 now I think we might be talking about Vega XT with less shaders? Becasue full vega at $400 doesn't seem realistic to me.

Or AMD is comparing it to a $600 after market 1080, which puts Vega at $500. I'd be floored if AMD sells Vega at $400. Regardless, it's going to be the shortest run big GPU ever released. They can't make their money back at $400, and they won't be able to sell at $500. Quite a conundrum.
 
  • Like
Reactions: spine
like this
Or AMD is comparing it to a $600 after market 1080, which puts Vega at $500. I'd be floored if AMD sells Vega at $400. Regardless, it's going to be the shortest run big GPU ever released. They can't make their money back at $400, and they won't be able to sell at $500. Quite a conundrum.

If AMD is doing good on the CPU side with ryzen, those are going to be most of the profit for them. Heck for now if they break even on the GPU side for now is okay. The way they are pumping out ryzen CPUs with amazing yields they are going to be making some damn good profits even pricing less then intel. That's where most of the profit is going to end up. If they can sell all they make for 400 for top end vega they will be just fine. Now by top end I mean air cooled version. The water cooled version will likely be 100 more then air cooled vega.
 
If AMD is doing good on the CPU side with ryzen, those are going to be most of the profit for them. Heck for now if they break even on the GPU side for now is okay. The way they are pumping out ryzen CPUs with amazing yields they are going to be making some damn good profits even pricing less then intel. That's where most of the profit is going to end up. If they can sell all they make for 400 for top end vega they will be just fine. Now by top end I mean air cooled version. The water cooled version will likely be 100 more then air cooled vega.


They aren't breaking even on the GPU side (semi custom), that's why they are still making a 150 million loss even after Ryzen's almost 1st quarter full release.

When you have management unsure of giving numbers for future quarters, that doesn't sound like they are confident when they will return to profitability either.
 
[citation needed]

You are in denial, bro.
https://pro.radeon.com/en-us/product/radeon-vega-frontier-edition/

Ok citation provided. Right on the product page if you scroll down a bit. Has higher theoretical numbers as well.

I don't seem to be the one in denial here.

NOPE to ALL of that.
It's been shown by every reviewer that has attempted gaming on FE so why NOPE? I don't see why you're outright rejecting literally ALL the evidence.

Oh btw why do you think at the Budapest event they put a 1080 and Vega head to head in blind test? You think they don't show the best of the best and are sandbagging lol? When was the last time you saw AMD do anything of sort? specially after so much bad press about gaming performance with Vega FE?
So they had performance counters provided to gauge performance? Looks like they were just comparing FreeSync to GSync to see if anyone could tell the difference. If Vega performance was the basis for that test, they'd have card specs up on a wall as to not invalidate a blind test. The cost difference they were advertising is largely the monitor difference.

If the drivers are still a WIP, the marketing department wouldn't have valid numbers to show. AMD hasn't provided any gaming performance numbers yet, just some pro apps that use a different compiler. All the graphics testing has basically shown falling back to Finish paths with none of the new features online. If they were we'd be seeing tech demos of primitive shaders and other effects. Little point in delaying a month if you don't expect the product to improve in a meaningful way.

Even for marketing, it would make sense to set expectations low to turn the inevitable narrative of all Nvidia's shills upside down.

I'm still unsure FE is the largest Vega they will release. AMD seems to have all the tech they need in place to push MCMs already. HBCC should solve all the annoying sync issues with AFR and the ACEs the dependencies easily enough. They have Infinity just like with Epyc to provide bandwidth. Might also be that they want Threadripper as a backplane to really push that capability.
 
https://pro.radeon.com/en-us/product/radeon-vega-frontier-edition/

Ok citation provided. Right on the product page if you scroll down a bit. Has higher theoretical numbers as well.

I don't seem to be the one in denial here.


It's been shown by every reviewer that has attempted gaming on FE so why NOPE? I don't see why you're outright rejecting literally ALL the evidence.


So they had performance counters provided to gauge performance? Looks like they were just comparing FreeSync to GSync to see if anyone could tell the difference. If Vega performance was the basis for that test, they'd have card specs up on a wall as to not invalidate a blind test. The cost difference they were advertising is largely the monitor difference.

If the drivers are still a WIP, the marketing department wouldn't have valid numbers to show. AMD hasn't provided any gaming performance numbers yet, just some pro apps that use a different compiler. All the graphics testing has basically shown falling back to Finish paths with none of the new features online. If they were we'd be seeing tech demos of primitive shaders and other effects. Little point in delaying a month if you don't expect the product to improve in a meaningful way.

Even for marketing, it would make sense to set expectations low to turn the inevitable narrative of all Nvidia's shills upside down.

I'm still unsure FE is the largest Vega they will release. AMD seems to have all the tech they need in place to push MCMs already. HBCC should solve all the annoying sync issues with AFR and the ACEs the dependencies easily enough. They have Infinity just like with Epyc to provide bandwidth. Might also be that they want Threadripper as a backplane to really push that capability.


Look man, in the history of graphics card launches, please tell me anything you have stated would be even REMOTELY TRUE?

You seem to have missed my post about Fermi, wood screw extravaganza. 6 months prior to Fermi launch nV showed off a mock up with wood screws right? So if nV didn't have silicon ready at that point and they were able to get drivers ready in 6 months for an ENTIRELY new architecture (not a modified GCN architecture like VEGA), their driver team must be so much better than AMD's right now with Vega right?

That's the problem that is not how driver development works (drivers are not done after the hardware is done), driver development and hardware development go hand in hand prior to tape out, all functionality is done in drivers by the time the GPU is taped out. Well that means with Vega FE either there is a huge problem with the silicon where parts are not functional or that's the best they got! AMD had 1 year from tape out till now (actually more than one year) to do miner bug fixes and optimizations. So in 1 year they are showing the same performance in the games they showed off 6 months ago, DOOM, BF1, Star Wars Battlefront.......

Don't skirt around the bush, the problem is those two points. as I stated before, if drivers weren't ready a month ago AMD is totally screwed with this GPU cause what ever functionality is fubar, is going to stay that way.

You are unsure FE is the biggest Vega to be released, what? How much bigger can they make it? Will they need a nuclear power plant to power it? Come on man, that makes no sense.

FE was already stated to be the fastest compute card on the market, so where do you expect the extra ALU's to come from?

If you are not in denial, that is a pretty scary position you are in ;)

And no they don't have all the tech in place to push MCM, we talked about this, GPU 1 control silicon doesn't know what GPU 2 control silicon does right now. They just don't have a direct communication link yet and the interconnect isn't fast enough.
 
Last edited:
Shouldn't this thing be out by now? It's the last week of July.

First of August I think, but you'll have to wait until January 2018 to get one due to manufacturing shortages & Cryptominers. Oh, its going to be a bloodbath. *Grabs Popcorn*.
 
All we know is that AMD has stated people are underestimating the card and Titan is the competition
Where did they that? They NEVER said that. They just picked a Titan because they can win in a couple of Pro benches against it, because it lacked pro drivers. they didn't even have the guts to compare to a Quadro P5000 which manhandles Vega FE.
Well it is more advanced than Pascal, possibly Volta, based on DirectX features, so being more "advanced" would be factually accurate. Performance and power efficiency remain to be seen as drivers can make a huge difference there.
Yeah a 500W power consumption advanced!
Looks like they were just comparing FreeSync to GSync to see if anyone could tell the difference.
A stupid and unneeded comparison, which you very well know! Yeah lets do a research on gsync and freesync right before Vega launches! what a brilliant idea. Wake up man, they only do this because they have nothing better to talk about for Vega. This is their only trump card.

You know when was the last time AMD did something like that? The FX series

Caplj3O.png


All the graphics testing has basically shown falling back to Finish paths with none of the new features online
Or they failed badly, and the new features are a bust. When was the last time you saw a driver boost performance and reduce power consumption? Disney land?
I'm still unsure FE is the largest Vega they will release.
Yeah now you are day dreaming!

AMD knows they have a bad product on their hands, they released videos of ThreadRipper performance TWO WEEKS before launch, they never did the same for RX Vega because it sucks! plain and simple!
 
Last edited:
Back
Top