Vega Rumors

Well it could make sense in that the temp of these ram chips are really at the limit, so the 8gb might run cooler? Still hard to imagine though.
 
That still doesn't make sense. The card with the lowest memory count having the best performance. I don't buy that for a second.

As long as it's 8GB that's perfect for a 1080 performance level.

It all comes down to price....
 
Don't know man, just wondering how they are going to get more clocks out of the RX version if its already pretty much power limited by the pci-e specs.


I dont think they will be getting more clocks out of it. In fact I dont think that is where the performance will come from. I think the RX were supposed to be 1500 MHz and the FE was supposed to be the best compute card with the highest clocks. But hell who really knows as Vega has been a mess of information. it's like AMD and Intel are having a competition on who can have the crappiest launch of a new product, X299 or Vega. I know you think I am crazy but I think most of the performance will come from the drivers that they will release that will unlock performance in the card. I just dont think they put out the gaming drivers like they said, as people have found almost no difference in performance between them and that would be really odd based on what we know with Nvidia drivers. I dont expect the FE Vega gaming side to improve as they will leave it locked to pro development and thus a small niche product that wont eat up their limited supply of HBM2. Not to much longer to see if my thought is right or wrong, not going to be sad one way or the other about it. I really hope it ends up between a 1080 and 1080ti when all said and done after that it's all up to the price. Would have been nice if it beat a 1080ti but from the looks of things that wont be happening, but honestly that was a bit of stretch.
 
I dont think they will be getting more clocks out of it. In fact I dont think that is where the performance will come from. I think the RX were supposed to be 1500 MHz and the FE was supposed to be the best compute card with the highest clocks. But hell who really knows as Vega has been a mess of information. it's like AMD and Intel are having a competition on who can have the crappiest launch of a new product, X299 or Vega. I know you think I am crazy but I think most of the performance will come from the drivers that they will release that will unlock performance in the card. I just dont think they put out the gaming drivers like they said, as people have found almost no difference in performance between them and that would be really odd based on what we know with Nvidia drivers. I dont expect the FE Vega gaming side to improve as they will leave it locked to pro development and thus a small niche product that wont eat up their limited supply of HBM2. Not to much longer to see if my thought is right or wrong, not going to be sad one way or the other about it. I really hope it ends up between a 1080 and 1080ti when all said and done after that it's all up to the price. Would have been nice if it beat a 1080ti but from the looks of things that wont be happening, but honestly that was a bit of stretch.


Never thought you were crazy man, just a bit too excited about possibilities before, that was probably one of the best posts I've seen so far lol.

I still think they will get to 1600 mhz though, with the TDP's listed, they should be hit those frequencies in boost at times.

They have to add game optimizations into the drivers for FE, otherwise it will not be something dev teams will likely use if the performance stays where it is right now, I wouldn't use for any type of dev work outside of making new features for their primitive shaders, and why get the FE when the RX will do?

This is why I find it really hard to believe half the features they talked about for Vega are off in FE lol. Why market something for game devs, and turn those features off? PS Rys and Ryan from PC per both stated there is nothing gimped in Vega FE, I really don't think there are any features turned off.
 
Last edited:
I dont think they will be getting more clocks out of it. In fact I dont think that is where the performance will come from. I think the RX were supposed to be 1500 MHz and the FE was supposed to be the best compute card with the highest clocks. But hell who really knows as Vega has been a mess of information. it's like AMD and Intel are having a competition on who can have the crappiest launch of a new product, X299 or Vega. I know you think I am crazy but I think most of the performance will come from the drivers that they will release that will unlock performance in the card. I just dont think they put out the gaming drivers like they said, as people have found almost no difference in performance between them and that would be really odd based on what we know with Nvidia drivers. I dont expect the FE Vega gaming side to improve as they will leave it locked to pro development and thus a small niche product that wont eat up their limited supply of HBM2. Not to much longer to see if my thought is right or wrong, not going to be sad one way or the other about it. I really hope it ends up between a 1080 and 1080ti when all said and done after that it's all up to the price. Would have been nice if it beat a 1080ti but from the looks of things that wont be happening, but honestly that was a bit of stretch.
Agreed, pushing 275w at default clocks on the FE does not leave much room for higher clock rates. 8GB of faster HBM power may not be that much less then the FE 16gb. So yep, improvements will need to come from the drivers at this stage. I am not sure how well these things will OC and what massive amount of power will be needed. I am expecting probably not that much in the end as in 1650-1700mhz.

Which begs the question how much more performance over time is on tap? The Polaris via drivers was around 10% which is decent. How much different is Vega from Polaris? I am more interested in DX 12 performance vice DX 11, Vulkan as well since tomorrow content will be using those APIs more and more.
 
Well at 1700, its pulling 400 watts from the PCI-E connectors so that is not too good. And if it really is a silicon lottery hit, I don't think that is too good cause using water they can't get more than 100 mhz? Water should give it 30- 50 watts head room and it seems the core is VERY hungry for power when overclocking.
 
Well at 1700, its pulling 400 watts from the PCI-E connectors so that is not too good. And if it really is a silicon lottery hit, I don't think that is too good cause using water they can't get more than 100 mhz? Water should give it 30- 50 watts head room and it seems the core is VERY hungry for power when overclocking.
Basically it is already pretty much at max like the FuryX. It would be fun to play around with one but unless performance is improved it will be a hard sell, at least for me. Making a less than 200w Nano like version seems somewhat unlikely. Not interested in a chopped, reduced shader version either. I wonder what the AIBs are thinking and what they have come up with so far, some of them probably wish they were making Nvidia cards like XFX. Hopefully RTG will surprise us with some untapped potential in Vega - that would be most welcome.
 
Never thought you were crazy man, just a bit too excited about possibilities before, that was probably one of the best posts I've seen so far lol.

I still think they will get to 1600 mhz though, with the TDP's listed, they should be hit those frequencies in boost at times.

They have to add game optimizations into the drivers for FE, otherwise it will not be something dev teams will likely use if the performance stays where it is right now, I wouldn't use for any type of dev work outside of making new features for their primitive shaders, and why get the FE when the RX will do?

This is why I find it really hard to believe half the features they talked about for Vega are off in FE lol. Why market something for game devs, and turn those features off? PS Rys and Ryan from PC per both stated there is nothing gimped in Vega FE, I really don't think there are any features turned off.


I think the RX series will be able to hit 1600 but I expect them to only go with 1500 listed on the box. So that will be your overclock and the luck of the draw if you can push it to 1700. But the wattage draw doesn't lie, it shows the chip is being pushed past the speed it was meant to be ran at. I think the 1080ti launch hurt them and this was their best answer to it. I believe they did add some game optimizations to the FE drivers, I just think they may have forgot to add a few things in there. But to be honest the whole FE edition is just confusing as hell to me, I dont like the fact they wasted valuable resources by splitting the driver team on two tasks instead of one. My idea with drivers is just speculation and I have no hard proof just a feeling that something is off and it would be the easiest way to tamper the results on the FE to skew results. Now it's possible it running some sort of redundancy on purpose to verify data integrity since it's a pro card, but software is not my strong suit. I will just be glad when it launches and we can see what the RX can do, I think were all a little tired from waiting.
 
Here's pricing information:

Vega 10: $499

Vega 10b: $599

Vega 11: $449

*These prices are subject to change.

Interpretation:

Vega 10b is presumably a special binned of Vega 10 and presumably comes with a liquid cooler.

Vega 11 is presumably the same as Vega 10, but with some CUs disabled.
 
AMD don't think so.

Expect Vega to be sold out as soon as it is "released".
600 bucks for 1070 level performance 12 months late and at 3x the power and 2x the heat...

what does Vega get you over a 1080 for 500?
 
The only SKU I see as moderately attractive is the Vega 11 IF you can get one that unlocks CUs e.g. 465 GT, 290, 6800 vanilla...
 
If Vega 10 is going to be $499 in the US, that's about £465 in the UK including VAT. The cheapest GTX 1080s right now are ~£500. I really don't think that's good enough if it's only offering GTX 1080 performance whilst producing a lot more heat, especially given all GPU prices are currently inflated anyway due to the mining craze and Vega will be even further inflated at launch as GPUs always are when stocks aren't high enough.
 
I think the RX series will be able to hit 1600 but I expect them to only go with 1500 listed on the box. So that will be your overclock and the luck of the draw if you can push it to 1700. But the wattage draw doesn't lie, it shows the chip is being pushed past the speed it was meant to be ran at. I think the 1080ti launch hurt them and this was their best answer to it. I believe they did add some game optimizations to the FE drivers, I just think they may have forgot to add a few things in there. But to be honest the whole FE edition is just confusing as hell to me, I dont like the fact they wasted valuable resources by splitting the driver team on two tasks instead of one. My idea with drivers is just speculation and I have no hard proof just a feeling that something is off and it would be the easiest way to tamper the results on the FE to skew results. Now it's possible it running some sort of redundancy on purpose to verify data integrity since it's a pro card, but software is not my strong suit. I will just be glad when it launches and we can see what the RX can do, I think were all a little tired from waiting.


Well EEC wise, pretty sure that is driver controlled, the pro drivers can turn that on. HBM has a built in ECC controller.

https://www.cs.utah.edu/thememoryforum/mike.pdf

Enable ECC/Resilience Features

Optional: Not all stacks required to support it


So yeah I can see performance difference because of that feature, but not too much I wouldn't be surprised if its all ready being turned on and off with Vega's FE drivers.
 
Basically it is already pretty much at max like the FuryX. It would be fun to play around with one but unless performance is improved it will be a hard sell, at least for me. Making a less than 200w Nano like version seems somewhat unlikely. Not interested in a chopped, reduced shader version either. I wonder what the AIBs are thinking and what they have come up with so far, some of them probably wish they were making Nvidia cards like XFX. Hopefully RTG will surprise us with some untapped potential in Vega - that would be most welcome.

They will likely add another PCIe connector with a custom card pcb ala RX480.

And right now with the mining craze going on, I seriously doubt they care. Everything is selling out except the low end stuff. (ie: <RX470)

On a side topic, I had a feeling those HBM2 stacks were burning hot and likely limitation to overclocking. Any heat increase from the core would likely easily bleed over to the HBM2 since they are so close. 1 HBM2 die stack will help solve improve this situation. How much is yet to be seen. I could run a rough simulation on it, but I really don't have enough data.
 
Last edited by a moderator:
600 bucks for 1070 level performance 12 months late and at 3x the power and 2x the heat...

what does Vega get you over a 1080 for 500?

Yes let's worry about power and heat when you guys will fork over an extra $200 or more for a g-sync monitor. *eye rolls* The difference between a 1080 and this is about 100 watts. That isn't a lot more heat. That's an old school lightbulb for a typical room and 1.2 cents EXTRA PER HOUR of it MAXED out.

And the final performance isn't known yet. Same old arguments. Save your judgement until the final product comes out.
 
Yes let's worry about power and heat when you guys will fork over an extra $200 or more for a g-sync monitor. *eye rolls*

And the final performance isn't known yet. Same old arguments. Save your judgement until the final product comes out.
You're right of course. We have a card that uses the same chip, in the same core configuration, with a similar TDP. We know it's performance. But obviously that tells us nothing about RX Vega, because the FE is Not-A-Gaming-Card™.
Plus obviously everybody who buys a videocard only ever considers Freesync/Gsync monitors.
 
You're right of course. We have a card that uses the same chip, in the same core configuration, with a similar TDP. We know it's performance. But obviously that tells us nothing about RX Vega, because the FE is Not-A-Gaming-Card™.
Plus obviously everybody who buys a videocard only ever considers Freesync/Gsync monitors.

And has been stated multiple times,
1) It's not the same configuration. There is 1 less HBM2 stack and ECC might be turned OFF
2) Raja stated these are not the finalized drivers for gaming. Basically these are hacked Fury drivers for the FE.
3) We haven't seen what water cooling will do yet. And to be fair 1700 MHz may be a statistical anomaly. But we just don't know.

Get some coffee. You're brains not working yet.
 
And has been stated multiple times,
1) It's not the same configuration. There is 1 less HBM2 stack and ECC might be turned OFF
2) Raja stated these are not the finalized drivers for gaming. Basically these are hacked Fury drivers for the FE.
3) We haven't seen what water cooling will do yet. And to be fair 1700 MHz may be a statistical anomaly. But we just don't know.

Get some coffee. You're brains not working yet.

Nice guy.

I think the mental point of his post was that a card comes out a year later which runs slower and offers nothing over the current options and, whether or not they matter to you, has some apparent handicaps.
Which engenders the question, "Why pick it?" The answer would seem to be based on some level of common sense...

Responding with OHHHHHHH BLAAAAH YOU TOTALLY CARE ABOUT HEAT AND POWER MEHHHHHHHHHHH YOU DONT EVEN KNOW! YOU HAVE NO BRAIN!!!! is just like... huh?

Same old fanboy support. Maybe you should wait for the final product before intellectually defending it. Suggesting that one 'save his judgment' would imply that if the final product were what it already seems to be, you would agree that it is inferior (i.e.: it doesn't look good, but save your judgment because it will probably be better!). Since that is already an unlikely outcome, it remains a foolish hook to hang your sarcasm on. Considering this is exactly what happened last cycle, being willfully ignorant of already-real facts isn't an argument. If you want to spend your money supporting AMD at the expense of slower, hotter, louder cards, that's your choice. But I think that's where the real *eye roll* comes in.

Pinning yourself into ever smaller corners is never the best way to proceed with an argument. You're now already at "Well we don't know what will happen when they pour liquid nitrogen on it!! Surely then it will meet current performance standards while producing more heat and using more power!" You have an actual, functioning example of the architecture available to base your judgments on but instead seem to refuse to accept it for what it is. It would seem reasonable to, as you suggest of others, hold your judgment about the slight possibility that another iteration of the same card might end up massively improved.

That's what everyone else is doing. Taking note of reality, and commenting on it. You're the one cooking up fantasy scenarios about 'what if there is a change to the HMB stack with ECC off while a few driver optimizations are done with a water cooling solution, then of COURSEEEE it will be better!!' Ignoring not only how much of an obvious logical fallacy that is, but also that one could make the same fucking conditions about the current Nvidia cards and then say "Of course they will be EVEN BETTER!"

If you're the example of someone's brain working........
Well. Do you boo.
 
Last edited:
And has been stated multiple times,
1) It's not the same configuration. There is 1 less HBM2 stack and ECC might be turned OFF
2) Raja stated these are not the finalized drivers for gaming. Basically these are hacked Fury drivers for the FE.
3) We haven't seen what water cooling will do yet. And to be fair 1700 MHz may be a statistical anomaly. But we just don't know.
.

They are not hacked Fury drivers, Raja even stated that, he stated it would be nice if they could do that, but they can't lol.

Rest of it I agree I with.
 
Yes let's worry about power and heat when you guys will fork over an extra $200 or more for a g-sync monitor. *eye rolls* The difference between a 1080 and this is about 100 watts. That isn't a lot more heat. That's an old school lightbulb for a typical room and 1.2 cents EXTRA PER HOUR of it MAXED out.

And the final performance isn't known yet. Same old arguments. Save your judgement until the final product comes out.

Yeah, I just ran home and back to drag this guy in from the rain.

Got it at the $575 price too even though I pre-ordered at $700 after seeing some early reviews.
 
"Fight you bastards I hate peace" - famous quote.

Really, I'm not expecting much more from RX than FE. Maybe 5% if they tune up the drivers nice. But basically I will bet anyone a bitcoin it's like putting a micro furnace inside your PC if you're going to hit max performance. This card may make the idea of a separate case for GPU only a widely adopted "thing".

The biggest most important question I have is what is the price and will it have any impact on the competitions pricing structure? Given the mining craze, I don't think it matters and with that final question rendered moot I have basically lost all but the most casual interest in a product that hasn't even been released. My interest in AMD and RTG remains however.

I do remember though that when Intel was shopping around their x86 build as a GPU at least they had the brains to cancel that direction once they realized it would only hit 3870 performance levels when 4870 were already out- It never saw the light of day.

I wish AMD would just cancel this direction they're going and eat the crow until they have a new design on the table. I mean... they had a fallback plan here, what was it? Double up Polaris (single die) with DDR? There's always a fallback plan. Too late to launch the fallback plan, I'm just curious about what it was.
 
"Fight you bastards I hate peace" - famous quote.

Really, I'm not expecting much more from RX than FE. Maybe 5% if they tune up the drivers nice. But basically I will bet anyone a bitcoin it's like putting a micro furnace inside your PC if you're going to hit max performance. This card may make the idea of a separate case for GPU only a widely adopted "thing".

The biggest most important question I have is what is the price and will it have any impact on the competitions pricing structure? Given the mining craze, I don't think it matters and with that final question rendered moot I have basically lost all but the most casual interest in a product that hasn't even been released. My interest in AMD and RTG remains however.

I do remember though that when Intel was shopping around their x86 build as a GPU at least they had the brains to cancel that direction once they realized it would only hit 3870 performance levels when 4870 were already out- It never saw the light of day.

I wish AMD would just cancel this direction they're going and eat the crow until they have a new design on the table. I mean... they had a fallback plan here, what was it? Double up Polaris (single die) with DDR? There's always a fallback plan. Too late to launch the fallback plan, I'm just curious about what it was.


I have to say yeah at this point its better for them not to release, but they are so far into the game, might as well release just limited release. They can get some of their investment back how ever much it is (20 mill just for tape out, how much will be to go into mass production, gotta be quite a bit added to that driver development marketing, wouldn't be surprised if its 100 million or more)....... They don't have money to waste, unlike Intel did with the first Phi.

Dual GPU boards just won't sell either though so the fall back plan will need another fall back plan....
 
good get for the price!

Even used without Freesync, I felt like it was solid buy for my preferences :)

I had been waiting for this exact set of features: 1440p, 120hz+, curved, "HDR" (lol), Freesync, 30-32 inch (my eyes aren't the best and I hate wearing glasses), 1ms GTG, and non-TN.
 
I've been looking for a good priced 4k HDR monitor looks like that isn't going to happen, don't care about freesync or gsync either cause I need it for development, might get that when it comes back in stock, its only 2k, but that will do, need the HDR feature more than anything else.
 
I've been looking for a good priced 4k HDR monitor looks like that isn't going to happen, don't care about freesync or gsync either cause I need it for development, might get that when it comes back in stock, its only 2k, but that will do, need the HDR feature more than anything else.

Yeah, I figured RX Vega would be able to power this at 1440p for at least 2 years nicely at higher refresh rates. The "HDR" is not quite there yet, I will try to explore what I can but I don't own any games with that feature enabled currently.

The dynamic brightness thing is only an 8 zone deal, so it's something that I'll leave off, but being VA the darks will be dark enough for my taste.

Here's a dude's 25 minute review, he seems to be more interested in the content creator side of things more than my gaming view:

Also, I think that Freesync 2 isn't in drivers yet, so I don't think the standard is supported properly just yet.

I have a feeling RTG will release that with the RX Vega drivers.
 
Nice guy.

I think the mental point of his post was that a card comes out a year later which runs slower and offers nothing over the current options and, whether or not they matter to you, has some apparent handicaps.
Which engenders the question, "Why pick it?" The answer would seem to be based on some level of common sense...

Responding with OHHHHHHH BLAAAAH YOU TOTALLY CARE ABOUT HEAT AND POWER MEHHHHHHHHHHH YOU DONT EVEN KNOW! YOU HAVE NO BRAIN!!!! is just like... huh?

Same old fanboy support. Maybe you should wait for the final product before intellectually defending it. Suggesting that one 'save his judgment' would imply that if the final product were what it already seems to be, you would agree that it is inferior (i.e.: it doesn't look good, but save your judgment because it will probably be better!). Since that is already an unlikely outcome, it remains a foolish hook to hang your sarcasm on. Considering this is exactly what happened last cycle, being willfully ignorant of already-real facts isn't an argument. If you want to spend your money supporting AMD at the expense of slower, hotter, louder cards, that's your choice. But I think that's where the real *eye roll* comes in.

Look I'm not fanboi. I slammed the Fury and the RX480.

What I get f'n tired of is f'n idiots who can't be impartial.

Now why would someone buy a Vega? There's pros and cons to each manufacturers approach. It's akin to choosing a Lotus versus a Dodge Demon. I'm not here to make those arguments. And quite frankly if you think one size fits all, then you would be mistaken.

but in the Vega's PRO category:

1. Async Compute
2. Freesync is cheaper
3. When properly implemented, Mantle pretty much beats NVIDIA's offerings at similar price points
4. If it's anything like Fury, it's long term longevity will fare better than NVIDIA in terms of future titles.


In NVIDIA's PRO category:
1. Raw speed overall
2. Gameworks appears to work better
3. Lower power consumption

THE FINAL UNKNOWN:
1. How the FINAL VEGA CONFIGURATION/DRIVERS AND AIR COOLING/WATER COOLING PERFORM AND HOW THAT CALCULATES ON PRICE/PERFORMANCE BASIS. And this is where most of you are showing you're fan-boi idiots. You're spelling doom and gloom before it's all said and done. You can't just f'n wait and pounce.

I swear to f'n god, I get sick of this bullshit and people's cognitive dissonance on both sides. Closed and narrow minded.
 
THE FINAL UNKNOWN:
1. How the FINAL VEGA CONFIGURATION/DRIVERS AND AIR COOLING/WATER COOLING PERFORM AND HOW THAT CALCULATES ON PRICE/PERFORMANCE BASIS. And this is where most of you are showing you're fan-boi idiots. You're spelling doom and gloom before it's all said and done. You can't just f'n wait and pounce.

I swear to f'n god, I get sick of this bullshit and people's cognitive dissonance on both sides. Closed and narrow minded.

Dissonance would be if people were drawing conclusions that went counter to the evidence at hand.

This isn't doom and gloom, no company has a perfect product track record. AMD and RTG just has a worse track record than most and looks especially bad compared to it's only competition. I have no doubts they have more designs and more plans and future cards that will be received more favorably. They might even have a giant slayer in the pipeline, but I wouldn't take a 100 to one bet on that.

But, FE is basically it. If you expect something greater than five to ten per cent better than FE, I have a bridge to sell you.
 
but in the Vega's PRO category:

1. Async Compute
2. Freesync is cheaper
3. When properly implemented, Mantle pretty much beats NVIDIA's offerings at similar price points
4. If it's anything like Fury, it's long term longevity will fare better than NVIDIA in terms of future titles.

1. has little impact on games
2. not everyone cares or is going to buy a new monitor
3. Mantle is dead
4. Fury is beat by a 1070
 
Dissonance would be if people were drawing conclusions that went counter to the evidence at hand.

This isn't doom and gloom, no company has a perfect product track record. AMD and RTG just has a worse track record than most and looks especially bad compared to it's only competition. I have no doubts they have more designs and more plans and future cards that will be received more favorably. They might even have a giant slayer in the pipeline, but I wouldn't take a 100 to one bet on that.

But, FE is basically it. If you expect something greater than five to ten per cent better than FE, I have a bridge to sell you.

Cognative Dissonance occurs when the brain feels threatened because someone presents a belief that is counter to the subject's own. A perfect example of this is precious little snowflakes on campus who throw an epic fit because a conservative speaker was invited. It doesn't matter if there are facts involved or not. The thought that somebody might be wrong causes their brain to respond with "threat" reactions as the basil core levels where the most basic instincts of survival and fear come from. (Proven with MRI)

I never ONCE claimed that Vega would beat 1080, or 1080ti. As I always said, "It's too early to tell" So yeah I am neutral. I just get tired of asshat armchair experts that act like parrots for their favorite brands.

It's like when another poster here, (who's name began with "S") claimed, "No way it will have 16GB of HBM2...NO WAY" like it's a matter of fact. Well we know what happened.
 
Last edited by a moderator:
1. has little impact on games
2. not everyone cares or is going to buy a new monitor
3. Mantle is dead
4. Fury is beat by a 1070

1&2. ATOS, and Doom
3. Mantle is Vulkan (more or less) So it lives on and offically supported by Microsoft.
4. I wasn't talking about raw speed. I said % degredation. If you buy a product from 2->4 years ago, is it still viable for AAA today? In terms of performance loss on average, AMD has fared better, particularly as DX12 titles have shown up. Sure I could buy a 970, then buy a 1070 a year later. But that's going to cost you more money in the end.
 
Last edited by a moderator:
Back
Top