Vega Rumors

Who said I had any hope? ;)

Vega FE just is not a card for me, the Rx Vega's maybe - looking more for a Nano size one for SFF build.

I have multiple machines, video cards etc. Thinking of selling 7970, 290 and one 1070 - buy 1080Ti and put the other 1070 into the BioStar Rizen machine - would be upgrading one from a 290 to 1070, SLI rig to single card using 1080Ti driving 3440x1440 monitor which seems perfect and probably be putting money in the bank as well.

Understand now?

Ok I get what you mean but the thread as a whole is still funny. Even when I get flamed for being a troll I found that even more funny.

We literally can make all the assumptions about a product we want but the danger here is that everyone is masturbating their minds into oblivion for NO reason.

I do not think that a thread discussing, in some finalized dictation, an unknown product for the most part (Gaming vega part) is not even called for as of now. It just breeds mixed opinions and baseless assumptions and in the end makes it apparent that the people most participating in the debate is literally the snake eating it's tail.

We might as well start a thread on nVidia's StarKiller Base card which they haven't even thought of and decide that it is superior to everything ever made ---- in ever and it's not even going to be thought of by their engineers for 5 more years. And at the same time AMD's future card 14 generations from now called IronJagoff is going to be just a slimy has been junk card that is only good for mining the remnants of the failed and imploded valueless bitcoin hype.

Disclaimer, I was never a fan boy of AMD or INtel but I love the competition, and I am rooting strongly for the underdog because comp. is great for everyone. I am sincerely hoping just as you Noko that the card really plays out well and is a powerful 1080 equiv or more for less $.

I JUST REFUSE to base base my opinion of Vega RX (Gaming part) on the virgin release of a workstation Adobe/CAD/CAM style card. It just doesn't make sense. Imagine how How windows 10 would perform if it was only optimized to use just one single core. It is optimized to use the piss out of your processor for the best OS performance as possible. Vega gaming version will be optimized to give the best possible performance with games that it can. The Frontier edition should have zero affect on how you view the upcoming performance of an entirely different purposed product.

We can't compare nVidia Quadro etc... performance to their gaming parts etc... that is not fair. nVIdia is an entirely different and competing technology that does the same thing in the end. Amd is a diff arch and product that will do the same thing in the end ........ when it is actually released.

Its just too early. There am I still trolling?


Anyways hot off the press..... VEGA RX is confirmed to be releaseing at Siggraph 2017 July 30th

THis is great news isn't it? Lets hope this card shreds the 1080 dollar for dollar. I will get one to replace my aging 980ti if it pans out., if not I will just get a used 1080ti or two. I am not biased. Just being patient and not making wild assumptions.

Link : https://hothardware.com/news/amd-radeon-rx-vega-gaming-gpus-siggraph-2017-unveiling
 
Last edited:
Ok I get what you mean but the thread as a whole is still funny. Even when I get flamed for being a troll I found that even more funny.

We literally can make all the assumptions about a product we want but the danger here is that everyone is masturbating their minds into oblivion for NO reason.

I do not think that a thread discussing, in some finalized dictation, an unknown product for the most part (Gaming vega part) is not even called for as of now. It just breeds mixed opinions and baseless assumptions and in the end makes it apparent that the people most participating in the debate is literally the snake eating it's tail.

No one forces you to keep reading the thread or posting here. I think the conversation is fairly good even if it is speculation about more AMD false hype.
 
No one forces you to keep reading the thread or posting here. I think the conversation is fairly good even if it is speculation about more AMD false hype.

That is fine I agree and I will keep reading it. I did post more to my above comment that you may have not read.
 
Yeah I can't go back from FreeSync to no FreeSync, and unfortunately I just purchased my monitor in November. The monitor I had before that lasted me nearly 6 years, so I'm locked in for a while. If Vega sucks, I don't upgrade, plain and simple. I'm holding out for true 4k 144hz HDR with Free/Gsync at this point, maybe they'll have competitive cards by then.
 
Its just too early. There am I still trolling?


Anyways hot off the press..... VEGA RX is confirmed to be releaseing at Siggraph 2017 July 30th

THis is great news isn't it? Lets hope this card shreds the 1080 dollar for dollar. I will get one to replace my aging 980ti if it pans out., if not I will just get a used 1080ti or two. I am not biased. Just being patient and not making wild assumptions.

Link : https://hothardware.com/news/amd-radeon-rx-vega-gaming-gpus-siggraph-2017-unveiling


No, you actually added to the conversation.

Unless you have some edge case need, I can't see waiting for the mythical Vega RX is my only real thought. (as much as I want to see one working, but I also want to see the mystical unicorn of the Omega Cream Pie quadrant)

I really do like the idea that 1080Ti 's have been out long enough now that's there's a used market for them...wow. But I do agree that I hope it's priced to shake up the product lines of both orgs... something I've said for months. What no one saw coming was everyone's grandma buying skids of video cards to mine with.
 
No, you actually added to the conversation.

Unless you have some edge case need, I can't see waiting for the mythical Vega RX is my only real thought. (as much as I want to see one working, but I also want to see the mystical unicorn of the Omega Cream Pie quadrant)

I really do like the idea that 1080Ti 's have been out long enough now that's there's a used market for them...wow. But I do agree that I hope it's priced to shake up the product lines of both orgs... something I've said for months. What no one saw coming was everyone's grandma buying skids of video cards to mine with.

Every reasonable person hopes RX will shake up the market with competitive pricing, the issue is that given the relative die sizes of competing (in terms of performance) GPUs + more expensive memory it is not unreasonable to be concerned. If its priced competitively wrt 1080 then it will be a very low margin part
 
Every reasonable person hopes RX will shake up the market with competitive pricing, the issue is that given the relative die sizes of competing (in terms of performance) GPUs + more expensive memory it is not unreasonable to be concerned. If its priced competitively wrt 1080 then it will be a very low margin part


Low margin or no margin,worse yet, AIB partners and retailers will take hits on their margins too. Its just bad all around not just for AMD, but for anyone that sells these cards.
 
Ok I get what you mean but the thread as a whole is still funny. Even when I get flamed for being a troll I found that even more funny.

We literally can make all the assumptions about a product we want but the danger here is that everyone is masturbating their minds into oblivion for NO reason.

I do not think that a thread discussing, in some finalized dictation, an unknown product for the most part (Gaming vega part) is not even called for as of now. It just breeds mixed opinions and baseless assumptions and in the end makes it apparent that the people most participating in the debate is literally the snake eating it's tail.

We might as well start a thread on nVidia's StarKiller Base card which they haven't even thought of and decide that it is superior to everything ever made ---- in ever and it's not even going to be thought of by their engineers for 5 more years. And at the same time AMD's future card 14 generations from now called IronJagoff is going to be just a slimy has been junk card that is only good for mining the remnants of the failed and imploded valueless bitcoin hype.

Disclaimer, I was never a fan boy of AMD or INtel but I love the competition, and I am rooting strongly for the underdog because comp. is great for everyone. I am sincerely hoping just as you Noko that the card really plays out well and is a powerful 1080 equiv or more for less $.

I JUST REFUSE to base base my opinion of Vega RX (Gaming part) on the virgin release of a workstation Adobe/CAD/CAM style card. It just doesn't make sense. Imagine how How windows 10 would perform if it was only optimized to use just one single core. It is optimized to use the piss out of your processor for the best OS performance as possible. Vega gaming version will be optimized to give the best possible performance with games that it can. The Frontier edition should have zero affect on how you view the upcoming performance of an entirely different purposed product.

We can't compare nVidia Quadro etc... performance to their gaming parts etc... that is not fair. nVIdia is an entirely different and competing technology that does the same thing in the end. Amd is a diff arch and product that will do the same thing in the end ........ when it is actually released.

Its just too early. There am I still trolling?


Anyways hot off the press..... VEGA RX is confirmed to be releaseing at Siggraph 2017 July 30th

THis is great news isn't it? Lets hope this card shreds the 1080 dollar for dollar. I will get one to replace my aging 980ti if it pans out., if not I will just get a used 1080ti or two. I am not biased. Just being patient and not making wild assumptions.

Link : https://hothardware.com/news/amd-radeon-rx-vega-gaming-gpus-siggraph-2017-unveiling


sand.jpg
 
Every reasonable person hopes RX will shake up the market with competitive pricing, the issue is that given the relative die sizes of competing (in terms of performance) GPUs + more expensive memory it is not unreasonable to be concerned. If its priced competitively wrt 1080 then it will be a very low margin part

I don't think it will be that bad. I think AMD will likely have a card around 1080 performance for cheaper and one that sustains higher clocks, likely watercooled or some really good after market cooling that sucks up more watts. In any case it will likely use 50-100 more watts then comparable nvidia solution. Might not be all that bad with their shit R&D if they price them right. Like someone here said if they price this thing at $399 or 449, good luck. Miners will probably grab them. I think selling enough of Vega at reasonable price will be least of amd concerns. AMD is frickin so lucky right now that mining is popular again or they would be in shit load of trouble.
 
Ok I get what you mean but the thread as a whole is still funny. Even when I get flamed for being a troll I found that even more funny.

We literally can make all the assumptions about a product we want but the danger here is that everyone is masturbating their minds into oblivion for NO reason.

I do not think that a thread discussing, in some finalized dictation, an unknown product for the most part (Gaming vega part) is not even called for as of now. It just breeds mixed opinions and baseless assumptions and in the end makes it apparent that the people most participating in the debate is literally the snake eating it's tail.

We might as well start a thread on nVidia's StarKiller Base card which they haven't even thought of and decide that it is superior to everything ever made ---- in ever and it's not even going to be thought of by their engineers for 5 more years. And at the same time AMD's future card 14 generations from now called IronJagoff is going to be just a slimy has been junk card that is only good for mining the remnants of the failed and imploded valueless bitcoin hype.

Disclaimer, I was never a fan boy of AMD or INtel but I love the competition, and I am rooting strongly for the underdog because comp. is great for everyone. I am sincerely hoping just as you Noko that the card really plays out well and is a powerful 1080 equiv or more for less $.

I JUST REFUSE to base base my opinion of Vega RX (Gaming part) on the virgin release of a workstation Adobe/CAD/CAM style card. It just doesn't make sense. Imagine how How windows 10 would perform if it was only optimized to use just one single core. It is optimized to use the piss out of your processor for the best OS performance as possible. Vega gaming version will be optimized to give the best possible performance with games that it can. The Frontier edition should have zero affect on how you view the upcoming performance of an entirely different purposed product.

We can't compare nVidia Quadro etc... performance to their gaming parts etc... that is not fair. nVIdia is an entirely different and competing technology that does the same thing in the end. Amd is a diff arch and product that will do the same thing in the end ........ when it is actually released.

Its just too early. There am I still trolling?


Anyways hot off the press..... VEGA RX is confirmed to be releaseing at Siggraph 2017 July 30th

THis is great news isn't it? Lets hope this card shreds the 1080 dollar for dollar. I will get one to replace my aging 980ti if it pans out., if not I will just get a used 1080ti or two. I am not biased. Just being patient and not making wild assumptions.

Link : https://hothardware.com/news/amd-radeon-rx-vega-gaming-gpus-siggraph-2017-unveiling

what will the excuses be after RX VEGA will be faster that FE (and it will by at most 10%)? Hmm not enough time to do driver development? OR what? Lets here the excuses now cause it gives ya time to practice and rehearse. Since you think this place is a like kids play, you need it to fine tune your story, so it flows naturally soft.
 
I don't think it will be that bad. I think AMD will likely have a card around 1080 performance for cheaper and one that sustains higher clocks, likely watercooled or some really good after market cooling that sucks up more watts. In any case it will likely use 50-100 more watts then comparable nvidia solution. Might not be all that bad with their shit R&D if they price them right. Like someone here said if they price this thing at $399 or 449, good luck. Miners will probably grab them. I think selling enough of Vega at reasonable price will be least of amd concerns. AMD is frickin so lucky right now that mining is popular again or they would be in shit load of trouble.

Very had to come down to that price. Now do we see why nV dropped the 1080 to 500? They could have easily kept their margins on the 1080 and slotted the 1080ti in with no problems at all. But now the 1080 prices puts a butt load of pressure on AMD RX Vega
 
Last edited:
Like someone here said if they price this thing at $399 or 449, good luck. Miners will probably grab them.

How do I put myself down as "Miner" when I order a card? If miners are getting priority, why not just declare yourself a miner?
 
Last edited:
miners are getting priority cause they are buying these cards before they even come into the suppliers warehouses lol. Just talked to my best buy and new egg business this week, they can't get many of them cause their suppliers don't have them showing in stock.

But for Vega to be profitable at mining, it still needs to be competitive with the rx 480 and 580 in perf/watt in mining, so Vega needs to get 60mhs per sec at its current power draw levels, that probably won't happen.

PS that is another draw back to what we have seen before with Vega, during mining its using just a little less wattage that in gaming. During mining, a good portion of the chip isn't being used, Back-end of the chip (ROP's in particular) is not being used, and front issues are not there either (no polygons) yet its drawing a butt load of power. So if the ROP's aren't being used, and the power draw didn't change much will tiled binned rasterizer help much with power draw if it can even be automatically be activated on all games current and old?

This is exactly why for RX480 and RX 580 without no modification in power draw or voltage, the GPU can drop in power consumption down to 100 or 90's, For Vega there is some drop similar in respect to die size and the amount of units, for the GPU, but that speaks to silicon that isn't even being used. So how much savings will it have when being used albeit more optimal than before, some where in between those two figures right?

Come guys the test are out there, its easy to see things aren't going to change dramatically with RX Vega.

What do you guys think mining tests are like, they are like synthetics, they only test one part of the chip, the shader core, so we get a pretty cool breakdown of power consumption because of that tests.
 
Last edited:
Tearing can happen at any framerate, even fps=hz has tearing if you're not using a software or hardware sync method.

I still get tearing every once a while with my 75Hz freesync monitor (HP Omen 32" 1440p) from time to time. My framerates are always above 60 with an overclocked RX 480.
 
No one forces you to keep reading the thread or posting here. I think the conversation is fairly good even if it is speculation about more AMD false hype.

What was the false hype? I missed it.

Every reasonable person hopes RX will shake up the market with competitive pricing, the issue is that given the relative die sizes of competing (in terms of performance) GPUs + more expensive memory it is not unreasonable to be concerned. If its priced competitively wrt 1080 then it will be a very low margin part

AMD has much lower overhead than nVidia. Also, they are using these cards, Fiji and Vega, as test beds for HBM. The payoff is still down the road.
 
Dullard it doesn't matter since these cards a total failure before we even have one to review based on ......oh ... most of the members of oft forums
 
Which is less bandwidth than HBM1 used in the Fury X.

AMD never fails to amaze me.

That could easily be some function of a 16GB card that isn't applicable to the 8GB version that we aren't yet aware of. I don't know. But it does seem silly.
 
That was sarcasm but it zoomed right over your head.


And Araxie is being realistic ;)

Look we know its Tflops now, and its max potential, when the shader array is fully utilized. We know also AMD traditionally doesn't have the capability to use its shader array as effectively as nV's from generations of GCN vs nV's architectures.

And when we start seeing Vega's results aligning with other GCN products vs nV products in a similar manner with Vega FE, (just power consumption is crazy), how is it possible not to take things like that seriously, or ignore all that?

Utilization wise Pascal isn't much different that Maxwell, where Pascal really shifted was its power consumption and the possible clocks it can get. A mixture of architectural changes (which were extreme enhancements of Maxwell) and node (to much lesser degree but because of the lesser voltage of the new node it enhances the capablilites of the architectural changes, remember nV stated they got better than expected with Pascal when it came to clocks, so a bit luck also played into this. AMD with Vega had to jump 2 generations to catch up to Pascal, they couldn't do that with Polaris they jumped one generation, now Vega FE is in the same boat. That is exactly where AMD should be in realistic terms, only way they could have jumped up 2 gens is if they fast tracked their next gen (not Vega, but Navi) up the time lines. And even if they fast tracked Navi and shifted it from 7nm to 14nm, they would only be able to push it up so much maybe 1 year, since Navi would have been at an extremely early stage of development. Still puts it @ a late 2017 and early 2018 product.

If Pascal just enhanced by node, AMD would have caught up, or if Pascal was a typical update of 40-50% per bracket, AMD would be in spitting distance for the most part. That was not what happened with Pascal though, Pascal went above 60% per bracket.

Again there is nothing mythical about this, engineering is math, expectations are played out at the design phase, calculated for and proceeded to create. Everything being done, is know before hand, bad luck or good luck plays its part but not as dramatic as people want to think it is.
 
Last edited:
Well that's the thing, assuming the same fps/tflop as Fury X it should be slightly ahead of the 1080. Also assuming it's holding 13 tflops which that was based off 1600Mhz IIRC? And these are holding 1440ish. So that's -10% right there if I am remembering this all right.

Custom AIB and a little better drivers RX should be 1080ish which is fine IMO. Seeing the FE struggling by a 1070...

I wonder how it'll do with VR. Right now AMD doesn't really have a contender.
 
That could easily be some function of a 16GB card that isn't applicable to the 8GB version that we aren't yet aware of. I don't know. But it does seem silly.

Nothing hidden.. no special function, VEGA have 2048 bit bus.. Fury X 4096, even with "slower HBM1" RAM.. the bandwidth is higher.. bus width is one of the things many are not taking in consideration in the Vega vs Fury X war..
 
Well that's the thing, assuming the same fps/tflop as Fury X it should be slightly ahead of the 1080. Also assuming it's holding 13 tflops which that was based off 1600Mhz IIRC? And these are holding 1440ish. So that's -10% right there if I am remembering this all right.

Custom AIB and a little better drivers RX should be 1080ish which is fine IMO. Seeing the FE struggling by a 1070...

I wonder how it'll do with VR. Right now AMD doesn't really have a contender.


Yeah and those are the results its showing right now. VR is a combination of many things, we don't know of the VR special features of Vega, they probably will have some things to help them out there.
 
I have to say Pascal is probably the best design GPU for it's time then any other, from HPC down to mobile it just shines. AMD's best may match last year 1080 only in performance but at extreme power levels and size? Looks like RTG has a lot of work ahead of them if they want to keep in the game. I almost think Vega is on that line of being cancelled, AMD may just have to produce these at zero profit to keep some level of marketplace presence and get Navi out ASAP. Great that AMD is making good headway with their RyZen line and hopefully Threadripper and Epyc so they can make good profits for once in a long time. Does AMD have anything to even match Nvidia 2015 P100 chip that uses HBM2, which has been in use for almost 2 years now. Combine that with Volta is indeed right around the corner. Vega FE falls way short to start a new generation of cards.

We have not even talked about VEGA in the APU yet, will it actually have tiled based rendering? Suck as much power with little gain over previous generation? The one problem I see with AMD professional line of CPU's is at the the low end you still have to have a video card - that kills it for the average business computer doing routine low end stuff where millions of old computers are being replaced yearly. RTG does affect the cpu division in a very big way, from HPC down to mobile platform needing a good APU.

Now Raja did some rather good dancing and got Polaris respectful and competitive to Nvidia to a certain extent, I am not sure if Vega can do that. Plus no mention of Vega 11 - wow! It is almost as if Vega 10 is performing at Vega 11 expectations. Unless this is all software fixable, Vega is not looking too good.
 
So siggraph is July 30, assuming AMD will at worst declare a launch date a month away, when should we expect AIBs to start being delivered chips and producing cards in enough quantity that we at least hear or get poorly sourced chinese benchmarks?
 
So siggraph is July 30, assuming AMD will at worst declare a launch date a month away, when should we expect AIBs to start being delivered chips and producing cards in enough quantity that we at least hear or get poorly sourced chinese benchmarks?

i think AMD was quoted stating that the RX Vega launch stock will be low, this could mean that it will launch on time but due to it being "on-time", they aren't able to pump the volume of gpus/cards to AIB vendors
 
So siggraph is July 30, assuming AMD will at worst declare a launch date a month away, when should we expect AIBs to start being delivered chips and producing cards in enough quantity that we at least hear or get poorly sourced chinese benchmarks?
With no leaked card information or tests with less then a month away from launch is not a good indicator. The Vega FE is an extremely well designed card with strong VRM configuration except maybe the single phase for the HBM2 memory. So AMD should have reference designs that the AIB can just use to manufacture reference designed cards before custom one's come out. With zero leaks or any information about Rx Vega performance - not looking good for a hard launch, at least to me that is. Also Vega has to be viable to the AIBs to begin with, they are not going to make something that has no real market - I am wondering if we will see some of the AIBs pull out or not make Vega cards?
 
With zero leaks or any information about Rx Vega performance - not looking good for a hard launch, at least to me that is. Also Vega has to be viable to the AIBs to begin with, they are not going to make something that has no real market - I am wondering if we will see some of the AIBs pull out or not make Vega cards?

I'm okay with no Vega leaks. Every time RTG has leaked something within the last three years it's been to push a false narrative. So silence is golden in my opinion.
 
So siggraph is July 30, assuming AMD will at worst declare a launch date a month away, when should we expect AIBs to start being delivered chips and producing cards in enough quantity that we at least hear or get poorly sourced chinese benchmarks?

They'll announce it July 30th with some bs benchmarks and then if we're lucky we'll see a small batch within a month. Mass availability probably won't happen until late September at the earliest which is a total waste for anyone who buys one since Volta is assuredly an early 2018 release. I wouldn't be at all surprised to see Titan Volta this year.

Basically Vega is likely the biggest failure ATi has ever produced and will be remembered as the GPU equivalent of Bulldozer.
 
I'm okay with no Vega leaks. Every time RTG has leaked something within the last three years it's been to push a false narrative. So silence is golden in my opinion.
The R300 comes to mind from ATI - that was pure Gold and got Nvidia scrambling. lol. Still the biggest leak or show is Vega 10 in the FE. I am hopeful for some rather tight/small configurations with decent performance increase over the Nano. Right now that seems to be a tall order.
 
Last edited:
They'll announce it July 30th with some bs benchmarks and then if we're lucky we'll see a small batch within a month. Mass availability probably won't happen until late September at the earliest which is a total waste for anyone who buys one since Volta is assuredly an early 2018 release. I wouldn't be at all surprised to see Titan Volta this year.

Basically Vega is likely the biggest failure ATi has ever produced and will be remembered as the GPU equivalent of Bulldozer.
ATi is dead man, consumed and eaten by AMD. Kinda like 3dfx which Nvidia snacked on.
 
Well, I'm going to have to like Vega because it's going to be in the new iMac Pro that I'll be requesting from my job at the end of this year. So, for my own sake, I hope to god it doesn't suck.
 
Well, I'm going to have to like Vega because it's going to be in the new iMac Pro that I'll be requesting from my job at the end of this year. So, for my own sake, I hope to god it doesn't suck.
And shortly after that, you will have bring it in for the repair extension program because your iMac Pro suffer a meltdown from excessive heat.

Hasn't Apple launched a repair extension program for every single Mac with a dGPU?
 
And shortly after that, you will have bring it in for the repair extension program because your iMac Pro suffer a meltdown from excessive heat.

Hasn't Apple launched a repair extension program for every single Mac with a dGPU?

My 2010 and 2013 Mac Pros haven't had any dGPU issues. Ironically, the two MacBook Pros with nVidia GPUs did...
 
Back
Top