Vega Rumors

Already have Vega56 performing around GTX1080 with liquid cooled Vega64(more hardware and higher clocks) a good deal above that. Doubt that's even bothering with packed math and other optimizations.
LOL, that's a funny one, mark my word though, Vega 56 is not going to be substantially faster than a 1070, those numbers you read in that rumor are fake or min fps (they are too low for a 1070), AMD would have been up and about about Vega 56 if they were true. And we've already seen Vega 64 numbers, barely faster than a 1080FE (which boosts lower than aftermarkets 1080s). You either have Vega 56 so close to Vega 64 that 64 becomes irrelevant, or these numbers are false. So nobody believes these Vega 56 results, except those grasping for straws. And rest assured, I am gonna remind you of that very soon.
 
Last edited:
From an 1070 killer to a 1080 killer, boy I wish I have his optimism......
What optimism? That's precisely what the leaks showed and the expected performance.

What's odd is the shear level of denial some of the posters have around here. Half a dozen guys running around hysterically trying to pretend they didn't screw up big time again with everyone thinking their idiots. I'm still laughing at the fact AMD played them so easily.
 
What optimism? That's precisely what the leaks showed and the expected performance.

What's odd is the shear level of denial some of the posters have around here. Half a dozen guys running around hysterically trying to pretend they didn't screw up big time again with everyone thinking their idiots. I'm still laughing at the fact AMD played them so easily.

You mean the post from Tweaktown regarding an industrial source have provided the author some questionable raw benchmark numbers. Last time I checked Civ 6, the GTX 1070 scored 83.8 fps at ultra setting with 8X MSAA on 1440p from a test conducted by TPU. https://www.techpowerup.com/reviews/Performance_Analysis/Civilization_6/4.html

If that is not enough for you, there is also a test conducted by Tweaktown on Battlefield 1 put the 1070 with an avg of 105 fps at ultra setting on 1440p. Of course that is DX 11 number, DX 12 number has an avg of 99 fps. http://www.tweaktown.com/tweakipedia/116/fury-vs-gtx-1070-battlefield-dx11-dx12/index.html
http://www.tweaktown.com/tweakipedia/116/fury-vs-gtx-1070-battlefield-dx11-dx12/index.html
Since I don't know whether DOOM was run under Vulkan or OpenGL, I will concede that RX Vega 56 probably does run better than GTX 1070. And I cannot find any benchmark review on the 1070 with high setting except ultra setting which I can provide if you want me to.

While the analysis from the link I provided is not concrete proof regarding performance comparison between GTX 1070 vs RX Vega 56, it does leave me with a lot of question how the test were conducted since there are contradictory information.
 
What's odd is the shear level of denial some of the posters have around here. Half a dozen guys running around hysterically trying to pretend they didn't screw up big time again with everyone thinking their idiots. I'm still laughing at the fact AMD played them so easily.
Hmm, define screwed up: guy says Vega 64 is 1080Ti killer, but AMD says it only trade blows with the 1080. So he goes on on a quest to justify himself by clinging to a rumor about Vega 56 and then try to reinforce his failed 1080Ti theory through the same rumor. Yeah I'd say he screwed big time.
 
No, no; the RX Vega 56 trades blows with the 1080, the RX Vega 64 is the 1080Ti killer...?!? ;^p
 
If the Radeon Pro WX 9100 is the actual Vega workstation card certified for use with pro applications...

And the Radeon Pro SSG is basically a WX 9100 with two M.2 slots on board for 2TB of dataset storage...

And the WX 9100 has the ability to switch back & forth between Workstation drivers and Gaming drivers...

Can I keep a SSG in Gaming mode & load my Steam library to the 2TB of on board storage...?!?
 
And the Radeon Pro SSG is basically a WX 9100 with two M.2 slots on board for 2TB of dataset storage...
WX 9100 is the same as Vega Frontier Edition except that it has certified drivers, while Vega FE has the drivers without certification.
Can I keep a SSG in Gaming mode & load my Steam library to the 2TB of on board storage...?!?
You can't since the memory will not be mapped to your hard drive, the SSG will only be accessible through the driver or the API.
 
I suppose you'd like to argue with this.

ef73ace1-35c9-499c-bdc5-af05c54ef8c0.png
 
If the Radeon Pro WX 9100 is the actual Vega workstation card certified for use with pro applications...

And the Radeon Pro SSG is basically a WX 9100 with two M.2 slots on board for 2TB of dataset storage...

And the WX 9100 has the ability to switch back & forth between Workstation drivers and Gaming drivers...

Can I keep a SSG in Gaming mode & load my Steam library to the 2TB of on board storage...?!?

WX 9100 is the same as Vega Frontier Edition except that it has certified drivers, while Vega FE has the drivers without certification.

You can't since the memory will not be mapped to your hard drive, the SSG will only be accessible through the driver or the API.

I know that the only real difference between the Vega FE & the WX 9100 (shrouds & such aside) is the certified drivers; well, and about 1200 bucks...

As for the Steam library on a SSG; dude, it was obviously a joke...
 
Anarchist4000 Your claim that Vega 56 is on par with a 1080 and that Vega 64 is a good deal above that is completely unfounded, you have no evidence whatsoever to support this claim. AMD's own overly enthusiastic marketing slides pit Vega 64 against the 1080, at this point you're just spewing more and more BS and a few weeks from now we'll be chiding you over your bullshit predictions as we have done ad nauseam since the beginning of time. You do this every fucking amd release.
 
My personal prophecy is that Vega 56 will be the people's champ.

Im not saying its going to make waves or anything, but I think it should give about 1070 performance, which is fine. I don't think the 64 will be worth the power+price.

In other news, AMD's last 3 GPU designs could BIOS unlock dormant cores... so Vega 56 may have some warranty-voiding potential...
 
I suppose you'd like to argue with this.
Since one of the numbers is BF1 in DX12 and the other is CoD IW which for whatever reason (scene choice, eh?) has lower minimums than those widely observed by players, they are basically claiming that Vega is slower than 1080.
Why would anyone want to argue with this?
 
Since one of the numbers is BF1 in DX12 and the other is CoD IW which for whatever reason (scene choice, eh?) has lower minimums than those widely observed by players, they are basically claiming that Vega is slower than 1080.
Exactly, here they are in AMD's own slides:
@1440p, the GTX 1080 wins 3 games, loses 2, and ties 1 game with Vega 64
@4K: The GTX 1080 wins 3 games, loses 2 and ties 1 with Vega 64.
Of course AMD varies games selection at every resolution and cherrypicks the ones that shows them at best light, but even their own materials can't escape the shadow of the 1080.

slides-raja-35.jpg


slides-raja-38.jpg
 
Anarchist4000 Your claim that Vega 56 is on par with a 1080 and that Vega 64 is a good deal above that is completely unfounded, you have no evidence whatsoever to support this claim. AMD's own overly enthusiastic marketing slides pit Vega 64 against the 1080, at this point you're just spewing more and more BS and a few weeks from now we'll be chiding you over your bullshit predictions as we have done ad nauseam since the beginning of time. You do this every fucking amd release.
LOL oh the irony
same could be said about nearly every in this thread...
 
Anarchist4000 Your claim that Vega 56 is on par with a 1080 and that Vega 64 is a good deal above that is completely unfounded, you have no evidence whatsoever to support this claim. AMD's own overly enthusiastic marketing slides pit Vega 64 against the 1080, at this point you're just spewing more and more BS and a few weeks from now we'll be chiding you over your bullshit predictions as we have done ad nauseam since the beginning of time. You do this every fucking amd release.


Two words

Viral Marketing
 
My personal prophecy is that Vega 56 will be the people's champ.

Im not saying its going to make waves or anything, but I think it should give about 1070 performance, which is fine. I don't think the 64 will be worth the power+price.

In other news, AMD's last 3 GPU designs could BIOS unlock dormant cores... so Vega 56 may have some warranty-voiding potential...


I think if its not good at mining, its going to be a good seller for gamers. Here's hoping for a good mining card though lol.
 
post that generated the vega rx 70-100 mh/s rate rumor caused the poster to get perma banned.

not sure if that was because rumor is false or rumor breaks someones NDA. though even if false I suppose it could still break an NDA if NDA has Fight club rules 1 & 2.
 
highly doubt it will get that much, 60mhs yeah I can see that. Outside of API differences, the only thing mining goes by is how many flops a card can do, so it really should scale fairly linearly. Now with Vega and 16bit, it might get more and I suspect this is where its getting more from.
 
It would take a divine intervention for Linux to over take Windows in the next year.
All it would take for Android is an up-gradable Arm ecosystem, Desktop Arm CPU and 4 Pci-E slots. It will happen a lot sooner than Linux taking over Windows in gaming. Just sayin.



Miners are AMD's strong point. Love them or hate them, they are keeping AMD relevant.
If AMD could make Vega cards more common than USB memory sticks and charge a $500 mark up, they would still sell every one of them, thanks to miners.
They need a Crytomining division at AMD to make a "mining only card" not a video card, just heavy mining work.
That will bring video cards back to game enthusiast and give miners what they need.
 
highly doubt it will get that much, 60mhs yeah I can see that.

I don't see why it (whatever is making it possible to double - triple the mhs) wouldn't be enabled for Vega FE right now. You think they'd want to sell the $1,000+ cards...
 
Well the only reason is there are features in AMD's architecture for Vega that weren't exposed yet and if not exposed the miner programmers wouldn't be able to access them when FE was launched. But I haven't seen any updated miners for Vega FE, so..... highly unlikely. Its not like anyone outside of AMD has access to that info either at least not yet to the people that it would be relevant to.
 
claymore ETH miner has vega asm update in v9.8 released a few days ago. claymore was also working closely with AMD to fix DAG epoch issue. though claymore also reports his vega fe only does ~ 40mh/s even with v9.8.
 
really didn't get the update notification lol, weird awesome miner is pretty much on top of those things. time to update thx!
 
fuckers, going to have to complain whats the use of buying software to take care of those things when it doesn't do it :/
 
post that generated the vega rx 70-100 mh/s rate rumor caused the poster to get perma banned.

not sure if that was because rumor is false or rumor breaks someones NDA. though even if false I suppose it could still break an NDA if NDA has Fight club rules 1 & 2.

No he wasn't that was just a joke from another Staff Member. Gibbo owns OCuk, He isn't going to perma ban himself.
 
No he wasn't that was just a joke from another Staff Member. Gibbo owns OCuk, He isn't going to perma ban himself.

You're right. Don't know the site or the people I just scanned though posts to see if there was an update. OCuk doesn't seem like a good place to shop, 1/10 stars over 129 reviews at resellarratings mostly about bad customer service and being lied to. Guess we'll see about the later in 10 days.
 
Last edited:
You're right. Don't know the site or the people I just scanned though posts to see if there was an update. OCuk doesn't seem like a good place to shop, 1/10 stars over 129 reviews at resellarratings mostly about bad customer service and being lied to. Guess we'll see about the later in 10 days.

Gibbo is a sales man, he says pretty much anything to make a sale lol. If he says something is awesome take it with a grain of salt, if he says if things aren't looking good, expect it to be really bad.
 
You're right. Don't know the site or the people I just scanned though posts to see if there was an update. OCuk doesn't seem like a good place to shop, 1/10 stars over 129 reviews at resellarratings mostly about bad customer service and being lied to. Guess we'll see about the later in 10 days.

reselleratings? what is that site? Pure Rubbish. The only shops that get positive reviews there are ones that are merchant members, it's basically a scam.

Overclockers have very good customer service. 6000+ reviews on Google in the last 12 months, 4.8 out of 5 stars. Trustpilot- 9.1/10 from 2100 reviews.
 
Anarchist4000 Your claim that Vega 56 is on par with a 1080 and that Vega 64 is a good deal above that is completely unfounded, you have no evidence whatsoever to support this claim. AMD's own overly enthusiastic marketing slides pit Vega 64 against the 1080, at this point you're just spewing more and more BS and a few weeks from now we'll be chiding you over your bullshit predictions as we have done ad nauseam since the beginning of time. You do this every fucking amd release.
How so?

http://wccftech.com/amd-radeon-rx-vega-mining-performance-great/

Vega 56 / 64 / LC
FP32 10.5 / 12.6 / 13.7
Mem 410 / 484 / 484

If the leaked numbers were right and Vega56 was hovering around 1080 territory and higher tier products are... higher. Why wouldn't they be faster?
 
How so?

http://wccftech.com/amd-radeon-rx-vega-mining-performance-great/

Vega 56 / 64 / LC
FP32 10.5 / 12.6 / 13.7
Mem 410 / 484 / 484

If the leaked numbers were right and Vega56 was hovering around 1080 territory and higher tier products are... higher. Why wouldn't they be faster?
First off you can't post a rumor to prove a prior rumor is true. Secondly WCCFTECh doesn't exactly have a great track record. Thirdly mining hash rates is not equal to actual graphical performance in games.
 
First off you can't post a rumor to prove a prior rumor is true. Secondly WCCFTECh doesn't exactly have a great track record. Thirdly mining hash rates is not equal to actual graphical performance in games.
Using that source for simple hardware figures as they have a giant table. Nothing too unreasonable about that. I wasn't discussing mining hash rate, but there have been hints from reputable people for some time they are rather strong and not wholly unexpected given the hardware capabilities.

As for the rumored performance of the tweaktown leak, I'm taking that with a grain of salt still. Taken at face value my comparison seems valid as that's precisely what the leak shows. Nothing too complicated about that. More likely new features were enabled and drivers tuned since FE released. RX would have released earlier otherwise, unless inventory was the entire reason for the delay. So far we've seen a lot of comparisons about minimum framerates and cards compared based on cost. That in no way invalidates the possibility they may be significantly faster in average and maximum framerates.

I called the Polaris release correctly, async issue correctly, and a year later haven't seen anything inaccurate about my claims. Only odd one being a high end Polaris model that never materialized. I follow the market for investment purposes and being consistently wrong isn't a great strategy. Consequently I'm seeing several hundred percent gains yearly. If any prospective consumer wants an even remotely educated opinion from reading forums, this is not the place to do it. The general understanding of engineering and how electronics function is far too limited, falling to individuals that only understand what is directly in front of them with no ability to anticipate the effects of changing the system. That and a bunch of viral marketers that probably put a bunch of heat on their boss following the whole async debacle.
 
This card needs to come out, thread is literally a dumpster fire. Im going to buy one just because I have followed this thread for months even if the card cant play wolfenstein 3D
 
Once Vega is released, I'm quite sure regardless of how it turns out, we will then go right back to arguing why some game titles should be ignored since they don't really represent Favored_Vendor's cards very well. Usually this will be explainedaway due to the presence of an optional setting in the graphics menu which turns on a light shaft, better ambient occlusion, or makes characters have fully rendered pubes. This shows the title has some vendor bias, and should be ignored.

Or the "wrong" API was used to make the game.

Or something.
 
How so?

http://wccftech.com/amd-radeon-rx-vega-mining-performance-great/

Vega 56 / 64 / LC
FP32 10.5 / 12.6 / 13.7
Mem 410 / 484 / 484

If the leaked numbers were right and Vega56 was hovering around 1080 territory and higher tier products are... higher. Why wouldn't they be faster?

If Vega 56 matches a 1080, why did they show Vega 64 matching a 1080 in the official amd presentation?

As for the figures, 13.1tflops puts it well ahead of the Titan Xp (reference clocks). Vega is comfortably the single most powerful GPU in terms fp throughput, doesn't translate to effective performance. Those are just facts that have been verified tens of times by all the independent reviewers who benchmarked Vega FE. Tiled rasterization will alleviate memory bandwidth bottlenecks thus improving performance, in the absence of memory bandwidth constraints it will reduce power consumption. The performance numbers presented by amd (games) are using the new geometry pipeline (sans primitive shaders as those require developer intervention) so I don't see where you're coming from here.

As for your remarks about investments and knowing the industry and whatnot; this is the internet, I'm Jen Hsun Huang, Kyle is the Dalai Llama and Razor1 is Jim Keller and Abraham Lincoln warned the world not to believe everything you read on the internet.

Hardocp doesn't need to be a place for people to learn about electronics, and people don't need to learn about electronics to talk about hardware, most people are perfectly reasonable and recognize there are limitations to their understanding and will either inform themselves if interesting or just not talk bs. It's not hard.

In my eyes all you've been doing is harping on about pretty much anything AMD does as if it's the second coming of christ, and you set ridiculously high expectations for everything, then when things inevitably turn out to be less flabbergastlingly good than you claimed you just move onto a new thing.

I would have been very happy if Vega performance actually scaled with throughput, but it doesn't, and I simply don't buy the whole 'wait for dev support' mantra the amd fans keep repeating. It never pans out. I have no vested interest whatsoever in the financials of either company, you might.
 
Once Vega is released, I'm quite sure regardless of how it turns out, we will then go right back to arguing why some game titles should be ignored since they don't really represent Favored_Vendor's cards very well. Usually this will be explainedaway due to the presence of an optional setting in the graphics menu which turns on a light shaft, better ambient occlusion, or makes characters have fully rendered pubes. This shows the title has some vendor bias, and should be ignored.

Or the "wrong" API was used to make the game.

Or something.

If what you're talking about is the inevitable one benchmark where Vega performs exceedingly well then absolutely not it shouldn't be ignored, but it should be called what it is, one Vega optimized title.

Remember quantum break d12, users here and elsewhere went absolutely mental with that one, Fury X was crushing Titan X etc. Amd master plan, nvidia doom and gloom. That panned out just dandy didn't it? Fury X outperforming 980Ti two years later in every game, 4gb hbm >8gb gddr5, Maxwell is doesn't support async, neither does Pascal so they can't run modern titles at max settings. What am I missing? Oh, right, something something rasterization rates aren't important, primitive discard and geometry units aren't super important until AMD's are comparable to the competition.


So this is my one and only response to the now old 'there's a group of people who always bash AMD', it's just normal people talking about hardware being accused of heresy if they point out any flaws or dampen other's enthusiasm.

Dd
 
If what you're talking about is the inevitable one benchmark where Vega performs exceedingly well then absolutely not it shouldn't be ignored, but it should be called what it is, one Vega optimized title.

Remember quantum break d12, users here and elsewhere went absolutely mental with that one, Fury X was crushing Titan X etc. Amd master plan, nvidia doom and gloom. That panned out just dandy didn't it? Fury X outperforming 980Ti two years later in every game, 4gb hbm >8gb gddr5, Maxwell is doesn't support async, neither does Pascal so they can't run modern titles at max settings. What am I missing? Oh, right, something something rasterization rates aren't important, primitive discard and geometry units aren't super important until AMD's are comparable to the competition.


So this is my one and only response to the now old 'there's a group of people who always bash AMD', it's just normal people talking about hardware being accused of heresy if they point out any flaws or dampen other's enthusiasm.

Dd
That is actually a pet peeve of mine. There is no reason to shut off manufacture specific GPU features in games to do benchmark testing. It should just be acknowledged that it is a feature specific to a manufacture. The Witcher 3 is a great example as it uses NVIDIA Gameworks extensively in hair on people and creatures. That is an actual feature that should be discussed in GPU reviews it makes a legitimate difference in the games presentation and should not be ignored because it favors one GPU manufacture over the other. Same reason why a game that favors AMD should not be ignored; there are peole who may want to play such game and seeing how both manufactures handle said game is part of the buying proccess.
 
That is actually a pet peeve of mine. There is no reason to shut off manufacture specific GPU features in games to do benchmark testing. It should just be acknowledged that it is a feature specific to a manufacture. The Witcher 3 is a great example as it uses NVIDIA Gameworks extensively in hair on people and creatures. That is an actual feature that should be discussed in GPU reviews it makes a legitimate difference in the games presentation and should not be ignored because it favors one GPU manufacture over the other. Same reason why a game that favors AMD should not be ignored; there are peole who may want to play such game and seeing how both manufactures handle said game is part of the buying proccess.

I agree, but I was thinking about actual hardware specific optimizations, like Doom's use of gcn shader intrinsics in their TSSAA implementation. Hairworks is not an example of that, it's just that GCN's geometry performance is much lower than Maxwell and there was a massive performance hit
 
Back
Top