Vega Rumors

Vega is pure win.

Supports all the latest standards, runs cooler, is more efficient and provides the user with the latest technology. Something, a competitor can only offer in a $8k product.

Rx Vega will outperform the higher priced 1080, in modern games.


1501481674403m5pkzqq_1_48_l.jpg


Nothing to really argue.
And if you need something more, then wait for dual-gpu soc. But as it stands, to get dx12.1 and latest standards, not to mention lower latency gaming, Vega is the easy choice for Gamers.


Woah woah woah.

How is it more efficient exactly? Performs like a 1080, draws almost double the power. Half as efficient if anything
 
Woah woah woah.

How is it more efficient exactly? Performs like a 1080, draws almost double the power. Half as efficient if anything
It is very efficient at doing efficient things sufficiently. In fact the closer you get to Vega efficiency you beguine to approach an efficiency quotidian that makes all other efficient GPUs look less efficient than other efficient GPUs.
 
Woah woah woah.

How is it more efficient exactly? Performs like a 1080, draws almost double the power. Half as efficient if anything
ahh more BS from nTardiacs again..gone from 30-40W more to now "double"
lol yeah ok.. nearly 400+w for 1080 perf.. lay off the green chapstick and turn in your kneepads..
 
ahh more BS from nTardiacs again..
lol yeah ok.. nearly 400+w for 1080 perf.. lay off the green chapstick and turn in your kneepads..
Even without being hyperbolic there is scant evidence to support it is more efficient. Again we will know for sure in a few days, but it definitely looks like a power hungry card. That is obviously not a big deal to everyone, but it is to some. Power efficiency is something i look at because i live in the damn desert and it is too expensive to run the AC too low, so i try not to stick too much of a furnace under my desk.
 
ahh more BS from nTardiacs again..gone from 30-40W more to now "double"
lol yeah ok.. nearly 400+w for 1080 perf.. lay off the green chapstick and turn in your kneepads..

30/40 watts more can still be double if your reference is 30/40w...

Having said that we are comparing a 180w card ( 1080) to a 300w card (Vega 64 air cooled) at best.

If we look at PCPer's power testing you will find GTX 1080 @165/170w in RottR, whereas...

. In Rise of the Tomb Raider the Vega FE Liquid pulls around the 350-watt mark, compared to 290 watts for the air-cooled Vega FE and 250 watts for the GTX 1080 Ti.

And this is what the performance landscape looks like in the very same test they used for power measurements.

RoTRDX12_2560x1440_OFPS_2.png


Well would you look at that, it's a 1712mhz overclocked watercooled Vega FE still losing to the 1080 despite drawing *more* than double the power. 350w is for stock Vega WC btw, the 1712mhz card is over 400w lol

You're more than welcome to continue accusing me of making false, unfounded claims though!
 
what is odd is AMD marketing this as a really modified GCN architecture so much so its essentially new, they really should have just keep their mouth shut and let it die before it even came out. I can't even image Polaris if it was bigger being much worse. The power characteristics, performance just scale so much like Polaris, the only difference is higher clock speeds.

They liked the name Vega so much they wanted to use it as their Flagship name, now in marketing terms, next generation top end cards, can't be called Vega anymore, gotta use something else. Fury as a marketing name is dead now too. What are they going to call their next one? They should have called it Fury Maxx, name is already dead, just kill it some more lol. They should keep one name as their top level card, like nV does with Titan. Vega is a good name to use for that, oh well just have to open up the dictionary again I guess.
 
what is odd is AMD marketing this as a really modified GCN architecture so much so its essentially new, they really should have just keep their mouth shut and let it die before it even came out. I can't even image Polaris if it was bigger being much worse. The power characteristics, performance just scale so much like Polaris, the only difference is higher clock speeds.

They liked the name Vega so much they wanted to use it as their Flagship name, now in marketing terms, next generation top end cards, can't be called Vega anymore, gotta use something else. Fury as a marketing name is dead now too. What are they going to call their next one? They should have called it Fury Maxx, name is already dead, just kill it some more lol. They should keep one name as their top level card, like nV does with Titan. Vega is a good name to use for that, oh well just have to open up the dictionary again I guess.
How About Vega Furrier. I want to see the white paper have they released that and I am just too dumb to find it? I am curious how different or similar the architecture is to fiji or polaris.
 
what is odd is AMD marketing this as a really modified GCN architecture so much so its essentially new, they really should have just keep their mouth shut and let it die before it even came out. I can't even image Polaris if it was bigger being much worse. The power characteristics, performance just scale so much like Polaris, the only difference is higher clock speeds.

They liked the name Vega so much they wanted to use it as their Flagship name, now in marketing terms, next generation top end cards, can't be called Vega anymore, gotta use something else. Fury as a marketing name is dead now too. What are they going to call their next one? They should have called it Fury Maxx, name is already dead, just kill it some more lol. They should keep one name as their top level card, like nV does with Titan. Vega is a good name to use for that, oh well just have to open up the dictionary again I guess.

I like how they append the number of CUs to the name though, it's going to sound like shit for low end cards but oh well.
 
I like how they append the number of CUs to the name though, it's going to sound like shit for low end cards but oh well.


Thats not all that bad IMO, just easy to demarcate the line.

PS this is why I say Doom is more optimized for AMD hardware, again no one fault other than nV's because of them being late with Vulkan driver updates with their intrinsics.

arch-23.jpg


Love this slide

arch-30.jpg


Where did all that power savings go?
 
Last edited:
LOL sounds like furry (furrier).

Don't think they released it yet, but tech power up has a good write up (over view) but nothing indepth yet as white papers. I would like to see it too!

https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical_Overview/
So the HBCC sounds impressive but everything else is just a beefed up Fiji with some Polaris compute features thrown in. Not sure if they can keep up if they stick with GCN for another generation or two.
 
Respin? Updated Vega early next year? 16gb version using 4 stacks? Does anyone know how many revisions Vega went through so far? Really not sure how long Vega will be on the shelfs in other words or this version of Arch.
 
The problem with Vega and why you are seeing such much hate on it is because of the advertising campaign and guerilla marketing strategy used by AMD. It generated alot of hype and although this hype was only partially created by AMD, AMD gave enough for the Guerilla marketers to work with. With claims like biggest jump in performance per watt ever, increased IPC and the poor Volta slogan, viral marketers had all the tools they needed to make hype. What AMD has been doing to maintain its credibility while get cheap marketing done is use guerrilla marketing. We see alot more rumors coming from AMD side along with mysterious posters that disappear after AMD launches.

Compare this with Nvidia and we don't see it as much if at all. How often do we see performance rumors of Nvidia cards aside from those a couple weeks before launch? Basically zero. How often do we see them from AMD months before launch? Basically every launch since Fiji. Remember the captain jack slides, Polaris slides that indicate gtx 980 ti performance, 1600mhz overclocking and with Vega, we have gtx 1080 ti performance or better, the author of chipandbits.it claiming something similar. If we look at other forums you will see people who are long term members saying things like my sources within AMD are claiming .... What AMD has been doing is seeding fake information as cheap marketing because their products are late and they need to delay but they also want to deny accountability. The problem with this is we discuss these rumors and the viral marketers foster these topics into hype trains. They get so wild that if people say vega performs like a gtx 1080, your considered a hater and a fanboy. When these hype trains crash, it is vindication for those that didn't board it.

This launch has been particularly bad. It was not just the performance expectations, it was the fake launch dates. How often were we told to wait till october 2016, next may for prey launch, next was june and finally next was end of July. Add in the annoucements about annoucements and presentations that lack substance like the Capcaicin event which showed only a damn name and AMD was testing even the most hardened fans patience.

Now we have AMD doing stuff like hindering reviews for the product unless you test it the way AMD wants it and split NDAs that only allow AMD products to be shown the way they want. It just a damning marketing strategy.

Add in the inferiority of the actual product which which mixes some of the worst performance characteristics from Fermi(inefficiency), the fx 5800(performance inferiority vs competition) and it should be no surprise this product is getting slammed along with those responsible for it.

If people say nothing and just pretend this marketing disaster didn't happen, you are only validating these type of marketing tricks in the future.

You're right on point here. AMD marketing (and their "influencers") have had their fingers in the pie that is the collective anticipation for Vega since the day it was launched. Watching Vega (and Polaris prior to this) unfold has been an education in guerrilla tactics, payola, context manipulation and misdirection.

Not by any means suggesting nV isn't doing the same, but it at least their claims are more consistent and, as you mention, much less of the planted, engineered early "leaks".
 
Would anyone explain to me why anyone should buy Vega over Pascal?

If looking at 1080 level of performance and not too bothered about the extra power demand, ignoring noise as it could be a factor but depends upon model and noise characteristics.
Or as some mention FreeSync.
Also considering Nvidia does seem to have issues (mostly low level API games) on the AMD platform for whatever reason that are still not clear I would probably go for Vega by default here, the additional benefit is that Polaris tends to perform better on Ryzen than Intel when looking at relative DX11 to DX12 performance.
So if on Ryzen, I would feel Vega for now could be the right choice for a few reasons.
Cheers
 
Last edited:
FreeSync monitor owners

The problem with this argument is that the cost of having to buy a more powerful power supply cuts into the saving of choosing a FreeSync monitor over a G-Sync monitor.

That, not to mention higher electricity costs of not only powering the video card, but also to cool the room.
 
The problem with this argument is that the cost of having to buy a more powerful power supply cuts into the saving of choosing a FreeSync monitor over a G-Sync monitor.

That, not to mention higher electricity costs of not only powering the video card, but also to cool the room.

Well that's usually the primary argument for people to choose FreeSync over G-Sync, the cost of the monitors alone, and never rest of the system. I compared that to choosing between a BMW and Toyota based on the cost of the windscreen wiper replacements.

But another argument can be made that not every FreeSync has a G-Sync equivalent monitor, while the reverse almost do not exist (I only know of 1, XB271HU, has no FreeSync exact equivalent). For those who favour their monitor more than their GPU, that can be a pretty strong argument. For example, there are no 24" 4k or 32" 1440p 144hz G-Sync monitors, but both have FreeSync versions.

Plus, the blur reduction on FreeSync monitors function independently of GPU, as they can be both used with FreeSync or with nVidia GPU. With G-Sync monitor, ULMB and G-Sync are one or the other, and you lose both when hooking it up to an AMD GPU. Essentially, even for nVidia owners, there are some sense to go with a FreeSync monitor, while there is literally none for AMD owner buy G-Sync (there was one historically, but that is already long gone).
 
Would anyone explain to me why anyone should buy Vega over Pascal?

to have a cohesive platform. Currently running ryzen and a freesync monitor. I got rid of 2x 1070s since I was able to get back nearly 100% due to mining craze. Waiting for the 56 and hopefully it's as good as the leaked benchmarks show, otherwise there's no point in its existance vs. fury.
 
Respin? Updated Vega early next year? 16gb version using 4 stacks? Does anyone know how many revisions Vega went through so far? Really not sure how long Vega will be on the shelfs in other words or this version of Arch.

I don't see why they would it would be a waste of money.
 
There is currently an internal power struggle with at least one powerful figure actively lobbying for Koduri to be removed from his position.
 
There is currently an internal power struggle with at least one powerful figure actively lobbying for Koduri to be removed from his position.

I've must have missed the post where you explained it, is there any reason why anyone should believe you beyond acknowledging the possibility this is happening?
 
Here's one to throw on the old burn pile. This was brought up in the AMD pre-order thread.

https://videocardz.com/71591/rumor-amd-radeon-rx-vega-64-to-be-great-for-mining

Vega is 70-100mh/s?

vs. 1080Ti's ~30mh/s and Fury X ~28mh/s.

Sounds preposterous....but if true - you won't be able to touch a Vega for less than $1K after launch. The mining operators will be special ordering these things by the thousands.


This makes all the gaming arguments moot, since only a handful of gamers will ever own these cards.
 
I'm just hoping the mining numbers are totally fake. It just means it will be almost impossible for people like myself who are considering the RX for gaming.
 
I'm just hoping the mining numbers are totally fake. It just means it will be almost impossible for people like myself who are considering the RX for gaming.
Those are probably the only real numbers as this card seems it was designed with mining in mind.
 
Those are probably the only real numbers as this card seems it was designed with mining in mind.

If that is truely the case, then I feel quite let down. I need an upgrade for my 480 and I was hoping Vega would be it. But it looks like it may be nigh on impossible to grab one at launch.
 
Anyone have a clue why it would be 3x FE Vega?
Probably a combination of hashes liking integers and HBCC paging the memory so the physical requirements are lessened. Those features may not have been readily enabled when FE released. Card may actually be compute bound despite all the FLOPs.
 
Back
Top