Vega Rumors

Nvidia trumps amd when it comes to anything professional and big part of it is drivers. AMD doesn't have the resources to do any better and they will always be behind in nvidia when it comes to that. Until this day nvidia outdoes AMD with their dx11 drivers, it is well known and I have read up quiet a bit on it. Nvidia really does a lot of magic in software where AMD just goes brute force way. Thats just the way its been. AMD wont get any better all of sudden.

Then why is this card marketed towards professional 3d work and then compared to GP102 with gaming drivers ? it's not only in the drivers, nvidia has traditionally had far far more geometry performance which is very important in pro graphics as you can have very complex models etc.

This is NOT the AMD equivalent of the Titan line. These are not running gaming drivers, these are running pro drivers, for pro graphics. It should be compared to other cards with drivers for pro graphics, the people spending 2K to buy one of these are not morons, but AMD clearly intends to treat them as such if it gets them a sale, because this comparison to Titan X is seriously shameful

The Vega FE has only one market where it is even remotely relevant and thats machine learning, but with fledgling support for ROCm it will have a hill to climb. All it offers is packed FP16 at double throughput, which is exclusive to GP100. In that context it costs less than half as much, but it doesn't do half rate FP64
 
LOL there are NUMA API's under windows and linux man
Those APIs affect the algorithms to implement a function how exactly? Or did you not even look at or understand what they were doing? APIs aren't required to run efficiently on a NUMA architecture at all. All they do is provide an interface to optimize scheduling and memory allocations.

Ah yes, continue pretending that Titan X is 'terrible' at the benchmark, it's a question of drivers used anarchist you know that perfectly well, I am not moving goal posts at all. I am simply pointing out that a card costing half as much as a Vega FE performs just like it, and in less than half the power envelope.
I say the results show a simple mathematical equation, you state that is untrue because claims that weren't made are false. How is that not moving goal posts? Is that any different than running wild with claims Vega uses 375W? At least the performance was a measured figure, the power figures are the maximum allowable given the connectors. Vega may very well be using just 100W in that test.
 
Then why is this card marketed towards professional 3d work and then compared to GP102 with gaming drivers ? it's not only in the drivers, nvidia has traditionally had far far more geometry performance which is very important in pro graphics as you can have very complex models etc.

This is NOT the AMD equivalent of the Titan line. These are not running gaming drivers, these are running pro drivers, for pro graphics. It should be compared to other cards with drivers for pro graphics, the people spending 2K to buy one of these are not morons, but AMD clearly intends to treat them as such if it gets them a sale, because this comparison to Titan X is seriously shameful

True. I am with you. But AMD SUCKS in any drivers other than dx 12 that reduces their shitty driver overhead. AMD won't come close to matching nvidia in pro market for a while. They have a long way to go. I am not making an excuse. it is a fact! I am predicting AMD will likely land close to 1080ti with vega with more watts on gaming side. I expect 50 watts on average and likely up to 375 if you overclock. If they can do that, its still much better than polaris.

if gaming cards have 2 8 pin adapters. Then looking at 300 watt stock and max 375 with OC. Still shitty wattage but at this point i think they are probably desperate to atleast match nvidia if not surpass it. yea it sucks but it will make fanboys happy.
 
My 1080 Zotac Amp comes with two 8 pin connectors, not a good way to judge power draw, yet to no surprise it's drawing less power then my old 290X.
 
I say the results show a simple mathematical equation, you state that is untrue because claims that weren't made are false. How is that not moving goal posts? Is that any different than running wild with claims Vega uses 375W? At least the performance was a measured figure, the power figures are the maximum allowable given the connectors. Vega may very well be using just 100W in that test.

Now you're just typing a lot of words and saying nothing worth reading, I stated that claims that weren't made are false? What ? The point is Vega FE is absolute *shit* performance per $, per W as far as pro graphics are concerned. I have concluded this based on the non-fraudulent data you are so adamantly defending. I am not stating an opinion here, we are talking about a product costing less than half as much, drawing less than half as much power matching it.

I have abandoned all hopes of getting you to stop pretending the comparison to Titan X in that context is meaningful, for the purposes of this discussion I'll pretend to also be an idiot and absolutely agree that the results are unequivocally correct and representative of the relative performance of these cards.

How am I running wild with claims Vega uses 375W? The card is rated for 375W, you think it uses less? Then why the hell does water-cooled edition need 75w extra with the same 13.1 tflop rating? These aren't over-specced overclocking cards
 
My 1080 Zotac Amp comes with two 8 pin connectors, not a good way to judge power draw, yet to no surprise it's drawing less power then my old 290X.

your zotac amp is designed to be overclocked, and nvidia traditionally overspec, power draw is usually lower. AMD on the other hand ...
 
Those APIs affect the algorithms to implement a function how exactly? Or did you not even look at or understand what they were doing? APIs aren't required to run efficiently on a NUMA architecture at all. All they do is provide an interface to optimize scheduling and memory allocations.


They give you the necessary extensions to create NUMA aware programs. Without those extensions, it would be very hard to, well, actually impossible to do .
 
My 1080 Zotac Amp comes with two 8 pin connectors, not a good way to judge power draw, yet to no surprise it's drawing less power then my old 290X.


Zotac Amp are overkill on the power connectors pissed me off a bit, just bought 18 of them for mining, and had to jerry rigged the power connectors to get them to work, cause the break out boards only have 10 power connectors and I needed 12 for all 6 zotac cards. And they definitely don't even use much power at all, 90 watts each lol.
 
Last edited:
Are those maxed power ratings? Ratings under what condition? Information is too sparse but looks like a turd. Not sure why AMD would compare a Pro card to a gaming card - other then they would look terrible if they did. Now who in the world would pre-order one of these other than someone who has money to gamble with? When the card is launched it should be compared against similar priced Nvidia Professional cards - clear and simple. Does not mean some game test should not be run with pro drivers and game drivers if they work.

Anyways with the 300w and 375w rating - it does not look like too much headroom will be left is the sad part. If it was 225w and 275w then that would be much better. In the end it does not look particularly that promising if the information is accurate.
 
Really I thought Raja put out 225w 12.5 TFlops for Vega. Now it maybe 300w/375w for 13 Tflops is a huge difference. Feels very much Fiji like launch. We just need some real reviews on this.
 
Really I thought Raja put out 225w 12.5 TFlops for Vega. Now it maybe 300w/375w for 13 Tflops is a huge difference. Feels very much Fiji like launch. We just need some real reviews on this.

Raja is known for overhyping everything. He is an engineer, he just needs to shut up sometimes. Its like celebrating a new born child, only difference is there is a big difference in a new born and a fucking graphic card. I have no doubt Lisa pulled him from computex show. Because he just talks up everything.
 
Anyways with the 300w and 375w rating - it does not look like too much headroom will be left is the sad part. If it was 225w and 275w then that would be much better. In the end it does not look particularly that promising if the information is accurate.

I am not exactly surprise.

I have said months ago on Anandtech that Vega's development isn't going well.

I said that the water-cooled Vega is 10%-15% faster than the Geforce GTX 1080, but is already at its power limit OOB with no headroom.

It looks like AMD want ahead and push even more power into it.

I also said that Vega has been delayed beyond 1H 2017.

Anyway, the AMD loyalists called me a troll and got me banned from Anandtech.
 
Ugh... I really hope AMD pulls a rabit out of their hat on this one. .. if the RX Vega is anything like these leaked Pro products, I weep for the industry...
 
do we believe the 300w/375w tdp?

at the end of the day it is still 4096 shaders driving a hbm memory, why so different from Fury?

even more so, it's now made on 14nm and has hbm with half the bit rate of the previous card, it should be less not more!
 
RX Vega is starting to look like another failure from the red team like the ATi Radeon HD 2900 XT.

The difference might be that with the 2900xt the lowly 8800 GTS ate it's lunch. I don't expect the 1080 will be eating Vegas lunch.

We still know SFA however.

What a weird non launch.
 
RX Vega is starting to look like another failure from the red team like the ATi Radeon HD 2900 XT.
Well that is one way to look at it - Black and White - it is more like Nvidia just hit it out of the Park with Pascal - big league. Looks like a 600mhz bump for AMD over Fiji so will they get a 60% bump in performance for gaming? That is what they would need to do to beat a 1080Ti (not OC). I am doubtful, so now it comes to perf/$ plus any Nano type version which makes it more interesting at least for me. As for Nvidia, I will be very interested in Volta and RTG may just have to release Navi next year as schedule to have any chance of keeping Nvidia in sight.
 
RX Vega is starting to look like another failure from the red team like the ATi Radeon HD 2900 XT.

2900 XT wasn't that bad a card, especially at higher IQ/res and once drivers got in order (and again, it was eclipsed by the sublime G80). But yeah, it's sounding like a typical AMD release of late: middling performance vis a vis the competition and redlined power draw. And now with the rumors of Vega being priced higher to reap margins from miners, it's really not looking good.
 
vega might as well be a turd. It simply looks to me that those power figures are maximum board power. 300w for the 8+6 pin and 375w for 8+8 pin. We will find out soon enough. But yea going by amd history they probably couldn't give the performance we wanted without the power.
 
Probably best to wait for ACTUAL power consumption. You can slap 5 - 8 pin PCIE connectors on a card, it doesn't mean it will use all the power.
 
vega might as well be a turd. It simply looks to me that those power figures are maximum board power. 300w for the 8+6 pin and 375w for 8+8 pin. We will find out soon enough. But yea going by amd history they probably couldn't give the performance we wanted without the power.
Usually you rate the card for the power draw at default clocks/boost clocks - not plug ratings but we are dealing with AMD so who knows what those ratings are. Especially since 375w is above 300w standard for a card. What! Are those cooler ratings maybe??? :wacky:
 
It's the engineering max power draws, most people should be smart enough to realize this. Just like my Zotac 1080 can technically draw 375 watts with two 8 pins it's nowhere near that. 250 watts or less is my expectation of power draw. Actual specs are still under NDA but we should know the real truth soon enough.
 
Yeah it doesn't look great, but we'll find out in a short while!

If it can stay under 300 watts for the boost, i still think just cranking the fan up would work as with previously linked blower cards pulling that much juice.

Honestly, this could just be their overreaction to PowerGate with the 6 pin on the RX480 while drawing more than rated for.
 
It's the engineering max power draws, most people should be smart enough to realize this. Just like my Zotac 1080 can technically draw 375 watts with two 8 pins it's nowhere near that. 250 watts or less is my expectation of power draw. Actual specs are still under NDA but we should know the real truth soon enough.

You are right, the R9 290X is 250W TDP, and it used significantly less power during average use
https://www.techpowerup.com/reviews/AMD/R9_290X/25.html

I expect Vega to use a whooping 20W less than its rated TDP during gaming
 
Just cause the 290X was a thirsty card does not mean Vega is the same. Will know what it actually draws in a few more days.
 
Is there any info on the release date?

Hopefully the workstation cards are not the only Vega they are releasing
 

Wow, what a turd.

#waitforvega #waitforfailure

vega might as well be a turd. It simply looks to me that those power figures are maximum board power. 300w for the 8+6 pin and 375w for 8+8 pin. We will find out soon enough. But yea going by amd history they probably couldn't give the performance we wanted without the power.

Don't you all go over to Anandtech and call Vega a turd or you would get hit with a ban hammer.
 
While I said it could be a turd you all have to take those numbers with a grain of salt. Those numbers are basically max board TDP. Looks like they are giving out one for 6+8 pin and one for 8+8 pin. Reason I am pretty confident I am correct here is because watercooled version should help with TDP by default. As temps will be much lower. Since specs don't seem to be much different with two versions. Watercooled version technically shouldn't be using more power than air cooled with similar specs. This makes me believe those are probably max board power that the board can sustain with 6+8 pin and 8+8 pin. What makes this even more weirder is if they both have 8+8 pin but one is at 300w and one at 375w. So It could simply be reporting error since if they are both 2 8 pins max board power should be 375. On the other hand if they are both 8 pin watercooled card shouldn't really be using 75 more watts then air cooled version with same specs.
 
How does 300 or more watts help them meet their claimed efficiency gains over Polaris?

There'd be a lawsuit if they released a beast like that.
 
Back
Top