Vega Rumors

Rys from RTG says Vega FE has no gimped gaming drivers compared to RX Vega, the driver is just older.
https://forum.beyond3d.com/posts/1989522/.


LOL so now AMD is saying newer drivers will the difference for Vega, yeah that is what happens when you get info from AMD. *sarcasm* newer drivers have always panned out in shifting a card up a bracket.
 
LOL so now AMD is saying newer drivers will the difference for Vega, yeah that is what happens when you get info from AMD. *sarcasm* newer drivers have always panned out in shifting a card up a bracket.
Dont think he means that, he just means that by the time RX Vega is released, this current driver will be older.
 
Dont think he means that, he just means that by the time RX Vega is released, this current driver will be older.

Doesn't matter at this point its just an excuse to wait for future drivers, and he should be smart enough to know that won't happen..... The only person/s he would ever quote on something like that is from an AMD rep. He is fully aware that has never happened in past so why even try to make an excuse for it? Vega is a huge step backward for AMD A card that barely matches up with a three tier down with exorbitant power usage.

Time to get real, he should have gotten real a few months ago, a person that prides himself on knowing what he talks about. Surely he could just be honest about it.

Worse yet AMD has been working on these drivers since tape out, we know the tape out occurred June of last year, So one month of newer drivers is going to make a difference? Yeah that is really realistic.

What did Raja say, its really really hard to work on drivers, yeah its hard when the underlying silicon isn't mustering up. That was the same damn excuse Jensen gave for the FX series. If that wasn't a red herring I don't know what was.
 
Last edited:
I hate to say it, was really hoping Vega would pull off a turnaround, despite me just getting a 1080 ti. Even of they price it at $450, it sounds like you'll still be getting 1080-level performance (over a year after 1080's release) with little OC potential and embarrassing power consumption. This outcome for AMD is starting to sound like an annual echo.

As it stands, I'm very happy I picked up a used 1080 ti from someone who was "getting ready for Vega".
 
They aren't going to take 1070 to 1080 performance levels all the way up to 1080ti levels in games with drivers, not going to happen.

Lets say they can solidly get to 1080 and to somewhere around 10% above 1080 with rx Vega, it still won't be good unless they have a price at around 450 bucks, cause it consumes so much more power its crazy. 120 watts more that is a huge difference. Then when we start looking at Fiji, Fiji doesn't look as bad, at least it kept up and matched up with the 980ti the second best card and only used 25 watts more.

Razor,

I'm going to call you on it. You don't know shit about driver development. 20% decrease in performance in early driver models or more is absolutely common until the bugs get worked out. When you are development, you take time to establish a good footprint which is stable and easy to maintain. Then the reports fly in and there is a flurry of changes and modifications to tweek performance in the last weeks. You don't know how many stability and logging features, as well as lack of optimization in currently in the game BETA driver code base. I mean have you sat there and done a reverse compile?

Hell my personal AI code when I remove all the logging (which shows how it's thinking and arriving at decisions), speeds up 5x's.

This is why you never do official reviews until a company says "It's ready"

Do I think they will make up that 20% gap. I don't know. I'm not going to say. But it's just too early to tell.
 
these are not testing drivers man, these are release drivers, they would have disabled all testing code in them,

I'm fully aware how debugging code can slow down a driver or application, this is not that.

If they still have debugging code in drivers, they shouldn't even release the product, cause 20% is a small amount it would be more in the neighborhood of 50%. Drivers unlike applications, relay on CPU timings and cycles, debug code doubles that for drivers because the code looks for validation, double the cycles.

What you are saying the drivers are not even in beta form, they are in debug mode, no they are not.

Have you ever seen drivers from AMD come with Debug code (officially released on their website or released with their cards)? I have never seen that even with ATi, or nV so you think they are now going to have debug code for drivers?

Keep this in mind same excuses for Vega and Doom showing when Doom was in debug mode, not going to fly, Vulkan in debug mode has no difference in performance. Why cause Vulkan was made that way. Can't use applciations and drivers as the same thing when talking about debug modes.
 
Last edited:
these are not testing drivers man, these are release drivers, they would have disabled all testing code in them,

I'm fully aware how debugging code can slow down a driver or application, this is not that.

If they still have debugging code in drivers, they shouldn't even release the product, cause 20% is a small amount it would be more in the neighborhood of 50%. Drivers unlike applications, relay on CPU timings and cycles, debug code doubles that for drivers because the code looks for validation, double the cycles.

What you are saying the drivers are not even in beta form, they are in debug mode, no they are not.

Have you ever seen drivers from AMD come with Debug code (released on their website or released with their cards)? I have never seen that even with ATi, or nV so you think they are now going to have debug code for drivers?

Keep this in mind same excuses for Vega and Doom showing when Doom was in debug mode, not going to fly, Vulkan in debug mode has no difference in performance. Why cause Vulkan was made that way. Can't use applciations and drivers as the same thing when talking about debug modes.

No these are drivers for VEGA FE. Not VEGA RX. The purpose of which is NOT gaming. AMD representatives were quoted last week that they aren't showing game drivers because they aren't ready.

There's a difference between Debug mode, Release - GIMP mode/Diagnostic debug mode, and full Release with all optimizations.

And how do you know they aren't in beta form? How do you know they aren't in gimp mode for stability purposes? Again have you reverse engineered the code? Talked to an AMD rep? Worked as their engineer?
 
No these are drivers for VEGA FE. Not VEGA RX. The purpose of which is NOT gaming. AMD representatives were quoted last week that they aren't showing game drivers because they aren't ready.

There's a difference between Debug mode, Release - GIMP mode/Diagnostic debug mode, and full Release with all optimizations.

And how do you know they aren't in beta form? How do you know they aren't in gimp mode for stability purposes? Again have you reverse engineered the code? Talked to an AMD rep? Worked as their engineer?


What you think Scott (AMD product manager) telling Rys (Hexus editor and Beyond 3d owner, now looking like he is spending too much time with Scott), there is no gimping going on in the drivers is BS? I will say this, what is BS the second part of that statement, RX Vega drivers will make the difference. They won't, 10% maybe, that is it. That still puts RX Vega at 1080 performance levels @ 300 watts, still not good.

If he had to go to Scott to get that info, which god knows why, it is common sense, this card is in deep trouble.
 
No these are drivers for VEGA FE. Not VEGA RX. The purpose of which is NOT gaming. AMD representatives were quoted last week that they aren't showing game drivers because they aren't ready.

There's a difference between Debug mode, Release - GIMP mode/Diagnostic debug mode, and full Release with all optimizations.

And how do you know they aren't in beta form? How do you know they aren't in gimp mode for stability purposes? Again have you reverse engineered the code? Talked to an AMD rep? Worked as their engineer?

AMD confirmed the drivers aren't gimped and the card is marketed for gaming and compute tasks.

035ostrich_468x538.jpg
 
What you think Scott (AMD product manager) telling Rys (Hexus editor and Beyond 3d owner, now looking like he is spending too much time with Scott), there is no gimping going on in the drivers is BS? I will say this, what is BS the second part of that statement, RX Vega drivers will make the difference. They won't, 10% maybe, that is it. That still puts RX Vega at 1080 performance levels @ 300 watts, still not good.

If he had to go to Scott to get that info, which god knows why, it is common sense, this card is in deep trouble.

Alright, so we're betting that Vega won't be better than 10% from now on what specific metric? Timespy and Firestrike?

We'll record that, come back after RX Vega launch, then again 3 months after that as I bet Raja's driver team is hip deep still figuring out how to get this thing in gear.

It'll be a gentleman's bet.

But to clarify, do we include higher clocks if superior cooling allows it? Or are we just talking drivers at say 1600mhz?
 
AMD confirmed the drivers aren't gimped and the card is marketed for gaming and compute tasks.

I think it's a matter of understanding. They were referring to Vega FE. They said last week the RX Gaming drivers aren't ready. A release version that works in gaming doesn't mean it's been optimized yet. When you are dealing with multiple product lines you focus your attention/resourced to what the product card is aimed towards. So most of the driver work no doubt went into optimizing professional applications and not DX12 or other games specifically.


I don't have anything invested in this either way. Not my fight. But I reserve judgment until actual reviews or maybe a driver used a week before actual release of said product. You don't count on things till they are done.
 
Alright, so we're betting that Vega won't be better than 10% from now on what specific metric? Timespy and Firestrike?

We'll record that, come back after RX Vega launch, then again 3 months after that as I bet Raja's driver team is hip deep still figuring out how to get this thing in gear.

It'll be a gentleman's bet.

But to clarify, do we include higher clocks if superior cooling allows it? Or are we just talking drivers at say 1600mhz?

clocks aren't part of that cause clocks have nothing to do with drivers, just drivers at what ever RX Vega comes at, so if its 1500 mhz instead of 1600mhz have to take that into consideration too. Cooling isn't part of that ether, cause cooling will affect clock speeds.

The games that are lower than the 1070 or at 1070 in this showing, shouldn't be part of that either, I think something is going wrong with those games (metro last light, and hitman), and they shouldn't be that low. That is why gtx 1080 performance levele seem to be where it should be.

We have two people with Vega FE, same results for Firestrike, Doom, same things AMD showed in Doom 6 months ago on the same card. What is going to change? Just nothing out of a miracle or the second coming of Christ is going to change that.
 
The card itself. That's what's measured by wattman.
And we talk about liquid that is always cooler than what it cools. Context, bro.


But to clarify, do we include higher clocks if superior cooling allows it? Or are we just talking drivers at say 1600mhz?
We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.
 
We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.

Also make sure to wait a few more months and compare it to Volta at release since waiting for performance for a $1000 product makes sense.
 
And we talk about liquid that is always cooler than what it cools. Context, bro.



We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.

I think what we should be looking at is if it delivers price: performance compared to it's competition and delivers at least 1080 levels for <$549 or 1080ti levels < $700.

That in the end is all that matters. And quite frankly that's all we should care about. The most gaming for our buck.

Now before you start claiming I'm recanting my claims about driver improvements, I said "they may or may not make significant improvements. It's just too early to tell." I made no predictions about final performance targets.
 
This is Raja claiming there is no developer intervention reqiured to make use of the "higher throughput per clock cycle" in terms of geometry. Fascinating claim really considering on the Vega FE spec sheet geometry throughput is literally identically to Fiji's and Polaris'.
How is that fascinating? Because the feature isn't enabled in drivers means it doesn't work transparently?

I won't lie, I'm highly amused by this launch, if only because of the sheer number of posters who will have to eat crow after over-hyping this damned thing to the Proxima Centauri and back. it is disappointing to see AMD coming up with such a lackluster product after >1 year of waiting and frankly this whole launch is a shambles .
That's my thought exactly. You have a technically superior Vega card, essentially running Fiji drivers, with none of the new performance and energy saving features enabled beating 1080. Easily half, if not more, of the performance still in the table. Not even the fastest variant and we still need to wait for game optimizations to land. At least with FE shipped more devs can start getting those in order.

The real question is how much Volta will be able to bring to the table to match it. So far it just looks like a process advantage.

Rys from RTG says Vega FE has no gimped gaming drivers compared to RX Vega, the driver is just older.
https://forum.beyond3d.com/posts/1989522/.
Gimped and not including incomplete features are two distinct things. The gaming and pro drivers perform nearly identically. Many of Vegas features require drivers. Plenty of research out there showing potential gains and they would be rather large when enabled. Couple that with Vega being technically superior to even Pascal with the DX12 features.
 
You are making other AMD fans and trolls look bad with garbage like this.

LOL that reminds me of a arsechnica mod post about the fx
I think there is a definitely a lot of truth to what you are saying. As you may recall a few years ago, 3dfx was pushing the same type of ideas: "cinematic" special effects, etc. Yet they couldn't get their cards out the door. While they were squandering their opportunity for something revolutionary, Nvidia was stealing their business.

It seems to me that if you are going to attempt something revolutionary, you better at least have all the basics down and be able to execute.

And this is where Nvidia failed. The latest round of their cards were not released on time and the basics of what they offer (DX7 and DX8 support) don't quite measure up to what ATi offers.

I don't think the problem is that developer tools are lagging (that's always the case). I think it is (as you said) that they just got way ahead of themselves. Between that and the borked conversion to 0.13 micron, they kind of forgot about the here and now. (Maybe they didn't forget... Maybe it was truly due to the slow, successful conversion to 0.13 micron.)

That's how I see it... Maybe there is another twist to this...

Yeah way of ahead of its time :ROFLMAO:
 
Really wanted these cards to be great. I guess looking at their marketing strategy for FE now.. it makes sense vega kinda sucks. Will most likely suck a month from now too.

Newegg was able to cancel my orders, thankfully. Hopefully AMD is able to improve things a bit w/ drivers, but I agree with most here.. not looking good at all.
 
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?

Some hoped otherwise - but not really anybody was suggesting it that I saw.

So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.

There's nothing deflating about this at all --- yet.

It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.
 
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?

Some hoped otherwise - but not really anybody was suggesting it that I saw.

So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.

There's nothing deflating about this at all --- yet.

It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.

It's the truth of the matter, AMD will have to live with little to no margin to stay afloat for another day, and hope the Zen tide lifts all of AMD's R&D budgets.
 
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?

Some hoped otherwise - but not really anybody was suggesting it that I saw.

So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.

There's nothing deflating about this at all --- yet.

It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.

its the power consumption if it was at 220 watts or 230 watts, it wouldn't be a big deal, but go over 250 and at 300 that difference is something that can't be ignored. And that is going to force its price down.
 
Power draw wouldn't be an issue if the performance was at 1080 Ti or higher levels with some OC headroom. Unfortunately, even with a driver update giving 10% performance boost this won't be a reality and given the fact it's using HBM2 with a large die, it will be cost prohibitive for AMD to price it below a certain point. On the other hand, with NVIDIA using GDDR 5X, they can simply drop the price on their 1080 and then nobody besides maybe miners will want a Vega.

Someone should forward this to Raja:


Also, NVIDIA is now pushing relatively thin and light gaming notebooks with their Max Q tailored designs and now we know for damn sure AMD won't be able to match that. So once again, NVIDIA will continue to dominate the gaming notebook market.
 
Last edited:
its the power consumption if it was at 220 watts or 230 watts, it wouldn't be a big deal, but go over 250 and at 300 that difference is something that can't be ignored. And that is going to force its price down.

Yeah. It's the power draw that turned me away. It's bad. Real bad.

Don't worry guys the drivers will fix the power draw. :p
 
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?

Some hoped otherwise - but not really anybody was suggesting it that I saw.

So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.

There's nothing deflating about this at all --- yet.

It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.

Yeah as someone who is looking for something more in the GTX 1070 or 1080 class of cards, as long as AMD can bring some competition to those cards hopefully video card prices will finally go down. Pascal and Polaris barely even sold for their MSRPs between the founder's edition crap and poor supply on the AMD side, and now the mining craze pushing all video card prices up.
 
Looks like we are seeing the same issues we saw with polaris. Limitations of the current GCN architecture. I guess it's showing now that graphics division took the back seat as we got closer to ryzen launch. I think Polaris and Vega are just something to stay afloat on graphics side. It's looking more and more like Navi is the true next gen and Polaris and Vega are dying breed of GCN, last cards stay relevant for now. 1382 MHz seems to be the sustained clocks for air cooled version. I think we are looking at air cooled Vega and then watercooled Vega what will sustain 1600. Makes sense. We will probably have air cooled for 399 and watercooled for $599. Because AMD can't sell a bunch if they have 1080 competitior for anything higher than 399 at this point for air cooled.
 
Last edited:
unless you are mining - 300 watt power usage vs. 250 watt power usage is completely immaterial. I don't care for one moment if my card draws 50 watts more for gaming. I only game a couple hours a day at most. That equates to like a 5 cents a night on power usage. I'm not going to chose my card based on which one costs me 5 cents more for each gaming session.

----------------

Seriously, 300 watts at 10 cents a KW/H (reasonable National average in the U.S.) costs 72 cents a day running 24 hours a day. So that's 3 cents per hour.
1080TI uses 250 watts at full tilt - which is 60 cents at day, if running 24 hours a day. So that's 2.5 cents an hour.

So stop it already with the massive power use bull crap. Nobody with any sense cares.
The only way that matters is if you have a load of them running 24/7 mining. To the average person with their computer on idle 95% of the time it's completely moot.
 
unless you are mining - 300 watt power usage vs. 250 watt power usage is completely immaterial. I don't care for one moment if my card draws 50 watts more for gaming. I only game a couple hours a day at most. That equates to like a 5 cents a night on power usage. I'm not going to chose my card based on which one costs me 5 cents more for each gaming session.

----------------

Seriously, 300 watts at 10 cents a KW/H (reasonable National average in the U.S.) costs 72 cents a day running 24 hours a day. So that's 3 cents per hour.
1080TI uses 250 watts at full tilt - which is 60 cents at day, if running 24 hours a day. So that's 2.5 cents an hour.

So stop it already with the massive power use bull crap. Nobody with any sense cares.
The only way that matters is if you have a load of them running 24/7 mining. To the average person with their computer on idle 95% of the time it's completely moot.


its 300 watts at GTX 1080 performance, that is a 66% delta. That shows how far AMD is behind nV when it comes to architecture and power efficiency. 66% is pretty much what we see in a generational gap with a smaller node.

And its not mining, its rare to use over 70% of the full TDP of card anyways. My 580s use 100 watts only when single mining, my 1070's use only 60 watts (GPU only) add some more in there for the rest of the card.
 
unless you are mining - 300 watt power usage vs. 250 watt power usage is completely immaterial. I don't care for one moment if my card draws 50 watts more for gaming. I only game a couple hours a day at most. That equates to like a 5 cents a night on power usage. I'm not going to chose my card based on which one costs me 5 cents more for each gaming session.

----------------

Seriously, 300 watts at 10 cents a KW/H (reasonable National average in the U.S.) costs 72 cents a day running 24 hours a day. So that's 3 cents per hour.
1080TI uses 250 watts at full tilt - which is 60 cents at day, if running 24 hours a day. So that's 2.5 cents an hour.

So stop it already with the massive power use bull crap. Nobody with any sense cares.
The only way that matters is if you have a load of them running 24/7 mining. To the average person with their computer on idle 95% of the time it's completely moot.


First of all, you're comparing it to a 1080 Ti which is a much more powerful video card. You should be comparing it to a 1080 that has a draw of 180W which is significantly less. Not only that but AMD desperately needs to make inroads into the notebook gaming market and now it seems they will once again be completely shut out from the top end of the market where margins are quite lucrative. NVIDIA is simply dominating with their Max Q designs thanks to their low power draw.

https://www.nvidia.com/en-us/geforce/products/10series/laptops/max-q/


 
First of all, you're comparing it to a 1080 Ti which is a much more powerful video card. You should be comparing it to a 1080 which has a draw of 180W which is significantly less. Not only that but AMD desperately needs to make inroads into the notebook gaming market and now it seems they will once again be completely shut out from the top end of the market where margins are quite lucrative. NVIDIA is simply dominating with their Max Q designs thanks to their low power draw.
That is partially True. Truth is is polaris and likely Vega can be really efficient if kept within certain parameters of clocks and voltage. The reason these tend to use too much power is AMD pushing GCN and trying to stretch its legs in to gaming. When they push the limits of GCN to hard the power usage goes through roof. Apple is likely getting binned chips of Vega tuned to stay within certain limits.
 
That is partially True. Truth is is polaris and likely Vega can be really efficient if kept within certain parameters of clocks and voltage. The reason these tend to use too much power is AMD pushing GCN and trying to stretch its legs in to gaming. When they push the limits of GCN to hard the power usage goes through roof. Apple is likely getting binned chips of Vega tuned to stay within certain limits.


The only reason they are pushing their cards out of their ideal limits is because of Pascal, they are trying to keep some semblance of performance parity.
 
First of all, you're comparing it to a 1080 Ti which is a much more powerful video card. You should be comparing it to a 1080 that has a draw of 180W which is significantly less. Not only that but AMD desperately needs to make inroads into the notebook gaming market and now it seems they will once again be completely shut out from the top end of the market where margins are quite lucrative. NVIDIA is simply dominating with their Max Q designs thanks to their low power draw.

https://www.nvidia.com/en-us/geforce/products/10series/laptops/max-q/



so a 1080 - in the same comparison uses 43 cents in 24 hours at full tilt (or 1.8 cents per hour)
as mentioned earlier 300 watt Vega will use 72 cents of electricity in 24 hours of full tilt

If you game with two hours on each that's 6 cents for Vega, or 4 cents for 1080.

And your point of not using full TDP for gaming is fair -- which makes it even less for both cards -- if we use 66% TDP as a estimate then it's 4 cents for Vega for two hours of gaming and 2.5 cents for 1080 for two hours of gaming.

Still wholly immaterial -- and all the concern otherwise is folly.
 
Back
Top