Rys from RTG says Vega FE has no gimped gaming drivers compared to RX Vega, the driver is just older.
https://forum.beyond3d.com/posts/1989522/.
https://forum.beyond3d.com/posts/1989522/.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Rys from RTG says Vega FE has no gimped gaming drivers compared to RX Vega, the driver is just older.
https://forum.beyond3d.com/posts/1989522/.
Dont think he means that, he just means that by the time RX Vega is released, this current driver will be older.LOL so now AMD is saying newer drivers will the difference for Vega, yeah that is what happens when you get info from AMD. *sarcasm* newer drivers have always panned out in shifting a card up a bracket.
Dont think he means that, he just means that by the time RX Vega is released, this current driver will be older.
Dont think he means that, he just means that by the time RX Vega is released, this current driver will be older.
They aren't going to take 1070 to 1080 performance levels all the way up to 1080ti levels in games with drivers, not going to happen.
Lets say they can solidly get to 1080 and to somewhere around 10% above 1080 with rx Vega, it still won't be good unless they have a price at around 450 bucks, cause it consumes so much more power its crazy. 120 watts more that is a huge difference. Then when we start looking at Fiji, Fiji doesn't look as bad, at least it kept up and matched up with the 980ti the second best card and only used 25 watts more.
these are not testing drivers man, these are release drivers, they would have disabled all testing code in them,
I'm fully aware how debugging code can slow down a driver or application, this is not that.
If they still have debugging code in drivers, they shouldn't even release the product, cause 20% is a small amount it would be more in the neighborhood of 50%. Drivers unlike applications, relay on CPU timings and cycles, debug code doubles that for drivers because the code looks for validation, double the cycles.
What you are saying the drivers are not even in beta form, they are in debug mode, no they are not.
Have you ever seen drivers from AMD come with Debug code (released on their website or released with their cards)? I have never seen that even with ATi, or nV so you think they are now going to have debug code for drivers?
Keep this in mind same excuses for Vega and Doom showing when Doom was in debug mode, not going to fly, Vulkan in debug mode has no difference in performance. Why cause Vulkan was made that way. Can't use applciations and drivers as the same thing when talking about debug modes.
No these are drivers for VEGA FE. Not VEGA RX. The purpose of which is NOT gaming. AMD representatives were quoted last week that they aren't showing game drivers because they aren't ready.
There's a difference between Debug mode, Release - GIMP mode/Diagnostic debug mode, and full Release with all optimizations.
And how do you know they aren't in beta form? How do you know they aren't in gimp mode for stability purposes? Again have you reverse engineered the code? Talked to an AMD rep? Worked as their engineer?
No these are drivers for VEGA FE. Not VEGA RX. The purpose of which is NOT gaming. AMD representatives were quoted last week that they aren't showing game drivers because they aren't ready.
There's a difference between Debug mode, Release - GIMP mode/Diagnostic debug mode, and full Release with all optimizations.
And how do you know they aren't in beta form? How do you know they aren't in gimp mode for stability purposes? Again have you reverse engineered the code? Talked to an AMD rep? Worked as their engineer?
What you think Scott (AMD product manager) telling Rys (Hexus editor and Beyond 3d owner, now looking like he is spending too much time with Scott), there is no gimping going on in the drivers is BS? I will say this, what is BS the second part of that statement, RX Vega drivers will make the difference. They won't, 10% maybe, that is it. That still puts RX Vega at 1080 performance levels @ 300 watts, still not good.
If he had to go to Scott to get that info, which god knows why, it is common sense, this card is in deep trouble.
AMD confirmed the drivers aren't gimped and the card is marketed for gaming and compute tasks.
Alright, so we're betting that Vega won't be better than 10% from now on what specific metric? Timespy and Firestrike?
We'll record that, come back after RX Vega launch, then again 3 months after that as I bet Raja's driver team is hip deep still figuring out how to get this thing in gear.
It'll be a gentleman's bet.
But to clarify, do we include higher clocks if superior cooling allows it? Or are we just talking drivers at say 1600mhz?
And we talk about liquid that is always cooler than what it cools. Context, bro.The card itself. That's what's measured by wattman.
We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.But to clarify, do we include higher clocks if superior cooling allows it? Or are we just talking drivers at say 1600mhz?
We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.
And we talk about liquid that is always cooler than what it cools. Context, bro.
We are talking drivers at fixed clock (i.e. cooling shenanigans excluded). Basically take Vega FE now and Vega FE few months later with the same set fan speed/power limit and go at it.
How is that fascinating? Because the feature isn't enabled in drivers means it doesn't work transparently?This is Raja claiming there is no developer intervention reqiured to make use of the "higher throughput per clock cycle" in terms of geometry. Fascinating claim really considering on the Vega FE spec sheet geometry throughput is literally identically to Fiji's and Polaris'.
That's my thought exactly. You have a technically superior Vega card, essentially running Fiji drivers, with none of the new performance and energy saving features enabled beating 1080. Easily half, if not more, of the performance still in the table. Not even the fastest variant and we still need to wait for game optimizations to land. At least with FE shipped more devs can start getting those in order.I won't lie, I'm highly amused by this launch, if only because of the sheer number of posters who will have to eat crow after over-hyping this damned thing to the Proxima Centauri and back. it is disappointing to see AMD coming up with such a lackluster product after >1 year of waiting and frankly this whole launch is a shambles .
Gimped and not including incomplete features are two distinct things. The gaming and pro drivers perform nearly identically. Many of Vegas features require drivers. Plenty of research out there showing potential gains and they would be rather large when enabled. Couple that with Vega being technically superior to even Pascal with the DX12 features.Rys from RTG says Vega FE has no gimped gaming drivers compared to RX Vega, the driver is just older.
https://forum.beyond3d.com/posts/1989522/.
Easily half, if not more, of the performance still in the table.
Sometimes I really can't tell if he's being sarcastic or not. He asks what Volta can bring to the table to match VegaYou are making other AMD fans and trolls look bad with garbage like this.
You are making other AMD fans and trolls look bad with garbage like this.
I think there is a definitely a lot of truth to what you are saying. As you may recall a few years ago, 3dfx was pushing the same type of ideas: "cinematic" special effects, etc. Yet they couldn't get their cards out the door. While they were squandering their opportunity for something revolutionary, Nvidia was stealing their business.
It seems to me that if you are going to attempt something revolutionary, you better at least have all the basics down and be able to execute.
And this is where Nvidia failed. The latest round of their cards were not released on time and the basics of what they offer (DX7 and DX8 support) don't quite measure up to what ATi offers.
I don't think the problem is that developer tools are lagging (that's always the case). I think it is (as you said) that they just got way ahead of themselves. Between that and the borked conversion to 0.13 micron, they kind of forgot about the here and now. (Maybe they didn't forget... Maybe it was truly due to the slow, successful conversion to 0.13 micron.)
That's how I see it... Maybe there is another twist to this...
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?
Some hoped otherwise - but not really anybody was suggesting it that I saw.
So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.
There's nothing deflating about this at all --- yet.
It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?
Some hoped otherwise - but not really anybody was suggesting it that I saw.
So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.
There's nothing deflating about this at all --- yet.
It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.
its the power consumption if it was at 220 watts or 230 watts, it wouldn't be a big deal, but go over 250 and at 300 that difference is something that can't be ignored. And that is going to force its price down.
Yeah. It's the power draw that turned me away. It's bad. Real bad.
At this point why is the sky falling because Vega doesn't look to be competitive with 1080TI? We knew this?
Some hoped otherwise - but not really anybody was suggesting it that I saw.
So we are getting what we expected performance wise. Nobody expected this to be a Titan Xp killer or even a 1080TI direct rival.
There's nothing deflating about this at all --- yet.
It depends solely on price if this card is a success. If it's between 1080 and 1080TI speeds it should be about $600 -- and AMD will find a market. The benefits of Freesync are real (as are the benefits of g-sync) -- and that's a draw to many with a freesync monitor - even if Nvidia has the faster top card.
unless you are mining - 300 watt power usage vs. 250 watt power usage is completely immaterial. I don't care for one moment if my card draws 50 watts more for gaming. I only game a couple hours a day at most. That equates to like a 5 cents a night on power usage. I'm not going to chose my card based on which one costs me 5 cents more for each gaming session.
----------------
Seriously, 300 watts at 10 cents a KW/H (reasonable National average in the U.S.) costs 72 cents a day running 24 hours a day. So that's 3 cents per hour.
1080TI uses 250 watts at full tilt - which is 60 cents at day, if running 24 hours a day. So that's 2.5 cents an hour.
So stop it already with the massive power use bull crap. Nobody with any sense cares.
The only way that matters is if you have a load of them running 24/7 mining. To the average person with their computer on idle 95% of the time it's completely moot.
unless you are mining - 300 watt power usage vs. 250 watt power usage is completely immaterial. I don't care for one moment if my card draws 50 watts more for gaming. I only game a couple hours a day at most. That equates to like a 5 cents a night on power usage. I'm not going to chose my card based on which one costs me 5 cents more for each gaming session.
----------------
Seriously, 300 watts at 10 cents a KW/H (reasonable National average in the U.S.) costs 72 cents a day running 24 hours a day. So that's 3 cents per hour.
1080TI uses 250 watts at full tilt - which is 60 cents at day, if running 24 hours a day. So that's 2.5 cents an hour.
So stop it already with the massive power use bull crap. Nobody with any sense cares.
The only way that matters is if you have a load of them running 24/7 mining. To the average person with their computer on idle 95% of the time it's completely moot.
That is partially True. Truth is is polaris and likely Vega can be really efficient if kept within certain parameters of clocks and voltage. The reason these tend to use too much power is AMD pushing GCN and trying to stretch its legs in to gaming. When they push the limits of GCN to hard the power usage goes through roof. Apple is likely getting binned chips of Vega tuned to stay within certain limits.First of all, you're comparing it to a 1080 Ti which is a much more powerful video card. You should be comparing it to a 1080 which has a draw of 180W which is significantly less. Not only that but AMD desperately needs to make inroads into the notebook gaming market and now it seems they will once again be completely shut out from the top end of the market where margins are quite lucrative. NVIDIA is simply dominating with their Max Q designs thanks to their low power draw.
That is partially True. Truth is is polaris and likely Vega can be really efficient if kept within certain parameters of clocks and voltage. The reason these tend to use too much power is AMD pushing GCN and trying to stretch its legs in to gaming. When they push the limits of GCN to hard the power usage goes through roof. Apple is likely getting binned chips of Vega tuned to stay within certain limits.
First of all, you're comparing it to a 1080 Ti which is a much more powerful video card. You should be comparing it to a 1080 that has a draw of 180W which is significantly less. Not only that but AMD desperately needs to make inroads into the notebook gaming market and now it seems they will once again be completely shut out from the top end of the market where margins are quite lucrative. NVIDIA is simply dominating with their Max Q designs thanks to their low power draw.
https://www.nvidia.com/en-us/geforce/products/10series/laptops/max-q/