Radeon RX Vega Discussion Thread

AMD's 'old and outdated' GPU technology has a more current DX12 and Vulkan implementation than NV's 'new' GPU technology.

Then why can't they run a 1080p resolution at 60 frames per second with most of their games? Its old and outdated or underpowered. Either way, that's why they are coming out with new ones. Its getting expensive for the customer these days.
 
AMD's 'old and outdated' GPU technology has a more current DX12 and Vulkan implementation than NV's 'new' GPU technology.


Now now there, nope, Still don't have CRV's ;). Do you even know the differences between AMD and nV GPU's? nV GPU's have more features that's why they are DX 12.1, now if you are talking about tiers that is different, but the tiers, tier 2 is more than enough for foreseeable future when it comes to games. AMD is ahead with tiers they are tier 3 and nV is tier 2. The only reason games won't go beyond tier 2 for now is because of the consoles and because if you are going to push above tier 2 limits, other parts of the gpu will probably be tasked hard enough that today's shader power isn't enough, not sure about Xbox Scorpio what tier it is though. But CRV's are much more important in VR and things Scorpio is promoting, so hopefully they put them into Scorpio's SOC, but its a most likely they won't just because of what scorpio is, a console that is backwards compatible.

Please don't make things up just because.
 
Last edited:
Now now there, nope, Still don't have CRV's ;). Do you even know the differences between AMD and nV GPU's? nV GPU's have more features that's why they are DX 12.1, now if you are talking about tiers that is different, but the tiers, tier 2 is more than enough for foreseeable future when it comes to games. AMD is ahead with tiers they are tier 3 and nV is tier 2. The only reason games won't go beyond tier 2 for now is because of the consoles and because if you are going to push above tier 2 limits, other parts of the gpu will probably be tasked hard enough that today's shader power isn't enough, not sure about Xbox Scorpio what tier it is though. But CRV's are much more important in VR and things Scorpio is promoting, so hopefully they put them into Scorpio's SOC, but its a most likely they won't just because of what scorpio is, a console that is backwards compatible.

Please don't make things up just because.

lol nope ;) Nv's architecture is designed around yesterday's API's, even AMD's first GCN architecture from 5 years ago has more advanced next gen API features built in at a hardware level. Of course, NV did lobby MS to add the .1 to DX12 so the fanboys could claim that 'NV has a 12.1 so it's better! her derp'.
 
lol nope ;) Nv's architecture is designed around yesterday's API's, even AMD's first GCN architecture from 5 years ago has more advanced next gen API features built in at a hardware level. Of course, NV did lobby MS to add the .1 to DX12 so the fanboys could claim that 'NV has a 12.1 so it's better! her derp'.
Better or not, that's a fact: AMD GPUs have worse Dx12 feature support than a fucking Intel GPU. Now that's a shameful display, if you ask me.
 
Better or not, that's a fact: AMD GPUs have worse Dx12 feature support than a fucking Intel GPU. Now that's a shameful display, if you ask me.

I've never fucked an Intel GPU. How did that go for you?

Better in what way exactly?
 
lol nope ;) Nv's architecture is designed around yesterday's API's, even AMD's first GCN architecture from 5 years ago has more advanced next gen API features built in at a hardware level. Of course, NV did lobby MS to add the .1 to DX12 so the fanboys could claim that 'NV has a 12.1 so it's better! her derp'.


No they didn't CRV's where implemented by Intel too Skylake IGP, had them as well lol. It was a decision done by MS, man tellin ya just don't know WTF you are talking about.
 
I've never fucked an Intel GPU. How did that go for you?

Better in what way exactly?

Features wise, Skylake IGP is also DX12.1 complaint.

AMD just hasn't gone to DX12.1 because their consoles can't, to introduce these types of features in their mainstream consumer GPU's will screw up things for their entire SOC's/embedded things they are working on. It will cause a lot of problems with their intrinsic shaders, porting games will be much harder to their hardware. Its just smart business for them not to support those features at this point.

Well by those metrics, NV has worse DX12 feature support than a fucking intel GPU.

Dude please, if you can't discuss the topic without comments that are BS, don't post at all. I will start reporting your posts for this point on ward.

NO ONE here is making off the wall comments without knowing what they are talking about but YOU.
 
you guys need to discuss this in a mature fashion..you can debate and disagree all you want , but once the insults start flying people will start getting banned.

consider this a warning to all
 
Nope. Here...




That is not talking about feature differences (and actually confirms nV has the same features as AMD, so quite the opposite of what you are stating), DX API doesn't stipulate how features are done in hardware just that they have to have them, and at this point there is no reason to see nV's latest hardware being held back by the way their GPU's schedule instructions when doing specific operations, if anything, AMD's GPU's don't have the performance (due to throughput issues) to catch up to nV's, even though they have much more shader capabilities (raw processing power), more bandwidth and since you like that video so much, HARDWARE scheduling.

And that video although pretty good isn't entirely correct. If you go back and look at posts on Async compute you will see where those incorrect portions are, and I just highlighted above where he got things wrong too.
 
why the fuck are you guys arguing about feature difference? Where has this shit mattered in the last year? Seriously what exactly anyone gains by these pissing fanboy war. Vega thread is turning in to dx 12 vs dx12.1, if you guys want to argue that make another thread. Jeez!
 
why the fuck are you guys arguing about feature difference? Where has this shit mattered in the last year? Seriously what exactly anyone gains by these pissing fanboy war. Vega thread is turning in to dx 12 vs dx12.1, if you guys want to argue that make another thread. Jeez!

Ask Peppercorn he is the one the that thinks it makes a difference lol.

CRV's are extremely important if you want faster performance in applications, and will be the future, now Vega most likely will not have these (not saying this is bad, as I will explain later), The performance it brings to the table when doing CRV's for GI you are looking in the neighborhood of 30% easy. And we know lighting is probably the most compute intensive task that's around. now I only know of two engines that use CRV's and both of them aren't that widely used so I don't think its a big deal in the near future, down the road they will be though, Navi time I would say.

And this goes both ways, Tier 3 resource binding, something that nV lacks at the moment, doesn't matter as devs aren't even getting near the limit of Tier 2 resource binding in the near feature either.
 
Last edited:
I think Vulkan will likely see further support and will make little difference what level of DX 12 support each manufacture has. The only thing Nvidia should be fixing is their crappy DX 12 drivers, they did great DX 11 drivers but so far in DX 12 they have not done very well. AMD only needs to worry about getting Vega out the damn door. That is reality as it stands right now.
 
Looks like lisa has gotten Kaduri to shut up and RTG is on lockdown about Vega ROFL. Probably like " shut the fuck up and work 24/7 and only talk about vega when you are ready to launch this shit" hahaha
 
I think Vulkan will likely see further support and will make little difference what level of DX 12 support each manufacture has. The only thing Nvidia should be fixing is their crappy DX 12 drivers, they did great DX 11 drivers but so far in DX 12 they have not done very well. AMD only needs to worry about getting Vega out the damn door. That is reality as it stands right now.
DX12 is a mess and far from any consistency especially when it involved older engines that are hacked to be DX12 and this is where Nvidia has had the most issues.
Nvidia is fine with some such as AoTS/Sniper Elite 4/Gears of War 4 but fails with ones such as Warhammer/Deus Ex Mankind (depends upon the review and scene)/Hitman (depends upon the review and scene)/etc.

But Vulkan could be interesting although Microsoft has sort of pushed it down for AAA studios with Scorpio, and lets see if AMD can capitalise on the in-built DX12 for Scorpio on PC Platform in some way down the line in the future which would put pressure on Nvidia but no guarantee AMD can capitalise on this.

Cheers
 
That is not talking about feature differences (and actually confirms nV has the same features as AMD, so quite the opposite of what you are stating), DX API doesn't stipulate how features are done in hardware just that they have to have them, and at this point there is no reason to see nV's latest hardware being held back by the way their GPU's schedule instructions when doing specific operations, if anything, AMD's GPU's don't have the performance (due to throughput issues) to catch up to nV's, even though they have much more shader capabilities (raw processing power), more bandwidth and since you like that video so much, HARDWARE scheduling.

And that video although pretty good isn't entirely correct. If you go back and look at posts on Async compute you will see where those incorrect portions are, and I just highlighted above where he got things wrong too.

Who was talking about features besides you and lolfail? I said AMD has a better hardware implementation for next gen APIs and you changed the subject to hardware features because you cant win that debate, hence your obvious bait and switch. Then, i posted a video that further proves my point.
 
Who was talking about features besides you and lolfail? I said AMD has a better hardware implementation for next gen APIs and you changed the subject to hardware features because you cant win that debate, hence your obvious bait and switch. Then, i posted a video that further proves my point.

AMD doesn't have a better hardware implementation. That myth got busted long ago.

Not sure what the console parts got to do with Vega either. Since its not confirmed in any way that they use Vega parts. What we do know is tho its a 44CU chip with 40CUs active due to yield.

Vega better be delivering DX12.1 featureset as well.
 
AMD doesn't have a better hardware implementation. That myth got busted long ago.

Not sure what the console parts got to do with Vega either. Since its not confirmed in any way that they use Vega parts. What we do know is tho its a 44CU chip with 40CUs active due to yield.

Vega better be delivering DX12.1 featureset as well.

Yes they do, and no it wasn't. Nv better be delivering a proper hardware scheduler and ASync compute support as well.

And obviously Microsoft knows what Scorpio is going to need in regards to DX12 better than you or anyone else on these boards.

And it's also safe to say that whatever is making Scorpio a beast will have it's DNA in what AMD is bringing to market.
 
Last edited:
Who was talking about features besides you and lolfail? I said AMD has a better hardware implementation for next gen APIs and you changed the subject to hardware features because you cant win that debate, hence your obvious bait and switch. Then, i posted a video that further proves my point.

dude you stated it

AMD's 'old and outdated' GPU technology has a more current DX12 and Vulkan implementation than NV's 'new' GPU technology.

No AMD is behind they have older DX12 implementation and nV has 12.1 implementation, which came out after, sorry simple to see that now?
 
I think Vulkan will likely see further support and will make little difference what level of DX 12 support each manufacture has. The only thing Nvidia should be fixing is their crappy DX 12 drivers, they did great DX 11 drivers but so far in DX 12 they have not done very well. AMD only needs to worry about getting Vega out the damn door. That is reality as it stands right now.


Nothing wrong with nV's DX12 drivers (a bug in a specific title yeah that's it), yeah please don't point to AdronedTV videos, already talked to him @ B3D, and no one there thinks he is on to something, quite the opposite.

This is a problem, you guys are flaky with the information you have, and point to obvious flakes when it comes to these things, and this place becomes a mess,
 
Yes they do, and no it wasn't. Nv better be delivering a proper hardware scheduler and ASync compute support as well.

And obviously Microsoft knows what Scorpio is going to need in regards to DX12 better than you or anyone else on these boards.

And it's also safe to say that whatever is making Scorpio a beast will have it's DNA in what AMD is bringing to market.


Dude scorpio is not a beast, its today's mid range GPU with a really crappy CPU lol. 6TFlops man does that ring a bell, its rx480 with more clocks, well rx 580 has more lol. It is going to have certain features for 4k rendering that will not be in AMD products (work load and automation done on Scorpio GPU). I don't think those will be introduced into AMD desktop GPU's because its MS that did the work of that implementation not AMD.

Async compute nV has no issues with, we have seen games work well for nV's implementation as well, so its all developer controlled, and we have seen AMD sponsored titles screw up for AMD hardware too! So don't make things up hardware scheduler is not a 100% need. Not only that, nV has experience with hardware schedulers, if you followed the video or if you looked at my async post from oh a year and half ago you would know this too lol. If you want me to link the benchmarks. But I think you have the memory to remember which ones those are. If you want me to link my posts over a year and half ago, I will do that too.

We KNOW what the differences are, but we also KNOW the implementation doesn't matter, I don't give a shit if nV has do extra work in their drivers, cause at the end it doesn't affect me as an end user as long as they are doing the work and they have been doing the work, we see that in most DX12 games.

You don't even have an nV card, and you are butt hurt over this right? Why is that? Cause what the 75% of the market must know, shit yeah, they already know what is real and what isn't, that is why nV has 75% of the market.

At the end AMD's base performance before hardware scheduling and async compute sucks soooo much that it doesn't matter what dev's do they can barely equalize performance with nV counterparts! Good for you you have hardware scheduling, oh bad for you, you are still slower lol.
 
Last edited:
Looks like lisa has gotten Kaduri to shut up and RTG is on lockdown about Vega ROFL. Probably like " shut the fuck up and work 24/7 and only talk about vega when you are ready to launch this shit" hahaha


I think Kudori has a lot more freedom then previous heads of the graphics units have.
 
That is not talking about feature differences (and actually confirms nV has the same features as AMD, so quite the opposite of what you are stating), DX API doesn't stipulate how features are done in hardware just that they have to have them, and at this point there is no reason to see nV's latest hardware being held back by the way their GPU's schedule instructions when doing specific operations, if anything, AMD's GPU's don't have the performance (due to throughput issues) to catch up to nV's, even though they have much more shader capabilities (raw processing power), more bandwidth and since you like that video so much, HARDWARE scheduling.

And that video although pretty good isn't entirely correct. If you go back and look at posts on Async compute you will see where those incorrect portions are, and I just highlighted above where he got things wrong too.


Was looking for this to point out to you guys why that video is wrong, a member at B3D posted it too,

https://github.com/NervanaSystems/maxas/wiki/Control-Codes


Damn now how the hell does the CPU have anything to do with instruction scheduling after the shader is compiled. And before anyone answers, if they actually understand what this link is about, they would know, the CPU isn't involved in that once the shader is compiled! Even with Keplar! So this is why the hardware scheduling portion of that video is just incorrect.

Do I blame you guys for not understanding where the video went wrong, no, you guys are on the outside looking in and trying to figure things out, being inquisitive is a good thing, but making blanket statements on wrong information is not good.


Just think about this guys, the CCX latency in Ryzen, that causes how much performance loss? Now, what do you think the latency is going across the pic-e bus, so the CPU can feed the GPU the instructions...... Just gander how much more latency that is over the CCX L3 communication...... Its in the neighborhood of a 100 fold difference guys!

Youtube is akin to TV, and why TV was called a boob tube. If you don't have basic understanding of what your watching, its just going to rot your brain.
 
Last edited:
Damn now how the hell does the CPU have anything to do with instruction scheduling after the shader is compiled. And before anyone answers, if they actually understand what this link is about, they would know, the CPU isn't involved in that once the shader is compiled! Even with Keplar! So this is why the hardware scheduling portion of that video is just incorrect.
You are actually correct on that but from what i understand of it, it does not actually affect the point that youtube guy tried to make: nV driver uses whatever daemon magic it does use for multithreading their driver, AMD just acts as direct bridge. I'll agree that instruction scheduling has nothing to do with it, though.
 
You are actually correct on that but from what i understand of it, it does not actually affect the point that youtube guy tried to make: nV driver uses whatever daemon magic it does use for multithreading their driver, AMD just acts as direct bridge. I'll agree that instruction scheduling has nothing to do with it, though.


Yeah I agree, some parts of the video are correct, thats why its a good video, its great for a good start.
 
Nice quintuple post, jackass. Learn how to use the edit function if you're going to continue these rants.

I don't know why every Vega thread has to turn into a shitfest.


Some posts were removed lol, so if you didn't see the thread before, you wouldn't know what was going on.

Its turns into a shitfest because, people make things up about AMD and nV all the time without knowing WTF they are talking about, so please elaborate on what you are thinking then go from there.
 
DX12 is a mess and far from any consistency especially when it involved older engines that are hacked to be DX12 and this is where Nvidia has had the most issues.
Nvidia is fine with some such as AoTS/Sniper Elite 4/Gears of War 4 but fails with ones such as Warhammer/Deus Ex Mankind (depends upon the review and scene)/Hitman (depends upon the review and scene)/etc.

But Vulkan could be interesting although Microsoft has sort of pushed it down for AAA studios with Scorpio, and lets see if AMD can capitalise on the in-built DX12 for Scorpio on PC Platform in some way down the line in the future which would put pressure on Nvidia but no guarantee AMD can capitalise on this.

Cheers

I agree with ya but far to often Nvidia is lagging on performance in DX12. They have the hardware to run better then they are right now, but I think Vulkan might be the future for both. But yeah AMD really has not made their control of consoles pay off either. Far to often AMD talks about how great it will be then pulls up short in reality. Will be interesting to see how Vega does as my 290x is starting to show its age now.
 
Why are people thinking that Vega is in Scorpio? At only 6 Tflops that would make Vega the biggest fail in AMD history. Even bigger than bulldozer.
 
Why are people thinking that Vega is in Scorpio? At only 6 Tflops that would make Vega the biggest fail in AMD history. Even bigger than bulldozer.

Isn't Scorpio an AMD APU? And isn't the Raven Ridge rumored to be used in Scorpio an iteration or cut down version of Vega?
 
Isn't Scorpio an AMD APU? And isn't the Raven Ridge rumored to be used in Scorpio an iteration or cut down version of Vega?


No its not MS made changes to Polaris GCN architecture and created Scorprio using TSMC 16nm. CPU wise using Jaguar cores just clocked much higher than Xbone.

Why are people thinking that Vega is in Scorpio? At only 6 Tflops that would make Vega the biggest fail in AMD history. Even bigger than bulldozer.

its a console they need to save money even if it was based on Vega, they wouldn't use the biggest baddest Vega chip out there in their consoles, it would be a heavily cut down Vega. Half the amount of CU's? That would put it at 6 Tflops, so either way doesn't matter for this thread.

MS is not using Vega IP for xbox Scorpio nor is it using Ryzen. Does this mean anything for Vega. Not really, it means, possibly Vega IP wasn't finalized in time for Scorpio, or MS went a different route making their own SOC based on IP they already got from AMD which made it easier for them to be backwards compatible.
 
Nothing wrong with nV's DX12 drivers (a bug in a specific title yeah that's it), yeah please don't point to AdronedTV videos, already talked to him @ B3D, and no one there thinks he is on to something, quite the opposite.

This is a problem, you guys are flaky with the information you have, and point to obvious flakes when it comes to these things, and this place becomes a mess,
So all the links to his videos discussing why a difference existed were about an issue that didn't exist?

Damn now how the hell does the CPU have anything to do with instruction scheduling after the shader is compiled. And before anyone answers, if they actually understand what this link is about, they would know, the CPU isn't involved in that once the shader is compiled! Even with Keplar! So this is why the hardware scheduling portion of that video is just incorrect.
They also wouldn't confuse instruction scheduling with higher level wave and queue scheduling which is the concern.

Just think about this guys, the CCX latency in Ryzen, that causes how much performance loss? Now, what do you think the latency is going across the pic-e bus, so the CPU can feed the GPU the instructions......
So AMDs software development is superior to Nvidia's and they were able to avoid the latency enough that adding 4 cores wasn't detrimental to performance with DX12? So sending commands to two devices at a higher rate provided less of a load than a single device?

MS is not using Vega IP for xbox Scorpio
You've seen confirmation that Vega's packed math isn't in Scorpio like PS4 Pro? Along with potential changes to the command processor, geometry pipeline, etc for the bulk of the technical specs we haven't seen yet?
 
So all the links to his videos discussing why a difference existed were about an issue that didn't exist?

the problem might not even be on nV's drivers side! we don't know where the hell the problem lies! Bios changes on Ryzen also have fixed some things too, in that specific games he tested. Its a damn mess. Much more testing needs to be done to see what the hell is going wrong.

They also wouldn't confuse instruction scheduling with higher level wave and queue scheduling which is the concern.
True

So AMDs software development is superior to Nvidia's and they were able to avoid the latency enough that adding 4 cores wasn't detrimental to performance with DX12? So sending commands to two devices at a higher rate provided less of a load than a single device?

That isn't what I said was it. We don't know where the problem resides, so to make that assumption is folly.


You've seen confirmation that Vega's packed math isn't in Scorpio like PS4 Pro? Along with potential changes to the command processor, geometry pipeline, etc for the bulk of the technical specs we haven't seen yet?

I would ask a developer that has access to scorpio info that you know and he would be willing to share that info for ya, cause I can't talk much more about it ;)
 
Back
Top