Vega performance compared to the Geforce GTX 1080 Ti and the Titan Xp looks really nice!

4th generation Graphics Core is seperate from Next GU they != GCN CU's lol, If you don't remember AMD talking about their changes for polaris in the graphics core don't sit around here and try that with me.
Graphics Core is not separate from the Next, though. Yes, it is on different line, but how would you fit it there, anyway. So, it IS GCN CU, not GC NCU.

And yes, AMD did state Polaris was all-new, to my amusement.
 
Graphics Core is not separate from the Next, though. Yes, it is on different line, but how would you fit it there, anyway. So, it IS GCN CU, not GC NCU.

And yes, AMD did state Polaris was all-new, to my amusement.


They are separate, when we start looking at the patents AMD has been filling, its clear how separate they are. They are not talking about the same components one is the ALU's and the other is communications to those ALU's.

And from our perspective they aren't much different then what they had before from GCN's earlier generations in must games, but they are different from a communications point of view, this is why we see with tessellation (actually polygon through put) Polaris has an discernible advantage over previous GCN GPU's.

Also I specific remember then talking about different parts of the GCN core

Radeon%20Technologies%20Group_Graphics%202016-page-005.jpg


And when they started saying certain things are new, when in fact they were not new at all. Tonga, Fiji both had color compression, they same type Polaris had, Hardware scheduling, same stuff, not new. They only new thing they added in was Primitive discard accelerator. Which is exactly what we see with the polygon through put tests.

One of Raja's presentations about Vega they talk about Instruction prefetch as well as the buffer sizes are now able to accommodate that. It might be more useful in Vega, but at this point I don't see a whole lot of changes.

Improved shader efficiency, wasn't that the same thing they are now presenting with Vega?
 
Last edited:
They are separate, when we start looking at the patents AMD has been filling, its clear how separate they are.
I mean, there is no Graphics Core brand that i have ever heard of. There is Graphics Core Next. As such, it has to be GCN CU and not GC NCU because the latter does not exist.

And yes, i am skeptical about Vega.
 
I know what you guys are saying, but I just don't see how or why they would put them together like that when all other slides and presentations, and patents, separate them distinctly.
 
Last edited:
That's not the point. GCN has always been revered to as Graphics core next! So GCN Cu is just spelled out as Graphics Core Next CU. from what I remember they are just calling it NCU now. Shaders have certainly changed, since they can also handle programmable geometry. We will find out soon I guess. But as far as that slide it's just referring to GCN CU as Graphics core next CU. There is not really a way to avoid that.
 
That's not the point. GCN has always been revered to as Graphics core next! So GCN Cu is just spelled out as Graphics Core Next CU. from what I remember they are just calling it NCU now. Shaders have certainly changed, since they can also handle programmable geometry. We will find out soon I guess. But as far as that slide it's just referring to GCN CU as Graphics core next CU. There is not really a way to avoid that.

NO they haven't changed, same pretty much as before, and if you remember before what AMD was capable of doing, was using its shader array for geometry calculations. The only thing now I think the primitive shaders do is the capability of doing RPM with geometry calculations.

There is no need for 32 FP calculations for vertex calculations. Pixel calculations is where 32 FP is necessary.

What has changed is the communication from the geometry units to the CU's. That is it. Now they have increased that flexibility even more,

If they have been showing other things outside of RPM for geometry performance I would look at it different, that is not the case, nor when asked directly, they don't have answers or don't want to answer those questions, because they know there is no major changes over Polaris, and it will not be seen without special programming. Check the Scott Wasson interview by redtechgaming on youtube, he was asked directly about polygon through put performance and he couldn't answer it, it was about Witcher 3. But now they show off demos of specifically polygon through put with RPM? That is a telling sign isn't it? Without primitive shaders and RPM, they don't get the performance increases. Not only that, the open cl leaks show it too!

And this all goes back to 6 months extra time isn't going to change much with the GCN architecture. I wouldn't be surprised if we see performance like Fiji @ 1500mhz. Which is a good increase in performance but compared to what is out there already, its just that. (50% increase in performance) puts it at what ~10% above gtx 1080, isn't that what we see with Doom? So it all comes down to if Vega isn't specifically programmed for, that no other cards even older GCN architectures can't take advantage of, we won't see anything special......

Now lets say Prey which is a launch title for Vega does have Vega specific features, it only equalizes to what Pascal does automatically. So does the programmability of its geometry pipeline really come to an advantage for Vega? No it just equalizes the playing field with extra work for developers. Programmable geometry units I see it as the future but only if they are useful with increased performance over current fixed function implementations.
 
Last edited:
Jesus Christ!!!!!

GCN, CU, FP, ABC, EFG!!!!!

At the end all I care if it provides great gameplay compared to NVidia.

I thought the original statement of "Looks really nice" was silly. Now it does not sound too bad. At least not compared to all this GCN, CU, FP, ABC, EFG talk.
 
Lets just work on launching a product without a TLB bug(Phenom), a water cooling unit issue(FuryX), overheating issue(Polaris), gaming issues related to Windows 10(Ryzen, recently fixed in driver update) BSOD among many issues when Bulldozer launched. Just a nice, clean launch. Right out the gate kicken ass! My 6700K and 980Ti didn't have issues at launch, lets just do that AMD!!!
 
Lets just work on launching a product without a TLB bug(Phenom), a water cooling unit issue(FuryX), overheating issue(Polaris), gaming issues related to Windows 10(Ryzen, recently fixed in driver update) BSOD among many issues when Bulldozer launched. Just a nice, clean launch. Right out the gate kicken ass! My 6700K and 980Ti didn't have issues at launch, lets just do that AMD!!!

I think you are blowing the polaris and ryzen thing out of proportion. Polaris wasn't over heating it was drawing little more out of spec power but it was later fixed via driver update.
 
I think you are blowing the polaris and ryzen thing out of proportion. Polaris wasn't over heating it was drawing little more out of spec power but it was later fixed via driver update.

You're right it was a power issue. His main point still stands. AMD needs a clean launch, aka showcase a quality product. They've had enough time... don't give people a reason to hate on it.

Even if they fix something later most people don't seem to hear about it or care. Already tarnished.
 
You guys arguing about what AMD calls their block names? o_O

It's still Compute Units, very much like NV's SM vs SMX name change recently.

What's a Compute Unit? Just a group of ALUs/SIMDs & registers/cache/wavefront scheduler. NV calls it SMX in the past.

What's a Shader Engine? A big group of Compute Units & Geometry Engines & Back Ends. NV calls theirs Graphics Processing Clusters or GPC. Same thing.

You can call it whatever you want.

The question is how significant a change the individual components are from one generation to the other.

Why did AMD call Tahiti GCN? They took the components of TeraScale, front-end, back-end, they changed the Compute Units from VLIW to SIMD, and that's why they called it Graphics CORE Next. It's the core component.
 
Used in that context when I think of the phrase "really nice", that's just a polite way of saying it's bad.
 
Used in that context when I think of the phrase "really nice", that's just a polite way of saying it's bad.

I think it may lean towards better bang/buck but not grabbing the crown.

Will it be a 4850/4870 type of situation? Who knows!

I hope they have a compelling product to push this market forward.
 
Lets just work on launching a product without a TLB bug(Phenom), a water cooling unit issue(FuryX), overheating issue(Polaris), gaming issues related to Windows 10(Ryzen, recently fixed in driver update) BSOD among many issues when Bulldozer launched. Just a nice, clean launch. Right out the gate kicken ass! My 6700K and 980Ti didn't have issues at launch, lets just do that AMD!!!

Replace Nvidia in the windows 10 driver rant, Intel sandy bridge with USB and SATA issues etc etc. You just got lucky, punk. All platforms and new things have bugs/errata. It just depends on if they're bad enough to be a major or not..
 
Replace Nvidia in the windows 10 driver rant, Intel sandy bridge with USB and SATA issues etc etc. You just got lucky, punk. All platforms and new things have bugs/errata. It just depends on if they're bad enough to be a major or not..

I'm on Windows 10 Insider Edition on the Fast Ring and I don't have any issues with nVidia's drivers. You'd think that being on a 'beta' OS I'd have had issues by now but nary a one.
 
If Vega beats Titan Xp, that would be a dream (and I'd also lose my $100 bet but in a good way).

Someone bet you $100 that vega would beat a Xp? Where can I meet this person?


But really, lets say Vega is a flop (no faster then 1080 for around the same cost), what should AMD do next in regards to their GPUs? Ask for a bit more R&D money from the tentative success of Ryzen?
 
Last edited:
Someone bet you $100 that vega would beat a Xp? Where can I meet this person?


But really, lets say Vega is a flop (no faster then 1080 for around the same cost), what should AMD do next in regards to their GPUs? Ask for a bit more R&D money from the tentative success of Ryzen?

The risk is if Vega flops hard, think 1070 level like the latest leaks, AMD could just decide the GPU division isn't worth throwing any more extra money at. Choose not to waste their Ryzen money on it and instead reinvest it all back into CPU where they've at least made strides and is showing growth.
 
The risk is if Vega flops hard, think 1070 level like the latest leaks, AMD could just decide the GPU division isn't worth throwing any more extra money at. Choose not to waste their Ryzen money on it and instead reinvest it all back into CPU where they've at least made strides and is showing growth.

If you look at AMDs SEC statements about R&D, they already channeled more or less everything into CPU in a ever increasing pace since 2011 or so not to mention the R&D cuts. And Raja even said he works with a way too tiny group and resources. This is also why we now see 14nm parts with the perf/watt of a 28nm part. Both GCN mind you.

It seems AMD pissed away RTG in favour of CPUs. Once a CPU company, always a CPU company.
 
Makes more sense at this point to stay with CPUs, they have much better chance at making money vs. hemorrhaging money trying to catch Nvidia. Ryzen, Raven Ridge, and Naples are far more compelling than another extremely low margin product from RTG. Honestly, they'd do better to stick with mid-ranged products and make SOCs for consoles. I don't see Vega coming anywhere close to Pascal, they'd release information at this point if it wasn't disappointing.
 
I'm on Windows 10 Insider Edition on the Fast Ring and I don't have any issues with nVidia's drivers. You'd think that being on a 'beta' OS I'd have had issues by now but nary a one.

Both are past issues. Point still stands.
 
Both are past issues. Point still stands.

I've been on the Insider Edition since before Windows 10 was released to the public. So replace past issue with non-issue and we have a deal.
 
I've been on the Insider Edition since before Windows 10 was released to the public. So replace past issue with non-issue and we have a deal.
You should read more then. Nvidias driver + Win10 issues was big news upon Win10 release.
 
You should read more then. Nvidias driver + Win10 issues was big news upon Win10 release.

Anecdotally, AMD drivers suck, too but I've never had any problems with either company's drivers. Don't pillory one and absolve the other based on bias.
 
Makes more sense at this point to stay with CPUs, they have much better chance at making money vs. hemorrhaging money trying to catch Nvidia. Ryzen, Raven Ridge, and Naples are far more compelling than another extremely low margin product from RTG. Honestly, they'd do better to stick with mid-ranged products and make SOCs for consoles. I don't see Vega coming anywhere close to Pascal, they'd release information at this point if it wasn't disappointing.

They are sticking with mid-range products. Look at the leaked benches: stock 1070 performance in mid 2017 = mid range.
 
They are sticking with mid-range products. Look at the leaked benches: stock 1070 performance in mid 2017 = mid range.

You do realize you are going off one bench right. Ran off an early engineering sample. There are new revisions of the chip out. There is also C3 revision that no one has seen benches from. You say that like you have a product in hand and know the true outcome.
 
You do realize you are going off one bench right. Ran off an early engineering sample. There are new revisions of the chip out. There is also C3 revision that no one has seen benches from. You say that like you have a product in hand and know the true outcome.

Having been around the block a couple of times with the Roadmap>Rumour>Announcement>Leak>Release cycle on MANY an AMD product, you should know that leaks get pretty accurate closer to launch. I remember the rabid AMD fans declaring that the bulldozer leaks (the ones showing turd-like performance) prior to launch were 'fake' or 'just engineering tests', and what do you know: the chip was a turd.
 
Having been around the block a couple of times with the Roadmap>Rumour>Announcement>Leak>Release cycle on MANY an AMD product, you should know that leaks get pretty accurate closer to launch. I remember the rabid AMD fans declaring that the bulldozer leaks (the ones showing turd-like performance) prior to launch were 'fake' or 'just engineering tests', and what do you know: the chip was a turd.

This wasn't close to launch. This revision showed up in OPEN CL a while back. Same card shows up in 3dmark bench. C1 revision. We know for a fact there is C3 revision because that is reported in OpenCL as well. So just because same revision showed up doesn't mean its a new leak card with final specs. You clearly haven't been keeping up with the revision IDs.
 
This wasn't close to launch. This revision showed up in OPEN CL a while back. Same card shows up in 3dmark bench. C1 revision. We know for a fact there is C3 revision because that is reported in OpenCL as well. So just because same revision showed up doesn't mean its a new leak card with final specs. You clearly haven't been keeping up with the revision IDs.

Already the memories are flooding in.... oh the nostalgia...
 
Already the memories are flooding in.... oh the nostalgia...


Ehh, Polaris early benchmarks were way off. No one knows for sure, if it is a mid ranged chip it's certainly way too big. If it's honestly 1070+ performance at that size(500mm+) they're going to take a major loss on Vega, and it could mean a hell of a lot worse for RTG.
 
Anecdotally, AMD drivers suck, too but I've never had any problems with either company's drivers. Don't pillory one and absolve the other based on bias.
Man that was a quick jump to the bias card. Doesn't change the fact that Nvidia DID have issues with Win10 nor the plethora of data supporting this statement.

As another fact I never had the microstutter issue with CF but that doesn't mean it didn't exist. It also has a plethora of data showing that it DID exist. Bias has nothing to do with it.
 
Win10 did not help that it kept replacing the drivers and download-implementation approach Nvidia wanted to use initially or for short term until Win10 had been out for awhile (I doubt we will ever know the full facts behind the whys).
Now one may say Nvidia is at fault for not tying into Win10 update system completely but it possibly seriously hobbled their driver team updates at the time, being critical Nvidia did not fully appreciate how Win10 would impact their driver cycle process when it launched (not sure if the issue was there at official launch or just Insider build).
Cheers
 
Last edited:
Anecdotally, AMD drivers suck, too but I've never had any problems with either company's drivers. Don't pillory one and absolve the other based on bias.

The point is they all suck at times (IntelVidiaMD), so lets stop just bashing AMD in every fucking thread.
 
AMD drivers suck? I've had less(by less I mean no) issues with this card's drivers than I did i with my GTX 780. I've always heard that meme, but it's never really rung true for me. Even with my Radeon 6950 back in the day, the catalyst software was sluggish but I wouldn't call it bad.
 
The point is they all suck at times (IntelVidiaMD), so lets stop just bashing AMD in every fucking thread.

It'd be nice but it will never happen. Can't even go into the AMD section and post anything good without the same tired trolls shitting on it. I'm on this site all the damn time, I go all over in it, but I rarely venture into the video card forum because of it.
 
It'd be nice but it will never happen. Can't even go into the AMD section and post anything good without the same tired trolls shitting on it. I'm on this site all the damn time, I go all over in it, but I rarely venture into the video card forum because of it.

You should not get discouraged from participating on the forums , in the CPU section they are already gone, The level of interaction can only improve by the likes of you showing up and doing something rather then being "scared" to post , just don't go overboard if you are being trolled :) .
 
You should not get discouraged from participating on the forums , in the CPU section they are already gone, The level of interaction can only improve by the likes of you showing up and doing something rather then being "scared" to post , just don't go overboard if you are being trolled :) .

He can also post on AT instead where even mods got bought by AMD (XS all over). Then he will be all safe from any reality. Only a world where Red Team and Red Team Plus rules.

Then you can talk with the others on how much better Vega is than the 1080TI and Titan Xp while only sipping power and how cheap it is. Because that never backfires! :LOL:

Oh, and here is Vega Firestrike:
http://www.3dmark.com/fs/12296284
 
Last edited:
Back
Top