Vega Rumors

Wow the only difference for the shader array from an instruction point of view is fp 16! So this is the "major" change in GCN to NCU....... They better do something else in Navi then.

Interesting they never mentioned anything about primitive shaders, what I'm thinking its just using the shader array, the geometry units aren't programmable at all, they need the shader units to feed them information. Pretty much the same thing as before just a new name.

Armenius was correct!
Wait, what?
Did you read any of the articles?

TR wrecked the 7900x and held it's own in 1080p gaming...
A 16-core/32-thread CPU is beating a 10-core/20-thread CPU in multithreaded tasks... Be still my beating heart.
 
Wait, what?

A 16-core/32-thread CPU is beating a 10-core/20-thread CPU in multithreaded tasks... Be still my beating heart.

Yeah, but it's the same price. I quite like this new HEDT generation, Intel's cache rework for datacenter oriented tasks, Zen showing real strong performance in independent multithreading, it feels like HEDT is back to being what it should be. I only bought a 5820K because it offers far more value than kaby Lake or skylake systems.

Hopefully coffee lake will offer very strong 6c/12t performers and HEDT will continue to be more HPC and datacenter oriented as it should be.

Funnily enough skylake-e's regression in gaming benchmarks puts it closer to ryzen performance so it ends up working out quite well lol.

I Am curious to see how advantageous skylake's bigger l2 will be in datacenter oriented benchmarks, TR is offering impressive value imo I would be hard pressed to consider 7900x over it at the same price.

.
 
Wait, what?

A 16-core/32-thread CPU is beating a 10-core/20-thread CPU in multithreaded tasks... Be still my beating heart.

a 16 core/32 thread is destroying a 10 core/20 thread cpu in multithreaded tasks, is within 5 to 10% in single threaded tasks and is the same price (cheaper when platform costs are taken into account)

fixed for you.
 
a 16 core/32 thread is destroying a 10 core/20 thread cpu in multithreaded tasks, is within 5 to 10% in single threaded tasks and is the same price (cheaper when platform costs are taken into account)

fixed for you.
That is really the big deal here. Ryzen performance may not be where people want it in single threaded tasks, but Threadripper is a bargain for people who have those types of workloads and needs. I am almost due for a CPU upgrade myself and am particularly excited to see how well a 6core i5 performs and if it ends up being backwards compatible to z100 boards instant upgrade for me. But, that is way off topic here. Now if only AMD could be as competitive in the GPU realm.
 
a 16 core/32 thread is destroying a 10 core/20 thread cpu in multithreaded tasks, is within 5 to 10% in single threaded tasks and is the same price (cheaper when platform costs are taken into account)

fixed for you.
I don't know what you're fixing. I never alluded to single core performance or price.
 
I'm not sure why there's hate for Threadripper. It's a legitimately great deal. Price/performance competitive, tons of PCIe lanes, modern chipset support. I'm not very excited for Vega but Ryzen and Threadripper have been a pleasant surprise.
 
I'm not sure why there's hate for Threadripper. It's a legitimately great deal. Price/performance competitive, tons of PCIe lanes, modern chipset support. I'm not very excited for Vega but Ryzen and Threadripper have been a pleasant surprise.

I didn't see any hate? Fantastic productivity, great for gaming except high Hz. It does have good value.
 
Last edited:
I didn't see any hate? Fantatic productivity, great for gaming except high Hz. It does have good value.
Just a handful of posts over the past page or two.

It's a shame they can't seem to execute well on both fronts at once. But I kind of want a 1950X just because.
 
The msi 1080 is an aib oc version and it competes nicely?


someone pointed out elsewhere regarding the MSI Gaming X used:

Stock 1080 is 1607/1733 base/boost, 1708/1847 in OC mode.
that 1080 Gaming X is overclocked @ 1924mhz almost a 20% overclock for 1080..

1080 stocks come in around 19K-20K

http://www.3dmark.com/search#/?mode...Core i7-6700K&gpuName=NVIDIA GeForce GTX 1080

ex: http://www.3dmark.com/fs/8912050 19650 on 6700K @ 2114

https://www.techpowerup.com/222239/nvidia-geforce-gtx-1080-put-through-3dmark
 
Last edited:
Razor have you seen this?
http://developer.amd.com/wordpress/media/2013/12/Vega_Shader_ISA_28July2017.pdf

Related to this interesting little snippet
In essence the ACE on GCN 1.0 to 4.0 has 4 Pipelines, each is 128-Bit Wide. This means it processes 64-Bit on the Rising Edge, and 64-Bit on the Falling Edge of a Clock Cycle. Now each CU (64 Stream Processors) is actually 16 SIMD (Single Instruction, Multiple Data / Arithmetic Logic Units) each SIMD Supports a Single 128-Bit Vector (4x 32-Bit Components, i.e. [X, Y, Z, W]) and because you can process each individual Component ... this is why it's denoted as 64 "Stream" Processors, because 4x16 = 64.

As I note, the ACE has 4 Pipelines that Process, 4x128-Bit Threads Per Clock. The Minimum Operation Time is 4 Clocks ... as such 4x4 = 16x 128-Bit Asynchronous Operations Per Clock (or 64x 32-Bit Operations Per Clock)

GCN 5.0 still has the same 4 Pipelines, but each is now 256-Bit Wide. This means it processes 128-Bit on the Riding Edge, and 128-Bit on the Falling Edge. Each CU is also now 16 SIMD that support a Single 256-Bit Vector or Double 128-Bit Vector or Quad 64-Bit Vector (4x 64-Bit, 8x 32-Bit, 16x 16-Bit).
reddit.com/r/Amd/comments/6rm3vy/vega_is_better_than_you_think/

256.jpg


Is it the only major change?
 
Stock 1080 FE boosts and maintains mid 1800s on its own due to boost 3.0. The Gaming X is no where near 20% OC out the box. 98% of cards will do around 2000mhz undervolted.


[H] own review shows FE barely maintain anything over 1700 .. nevermind 1800/2Ghz
https://www.hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review/5

"The clock speed seems to average out to around 1770MHz (1.77GHz) according to the table. If you look at the graph the frequency likes to hover between 1760MHz-1785MHz most of the time. The good news is that this is mostly all above the quoted GPU Boost clock of 1734MHz. It is at least 30MHz higher, and sometimes more. However, you will note that there are many times the clock speed does dip below 1734MHz."
 
Razor have you seen this?
http://developer.amd.com/wordpress/media/2013/12/Vega_Shader_ISA_28July2017.pdf

Related to this interesting little snippet

reddit.com/r/Amd/comments/6rm3vy/vega_is_better_than_you_think/

View attachment 33002

Is it the only major change?
That's a ton of speculation and theory. I'd love to see AMD release drivers that boost performance 20-30% over what we are seeing from leaks, but when has that ever happened historically so close to release?
 
That's a ton of speculation and theory. I'd love to see AMD release drivers that boost performance 20-30% over what we are seeing from leaks, but when has that ever happened historically so close to release?
Never. Ever. Not once.
 
Razor have you seen this?
http://developer.amd.com/wordpress/media/2013/12/Vega_Shader_ISA_28July2017.pdf

Related to this interesting little snippet

reddit.com/r/Amd/comments/6rm3vy/vega_is_better_than_you_think/

View attachment 33002

Is it the only major change?


Was only talking about the shader array and instruction set, yeah the back end changed but the performance we will see from L2 direction communication with the ROPs will be to save bandwidth and that will only be seen with bandwidth limited situations or when the application causes cache thrashing, and I'm sure with older GCN architectures those drivers will have limited cache thrashing as much as possible anyways. So pretty much that advantage is minimized by previous driver work, but in the future it will not require driver work.

About the 128 to 258 bit change in the pipeline, that is pretty much how they are doing half precision and full precision in the same pipeline ;) hence why I stated that was the only major change they seemed to have done in the shader array.
 
That's a ton of speculation and theory. I'd love to see AMD release drivers that boost performance 20-30% over what we are seeing from leaks, but when has that ever happened historically so close to release?
Dug up the older R700 ISA documents to see what is different on block diagrams.
http://developer.amd.com/wordpress/media/2012/10/R700-Family_Instruction_Set_Architecture.pdf

vega.jpg r700.jpg GCN3.jpg
Not much different and same marketing blurb above lol.

Edit2 Added GCN3


Never. Ever. Not once.
This

Maybe over a year or two, seen that happen a few times but never launch. That said it's a Jan driver so who knows what they are up to.. Considering the timing, they're probably trying for a 1-2 punch in marketing, so it would make sense to sandbag Vega till the end, if it's not as much of a solid product as Ryzen.


About the 128 to 258 bit change in the pipeline, that is pretty much how they are doing half precision and full precision in the same pipeline ;) hence why I stated that was the only major change they seemed to have done in the shader array.

Ahah! Makes more sense now thank you so much. With everything all sorted, how much do you think they have left on the table and do you think there is even a slightest possibility they disabled a memory controller till launch (to account for the weird bandwidth numbers so far)?

Edit: On further thinking, if they're aiming at a 500 dollar price point for vega 64 (which has been blown), vega 56 will probably end up around there... So really it all comes down to they're probably only very good at mining, hence the pack strategy this time, with the gaming performance having not too much left on the table.
 
Last edited:
All you need to do is move the power limit slider and magic. Mid 1800 boost.

Most the of the Gaming X frequency advantage out the box is from the higher factory power limit.
If I don't overclock my 1080 FE and just move power slider to 120% it will easily hit above 1900 boost.
 
I don't see AMD and partners selling these cards much over the $400 V56 to $600 V64 Water MSRP which has been rumored so far. If Vega 64 goes much over $500 then it turns Nvidia its the 1080 into a value purchase. Also it turns off even more people to AMD gaming. The gaming market has longer legs than the mining market.
 
I don't see AMD and partners selling these cards much over the $400 V56 to $600 V64 Water MSRP which has been rumored so far. If Vega 64 goes much over $500 then it turns Nvidia its the 1080 into a value purchase. Also it turns off even more people to AMD gaming. The gaming market has longer legs than the mining market.
You can pick up a founders edition 1080 on amazon for $535, so yeah any higher of a price and it becomes an odd duck.
 
I found this quite funny. It seems like RTG isn't even a threat to Nvidia.

that is interesting....

somebody with a twitter account please tell Nvidia that a GeForce GTX 1080 TI and an Nvidia software driver update with FreeSync support for the cheaper, 5:1 more accessible, FreeSync monitors would also make for a very compelling pair.
 
Last edited:
Nvidia knows whats up. You think all those Threadripper buyers are gonna go Vega? Nope, those PCI-e slots are going green.

Even if RTG were a threat, I would venture to say that AMD and Intel competing for the CPU space has basically no bearing on Nvidia. No reason not to encourage some more sales.
 
Stock 1080 is 1607/1733 base/boost, 1708/1847 in OC mode.
that 1080 Gaming X is overclocked @ 1924mhz almost a 20% overclock for 1080..
The gaming X is not heavily or even moderately overclocked, it has a 3% clockspeed bump but a more forgiving power profile. Which allows the GPU to maintain highest boost clocks almost all the time. This is happening with all aftermarket 1080 and 1070. You literally can't buy a FE anymore l. All the cards are after market and all of them boost higher.
 
The gaming X is not heavily or even moderately overclocked, it has a 3% clockspeed bump but a more forgiving power profile. Which allows the GPU to maintain highest boost clocks almost all the time. This is happening with all aftermarket 1080 and 1070. You literally can't buy a FE anymore l. All the cards are after market and all of them boost higher.
Amazon has FE and so does Best Buy which sells Nvidia OEM versions
 

AMD CPU division has to be vendor agnostic. Surprised that RTG isn't spun off by now.

Nothing surprising about it. Imagine GlobalFoundaries x2 if RTG got spun off. It would be like AMD chopping off their arm and trying to tell the doctors (investors) its a flesh wound. Plus, I reckon that neither could really stay afloat separately financially.
 
Even if RTG were a threat, I would venture to say that AMD and Intel competing for the CPU space has basically no bearing on Nvidia. No reason not to encourage some more sales.

Quite the contrary. Intel has enormous bearing on NV since most of the those "green" PCIe slots are on Intel boards. Intel also chased NV out of the chipset market years ago so there may be a grudge there too
 
Indeed, but don't you worry. Pascal is obsolete, V100 is in danger of being obsoleted next according to Anarchist4000


http://developer.amd.com/wordpress/media/2013/12/Vega_Shader_ISA_28July2017.pdf

ISA docs
Moving the goalposts to V100 now?

Considering features, it is obsolete once Vega releases. That binding model will be a real PITA for Pascal and any further use if async will only make the situation worse.

You don't understand, AMD are just sandboarding sandbagging and when RX is in people's hands they're going to release a driver that enables the other half of the memory controller, the tiled rasterizer, shader replacement for FP16 with per game profiles for every AAA title ever released and support for primitive shaders whose gradual rollout will propel Vega 10 from competing with GP104 to competing with GV100.
I'm expecting a few titles patched with FP16 in shirt order, but really need the new shader model to release for that. Then PC has the full console feature set. Seems all games will be using primitive shaders on Vega, just without the heavily modified paths. Per game optimizations would be possible on existing titles.

Latest Linux drivers have even Fiji ahead of 1080 and actually beating 1080ti in one test. Civ6 eats CPU though. Still a lot of work there, but AMD did create a new unified driver stack for supporting multiple operating systems so it's taking a while. More than just Win, Linux, and MacOS.

Probably mining, but considering Fiji performance with latest git Vega may very well be ahead in gaming on several titles.

So basically AMD is selling the 56 and 64 (air cooled) for 599 now, if true that's hilarious. So basically once 64 sells out, you're stuck paying full price for the 56 if you're that retarded loyal.
May be a way to encourage bundles and make more off mining. MSRP could increase, but won't necessarily apply to bundle prices or rebates if they go that route.
 
"As we are not being sampled for RX Vega and aren’t in official communications regarding the product, we’re able to provide up-to-date news without concern of embargo breach."

- Gamers Nexus

Anyone remembers what I said?

July 1, 2017

RTG is apparently pretty bitter about reviews of Radeon Vega FE that have been published and is considering denying sample units of Radeon RX Vega to sites that published reviews of Radeon Vega FE.

This is not a done deal and may not happen, but is currently under consideration.
 
"As we are not being sampled for RX Vega and aren’t in official communications regarding the product, we’re able to provide up-to-date news without concern of embargo breach."

- Gamers Nexus

Anyone remembers what I said?

July 1, 2017

Gamer Nexus states that

Although we were told we wouldn’t be sampled, following our defiance of AMD’s decision to express favoritism by permitting only select reviewers to publish early Threadripper data, we will still have Vega content ready on embargo lift dates. We’ve sourced information and parts elsewhere. Turns out that talking about thermal compound efficacy on an IHS is disallowed, despite the fact that TR’s thermal performance was already published by outlets expressly permitted to break embargo.

Maybe the FE reviews were a factor, but we'll have to see if PCPer and Tom's Hardware also didn't get sampled with RX Vega either.
 
Back
Top