Spitballing Vega GPU Performance

Well, if the rumors are right, then that makes sense to me.

Frankly, based on when Raja got there, I never thought Vega would be a 100% Raja card.


There is a new theory floating around though, and it fits in nicely with my idea that Navi will be the first real "Raja-designed" card top to bottom.

The idea is that Polaris is really still a Tonga-based card (remember, they changed names when Raja got there - if this was under the old crew, it would have an Island name), and Vega was to be the last in the line of the Island cards.

When Raja came on, Polaris was already well into design, but what they did is change and test a small part of Polaris, and see how effective the design is. Vega was under initial R&D and had some design work done, but more could be changed, and they in fact did change a lot on it, but it still at the heart is an Island card.

Navi will be the culmination - essentially the first card completely designed from top to bottom under Raja's eye. They are using and selling what they have already in development to remain relevant and keep capital coming in, hoping essentially to tread water until they can blow the roof off with Navi.

Again, this is just my perception, but I think based on what little rumors I have heard and how this all has played out, this indeed is the plan.
 
It's kinda funny how no one talked about GFE telemetry and spyware when AMD was letting a third party collect their user data through the Gaming Evolved app.

Customers: Why don't we have something like the GFE?

AMD: Err... hold on... ahh wait, wait... there you go, the awesome GAMING EVOLVED. Just as good, no wait, actually much better than GFE.

Customers: Yaaay. But wait, it is being force installed. We want a choice.

AMD: Really? C'mon. Okay whatever, here... take your choice, and remember that we love you, teehee.

Customers: It's not really doing anything useful, and waitaminute, is it mining data from us for RAPTR?

AMD: WHAT? NO WAY! Well kinda. Maybe just a tiny bit.

Customers: We don't want it.

AMD: (grumble grumble) Eh? Okay whatever.

Customers: Why don't we have something like the GFE?

AMD: Grrr, it has telemetry y'know. Which is like this evil unholy thing that will rob you of your soul and eat your babies.

Customers: O... M... G!!!


Here is an alternative: take OFF the tinfoil hat.


AMD didn't force anyone to use raptor.
 
From what I've heard, Top Vega will be faster than the GTX1080 in DX12/Vulkan, sometimes a little bit, sometimes a lot depending on the game, and about as fast or somewhat slower in DX11.

My guess is that AMD was caught off guard with the 1080Ti, so it doesn't have anything to go against it.
 
It's funny, all the 'benchmarks' that get 'leaked out' have zero to do with gaming. That should tell you something right there.
 
Unless it beats a 1080 for $399, I don't see them selling too many. And if it does beat 1080 for $399, Nvidia will drop prices.

It's like Nvidia checkmate no matter what, unfortunately. Ryzen created a space for itself because Intel's a lumbering dinosaur. But nVidia is no Intel, they move quickly, and always have something faster ready to drop at a moments notice.


Unless NVIDIA drops price of gsync monitors, a 1080 like Vega would be quite the revolution on the gaming market. remember that we had $550 34" 3400x1440 100 Hz freesync on sale in amazon this week.
 
AMD would charge the exact same prices if they could. AMD's lack of competitive products is what let the prices go up for the only company left in the market. If AMD takes over with Vega/Threadripper and Intel and Nvidia don't respond, I'll bet you dollars to donuts Vega2/Threadripper2 see much higher prices. It's not corporate greed, it's pure supply and demand.

When AMD had the upper hand on CPUs for a couple years in both consumer CPU and server space I don't remember their prices being outlandish... I agree they aren't formed as a charity but at this point the two companies aren't one and the same in regards to proprietary tech and greed. There is a long history of nvidia being greedy and proprietary. Not so for AMD. You might argue that is market position related. Then again maybe not. We don't really know. AMD created freesync and mantle, both open to competitors. Nvidia created gsync, physx, cuda, and hair works which are closed to competitors. Nvidia went so far as to disable physx on systems with an AMD GPU and Nvidia GPU both installed. (They'd only let physx run if you were Nvidia only). That's pretty crappy and I don't know anyone who would argue otherwise.
 
Last edited:
Yar, the only thing Seymour hates more than the separate closed console eco-systems, is the freesync/gsync competition. I love what freesync/gsync tech bring to the table, but really wish they would play nice with each other. I used to be able to purchase a GPU based solely on the price/performance ratio for the resolution that I wanted to game at - now with my freesync monitor, that's playing a big factor in my decision making. I'm hoping with Intel rumored (confirmed now?) to be supporting freesync for their future iGPUs and the Scorpio, NVidia will finally cave and support it as well.
 
While I think this card will stink for me in cfd/mechanics packages just from driver things, I'm pretty sure this will rock with my code I make myself for things not covered in normal packages, so I could love this card. If it does good for gaming, and I can take advantage of it I'm psyched. Hopefully it does good in vr though, but I've not had a problem with my 2x290x with any game I played , but maybe I just have a problem cause of using corrective vision stuffs.
 
crapshoot - they will try to sell it for a little to much $$$ same as fury, it is known, based on how late to the game they are with its release. Unfortunately, much like fury, there were already equal or better solutions on the market in advance of its inception. I am however hoping that they will continue to increase the VRAM on the cards so that we can finally get to around that 10gb sweet spot that will get developers on board with forward looking textures in modern games.
 
$300 for a gsync display that'll do three 1080ti justice? Well over $700 for a basic 4k Gsync monitor.

No, but the quote said displays that will do the card justice. As mentioned above a good 40" 60hz 4k tv can be had for under $300. Now, whether that fits into YOUR definition of doing the card justice is a different conversation.
 
Nvidia created gsync and physixs
Little correction, physx was created by Aegia, which was then bought by nvidia, but the disabling their own cards if an amd card was also installed is correct.
 
I don't think anyone in the know expects Vega to get close to the 1080Ti, which is sad. I'd love to be proven wrong.
 
My post said it's NOT corporate greed, it's supply and demand. Intel/Nvidia aren't being "dicks", they're charging what the market will bear for enthusiast products.

I'm in it for performance. I personally am not going to accept lower performance just to help a "struggling" company get back on their feet. If they were a small business or something, it'd be different, but AMD is a huge multi-national corporation. You help them out now, and in 5 years when they're on the top, they'll reward you by charging the same high prices people complain about now.

If that happens, you can just buy overpriced crap from AMD instead of Nvidia. I don't get the whining against the people that want to have a more balanced market by spending their money any way they want...
 
At 4k or higher resolutions, where compute performance matters more going by Fury scaling, it's hard to imagine a game taking advantage of FP16 not performing far better on Vega than a 1080 with well over 3x the FP16(FP32 on 1080) performance. Console devs at the very least will support it and it's practically required for any engine also targeting mobile. That's leaving out overclocking, tiling, geometry, DX12/Vulkan/async advantages, and anything else that may have been added. Just looking at that theoretical gap I'd expect it to outperform the 1080ti. Setting the bar low seems to have been the goal.

Obviously the cores did get reworked going off the higher clocks that have been demonstrated. While just speculating, I'm eyeing the Ryzen FPU design exploded out to 16 lane SIMD and a flexible scalar or two on the side to pick up integer, scalar, and other costly work. Defining a clock as a MUL operation instead of FMA would probably net the clockspeed increase we've seen. If the budget is tight reusing parts from Zen also makes sense and SMT is a viable GPU technique we haven't seen used. Everyone keeps assuming the traditional 2 ops/clock is MUL+ADD, but MUL+MUL is a possibility along with accumulators. That works for DL and graphics and is far more compute capability than the raw TFLOPs figure would suggest.


It's also possible they designed it not to pull any power through the PCIE slot given all the IO on Naples and Polaris issues. At x16 a single system could have 8 GPUs attached, and I wouldn't rule out 16 discrete(Falconwitch) or even dual cards for density.

I was working similar numbers, as DX using FP16 ops not FP32. However, FP isn't everything. Fury had great FP throughput. Where AMD choked against nvidia was the lack of a good tiled ROP backend, which is incredibly efficient. I think it will come down to how efficient AMDs tiling mechanism is.

I'm also thinking that Vega will not be 1 GPU part, but actually 2. One optimized for GDDR5x and one optimized for HBM2. This might explain why we are hearing rumors of the gaming version (GDDR5x) being delayed till fall. AMD wants the high margin HBM2 parts out the door. And I said this months ago, and Raja also pointed out during the presentation. While the largest % of market share is in the Sub $250 category, the real profits are to be found in the high end, especially machine learning. We gamers are small potatoes compared to the potential of that market share reaping profits.
 
If that happens, you can just buy overpriced crap from AMD instead of Nvidia. I don't get the whining against the people that want to have a more balanced market by spending their money any way they want...

I don't get people who buy for something other than performance, especially considering we're in the performance computer market here. I get spending more money to support a small business, or a local business. I don't get spending money at one globalized corporate entity over another globalized corporate entity with the thought that your purchase will some how restore the balance.
 
Response Time : 19.4 ms

Lol no thanks

Response time means nothing. If it were such a handicap I wouldn't be top of the scoreboard practically every game of Titanfall 2 which is about as fast paced of a shooter as you can get.
 
Response time means nothing. If it were such a handicap I wouldn't be top of the scoreboard practically every game of Titanfall 2 which is about as fast paced of a shooter as you can get.
It degrades image quality; doesn't really impact performance. The whole point is to get a display that looks good. That means motion clarity in fast paced games.
 
Response time means nothing. If it were such a handicap I wouldn't be top of the scoreboard practically every game of Titanfall 2 which is about as fast paced of a shooter as you can get.

I'm not trying to insult you choice of display or anything, but the quality of the display has very little to do with leaderboard standings. A good player will probably make it to the top of the list regardless of graphics quality or settings.
 
It degrades image quality; doesn't really impact performance. The whole point is to get a display that looks good. That means motion clarity in fast paced games.

I'm not trying to insult you choice of display or anything, but the quality of the display has very little to do with leaderboard standings. A good player will probably make it to the top of the list regardless of graphics quality or settings.

The video review quote was wrong. Response time on OLED is pretty much instantaneous. What he should have said was input lag.

You're right, I was reading response time as input lag which is what most people complain about on using a TV as a monitor. For response time this is basically the same as my last monitor so I didn't notice a difference on blur or anything of that nature. From looking at my friend's monitor, I also can't tell a difference.

For input lag, my Samsung KU7500 does 37ms or 20ms depending on if you enable 4:4:4 chroma. Asus ROG swift is 9 to 10ms. So the way I look at it (and this could be wrong) is that if I have a 50ms ping while gaming online, it would FEEL as if I had a 60ms ping which honestly is unnoticeable to me personally. I know latency online to a server is different from latency to a monitor but it was my only real mental reference on what 10ms feels like. I used to always put it in game mode because I thought I "needed" that extra 17ms. Eventually I got lazy and stopped doing it and seriously, I can't tell a difference at all. My online standing in pvp games didn't change a bit, how I played in single player games didn't change a bit, and it was just less work to keep changing modes.

So the feeling of gaming on a giant fucking screen really, in my personal opinion, far outweighs the 27ms input lag or the 10ms of response time simply because the input lag and response time are imperceptible but a monitor that is almost twice as big is very perceptible.
 
MY VIDEO CARD MANUFACTURER OF CHOICE IS BETTER THAN YOUR VIDEO CARD MANUFACTURER OF CHOICE!
 
Well, if the rumors are right, then that makes sense to me.

Frankly, based on when Raja got there, I never thought Vega would be a 100% Raja card.


There is a new theory floating around though, and it fits in nicely with my idea that Navi will be the first real "Raja-designed" card top to bottom.

The idea is that Polaris is really still a Tonga-based card (remember, they changed names when Raja got there - if this was under the old crew, it would have an Island name), and Vega was to be the last in the line of the Island cards.

When Raja came on, Polaris was already well into design, but what they did is change and test a small part of Polaris, and see how effective the design is. Vega was under initial R&D and had some design work done, but more could be changed, and they in fact did change a lot on it, but it still at the heart is an Island card.

Navi will be the culmination - essentially the first card completely designed from top to bottom under Raja's eye. They are using and selling what they have already in development to remain relevant and keep capital coming in, hoping essentially to tread water until they can blow the roof off with Navi.

Again, this is just my perception, but I think based on what little rumors I have heard and how this all has played out, this indeed is the plan.


Bang on with my understanding, just I remember seeing in a Raja presentation or similar officially based analysis, where it was confirmed Navi is the first 100% Raja design. Vega he had some part in but not all. Sorry I can't find the source, it was a pretty minor remark in some much larger event. Maybe Ctex 16/Polaris?
 
Interesting! I had no idea OLED did that!

Actually I got my wires crossed. I had this and the OLED monitor thread open at the same time. :D

Someone in that thread was talking about Response time of OLED being bad. And then I read something similar here...
 
well im buying one (or afew :p) hbm and a decent sized chip looks like a killer combination for mining :)
 
Bang on with my understanding, just I remember seeing in a Raja presentation or similar officially based analysis, where it was confirmed Navi is the first 100% Raja design. Vega he had some part in but not all. Sorry I can't find the source, it was a pretty minor remark in some much larger event. Maybe Ctex 16/Polaris?

Kyles article about the status of internal chaos with Raja/AMD mentions how he wants to push as quickly as possible to Navi. And looking at the intended design it holds some interesting possibilities.
 
  • Like
Reactions: N4CR
like this
LOL. just reading everybody's post is hilarious. The RX version specs aren't even out yet and most everybody has come up with a conclusion. It's like reading about the OJ case while it was happening.
It's just a video card for gaming, nothing else (the RX). I can understand those people that work on CAD, Adobe Premier Pro or other industry jobs (Kyle) but many here think they're going to lose their house and job if AMD comes out with a winner. Funny
 
I'm also thinking that Vega will not be 1 GPU part, but actually 2. One optimized for GDDR5x and one optimized for HBM2. This might explain why we are hearing rumors of the gaming version (GDDR5x) being delayed till fall. AMD wants the high margin HBM2 parts out the door. And I said this months ago, and Raja also pointed out during the presentation. While the largest % of market share is in the Sub $250 category, the real profits are to be found in the high end, especially machine learning. We gamers are small potatoes compared to the potential of that market share reaping profits.
I'm unsure on GDDR5/x as I think Vega probably benefits from the faster memory. That lower prefetch is memory compression when encountering sparse waves and more random workloads. It's 2 consecutive versus 8 when fetching data. Given how Infinity works with Ryzen a separate DDR or slower pool makes sense. I wouldn't rule out HBM along with GDDR, but it would cost a bit more. These cards are still longer than they should be.
 
LOL. just reading everybody's post is hilarious. The RX version specs aren't even out yet and most everybody has come up with a conclusion. It's like reading about the OJ case while it was happening.
It's just a video card for gaming, nothing else (the RX). I can understand those people that work on CAD, Adobe Premier Pro or other industry jobs (Kyle) but many here think they're going to lose their house and job if AMD comes out with a winner. Funny

their are almost 1million amd cards on the Ethereum network alone. If this card is a winner it may force many current mines to upgrade costing massive amounts of money to stay competitive. For some people these releases mean quite a bit more then just a gaming card

edit their looks to be another 400,000 on zcash
74,000 on ethereum clasic
100,000 on monero
2.5 milion on dash (not 100% amd)

most of my estimates were also pretty concervative. I assumes everyone was using a well tuned rx 480 and in realitty thats far from true.

and from a finantial standpoint it looks like mining would accound for 10-25% of their revenue. (im assuming the majority of the cards were perchused in the same year. this is probrably true for all big mines but is most likly not true for all the cards.)
 
Last edited:
their are almost 1million amd cards on the Ethereum network alone. If this card is a winner it may force many current mines to upgrade costing massive amounts of money to stay competitive. For some people these releases mean quite a bit more then just a gaming card

Would you mind telling me where you learned there are a million AMD cards mining Ethereum?
 
Would you mind telling me where you learned there are a million AMD cards mining Ethereum?

network hash rate is 26,000 gh/s (26000000mh's) the fastest cards will only do around 30mh/s (r9 290's and rx 480's) 26000000/30=866,000 and I rounded up because I garentee the majority of the network isnt running at 30mh/s (you have to have a 8gb or flashed overclocked card to do this)

edit: i also negligated nvidia cards because i HIGHLY dought they would make up any significant portion of this.
 
well im buying one (or afew :p) hbm and a decent sized chip looks like a killer combination for mining :)

I think it is going to be worth preordering at least one, just in case it ends up being a winner in that area.
 
I think it is going to be worth preordering at least one, just in case it ends up being a winner in that area.

thats my plan and if its not decent at mining my desktop could always use a new card.
 
I don't get people who buy for something other than performance, especially considering we're in the performance computer market here. I get spending more money to support a small business, or a local business. I don't get spending money at one globalized corporate entity over another globalized corporate entity with the thought that your purchase will some how restore the balance.

You shouldn't worry about things you don't get. Just keep buying for performance. Why are you participating in the discussion again?
 
You shouldn't worry about things you don't get. Just keep buying for performance. Why are you participating in the discussion again?

No offense, but that's idiotic. If you "don't worry about things you don't get", how are you going to ever learn anything? Perhaps I was hoping for more dialogue on the reasons behind that decision.
 
Back
Top