Radeon RX Vega Discussion Thread

So a 1080 class card is now esports ready? Check. 144Hz in an esports game hardly takes a 1080 class GPU.

Again this shows you know nothing about 120hz gaming.

But this is a thread about Vega. We should get back on track.
 
Even though I don't believe the article, if you read the whole thing, the writer said he's expecting it to crush the Ti by "15-20%" at 4K.

If you buy this card and you're concerned with it's 1080p performance... maybe your priorities are skewed... just sayin.

Which is completely absurd. It's not going to go from 5% slower than a 1080 at 1080p to 20% faster than a 1080ti at 4k. Do remember a 1080ti is 35% faster than a 1080. It's not going to magically jump a performance tier and get 60% faster because of a change in resolution. There's wishful thinking, stoned out of your mind, and then there's whatever that is.
 
Again this shows you know nothing about 120hz gaming.

But this is a thread about Vega. We should get back on track.

Vega will be aimed at 4K gaming. I have no idea and nor do I pretend to have one about it's final capabilities. But it won't be an esports gaming card, neither is a 1080, and I know enough to know when smoke is being blown up my butt. Please stop.

I still don't believe the article but I know that it wasn't an off the cuff decision to feed Vega with HBM2. Given the cost, they must feel some specific outcome makes it worthwhile.
 
Which is completely absurd. It's not going to go from 5% slower than a 1080 at 1080p to 20% faster than a 1080ti at 4k. Do remember a 1080ti is 35% faster than a 1080. It's not going to magically jump a performance tier and get 60% faster because of a change in resolution. There's wishful thinking, stoned out of your mind, and then there's whatever that is.

I know. I just don't understand what they think this is going to be. But one thing we do know... no matter how hard AMD tries, their channel leaks like a sieve. As soon as partners start building cards there's going to be leaks.
 
Let's hope it is a good product. That would benefit everyone. But with their track record, it is understandable that people are skeptical.
 
TPU got the clocks from the open cl leaks, so 1200mhz, boost, seems a bit low, still expect this thing to boost up to ~1400 to ~1500. The open cl bench might not be getting the clocks right or there is still rumor to play on performance.
 
Vega will be aimed at 4K gaming. I have no idea and nor do I pretend to have one about it's final capabilities. But it won't be an esports gaming card, neither is a 1080, and I know enough to know when smoke is being blown up my butt. Please stop.

I still don't believe the article but I know that it wasn't an off the cuff decision to feed Vega with HBM2. Given the cost, they must feel some specific outcome makes it worthwhile.

Vega and 4K gaming is a fairy tale. Sure, if you lower the settings enough to match something like a console 4K. Maybe 30FPS too for a cinematic effect. And the 4K bar keeps moving, I wouldn´t even call a Titan Xp for a 4K card.

The reason Vega uses HBM have more to do with HPC and AMDs stupidity in locking themselves into technology way before its prime. They choose to gamble on HPC first and gaming last with the meagre resources RTG got.

HBMx as such can only do one single thing GDDRx cant do, and that's ECC. Everything else is slideware.
 
Last edited:
So a 1080 class card is now esports ready? Check. 144Hz in an esports game hardly takes a 1080 class GPU.

While most would be perfectly fine with a 1060, or even a 1050...I would wager that there's been some sponsorship bucks to get esports pros a nice 1080 from either nvidia or some of the big guys like asus or msi. Not really too sexy showing off a Enhanced 580 for sports you know!
 
Vega and 4K gaming is a fairy tale. Sure, if you lower the settings enough to match something like a console 4K. Maybe 30FPS too for a cinematic effect. And the 4K bar keeps moving, I wouldn´t even call a Titan Xp for a 4K card.

Um what?? Some of you guys' standards are simply crazy. You can't play at a resolution unless "90 fps and 16x AA that nobody can notice."
 
Um what?? Some of you guys' standards are simply crazy. You can't play at a resolution unless "90 fps and 16x AA that nobody can notice."

16x aa I don't really give a shit about in game, it's good for screen shots and all, or when I'm actually standing still.

90+ fps...very much yes I can tell a difference mostly in input. I only play a MOBA (Smite) but there is a distinct difference in my performance on a 60hz screen and my 144 and 165hz screens. I thought for a while it was a placebo effect, but can say with confidence that it is there...to the point that sometimes I'm curious about the advantages of a 240hz crt.

As such I think 4k is overrated imho. I had a 4k monitor, still have a few 1080p monitors in the house, but 1440p has been gaming heaven for me the last 4+ years (well 1600p before that, but the 3007wfp was a different monster that was great for games for an odd reason for me).
 
4K with the right game can be an amazing experience. Played Resident Evil 7 like that maxed out 4K HDR, it was insane. I don't think it's overrated, it's really quite good, but there is an argument that it's not worth the cost or hassle and I might agree with that.
 
LOL
"Based on TPU review data: "Performance summary" at 1920x1080
Radeon RX Vega performance estimated based on architecture, shader count and clocks"

WTF!?

The whole article has zero credibility or real source - clickbait is the only thing I can think of from that shit site.
 
I like TPU in general, but they are sort of stretching it with this estimation and presenting it as fact. I don't buy it.
 
LOL
"Based on TPU review data: "Performance summary" at 1920x1080
Radeon RX Vega performance estimated based on architecture, shader count and clocks"

WTF!?

The whole article has zero credibility or real source - clickbait is the only thing I can think of from that shit site.


The performance scores aren't an estimate they are from the leaked Open Cl bench, the specs are good too, the frequency is the only thing that is up in the air since that too is from the open cl bench.
 
The performance scores aren't an estimate they are from the leaked Open Cl bench, the specs are good too, the frequency is the only thing that is up in the air since that too is from the open cl bench.

Yep. So I wonder about how this card could be killer at higher resolutions. Where is the magic switch that kicks in at 4k and how does that work? I'm very skeptical. We should know within a couple weeks though if the leaked slides are real based upon their NDA date on the footer.
 
http://www.3dmark.com/spy/1544741

Hmm same clocks generic VGA card.....
Above the 1070 less than the gtx 1080 just like TPU's.

definitely an HBM 2 chip with the 700 mhz on the vram.


Well, lots of clues point towards Vega being teste. The AMD 1800x make me think it was done by AMD ^_^. A 5700 GPU score is MUCH closer to the 1070 than the 1080. This is only about 10% higher than Fury X.
 
So where does that leave them? Is Nvidia going to lower the prices on the GTX 1070 to $250 and push AMD to the sub $300 price category? Keep the high end, high margin market to itself.
 
what I think is that one is the cut down or down clocked version, and there is another one that is ~1080 performance level, hopefully.....
 
So where does that leave them? Is Nvidia going to lower the prices on the GTX 1070 to $250 and push AMD to the sub $300 price category? Keep the high end, high margin market to itself.

I don't see a 1070 at $250 any time soon as standard pricing as AMD cant touch it with Polaris. I think a $300 1070 and a $400 1080 kills vega if these leaks are to be believed. AMD probably can't cut cost to compete with that pricing scenario due to the high cost of HBM 2. I think they are kinda screwed and vega simply becomes APU tech. Is Everyone ready to spend the next 2+ years screaming wheres navi? I'm not.

Of course this is all based upon the leaks being representative of the final product.
 
I don't see a 1070 at $250 any time soon as standard pricing as AMD cant touch it with Polaris. I think a $300 1070 and a $400 1080 kills vega if these leaks are to be believed. AMD probably can't cut cost to compete with that pricing scenario due to the high cost of HBM 2. I think they are kinda screwed and vega simply becomes APU tech. Is Everyone ready to spend the next 2+ years screaming wheres navi? I'm not.

Of course this is all based upon the leaks being representative of the final product.
Does anyone know how much room does Nvidia have to lower prices? If vega is 10% better than 1070 (current msrp 350), AMD will immediately become a player in the $350 price range. If that is the best card AMD has to offer, then Nvidia dictates where amd plays. Does Nvidia want them there?
 
I thought the lower end card would have 4GB ram.


can't do that unless they cut down the bus size by half which is a huge cut or go to HBM which either of the two bandwidth gets cut down by half, too much of a cut to sustain good performance.
 
If this leak reflects the actual product, then the predictions of a particular Baidu user have been spot on so far:

Translated Chinese said:
- for the moment Vega consumer level but also 1200mhz frequency. Much lower than the RX580, after all, the core is too large, the frequency is very low Fever is too big. 1500mhz is simply not possible, unless it is liquid nitrogen conditions

- Vega Professional Card Radeon mi25 is currently determined to postpone the release without posting.

- In my words, the current look at the production version as long as it is estimated that it is impossible to beat 1080, according to the core cost is basically higher than gp102
 
Does anyone know how much room does Nvidia have to lower prices?
AMD used to sell 350mm^2 dies with 256 bit memory for $200. That is basically as low as you can go.

So... nV has a fuckton of space to lower prices if need be, but they will not use it, because nV were already fucked over by price cuts once.
 
can't do that unless they cut down the bus size by half which is a huge cut or go to HBM which either of the two bandwidth gets cut down by half, too much of a cut to sustain good performance.

AMD did hint there would be both a 4GB and 8GB version (possibly one reason to really push the idea of their cache technology for consumers and gaming when they showed 2GB with Deus Ex:Mankind).
But like you say their bandwidth is becoming a bit of a worry relative to Nvidia; 2 stack is half the bandwidth of say P100 that has 720GB/s so down to 360GB/s rather than the 512GB/s that SK Hynix originally was pushing all that time ago as a possibility, so that would leave the 1-stack pretty weak even relative to Nvidia's raw bandwidth for the 1070.
I notice VideoCardz still has the Vega memory at the last official speed from SK Hynix and that was 1.6Gbps effective clocks, which is higher than that recent benchmark leak at 1.4Gbps that is matching what was available from Samsung - suggests technical challenges ramping it up and we see this with every memory tech when a manufacturer starts out.
Seems to me AMD was relying upon the 2Gbps HBM2 clocks to make both a 2-stack and 1-stack work well, but has ended up at the same clock speeds so far as we can see in most recent leak results at 1.4Gbps.

Just to stop some arguments, use Gbps to Ghz for the effective clocks (actual clocks is half), some understand one better than other in context of this discussion rather than what is techically accurate.
Cheers
 
Last edited:
We have had Vega es show up at those before 1200 core. This looks like a lower clocked model or may be a pro card with full core but slower memory and slower core. Doesn't the top Vega have 1.6 ghz on memory? So I think this is the very early sample we saw before that's been roaming around with same clocks and this with even slower memory. Looks like AMD hasn't given anyone later revisions of the chip. Since few leaks we have seen have been 1200mhz
 
AMD did hint there would be both a 4GB and 8GB version (possibly one reason to really push the idea of their cache technology for consumers and gaming when they showed 2GB with Deus Ex:Mankind).
But like you say their bandwidth is becoming a bit of a worry relative to Nvidia; 2 stack is half the bandwidth of say P100 that has 720GB/s so down to 360GB/s rather than the 512GB/s that SK Hynix originally was pushing all that time ago as a possibility, so that would leave the 1-stack pretty weak even relative to Nvidia's raw bandwidth for the 1070.
I notice VideoCardz still has the Vega memory at the last official speed from SK Hynix and that was 1.6Gbps effective clocks, which is higher than that recent benchmark leak at 1.4Gbps.
Seems to me AMD was relying upon the 2Gbps HBM2 clocks to make both a 2-stack and 1-stack work well, but has ended up at the same clock speeds so far as we can see in most recent leak results at 1.4Gbps.

Just to stop some arguments, use Gbps to Ghz for the effective clocks (actual clocks is half), some understand one better than other in context of this discussion rather than what is techically accurate.
Cheers

This version with 1200 core has leaked before while back what we didn't know was the memory speed. So I am pretty confident this is ES sample with earlier revision of the chip with slower memory or this is just one version with lower clock speeds.
 
We have had Vega es show up at those before 1200 core. This looks like a lower clocked model or may be a pro card with full core but slower memory and slower core. Doesn't the top Vega have 1ghz on memory? So I think this is the very early sample we saw before that's been roaming around with same clocks and this with even slower memory.


Vega's memory frequency is limited with HBM, there really isn't much room to play with HBM.

There is no such thing as "early" samples not with GPU's, unless there are problems with the silicon or node, there is not much they can do. Now if this is a cut down version, its board id doesn't match up , the C1 boards are the top end boards that AMD/ATi has been using for many years now to designate the top end...... So if they don't get more than 1200 on boost clocks they are screwed. Even at 1500 mhz, that is gtx 1080 performance at much higher power consumption.

Question is has AMD sent out MI125 Instinct cards yet? They launched those last quarter, and I still haven't heard anything about them in the hands of anyone.
 
This version with 1200 core has leaked before while back what we didn't know was the memory speed. So I am pretty confident this is ES sample with earlier revision of the chip with slower memory or this is just one version with lower clock speeds.

We have seen the exact same thing happen in the past when it comes to memory, really challenging to hit the maximum rated speeds and it was always seen HBM2 was falling short of its max rating that was originally given by SK Hynix.
It dropped from 2Gbps (or 2GHz effective clock) to 1.6Gbps even before any product was available from SK Hynix, so it is possible they can only match what Samsung was able to do last year with HBM2 and that was 1.4Gbps.
Does sound like a memory rating challenge affecting both manufacturers for all of HBM2/GDDR5X/GDDR6.
I seriously doubt you will see this increasing for quite awhile, Samsung has the best chance to hit 1.6 to 2Gbps this year as they have an extra 12 months manufacturing advantage over SK Hynix with HBM2.

Cheers
 
Interesting enough the Open CL leak, shows its the full GPU, so the only thing left is its clock speed.....
 
So that latest 3DMark leak looks like it could be real. At least it's plausible. However, I feel like if AMD can only match the 1070 and get a scant 10% lead over Fury X, then that is a huge problem. Fury X would have come out nearly 2 year prior, 10% gain in 2 years is a bad joke.

The one thing we can hope is that there are a few Vega SKUs, and this is just the 1070-competitor model and there is something better. At least I hope so.
 
Back
Top