Vega Rumors

VEGA 56 3Dmark Benchmark


Just of the top of my head, these seem like OCED 1070 scores, correct me if I'm wrong
My OC'd GTX 1070 (reference card but with ACX 3.0 cooler, stays below 65c while fan RPM stays below 1400) scores above 20K. No extra voltage and not even the max overclock.
 
G-Sync "tax" aside, you are comparing two different sizes of monitor with different feature sets...
That's the cloest I could get them to equate one another. Because the G-Sync tax for C32HG70 is "it doesn't exist".

My point was that not all FreeSync's are budget friendly.
Something like this:

https://www.amazon.com/AOC-G2460PF-24-Inch-Gaming-Monitor/dp/B01BV1XBEI/ref=as_li_ss_tl

$200 with a 35-144hz range

That's the kinda "budget" I'm talking about....

For $600 total (incl. a Vega 56), ain't a bad deal!

(Especially if you're more inclined towards e-sports/crazy fps gaming etc)

*As you can see from my sig, I'm not one that is prone to dropping $700 on a monitor

It all boils down to facilitating a need in the marketplace and I believe AMD might have hit the nail on the head with the Ryzen 1600X, Vega 56, and a Freesync monitor.

Guess your thinking is far different than mine. Monitors are the very first thing I'd drop big bucks at because it's the part you will most likely stare the most at the computer (assuming normal usage, I don't know who stares at the inside of their computer more than their monitor), so the experience that most often defines my experience is the screen itself.

That being said, I am skeptical there will be significant number of people (outside of those who knows what they are doing), that will pair a 1080p monitor with a Vega 56, I'd be more inclined to believe that RX 570/580 would be a likelier candidate for such a pairing.

Everything Vega has done, has done right, that I agree, it's the timing.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
my OC'd 980ti gets 21k graphics score in FS (normal), so this is pretty hilarious.
 
Not all FreeSync is "cheap".

See C32HG70, costs the same as XB271HU.
that C32HG70 has quantum dots, (New color enrichment tech), and is 32" and 1440p

That Nvidia monitor is 1080, 27" and doesn't have quantum dots!
Nvidia gsync monitors with quantum dots are double that freesync monitor price.
 
Also different resolution, but that's argument for a different day, since that particular G-Sync monitor has PPI way too high for my liking, given the choice, I'd prefer the Samsung one.
 
Something like this:

https://www.amazon.com/AOC-G2460PF-24-Inch-Gaming-Monitor/dp/B01BV1XBEI/ref=as_li_ss_tl

$200 with a 35-144hz range

That's the kinda "budget" I'm talking about....

For $600 total (incl. a Vega 56), ain't a bad deal!

(Especially if you're more inclined towards e-sports/crazy fps gaming etc)

*As you can see from my sig, I'm not one that is prone to dropping $700 on a monitor

It all boils down to facilitating a need in the marketplace and I believe AMD might have hit the nail on the head with the Ryzen 1600X, Vega 56, and a Freesync monitor.

If people are on a budget then they may be best waiting for HDR monitors rather than buy Vega and Freesync now.
I say may because no-one can really say how much impact it will have with developers and market for monitors, but it will add to the cost of a monitor making it even more painful to buy Freesync now and then replace with a Freesync monitor end of the year or very early next year.
If already have Freesync sure it may be a no brainer, but in the context you raise of value concerns combined with VRR I do not think it is so clear cut especially if HDR does (big if) live up to what it is meant to offer, and this is ignoring possible shadow of 4-5 months with the GV104 launch more specifically '2070'.
I guess one needs to weigh up how much they want VRR now and when they would then be able to get onto HDR or happy to accept there will be another performance leap between now and Navi (which with AMD's record will probably launch late and especially so with them now adding AI core functionality into it).
Point being not that clear to me unless one can say HDR is going to be a flop with minimal impact and minimal interest from devs/studios (considering consoles they will probably push it) or one is ok to be tied in for a generation to current performance that has been around for 15 months already and games will continue to evolve.
It is a tough call when considering other relevant factors and not sure there is one given answer.
Cheers
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
If people are on a budget then they may be best waiting for HDR monitors rather than buy Vega and Freesync now.
I say may because no-one can really say how much impact it will have with developers and market for monitors, but it will add to the cost of a monitor making it even more painful to buy Freesync now and then replace with a Freesync monitor end of the year or very early next year.
If already have Freesync sure it may be a no brainer, but in the context you raise of value concerns combined with VRR I do not think it is so clear cut especially if HDR does (big if) live up to what it is meant to offer, and this is ignoring possible shadow of 4-5 months with the GV104 launch more specifically '2070'.
I guess one needs to weigh up how much they want VRR now and when they would then be able to get onto HDR or happy to accept there will be another performance leap between now and Navi (which with AMD's record will probably launch late and especially so with them now adding AI core functionality into it).
Point being not that clear to me unless one can say HDR is going to be a flop with minimal impact and minimal ...

Your statements are reasonable. But how did I know you would shill for nvidia eventually?

Hdr is in the very early stages like 4k was. Remember when you needed 2 cables to display it. And not every format of 4k refresh/depth/bit format is supported yet (even if you leave out 3d formats). Your box may support the 21mbit format and your monitor may not.

I predict the HDR will fall out the same way. All the monitors now have compatibility problems. And to be honest they all lack in proper dynamic contrast.
 
Last edited by a moderator:
Your statements are reasonable. But how did I know you would shill for nvidia eventually?

Hdr is in the very early stages like 4k was. Remember when you needed 2 cables to display it. And not every format of 4k refresh/depth/bit format is supported yet (even if you leave out 3d formats). Your box may support the 21mbit format and your monitor may not.

I predict the HDR will fall out the same way. All the monitors now have compatibility problems.

Eh?
Nothing in my post is about shilling for Nvidia.
Only one showing bias is your deliberate insulting/slighting of me.
If I want to shill I would weigh much more in my post about Volta than I actually did, along with being highly critical of AMD on the GPU side with a rather large breakdown.
In fact earlier I actually supported the leak Firestrike results as being pretty reasonable in general and its specific context where some were ignoring the MSI 1080.

People suggesting "go ahead and buy Vega AND Freesync now and save yourself cash" without pointing out the possibility of HDR and looming shadow of Volta '2070' are the actual shills, and may actually financially cost those people more as they pass on the simple recommendation of "do both now! its great value".
Who is the actual shill here, me or you who is already discounting HDR when no-one can yet make any conclusions albeit though consoles will use it heavily?
There are actual members here who went Freesync but ended up having to replace their 390/390x with an Nvidia GPU, so their lovely Freesync monitor was not doing much.
Meaning a repeat cycle could happen where some buy the Vega56 with Freesync and before they know it end up in the very same situation as those with the 390+Freesync in the past, and probably with a repeat upgrade cycle for the very reason I hinted at of Navi now incorporating AI core development (consider how late Vega is this should give you some indicators that Navi will run late with such R&D complexity added).

Like I said it is a tough call, and THERE IS NOT A SIMPLE RECOMMENDATION.
 
Last edited:
Eventually one upgrades everything...

RX Vega 56 & low cost FreeSync monitor now...

Navi & HDR FreeSync2 monitor later...
 
Eventually one upgrades everything...

RX Vega 56 & low cost FreeSync monitor now...

Navi & HDR FreeSync2 monitor later...
I had a feeling my context would be missed.
My response was in context to the person saying it is better value buying Vega and Freesync now.

Again this is another oversimplification because one cannot say for sure the impact HDR will have (various technologists report its impact is greater than 4K and if so pretty important with relevance sooner rather than later), and one is also assuming a person will be happy to be locked into Vega until Navi launches and this is adding a much greater R&D complexity than was seen with Vega, and Vega is pretty late relative to competition and so said enthusiasts may feel trapped.
Expecting Navi to arrive in a reasonable timeframe is likely to cause some disappointment considering they are adding AI/DL related cores (think Volta V100 context), and not all enthusiasts will wait for it; see my point about those stuck with Freesync after having to move to Pascal from their 390/390x waiting so long.
That last point is a real example of some actual members here.

I am not saying it is bad to buy Vega, just that it is not a simple value proposition of buy Vega+Freesync now due to other factors that are being ignored.
If you buy Vega without Freesync then it removes the actual value proposition, if you buy both it is only a value proposition if one can happily ignore other important factors that are not far off or the long lock-in before enthusiast performance upgrade cycle.
Cheers
 
Last edited:
as much as I have bashed amd for vega. tweaktown benches weren't for 3dmark. and 3d mark isn't a game. Yea it will give you general ballpark, but games can be a totally different story.


Its not going to change, Vega 56 is around the 1070.
 
Your statements are reasonable. But how did I know you would shill for nvidia eventually?

Hdr is in the very early stages like 4k was. Remember when you needed 2 cables to display it. And not every format of 4k refresh/depth/bit format is supported yet (even if you leave out 3d formats). Your box may support the 21mbit format and your monitor may not.

I predict the HDR will fall out the same way. All the monitors now have compatibility problems. And to be honest they all lack in proper dynamic contrast.
It's a fair prediction. I'm an even more avid enthusiast in the home theater forums and the high end displays with HDR and top shelf 4K players with HDR are rife with growing pains, mismatches standards and frustrating incompatibilities.

That said there are some other monitor techs looming, IMO, that a person buying a monitor right now might consider if they'd wait for in addition to HDR... oled or quantum dots being chief among them, freesync 2 being another. It is a difficult time to sum up the courage to buy an expensive monitor, (this is relative, but to me expensive would be $600 plus ---- because I think you could easily lose half of that value in resale in a year where as a single $300 monitor you might lose $100 in value if your were to sell in a year e.g. HP Omen 32" (1440p, freesync, VA panel, $300). I find it much easier to recommend a cheaper one without the resale value falloff when the newer techs hit mainstream.
 
Last edited:
It's a fair prediction. I'm an even more avid enthusiast in the home theater forums and the high end displays with HDR and top shelf 4K players with HDR are rife with growing pains, mismatches standards and frustrating incompatibilities.
That unfortunately is the competition between 2 standards pushing for dominance and they did jump early 2016.
However you read reviews or technologists regarding HDR for films-content using HDR TVs and they are mostly saying it is more important than 4k.
Content of course is currently an issue, but that does not apply to game devs and monitors.

Anyway same could be said about OLED/Qdots, it is probably the future (also for monitors) but not jumping in on the 1st generation of these TVs.
Cheers
 
ij0us2rrv9fz.png
 

Finally some confirmation primitive shaders are used everywhere. Not a huge surprise considering they're programmable.
 
Finally some confirmation primitive shaders are used everywhere. Not a huge surprise considering they're programmable.
Which is actually worse than before, because their effectiveness is rather limited right now, ie: they don't offer any tangible performance increase in current games. Vega appears locked at 4 tri/cycle. At least with developer intervention hope is there for it to offer more with proper handling. Now it won't even do that.
 
Ut oh... we need details! But he isn't really allowed to post any. Can he even post what he did without breaking NDA?
Does seem a bit of a grey area/tightrope he is walking.
TBH I would prefer to take mining performance from those that do it rather than tech review/news sites.
I am dubious myself, but for it to be conclusive really needs an article review-analysis from one of those that is heavily active in mining and seems reasonably trustworthy.
All Tweaktown would show-context IMO is performance for casual mining setup.
Cheers
 

Finally some confirmation primitive shaders are used everywhere. Not a huge surprise considering they're programmable.


Did you read Vega's white paper?

There is no damn difference between older GCN to VEGA when it comes to instruction sets lol (outside of FP 16), You know what that means? Primitive shaders is nothing new ;) They just are using a different name for what programmers have used in the past.

And ya RYS, what he just stated is BS, he lied....... He didn't answer the question properly, It doesn't auto magically work, AMD has to do the work and guess what that is why you won't see more than Polaris's geometry through put most of the time, I have already confirmed this. So unless the application is changed to take advantage of primitive shader, we will not see any improvements in Vega when it comes to geometry through put.

And never Fuckin post a AMD empolyee's saying as the truth, specially from a person like RYS. He is worse then any fucken shill.
 
Eh?
Nothing in my post is about shilling for Nvidia.
Only one showing bias is your deliberate insulting/slighting of me.
If I want to shill I would weigh much more in my post about Volta than I actually did, along with being highly critical of AMD on the GPU side with a rather large breakdown.
In fact earlier I actually supported the leak Firestrike results as being pretty reasonable in general and its specific context where some were ignoring the MSI 1080.

People suggesting "go ahead and buy Vega AND Freesync now and save yourself cash" without pointing out the possibility of HDR and looming shadow of Volta '2070' are the actual shills, and may actually financially cost those people more as they pass on the simple recommendation of "do both now! its great value".
Who is the actual shill here, me or you who is already discounting HDR when no-one can yet make any conclusions albeit though consoles will use it heavily?
There are actual members here who went Freesync but ended up having to replace their 390/390x with an Nvidia GPU, so their lovely Freesync monitor was not doing much.
Meaning a repeat cycle could happen where some buy the Vega56 with Freesync and before they know it end up in the very same situation as those with the 390+Freesync in the past, and probably with a repeat upgrade cycle for the very reason I hinted at of Navi now incorporating AI core development (consider how late Vega is this should give you some indicators that Navi will run late with such R&D complexity added).

Like I said it is a tough call, and THERE IS NOT A SIMPLE RECOMMENDATION.

I'm not here to get into a pissing match. The point being it doesn't matter if you pick nvidia's volta (when it does come out) or Vega. You're still going to get the nvidia tax. And to be honest you should really wait a year until the HDR compatibility issues settle out.
 
Which is actually worse than before, because their effectiveness is rather limited right now, ie: they don't offer any tangible performance increase in current games. Vega appears locked at 4 tri/cycle. At least with developer intervention hope is there for it to offer more with proper handling. Now it won't even do that.


Developers have to do the work first for primitive shaders to take advantage of the "more through put", so pretty Rys never answered the question, typical AMD cronies. The work that has to be done is fp16.
 
Last edited:
I'm not here to get into a pissing match. The point being it doesn't matter if you pick nvidia's volta (when it does come out) or Vega. You're still going to get the nvidia tax. And to be honest you should really wait a year until the HDR compatibility issues settle out.
Sure there is a 'tax' for GSync and when matching identical from same manufacturer cost is between £120 to £160 unless looking at the most extreme GSync models; I doubt anyone has ever disputed that and not something I ague about.
But there is much more at play than just that, as I outlined without being a shill about it.
And tbh I doubt people will replace their monitors 12 months down the line as it is still not good value sense to do that (people who upgrade from 390/390x actually preferred to keep the monitor even though they upgrade to Pascal, showing there is a reluctance to actually upgrade monitors with any kind of frequency and so does play into possibly waiting on the monitor side with regards to factors I raised)
But from a cost perspective, I gave clear real world examples where Vega+freesync may not be an actual value proposition.
It is not as clear cut as some make it out even from a cost perspective; like there will be enthusiasts who will not be able to wait for Navi or not want to wait 1.5-2 years for HDR if it does take off because they just recently purchased their Freesync monitor with Vega.
If one must buy Vega, I would seriously consider doing it but waiting on the monitor side until it becomes clearer what is happening, unless one just cannot live without VRR and importantly is happy to ignore the other looming factors I mentioned.
To reiterate my posts come back specifically to the value proprosition raised and factors one should be considering/weighing that can change this.
Cheers
 
Last edited:
Sure there is a 'tax' for GSync and when matching identical from same manufacturer cost is between £120 to £160 unless looking at the most extreme GSync models; I doubt anyone has ever disputed that and not something I ague about.
But there is much more at play than just that, as I outlined without being a shill about it.
And tbh I doubt people will replace their monitors 12 months down the line as it is still not good value sense to do that (people who upgrade from 390/390x actually preferred to keep the monitor even though they upgrade to Pascal, showing there is a reluctance to actually upgrade monitors with any kind of frequency and so does play into possibly waiting on the monitor side with regards to factors I raised)
But from a cost perspective, I gave clear real world examples where Vega+freesync may not be an actual value proposition.
It is not as clear cut as some make it out even from a cost perspective; like there will be enthusiasts who will not be able to wait for Navi or not want to wait 1.5-2 years for HDR if it does take off because they just recently purchased their Freesync monitor with Vega.
If one must buy Vega, I would seriously consider doing it but waiting on the monitor side until it becomes clearer what is happening, unless one just cannot live without VRR and importantly is happy to ignore the other looming factors I mentioned.
To reiterate my posts come back specifically to the value proprosition raised and factors one should be considering/weighing that can change this.
Cheers

Well I'm waiting for Kyles review. Then I'll decide. You can always wait for the next best thing, but then you'll have nothing. But with new tech, you should always wait a couple gens until all the standards settle and work nicely.
 
Remember when the 7970s came out and people said the same things about they're not competing.


My 7970 can play @ 1440p over 45 fps with every game I have tried it with.

I don't think there is anything from nvidia since the 9800 that had this kind of life in it.

Assuming release performance numbers are going to stay as low as they are now is really silly and completely ignores all history.

@ release performance equal to 1070 performance(with a year of driver refinements) equates to it outperforming it by a fairly decent margin within a couple of months. At least that's the logical assumption based on observed history.

But I suppose this is more about chest thumping than anything else.
 
Ut oh... we need details! But he isn't really allowed to post any. Can he even post what he did without breaking NDA?


He didn't give any specifics, just said the rumor of 100mhs is not right. All we know it can still be doing 75mhs lol. Which is still very good and acceptable for a replacement of current cards.
 
How will AMD make a profit when miners, scalpels, or stores sell them 300+ over MSRP? Not like they turn around give any of that money back to AMD.

These card will most likely sell out due to people who hate green, love underdog, only available card on the market for a short time.

Now I am not saying AMD will not make any profit, whatever the cost of the card/production (unknown to me) and MSRP is they will get. But everyone else posted above will make far more profits than AMD if they can snatch em up and sell them highly over priced.

There was a story form Videocardz that AMD already increased the price over the announced MSRP to cash in:
https://videocardz.com/71776/amd-is-trying-to-sell-radeon-rx-vega-64-at-a-higher-price

I think a lot of the "mining" increase, is the whole distribution chain scraping some extra off the top. I even expect if/when mining implodes they will try to keep prices high after discovering people will pay these kinds of prices.
 
AMD has to do the work and guess what that is why you won't see more than Polaris's geometry through put most of the time, I have already confirmed this. So unless the application is changed to take advantage of primitive shader, we will not see any improvements in Vega when it comes to geometry through put.
.

i've saying for months, tenths of times,that geometry throughput was exactly the same as polaris, I did said several times that Polaris is to VEGA what Tonga was to Fiji, they release the same architecture revision first in the form of a mid range to study, learn, improve and optimize drivers and tech before the high-end.

Remember when the 7970s came out and people said the same things about they're not competing.

My 7970 can play @ 1440p over 45 fps with every game I have tried it with.

I don't think there is anything from nvidia since the 9800 that had this kind of life in it.

lol you must have a very low standard for settings or be playing minecraft, ancient games or low settings on modern games to keep a 7970 playing at 2560x1440, I have an R9 280X at 1200mhz with mem at 6500mhz and still do a decent job at 1080P, that machine is used by my nephews every freaking week when they come to visit and everytime I have to optimize the hell out of modern games at 1080P to keep playable framerates.
 
Well I'm waiting for Kyles review. Then I'll decide. You can always wait for the next best thing, but then you'll have nothing. But with new tech, you should always wait a couple gens until all the standards settle and work nicely.
Navi wait for Navi!!!!!!!
 
Apparently even the reviewers packaging says about a minimum 1000 watt psu, for the one that got the aio version at least. :eek:
 
Just watched Kyle's unboxing video. Card looks slick. I wish the exhaust vents were a wee bit wider for dust reasons but nothing some snips can't help.
 
I mean... even if it's 400 watts power draw, the rest of my computer does NOT use more than 200-300 watts under load...

A high end intel processor with a high OC can easily pull that much, just for the processor. I've gotten up to 388W on my 5960x at 4.8Ghz.

It's not bad advice if you're OCing everything. Honestly if you're building a $2-3,000 PC throw a decent PSU in it for another $50.
 
i've saying for months, tenths of times,that geometry throughput was exactly the same as polaris, I did said several times that Polaris is to VEGA what Tonga was to Fiji, they release the same architecture revision first in the form of a mid range to study, learn, improve and optimize drivers and tech before the high-end.



lol you must have a very low standard for settings or be playing minecraft, ancient games or low settings on modern games to keep a 7970 playing at 2560x1440, I have an R9 280X at 1200mhz with mem at 6500mhz and still do a decent job at 1080P, that machine is used by my nephews every freaking week when they come to visit and everytime I have to optimize the hell out of modern games at 1080P to keep playable framerates.
Low standards?
2x AA and maxed aniso everything but shadows on high or ultra.
And explain why you are talking about 1080p when I said 1440p?

I also don't believe you have to do a bunch of work to get stuff to work at 1080p, my son is on a 7950 and an a8 3870k and I don't have to do a damn thing for him to play whatever he downloads @ 1080p.
Is doom new enough for you?
 
Last edited:
Back
Top