Vega Rumors

Do we actually have a figure for cards available at launch worldwide? If I remember right Fury had only 16,000 units available at launch.

I wonder how this launch compares. Especially since AMD said they were delaying launch to build up inventory to make sure gamers got a card.
maybe see what gibbo had, he always tells us before launch.
 
Yet if the game is CPU bottlenecked, the card should be idling and consuming less power. Not spinning it's tires at high clocks. That wouldn't appear to be the case here. Current power savings mode is probably how AMD should have shipped the cards. 10-15% less performance at 150W less with the understanding drivers, packed math, etc would eventually overtake the competition. Ideally all that happens for launch, but no plan is perfect.
Ah, yes. 10% lower performance and a 50% power reduction whilst obviously overtaking the competition. I see you're a morning drinker.
 
Ah, yes. 10% lower performance and a 50% power reduction whilst obviously overtaking the competition. I see you're a morning drinker.


LOL lets throw out the voltage and power curves out the window..........

Damn even adorned understands the reality of situation with Vega lol.
 
LOL lets throw out the voltage and power curves out the window..........

Damn even adorned understands the reality of situation with Vega lol.

Really?
Then he is less of an idiot than I thought after having a few exchanges with him on youtube...I found him sorely lacking in technical aspects...surprise, surprise...
 
Really?
Then he is less of an idiot than I thought after having a few exchanges with him on youtube...I found him sorely lacking in technical aspects...surprise, surprise...

Hes a brilliant scientist I had a discussion with him on reddit and he totally wiped the floor with me using solid rhetoric like 'i make more money in one video than you do in a month'.

Impreasive stuff
 
Hes a brilliant scientist I had a discussion with him on reddit and he totally wiped the floor with me using solid rhetoric like 'i make more money in one video than you do in a month'.

Impreasive stuff

He also tried to play "cocky"...but I was right, he was wrong...I guess your,situation was the same ^^
 
To those who aren't convinced of the gaming experience that Freesync delivers (assumably Gsync as well)

Watch this video starting at 13:34. Bear in mind he's been basically saying/showing the Vega is a failed product offering during the video to this point based on his tested FPS and power draw deltas as compared to the 1080TI. He was given a water cooled variant to review, and was appropriately pretty harsh on the power usage and "turbo mode". 100 more watts for 2% frame rate increase as the overclocking option?!?!?!
The whole video is really quite good to see the weaknesses and strengths of his water cooled Vega unit compared to a 1080TI, and he effectively says if you don't have an adaptive sync monitor, or don't plan to buy one - there is no reason to buy Vega.

However --- in his conclusion - he lays out that he's been reviewing on a non adaptive sync monitor for this particular comparison, and the whole conclusion changes if you enter adaptive sync tech. --- SO make sure to watch what he says at 15:20!!!

He said he'd be more upset to give up the Vega card than the 1080TI -- even given the numbers based performance gap, because he knows what freesync has done for his gaming experience in the last year +, and he personally uses a freesync 1440p monitor.
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison have a worse gaming experience than his former RX480 and then RX580 have with freesync enabled. He continues that the gaming experience with a 1080TI on a non adaptive-sync monitor will never beat even a much less expensive freesync experience. (he isn't concluding anything about gsync - but assumedly that's just as good or better -- he's just saying freesync really is a plus to the gaming experience)

Basically the real winner here is the adaptive sync technology - either freesync or gsync - not max FPS with an non adaptive sync monitor. I agree completely based on my freesync experiences, which I've relayed many times recently myself on this board.

 
Last edited:
Really?
Then he is less of an idiot than I thought after having a few exchanges with him on youtube...I found him sorely lacking in technical aspects...surprise, surprise...


Well he pretty much stated in his video what I stated over a year ago, drivers and what not will drop power consumption without dropping performance (What I told him about Polaris). At least he understands what he is being told is based facts, now he might still not under the facts and why they are that way but that ok, its a step in the right direction.
 

Adaptive sync working as designed? What a shock!

All snark aside, some of us would prefer to judge a card based on its merits and not what a monitor can do to gloss over the shortcomings of it. Also, some of us don't have adaptive sync monitors. I may own an nVidia card but I won't pay the G-Sync tax and, if I had waited for Vega I wouldn't have been gaming much, if at all, for over a year.
 
To those who aren't convinced of the gaming experience that Freesync delivers (assumably Gsync as well)

Watch this video starting at 13:34. Bear in mind he's been basically saying/showing the Vega is a failed product offering during the video to this point based on his tested FPS and power draw deltas as compared to the 1080TI. He was given a water cooled variant to review, and was appropriately pretty harsh on the power usage and "turbo mode". 100 more watts for 2% frame rate increase as the overclocking option?!?!?!
The whole video is really quite good to see the weaknesses and strengths of his water cooled Vega unit compared to a 1080TI, and he effectively says if you don't have an adaptive sync monitor, or don't plan to buy one - there is no reason to buy Vega.

However --- in his conclusion - he lays out that he's been reviewing on a non adaptive sync monitor for this particular comparison, and the whole conclusion changes if you enter adaptive sync tech. --- SO make sure to watch what he says at 15:20!!!

He said he'd be more upset to give up the Vega card than the 1080TI -- even given the numbers based performance gap, because he knows what freesync has done for his gaming experience in the last year +, and he personally uses a freesync 1440p monitor.
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison have a worse gaming experience than his former RX480 and then RX580 have with freesync enabled. He continues that the gaming experience with a 1080TI on a non adaptive-sync monitor will never beat even a much less expensive freesync experience. (he isn't concluding anything about gsync - but assumedly that's just as good or better -- he's just saying freesync really is a plus to the gaming experience)

Basically the real winner here is the adaptive sync technology - either freesync or gsync - not max FPS with an non adaptive sync monitor. I agree completely based on my freesync experiences, which I've relayed many times recently myself on this board.



What's the point of freesync if the card can't sustain the high FPS you want? If you have a freesync display running between 50-70fps and another display with VRR running at closer to 100fps the latter will still be far more fluid. Vega is not adequately powerful for high refresh gaming, apart from suffering at high framerates in general, its just not powerful enough. The fact is that these are enthusiast tier cards, and trying to peddle them as value offerings when paired with a display while the competition is selling a 35% faster card for marginally more money is beyond ludicrous.

Yeah G-Sync is more expensive on average, but what you know what is G-Sync exclusive ? Fucking 1080Tis, the only cards that anyone would even bother buying high refresh 1440+ displays for
 
Yeah G-Sync is more expensive on average, but what you know what is G-Sync exclusive ? Fucking 1080Tis, the only cards that anyone would even bother buying high refresh 1440+ displays for


Nah. I picked up a 980ti for 200 and a S2417DG (1440p, 165hz) for 250, and the experience is really good. Tear free gaming at 80 fps at 1440P in BF1 and QC. I'd have to pay a lot more to get a 27" IPS GSync display and a 1080TI.
 
Nah. I picked up a 980ti for 200 and a S2417DG (1440p, 165hz) for 250, and the experience is really good. Tear free gaming at 80 fps at 1440P in BF1 and QC. I'd have to pay a lot more to get a 27" IPS GSync display and a 1080TI.

Lol speaking of 980Tis someone should really benchmark a highly OCd one vs Vega I have a feeling it will compare very favorably both in terms of raw performance and in terms of perf/w
 
What's the point of freesync if the card can't sustain the high FPS you want? If you have a freesync display running between 50-70fps and another display with VRR running at closer to 100fps the latter will still be far more fluid. Vega is not adequately powerful for high refresh gaming, apart from suffering at high framerates in general, its just not powerful enough. The fact is that these are enthusiast tier cards, and trying to peddle them as value offerings when paired with a display while the competition is selling a 35% faster card for marginally more money is beyond ludicrous.

Yeah G-Sync is more expensive on average, but what you know what is G-Sync exclusive ? Fucking 1080Tis, the only cards that anyone would even bother buying high refresh 1440+ displays for
HI,
you must have missed this:
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison on a non adaptive sync monitor, showcase a worse gaming experience than his former RX480, and then RX580, with freesync enabled (that he's personally used for the last year).

So let's see here.

$300 freesync monitor + $250 RX580 graphics card = $550 = better gaming experience
vs
(($300 non adaptive sync monitor + $700 1080ti graphics card = $1000 = worse experience) > (or $300 non adaptive sync monitor + $700 Vega graphics card = $1000 = worse experience))
 
Last edited:
HI,
you must have missed this:
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison on a non adaptive sync monitor, showcase a worse gaming experience than his former RX480, and then RX580, with freesync enabled.

So let's see here.

$300 monitor + $200 graphics card = $500 = better
vs
$600 monitor + $700 graphics card = $1300 = worse

Well it's AdoredTV, also known as Adrian McShillingtonshire. These whole card+monitor arguments are just a consequence of the actual hardware being subpar
 
HI,
you must have missed this:
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison on a non adaptive sync monitor, showcase a worse gaming experience than his former RX480, and then RX580, with freesync enabled.

So let's see here.

$300 monitor + $200 graphics card = $500 = better
vs
$600 monitor + $700 graphics card = $1300 = worse


This is only valid if a person is in the market for a monitor and if they don't want the best performance their $ can buy.

People only buy monitors ever 5 to 7 years, which kinda puts a huge damper on the pack deals.

So why would a person that wants lets say 1080 performance who already has a monitor at least go for something like a rx580 + a monitor which ends up the cost of a 1080?

Or a person that wants a 1080 performance and is ok to spend that extra 200 bucks for a gsync monitor because that 200 bucks is spread over the lifetime he owns that monitor?
 
HI,
you must have missed this:
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison on a non adaptive sync monitor, showcase a worse gaming experience than his former RX480, and then RX580, with freesync enabled (that he's personally used for the last year).

So let's see here.

$300 freesync monitor + $250 RX580 graphics card = $550 = better gaming experience
vs
(($300 non adaptive sync monitor + $700 1080ti graphics card = $1000 = worse experience) > (or $300 non adaptive sync monitor + $700 Vega graphics card = $1000 = worse experience))

Sure, the RX580, a card that is barely capable of 1440p, will deliver a "better gaming experience" while you're being forced to drop the graphical settings down to medium to maintain framerates. How quaint.

I guess the next thing we'll hear is, "YOUR EYES CAN'T SEE MORE THAN 30FPS ANYWAY!"
 
Radeon RX Vega 64 is barely an improvement over Radeon R9 Fury X at the same clock speed

Radeon-RX-Vega-64-56.png


https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/7/
 
Lol speaking of 980Tis someone should really benchmark a highly OCd one vs Vega I have a feeling it will compare very favorably both in terms of raw performance and in terms of perf/w

I'd really like to see this as well. 1500mhz 980 ti is no joke. And cheap enough to pay the so-called g-sync "tax" easily.
 
I'd really like to see this as well. 1500mhz 980 ti is no joke. And cheap enough to pay the so-called g-sync "tax" easily.

upload_2017-8-16_18-52-20.png

5% behind a 1080
upload_2017-8-16_18-53-31.png


8% ahead of a 1080

So Vega 64 is 13% ahead of an OCd 980Ti, drawing roughly the same power. Impressive. This is DOOM as well, virtually the only title in which Vega 64's 13+ tflops manage to eke out a measly 8% lead over a 9 tflop 1080.
 

Attachments

  • upload_2017-8-16_18-51-43.png
    upload_2017-8-16_18-51-43.png
    52.5 KB · Views: 22
Radeon RX Vega 64 is barely an improvement over Radeon R9 Fury X at the same clock speed

Radeon-RX-Vega-64-56.png


https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/7/
The IPC is going to be similar as the underlying architecture is the same. In the Vega white paper and in interviews with engineers all those extra transistors go towards raising the clock speed that is where its performance increase comes from. The only other thing Vega adds is more compute functions.
 
The IPC is going to be similar as the underlying architecture is the same. In the Vega white paper and in interviews with engineers all those extra transistors go towards raising the clock speed that is where its performance increase comes from. The only other thing Vega adds is more compute functions.

IPC is a bad term imo, shader throughput should have improved massively over Fiji. We were promised something lik 6x peak geoemetry throughput, more efficient culling methods (than those introduced in Polaris ) etc so I would say the gains are rather disappointing. Literal IPC is unchanged as you have those same 4096 ALU but you would expect Vega uarch to be able to better make use of those flops and be able to achieve higher effective performance at throughput parity aka at same clocks Vega should be comfortably ahead of Fiji
 
Even though I think Vega 64 is too late in regard to gaming I am encouraged by its potential compute performance and what it will bring to the table in the upcoming iMac Pro.
 
  • Like
Reactions: Boil
like this
IPC is a bad term imo, shader throughput should have improved massively over Fiji. We were promised something lik 6x peak geoemetry throughput, more efficient culling methods (than those introduced in Polaris ) etc so I would say the gains are rather disappointing. Literal IPC is unchanged as you have those same 4096 ALU but you would expect Vega uarch to be able to better make use of those flops and be able to achieve higher effective performance at throughput parity aka at same clocks Vega should be comfortably ahead of Fiji
We will get all of that plus Titan Xp beating performance at a later date through drivers... eventually...sometime in 2020...possibly 2022...
 
a


tldr: Vega 56 is the one to look at.

A little irony I've noticed is that people bad mouthed Fury X for two full years + on this forum. You could buy the XFX Fury X at newegg for $300-$325 for at least 6-9 months of that two + years.
Now that $400 ($500?) Vega 56 is confirmed very similar performance wise to a Fury X - over two years after Fury X release - folk are saying that's the card to get from the Vega series?
 
a

A little irony I've noticed is that people bad mouthed Fury X for two full years + on this forum. You could buy the XFX Fury X at newegg for $300-$325 for at least 6-9 months of that two + years.
Now that $400 ($500?) Vega 56 is confirmed very similar performance wise to a Fury X - over two years after Fury X release - folk are saying that's the card to get from the Vega series?

You could also buy 980Tis that smoke the Fury X. Incidentally, 980Tis that will at least match Vega 56 even in titles where it performs well (doom) and when it comes to a wider range of titles its not even going to be a competition.

Sure, Vega 56 will be more power efficient should be using around 220w stock iirc but we are talking about a 4 year old architecture vs something that came out a few days ago.
 
a

A little irony I've noticed is that people bad mouthed Fury X for two full years + on this forum. You could buy the XFX Fury X at newegg for $300-$325 for at least 6-9 months of that two + years.
Now that $400 ($500?) Vega 56 is confirmed very similar performance wise to a Fury X - over two years after Fury X release - folk are saying that's the card to get from the Vega series?
In comparison to the immediate competition. The performance is not comparable to a Fury X unless you undervolt it. Think of Vega 56 has a super OC Fury X with more Compute abilities and double the VRAM. In comparison to a similar priced 1070 it may not be a bad deal.
 
upload_2017-8-16_21-30-57.png


Just noticed this on GN, not bad considering this is UE4 aka AMD's worst enemy and the prime example of NV's evil.

Is anyone else triggered by AMD's abject disdain for symmetry ? I'm not talking about the small differences in height of the HBM vs the die and whatnot.
upload_2017-8-16_21-34-52.png


Couldn't they have lined up the HBM stacks with the die and just increased the space in the middle?
NVIDIA-Pascal-GPU-Chip-Module.jpg

The leatherman knows what's what

Just noticed the SMDs on either side of the GP100 die are not symmetrical. Fuck. At least the symmetric board makes up for it.. Mmmm... Why is this so satisfying to look at?
 
Last edited:
The IPC is going to be similar as the underlying architecture is the same. In the Vega white paper and in interviews with engineers all those extra transistors go towards raising the clock speed that is where its performance increase comes from. The only other thing Vega adds is more compute functions.
Slight correction ..
all those extra transistors go towards raising the clock speed...other thing Vega adds is more compute functions..AND doubled sram caches, to OVER 45MB total (unless people actually think doing so is "free" and uses no transistors)
 
cache is high density so yeah the increase in cache size will have a larger impact on transistor counts but that is also needed to hide the latency of the longer pipelines which are needed for the higher clocks. So for that higher clock speed that cache increase was most likely necessary.
 
Basically the real winner here is the adaptive sync technology - either freesync or gsync - not max FPS with an non adaptive sync monitor. I agree completely based on my freesync experiences, which I've relayed many times recently myself on this board.

AdoredTV is an AMD shill, of course he's going to grasp at any straw to justify Vega's existence. And I've had G-Sync since it was released and while its nice, at high FPS it becomes essentially irrelevant and I tend not to use it. In fact, FreeSync/G-Sync add some input lag and I've found it better to turn it off in games I push 140+ fps in rather than keep G-Sync on. With Vega offering nothing substantial to the market, falling back on FreeSync is the obvious tactic. I mean they even held a blind test with a best case scenario (doom) on [H] and other websites trying to fool people into thinking Vega is worth getting.
 
Slight correction ..
all those extra transistors go towards raising the clock speed...other thing Vega adds is more compute functions..AND doubled sram caches, to OVER 45MB total (unless people actually think doing so is "free" and uses no transistors)
Sure absolutely. it was not my intention to imply all of the transistors went to clock gains but a huge chunk, roughly 3 billion, did go primarily to clock increases.
 
AdoredTV is an AMD shill, of course he's going to grasp at any straw to justify Vega's existence. And I've had G-Sync since it was released and while its nice, at high FPS it becomes essentially irrelevant and I tend not to use it. In fact, FreeSync/G-Sync add some input lag and I've found it better to turn it off in games I push 140+ fps in rather than keep G-Sync on. With Vega offering nothing substantial to the market, falling back on FreeSync is the obvious tactic. I mean they even held a blind test with a best case scenario (doom) on [H] and other websites trying to fool people into thinking Vega is worth getting.
There's no way that's an AMD shill video. I've never watched AdoredTV before, so I can't tell you the history, but it's pretty clear you didn't watch this particular video.
He spends about 3/4 of the video talking about how Vega is a product failure compared to the 1080TI and NVidia's current offerings --- with pretty much the singular HUGE caveat -- unless you own freesync monitors.
 
There's no way that's an AMD shill video. I've never watched AdoredTV before, so I can't tell you the history, but it's pretty clear you didn't watch this particular video.
He spends about 3/4 of the video talking about how Vega is a product failure compared to the 1080TI and NVidia's current offerings --- with pretty much the singular HUGE caveat -- unless you own freesync monitors.
His history is to put it kindly overly enthusiastic toward AMD. I think the reason the review is negative is he honestly believed it would compet with a 1080ti.
 
There's no way that's an AMD shill video. I've never watched AdoredTV before, so I can't tell you the history, but it's pretty clear you didn't watch this particular video.
He spends about 3/4 of the video talking about how Vega is a product failure compared to the 1080TI and NVidia's current offerings --- with pretty much the singular HUGE caveat -- unless you own freesync monitors.


Yeah his history is unblievable, like AMD's masterplan (part 1 and 2) to take over the game industry which at that time the masterplan failed already by that point, plus with nV taking one of the consoles later on really kinda showed more of that. And Polaris rumor videos..... Almost all his videos until recently read like an AMD cheerleader. The past 2 well we can say three, videos, 2 were Vega he pretty much stated it won't be competition to nV in any way or for and the 3rd was a the gpu war is over nV won.

Unfortunately many of his recent conclusions although correct came about by some backward reverse polish notation thinking where logic is all over the place. But end results came to some of the same conclusions so......
 
AdoredTV is an AMD shill, of course he's going to grasp at any straw to justify Vega's existence. And I've had G-Sync since it was released and while its nice, at high FPS it becomes essentially irrelevant and I tend not to use it. In fact, FreeSync/G-Sync add some input lag and I've found it better to turn it off in games I push 140+ fps in rather than keep G-Sync on. With Vega offering nothing substantial to the market, falling back on FreeSync is the obvious tactic. I mean they even held a blind test with a best case scenario (doom) on [H] and other websites trying to fool people into thinking Vega is worth getting.


Can't say I've ever seen proof of input lag with freesync or gsync on, and I've done plenty of searching. I've seen evidence to the contrary, though.


His history is to put it kindly overly enthusiastic toward AMD. I think the reason the review is negative is he honestly believed it would compet with a 1080ti.


He actually predicted that Vega would perform like this months ago, and that AMD had no answer to the 1080ti. He's honestly one of the most objective youtubers there is, but because that means he leans away from groupthink, people call him biased. This may be the most Radeon-positive video I've ever seen him post, to be honest.
 
HI,
you must have missed this:
He says his freesync gaming experience on these same games he is benchmarking for the Vega vs. 1080TI comparison on a non adaptive sync monitor, showcase a worse gaming experience than his former RX480, and then RX580, with freesync enabled (that he's personally used for the last year).

So let's see here.

$300 freesync monitor + $250 RX580 graphics card = $550 = better gaming experience
vs
(($300 non adaptive sync monitor + $700 1080ti graphics card = $1000 = worse experience) > (or $300 non adaptive sync monitor + $700 Vega graphics card = $1000 = worse experience))

Problem is that peoples experience varies, even the blind experiment performed at [H] was not 100% in AMD court. Also the sample size is to small to honestly say one is really better than the other, or one experience is vastly improved over the other.

I honestly don't know how free sync or g-sync perform, never owned one of those types of monitors. I made a jump to a 27" before the tech was available , then made a jump to an ultra wide 34" before the tech caught up, and now I am on a 55" 4k, hoping to swap to a LG OLED later this year (If any of them every come to our side of the world).

Would have bought one of Acer/Asus 27" 144hz 4k monitors next month, now they are pushed back to Q1 or Q2 2018.
 
Back
Top