Vega Rumors

OK, I think I may be using TXAA so that is probably the difference. Not saying the review was fake, just that I am getting better performance but lots of variables in the equation.
 
OK, I think I may be using TXAA so that is probably the difference. Not saying the review was fake, just that I am getting better performance but lots of variables in the equation.
GTAV has the most extensive graphics tweaking options out of any game ever made, so who knows where the difference is in you settings that could be making the difference lol
 
A couple of discrepancies as well.

I noticed that according to TPU's results, Vega seems to be stuck on 77 fps in FO4:

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/17.html

But it resoundedly beats 1080 at 4k, but is still about 10~15% behind 1080ti on that.

That's the only major discrepancy, the other being Vega 56 beating 1080 at certain games, but averages to be about equal to 1080.
Ummm I have not seen anything that averages the 56 to be about at GTX 1080 performance. If that was the case reviewers would be going nuts singing its praise.
 
In certain games, at least according to TPU's results, have Vega 56 topping 1080.


Granted, at least one of those games is called Doom.

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/9.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/11.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/13.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/15.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/16.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/17.html (Fallout 4 only does so at 4k, at 1080/1440, the game seem to be stuck at 77fps on Vega)
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/23.html (sort of, Vega 56 is neck to neck with 1080 in RE7)

In fact, TPU's results were a bit all over the place. It has certain games with Vega 56 beating 1080, then a few with Vega 64 LOSING to 1070...

But overall averages, 1080 is the same as Vega 64

At least 1 thing is consistent though, that Vega 64 cannot touch 1080ti at all, barring cases of engine fps hard limit.
 
Does it matter if we can't even buy the things? This launch is a joke. AMD delayed the release this long just to have cards go out of stock in 10 minutes?
Ha, this must be your first video card launch. The 480s sold out instantly too and were hard to get for a couple weeks.
 
  • Like
Reactions: N4CR
like this
Yeah, it seems it's been that way for a bit. Only in the past few years have I been on the "have to order the minute it releases" with the GTX 980 and after that, and it's was the same thing.
 
Quite a sad launch for AMD. I don't think this thing will get them GAMING market share.
Also love the posts about wait for another 3 - 6 months for vega to shine in multiple forums. With AMD, I would say wait for Navi and then be disappointed again.
 
Last edited:
Vega 56 seems like the best of the cards AMD has, Vega 64 is pretty Meh. Thankfully this is the last incarnation of GCN and Navi is already done so Vega should only have little over a year to exist.
That's a good point. Wonder what GCN support will look like in a few years..

Nearly all reviews where showing abysmal GTAV numbers. Vega seems to suffer in most other DX11 games, but nearly as bad as in GTAV.
To be frank, none of those numbers are very appealing.
 
The worst part about Vega is anyone that was dumb enough to buy a Freesync monitor is stuck with 1070/1080 performance for another year...

This is what blows my mind really. A lot of people berate G-Sync because its not for free or an open standard and think they're saving money by going with FreeSync. What they fail to realize is that they are just buying into AMD's ecosystem under the guise of "cheaper" but are getting stuck with lesser GPUs that eat tons of power and can't match what NVIDIA offers. Why not pony up the extra cash and save yourself from the waiting and headache of AMD? About the only reason I can think of is some FreeSync monitors are offered in larger sizes than G-Sync but even then I'd just go with a 34" curved G-Sync display instead.
 
You probably just argued the case for FuryX and Maxwell....
No need to upgrade then if going by that.
Cheers
Is that supposed to be a bad thing? Not all of us upgrade every year so the more miles we get the better. I have had my 290 for over 2 years, gave it to the wife using a freesync monitor and I am now getting the WCed Vega 64 which I will use for at least another 2-3 years which I am mating to a freesync 2440P monitor for that long lasting experience.
 
If only they have used identical panels...
Granted that is a valid scientific concern, however the number of equivalent monitors to fit that criteria are far and few between. So at best we can determine that these monitors add greatly to the experience of the user and again as I stated above adds greatlyt to the longevity of said GPU.
 
The worst part about Vega is anyone that was dumb enough to buy a Freesync monitor is stuck with 1070/1080 performance for another year...
Here's this dummy's freesync monitor gaming experience which was butter smooth with a pair of Fury X in crossfire at about 70fps or above on nearly any game. I bought the pair of Fury X at $325 each about a year ago, and bought the three HP Omen freesync 32" monitors new at $300 each as well. A pair of Fury X is as fast in triple A titles as a single 1080ti. I swapped out from Fury X to a pair of 1080ti in the last month and my experience got worse, I'd switch back in a heartbeat but unfortunately I had already sold my pair of Fury X cards to fund the 1080ti. No freesync is a pretty big detriment. I went from silkily smooth to encountering microstutter and noticeable frame drops (when my FPS deviated off 60hz) and I'm back to vsync lag. :(. I was planning to pick up a liquid cooled Vega, and replace my 1080tis to gain back freesync.

It's been a great gaming experience.

What's your monitor experience oh wise one?

(and by the way - this is the first top shelf card that hasn't been within 10% or so of Nvidias flagship for several generations - so the performance delta hasn't been as wide up until now)
IMG_0532.JPG


Now to apply a bit of balance:

I'm a bit bitter I can't find one single liquid cooled Vega yet. (And I did all the prep! Signing up for notifications via text and email, hitting f5 to the point of obsession, waiting for the darn things to come into stock). Come on AMD, you said you were delaying to make sure there were plenty at launch and that seems a foul lie. Microcenter in KC got four cards at launch. 4. Two air cooled gigabyte and two air cooled XFX. No water cooled. Newegg and amazon and Best Buy sold out in moments. :(

I'm also a bit annoyed AMD released Vega without working crossfire support because it worked so darn well with the Fury X. (I never had a single issue the entire time I used it). How does that even happen? You promoted the use of crossfire in marketing through rx480 and rx580. Crossfire has been super strong and reliable for several generations...and now it doesn't even work with the launch driver?

I'm also annoyed AMD doesnt offer a different cooling solution for the air cooled models. I agree with Brent's review in that the blower is too loud on AMD cards - making the water cooled variant the only reasonable option for those of us who like a silent PC. Something like Evga ICX cooler solution is soo much better than the blower it's unreasonable to only offer the blower at this stage.

Also a littte peeved at the MSRP bait and switch. And surprised. Prices were announced by AMD several weeks back. At launch $100 higher. Lame.

I see the disappointment Vega isn't stronger, but it never was supposed to be. Those following along knew it was only a matchup for the 1080. That's all that was ever claimed. The issue was the long delay for launch - not the performance. The parts delay made the performance seem weak. Six months ago people wouldn't have been bitter about the performance. The 1080ti wasn't out yet and AMD would have been competitive with Nvidia 1080, albeit at higher power and heat levels -- so what else is new?

A six month delay changed the whole narrative.
 
Last edited:
Here's this dummy's freesync monitor gaming experience which was butter smooth with a pair of Fury X in crossfire at about 70fps or above on nearly any game. I bought the pair of Fury X at $325 each about a year ago, and bought the three HP Omen freesync 32" monitors new at $300 each as well. A pair of Fury X is as fast in triple A titles as a single 1080ti. I swapped out from Fury X to a pair of 1080ti in the last month and my experience got worse, I'd switch back in a heartbeat but unfortunately I had already sold my pair of Fury X cards to fund the 1080ti. No freesync is a pretty big detriment. I went from silkily smooth to encountering microstutter and noticeable frame drops (when my FPS deviated off 60hz) and I'm back to vsync lag. :(. I was planning to pick up a liquid cooled Vega, and replace my 1080tis to gain back freesync.

It's been a great gaming experience.

What's your monitor experience oh wise one?

(and by the way - this is the first top shelf card that hasn't been within 10% or so of Nvidias flagship for several generations - so the performance delta hasn't been as wide up until now)
View attachment 33440

Now to apply a bit of balance:

I'm a bit bitter I can't find one single liquid cooled Vega yet. (And I did all the prep! Signing up for notifications via text and email, hitting f5 to the point of obsession, waiting for the darn things to come into stock). Come on AMD, you said you were delaying to make sure there were plenty at launch and that seems a foul lie. Microcenter in KC got four cards at launch. 4. Two air cooled gigabyte and two air cooled XFX. No water cooled. Newegg and amazon and Best Buy sold out in moments. :(

I'm also a bit annoyed AMD released Vega without working crossfire support because it worked so darn well with the Fury X. (I never had a single issue the entire time I used it). How does that even happen? You promoted the use of crossfire in marketing through rx480 and rx580. Crossfire has been super strong and reliable for several generations...and now it doesn't even work with the launch driver?

I'm also annoyed AMD doesnt offer a different cooling solution for the air cooled models. I agree with Brent's review in that the blower is too loud on AMD cards - making the water cooled variant the only reasonable option for those of us who like a silent PC. Something like Evga ICX cooler solution is soo much better than the blower it's unreasonable to only offer the blower at this stage.

Also a littte peeved at the MSRP bait and switch. And surprised. Prices were announced by AMD several weeks back. At launch $100 higher. Lame.

I see the disappointment Vega isn't stronger, but it never was supposed to be. Those following along knew it was only a matchup for the 1080. That's all that was ever claimed. The issue was the long delay for launch - not the performance. The parts delay made the performance seem weak. Six months ago people wouldn't have been bitter about the performance. The 1080ti wasn't out yet and AMD would have been competitive with Nvidia 1080, albeit at higher power and heat levels -- so what else is new?

A six month delay changed the whole narrative.

just get a 1080ti or Titan Xp you'll spend less and it's just as fast if not faster than corsfired vega would be
 
Here's this dummy's freesync monitor gaming experience which was butter smooth with a pair of Fury X in crossfire at about 70fps or above on nearly any game. I bought the pair of Fury X at $325 each about a year ago, and bought the three HP Omen freesync 32" monitors new at $300 each as well. A pair of Fury X is as fast in triple A titles as a single 1080ti. I swapped out from Fury X to a pair of 1080ti in the last month and my experience got worse, I'd switch back in a heartbeat but unfortunately I had already sold my pair of Fury X cards to fund the 1080ti. No freesync is a pretty big detriment. I went from silkily smooth to encountering microstutter and noticeable frame drops (when my FPS deviated off 60hz) and I'm back to vsync lag. :(. I was planning to pick up a liquid cooled Vega, and replace my 1080tis to gain back freesync.

It's been a great gaming experience.

What's your monitor experience oh wise one?

(and by the way - this is the first top shelf card that hasn't been within 10% or so of Nvidias flagship for several generations - so the performance delta hasn't been as wide up until now)
View attachment 33440

Now to apply a bit of balance:

I'm a bit bitter I can't find one single liquid cooled Vega yet. (And I did all the prep! Signing up for notifications via text and email, hitting f5 to the point of obsession, waiting for the darn things to come into stock). Come on AMD, you said you were delaying to make sure there were plenty at launch and that seems a foul lie. Microcenter in KC got four cards at launch. 4. Two air cooled gigabyte and two air cooled XFX. No water cooled. Newegg and amazon and Best Buy sold out in moments. :(

I'm also a bit annoyed AMD released Vega without working crossfire support because it worked so darn well with the Fury X. (I never had a single issue the entire time I used it). How does that even happen? You promoted the use of crossfire in marketing through rx480 and rx580. Crossfire has been super strong and reliable for several generations...and now it doesn't even work with the launch driver?

I'm also annoyed AMD doesnt offer a different cooling solution for the air cooled models. I agree with Brent's review in that the blower is too loud on AMD cards - making the water cooled variant the only reasonable option for those of us who like a silent PC. Something like Evga ICX cooler solution is soo much better than the blower it's unreasonable to only offer the blower at this stage.

Also a littte peeved at the MSRP bait and switch. And surprised. Prices were announced by AMD several weeks back. At launch $100 higher. Lame.

I see the disappointment Vega isn't stronger, but it never was supposed to be. Those following along knew it was only a matchup for the 1080. That's all that was ever claimed. The issue was the long delay for launch - not the performance. The parts delay made the performance seem weak. Six months ago people wouldn't have been bitter about the performance. The 1080ti wasn't out yet and AMD would have been competitive with Nvidia 1080, albeit at higher power and heat levels -- so what else is new?

A six month delay changed the whole narrative.

Archaea, I appreciate you relating your experience. I've got the exact same display setup as you (triple HP omen 32) and have exactly the same intention of liquid Vegas. I was watching newegg at launch and watched the air cooled disappear in minutes, waiting for the liquids that never came (packages don't count).

Your experience with more horsepower (Tis) vs freesync has further encouraged me to stick with vega, but it's certainly frustrating so far.
 
The worst part about Vega is anyone that was dumb enough to buy a Freesync monitor is stuck with 1070/1080 performance for another year...

R5 [email protected] | DEEPCOOL CAPTAIN 240EX {Push/Pull}
MSI B350 Tomahawk
Corsair LPX@2933
EVGA 1070 FTW@2126/8886
Seasonic 860 Platinum | HYPER X 120GB SSD | Samsung 850 Evo 500GB SSD

Odd thing to say from a guy in his own signature only has a 1070, so your making fun of people with Vega or getting Vega yet you have less graphic power then them?
 
Odd thing to say from a guy in his own signature only has a 1070, so your making fun of people with Vega or getting Vega yet you have less graphic power then them?

In any case, please welcome team AMD to June 2016!

I'm glad they made it.

Man I feel so bad for Nvidia owners who had to buy the GTX 1080/1070 over a year ago and didn't get to enjoy the benefits of a year's worth of inflation. :ROFLMAO:
 
AMD brings nothing new to the table with these cards. They could have at least started a price war. Vega 56 at 349 and Vega 64 at $449 would have been nice
 
Where is our prophet
Anarchist4000 when we need him? Read from the holy scriptures, tell us more about how power consumption will reduced by 30% etc. God knows, we need it.

500W in unigine heaven when overclocked. This is litetally half as efficient as a 1080
Overclocked cards use more power, and you're surprised by this? Drop to power savings, stick a water block on it and you'll get 1080 perf/watt at similar performance. Or did you not follow the reviews? Where lower power modes reduce power with almost no performance hit.

Still need the drivers tuned and enabled everywhere, nothing new there. So no idea wtf you're going on about. Or why you think something has changed since the last time you mentioned it.

I'm still sticking with my 1080ti comparison as that's where current numbers with some new features have it. I'd buy one to test if they weren't sold out instantly because demand is so damn high.
 
Overclocked cards use more power, and you're surprised by this? Drop to power savings, stick a water block on it and you'll get 1080 perf/watt at similar performance. Or did you not follow the reviews? Where lower power modes reduce power with almost no performance hit.

Still need the drivers tuned and enabled everywhere, nothing new there. So no idea wtf you're going on about. Or why you think something has changed since the last time you mentioned it.

I'm still sticking with my 1080ti comparison as that's where current numbers with some new features have it. I'd buy one to test if they weren't sold out instantly because demand is so damn high.

I have reported you for contributing to my splitting at the seams due to excessive laughter.
 
Someone also aptly pointed out that Vega is still bottlenecked by its 4 compute engines (like Fiji) as illustrated in this game:
civ6_2560_1440.png


Anandtech asked AMD about this limitation and they essentially told them that they could have added more but didn't feel like it because its more work...lol!
LMAO.

So you're saying the card with significantly higher clockspeeds is held back by 4 shader engines? At which point it gets identical performance? Because 4 triangles per clock wouldn't be higher with more clocks? That's 3rd grade math right there.

They simply don't need more than 4SEs. Just think, only 4SEs and already beating Pascal on some titles despite Nvidia having higher clocks. Wow!
 
LMAO.

So you're saying the card with significantly higher clockspeeds is held back by 4 shader engines? At which point it gets identical performance? Because 4 triangles per clock wouldn't be higher with more clocks? That's 3rd grade math right there.

They simply don't need more than 4SEs. Just think, only 4SEs and already beating Pascal on some titles despite Nvidia having higher clocks. Wow!

When you say beating Pascal do you refer to the 1070 having 60% higher geometry tbroughput than Vega 64? 13 tflops vs 7, and the 1070 is still winning lol. Vega should be twice as fast.

But please don't let common sense stop you from making me laugh
 
Overclocked cards use more power, and you're surprised by this? Drop to power savings, stick a water block on it and you'll get 1080 perf/watt at similar performance. Or did you not follow the reviews? Where lower power modes reduce power with almost no performance hit.

Still need the drivers tuned and enabled everywhere, nothing new there. So no idea wtf you're going on about. Or why you think something has changed since the last time you mentioned it.

I'm still sticking with my 1080ti comparison as that's where current numbers with some new features have it. I'd buy one to test if they weren't sold out instantly because demand is so damn high.

tumblr_lx9jb1SPMr1qdrpdr.gif
 
I have reported you for contributing to my
I take it you didn't read H's or any other sites reviews? Where Turbo to balance simply reduces power consumption. Power savings having very little effect in some cases while further reducing power. Your dissonance isn't my problem though, I gave fair warning this was coming.

When you say beating Pascal do you refer to the 1070 having 60% higher geometry tbroughput than Vega 64? 13 tflops vs 7, and the 1070 is still winning lol. Vega should be twice as fast.

But please don't let common sense stop you from making me laugh
Go read the H review. All that geometry throughput and they still aren't 60% ahead in all games. Just a synthetic benchmark nobody plays anymore.

Hell, if 4 tri/clock was a bottleneck, the higher clocked part should be faster. Just compare fury to Vega. Same SE limit with 1GHz vs 1.5GHz clocks and identical performance.
 
I take it you didn't read H's or any other sites reviews? Where Turbo to balance simply reduces power consumption. Power savings having very little effect in some cases while further reducing power. Your dissonance isn't my problem though, I gave fair warning this was coming.


Go read the H review. All that geometry throughput and they still aren't 60% ahead in all games. Just a synthetic benchmark nobody plays anymore.

Hell, if 4 tri/clock was a bottleneck, the higher clocked part should be faster. Just compare fury to Vega. Same SE limit with 1GHz vs 1.5GHz clocks and identical performance.

ahahahahaha I'm sorry mate but today you've really done it I can no longer take you seriously. I have never seen a more pitiful stream of excuses in the face of such staggering evidence to the contrary.

What you are doing right now is the rhetorical equivalent of holding the gates of thermopilae alone while facing the entirety of the Persian army. This is madness.

Why would 60% higher geometry throughput result in 60% higher framerate unless thats the sole bottleneck? 1070 is outperforming Vega 64 in some titles where this is obviously an issue, it's still 7 tflops,
HAlf as powerful as Vega and draws 150w.

Absolutely ridiculous
 
Why would 60% higher geometry throughput result in 60% higher framerate unless thats the sole bottleneck? 1070 is outperforming Vega 64 in some titles where this is obviously an issue, it's still 7 tflops,
HAlf as powerful as Vega and draws 150w.
It wouldn't, but the original point was that the graph showed AMDs 4SE's holding back performance. Despite the obvious fact that Vega would have a 50% higher limit with higher clockspeed. Increasing throughput on a bottleneck 50% should have more than a 0% change. That part is simple.

Cover is a classic CPU bottleneck. On Linux even Further is ahead of 1080ti, albeit 0.3% along with other cards.

ahahahahaha I'm sorry mate but today you've really done it I can no longer take you seriously. I have never seen a more pitiful stream of excuses in the face of such staggering evidence to the contrary.
hitman_1920_1080.png

Not my fault if you don't believe the evidence in front of you. And yes some driver work is still needed as almost every reviewer I've seen has noted. Even Kyle mentioned it with the MSAA tests. Simple driver tuning right there.
 
As someone who was on Team Red for about a decade, I can't comprehend how pathetic this launch actually is. You had the Fury X, which traded blows with the then-flagship 980 Ti (which only a very good overclock on a Maxwell Titan could beat) to Vega 64, which trades blows with the 1080, a year and a quarter later, and which is beaten by no less than three other nVidia cards. Fucking pathetic, to say the least.
 
Card barely matches a 1080 when overclocked and drawing 350W.

Drop the OC, enable power saving and you'll get the same performance for half the power draw. You have to wait for the drivers to be ready ahahahahaha


This is what he tells us while accusing people of having the mental acuity of third graders.

I just ignored him...he is just noise in the forum now IMHO
 
It wouldn't, but the original point was that the graph showed AMDs 4SE's holding back performance. Despite the obvious fact that Vega would have a 50% higher limit with higher clockspeed. Increasing throughput on a bottleneck 50% should have more than a 0% change. That part is simple.

Cover is a classic CPU bottleneck. On Linux even Further is ahead of 1080ti, albeit 0.3% along with other cards.


View attachment 33449
Not my fault if you don't believe the evidence in front of you. And yes some driver work is still needed as almost every reviewer I've seen has noted. Even Kyle mentioned it with the MSAA tests. Simple driver tuning right there.

What kind of horseshit is this? You just posted a graph of a CPU bottlenecked AMD leaning game at 1080p where balanced and power saver are identical perf, Turbo is lower.

Yeah. GPU power saving features in a CPU bound game can reduce power draw without affecting performance, real astute observation Sherlock.
Pray tell us more of your incredible technical insights.
 
Back
Top