Vega Rumors

Still need the drivers tuned and enabled everywhere, nothing new there. So no idea wtf you're going on about. Or why you think something has changed since the last time you mentioned it.
  1. Wait 15 months for AMD's comeback
  2. Wait another 5 months for drivers to mature
  3. Celebrate our sound investment with me mates from /r/AMD now that we're beating Nvidia by 5%
  4. Sweat profusely in my strangely hot room
  5. There's no way leatherman can catch us now guys
  6. GOTO 1
 
  1. Wait 15 months for AMD's comeback
  2. Wait another 5 months for drivers to mature
  3. Celebrate our sound investment with me mates from /r/AMD now that we're beating Nvidia by 5%
  4. Sweat profusely in my strangely hot room
  5. There's no way leatherman can catch us now guys
  6. GOTO 1

Number 4 really made me chuckle
 
As someone who was on Team Red for about a decade, I can't comprehend how pathetic this launch actually is. You had the Fury X, which traded blows with the then-flagship 980 Ti (which only a very good overclock on a Maxwell Titan could beat) to Vega 64, which trades blows with the 1080, a year and a quarter later, and which is beaten by no less than three other nVidia cards. Fucking pathetic, to say the least.


Can only do so much when ya have half the R&D ;)
 
It wouldn't, but the original point was that the graph showed AMDs 4SE's holding back performance. Despite the obvious fact that Vega would have a 50% higher limit with higher clockspeed. Increasing throughput on a bottleneck 50% should have more than a 0% change. That part is simple.

Cover is a classic CPU bottleneck. On Linux even Further is ahead of 1080ti, albeit 0.3% along with other cards.


View attachment 33449
Not my fault if you don't believe the evidence in front of you. And yes some driver work is still needed as almost every reviewer I've seen has noted. Even Kyle mentioned it with the MSAA tests. Simple driver tuning right there.


Again we saw the scaling pitfalls in Fiji too. and......

Fiji had issues with MSAA performance too remember, why the Fiji review guide had MSAA turned off or set low on most titles?

This is not something with drivers, its something with the architecture and the bandwidth requirements for MSAA.

That is evidence for you that the MSAA issue and scaling issues are not a one off driver issue, its a uarch issue.

How many months have I been saying it? Since Fiji's launch, AMD needs to step away from GCN, and now we have compelling evidence the problem is routed in GCN. Because we now have two "generations" with similar unit counts having the same/similiar problems. We have seen it happen many times in the past when architectures have been reused over and over again. Any more then two times, we can see the cracks as games evolve.
 
AMD is wasting an enormous amount of money on their failed GPU division. Even when they first bought it, it destroyed AMD.

To release a card like this, at this point, considering the performance and power draw in comparison to a gtx 1080, no one in their right minds should ever buy one. I can only imagine this is a no-choice buy for those locked into free-sync.

Too many cores and too much power is required to compete. The architecture is massive failure. It may have been good years ago, but it is highly outdated.

I believe monitor manufacturers knew of the flop, as did NVidia, which is why 4k 144hz and volta will not release until 2018. There is no competition; more money can be made by selling current gen-tech monitors and gpus. No reason to release 144hz 4k if nothing can drive it by today's standards.

AMD should sell it's GPU division to intel. Intel needs it, and so does AMD. It is the only logical business decision.
 
This is what blows my mind really. A lot of people berate G-Sync because its not for free or an open standard and think they're saving money by going with FreeSync. What they fail to realize is that they are just buying into AMD's ecosystem under the guise of "cheaper" but are getting stuck with lesser GPUs that eat tons of power and can't match what NVIDIA offers. Why not pony up the extra cash and save yourself from the waiting and headache of AMD? About the only reason I can think of is some FreeSync monitors are offered in larger sizes than G-Sync but even then I'd just go with a 34" curved G-Sync display instead.
Considering almost all high refresh rate monitors are now freesync i don't see it as buying into their ecosystem. I own a freesync monitor because G-Sync is way too expensive. However i use a GTX 1080. Would i love to be able to use varible refresh rate? Sure, but it is not a big enough game changer for me to buy radeon or upgrade to g-sync right now.
 
Well Vega 64 and a GTX960 provide the same experience in CSGO. I guess that means they're equal then, despite thee former being 4x faster in technical terms

Pretty stupid example and obviously not what I meant. Hey, an APU matches a 1080ti because they offer the same experience in Sim City!
 
yeah in one game though lol.
Precisely my point, and anarchist posted graphs for hitman at 1080p as if they were meaningful in the context of power savings lol.

Even outside of cpu bottlenecks etc, I am certain there will be a new game that will be to Vega as AotS was to Fiji, in which Vega will perform really well, probably within 10% of a Ti, and it will be as if no other benchmark matters anymore. 'Vega has come to take the crown" they will proclaim far and wide, from here to anandtech, and then every title that will use those Vega features after that will show much less impressive gains, and the cycle will repeat itself. Wait for Navi, Vega was the beta, Navi is the real deal. Rinse and repeat.

Look at polygon throughput tests, Vega still lags far behind.

Thats going to hurt going forward as games with higher geometric detail will suffer particularly if you're aiming for high framerates. If this sounds familiar, that's because it is.
 
yeah in one game though lol.

Ok, yes they used DOOM. But even then Vega still easily loses to the 1080ti in performance, but was said to offer a better experience, so why couldn't it be true for other games?

It is similiar to a R5 vs a 7600k in gaming. Much of the time, they offer similiar fps in gaming but the R7 offers a better experience since it doesnt drop in cpu intensive areas. Its better to stay between 60 and 70 fps then fluctuate from 80 down to 30 fps in areas even though both are getting '55 fps avg'
 
  • Like
Reactions: N4CR
like this
Precisely my point, and anarchist posted graphs for hitman at 1080p as if they were meaningful in the context of power savings lol.

Even outside of cpu bottlenecks etc, I am certain there will be a new game that will be to Vega as AotS was to Fiji, in which Vega will perform really well, probably within 10% of a Ti, and it will be as if no other benchmark matters anymore. 'Vega has come to take the crown" they will proclaim far and wide, from here to anandtech, and then every title that will use those Vega features after that will show much less impressive gains, and the cycle will repeat itself. Wait for Navi, Vega was the beta, Navi is the real deal. Rinse and repeat.

Look at polygon throughput tests, Vega still lags far behind.

Thats going to hurt going forward as games with higher geometric detail will suffer particularly if you're aiming for high framerates. If this sounds familiar, that's because it is.


Oddly enough not just the polygon tests where it fails behind it falls behind on many compute tasks too. There are some serious issues with GCN,
 
AMD is wasting an enormous amount of money on their failed GPU division. Even when they first bought it, it destroyed AMD.

To release a card like this, at this point, considering the performance and power draw in comparison to a gtx 1080, no one in their right minds should ever buy one. I can only imagine this is a no-choice buy for those locked into free-sync.

Too many cores and too much power is required to compete. The architecture is massive failure. It may have been good years ago, but it is highly outdated.

I believe monitor manufacturers knew of the flop, as did NVidia, which is why 4k 144hz and volta will not release until 2018. There is no competition; more money can be made by selling current gen-tech monitors and gpus. No reason to release 144hz 4k if nothing can drive it by today's standards.

AMD should sell it's GPU division to intel. Intel needs it, and so does AMD. It is the only logical business decision.

To sell the GPU division would be a horrible decision, since AMD has spent the past 9 years integrating ATi and since Bulldozer, relying on the vital revenue of consoles and GPUs to shore up the company and cover the feeble CPU side. Part of AMD strategy for shoring up confidence is that AMD competes in both GPU and CPU markets and can take advantage of potential revenue from both.

To sell RTG to Intel, AMD's competitor, would be utter suicide on a widespread scale. AMD needs APUs to compete with Intel for the largest part of the computer market, and that requires GPU tech. For AMD to sell that GPU tech to Intel, for Intel to make better graphics in their CPUs is absurd.
 
Ok, yes they used DOOM. But even then Vega still easily loses to the 1080ti in performance, but was said to offer a better experience, so why couldn't it be true for other games?

It is similiar to a R5 vs a 7600k in gaming. Much of the time, they offer similiar fps in gaming but the R7 offers a better experience since it doesnt drop in cpu intensive areas. Its better to stay between 60 and 70 fps then fluctuate from 80 down to 30 fps in areas even though both are getting '55 fps avg'


You can't take one game and say its like that for all games, and that was the point.

If you look through the synthetic tests for Vega, you can see where Vega will absolutely fall apart. There are no relative weaknesses in Pascal like we see in Vega. You don't even need to guess what developers have to shy away from in Vega. If a developer wants a certain effect and they do it, it will work well on Pascal but will crush Vega, Vega is screwed.

In pure compute tasks Vega should be higher than a 1080ti but in almost all synthetic compute takes its behind the gtx 1080, that should not happen when the gtx 1080 has what 40% less tflops? These are tests that only stress the compute performance in Open CL where nV's drivers suck.
 
Ok, yes they used DOOM. But even then Vega still easily loses to the 1080ti in performance, but was said to offer a better experience, so why couldn't it be true for other games?

It is similiar to a R5 vs a 7600k in gaming. Much of the time, they offer similiar fps in gaming but the R7 offers a better experience since it doesnt drop in cpu intensive areas. Its better to stay between 60 and 70 fps then fluctuate from 80 down to 30 fps in areas even though both are getting '55 fps avg'

We don't know how much of it could have just been monitor selection. They were conpletely different types of monitors selected by AMD.

I'd love for the test to be redone with equal monitors and multiple games.
 
Ok, yes they used DOOM. But even then Vega still easily loses to the 1080ti in performance, but was said to offer a better experience, so why couldn't it be true for other games?

It is similiar to a R5 vs a 7600k in gaming. Much of the time, they offer similiar fps in gaming but the R7 offers a better experience since it doesnt drop in cpu intensive areas. Its better to stay between 60 and 70 fps then fluctuate from 80 down to 30 fps in areas even though both are getting '55 fps avg'
One game small sample size, so stop pretending it is actually meaningful. It most certainly is not meaningful in any fashion.
 
The back and forth started when several people were cheerleading the comment that Freesync was a bad investment. I am saying it is a good investment when you can take a POS card like Vega and give similiar experience to the like of a 1080ti.

Again, it was only 1 game, but I put more weight on a real world test involving a dozen or so of the most experienced gamers than any number of synthetic tests.

I would love to see more of these blind tests, however.
 
So you are saying Kyle's test was a big waste of time and money?


he was only allowed one game to test and didn't have time to do anymore, it was AMD's requirements that stopped him from doing any more. No one is blaming Kyle, he was stuck with what AMD gave him as a time line can't do much in what maybe 10 hours?
 
So you are saying Kyle's test was a big waste of time and money?
yes, and no. From a marketing perspective for the site and driving traffic it is a big win, but if you where looking for an objective analysis yes it was a waste as it was in no way objective. I am not saying he should not have done it, it is his site and I am sure it helps bring in revenue, but we should not treat it as a benchmark of real performance.
 
Apparently, the $499/£449 pricing for Radeon RX Vega 64 is a limited quantity only promotion and the normal price is $599/£549.99

https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/

TBH I'm not sure why anyone cares about pricing. It's not a good card at it's previous pricing much less this one, so why give a shit if AMD sells it at that price. If it does not sell, poor AMD, they will have to lower pricing. If it does, good for AMD.
 
he was only allowed one game to test and didn't have time to do anymore, it was AMD's requirements that stopped him from doing any more. No one is blaming Kyle, he was stuck with what AMD gave him as a time line can't do much in what maybe 10 hours?

Kyle set up the test and chose the game so don't be ao quick to defend your boyfriend.
 
yes, and no. From a marketing perspective for the site and driving traffic it is a big win, but if you where looking for an objective analysis yes it was a waste as it was in no way objective. I am not saying he should not have done it, it is his site and I am sure it helps bring in revenue, but we should not treat it as a benchmark of real performance.

I know we all hate hypotheticals, but just play along. If that same test group played all of the major games and they all claimed that the Vega/freesync combo offered a better experience over the 1080ti in all of the games, would you buy Vega? Simple yes or no.
 
Kyle set up the test and chose the game so don't be ao quick to defend your boyfriend.

Are you assuming his gender? Again, unless there's equal monitors the test is practically useless to compare the graphics cards.
 
Kyle set up the test and chose the game so don't be ao quick to defend your boyfriend.


I'm not defending him, what I stated was the truth, the card was sent by a AMD rep or AIB rep, the testing was monitored by them and then taken back. Simple there was no more time to do anything more then one test.

It was a marketing ploy by AMD.

You don't see Intel, nV doing such things, we have never seen them do such things period even when they had bad products. Why, when AMD has a shit product they need to show it in the best light and giving reviewers free reign will NOT do this. There is not a single review that can recommend Vega over Pascal, why, that is what is came down to, Vega is an inferior product.

We even saw this with Epyc, they knew they were going to get screwed with data localization and database testing, and that one test they limited anandtech from doing any more tests. Shit 75% and higher of the servers on the web that is important for. Cloud servers its crucial to keep data locality. Business servers that are built for databases any type its crucial. Yet that one test is where they wanted to limit testing?

You can sit here and bend over for AMD's hot red card all day long, but don't try to convince me or others its a good thing to get a fiery poker up your ass.
 
shouldve just priced them cheap..... enough to make a profit.............but cheap........
polaris was a decent launch, they shouldve followed suit......

as a hobbyist miner im disappointed
as a gamer im disappointed
as a consumer im disappointed
as an enthusiast im disappointed


AND NOW THEYRE LOCKING BIOS'

(I know probably will be cracked... still---probably protecting the Vega 56 for the time being)
 
TBH I'm not sure why anyone cares about pricing. It's not a good card at it's previous pricing much less this one, so why give a shit if AMD sells it at that price. If it does not sell, poor AMD, they will have to lower pricing. If it does, good for AMD.

https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/1

Let's face it.

Vega is a flop and the only ones to buy it are the fanboys.

And if they are going to buy it anyway, why not charge $100 more?
 
Last edited:
I'm not defending him, what I stated was the truth, the card was sent by a AMD rep or AIB rep, the testing was monitored by them and then taken back. Simple there was no more time to do anything more then one test.

It was a marketing ploy by AMD.

You don't see Intel, nV doing such things, we have never seen them do such things period even when they had bad products. Why, when AMD has a shit product they need to show it in the best light and giving reviewers free reign will NOT do this. There is not a single review that can recommend Vega over Pascal, why, that is what is came down to, Vega is an inferior product.


You can sit here and bend over for AMD's hot red card all day long, but don't try to convince me or others its a good thing to get a fiery poker up your ass.

Settle down man. All I am saying is that not everything in gaming experience is quantifiable. Long charts make our epeens feel bigger, but it would be nice to get more real world experience as opposed to just measuring charts. Its like a Mazda Miata. On paper, it looks like total crap. But everyone that drives one say it is an absolute blast. Similiar things happen with gaming experience when you start mixing in freesync/gsync.

Once again, this was me defending freesync and not trying to champion Vega.
 
I'm not defending him, what I stated was the truth, the card was sent by a AMD rep or AIB rep, the testing was monitored by them and then taken back. Simple there was no more time to do anything more then one test.

It was a marketing ploy by AMD.

You don't see Intel, nV doing such things, we have never seen them do such things period even when they had bad products. Why, when AMD has a shit product they need to show it in the best light and giving reviewers free reign will NOT do this. There is not a single review that can recommend Vega over Pascal, why, that is what is came down to, Vega is an inferior product.

We even saw this with Epyc, they knew they were going to get screwed with data localization and database testing, and that one test they limited anandtech from doing any more tests. Shit 75% and higher of the servers on the web that is important for. Cloud servers its crucial to keep data locality. Business servers that are built for databases any type its crucial. Yet that one test is where they wanted to limit testing?

You can sit here and bend over for AMD's hot red card all day long, but don't try to convince me or others its a good thing to get a fiery poker up your ass.

I don't remember Anandtech being limited by AMD anywhere aside from lack of time. You were referring to this article right?
 
Settle down man. All I am saying is that not everything in gaming experience is quantifiable. Long charts make our epeens feel bigger, but it would be nice to get more real world experience as opposed to just measuring charts. Its like a Mazda Miata. On paper, it looks like total crap. But everyone that drives one say it is an absolute blast. Similiar things happen with gaming experience when you start mixing in freesync/gsync.

Once again, this was me defending freesync and not trying to champion Vega.


Telling me to calm down after you insult me? Yeah I see how it works.........

dude Mazda Miata is a great chicks car, its also a chick magnet car. There is a certain reason to get that car. I would rather get a Lotus Elise. Much better in both those categories and its a better driving experience but its cost more, that is what its like Vega + free sync is your Mazda Miata and Pascal + gync is your Lotus Elise.
 
I don't remember Anandtech being limited by AMD anywhere aside from lack of time. You were referring to this article right?


Closing thoughts
First of all, we have to emphasize that we were only able to spend about a week on the AMD server, and about two weeks on the Intel system. With the complexity of both server hardware and especially server software, that is very little time. There is still a lot to test and tune, but the general picture is clear.
 
All the test showed was that, in one game, that ran at 100 FPS+ on both cards, that a Vega + FreeSync and 1080 Ti + G-Sync, performed similarly enough that two out of four people made a subjective call as to what they would have called the better 'ecosystem'. That's all the test showed. It's being skewed to mean that Vega + FreeSync is the better performing ecosystem and is being extrapolated to cover all gaming scenarios.

So, yes, the test was fucking crap and, as a loyal [H] reader, I was disappointed that Kyle even deigned to honor AMD's marketing bullshit request.
 
Back
Top