Vega Rumors

LOL I think anyone would take the Lotus.

Yeah it was marketing.

The whole point of AMDs test and perhaps Kyles is that Freesync is so good and such a great value that it can make up for the POS card we are about to release.

Hey, it's something.
 
Welcome to the Nvidia forums,no wonder Hard is losing readers .

I think it is more of an anti-Amd thing as alot of these guys are in the intel forums as well. Seriously razor, that epyc comment came out of left field and it makes you appear to have some sort of vendetta on AMD.
 
Well I even stated in that thread, if someone invested in Freesync yeah get Vega its a better option for them. There is no if and buts about it. But if they are on the fence go with nV cause right now I don't see Navi catching up to Volta, not toe for toe. It might be much closer than what we see with Vega to Pascal though.
 
All the test showed was that, in one game, that ran at 100 FPS+ on both cards, that a Vega + FreeSync and 1080 Ti + G-Sync, performed similarly enough that two out of four people made a subjective call as to what they would have called the better 'ecosystem'. That's all the test showed. It's being skewed to mean that Vega + FreeSync is the better performing ecosystem and is being extrapolated to cover all gaming scenarios.

So, yes, the test was fucking crap and, as a loyal [H] reader, I was disappointed that Kyle even deigned to honor AMD's marketing bullshit request.

Yep. I am sure I could pick the right game/settings and do this kind of "blind test" with a Vega 64, and GTX 1060, and the results would be similarly split. It was an utterly one sided marketing stunt.

That doesn't mean a GTX 1060 is equal to a Vega 64, it just means it's an utterly pointless "test".
 
I think it is more of an anti-Amd thing as alot of these guys are in the intel forums as well. Seriously razor, that epyc comment came out of left field and it makes you appear to have some sort of vendetta on AMD.


Its the same thing they did with Vega, limiting testing to show the best points of their new product. I'm sure if they had an option to not even include those tests, they would have taken it, but it would have been way too obvious what they were trying to do.
 
Closing thoughts

Anandtech writes on the page "Testing Notes & Benchmark Configuration"

Unfortunately, our time testing the two platforms has been limited. In particular, we only received AMD's EPYC system last week, and the company did not put an embargo on the results. This means that we can release the data now, in time to compare it to the new Skylake-SP Xeons, however it also means that we've only had a handful of days to work with the platform before writing all of this up for today's embargo. We're confident in the data, but it means that we haven't had a chance to tease out the nuances of EPYC quite yet, and that will have to be something we get to in a future article.

Now that does not preclude AMD from being sneaky bastards but it seems like Anandtech wanted to get first impression clicks done with Skylake-SP and I guess, if the last line indicates correctly, then do more testing with EPYC afterwards and simply do another article.

Now, the only other site that has epyc hardware and is in the business of reviews seems to be ServetheHome (that I can find), they have done multiple articles on epyc, from June 20, to August 8th, so at least for STH, AMD did not ask for the hardware back. TBH, everything about servers and datacenters I've heard about, I find hard to believe AMD would or could be able to sneak stuff past when AMD is operating from such a non-starting position in the market, and when customers usually do testing on hardware for a long time before deployment.
 
It's not a bad idea. (On AMD's perspective) everyone else can just buy 1080s or Ti.

AMD/RTG is milking the fanboys for all they got.

Even the $699 Radeon RX Vega 64 Liquid Cooled is sold out.

Nobody, but the fanboys would pay $699 for that instead of the Geforce GTX 1080 Ti
 
I know we all hate hypotheticals, but just play along. If that same test group played all of the major games and they all claimed that the Vega/freesync combo offered a better experience over the 1080ti in all of the games, would you buy Vega? Simple yes or no.
Too many unknowns in the question to just reply yes or no.
 
Anandtech writes on the page "Testing Notes & Benchmark Configuration"



Now that does not preclude AMD from being sneaky bastards but it seems like Anandtech wanted to get first impression clicks done with Skylake-SP and I guess, if the last line indicates correctly, then do more testing with EPYC afterwards and simply do another article.

Now, the only other site that has epyc hardware and is in the business of reviews seems to be ServetheHome (that I can find), they have done multiple articles on epyc, from June 20, to August 8th, so at least for STH, AMD did not ask for the hardware back. TBH, everything about servers and datacenters I've heard about, I find hard to believe AMD would or could be able to sneak stuff past when AMD is operating from such a non-starting position in the market, and when customers usually do testing on hardware for a long time before deployment.


They can't sneak anything by its impossible, server, datacenters its all about money for them, they will look into everything. but by limiting time, 1 week of testing, that is just not enough time to do much. Everyone is going to test out the easiest synthetics in that short amount of time. Database testing is considerably more tasking and time consuming because of the options available to route the data around. I can spend an entire month with a server rack and test out database performance to figure out problems and then a month after that to fix the issues or more. And as I do that more problems will come up lol.
 
AMD/RTG is milking the fanboys for all they got.

Even the $699 Radeon RX Vega 64 Liquid Cooled is sold out.

Nobody, but the fanboys would pay $699 for that instead of the Geforce GTX 1080 Ti

I don't think it is fanboys paying $699 for a liquid cooled Vega, but miners. I like AMD products, but I am running my 980Ti until Volta or Navi come out.
 
I don't think it is fanboys paying $699 for a liquid cooled Vega, but miners. I like AMD products, but I am running my 980Ti until Volta or Navi come out.

Clueless miners who believed the BS hashing numbers, and definitely fanboys, I have seen some online excited about buying a Liquid Vega, which is like 20-30% slower than 1080Ti for the same money.

Couple that with AMD not having much supply to start with. It looks like they just started production recently. That is likely why reviewers only had 2 days to review the cards.
 
Considering almost all high refresh rate monitors are now freesync i don't see it as buying into their ecosystem. I own a freesync monitor because G-Sync is way too expensive. However i use a GTX 1080. Would i love to be able to use varible refresh rate? Sure, but it is not a big enough game changer for me to buy radeon or upgrade to g-sync right now.

People that haven't used a variable refresh rate setup need to STFU until they have tried it. Gsync is probably the best investment I've made and biggest visual improvement in the past 5 years for me. No screen tearing with no vsync lag is amazing
 
They can't sneak anything by its impossible, server, datacenters its all about money for them, they will look into everything. but by limiting time, 1 week of testing, that is just not enough time to do much. Everyone is going to test out the easiest synthetics in that short amount of time. Database testing is considerably more tasking and time consuming because of the options available to route the data around. I can spend an entire month with a server rack and test out database performance to figure out problems and then a month after that to fix the issues or more. And as I do that more problems will come up lol.

People often forget that inside a datacenter...you don't have to cool a single server...you have to cool thousands!
Every watt dissapated, needs to be moved (requires energy) and new cooler air need to be put in.

So you don't only need to supply power, you need to cool and move air too.

If your servers in the datacenter reuqires 1MW of power to run...guess how much power you need to cool and move air? ^^

A consumer might look and think "150 Watts more for same performance...I can live with that!"
A datacenter will look and go "We CERTAINLY do not need 150 more Watt per unit for the same or less performance!"

I am pretty sure we will never have Vega in our datacenters...but the other "V"...:whistle:
 
People that haven't used a variable refresh rate setup need to STFU until they have tried it. Gsync is probably the best investment I've made and biggest visual improvement in the past 5 years for me. No screen tearing with no vsync lag is amazing
I have tried it and at least for me it is not worth the investment. VSYNC and Tripple Buffering nearly eliminates lag.
 
AMD brings nothing new to the table with these cards. They could have at least started a price war. Vega 56 at 349 and Vega 64 at $449 would have been nice

AMD can't win a price war. Vega is 484mm vs 1080/1070 pascals ~ 314mm, not only is vega a bigger chip, it has more expensive hbm2 memory. The best thing they could do is stay with the similar cost to the 1080 and 1070, and frankly, they can't even seem to do that since costs are starting at 599 while there are plenty of 1080s going for around 500 dollars on newegg.

Not a good launch at all for amd as far as gaming is concerned. AMD needs to divert a portion of the funds from those ryzen/epyc/threadripper sales into beefing up the gpu division.

No more staggered releases. If Vega came out with the 1080 and pascal, it would not have been great but it still would have been better than a year late. AMD CANNOT keep allowing nvidia to release entire gpu slates every year only to completely cede over half the market if not more. And losing the high end so badly causes a hit to most of the gpu profits.

AMD has to lean on global foundries and samsung more. Samsung can be a lifeline as they are catching up in process tech, all the big foundries are like tsmc and globalfoundries to intel. AMD should see if samsung can help tweak their process to be more suited to both their cpus and gpus like tsmc seems to bend over backwards to cater to nvidias needs. Anf for f*cks sakes, whenever Navi launches, it better effing launch with a full slate of products from top to bottom within months of each other.


AMD should get another injection of cash if they can start getting raven ridge apus into notebooks by holiday. Those will have vega integrated gpus so that is something, maybe it will at least raise the baseline performance compared to the typical intel integrated, sans the more power hungry iris pro variants.
 
I have tried it and at least for me it is not worth the investment. VSYNC and Tripple Buffering nearly eliminates lag.

Done properly on a higher refresh monitor, this should be very hard to tell the difference between Vsync + Triple buffer, vs Free/G sync.

A higher refresh screen can cut any lag waiting for a sync(120Hz means ~8ms instead of 16ms for the worse case wait), and having properly done triple buffering, increases the odds of having good frame ready any time a vsync windows opens.

If I wanted to skip the G-Sync tax and stay on a NVidia GPU, I would just get any high refresh monitor and use Triple buffering.
 
Telling me to calm down after you insult me? Yeah I see how it works.........

dude Mazda Miata is a great chicks car, its also a chick magnet car. There is a certain reason to get that car. I would rather get a Lotus Elise. Much better in both those categories and its a better driving experience but its cost more, that is what its like Vega + free sync is your Mazda Miata and Pascal + gync is your Lotus Elise.


How dare you use that comparison. A Better one would be to compare the now defunct Saturn Sky to a mazda miata, the Sky was prettier (metal shroud) but did not handle and perform as well as the miata. The lotus elise is more like a titan xp
 
Done properly on a higher refresh monitor, this should be very hard to tell the difference between Vsync + Triple buffer, vs Free/G sync.

A higher refresh screen can cut any lag waiting for a sync(120Hz means ~8ms instead of 16ms for the worse case wait), and having properly done triple buffering, increases the odds of having good frame ready any time a vsync windows opens.

If I wanted to skip the G-Sync tax and stay on a NVidia GPU, I would just get any high refresh monitor and use Triple buffering.
Which is what I did
 
Which is what I did

Are you using NVida Fast-sync, which is a driver feature to do triple buffer + vsync?


I am sure this is good enough for me, rendering the Gsync/freesync tax issue irrelevant for me, but it might matter to others.

Edit:

I actually watched the Video through now. It should be great tech when properly implemented, but it looks like Fast Sync introduces stuttering. I can see it in the video.

This would be great if they could do it properly, but at the time of this video, it's really not that good, and since it doesn't sell Gsync monitors I bet it is low on NVidias todo list.
 
Last edited:
Settle down man. All I am saying is that not everything in gaming experience is quantifiable. Long charts make our epeens feel bigger, but it would be nice to get more real world experience as opposed to just measuring charts. Its like a Mazda Miata. On paper, it looks like total crap. But everyone that drives one say it is an absolute blast. Similiar things happen with gaming experience when you start mixing in freesync/gsync.

Once again, this was me defending freesync and not trying to champion Vega.

Read any review with frametime graphs and that pretty much shows you what you will get in "real world" experiences. The sham AMD setup with Doom + Vulkan (absolute best case scenario) was to make people think Vega is more than it really is.
 
I was hoping there would be some sort of conclusion by now where both sides shook hands and walked away from the table. Looks like that isn't going to happen for a while.

Congrats to team Red for Ryzen/Threadripper and and a golf clap to Vega for "giving it all she's got".

AMD you ran a messy marketing campaign for Vega and continuously dumbed massive amounts of fuel to power the hype train. While I chuckled at "Poor Volta" and a early market video that made me feel like its better to be Red than Dead... I will just say congrats on being able to compete 15 ish months later.

Enjoy your purchases, whether its blue, green, or red.

Now can we just let it end and start on the Navi vs Volta? ;)
 
I was hoping there would be some sort of conclusion by now where both sides shook hands and walked away from the table. Looks like that isn't going to happen for a while.

Congrats to team Red for Ryzen/Threadripper and and a golf clap to Vega for "giving it all she's got".

AMD you ran a messy marketing campaign for Vega and continuously dumbed massive amounts of fuel to power the hype train. While I chuckled at "Poor Volta" and a early market video that made me feel like its better to be Red than Dead... I will just say congrats on being able to compete 15 ish months later.

Enjoy your purchases, whether its blue, green, or red.

Now can we just let it end and start on the Navi vs Volta? ;)

It will never end...
 
All the test showed was that, in one game, that ran at 100 FPS+ on both cards, that a Vega + FreeSync and 1080 Ti + G-Sync, performed similarly enough that two out of four people made a subjective call as to what they would have called the better 'ecosystem'. That's all the test showed. It's being skewed to mean that Vega + FreeSync is the better performing ecosystem and is being extrapolated to cover all gaming scenarios.

So, yes, the test was fucking crap and, as a loyal [H] reader, I was disappointed that Kyle even deigned to honor AMD's marketing bullshit request.

This is directed to AMD rather than HardOCP:
itsatrap.jpg


Sorry just watched Rogue One for nteenth time tonight (yeah meme from Return of the Jedi I think) :)

Also the test had to rely upon one monitor with a natural refresh of 60Hz and overclocked while the other was native 100Hz.
TBH in that context it made the Freesync monitor even better value when compared to that specific GSync model, but from a test-conclusion less ideal both from the scope of the test and also from monitor spec, but then Kyle made it clear this was a fun test more than anything else and many are taking this test as some kind of validation of Freesync.
Maybe HardOCP should had put more emphasis on caveats.

Cheers
 
Last edited:

The good news is that messing around with voltages will make it quieter, but then one can do that with both manufacturers although it really is needed for AMD products.
That said the general consumer I doubt will mess around with undervolting and testing to a point to find stable setup with said voltage/clocks, but I would say most on here will mess around with it.

As a comparison Guru3D with their calibrated noise probe also found both cards slightly noisier than the 290x, but I would say they never messed around with undervolting as part of that testing to go with stock.
I agree that stock behaviour are the ones that should have greater emphasis because it applies to the majority of consumers and undervolting with stability can vary depending upon each card and what used for or with certain games, but still worth considering undervolting in this context IMO for both AMD and Nvidia (well Nvidia sell a Tesla GP104 that runs at 75W with 5.5TFLOPs FP32 as a clear example it working for them - just mentioning because many do not think of undervolting-power/clocks also for Nvidia).
Cheers
 
Last edited:
Are you using NVida Fast-sync, which is a driver feature to do triple buffer + vsync?


I am sure this is good enough for me, rendering the Gsync/freesync tax issue irrelevant for me, but it might matter to others.

Edit:

I actually watched the Video through now. It should be great tech when properly implemented, but it looks like Fast Sync introduces stuttering. I can see it in the video.

This would be great if they could do it properly, but at the time of this video, it's really not that good, and since it doesn't sell Gsync monitors I bet it is low on NVidias todo list.

Not all games suffer from stuttering. CSGO does not appear too and doom Vulkan never stutters.
 
Talk shit all you want. The card is sold out (other than bundles). Seems like a win for AMD (even with the sales of the higher priced aluminum shroud cards which we all know does not warranty $100 nor cost even close to a 1/10th of that).
 
Talk shit all you want. The card is sold out (other than bundles). Seems like a win for AMD (even with the sales of the higher priced aluminum shroud cards which we all know does not warranty $100 nor cost even close to a 1/10th of that).

We know sold out means nothing. AMD sells out every launch and it has no correlation to market share.

The real trials come when Volta hits or mining goes to hell. If Volta hits and mining goes to hell at the same time team "red" will really live up to their color.
 
We know sold out means nothing. AMD sells out every launch and it has no correlation to market share.

The real trials come when Volta hits or mining goes to hell. If Volta hits and mining goes to hell at the same time team "red" will really live up to their color.
lulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018

So I guess should be #WaitForVolta2018!!
 
lulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018

So I guess should be #WaitForVolta2018!!

I never said wait for Volta, I said AMD is going to struggle when one of those two happen. There's a good chance the mining collapse and Volta will coincide. Good for gamers, bad for nVidia and AMD.

Considering my current card pushes the only game I play (World of Warships) at 8xMSAA and 4xSGSSAA I have no reason to upgrade until 4k/wireless VR drops which is likely years away.

I am pretty much unbiased besides my Fury X crapping out on me (albeit the only AMD card out of 8 of them I've owned.)
 
I'd buy one to test if they weren't sold out instantly because demand is so damn high.
Always quick to false conclusions, stock evaporated because Vega is one of the worst stocked GPU launches in history.


I have to say that I wasn’t expecting this type of anti-consumer behavior from AMD. The reason that they are apparently giving is to scare off miners by forcing people to buy ‘Radeon Packs’ but this is something that can be easily solved by limiting standalone GPUs to 1 pc per customer and is basically nothing more than a glorified excuse. It is an especially low blow because AMD stated that the entire point of the Vega delay was to accumulate stock for launch and yet we saw that their entire inventory evaporated within minutes of launch, making it one of the worst stocked GPU launches in history. Consumers will now pretty much be forced to by Radeon Packs at gunpoint. One thing is for sure however, every reviewer needs to update their conclusions based on the new MSRPs.
Gibbo OC UK

http://wccftech.com/amds-rx-vega-64s-499-price-tag-was-a-launch-only-introductory-offer/
 
Adding the link to the OC3D article about the price increases (other one was missing a 1 at the end) so it didn't go to the article.

https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/1

Not a good sign as to pricing but it is worth noting that AMD has not officially responded to questions about whether the $600 pricing for the base model is accurate/permanent.


More BS brought to you by the same etrailer that played part in the "100Mh/s" mining rumor..

Ask yourself who has more to gain, AMD by stating pricing would be A and instead turns into C and alienating potential customers or an etailer who can sell as many cards as they can as quickly as possibly then bump pricing by $100 and claim "it's AMD" (all while pocketing the difference) ?!
 
More BS brought to you by the same etrailer that played part in the "100Mh/s" mining rumor..

Ask yourself who has more to gain, AMD by stating pricing would be A and instead turns into C or an etailer who can sell as many cards as they can as quickly as possibly then bump pricing by $100 and claim "it's AMD" (all while pocketing the difference) ?!


That is kinda shady, Gibbo usually doesn't lie like that. As it right now, it seems like AMD really is pushing for people to buy their packs, lets wait and see what is what. I am surprised AMD would increase prices like that, really makes no sense it just makes their cards look worse then where they are now......
 
Back
Top