Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Welcome to the Nvidia forums,no wonder Hard is losing readers .
All the test showed was that, in one game, that ran at 100 FPS+ on both cards, that a Vega + FreeSync and 1080 Ti + G-Sync, performed similarly enough that two out of four people made a subjective call as to what they would have called the better 'ecosystem'. That's all the test showed. It's being skewed to mean that Vega + FreeSync is the better performing ecosystem and is being extrapolated to cover all gaming scenarios.
So, yes, the test was fucking crap and, as a loyal [H] reader, I was disappointed that Kyle even deigned to honor AMD's marketing bullshit request.
I think it is more of an anti-Amd thing as alot of these guys are in the intel forums as well. Seriously razor, that epyc comment came out of left field and it makes you appear to have some sort of vendetta on AMD.
Closing thoughts
Unfortunately, our time testing the two platforms has been limited. In particular, we only received AMD's EPYC system last week, and the company did not put an embargo on the results. This means that we can release the data now, in time to compare it to the new Skylake-SP Xeons, however it also means that we've only had a handful of days to work with the platform before writing all of this up for today's embargo. We're confident in the data, but it means that we haven't had a chance to tease out the nuances of EPYC quite yet, and that will have to be something we get to in a future article.
It's not a bad idea. (On AMD's perspective) everyone else can just buy 1080s or Ti.
Too many unknowns in the question to just reply yes or no.I know we all hate hypotheticals, but just play along. If that same test group played all of the major games and they all claimed that the Vega/freesync combo offered a better experience over the 1080ti in all of the games, would you buy Vega? Simple yes or no.
You are entitled to your opinion and we are entitled to tell you why it is factually wrong@ Daysk Ok, so you also think the test was useless. I don't.
Anandtech writes on the page "Testing Notes & Benchmark Configuration"
Now that does not preclude AMD from being sneaky bastards but it seems like Anandtech wanted to get first impression clicks done with Skylake-SP and I guess, if the last line indicates correctly, then do more testing with EPYC afterwards and simply do another article.
Now, the only other site that has epyc hardware and is in the business of reviews seems to be ServetheHome (that I can find), they have done multiple articles on epyc, from June 20, to August 8th, so at least for STH, AMD did not ask for the hardware back. TBH, everything about servers and datacenters I've heard about, I find hard to believe AMD would or could be able to sneak stuff past when AMD is operating from such a non-starting position in the market, and when customers usually do testing on hardware for a long time before deployment.
AMD/RTG is milking the fanboys for all they got.
Even the $699 Radeon RX Vega 64 Liquid Cooled is sold out.
Nobody, but the fanboys would pay $699 for that instead of the Geforce GTX 1080 Ti
I don't think it is fanboys paying $699 for a liquid cooled Vega, but miners. I like AMD products, but I am running my 980Ti until Volta or Navi come out.
Considering almost all high refresh rate monitors are now freesync i don't see it as buying into their ecosystem. I own a freesync monitor because G-Sync is way too expensive. However i use a GTX 1080. Would i love to be able to use varible refresh rate? Sure, but it is not a big enough game changer for me to buy radeon or upgrade to g-sync right now.
I love how things you post on the internet are hard to take back. It's fun to see all these shit tier websites have to eat crow
They can't sneak anything by its impossible, server, datacenters its all about money for them, they will look into everything. but by limiting time, 1 week of testing, that is just not enough time to do much. Everyone is going to test out the easiest synthetics in that short amount of time. Database testing is considerably more tasking and time consuming because of the options available to route the data around. I can spend an entire month with a server rack and test out database performance to figure out problems and then a month after that to fix the issues or more. And as I do that more problems will come up lol.
I have tried it and at least for me it is not worth the investment. VSYNC and Tripple Buffering nearly eliminates lag.People that haven't used a variable refresh rate setup need to STFU until they have tried it. Gsync is probably the best investment I've made and biggest visual improvement in the past 5 years for me. No screen tearing with no vsync lag is amazing
AMD is sandbagging.
I love how things you post on the internet are hard to take back. It's fun to see all these shit tier websites have to eat crow
AMD brings nothing new to the table with these cards. They could have at least started a price war. Vega 56 at 349 and Vega 64 at $449 would have been nice
I have tried it and at least for me it is not worth the investment. VSYNC and Tripple Buffering nearly eliminates lag.
Telling me to calm down after you insult me? Yeah I see how it works.........
dude Mazda Miata is a great chicks car, its also a chick magnet car. There is a certain reason to get that car. I would rather get a Lotus Elise. Much better in both those categories and its a better driving experience but its cost more, that is what its like Vega + free sync is your Mazda Miata and Pascal + gync is your Lotus Elise.
Which is what I didDone properly on a higher refresh monitor, this should be very hard to tell the difference between Vsync + Triple buffer, vs Free/G sync.
A higher refresh screen can cut any lag waiting for a sync(120Hz means ~8ms instead of 16ms for the worse case wait), and having properly done triple buffering, increases the odds of having good frame ready any time a vsync windows opens.
If I wanted to skip the G-Sync tax and stay on a NVidia GPU, I would just get any high refresh monitor and use Triple buffering.
Which is what I did
Settle down man. All I am saying is that not everything in gaming experience is quantifiable. Long charts make our epeens feel bigger, but it would be nice to get more real world experience as opposed to just measuring charts. Its like a Mazda Miata. On paper, it looks like total crap. But everyone that drives one say it is an absolute blast. Similiar things happen with gaming experience when you start mixing in freesync/gsync.
Once again, this was me defending freesync and not trying to champion Vega.
I was hoping there would be some sort of conclusion by now where both sides shook hands and walked away from the table. Looks like that isn't going to happen for a while.
Congrats to team Red for Ryzen/Threadripper and and a golf clap to Vega for "giving it all she's got".
AMD you ran a messy marketing campaign for Vega and continuously dumbed massive amounts of fuel to power the hype train. While I chuckled at "Poor Volta" and a early market video that made me feel like its better to be Red than Dead... I will just say congrats on being able to compete 15 ish months later.
Enjoy your purchases, whether its blue, green, or red.
Now can we just let it end and start on the Navi vs Volta?
All the test showed was that, in one game, that ran at 100 FPS+ on both cards, that a Vega + FreeSync and 1080 Ti + G-Sync, performed similarly enough that two out of four people made a subjective call as to what they would have called the better 'ecosystem'. That's all the test showed. It's being skewed to mean that Vega + FreeSync is the better performing ecosystem and is being extrapolated to cover all gaming scenarios.
So, yes, the test was fucking crap and, as a loyal [H] reader, I was disappointed that Kyle even deigned to honor AMD's marketing bullshit request.
http://www.hardwarecanucks.com/foru...rx-vega-64-vega-56-performance-review-19.html
55 decibels?! Thats Reference 290X levels. Holy fuck.
Are you using NVida Fast-sync, which is a driver feature to do triple buffer + vsync?
I am sure this is good enough for me, rendering the Gsync/freesync tax issue irrelevant for me, but it might matter to others.
Edit:
I actually watched the Video through now. It should be great tech when properly implemented, but it looks like Fast Sync introduces stuttering. I can see it in the video.
This would be great if they could do it properly, but at the time of this video, it's really not that good, and since it doesn't sell Gsync monitors I bet it is low on NVidias todo list.
Talk shit all you want. The card is sold out (other than bundles). Seems like a win for AMD (even with the sales of the higher priced aluminum shroud cards which we all know does not warranty $100 nor cost even close to a 1/10th of that).
lulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018We know sold out means nothing. AMD sells out every launch and it has no correlation to market share.
The real trials come when Volta hits or mining goes to hell. If Volta hits and mining goes to hell at the same time team "red" will really live up to their color.
You don't even need to do that you can buy better GPU then Vega today and they are actually available at MSRPlulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018
So I guess should be #WaitForVolta2018!!
lulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018
So I guess should be #WaitForVolta2018!!
lulz.. and so it begins.. #WaitForVolta.. except of course, NV has already said no (gaming) Volta until at least 2018
So I guess should be #WaitForVolta2018!!
Always quick to false conclusions, stock evaporated because Vega is one of the worst stocked GPU launches in history.I'd buy one to test if they weren't sold out instantly because demand is so damn high.
Gibbo OC UKI have to say that I wasn’t expecting this type of anti-consumer behavior from AMD. The reason that they are apparently giving is to scare off miners by forcing people to buy ‘Radeon Packs’ but this is something that can be easily solved by limiting standalone GPUs to 1 pc per customer and is basically nothing more than a glorified excuse. It is an especially low blow because AMD stated that the entire point of the Vega delay was to accumulate stock for launch and yet we saw that their entire inventory evaporated within minutes of launch, making it one of the worst stocked GPU launches in history. Consumers will now pretty much be forced to by Radeon Packs at gunpoint. One thing is for sure however, every reviewer needs to update their conclusions based on the new MSRPs.
Adding the link to the OC3D article about the price increases (other one was missing a 1 at the end) so it didn't go to the article.
https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/1
Not a good sign as to pricing but it is worth noting that AMD has not officially responded to questions about whether the $600 pricing for the base model is accurate/permanent.
More BS brought to you by the same etrailer that played part in the "100Mh/s" mining rumor..
Ask yourself who has more to gain, AMD by stating pricing would be A and instead turns into C or an etailer who can sell as many cards as they can as quickly as possibly then bump pricing by $100 and claim "it's AMD" (all while pocketing the difference) ?!