Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

Well DOOM is a game that Vega has enough horse power to hit the max frames required in the blind test, but things could change if we moved onto other more demanding titles like maybe TW3? (I know it's an older game now, but so far every other game that runs worse than TW3 seems to have "unoptimized POS" label attached.

I'd be interested in seeing whether the blind test still holds for titles like Andromeda or DX:MD.


Any of the marquee games will be good, but we really need a set of games, not just one. Great it runs good with Doom, but might run like a dog on another. And this is why more gaming benches the better. Always good to have a solid pool to give gamers an idea of what is going on and what is best for their needs. I'm sure Kyle and [H] will put it through the tests.
 
Well at least Vega can run Painkiller. I'm not sure if the AMD marketing guys are smart enough to find the painkiller weapon though. Great game. They can play some 4k Painkiller and slather on the Capsaicin while they're at it. Pain free living. Poor AMD will be in a world of hurt. I guess they can take solace in the fact that they can use a Freesync monitor. Maybe Painkiller will run smoother.


Damn that's an old game lol, remember being at GDC (or E3) in 2005 and watching the best in world go head to head in that game!
 
I see by your post logs you love to hate on Zen and Vega. You stopped on Zen cause you were wrong and now your here to pick on Vega. Freesync and G-sync make a huge difference in how a game feels to you, it makes a monitor without it feel like junk. Nothing wrong with AMD or Nvidia pointing that out to their customers. As for when they release tech well it gets released when it's ready not cause you want it right this second. Vega never consumed over 440 watts except in a extreme overclocking of Vega FE and let me tell you something the 1080ti sucks some juice when you turn up the speed as well. Simple fact is if you overclock you cant go on the forums and then bitch about wattage use, you decided to run it out of spec. Also last time I checked it was Kyle running the test not AMD. So no AMD is not trying to deflect anything, it was awesome they let Kyle actually have a crack with it before the NDA lifts.
That's fair, I was wrong about Ryzen, you are right about that. I was also wrong about Polaris. AMD botched both launches but managed to recover quickly. Ryzen had poor gaming performance which they fixed with faster memory and updated motherboard BIOSes. Polaris' PCI-E power consumption was out of spec, and they fixed that with the aftermarket cards. I could also be wrong about Vega. I would *like* to be wrong about Vega. I would love to see AMD pull a unicorn driver out of their yin yang. I just don't see it happening. I am hopeful that Vega will turn into a reasonably competitive card once we see the aftermarket cards with good cooling. We could also see a disruptive cut down Vega card akin to the 7950 or 290 cards. However all of that is not going to happen until there is widespread availability of the cards. In its current form Vega is a turkey. You can try to take away from my credibility but I'm just telling it like it is. I have been at this game for a very long time. Sometimes I am wrong, which I just admitted to. That doesn't mean too much at the moment. I think a lot of people were underwhelmed by the Ryzen and Polaris launches. If you don't want to read the honest opinions of an experienced PC hardware enthusiast, I suggest you no longer read my posts. I'm not going to change. I'm not the type of person that follows the herd mentality and I can see right through AMD's marketing bullshit.
 
Well DOOM is a game that Vega has enough horse power to hit the max frames required in the blind test, but things could change if we moved onto other more demanding titles like maybe TW3? (I know it's an older game now, but so far every other game that runs worse than TW3 seems to have "unoptimized POS" label attached.

I'd be interested in seeing whether the blind test still holds for titles like Andromeda or DX:MD.

Not gonna get into the politics of what is optimized, but yea as I posted earlier I'd like to see them apply this testing to other games too.

That's fair, I was wrong about Ryzen, you are right about that. I was also wrong about Polaris. AMD botched both launches but managed to recover quickly. Ryzen had poor gaming performance which they fixed with faster memory and updated motherboard BIOSes. Polaris' PCI-E power consumption was out of spec, and they fixed that with the aftermarket cards. I could also be wrong about Vega. I would *like* to be wrong about Vega. I would love to see AMD pull a unicorn driver out of their yin yang. I just don't see it happening. I am hopeful that Vega will turn into a reasonably competitive card once we see the aftermarket cards with good cooling. We could also see a disruptive cut down Vega card akin to the 7950 or 290 cards. However all of that is not going to happen until there is widespread availability of the cards. In its current form Vega is a turkey. You can try to take away from my credibility but I'm just telling it like it is. I have been at this game for a very long time. Sometimes I am wrong, which I just admitted to. That doesn't mean too much at the moment. I think a lot of people were underwhelmed by the Ryzen and Polaris launches. If you don't want to read the honest opinions of an experienced PC hardware enthusiast, I suggest you no longer read my posts. I'm not going to change. I'm not the type of person that follows the herd mentality and I can see right through AMD's marketing bullshit.

10 posts since 2012... and when ya finally post its all hyper negative and wrong. yea ok.
 
I thought the whole reason behind Freesync was to be able to game with lesser hardware. Seems like Crossfire would fly in the face of what that's all about.
Unless you have high resolution of eyefinity.
 
Not gonna get into the politics of what is optimized, but yea as I posted earlier I'd like to see them apply this testing to other games too.



10 posts since 2012... and when ya finally post its all hyper negative and wrong. yea ok.
You think I'm wrong? About what? What were your honest thoughts on Ryzen and Polaris when they were first launched? When I saw Polaris I saw a card that performed overall slower than a GTX 1060 while consuming more power, with no overclocking headroom. Then there was the PCI-E power issue. Sorry, I'm not going to put hardware that violates electrical specs into my system. As for Ryzen it had very underwhelming gaming performance. Not that type of thing I would run in my system. Now that AMD has basically worked miracles with their drivers Polaris is pretty good. Same thing with Ryzen, what AMD did with the faster memory support really provided it a shocking boost IMO. Faster RAM rarely makes a difference in gaming.

So again, I do apologize if my honesty bothers you. I will tell you that your baselessly attacking my credibility bothers me. How about you post some facts rather than stirring up an argument for no reason?
 
Well DOOM is a game that Vega has enough horse power to hit the max frames required in the blind test, but things could change if we moved onto other more demanding titles like maybe TW3? (I know it's an older game now, but so far every other game that runs worse than TW3 seems to have "unoptimized POS" label attached.

I'd be interested in seeing whether the blind test still holds for titles like Andromeda or DX:MD.
Dang man. Are you paying attention? As long as it's in the sync range of the monitors freesync setting it will feel smooth. Most freesync monitors that means 40hz (fps) and up!
 
Chill guys, it is just an interesting test that shows more people prefer the system setup at that time with that game.
The aftermath screaming of bias sounded almost as bad as audio equipment blind test lol.
 
While I am not one that normally enjoys these types of reviews (just give me the numbers!), it made me smile seeing that Vega can offer a comparable gaming experience.

With most of the bias removed (I use most because someone will find some somewhere...) and not using an AMD setup lab, to me, that removed any doubt that AMD provided the solution and tuned it in such a way that it shows their product winning by a mile.

I patiently wait for the full on review.

Thank you Kyle.



1080TI owner.
 
When TVs start coming with FreeSync and/or Gsync then I'll care about tests like these. Granted, this is a population as small as or smaller than those currently using FreeSync/Gsync.

Until that time all I really care about is raw performance, especially at 4K.

So as already stated: "Show me the numbers!"
 
When TVs start coming with FreeSync and/or Gsync then I'll care about tests like these. Granted, this is a population as small as or smaller than those currently using FreeSync/Gsync.

Until that time all I really care about is raw performance, especially at 4K.

So as already stated: "Show me the numbers!"
I'm in the same boat. I'm gaming on a 75" 4k HDR TV. I would take HDR over Freesync any day. It requires a bit of GPU overkill but I try to stay pegged at 60fps. Worth it IMO. There is a reason why the gaming consoles have been so successful. Playing games on a large TV is a lot of fun.
 
In (my) theory, G-sync is better as there is a processor inside the monitor processing feed from the GPU, where free-sync doesn't. I will believe that there will be scenarios where G-sync > free-sync because of that, but I also believe that G-sync is over priced. Free-sync is indeed a viable option if it does what G-sync can do on most games, as long is it isn't only for a few selected games.

Those reviewers are long time gamers, and the point wasn't about frozen pictures or min/max/avg frame rates, but game play. Those reviewers' comments where specific, they were able to staying on target better on system 2 while system 1 appears to be able to maintain consistent frame rate. I too love consistent frame rate and believed that I would have go for G-sync monitor, but after this video, I wonder if it is wise to feed the beast with whatever they ask for? Or give that hard working horse a chance. Shall I go will G-sync, then I will have to stick with Nvidia until the monitor broke knowing that it won't be easy for AMD to adopt G-sync. Shall I go will Free-sync, then I will have to stick with AMD until the monitor broke, or Nvidia chooses to support Free-sync.

I am aware that my opinion is subjective, so if you still think that G-sync is better and worth the extra cost, then go for it. If Free-sync rocks your boat and you like money well spend, then go for it.

All in all, great work [H], thank you.
 
In (my) theory, G-sync is better as there is a processor inside the monitor processing feed from the GPU, where free-sync doesn't. I will believe that there will be scenarios where G-sync > free-sync because of that, but I also believe that G-sync is over priced. Free-sync is indeed a viable option if it does what G-sync can do on most games, as long is it isn't only for a few selected games.

Those reviewers are long time gamers, and the point wasn't about frozen pictures or min/max/avg frame rates, but game play. Those reviewers' comments where specific, they were able to staying on target better on system 2 while system 1 appears to be able to maintain consistent frame rate. I too love consistent frame rate and believed that I would have go for G-sync monitor, but after this video, I wonder if it is wise to feed the beast with whatever they ask for? Or give that hard working horse a chance. Shall I go will G-sync, then I will have to stick with Nvidia until the monitor broke knowing that it won't be easy for AMD to adopt G-sync. Shall I go will Free-sync, then I will have to stick with AMD until the monitor broke, or Nvidia chooses to support Free-sync.

I am aware that my opinion is subjective, so if you still think that G-sync is better and worth the extra cost, then go for it. If Free-sync rocks your boat and you like money well spend, then go for it.

All in all, great work [H], thank you.
In fairness, you have just been proven wrong by Kyle's test. IMO.
 
When TVs start coming with FreeSync and/or Gsync then I'll care about tests like these. Granted, this is a population as small as or smaller than those currently using FreeSync/Gsync.

Until that time all I really care about is raw performance, especially at 4K.

So as already stated: "Show me the numbers!"
TVs will start coming with freesync very soon. Xbox Scorpio has freesync. Gsync has lost this war. The sooner Nvidia realizes they should start supporting freesync the better for all.
 
I think this was a great test and a surprising conclusion. Nice job.

Also, DOOM was the perfect game to test. It's well optimized and smooth in general, so if Vega failed it would be a huge red flag.

At least we know Vega is in the realm, and I think we all realize it's not going to actually beat the 1080 Ti in FPS, but it may be close enough if people can't tell the difference.

I have a G-Sync 27" monitor and it's awesome, but I find I play more on my 40" 4K TV (or 125" 1080P projector) just because the size makes it more immersive. In a perfect world we could have everything, but that's probably years away.
 
I'm in the same boat. I'm gaming on a 75" 4k HDR TV. I would take HDR over Freesync any day. It requires a bit of GPU overkill but I try to stay pegged at 60fps. Worth it IMO. There is a reason why the gaming consoles have been so successful. Playing games on a large TV is a lot of fun.
And what "sync" technology have you tried to dismiss it so easily?
 
Interesting... I already have a g-sync monitor, (only at 1440p) so if i moved over to an AMD GPU I would lose that benefit. Still, really pulling for a super competitive card from AMD to keep prices down. Also - the first setting I do on new games is turn off motion blur. Why have a high end GPU, and then blur up your motion? Seems to me like we have these high end systems to avoid motion blur.
 
And what "sync" technology have you tried to dismiss it so easily?
I'm old school. I have played games on 100hz+ CRT monitors back in the day with GPU hardware powerful enough to run the games at 100fps. I will tell you, HDR makes a much bigger difference than a higher framerate beyond 60fps. 4k resolution also makes a big difference. I'm not seeing any HDR enabled PC monitors with Freesync, please correct me if I'm wrong. I've been playing Mass Effect Andromeda in HDR and it is visually stunning, beyond anything I have seen.

The other thing is that so long as you have enough GPU power and can stay at or above 60fps, Freesync and such become somewhat moot. I don't think you can game beyond 60fps on a 4k display, either. Please correct me if I am wrong.

Maybe I like nice graphics too much, but 4k HDR gaming on a 75" 4k TV looks stunning. It's a bigger jump than I have ever seen from any display technology.
 
Interesting... I already have a g-sync monitor, (only at 1440p) so if i moved over to an AMD GPU I would lose that benefit. Still, really pulling for a super competitive card from AMD to keep prices down. Also - the first setting I do on new games is turn off motion blur. Why have a high end GPU, and then blur up your motion? Seems to me like we have these high end systems to avoid motion blur.

The first part about closed vs open standards, yea the real world effect of it sucks. That's why closed systems exist to keep customers within their eco system.

The second part about blur, I kinda dislike blur as well because I assume you like me know what it is and by preference don't like it. The reality is that motion blur is a perfectly natural effect of our vision. We see motion blur in everyday life and do not notice it but in a game it becomes apparent because we don't expect blurring in a game. I don't prefer it but I don't really mind it as well. Some out there like it as the auteur is trying to mimic real life.
 
Well...the test was odd, but I appreciate that Kyle was attempting to replicate the AMD demonstration.

1.) I think the pool is way too tiny to really draw any conclusion from.

2.) All participants basically said they couldn't tell the difference between the systems when posed question A, but when pressed on question B to make a choice, half of the participants would give a somewhat doubtful recommendation in favor of one monitor/card combo over the other. Question B, following question A, gives the appearance that a clear choice MUST be made. It is really hard to take the question B data seriously in this circumstance, though the question A data is pretty legit.

I think the clearest thing that can be gathered, is all participants found both systems to be essentially identical in real-world performance.

Cool, I guess we just need to see Vega's price-point. I really have no interest in freesync/gsync, so these comparisons worry me about where Vega will land in the market. I was really hoping it was going to be 1080 perf at a lower price, but all this freesync/gsync monitor/gpu price jargon is a bit disappointing and hard to dissect.
 
I'm old school. I have played games on 100hz+ CRT monitors back in the day with GPU hardware powerful enough to run the games at 100fps. I will tell you, HDR makes a much bigger difference than a higher framerate beyond 60fps. 4k resolution also makes a big difference. I'm not seeing any HDR enabled PC monitors with Freesync, please correct me if I'm wrong. I've been playing Mass Effect Andromeda in HDR and it is visually stunning, beyond anything I have seen.

The other thing is that so long as you have enough GPU power and can stay at or above 60fps, Freesync and such become somewhat moot. I don't think you can game beyond 60fps on a 4k display, either. Please correct me if I am wrong.

Maybe I like nice graphics too much, but 4k HDR gaming on a 75" 4k TV looks stunning. It's a bigger jump than I have ever seen from any display technology.

As I expected. How can you make a comparison between two things and declare one is better when you haven't tried both things?

You realize that's absurd?!?!

The sync technologies do matter. Even if you typically have 60fps --- you won't always have 60 FPS. There are dips and hesitations and frametearing you may not even know are there. --- until they aren't.

BTW. I'm an older gamer too. As were a bunch of the guys in that vid. I've been gaming on PC since about 1989 on a Tandy 1000. I don't miss CRT for a moment. They always gave me headaches and eye strain.
 
Last edited:
As I expected. How can you make a comparison between two things and declare one is better when you haven't tried both things?

You realize that's absurd?!?!

The sync technologies do matter. Even if you typically have 60fps --- you won't always have 60 FPS. There are dips and hesitations and frametearing you may not even know are there. --- until they aren't.

BTW. I'm an old gamer too. As were a bunch of the guys in that vid. I've been gaming on PC since about 1989 on a Tandy 1000.
I could ask you the same thing, have you experienced 4k HDR gaming on a powerful PC?

You're right, I don't always have 60fps, but those times are extremely rare. I usually dial back a few settings if I have to. 99% of the time my system is running games at 60fps. I don't disagree that Freesync is better, it clearly is. Gsync sucks. I can't stand closed standards. And nVidia tends to adopt closed standards a lot more than AMD. It's annoying and it's terrible for consumers. The real problem we have today is that AMD has been unable to address the high end GPU market for a long time now. And it doesn't look like Vega is going to do much to change that, unfortunately.
 
Btw you never clarified in the article you said you tone down image quality on the monitors to make them even, wtf does that even mean??!

Please explain!

One monitor is a VA panel the other is a IPS one. So by toning down I'd assume he means he made them look as close as possible.
 
Btw you never clarified in the article you said you tone down image quality on the monitors to make them even, wtf does that even mean??!

Please explain!
We where just trying to get the colors to match as close as possible doing side by side calibrations. The FreeSync panel had a noticably better IQ at default.
 
I could ask you the same thing, have you experienced 4k HDR gaming on a powerful PC?

You're right, I don't always have 60fps, but those times are extremely rare. I usually dial back a few settings if I have to. 99% of the time my system is running games at 60fps. I don't disagree that Freesync is better, it clearly is. Gsync sucks. I can't stand closed standards. And nVidia tends to adopt closed standards a lot more than AMD. It's annoying and it's terrible for consumers. The real problem we have today is that AMD has been unable to address the high end GPU market for a long time now. And it doesn't look like Vega is going to do much to change that, unfortunately.

Holy shit, that was actually a decent post from you. AMD hasn't been competitive in 2 years, not that long ago. Is Vega gonna fix that? I dunno, got to wait 3 more days (I'm guessing because of siggraph) to find out. If it is, awesome. If not, I hope it's priced appropriately.
 
Not a review, 1 non-demanding game, subjective, no frametime/framerate data showing that the new version/features of Freesync appears superior to G-Sync.
So you don't like how they have been doing real world subjective testing all these years? Want FireStrike and 3DMark to replace [Hard] methodology?
 
cool video but my panties aren't bunched up because I walk around naked at my apartment and especially when I'm gaming. :kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss::kiss:
 
Thanks for the review Kyle. I've been tossing around the idea of ?-sync for a while now. I've already purchased my new gpu so that pretty much dictates which kind of monitor I'll have to invest in. I'm afraid to look at the prices of good 4K g-sync monitors.
 
I could ask you the same thing, have you experienced 4k HDR gaming on a powerful PC?

You're right, I don't always have 60fps, but those times are extremely rare. I usually dial back a few settings if I have to. 99% of the time my system is running games at 60fps. I don't disagree that Freesync is better, it clearly is. Gsync sucks. I can't stand closed standards. And nVidia tends to adopt closed standards a lot more than AMD. It's annoying and it's terrible for consumers. The real problem we have today is that AMD has been unable to address the high end GPU market for a long time now. And it doesn't look like Vega is going to do much to change that, unfortunately.
Does a JVC RS 600 count


Yes I think so...

But bear in mind I never said A was better than B when only ever trying A. That's why I'm busting your chops, my man.
 
Thanks for the review Kyle. I've been tossing around the idea of ?-sync for a while now. I've already purchased my new gpu so that pretty much dictates which kind of monitor I'll have to invest in. I'm afraid to look at the prices of good 4K g-sync monitors.
No need, there aren't that many of them.
 
This"taste test" was a great idea and makes a couple good points: a) experiential evaluation is as important (if not more) than flat, disparate metrics and b) in a non-bleeding edge scenario, it's very hard to see a difference between these cards. Particularly like that a blind test like this effectively removes the ability for the participants to apply any (conscious or unconscious) confirmation bias/fanboyism/wishful thinking etc. Can't think of another evaluation that wasn't subject to some bias, either by the reviewer or commenters/participants. Doom was a decent choice here as, in any practical experiment, you want to eliminate as much variance as possible in how the actors perform and this title/engine has historically performed well on both nV and AMD cards. If I had one critique, If I was only picking a single game, I might have picked a more stressful title -- but I certainly see the utility of picking a less stressful game as well.

That said, I do feel the proof will be in the bleeding edge testing, and other attributes of the card e.g. noise, heat, cost
 
Tried it don't need it. I have triple 24" monitor setup. And yes I'm a gamer. Over 500 games in all. And thousands of hours in games like Tribes Ascend 2500 hours. First Halo 2000 hours. Now I play triple titles.


If you can honestly say that you think G-Sync is a waste of money then either you haven't tried it or you're not a gamer. Going back to non-G-Sync is literally painful. I can't speak for Freesync, having never tried it, but it looks like AMD have stepped it up in the Vega.
 
TBH results pretty much what I expected considering this is Doom with additional extensions AMD has implemented (Nvidia slowly building theirs up).
Makes a pretty big difference in performance, and yes before anyone links the old blogs showing Nvidia does have some extensions with Vulkan they are not as critical to performance as ones coming down now.

Just as important would be interesting to see Nvidia on Intel while keeping AMD as it was with 1800X, we know Nvidia has quirks at times with the AMD platform combined with low level APIs and especially the 1080ti (just look at RoTR articles identifying where its performance was as weak as a 1070/1080 in DX12).
I wonder if the perception results would had been the same in that instance, or even if both were on Intel CPUs.

So lots of weighting needs to be done with any conclusions IMO by anyone posting *shrug*.
Cheers
 
Last edited:
While I don't think that a blind test is a bad idea, I think this could have been set up better.
Using 3440x1440 100Hz displays is fine, so long as the game/settings used never reaches 100 FPS on the faster GPU.
As soon as it does it invalidates the test, since you're capping the performance of the faster card.
Since you used DOOM, that likely means DSR/VSR would be required to render higher than the display's native resolution to keep the framerate below 100 FPS; e.g. 5160x2160.
I would also have disabled motion blur.

Dang man. Are you paying attention? As long as it's in the sync range of the monitors freesync setting it will feel smooth. Most freesync monitors that means 40hz (fps) and up!
Variable Refresh Rate displays allow framerates in-between the divisors of your monitor's refresh rate to remain smooth.
On a fixed 144Hz display, only V-Synced 144, 72, or 48 FPS can be truly smooth. (arguably only 144 & 72...)
On a VRR 144Hz display, everything from ~48-144 FPS will be smooth.

108 FPS would be a stuttering mess on a fixed 144Hz display since that's right in the middle of two divisors, while it would be buttery-smooth on a 144Hz VRR display since it would be running at 108Hz.

VRR can only fix the stuttering caused by the framerate being out of sync with the refresh rate though.
Low framerates are still low framerates, and still feel terrible on a VRR display.
Where VRR helps is that it might allow you to run a game at an unlocked 85-100 FPS instead of having to cap it to 72 FPS on a 144Hz display for example.
It doesn't mean that the framerate can drop below 60 FPS and I won't notice. It's just not quite as bad.

The other thing is that so long as you have enough GPU power and can stay at or above 60fps, Freesync and such become somewhat moot. I don't think you can game beyond 60fps on a 4k display, either. Please correct me if I am wrong.
The problem is that there is no hardware in existence which can keep all games above 60 FPS all the time, especially at 4K.
Even if you're only targeting 60 FPS (since it's basically impossible to hold 60 FPS), VRR will play smoother and you'll have much lower latency. It's typically 3 frames of latency removed vs V-Sync, which would be 50ms at 60 FPS.
 
I think this was a great test and a surprising conclusion. Nice job.

Also, DOOM was the perfect game to test. It's well optimized and smooth in general, so if Vega failed it would be a huge red flag.

At least we know Vega is in the realm, and I think we all realize it's not going to actually beat the 1080 Ti in FPS, but it may be close enough if people can't tell the difference.

I have a G-Sync 27" monitor and it's awesome, but I find I play more on my 40" 4K TV (or 125" 1080P projector) just because the size makes it more immersive. In a perfect world we could have everything, but that's probably years away.

I can't wait to know that difference :)
Having an LG34UC88 (3440x1440 max 75Hz with freesync enabled) paired to a GTX 980Ti I'm still very doubful purchasing a 1080Ti or the best AIB Vega available.
 
Back
Top