Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

I'm here to also jump on the Freesync bandwagon. I had to play CSGO on a non-freesync monitor at my buddies house and I wanted to jump out the window.

It's like going from a 144hz monitor back to a 60hz monitor, but maybe even more important.
 
I saw adaptive sync best described as "it's not magic, you're just getting all the frames you already paid for".

Get all dem frames. Get em.
 
OK AMD. Release a 40" 4K 120Hz Freesync monitor to pair with this and I may entertain your shenanigans.
Whom ever is the first to release such a monitor, Freesync or G-sync, has my money.
 
OK AMD. Release a 40" 4K 120Hz Freesync monitor to pair with this and I may entertain your shenanigans.
Whom ever is the first to release such a monitor, Freesync or G-sync, has my money.

It will be a while before anything like that exists. 120hz 40K is beyond what can be achieved on a single connector right now and using two display connections for a single monitor is never a great situation for gaming.
 
There's something else: from what I've read, GSync is better at lower frame-rates. In this test, everything was at very high frame-rates.

This is not true.

You will be right if you say most GSync monitors are better than most FS monitors. It's because FS is an open standard, monitor makers can do whatever they want, so they get some shit FS hz range into it. GS is controlled by NV so the standards are higher.

But there are FS monitors with the full 30-144hz FS range, with Low Framerate Compensation (same as GS).

FS you just need to do a bit of research before buying and it will save you a LOT of money vs GS.
 
It will be a while before anything like that exists. 120hz 40K is beyond what can be achieved on a single connector right now and using two display connections for a single monitor is never a great situation for gaming.

I'm more worried about gaming monitor makers never going above 32" for 4K.
Samsung will have to do it for them.
 
I saw adaptive sync best described as "it's not magic, you're just getting all the frames you already paid for".

Get all dem frames. Get em.

Basically, this.

It's "amazing" relatively, but it should just be the standard.

Sub-60Hz with Gsync I only want to punch something a little less than I did with Vsync.


Sadly, I struggle to play games below 90Hz on KB+M anymore, don't have time for that shit.
 
Interesting video & results. Honestly makes me consider freesync once I upgrade which I hadn't before.

Kyle, are those wafers on your wall at the beginning of the video? What are they?
 
DOOM is just interesting because it is an ungodly (pun intended) well optimized title for both platforms. Calling it an AMD title is silly.

I would of course look forward to seeing how this card runs other things when not locked to 100 Hz and without things like motion blur, but it is good to know that with a good title it does have chops.

Doom was also showcased first by NVIDIA on their stage at the Pascal launch. It's Vulkan implementation was also first optimized for NVIDIA GPUs. Wasn't that long ago, don't say you guys forgot.

AMD optimizations came later.
 
I think it's a great game to showcase the new AMD card. Nothing wrong with picking it. It's an obvious choice since most of your testers / friends have extensive history with Doom / FPS games.

I just recently picked up an Asus gsync 144hz 24" and I am completely blown away by how smooth and fluid game play is. Too bad I cannot use it. 24" is like gaming on a postage stamp to me.

I should point out that of the entire video I really REALLY liked you pointing out the $300 cost difference. I think that's a very key point if not THE point over-all.
 
Adaptive Framerate monitors are the shit.

The End.

Seriously I will never game on something that doesn't have Freesync or Gsync again. It's amazing.

If it's that important and the gameplay experienced is superior with FS/GS, why is it reviewers never actually talk about that in the end. They always show FPS charts and say GPU X or Y is better because of factors like price, thermals, power, etc. No serious reviewer has ever placed an important emphasis on what matters most: the actual gameplay experience and this has to include the monitor since it's the final output.

To give an example, GTX 1060 vs RX 480. Which can deliver the better gaming experience for the money? GPUs alone on regular monitors, they are pretty close. Add FS/GS into the mix and suddenly it's not even a close contest.
 
  • Like
Reactions: N4CR
like this
this wasn't meant to be a review guys. Just some guys playing some games on two nearly identical systems to see if they could notice a difference.

Just because there are not numbers on a graph doesn't make this approach any less valid. Just like with SLI and Crossfire testing has shown that actual gameplay in front of real players quickly determines input lag, tearing, and other baddies. How smooth a game feels is important, something numbers rarely demonstrate.

Spoiler alert: RX Vega is slower than 1080ti.
 
I can $300.00 anyways. I will not waste money on G-Sync. And for FreeSync that's AMD. And I have been a Nvidia person for years!
 
I'm still waiting patiently for this to launch to go with my XR342CK.
 
So whats the end here, is it that any video card is fine if you aren't trying to target 30 OR 60 OR 120 with zero tolerance for anything in-between? The real story is in the syncing technologies.
 
Nice to finally put a face on Dan's D name.

I think Kyle choose some testers so we stop calling him old fellow on the videos he appears.

Took the time and started a $10 patreon.

Now please open a job spot fo a monitor reviewer , because there are too many good monitors on North America, but all the good monitor review sites are at the other side of the pond.
 
Last edited:
I can $300.00 anyways. I will not waste money on G-Sync. And for FreeSync that's AMD. And I have been a Nvidia person for years!
If you can honestly say that you think G-Sync is a waste of money then either you haven't tried it or you're not a gamer. Going back to non-G-Sync is literally painful. I can't speak for Freesync, having never tried it, but it looks like AMD have stepped it up in the Vega.
 
This is my 2c, but Vega could probably throw the wrench into "FreeSync is better because it's cheaper" argument.

Vega + FreeSync is indeed cheaper than 1080 + G-Sync.

But if you crossfired/SLI'ed them, that difference largely disappears due to the higher cost of Vega.

Also that difference does not apply to a user using 1080ti, since the equivalent AMD GPU does not exist.

I'd say the only true purchasing decision that would tip it in one or the other is the monitor selection, which admittedly is definitely in FreeSync's favor, at least in my case. My current ideal single monitor solution is a 32" 1440p 144hz, which only exist in FreeSync, with no G-Syncs available.

If you can honestly say that you think G-Sync is a waste of money then either you haven't tried it or you're not a gamer. Going back to non-G-Sync is literally painful. I can't speak for Freesync, having never tried it, but it looks like AMD have stepped it up in the Vega.

He isn't saying G-Sync is a waste of money, he is saying G-Sync's $300 cost over FreeSync is waste of money.
 
Don't assume that every question is an attack, lots of the time they're just questions..

Scientifically speaking...that is a very bad assumption. In my psych course as an undergrad we reviewed a few papers where the results strongly indicated that most questions were loaded in attempt to prove somebody wrong on a point they asserted.
 
If it's that important and the gameplay experienced is superior with FS/GS, why is it reviewers never actually talk about that in the end. They always show FPS charts and say GPU X or Y is better because of factors like price, thermals, power, etc. No serious reviewer has ever placed an important emphasis on what matters most: the actual gameplay experience and this has to include the monitor since it's the final output.

To give an example, GTX 1060 vs RX 480. Which can deliver the better gaming experience for the money? GPUs alone on regular monitors, they are pretty close. Add FS/GS into the mix and suddenly it's not even a close contest.

I would say it's because it blurs the lines of performance (no pun intended). It's more of a subjective thing than something that can be measured like FPS.

On a non adaptive monitor, I can feel when the framerate goes below 60, I mean nearly instantly. Therefore, I need a GPU that can put out above 60 FPS all the time. With Freesync, as long as I'm in the range for the monitor, it is flawlessly smooth. So I don't need a GPU that can run 60+ consistently, and still get a great experience.

But that isn't something that can be measured.
 
But if you crossfired/SLI'ed them, that difference largely disappears due to the higher cost of Vega.

I thought the whole reason behind Freesync was to be able to game with lesser hardware. Seems like Crossfire would fly in the face of what that's all about.
 
I thought the whole reason behind Freesync was to be able to game with lesser hardware. Seems like Crossfire would fly in the face of what that's all about.
The exact same argument can be made for G-Sync, and that argument is STILL in nVidia's favor due to existence of 1070, and AMD has nothing between Vega and RX 580.

Things only start to turn towards AMD's favor again in the RX580's range.
 
AMD is desperate, they are talking about Freesync. They are trying to deflect attention away from the fact that they can't even compete with nVidia's second best card that has been on the market for over a year now. Vega needs to be overclocked and water cooled to beat a GTX 1080. And at that point Vega is consuming over 440w (!). Normally I don't care about power consumption but that is just nuts!
 
Last edited:
I would say it's because it blurs the lines of performance (no pun intended). It's more of a subjective thing than something that can be measured like FPS.

On a non adaptive monitor, I can feel when the framerate goes below 60, I mean nearly instantly. Therefore, I need a GPU that can put out above 60 FPS all the time. With Freesync, as long as I'm in the range for the monitor, it is flawlessly smooth. So I don't need a GPU that can run 60+ consistently, and still get a great experience.

But that isn't something that can be measured.

I would add too that adaptive sync cannot overcome resolution though so at large resolutions you still need the power required to even maintain 60fps like moving above 6-8 megapixel.
 
AMD is desperate, they are talking about Freesync. They are trying to defect attention away from the fact that they can't even compete with nVidia's second best card that has been on the market for over a year now. Vega needs to be overclocked and water cooled to beat a GTX 1080. And at that point Vega is consuming over 440w (!). Normally I don't care about power consumption but that is just nuts!
Seriously I get you aren't interested in Vega but is it necessary that you post every so often the exact same negative drivel over and over? Its a GPU that is stronger than other cards in its own stack. It doesn't best Nvidias line. If you need Nvidias performance level then buy Nvidia. If you hate paying for Nvidia, well tough. Pick one.
 
Great comparison Mr. Bennett. Thank you for your time and effort. Does any of you guys know whats coming out late this summer. Rumor had it that Samsung was going to release some new panels as well as Asus. I'd like to get a 34-35
freesync panel. Preferably 144 Hz. Any suggestions, Looks like the 35 Freesync Acer is selling well.
Kyle feel better. We need this kind of content more often. The PcPer and Nexus or whatever its called are just not Hard core :D
 
Seriously I get you aren't interested in Vega but is it necessary that you post every so often the exact same negative drivel over and over? Its a GPU that is stronger than other cards in its own stack. It doesn't best Nvidias line. If you need Nvidias performance level then buy Nvidia. If you hate paying for Nvidia, well tough. Pick one.


Problem is its own stack is in competition in the marketplace with nV's products, when you have comparisons being done with freesync and gsync on its the same thing as Polaris with frame rate locks on to show power consumption when in reality Polaris only matches last gen nV products in that metric. This time they are trying to lock down performance with syncing and to try to really give a reason to buy their product because of the price difference. Again shouldn't need to explain why someone has to buy a product in this type of market.
 
AMD is desperate, they are talking about Freesync. They are trying to defect attention away from the fact that they can't even compete with nVidia's second best card that has been on the market for over a year now. Vega needs to be overclocked and water cooled to beat a GTX 1080. And at that point Vega is consuming over 440w (!). Normally I don't care about power consumption but that is just nuts!

I see by your post logs you love to hate on Zen and Vega. You stopped on Zen cause you were wrong and now your here to pick on Vega. Freesync and G-sync make a huge difference in how a game feels to you, it makes a monitor without it feel like junk. Nothing wrong with AMD or Nvidia pointing that out to their customers. As for when they release tech well it gets released when it's ready not cause you want it right this second. Vega never consumed over 440 watts except in a extreme overclocking of Vega FE and let me tell you something the 1080ti sucks some juice when you turn up the speed as well. Simple fact is if you overclock you cant go on the forums and then bitch about wattage use, you decided to run it out of spec. Also last time I checked it was Kyle running the test not AMD. So no AMD is not trying to deflect anything, it was awesome they let Kyle actually have a crack with it before the NDA lifts.
 
Great comparison Mr. Bennett. Thank you for your time and effort. Does any of you guys know whats coming out late this summer. Rumor had it that Samsung was going to release some new panels as well as Asus. I'd like to get a 34-35
freesync panel. Preferably 144 Hz. Any suggestions, Looks like the 35 Freesync Acer is selling well.
Kyle feel better. We need this kind of content more often. The PcPer and Nexus or whatever its called are just not Hard core :D


I've been looking for a 34 inch HDR as well, waiting for prices to drop though, right now they are kinda ridiculous prices for gaming monitors.
 
You're missing the point

If you'd consider one of these $700 cards and don't have freesync or gsync you're leaving a positive experience on the table.

And it'll be a positive experience as long as you are in the synced range! You don't need 90+ fps to feel smooth. The sync tech makes it feel butter smooth all the way to the minimum FPS sync.

The sync tech changes the rules of the game.


It'd be interesting to repeat the test next week with the sync tech turned off! And see if opinions stay the same?

This, Totally. As far as my personal ,very subjective, experience I really do think that the sync tech is pretty good. I have an Asus 1440p 144hz freesync monitor. I was way more interested in the 144hz and IPS panel than the sync be it Free or G. So, I picked up the cheaper monitor which is freesync. I paired it with an open box 1070, TY Microcenter, and dialed in settings. I could play most things in the 90+fps range. Super happy. Great GPU. I then Sold the 1070 for a small profit and thought , hey I have this freesync panel, lets give it a go. I picked up a nitro rx 480 back when they went for retail and had to live with 60-90fps on dialed settings. However, the sync range on my panel is 30-90hz so i capped the frame rate in wattman at 90 .. Anyway... Less fps but smoother 'feeling' using the sync tech in my panel and the GPU was 100 less I have no experience with Gsync but I'm guessing its good too but costs more.
It really does give a nice lift to the weaker GPUs. even at 100+ fps I would occasionally feel the effect of wild frame rate changes some stutters etc. The older and slower I get smooth > than fast. I love spending less and getting same so if AMD can pull this off then I will consider Vega.


Great VID and very interesting to see what people think in a nice conversational format. Looking forward for the full [H] Vega review!
 
I see by your post logs you love to hate on Zen and Vega. You stopped on Zen cause you were wrong and now your here to pick on Vega. Freesync and G-sync make a huge difference in how a game feels to you, it makes a monitor without it feel like junk. Nothing wrong with AMD or Nvidia pointing that out to their customers. As for when they release tech well it gets released when it's ready not cause you want it right this second. Vega never consumed over 440 watts except in a extreme overclocking of Vega FE and let me tell you something the 1080ti sucks some juice when you turn up the speed as well. Simple fact is if you overclock you cant go on the forums and then bitch about wattage use, you decided to run it out of spec. Also last time I checked it was Kyle running the test not AMD. So no AMD is not trying to deflect anything, it was awesome they let Kyle actually have a crack with it before the NDA lifts.

Kyle was under NDA and his choices were limited, that is how AMD controlled the test. The rest of it, Kyle made sure there was no funny business going on. He is probably not at liberty to discuss anything outside of what was said in those videos, which was probably the scope of the NDA.
 
Problem is its own stack is in competition in the marketplace with nV's products, when you have comparisons being done with freesync and gsync on its the same thing as Polaris with frame rate locks on to show power consumption when in reality Polaris only matches last gen nV products in that metric. This time they are trying to lock down performance with syncing and to try to really give a reason to buy their product because of the price difference. Again shouldn't need to explain why someone has to buy a product in this type of market.

Cause bar graphs are life? More to life then max performance, frame times are far more important.
 
Kyle was under NDA and his choices were limited, that is how AMD controlled the test.
I could have picked any game I wanted to. AMD did not have control over the testing. The overall idea for testing was theirs' and agreed to, but I could have used any game that I wanted to.
 
Cause bar graphs are life? More to life then max performance, frame times are far more important.


Everything is important, but subjective preferences are user preferences, my experience won't be the same as yours, nor will yours be to a another person. You can't relay on subjective analysis unless you are part of that test.
 
Cause bar graphs are life? More to life then max performance, frame times are far more important.
The computer gurus would take frametime.

Your average joe/jane gamer probably would think the 2 are the same, and all marketting and tons of reviews only concentrate on fps data, not frametime graphs.
 
Doom was a fine choice as its well optimized on both platforms. Maybe they are putting up a stink cuz it doesn't run like shit on AMD?
Well DOOM is a game that Vega has enough horse power to hit the max frames required in the blind test, but things could change if we moved onto other more demanding titles like maybe TW3? (I know it's an older game now, but so far every other game that runs worse than TW3 seems to have "unoptimized POS" label attached.

I'd be interested in seeing whether the blind test still holds for titles like Andromeda or DX:MD.
 
Back
Top