Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,572
Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync - Totally unscientific and subjective testing is scorned by many, so if that gets your panties in a bunch, I suggest you stop reading. We had the opportunity to preview AMD's RX Vega this weekend, and we put it up against NVIDIA's GTX 1080 Ti, both using 100Hz FreeSync and G-Sync panels, with our testers representing 223 combined years of gaming experience.
 
Just when I was looking for a new panel and GPU.
Looks like i can also save $300 to boot.

Thanks!
 
I definitely think that motion blur should have been turned OFF for this test. I'm sure AMD and NVIDIA handle motion blur differently enough for people to like one over the other. Please redo the test with motion blur turned OFF. Thanks for being so thorough otherwise.
 
I definitely think that motion blur should have been turned OFF for this test. I'm sure AMD and NVIDIA handle motion blur differently enough for people to like one over the other. Please redo the test with motion blur turned OFF. Thanks for being so thorough otherwise.
I think AMD has the cards now.
In a previous post he said he had them briefly.
 
Did you watch the video?
Yes, since we don't know the performance of Vega, we can't tell if people preferred the Vega system because FreeSync is better than G-Sync or if Vega has higher fps or shorter frame latency.
 
An interesting approach - I appreciated the video very much. So in essence - from the take of it, it would appear that the differences are marginal and if I could save 300 on a system, I would do that even if I didn't know which one were cheaper / expensive.
 
Cool video. Resolution and audio were nice and crisp on my old cheapie 1080p rig.

Will be interested in seeing the full review here when the time comes.

SOT.. I like those chip slabs hanging on the wall like foo foo mirrors. I'm gonna have to hunt some of those down.
 
Essentially the video means nothing (unless you are going out and getting either gsync/freesync and possibly a GPU at the same time to save $$$$$) although it was quite interesting to see that 3 people said #2 was better by a large margin with what, 2 people willing to pay extra?

pretty neat.


EDIT: Im guessing AMD wouldnt let you turn off motion blur?
 
When you include the monitor tech...I think the value proposition shift greatly to AMD. D3 @ 4k running at those rates/settings is not a severe push but it also isn't a joke either.
 
So ASUS MX34V FreeSync display ($720) and an ASUS PG348 G-Sync display ($1300). $580 difference. Is Vega really going to be $280 more than a 1080?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
So ASUS MX34V FreeSync display ($720) and an ASUS PG348 G-Sync display ($1300). $580 difference. Is Vega really going to be $280 more than a 1080?

Vega is estimated @ 850 right now.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Is FreeSync better than G-Sync? Anyone have experience with both?

IMHO as I posted in the other thread that lined up this preview, G-Sync is superior at the fringes (30-60, 120+ FPS ranges) for me but it gets pretty difficult to notice differences in the "sweet spot" (85-120 FPS for me) although generally G-Sync has better tolerance to variances. Note that I don't talk about value/price or upcoming improvements (e.g. to FreeSync) and this is totally subjective based on only two months of playing around with the tech. I'd like to add that for me, personally, I always want to be in that sweet spot, so the single-GPU card most capable of consistently higher FPS is worth a premium to me even at the monitor level.
 
Good job. As others have mentioned, Doom isn't hard on GPUs, so this is pretty much a teaser. I'm looking forward to seeing a full review in due course. I do hope that this induces a significant reduction in the GSync premium.

There's something else: from what I've read, GSync is better at lower frame-rates. In this test, everything was at very high frame-rates.

This has whetted my appetite; give me more! :)
 
Cool. Good job, great video.

Kyle did you tell the gamers what the systems were after everyone was done and interviewed?
 
Nice bit of info to delve into, Kyle and team...thanks for putting this together!

My thoughts: I've never used G-Sync or Freesync, but I do use adaptive Vsync, because I do get a noticeable benefit at higher refresh rates/FPS (my monitor is 144 Hz capable). That being said, I've always seen G-Sync as a gimmick...albeit one that works and gets a lot of praise, but also one that NVidia gouges for. Until G-Sync monitors are either priced the same or a miniscule amount more than the Freesync equivalents, I will likely never pay the premium in price for NVidia's offering.
 
That was a fun test I bet.

I got you for 2 schmeckels a month for fun content followed by the traditional [H] treatment when NDA's allow.
 
  • Like
Reactions: N4CR
like this
Pleasantly suprised ! I definately thought most would have preferred system 1.
 
Nice, these are the type of videos that make me love this site. They take real world real people and put them to the test. Honestly this is what it should be people have prefences yes but it shows that if a majority prefer one of the other blindly, something has to come out better than the other without being biased on the review.
Interesting that the performance was pretty much deemed equal by everyone but 3 people. Everyone else said it was a minimal difference.
 
Are amd going a little OTT with this secrecy bs? The cards due to launch in a few days yet they're paranoid about a reviewer installing the card and drivers himself. Surprised the system wasn't in its own little air fed cubicle so it wouldn't be contaminated with second hand air.
 
Im guessing there is some secret sauce in that driver, thats why the AMD rep had to install it. I hope it straddles the 1080/1080Ti perf wise
 
Having watched the entire thing, I think its missing the forest for the trees. What value proposition? Just the GPU and monitor alone are $2k. Who cares about $300 when the entire PC is likely $3.5k easy.

Would be nice if a 2017 *sync article came out. I thought with FPS games, you wanted 120/240Hz and ULMB like feature(which disables *-sync?). Absolute minimum lag? Isn't sub 100Hz sync more for RPG games or weaker systems(RX 580/GTX 1060). If I just spent $4k on high end gaming PC to play DOOM or any other FPS at sub 100FPS... Maybe I am seriously out of date with my memory about lag and focusing in on targets.

Never used *sync personally and havnt seriously gamed in a long while.

edit: not meaning to be downer, It did not go over my head that most folks thought there was no difference between the two systems. In general, most folks aren't 12 years old or on ADHD meds anymore.
 
i think this is smart from AMD, alot of ppl still don't know that they can improve gameplay experience significantly by a monitor upgrade freesync/gsync, probably alot more than what they can get from a total system upgrade and for a fraction of the cost.
AMD don't push your luck with the pricing plz.
 
Having watched the entire thing, I think its missing the forest for the trees. What value proposition? Just the GPU and monitor alone are $2k. Who cares about $300 when the entire PC is likely $3.5k easy.

Would be nice if a 2017 *sync article came out. I thought with FPS games, you wanted 120/240Hz and ULMB like feature(which disables *-sync?). Absolute minimum lag? Isn't sub 100Hz sync more for RPG games or weaker systems(RX 580/GTX 1060). If I just spent $4k on high end gaming PC to play DOOM or any other FPS at sub 100FPS... Maybe I am seriously out of date with my memory about lag and focusing in on targets.

Never used *sync personally and havnt seriously gamed in a long while.

There were a lot of articles on gsync/freesync when it first came out, but since then there’s been almost nothing. I’m not sure why nobody is covering it anymore.
 
The video was interesting. Were you not concerned that the freesync monitor was a VA panel vs the G-Sync IPS panel?
 
Great video - been waiting for it all day (even longer as I'm in the UK!). Was expecting the 1080Ti to score better considering the comparisons that AMD have been doing on their little tour is against the standard 1080.

As some have said above, having a couple say they would pay the $300 for the difference was the biggest surprise for me!

Looking forward to the proper review when the actual cards are released in the coming weeks.
 
The last 1 1/2 minutes of the video pretty much sums it up.

Not a review, 1 non-demanding game, subjective, no frametime/framerate data showing that the new version/features of Freesync appears superior to G-Sync. Hopefully the performance of the card will actually make this advantage matter.

Because if your fps can't 100% match/exceed the tick rate, it doesn't matter how smooth it is. It is also somewhat confusing why Doom was chosen over a proper multiplayer game, such as Battlefield 1 or the like (is this a Freesync thing?)

Very concerned if the Freesync 'features' of Vega are somehow game/driver based.
 
I haven't seen your mug in a while Kyle so it was like hey a new guy? Ya look so clean cut now like a whole other nerd chic dude with the specs.
 
I like the video style and testing being done in a blind manner, but like it has been mentioned before Doom is never going to dip below 100fps on at least a 1080ti. For me it stays closer to 200fps. So even if Vega was half as powerful both systems would display the equivalent locked 100hz as if v sync was on with the only difference being possible input lag between freesync and gsync but not visual differences. If the experience was the same that's because it was the same not because I've was faster etc. Without a time by frame graph we only know that the limiting factor was the slow panels. I do however enjoy the format regardless of this and like hearing from real gamers. Hopefully a full review will be out soon if AMD doesn't delay Vega until 2019.
 
I seem to recall a similar outcome back during a HOCP AMD event back in 2013 or so. (I was not there but I watched a video on it.)

The last 1 1/2 minutes of the video pretty much sums it up.

Not a review, 1 non-demanding game, subjective, no frametime/framerate data showing that the new version/features of Freesync appears superior to G-Sync. Hopefully the performance of the card will actually make this advantage matter.

Because if your fps can't 100% match/exceed the tick rate, it doesn't matter how smooth it is. It is also somewhat confusing why Doom was chosen over a proper multiplayer game, such as Battlefield 1 or the like (is this a Freesync thing?)

Very concerned if the Freesync 'features' of Vega are somehow game/driver based.

It does matter how smooth a game plays, that is definitely not subjective but objective. (This test is not the be all, end all but, it is a good example of what is important to folks.)
 
When you include the monitor tech...I think the value proposition shift greatly to AMD. D3 @ 4k running at those rates/settings is not a severe push but it also isn't a joke either.

It wasn't running at 4k though. It was running at a lower 21:9 3440x1440 resolution (or whatever that first number is), of which 4k is 67% more pixels. Just by extrapolation alone, a GTX 1080 should average over 70fps with the same settings.
 
Last edited:
I hate to be THAT guy, but I won't be the first to say....

I wish Kyle hadn't picked DOOM.... that game runs good on a potato. Either of those cards probably never dipped below 100 fps @ 1440p, which defeats the purpose of G-Sync / Freesync.

Need something demanding, a game that's going to kick the GPUs in the balls and test to see if the adaptive sync can help. I don't know what current game could punish either of those cards at 1440.

Next topic, who's dropping $1000 on a monitor? Not this guy.
 
The video was interesting. Were you not concerned that the freesync monitor was a VA panel vs the G-Sync IPS panel?
I think that's where this comment comes from,
I actually "dumbed down" the image quality on each panel so we could get closer image quality between the two panels. Do keep in mind that our interviews do not cover actual IQ.
But yeah next time i'd suggest finding a pair of monitors that are the same panels, with the free sync being a samsung LTM340YP03 and g-sync being a LG LM340WU2-SSA1, it's a variable that could be controlled? Assuming the objective is a look just at freesync and g-sync operation not necessarily implementation and product line.

It's all a matter of objective, seeing as this is a vega preview, makes me wonder if vega handles freesync in a different matter from current cards. It's hard to say, seeing as this idea is an extension of what AMD was doing, maybe AMD just thought freesync needed promotion and VS format is a popular promotion format.
 
Interesting test, however I don't see how either GSync or FreeSync make a huge difference here when the game is pretty much going to be running at 100 fps the whole time anyway. I'd be much more interesting in seeing performance at 4K with a more demanding game.

Except that maybe any dips on the AMD card would be covered up by FreeSync... That is really what they are targeting here.
 
Yes, since we don't know the performance of Vega, we can't tell if people preferred the Vega system because FreeSync is better than G-Sync or if Vega has higher fps or shorter frame latency.

The impression I got, was that AMD and Vega are better at handling the Vulkan API than NVIDIA is. FreeSync vs. G-Sync are so close as you wouldn't really be able to tell them apart when the frame rates you are getting are sufficient. The difference between the two systems was minimal, but clear. The Vega system felt snappier, but you couldn't say "hey, it gets xx frames more than the NVIDIA system."

Cool. Good job, great video.

Kyle did you tell the gamers what the systems were after everyone was done and interviewed?

Yes he did. I was shocked to learn that Vega was in System 2.
 
Back
Top