AMD Radeon FreeSync 2 vs NVIDIA G-Sync @ [H]

I like this second review where these comparisons get a somewhat unbiased opinion.Thanks for sharing.
 
Just like last years contest - gsync vs. freesync - both techs seem to be very comparable - it's good to have that confirmation from another battery of test subjects.

Parity between the two techs has been my own commentary too as I went from three HP Omen 32s with Freesync, driven by a pair of Fury X cards to a single 34" ultrawide Alienware driven by a pair of 1080TI. The gaming experience (smoothness/enjoyment level) between the two techs is nearly identical. Die hard fans of one or the other need simply stand down...

----

From a home theater enthusiast perspective -- it's interesting that HDR capability didn't clearly win the crowd over. We are seeing that in the avsforum discussions too. HDR is a nice to have, but not a "must have". Higher contrast, better colors, and better resolution beat out HDR on the subjective preference bullet point lists from what I've been reading over the last few months.
 
Last edited:
I've been curious about this myself. Especially since news started leaking out about possible future video standards implementing a similar "sync" standard into the standard itself.

As for HDR - I think it hasn't hit critical mass because HDR on LED isn't as good as HDR on OLED and OLED prices just haven't gotten to the sweet spot necessary for non-enthusiasts yet.
 
Great video and test! The comparisons between HDR and SDR were what I found the most interesting. I've been playing games in HDR for a while on my LG OLED C6 and the things I've noticed about it mostly match up with what the people in the video noticed. With HDR the colors can look a little muted and less vibrant depending on the game and implementation but at the same time more realistic. When you look at HDR content for a while then go back to SDR content the colors start to look overly saturated and blown out. There is also more black crush and white clipping in SDR content since it doesn't have the wider range of colors and contrast of HDR, hence why it's called "High Dynamic Range" to begin with. The extra brightness you get from an HDR certified panel also makes the brighter highlights in the image pop a little more. I think someone in the video picked up on this a little bit.

I looked up the FreeSync 2 monitor being used in the video and the max brightness is rated at 600nits which is a good bit less than the 1000nits that most LED LCD HDR TVs can do. So, I'm curious how the 1000nit G-Sync HDR panels will fare when they finally come out.
 
As far as the comments about vibrance, could it have been the driver oversaturating colors on the nvidia system? I know that there is an option, or at least was in the past, to adjust the vibrance in nvidia's driver, and it may be adjusted higher than what AMD looks by default.
 
Love this video comparison. Really shows that both camps have a very similar gaming experience. I have been a Nvidia user for a long time, due to them always having what I wanted in price/performance when I went to upgrade. But because of things like this video and last year's, the fact that I can save a few hundred dollars, AND the BS that is Nvidia's GPP, and me needing a new monitor, I think I will be finally getting my first AMD video card.
 
Auto CC takes some time, and isn't guaranteed. You can't force or request it either in the video manager either (thanks a lot, YouTube).
 
I’ve stopped using gsync altogether in my gaming sessions. Despite what blur busters and other reviews say, it feels more sluggish in FPS games compared to ULMB which to me is the gold standard in gaming right now for FPS.

As for HDR, meh is my opinion. I’ve got a 55” LG oled and 65” Samsung LED with both supporting HDR and it doesn’t look good on either TV.
 
Interesting, but it really worries me that some of these people thought that system 1 had dull and less vibrant colours than system 2. Something is very wrong here. Either system 2s monitor is amazing, or something was not configured correctly in system 1, maybe in the monitor settings, or even in Windows... Or maybe system 1s monitor was utter shit, because I simply cannot accept that people would find a properly setup HDR screen “dull”, when compared to a non HDR screen.

I also thought that it was unfair to have a water cooled, overclocked card pitched against a much slower stock card (I appreciate that you also said this in the opening...), this would make the nVidia card present more smoothly, which will fool some in to thinking it was better overall. It really should have been a reference 1080. Also physics should have been disabled on the nVidia card, so that the only difference was HDR and frame/sync stability, not game content.
 
Last edited:
I simply cannot accept that people would find a properly setup HDR screen “dull”, when compared to a non HDR screen

unless something changed a LOT over the last few months, HDR on most PC games is still broken, and the HDr experience outside OLED Tvs is subpar, to say the least.

The worst issue of HDR games is content creation: a scene that looks beautifully mastered for an OLED TV mat look worst than SRD when viewed on a 10b panel.

the test comparison was a bit unfair in this topic, since it pitted a scanning quantum dot VA against a strobing TN.
it is not unexpected that a monitor would look completely dull compared against the other, given the massive gap in contrast and color gamut specs between them.
 
As always thanks for the blind test.
BUT:
Vage64 start at $950USD and performas about the same at a GTX1080 that stars at $700ish (popular on-line retailer named after river).

You'll find that the $400USD in favour of AMD might have actually been $50USD in favour of NVidia had the NV systems been configured diferently. A Voltar carrd could have been used and and then you could have asked ..."Was there a 2200USD difference?"

Clearly based on feed back, the systems looked differently. AMD showed less but better....but it showed less. I'm not saying that AMD is not going to render an arm or leg on an NPC but what else might we be missing out on?
 
Last edited:
Loved the video. Really informative.

I love fancy gear as much as the next nerd, but when blind tested? "Not worth $200 difference".

[Opinion]
I hate the whole Freesync/Gsync state!

I love monitors, as they can be one of the few great long term parts in a build. Sort of like a good set of speakers for an Audio system.

Why the hell do I need something that limits the useful lifespan or limits my choice of hardware?

Hate it!
[/Opinion]

I do realize I can still use either monitor, just without Sync. But man, it is annoying.
 
As always thanks for the blind test.
BUT:
Vage64 start at $950USD and performas about the same at a GTX1080 that stars at $700ish (popular on-line retailer named after river).

You'll find that the $400USD in favour of AMD might have actually been $50USD in favour of NVidia had the NV systems been configured diferently. A Voltar carrd could have been used and and then you could have asked ..."Was there a 2200USD difference?"

Clearly based on feed back, the systems looked differently. AMD showed less but better....but it showed less. I'm not saying that AMD is not going to render an arm or leg on an NPC but what else might we be missing out on?
I checked Amazon and Newegg the morning we did the test and used pricing on that day. I have no control over pricing changes and even addressed that in the video, so I am not sure what you are going on about. It was the reason the video card pricing difference was left out of the value question.
 
speaking of HDR, i'd love to see an HDR article. It seems like it's really messed up on PC, even moreso on 4k 4:4:4

It's a shit-show; even the latest 'standards' that were released are a mess with so many tiers, and the FALD implementations being used to give LCD panels more contrast are hardly suitable for TV and cinematic content- I'm willing to bet that the artifacts will be too distracting for many gamers.

I would join you in asking for an article on HDR implementations if such monitors were more prevalent!
 
I can guarantee you that if this test was carried out on 4k monitors people would overwhelmingly choose the nVidia system with the GTX 1080 Ti card. Lots of AMD propaganda on here lately. This reminds me of the Vega blind test and the GPP article, both of which were put into motion by AMD. This test was skewed to favor the AMD setup. And I wouldn't be surprised at all if this article was entirely AMD's idea.
 
I can guarantee you that if this test was carried out on 4k monitors people would overwhelmingly choose the nVidia system with the GTX 1080 Ti card. Lots of AMD propaganda on here lately. This reminds me of the Vega blind test and the GPP article, both of which were put into motion by AMD. This test was skewed to favor the AMD setup. And I wouldn't be surprised at all if this article was entirely AMD's idea.

Even if it is- who could legally say?- it's still a solid data point.

There are two further points that should be expressed:
  • That while still superior to FreeSync, G-Sync isn't so superior as to warrant significantly higher costs, assuming other variables being equal*
  • HDR makes a big difference, and you can get it now with an AMD card, supposing you can actually get an appropriate AMD card

*[the variables are never equal]
 
so G-Sync displays are still better at providing a smoother experience over a wider range of games and performance differences.

it is the second time in a row that HardOCP produces a blind test that fails to prove your believe that gsync is vastly superior to Freesync.
So if you really believe , and i believe you do, you should stop reading H articles and close the forum account, because what you are saying is that H produced 2 fake blind test articles in a row, and you should not waste your time reading fake articles or discussing fake results.
 
Thank you for this test. I enjoyed it. It helps remove the ardent fanboy-ism and looks to the actual experience.
 
It's great that G-Sync might not be worth $200 over FreeSync and all... but for those of us with high end nVidia GPU's, what does it matter? A 1080 Ti is not compatible with FreeSync, and AMD has not produced a GPU that competes with a 1080 Ti.

Until AMD produces high end GPU's again, and not overpriced garbage (Vega 64), high end gaming enthusiasts are stuck with Team Green and their implementation of VRR.
 
As an nVidia user since the TNT2 days, I've NEVER bought into the GSync crapola. AMD did it the right way: FREEsync.

Well you cant stay loyal to nvidia and use feeesync. Is gsync crapola as you said?

Clearly you have never used it. I am not even going to attempt to go there right now. But if your an nvidia gpu owner you have no choice but to use gsync if you want a superior display experience. Something you probably have never used.

Judge not that which you've never experienced.

Yes nV needs to lower their license fees on gsync but I'm sorry but synchronized displays plain suck ass after you've went to synchronized.
 
it is the second time in a row that HardOCP produces a blind test that fails to prove your believe that gsync is vastly superior to Freesync.
So if you really believe , and i believe you do, you should stop reading H articles and close the forum account, because what you are saying is that H produced 2 fake blind test articles in a row, and you should not waste your time reading fake articles or discussing fake results.

The technology is superior, and the testing is limited. I'm quite clear in my position above, you may believe whatever you wish; you're literally saying that that you believe that I believe something that contradicts what I actually said.
 
Well you cant stay loyal to nvidia and use feeesync. Is gsync crapola as you said?

Clearly you have never used it. I am not even going to attempt to go there right now. But if your an nvidia gpu owner you have no choice but to use gsync if you want a superior display experience. Something you probably have never used.

Judge not that which you've never experienced.

Yes nV needs to lower their license fees on gsync but I'm sorry but synchronized displays plain suck ass after you've went to synchronized.

My laptop has Gsync but I didn't go out of my way looking for a laptop with it. I simply wanted one of the cheapest 17" display laptops that contained a 1070 and it happened to have Gsync. But that's fine with me because I'm locked to the laptop and what's contained therein; I can't change the video card anyway so it doesn't matter to me that it has a Gsync display.

For my home desktop system, I believe that using either Gsync or Freesync do result in a better 'experience'. However, I don't want to be locked into an nVidia-only upgrade path, so I won't purchase a Gsync display.

Tests like this can be useful but, as with the earlier blind test, it's limited to one specific scenario and may not translate to many games having the same conclusions as that of the video.
 
Kyle

How come you didn’t lock the settings? That would have made the graphic the same on both systems and put more of an emphasis on Freesync and Gsync? Example, the amd would of had the extra leaves but maybe 10fps less.
 
Kyle

How come you didn’t lock the settings? That would have made the graphic the same on both systems and put more of an emphasis on Freesync and Gsync? Example, the amd would of had the extra leaves but maybe 10fps less.
Because I wanted to see the differences perceived by the gamer at default Ultra settings in the game.
 
Different panels, gsync monitor is TN, freesync is VA, one supports HDR the other don't. This is like comparing apples to oranges in terms of visual quality. The only thing that matter in a test like this is the level of smoothness.
 
Different panels, gsync monitor is TN, freesync is VA, one supports HDR the other don't. This is like comparing apples to oranges in terms of visual quality. The only thing that matter in a test like this is the level of smoothness.
I guess you do not understand the focus being on HDR, image quality, and its support.
 
We reached out to AMD and asked what panel they would suggest to use, they suggested the Samsung used in the article.

We reached out to NVIDIA and asked what panel they would suggest to use against the Samsung. "My question is what HDR display, in the 27 to 32" range, 144hz would you suggest to compare that supports GSync? Shooting for street price from $550 to $650."

There were their suggestions, none of which we found comparable either in terms of specs or price.

Option 1: 4K
Display suggestions:
ASUS PG27AQ or Acer XB271HK / LGMU68 or MU58. All are IPS.

Option 2: FHD 240Hz, 25-inch (really attractive to eSports players given the high refresh rate)
Display suggestions:
Alienware: AW2518H (G-SYNC) and AW2518HF. Both are 240Hz.

Option 3: 1440p, 21:9, 34-inch. Curved.
Display suggestions:
Alienware AW2418HW / Samsung C34F791 (VA) or LG 34UC98 IPS.

We decided to go with the ROG we used for better or worse. That said, it is a beautiful panel that I think anyone would be happy with....as pointed out by most of our gamers as well.

I also asked NVIDIA what game(s) they would suggest. When I told them I could get Far Cry 5 in for testing, NVIDIA refused to give us drivers with support for the game and declared if we used the old drivers it would invalidate our testing. NVIDIA complained that FC5 was an "AMD title." NVIDIA never suggested a game to use that was HDR. NVIDIA also suggested that I was wasting my time because Tom's Hardware did this two years ago.

NVIDIA then suggested that I set up all three above and then do the testing....

Asking AMD for a list of games, I got a list of games. I decided on Forza 7 as it seemed to have the best visuals after a bit of research in terms of HDR.
 
You did but, as with the other video, people are extrapolating this test to apply to all scenarios.
Well, I should have stopped writing a long time ago then about anything tech related if that is to be decided by people on the internet not being able to read or understand what is told to them.
 
Different panels, gsync monitor is TN, freesync is VA, one supports HDR the other don't. This is like comparing apples to oranges in terms of visual quality. The only thing that matter in a test like this is the level of smoothness.

But, in that case, don't make it a freesync vs gsync battle. Use the same screen panel in both graphics card, and ask which system performs better.
Your thoughts are noted.
 
Back
Top