AMD Radeon FreeSync 2 vs NVIDIA G-Sync @ [H]

While this monitor (c27hg70) doesn't have enough light zones it's still better than not having HDR. Multiple times the c27hg70 has gone for 450. It's at the very least worth taking a look at.

I've had my c27hg70 since BF.
 
wow, thanks Kyle, great video on FreeSync 2 vs G-Sync ... I'd like to see more of that whenever possible after the AMD X470 system comes out soon. It really makes sense the point your daughter made about wanting to play FreeSync 2 on more real looking games but would prefer Overwatch or more gamy looking games on G-Sync - that was really helpful. Money is an issue so, I'd prefer to save $400 in this case as I prefer more real looking games. I look forward to more monitors and even TV's to come out that will support FreeSync 2.
 
I thought I shall chime in about HDR and what my own workplace actually experienced.

A few months ago, the developers and artists were taking a serious look into HDR and decided to create a test scene using the Unity engine. After taking a few scenes from one of our shipping games and redid it all in HDR, it was presented to the whole team and our reactions gauged.

Short of the story was, we surprisingly hated it. Don't get me wrong though. The Wow! faction was huge. It made a big difference in the impressions department. But boy did nobody want it in the game. No rational explanation to it. It reminds me of the story of making robots more humanlike, and how people wanted robots to be more humanlike but when it reached a certain point where it became too humanlike, we got disgusted and hated it. HDR almost felt like that to me. This video hit home when it said that something to the line of games should look like games. Maybe it's years of computer graphics conditioning that we have been used to and we want to keep our virtual reality distinctly different from our physical reality. I don't know. All I can say is that the negative reaction is real.

Maybe we aren't ready yet to make that instantaneous leap into digitized reality yet. Up till today, 3D graphics have always looked that little bit fake. Too clean, too flat. 2D graphics, even photorealistic images, have also always looked a bit fake, with luminous and color noise, along with the contrasts that is too drastic to be real. HDR can change this. It's not 100% reality yet but it's that much closer that it can repulse people.

Edit: When I say "HDR" here, I meant photorealistic HDR. HDR can be used for more effective lighting in games and are not photorealistic at all. That's not a use of HDR I'm talking about here.
 
Last edited:
You're talking about the uncanny valley- and I get that!


And that's one of the reasons that games, unlike movies, will be slow on the HDR uptake (are slow): artists will have to learn how to use HDR for games, which must, as you say, be different from photorealistic HDR.

[and to wit, most people don't want 'HDR' as it is commonly misused in photography either; typically, HDR is what is captured at the sensor, and then RAW converters and photo editors are used to compress the dynamic range down!]
 
Guys we don't even have working HDR in Windows 10 yet. MS is finally fixing it in the spring creator's update coming out this month.
 
I decided on Forza 7 as it seemed to have the best visuals after a bit of research in terms of HDR.

So you chose a game that offers the best implementation of HDR for a comparison where one of the competitors didn't have an HDR monitor? Is it possible that, that may have swayed the opinion of the testers in one direction or the other? As in, at least some might have loved FS2 but they didn't like the HDR, or vice versa.
 
So you chose a game that offers the best implementation of HDR for a comparison where one of the competitors didn't have an HDR monitor? Is it possible that, that may have swayed the opinion of the testers in one direction or the other? As in, at least some might have loved FS2 but they didn't like the HDR, or vice versa.
I could be wrong, but I believe that was partly the goal of this test--to see if freesync+hdr was subjectively better or worse than gsync. If you performed the same test on a game with a poor implementation of hdr, then you might as well not be testing it. It would be nice if the same test could be ran on the same game but with hdr disabled, but you'd need to have another set of people or modify the format somehow as they could be influenced by what they had experienced before.
 
The title is literally 'Freesync2 vs G-Sync'- which done at this time means that it will also be HDR versus SDR, because FS2 requires HDR and HDR isn't yet available in a G-Sync monitor.
 
So you chose a game that offers the best implementation of HDR for a comparison where one of the competitors didn't have an HDR monitor? Is it possible that, that may have swayed the opinion of the testers in one direction or the other? As in, at least some might have loved FS2 but they didn't like the HDR, or vice versa.
Gotcha. Will use the shittiest HDR title I can find next time.
 
I could be wrong, but I believe that was partly the goal of this test--to see if freesync+hdr was subjectively better or worse than gsync. If you performed the same test on a game with a poor implementation of hdr, then you might as well not be testing it. It would be nice if the same test could be ran on the same game but with hdr disabled, but you'd need to have another set of people or modify the format somehow as they could be influenced by what they had experienced before.
The title is literally 'Freesync2 vs G-Sync'- which done at this time means that it will also be HDR versus SDR, because FS2 requires HDR and HDR isn't yet available in a G-Sync monitor.
Oh right, as in, "if you get FS2 than you also get HDR, which is not offered by the higher priced competing product." Hmm... makes sense. Cheers.
 
Interesting review Kyle and a good differentiation between how they look when being played. HDR seems to be another good addition to display technology.
 
Thanks for the review Kyle. I hadn't even heard of FreeSync 2 yet.

I totally agree with your choice of Forza but on the other side of the fence I have to say RE7 looks pretty impressive in HDR. I'm still mostly on the fence with HDR. From movies to games it really does depend on the implementation. I think, like many, it's more probable I'll be saving for the next gen GPU's before moving to another display but reviews like this help keep us buyers educated on choices. Thanks again.
 
Articles like this are one of the things that keeps me interested in Kyle's research. Anyone can do a "hard numbers" benchmark to determine which is "on paper" the better system, but playing games is about the overall experience. Kyle has made that a key point in gaming tests. If I want hard numbers (and I do, along with the overall experience reviews provided here) there are a myriad of other sites that use different testing methodology to help with that. I know of no other site that provides the "overall experience" angle like or as well as done here.

Ok, now that I'm done buttering up the owner, I'll add some of my own thoughts on the overall issue -
I am excited that there are some new technologies to progress the visual performance of our games. The drive to "realism" in graphics has hit the odd point where it is realistic enough that the non-realistic portions stick out and are really annoying. As long as it is an option that can be turned on or off, you can play with the visuals to your liking. I do hope NVIDIA starts to adopt this, and who knows - maybe we'll be back to FMV games again, only with pure CGI!
 
G-Sync with HDR is apparently around the corner- I'm pretty baffled that it's taken both adaptive sync technologies this long, given that they are separate technologies that just happen to share the same cable and that HDR is the new marketing buzzword.

But whatever; I'm still in it for a 34"+ 16:9 UHD 120Hz HDR gaming monitor when those hit, with either technology, depending on who ships the best hardware to push it and a VR headset.
 
Don't understand your assumption - ". . . make people happy that system 2 is "better" because it has much higher FPS."

If that the case than 640x480 should make a hell a lot of folks very happy with 240hz monitors. There is a point where more FPS has very little benefit and other factors can make a decidedly better experience. Once again this was a very interesting comparison, a lowering performing graphics card vs. an overclocked 30%-60% better performing graphics card and in the end a tie in preference with real gamers playing the same game with the only real difference besides the graphics cards is the monitors and how the game optimized using dynamic settings. In all cases not one person thought the Nvidia system equipped video card would be worth $200 more. While only one game the results speak clearly.

People not wanting HDR in general; I would say some would not see the point, some may not see any added cost worth it, others would like it and then those that are just blown away and actually just have to have it. In reality, HDR 10 even HDR 12 is fake HDR and does not reflect reality dynamic range - not likely to be able to as in brightness as bright as the sun and as dark as the deepest cave system. It is another approximation is the jest. Content in general are not up to the standards yet, agree with person above that the artist will need to adapt and use it to be able to tap into it's full potential. Everything art wise has been pretty limited in dynamic range, so this is new territory from an artists viewpoint.

Sorry, I guess I should have put a sarcastic smiley on my post ;)
 
The title is literally 'Freesync2 vs G-Sync'- which done at this time means that it will also be HDR versus SDR, because FS2 requires HDR and HDR isn't yet available in a G-Sync monitor.

I get that even though Nvidia haven't released their HDR G-Sync monitors yet, I am not sure how much of the conclusion will change since a G-Sync with HDR will cost significantly more due to implementing FALD.
 
Seems more like a VA vs TN panel comparison and it's interesting that some users felt that the TN panel had more vibrant colors; my guess is because of the "fake" HDR and low FALD of the Samsung panel causing areas of the display to wash out. I think that there should be a control system without either form of adaptive sync in these types of reviews to see if people felt that the extra $100-400 was worthwhile. I think a quick chart with average frametimes would be useful also as if both systems are holding the max refresh rate or higher then Freesync/Gsync aren't doing anything.

Also, you can buy a Dell S2716DG which has GSync and similar specs as the ASUS for $100 less (almost $200 less last week) than the Samsung Freesync monitor...
 
I have to say... I am not a videophile or audiophile. I couldn't tell the difference between Dolby Atmos and regular speakers.

I have a love/hate relationship with HDR. My eyes can not agree on what I prefer. Playing Far Cry 5 on my KS8000 I almost prefer non-HDR gameplay than HDR. I can't quite put my finger on why... the first time I tried HDR the game was way too dark in certain areas. I didn't like that. I adjusted the brightness a handful of times and I would say in Far Cry 5 about 90% of the time I prefer non-HDR. I had the same problem in RE7 although it was way worse. RE7 was unplayable with HDR on and brightness on max... was still too dark.

Perhaps newer HDR technology is better but I honestly don't see that much of a difference. It's almost just like higher contrast to me or something.
 
I get that even though Nvidia haven't released their HDR G-Sync monitors yet, I am not sure how much of the conclusion will change since a G-Sync with HDR will cost significantly more due to implementing FALD.

I won't make any claims to price, other than to say that we don't really know. I'd be certain that Nvidia is aware that higher prices on G-Sync monitors keep a 'plus' in AMD's column for the majority of gamers. I'm sure they prefer to be more expensive just to maintain their premium branding image, but I doubt that >US$200 even on the high end over an equivalent FS2 monitor is ideal for them. And I do hope that they address this by focusing on lowering the BOM for G-Sync...

it's almost just like higher contrast to me or something.

It's not 'just like' high contrast- it is high contrast. HDR should also enable smoother color gradients and potentially a greater range of colors where that might be a problem for SDR, but that's the gist of it.
 
I do not really use G-Sync or worried about vsync at all. But HDR and games, such an amazing mix, just need HDR, Ray Tracing combined with some of the games currently out there, DAMN now that would be badass. I would not mind throwing FPS out the window for visual bliss.
 
Can I just say something?

[H]ardOCP has a reputation for the most unbiased in quality testing in the industry. They have lasted this long because they are willing to let the results speak for themselves rather than shill for some company.

So if you are some team red or Team Green pissant who is having a shit fit over the results of a test conducted by Kyle, I submit that the issue isn't with the test but with your fucking brain.

Just because you, whoever you are, happens to be a sycophant for whatever company you take it in the butt for doesn't mean Kyle similarly bends over.

Kyle and [H]ardOCP have earned the trust of people who can think for themselves, over the Long Haul and with frank and honest reviews and editorials.

And some bitches on an internet forum think they know better? Please just go and fuck right off and leave the forum posting to your more intelligent betters.

Because with the weight of history behind this site, only the legendarily stupid would try to pick a fight with Kyle and his reviews.
 
I could be wrong, but I believe that was partly the goal of this test--to see if freesync+hdr was subjectively better or worse than gsync. If you performed the same test on a game with a poor implementation of hdr, then you might as well not be testing it. It would be nice if the same test could be ran on the same game but with hdr disabled, but you'd need to have another set of people or modify the format somehow as they could be influenced by what they had experienced before.


Add a third system, only one of the pair was HDR capable. Adding a third system to the blind-test with HDR turned off would do it, but I must add that I don't really think this would be necessary for the scope of what Kyle wanted to target.

I believe he wanted to test the full range of two separate technological solutions. That each is supported by only a single GPU manufacturer is just the way things are and polarizes people's reactions to it.
 
Excellent video! Thank you guys for that. Really feels like you are for the people here and that is really appreciated.
 
gsync vs freesync?

Simple, freesync.

Anything from nvidia is to lock you into their ecosystem, so no, fuck them.
 
But, in that case, don't make it a freesync vs gsync battle. Use the same screen panel in both graphics card, and ask which system performs better.

Well, they could still do freesync vs gsync, but slightly different. Since both seem quite similar, perhaps 4 test stations, 2 freesync, 2 gsync, where one has the sync tech enabled, the second does not. blind tests so they are not told which ones even have the sync tech enabled. so 4 test systems, all mixed up (so it might be brand a no sync, brand a + sync tech, brand b + sync tech, brand b). Hearing some opinions like this I think would help a lot of gamers who haven't tried either yet, be more sold on the merits of the sync techs in general. Comparing the 2 sync techs that are close enough that it is hard to tell them apart, only really tells us they are similar. We will know how awesome they are once compared to no-sync :)

The HDR games, can you post a list of what games can do HDR? I think this deserves its' own tests, and I have no idea if any games I currently play can do HDR... For tv's there's multiple levels of HDR, plus HDR quality/appearance also depends on the content.. seems like a crapshoot atm. But once some more HDR panels are out, IPS with HDR vs OLED with HDR, etc, I think would be valuable to [H] readers (I know it interests me). Since this too is something that probably isn't graphable, the viewer opinions (multiple) is a great format. You might even be able to double up the gamer opinions, have 2 types of chairs, and get opinions on those too. Call up some chair vendors, get some new sponsors :)
 
Last edited:
gsync vs freesync?

Simple, freesync.

Anything from nvidia is to lock you into their ecosystem, so no, fuck them.


Because you have so many choices if you stay away from NVidia? (y)

Hows that Matrox card still working for you ? :sneaky:
 
............ Comparing the 2 sync techs that are close enough that it is hard to tell them apart, only really tells us they are similar...............

I think that this was entirely the point, to find out if one was better, or if they were comparable.

You really have to understand some things, Kyle went to great lengths to ensure as level a playing field as possible. The both video cards support DX12, the AMD card has a slight edge as recent reporting suggests that it runs this game very well and at 1080P, outperforms a 1080Ti. But at the tested resolutions, 2660x1440, the NVidia card has more of an edge. The Samsung Display maxes at 144Hz, the ASUS ROG Display 165Hz, Kyle chose 100Hz for the refresh rate on the monitors so neither card should try to push beyond this additional limitation, again, neatly selecting a middle ground between each card's advantages. I suspect, with the eye-candy turned on to high settings, the frame rates probably stayed under 100 most of the time anyway, but I bet they were also pretty smooth by the tester's comments.

In short, Kyle made all the right choices to ensure that these cards had a chance to produce almost identical experiences with the only differences being those highlighted by the monitors themselves, HDR and VA vs PA panel characteristics.

Neither system was in a position to run away from the other, not that they were seriously different in capability at the settings selected.

1507576569da7dn48w49_2_1_l.png
 
It's pretty interesting overall. It was interesting to me that for many of the people the biggest factor was preferences around color rendering. Over the years I've had a couple systems where I immediately noticed (after switching cards) that I preferred the way radeons have rendered colors and contrast on the same monitor. I wasn't sure if that's a brand thing (msi vs asus, etc) or an AMD vs NVidia thing. I'm still not sure to be honest. In this case I wonder how much of their preference was the monitor vs a familiar certain color profile.
 
Because you have so many choices if you stay away from NVidia? (y)

Hows that Matrox card still working for you ? :sneaky:
It’s kicking ass, next to my Kyro 2 card.

So, i guess that you are ok in observing the severe anticonsumer actions on nvidia part and instead of helping in taking them down a couple of notches (by not giving them your money), you prefer to make fun of someone who is willing to do what it’s right for us the consumer?

Oh well, sheeps don’t fight back on their way to the slaughter either.
 
I have FreeSync on my LG UW monitor but the 75Hz refresh rate is locked behind the monitor requiring a card with it. Tough decision between upgrading my 980 Ti to the 1080 Ti & skipping the higher refresh OR going with an OC'd Vega 64 to get FreeSync working at max of 75Hz. I'll have to mull over this review to think more on what route I want to take.
 
He's kind of right though in regards to the monitors. While there are Freesync versions of the Gsync monitors at a cheaper price they often lack LFC, have frame-skipping issues at high refresh rates on NV cards, and they always lack ULMB (which is a better feature than either Freesync or Gsync IMO). Then some like the Samsung panel advertise HDR support but actually have an awful HDR implementation. QA in the current display market is pretty awful and Freesync has no real standards whereas Gsync has specific requirements to be certified (All panels must have ULMB, a sync range of 30-Max Hz (Some Freesync monitors have ranges as low as 48-75hz), high contrast ratios, etc). Barring the normal QA issues you can go to the store and blindly buy a Gsync panel knowing that it will be good; not so with a Freesync panel. Whether it's worth the extra $100-$200 is debatable and I'd argue that you're better off spending extra on something that you're going to keep for as long as most people keep monitors.

The fact that Freesync was only equal in votes to Gsync despite the inclusion of HDR (and on a VA panel compared to TN no less) shows that it's not exactly equal and probably deserving of having a lower price point.

I'm actually saddened that I'm right about the current situation- I'd prefer AMD have pushed out something competitive, but it seems like they're always a generation or more behind.
 
Back
Top