Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

Noko I notice you locked fps.
I notice it is locking to 60fps nearly all of the time and very very briefly to 59fps sometimes.
Also like I said it may come down to options, in previous engine it was worst with greater visual options.
So your setup is not exactly how the engine is used by everyone tbh, that said the point is even if you cannot replicate the issue others have, meaning no conclusion can be made.
Cheers
Adaptive sync was used is the reason for the 60fps, video encoding was slightly off from 60fps, something like 59.7, which gave the video a slight judder at times, game play was as smooth as you can get at 60hz. No I did not use the same area but understand you point none the less. Vertical sync I see mouse lag, adaptive sync I do not - what is the difference with the mouse lag? I don't know why adaptive sync does not have it or it is much less. Without adaptive sync, even with faster fps the screen tears and the smoothness is not there with some stuttering.

I should do a short video showing what motion blur does in slow motion - it basically blinds you to the whole environment especially when you turn. For quick actions in this game motion blur I recommend to be off. Depth of field does give the environment a more volume feel to it which I will probably turn back on since I am hooked again on this game.
 
"Our card is slower, but you can't tell the difference in FPS capped testing". Are they serious?
 
"Our card is slower, but you can't tell the difference in FPS capped testing". Are they serious?
Are you? Do you not understand the test or what the outcome is indicating? It is a test showing what visually a gamer would see when using some form of adaptive sync. Say if you were using a 3440/1440p monitor like these then this test would actually tell you a lot. The 1080Ti would likely give you a longer lasting setup, but if you are the kind to upgrade every cycle then you could forgo the expense of the top of the line and buy the tier that would better fit your needs without sacrificing great game play. Honestly we haven't seen a whole lot of adaptive sync tech when it comes to games or GPU releases. This test and the others tell us that there may be more to gaming than chasing fps.
 
This test and the others tell us that there may be more to gaming than chasing fps.

Funny now that all-of-a-sudden FPS doesn't matter. Why highlight the area where you aren't as powerful? So instead they are relying on folks not being able to tell the difference in FPS for anything greater than 60 FPS.
 
Funny now that all-of-a-sudden FPS doesn't matter. Why highlight the area where you aren't as powerful? So instead they are relying on folks not being able to tell the difference in FPS for anything greater than 60 FPS.
Don't think for a second that I changed my stance, I have always said CONTEXT. With the 980Ti and FuryX I said it would be virtually a tie in real world visually. I have mentioned CONTEXT when it came to CPUs in regards to Monitor Hz as far as being or not being capable.

And it was 100Hz not 60Hz.
 
Vertical sync I see mouse lag, adaptive sync I do not - what is the difference with the mouse lag? I don't know why adaptive sync does not have it or it is much less.

If you do not know why, then go look it up!

If you cannot understand the difference between V-Sync and Adaptive Sync, then you cannot possibly understand G-Sync and FreeSync.
 
If you do not know why, then go look it up!

If you cannot understand the difference between V-Sync and Adaptive Sync, then you cannot possibly understand G-Sync and FreeSync.
Since you think you know so much, how about explain it to us dumb folks then or anyone else. Since you are so smart.
 
This test and the others tell us that there may be more to gaming than chasing fps.
No, it tells us that capping the performance of a faster card to match that of a slower card means you can't tell the difference between them. Who would have thought.
Maybe now you understand why I have been complaining about how this test was set up.
AMD using it for marketing to show how the card is "comparable" to a 1080Ti is exactly why I have had issues with this test.

Without having actual data for how close the two cards perform there's no way to say for certain, but it seems irresponsible for [H] to have published this.

Vertical sync I see mouse lag, adaptive sync I do not - what is the difference with the mouse lag? I don't know why adaptive sync does not have it or it is much less.
Vertical Sync buffers multiple frames so that it is able to present them in sync with the refresh rate of the monitor.
Adaptive Sync synchronizes the refresh rate of the monitor to the framerate of the game without having to buffer frames. (technically it may be buffering a single frame and using accelerated scan-out, but it's not buffering multiple frames)

That's how it eliminates tearing without adding latency.
It also fixes "judder". You cannot always guarantee that the framerate will be able to match a fixed the refresh rate, causing judder. Adaptive Sync eliminates this problem by synchronizing the refresh rate to match the framerate instead.
 
Last edited:
No, it tells us that capping the performance of a faster card to match that of a slower card means you can't tell the difference between them. Who would have thought.
Maybe now you understand why I have been complaining about how this test was set up.
AMD using it for marketing to show how the card is "comparable" to a 1080Ti is exactly why I have had issues with this test.

Without having actual data for how close the two cards perform there's no way to say for certain, but it seems irresponsible for [H] to have published this.


Vertical Sync buffers multiple frames so that it is able to present them in sync with the refresh rate of the monitor.
Adaptive Sync synchronizes the refresh rate of the monitor to the framerate of the game without having to buffer frames. (technically it may be buffering a single frame and using accelerated scan-out, but it's not buffering multiple frames)

That's how it eliminates tearing without adding latency.
It also fixes "judder". You cannot always guarantee that the framerate will be able to match a fixed the refresh rate, causing judder. Adaptive Sync eliminates this problem by synchronizing the refresh rate to match the framerate instead.
The 1080Ti was chosen by KYLE so stop with the tinfoil hat commentary!
And you have absolutely no proof or facts to back up your claim. What were the frametime graphs for each card in this test? There were none so all you are doing is whining and moaning because the outcome does not prop up your belief in how GPUs and gaming matter.

What this test shows and PROVES is that with an owner of either of these monitors buying a 1080Ti is not necessary as likely the 1080 would have provided equal performance as well. You don't like the results because they don't look right in your short-sighted view of the world, fine but don't expect to win any debate with that particular stance.
 
What this test shows and PROVES is that with an owner of either of these monitors buying a 1080Ti is not necessary as likely the 1080 would have provided equal performance as well. .

and this is where you are wrong and most of the defenders of this blind test are, THIS useless test just PROVES that in the game DOOM an owner of either of those monitors doesn't show any noticeable difference between VEGA and 1080 or even 1080Ti, but.. at the end, that's all.. just a game at that resolution, but what about other games? what about more demanding games that will really show a difference in performance? this test is an AMD GPU in the best of the best possibles scenarios, there's no other game in the market with that kind of advantage on performance on a AMD card Versus Nvidia, do you thing that pattern will be the same for every other game? what about more "neutral" games? what about scenarios that favor Nvidia? what about other games that will require to turn down settings to reach playable FPS even on a 1080Ti?. do you see the issue with this test there? this kind of test will require to be executed with EVERY game on the [H] GPU review suite with different game settings to have any remote validity because there will be Visual differences not only in performance but also in visual fidelity due to different settings and no kind of FPS smoothness technology can hide that.

This test just proves that Monitor A vs Monitor B when calibrated the same shows no differences to the end user that can sustain the Freesync/Gsync range. that's all.
 
Last edited:
and this is where you are wrong and most of the defenders of this blind test are, THIS useless test just PROVES that in the game DOOM an owner of either of those monitors doesn't show any noticeable difference between VEGA and 1080 or even 1080Ti, but.. at the end, that's all.. just a game at that resolution, but what about other games? what about more demanding games that will really show a difference in performance? this test is an AMD GPU in the best of the best possibles scenarios, there's no other game in the market with that kind of advantage on performance on a AMD card Versus Nvidia, do you thing that pattern will be the same for every other game? what about more "neutral" games? what about scenarios that favor Nvidia? what about other games that will require to turn down settings to reach playable FPS even on a 1080Ti?. do you see the issue with this test there? this kind of test will require to be executed with EVERY game on the [H] GPU review suite with different game settings to have any remote validity because there will be Visual differences not only in performance but also in visual fidelity due to different settings and no kind of FPS smoothness technology can hide that.

This test just proves that Monitor A vs Monitor B when calibrated the same shows no differences to the end user that can sustain the Freesync/Gsync range. that's all.
This test really proves nothing other then on that day 6 could care less which one they had to game on except would prefer the cheaper cost one - 3 thought the Radeon/FreeSync was better and the better price a clear winner and the one who thought the GFX was better and not clear if the increase price was worth it. That is it. I agree it would be interesting with other games except wasn't BF1 also tested by AMD? Anyways this discussion I think has run it's course, unless HardOCP can do more tests like these, if feasible and keeping as objective as possible hopefully, I think I will just wait until Brent does his own testing.
 
No, it tells us that capping the performance of a faster card to match that of a slower card means you can't tell the difference between them. Who would have thought.
Maybe now you understand why I have been complaining about how this test was set up.
AMD using it for marketing to show how the card is "comparable" to a 1080Ti is exactly why I have had issues with this test.

Without having actual data for how close the two cards perform there's no way to say for certain, but it seems irresponsible for [H] to have published this.


Vertical Sync buffers multiple frames so that it is able to present them in sync with the refresh rate of the monitor.
Adaptive Sync synchronizes the refresh rate of the monitor to the framerate of the game without having to buffer frames. (technically it may be buffering a single frame and using accelerated scan-out, but it's not buffering multiple frames)

That's how it eliminates tearing without adding latency.
It also fixes "judder". You cannot always guarantee that the framerate will be able to match a fixed the refresh rate, causing judder. Adaptive Sync eliminates this problem by synchronizing the refresh rate to match the framerate instead.
Thanks - that makes a lot of sense - Nvidia just says vertical sync is applied when frame rate exceeds monitor refresh rate and turns off when not. It is great tech and I consider it as good as FreeSync if you can keep your frame rates up. Now I hope we are talking about the same thing, Adaptive vertical sync and not GSync.
 
Thanks - that makes a lot of sense - Nvidia just says vertical sync is applied when frame rate exceeds monitor refresh rate and turns off when not. It is great tech and I consider it as good as FreeSync if you can keep your frame rates up. Now I hope we are talking about the same thing, Adaptive vertical sync and not GSync.
Adaptive V-Sync is not Adaptive Sync. Adaptive Sync is the name for the VESA standard that FreeSync is based on.
Adaptive V-Sync just enables V-Sync when your framerate meets or exceeds the refresh rate and disables it when it drops below that, allowing the screen to tear instead of dropping from 60 to 30 FPS.
 
I hate to be THAT guy, but I won't be the first to say....

I wish Kyle hadn't picked DOOM.... that game runs good on a potato. Either of those cards probably never dipped below 100 fps @ 1440p, which defeats the purpose of G-Sync / Freesync.

Need something demanding, a game that's going to kick the GPUs in the balls and test to see if the adaptive sync can help. I don't know what current game could punish either of those cards at 1440.

Next topic, who's dropping $1000 on a monitor? Not this guy.

To be fair a monitor will likely outlast your video card. I'm using a 120hz catleap right now. I've used several video cards to drive this monitor. My current GTX980ti cost me over $700.

There is also a large difference between 2560x1440 and 3440x1440.

I do wish that I hadn't paid 1k for my Dell u3415w. I often don't like using it due to the input lag.

3440x1440 with adaptive sync, a high refresh rate, and no input lag. Yeah, thats worth some money imo.
 
This test really proves nothing other then on that day 6 could care less which one they had to game on except would prefer the cheaper cost one...

For Doom.

Which is an easy game to drive.

Meaning that if you play Doom and only Doom, at 3440x1440x100, this test is valid concerning how people who are not you perceive the performance of Vega vs. a 1080Ti.

This test really only tells us one thing that we can actually extrapolate and apply: Vega isn't broke. Which is good!

But it does not tell you how Vega will run other games, or at other resolutions, or with other framerate targets.

So yes, if you're going to play Doom, and you're not going to want more than 5MP at 100Hz, Vega might be right for you.

 
This may also need to be considered.
Anyone know if the following was ever resolved with the ASUS PG348 G-Sync display:
From TFTCentral:
One thing we did notice on our test systems was an odd behaviour with the overdrive impulse at the maximum 100Hz refresh rate, and also at one step down of 95Hz. If you observe the moving car in the PixPerAn tests you can see this issue. Approximately every 10 seconds the overdrive impulse seemed to "jump", causing a large amount of overshoot to momentarily appear in front of the moving object, before quickly reverting to its normal operation. It was a very quick change but you could see it clearly. At 100Hz and with OD set to normal there is no overshoot trailing at all normally behind the car. When this "jump" happens a noticeable and pretty severe dark and pale overshoot appears in front of the car before reverting back to normal again and going through the same cycle. It was an odd behaviour and we can only assume it was an artefact resulting from the excessive refresh rate boost above the native 60Hz
......
For now, we would recommend sticking with 90Hz as the maximum overclock speed and refresh rate setting where this overdrive anomaly is not present. You are still getting a 50% boost in refresh rate/frame rates compared with the native 60Hz panel, and some nice additional benefits in terms of improved response times and improved perceived motion blur. Powering the screen at 3440 x 1440 and above 90Hz on a regular basis is probably going to be a challenge in most games anyway, so we don't expect our 90Hz upper cap recommendation to be a problem really.
http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg348q.htm

Nothing to do with test done here as it is a challenge to get matching panels, but as a buyer not sure I would buy this expensive monitor considering its native rate is 60Hz and what seems to be some quirks associated with it (caveat being not sure if Asus ever addressed this and customer reviews show fair amount of happy owners).
Makes the Freesync one even better value in comparison to this specific model IMO.
Cheers
 
Last edited:
The bannings have begun for you guys doing all the personal sniping.
 
Is it too simple to think about Vega as a higher clocked version of fury x?
What is the reason for AMD splitting their GPU market up into two different GPU designs?
 
Think you missed my point.
You will still need to upgrade your monitor is my point.
You cannot buy a Freesync monitor now and expect HDR support in future from that monitor; meaning you would need to replace it again, therefore it may be costly to buy now rather than waiting

So if I am waiting for HDR, this brings me also closer in line with Volta, factors that may change the decision landscape, along with when we get to see how HDR performs on both Freesync and Gsync if the standard is more broadly accepted and coded well for (although that then raises argument on forums what is the right contrast/brightness level as this can vary even between Televisions-Bluray with HDR format).

Cheers
Check Samsung. They have a line out now, or very very soon.
 
Check Samsung. They have a line out now, or very very soon.
I would wait for one of the specialist monitor sites to test the HDR monitors before buying, case in point it was TFTCentral that highlighted the PG348 G-Sync monitor used in this test here suffers with subtle perception issues when at 95 and 100Hz as it is not a native 100Hz monitor.
See post #458.
Good to see one out now for Freesync (and plus over GSync if one did commit now) but leery buying any monitor this early relative to what will be coming out end of year.
Cheers
 
Last edited:
I would wait for one of the specialist monitor sites to test the HDR monitors before buying, case in point it was TFTCentral that highlighted the PG348 G-Sync monitor used in this test here suffers with subtle perception issues when at 95 and 100Hz as it is not a native 100Hz monitor.
See post #458.
Good to see one out now for Freesync (and plus over GSync if one did commit now) but leery buying any monitor this early relative to what will be coming out end of year.
Cheers

Looking at Samsung's site they have released 3 models. All are 144Hz, HDR, Freesync 2, 10 bit , VA (178° viewing), and 1ms g2g. 27" (looks like the 27" isn't out yet) and 32" 2560*1440 and 49" 3840*1080. Just trying to address your original statement that you couldn't currently buy one.

http://www.samsung.com/us/compare/#.../LC32HG70QQNXZA,LC49HG90DMNXZA,LC27HG70QQNXZA
 
I would wait for one of the specialist monitor sites to test the HDR monitors before buying, case in point it was TFTCentral that highlighted the PG348 G-Sync monitor used in this test here suffers with subtle perception issues when at 95 and 100Hz as it is not a native 100Hz monitor.
See post #458.
Good to see one out now for Freesync (and plus over GSync if one did commit now) but leery buying any monitor this early relative to what will be coming out end of year.
Cheers

That issue with the PG348Q does not seem to be present in the X34 which uses the same panel and there was no mention of that in TFTCentral's review of the X34.

On the other hand, going by the many online discussions about it, both the PG348Q and the X34 seem to have varying degrees of visible scan lines or horizontal lines when overclocked.
 
Looking at Samsung's site they have released 3 models. All are 144Hz, HDR, Freesync 2, 10 bit , VA (178° viewing), and 1ms g2g. 27" (looks like the 27" isn't out yet) and 32" 2560*1440 and 49" 3840*1080. Just trying to address your original statement that you couldn't currently buy one.

http://www.samsung.com/us/compare/#.../LC32HG70QQNXZA,LC49HG90DMNXZA,LC27HG70QQNXZA
Yeah wrong choice of word on my part there :)
But they seem to have serious jumped the gun before anyone else because no-one else has anything available for at least another quarter plus there has been concerns about QA of the Quantum Dot technology (some are having issues with their new TVs with said technology), so I am putting it down they jumped the gun a bit and would love to see it reviewed to see if anything is missing from an HDR perspective.
Cheers
 
It's about a 5:1 ratio. And some big manufacturers like LG said they won't make GSync monitors.

No, LG has one: http://www.lg.com/us/monitors/lg-34UC89G-B-ultrawide-monitor but the resolution is garbage.

A lot of G-sync monitors leave a lot to be considered to be honest. I faced too many QC issues with Acer and Asus. BenQ was the only good one for me.

On the other hand, FreeSync support includes LG, EIZO and Samsung... which are pretty much the best companies in terms of display tech.

I don't understand why Nvidia just doesn't abolish this G-Sync crap and support FreeSync but then I remembered, this is Nvidia we're talking about... given their track record with sticking to proprietary nonsense.

Don't really care either way, this makes my purchasing decision much easier.
 
No, LG has one: http://www.lg.com/us/monitors/lg-34UC89G-B-ultrawide-monitor but the resolution is garbage.

A lot of G-sync monitors leave a lot to be considered to be honest. I faced too many QC issues with Acer and Asus. BenQ was the only good one for me.

On the other hand, FreeSync support includes LG, EIZO and Samsung... which are pretty much the best companies in terms of display tech.

I don't understand why Nvidia just doesn't abolish this G-Sync crap and support FreeSync but then I remembered, this is Nvidia we're talking about... given their track record with sticking to proprietary nonsense.

Don't really care either way, this makes my purchasing decision much easier.


I'll tell you why. NV 80% of the market versus AMD 20% of the market. You can harp on freesync all you want but not enough people give a shit about adaptive sync and not enough people give a shit about AMD graphics. NV is the high performance leader and anyone who wants the high performance vid cards are going to go gsync or don't care for adaptive sync. The vocal minority on this and other forums saying freesnyc is the winner are wrong. Freesync isn't selling video cards for AMD and Gsync isn't selling video cards for NV. The mass market doesn't give a single fuck.
 
Freesync isn't selling video cards for AMD and Gsync isn't selling video cards for NV.
I have no data, but I would disagree with this. FreeSync is the biggest (and maybe only) selling point for buying an AMD card today.
 
I'll tell you why. NV 80% of the market versus AMD 20% of the market. You can harp on freesync all you want but not enough people give a shit about adaptive sync and not enough people give a shit about AMD graphics. NV is the high performance leader and anyone who wants the high performance vid cards are going to go gsync or don't care for adaptive sync. The vocal minority on this and other forums saying freesnyc is the winner are wrong. Freesync isn't selling video cards for AMD and Gsync isn't selling video cards for NV. The mass market doesn't give a single fuck.

I don't know, I think Nvidia would control more than 80% of the market if they supported Freesync. No one would be choosing AMD for gaming at that point.

Also, due to the poor release of Vega, a majority of people bought Nvidia cards to use on their Freesync monitors.

I actually would've done the same but I also do professional workloads on the side (60/40) and sadly, Nvidia has no card that beats AMD at that criteria unless I'm willing to spend over $5000.
 
I have no data, but I would disagree with this. FreeSync is the biggest (and maybe only) selling point for buying an AMD card today.

So you're telling me freesync is the only thing keeping RTG afloat? Sad state of affairs then.
 
So you're telling me freesync is the only thing keeping RTG afloat? Sad state of affairs then.
They've had some other advantages, mining ability, price, better open-source Linux drivers, some gains in Vulkan/DX12, etc.

However, the price advantage is largely gone as it's hard to find any of their decent cards at MSRP (due to miners, scalpers, low supply, retailers price gouging, etc.), Linux is a small market, and the Vulkan/DX12 gains are not enough to beat Nvidia's best.

So FreeSync is at least a clear marketing point that's not BS.
 
So you're telling me freesync is the only thing keeping RTG afloat? Sad state of affairs then.

Sony and Microsoft with the game consoles both use AMD cards. That's steady business for AMD, but not very profitable. That's why nvidia wanted out of the console market.
 
Sony and Microsoft with the game consoles both use AMD cards. That's steady business for AMD, but not very profitable. That's why nvidia wanted out of the console market.

I agree. The console market is such low margin it's hardly worth getting into. NV made a smart decision.
 
Back
Top