freesync vs FPS?

Putz

I have a custom title
Joined
Jul 8, 2002
Messages
5,477
Was holding out on the Vega cards but the 64 is just too powerhungry and hot for my ITX rig, the 56 looks ok heat/power wise (running sf600 corsair) but it looks like it is on par with a 1070, and im not sure it can provide the FPS needed for freesync stability at 3440x1440. all the reviews are for 4k and looks like it struggles with 4K

debating now on a 1080 (dont really need a 1080ti) and just set up adaptive sync and set my fps max to 60 fps in wow (all i play is wow for the most part and the 1060 now struggles with 3440x1440 at times)

so whats better? freesync at 40-50 fps (monitor capable of 75hz in free sync or 60hz without) or 1080 running with adaptive (60hz cap) and probably never dipping below

never used freesync but not sure its worth buying an overpriced, under performing power hungry card to get it

so whats "better" ? something that will just peg the games at 60fps with adaptive sync or freesync that probably cant keep games in the freesync range? right now in Canada the 1080 (non Ti) is coming in like 100-150$ cheaper than r64 so probably about the same as the v56 will be
 
Last edited:
I think it's ridiculous to buy a new card for 4k and not have it be a 1080Ti or something of similar price and performance, which there isn't any. Anything less and you'll be making another thread like this in short order.
 
I have a 3440x1440 monitor, no sync tech, no way an overclock 1070 would maintain 60 god unless reducing settings. Of course there are games you could, almost. SLI helped but was limiting depending upon game. Setting monitor to 50hz for Adative Sync was a workable solution for me.

Have a freesync 4K monitor and freesync is the real deal in delivering smoother feeling game play at 40fps and up. So I would think a Vega 64 and Freesync on 3440x1440 would work well together.

Other option with a 3440x1440 monitor with AMD is using AMD scaling, like render at 2560x1080p with GPU scaling it to 3440x1440. AMD scaling is very good.

The 1080Ti on my 3440x1440 monitor seems to be the best solution being a non syncing tech monitor. For the most part I can maintain 60fps. I don't think the 1080 could in all games.
 
well technically 3440x1440 is more like 3K which is why i was assuming the 1080 would keep wow at 60fps with ease, i know its a cpu dependent game but thats not my bottle neck

wow doesnt need that much gpu but the 1060 cant quite cut the mustard on 3k which is why i was thinking 1080, less power and standard size, all the 1080ti cards seem to run the extra tall cooler which isnt going to fit in my ncase ITX box. but i agree with you, for 4K 1080ti is a must
 
Was holding out on the Vega cards but the 64 is just too powerhungry and hot for my ITX rig, the 56 looks ok heat/power wise (running sf600 corsair) but it looks like it is on par with a 1070, and im not sure it can provide the FPS needed for freesync stability at 3440x1440. all the reviews are for 4k and looks like it stuggles with 4K

debating now on a 1080 (dont really need a 1080ti) and just set up adaptive sync and set my fps max to 60 fps in wow (all i play is wow for the most part and the 1060 now struggles with 3440x1440 at times)

so whats better? freesync at 40-50 fps (monitor capable of 75hz in free sync or 60hz without) or 1080 running with adaptive (60hz cap) and probably never dipping below

never used freesync but not sure its worth buying an overpriced, under performing power hungry card to get it

so whats "better" ? something that will just peg the games at 60fps with adaptive sync or freesync that probably cant keep games in the freesync range? right now in Canada the 1080 (non Ti) is coming in like 100-150$ cheaper than r64 so probably about the same as the v56 will be


gsync and freeesync is roughly said just a fix to remove FPS drops from low framerates during vsync... so why even consider it if you can avoid what they fix to begin with?

if you are running 60fps/60hz constantly with Vsync, freesync will not give you anything extra. its still a 16.6ms render delay between each frame shown. and heck Gsync will even worsening your input delay (microscopically)


The solutions in my eyes is going with the most powerfull card and then patch low fps with freesync/gsync or fast sync if its not fast enough to hit the Hz cap
or in short: use fastsync on nvida cards if you dot have agsync monitorn
 
well i wouldnt run vsync, its alot of overhead. but the nvidia adaptive sync seems to be a no performance loss option to avoid the tearing of high FPS spikes. or are you referring to fast sync as the same thing as the adaptive?
 
well i wouldnt run vsync, its alot of overhead. but the nvidia adaptive sync seems to be a no performance loss option to avoid the tearing of high FPS spikes. or are you referring to fast sync as the same thing as the adaptive?


... i have no words for this. either you are totally not understadning how sync works. or i have a very wrong picture if what adaptive sync is

ny understaidn is that adaptive sync = we turn vsync on and of depending on your framerate.
Also you might wanna inform nvida about that is not just bascaily vsync
https://www.geforce.com/hardware/technology/adaptive-vsync
( I actually just went and double checked my self:D)


Vsync overhead ?
What are you talking about vsync overhead ?

Please Dont take offense. but i don't think you really know what sync is. and how it works with framebuffers. That is fair very few people do despite it being decades old technology.
I would be happy to explain it if you would want to. but its going to take a bit of time.

or if you can put me in on anything i must have missed because its just makes no sense to me what you are saying
 
Last edited:
ill give you my example in particular, since i only play really world of warcraft this is what i see: (1060 6gb, 3440x1440)

enable vsync in game, card gets very hot and has a hard time maintaining above 30fps. maybe heat related and throttling..seems odd or maybe something game specific..i figure maybe there was some overhead with vsync causing the extra heat/performance loss, perhaps just an issue in world of warcraft?
disable vsync in game card is quiet and cool fps ranges from 30-90 but a lot of tearing
with adaptive sync enabled in the nvidia driver on a global level games cap at the 60 fps so no tearing , no heat issues but there is still alot of drops down into the 30's (because the 1060 just cant keep up with 3440x1440)

my though process was just get a card that can keep it pegged close to 60, and leave adaptive sync on...i couldn't tell you why the vsync enabled in game causes issues, just my observation

you said pretty much what i was planning, get an nvidia card that can maintain 60 fps, enable adaptive sync and im good to go. I find FPS is like sound, something a little noisy isnt too annoying but when you hear a fan change speed or sound pitch its super annoying, I find FPS the same im fine with 60 if it stays there but it looks like crap when its all over the place dipping down in low 30's then back up

have never looked into the fast sync, i though it was more for ridiculous high fps above what the monitor can display
 
VSYNC should not increase your GPU temps at all. There is no real overhead, if anything it reduces the load because it limits the FPS. It would certainly have to be a buggy implementation in or a driver issue causing it.
 
if you are running 60fps/60hz constantly with Vsync, freesync will not give you anything extra.

false

multiple people have stated, myself included, that 60hz pegged with vsync is not as smooth as 48-75hz freesync. Not to mention mouse lag with vsync - that once you've moved away from feels very kludgy when you come back to it. My 1080TI experience keeps the frame rates pegged at 60hz, and it feels significantly and noticeably worse than my pair of Fury X did in in Crossfire that wouldnt' always keep 60Hz pegged. (7680x1440 resolution)

For that reason I'm looking to sell off my 1080TI SLI pair and move to a Vega after using freesync monitors for the last year on a pair of Fury X in Crossfire.



That said - to the original poster. You need to target well above freesync minimum. 48FPS in freesync on my monitors feels buttery smooth. 47FPS (which dips outside of 48-75hz range) feels as choppy as you'd expect 47FPS to feel. To keep it below 75hz you'll use frame rate target control on the AMD card and set it to like 72 or 73FPS. (it isn't an exact cutoff - it's a throttling mechanism) -- but you avoid using vsync that way and it feels and plays oh so nice with just freesync enabled.
 
false

multiple people have stated, myself included, that 60hz pegged with vsync is not as smooth as 48-75hz freesync. Not to mention mouse lag with vsync - that once you've moved away from feels very kludgy when you come back to it. My 1080TI experience keeps the frame rates pegged at 60hz, and it feels significantly and noticeably worse than my pair of Fury X did in in Crossfire that wouldnt' always keep 60Hz pegged. (7680x1440 resolution)

For that reason I'm looking to sell off my 1080TI SLI pair and move to a Vega after using freesync monitors for the last year on a pair of Fury X in Crossfire.



That said - to the original poster. You need to target well above freesync minimum. 48FPS in freesync on my monitors feels buttery smooth. 47FPS (which dips outside of 48-75hz range) feels as choppy as you'd expect 47FPS to feel. To keep it below 75hz you'll use frame rate target control on the AMD card and set it to like 72 or 73FPS. (it isn't an exact cutoff - it's a throttling mechanism) -- but you avoid using vsync that way and it feels and plays oh so nice with just freesync enabled.

Statemenst are easy to make. is there any ABX testing behind this ? never underestimate the effect from placebo

How would a longer render delay ever feel more smooth than a shorter one ?
How would a longer input delay ever feel more smooth than a shorter one.
one answer.. placebo

You thing about 47 and 48 does not have any relevense for running at 60/60. not the same issue.

Also gsync have MORE input lag than vsync on its own, its only less input lag when it save yous FPS frpm double bufferings issues , whch there is non at at 60/60 so at 60/60 you have LESS input lag with vsync than gsync.
stop buying into the marketings BS


fressync/gsync is not magical its just slows down your monitor refresh cycle so the graphis during low fps can make a buffer swap emediatly instead of missing one "Chance" and having to wait for the next refresh cycle.
Again this is not an issues at 60/60 since there is nothing missed, why talking about 47and 48 fps where it IS an issue has not relevant.

You are thinking way to simplistic about it.



P.S.
just notice you siad 48-75hz... ofcause 60fps is no smooth there becuase its not 60/60 but 60/75
I said 60FPS on a 60Hz monitor not a 75Hz monitor
if you are running 60fps under a 75 refresh rate you are again introducing the buffer swap wait dealy with vsync.
if you are running 75FPS on a 75Hz refresh rate there is no buffer swap wait delay.

it just a matter of understanding what sync it and how framebuffers works, and why/when the drawbacks comes into play
 
Last edited:
You thing about 47 and 48 does not have any relevense for running at 60/60. not the same issue.

Even my 1080TI in SLI won't always keep 7680x1440 pegged at 60hz. And when it deviates it's noticeable. (not to mention frame tearing)

Also gsync have MORE input lag than vsync on its own, its only less input lag when it save yous FPS frpm double bufferings issues , whch there is non at at 60/60 so at 60/60 you have LESS input lag with vsync than gsync.
stop buying into the marketings BS
Stop making strawmen. I never spoke to gsync. I'm only speaking to freesync. I haven't spent enough time with gsync to formulate a valid opinion having only sat in front of a gsync setup for a couple rounds of a couple different games. -- But I used Freesync for nearly a year now -- and when it's absent - it's very clear.


P.S.
just notice you siad 48-75hz... ofcause 60fps is no smooth there becuase its not 60/60 but 60/75
I said 60FPS on a 60Hz monitor not a 75Hz monitor
if you are running 60fps under a 75 refresh rate you are again introducing the buffer swap wait dealy with vsync.
if you are running 75FPS on a 75Hz refresh rate there is no buffer swap wait delay.

it just a matter of understanding what sync it and how framebuffers works, and why/when the drawbacks comes into play

This isn't my first rodeo. The HP Omen 32" monitors I have frame skip at 75hz, so I have to set them to 60hz.
So with the 1080TI SLI I'm forced to use vsync with 60hz on the HP Omen monitors.

It's not as smooth in run of the mill daily gaming. To the point I immediately noticed, and when I participated in a LAN party a couple days after doing the GPU swap - I actually felt disheartened enough to not enjoy myself. The lack of smoothness I pointed out to a couple of my friends, and they also noticed. They also used either gsync or freesync technology.

Not to mention I'm encountering what I believe to be microstutter with SLI, that I never experienced with the Crossfire setup and Freesync engaged on the monitors. --- so much so that after just a few days and going back and forth I keep SLI mostly off now. I've been on the 1080TI for coming up on a month - and I'm slowly getting more used to 60hz vsync, but it simply isn't as smooth, nor responsive as freesync and no vsync.

You say placebo. I say not. Placebo would typically have you want to buy the more expensive, more desirable, product. So one would think I would gravitate toward the pair of 1080TI, since they are clearly superior, on paper, than two Fury X.
In this case I'm telling you plainly that I'd rather game on my monitors with two Fury X and freesync than two 1080TI and 60hz locked vsync -- without question!

Have you personally spent some time with gysync or freesync. If the answer is no, then you having nothing of value to add to this particular thread.
 
Last edited:
ill give you my example in particular, since i only play really world of warcraft this is what i see: (1060 6gb, 3440x1440)

disable vsync in game card is quiet and cool fps ranges from 30-90 but a lot of tearing
with adaptive sync enabled in the nvidia driver on a global level games cap at the 60 fps so no tearing , no heat issues but there is still alot of drops down into the 30's (because the 1060 just cant keep up with 3440x1440)

my though process was just get a card that can keep it pegged close to 60, and leave adaptive sync on...i couldn't tell you why the vsync enabled in game causes issues, just my observation

you said pretty much what i was planning, get an nvidia card that can maintain 60 fps, enable adaptive sync and im good to go. I find FPS is like sound, something a little noisy isnt too annoying but when you hear a fan change speed or sound pitch its super annoying, I find FPS the same im fine with 60 if it stays there but it looks like crap when its all over the place dipping down in low 30's then back up

have never looked into the fast sync, i though it was more for ridiculous high fps above what the monitor can display


Thank you- something semm definantly wrong in your system.

enable vsync in game, card gets very hot ..snip....i figure maybe there was some overhead with vsync causing the extra heat/performance loss, perhaps just an issue in world of warcraft?
vsync should not crease heat ( it can acutalyl reduce it because it reduce your FPS or but a cap on it) there is no overhead. its a simple delay (Whcih cools you GPU) more on that next line


has a hard time maintaining above 30fps

offcause it has that is due to the buffer swap issues. when you are running vsync with only 2 framebuffers you can reander a frame while wiating for a refresh cycle to ends so sparing the technical details you card can only perform in whole interges of the refresh rate
aka 60hz you can only make
60fps (x/1)
30fps (x/2)
20fps (x/3)
15fps (x/4)

so when you card would be rendering a frame at 58fps pseed it would drop to 30
offcause if you are right at the 60FPS mark but a bit below would due to the netire of FPS enve beeing perfect the same have some at 30 an som at 60 so a FPS dispalyt that measure over time ( as in PS of FPS) you will average in between.
tha why its sometimes easi took look at a cycle time rather than FPS as cycle times are specifc FPS is an average
That is the well known speed penalty from running vsync on with low fps. buts ONLY during low fps
more on that later if you want to
again this doe not happen if you cn actually hits you refresh cycle with you FPS contantsla
and this is the issue Gsync and freesync are trying to fix



with adaptive sync enabled in the nvidia driver on a global level games cap at the 60 fps so no tearing , no heat issues but there is still alot of drops down into the 30's (because the 1060 just cant keep up with 3440x1440)
You have no tearing as long as you hit the 60FPS and vsync enables. as soon as you FPS drops below 60fps ( or whateve the treshold it for adaptive sync) vsync is disable and you are playing no-sync with tearing.
adaptiv sync is NOT as specific sync metho is just a way to enable and disable vsync it does not give you the benefits of both at the same time. but choses which one it thinks will hurt you less.
so no you DO have tearing ( considered with a normal monitor) when you drop below 60



you said pretty much what i was planning, get an nvidia card that can maintain 60 fps, enable adaptive sync

if you are able to maintein a 60FPS (on a 60hz refresh rate constantly) having adaptiv FPS just means you always have vsync on anyway
You might as well disable adaptiv vsync and save the monitoring ressources. since you dont need it anymore.
but if it makes you feel safer. that is absolutly your choice.
I would recommend looking into using fast sync instead though (again more on that on a more technical walk through
again Im not here to tell what to do .just trying to explain the choices.


Have never looked into the fast sync, i though it was more for ridiculous high fps above what the monitor can display
It is for both fast sync worsk both with low FPS (buts not as good as gsync/freesync) and food for high FPS ( only solutions to remove tearing on high FPS)
with fast sync you can get close to non sync input delay from the rendering part. however you can argue there still is an average of on half cycle time delay before the pictures is on the monitor compared to non vsync with tearing.


Again i will happily go over what happens in the rendering part of the GPU framebuffers and sync if need be.


in short:
1: Go for high FPS
2: patch things up with gsync/freesync (or fast sync if the prioer is not avaiable) if you fps is below monitor refresh rate
2B: Enable triple buffering for OpenGL since fast sync only seems to work with DX in my end. however i have no confirmation on that so take it with a gran of salt. ( triple buffering in DX is horrible never do that ever ever)
3: Enable fast sync if you FPS goes above HZ monitor refresh rate
 
Even my 1080TI in SLI won't always keep 7680x1440 pegged at 60hz. And when it deviates it's noticeable. (not to mention frame tearing)


Stop making strawmen. I never spoke to gsync. I'm only speaking to freesync. I haven't spent enough time with gsync to formulate a valid opinion having only sat in front of a gsync setup for a couple rounds of a couple different games. -- But I used Freesync for nearly a year now -- and when it's absent - it's very clear.




This isn't my first rodeo. The HP Omen 32" monitors I have frame skip at 75hz, so I have to set them to 60hz.
So with the 1080TI SLI I'm forced to use vsync with 60hz on the HP Omen monitors.

It's not as smooth in run of the mill daily gaming. To the point I noticed, and several of my friends noticed. Not to mention I'm encountering what I believe to be microstutter with SLI, that I never experienced with the Crossfire setup and Freesync engaged. --- so much so that after just a few days and going back and forth I keep SLI mostly off.

Even just 60Hz/60FPS with the 1080TI doesn't feel as smooth as Freesync with more variety of frame rate on the Fury X crossfire.


You say placebo. I say not. Placebo would typically have you want to buy the more expensive, more desirable, product. In this case I'm telling you plainly that I'd rather game on my monitors with two Fury X and freesync than two 1080TI and vsync -- without question.

Have you personally spent some time with gsync or freesync. If the answer is no, then you having nothing of value to add to this particular thread.



I am not making strawmen im just making sure to round of all of the sync options as they all act slightly different
but ok lets talk freesync only.
and let me adresse i consider freesync superioe to gsync in regards to input lag. however the diffrence is small its there. so please keep that in mind


Even my 1080TI in SLI won't always keep 7680x1440 pegged at 60hz. And when it deviates it's noticeable. (not to mention frame tearing)

This is irrelevant how fast you system is we where debating a specific situation. arguing that you system is not god enough for that situation does not change the facts about that situation.
Its irrelevant for what i said and you are trying to "correct"


This isn't my first rodeo. The HP Omen 32" monitors I have frame skip at 75hz, so I have to set them to 60hz.
OK then and i asume during the freesync testing as well you did not go above 60hz?


Even just 60Hz/60FPS with the 1080TI doesn't feel as smooth as Freesync with more variety of frame rate on the Fury X crossfire.

Still i don't see any testing or technical evidence. just he said she said and basic placebo effect.are you actually argiing that randomly going below 60fps is more smooth than being on 60fps constantly? and why are you card suddenly becomming slower and not able to deliver the 60fps when you ar runnig freesync ?


You say placebo. I say not. Placebo would typically have you want to buy the more expensive, more desirable, product.
You have no idea what placebo is. I'm sorry but you are 100% wrong on what placebo is
Placebo is that you perceive in effect because you subconscious want you to. it does not have anything to do with wanting to boy more stuff. However it is often an effect FROM buying expensive stuff.
Basically you are perceiving wishfully thinking, and it can come in many forms

Example when people say Yeah i can clearly here this is an mp3 because it lacks base. they perceived it worse because they "know" its supped the be worse to to here knowledge about being an mp3 file
Setting the same person in a proper ABX testing might not detect any difference at all, because ABX isolate the knowledge and thereby the placebo effect is eliminated.

ANY experiment in regards to peoples perception of things that is NOT done in a proper placebo isolating way has the risk of being influence by placebo.
And any experiment in regards to perception that is not double blind is simple not sane or propper in anyway and the evidence nature is close to invalid.


Not to mention I'm encountering what I believe to be microstutter with SLI, that I never experienced with the Crossfire setup and Freesync engaged. --- so much so that after just a few days and going back and forth I keep SLI mostly off.
if you are getting microstutter. you FPS is dipping below the 60 mark and thereby NOT running 60/60 as i said, and again you are talking about another situation that i was not talking about
59.9999/60 is not the same as 60/60 so offcauyse you experince issues because its NOT what i was talking about

again you view is just to simplistic you bundle everything under one thing, which is is not.



Have you personally spent some time with gysync or freesync. If the answer is no, then you having nothing of value to add to this particular thread.
very typical resposne fomr placebo files and non scientific poeple. you cant know how things works unless you have tried it.... you cant know how to make optimal curing gear for you healer unless you play it... BS the math and physics are there.
jsut because you preffer not to understand thing but just to percieve than does not in anyway means you have an ultimate knowledge of how things are. on the contrary. people like you typically gave very little knowledge because you simple dont read into and understand the effect you are perceiving.

no wonder yuo don't understand the differen between 60/60 and "close to 60/60" because you don't understand the technical changes and are solely going by a perception that has nothing to do with 60/60




in shor.
You are bringing up issues from you system NOT being able to run 60/60 (by you own information) and tries to use it in a 60/60 situation which you simply did not have..
 
Last edited:
Lets bring it down to a ms per ms basis

running 60FPS/60hz aka 16.7ms frame and refresh cycle times
Vsync on no freesync/gsync

0ms
monitor is startin its refresh cycles 1 from... well Nothing or whatever was before this.
GPU start to render Frame1 in Framebuffer A


16.7ms
GPU is done rendering Frame1 in framebuffer A
GPU does a buffer swap and starts rendering Frame2 in framebuffer B
Monitor starts refresh cycle 2 containts Frame 1 in frame buffer A

aka frame1 is being showed in screen


33.3ms
GPU is done rendering Frame2 in framebuffer B
GPU does a buffer swap and starts rendering Frame3 in framebuffer A
Monitor starts refresh cycle 3 containts Frame 2 in frame buffer B

aka frame2 is being showed in screen


50ms
GPU is done rendering Frame3 in framebuffer A
GPU does a buffer swap and starts rendering Frame4 in framebuffer B
Monitor starts refresh cycle 3 containts Frame 3 in frame buffer A

aka frame3 is being showed in screen


perfect world huh.
To notice is that there is a 16.7ms rendering delay . no tearing



lets try with Freesync/fsync running 60FPS aka 16.7ms frame cycle time and and variable refresh cycle time


0ms
monitor is startin its refresh cycles 1 from... well Nothing or whatever was before this.
GPU start to render Frame1 in Framebuffer A



16.7ms
monitor check if a new framebuffer sis ready to be shown
GPU is done rendering Frame1 in framebuffer A
GPU does a buffer swap and starts rendering Frame2 in framebuffer B
Since there was a new framebuffer ready Monitor starts refresh cycle 2 containing Frame 1 in frame buffer A

aka frame1 is being showed in screen


33.3ms
monitor check if a new framebuffer sis ready to be shown
GPU is done rendering Frame2 in framebuffer B
GPU does a buffer swap and starts rendering Frame3 in framebuffer A
Since there was a new framebuffer ready Monitor starts refresh cycle 3 containts Frame 2 in frame buffer B

aka frame2 is being showed in screen


50ms
monitor check if a new framebuffer sis ready to be shown
GPU is done rendering Frame3 in framebuffer A
GPU does a buffer swap and starts rendering Frame4 in framebuffer B
Since there was a new framebuffer readyMonitor starts refresh cycle 3 containts Frame 3 in frame buffer A

aka frame3 is being showed in screen

Same freaking things. why? because the effect from having freesync or vsync delaying that refresh cycle was unnend to begin with.
its is simply not neede at this speed the onyl thing frresync can do is delay the refresh cycles... and it was not needed



Now lets looks at "close to 60fps" lets do 50fps and60hz aka 20ms frame and 16.7 refresh cycle time
This is what OP is expericneing whe he see his FPS drops




0ms
monitor is starting its refresh cycles 1 from... well Nothing or whatever was before this.
GPU start to render Frame1 in Framebuffer A



16.7ms
Monitor starts refresh cycle 2 but still no bufferswap has happend so it repeats previous "Nothing or whatever was before this."

aka still no frame being showed in screen


20ms
GPU is done rendering Frame1 in framebuffer A
GPU cant do a bufferswap because it has to wait for the refresh cycle (aka V-sync)
GPU has no where to render now. framebuffer A has the new waiting frame1 and Frame buffer B is still being used by the monitor to show "Nothing or whatever was before this."
GPU stall and waits (power and heat is reduced)


33.3ms
GPU does a buffer swap and starts rendering Frame2 in framebuffer B
Monitor starts refresh cycle 3 containing Frame 1 in frame buffer A

aka frame1 is being showed in screen


50ms
Monitor starts refresh cycle 3 but still no bufferswap has happened so it repeats previous "Frame 1 in frame buffer A"

aka frame1 is still beeing showed in screen


53.3.ms
GPU is done rendering Frame2 in framebuffer B
GPU cant do a bufferswap because it has to wait for the refresh cycle (aka V-sync)
GPU has no where to render now. framebuffer B has the new waiting frame@ and Frame buffer A is still being used by the monitor to show "Frame 1 in frame buffer A"
GPU stall and waits (power and heat is reduced)

aka frame1is being showed in screen

66.7
GPU does a buffer swap and starts rendering Frame3 in framebuffer A
Monitor starts refresh cycle 4 containing Frame 2 in frame buffer B

aka frame 2


now thie here how the display of a frame now is fown to around 33.3ms aka 30FPS. even thoug our graphics card is able to handle 20ms/50fps rendering. its simpyl because it misses it chance and then frame drops occurs.
again this is ONLY as long as your framerender time is slower than you refresh cycle time... and yes this is horrible
but it is NOT 60/60 it was 50/60.. and its a huge difference)




Wanna try freesync with 50fps to see the wonderfull benefits of freesync/gsync
Why not. it would be fair not to show where freesync/gsync shines
60FPS 60HZ with freesync aka 20ms frame and variable refresh cycle time


0ms
monitor is starting its refresh cycles 1 from... well Nothing or whatever was before this.
GPU start to render Frame1 in Framebuffer A

16.7ms
Monitor check if a new framebuffer is ready to be shown.. ther is not. so it dealys the next refresh cycle

20.ms
GPU is done rendering Frame1 in framebuffer A
GPU does a buffer swap and starts rendering Frame2 in framebuffer B
Monitor sense bufferswap and starts refresh cycle 2 containing Frame 1 in frame buffer A

aka frame1 is being showed in screen


36.7ms
Monitor check if a new framebuffer is ready to be shown.. ther is not. so it dealys the next refresh cycle

40ms
GPU is done rendering Frame2 in framebuffer B
GPU does a buffer swap and starts rendering Frame3 in framebuffer A
Monitor sense bufferswap and starts refresh cycle 2 containing Frame 2 in frame buffer B

aka frame2 is being showed in screen


56.7ms
Monitor check if a new framebuffer is ready to be shown.. ther is not. so it delays the next refresh cycle

40ms
GPU is done rendering Frame3 in framebuffer A
GPU does a buffer swap and starts rendering Frame4 in framebuffer B
Monitor sense bufferswap and starts refresh cycle 2 containing Frame 3 in frame buffer A

aka frame2 is being showed in screen




look at that beauty. now we see 20ms delay form starting to render a frame to it beeing showned on screen. aka 20ms = 50fps
but also ntocie how the scree has 20ms between refresh cycles. aka the monitor is running at 50hz

THIS is where freesync/gsync shines, instead if like above we are dropping to 33.3ms frame render delay. we are down to 20.ms frame render delay. which is the real FPS of the grapphics cards
Stil not as good as the 16.7ms render delay we got at the perfect 60/60 but a huge improvement over the 50/60 with vsync


i hope this also illustrate why Archaea's experince with close to 60fps is not in any way or shape representative for a true 60/60.



I would love to show fast sync... but once you introduce a 3rd buffer it becomes pretty chaotic to follow so. ima not
 
SvenBent,

You don't know much about me - but yet you assumed a great deal. I organize and participate in a many blind tests in my audio hobby interests, not so much on the PC side because there isn't as much call to do so. (see below)

Yet my question still remains - have you tried freesync or gsync personally, for any length of time? (preferably on your own machine, where you can use it for a significant portion of time). simply put - rattling off the science isn't the same as sitting in front of it.

I had a friend who had a 1070, but had a freesync monitor. I let him borrow one of my Fury X cards for about a month so he could play with freesync. He thought it was an improvement, but he wasn't sure it was worth "downgrading" to a Fury X from his superior 1070. I asked for the card back, and so he put his 1070 back in, within just a couple days he sold his 1070 and ordered his own Fury X -- on his own accord - no prompting on my part. I then got to tease him because he was always previously badmouthing the Fury X and AMD as not being as good as Nvidia. He said when he left freesync and went back to vsync everything felt slightly laggy, and just not as smooth, and he hated the vsync mouse lag. I can agree with my own experiences -- now that I've done the same thing. And any tearing being gone with the occasional frame rate deviation is a big bonus as well.

Here's my personal testament to sync technology. I played Shadow of Mordor at 7680x1440 on my Fury X crossfire with freesync - beginning to end. My frame rates varied from the occasional low 40 FPS ish to the upper limit of 72FPS where I enabled targeted frame rate control. It felt amazingly smooth with all that variety -- so long as the game operated in freesync FPS range. Lots of times id only by between 50 and 60 FPS, and yet it felt smoother than attempting to lock in FPS at 60 without the sync technology. The frame rate was far from consistent at 7680x1440 with the Fury X Crossfire --- it didn't matter. Only when the FPS dipped out of freesync range (below 48hz) did I perceive an issue in smoothness, control, and experience -- it was immediately noticeable, and true to form - anytime I looked up at the frame rate counter when the experience was degraded it was always below freesync range, and ONLY when it was below freesync range was this degraded state observed.

60 FPS with no deviation feels fine, but when it deviates you know it. It feels clumsy, the mouse, or control consistency is gone. Also the vsync mouse lag is a bad deal. You want a simple test? Go into the Doom engine menus. Enable vsync, move your mouse around in the menus. Disable vsync, move your mouse around in the menus. Feel that difference? That vsync lag is detracting from your gameplay experience -- and is certainly noticeable once you've gone without it for a while and you re-engage it.


---------------------------------------------------------------
As to not knowing what placebo is, or blind test value. I've participated in quite a few blind tests with eye opening results in my audio hobby.

Here are the blind testing audio g2g's I thought were most valid/interesting that I've participated in.

2012 KC Blind Meet (subwoofers)
Host - Learned not even the most avid subwoofer enthusiasts on avsforum could identify ported vs sealed vs horn subs in blind testing, in an unfamiliar room. HUGE surprise actually. Differentiation and preferences noted in scoring trends divided subs into clear tiers across the board - but based on capability (and price), not at all box alignment.
http://www.avsforum.com/forum/113-s...sas-city-blind-subwoofer-shootout-2012-a.html

2012 KC Blind Meet (speakers)
Participant - Learned that an expensive pair of speakers we all expected to wholly dominate were only slightly preferred in blind testing over a much cheaper set of speakers.
http://www.avsforum.com/t/1422525/s...s-bill-fitzmaurice-dr-250s/0_20#post_22263924

2013 Iowa Blind Meet (speakers) - post 3480
Participant - Learned that two audio enthusiasts sitting immediately next to each other in a controlled listening test could wholly prefer different speakers. (speaker preferences are NOT identical) - and that musical preferences were remarkably different and clear between a big group of enthusiasts. Big Surprise - to this point I figured that everyone would like the same 'superior speaker' if exposed to the same test in the same conditions.
http://www.avsforum.com/forum/61-area-home-theater-meets/871474-ia-meet-116.html#post22789496

2013 - the amp comparison that was a failure because we found out Audyssey engaged accidentally after a power outage - throwing off our results - but still had interesting implications for those who attempted the retest on day 2.
http://www.avsforum.com/forum/61-area-home-theater-meets/1472180-do-amps-matter-kansas-city-blind-amp-comparison-gtg.html
Participant and Host - Learned that all tested amps sounded the same (except for a super cheap HTIB onkyo receiver) - that is from a $20 T-amp on up to an $800 Emotiva 2 channel audiophile amp (were indistinguishible in instant switching when played at reasonable/typical volumes). Somewhat surprised.

2014 PA Gorilla83 HT Heavy Hitters Loudspeaker Blind meet (speakers)
Participant - Learned that completely divergent speaker designs sounded more alike than different and I could be happy (read indifferent) with any number of design choices/speaker choices - and discovered I couldn't correctly identify any of them. I was personally surprised as there were VERY different designs in play.
http://www.avsforum.com/forum/89-sp...speaker-gtg-results-thread.html#post_24235005

2014 Auto Room EQ Processor Blind comparison (Audyssey, MCACC, DIRAC, YPAO, Trinnov, ARC, AccuEQ)
Host - Learned that auto EQ and 'reference' across different AVR and Auto EQ systems is in a state of absolute chaos, and there is no trusted "Reference" in our hobby. Surprised the state of our hobby was that bad.
http://www.avsforum.com/forum/90-re...mparison-g2g-november-8-2014-kansas-city.html

2014 - The blind test of the JTR 228HT vs. Mackie in my room, where I learned the power of personal bias. Post 30.
http://www.avsforum.com/forum/155-diy-speakers-subs/1535032-mackie-c200-under-hood.html#post28544890
Participant - Learned that differences I thought I could clearly identify in sighted listening (in fact stacking the deck by picking specific songs I knew very well for the audition music) - I couldn't identify in blind testing. Quite surprised.

2016 - And finally a relatively recent blind comparison of four speakers (SVS, Mackie, JBL, DIY)
http://www.avsforum.com/forum/89-sp...mackie-c200-vs-qsc-k10-vs-diysg-volt-8-a.html
Host - further confirmed that people's audio preferences are not universal
 
Last edited:
Even my 1080TI in SLI won't always keep 7680x1440 pegged at 60hz. And when it deviates it's noticeable. (not to mention frame tearing)


Stop making strawmen. I never spoke to gsync. I'm only speaking to freesync. I haven't spent enough time with gsync to formulate a valid opinion having only sat in front of a gsync setup for a couple rounds of a couple different games. -- But I used Freesync for nearly a year now -- and when it's absent - it's very clear.




This isn't my first rodeo. The HP Omen 32" monitors I have frame skip at 75hz, so I have to set them to 60hz.
So with the 1080TI SLI I'm forced to use vsync with 60hz on the HP Omen monitors.

It's not as smooth in run of the mill daily gaming. To the point I immediately noticed, and when I participated in a LAN party a couple days after doing the GPU swap - I actually felt disheartened enough to not enjoy myself. The lack of smoothness I pointed out to a couple of my friends, and they also noticed. They also used either gsync or freesync technology.

Not to mention I'm encountering what I believe to be microstutter with SLI, that I never experienced with the Crossfire setup and Freesync engaged on the monitors. --- so much so that after just a few days and going back and forth I keep SLI mostly off now. I've been on the 1080TI for coming up on a month - and I'm slowly getting more used to 60hz vsync, but it simply isn't as smooth, nor responsive as freesync and no vsync.

You say placebo. I say not. Placebo would typically have you want to buy the more expensive, more desirable, product. So one would think I would gravitate toward the pair of 1080TI, since they are clearly superior, on paper, than two Fury X.
In this case I'm telling you plainly that I'd rather game on my monitors with two Fury X and freesync than two 1080TI and 60hz locked vsync -- without question!

Have you personally spent some time with gysync or freesync. If the answer is no, then you having nothing of value to add to this particular thread.

You have a 1080ti for the performance, and probably had to go that route because waiting for the upgrade cycle from AMD so you were not getting the value out of your Freesync.
But with that experience it is probably a coin toss in terms of value because you may find yourself back in the situation 12 months from now where you need the Volta GV104 or the 1080ti to run AAA games well with higher resolution, meaning it may make sense to just bite the bullet with a decent GSYnc monitor when a sale is on now and again.

If you end up selling your 1080ti but replacing Vega again for Nvidia again in 12-15 months, well that Freesync monitor will have been a serious anchor.
I appreciate you are not the only one in this situation, as others went 390x+Freesync and ended up replacing the 390X with Pascal but kept their Freesync monitor and now looking to replace their Pascal card rather than monitor (which could mean they repeat the whole cycle again themselves 12-15 months from now going back to Nvidia).

Yeah a lot of whatifs, but that is unfortunately the situation you and quite a few others are in, and also those that have to decide whether they are going all in with Nvidia or AMD who are not yet tied to either manufacturer.
Cheers
 
You have a 1080ti for the performance, and probably had to go that route because waiting for the upgrade cycle from AMD so you were not getting the value out of your Freesync.
But with that experience it is probably a coin toss in terms of value because you may find yourself back in the situation 12 months from now where you need the Volta GV104 or the 1080ti to run AAA games well with higher resolution, meaning it may make sense to just bite the bullet with a decent GSYnc monitor when a sale is on now and again.

If you end up selling your 1080ti but replacing Vega again for Nvidia again in 12-15 months, well that Freesync monitor will have been a serious anchor.
I appreciate you are not the only one in this situation, as others went 390x+Freesync and ended up replacing the 390X with Pascal but kept their Freesync monitor and now looking to replace their Pascal card rather than monitor (which could mean they repeat the whole cycle again themselves 12-15 months from now going back to Nvidia).

Yeah a lot of whatifs, but that is unfortunately the situation you and quite a few others are in, and also those that have to decide whether they are going all in with Nvidia or AMD who are not yet tied to either manufacturer.
Cheers
The thing with FreeSync and GSync you don't have to worry as much about keeping the frame rates super high for smooth game play. A Radeon 64 and a 3440x1440 Freesync monitor is probably a very potent and great gaming setup. It took a 1080Ti on my 3440x1 440 monitor to have smooth game play being 60hz at that. On any game that the 1080Ti won't be able to maintain 60fps I will have to reduce settings or resolution for smooth game play. Maybe I should sell that and buy a GSync monitor - not. I figure my next major monitor purchase will be a HDR one probably in 2019.

Now maybe I misunderstood some of the explanation above but a FreeSync/Gsync monitor refresh rate varies with the card output. In other words it does not stay at 75hz but could be at 40hz, 42hz or anywhere within its range. This also indicates a varying frame rate being displayed to the viewer. Some faster frame times and then slower frame times. A straight 60fps has a consistent 16.7ms frame time - there is also smoothness in having a consistent frame time from one to next. FreeSync/Gsync prevents screen tearing and frame judder (two or more frames repeating itself) - it does not prevent uneven frame times. To me a consistent 60fps would be soother then an average 60fps using Freesync/Gsync. A consistent 100fps even better. One reason VR has a consistent frame rate of 90fps as the most preferred method since we can sense motion acutely and an inconsistent frame rate confuses the brain so to speak once we engage both eyes with depth perception information, uneven motion due to varying frame times could cause motion sickness. 2d is easier to fool the brain in motion, in VR, when our full ability is more exposed and used, 90fps is probably not fast enough.

Anyway I do have a FreeSync monitor, 4K 60hz and a consistent 60FPS is better then a varying 40fps-60fps Freesync. Still FreeSync is way better than a varying 40fps-60fps+ non Freesync game play. Vertical sync at less then the refresh rate of the monitor introduces mouse lag, if your rendering and monitor are in sync you don't have extra mouse lag or judder either. Mouse lag I hate more then tearing which makes Adaptive sync the best option for me on the 3440x1440 monitor.
 
Last edited:
SvenBent

Watch this video starting at 13:34. Bear in mind he's been basically saying/showing the Vega is a failed product offering during the video to this point based on just FPS delta and power draw. He was given a water cooled variant to review, and was appropriately pretty harsh on the power usage and "turbo mode".

However --- in his conclusion - he lays out that he's been reviewing on a non adaptive sync monitor, and the conclusion changes if you have an adaptive sync tech. --- SO make sure to watch what he says at 15:20!!!

He said he'd be more upset to give up the Vega card than the 1080TI -- even given the numbers based performance gap, because he knows what freesync has done for his gaming experience in the last year +.

That parallels my experience, and is what I've been attempting to convey.

 
SvenBent

Watch this video starting at 13:34. Bear in mind he's been basically saying/showing the Vega is a failed product offering during the video to this point based on just FPS delta and power draw. He was given a water cooled variant to review, and was appropriately pretty harsh on the power usage and "turbo mode".

However --- in his conclusion - he lays out that he's been reviewing on a non adaptive sync monitor, and the conclusion changes if you have an adaptive sync tech. --- SO make sure to watch what he says at 15:20!!!

He said he'd be more upset to give up the Vega card than the 1080TI -- even given the numbers based performance gap, because he knows what freesync has done for his gaming experience in the last year +.

That parallels my experience, and is what I've been attempting to convey.



Or he could give up the Freesync monitor and wait and buy a GSync from one of the better retailers when they have them on sale now and again (although needs to be careful selecting a Gsync monitor as many are not worth their price)....
But he is thinking that he will not be replacing Vega with anything but another future AMD GPU.
And heavily impacting this decision is how game requirements ramp up over the next 12 months and how early into 2019 we get Navi; context if wanting to use enthusiast level settings and 1440p with high refresh or 4K with 60Hz and max settings.

Not saying his point is not valid, just that it ignores some looming value considerations if one keeps having to change to a competitor.

In general and not directed at you; Regarding fluctuating performance with VRR, well there are quite a few on here who are adamant that long term gaming prefer VRR at high refresh compared to it fluctuating at much lower refresh rates.
VRR is great technology but not a magic bullet that totally removes the need to upgrade.
Cheers
 
Or he could give up the Freesync monitor and wait and buy a GSync from one of the better retailers when they have them on sale now and again (although needs to be careful selecting a Gsync monitor as many are not worth their price)....
But he is thinking that he will not be replacing Vega with anything but another future AMD GPU.
And heavily impacting this decision is how game requirements ramp up over the next 12 months and how early into 2019 we get Navi; context if wanting to use enthusiast level settings and 1440p with high refresh or 4K with 60Hz and max settings.

Not saying his point is not valid, just that it ignores some looming value considerations if one keeps having to change to a competitor.

In general and not directed at you; Regarding fluctuating performance with VRR, well there are quite a few on here who are adamant that long term gaming prefer VRR at high refresh compared to it fluctuating at much lower refresh rates.
VRR is great technology but not a magic bullet that totally removes the need to upgrade.
Cheers
10-4
 
I think the variable refresh monitors and what is acceptable is going to be a very personal thing, maybe some happy with lower refresh rates and others not going by some owners on this forum *shrug*.
Makes it even harder to reach a concensus with that being the case.
Only way to know for sure is those happy with the lower rates spending time with a higher performing VRR GPU+monitor.
Might be that once they done that they cannot go back to a weaker VRR setup; same way those using VRR even at lower refresh rates cannot go back to a more powerful GPU without such monitor technology - using it changes your perspective and could change the POV as we see some owners on here going for ever higher VRR that they have experienced.
Cheers
 
Last edited:
I think the variable refresh monitors and what is acceptable is going to be a very personal thing, maybe some happy with lower refresh rates and others not going by some owners on this forum *shrug*.
Makes it even harder to reach a concensus with that being the case.
Only way to know for sure is those happy with the lower rates spending time with a higher performing VRR GPU+monitor.
Might be that once they done that they cannot go back to a weaker VRR setup; same way those using VRR even at lower refresh rates cannot go back to a more powerful GPU without such monitor technology - using it changes your perspective and could change the POV as we see some owners on here going for ever higher VRR that they have experienced.
Cheers

fair point.
FWIW, I did spend a couple weeks with a 144Hz Freesync display vs. my 75Hz Freesync display. I personally don't think I could tell the difference in Freesync 75Hz vs Freesync 144Hz in games if my very life depended on it. It's fair to assume there is a bell curve though, and some people may perceive what I cannot, etc.
 
fair point.
FWIW, I did spend a couple weeks with a 144Hz Freesync display vs. my 75Hz Freesync display. I personally don't think I could tell the difference in Freesync 75Hz vs Freesync 144Hz in games if my very life depended on it. It's fair to assume there is a bell curve though, and some people may perceive what I cannot, etc.
So sound like it will be a very personal thing where it is fine for some but not others.
Good to know, probably means best suggestion for anyone going this route without experience like quite a few is to try both at 45-65Hz (fps) and also say 100-144Hz (fps) before making a decision on what they can live with.
Cheers
 
4K has been limited to 60gz for awhile except for now. So choice for 4K and high frequency was not there obviously. Also limited by gpu speeds. VRR monitors probably make more impact at lower FPS conditions.
 
Was holding out on the Vega cards but the 64 is just too powerhungry and hot for my ITX rig, the 56 looks ok heat/power wise (running sf600 corsair) but it looks like it is on par with a 1070, and im not sure it can provide the FPS needed for freesync stability at 3440x1440. all the reviews are for 4k and looks like it struggles with 4K

debating now on a 1080 (dont really need a 1080ti) and just set up adaptive sync and set my fps max to 60 fps in wow (all i play is wow for the most part and the 1060 now struggles with 3440x1440 at times)

so whats better? freesync at 40-50 fps (monitor capable of 75hz in free sync or 60hz without) or 1080 running with adaptive (60hz cap) and probably never dipping below

never used freesync but not sure its worth buying an overpriced, under performing power hungry card to get it

so whats "better" ? something that will just peg the games at 60fps with adaptive sync or freesync that probably cant keep games in the freesync range? right now in Canada the 1080 (non Ti) is coming in like 100-150$ cheaper than r64 so probably about the same as the v56 will be


I was in the exact same boat as you were a couple of days ago. I had sold my Rx480 on ebay for a ridiculous amount of money and wanted to upgrade to a Vega56 for my ITX system. I have the same powersupply as you and knew i could only run the 56 but only saw the 64 available. The problem i see is that the pricing is totally inflated in out of whack even if by chance you found one in stock. These cards do not perform like $500 or $600 cards so i can't see paying that. I have 3 27" samsung curved freesync monitors so i really wanted to stay with AMD but the availabilty and pricing is just insane right and it doesn't make any sense. So I ended up buying a Zotac GTX1080 mini from newegg for $539 and the performance is like double from my rx 480 and from what i can tell performs just like a full size 1080. The power draw and temps are great with this card and it fits really well in my Silverstone sg13 case. I don't really miss freesync that much all my games perform at 100fps mostly so it doesn't really matter anyways. the GTX1080 priced at $539 is pretty fair.
 
lol i did the same, ordered a gigabyte 1080 since they dropped almost 100 bucks on Amazon.ca

1080ti was overkill and alot more money, and im only playing warcraft and doom @ 3440x1440 . also the additional heat and power in itx... less noise
 
I was in the exact same boat as you were a couple of days ago. I had sold my Rx480 on ebay for a ridiculous amount of money and wanted to upgrade to a Vega56 for my ITX system. I have the same powersupply as you and knew i could only run the 56 but only saw the 64 available. The problem i see is that the pricing is totally inflated in out of whack even if by chance you found one in stock. These cards do not perform like $500 or $600 cards so i can't see paying that. I have 3 27" samsung curved freesync monitors so i really wanted to stay with AMD but the availabilty and pricing is just insane right and it doesn't make any sense. So I ended up buying a Zotac GTX1080 mini from newegg for $539 and the performance is like double from my rx 480 and from what i can tell performs just like a full size 1080. The power draw and temps are great with this card and it fits really well in my Silverstone sg13 case. I don't really miss freesync that much all my games perform at 100fps mostly so it doesn't really matter anyways. the GTX1080 priced at $539 is pretty fair.
Interesting, I'm curious...
What kind of games do you play? When I went away from freesync I noticed bad tearing and jutter/lack of smoothness in my FPS games right away to the point it felt gross --
do not want!
 
Interesting, I'm curious...
What kind of games do you play? When I went away from freesync I noticed bad tearing and jutter/lack of smoothness in my FPS games right away to the point it felt gross --
do not want!

That's because you're running a 75Hz panel. @ 144Hz you'd be hard pressed to see any tearing even without adaptive sync
 
Interesting, I'm curious...
What kind of games do you play? When I went away from freesync I noticed bad tearing and jutter/lack of smoothness in my FPS games right away to the point it felt gross --
do not want!

As a high FPS user here since years ago, and now Gsync and freesync user in several machines, once you go above 120hz there's nothing that can make you go back, specially if you can sustain over 100FPS steady... if you have 144hz capable hardware both in CPU and GPU department Gsync/freesync will be pointless and worse that, Gsync/freesync will not offer the same faster animation feelings and smoothness, it's one of the reasons why I kept 1080P as my choice for default resolution even with high end cards until recently. Now I moved to 1440P at high FPS and it will be this way for a very long time, until I can reach the same maxed out game settings at high FPS with 4k panels.

Gsync and freesync can make the FPS feel always as 60hz sync'd panel at every frame. But over 120hz/144hz steady is another level.
 
As a high FPS user here since years ago, and now Gsync and freesync user in several machines, once you go above 120hz there's nothing that can make you go back, specially if you can sustain over 100FPS steady... if you have 144hz capable hardware both in CPU and GPU department Gsync/freesync will be pointless and worse that, Gsync/freesync will not offer the same faster animation feelings and smoothness, it's one of the reasons why I kept 1080P as my choice for default resolution even with high end cards until recently. Now I moved to 1440P at high FPS and it will be this way for a very long time, until I can reach the same maxed out game settings at high FPS with 4k panels.

Gsync and freesync can make the FPS feel always as 60hz sync'd panel at every frame. But over 120hz/144hz steady is another level.
Interesting. I don't think I can perceive that. I had a 144hz monitor for a couple weeks with freesync and sold it because I couldn't tell a difference between 75hz freesync and 144hz freesync in games. But I could definitely tell when freesync was on vs off. The difference was very clear.
 
Interesting. I don't think I can perceive that. I had a 144hz monitor for a couple weeks with freesync and sold it because I couldn't tell a difference between 75hz freesync and 144hz freesync in games. But I could definitely tell when freesync was on vs off. The difference was very clear.

Which GPU were you using with the 144hz monitor?
I appreciate one can tweak their game settings but just curious if you had to lower them much, maybe comes back to game played *shrug*.
Cheers
 
Which GPU were you using with the 144hz monitor?
I appreciate one can tweak their game settings but just curious if you had to lower them much, maybe comes back to game played *shrug*.
Cheers
I was mostly playing through Doom actively at the time --- highest detail settings outside of Nightmare which required more than 4GB of VRAM to use that preset. Fury X (just a single because crossfire didn't work with Doom) - IIRC Vulcan driver. 90ish FPS lows at 2560x1080 -- averages well above 100 FPS. I was also spending a lot of time with Star Wars Battlefront (crossfire supported) and Dirt Rally (crossfire supported), and Path of Exile at the time. I loaded up other games to test, but those were the ones I was spending the most time with actively enjoying playing.

I recorded this comparison video at the time I was comparing the three monitors. HP Omen 32" (75Hz, 1440p, freesync) vs Dell 3014 (60hz, 2560x1600), vs. 35" ultrawide Acer xz350cu (144Hz, 2560x1080, freesync)




System Specs
I7-4770k o/ced to 4.5Ghz
16GB DDR3
2x Fury X Crossfire

I ended up liking the Freesync Omen the best, and bought two more.
 
Araxie, I don't think I tried 144hz without freesync --- since I had AMD cards I just left freesync on. But 75hz freesync and 144hz freesync were very equitable to me - to the point I don't think I could tell the difference. I think freesync smoothness really grew on me too. It might be one of those things, for me, that I appreciated more when it was taken away (missed it more), than was excited about when I first experienced it. But that's not to say it wasn't noticeable at first either -- just more telling on how obvious it was when it was gone and I went back to 60Hz vsync.

Interestingly enough - after playing with the 1080TI pair at 60hz vsync of a month I'm starting to lose that initial disgust I held for 60hz vsync after coming back from 75hz gsync. Not that it's okay - just that it's human nature to adjust to what you have. When I first left freesync to the 60hz vsync again I was actually disgusted enough to feel like I wanted to leave a LAN party (which I really really enjoy) because my mouse control felt kludgy, my visuals seemed jittery, I was seeing tearing again -- it seemed really offputting to the point I wasn't having as much fun gaming and felt like - forget it -- not worth it.

I remember when I first started playing with freesync and seeing just how smooth doom felt, and controlled, and looked, that it was a MAJOR upgrade from my 60hz Dell 3014 panel gaming experience. I replayed levels and lost track of time because it looked so good, felt so smooth, played so nice. Even though the Dell 3014 panel looked better than either the Omen or the Acer panels (better colors, better gamma, better resolution (thought lesser desirable black levels than Omen's VA panel)) --- the smoothness of the freesync outweighed the better quality Dell 3014 panel visual quality for gaming.
 
Last edited:
Interesting, I'm curious...
What kind of games do you play? When I went away from freesync I noticed bad tearing and jutter/lack of smoothness in my FPS games right away to the point it felt gross --
do not want!

Player Unknown Battlegrounds, CSGO , Rocket League , Killing Floor 2, Overwatch. I mostly wanted to upgrade for Battlegrounds i went from rx480 @ 50-55fps to 1080@ 100fps seriously it was like double the performance with this card. This card has headroom too for overlcocking. I haven't really felt the need to increase the power limit. Stock out of the box is such an improvement from the RX 480
 
Player Unknown Battlegrounds, CSGO , Rocket League , Killing Floor 2, Overwatch. I mostly wanted to upgrade for Battlegrounds i went from rx480 @ 50-55fps to 1080@ 100fps seriously it was like double the performance with this card. This card has headroom too for overlcocking. I haven't really felt the need to increase the power limit. Stock out of the box is such an improvement from the RX 480
Of those games PUBG works terrible on AMD - the others probably didn't see much improvement did they -- assuming they were hitting 48FPS and above consistently and freesync was engaged??? You would have been out of freesync range a good portion of the time with PubG I'd think. My Fury X couldn't handle PUBG well at all either - and it didn't support crossfire - so I returned the game through Steam. My brother's 1080TI at the same LAN party was handling it without an issue. This was a few months back. I've not played it since.

That video I linked above has the host saying his RX580 with freesync was a better gaming experience than a 1080TI and a vsync experience.

You know - one thing I see - is that our collective opinions/experiences seem to differ a great deal. Is it the difference in our unique person to person vision system/eye's ability to perceive?
 
I'm firmly in the FreeSync camp after picking up a Dell 27SeHx on sale for $159 out the door. I have OC'd the lcd to 83Hz (was hoping for 90 but 85 and up acts funky). I have gamed since 13" CRTs were the new hotness and for the last few years I have been unable to play games more then ~20 minutes without getting a terrible headache and feeling sick.

I went out and bought a 144hz LCD, thinking moar frames would solve the issue. While it helped, it did not make it go away completely. I returned the LCD and didn't play games for a few months outside of NES. Now that I have FS, I am able to play games without an issue once again! I know my use case is a bit different since not everyone has medical issues, but I would choose 75-90hz FS vs 144Hz no sync any day of the week. Even if I had never experienced simulation sickness, I'd still go the FS route.
 
Of those games PUBG works terrible on AMD - the others probably didn't see much improvement did they -- assuming they were hitting 48FPS and above consistently and freesync was engaged??? You would have been out of freesync range a good portion of the time with PubG I'd think. My Fury X couldn't handle PUBG well at all either - and it didn't support crossfire - so I returned the game through Steam. My brother's 1080TI at the same LAN party was handling it without an issue. This was a few months back. I've not played it since.

That video I linked above has the host saying his RX580 with freesync was a better gaming experience than a 1080TI and a vsync experience.

You know - one thing I see - is that our collective opinions/experiences seem to differ a great deal. Is it the difference in our unique person to person vision system/eye's ability to perceive?

Part of that comes back to my point in the FS article that you need trained particpants to have a more critical and correlated/aligned response, especially for subtle perception and preference differences.
Same way you will not get a clear consensus the level and which AA is needed to enjoy the game (higher performance vs utmost AA visual quality).
But once aware people do seem to have an aligned quality preference that shows in audio with context of distortion (and this is in essense a kind of distortion), and I think the same can be said about visual quality where as an example chromatic aberration is mostly an irritation for many.

But then we have different levels of tolerance and thresholds, and it is possibly that *shrug*.
I appreciate you used a diverse range of games but we know Dirt4 and Doom helps to bring AMD closer to 1080ti if more sensitive to say the 80-100Hz range and both with VRR monitors active, although I would had expected Doom to be running over 170fps on a 1080ti with same settings and maybe was hitting the 200fps sometimes with same settings where Doom starts doing quirky things (engine not designed to hit that).
Yeah I appreciate this is not something that can be verified relative to the same setting as used with the FuryX, Dirt 4 a single custom 1080ti manages 107fps at 1080p while a single FuryX hits 69.1 at same settings.
http://www.pcgameshardware.de/Dirt-4-Spiel-21687/Specials/Benchmark-Test-Review-1230499/

Now something that may be linked and I really should look more into, some game engines actually for some reason make me feel sick and I wonder if that is the lack of VRR or need to test at over 100Hz sustained.
A game like that could be interesting for those who do mess around with both settings and they find one that causes this with them.
Cheers
 
Last edited:
Back
Top