60hz vs 120/144hz question.

motolube

Gawd
Joined
Dec 18, 2006
Messages
1,002
Sorry if this has been asked to death. Old fart here so I have always believed into the "the human eye can not see more than 30 fps" but seeing some youtube videos of the difference in gameplay between 60hz and 144hz has made me question my beliefs.

I am a Solo/Campaign gamer and I am not looking to set the world on fire. If you look at my rig (signature) you can see that I have been happy and content with low to mid setting but I am feeling that it is time I splurge on myself a bit and I want to see whether buying a 144hz 1080p monitor will be possible with my current rig or if I have to upgrade to something newer and/or better hardware than the current one.

One thing I should mention is that I recently got a new Asus Strix 1050ti 4GB. Nothing to Jump for joy but when compare to my old GTX660 I guess it is an upgrade for me.

Thanks folks
 
Last edited:
It will make a huge difference. Just make sure that whatever you get also supports G-Sync, not just a high refresh rate panel.
 
The human eye can easily see 144hz in most cases. For some games it might not make much of a difference though, especially with the sample and hold feature of LCD monitors.
What does make a huge difference is the very noticeable tearing, and consessions you have to make to get rid of it without adaptive sync, at 60hz. At low refresh rates, tearing can be anywhere from mildly annoying to downright nasty, especially in first and third person games.

I'm trying out a 32" 1440p 60hz VA monitor at the moment, as a complementing monitor to my CRT monitor. I like it on the whole, even the motion performance of the panel, but a game like Skyrim SE is just unplayable for me - without V-Sync it tears like a motherf****, and adding V-Sync lag on top of the barely adequate responsiveness of this monitor makes it unacceptably unresponsive.

Luckily 144hz does make the tearing go from jarring to almost unnoticeable, so you don't necessarily have to pay extra for adaptive sync for your Nvidia card (assuming you got a 1050ti, not the 1080ti). You could also swap the 1050ti for an equally priced/performing AMD card if you can find one, for Freesync support.
The 200-300$ does have some nice-looking-for-TN 24" 1080p 144hz Freesync monitors (Freesync to get a modern product, doesn't cost you anything even if you cant use it with Nvidia), like the XG2401 or 24gm79, and the Samsung C24FG70 offers a good 144hz VA panel for increase image quality as well (if you can work with the curve).
 
Last edited:
Recently upgraded to an X34 which is 100Hz after staying with 60Hzs for over a decade. It makes gaming so much more enjoyable. Everything is so silky and smooth that I even catch myself just moving the mouse/camera around in game because it feels great. I really want to try something like 120Hz or 144Hz and see how much better it can get but from what I hear the difference begins to taper off once you get past 75Hz.

You will not be disappointed with the upgrade. Just make sure your GPU can handle the frame output or else it will be all for naught.
 
Old fart here so I have always believed into the "the human eye can not see more than 30 fps"

You are basically talking apples vs oranges here. 30fps with pre-determined motion blur so that each of those 30 frames blurs together will be enough for the human eye. You can't really do that with interactive content though because it has to be displayed as soon as possible to prevent lag. TVs sometimes create virtual frames in-between the real frames to simulate the effect, but even that only works because the content on the TV (TV shows, movies, etc) is known in advance. It would create noticeable lag with interactive content like a videogame. Creating more actual real live frames is the only real option with a monitor and no lag.
 
Every game looks and "feels" better at 144hz. The explosions, driving, jumping, running, everything looks quite cinematic and "real". The difference between 60hz and 144hz is absolutely noticeable.
 
My strong opinion is try it before you buy.

I regularly switch between a 60hz monitor and a 144hz monitor.

I can't tell the difference.

I can see a big difference when I do that UFO test (https://www.testufo.com/#test=framerates), but when I'm actively involved in a game, I just don't notice.

I use vsync on my 60hz so tearing has never been an issue for me.

I use freesync on my 144hz. Supposedly that means I get less input lag (vs vsync), but I've never noticed input lag with vsync anyway.
 
WOW, you guys don't dissapoint. Frankly I am usually overwhelmed with the amount of responses I get from you guys but I do learn, and a lot, whenever I come here asking for questions. Hopefully you won't mind me much replying to all of you on some replies that may need a follow up question.

It will make a huge difference. Just make sure that whatever you get also supports G-Sync, not just a high refresh rate panel.
Yeah, that whole thing between G-sync & Freesync I think is whack but it does not really matter why. More than likely if I do decide to go with the upgrade I will get the G-sync since I usually always game with Nvidia and not ATI/AMD.


The human eye can easily see 144hz in most cases. For some games that might make much of a difference though, especially with the sample and hold feature of LCD monitors.
What does make a huge difference is the very noticeable tearing, and consessions you have to make to get rid of it without adaptive sync, at 60hz. At low refresh rates, tearing can be anywhere from mildly annoying to downright nasty, especially in first and third person games.

I'm trying out a 32" 1440p 60hz VA monitor at the moment, as a complementing monitor to my CRT monitor. I like it on the whole, even the motion performance of the panel, but a game like Skyrim SE is just unplayable for me - without V-Sync it tears like a motherf****, and adding V-Sync lag on top of the barely adequate responsiveness of this monitor makes it unacceptably unresponsive.

Luckily 144hz does make the tearing go from jarring to almost unnoticeable, so you don't necessarily have to pay extra for adaptive sync for your Nvidia card (assuming you got a 1050ti, not the 1080ti). You could also swap the 1050ti for an equally priced/performing AMD card if you can find one, for Freesync support.
The 200-300$ does have some nice-looking-for-TN 24" 1080p 144hz Freesync monitors (Freesync to get a modern product, doesn't cost you anything even if you cant use it with Nvidia), like the XG2401 or 24gm79, and the Samsung C24FG70 offers a good 144hz VA panel for increase image quality as well (if you can work with the curve).


Recently upgraded to an X34 which is 100Hz after staying with 60Hzs for over a decade. It makes gaming so much more enjoyable. Everything is so silky and smooth that I even catch myself just moving the mouse/camera around in game because it feels great. I really want to try something like 120Hz or 144Hz and see how much better it can get but from what I hear the difference begins to taper off once you get past 75Hz.

You will not be disappointed with the upgrade. Just make sure your GPU can handle the frame output or else it will be all for naught.

I read a lot about you guys using 32" to 40" for gaming and while I believe this could be sweet I never ask the question... at what distances are you gaming with these huge monitors or TVs? I still game on a desk so my distance from the monitor to my eyes is very short (no more than 2 feet) and I would figure a 32 or bigger TV monitor is just too close to enjoy gaming?

If you guys are on a couch and about 8 to 12 feet away, what are you using to rest your keyboard and mouse?

And therein lies my main problem... how do I know that my GPU can handle the frame rate? Is there something in the specs that tells me that? Reading the specs the only thing that makes some sense to me is that the Max Resolution is 7680 x 4320 so that tells me that playing it at 1080p should do the trick but there is nothing there, at least that I can see, telling me whether it is capable of handling 144hz.




You are basically talking apples vs oranges here. 30fps with pre-determined motion blur so that each of those 30 frames blurs together will be enough for the human eye. You can't really do that with interactive content though because it has to be displayed as soon as possible to prevent lag. TVs sometimes create virtual frames in-between the real frames to simulate the effect, but even that only works because the content on the TV (TV shows, movies, etc) is known in advance. It would create noticeable lag with interactive content like a videogame. Creating more actual real live frames is the only real option with a monitor and no lag.
sorry, way above my pay grade or ability to grasp all of that but I believe I get the gist even if it left me with more questions than before lol.

Every game looks and "feels" better at 144hz. The explosions, driving, jumping, running, everything looks quite cinematic and "real". The difference between 60hz and 144hz is absolutely noticeable.

My strong opinion is try it before you buy.

I regularly switch between a 60hz monitor and a 144hz monitor.

I can't tell the difference.

I can see a big difference when I do that UFO test (https://www.testufo.com/#test=framerates), but when I'm actively involved in a game, I just don't notice.

I use vsync on my 60hz so tearing has never been an issue for me.

I use freesync on my 144hz. Supposedly that means I get less input lag (vs vsync), but I've never noticed input lag with vsync anyway.

If I only play Offline, do I have to suffer from the lag that people suffer when playing Online though? or that is minimum when talking about the difference between 60hz and 144hz.



Finally, if you guys had my rig and a couple hundred bucks for a monitor, taking into consideration the distance between eyes and monitor and the fact that I only do Shooter Games (CoD, Moh, BF, Far Cry, etc... which monitor would you buy?
 
Sorry if this has been asked to death. Old fart here so I have always believed into the "the human eye can not see more than 30 fps" but seeing some youtube videos of the difference in gameplay between 60hz and 144hz has made me question my beliefs.

I am a Solo/Campaign gamer and I am not looking to set the world on fire. If you look at my rig (signature) you can see that I have been happy and content with low to mid setting but I am feeling that it is time I splurge on myself a bit and I want to see whether buying a 144hz 1080p monitor will be possible with my current rig or if I have to upgrade to something newer and/or better hardware than the current one.

One thing I should mention is that I recently got a new Asus Strix 1080ti 4GB. Nothing to Jump for joy but when compare to my old GTX660 I guess it is an upgrade for me.

Thanks folks


Wait they make a 1080Ti in a 4GB version?
 
I read a lot about you guys using 32" to 40" for gaming and while I believe this could be sweet I never ask the question... at what distances are you gaming with these huge monitors or TVs? I still game on a desk so my distance from the monitor to my eyes is very short (no more than 2 feet) and I would figure a 32 or bigger TV monitor is just too close to enjoy gaming?

If you guys are on a couch and about 8 to 12 feet away, what are you using to rest your keyboard and mouse?

And therein lies my main problem... how do I know that my GPU can handle the frame rate? Is there something in the specs that tells me that? Reading the specs the only thing that makes some sense to me is that the Max Resolution is 7680 x 4320 so that tells me that playing it at 1080p should do the trick but there is nothing there, at least that I can see, telling me whether it is capable of handling 144hz.

I use the 32" from about 2 feet away. 1440p at 32" / 92 DPI is perfectly fine for any type of game at this distance, with the added benefit of finer detail in the distance (Planetside 2 is much easier as a sniper now) and increased immersion. My friend uses an old 32" 1080p TV from the same distance, which also works better than I expected, although I wouldn't use it myself.
As a bonus, the size makes it much easier to see what I'm doing when I'm playing from a distance as well. I use a Steam Controller for that, since it emulates kb+m well enough and doesn't care if you sit, stand or lay down playing.

Finally, if you guys had my rig and a couple hundred bucks for a monitor, taking into consideration the distance between eyes and monitor and the fact that I only do Shooter Games (CoD, Moh, BF, Far Cry, etc... which monitor would you buy?

Whatever you buy, I would advice getting something with Freesync / VESA adaptive sync (or G-Sync, but with a 1050ti I expect you don't want to pay the G-Sync markup). Freesync don't cost more than the ones without it, and then you'll have it for if you get a lower end AMD card at some point.
I would stick with 1080p with a 1050ti though. What kind framerates are you getting in the games you play?
 
Last edited:
Yeah there is a big difference of course. Especially if you have SLI setup.
But gaiming on a 60hz monitor is totally possible and almost equally enjoyable.
 
Wait they make a 1080Ti in a 4GB version?
Ahhhh crap!!!

No, sorry... I meant a 1050ti

What a horrible way to botch my own post! :confused:

I have been playing shooter games since the late 90s and all my video cards for gaming have been Nvidia so I doubt very much I will get an AMD/ATI ... not say never but the odds are sure against it .
 
G-Sync will definitely help with the 1050Ti

Whatever you buy, I would advice getting something with Freesync / VESA adaptive sync (or G-Sync, but with a 1050ti I expect you don't want to pay the G-Sync markup). Freesync don't cost more than the ones without it, and then you'll have it for if you get a lower end AMD card at some point.
I would stick with 1080p with a 1050ti though.
Why would you recommend a FreeSync display if he just bought an NVIDIA GPU?
 
G-Sync will definitely help with the 1050Ti


Why would you recommend a FreeSync display if he just bought an NVIDIA GPU?

He bought a 1050ti - by default I don't expect him to want to hand out 350+ for a G-Sync monitor (non-TN ones being much more expensive even). And if you're not buying G-Sync, there is no point in getting a monitor without Freesync - "Freesync" monitors are just newer monitors with displayport 1.2a and the appropriate scaler and overdrive for adaptive sync, with no added cost. The Samsung S24F350 line is just about the cheapest 24" 1080p IPS monitors I've seen yet for example, and they come with Freesync support.

Even if you never buy an AMD card, there's a good chance that Apple and Intel, and perhaps even Nvidia, will support those monitors later on. Apple are making an iPad with adaptive sync now, and since it saves power with no added cost, I expect to see it in much of their lineup in the future. Intel could suffer lost sales if AMD's Raven Ridge puts adaptive sync in every single AMD laptop and they don't live up to that. If the rest of the industry go with software based adaptive sync, I wouldn't be surprised to see Nvidia offer limited range Freesync-like software support to complement their expensive G-Sync module products with 30-144hz range and ULMB.
 
Last edited:
I'm on a 60hz panel as Im typing this but I have a 165hz and it's not even that amazing (AOC AGON AG271qg for referenec). Most of time you'l never get even 144fps at 1440p. There is a difference of course, but unless you play FPS games purely it doesn't really make that big of a difference.

War Thunder, Company of Heroes, WoW or Starcraft? no big deal at all. CS:GO? Much more valuable.

I am currently eyeing a 4k 32" monitor.

Size + Resolution > Refresh rate for me. Also pixel response times matter, things like the 200hz Z35 is complete garbage because it's just blury and really, complete junk that I wonder how it even passed internal review process.
 
For simple proof that more Hz and FPS makes a difference:

Go to youtube and watch a video in 1080 @ 60fps (most are only 30fps). You should easily be able to tell the difference. You'll soon learn that you thought 30 was smooth but it isn't.
 
For simple proof that more Hz and FPS makes a difference:

Go to youtube and watch a video in 1080 @ 60fps (most are only 30fps). You should easily be able to tell the difference. You'll soon learn that you thought 30 was smooth but it isn't.

Its amazing anyone really think 30hz was good in the first place other then OP? :p
 
Back
Top