144 is the new 60?

lol for the love of god don't try and use this as some ace up the sleeve

a 44% increase in KDR by doing nothing but going from 60 to 144? Stupidest fucking claim I've ever heard anyone make.

maybe they missed a comman and mean 6.0 to 144 then mayve just maybe
 
This isn’t proven by any stretch yet. I host LAN parties a few times a year and inevitably there is a guy who comes in with potato hardware and mops the floor with everybody else on his favorite game. I remember a guy getting <30 FPS in counterstrike years ago who consistently was in the top 2-3 players out of about 25 passionate gamers.
 
This isn’t proven by any stretch yet. I host LAN parties a few times a year and inevitably there is a guy who comes in with potato hardware and mops the floor with everybody else on his favorite game. I remember a guy getting <30 FPS in counterstrike years ago who consistently was in the top 2-3 players out of about 25 passionate gamers.

Of course. Skill will trump everything else, every time. I don't think anyone is really thinking that higher FPS = automatic win.

The issue is: for a given skill level: does decreased latency and improved frame rate affect performance?

Everything I've seen and experienced indicates there is an impact.
 
Last edited:
Right, it doesn't seem so complicated.

These players are playing twitch shooters, and with a better PC/display they are literally seeing actions on the screen sooner and able to respond quicker.

To posit that has no impact on their performance in the game seems absurd.
 
Higher framerates are going to help better gamers more. Take the Overwatch example someone posted earlier, stating they were SR 2500 (Platinum rank). 90% of the player population is in Platinum or under (ranks low to high: Bronze, Silver, Gold, Platinum, Diamond, Master, Grandmaster), 10% are Diamond and up, and only 3% make it higher than Masters. Bronze players can't hit the broad side of a barn with an aimbot, but if you switch a Masters+ player from 60FPS to 144FPS, they definitely have the ability to take advantage of the difference.

I played Quake3 competitively back in the day, and I was pushing my CRT to 100Hz because it was THAT MUCH BETTER than 60Hz because I was at the top end of the skill curve back when I was a teenager and had all the time in the world to practice.

I went from a 60Hz LCD to a 165Hz one, and it made tracking enemies noticeably easier.
 
Last edited:
If you don't think that getting better refresh rates and better frame rates is an advantage, then IDK why you guys are on this forum. Go game on a dell inspiron at 60hz and have fun getting owned. nVidia has a valid point. I find it strange everyone here is calling BS....
Sure you can do very well on a Dell Inspiron gaming at 60hz/60fps, but if you are doing well on that system and switch to a rig capable of 144fps and a monitor capable of displaying it, your KD will ABSOLUTELY increase.

Also this quote from the article sums everything up pretty well:

This data doesn’t mean that simply upgrading your GPU will make you a better player. But however you cut it, it is easy to see a relationship between the hardware used and a player’s kill/death ratio: having the right hardware enables the highest FPS and lowest latency, and that can help you reach your full potential on the battlefield.
 
Right, it doesn't seem so complicated.

These players are playing twitch shooters, and with a better PC/display they are literally seeing actions on the screen sooner and able to respond quicker.

To posit that has no impact on their performance in the game seems absurd.
Nobody is positing that. I
 
Last edited:
I call bs.

A group of us get together twice a year to play fps games online, they pick the title and I play with them. This year was the latest black ops, which was surprisingly fun.

Everyone in this group is hardcore (except me) into fps games, they are all rocking 144hz+ systems with their choice of sync. I use a 60hz laptop with a 1070 because I can be bothered to risk and move my main system. My hay day of FPS was a decade ago, since then my reaction time has dropped with age and my skills rusted.

Not only do I keep up, often with never playing the game before (overwatch, Black ops, pubg where the last 3) I often outpace them and rank in the top 3 on the server.

If 144hz was such a huge advantage I could never compete in the K/D realm, let alone outscore them and most of the server.

There was more research in this post than nVidia marketing did in its graph ;)
 
I would say that minimum frame rates are probably more impactful than fps >100hz. Being able to understand what is going on during hectic scenes in online games, for example Battlefield V when there are artillery barrages and smoke, is going to be a big advantage versus someone with a rig that chokes down to 30-40fps during those same sequences.
 
I would say that minimum frame rates are probably more impactful than fps >100hz. Being able to understand what is going on during hectic scenes in online games, for example Battlefield V when there are artillery barrages and smoke, is going to be a big advantage versus someone with a rig that chokes down to 30-40fps during those same sequences.

If the tick rate is high enough yeah, if it's a server like BFV which can have questionable tick rates, you may have an advantage at 30-40fps lol
 
I call bs.

A group of us get together twice a year to play fps games online, they pick the title and I play with them. This year was the latest black ops, which was surprisingly fun.

Everyone in this group is hardcore (except me) into fps games, they are all rocking 144hz+ systems with their choice of sync. I use a 60hz laptop with a 1070 because I can be bothered to risk and move my main system. My hay day of FPS was a decade ago, since then my reaction time has dropped with age and my skills rusted.

Not only do I keep up, often with never playing the game before (overwatch, Black ops, pubg where the last 3) I often outpace them and rank in the top 3 on the server.

If 144hz was such a huge advantage I could never compete in the K/D realm, let alone outscore them and most of the server.

There was more research in this post than nVidia marketing did in its graph ;)

I think you are missing the point. Nobody is saying that you can't do good on 60hz. Nvidia is saying give any player who is decent at an FPS an upgrade from a 60hz monitor to a 144hz monitor (and hardware capable of it), and they will start playing better. 60hz in 2019 is a handicap. Take those same 144hz users you're talking about and downgrade them to 60hz and watch them complain about how they cant hit anything anymore.
 
This is why I need to save articles.

There is an incredibly in depth, technically detailed, page somewhere (oh I'll find you again and post you!) that explains exactly why this works. While it is just a marketing scheme/angle that nVidia pulled, the technical data behind high fps giving you an edge is there to back it up (unless the game has a hard cap, or breaks because of it).

That said, 60hz feels awful when you've played everything at 100 and above for a while.

Not sure what article, but this is a great resource for explaining how big a deal 144hz is.
https://www.testufo.com/
 
I think you are missing the point. Nobody is saying that you can't do good on 60hz. Nvidia is saying give any player who is decent at an FPS an upgrade from a 60hz monitor to a 144hz monitor (and hardware capable of it), and they will start playing better. 60hz in 2019 is a handicap. Take those same 144hz users you're talking about and downgrade them to 60hz and watch them complain about how they cant hit anything anymore.

BS, 44% increase my anus.

*
Someone has drank to much marketing coolaid, I have no doubt it makes things a little easier by being clearer in motion, but it is not a handicap, the player is the handicap.

I also played on one of there machines for a few hours, while its slightly smoother, I did not notice a significant increase in my time to kill, reaction/response, or anything really. Maybe I'm too old, maybe I'm just too good, or maybe it really is just FUD & FOMO.

I like high refresh rate, but to me it comes after fidelity, not before, I don't want to play a game that looks like a** so I can frag harder.
 
Last edited:
I think you are missing the point. Nobody is saying that you can't do good on 60hz. Nvidia is saying give any player who is decent at an FPS an upgrade from a 60hz monitor to a 144hz monitor (and hardware capable of it), and they will start playing better. 60hz in 2019 is a handicap. Take those same 144hz users you're talking about and downgrade them to 60hz and watch them complain about how they cant hit anything anymore.

It certainly helps, but it's nowhere near the benefit they claim. As I said earlier I bet I could cap myself back to 60 and my K:D would only drop slightly. There's just so many other variables at play here is why that chart is pure marketing garbage which is what almost all of us are here saying.
 
I play 180hz for R6 Siege and going back to 60hz will be a disadvantage. I would love to see 144hz replace 60hz, but high refresh rate monitors are a lot more expensive in other parts of the world. In other games, 30 fps is the best you can do for console users. Gears of War, Crackdown (soon to be Halo) all have crossplay. While Halo is 60fps, Gears of War 4 players are locked at 30 fps while a PC player can have 144hz at 120fps+ with precise mouse aim. Needless to say, they are slaughtered online.

These console players need to get off 30 fps before they get to 144fps. I am hoping next gen consoles kill off 30 fps.
 
Back when I was a pro gamer(hey ladies) the best guy in our guild was running onboard video(this was in the css days) and he would destroy us.
You get used to low frames, he was amazing.

I built him a new computer before I left, it took him a good 6 months to relearn how to play @ high fps.
 
BS, 44% increase my anus.

*
Someone has drank to much marketing coolaid, I have no doubt it makes things a little easier by being clearer in motion, but it is not a handicap, the player is the handicap.

I also played on one of there machines for a few hours, while its slightly smoother, I did not notice a significant increase in my time to kill, reaction/response, or anything really. Maybe I'm too old, maybe I'm just too good, or maybe it really is just FUD & FOMO.

I like high refresh rate, but to me it comes after fidelity, not before, I don't want to play a game that looks like a** so I can frag harder.

I don't drink "coolaid." 144hz over 60hz is a MASSIVE difference. It's not even really an nvidia vs amd thing so I don't see how you can claim I'm drinking the "coolaid." Look at any pro shooter, streamer, all of them are using high refresh rate monitors for a reason. It's not just slightly smoother, 60hz vs 144hz is more than double the refresh rate, so it's a LOT smoother, and all those extra frames means when somebody comes around a corner you see them a fraction of a second earlier. That's just one benefit. When you are making quick reflex shots instead of their being a handful of frames between where you looking are and the target, you might have more than 10 frames with a 144hz monitor in that split second. That means aiming is a lot easier and less of just muscle memory and luck, and also it helps significantly with immersion.
 
Back when I was a pro gamer(hey ladies) the best guy in our guild was running onboard video(this was in the css days) and he would destroy us.
You get used to low frames, he was amazing.

I built him a new computer before I left, it took him a good 6 months to relearn how to play @ high fps.

I like when the Australian "pro gamers" go to a tournament and get destroyed because they are used to really crappy pings, and over compensated at lan parties.

Still if monitor lag can reduce performance, then jumping from 60 to 144, would again mean half the lag/response time. I pity the person who loses because they were 3 frames behind the winner.
 
I don't drink "coolaid." 144hz over 60hz is a MASSIVE difference. It's not even really an nvidia vs amd thing so I don't see how you can claim I'm drinking the "coolaid." Look at any pro shooter, streamer, all of them are using high refresh rate monitors for a reason. It's not just slightly smoother, 60hz vs 144hz is more than double the refresh rate, so it's a LOT smoother, and all those extra frames means when somebody comes around a corner you see them a fraction of a second earlier. That's just one benefit. When you are making quick reflex shots instead of their being a handful of frames between where you looking are and the target, you might have more than 10 frames with a 144hz monitor in that split second. That means aiming is a lot easier and less of just muscle memory and luck, and also it helps significantly with immersion.

44% difference? This is what nVidia is claiming, and what is BS.

5% maybe, but what does it matter if the server tick rate isn't also 144hz?

Massive, no, minor yes. Look nicer of course, but I'll take fidelity over Hz anyday.
 
  • Like
Reactions: noko
like this
I like when the Australian "pro gamers" go to a tournament and get destroyed because they are used to really crappy pings, and over compensated at lan parties.

Still if monitor lag can reduce performance, then jumping from 60 to 144, would again mean half the lag/response time. I pity the person who loses because they were 3 frames behind the winner.

That's what ended my pro gamer tard phase, I was on shitty internet when I learned out to play. So something like 120 pings were totally normal, everyone else was sub 60. I left home and bought the fastest internet I could possibly get and magically sucked at online games out of the blue.
I had to stop playing all together for a while and then go back in order for me not to be over compensating, and I still to this day mess it up. I spent hours a day practicing at that ping and it's still muscle memory if I get tired.
 
I already knew this when I was a CRT gamer. In addition to the flicker, I could easily see the smoothness difference between 60Hz ad 85 Hz.

That's why I stayed on CRT for so many years. There is a difference!

I can't really see a difference between 85 Hz and 120hz (let-alone 240 hz), even on fast modern panels, but I figure it's just marketing to the 0.1 percenters at that point.


I agree with this. There is some benefit above 60fps, up to 90-100 somewhere, but above that I think it is marginal at best.
 
Even 75hz is a big step up.

Yeah. I can Still be pretty competitive at 60fps though, and I don't play multiplayer games much anymore, which is why I haven't bothered much with above 60fps gaming since my last CRT died in late 2004, but with that new Asus 43" 120hz 4k screen due soon, I may just jump back in.

Personally I Always found framerates above 60fps to be a pretty subtle improvement, but I did enjoy playing vsynced at 1600x1200@100hz on my 22" Iiyama Visionmaster Pro 510 back in the day.

If I am honest though, I don't even have enough a problem looking at 30hz content in videos or in game (I can see it is off in that UFO test for sure). It's OK to me until I actually grab the mouse and try to play at which point the inherent input lag at 30fps kills it for me.
 
Yeah. I can Still be pretty competitive at 60fps though, and I don't play multiplayer games much anymore, which is why I haven't bothered much with above 60fps gaming since my last CRT died in late 2004, but with that new Asus 43" 120hz 4k screen due soon, I may just jump back in.

Personally I Always found framerates above 60fps to be a pretty subtle improvement, but I did enjoy playing vsynced at 1600x1200@100hz on my 22" Iiyama Visionmaster Pro 510 back in the day.

If I am honest though, I don't even have enough a problem looking at 30hz content in videos or in game (I can see it is off in that UFO test for sure). It's OK to me until I actually grab the mouse and try to play at which point the inherent input lag at 30fps kills it for me.

That vsync enabled can cause a lot more input lag than you might think. Again another reason this is marketing nonsense.
 
To many blind assumptions I do believe have been made. As it might be true or not is another thing.

This would not be too difficult to test out. Use a control group and let them practice what ever game they prefer, keeping it either a local network or single player. Use the same monitor but with player not knowing the refresh rate and let them play. Then change the refresh rate after several runs on each.

Variations would be cool like take the best player in a MP game, lan, who consistently wins. While keeping every one else’s refresh rate the same except his.

I think adding in FreeSynce or Gsync, with more presentable data (images on the screen) to the player would help and should be tested as well.
 
To many blind assumptions I do believe have been made. As it might be true or not is another thing.

This would not be too difficult to test out. Use a control group and let them practice what ever game they prefer, keeping it either a local network or single player. Use the same monitor but with player not knowing the refresh rate and let them play. Then change the refresh rate after several runs on each.

Variations would be cool like take the best player in a MP game, lan, who consistently wins. While keeping every one else’s refresh rate the same except his.

I think adding in FreeSynce or Gsync, with more presentable data (images on the screen) to the player would help and should be tested as well.

I eagerly await your results!
 
Fast internet and a low profile keyboard would work better than a new card mouse would come into play as well.
 
To many blind assumptions I do believe have been made. As it might be true or not is another thing.

This would not be too difficult to test out. Use a control group and let them practice what ever game they prefer, keeping it either a local network or single player. Use the same monitor but with player not knowing the refresh rate and let them play. Then change the refresh rate after several runs on each.

Variations would be cool like take the best player in a MP game, lan, who consistently wins. While keeping every one else’s refresh rate the same except his.

I think adding in FreeSynce or Gsync, with more presentable data (images on the screen) to the player would help and should be tested as well.

Honestly I think in large part that 44% claim is due to type of player that buys a HRR monitor with capable graphics card. If you are buying 144hz and a 1080ti+ your obviously focused on FPS games in general and thus have many more hours in them then say RPGs or Strategy games, compared to someone happy with a 60hz or 100hz monitor (myself being 100hz).

So marketing removes that factor and comes up with a nice slide. I'll bet dollars to donuts there is little to no increase in actual performance of players of equal skill between a standard 60hz monitor and a 144hz monitor except for in the very top end of the competitive market, where the skill curve has been largely maxed out by hours played, in which case a small edge is still an edge.
 
That vsync enabled can cause a lot more input lag than you might think. Again another reason this is marketing nonsense.


Vsync shouldn't introduce any input language what so ever beyond what would be present in a theoretical GPU rendering at it's Max capacity at the same framerate.

All it does is hold the last conpleted frame and releases it the next time the screen refreshes. You'd have the same amount of input lag vsynced to 60hz, as if you were GPU limited at 60fps.

Sure, with Vsync you get a frame that has aged a little waiting in the frame buffer for the next refresh, but if you were GPU limited att the same framerate, this time wouöd be used actually rendering the frame, so the delay would be the same.

The only exception is if the GPU drops below 60fps. Then you'd wind up with the same input lag you'd have if you were GPU limited at 30fps.
 
I don't think Nvidia was exactly saying that buying a new monitor/GPU would automatically make you a better player (though, of course, that is what they would like you to infer). That idea is obviously questionable.

The point is that given a certain skill level, say for top players, a 60Hz/60fps machine would bottleneck their skill. So if you are already a top player, then getting a hardware upgrade would unlock your full potential.

I mean, I run some of the fastest rigs available, and I still suck at multiplayer games. So good hardware doesn't mean anything without the skill. But if you have the skill, no question that it helps.
 
Definitely just another anecdote but I agree that refresh rate and FPS does a lot more to improve gameplay than resolutions or widescreens.
 
I don't think Nvidia was exactly saying that buying a new monitor/GPU would automatically make you a better player (though, of course, that is what they would like you to infer). That idea is obviously questionable..

But that is exactly what they are saying.
 
I don't think Nvidia was exactly saying that buying a new monitor/GPU would automatically make you a better player (though, of course, that is what they would like you to infer). That idea is obviously questionable.

The point is that given a certain skill level, say for top players, a 60Hz/60fps machine would bottleneck their skill. So if you are already a top player, then getting a hardware upgrade would unlock your full potential.

I mean, I run some of the fastest rigs available, and I still suck at multiplayer games. So good hardware doesn't mean anything without the skill. But if you have the skill, no question that it helps.

I'd say even that assertion is a bit of a stretch. There is unquestionably an improved feel as framerate increases, but I'd argue that it has a marginal at best impact on game performance, as Long as you are at least in the 60-90 fps range, regardless of how good or competitive you are.
 
Last edited:
Man oh man, the same kind of people who said the eye can’t see more than 24fps... frametime results in less input delay. It also makes things noticeably smoother and less choppy. Less input delay means human input is closer to real time. There is NO SUBSTITUTE for skill, yet these people saying only 60-90 FPS is all you need are truly special. Man, I bet it’s the people who’ve invested in 3 4K 60fps displays justifying their choices after the fact.

Facts don’t care about your feelings you special people.


If you’re playing bottom of the barrel or Joe Noob who just started playing the game a week ago, or you’re playig Casual mode talking about top 3 on a server, you’re not playing a game like CS/Quake or you’re literally playing the public server casual mode. Try playing ESEA or competitive dueling and see how fast you accuse the other person of cheating when you play actual competition, not just someone who jumps on your surf server and how you’re so good you get in the top 3 while people blast music.


Obviously NGREEDDIA just wants more money, and higher FPS with a higher refresh rate with low latency will help your gameplay regardless of how Ngreedianjust wants that cheddar.

In my golfing days my big Bertha variant was not allowed on the pro tour. Was I ever good enough to play on the pro tour? No, not close. Tiger Woods or Mcilroy with a Big Bertha would be another story though.
 
Last edited by a moderator:
Facts don’t care about your feelings you special people.

Precisely. And there are published peer reviewed scientific studies resulting in human factors engineering design principles that suggest that human beings experience anything below 100ms as instantaneous.

Now this is probably in the general population, which includes old people, but one thing is for sure. Nothing in life improves linearly forever. If 90hz is better than 60hz, is 20Mhz better than 10Mhz? Probably not. At some point you reach saturation to where the improvement makes no practical difference. We usually call those diminishing returns. I don't know exactly where that line lies for game framerates, and it probably differs slightly from person to person, but what exacerbates finding it is the amount of placebo effect and bias that is involved in ANY human experience.

In many cases a sugar pill can perform almost as effectively at reducing pain as an actual pain killer, if you tell the person it is an effective drug before they take it. No one is immune to these effects, and they are far reaching, applying to everything we as humans sense in our lives. The end result is, if you THINK that something is going to be better before you try it, you WILL experience it as better, even if it doesn't make a difference at all.

This is why audiophiles spend thousands of dollars on car-priced audio jewelry that doesn't make an iota of difference.

The only way to find the line is to take large sample sizes of gamers and do blind tests. Not on test patterns (like that stupid flying UFO animation) but actual in game tests Problem is, I have never seen a thorough test like this with a methodology I trust.

So, maybe you are right, and we haven't gotten even close to limiting return yet, and 240hz is much better than 120hz. I can't show you any objective data to discount that because for some crazy reason, I haven't seen any actual rigid testing of this using thorough methodologies. All I can do is lean on my own experience from 15-20 years ago when I still had a high end CRT and still competitively played Counter-Strike (and probably had better reflexes than I do today). It is anecdotal as well, but in that experience, 90hz and 100hz DID feel a little bit better than 60hz, but it was subtle as all hell and really not worth writing home about.

I have never played at anything above 100hz, but since the natural law of things is that of diminishing returns, the higher we get, the smaller the improvement, so if 100hz felt marginal, anything above that should feel even less.
 
I have to look for the differences, its less noticible than going from HDD to SSD
 
Precisely. And there are published peer reviewed scientific studies resulting in human factors engineering design principles that suggest that human beings experience anything below 100ms as instantaneous.

Now this is probably in the general population, which includes old people, but one thing is for sure. Nothing in life improves linearly forever. If 90hz is better than 60hz, is 20Mhz better than 10Mhz? Probably not. At some point you reach saturation to where the improvement makes no practical difference. We usually call those diminishing returns. I don't know exactly where that line lies for game framerates, and it probably differs slightly from person to person, but what exacerbates finding it is the amount of placebo effect and bias that is involved in ANY human experience.

In many cases a sugar pill can perform almost as effectively at reducing pain as an actual pain killer, if you tell the person it is an effective drug before they take it. No one is immune to these effects, and they are far reaching, applying to everything we as humans sense in our lives. The end result is, if you THINK that something is going to be better before you try it, you WILL experience it as better, even if it doesn't make a difference at all.

This is why audiophiles spend thousands of dollars on car-priced audio jewelry that doesn't make an iota of difference.

The only way to find the line is to take large sample sizes of gamers and do blind tests. Not on test patterns (like that stupid flying UFO animation) but actual in game tests Problem is, I have never seen a thorough test like this with a methodology I trust.

So, maybe you are right, and we haven't gotten even close to limiting return yet, and 240hz is much better than 120hz. I can't show you any objective data to discount that because for some crazy reason, I haven't seen any actual rigid testing of this using thorough methodologies. All I can do is lean on my own experience from 15-20 years ago when I still had a high end CRT and still competitively played Counter-Strike (and probably had better reflexes than I do today). It is anecdotal as well, but in that experience, 90hz and 100hz DID feel a little bit better than 60hz, but it was subtle as all hell and really not worth writing home about.

I have never played at anything above 100hz, but since the natural law of things is that of diminishing returns, the higher we get, the smaller the improvement, so if 100hz felt marginal, anything above that should feel even less.
ive seen some off the wall YouTube tests that those participating were not sure, got it wrong etc. which to me indicates nothing dramatic. If one has to really look, go side to side quickly numerous times Vice just playing the game and still guess wrong. Then it is not that great of a difference.

I think some see less tearing and frame to frame jerks and confusing it with smoothness. I noticed that at 90fps and even faster non freesync can feel rather jerky compared to 60fps Freesync.

If it made a huge impact, I would expect to see people able to go up to a monitor and say, hey that is at 144hz and that one is at 100hz etc. in any case objective scientific test would be needed to clarify any advantages and what they are.

For me, 144hz Gave a more usable Freesync range available.
 
some people are more sensitive to such differences than others. i remember vividly how i connected my first 120 display, moved the mouse - on the windows desktop and the difference was so obvious to me that i said "wow" aloud, although i was alone. pair that with ulmb, as i found motion blur nauseating in shooters, and you have another world. at least, that was my experience.

a friend still plays on 60 hz, because he prefers the more vivid colors of ips and 4k, and the funny thing is, he still beats me usually, simply because he has far more time for gaming than i do and experience still matters more than frames. (as does talent, well, he has much better dexterity). and he has more fun with a beautiful image, so he plays more often.

meanwhile, my next upgrade will be a 240 hz monitor, i have already tried one and it felt much smoother than my current 120 hz screen. to me, smoothness matters more than looks. perhaps i'll play more if it's more fun, and get better ;)
 
Back
Top