Nvidia study finds gamers have better K/D ratios at higher framerates

If that's how you want to view it then sure.

But remember the experiment was "does higher fps improve gameplay". Not "what is the best way to improve gameplay".

It's like if a university releases a study about the efficacy of a new drug used to reduce heart attacks. Should they be expected to run a separate battery of tests that test that efficacy against every other possible treatment? Absolutely not. That makes no sense. Nothing would get done.
You need to isolate a single variable and test that against the control group.

I don't know what you're arguing against. I'm in agreement with you that the only variable they tested was higher FPS. I'm just questioning the motivation behind Nvidia telling us that higher FPS makes you a better player. It's like an oil company funding an anti-climate change study. Or a liberal think tank funding a pro-climate change study. There might be elements of truth in there, but you'll always question the findings because of where the funding comes from.
 
I don't know what you're arguing against. I'm in agreement with you that the only variable they tested was higher FPS. I'm just questioning the motivation behind Nvidia telling us that higher FPS makes you a better player.
I was arguing your quote where you said that Nvidia was claiming that the only thing that matters is fps.
 
I was arguing your quote where you said that Nvidia was claiming that the only thing that matters is fps.

Well, I mean we can both read the articles and find no mention of any other variable (except possibly input latency in relation to the refresh rate) is my point. I just don't think it's a benevolent scientific experiment like evidently you do. This isn't about science. This is a sales pitch designed to sell something. A scientific experiment might point to other variables for further research. No such thing here. Just go buy RTX cards and G-sync.
 
Well, I mean we can both read the articles and find no mention of any other variable (except possibly input latency in relation to the refresh rate) is my point. I just don't think it's a benevolent scientific experiment like evidently you do. This isn't about science. This is a sales pitch designed to sell something. A scientific experiment might point to other variables for further research. No such thing here. Just go buy RTX cards and G-sync.
I think we might have read different articles.

'"On its own, correlation doesn’t mean causation of course. But put in the context of fps benefits such as animation smoothness, reduced ghosting and tearing, and lower system latency outlined in the article, the positive relationship shown in the chart makes a bit of sense," Nvidia explains.'
 
I wasn't someone who put too much stock in this kind of thinking back in the day. I was at 60Hz for years and did fine in multiplayer games. I always believed that as long as your FPS is consistent, you can get used to it and that's what mattered most. Perhaps that's actually still true to a degree. However, going to a higher refresh rate display did in fact, improve my K/D. I don't think dropping to 60Hz will make a good player bad and I don't think going from 60Hz to 144Hz will make a bad player good. It makes a difference, but it won't make someone into something they aren't.
 
Pro-gamers have more cash to burn and get gifted higher end equipment.

This is corellation != causation.
 
Yea its a BS article. Now you're going to have folks running a 10 year old i5 with 8 gigs of DDR2 memory put a 2080 super in their box and pair it with a gsync monitor to get the best gaming experience and that simply is NOT the case.

The article is designed to sell video cards and buyers need to beware.
 
I hate Nvidia's marketing as much as the next guy. I deeply regret my RTX 2070 purchase, but I think most people are missing the point here.

They are saying better frame rate correlates to better k/d ratio. Not RTXOMGWTFBBQ360NOSCOPE
 
nVidia is pimping gpu's and monitors. You need max fps for competition, set your graphics down to the lowest settings, play at a res < 1024x768. You want to get rid of REAL latency, play on servers that kick players for pings over 150-250ms, oh wait they don't let you run your own servers anymore. Your brain will compensate for the rest.
 
Let me list the things that matter... MORE than frame rate.
1. Network Latency..

Concerning Network Latency,All things being equal, probably. But in real life I think the servers do things to try and normalize all the players. For instance, with the new CoD, I find my best games are about 35-40ms latency, if I get below that, I have terrible games. People with about 80 are butchering me. But I don't seem to have that affect on others, when I hit 60-80, those games suck also. Add to that cros-platform shenanigans and it's hard to figure out whats going on.

I don't think Godly low ping rates are what they used to be.
 
I hate Nvidia's marketing as much as the next guy. I deeply regret my RTX 2070 purchase, but I think most people are missing the point here.

They are saying better frame rate correlates to better k/d ratio. Not RTXOMGWTFBBQ360NOSCOPE

This is from the NVidia marketing department. I think it is pretty clear what the take home message is:
battle-royale-fortnite-pubg-increase-in-kd-gpu.png


You should run out and buy a RTX 20xx is you want to be competitive.

Note, this isn't giving the same a player new gear and measuring how much people improve.

This is comparing people that own low end and high end gear and seeing how they compare. There is probably a correlation between serious gamers and high end equipment, and casual gamers and low end equipment.
 
This is from the NVidia marketing department. I think it is pretty clear what the take home message is:
View attachment 204828

You should run out and buy a RTX 20xx is you want to be competitive.

Note, this isn't giving the same a player new gear and measuring how much people improve.

This is comparing people that own low end and high end gear and seeing how they compare. There is probably a correlation between serious gamers and high end equipment, and casual gamers and low end equipment.
I see your point. In the future they should put the AMD equivalent with each bar. Or maybe total transistor count of each gpu.
That'd be easier to follow.
 
Your K/D ratio will also be higher if you upgrade your ball mouse to an optical mouse.
I legit laughed at this one.....i mean this is a captain obvious moment here......with better equipment people tend to do better. But that doesn't mean it will make you better if you already suck.....
 
I see your point. In the future they should put the AMD equivalent with each bar.

That'd be pretty rough -- PUBG isn't known to be friendly to AMD GPUs (purely a developer issue), and AMD has no high-end products let alone no RT products on the market.

On the one hand, including them would be 'fair', on the other... it'd be fairly one-sided.
 
it doesnt take a rocket scientist to know this.

come on. limit someone to 3 fps and play against them.



but i think once at levels that are already really high. it wont make any difference.
 
That'd be pretty rough -- PUBG isn't known to be friendly to AMD GPUs (purely a developer issue), and AMD has no high-end products let alone no RT products on the market.

On the one hand, including them would be 'fair', on the other... it'd be fairly one-sided.
I was just joking about that.
Why would Nvidia include their competitor's products in their data? Not doing so doesn't make them an evil corporation. It just makes them a business.
 
Today's news: Gamers enthused enough to play on higher end hardware are better than casuals that don't. More at 11...
 
Like I said, I bought a 240hz Acer Predator. A tn with better colors (to my eyes) than a VA. People here made fun of me, said it dont make a difference, I'm an idiot, I'm a fool that parted with my money, but my k/d went through the roof.

I shrugged the sheep off and enjoyed my 240hz of glory. Now that nVidia made the same damn claim, all the internet lemmings are like so cool bruh gotta get me a mega refresh rate panel now.

They even made fun of me for sharing LTTs video on the subject. Yeah you guys know who you are. I'm tempted to quote old posts but that might been seen as flame baiting or trolling.

Lemmings
 
Like I said, I bought a 240hz Acer Predator. A tn with better colors (to my eyes) than a VA. People here made fun of me, said it dont make a difference, I'm an idiot, I'm a fool that parted with my money, but my k/d went through the roof.

I shrugged the sheep off and enjoyed my 240hz of glory. Now that nVidia made the same damn claim, all the internet lemmings are like so cool bruh gotta get me a mega refresh rate panel now.

They even made fun of me for sharing LTTs video on the subject. Yeah you guys know who you are. I'm tempted to quote old posts but that might been seen as flame baiting or trolling.

Lemmings

If they are lemmings, are you saying you went off the cliff first?

;)
 
If they are lemmings, are you saying you went off the cliff first?

;)

Yes, I jumped...

I took a gamble and found out. Put my money where my mouth is.

Unlike some of these people who have no experience yet have all the knowledge on what's right and what's wrong


lemmings.png


I mentioned no names or links to any one person. I just feel justified that I was right concerning high refresh tech and gaming, nothing more.
 
Last edited:
I mentioned no names or links to any one person. I just feel justified that I was right concerning high refresh tech and gaming, nothing more.

I hope my post didn't come across as accusatory. It was intended to be fun.

I personally "only" run 120 because I've found my eyes can't see much beyond 100Hz in most games, but I do totally get it. For me, the difference between 60 and 120/144 is somewhere between "considerable" and "mandatory". At 60Hz, it feels utterly wrong and broken.

I imagine folks with better vision, younger reflexes, better skills could push it further than I.
 
I hope my post didn't come across as accusatory. It was intended to be fun.

I personally "only" run 120 because I've found my eyes can't see much beyond 100Hz in most games, but I do totally get it. For me, the difference between 60 and 120/144 is somewhere between "considerable" and "mandatory". At 60Hz, it feels utterly wrong and broken.

I imagine folks with better vision, younger reflexes, better skills could push it further than I.

Nope not at all accusatory.
 
I hope my post didn't come across as accusatory. It was intended to be fun.

I personally "only" run 120 because I've found my eyes can't see much beyond 100Hz in most games, but I do totally get it. For me, the difference between 60 and 120/144 is somewhere between "considerable" and "mandatory". At 60Hz, it feels utterly wrong and broken.

I imagine folks with better vision, younger reflexes, better skills could push it further than I.

I think the higher Hz help with smoothness and your brain predicting where the object will be.

Anywho - it’s been a while since I looked into this but I remember we can see higher Hz in black and white than color and our peripheral can see something around 300Hz but at lower color depth and clarity.

So it’s not all black and white...
 
You same people have flamed me for suggesting my 240hz monitor was worth its weight.

I'm laughing at you now.

Except that this poll is fake news and just a way for Nvidia to get suckers to buy over priced monitors.

"On its own, correlation doesn’t mean causation of course..."

There was no control in place ie having the same player play multiple matches on a 60hz, 120hz, 240hz monitor. The best players in the world are not going to play on crap monitors and casual gamers are not going drop big dollars on a great monitor.
 
So when everyone is pushing 240Hz it will actually come down to skill again? lol
TLDR a faster car will make your lap times faster than others until they get a faster car.
 
Well, I mean we can both read the articles and find no mention of any other variable (except possibly input latency in relation to the refresh rate) is my point. I just don't think it's a benevolent scientific experiment like evidently you do. This isn't about science. This is a sales pitch designed to sell something. A scientific experiment might point to other variables for further research. No such thing here. Just go buy RTX cards and G-sync.
Dark12 is saying that when you read a study that doesn't mention A, you cannot infer that the study means A has no effect. When you read a study titled "bananas make you poop", the researchers are in no way implying that "tomatoes do not make you poop". It just means that is a variable the researchers decided not to study.

Nvidia is a poor source for such a study anyways (even for the basic question "does an increase in FPS cause an increase in K/D ratio?") , but it would be invalid to make claims for them that they never made.
 
So when everyone is pushing 240Hz it will actually come down to skill again? lol
TLDR a faster car will make your lap times faster than others until they get a faster car.

Once everyone has at least a 5080ti playing on a 960hz monitor, everyone will have a 1:1 k/d regardless of skill level. :p
 
  • Like
Reactions: N4CR
like this
People who are more "hardcore" into gaming are likely to spend more money on it.
 
They even made fun of me for sharing LTTs video on the subject. Yeah you guys know who you are. I'm tempted to quote old posts but that might been seen as flame baiting or trolling.

Lemmings
Most of the people in here are casual gamers and never played competitive in anything and usually have a bad understanding how input lag, refresh rate, etc works. When they give their opinion about something like refresh rate I just get amused and simply ignore these people.
I don't own a 240hz monitor but I do own a 270hz one and it's even possible to get 300hz monitors nowadays.
It's kinda sad to see how the hardforum community have slowly been shrinking and the knowledge with it.. Instead we have people arguing about petty things that they don't know anything about. Hardforum, xtremsystems, overclock.net are slowly dying.
 
Most of the people in here are casual gamers and never played competitive in anything and usually have a bad understanding how input lag, refresh rate, etc works. When they give their opinion about something like refresh rate I just get amused and simply ignore these people.
I don't own a 240hz monitor but I do own a 270hz one and it's even possible to get 300hz monitors nowadays.
It's kinda sad to see how the hardforum community have slowly been shrinking and the knowledge with it.. Instead we have people arguing about petty things that they don't know anything about. Hardforum, xtremsystems, overclock.net are slowly dying.
:cry:
 
Except that this poll is fake news and just a way for Nvidia to get suckers to buy over priced monitors.

"On its own, correlation doesn’t mean causation of course..."

There was no control in place ie having the same player play multiple matches on a 60hz, 120hz, 240hz monitor. The best players in the world are not going to play on crap monitors and casual gamers are not going drop big dollars on a great monitor.
People who are more "hardcore" into gaming are likely to spend more money on it.
Most of the people in here are casual gamers and never played competitive in anything and usually have a bad understanding how input lag, refresh rate, etc works. When they give their opinion about something like refresh rate I just get amused and simply ignore these people.
I don't own a 240hz monitor but I do own a 270hz one and it's even possible to get 300hz monitors nowadays.
It's kinda sad to see how the hardforum community have slowly been shrinking and the knowledge with it.. Instead we have people arguing about petty things that they don't know anything about. Hardforum, xtremsystems, overclock.net are slowly dying.
This man speaks the truth
 
This is from the NVidia marketing department. I think it is pretty clear what the take home message is:
View attachment 204828

You should run out and buy a RTX 20xx is you want to be competitive.

Note, this isn't giving the same a player new gear and measuring how much people improve.

This is comparing people that own low end and high end gear and seeing how they compare. There is probably a correlation between serious gamers and high end equipment, and casual gamers and low end equipment.

This is the truth. This is nothing other than a marketing presentation disguised as research.
 
Yea its a BS article. Now you're going to have folks running a 10 year old i5 with 8 gigs of DDR2 memory put a 2080 super in their box and pair it with a gsync monitor to get the best gaming experience and that simply is NOT the case..

And what if AMD had commissioned the exact same study? You have an anti-nVidia Bias so your response is expected and contributes nothing to the actual topic.

...I personally "only" run 120 because I've found my eyes can't see much beyond 100Hz in most games, but I do totally get it. For me, the difference between 60 and 120/144 is somewhere between "considerable" and "mandatory". At 60Hz, it feels utterly wrong and broken...

But the faster refresh means on-screen things you need to react to show up on your physical display sooner. So don't discount that a faster refresh could help you.

The test was "all else being equal", which means everything else that contributes to your performance:

Your reaction time
Your ping to the server/playing on a wired connection or wireless
Your control device (wireless mouse or wired? older sensor or new 12k dpi laser one?)/mousepad
The game engine/game netcode quality
Your CPU

So without changing any of the above, the tests show that increasing the FPS improved the players' performance. This is a fact. The "control group" is themselves and their hardware/reaction time/skill/ping/mouse/mousepad/ being unchanged except for the FPS of the display (so they either control refresh of the LCD or changed the LCD hardware to a faster one, which was the point of the study, i.e. it was the ONLY thing that changed). I'm not referring to the nVidia marketing material but the tests LTT did using a high fps camera and a repeatable in game experiment, and blur busters similar testing.

I plan on getting a 144+ hz display as soon as some good quality panel based ones come out. That ASUS PQ35VQ (200Hz, G-Sync) comes close but reviews were not as good as I had hoped or I would already own one.
 
And what if AMD had commissioned the exact same study? You have an anti-nVidia Bias so your response is expected and contributes nothing to the actual topic.
Just because someone disagrees with the premises (and then likely the conclusions) of the article doesn't mean they are inherently biased. It's not biased to call something bullshit that's bullshit. All you need to see is that graph comparing a GTX 6XX series to a RTX 20XX series to see the bullshit. The GTX 680 was released 7.5 YEARS ago. So his comparison isn't that far off from what Nvidia is trying to demonstrate.
 
Back
Top