Nvidia study finds gamers have better K/D ratios at higher framerates

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
"Nvidia is not alone in this thinking.

"For many years, esports pros have tuned their hardware for ultra-high frame rates—144 or even 240fps—and they pair their hardware with high refresh rate monitors. In fact, ProSettings.net and BattleRoyaleSettings.com report that 99 percent of Battle Royale Pros (Fortnite, PUBG and Apex Legends) are using 144Hz monitors or above, and 30 percent are using 240Hz monitors," Nvidia stated in March.

Nvidia's claim is also echoed in testing conducted by Linus Tech Tips, from back in June. Have a look:"

https://www.pcgamer.com/nvidia-study-finds-gamers-have-better-kd-ratios-at-higher-framerates/
 
I bet this is a *LOT* to do with the folks that are best at this also have the fastest gear. Kind of like the best drivers usually have the best race cars. The best shooters have the best guns. I'm sure if i gave Jerry Miculek a regular AR-15 he could still outshoot me any day of the week, even if I had the lightest trigger on the planet.
 
My K/D ratio never went any higher than it was at 85hz on CRT. Even though I tested frame rates of OVER 140 fps (at 140hz, vsync off).

While I will grant that you will see much higher kill rates going from 45 to 75fps, for most gamers those increases are going to hit a wall jumping to 120hz or higher.

But because it's a paid study by Nvida, they won't tell you ho low the frame rate cutoff is.
 
Not a surprise: Higher framerate is going to (all things being equal) give you lower latency which gives you an advantage. Particularly in the really twitchy shooters with high damage. When I played Black Ops 2, I decided to buy some skill in the form of a fast monitor. I had a 26" NEC pro IPS monitor I used and loved which was 60Hz as almost all monitors then were and had 33ms of latency. I was hard stuck at around .9ish KDR. I got myself a BenQ 24" TN 120Hz monitor, one of the few true 120Hz panels around then, which looked like utter ass particularly in its FPS mode which lowered the latency to like 4ms. Worked though, KDR went up to over 2. I could just get the shots off faster than other people on account of the faster display.
 
There was a follow up to this LTT video a couple weeks ago.


To sum it up yes higher rates help. But you can get the majority of the benefit just by having the high frame rate, even if it exceeds your refresh rate because the server data updates on your screen with more recent positioning.
 
There was a follow up to this LTT video a couple weeks ago.


To sum it up yes higher rates help. But you can get the majority of the benefit just by having the high frame rate, even if it exceeds your refresh rate because the server data updates on your screen with more recent positioning.



Right, as long as you game with vsync off, the benefit of higher refresh rates is mostly lost. A torn frame is as good as a fully-rendered frame.
 
Even if the human eye could see more than 30 fps, which it can't, then how come soap operas don't look as good as Star Wars

Does it matter what brand my frames per second are? Are nvidia 1650 ones good enough for your average pro fortnite streamer?


Yeah, got nothing
 
You were probably younger and 85 Hz CRT is much faster and sharper than even a 240 Hz LCD.


Nope, I tried a newer 1ms TN LCD panel at 75hz. and it was almost identical response times to my old CRT at 85hz (slightly slower, but not noticeable in terms of my pewrformance..

And no, the CRT wasn't much "sharper" than my 1080p LCD - only the gargantuanly expensive CRTs were the beauty queens you see everywhere on Youtube.

My 19" $450 IIyama 19" Diamondtron CRT had glowing edges around sharp pixel transitions (i.e. white to black text), and a dot clock that limited to about 900 vertical lines before getting super blurry.

The only thing my TN LCD does worse than the CRT is viewing angles and contrast (only slightly), but it doesn't have IPS glow nor the ghosting issues of VA. It's a much sharper picture than ml old CRT
 
Nope, I tried a newer 1ms TN LCD panel at 75hz. and it was almost identical response times to my old CRT at 85hz (slightly slower, but not noticeable in terms of my pewrformance..

And no, the CRT wasn't much "sharper" than my 1080p LCD - only the gargantuanly expensive CRTs were the beauty queens you see everywhere on Youtube.

My 19" $450 19" Diamondtron CRT had glowing edges around sharp pixel transitions (i.e. white to black text), and a dot clock that limited to about 1000 vertical lines before getting super blurry.

The only thing my TN LCD does worse than the CRT is viewing angles and contrast (only slightly), but it doesn't have IPS glow nor the ghosting issues of VA.
I meant in motion.
 
I meant in motion.


On 1ms overdriven Asus TN, I can't see the Motion Blur that everyone swears is there. Maybe if you all would stop insisting on gaming on IPS or VA?

Yeah, it was a problem back in 2005, when the best panels had maybe 16ms response times (and total response of over 40ms), but that era is over. Any decent 1ms TN panel is going to have total response times pf 10ms, and that's indistinguishable from CRT motion to me.

My C7 OLED TV has similar response times and motion clarity to my old CRT, in game mode.
 
On 1ms overdriven Asus TN, I can't see the Motion Blur that everyone swears is there. Maybe if you all would stop insisting on gaming on IPS or VA?

Yeah, it was a problem back in 2005, when the best screens had maybe 16ms response times,. but that era is over. Andy decent 1ms TN panel is going to have total response times pf 10ms, and that's indistinguishable form CRT motion to me.
Different people, different eyes I guess. 10 ms is vaseline teritory.
 
On 1ms overdriven Asus TN, I can't see the Motion Blur that everyone swears is there. Maybe if you all would stop insisting on gaming on IPS or VA?

Yeah, it was a problem back in 2005, when the best panels had maybe 16ms response times (and total response of over 40ms), but that era is over. Any decent 1ms TN panel is going to have total response times pf 10ms, and that's indistinguishable form CRT motion to me.
Fair points but I think everyone has different sensitivities.

Also it depends on the games. I wouldn't really care about what a StarCraft player has to say compared to a twitch shooter pro when it comes to response time and ghosting.
 
Fair points but I think everyone has different sensitivities.

Also it depends on the games. I wouldn't really care about what a StarCraft player has to say compared to a twitch shooter pro when it comes to response time and ghosting.


Right, I'm not the and be-all end-all when it comes to eyesight. I'm one of the few people in my circle of friends who cares this much about display accuracy and input lag, so you can imagine how rare vision like yours is :D

I'm just saying, this is a paid promotional piece by Nvidia, not useful for most casual gamers
 
Right, I'm not the and be-all end-all when it comes to eyesight. I'm one of the few people in my circle of friends who cares this much about display accuracy and input lag, so you can imagine how rare vision like yours is :D

I'm just saying, this is a paid promotional piece by Nvidia, not useful for most casual gamers
I hear ya bud.

Check out that second video though. The conclusion they came to is the general consensus with fps gamers. High fps isn't about visual fidelity, but rather the feel of the game. More frames equals more updates put on your screen from your mouse and the game server.

Competitive counterstrike servers run at 128 tick rate so if you have any less fps than that you are simply not seeing all the movement info the server is sending you. So your game client interpolates (guesses) everything in between. That causes you to make shots that look like hits, when in fact they were misses because your client guessed the wrong position of the enemy.
 
Even if the human eye could see more than 30 fps, which it can't, then how come soap operas don't look as good as Star Wars

Does it matter what brand my frames per second are? Are nvidia 1650 ones good enough for your average pro fortnite streamer?


Yeah, got nothing

You forgot the /s, you're gonna confuse some kids, lol.
 
Yeah and Adderall makes you a better player too, compared to refresh rates it's a no brainer.
 
Also it depends on the games. I wouldn't really care about what a StarCraft player has to say compared to a twitch shooter pro when it comes to response time and ghosting.

Not just the game, but even just the scene. The biggest driver of ghosting and motion blur is contrast. If a scene is generally light or generally dark or just generally similar, ghosting and motion blur are going to be harder to pick up. If you're scrolling a webpage that is black text on a white background, the effects are often immediately noticeable and can make the text impossible to read while scrolling. Turn that same page to black text on a dark grey background and suddenly it becomes easy to read while scrolling.
 
I hear ya bud.

Check out that second video though. The conclusion they came to is the general consensus with fps gamers. High fps isn't about visual fidelity, but rather the feel of the game. More frames equals more updates put on your screen from your mouse and the game server.

Competitive counterstrike servers run at 128 tick rate so if you have any less fps than that you are simply not seeing all the movement info the server is sending you. So your game client interpolates (guesses) everything in between. That causes you to make shots that look like hits, when in fact they were misses because your client guessed the wrong position of the enemy.

I can tell your heart is in the right place but game logic is not affected by the amount of frames a monitor can display.

As per your example, the game client generates frames based on game state. Each client updates the server n times a second, the server takes said inputs, tries to process them, then updates everyone on its next tick.

Say you are playing on a properly configured CS:GO server with 100 or higher tick rate. You game on a machine capable of rendering 500 fps. Until variable refresh rate modes, your computer would generate the 500 FPS but ~400 of those would be dupes. On top of that, there are no 400 Hz panels that I know of. Same thing if a monitor can only display 5fps. That won't slow the game, nor cause interpolation outside your brain.

Hopefully my exhausted brain did a good job reminding you about things you already knew
 
I used to end up in the top four or five players on 30 player CS:S servers until these 1ms high refresh rate monitors hit the market and I didn't bother to upgrade. I would even be the top player from time to time if there were no pregnant women on the server (that's right, I said that).

Now I'm lucky if I don't come in last. I don't really game anymore, it feels like everybody is cheating. lol

There are people who are good at making their brain compensate for the lag but I'm not one of them.
 
I am SHOCKED... SHOCKED I tell you that an nvidia study says higher frame rates = better competitive gameplay.

Let me list the things that matter... MORE than frame rate.
1. Network Latency.
2. Disk queueing
3. Control surfaces/devices.

Get those three optimized and even with a bog standard video card and monitor your gaming performance will be MUCH improved. Only once all of those are optimized would I worry over much at having the 240hz refresh rate panel driven by a RTX Titan.
 
  • Like
Reactions: Axman
like this
I am SHOCKED... SHOCKED I tell you that an nvidia study says higher frame rates = better competitive gameplay.

Let me list the things that matter... MORE than frame rate.
1. Network Latency.
2. Disk queueing
3. Control surfaces/devices.

Get those three optimized and even with a bog standard video card and monitor your gaming performance will be MUCH improved. Only once all of those are optimized would I worry over much at having the 240hz refresh rate panel driven by a RTX Titan.
Fair points but no one was saying that fps is the only thing that matters.
 
Actually, isn't that exactly what Nvidia was saying?
No
You can't address multiple variables in a single experiment.
Also, how much sense would it make for a video card manufacturer to push users to optimize their internet connection for better gaming?
 
No
You can't address multiple variables in a single experiment.
Also, how much sense would it make for a video card manufacturer to push users to optimize their internet connection for better gaming?

https://www.nvidia.com/en-us/geforc...mes/?cjevent=f83e7995177511ea83a003140a24060e

https://www.nvidia.com/en-us/geforce/news/geforce-gives-you-the-edge-in-battle-royale/

Those are the referenced "studies" that the OP article linked to. They are pushing "high FPS" and ignoring everything else to sell video cards. So I think I'm right in saying that Nvidia is pushing high FPS as the largest variable and ignoring just about anything else for the purpose of selling video cards and G-sync monitors.

They never really go into detail about where their percentages come from in terms of K/D ratio nor whose K/D ratio they are pulling from. All it says is "esports pros." At the end of the day, I'm sure an "esports pro" in CS:GO is going to kick my ass even if I have a 240Hz monitor and a 2080Ti even if he was playing with a 60Hz monitor and an RX580.
 
https://www.nvidia.com/en-us/geforc...mes/?cjevent=f83e7995177511ea83a003140a24060e

https://www.nvidia.com/en-us/geforce/news/geforce-gives-you-the-edge-in-battle-royale/

Those are the referenced "studies" that the OP article linked to. They are pushing "high FPS" and ignoring everything else to sell video cards. So I think I'm right in saying that Nvidia is pushing high FPS as the largest variable and ignoring just about anything else for the purpose of selling video cards and G-sync monitors.
If that's how you want to view it then sure.

But remember the experiment was "does higher fps improve gameplay". Not "what is the best way to improve gameplay".

It's like if a university releases a study about the efficacy of a new drug used to reduce heart attacks. Should they be expected to run a separate battery of tests that test that efficacy against every other possible treatment? Absolutely not. That makes no sense. Nothing would get done.
You need to isolate a single variable and test that against the control group.
 
Back
Top