NVIDIA: RTX GPUs, High-Refresh-Rate Monitors Can Improve Your Kill-Death Ratio

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
If only to convince gamers to upgrade their GPUs and displays, NVIDIA has published new data supporting the obvious idea that better hardware improves player performance in Battle Royale titles such as Fortnite and Apex Legends. Essentially, players who can manage 144 fps score significantly higher than those limited to 60 fps: the company’s graphs suggest its RTX cards can increase K/D ratio by as much as 53%, while playing on 240 Hz and 144 Hz monitors can improve K/D ratio by 34% and 51%, respectively.

NVIDIA used more than a million sample points collected via anonymous GeForce Experience data, and then analyzed the data (which means no AMD cards). Specifically, NVIDIA is looking at player performance in two popular battle royale games: PUBG and Fortnite. How do you quantify player performance? NVIDIA looked at kill/death ratio and matched that up with number of hours played per week, then finally broke that down into graphics hardware and monitor refresh rate. NVIDIA limited its analysis to 1080p, which provides for the highest refresh rates and also serves to normalize things a bit.
 
Some people will be more committed to gaming, and so spend more money on their gear, some will have more experience, and some will buy their advantage. It isn't either/or, it's all of the above.

Oh, and hi, everybody!
 
The best tool a gamer has is not even attached to the computer, it's the DEW! Mountain Dew!
That's a commercial, makes as much sense as this one.
 
So the faster your gear, the less is every one of your "kill's" worth. ;-)
 
Oh, a new twist on an old argument. It's been known for years the higher the frame rate, the lower the input lag, as well as the more you see on the screen without tearing if you have a higher refresh rate monitor allowing you to react quicker to what you see, or not able to react because you didn't see it fast enough do to the lower refresh rate of the monitor and/or lower frame rate. A great example is Counter Strike, as it has been common knowledge that you run the game with the frame rate un capped, even if you only have a 60 hz monitor. The concept holds true for any game.
 
As I posted in the other thread on this:
My KDR increased between 10% and 20% from a 1200P 60Hz fixed refresh rate monitor to a 1440P 165Hz G-sync monitor while using the same 1060 6GB GPU so high refresh definitely makes a difference.
 
As I posted in the other thread on this:
My KDR increased between 10% and 20% from a 1200P 60Hz fixed refresh rate monitor to a 1440P 165Hz G-sync monitor while using the same 1060 6GB GPU so high refresh definitely makes a difference.

Yes it does, but you also increased your resolution, and if you did not adjust your mouse sensitivity to compensate for the resolution change, then your mouse speed also changed, which can effect accuracy, specially since increasing resolution slows down your mouse movements.
 
Yes, yes it does, to a point. but you're only as good as you are. If you can't aim at anything and have the reaction speed of a second plus you are probably not going to get much mileage in FPS aside from it looking a lot more pleasing to the eye. but someone like myself with over two decades of very high level competitive play, yes, yes, it absolutely makes my ability to play that much better.
 
nVidia stated they accounted for hours played. In theory, that should eliminate the concerns many of you have stated with "more $$ spent = more dedication"
 
The best tool a gamer has is not even attached to the computer, it's the DEW! Mountain Dew!
That's a commercial, makes as much sense as this one.

Tell Truth!!! Love the Dew!
 
  • Like
Reactions: filip
like this
Anybody that doesnt think there is an advantage in shooters to high refresh rate monitor players are in denial. I do not even own a monitor over 75hz at 2k but it is easy to test if you know someone that has a 140 or even better yet a 240hz monitor, not tv, monitor. If the rest of my system could do 140+ I would already be on a higher end monitor.
 
The driver at a track day with the modded M5 vs a stock Miata is almost certainly going to turn in better times. But he'd probably beat the Miata's driver in their car too, because they spent more time developing the skills needed to corner well and maintain speed. The superior hardware they brought to the track means the limit is their own skill.
Same thing is going on here, sure the occasional spoiled rich kid will come through and act a jackass, but for most people you caught the bug, found camaraderie as you worked your way up, and invested in your hobby as you could to maximize the experience as your ability grew.
 
NVIDIA: RTX GPUs, High-Refresh-Rate Monitors Can Improve Your Kill-Death Ratio:
It is guaranteed to deplete your wallet faster if you act on this. I don't have any data to support this tho :)
 
I think it was blurbusters who had an article about this. Faster screen refresh = less input lag coupled with you seeing new information faster. It's basically true.

I don't believe it has to be "nVidia" cards, but their cards are the fastest, and G-sync has the best adaptive sync tech. So you can gain the purported benefits with either AMD or nVidia. nVidia likely the best outcome, for the foreseeable future anyway.
 
One of NVIDIA's biggest shills on YouTube:

youtube_herpes.jpg


Well, who would have thought that better, more expensive hardware equals better gaming experience? We needed research, data points, and NVIDIA to tell us that! Oh, and also YouTube influencers.
 
LTT isn't as critical about Nvidia as say... GamersNexus is, but they didn't give them a free pass on this either:

 
  • Like
Reactions: STEM
like this
LTT isn't as critical about Nvidia as say... GamersNexus is, but they didn't give them a free pass on this either:



People who have time to play video games don't have a lot of money (except for rich kids and such). And people who have a lot of money and could afford to pay whatever NVIDIA and Intel are asking for don't have time to play video games because they're busy making ... money. That's the real problem NVIDIA needs to address, however, it seems to me like NVIDIA lives in a parallel universe where they make high-end luxury products and their customers are rich snobs who need to grow their e-peens bigger with every purchase. Keep dreaming NVIDIA.
 
People who have time to play video games don't have a lot of money (except for rich kids and such). And people who have a lot of money and could afford to pay whatever NVIDIA and Intel are asking for don't have time to play video games because they're busy making ... money. That's the real problem NVIDIA needs to address, however, it seems to me like NVIDIA lives in a parallel universe where they make high-end luxury products and their customers are rich snobs who need to grow their e-peens bigger with every purchase. Keep dreaming NVIDIA.

I have a 2080 ti and I'm not rich. I just saved money and took advantage of step-up to get it.
 
I have a 2080 ti and I'm not rich. I just saved money and took advantage of step-up to get it.

Yeah, but you don't have two of them in SLI linked together with an expensive NVLINK bridge :p

Seriously though, I'm a hardware enthusiast, just like you. I bet you don't spend an overwhelming amount of time gaming. I'm a hardware junkie, I like new stuff to play with. I don't game much, I work a lot though. I was talking strictly about folks who have enough time on their hands to game for several hours every day, or at least that's what NVIDIA 'nvisions for them, hence why they need the most expensive hardware.

At this point, NVIDIA has more in common with Scientology than a computer hardware manufacturer. Upgrading your NVIDIA GPU is almost like moving up to the next Operating Thetan level. If you can afford it, that is. Too bad NVIDIA doesn't have anything like the Sea Org...
 
Yeah, but you don't have two of them in SLI linked together with an expensive NVLINK bridge :p

Seriously though, I'm a hardware enthusiast, just like you. I bet you don't spend an overwhelming amount of time gaming. I'm a hardware junkie, I like new stuff to play with. I don't game much, I work a lot though. I was talking strictly about folks who have enough time on their hands to game for several hours every day, or at least that's what NVIDIA 'nvisions for them, hence why they need the most expensive hardware.

At this point, NVIDIA has more in common with Scientology than a computer hardware manufacturer. Upgrading your NVIDIA GPU is almost like moving up to the next Operating Thetan level. If you can afford it, that is. Too bad NVIDIA doesn't have anything like the Sea Org...

I could on weekends if I wanted to. Technically could put in a few hours a day during the week, if I wasn't tired from work. Though I don't have a family to support/spend time with so I have a bit more free time (and money) than folks that do.
 
Yes, yes it does, to a point. but you're only as good as you are. If you can't aim at anything and have the reaction speed of a second plus you are probably not going to get much mileage in FPS aside from it looking a lot more pleasing to the eye. but someone like myself with over two decades of very high level competitive play, yes, yes, it absolutely makes my ability to play that much better.

While you can only get so good, hardware can definitely make you better. Take two players. One with a 200 Hz monitor with 10 ms delay, and one with a 60 Hz monitor with 80 ms delay. The better monitor will make up for a lot of the reaction time. It's also why I hate playing FPSs with a ping over 40 ms (and absolutely despise how many console FPSs assume 100 ms is perfectly playable and the highest bars you can get).

While I'm not sure how much better hardware is influential, I'm pretty sure everyone already knows better hardware will improve your game. The only time I've ever heard anyone of getting worse is if they were just use to their worse hardware.
 
Diminishing returns in regard to FPS and latency. Isn't gsync capped to display refresh rate as well or slightly below it? So if that's the case Nvidia is comparing 16.67ms latency against 6.94ms latency and 4.16ms latency in regard to KD/R oh gee big shocker a 12.51ms to 9.73ms difference leads to more face melting in a FPS game that relies heavily on latency plus to account for mouse and keyboard latency you've got to add both rather than either/or to get a more true latency figure along with network lag as well. I believe Nvidia was comapred three gsync displays and it's odd as hell they went with a non linear refresh rate comparison inserting 144Hz display into the comparison rather than 120Hz and keeping it a linear comparison that just skews the chart to look even less favorable. Anyways here's some comparisons about latency and FPS capping and the type of non linear latency differences you can expect. Just a reduction of 12.5ms to roughly 10ms is a hell of a noticeable difference jumping up to 144Hz/240Hz refresh rate if you're comparing it to 60Hz refresh rate in a capped frame rate scenario. It also completely highlights why you shouldn't cap frame rates if you can help it or only cap it a bit above below the max frame rate. Those latency reductions are nice, but at the same time the reductions are wider reaching at lower FPS.

240FPS = 4.16ms latency
120FPS = 8.33ms latency
60FPS = 16.67ms latency

30FPS = 33.33ms latency
15FPS = 66.66ms latency

144FPS = 6.94ms latency
24FPS = 41.66ms latency

Frame capped
60FPS vs 15FPS latency difference 50ms (45 frame difference)

60FPS vs 24FPS latency difference 25ms (36 frame difference)

240FPS vs 30FPS latency difference 29.17ms (210 frame difference)
60FPS vs 30FPS latency difference 16.66ms (30 frame difference)
60FPS vs120FPS latency difference 8.34ms (60 frame difference)
60FPS vs144FPS latency difference 9.73ms (84 frame difference)

60FPS to 240FPS latency difference 12.51ms (180 frame difference)
 
Last edited:
  • Like
Reactions: STEM
like this
While you can only get so good, hardware can definitely make you better. Take two players. One with a 200 Hz monitor with 10 ms delay, and one with a 60 Hz monitor with 80 ms delay. The better monitor will make up for a lot of the reaction time. It's also why I hate playing FPSs with a ping over 40 ms (and absolutely despise how many console FPSs assume 100 ms is perfectly playable and the highest bars you can get).

While I'm not sure how much better hardware is influential, I'm pretty sure everyone already knows better hardware will improve your game. The only time I've ever heard anyone of getting worse is if they were just use to their worse hardware.

i did not read past the first example because it was already silly as hell. I have cheap monitors 60 hz from like 8 years ago that have no where near that kind of latency. The discussion is about refresh rate, not junk. If you put someone on a laptop nipple and then gave them a mouse they'd do better too, it's a ridiculous statement. 80ms delay on a monitor is complete trash heap and something you only see on over processed junk TV's.
 
i did not read past the first example because it was already silly as hell. I have cheap monitors 60 hz from like 8 years ago that have no where near that kind of latency. The discussion is about refresh rate, not junk. If you put someone on a laptop nipple and then gave them a mouse they'd do better too, it's a ridiculous statement. 80ms delay on a monitor is complete trash heap and something you only see on over processed junk TV's.


Indeed. And just about everyone in this thread is using the phrase "input lag" incorrectly.
 
I wonder if GPU's could deploy a form adaptive frame rate controlled AA got example 40FPS/50FPS/30FPS would all be fairly reasonable 20ms/25ms/33.33ms intervals and could trigger retroactively in a way that favors AA over input latency or input latency over AA in terms of transitioning. It could be setup a bit like Radeon Chill's min frame/max frame targets to raise/reduce/turn off AA quality at trigger points. That could be a very big deal I believe. It would be a lot like streaming quality and how it changes based on network traffic during heavy usage. Sure you'd notice some differences, but at least the game would run smoother and then return to normal if it was a momentary frame rate dip from say a frag grenade.
 
My new 55" Samsung "BFGD" that does 120hz @ 1440p, 9ms port latency, 2ms GTG refresh has most def made me a better player in Apex and Blackout .... no question.

aiming at 120hz is very very accurate
 
FTFA:

Using anonymized GeForce Experience Highlights data on K/D events for PUBG and Fortnite

UM.. why would the geforce software even be tracking a players K/D ratio to begin with??

and are they openly admitting that the geforce experience software is actually tracking users games??

WTH? :eek:
 
FTFA:



UM.. why would the geforce software even be tracking a players K/D ratio to begin with??

and are they openly admitting that the geforce experience software is actually tracking users games??

WTH? :eek:

From nvidia's website:
GeForce Experience


GeForce Experience helps you configure your graphics card for the best gaming and content creation performance, get software updates and new features, such as tools for you to record and broadcast your gameplay, and redeem codes for free games.

To make this happen, we need to know your PC’s hardware, software for gaming and content creation (including settings, usage, and how well they run), GeForce Experience feature usage, and geographical region.

If you opt-in to recommendations, we will show you games, apps, and rewards that you might enjoy. If you opt-in to sharing technical data, you’ll send us error logs to help us find and fix bugs. You can configure collection and usage of your data by visiting Privacy Settings.

You can control that with the account you've created for GeForce Experience.
 
Back
Top