240hz gaming

Either that's what you're looking for- and you need to be looking at 1% and 0.1% for your workloads, not just 'FPS'- or you get AMD and get more cores.



And AMD users have paid hand over fist for it. At least Intels board upgrades guaranteed plug-and-play usage. Seen the thread on Ryzen memory speeds? Lol.



This is on you- there should be no stability difference.



This would depend on your system configuration and testing scenario. If loading down the CPU, yes, the AMD system should feel 'more responsive' just due to having more cores. But that would be the same if you'd upgraded to an Intel system with more cores too.





For reasons I haven't bothered to dig in to, AMD has not provided great support for VR. If that's your goal, an Nvidia GPU is highly recommended (otherwise, yeah, get the 5700XT).

on any of my recommendations as others, look at your games/programs you use, than start looking at which side and what processor. Chances are one side is beating the other, even with lesser cores.

On the plug and play, I had no issues with my X470/X570 when I researched what memory to buy. My 32GB and 16GB kits ran at advertised speeds by second reboot. However, I will not dismiss that X370 had growing pains, but most of those are ironed out by now. Also, Intel plug and play? Early adopters of the X99 platforms also had issues, to assume one is problem free and the other is not is a bit dishonest.

Stability was the wrong word, smoother is what I should have said.

Sure you can say toss more cores from the intel side to, but let me know how many 6-8 code intel parts with hyper threading are $200 or less. That’s why I said no budget buy a 9900K/2080 Ti and on budget look up upgrading to a newer Ryzen that is supported by his motherboard.

About VR, since it is my demo system I take to fundraisers/work down days, or when I have to clean my loop I can still game without rushing. It really does not need an upgrade, however, if I could ever find a VR review for the 5700 XT it might make the purchase easier.
 
Last edited:
I have seen Golden Eared audiophiles make similar claims for decades, and they are pretty much completely empty when there is some kind of rigorous blind testing.

If you have any kind of rigorous blind testing for 144hz vs 240Hz, I would love to see it.

------

yes, many...

https://hardforum.com/threads/240hz-is-best-quantitatively.1983341/


and (short answer, yes noticeable):



There is blur busters, TFT Central did something some time ago too if memory serves me correctly... the 1st URL I see you didn't like the whole 240Hz - lol, we often beat the same things up a number of times, and before you know it, new standards or something comes along and most everyone jumps aboard.

The idea of blind test playing games, really comparing play levels (results) would be a better example - there is that out there too, but can't find that one just yet.

EDIT:

Here's one you may like... actually pretty cool comparison albeit short;
 
Last edited:
------

yes, many...

https://hardforum.com/threads/240hz-is-best-quantitatively.1983341/


and (short answer, yes noticeable):



There is blur busters, TFT Central did something some time ago too if memory serves me correctly... the 1st URL I see you didn't like the whole 240Hz - lol, we often beat the same things up a number of times, and before you know it, new standards or something comes along and most everyone jumps aboard.

The idea of blind test playing games, really comparing play levels (results) would be a better example - there is that out there too, but can't find that one just yet.

EDIT:

Here's one you may like... actually pretty cool comparison albeit short;

So for fast moving games, 120/144+ is fine?
I have not played with a 240 really, if I could even reach that.
 
So for fast moving games, 120/144+ is fine?
I have not played with a 240 really, if I could even reach that.
—-

Actually you made great point - 240Hz is not going to make much or any difference unless going to be able to have a system up to the task (minor improvements sure, enough for most to move up to prolly not).

If going have system, and it’s fast pace game - certainly it’s noticeable to a point that once have 240Hz and try to go back - you will notice it.

My opinion only on this, I think we are seeing more 240Hz (IPS and 1440p) based upon cards coming next gen from AMD, Intel, and nVidia - they will turn the frames and frame times to keep these high refresh rate monitors full.
 
Either that's what you're looking for- and you need to be looking at 1% and 0.1% for your workloads, not just 'FPS'- or you get AMD and get more cores.

And AMD users have paid hand over fist for it. At least Intels board upgrades guaranteed plug-and-play usage. Seen the thread on Ryzen memory speeds? Lol.

This is on you- there should be no stability difference.

This would depend on your system configuration and testing scenario. If loading down the CPU, yes, the AMD system should feel 'more responsive' just due to having more cores. But that would be the same if you'd upgraded to an Intel system with more cores too.

For reasons I haven't bothered to dig in to, AMD has not provided great support for VR. If that's your goal, an Nvidia GPU is highly recommended (otherwise, yeah, get the 5700XT).

Take a look at the review benchmarks. Heck someone just posted a world of tanks bench in this thread. The ratio of FPS vs 1% lows is IDENTICAL for both AMD and Intel. Intel doesn't have some magic 1% low win. The ratios are the same. If you brake out a calculator you can even see the progression. Ryzen first gen as a base line... Ryzen+ improved 5% or so on 1% lows... Ryzen 2 ups it a further 5% or so, and the bottom line is the ratio between the 3700x and 9900 1% lows are identical. Not an issue. From first hand experience I would go a bit further and say AMD drops even fewer frames below average when your multitasking.

As for intel doing us a favor by changing a slot every single release... even if half the new pins are dummies. Ya no they are not doing that to benefit consumers. Ryzen memory depends on the CPU controller no different then Intel. I slotted a 3600x into a b450 board and selected xmp and done. 3200 no issue. As their is really zero advantage going past 3733... I'm good.

You can claim # of cores is the only advantage... but its more then that. Twice the cache also makes a difference. As well as other architecture differences no the least of which are security related. (the fixes impact multi threading much more heavily then more single threaded stuff like games) So ya when your multi tasking not only does Intel most of the time have fewer cores... they also have half the cache and low level fixes that hit context switching really hard. AMD has intel well beat in anything multi threaded for now. I'm sure Intels 7nm will fix a lot of that and I expect they will also double their cache.

On the NV v AMD your not wrong. In general AMDs software team is smaller and it sometimes shows. I'm sure they made a choice based on resources in regards to VR. I hope now that the CPU side of things at AMD is officially turned around, they will refocus on the GPU end of things. With Intel coming in... as you have pointed out before. AMD is going to have some serious competition. NV may be able to hold their top of the line market, the mid range though is likely to get more crowded.
 
—-

Actually you made great point - 240Hz is not going to make much or any difference unless going to be able to have a system up to the task (minor improvements sure, enough for most to move up to prolly not).

If going have system, and it’s fast pace game - certainly it’s noticeable to a point that once have 240Hz and try to go back - you will notice it.

My opinion only on this, I think we are seeing more 240Hz (IPS and 1440p) based upon cards coming next gen from AMD, Intel, and nVidia - they will turn the frames and frame times to keep these high refresh rate monitors full.

The only thing that makes sense with 240 is some of the newer IPS panels as you pointed out that are starting to boast 1ms response. No matter what there will always be a gate limit on pixel flips. Obviously TN panels have hung around because of that. The reality is 60hz sucks yes and 30-40ms response is noticeably bad. At 144hz though we are dealing with 3-5ms response. For most people with a 144hz IPS panel that is it they are at max. Going beyond 144hz is pointless as their panel is still flipping pixels at 5ms. The thing is though at 160 or so hrz on the overclocked monitors your getting down to sub 3ms. 240 only gets you a hair better then that.

I have seen 240hz and I couldn't see the difference... I will say though I would like to experience one of the newer 1ms IPS panels. TN panels have been a hard no for me. If IPS panel tech has progressed to give us the best of both worlds, that is a big deal. (although I still am not sure 144hz 1ms wouldn't be basically identical)
 
Last edited:
The only thing that makes sense with 240 is some of the newer IPS panels as you pointed out that are starting to boast 1ms response. No matter what there will always be a gate limit on pixel flips. Obviously TN panels have hung around because of that. The reality is 60hz sucks yes and 30-40ms response is noticeably bad. At 144hz though we are dealing with 3-5ms response. For most people with a 144hz IPS panel that is it they are at max. Going beyond 144hz is pointless as their panel is still flipping pixels at 5ms. The thing is though at 160 or so hrz on the overclocked monitors your getting down to sub 3ms. 240 only gets you a hair better then that.

I have seen 240hz and I couldn't see the difference... I will say though I would like to experience one of the newer 1ms IPS panels. TN panels have been a hard no for me. If IPS panel tech has progressed to give us the best of both worlds, that is a big deal.

Even if it does I don't think it's worth upgrading my existing 144hz/1440p panel. I'd be more interested in solid HDR support. From what I've seen the new version of the panel I have (xb270hu) basically have 1-2ms improvement which I'm not going to notice. And I can't use ULMB for long because the strobing effect gives me headaches.
 
  • Like
Reactions: ChadD
like this
Even if it does I don't think it's worth upgrading my existing 144hz/1440p panel. I'd be more interested in solid HDR support. From what I've seen the new version of the panel I have (xb270hu) basically have 1-2ms improvement which I'm not going to notice. And I can't use ULMB for long because the strobing effect gives me headaches.

I've been looking myself recently. My one kid has a 144hz acer which looks nice. Not doubt you can see the difference in shooters. I spend most of my day though on a LG ultrawide 75hz. I admit its not fantastic for twitch gaming... but I'm not sure I really want to spend 1200-1500 on a proper 21:9 144hz even. lol Hopefully pricing drops with some of the newer panels. I can hold out until the next round of GPUs hit. If can can grab a good mid range card that really is hitting me 140fps lows I'll have to pull the trigger. 240 21:9 ? I don't expect that will be realistic for some time.
 
I've been looking myself recently. My one kid has a 144hz acer which looks nice. Not doubt you can see the difference in shooters. I spend most of my day though on a LG ultrawide 75hz. I admit its not fantastic for twitch gaming... but I'm not sure I really want to spend 1200-1500 on a proper 21:9 144hz even. lol Hopefully pricing drops with some of the newer panels. I can hold out until the next round of GPUs hit. If can can grab a good mid range card that really is hitting me 140fps lows I'll have to pull the trigger. 240 21:9 ? I don't expect that will be realistic for some time.

Nor do I. I've had this panel for over 2 years and it has been great. I've considered the 144hz 4k panel but maybe when the next gen of nvidia/amd cards is out because even the 2080 ti would struggle a bit to drive it. I'm in no rush, this current panel has covered all bases for me especially thanks to gsync.
 
  • Like
Reactions: ChadD
like this
Take a look at the review benchmarks. Heck someone just posted a world of tanks bench in this thread. The ratio of FPS vs 1% lows is IDENTICAL for both AMD and Intel. Intel doesn't have some magic 1% low win. The ratios are the same. If you brake out a calculator you can even see the progression. Ryzen first gen as a base line... Ryzen+ improved 5% or so on 1% lows... Ryzen 2 ups it a further 5% or so, and the bottom line is the ratio between the 3700x and 9900 1% lows are identical. Not an issue. From first hand experience I would go a bit further and say AMD drops even fewer frames below average when your multitasking.

As for intel doing us a favor by changing a slot every single release... even if half the new pins are dummies. Ya no they are not doing that to benefit consumers. Ryzen memory depends on the CPU controller no different then Intel. I slotted a 3600x into a b450 board and selected xmp and done. 3200 no issue. As their is really zero advantage going past 3733... I'm good.

You can claim # of cores is the only advantage... but its more then that. Twice the cache also makes a difference. As well as other architecture differences no the least of which are security related. (the fixes impact multi threading much more heavily then more single threaded stuff like games) So ya when your multi tasking not only does Intel most of the time have fewer cores... they also have half the cache and low level fixes that hit context switching really hard. AMD has intel well beat in anything multi threaded for now. I'm sure Intels 7nm will fix a lot of that and I expect they will also double their cache.

On the NV v AMD your not wrong. In general AMDs software team is smaller and it sometimes shows. I'm sure they made a choice based on resources in regards to VR. I hope now that the CPU side of things at AMD is officially turned around, they will refocus on the GPU end of things. With Intel coming in... as you have pointed out before. AMD is going to have some serious competition. NV may be able to hold their top of the line market, the mid range though is likely to get more crowded.

That chart shows Intel at 190.4 FPS and AMD at 158.1 FPS for 1% low. How is that not a win? It’s also consistent across most games for high Hz.

WoWs runs on a different engine than WoTs as Ligtasm said btw.
 
So I have a 240Hz TN panel on my second rig, and a 166Hz IPS for my main rig. There are diminishing returns above 120Hz, honestly, but there is a difference if you can notice.

I think that the IPS is just about the best of both worlds. 166Hz is plenty, and the picture quality is nice. There is some slight ghosting / motion blur, but it is not so bad and, all things considered, still better than TN.

This monitor is actually a 1080p ultrawide (2560x1080). I ran 7680x1440 for a while, then 4K, and it was too much of a struggle to get to 60fps, so I made a choice for high refresh and I like it.

And even with a 2080 Ti, running new games like BFV, Metro, etc. is still a challenge on Ultra. With tweaked settings I can usually stay around 90fps average, which is a good sweet spot on a G-Sync monitor.

240fps is seriously hard on anything besides e-sports titles. Even with the GPU power, you might hit CPU limits, or just the engine hard-coded limit. With older games, it is nice though.
 
That chart shows Intel at 190.4 FPS and AMD at 158.1 FPS for 1% low. How is that not a win? It’s also consistent across most games for high Hz.

WoWs runs on a different engine than WoTs as Ligtasm said btw.

It simple. What he is suggesting is that AMDs chips are wonkey and even when you see them = intel in FPS they have crazy dips. Which is BS.

In that chart.
Ryzen 1800x (first gen) 116.6 FPS average 77.3 1% lows... Ratio 66.3%
Ryzen 2700x (Zen+) 142.5 average 96.5 1%... Ratio 67.8%
Ryzen 3700x (Zen2) 158.4 average 109.4 1%... Ratio 69.1%
Intel 9900k 165.3 average 115.4 1%... Ratio 69.7%

69.7 vs 69.1 is within the margin of error and basically identical. Yes Intel is 4% faster overall... but the performance raitos are the same. Intel doesn't have some magic win in frame drops as IIC was suggesting.

Looking over other benchmarks for other games the same is true. AMD you could argue in past generations was a bit worse in terms of 1% lows. The first gen ryzen had a larger swing in frame rates with bigger highes and lower lows. Zen+ improved things a bit and Zen2 puts them on par with Intel in that regard.

And seeing as we can't forget about price to performance.... again going on that benchmark
Ryzen 3900x 159.7 fps 116.0 1% lows... or 71.6% No doubt the 3900x provides the smoothest game play in that benchmark.
 
It simple. What he is suggesting is that AMDs chips are wonkey and even when you see them = intel in FPS they have crazy dips. Which is BS.

In that chart.
Ryzen 1800x (first gen) 116.6 FPS average 77.3 1% lows... Ratio 66.3%
Ryzen 2700x (Zen+) 142.5 average 96.5 1%... Ratio 67.8%
Ryzen 3700x (Zen2) 158.4 average 109.4 1%... Ratio 69.1%
Intel 9900k 165.3 average 115.4 1%... Ratio 69.7%

69.7 vs 69.1 is within the margin of error and basically identical. Yes Intel is 4% faster overall... but the performance raitos are the same. Intel doesn't have some magic win in frame drops as IIC was suggesting.

Looking over other benchmarks for other games the same is true. AMD you could argue in past generations was a bit worse in terms of 1% lows. The first gen ryzen had a larger swing in frame rates with bigger highes and lower lows. Zen+ improved things a bit and Zen2 puts them on par with Intel in that regard.

And seeing as we can't forget about price to performance.... again going on that benchmark
Ryzen 3900x 159.7 fps 116.0 1% lows... or 71.6% No doubt the 3900x provides the smoothest game play in that benchmark.

I saw he used the word “ratio” but I guess I see where he was gojng. At the end of the day all that matters to the end user is the actual 1% lows.

Intel has 21% higher fps for their 1% lows in that chart.
 
I saw he used the word “ratio” but I guess I see where he was gojng. At the end of the day all that matters to the end user is the actual 1% lows.

Intel has 21% higher fps for their 1% lows in that chart.

If you compare a Intel chip overclocked past 5ghz to a lower end stock AMD chip. At stock the 3900x bests the i9-9900. At stock the 3700x is within 3 FPS of the I7-9700 on 1% low numbers.

I would say that is more apples to apples... even if Intel just dropped the pricing on the i9 recently. (at least where I am it was at the same price a few weeks back today its about 80 bucks cheaper then a 3900)

Granted Intel has a bit more overclocking headroom... which does improve the 1% lows a bit. Still comparing it to the 3900 the only way it wins anything by more then low single digit FPS is with a water cooled (or high end air) overclock.
 
If you compare a Intel chip overclocked past 5ghz to a lower end stock AMD chip. At stock the 3900x bests the i9-9900. At stock the 3700x is within 3 FPS of the I7-9700 on 1% low numbers.

I would say that is more apples to apples... even if Intel just dropped the pricing on the i9 recently. (at least where I am it was at the same price a few weeks back today its about 80 bucks cheaper then a 3900)

Granted Intel has a bit more overclocking headroom... which does improve the 1% lows a bit. Still comparing it to the 3900 the only way it wins anything by more then low single digit FPS is with a water cooled (or high end air) overclock.

D1B465B3-04BC-4894-AB4F-ABBC7C577173.jpeg


I don’t see AMD besting Intel at stock or OC vs OC?

Besides this is [H], overclockers is in the name of the page...
 
View attachment 183358

I don’t see AMD besting Intel at stock or OC vs OC?

Besides this is [H], overclockers is in the name of the page...
The 3900 and 3700 are doing pretty awesome there. They are all over 144, so it would come down to price really. Which one is cheaper and has more of an upgrade path.
 
The 3900 and 3700 are doing pretty awesome there. They are all over 144, so it would come down to price really. Which one is cheaper and has more of an upgrade path.

If 144Hz and you mainly play WoT that’s a valid point. That’s basically why I buy AMD 4/5 times now.
 
I have seen Golden Eared audiophiles make similar claims for decades, and they are pretty much completely empty when there is some kind of rigorous blind testing.

If you have any kind of rigorous blind testing for 144hz vs 240Hz, I would love to see it.

how can you blind test for games you wouldnt be able to see the screen and using blind people would be unethical
 
how can you blind test for games you wouldnt be able to see the screen and using blind people would be unethical

Not only that, but for double-blind studies, you also have to use blind experimenters, not just blind test subjects. Do you know how hard it is to find blind scientists who specialize in monitor refresh frequency measurements? /s



So, I'm now about gaming around 160-180 FPS, which I'm comfortable with - I know my monitor is capable of more, and I'll get there with the next GPU upgrade in 6 months. Enabling gsync/freesync2 was nice - I had to finally step up tothe 4xx series of drivers - which I had been avoiding because of early stability issues.

One thing I would caution about this particular monitor - the edges of some surfaces ghost a bit with some grey display artifacts when fast motion is happening on the screen. I guess the VA panel isn't quite up to the 240Hz refresh yet. I'm still messing around with a bunch of monitor settings to limit this, but it is a bummer. Was not expecting artifacting like this.

Regards,
Mitch
 
Not only that, but for double-blind studies, you also have to use blind experimenters, not just blind test subjects. Do you know how hard it is to find blind scientists who specialize in monitor refresh frequency measurements? /s



So, I'm now about gaming around 160-180 FPS, which I'm comfortable with - I know my monitor is capable of more, and I'll get there with the next GPU upgrade in 6 months. Enabling gsync/freesync2 was nice - I had to finally step up tothe 4xx series of drivers - which I had been avoiding because of early stability issues.

One thing I would caution about this particular monitor - the edges of some surfaces ghost a bit with some grey display artifacts when fast motion is happening on the screen. I guess the VA panel isn't quite up to the 240Hz refresh yet. I'm still messing around with a bunch of monitor settings to limit this, but it is a bummer. Was not expecting artifacting like this.

Regards,
Mitch

So I goofed around a little. At 1080p “high” my 2080ti was about 30% usage and my CPU, 2700x, had a core pegged. So it’s a pretty easily CPU bound game. I was averaging about 160fps.

If you upgrade your GPU I wouldn’t expect a lot more frames but you could downscale with DSR for a sharper image, jack IQ in general.
 
So I goofed around a little. At 1080p “high” my 2080ti was about 30% usage and my CPU, 2700x, had a core pegged. So it’s a pretty easily CPU bound game. I was averaging about 160fps.

If you upgrade your GPU I wouldn’t expect a lot more frames but you could downscale with DSR for a sharper image, jack IQ in general.

Thanks! I didn't expect that! I'll double-check my gpu and cpu utilization... Maybe I'll turn up the gfx quality then...
 
Thanks! I didn't expect that! I'll double-check my gpu and cpu utilization... Maybe I'll turn up the gfx quality then...

It's pretty easy to try. Just keep in mind it only takes one core at ~90% for that game to be CPU bound.
 
So I goofed around a little. At 1080p “high” my 2080ti was about 30% usage and my CPU, 2700x, had a core pegged. So it’s a pretty easily CPU bound game. I was averaging about 160fps.

If you upgrade your GPU I wouldn’t expect a lot more frames but you could downscale with DSR for a sharper image, jack IQ in general.

Just to follow up on this since I just happened to be upgrading:

2700x/2080ti : 160 FPS
9900KF/2080ti : 230 FPS

Sounds like OP is around 160FPS with a 3600x/1070.

Just figured I’d post the data points since I was able to grab them. Definitely a heavily CPU constrained game.
 
Just to follow up on this since I just happened to be upgrading:

2700x/2080ti : 160 FPS
9900KF/2080ti : 230 FPS

Sounds like OP is around 160FPS with a 3600x/1070.

Just figured I’d post the data points since I was able to grab them. Definitely a heavily CPU constrained game.

Thanks for the test!

The code is likely unoptimized (or heavily single-threaded), the company behind the game isn't known for their great software development practices.

For comparison - playing WoW-Classic with all settings at ultra, I'm solidly at 200 FPS...
 
Ordered some 3733 CL15 RAM posted on the deals forum - we'll see the improvement over my current kit which struggles to keep 2800 speed.
 
1560618931526.png


Zowie has perhaps one of the best examples of 144Hz vs 240Hz when playing a first person shooter game - at 240Hz (with the a system up to gaming at 200+ frames) there's some benefit in not only getting the shot off first, but more accurate - Of course its not all as simple as going out and buying a more expensive 240Hz Monitor - Hopefully good information for those who've asked questions about 240Hz monitors.

The 240Hz IPS panel(s) should start selling any day now which will be interesting...

https://zowie.benq.com/en/product/monitor/divina/xl2546-blue.html
 
Sigh - that viper DDR4-3733 won't boot at default XMP2 profile on my ASrock AB350 Fatal1ty board. It's a total pain in ass to tune - since I need to CMOS reset whenever i try a too aggressive configuration. How does anyone put up with DDR timing / speed / voltage tuning - it's like pulling teeth. Am I doing this wrong?
 
Sigh - that viper DDR4-3733 won't boot at default XMP2 profile on my ASrock AB350 Fatal1ty board. It's a total pain in ass to tune - since I need to CMOS reset whenever i try a too aggressive configuration. How does anyone put up with DDR timing / speed / voltage tuning - it's like pulling teeth. Am I doing this wrong?

two things you can try, check if the fclk's running at 1:1, if it is try dropping it to 1800mhz see if that helps(shows up as northbridge clock in cpu-z). you can also try throwing more voltage at the memory, 1.4v would likely do it and not harm the memory. but just a reminder the max clocks on the qvl is 3600mhz so it's possible the PCB quality or trace quality might not be good enough for 3733.
 
Sigh - that viper DDR4-3733 won't boot at default XMP2 profile on my ASrock AB350 Fatal1ty board. It's a total pain in ass to tune - since I need to CMOS reset whenever i try a too aggressive configuration. How does anyone put up with DDR timing / speed / voltage tuning - it's like pulling teeth. Am I doing this wrong?

EDIT: I read you picked up a 3600X then your ram could work fine.
You need to tune your timings. At least do it in a manner to get reliable results.
Download : DRAM Calculator and thaiphoon burner with memory benchmark built in.
https://www.guru3d.com/files-details/download-ryzen-dram-calculator.html

In thaiphoon burner export your XMP profile with complete HTML Report and scroll down to save delays in nanoseconds to desktop or somewhere you would remember.
Then import the XMP complete HTML Report to DRAM Calculator for Ryzen

Then in the DRAM Calculator for Ryzen pick the settings you want and try them,they are meant as a guide and if you have a couple of errors give more voltage.
If games crash/stop/hitch/stutter then your RAM timings are incorrect and you need to set them properly.This is why the DRAM Calculator for Ryzen is perfect guide for someone/anyone out there.
 
Excellent review on 240Hz - reaction times are better at 240Hz, even better than 12ms, every test subject was more likely to make kill, better aims, and of course smoother game play. Live gaming Filmed 4k 1,000 frames per second in a controlled game setting.

Does 240Hz Matter for Gaming ft. Gav from Slow Mo Guys

 
Excellent review on 240Hz - reaction times are better at 240Hz, even better than 12ms, every test subject was more likely to make kill, better aims, and of course smoother game play. Live gaming Filmed 4k 1,000 frames per second in a controlled game setting.

Does 240Hz Matter for Gaming ft. Gav from Slow Mo Guys



Of all the inane tests Linus has done... this one might be worthy of first place. 240hz vs 60hz.... hmmm ya pointless and stupid test.

A proper test would have been 144hz vs 240hz. Problem with that is Linus and his sponsors both are well aware there any gains they could show would be margin of error at best. Of course 240hz is insanely better then 60hz... but 144hz is insanely better then 60hz. Hell even 75hz could probably be scientifically shown to be superior to 60.

Over 144hz diminishing returns means the gain in milliseconds per frame are so low that only a small fraction of super high level super twitch gamers would be able to scientifically show even a small advantage. Over 180hz or so you also start bumping up against the actual pixel gate timings. No matter what even the best panels still need 1ms to switch a pixel.
 
Back
Top