144 is the new 60?

some people are more sensitive to such differences than others. i remember vividly how i connected my first 120 display, moved the mouse - on the windows desktop and the difference was so obvious to me that i said "wow" aloud, although i was alone. pair that with ulmb, as i found motion blur nauseating in shooters, and you have another world. at least, that was my experience.

a friend still plays on 60 hz, because he prefers the more vivid colors of ips and 4k, and the funny thing is, he still beats me usually, simply because he has far more time for gaming than i do and experience still matters more than frames. (as does talent, well, he has much better dexterity). and he has more fun with a beautiful image, so he plays more often.

meanwhile, my next upgrade will be a 240 hz monitor, i have already tried one and it felt much smoother than my current 120 hz screen. to me, smoothness matters more than looks. perhaps i'll play more if it's more fun, and get better ;)
You called it.
I'm one of those ips 60 guys haha
 
Good marketing with just a touch of truth. High refresh is better without a doubt, but skill is still king.
 
I think ping and monitor input-lag are the main factors in upping your game. Get a CRT (or that infamous Sony alpha ips) , your own server and google fiber, if that still exists, and you will be a god player.
 
???, just saying it is better which I got from the read does not make it better or does it quantify how much. Over and over again tests done by various people (all fairly poor for one reason or another) people keep getting it wrong more than right. Here is a oldie but goody where person playing 60hz compared to 120hz got it wrong 5 out of 6 times:



Someone more familiar with what to look for can get it right, this time 5 for 5 right. Other tests I've seen for the most part have people guessing wrong. I like to see if folks, even with experience can tell between 100hz, 120hz, 144hz and 240hz if game can maintain frame rates that high with FreeSync or GSync monitor using VRR. Throw in some 144hz non VRR to 60hz VRR and 100hz VRR as well.

 
60hz always has had a flicker that my eye catches. but 75hz I dont see it so I dunno if 144hz would be better or not. diminishing returns has to hit at one point?
 
  • Like
Reactions: noko
like this
???, just saying it is better which I got from the read does not make it better or does it quantify how much....

That thread has multiple masters rank (top 3% of playerbase), multiple grand-masters (top 1%), and a top 500 ranked player posting in it, all in support of fast refresh as a substantial improvement to their game. It's relevant to listen to their experience as some of the best players in the game. Professional Overwatch is also played on 240Hz monitors as standard.

When your aim is in the 99th percentile and you are fighting other people with 99th percentile aim, single frames do matter.

In short, yes, Nvidia wants to sell you hardware. In length, they have a valid selling point, It's not just snake oil. If better hardware wasn't helpful, nobody would build "gaming" rigs. Every argument in this thread could be made againsts high DPI mice, gaming keyboard key switches, and good headphones too.
 
That thread has multiple masters rank (top 3% of playerbase), multiple grand-masters (top 1%), and a top 500 ranked player posting in it, all in support of fast refresh as a substantial improvement to their game. It's relevant to listen to their experience as some of the best players in the game. Professional Overwatch is also played on 240Hz monitors as standard.

When your aim is in the 99th percentile and you are fighting other people with 99th percentile aim, single frames do matter.

In short, yes, Nvidia wants to sell you hardware. In length, they have a valid selling point, It's not just snake oil. If better hardware wasn't helpful, nobody would build "gaming" rigs. Every argument in this thread could be made againsts high DPI mice, gaming keyboard key switches, and good headphones too.

A lot of them are paid to promote that stuff. It is given to them for free in lots of cases. It's like that article I read about buying a couple hundred dollar set of golf clubs vs. a couple thousand. You'll never guess what the guy found out when he went out on the course and played with the 3k set.
 
I kinda doubt Nvidia paid off some random masters and grand-masters players to astro-turf :)

Paid endorsements from professional players are generally disclosed as well.
 
I kinda doubt Nvidia paid off some random masters and grand-masters players to astro-turf :)

Paid endorsements from professional players are generally disclosed as well.

If you're at their level it probably does, too bad the vast majority are not and won't see any benefit to their game. I said earlier in the thread I read a review on amazon where a guy thinks buying a 240hz monitor (had a 144) thinks that got him from gold to plat (OW) that is so laughable to me. Just like I would never tell anyone their K:D will increase if they get a 1080ti over a 1070.
 
That thread has multiple masters rank (top 3% of playerbase), multiple grand-masters (top 1%), and a top 500 ranked player posting in it, all in support of fast refresh as a substantial improvement to their game. It's relevant to listen to their experience as some of the best players in the game. Professional Overwatch is also played on 240Hz monitors as standard.

When your aim is in the 99th percentile and you are fighting other people with 99th percentile aim, single frames do matter.

In short, yes, Nvidia wants to sell you hardware. In length, they have a valid selling point, It's not just snake oil. If better hardware wasn't helpful, nobody would build "gaming" rigs. Every argument in this thread could be made againsts high DPI mice, gaming keyboard key switches, and good headphones too.
Hmm, did not notice any of them, not in their league. Does not matter, where are the tests, solid proof etc?

The question would be more like. Will I even see a gaming experience improvement when I am already over 60FPS FreeSync in most games? Is paying $1200 for nothing worth it?

Personally I am playing FC5 second time are outs on hard, single Vega FE on 144hz monitor with frame rates in the 70,s FreeSync - buying a Vega 7 I would be in the 90’s - it would in the end make no significant gaming experience improvement.
 
I am fine with 60fps.
60fps got bad rep because of double buffered vsync, while in fact triple buffer vsync and/or use of RTSS 60fps cap, lowers the input lag A LOT.
I am all for graphics and image quality and the fact is, 60hz monitors still do offer a bit better image quality and uniformity, so going 4k60hz is still a viable option.
But neither 144hz 1440p, nor 4k60hz are easy to achieve.
So while it's an unpopular take, I prefer great graphics at 60 or even 30 (console) fps. After all, 10 years later I will remember amazing god of war graphics and not 144hz of counter strike.

I remember playing first Unreal and half life when those came out and I was in awe of amazing graphics and technology. Now that I think of it, these games couldn't run faster than 40fps on voodoo card. If the game looked like shit and played 120hz on 120hz crt which I had, I would probably not remember it.

Same goes for 240hzr. I've had bought AOC ag251fg last year after I've heard so much about gsync and high refresh rate craze. So coming from 60hz to 240hz was quite a huge difference.
But after few days/weeks, I've got used to 240hz and it felt just like month before when I was playing at 60hz... but the image quality of that monitor was bad and while I stopped noticing 240hz quickly, I couldn't get used to worse image quality and lower game settings(not in every game).
 
Last edited:
  • Like
Reactions: Auer
like this
I believe it is, aswell as, 1440p being the new 1080. It's time to upgrade the monitors and GPUS
 
So, I ended up purchasing a 240Hz monitor. It's nice, I guess, but a mixed bag.

I do feel like it looks smoother, coming from 166Hz on my main rig, but not what I expected an extra 74Hz would look like.

Understandably, there are diminishing returns, and it does feel pretty smooth honestly, but I would say even 144Hz is probably plenty.

So, not really a complaint about the refresh but more about the trade-offs you have to make (low resolution, panel tech, etc.), I feel make it not a good decision.

Especially as a main monitor, I think 1440p 144Hz is probably still a pretty safe choice, with many affordable options. Or going ultrawide, or 4K even, rather than 240Hz.

Still only tested for about 1 hour, so I will mess with it more, but my initial impression is that unless you are an Overwatch pro player or something, you don't need 240Hz.
 
Does it feel smoother even though your not hitting 240fps compared to your 166 hz monitor?
 
Well, I tested two old games: Left4Dead (hitting 240fps easily) and Mirror's Edge, which was around 200 - 240fps depending.

Yes, it was definitely smoother but not by a drastic amount. G-Sync helps too to keep under 240fps feeling good.

I was hoping for a night and day difference, while in practice it was just a moderate improvement.

For me, I play single player games mostly, I think 166Hz is great for what I am doing. While I could tell a difference going to 240Hz, it didn't feel needed to me.

If you are a competitive online shooter player, maybe that extra bump is an advantage. I don't really know, but I could believe that.

Or if you are on a 60Hz screen, obviously it would be a huge improvement. But the trade-offs are large (limited to 1080p, TN panel, screen-size, etc.) in which case you will have to decide what is important.
 
Higher framerates are going to help better gamers more. Take the Overwatch example someone posted earlier, stating they were SR 2500 (Platinum rank). 90% of the player population is in Platinum or under (ranks low to high: Bronze, Silver, Gold, Platinum, Diamond, Master, Grandmaster), 10% are Diamond and up, and only 3% make it higher than Masters. Bronze players can't hit the broad side of a barn with an aimbot, but if you switch a Masters+ player from 60FPS to 144FPS, they definitely have the ability to take advantage of the difference.

I played Quake3 competitively back in the day, and I was pushing my CRT to 100Hz because it was THAT MUCH BETTER than 60Hz because I was at the top end of the skill curve back when I was a teenager and had all the time in the world to practice.

I went from a 60Hz LCD to a 165Hz one, and it made tracking enemies noticeably easier.

This 100%.

60hz -> 144hz is a MAJOR jump in the competitive gaming scene.
144hz -> 240hz not so much, but still a great jump especially considering most 240hz panels are focused on hardcore gamer needs like low input lag and anti-ghosting measures.

The problem for me with 240hz is that it requires an upkeep of 240fps in order to display it properly. I simply can't achieve that in 1440p with my budget currently, so I'm enjoying 144hz which I think is more than enough to get you to the point where you need to be and not make excuses about hardware whenever you get your ass handed in-game.

It's absolutely crucial to have 144hz+ when it comes to FPS titles if you're looking to play competitively. I'm talking run the game on low, turn the picmip to 9000 and enjoy sacrificing every bit of graphics quality so you can have the smoothest gaming experience with the highest visibility.

To each their own :)
 
???, just saying it is better which I got from the read does not make it better or does it quantify how much. Over and over again tests done by various people (all fairly poor for one reason or another) people keep getting it wrong more than right. Here is a oldie but goody where person playing 60hz compared to 120hz got it wrong 5 out of 6 times:



Someone more familiar with what to look for can get it right, this time 5 for 5 right. Other tests I've seen for the most part have people guessing wrong. I like to see if folks, even with experience can tell between 100hz, 120hz, 144hz and 240hz if game can maintain frame rates that high with FreeSync or GSync monitor using VRR. Throw in some 144hz non VRR to 60hz VRR and 100hz VRR as well.



I think it comes down to how familiar you are with a game or particular monitor. Moving to a new setup or game and you likely won't be able to tell the difference. But if you play the same game for a long time and move to a similar monitor with a higher refresh rate you'll see the difference. I noticed it in BF4. Legs look a bit smoother when people are running and I find it a bit easier to pick out someone moving through foliage. But I have around 1000+ hours in BF4. Probably 700 or so on 60Hz.

I assume if you sat me down in front of a game I hardly played, with a different size or aspect ratio and asked to me to tell them apart after playing for 15-20 mins it might be harder. But when you really know a game, it can make a bigger difference.
 
Well I am a weirdo I guess ,for the last 10+ years I was 144Hz /165Hz guy,then I sold my Gsync 165Hz TN panel and went with a 4K 60Hz IPS freesync panel and I am perfectly enjoying the 60Hz.
I really do not find a difference other than freesync for me is a more fluid/smooth than gsync ever was for me.
I need to add this to the Nvidia crazy people,the freesync is on RTX 2080 card .
 
Well I am a weirdo I guess ,for the last 10+ years I was 144Hz /165Hz guy,then I sold my Gsync 165Hz TN panel and went with a 4K 60Hz IPS freesync panel and I am perfectly enjoying the 60Hz.
I really do not find a difference other than freesync for me is a more fluid/smooth than gsync ever was for me.
I need to add this to the Nvidia crazy people,the freesync is on RTX 2080 card .
I would agree with this as well, playing Shadow Of The Tomb Raider with SLI 1080 Ti's with DSR bla bla bla HDR etc. fantastic, 100fps around. This was before FreeSync support. Smoothness wise one Radeon FE with FreeSync at a much lower frame rate, settings etc. is smoother. Between the two, fast refresh rates and FreeSync or VRR - FreeSync gives more of a smoothness factor I say then just plain hz. Of course overall the gaming experience on the 1080Ti's were better because of IQ was much higher. I will be doing some shuffling around, putting main system in new case with 1600w p/s, water cooler inside vice separate. Will get a chance to stick the Samsung monitor back on the Nvidia system.
 
A lot of them are paid to promote that stuff. It is given to them for free in lots of cases. It's like that article I read about buying a couple hundred dollar set of golf clubs vs. a couple thousand. You'll never guess what the guy found out when he went out on the course and played with the 3k set.

This is funny for me at least - I justified my 1080ti purchase a while back based upon how much my spouse paid for golf clubs. Their argument was "which will last longer", but mine was "which matters more".

I can play at Masters level. Those clubs are whacking away at the golf equivalent of Bronze/Silver. ;)
 
This is funny for me at least - I justified my 1080ti purchase a while back based upon how much my spouse paid for golf clubs. Their argument was "which will last longer", but mine was "which matters more".

I can play at Masters level. Those clubs are whacking away at the golf equivalent of Bronze/Silver. ;)

lol. Well as long as he doesn't save any scorecards there won't be any doubt!
 
Back
Top