Why 120hz

Ultranifty

Limp Gawd
Joined
Aug 3, 2012
Messages
167
I've used it all.....used 120hz 1920x1080, used 60hz 2560x1600\1440.

Currently I'm back to 60hz 2560x1440 .232 dot pitch PB278Q. I do like the immersion of triple screen but it's intensive on the vid cards and it's low rez (1080p and usually .31 dot pitch monitors)

Thinking about getting the new 34" 34UM95 LG monitor for that wide view immersion, high rez. You ask anyone what the best gaming monitor is and usually they say some 120hz model. And here's my question:

I've used 120hz, played BF3 on it for awhile. Yes, it was smoother, but not like 60hz (with adaptive vsync) was bad......120hz was just butter smooth. BUT it was also only smooth in BF3, just first person shooters basically.

So gamers only play shooters? I'll go over the games I play/played and how 120hz did:

BF3 or 4 - very smooth
New Xcom - no difference but low 1920x1080 rez and bad TN colors
Borderlands 2 - hardly any smoothness change, and again, low rez and TN colors
D3 - Already smooth, 120hz made no change, low rez and TN colors
Sins of Solar Empire - no change for 120hz, low rez, and TN colors
Company of Heroes - no change for 120hz, low rez, and TN colors
WoW - no change for 120hz, was already smooth, low rez and TN colors
yada yada yada, I'm sure I'm forgetting some games

This was on SLI'd NGTX580 Lightning Extreme 3GB vids, more than enough to push 100-120fps in these games at 1920x1080, so it's not a vid card issue. Yes, I turned off vsync while on the 120hz



MY POINT IS........are you guys really getting 120hz screens just for shooters? Just for that one game? Sure, moving things around on the desktop is smooth, but big deal.

I guess I just don't get why more don't share my view. 120hz isn't the end all be all of gaming. I feel that you get more fun out of triple screen, or a 21:9. Run with vsync on and 60hz isn't a big enough drawback to deal with a TN panel and low 1920x1080 rez.

Disagree? I'm not here to flame 120hz, for shooters, no question, smooth, but anything else?
 
Last edited:
It's "nice" in slower games like Wargame, or Warthunder Tanks, to have better motion performance. However, going from a regular 60Hz LCD to my CRT (which is much better than your 120Hz TN) in those games - the motion performance isn't worth what I lose.
I'd rather have the sharpness and size, of the LCD.

It's similar, comparing 60Hz LCD to 120Hz LCD. *Typically,* you lose quality when you go for 120-144Hz, being TN panels and smaller, etc. But there is the overclockable korean 27" ;)


So, yes - for me, it's only a big deal in shooters, and games with fast movement. i.e., warthunder planes.



Also, see here:
http://pcmonitors.info/articles/factors-affecting-pc-monitor-responsiveness
 
Last edited:
I would have to say that yes FPS are the primary games where the more HZ/FPS you get the better which makes a difference in playability. A lot of other games I can stand 30Hz even but shooters for me have to be high Hz.
 
Yes, there is much more benefit in fast paced games, but 120 Hz screens often bring an improvement in panel speed, which benefits all games with motion, especially full screen motion, no matter how fast or which genre they are. TN is terrible, though, so it's hardly an easy decision. I am lucky to have a big widescreen CRT, so I don't have to compromise between image quality and speed/smoothness/motion sharpness. LCDs are just a shitty display technology and the sooner we get rid of it the better. It's 2014 and we are still one step forward, four steps backward compared to previous tech.
 
My observation :

DayZ -> can't even hit 60 FPS lolz (and even then theres no big change because you walk 90% of the time)
WarThunder -> Huge change
 
Imho true, need for 120Hz is a bit overblown and sometimes just because users think they need >60Hz like with CRTs. Percieved difference is much bigger though if going other way. 60Hz>30Hz IS noticeable by almost anyone.
 
Maximum PC ran a article last year on Higher refresh rate most of the test subjects couldn't even tell the difference...

I like TN over IPS cause I can have lower brightness on a TN and not worry about overwhelming black levels in dark games.
ASUS TN is good
BENQ TN sucks
I want to try a Eizo Forris once they get the bugs out
 
Ultimate monitor for me will be a 3440 x 1440 rez 34" (21:9 ratio) OLED monitor ;)

No backlight, no refresh rate limit, and great colors. That's few years off though
 
I notice the difference in WoW. its night and day.

try turning the camera around (right click + drag) on 60hz and then 120hz.

but if YOU don't notice a large difference except for shooters, and its not such a big deal to YOU, get your 34" LG.
 
You'll never know how amazing it is unless you try it. Anything below 90fps looks chunky to me now :-/
 
When I had a CRT in 2003 I used 100hz. Then I switched to a 24" 60hz LCD in 2005. In 2009 I switched to a 120hz LCD and it is better in almost every single way. If you truly desire the best graphical fidelity go get an OLED for 5 thousand.
 
When i had CRT if i had choice between higher resolution or higher refresh, i went for resolution, with one limit - highest resolution at least at 75Hz, as below i felt eye strain in prolonged usage. When many years ago switched to LCDs i first worried about "just" 60Hz of LCDs, but soon both own experience and RTFM about technology differences proven me wrong. 120Hz MAY be slightly better, but often comes at expense with having lower specs in other areas, like lower res per display size, worse colors, much higher price, usage of multiple inputs due data transfer standarts limited bandwith. I'm not willing to go few steps back in areas i care for for slight improvement in just one. I'm also guessing that those, to whom that one area is THAT important, make up niche no more then 1/10, but possibly even less (how much from all users you know of are almost professional gamers and game not just for fun but for competitive results and spend most of their free time for that?).
 
It is a night and day difference for ANY game I play.

You're most likely doing it wrong and not actually even getting 120 fps.
For example Company of Heroes has a 60 fps cap unless you add "-refresh 120" to the launch options.
 
I guess I just don't get why more don't share my view. 120hz isn't the end all be all of gaming. I feel that you get more fun out of triple screen, or a 21:9. Run with vsync on and 60hz isn't a big enough drawback to deal with a TN panel and low 1920x1080 rez.

I thought 120Hz was a huge difference in smoothness compared to 60Hz.

Unfortunately the colors looked awful and I couldn't stand how terrible BF4 looked on either monitor. This is coming from a ZR30w. I tried 2 different Asus models (24" and 27") and sent them both back. My buddy has a "calibrated" VG248QE and it looks terrible in my opinion.

The 34UM95 looks awesome and I'm waiting for mine to ship from Amazon. Since I'm happy gaming on my ZR30w I'm pretty sure I'll enjoy the 34UM95.

Another problem for me with the current selection of "gaming" monitors is they are all too small. I've been gaming on a 30" monitor for years and I'm not interested in getting a smaller monitor. I'm also not interested in multi-monitor setups no matter how small the bezels are. If the ROG Swift were bigger it would be much more tempting.
 
I thought 120Hz was a huge difference in smoothness compared to 60Hz.

Unfortunately the colors looked awful and I couldn't stand how terrible BF4 looked on either monitor. This is coming from a ZR30w. I tried 2 different Asus models (24" and 27") and sent them both back. My buddy has a "calibrated" VG248QE and it looks terrible in my opinion.

The 34UM95 looks awesome and I'm waiting for mine to ship from Amazon. Since I'm happy gaming on my ZR30w I'm pretty sure I'll enjoy the 34UM95.

Another problem for me with the current selection of "gaming" monitors is they are all too small. I've been gaming on a 30" monitor for years and I'm not interested in getting a smaller monitor. I'm also not interested in multi-monitor setups no matter how small the bezels are. If the ROG Swift were bigger it would be much more tempting.
Even my crappy Vizio TV monitor looks better than that ASUS VG248QE. I was surprised how much I preferred accurate colors. But the VG248QE definitely helped me out in most FPS.
 
You'll never know how amazing it is unless you try it. Anything below 90fps looks chunky to me now :-/

I didn't believe in 120Hz until a local [H] member showed me the difference a few weeks ago.
It's a big difference in FPS games, or any other game. Even browsing the web is smoother.

I wouldn't go back to a standard 60Hz monitor after getting my Qnix.
1440p 120Hz IPS Glossy goodness.
 
For me it's less about the increase refresh fate than it is about being able to use lightboost/ULMB. The reduction in motion blur is staggering. Having seen both, running at 120hz I'd a nice side. benefit to having a panel that supports those techs.

I also have a secondary monitor at 60 Hz that has much better color if I care about that. It's a nice compromise in my opinion.
 
Maximum PC ran a article last year on Higher refresh rate most of the test subjects couldn't even tell the difference...

http://www.maximumpc.com/refresh_rate_2013

Not actually what they found, at all. Funny that you remember the opposite conclusion of what they actually found...

Most people could tell the difference, but _preferences_ varied. Some people prefer more motion blur for some types of content or particular games, some prefer less. You'll notice that almost every single person could tell there was a difference, and most picked 144hz as better for Portal 2, while not all did for L4D2.

Their test proves that there is 100% a significant perceptual difference that almost everyone can perceive... but whether or not they believe it's "better" is content-dependent. Also, a 24fps video was part of this test but people are most used to the way 24p gets butchered at 60hz than what it looks like when displayed correctly, so they're obviously more likely to pick that.
 
Gsync and high hz are greatly advantageous for gaming experience in 1st/3rd person games.

High hz does two things. It cuts movement blur (by 1/2 comparing 120hz vs 60hz, 60% at 144hz). At higher fps it also increases motion definition, animation definition, and overall motion flow/control flow and motion tracking. (More "slices" of unique newer action, animation, and control path states shown per second).
www.web-cyb.org 120hz-fps-compared


.........
.........
The amount of FoV movement blur at 60hz is not even definably a solid grid resolution to your eyes during continual FoV movement in 1st/3rd person games (movement keying, mouse-look pathing, etc). All of the texture detail, depth via bump-mapping, some of the high detail geometry, is lost in a smear. You don't play a still screen shot.

You still get a 5:3 increase in movement definition, animation definition, and overall smoothness/flow at 100fps at anything over 100hz vs 60fps-hz max.
This is not just something for those who seek a gameplay advantage and motion tracking advantage, it aesthetically looks much better and feels better in controls.
I'm sure most people have seen the mouse pointer tracking examples of 60hz vs 120hz. That applies to the entire view-port and control pathing in a game.
This is just a graphic trying to simulate it
(5frames of world state action vs 3 ~> 100fps at 100hz+ shown vs 60hz-fps max ceiling.. 120fps-hz would be 6:3)
120hz-vs-60hz-gaming.jpg


If you get a g-sync monitor, you can get dynamic hz to match whatever your fps is all the time. This eliminates the use of v-sync (which causes input lag) while avoiding judder and tearing from unmatched fps roller coaster vs hz due to different gpu demands in different scenes(complexity, number of characters)/action/viewpoints(distances,etc).

G-sync monitors (and monitors like the eizo fg2421 VA's own tech) have an ulmb mode you can use instead of the g-sync mode on games that are less demanding (of if you have an extreme gpu budget) too. Games where you get 120fps average. This is a backlight strobing mode which results in zero FoV movement blur much like a crt's pristine movement.

IPS monitors have their advantages but lack all of that advanced gaming tech. 60hz monitors have double the FoV movement blur of 120hz (tn vs tn) + worse due to ips response time, they lack upcoming g-sync tech and backlight strobing options, and they have a 60fps max viewable motion slices ceiling due to their refresh rate. Even the overclockable korean 120hz one's response times are slow so that they blur a lot more than a gaming by design 120hz-144hz monitor, and obviously lack g-sync and strobing tech options.
..
In regard to higher resolutions - they are great but they also are a lot more demanding. At the same aspect ratio they are the exact virtual camera perspective.. the exact same scene and scene elements/scene element sizes.. but with more pixel density/resolution. In almost all 1st/3rd person games, which use HOR+, you are not gaining any FoV increase with higher resolutions (at the same aspect ratio). The are still a great improvement but they have to be balanced vs tradeoffs and gpu budget.
Even 2560x1440 is quite demanding to average 100fps or more. I would use dual 780's to get that but I am waiting until 20nm to do sli.
GTX 780Ti Benchmarks 1x-4x SLI (Work in Progress)

The above reasons (and the tradeoffs of different monitor types in general going way back) are why I have always used two monitors, one dedicated to gaming. Personally I can't see gaming in a 1st/3rd person game with a 60hz monitor, and without g-sync and backlight strobing options going forward. I'm definitely watching the asus PG278Q closely. For now my samsung A750D 27" 1080p 120hz monitor has notably lush and vibrant color. Not all TNs are as pale as some of the recent asus offerings. I have a 27" 2560x1440 60hz ips for desktop/apps next to it.
...

For much greater immersion I'm hoping for the oculus rift and competiting VR rigs in the next few years, and at higher resolutions than 1080p in later models. For their first consumer release the oculus rift dev's are shooting for 90hz - 100hz, 1080p, and some kind of low persistence or strobing tech to eliminate FoV blur (which they consider essential for immersive perspectives/VR).


. High rez screenshots (4k at low hz and low fps getting pushed by review sites now) might look pretty in a still, but "You don't play a still screen shot".. and you can't show people on 60hz what 120hz and high fps looks like at full speed, nor what blur reductions of 50%+ or complete blur eliminations looks like in motion.

As for fps numeric comparisons.. it's not just lost frames as numbers, it's lost frames as lost motion.. lower fps(and fps+hz ceilings like 60hz) making motion states into "freeze-frames" through two or more action&animation state updates rather than more defined motion tracking and animation "resolution"/definition (as compared to higher fps&hz ceilings). Moving a mouse cursor from my 120hz screen to my 60hz screen is an obvious motion tracking and smoothness loss.. a whole viewport full of motion tracking of players, creatures, FoV movement of the entire viewport itself, and all animations gets stuck in freeze-frames of double (60fps/hz 16.6ms) or more ms (lower than 60fps.. e.g. 30fps "freeze-framed" through 4 newer, unique action/world-state slices). Then you add the worst/baseline FoV movement blur of 60hz and at a 5ms or worse ips response time.


This is a graphic I made comparing different resolutions and aspect ratios apples to apples.
web-cyb.org 4k_21x9_27in_30in_same-ppi.jpg

I'll leave this here too since we are talking gaming
web-cyb.org HOR-plus_scenes-compared_1-sm.jpg
 
Last edited:
At higher fps it also increases motion definition, animation definition, and overall motion flow/control flow and motion tracking. (More "slices" of unique newer action, animation, and control path states shown per second).

While that's true, unfortunately the vast majority of games are locked internally to 60 or even 30hz in terms of the game engine. So while they sometimes uncap rendering framerates for PC releases, the game engine itself is still only processing input and calculating game states at the much lower rates anyway. As long as consoles are the primary way for most people to experience games, this isn't likely to change anytime soon, as 60hz is already a struggle for consoles, let alone anything higher. Getting games to a universal 60hz would actually be a VICTORY.

That doesn't take away the purely visual motion blur benefits, of course, but it definitely hurts responsiveness.
 
Not all games no. Some of the half assed console ports.
The motion and animation definition increase is a large aesthetic advantage not just a tracking/aiming "enhancement" , so if the game shows the increased animation, travel/tracking and control-pathing (of the entire viewport) states it will look a lot more appealing to your eyes especially when combined with large blur reduction or even blur elimination... even if some game's shitty netcode won't allow higher fps+hz to provide a real player-performance vs the game engine/other players advantage.

Not all games support triple monitor or wide aspect either (or without mods/"hacks"), which the OP was comparing to.
Thinking about getting the new 34" 34UM95 LG monitor for that wide view immersion, high rez. You ask anyone what the best gaming monitor is and usually they say some 120hz model.
..
I feel that you get more fun out of triple screen, or a 21:9.

Consoles themselves are trash and shouldn't even come up in a 120hz pc monitor gaming benefits convo imo :b (at least as far as using one on a pc monitor, obviously game ports come up as a pc issue). Supporting them buy buying them and their games isn't going to help the ported games issue either. Consoles are happy meals to the masses profit margin kiosk machines similar to why we are still stuck on lcd display tech. Inferior yet cheap and digestible, and of course, more profitable to the vested interests.

Regarding the 21:9 aspect monitors, if they made a 120hz g-sync one it could be very appealing for 1st/3rd person gaming (to me). By that time (if they ever make such a 120hz+ 21:9) we'd prob have a consumer release oculus rift rig at 90hz-100hz + blur elimination to mess with though too.
 
Last edited:
Well probably >95% of video games you can play on PC are "shitty console ports" by that definition. Maybe even >99%. There are few counter examples.

A large majority of the counter examples are older games, as well -- increasingly games are going backwards in this area, not forwards. For example, see the controversy over BF4's 10hz tick rate and the fact that the game has like 3x as much engine-based input lag as CSGO does.... modern games aren't getting better in terms of responsiveness/input lag/smoothness... they are getting worse.

How many people arguing over 120hz and +/- 5ms input lag issues are using them to play BF4, do you think, where much of the benefit of these things is negated because the game is only presenting you actual new game state once every 3 frames at 30fps.... nevermind at 120fps...(if you even have the insane rig required to get BF4 over 100fps consistently)
 
agreed on BF4 for sure. I don't play that game personally but I am aware of those failings on it. . I like mmo/rpg, rpg (1st/3rd person and/or tiled~isometric), some first person coop, online team vs stuff depending on the game (TF2 still rocks), adventure games like witcher 1 and 2, bioshock and tombraider, darksouls, darksiders, etc. I know skyrim had issues with high framerates also (if left unaddressed) to be fair to your point. I'm looking forward to that game "Evolve" game from the makers of L4D2 - even though it is being released on consoles as well as pc. Will have to see how that one is on pc. Supposed to be released in late october now.
Youtube - Evolve gameplay trailer
YouTube - Evolve dev commentary gameplay vid

120fps would be great , but 100fps average is enough to get a considerable motion definition, animation definition, and control flow improvement vs 60hz-fps max monitors. That's 5:3 frames, or even more "frozen" frames" vs the higher hz+fps player if you are running less than 60fps. Of course besides that, 120hz - 144hz yields 50% to 60% reduction in FoV motion blur of the entire viewport during your continual movement keying and mouse looking so that things blur "within the lines" or shadow-masks of onscreen objects, rather than the smearing outside of everything that 60hz does even on a 60hz tn let alone an ips. So those pretty high rez screenshots turn into a resolution that is so blurred to your eyes that it isn't even definably a solid grid on a 60hz monitor.
Those points (blur reduction/elimination, greater motion definition, animation definition, motion flow/tracking, and control articulation/flow) are huge tradeoffs, aesthetically, for a lot of us that have used a 120hz monitor.

As for 100fps average on BF4, that was among the benchmarks in that evga forum page I lined about 780ti 1x/2x/3x/4x (1x vs 2 - 4 cards in sli)
http://forums.evga.com/m/tm.aspx?m=2069079&p=

At 1920x ultra single 780ti = 81 fps , 780ti 2x (sli) = 145 fps ave
At 2560x ultra single 780ti = 52 fps , 780ti 2x (sli) = 99 fps ave

And it wasn't the most demanding game on that list. That is if you use the arbitrary max ceiling on all the games though, with AA and TressFX, etc. There is something to be said for minimal or zero blur during continual movement keying and mouse looking that 1st/3rd person games do throughout though. A screenshot looks pretty but 1st/3rd person games are about motion which is a smearfest on a 60hz monitor, especially 60hz full blur combined with a slower response (5ms - 7ms+) ips.
I have a 27" 2560x 60hz ips right next to a notably vibrant 27" 1080p 120hz TN and can't see myself ever going back to 60hz full time for gaming on my main gaming rig for 1st/3rd person games.
 
Last edited:
lets get something straight about "console ports" going forward.

While this term actually did mean something in the past, in a derogatory sense, now and in the future the term should be used very loosely or not at all. The Xbone One and the PS4 are nothing more than x86 PC's. These devices are basically identical in performance / hardware to high-end PC's that we all owned here on HardOCP 4 or 5 years ago.
 
lets get something straight about "console ports" going forward.

While this term actually did mean something in the past, in a derogatory sense, now and in the future the term should be used very loosely or not at all. The Xbone One and the PS4 are nothing more than x86 PC's. These devices are basically identical in performance / hardware to high-end PC's that we all owned here on HardOCP 4 or 5 years ago.

It's not just about similar hardware and development.

Ports are usually defined as games that are made with console interaction and lack of PC graphical features.
Using your point, games should now be made for PC first then scaled down to consoles, but that's not going to happen.
Devs will continue to scale up from console to PC, and those games are ports.

As a designer myself, it's always best practice to scale down than to scale up. Scaling up always results in reduced quality.
 
I thought 120Hz was a huge difference in smoothness compared to 60Hz.

Unfortunately the colors looked awful and I couldn't stand how terrible BF4 looked on either monitor. This is coming from a ZR30w. I tried 2 different Asus models (24" and 27") and sent them both back. My buddy has a "calibrated" VG248QE and it looks terrible in my opinion.

The 34UM95 looks awesome and I'm waiting for mine to ship from Amazon. Since I'm happy gaming on my ZR30w I'm pretty sure I'll enjoy the 34UM95.

Another problem for me with the current selection of "gaming" monitors is they are all too small. I've been gaming on a 30" monitor for years and I'm not interested in getting a smaller monitor. I'm also not interested in multi-monitor setups no matter how small the bezels are. If the ROG Swift were bigger it would be much more tempting.

I've owned that ZR30w and that is hands down the best picture of any monitor, ever. It uses alot of power and runs hot/warm, but it's picture can't be beat. Just wanted to warn you since you're going to 34UM95. The colors aren't going to be quite as good (nothing can be), but the screen is wider, yes. You're going to notice that missing 160 vertical pixels too. Each of those monitors serve a purpose though. ZR30w is "can't be beat" picture. 34UM95 is for 21:9 immersion. You'll save power and heat ;)
 
It is a night and day difference for ANY game I play.

You're most likely doing it wrong and not actually even getting 120 fps.
For example Company of Heroes has a 60 fps cap unless you add "-refresh 120" to the launch options.

It might be slightly noticeable in games other than shooters, but I don't think it's worth dealing with TN and dealing with 1920x1080. I feel you get more exciting and fun out of a wide screen (whether triple screen or 21:9) than some extra smoothness of 120hz.

The end all perfect monitor is going to be some years in future when they make a 3440 x 1440 21:9 OLED monitor. That'll have great colors, no backlight, and the only refresh limit being how fast your vid cards can push.

I've played on both, 120hz, triple screen, 60hz high rez single screen. The slight smoothness wasn't worth what you're giving up. Get a 21:9 high rez IPS screen and run 60hz (with vsync on) and you're not giving anything up, you're also gaining so much.
 
While this term actually did mean something in the past, in a derogatory sense, now and in the future the term should be used very loosely or not at all. The Xbone One and the PS4 are nothing more than x86 PC's. These devices are basically identical in performance / hardware to high-end PC's that we all owned here on HardOCP 4 or 5 years ago.

Nothing has changed because console games are still locked to 30fps or 60fps and consoles(especially the underpowered xbone) struggle to run games designed for the platforms at 60fps, let alone anything higher, thus games are designed for these framerates internally as well....

Whether or not the architecture is x86 or not has nothing to do with what performance levels are assumed.
 
I've owned that ZR30w and that is hands down the best picture of any monitor, ever. It uses alot of power and runs hot/warm, but it's picture can't be beat. Just wanted to warn you since you're going to 34UM95. The colors aren't going to be quite as good (nothing can be), but the screen is wider, yes. You're going to notice that missing 160 vertical pixels too. Each of those monitors serve a purpose though. ZR30w is "can't be beat" picture. 34UM95 is for 21:9 immersion. You'll save power and heat ;)

I guess that makes sense. I've been using this monitor for years and in my opinion "gaming" TN panels look like complete dogshit by comparison.

Thanks for the heads up. I pre-ordered the 34UM95 through Amazon so I'll just send it back if it doesn't work out.
 
Praising the ZR30W and dissing the VG248QE is like praising a rotten apple core (grainy, wide gamut 30" with gross green and red over-saturation, poor black levels and very obvious glow) over a rotten pear core (VG248QE's terrible lightboost boost colors, black levels and grainy matte coating)...so much fail.

-----

Owned a glossy VG236H for a year which looks better than all of the matte 120hz+ but sold it once I bought a glossy 1440p monitor and preferred my first 1440p monitor (S27A850D) once I got used to the input lag (had to reduce the aiming sensitivity and mouse settings in many games to compensate).

Now I use a 96hz Qnix & X-Star PLS and can't really complain about blur if they have v-sync disabled since screen tearing is so obvious, even with high FPS (own a 780 lightning). I force tipple buffered V-Sync in games I can run without obvious fps drops.

I don't understand when people claim non-TN blur is horrible but deal with screen tearing.
 
Praising the ZR30W and dissing the VG248QE is like praising a rotten apple core (grainy, wide gamut 30" with gross green and red over-saturation, poor black levels and very obvious glow) over a rotten pear core (VG248QE's terrible lightboost boost colors, black levels and grainy matte coating)...so much fail.

-----

Owned a glossy VG236H for a year which looks better than all of the matte 120hz+ but sold it once I bought a glossy 1440p monitor and preferred my first 1440p monitor (S27A850D) once I got used to the input lag (had to reduce the aiming sensitivity and mouse settings in many games to compensate).

Now I use a 96hz Qnix & X-Star PLS and can't really complain about blur if they have v-sync disabled since screen tearing is so obvious, even with high FPS (own a 780 lightning). I force tipple buffered V-Sync in games I can run without obvious fps drops.

I don't understand when people claim non-TN blur is horrible but deal with screen tearing.

I guess people get hung up on the tiny bit of delay vsync adds but I'm baffled as well. Vsync or frame limiters are always a part of game setup for me regardless of the title or genre.
 
I guess people get hung up on the tiny bit of delay vsync adds but I'm baffled as well. Vsync or frame limiters are always a part of game setup for me regardless of the title or genre.
Don't you get delay with a SLI setup too, so vsync isn't much different on your setup? I'm not sure... I play without vsync and frame limiters. I don't notice tearing at all, but really notice input lag and if fps dips below 120-100.
I have a single gtx 680 and catleap 1440 120hz. Brutal doom is awesome at 120fps and 7.1 audio, tons of gibs and action (the sprites animates at like 3fps but everything else is smooth).
 
Don't you get delay with a SLI setup too, so vsync isn't much different on your setup? I'm not sure... I play without vsync and frame limiters. I don't notice tearing at all, but really notice input lag and if fps dips below 120-100.
I have a single gtx 680 and catleap 1440 120hz. Brutal doom is awesome at 120fps and 7.1 audio, tons of gibs and action (the sprites animates at like 3fps but everything else is smooth).

No idea, I don't really worry much about input delay. Tearing is ugly though.
 
i've heard the flip side of this argument, which is that you can get used to how bad tn panels look. Ultimately, I think you are just gonna have to accept that you can't have it all, and you can't really say that one type is the best for everything.
 
There's nothing inherently wrong with 60Hz (NTSC), imo--if you can accelerate the pixel matrix enough then it's not such an issue. However, one thing that sticks in my craw about 60Hz is that is isn't really fully compatible with film at 24FPS. If your display is 60Hz only then you are going to have to use 3:2 pulldown and that results in very slightly off pitch audio and a juddery, slightly-lossy converted video format. 72Hz is 24FPS (film) compatible, 96Hz is, and strangely, so is 120Hz. So everything works on 120Hz assuming it's vsynced or frame limited properly.

Very slight differences in FPS and refresh rate can cause problems. It is best if they are exactly matched or perfectly scaled by a factor of .5. So if you are playing a 60Hz game with FMVs in them, 60Hz 90Hz or 120Hz will work great and 72Hz or 96Hz might not, depending on the way the software runs. Tomb Raider is a good example of a game that does not like refresh rates that aren't a factor of 30; the movies tear horribly at 100 or 96Hz. Butter smooth at 90Hz... of course, the game runs fine at 60Hz, too--but then you put in a BluRay that is native 24fps and it will not run in its pure form like that.
 
Now I use a 96hz Qnix & X-Star PLS and can't really complain about blur if they have v-sync disabled since screen tearing is so obvious, even with high FPS (own a 780 lightning). I force tipple buffered V-Sync in games I can run without obvious fps drops.

I don't understand when people claim non-TN blur is horrible but deal with screen tearing.

interesting. so even with 96hz, you use vsync? wouldn't it be the same to just run the monitor at 60hz unless you were to use a limiter for 96hz? truly just curious. I'm thinking about buying a Korean 27".
 
Praising the ZR30W and dissing the VG248QE is like praising a rotten apple core (grainy, wide gamut 30" with gross green and red over-saturation, poor black levels and very obvious glow) over a rotten pear core (VG248QE's terrible lightboost boost colors, black levels and grainy matte coating)...so much fail.

-----

Owned a glossy VG236H for a year which looks better than all of the matte 120hz+ but sold it once I bought a glossy 1440p monitor and preferred my first 1440p monitor (S27A850D) once I got used to the input lag (had to reduce the aiming sensitivity and mouse settings in many games to compensate).

Now I use a 96hz Qnix & X-Star PLS and can't really complain about blur if they have v-sync disabled since screen tearing is so obvious, even with high FPS (own a 780 lightning). I force tipple buffered V-Sync in games I can run without obvious fps drops.

I don't understand when people claim non-TN blur is horrible but deal with screen tearing.

I just can't let go of my VG236H, I've tried dozens of monitors and I just can't find one that beats this thing. It sucks because it was built in 2009 and the backlight is slowly degrading, it has a hard time turning on sometimes as well. I really can't figure out what to replace it with as nothing else has been enjoyable to use. I vastly prefer glossy to anything else but nobody seems to make any these days.

I've tried a Korean 27" which burst into flames, I've tried several big Dell screens (3007, 3011, 3014, 2711) as well as the VG248QE which is awful. I tried one of the Asus 27" IPS but it had around 500 dead pixels so that got returned, I tried the Eizo FG2420 which looked like smashed assholes compared to a 10 year old 17" acer LCD (I think mine was bad because no screen had a right to look that terrible). I tried the 29" LG ultrawide, but it was too slow for gaming. In the end the old VG236H comes back out of storage and gets used again.
 
interesting. so even with 96hz, you use vsync? wouldn't it be the same to just run the monitor at 60hz unless you were to use a limiter for 96hz? truly just curious. I'm thinking about buying a Korean 27".
it wouldn't be the same, 60Hz have both tearing and input lag from v-sync in terrible amounts
at 96Hz you get 1.6 times less tearing and 1.6 times less input lag overall so... so it still comes to the same choice with v-sync, just now you will be 1.6 times less affected by either v-sync or tearing. Unfortunately your GPU have to be 1.6 times more powerful too to keep frame render time below monitor refresh time to have v-synced silky smoothness :D
 
it wouldn't be the same, 60Hz have both tearing and input lag from v-sync in terrible amounts
at 96Hz you get 1.6 times less tearing and 1.6 times less input lag overall so... so it still comes to the same choice with v-sync, just now you will be 1.6 times less affected by either v-sync or tearing. Unfortunately your GPU have to be 1.6 times more powerful too to keep frame render time below monitor refresh time to have v-synced silky smoothness :D

:confused:
:eek:
:confused:

60Hz with Vsync won't have tearing on any monitor unless you can't maintain 60FPS
all this 1.6 times stuff... yeah.. maybe.
V-sync adds a little input lag. Not enough to even bother mentioning if you have a Korean OC monitor with *zero* input lag.
 
If I didn't have my samsung A750D 1080p 120hz that is notably vibrant/saturated for a tn, the eizo FG2421 "240hz" 1080p VA (120hz + proprietary backlight strobing) panel lottery or a "120hz" Korean 2560x (though the rez demands have to be taken into account and some seem to have a "hz lottery") would be my next choices. I'm still waiting on the asus PG278Q 120hz-144hz 1ms response though since none of the above will have g-sync and ulmb mode option.

The eizo does have it's own backlight strobing mode but no g-sync, and the korean ips screens blur more than a TN due to their response times from all I've read from knowledgeable people on this site seeking the best motion clarity, and they won't have g-sync.

I can't go back to sub 100hz on 1st/3rd person games personally.

I'd really like a 3440x1440 21:9 120hz-144hz g-sync/ulmb monitor but I don't see that happening in the foreseeable future. :p
 
Back
Top