120Hz LCD Info Thread

Is there a particular reason people turn on v-sync other than tearing?
Vsync is sometimes useful as a frame rate cap for older games that would otherwise run at a much higher frame rate. Generally speaking, there's no reason to run older games at frame rate higher than the monitor's refresh rate unless sim and input are linked to the rate of the renderer (in which input polling, physics or AI are affected by the frame rate of the renderer). With vsync enabled, the GPU performs less work than it would otherwise have to and extends the life of the card and/or keeps the fan from ramping up more than necessary.

Vertical sync has two cons:
1) It adds one or more frames of rendering latency, though this may not always affect sim and input.
2) It halves the effective frame rate whenever a frame can't be completed in time for a flip (assuming double buffering). This can be minimized to some extent by using more buffers.

Also, note how v-sync at 60hz feels horrible, even when your rig can output far more than 60fps and you get "only" 16.67ms lag. The delay at 120hz is halved, but it's still possible to feel it for many people. Precisely because the mouse is directly affected.
Not necessarily. Input and rendering are decoupled in some engines such that input can run at a much higher rate (and thus update at a much higher rate) than the renderer.
 
Some people are just more sensitive to it than others. I've had the glasses on for well over 100 hours now, no problems other than minor eye strain but nothing more than I get from staring at a monitor for tool long anyway.
 
Wait what really? I think you could have sold them for quite some monies. I'd have at least taken them off your hands to try out once or twice, then probably pass them along to someone else.

I've gotta say I get headaches after a while from 3D. Made it 3/4 of the way through Avatar before I got a headache (could feel it coming, so I took off the glasses for a few minutes). The nvidia thing just looks like a massive migraine, considering that it is pretty much like staring at a strobe the entire time. I just can't imagine anyone using it for any length of time. I don't think nvidia even made any attempt at thinking it through as this very well could give someone a seizure. Wait till someone falls over and hits their head then sues nvidia. Avatar just used polarized lens so the only headache comes from your brain (it knows it's not 3D) trying to correct what you are seeing.

As I posted elsewhere, I ended the whole damned thing with a headache and was about to vomit. I tossed the things to the side, contemplated destroying them...picked them up half wanting to half wanting to just sell them or something...and squeezed them, not really trying to destroy them...and one of the lenses immediatly shattered. I'm actually surprised how easy it was to destroy. Once the lense shattered I just went to town on the rest of it.

Skakruk - no. I might not be seeing the advantage to 120hz, but no way am I going to break open an LCD...anything with AC power in it I generally handle with care if I ever open it...and the cost of the LCD is enough to keep me from that anyway. Rarely if ever to I break things in rage...I just wander off and toss a stress ball around :)

heatlesssun: So, the nVidia forums basically said I had no clue what I was doing and that it was all user error/I was insane or something. So, a coworker who has a kit around he's not using (wish I'd known this sooner...) is lending me his glasses and I'm going to take pictures. But there was a VERY CLEAR second image on the RE5 variable benchmark;
when Chris and Sheva enter the room at the beginning, I could see 3 shevas and 3 chris' basically - the main, clearest one, one offset to the right, one offset to the left (basically, 2 of each per eye) - the ghosting was truly pretty terrible. I have no doubt that, as some people on the nVidia forums suggested, the glasses are totally out of sync with what is being displayed, but nVidia offers no way to tweak timings, and USB is far from a latency-free solution.

Trine wasn't as bad, but it was still pretty bad (oh, and you know how someone on here said Trine wasn't the best 3D title? People on the NV forums are calling it one of the best.) And how someone on here said RE5 was on of the better titles to try? Well, the NV forums claim it's one of the worst.
 
Minimized? I'd say it can be all but ELIMINATED using triple buffer.
"Minimized" is probably the wrong word. Triple buffering allows for the effective frame rate to drop less severely than double buffering when a frame can't complete before a flip.
 
Triple Buffering is awesome and makes a huge noticeable difference. Too bad it doesn't work with Direct3D so it's basically irrelevant.

One note, I run a 120Hz monitor and VSYNC is still definitely detrimental to smoothness and responsiveness. Also, the effect of tearing is lessoned significantly with 120Hz monitor and VSYNC off. So I run VSYNC off.
 
Triple buffering is supported in Direct3D but can't be enabled in every game (without a tool).

As for vertical sync being detrimental to 'smoothness', that hasn't ever been my experience. Theoretically speaking, rendering in sync with the display's refresh rate should, barring some sort of engine deficiency, always appear smoother.
 
It's true that the NVIDIA CP option for Triple Buffering only applies to OpenGL, but for Direct3D you use D3DOverrider. Also, some games have their own setting for true Triple Buffer that operates under Direct 3D. Pretty sure Crysis is one of them.

D3D is no excuse not to use Triple Buffer.

It sure is an excuse not to use it when AMD won't offer the option in the driver settings. I didn't know it could be manually forced with an external app, however.

I haven't seen an option in Crysis 2 for triple buffering and I've already played through the game.
 
Triple buffering is supported in Direct3D but can't be enabled in every game (without a tool).

As for vertical sync being detrimental to 'smoothness', that hasn't ever been my experience. Theoretically speaking, rendering in sync with the display's refresh rate should, barring some sort of engine deficiency, always appear smoother.

Err, theoretically speaking, VSYNC is detrimental to smoothness. It only allows frame updates when the display is ready, regardless of when in "game time" the frame actually occurred. On the other hand, with VSYNC off, the frame can immediately begin to be displayed, with the side effect of that frame being broken by tearing. But no studder due to the constant mismatch and frame to frame variability of the "game time" to "display time" discrepancy.
 
Sorry for the double post...but to conclude my wonderful venture into 3D Vision, I'll end with this:

RE5 is terrible; text is unreable in 3D mode and ghosting is extremely bad. After trying the various bullshit the scumbags at nvidia posted on their forums, I ended the night feeling naseus with a bad headache...the world is still spinning (and I was FINE with stuff like Avatar in 3D before)

So my solution for nVidia 3D Vision? Here.

Good use of $200, next time send 'em to me I'll be happy to pay shipping and then either keep 'em or sell 'em.
 
EDIT: Need to revise this post a bit. Will come back to it later.
 
Last edited:
Triple buffering is supported in Direct3D but can't be enabled in every game (without a tool).

As for vertical sync being detrimental to 'smoothness', that hasn't ever been my experience. Theoretically speaking, rendering in sync with the display's refresh rate should, barring some sort of engine deficiency, always appear smoother.

To me, v-sync "looks" smoother but doesn't "feel" smoother because of the lag it creates. I've noticed some kind of micro-stuttering in some games too (pretty rare though).
There simply isn't any FPS (even the super old ones that run at a very high framerate) I can enjoy playing with v-sync ON (I don't mind too much in RTS/RPGs most of the time however).

There are quite a lot of games with built-in framerate caps anyway, when there isn't one I usually just bump the AA/AF to something ridiculous, since my GPU squeals with insane framerates (like 400fps +).
 
Thought I posted it here: Frys pricematched the kit to 120 for me. I'll certainly sell off the emitter if you want :-p

Thanks, but no, the glasses I could of used for an extra pair for 3D movies or SFIV with friends. The emitter is a dime a dozen, in essence, you destroyed the only part that actually had any value.
 
Thanks, but no, the glasses I could of used for an extra pair for 3D movies or SFIV with friends. The emitter is a dime a dozen, in essence, you destroyed the only part that actually had any value.

Except you've got people on the nvidia forums claiming they are the worst 3D glasses money can buy. Give me a break.

No one seems to realize: I destroyed them. I'm not on here to cry about it. I'm happy I trashed em....felt damn good.
 
It's very pathetic that you need to wear glasses to look at your screen... (talk about a no chick "dick" attraction....)
 
Hey all,

I've been thinking very seriously about buying either the Asus VG236 or the BenQ XL2410Q, both of which seem to be love 'em or hate 'em types of monitors. I'm very concerned about black levels - as such I'm also considering the Eizo Foris FS2331, which has other threads in this forum and with its PVA panel has spectacular black levels - but with quite a bit more ghosting, supposedly.

Coming from a Dell 2209WA, does anyone have any perspective as to how much I'd notice 120Hz vs. going with the PVA panel? Hell, do the two 120Hz monitors I've listed have better black levels (taking into account IPS glow, which I've really grown tired of) than the 2209WA (reviews seem to be mixed)? This is primarily for gaming, so ghosting would be a very bad thing, but I'm tired of the poor blacks of my 2209WA.

Thanks!
 
It's very pathetic that you need to wear glasses to look at your screen... (talk about a no chick "dick" attraction....)


Yeah because a geek playing games in front of a computer gets chicks SOOO hot without glasses?:confused:
 
Hey all,

I've been thinking very seriously about buying either the Asus VG236 or the BenQ XL2410Q, both of which seem to be love 'em or hate 'em types of monitors. I'm very concerned about black levels - as such I'm also considering the Eizo Foris FS2331, which has other threads in this forum and with its PVA panel has spectacular black levels - but with quite a bit more ghosting, supposedly.

Coming from a Dell 2209WA, does anyone have any perspective as to how much I'd notice 120Hz vs. going with the PVA panel? Hell, do the two 120Hz monitors I've listed have better black levels (taking into account IPS glow, which I've really grown tired of) than the 2209WA (reviews seem to be mixed)? This is primarily for gaming, so ghosting would be a very bad thing, but I'm tired of the poor blacks of my 2209WA.

Thanks!

I went from a 2209WA to the ACER 120Hz. Ghosting is almost a non issue at 120Hz.
Input lag at 60Hz is noticeably higher then the 2209WA at 60Hz or 76Hz. But switch to 120Hz and it all but disappears.

Colors in many situations are noticeably off/bad compared to the e-IPS. But then you get used to it after 3 days and it becomes a non issue.

Black levels are close, perhaps the ACER is slightly better. Nothing to write home about. This will be my monitor until I can get a 120Hz/240Hz displayport OLED.
 
I went from a 2209WA to the ACER 120Hz. Ghosting is almost a non issue at 120Hz.
Input lag at 60Hz is noticeably higher then the 2209WA at 60Hz or 76Hz. But switch to 120Hz and it all but disappears.

Colors in many situations are noticeably off/bad compared to the e-IPS. But then you get used to it after 3 days and it becomes a non issue.

Black levels are close, perhaps the ACER is slightly better. Nothing to write home about. This will be my monitor until I can get a 120Hz/240Hz displayport OLED.

Glad someone on this forum isn't a super-picky neat freak.
People need to learn to be happy with products which are 99.9% acceptable given the current market trends.
 
Glad someone on this forum isn't a super-picky neat freak.
People need to learn to be happy with products which are 99.9% acceptable given the current market trends.

Bought my ViewSonic VX2265wm when this first came out. I could really care less that it doesn't have OSD since I can control that right from the OS, and I am happy with mine :)

I am, though, interested in these mythical 27" 120hz monitors that only appear during tech conventions...
 
I went from a 2209WA to the ACER 120Hz. Ghosting is almost a non issue at 120Hz.
Input lag at 60Hz is noticeably higher then the 2209WA at 60Hz or 76Hz. But switch to 120Hz and it all but disappears.

Colors in many situations are noticeably off/bad compared to the e-IPS. But then you get used to it after 3 days and it becomes a non issue.

Black levels are close, perhaps the ACER is slightly better. Nothing to write home about. This will be my monitor until I can get a 120Hz/240Hz displayport OLED.

IMO, the Alienware colors look great, and while there is banding, it's almost not present; only other issue is the viewing angles...they're what you expect from a TN panel...and if all you do it look straight at the monitor (meaning it's not a TV as well) it's fine. I don't think my alienware is going back, even if 3D is a bunch of bullshit.

I'm hoping to run the FC2 tearing test tonight, btw.
 
Glad someone on this forum isn't a super-picky neat freak.
People need to learn to be happy with products which are 99.9% acceptable given the current market trends.
true
i don't see why so many people simply refuse to eat shit
1,000,000,000 flies seem to be perfectly happy with its quality
 
true
i don't see why so many people simply refuse to eat shit
1,000,000,000 flies seem to be perfectly happy with its quality

I wasn't going to reply to him...everyone has different things they want. Some people have golden ears and pay $500 for headphones...some people have "golden eyes" and can spot bad colors right away and so want PVA color accuracy. To each their own. If you don't care about color accuracy, viewing angles, banding etc...good for you. Doesn't make someone else's likes and dislikes any more or less valid.
 
I think most people who find LCD so acceptable haven't seen anything to compare the technology to - that is a *healthy* CRT, or possibly high end Plasma. Pitting a brand new LCD against a high mileage CRT is NOT a valid comparison (although I find high end CRTs tend to look better even if they've seen plenty of hours). Obviously, it's not a persons fault if they don't have access to a good running CRT for comparison, but it is still a case of ignorance.

And ignorance should NOT be an excuse for the kind of technological stagnation that we've seen with LCD. If more people realized just how much LCD shit they were eating, maybe manufacturers would be prompted to give us OLED etc. sooner.

The same fundamentally flawed technology, with half arsed 'improvements' plastered over it year after year. Edging - in painfully slow increments - towards the picture quality we already had 8 years ago. And the manufacturers of this shit have so many of us believing that we are actually advancing.

I love my Panny plasma. It might buzz a little (need to find that damned coil and coat it in silicone) but it's an amazing tech - so glad I didn't buy a LCD TV...

Motion looks amazing, black levels are great, it's smaller than a tube TV (although it DOES weigh ~80lbs with it's wallmount assembly on it) costs less than an decent looking LCD, has no backlight that'll die...etc.
 
I love my Panny plasma. It might buzz a little (need to find that damned coil and coat it in silicone) but it's an amazing tech - so glad I didn't buy a LCD TV...

Motion looks amazing, black levels are great, it's smaller than a tube TV (although it DOES weigh ~80lbs with it's wallmount assembly on it) costs less than an decent looking LCD, has no backlight that'll die...etc.

Most LCD's are around 40-100W and if I'm not mistaken, Plasma screens use a lot more wich you'll feel in your energy bill...
 
I think most people who find LCD so acceptable haven't seen anything to compare the technology to - that is a *healthy* CRT, or possibly high end Plasma. Pitting a brand new LCD against a high mileage CRT is NOT a valid comparison (although I find high end CRTs tend to look better even if they've seen plenty of hours). Obviously, it's not a persons fault if they don't have access to a good running CRT for comparison, but it is still a case of ignorance.

And ignorance should NOT be an excuse for the kind of technological stagnation that we've seen with LCD. If more people realized just how much LCD shit they were eating, maybe manufacturers would be prompted to give us OLED etc. sooner.

The same fundamentally flawed technology, with half arsed 'improvements' plastered over it year after year. Edging - in painfully slow increments - towards the picture quality we already had 8 years ago. And the manufacturers of this shit have so many of us believing that we are actually advancing.

This.
I'm tired of LCDs now. It looks like they will never be "acceptable" for someone like me.

You either get the almost ghosting-free 120hz TN with crappy colours and blacks, or the IPS with decent colours, crappy blacks and some ghosting, or you get the PVA with great colours and blacks... but terrible ghosting.
Plus there's that annoying "fixed" res which makes watching DVDs or playing old games so awkward.
 
Most LCD's are around 40-100W and if I'm not mistaken, Plasma screens use a lot more wich you'll feel in your energy bill...

You know, I'm at work so I'll save the math for later, but I've already done the math with regards to videocards and processors: an extra 100W isn't going to kill you. It'll cost you less than $50 a year IIRC.

Actually...no, I needed to go online to pay my electric bill, so let's do this. I'm paying $.085578 Per KWH. OK, so now, let's say that I watch TV (be it movies, games, TV etc) for 20 hours every week - probably accurate for me. So, the math is wattage * hours / 1000 * cost per kw/h. OK...and my panny uses 275.03W average (this is off a site - I'll plug my kill a watt in tonight.) OK, big number. Wee. Let's go:

20 hours a week * 52 weeks = 1040 hours
1040 hours * 275.03W = 286031.2 W/h
286031.2 W/h / 1000 = 286.0312 kW/h
286.0312 kW/h * .08557 = $24.475689784

So, apparently...this TV costs 25 USD a year to run. OK, so...maybe my usage numbers are lower than average...but still, even if you use the TV for 2000 hours a year, well...double the cost. Now, to me paying 25 USD a year is...nothing. That;s a few coffees for crying out loud.

Anyone who brings the cost of running a TV or a videocard into the discussion, in my mind, is running with a few screws lose. It doesn't matter. In a PC it matters if you can cool it and power it. In a living room...how does it matter? Can your breaker handle it? I mean...that's all really.

(and my wattage numbers I BET are lower given I run at 25 brightness or something. I'll be mucking around on the xbox/watching movies tonight, so I'll run the kill a watt the entire time.)
 
Last edited:
The point im trying to make, is, what other choice do you have right now if you want 120hz. Aren't a whole lot of options, just to deny yourself of 120hz waiting for something better than current offerings.

Which are FINE, their are no bettter options for FPS gaming in realistic practical terms, (CRT's are pretty much no longer an option because of availability.)

TN's suck on color, i've yet to see one of the 120hz's not able to be correctly calibrated into acceptable levels. Your playing games, your not going to notice.

I guess these monitors are hitting too much of a niche market, i doubt they'll make too many more if this becomes a bigger problem.
And why are we talking about plasmas here. There are no 120hz TV's worth talking about so comparing any LCD to plasma on this thread is kinda off-topic.
 
The point im trying to make, is, what other choice do you have right now if you want 120hz. Aren't a whole lot of options, just to deny yourself of 120hz waiting for something better than current offerings.

Which are FINE, their are no bettter options for FPS gaming in realistic practical terms, (CRT's are pretty much no longer an option because of availability.)

TN's suck on color, i've yet to see one of the 120hz's not able to be correctly calibrated into acceptable levels. Your playing games, your not going to notice.

I guess these monitors are hitting too much of a niche market, i doubt they'll make too many more if this becomes a bigger problem.
And why are we talking about plasmas here. There are no 120hz TV's worth talking about so comparing any LCD to plasma on this thread is kinda off-topic.

What other choice? Panny plasma at 1080p. Pretty color accurate, great black levels...

As for color: am I playing games? Yes. Am I also browsing? Looking at photos from my camera? Yep. I notice. Banding is the bigger issue, really.

As for comparing to plasma...why not? It's the spiritual successor to CRTs. It can be made into a great monitor. Frankly, I wish I'd nabbed the VT25 instead of the G25....and I'd then just hook my PC up to my TV and be done.
 
Thread title: 120Hz LCD Info Thread, dont really care, but still.
 
Thread title: 120Hz LCD Info Thread, dont really care, but still.

And if the info is that current 120hz LCDs are your run of the mill TN panels with all the TN pitfalls (and I'll say again, the AW2310 is better than any other TN I've used) that's great info. If people want to say that a plasma is a better choice...well, that's good info the help guide people.

On top of that, you kinda opened this can of worms yourself.
 
I suppose, although i'll agree on the Alienware, best overall.
 
I like my AW2310. Within their viewing angles they provide pretty good picture quality IMHO, not IPS level but significantly better than your average TN panel, and the finish is matte which I love. But as with all TN panels out side of the narrow viewing angles there's significant color shifting. But for gaming I think the 120 Hz is more important than super accurate color and no color shifting.
 
You know, I'm at work so I'll save the math for later, but I've already done the math with regards to videocards and processors: an extra 100W isn't going to kill you. It'll cost you less than $50 a year IIRC.

Actually...no, I needed to go online to pay my electric bill, so let's do this. I'm paying $.085578 Per KWH. OK, so now, let's say that I watch TV (be it movies, games, TV etc) for 20 hours every week - probably accurate for me. So, the math is wattage * hours / 1000 * cost per kw/h. OK...and my panny uses 275.03W average (this is off a site - I'll plug my kill a watt in tonight.) OK, big number. Wee. Let's go:

20 hours a week * 52 weeks = 1040 hours
1040 hours * 275.03W = 286031.2 W/h
286031.2 W/h / 1000 = 286.0312 kW/h
286.0312 kW/h * .08557 = $24.475689784

So, apparently...this TV costs 25 USD a year to run. OK, so...maybe my usage numbers are lower than average...but still, even if you use the TV for 2000 hours a year, well...double the cost. Now, to me paying 25 USD a year is...nothing. That;s a few coffees for crying out loud.

Anyone who brings the cost of running a TV or a videocard into the discussion, in my mind, is running with a few screws lose. It doesn't matter. In a PC it matters if you can cool it and power it. In a living room...how does it matter? Can your breaker handle it? I mean...that's all really.

(and my wattage numbers I BET are lower given I run at 25 brightness or something. I'll be mucking around on the xbox/watching movies tonight, so I'll run the kill a watt the entire time.)

I'm an average user, behind my screen 4 hours a day wich is 1460 hours a year.
seeing as me and my wife always get the same monitor as me that would mean 2920 hours a year.. so how much would this cost me? around 70 $ a year on energy?
I am carefull since we both use seperate pc's and we use electric heating for the winter.

Anyhow I have my doubts about using a TV as monitor, I have one that can be used for PC/TV aswell wich costed me around 680 € and I bought it a year ago but when I plugged it in behind my pc, the first impression was that the colors were too stretched (quality was poor).
So I doubt I will be using anything else then a pc monitor for my pc.
 
Last edited:
I don't understand why you guys are talking about Plasmas in this thread at all, Don't get me wrong I love them but... not as a computer monitor. The picture is too soft for text in my opinion and burnin would be just terrible having a static web browser open multiple hours a day.
 
Last edited:
I don't understand why you guys are talking about Plasmas in this thread at all, Don't get me wrong I love them but... not as a computer monitor. The picture is too soft for text in my opinion and burnin would be just terrible having a web browser open multiple hours a day.

I agree, it's the main reason why I don't want one..
 
When are 120Hz IPS monitors coming? The lightbleeding in TN monitors is ridiculous, but 120Hz still so good that I'll just have to keep this piece of junk. *sigh*
 
When are 120Hz IPS monitors coming? The lightbleeding in TN monitors is ridiculous, but 120Hz still so good that I'll just have to keep this piece of junk. *sigh*

Seriously? We have had like 100 threads on this site asking the same thing....it is not happening.

Lighting bleeding is just as bad on IPS as TN, except you have to deal with aggressive AG coating and the response time as well.

Turn down the brightness
 
Back
Top