120hz Windows Desktop (2D performance)

Trematode

n00b
Joined
Jan 13, 2005
Messages
37
Hey guys,

Trying to get a handle on this, and I'm hoping you can help.

Last week I helped a friend of mine build a new system. We were pretty intrigued by the new batch of 120hz LCDs out there -- we've always lamented the fact that 60hz became the standard when we all transitioned from CRTs (it's fine for most people, but if you were used to the fluidness of a high-end CRT, it was less than ideal).

Anyway, long story short, he ended up snagging an Asus VG236 to go with his GTX580, and once we fired it up and set the windows refresh rate to 120hz, we were NOT disappointed.

IMMEDIATELY, we saw the difference just dragging windows across the screen. The improvement was like a breath of fresh air after all these years.

I was so taken aback by just the everyday windows performance improvement, that I convinced a second friend to purchase the monitor.

This is where I get confused. He swears up, down, and sideways that he cannot see any discernable difference between 60hz, and 120hz settings on his display. Before you ask, he is indeed running dual-link DVI, and the Monitor's built-in OSD confirms 120hz as the input signal.

The only difference is that he is running this off of a Radeon 5770.

My question to you guys is this: nVidia and AMD users, can you notice a discernable difference between these two refresh rate settings on native 120hz LCDs in everyday windows usage??

My hunch is that AMD has cheated a bit in their 2D implementation, and even though the signal is being output in 120hz -- discreet 120hz data is not actually being sent to the display. (Read about a lot of issues they were having in previous driver releases with flicker @ 120hz settings -- maybe they implemented this as a workaround?)

I have searched everywhere, but I can't find anything on the net that tests 2D performance of these monitors on different GPUs.

To me, this is a huge selling point, and nobody is talking about it. If you're going to be working in front of a machine all day, 120hz would be a much nicer experience.
 
Hmmm, hadn't thought about it, I just got my VG236 today.

So switching between 120 and 60 and then moving a window back and forth, I think I can see the 120 being a little smoother. Or I could be just trying to believe that, heh. I think it is, but I'm not sure I'd have ever paid attention to it without someone asking.
 
Doesn't matter which brand video card you have. 120Hz is 120Hz. Your friend is just blind as a bat.
 
I fail to see the how 120hz at the desktop is a selling point myself. If you're being "productive" you will have some kind of input window maximized and be typing a lot. Most of the screen will be static. A quality IPS display makes far more sense in this scenario.

Having had the Samsung 2233rz and Dell 2209wa side by side I couldn't tell THAT much difference in desktop use, such as dragging windows around. The inherent flaws in LCD sample/hold means blur of black text on a white being background is always apparent. You still can't easily read text being smoothly dragged around the screen on a 120hz LCD, whereas you can on a CRT.
 
If you can't tell the difference between 60hz and 120hz, you have terrible "sight".

I disagree. I have excellent vision and I can easily see the flicker in a 60Hz or 70Hz CRT. So much that it really bugs me, but I cannot see a difference in 2D desktop performance between 60 and 120Hz on an LCD.

I can definitely see a difference between TN and VA/IPS though and since all 120Hz panels are TN, I highly prefer the 60Hz VA or IPS.
 
Hey guys,

Trying to get a handle on this, and I'm hoping you can help.

Last week I helped a friend of mine build a new system. We were pretty intrigued by the new batch of 120hz LCDs out there -- we've always lamented the fact that 60hz became the standard when we all transitioned from CRTs (it's fine for most people, but if you were used to the fluidness of a high-end CRT, it was less than ideal).

Anyway, long story short, he ended up snagging an Asus VG236 to go with his GTX580, and once we fired it up and set the windows refresh rate to 120hz, we were NOT disappointed.

IMMEDIATELY, we saw the difference just dragging windows across the screen. The improvement was like a breath of fresh air after all these years.

I was so taken aback by just the everyday windows performance improvement, that I convinced a second friend to purchase the monitor.

This is where I get confused. He swears up, down, and sideways that he cannot see any discernable difference between 60hz, and 120hz settings on his display. Before you ask, he is indeed running dual-link DVI, and the Monitor's built-in OSD confirms 120hz as the input signal.

The only difference is that he is running this off of a Radeon 5770.

My question to you guys is this: nVidia and AMD users, can you notice a discernable difference between these two refresh rate settings on native 120hz LCDs in everyday windows usage??

My hunch is that AMD has cheated a bit in their 2D implementation, and even though the signal is being output in 120hz -- discreet 120hz data is not actually being sent to the display. (Read about a lot of issues they were having in previous driver releases with flicker @ 120hz settings -- maybe they implemented this as a workaround?)

I have searched everywhere, but I can't find anything on the net that tests 2D performance of these monitors on different GPUs.

To me, this is a huge selling point, and nobody is talking about it. If you're going to be working in front of a machine all day, 120hz would be a much nicer experience.

What if he switchs back to a true 60Hz display, does he see the difference then? 120Hz displays probubly perform better even at 60Hz.
 
For normal every day usage you won't really see much of a difference. It is when watching movies, especially fast paced action, etc. and gaming where the 120Hz is going to be noticable. And like others said, the graphics card doesn't make a difference in the Hz. Now I just need to pick one up for myself :D
 
I disagree. I have excellent vision and I can easily see the flicker in a 60Hz or 70Hz CRT. So much that it really bugs me, but I cannot see a difference in 2D desktop performance between 60 and 120Hz on an LCD.

I'm not talking about a stationary image.

I'm am talking about moving windows around and resizing. scrolling web pages. and moving the cursor.

there is a HUGE difference in regular desktop usage for those, and it's something I can see quite easily.

I am curious what kind of GPU you are running that you can't see a difference between when you switch from 60hz to 120hz -- sounds like my friend not perceiving the difference, and I am curious is you are running an AMD video card as well.

For normal every day usage you won't really see much of a difference. It is when watching movies, especially fast paced action, etc. and gaming where the 120Hz is going to be noticable. And like others said, the graphics card doesn't make a difference in the Hz. Now I just need to pick one up for myself

I completely disagree. Every day usage in windows is quite a difference, and I can see that on the GTX580 and VG236 set up I have experienced.

If anything, because video is usually filmed at 24 or 30 fps, with motion blur, you are going to see zero difference between running the display at 60 vs 120 hz.

I agree that the results should be obvious when gaming, though.
 
What if he switchs back to a true 60Hz display, does he see the difference then? 120Hz displays probubly perform better even at 60Hz.

Nope -- this is what I can't understand.

It's so blatantly obvious when switching on the hardware set up I mentioned. For him not to see it on his machine leads me to believe the video card isn't outputting a proper 120hz signal in windows.

To clarify I haven't been able to see the hardware running on the radeon in person myself, and it's part of the reason I'm so confused.
 
if you can't see his hardware setup it's kind of pointless trying to find what the issue is. Has your friend EVER seen the difference between 60hz and 120hz?
 
if you can't see his hardware setup it's kind of pointless trying to find what the issue is. Has your friend EVER seen the difference between 60hz and 120hz?

I dont think it's pointless...

I've helped him troubleshoot online and confirmed that he's got the latest drivers, is using the proper DVI connection, has the refresh rate properly selected in the display settings, and the display itself is reporting 120hz for the input signal.

In my experience, when it's turned on, the difference is obvious. From the troubleshooting, it should be turned on, but he's not seeing it.

I'm just trying to gauge whether or not you guys find the effect obvious, and whether or not any disagreements might be split along a line of differing windows implementations for amd and nvidia.
 
My guess is that everything is working just fine and your friend simply doesn't notice the difference. The majority of the people don't notice any of the issues often discussed here (poor viewing angles from head on, gamma shift, input lag, response time, screen uniformity etc etc). I gave my CRT to a friend of mine because he needed a monitor and the next time I came to his house to my horror he was using it at 60Hz even though the monitor is capable of up to 120Hz at that resolution. Switching back and forth I tried to show him but he simply couldn't tell any difference neither in smoothness nor flicker.

I can tell right away but I agree with what one previous poster said, I don't really see it is a huge selling point at the desktop as everything being smoother while moving windows around hardly matters at all for actual work.
 
Eye of the beholder gentlemen. Not every sees the same exact thing. I notice the difference without question. The mouse blurs less and the overall smooth nature of windows maximizing and minimizing in W7 makes it easy to see the difference for me. Its a dead on reminder of using a CRT monitor back in the day with its high refresh rates across higher max resolutions.

Not everyone is wired the same way and while some of us for instance require extensive glassware prescriptions others have perfect 20/20 sight til they day they grow old and die.
 
Nope -- this is what I can't understand.

It's so blatantly obvious when switching on the hardware set up I mentioned. For him not to see it on his machine leads me to believe the video card isn't outputting a proper 120hz signal in windows.

To clarify I haven't been able to see the hardware running on the radeon in person myself, and it's part of the reason I'm so confused.

You really need to see what he is seeing or forget about it and move on.
 
I dont think it's pointless...

I've helped him troubleshoot online and confirmed that he's got the latest drivers, is using the proper DVI connection, has the refresh rate properly selected in the display settings, and the display itself is reporting 120hz for the input signal.

In my experience, when it's turned on, the difference is obvious. From the troubleshooting, it should be turned on, but he's not seeing it.

I'm just trying to gauge whether or not you guys find the effect obvious, and whether or not any disagreements might be split along a line of differing windows implementations for amd and nvidia.

I can notice a difference in fast stuff like gaming. But moving windows around, scrolling and max/minimizing? Not really. I didn't think there was enough of a benefit in gaming to justify downgrading from the quality of a IPS or VA panel to gain a higher refresh rate.

To me, the viewing angle, colors, back-light bleed, etc between a TN and my VA are MUCH more noticeable even in gaming than Iv'e seen in 60 vs 120Hz.

My video card is a GTX 480 and I only used the 120Hz monitor for a day.
 
I can notice a difference in fast stuff like gaming. But moving windows around, scrolling and max/minimizing? Not really. I didn't think there was enough of a benefit in gaming to justify downgrading from the quality of a IPS or VA panel to gain a higher refresh rate.

To me, the viewing angle, colors, back-light bleed, etc between a TN and my VA are MUCH more noticeable even in gaming than Iv'e seen in 60 vs 120Hz.

My video card is a GTX 480 and I only used the 120Hz monitor for a day.

Yeah, that's going to be the main reason if I take this back, it's the color. Going from 16:10 to 16:9 wasn't that big of a deal, I notice the difference but it's not a killer. Backlight bleed isn't a killer for me either, though admittedly it's a little worse than my VA.

But trying to get the colors to where I like them is proving to be a drag, but I've been using this for only 5 hours or so, so far.
 
I do find many people who cannot tell the difference. It is possible it is genetic but more than that I think that your vision pertaining to speed is a learned behavior. It is separate than the traditional means of measuring eye sight in fact it is something that almost no one measures. Something that you do not develop if you do not push yourself to develop it as a skill. It is possible he just has not developed this ability, tell him to concentrate on some fast paced games then try switching back. Also it may help to set up dual monitors run one at 60 and the other at 120 in extended display mode. Put the window in between them and move it around and see if it becomes obvious.

Keep in mind that LCDs have taken over so aggressively over time you now run into a good number of people who never had much experience with CRTs. The few they did use were only running at 60hz go into most any place and non tech savy were always running at 60hz. So it does not surprise me if lots of people just do not have much to base anything off of.
 
if you don't mind, what kind of video hardware are you running?
currently just 120hz crts
but i"ve already had the lg w2363d and the old samsung/viewsonic 120hz tfts

I do find many people who cannot tell the difference. It is possible it is genetic but more than that I think that your vision pertaining to speed is a learned behavior. It is separate than the traditional means of measuring eye sight in fact it is something that almost no one measures. Something that you do not develop if you do not push yourself to develop it as a skill. It is possible he just has not developed this ability, tell him to concentrate on some fast paced games then try switching back. Also it may help to set up dual monitors run one at 60 and the other at 120 in extended display mode. Put the window in between them and move it around and see if it becomes obvious.

Keep in mind that LCDs have taken over so aggressively over time you now run into a good number of people who never had much experience with CRTs. The few they did use were only running at 60hz go into most any place and non tech savy were always running at 60hz. So it does not surprise me if lots of people just do not have much to base anything off of.
i agree

pretty sure it has nothing to do with ones eyes itself
i have bad genetics, bad eyesight - good enough to bitch about 120hz for not being smooth enough

in my theory this simply depends on ones brain processing speed
if you play competitive games and you play to win then your brain is simply forced to speed up and burn some energy
results are same as lifting some weights
what was hard once eventually becomes easy and effortless

but now if you look at today's easymode games and the slow shitty monitors/hdtvs...
playing the average game today kinda became like watching a movie
you just walk right through it enjoying the scenery and the story
losing impossible - you can be a "winner" without putting any effort into it
unfortunately no reason to wake your brain from energy-save-mode also means zero improvement
 
I'm sure there are people who can't tell the difference.

Look at DLPs, one in 100,000 can see the rainbow effect...

As for whether or not ATI vs nVidia can show a difference, no there isn't a difference that I can see on the Acer GD235HZ I have. I've used both a GTX 480 and Radeon 5870 on it and 120hz looks as smooth on either in 2D, and similarly on 3D with the only differences being FPS on that particular game/app.
 
I think your friend just can't tell in general desktop use. I'm not sure I could fault him for that, some people just aren't as observant. If he can't tell the difference in game I might suggest that he is using vsync. Again less observant people might not notice how vsync gives the mouse that underwater feeling, but the visual performance would be similar.

Get him to turn vsync off (if on) and set the monitor to 60hz and play for a bit. He should notice that the mouse feels more responsive since he's not "waiting" for vsync, but the image performance suffers from more tearing that vsync compensated for.

Then have him enable 120hz with vsync off and play for a bit. He should notice better mouse response and better performance than 60hz with vsync on as far as fluid images are concerned.

If he doesn't notice a difference he's probably not the best gamer and probably wasted his money because he is incapable of seeing the difference. You can lead a horse to water...

I ordered my 120hz LCD today, an LG w2363d. I went to a brick & mortar to make sure it was what I expected and it did not disappoint. About time we get CRT performance on LCDs, I never understood how the market could go so long with such a step backward in performance. I think your friend (might) be the reason for that because sadly he represents the majority of consumers.

EDIT: I might add that depending on the game he's playing and the settings in those games, he might not be getting framerates above 60. He could be CPU bottlenecked and unable to increase framerates despite having enough graphics horsepower. However, he still should be able to notice a difference even if only getting 60fps. I'm just throwing out the fact that having a machine capable of 120fps is the only way to truly enjoy every last drop of 120hz goodness and he may be falling very short of that number.
 
Last edited:
Skipping ahead here, but most 2D desktop use is static, and if you don't know what you are looking for then it could be missed. Its like a dead pixel towards a corner of an LCD that doesn't bother you until someone points it out, and even then for the majority its a non-issue. Most people will prioritize things like color accuracy, viewing angles, light bleed, uniformity, and the like, and simply pay little attention to the motion of a moving white cursor on a black background, others are anal and will track the mouse with their eye around the screen which some would find odd.

If you are watching video footage or playing a video game, and its not shot or rendering at over 60FPS, you are not going to see any difference, and with motion blur it will look smooth even at 30FPS. Point is, the LCD cannot create frames that don't exist.... well, HDTVs can, but you see what I'm getting at.

And lets not forget that there are visualphiles out there just as there are audiophiles, that pride themselves on noticing small differences and tend to assign tremendous importance to them. Its like when my girlfriend and I were enjoying some wine recently, to me most of the stuff tastes "good" and gives a buzz. Yay. But no, to her before you can even taste it you have to hold the glass just right so as not to minutely alter the temperature and waft it into your nostrils and bask your nose hairs in the oaky afterbirth aroma or some such crap.
 
If anything, because video is usually filmed at 24 or 30 fps, with motion blur, you are going to see zero difference between running the display at 60 vs 120 hz.

Categorically false. Movies are filmed at 24 fps, which divides evenly into 120.
 
Could it have something to do with the way windows 7 uses vertical sync differently than windows xp on the desktop? Random thought / idea.

I know this has caused problems with some flash fullscreen programs causing massive tearing on windows xp.
 
I'm sure there are people who can't tell the difference.

Look at DLPs, one in 100,000 can see the rainbow effect...

As for whether or not ATI vs nVidia can show a difference, no there isn't a difference that I can see on the Acer GD235HZ I have. I've used both a GTX 480 and Radeon 5870 on it and 120hz looks as smooth on either in 2D, and similarly on 3D with the only differences being FPS on that particular game/app.

I second that. I've run a 5830, 4890, and gtx 460 on a 235hz, and they all look really smooth at 120Hz. For kicks, I dropped my display (on a 460 now), to 60Hz, and the difference was immediately noticable. Jarring, even. Then again, I'm the type of guy who cannot watch a 60Hz DLP because all I see are those stupid rainbows...

However, at work I have a couple of 60Hz LCDs, and they don't bother me at all. Things have always been noticably smoother on my home PC, but until I looked at the difference back-to-back, the enormity of the difference never really jumped out at me like that.
 
Last edited:
For normal every day usage you won't really see much of a difference. It is when watching movies, especially fast paced action, etc. and gaming where the 120Hz is going to be noticable. And like others said, the graphics card doesn't make a difference in the Hz. Now I just need to pick one up for myself :D

I was under the impression monitors don't add the extra frames for smoothness. So wouldn't this have zero effect on movies? Since they're pre-rendered at 24fps or 24hz for that matter?
 
I was under the impression monitors don't add the extra frames for smoothness. So wouldn't this have zero effect on movies? Since they're pre-rendered at 24fps or 24hz for that matter?

Since movies are almost always 24fps, they can only be playback smoothly with refresh rates that are multiples of that frame-rate (24hz, 48hz, 72hz, 96hz, 120hz, etc).

With 120hz and 24p movie content (and 30i 3:2 pulldown content), each frame is displayed an even amount of time each, producing perfectly smooth playback. 120hz also has judder free playback with native 30fps and 60fps progressive video.

With 60hz and 24p movie content (and 30i 3:2 pulldown content), each frame has an uneven display duration, producing judder and jerky playback. 60hz only has judder free playback with native 30fps and 60fps progressive video.
 
Since movies are almost always 24fps, they can only be playback smoothly with refresh rates that are multiples of that frame-rate (24hz, 48hz, 72hz, 96hz, 120hz, etc).

With 120hz and 24p movie content (and 30i 3:2 pulldown content), each frame is displayed an even amount of time each, producing perfectly smooth playback. 120hz also has judder free playback with native 30fps and 60fps progressive video.

With 60hz and 24p movie content (and 30i 3:2 pulldown content), each frame has an uneven display duration, producing judder and jerky playback. 60hz only has judder free playback with native 30fps and 60fps progressive video.

I think you're confusing syncing a movie with a monitor over adding that fake frame to appear smoother. You cant add whats not there ( aka fake frames) that is why its recommend to turn this off for movies. Some people like it.. I personal don't.
 
Bzzzz, wrong, 30p is the most common format, and that "naked" 24p format has its own issues.
Ducman69, you don't know what you're talking about.

24000/1001 aka 23.976 fps is the default frame-rate for movies, film, and tv shows.

Things filmed 29.97 fps or 59.97 fps are by far the minority.

Bzzzz, wrong again, strike two! Frame interpolation generally produces a smoother image than native 24p. http://www.projectorcentral.com/judder_24p.htm
lol, you should really sober up before you post. Since when was this topic about frame interpolation? This topic is about 120hz PC monitors which don't even support frame interpolation. That things filmed at 24p aren't perfectly smooth to begin, is all the more reason why you don't want to make the problem worse by using a non-multiple refresh rate like 60hz.

I think you're confusing syncing a movie with a monitor over adding that fake frame to appear smoother. You cant add whats not there ( aka fake frames) that is why its recommend to turn this off for movies. Some people like it.. I personal don't.
For the record, my entire previous post in response to you was about syncing the movie framerate (23.976 fps aka 24p) to the monitor refresh. If you don't use a refresh rate which is a multiple of the video framerate, you will have added jerkiness which is especially noticeable during fast motion and panning scenes. I personally can't stand the jerky playback I get watching 23.976 fps on a 60Hz LCD, but some people don't even notice it because they've gotten so used to it. These same people are likely the ones who don't notice the increased smoothness of 120Hz monitors either.
 
Look at DLPs, one in 100,000 can see the rainbow effect...

DLP varies a lot. There are different color wheel speeds and configurations. I had an Optoma XGA DLP projector that... Well, I forget if it had a 2x or 4x wheel now. I could see the color separation all the time. Then I replaced it with a Marantz VP12S4 with a 5x 7-segment wheel and I could almost never see a rainbow with it - only once in a great while, even when I tried to see them. I converted a couple of DLP-hating friends with that machine.
 
This friend is not crazy. The same thing is happening too me. Windows desktop is NOT at 120Hz. I recently acquired the Acer HN274H 120Hz 3D monitor. On the ASRock Vision 3D HTPC, the Windows 7 desktop is smooth as butter. It looks amazing. However on my main machine, which has Vista and a GTX 470, Windows looks choppy. I tried swapping for a spare GPU, the AMD 5570, still choppy. Now I have installed Windows 7 and using the 5570 it still looks choppy. I have FRAPS running, Windows does not go above 60fps. Something is wrong.

This is not an issue of not being able to tell the difference. The difference is readily apparent. It is literally night and day, and if you disagree you clearly have never seen it. I don't want to get into an argument here, but there have been studies where pilots were able to recognize pictures of planes if they were only shown for 1/200th of a second. The human eye is capable of far more than 60Hz, not sure where this disinformation came from.

So has anyone found the fix? Is there some setting in Windows to get this to work?
 
never heard of windows 7 only redrawing at 60fps
i once tested 240hz/fps with my crt - no problem
 
I don't want to get into an argument here, but there have been studies where pilots were able to recognize pictures of planes if they were only shown for 1/200th of a second. The human eye is capable of far more than 60Hz, not sure where this disinformation came from.
I'd be careful, because that statement will get you into an argument. The human eye is not some sort of digital device with a sampling rate; standard monitor measure techniques may not make sense as applied to human vision. I think we can agree that people can see the difference in fluidity of motion past 60hz, but I don't think that 1/200 of a second test is applicable.
 
I got terribles eyes but can easily tell the difference between 50 and 60hz, or 60 and 72/75hz. So I can't even imagine with 120hz (haven't used a CRT nor a 120hz LCD in ages).
I can hardly imagine someone not being able to tell the difference tbh, even on the desktop. Just swap back and forth between the two refresh rates while playing around with Aero and the windows.

Also, I wonder why movie playback at 60hz doesn't bother more people than it does, it's really quite awful. Close to giving me a headache in some films.
 
Also, I wonder why movie playback at 60hz doesn't bother more people than it does, it's really quite awful. Close to giving me a headache in some films.

Most people watch camrip movies on megavideo. Do you really think something like simple jerkiness would bother them? :(
 
Back
Top