Hey guys, Trying to get a handle on this, and I'm hoping you can help. Last week I helped a friend of mine build a new system. We were pretty intrigued by the new batch of 120hz LCDs out there -- we've always lamented the fact that 60hz became the standard when we all transitioned from CRTs (it's fine for most people, but if you were used to the fluidness of a high-end CRT, it was less than ideal). Anyway, long story short, he ended up snagging an Asus VG236 to go with his GTX580, and once we fired it up and set the windows refresh rate to 120hz, we were NOT disappointed. IMMEDIATELY, we saw the difference just dragging windows across the screen. The improvement was like a breath of fresh air after all these years. I was so taken aback by just the everyday windows performance improvement, that I convinced a second friend to purchase the monitor. This is where I get confused. He swears up, down, and sideways that he cannot see any discernable difference between 60hz, and 120hz settings on his display. Before you ask, he is indeed running dual-link DVI, and the Monitor's built-in OSD confirms 120hz as the input signal. The only difference is that he is running this off of a Radeon 5770. My question to you guys is this: nVidia and AMD users, can you notice a discernable difference between these two refresh rate settings on native 120hz LCDs in everyday windows usage?? My hunch is that AMD has cheated a bit in their 2D implementation, and even though the signal is being output in 120hz -- discreet 120hz data is not actually being sent to the display. (Read about a lot of issues they were having in previous driver releases with flicker @ 120hz settings -- maybe they implemented this as a workaround?) I have searched everywhere, but I can't find anything on the net that tests 2D performance of these monitors on different GPUs. To me, this is a huge selling point, and nobody is talking about it. If you're going to be working in front of a machine all day, 120hz would be a much nicer experience.