I hate LCD for gaming, even on the fastest displays I can still see ghosting or blurring or whatever you want to call the slow pixel response time.
Plasmas are much much better for that.
My only issue with my 60" plasma is it's inability to accept and display a signal that is >60hz eventhough it claims this 600hz subframe refresh (Would it really be so hard for them to all the TV to display a true 120hz refresh rate while hooked up to a PC?).
60hz sucks for me because I can see the screen flashing. The flashing usually goes away for me around 75hz, but I prefer 85hz minimum, 120hz is a dream.
I remember playing quake 3 at true 120hz on my old CRT. It did more for realism than any 3D bullshit or other special effects.
And, when I am talking about 120hz I mean 120hz native. None of this stupid up sampling frame interpretation gimmick BS that LCD uses.
I wonder about this myself!! Why can't our TVs take it? Our video cards are powerful enough to do it. The TV will supposedly take a 120hz signal from a compatible bluray player with the proper cable (does the cable really matter when it comes to 120hz??) assuming it's a TV that offers real 120hz and not some software frame smoothing gimmick, which I've heard actually do exist...
So why can't we do it from a PC? Why does it gray out those options when it communicates with our video card? HDMI is a two-way thing. Does it display proper native 120hz from your video card on those 120hz 3D displays? Wouldn't it have to??
And if a 120hz 3D LCD can do it, why not a 600hz plasma? Or at the very least, a 3D plasma??? Or can some of them provided you have the right model and the right settings? What's it take to get the right model? And what LCDs can you plug into your PC to really get 120hz?
Argh.... I hate TVs sometimes. I build computers all the time, and when I plug this in here and that in there, I know what I'm going to get. I always thought TVs were even simpler beasts, but I know so much less about them than I thought I did, evidently.