LCD's = crap because...

SHiZNiLTi

Gawd
Joined
Jan 15, 2006
Messages
927
Thanks for viewing my thread, below I have a couple questions regarding wide-screen gaming and field of view benefits that I need answered, if you can take the time to answer them I would greatly appreciate it.

Ok so here is a little info on my past gaming experience with LCD's and CRT's...

I'm a serious gamer in 1st person shooters and have a ton of experience as you can see from my xfire profile. I allways get 1st place every server, every map I play in for COD2, COD4, and Joint Operations. I'm very competative in getting this 1st spot and keep training everyday to keep this position....

profilequ6.jpg


Currently I'm using a 21" Sony Trinitron FD CRT that I love which does 1600rez@100hz and 1280rez@140hz.

I've recently tried out a 22" 2ms Samsung and a 22" 2ms Viewsonic LCD that I purchased to only test then return them and was very disappointed in the performance.

While running COD2@1680x1050 on these LCD's I noticed something very very different about how the game looked and felt performance wise. It seemed like I was lagging when making quick movements looking around quickly it just seemed like the movement wasn't as smooth as my CRT and the LCD's seemed almost blurry to me in fast action. Because of this I wasn't able to get off the quick headshot snapshots as easy as I could with my CRT due to this issue.

Ok so with that said above you can tell I'm biased against using LCD's, but in the process of building a new PC setup for a squad mate he keeps getting told by his peers that he shouldn't get a 21" Sony Trinitron FD CRT for $150 and should drop $300 for a 22" LCD when his main focus is fps gaming.

The only thing that I see as an unconfirmed advantage for LCD's is the wider Field of View that you get when running in wide-screen mode. The questions that I have are simple...

1.) Does COD2 & COD4 wide-screen mode really give you a wider field of view then the standard 4:3 that I'm accustomed to with my 21" CRT??

2.) Is there a 22" LCD out there that is better then these (2) models that I tested which can perform better then a Trinitron FD?

I just find it hard to believe that running in wide-screen mode actually lets you see more of the map to the left and right of the screen.
 
I think that LCDs are far superior for a couple of reasons.
#1 As far as I know there is no HDMI input for CRTs, which gives a better signal, and better picture.

#2 More R&D has been put into LCDs since 2000, and they are probably more superior in ways I don't even know.

From personal experience I like the clearer crisper picture, and although it takes some getting used to, it's even harder to go back. Another aspect you may be dissapointed in is that the wide screen monitors can use a higher resolution, and this taxes your system which results in glitchy gameplay.
 
I think that LCDs are far superior for a couple of reasons.
#1 As far as I know there is no HDMI input for CRTs, which gives a better signal, and better picture.

#2 More R&D has been put into LCDs since 2000, and they are probably more superior in ways I don't even know.

From personal experience I like the clearer crisper picture, and although it takes some getting used to, it's even harder to go back. Another aspect you may be dissapointed in is that the wide screen monitors can use a higher resolution, and this taxes your system which results in glitchy gameplay.

The issue is not really about picture quality, I do agree some LCD's have better image quality, but the key thing here is that it's being used by a pro gamer that gets about 6-8 hours of practice in a day. Performance is the main thing that I'm looking at and best reaction time on monitor. When I tested the LCD's I ran them at 1680x1080 and my CRT I used 1600x1200 so that was effecting anything. They do have CRT's that have DVI input, but no HDMI.

Anyone else able to answer the question about having a larger filed of view on widescreen mode resolution or does it just stretch it without distortion? Thanks!
 
§·H·ï·Z·N·ï·L·T·ï;1031675574 said:
The issue is not really about picture quality, I do agree some LCD's have better image quality, but the key thing here is that it's being used by a pro gamer that gets about 6-8 hours of practice in a day. Performance is the main thing that I'm looking at and best reaction time on monitor. When I tested the LCD's I ran them at 1680x1080 and my CRT I used 1600x1200 so that was effecting anything. They do have CRT's that have DVI input, but no HDMI.

Anyone else able to answer the question about having a larger filed of view on widescreen mode resolution or does it just stretch it without distortion? Thanks!

Widescreen mode in COD2 and COD4 gives you a wider field of view with the same vertical angle. You do see more with widescreen.
 
§·H·ï·Z·N·ï·L·T·ï;1031675454 said:
I just find it hard to believe that running in wide-screen mode actually lets you see more of the map to the left and right of the screen.

Whether or not a game widens the fov for a widescreen aspect ratio is up to the developers, some do some don't. It's not right or wrong either way, despite what some people might claim

§·H·ï·Z·N·ï·L·T·ï;1031675454 said:
Anyone else able to answer the question about having a larger filed of view on widescreen mode resolution or does it just stretch it without distortion? Thanks!

Without distorting the image there are two options, either keep the vertical fov constant and display a wider fov horizontally on a wide screen, or else keep the horizontal fov constant and display more at the top and bottom on a 4:3 monitor. Any game that supports wide screen modes will do one of the two and the image won't be distorted either way. Any game that doesn't support widescreen will either fill the screen with the same image on either type (causing distortion) or else show black bars on the sides of the wide screen, but LCD monitors usually have options to control that.
 
I think that LCDs are far superior for a couple of reasons.
#1 As far as I know there is no HDMI input for CRTs, which gives a better signal, and better picture.

#2 More R&D has been put into LCDs since 2000, and they are probably more superior in ways I don't even know.

From personal experience I like the clearer crisper picture, and although it takes some getting used to, it's even harder to go back. Another aspect you may be dissapointed in is that the wide screen monitors can use a higher resolution, and this taxes your system which results in glitchy gameplay.

Terrible arguments - HDMI means nothing over VGA as they are completely different types of displays. As for the second part - that's 7 years of R&D - CRT's have been around since what, 1956? How about 44 years of research and development.

But yes, 16:10 widescreen LCD monitors will give the wider field of view. For that matter, so will the 16:10 Sony FW900 24" CRT monitor that can be had for ~$300 on ebay, much better than ANY 22" LCD out there. Supposedly they look even better than the P1130 / G520 which is what you have if I'm not mistaken, as do I.
 
I just took a screenshot at 1680x1050 and another at 1280x1024.

I resized each to 50% of original for easier viewing.

Here they are:

Copyofshot0003_resize.jpg


Copyofshot0001_resize.jpg
 
HDMI = DVI (my mistake). I did not know that any CRTs actually used HDMI, so I learned somthing new.
 
Terrible arguments - HDMI means nothing over VGA as they are completely different types of displays. As for the second part - that's 7 years of R&D - CRT's have been around since what, 1956? How about 44 years of research and development.

But yes, 16:10 widescreen LCD monitors will give the wider field of view. For that matter, so will the 16:10 Sony FW900 24" CRT monitor that can be had for ~$300 on ebay, much better than ANY 22" LCD out there. Supposedly they look even better than the P1130 / G520 which is what you have if I'm not mistaken, as do I.

I've allways wanted that 24" CRT...

http://www.accurateit.com/details.asp?iid=1134

So I gues it is true that the wider screen will give you an advantage in seeing the enemy 1st for COD4&2.

Anywone else able to confim this with doing test screenshots?

EDIT *kumquat 2[H]4U , thanks for the SS it looks like the widescreen does give you an adavantage.*
 
the delay you prob saw was also the ghosting issue LCD's, some, have when doing fast movements, LCD's have their places, but if you want something professional for work, then you usually need to strart spending $1000+ for an lcd.


for me the fact i can game for hours on an LCD with out getting a headache is advatnage enough, i am sure if you used the LCD long enough you would adjust from the CRT.
 
with all current LCDs being limited to 60 or maybe some to 76Hz refresh you will never be
able to achieve the same smooth high fps like on the CRT. But the test with the 2 22'''ers
might not have been sufficient because input lag (2nd problem) can differ greatly between
models, even if they are from the same manufacturer.
http://www.digitalversus.com/duels.php?ty=6&ma1=36&mo1=265&p1=2507&ma2=36&mo2=224&p2=2104&ph=12
If you've tested a non lagging Samsung 226BW that's about as good as it gets in therms of
responsiveness right now. (almost no lag but still limited refreshrate)

more expensive, non-TN monitors will mostly perform worse, as expensive monitors like the ones from NEC or EIZO feature image enhancers that add to the lag. Almost all 30''ers lag less (lagging "only" ~16ms) than most smaller high quality non-TN models cause the big ones haven't got any complicated internals. (weren't available for that resolution gateway and EIZO now implementing some that could possibly add lag.)

For sitting in front of the box all the time doing work or even graphics i'll take an LCDs clear picture any time (I think even my sight got better after switching from a 19'' Sony CRT to a 18'' samsung LCD many years ago, maybe the eyes relearned to focus properly ^^; )

But for responsiveness there still is no way around a CRT.. that could theoretically be used running a letterboxed 16:10 resolution, something you could even try out on your own CRT for clearing up the widescreen issue... ^_^
 
the delay you prob saw was also the ghosting issue LCD's, some, have when doing fast movements, LCD's have their places, but if you want something professional for work, then you usually need to strart spending $1000+ for an lcd.

see above, picture quality has its price sometimes even costing responsiveness, The cheap TN-panels ghost less than most expensive models.
ghosting will only blur things and cause a very minimal delay, lag and refresh rate will have a greater impact on playability.
 
You don't need a widescreen monitor to get the wider FOV. You can select the widescreen resolution in the game while still using a 4x3 CRT monitor. This will give you the widescreen, but the monitor will probably stretch it vertically. Use the monitor controls to add some black bars to the top and bottom. Then you'll have the wide FOV with the responsiveness of a CRT.
 
I have a 21" sony and it runs @ 75HZ 1600x1200 and it still gives me headaches

I know it might sound weird but in my experience the videocard can play a role in this, Awhile back I had to send off my main card for RMA and the temp card I was using (Nvidia FX5500) was causing noticeable flicker on my CRT even at 85hz,I remember repeatedly checking the refresh rate and sure enough even the monitor itself was displaying 85hz on the OSD output, It would bother me after extended use, But once my main card arrived (7800gtx 512mb) the flickering was no longer there @ 85hz. Not saying this is the definite case for you but worth a mention.
 
This a troll post.

And to the others. Frequency response limits frame rate, but it doesn't make your game run poorly. Go do some research before you just regurgitate something you heard somewhere else for facts. Smoothness is a perception, and for a game to "feel smooth" it must be ~40-60fps depending on how sensitive you are. The human eye cannot perceive above 24fps, period. For it to seem smooth it must have a higher frame rate for multiple reasons; Frame rates have a propensity to drop in action, thus making the game feel choppy.. Move your mouse slowly with a game cranked up to drop you to low framer rate.. say in the 20s.. You won't notice it bad. Move your mouse fast, it'll jitter. Your eyes don't notice big black blank screens in between, they notice the jitter. 60 hertz frequency response means the picture is updated 60 times per second. Period. Which means you can at max have it updated 60 times per second. But it wont update less than that.

To the OP, stop trolling. If you've played games so **%&(&@%(*% much.. you would know what FoV is. (Field of View for the uninformed) It's the same principle with 16:10 as opposed to 4:3 .. Imagine that you see more left and right!
"
I just find it hard to believe that running in wide-screen mode actually lets you see more of the map to the left and right of the screen." ... Wide screen TVs must really baffle you.. how about movie theaters? Man everyone is liars.. because you don't believe it doesn't mean much.

You play seriously? Yeah.. the thing that's sad is, you bought a 22" panel, you obviously don't play seriously. There are a hundred reasons why a 22" panels are crap, Do some research though.

Think before you post, you were inciting a flame war, thats it. And the few people who took your crap seriously just wasted their life and braincells by reading your post.

I don't usually call people out for stupidity but.. you're stupid.






Edt: Yeah and btw, if you actually do care input lag is what matters when it comes to gaming, ghosting doesnt hinder gaming, its annoying slightly but input lag is the only thing that would hold you back.

S-IPS panels have the least input lag, Dell 2007WFP, NEC2490WUXI and 2690s are SIPS panels, NEC also has a 20 or 22 sips thats good as well.
 
This a troll post.

The human eye cannot perceive above 24fps, period. For it to seem smooth it must have a higher frame rate for multiple reasons; Frame rates have a propensity to drop in action, thus making the game feel choppy...

Thats an old & busted rumor ,The human eye can perceive well above 24fps , I'd bet 1 million bucks if you did an A & B test with the same game running at a rock solid 30FPS on one monitor and a solid 60FPS on another monitor every person on the planet could tell you which monitor was playing the 60fps version. Actually I remember back in the day 3DFX had a demo called "30/60" which also proved this point.
 
Tried a Trinitron? Headaches come from crappy 60Hz CRT's.

i owned 4x 19" sony trinitrons @ 100mhz 1600 x 1200 and i got headaches after hours, then goignt o a :LCD having such a clearity, i wont go back to CRT ever.
 
Thats an old & busted rumor ,The human eye can perceive well above 24fps , I'd bet 1 million bucks if you did an A & B test with the same game running at a rock solid 30FPS on one monitor and a solid 60FPS on another monitor every person on the planet could tell you which monitor was playing the 60fps version. Actually I remember back in the day 3DFX had a demo called "30/60" which also proved this point.

As said, it was busted, people "assume" you cant because of TV standards, but compueters changes that because video / TV uses a blurring affect between frames, computers dont, thus you can perceive WELL above 24 and some people over 100+, there are some test on the net for testing what you can see.
 
you also have to take into consideration that the game would be moving at a different frame rate than your monitor. Any difference over 30 fps is negligible.
 
Thats an old & busted rumor ,The human eye can perceive well above 24fps , I'd bet 1 million bucks if you did an A & B test with the same game running at a rock solid 30FPS on one monitor and a solid 60FPS on another monitor every person on the planet could tell you which monitor was playing the 60fps version. Actually I remember back in the day 3DFX had a demo called "30/60" which also proved this point.


Yeah because that wouldn't be a great way to sell video cards.

As said, it was busted, people "assume" you cant because of TV standards, but compueters changes that because video / TV uses a blurring affect between frames, computers dont, thus you can perceive WELL above 24 and some people over 100+, there are some test on the net for testing what you can see.

Yeah which would be limited by the monitor you are using to view the pages.

Frame rate has NOTHING TO DO WITH WHAT YOU PERCEIVE. ITS NOT YOUR VISION SENSITIVITY. It's the fluidity of the game to keep it from stuttering, WHY IS THAT CONCEPT SO HARD TO UNDERSTAND?
 
Yeah because that wouldn't be a great way to sell video cards..

It still provided enough evidence that the human eye can easily perceive above 30fps

Frame rate has NOTHING TO DO WITH WHAT YOU PERCEIVE. ITS NOT YOUR VISION SENSITIVITY. It's the fluidity of the game to keep it from stuttering, WHY IS THAT CONCEPT SO HARD TO UNDERSTAND?

Because that comment makes zero sense thats why , If a game is running at 30fps , not stuttering , not 29fps but 30FPS you could STILL *easily* tell the difference between 30fps and 60fps. See how that works?
 
As said, it was busted, people "assume" you cant because of TV standards, but compueters changes that because video / TV uses a blurring affect between frames, computers dont, thus you can perceive WELL above 24 and some people over 100+, there are some test on the net for testing what you can see.

It still provided enough evidence that the human eye can easily perceive above 30fps



Because that comment makes zero sense thats why , If a game is running at 30fps , not stuttering , not 29fps but 30FPS you could STILL *easily* tell the difference between 30fps and 60fps. See how that works?

Do yourself a favor, go read some books, maybe learn something, then come back.
 
bud, play hard, go pro.

Do yourself a favor, go read some books, maybe learn something, then come back.

you first. Tell me what the light refraction index is for the forward most part of each monitor pannel, then I'll accept your statements as even possibly true. I can tell you it is easy to see the difference between 30, and 60fps. It is true that this has alot to do with how your brain accepts flashy things and colors, but on a standard LCD monitor (read: a monitor that doesn't refresh), it is easy to see the difference between 30 and 60fps.

btw, I, just like mathesar, dont understand what you ment by this:

Frame rate has NOTHING TO DO WITH WHAT YOU PERCEIVE. ITS NOT YOUR VISION SENSITIVITY. It's the fluidity of the game to keep it from stuttering, WHY IS THAT CONCEPT SO HARD TO UNDERSTAND?

a TV runs at 24fps for bandwidth reasons only.
 
a lot of people here are just wow....

i think its pretty easy to follow that.. i'm not going to constantly explain it

no fricken place did i relate 24fps to TV, and I know why its 24 for TV in the US.

Its not other places.
 
quite right, and in those other places standard definition and high definition arn't 640 X 480 and 1024 X 720 (respectivly) :nerdy:.
 
Do yourself a favor, go read some books, maybe learn something, then come back.

If you read books, you'd know that human vision is a constant phenomena. We don't see in frames like a computer displays, nor do we have a discrete "refresh rate". While the firing frequency of optical ganglia is limited, it's limit is quite high. Therefore, incoming visual information is almost like an analog signal, rather than a digital one. We are constantly seeing things. Our optical ganglia are also tied to motor-cells (which help our eyes move our gaze from very bright objects instantly, like when accidently stairing down a laser pointer) which are able to perceive and coordinate action within about 0.013ms (a lot lower than the 2.5ms response that "24fps" would account for).

24fps is NOT adaquate under most situations. The figure is quoted for video displayed in movie theaters, which have a very distinct set of conditions. What are these conditions? Well, have you ever noticed that gaming in the dark makes the game seem smoother? I urge you to download the Crysis demo, wait till midnight, turn the settings up high, and go for it. Much smoother than with the lights on. Why is this you ask? It's simple. Afterimages. When our eyes percieve light, a photopigment in each recepter cell is isomerized, which sends an electical-chemical to the nearest ganglion cell. Six of these pulses must travel to the local ganglion before a signal is actually sent to the brain. Bright light isomerizes (bleaches) many pigments at once and this puts great strain on the nerve fiber, and when it reverts back to normal, the nerve fires in a pattern which codes for the negative afterimage of what you just saw. Look at a waterfall for an hour, then look at the ground: It will appear to move upwards, as this is the opposite of water moving downwards. Additionally, we have our own version of interframe blending, which helps fast-moving images stay in our field of view for a few milliseconds, even if they're not actually still there. The image is blurred (due to a lack of visual information) but it allows us to percieve what we need to to notice the object. In summary: This afterimage is displayed for a few milliseconds between each frame. Coupled with interframe blending, you see a seemless image, with little to no stutter.

Television NTSC signal is sent at 30fps because of the conditions one usually watches TV in... Lighter conditions where the isomerization isn't so pronounced. In addition, due to interleave, where the screen is updated two times each refresh, we can see much more information with less television bandwidth. IMAX films are presented at 48fps for greater depth and photorealism. Independant studies on computer screens (with no interframe blending) have shown frame-rates in the 50s as the baseline for the cutoff of fluidity...

So, the antiquated notion of '24fps" was largely provoked by film buffs and early gamers. There is a LOT more information on this, but I honestly don't have time to get into it.
 
Don't forget that older generation games usually required a much higher frame rate to properly move due to "feature bugs" in the game's physics engine.

A great example is quake 3. If your frame rate wasn't a steady 125 fps then you would not be able to move as fluid and quick as someone who was getting a solid 125 fps.

If you hard capped the FPS to 60 the game would visually look very smooth to your eyes, but you would be completely gimped for movement.

If you hard capped the FPS to 30 it would be so choppy that it would be nearly unplayable.

If you hard capped the FPS to 125 but ran a low refresh rate (even with v-sync off) you're going to see some amount of chop.

If you hard capped the FPS to 125 but ran your monitor at 120hz things became very smooth. I'd go as far as saying "liquid smooth".

I have yet to play quake 3 on an LCD but I really doubt it would be as smooth as a CRT running at 120hz (when the game is running at 125 FPS).

Most newer games seem to have a hard cap of 60 fps. In these cases it should look very smooth with an LCD as long as there's no ghosting and your computer is able to consistently pump out 60 fps.

The visual problem seems to stem up when the game demands an abnormally high frame rate.
 
§·H·ï·Z·N·ï·L·T·ï;1031675574 said:
The issue is not really about picture quality, I do agree some LCD's have better image quality, but the key thing here is that it's being used by a pro gamer that gets about 6-8 hours of practice in a day. Performance is the main thing that I'm looking at and best reaction time on monitor. When I tested the LCD's I ran them at 1680x1080 and my CRT I used 1600x1200 so that was effecting anything. They do have CRT's that have DVI input, but no HDMI.

Anyone else able to answer the question about having a larger filed of view on widescreen mode resolution or does it just stretch it without distortion? Thanks!

I don't think you had to tell everyone you are a pro gamer. your 1337 name gives you away! u must be pro!
 
S-IPS panels have the least input lag, Dell 2007WFP, NEC2490WUXI and 2690s are SIPS panels, NEC also has a 20 or 22 sips thats good as well.

1. Look up the Lag measurements http://www.hardforum.com/showthread.php?t=1219987

The panel crystal technology (TN IPS PVA MVA ASV whatever) has nothing to do with lag it's the electronics driving the panel, slower panels (PVA) tend to be used with complicated electronics (overdrive etc) that add lag, calculation, frame buffering ect. the high end IPS NECs all buffer a frame for LUT calculation, gamma correction and so on.

Dell has a panel lottery and the NEC seems to be discontinued.

Do yourself a favor, go read some books, maybe learn something, then come back.

2. IF you ever complain again, first read for yourself, so we don't have to explain what you should have read...

The old US NTSC TV standard is 60fps, 60 interlaced half pictures per second, making it 30 full pictures but showing 60fps movement. HDTV supports full 60fps @ 1280x720.
the 24 fps is what was considered adequate back with black and white cinema because is is about double the fps needed for a human to interpret changing still pictures as very jaggy motion, anything under 12fps will be a sideshow, 24fps is still used in feature films but it would be perceived as extremely jaggy if it wasn't for motion blur.
A non blurry picture (game) will not feel perfectly fluid unless its fps is above what the user sees as flicker if it was used as a CRTs refreshrate.

As long as you can perceive flicker you can see a change in motion.
If you can't tell the difference between 30 and 60 fps get a CRT and turn it to 30Hz, if you're fine with it and we all know the CRT flicker was only a conspiracy made to sell high refreshrate CRTs and actually comes from poisonous plastic used in old monitors.
 
You play seriously? Yeah.. the thing that's sad is, you bought a 22" panel, you obviously don't play seriously. There are a hundred reasons why a 22" panels are crap, Do some research though.

I assume playing seriously (as in pro sports) is not identical with playing for the sake of looking at beautifull pictures (OOOh look I got a tennis racket made of pure gold that makes a great sound when hitting the ball, I play seriously!!).

TN are the best for the job, even if they look like crap.
 
I'll take the non-technical route and say this:

- I can pick my monitor up with one hand. I can also rearrange my desk setup anytime.

- Oh, and my eyes don't hurt reading text for long periods of time.
LCDs FTW. :)
 
I'll take the non-technical route and say this:

- I can pick my monitor up with one hand. I can also rearrange my desk setup anytime.

- Oh, and my eyes don't hurt reading text for long periods of time.
LCDs FTW. :)

I also like my (to big to carry) LCDs but the problem is it's not about ergonomics ^_^
 
you also have to take into consideration that the game would be moving at a different frame rate than your monitor. Any difference over 30 fps is negligible.

OMG!! no it isnt, there, for many, is a VERY noticble differnce between say 30FPS and 50FPS as i said if you bothered to read it, dont think 30FPS / 24FPS = TV standard, where this myth came from. computers and your TV use different methods to move from one frame to another.


facts have been provided to blow this out of the windows TIME AND TIME again but poeple just dont beleive it, yet, dont provide any facts to prove their claim....
 
Argument holds zero water, since you could make the same argument on people playing at different resolutions on CRT's, Mr. "1337 gamer"
 
HDMI = DVI (my mistake). I did not know that any CRTs actually used HDMI, so I learned somthing new.

That's not what I meant - no CRT's use HDMI except certain 30+ inch CRT HDTV's. What I meant is that VGA has every capability to equal DVI/HDMI in signal and picture quality on CRT monitors.
 
Back
Top