Thoughts on the IBM T221

colinstu

2[H]4U
Joined
Oct 11, 2007
Messages
3,563
So one day I was browsing computer resolution articles on wikipedia. I was curious to see if any desktop monitors supported resolutions past 2560x1600. This lead me to stumble upon the IBM T221 monitor.

It came out in 2001-2002 and was discontinued in 2005. Supports 3840x2400 on a 22.2" screen! Holy crap! :D The only pitfall, 1) 48Hz is the fastest refresh rate supported at that resolution, and 2) needs special adapters to properly use today's current DVI cables and not the proprietary solution that needs a ratty old Matrox card.

The thing is, there is a guy on ebay that sells these adapters bundled with the monitors. These monitors usually go for in the $1500+ range... but for a crazy 0.1245 pixel pitch.. that's basically half that of most monitors!

Not too sure how well contrast, color gamut, etc work with these decade old monitors, but they're still impressive.

My question to you [H]... what is your opinion/thoughts on all of this? Personally the 48hz kind of kills it for me because I'd prefer more fps while I play games, and while I like resolution, the pixel pitch on this might be just a bit too fine for me. But still, it's an interesting find and amazing something like this could be pumped out so long ago.
 
So one day I was browsing computer resolution articles on wikipedia. I was curious to see if any desktop monitors supported resolutions past 2560x1600. This lead me to stumble upon the IBM T221 monitor.

It came out in 2001-2002 and was discontinued in 2005. Supports 3840x2400 on a 22.2" screen! Holy crap! :D The only pitfall, 1) 48Hz is the fastest refresh rate supported at that resolution, and 2) needs special adapters to properly use today's current DVI cables and not the proprietary solution that needs a ratty old Matrox card.

The thing is, there is a guy on ebay that sells these adapters bundled with the monitors. These monitors usually go for in the $1500+ range... but for a crazy 0.1245 pixel pitch.. that's basically half that of most monitors!

Not too sure how well contrast, color gamut, etc work with these decade old monitors, but they're still impressive.

My question to you [H]... what is your opinion/thoughts on all of this? Personally the 48hz kind of kills it for me because I'd prefer more fps while I play games, and while I like resolution, the pixel pitch on this might be just a bit too fine for me. But still, it's an interesting find and amazing something like this could be pumped out so long ago.

Yes, it is quite an exquisite and historic monitor. From my memory of reading, it pushes around 350:1 CR. I doubt that it has a large colour gamut considering its age, and it may not have strong gamma regulation, which would put it behind more recent monitors.

There are modern 5 MP + monitors like the Coronis Fusion 6 MP and the Eizo FDH3601. They are designed for big business such as medicine, satellite imaging (like hyperspectral and ground penetrating imaging, serious intelligence stuff) and geology, with a price tag to match.

Recently there has been a push in mobile devices for very high PPI by Apple, who wants 2048x1536 in a 10" IPS module for its next Ipad. There are still existing limits on the newest graphics cards limiting the outputs to 2560x1600, but an increase in graphics segment LCD pixel density is by now overdue, so hopefully we may see new panels with higher resolution in a few years. Apple has hired a lot of display engineering talent who may want to oversee this in their desktop monitor products.
 
Yes, it is quite an exquisite and historic monitor. From my memory of reading, it pushes around 350:1 CR. I doubt that it has a large colour gamut considering its age, and it may not have strong gamma regulation, which would put it behind more recent monitors.

There are modern 5 MP + monitors like the Coronis Fusion 6 MP and the Eizo FDH3601. They are designed for big business such as medicine, satellite imaging (like hyperspectral and ground penetrating imaging, serious intelligence stuff) and geology, with a price tag to match.

Recently there has been a push in mobile devices for very high PPI by Apple, who wants 2048x1536 in a 10" IPS module for its next Ipad. There are still existing limits on the newest graphics cards limiting the outputs to 2560x1600, but an increase in graphics segment LCD pixel density is by now overdue, so hopefully we may see new panels with higher resolution in a few years. Apple has hired a lot of display engineering talent who may want to oversee this in their desktop monitor products.

*nods* That's why I've been semi-hesitating with a 30" monitor purchase. That's been the max for... quite awhile now o.o (2004 for the 30" ACD, early dells were years later)

And with multimonitor (eyefinity and surround) setups and 3D trying to wrestle its way in... I feel bad for how long it will take to get a resolution or PPI increase.
 
The typical CR is rated at 400:1 and that is what they usually reaches AT LEAST. The color gamut is sRGB and uses a DD-IPS panel (DD: Dual-Domain).
I'm currently using a panel from the same manufacturer (ID Tech) in my notebook which is 2048x1535 at 15.0" (4:3) and it is quite excellent, with excellent viewing angles, where black becomes a purple tint at steep angles - and still better than most of the laptop panels available.
It is not a modern kind of IPS matrix, so the transmittance is relatively low, requiring a relatively strong backlight and thus consumes a lot of power.

IDtech touted high PPI displays and IPS unlike no other manufacturer at the time (2000-2001). They also had a 20.8" 2048x1536 and a monochrome 21.3" 2560x2048 panel along with many other panels for desktops and laptops.
Here's the data sheet of the panel: http://www.beyondinfinite.com/lcd/Library/IDtech/MD22292-B2.pdf
 
Last edited:
Played with 'em before at a previous job. They were pretty craptacular. The high resolution was the only thing they had going for them. They would routinely quit working or go bad.
 
The typical CR is rated at 400:1 and that is what they usually reaches AT LEAST. The color gamut is sRGB and uses a DD-IPS panel (DD: Dual-Domain).
I'm currently using a panel from the same manufacturer (ID Tech) in my notebook which is 2048x1535 at 15.0" (4:3) and it is quite excellent, with excellent viewing angles, where black becomes a purple tint at steep angles - and still better than most of the laptop panels available.
It is not a modern kind of IPS matrix, so the transmittance is relatively low, requiring a relatively strong backlight and thus consumes a lot of power.

IDtech touted high PPI displays and IPS unlike no other manufacturer at the time (2000-2001). They also had a 20.8" 2048x1536 and a monochrome 21.3" 2560x2048 panel along with many other panels for desktops and laptops.
Here's the data sheet of the panel: http://www.beyondinfinite.com/lcd/Library/IDtech/MD22292-B2.pdf

Awesome info! I've tried finding panel/manufacturer info before but got nothing. thanks
 
It came out in 2001-2002 and was discontinued in 2005. Supports 3840x2400 on a 22.2" screen! Holy crap! :D

Holy crap indeed. I have two of these. I switched from a 30" Dell 3007WFP to one of these and never regretted it. There is just no going back after you have tried 3840x2400. The resolution trumps everything else. My wife is using my old Dell now, and every time she shows me something on her computer I realize just how vastly superior the T221 is to anything else out there.

The only pitfall, 1) 48Hz is the fastest refresh rate supported at that resolution

This is really not an issue. It is a common misconception that you need 60Hz or whatever, propagated by people who don't have a clue about how the human eye works (or it's spec ;) )

Human eye can only distinguish up to around 12 individual frames per second. After that you can percieve flicker (depending on the contrast and brightness between the frames) up to the high teens when it comes to fps, maybe at a stretch up to 20fps if you really have exceptional eyes. Anything more than that is only percievable as fluid motion.

When you watch a film at the cinema, do you find that to be jerky? Or does it look perfectly smooth? Movies are shot at 24fps. So what makes you think for one moment that you might need more than 48Hz for any purpose?

2) needs special adapters to properly use today's current DVI cables and not the proprietary solution that needs a ratty old Matrox card.

Not true,. I have a DG5 with the 2x DL-DVI adapter (48Hz) and a DG1 without (came with the old Matrox card, never plugged it in), just running off 2x SL-DVI inputs (~30Hz, not finished fully dialing it in yet). I cannot perceive any difference between them in any application (gaming, watching videos, typical desktop/office applications).

The thing is, there is a guy on ebay that sells these adapters bundled with the monitors. These monitors usually go for in the $1500+ range... but for a crazy 0.1245 pixel pitch.. that's basically half that of most monitors!

Worth every penny. :)
But you can probably find a good T221 for less. One thing I've noticed, though - DG1 seems to exhibit some ghosting, which the later DG5 does not. I don't find it's a big deal, but it is worth mentioning.

Not too sure how well contrast, color gamut, etc work with these decade old monitors, but they're still impressive.

They're pretty good, actually. I've not found them to be in any way lacking.

My question to you [H]... what is your opinion/thoughts on all of this?

Let me put it this way - I never regretted getting mine.

Personally the 48hz kind of kills it for me because I'd prefer more fps while I play games

Utter nonsense. See above. The problem isn't the fps, certainly not at above 24fps. The problem is the peak delay between the frames. If you are getting a perfectly steady 24fps, that is a frame every 1/24 of a second. The problem is that if you have a 60Hz screen, and you get 29fps in the first 0.5 of a second, and then nothing until the last frame in that second that's still 30fps - and completely unusable. It is peak time between the frames that determines how smooth things look. 24fps is plenty, if your GPU can keep up with every frame. In fact, if you are playing a game at 24fps with vsync on, you may find the frame rate is actually smoother and the game more
playable because by lowering the frame rate you have given the GPU more time to render the next frame, rather than skip it. The problem isn the frame rate of the monitor - the problem is finding a GPU that can actually render a first person shooter at 3840x2400 at 48fps. Or even 24fps.

, and while I like resolution, the pixel pitch on this might be just a bit too fine for me. But still, it's an interesting find and amazing something like this could be pumped out so long ago.

I know what you mean. The technology has really stagnated in this area over the past decade.

You may also want to have a read through the articles on getting the most ouf of the T221 here.
 
Thank you for such a thorough and detailed analysis! I've seen the T221 but never had the pleasure of doing any work on it myself.
It really is a shame that the technology is so stagnant, although there may be hope as phones and tablets have led to interest in high DPI displays.
I think another point to mention is that these types of monitors matter most for text, since for images our eyes fixate on individual points for much less time (microsaccades) and so a lower DPI image can appear smooth while text, partially owing to the fact that it isn't a natural image of the type our visual systems are adapted to process will appear pixelated. Although, sensitivity to pixel density, input lag, motion smoothness and the like seems to vary tremendously from person to person.
 
This is really not an issue. It is a common misconception that you need 60Hz or whatever, propagated by people who don't have a clue about how the human eye works (or it's spec ;) )


Human eye can only distinguish up to around 12 individual frames per second. After that you can percieve flicker (depending on the contrast and brightness between the frames) up to the high teens when it comes to fps, maybe at a stretch up to 20fps if you really have exceptional eyes. Anything more than that is only percievable as fluid motion.

When you watch a film at the cinema, do you find that to be jerky? Or does it look perfectly smooth? Movies are shot at 24fps. So what makes you think for one moment that you might need more than 48Hz for any purpose?

I am going to have to agree to disagree. I have a T221 9503-DGP and VP2290b's.

Cinema uses motion blurring and stuff in order to make things not look jerky but yes it still looks jerky (and blurred) to me.

I noticed a difference going from 60 Hz -> 67 Hz and even 67Hz -> 100 Hz even if I am not playing but just watching. Hell I notice on TV on a OTA channel that is 720p when a commercial is an actual 60p commercial instead of of 24p telicined to 60p it is very obvious to me. 48Hz is fine if your just watching video (for most types) or for desktop use but is definitely not enough for the vast majority of games. Why do you think soo many people swear by 120Hz panels for gaming?


Not true,. I have a DG5 with the 2x DL-DVI adapter (48Hz) and a DG1 without (came with the old Matrox card, never plugged it in), just running off 2x SL-DVI inputs (~30Hz, not finished fully dialing it in yet). I cannot perceive any difference between them in any application (gaming, watching videos, typical desktop/office applications).

But you are still using an adapter.. just one that came with the video card... Mine did not come with one and I had to buy a special one.



Utter nonsense. See above. The problem isn't the fps, certainly not at above 24fps. The problem is the peak delay between the frames. If you are getting a perfectly steady 24fps, that is a frame every 1/24 of a second. The problem is that if you have a 60Hz screen, and you get 29fps in the first 0.5 of a second, and then nothing until the last frame in that second that's still 30fps - and completely unusable. It is peak time between the frames that determines how smooth things look. 24fps is plenty, if your GPU can keep up with every frame. In fact, if you are playing a game at 24fps with vsync on, you may find the frame rate is actually smoother and the game more
playable because by lowering the frame rate you have given the GPU more time to render the next frame, rather than skip it. The problem isn the frame rate of the monitor - the problem is finding a GPU that can actually render a first person shooter at 3840x2400 at 48fps. Or even 24fps.


Actually this is the utter nonsense. If you are going to argue that the problem is the delay between the frames well that varies depending on the FPS/Hz so again its a Hz problem. I have played games (very painfully so) at 3840x2400 @33Hz and @41 hz and later at 48Hz on a vp2290b -> t221. I played quake 3 and quakelive on a VP2290b at 3840x2400 and it had no problem getting a constant 125 FPS. The problem was at 33 Hz the game simply was not playable. I actually spent $500 on matrox triple head 2 go adapters (my only option at the time to get 41 Hz from a single video card) just to go from 33 Hz -> 41 Hz.

Just 33 -> 41Hz made a difference between the game being unplayable and somewhat playable. A bump to 48Hz helped quite a bit too but now I don't game on the monitor as I ended up getting a better disk allowing for more monitors. That wikipedia article argues a human brain can detect completely.

Its extremely noticeable when playing the game but I don't even have to be playing it I can just watch someone playing it and notice 60Hz vs 100Hz very easily.
 
I noticed a difference going from 60 Hz -> 67 Hz and even 67Hz -> 100 Hz even if I am not playing but just watching. Hell I notice on TV on a OTA channel that is 720p when a commercial is an actual 60p commercial instead of of 24p telicined to 60p it is very obvious to me. 48Hz is fine if your just watching video (for most types) or for desktop use but is definitely not enough for the vast majority of games. Why do you think soo many people swear by 120Hz panels for gaming?

I think what you notice there might be more the 3:2 pulldown since 60 is not divisible by 24. If i'm not mistaken, half the frames are held for two frames and the others for 3. This is effectively uneven frame presentation, as gordan pointed out.

Regarding the other differences, it's not so clear cut that our visual system distinguishes X amount of detail and Y amount of motion at Z distance. I think one of the bigger reasons gamers like high refresh monitors is that even if frame rate differences themselves might be hard to notice, ghosting is perhaps more apparent. Also, as it is becoming more apparent, frame rate latency in games can be highly inconsistent and that seems to be the predominant issue in perceived smoothness. Perhaps at this resolution framerates are less consistent, especially without enough vram? I don't know, I'm not an expert
 
I think what you notice there might be more the 3:2 pulldown since 60 is not divisible by 24. If i'm not mistaken, half the frames are held for two frames and the others for 3. This is effectively uneven frame presentation, as gordan pointed out.

Well I will admit that could be why its so noticeable but I can also notice difference (a lot) with rendered content on multiple games when you don't have to worry about the difference in timing/skipped frames if outputing to something like a VP2290b/T221 which will output at odd refresh rates.

Regarding the other differences, it's not so clear cut that our visual system distinguishes X amount of detail and Y amount of motion at Z distance. I think one of the bigger reasons gamers like high refresh monitors is that even if frame rate differences themselves might be hard to notice, ghosting is perhaps more apparent. Also, as it is becoming more apparent, frame rate latency in games can be highly inconsistent and that seems to be the predominant issue in perceived smoothness. Perhaps at this resolution framerates are less consistent, especially without enough vram? I don't know, I'm not an expert

Possibly but the quake 3 engine is one of the most responsive engines in existence and its old enough that on even older hardware I was able to get a constant 125 FPS whether it was at 2560x1600 or 3840x2400, it does not matter.

Also even on the same monitor I notice a huge difference going from 60 Hz to 100Hz as I have a 100Hz capable catleap as its much smoother (less jagged) @ 100Hz. It wasn't as noticeable until I started using a 100Hz monitor on a regular basis. Its also very noticeable in games like stepmania where the arrows are only refreshed a few times going across the screen at higher speeds vs a fluid motion where it actually becomes harder to make out the shape due to the speed its moving (instead of snapshots that my brain sees at the lower refresh rate).

But saying your brain can't tell the difference between 24 updates/sec vs 60 updates/sec is ridiculous and that person has obviously never played a first person shooter. 24 FPS is completely unplayable I know because 33Hz was completely unplayable on my vp2290b. 41Hz just barely came on the realm where I was not significantly worse playing at the lower refresh rate. At 33hz I played much, much worse.
 
But saying your brain can't tell the difference between 24 updates/sec vs 60 updates/sec is ridiculous and that person has obviously never played a first person shooter. 24 FPS is completely unplayable I know because 33Hz was completely unplayable on my vp2290b. 41Hz just barely came on the realm where I was not significantly worse playing at the lower refresh rate. At 33hz I played much, much worse.
So, I took a look at the cited article. What it's actually saying is that if you flash 10 to 12 distinct images that's the limit of discrete images we can process separately. If I recall correctly, those types of studies put people in a room and show them the images in rapid succession and either ask them to recall what they saw or to ask them to identify what they've seen from choices.

This is very different from frames in smooth motion. Our visual system doesn't perceive motion in terms of discrete frames anyway. In fact, object recognition and motion are processed separately.
Also, like pretty much all human perception, we view contrasts, not absolutes
 
Holy crap indeed. I have two of these. I switched from a 30" Dell 3007WFP to one of these and never regretted it. There is just no going back after you have tried 3840x2400. The resolution trumps everything else. My wife is using my old Dell now, and every time she shows me something on her computer I realize just how vastly superior the T221 is to anything else out there.



This is really not an issue. It is a common misconception that you need 60Hz or whatever, propagated by people who don't have a clue about how the human eye works (or it's spec ;) )

Human eye can only distinguish up to around 12 individual frames per second. After that you can percieve flicker (depending on the contrast and brightness between the frames) up to the high teens when it comes to fps, maybe at a stretch up to 20fps if you really have exceptional eyes. Anything more than that is only percievable as fluid motion.

When you watch a film at the cinema, do you find that to be jerky? Or does it look perfectly smooth? Movies are shot at 24fps. So what makes you think for one moment that you might need more than 48Hz for any purpose?



Not true,. I have a DG5 with the 2x DL-DVI adapter (48Hz) and a DG1 without (came with the old Matrox card, never plugged it in), just running off 2x SL-DVI inputs (~30Hz, not finished fully dialing it in yet). I cannot perceive any difference between them in any application (gaming, watching videos, typical desktop/office applications).



Worth every penny. :)
But you can probably find a good T221 for less. One thing I've noticed, though - DG1 seems to exhibit some ghosting, which the later DG5 does not. I don't find it's a big deal, but it is worth mentioning.



They're pretty good, actually. I've not found them to be in any way lacking.



Let me put it this way - I never regretted getting mine.



Utter nonsense. See above. The problem isn't the fps, certainly not at above 24fps. The problem is the peak delay between the frames. If you are getting a perfectly steady 24fps, that is a frame every 1/24 of a second. The problem is that if you have a 60Hz screen, and you get 29fps in the first 0.5 of a second, and then nothing until the last frame in that second that's still 30fps - and completely unusable. It is peak time between the frames that determines how smooth things look. 24fps is plenty, if your GPU can keep up with every frame. In fact, if you are playing a game at 24fps with vsync on, you may find the frame rate is actually smoother and the game more
playable because by lowering the frame rate you have given the GPU more time to render the next frame, rather than skip it. The problem isn the frame rate of the monitor - the problem is finding a GPU that can actually render a first person shooter at 3840x2400 at 48fps. Or even 24fps.



I know what you mean. The technology has really stagnated in this area over the past decade.

You may also want to have a read through the articles on getting the most ouf of the T221 here.

I have to quote this simply for posterity as it has to be one of the most I'll informed and senseless posts I have seen in a long time.
 
Come on, that's not nice. He's mostly talking about his subjective experiences. Sure, some facts about the human visual system are off, but there are individual differences, and from what I can tell it's a great monitor for text-heavy work.
 
For 99.9 percent of people it's probably not worth pursuing at this point. If for some reason you actually need one then you probably have a very specific use for it that far outweighs all of its shortcomings or limitations. I hope they have one at the Computer History Museum.

Hopefully some 3840x2160 (60hz) displays at like ~30" are coming our way in the next few years.
 
I am going to have to agree to disagree. I have a T221 9503-DGP and VP2290b's.

Cinema uses motion blurring and stuff in order to make things not look jerky but yes it still looks jerky (and blurred) to me.

Motion blurring isn't relevant when there are only 24 different frames being displayed. It might make the image appear more blurry, but it won't appear the frame distinction any less choppy.

I noticed a difference going from 60 Hz -> 67 Hz and even 67Hz -> 100 Hz even if I am not playing but just watching. Hell I notice on TV on a OTA channel that is 720p when a commercial is an actual 60p commercial instead of of 24p telicined to 60p it is very obvious to me. 48Hz is fine if your just watching video (for most types) or for desktop use but is definitely not enough for the vast majority of games. Why do you think soo many people swear by 120Hz panels for gaming?

Swearing by 120Hz comes out of hype and ignorance, mostly. While there is a remote possibility that your eyes are wired for higher frame rate (we are all different), it is unlikely in the extreme that you can actually tell the difference if you looked at 30Hz, 48Hz, 60Hz and 100Hz side by side. The high refresh rates were only really relevant with CRTs because they flickered. I suspect he reason 60Hz was made the standard was largely because it allowed for reasonably clean rendering of both 24fps, 30fps video. But since the bandwidth of the optic nerve is finite, the chances are that if you can see that many fps, you can't process that many pixels anyway. For example, I recently found that my wife perceives things skipping on the screen up to a frame rate of about 5-6 fps higher than me - but she has to get within 6in of my T221 to read the text on it, which I comfortably read from a little over 3ft away.

But you are still using an adapter.. just one that came with the video card... Mine did not come with one and I had to buy a special one.

What adapter are you talking about? You only need an adapter if you want to reduce the number of inputs required to drive the monitor at the full refresh rate, i.e. instead of using 4x SL-DVI links (4 DVI ports) you can instead use 2x DL-DVI links (2 DVI ports). It makes things a little tidier, sure, but it's not that big a deal.

Actually this is the utter nonsense. If you are going to argue that the problem is the delay between the frames well that varies depending on the FPS/Hz so again its a Hz problem. I have played games (very painfully so) at 3840x2400 @33Hz and @41 hz and later at 48Hz on a vp2290b -> t221.

Your GPU almost wasn't able to saturate the fps the monitor was capable of. My GTX580 gets about ~20fps on _average_ (minimum is lower) at 3840x2400 (with AA disabled) in Crysis. Until you have a GPU setup that can saturate what the monitor can do with vsync enabled, you are not really making a meaningful comparison.

I played quake 3 and quakelive on a VP2290b at 3840x2400 and it had no problem getting a constant 125 FPS.

So clearly you were running with vsync disabled. Fix that and the chances are your perception will change.

The problem was at 33 Hz the game simply was not playable. I actually spent $500 on matrox triple head 2 go adapters (my only option at the time to get 41 Hz from a single video card) just to go from 33 Hz -> 41 Hz.

There is no way that any Matrox card is going to push gaming frame rates like that at 3840x2400.

Just 33 -> 41Hz made a difference between the game being unplayable and somewhat playable. A bump to 48Hz helped quite a bit too but now I don't game on the monitor as I ended up getting a better disk allowing for more monitors.

A better disk allowing for more monitors? Is there something getting lost in translation here? Because that makes no sense whatsoever.

Its extremely noticeable when playing the game but I don't even have to be playing it I can just watch someone playing it and notice 60Hz vs 100Hz very easily.

I still think that what you are noticing is tearing from running without vsync and the slowness of the GPU, at least as far as running at 3840x2400 is concerned. I haven't re-tested with a GTX680 (haven't got one yet), but last time I did do the testing with various GPUs (4870X2 a couple of years ago, and more recently with a GTX580), they were not capable of getting anywhere near saturating 41Hz, let alone 48Hz. Maybe 1/3 of that with various settings turned down. And considering that multiple GPUs don't scale lnearly, I rather doubt you could saturate 3840x2400 in a modern game even when using a maxed out SLI/Crossfire solution.

Also even on the same monitor I notice a huge difference going from 60 Hz to 100Hz as I have a 100Hz capable catleap as its much smoother (less jagged) @ 100Hz.

It sounds increasongly like the reason you are seeing differences because higher refresh monitors is because you are running without vsync, this is what causes jaggedness, unless you are not explaining what you are seeing properly.

But saying your brain can't tell the difference between 24 updates/sec vs 60 updates/sec is ridiculous and that person has obviously never played a first person shooter.

If you are seeing 24fps as unacceptable, then that means that your cinema experience is just as crap, too. If you find movies at the cinema OK, then the cause must be elsewhere (e.g. no vsync, underpowered GPU, etc.). Personally I find that I cannot tell the difference above 24fps, unless there is re-framing going on (e.g. 30fps movie on 24Hz refresh or vice versa, but 24fps movie on 24Hz refresh looks perfect). With rendered content such as a FPS game, if your GPU can keep up and you have vsync enabled, I cannot see any frame skipping at all.

But as I said, everyone's eyes are wired differently. I use a MiscFixed 8-point font for development work on Linux (roughly equivalent to the 6 point bitmap font on Windows), and I can comfortably view this from about 3ft away. But then again, while I enjoy the odd blast of a FPS like Borderlands or Left4Dead with friends, I am definitely not a twitch gamer.

I have to quote this simply for posterity as it has to be one of the most I'll informed and senseless posts I have seen in a long time.

Care to quantify this with actual facts?
 
Motion blurring isn't relevant when there are only 24 different frames being displayed. It might make the image appear more blurry, but it won't appear the frame distinction any less choppy.

Motion Blurring is relevant. It's how movies avoid choppiness issues while showing the video. Cameras can capture motions blur, where in games, it has to be added in manually as an effect. Plus, your arguement on frame distinction only goes up the less frames that are displayed per second because you are seeing the frames for a longer period of times and persistence of vision is affected because of it.

Swearing by 120Hz comes out of hype and ignorance, mostly. While there is a remote possibility that your eyes are wired for higher frame rate (we are all different), it is unlikely in the extreme that you can actually tell the difference if you looked at 30Hz, 48Hz, 60Hz and 100Hz side by side. The high refresh rates were only really relevant with CRTs because they flickered. I suspect he reason 60Hz was made the standard was largely because it allowed for reasonably clean rendering of both 24fps, 30fps video. But since the bandwidth of the optic nerve is finite, the chances are that if you can see that many fps, you can't process that many pixels anyway. For example, I recently found that my wife perceives things skipping on the screen up to a frame rate of about 5-6 fps higher than me - but she has to get within 6in of my T221 to read the text on it, which I comfortably read from a little over 3ft away.

Ignorance? I disagree. I've used monitors at 30Hz, 60Hz, 75Hz, 76Hz, and 85Hz. I can tell the difference very clearly (except 75 and 76). Going from 60Hz to 75Hz makes me go, "Oh my god, it's sooooo smooth. I gotta go show my roommate." Then he said similar. Most people can tell the difference in framerate. Putting them side by side would make the seeing the difference even easier.

What adapter are you talking about? You only need an adapter if you want to reduce the number of inputs required to drive the monitor at the full refresh rate, i.e. instead of using 4x SL-DVI links (4 DVI ports) you can instead use 2x DL-DVI links (2 DVI ports). It makes things a little tidier, sure, but it's not that big a deal.

Yes those adapters do just that, and they are needed for using a modern AMD GPU with this monitor. AMD only allows for two DVI Signals to be used without Active DP->DVI adapters.

Your GPU almost wasn't able to saturate the fps the monitor was capable of. My GTX580 gets about ~20fps on _average_ (minimum is lower) at 3840x2400 (with AA disabled) in Crysis. Until you have a GPU setup that can saturate what the monitor can do with vsync enabled, you are not really making a meaningful comparison.

Yes and No. But he clearly states that he was getting over 125FPS on quake 3 and quakelive, so he can saturate the framebuffer just fine.

So clearly you were running with vsync disabled. Fix that and the chances are your perception will change.

Running V-sync adds latency and wont necessarily be smoother. I usually get over 60 FPS on BF3 on my setup (7680x1600), but if I enable v-sync, there's a lag that I can feel and the framerate gets capped at 30FPS, which is unplayable for a fast twitch shooter.

There is no way that any Matrox card is going to push gaming frame rates like that at 3840x2400.

It's not a card, it's a video splitter that allows you to run multiple monitors off of one input. I used one before the days of eyefinity. great piece of tech there.

Just 33 -> 41Hz made a difference between the game being unplayable and somewhat playable. A bump to 48Hz helped quite a bit too but now I don't game on the monitor as I ended up getting a better disk allowing for more monitors.

What the hell is this "disk"? Also, this goes against what you said earlier about games being playable at 24FPS.

I still think that what you are noticing is tearing from running without vsync and the slowness of the GPU, at least as far as running at 3840x2400 is concerned. I haven't re-tested with a GTX680 (haven't got one yet), but last time I did do the testing with various GPUs (4870X2 a couple of years ago, and more recently with a GTX580), they were not capable of getting anywhere near saturating 41Hz, let alone 48Hz. Maybe 1/3 of that with various settings turned down. And considering that multiple GPUs don't scale lnearly, I rather doubt you could saturate 3840x2400 in a modern game even when using a maxed out SLI/Crossfire solution.

Has nothing to do with tearing. Tearing doesn't make the video image seem jittery, if the frames come in at a constant rate.

It sounds increasongly like the reason you are seeing differences because higher refresh monitors is because you are running without vsync, this is what causes jaggedness, unless you are not explaining what you are seeing properly.

See above.

If you are seeing 24fps as unacceptable, then that means that your cinema experience is just as crap, too. If you find movies at the cinema OK, then the cause must be elsewhere (e.g. no vsync, underpowered GPU, etc.). Personally I find that I cannot tell the difference above 24fps, unless there is re-framing going on (e.g. 30fps movie on 24Hz refresh or vice versa, but 24fps movie on 24Hz refresh looks perfect). With rendered content such as a FPS game, if your GPU can keep up and you have vsync enabled, I cannot see any frame skipping at all.

Movies and Games are different beast. Again, you disregard the existence of motion blur.

But as I said, everyone's eyes are wired differently. I use a MiscFixed 8-point font for development work on Linux (roughly equivalent to the 6 point bitmap font on Windows), and I can comfortably view this from about 3ft away. But then again, while I enjoy the odd blast of a FPS like Borderlands or Left4Dead with friends, I am definitely not a twitch gamer.

Your eyes must be botched because everyone I know can see the difference in framerate.

Care to quantify this with actual facts?

For your enjoyment: http://www.100fps.com/how_many_frames_can_humans_see.htm
 
By the way, I think it's cool for it's time, but it's too convoluted to be used now. QFHD is enough, even though the T221 is higher, I feel like the size is too small for me and others who use 24+ monitors.
 
Pffffff 120fps, 60fps, 30fps and even 12fps are for noobs! Through a special high fiber diet and controlled rapid eye flutter, I have trained myself to enjoy only 1 FPS when I game! Because of this I can enjoy Crysis like it was meant to be played in FULL HD with my 9700PRO while all of the uneducated noobs continue to climb over each other like crabs in a bucket , kissing the foot of consumerism at every yearly over rated GPU release, grasping at that never ending brass ring of price/performance/value.
 
Last edited:
Ignorance? I disagree. I've used monitors at 30Hz, 60Hz, 75Hz, 76Hz, and 85Hz. I can tell the difference very clearly (except 75 and 76). Going from 60Hz to 75Hz makes me go, "Oh my god, it's sooooo smooth. I gotta go show my roommate." Then he said similar. Most people can tell the difference in framerate. Putting them side by side would make the seeing the difference even easier.

I'm not convinced this is not largely a placebo effect. But I guess this is one of those "agree to disagree" things. All I can say is that I have never found that fps > 25-ish actually improves my experience.

Yes those adapters do just that, and they are needed for using a modern AMD GPU with this monitor. AMD only allows for two DVI Signals to be used without Active DP->DVI adapters.

On Linux ATI binary drivers suck to the point where they don't even accept custom modelines. I don't know if the Windows ones are any better.

Yes and No. But he clearly states that he was getting over 125FPS on quake 3 and quakelive, so he can saturate the framebuffer just fine.

But it also shows that he isn't running vsync. If you are rendering 120fps but only showing 60, you will get 60 instances of half of one frame and half of another frame. That will cause tearing that is often percievable as stutter.

Running V-sync adds latency and wont necessarily be smoother. I usually get over 60 FPS on BF3 on my setup (7680x1600), but if I enable v-sync, there's a lag that I can feel and the framerate gets capped at 30FPS, which is unplayable for a fast twitch shooter.

There might be lag if your drivers are crap (and given the current duopoly, there is a very good chance that they are). And yes, this gets worse if you run multiple GPUs (then again, multiple GPUs were always a mug's game). But I find the smoothness definitely improves with vsync on.

It's not a card, it's a video splitter that allows you to run multiple monitors off of one input. I used one before the days of eyefinity. great piece of tech there.

Eyefinity is just a clever way to con people into paying extra for what they had for free before. I'm happy with multiple DVI outputs, thanks.

What the hell is this "disk"? Also, this goes against what you said earlier about games being playable at 24FPS.

That was a mis-quote above. I was also querying what a disk can possibly have with a frame rate.

Has nothing to do with tearing. Tearing doesn't make the video image seem jittery, if the frames come in at a constant rate.

Maybe this is another one of those subjective things. I find having halves of two different frames on the screen all the time makes it look both jerky and distracting.
 
I meant to write 'desk' not 'disk'. Anyway no serious FPS gamer uses vsync. Vsync introduces latency/lag. If you don't believe that then you don't know how vsync works. Triple buffered vsync does decrease the latency quite a bit but its still best with vsync disabled which is why anyone who is a serious FPS gamer does not use vsync.

Gordon, all I can say is you are not the norm if you can't differentiate between say three 100 Hz monitors showing a rendering running on each at 25, 50 and 100 FPS. Anyone else could.
 
I get the impression people are talking past each other a bit here. The T221 is great for text, I doubt anything can beat it yet. But it isn't for gaming.
And I think the issue again comes back to contrasts. Humans perception works in terms on contrast. If you haven't spent much time with a sharper monitor, crappy low pixel density ones will look 'good enough' and likewise for movement on faster and slower panels. If you put them next to each other, or even spent enough time with both at different times, you'd notice the difference. It's easy to assume our experiences are objective and that we perceive absolutes. But we don't.
 
I think the most important point being missed here is that there is actually no GPU hardware capable of saturating 3840x2400@48Hz with reasonably high detailed textures anyway. Having played through a number of FPS style games, from Fallout 3 and NV to Borderlands 1 and 2, I have not noticed any performance issues with my GTX580 (which cannot hope to saturate 48Hz at 3840x2400) that affected my enjoyment of the games.

And the argument that a handful of films are now shot at 48Hz (after nearly a century of 24fps being considered fine) seems a bit hollow as an argument that 48Hz is not enough for decent, fluid motion.

If you want to play serious twitch FPS games on high performance engines like UT1 or Q3, you are probably best of playing in low res on a CRT monitor - that way you will get no lag associated with any flat panel screen. And while you are at it make sure you are using a PS/2 keyboard and mouse as those have dedicated interrupts and no processing lag (which is inconsistent to boot - ask any embedded engineer working on designing and developing USB devices) that USB inevitably comes with. And of course disable frame pre-rendering. I knew a few people who do this, but this is for competitive twitch gaming, not enjoying eye-candy on sexily featured bleeding edge hardware.
 
I used one of these years ago, amazing pixel clarity, though the refresh rate was really poor. I also think the contrast ratio was really low and it had minimal connections compared to what's available today - personally I would not get one now. Colors were also somewhat flat looking.
 
Again, it depends what you're using it for. The brightness is 235 cd/m^2 and contrast ratio is 400:1. Yeah, not great, but if you're using it for text pixel density reigns supreme. I just ordered one to be used primarily for programming and writing and I'll gladly post a review for these specialized purposes when I get it if anyone's interested. Since most of the writing involves \LaTeX I'm psyched at how good rendered math will look on it. I also have the feeling it will be great for data visualization - although we'll see if pixel density beats the added area of my old 30 inch.
 
Swearing by 120Hz comes out of hype and ignorance, mostly. While there is a remote possibility that your eyes are wired for higher frame rate (we are all different), it is unlikely in the extreme that you can actually tell the difference if you looked at 30Hz, 48Hz, 60Hz and 100Hz side by side.

...And I'm out.
 
playing at lower frames per second increase input lag considerably, especially witch vsync enabled, making playing FPS just painful experience. I tested 50Hz on my plasma and it felt like it was twice worse than 60Hz.

Anyway, playing anything on very old IPS (pixel refresh times so slooooow.... and contrast ratio from ass) at 48Hz must only be entertaining crazy people as any normal one interested in ancient technology will buy GDM-FW900 and have ultra sharp picture at 2304x1440@80Hz with both static and moving images, very deep blacks and proper colors

I bet something like T221 is very good for browsing web and doing text/cad related work but anything multimedia-like will just suck on it. I think the best thing would have IBM T221 + SONY GDM-FW900 combo desk :cool:
 
Just for kicks, one day I played some quake3 @ 48hz on the t221...the input lag felt awful to me. :(

Probably not a surprise to anyone but don't buy this for anything gaming related.
 
Because of the way the human visual system works, it likely depends on the nature of the scene regarding what frame rates are perceived as smooth enough. Also, on this monitor you're likely to have bigger issues due to ghosting, as the pixel response rate is a whopping 50ms.
I'd be happy to see what the current research says about human perception of moving images and how it relates to displays. There is some out there but it'd be nice to have a nice summary if people are interested enough. It would provide a good guide to what features are most important for what use cases, don't you think? Nothing like a rigorous study as opposed to hearsay and subjective opinions.

I don't see how you figure that. Large resolutions might be good for browsing etc, but not when they're crammed into a miniscule display - you'd either have tiny unreadable text, or bugger all working space. Ultra fine pixel pitches on smaller displays would be more suited to gaming, but only if other aspects of the monitor are suitable, which in the case of the T221, they are not. The fact is that this monitor just isn't much good for anything at all.

I'm going to have to disagree with you here and take a gander that you haven't used many high DPI screens, at least not on anything smaller than a smartphone. I don't mean that as an insult, but it's the kind of thing that in my experience and that of those I know that once you have it you never want to go back.

One issue with non "reasonable resolution" - to crib from Linus Torvalds - displays is that the limit to how small you can make text is not the smallest size your eyes can comfortably work with, but rather the smallest size you can make text before it looks blurry and awful. For example, on a 100 DPI monitor, I could see artifacts from subpixel rendering of mathematial symbols in a paper I was reading at what would have otherwise been an acceptable size. The same could be true for data visualization. And even if your limiting factor is size and not the bluriness induced by low DPI monitors, it really is much more pleasing to have smoother text and I believe there is evidence that it can help reduce eye strain and improve reading speed. Either way, the difference is night and day. In fact, I'm writing this on a transformer infinity tablet with ~220 PPI and after owning this and my phone, normal displays look disgusting for text.

One last thing to point out is that the DPI difference is more noticeable for text than for games. This is partially because games are not designed to take advantage of reasonable resolution screens. It also has to do with differences in eye movements in processing text and images. With text we slowly move our eyes across the screen, fixating on individual points for longer periods of time. When looking at images, our eyes are actually darting around so we can take everything in. As a result, we're doing a lot of smoothing in our heads and not fixating on one spot of the display long enough to notice pixellation.
 
What kind of work? Also, 22 inches is still probably larger than most monitors people use. And with that DPI you really could cram more on it and have it look great. I'll be getting one soon and I'm swapping a 3008WFP out for it which I actually felt was too much space to use effectively sometimes. So I'll let you know my subjective experience.

Also, high DPI displays reduce the need for antialiasing since it is much less apparent, no? And generally by designed I was referring to the fact that the interfaces are chunkily desgined for console people gaming on low resolution TVs.
 
Comfortable viewing is subjective. I always feel like I can't make text as small as I'd like because it starts to look crummy, even close up. But I'll be happy to let you know when I get it. Although the upcoming EQD clarity screen look like they might make a good compromise at a reasonable price point.

And I'll repeat, regarding design I'm talking more about the interface. Everything is so large and chunky, in large part I've heard due to console porting. Although better textures would be great!
 
I'm with DrinkTea on this one, regarding the monitor size. Having moved from a Dell 30" to a T221, I am actually finding that the 22" is more comfortable to use than the 30", even without the higher resolution I find that most desks aren't deep enough for a comfortable distance required to keep the 30" screen filling your field of vision (too close and turning your head side-to-side to see things on the outer edges gets distracting). Size isn't everything. Personally, I'd love something like a quad-2560x1600 in the same 22"-ish size. Think 4 Google Nexus 10s tiled. :)
 
This is really not an issue. It is a common misconception that you need 60Hz or whatever, propagated by people who don't have a clue about how the human eye works (or it's spec ;) )

Human eye can only distinguish up to around 12 individual frames per second. After that you can percieve flicker (depending on the contrast and brightness between the frames) up to the high teens when it comes to fps, maybe at a stretch up to 20fps if you really have exceptional eyes. Anything more than that is only percievable as fluid motion.

When you watch a film at the cinema, do you find that to be jerky? Or does it look perfectly smooth? Movies are shot at 24fps. So what makes you think for one moment that you might need more than 48Hz for any purpose?
Although the number of individual frames may be correct, there is already proven human capability to see motion side-effects (e.g. stutters, motion blur, stroboscopic effects, etc) far beyond 60fps.
Vision science is a lot more complicated than it appears on the surface.

-- Ability to distinguish frames
......is not the same as
-- Ability to distinguish stutters
......is not the same as
-- Ability to see motion blur
......is not the same as
-- Ability to see flicker directly
......is not the same as
-- Ability to see flicker indirectly (e.g. phantom array effect) -- this reference is more reputable than Wikipedia
etc.

An example is the following:
There are many sources that show human eyes can indirectly detect visual phenomena running at many hundred Hertz, far beyond a human flicker fusion threshold. This information is included, because knowledge about impulse-driven displays (e.g. CRT’s, scanning backlights) is improved by knowledge about these topics.

Wagon Wheel Effect
- en.wikipedia.org/wiki/Wagon-wheel_effect
- www.michaelbach.de/ot/mot_wagonWheel/index.html
Example: An 8-spoke wagon wheel spinning clockwise 50 times per second under a 400 Hz stroboscope will appear to be stationary. However, if the stroboscope runs at 401 Hz, the wheel will spin slowly counter-clockwise. If the stroboscope runs at 399 Hz, the wheel will spin slowly clockwise.

Phantom Array Effect / Stroboscopic Effect
- opensiuc.lib.siu.edu/cgi/viewcontent.cgi?article=1538&context=tpr (500 Hz detected)
- www.lrc.rpi.edu/programs/solidstate/assist/flicker.asp (300 Hz detected)
- www.lrc.rpi.edu/programs/solidstate/assist/pdf/AR-Flicker.pdf (10,000 Hz detected)
- cormusa.org/uploads/2012_2.10_Bullough_CORM_2012_Stroboscopic_Effects.pdf
- people.ds.cam.ac.uk/ssb22/lighting.html
Synopsis: Humans can indirectly detect a 500 Hz flicker via the “phantom array” effect: A fast moving flickering light source in a dark room, appears as a dotted trail instead of a continuous blur. This can also occur when rapidly moving/rolling eyes in front of flickering lights in a darkened room (e.g. old LED alarm clocks, neon lights, unrectified LED decoration light strings). Flicker all the way up to 10,000 Hz was indirectly detectable in some studies, in certain situations.

Rainbow Artifacts (DLP projectors)
- www.projectorcentral.com/lcd_dlp_update7.htm?page=Rainbow-Artifacts
- www.ausmedia.com.au/DLP_Sensitive.htm
- en.wikipedia.org/wiki/Digital_Light_Processing
Synopsis: Related to the phantom array effect, rainbow artifacts are observed by some people watching images from a single-chip DLP projector, even with 4X or 6X color wheel (240Hz and 360Hz). Fast eye movement causes white objects on black background to have a rainbow-colored blur (red-green-blue trails) instead of a continuous white blur.
In addition, human eyes can see continuously decreasing amounts of motion blur for motion beyond 60 Hz. The difference between 120Hz, 240Hz, 480Hz, 960Hz are noticeable in terms of eye-tracking-based motion blur differences, for sample-and-hold displays (displays that continuously shine). This is because when you're tracking eyes on moving objects on-screen. Your eyes are in different positions at the beginning of a refresh than at the end of a refresh. So shortening the length of a refresh (either via more Hz, or via flickering the frame ala CRT) reduces the chance that a frame gets blurred across your retinas. First, it then becomes important to understand what is sample-and-hold motion blur, of which I refer you to a very popular page, Why Do Some OLED's Have Motion Blur?, quoted here in its entirety:
Why Do Some OLED's Have Motion Blur?

OLED has been regarded as a Holy Grail for eliminating motion blur. Unfortunately, the portable Playstation Vita and Samsung Galaxy S3 have lots of motion blurduring fast scrolling animations.

Why?

The answer lies in sample-and-hold. OLED is great in many ways, but some of them are hampered by the sample-and-hold effect. Even instant pixel response (0 ms) can have lots of motion blur due to the sample-and-hold problem. Some newer OLED's use impulse-driving in order to eliminate motion blur, but not all of them do.

Your eyes are always moving when you track moving objects on a screen.Sample-and-hold means frames are statically displayed until the next refresh. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh; this causes the frame to be blurred across your retinas:

sampleandhold1.gif

(Source:Microsoft Research)
Vertical axis represents position of motion. Horizontal axis represent time.
Middle image represents flicker displays, including CRT and LightBoost.
Right image represents sample-and-hold displays, including most LCD and OLED.​

The flicker of impulse-driven displays (CRT) shortens the frame samples, and eliminates eye-tracking based motion blur.This is why CRT displays have less motion blur than LCD's, even though LCD pixel response times (1ms-2ms) are recently finally matching phosphor decay times of a CRT (with medium-persistence phosphor). Sample-and-hold displays continuously display frames for the whole refresh. As a result, a 60Hz refresh is displayed for a whole 1/60th of a second (16.7 milliseconds).

sampleandhold2.gif

(Source: Microsoft Research)​

Motion blur occurs on the Playstation Vita OLED even though it has virtually instantaneous pixel response time. This is because it does not shorten the amount of time a frame is actually visible for, a frame is continuously displayed until the next frame.The sample-and-hold nature of the display enforces eye-tracking-based motion blur that is above-and-beyond natural human limitations.

Solution to Motion Blur

The only way to reduce motion blur caused by sample-and-hold, is to shorten the amount of time a frame is displayed for. This is accomplished by using extra refreshes (higher Hz) or via black periods between refreshes (flicker).

Some experimental OLED televisions now shorten frame sample lengths by strobing the pixels, and/or motion interpolation (extra refreshes). Motion interpolation can interfere with video games due to input lag. Fortunately, impulse-driving (flicker) is video game friendly.

This is why many HDTV displays now use motion interpolation and high refresh rates, as well as using scanning backlights. In addition, some new 120 Hz gaming computermonitors now have a strobe backlight feature such as LightBoost, which allows these LCD's to have less motion blur than sample-and-hold displays (including the PS Vita OLED). One big problem to overcome first, is that impulse-driving require a lot of brightness to compensate for extra black period between refreshes. OLED has historically had brightness problems, however, researchers are continually improving this. ---

References

This list of scientific references help you gain a better understanding of how eye-tracking creates motion blur on sample-and-hold displays:

  • "Temporal Rate Conversion" (Microsoft Research) Information about frame rate conversion, that also explains how eye tracking produces perceived motion blur on a sample-and-hold display, including explanatory diagrams.
    -
  • "Correlation between perceived motion blur and MPRT measurement" by J. Someya (SID’05 Digest, pp. 1018–1021, 2005.) Covers the relationship between human perceived motion blur versusMotion Picture Response Time (MPRT)of the display. This also accounts for motion blur caused by eye tracking on a sample-and-hold display, a separate factor than pixel persistence.
    -
  • "What is needed in LCD panels to achieve CRT-like motion portrayal?" byA. A. S. Sluyterman (Journal of the SID 14/8, pp. 681-686, 2006.) This is an older 2006 paper that explains how scanning backlight can help bypass much of an LCD panel's pixel persistence.
    -
  • "Frame Rate conversion in the HD Era" by Oliver Erdler (Stuttgart Technology Center, EuTEC,Sony Germany, 2008) Page 4 has very useful motion blur diagrams, comparing sample-and-hold versus impulse-driven displays.
    -
  • "Perceptually-motivated Real-time Temporal Upsamplingof 3D Content for High-refresh-rate Displays" by Piotr Didyk, Elmar Eisemann,Tobias Ritschel,Karol Myszkowski,Hans-Peter Seidel (EUROGRAPHICS 2010 by guest editors T. Akenine-Möller and M. Zwicker) Section "3. Perception of Displays" (and Figure 1) explains how LCD pixel response blur canbe separate from hold-type (eye-tracking) motion blur.
    -
  • "Display-induced motion artifacts" by Johan Bergquist (Display and Optics Research, Nokia-Japan, 2007) Many excellent graphics and diagrams of motion blur, including impulse-driven and sample-and-hold examples.
    -
  • "Flicker Fusion" byStephen Macknik, Barrow Neurological Institute (Scholarpedia) Background information that relates to how flicker becomes a continuous image (applies to CRT and to scanning backlights).
    -
  • "Temporal Resolution" by Michael Kalloniatis and Charles Luu, Webvision (University of Utah) Background information that relates to human vision behavior and how multiple flicker events, over a short interval, blends together.

More references can be found in Science & References. Other Reading:
Higher framerates on non-flicker displays leads to less motion blur. That's why Hobbit 48fps has half as much motion blur than Hobbit 24fps. It's also why 120Hz LCD monitors have half as much motion blur as 60Hz LCD monitors. (excluding LightBoost improvements; which reduces motion blur even further by using a stroboscopic backlight). It's also why 30fps@60Hz has more motion blur than 60fps@60Hz.
 
Last edited:
As no one has added that information so far... most humans can consciously perceive the difference in fluidness well above 200fps, independent research by both Sony and the US military indicate this. (looks like Mark Rejhon wrote something more comprehensive while I was typing f^^; ...)

Sony once demoed an FED-monitor running the game grand tourismo at 240Hz the ideal refresh-rate for being pretty much perfectly fluid and dividable by standard frame rates like 24/30/60fps.

Let's just be friends and all agree that we'd be happy with a screen with 400ppi pixel density, 240Hz input, Ultra-Wide Gamut with working sRGB mode, true Blacks, HDRI contrast and maximum brightness, and lets say 32bits per color channel to be absolutely safe :) ...and not aging like plasma or OLED. Or put differently, it doesn't really matter what someone considers inadequate in todays displays, everyone is right.

Some won't care for the PPI some for the Hz but all would have what they want.

It's unfortunate that the T221 and the other IDTech panels were too hampered by the lack of OS scaling options when they were released to create an impact on the resolutions of consumer displays...
 
Last edited:
As no one has added that information so far... most humans can consciously perceive the difference in fluidness well above 200fps, independent research by both Sony and the US military indicate this. (looks like Mark Rejhon wrote something more comprehensive while I was typing f^^; ...)
Correct. Although human eyes cannot tell flicker or individual frames at 200fps, there is definitely less motion blur at 200fps than at 30fps or 60fps. It's an indirect phenomena: The motion blurring caused by eye tracking motion across statically displayed frames, since your eyes are in different positions throughout a frame, as you track moving objects on a screen. Shorter refreshes means less motion blur.

Sony once demoed an FED-monitor running the game grand tourismo at 240Hz the ideal refresh-rate for being pretty much perfectly fluid and dividable by standard frame rates like 24/30/60fps.
If you there are still people who don't believe humans can tell apart 30fps versus 60fps, click on these links:
  1. See this webpage animation #1:
    15fps versus 30fps versus 60fps
    -
  2. See this webpage animation #2:
    Test Motion With Multiple Custom Framerates
    - Set both balls to "Soccer Ball"
    - Set left animation to "60fps", right animation to "30fps"
    - Set motion blur to "None". (exclude software-based motion blur for this test
This even continues beyond this too; 60fps versus 120fps, although twice as hard to tell apart since motion blur does have points of diminishing returns. You need faster motion to see motion blur in shorter frame sample lengths (e.g. higher Hz, or shorter strobes). Eventually, motion blur becomes too tiny, and in order to see the motion blur, you need motion that's faster than the human eyes can track.

Motion blur of a sample-and-hold display at X fps@X Hz scientifically have the same amount of eye-tracking-based motion blur on an impulse-driven display that flickers once per frame at 1/X second (even if it runs at a lower Hz, such as 60 Hz). Read the above scientific references to understand eye-tracking-based motion blur (sample-and-hold motion blur).

blurbusterslogo-300x232.png

The old "can't tell apart 30fps vs 60fps" argument is Blur Busted.
(and for that matter, 60fps vs 120fps too.)
 
I don't see how you figure that. Large resolutions might be good for browsing etc, but not when they're crammed into a miniscule display - you'd either have tiny unreadable text, or bugger all working space.
For text reading you don't need so much "working space". What you need is to have properly looking fonts. Font rendering became so ugly, fuzzy and blurry in recent years on Windows programs it's not an option: higher PPI + zooming everything is necessity to view web comfortably. Too bad we aren't there yet :mad:

Ultra fine pixel pitches on smaller displays would be more suited to gaming, but only if other aspects of the monitor are suitable, which in the case of the T221 , they are not.
I don't find 4K gaming monitor to be exciting idea because GPU overhead will make playing any modern game rather unpleasant experience. Processing power is better put to make 1080p run 120Hz than 4K 30fps ...

On my CRT I have all sorts of native resolutions and for most games I often prefer lower resolution + higher AA than higher resolution because games don't have very good textures or other details anyway and so it's better to hide things in blur than show them :eek:

The fact is that this monitor just isn't much good for anything at all.
I bet it's good for text reading, especially older IPSed had generally very soft and pleasant image (as opposed to older TNs :eek: )
 
I bet it's good for text reading, especially older IPSed had generally very soft and pleasant image (as opposed to older TNs :eek: )
The venerable IBM T221 is still damn impressive at viewing photos, even today. The first ones were often in offices that did satellite photo reconnisance.
 
Comfortable viewing is subjective. I always feel like I can't make text as small as I'd like because it starts to look crummy, even close up. But I'll be happy to let you know when I get it. Although the upcoming EQD clarity screen look like they might make a good compromise at a reasonable price point.

And I'll repeat, regarding design I'm talking more about the interface. Everything is so large and chunky, in large part I've heard due to console porting. Although better textures would be great!

Sounds like you are more like me.

I run my IBM T221 on linux with X set at 75 DPI. This means smaller text than what is used on windows.

I have no problems comfortably reading everything on the display from 2 feet away... Its subjective.
 
Sounds like you are more like me.

I run my IBM T221 on linux with X set at 75 DPI. This means smaller text than what is used on windows.

I have no problems comfortably reading everything on the display from 2 feet away... Its subjective.

What Linux distro do you use and did you set it up so that it spans across both halves as one monitor? I ask because I ordered one recently and I'm using Mint 14 right now. Also, does the special setup required make it a problem to configure a second monitor if I have another graphics card to run it off of?
 
Back
Top