24" Widescreen CRT (FW900) From Ebay arrived,Comments.

did some more experiments and WinDAS stuff.

Dynamic convergence is very cool, and pretty straightforward. Also did some testing of the sRGB mode. On my p275, switching to sRGB mode had absolutely no effect on the chromaticity of the primaries, but it did affect the luminance of each primary. My guess at this point is that it is adjusting the relative luminances (but not chromaticities) of each of the primaries to fit the sRGB gamut standards. Notice that the luminance of each primary ("Y") is specified, relative to the luminance of white.

On my FW900, when I switch to sRGB mode, the chromaticity does indeed change, although not by a large amount. This might mean that the GDM line of monitors has a color management system built into it. There's only so much it can do, however, since the native red primary is slightly less saturated than that of sRGB. This means that while the hue can be matched up to sRGB's red, the saturation can never be on target.

I also measured the spectral power distributions of the FW900's three phosphors. Check em out! (top to bottom: red, green, blue).

x6mv6p.png
 
Last edited:
the usb hub could be at the back. My A271A is swiveled such that you can't see the usb ports unless you go around the back of the monitor. Then again, my unit has a prominent HP logo on the top bezel, and the bezel's on those units are unblemished.

As for the control panel, you may be right - hard to tell from a straight on angle, since the projection of a cylinder at that angle is very similar to that of a right angled shape.

They're definitely CRTs and they certainly look like widescreens. What do you suppose they are?
 
If they also changed out the base to not have USB and squared off the control panel. Maybe.

I haven't seen the video, but if it's a widescreen CRT - then I would think it's either the GDM-FW900 or the GDM-W900. Both of which were made by Sony. I think Sony was the only widescreen CRT monitor manufacturer, but I could be wrong.
 
wow, I think you might be right jbl... just looked up some images of the W900 and the form factor around the control panel area looks very similar to that in the video. Wonder if they simply weren't able to find any decent FW900s, and had to settle instead!
 
The W900 is a curved tube and not quite as powerful as the FW900 AFAIK.

A bit of a wider dot pitch, curved tube, not as heavy, and max resolution of 1920x1200 at 75hz I believe. A calibrated one would be nothing to sneeze at though, and I'm sure it would make for an excellent monitor. I have a curved Dell M991 (Phillips screen), and my eyes don't take long at all to adjust to it.

And as far as "settling" on a W900. Possibly. I do know that instead of a 4-pin TTL for WinDAS, it uses a DB-9 cable for DAS. Like I said - calibrated, they're probably nothing to sneeze at, and I'm sure they look fantastic.
 
hehe, it's still a great display (it's a GDM afterall), but I'm sure if they had the choice between a W900 and an FW900 it'd be a nobrainer :)
 
Bit off topic, but talking about great displays... I have here in front of me an EIZO T966, 21" Trinitron tube... This was EIZO's top of the line highest spec 4:3 CRT introduced back in 2003... About the same level as Sony's 4:3 version of the FW-900 (the F520)... with only a slightly lower max horizontal scan freq. Let me tell you... this thing does 1600x1200 @ 104 Hz no problems. 2048x1536@ 80 Hz max. And I've just played Unreal Tournament on it in 800x600 @ 160 Hz.
Man. Just moving a window at that refresh rate it feels like it's gliding over the screen. It's unbelievable how smooth that is. Even coming from 85 Hz it feels way smoother, and compared to lame LCD-land 60 Hz, no kidding. It's just amazing the flexibility you get from a high-end CRT monitor. Imagine how far they could have pushed the electronics if they continued developing. We might have had 1920x1200 @ 200 Hz by now. Too bad we never got to see how far they can really go.

But this display is pretty kickass. Text is sharp (I adjusted it on the flyback) and when you connect it via USB cable you actually get a nifty utility that allows you to access all the monitor settings (such as color calibration, resizing, convergence etc.) through the Display options when you right click on the desktop. Playing anything on the thing is a blast. The only downside is that the anti-glare was scratched when I got it so I had to remove it. Now it looks less than glorious during the day and the static makes the glass get dusty quickly, but in a dim room and at night, it's just marvelous to look at.
 
@4ort, yup. Welcome to the club.

The dust will become its own removable antiglare layer :p

I ran my FW900 @ 1200p@95Hz 1080p@117Hz and 1680x1050@120ish, I know how it feels :D
Couldn't get the focus right at 1050, was a bit blurry, but that's okay, the pixels are a bit big at that res anyway.
 
I still can't get over the fact that FED/SED displays didn't make it. They were basically supposed to be a thin CRT. Can't get much better than that.

There was a prototype made some years ago and someone saw it. Check what he wrote:

"-Ikegami has a CRT sitting beside the FED prototype. The blacks on the FED
prototype look better than the CRT as it doesn't appear to suffer as much
from flaring in the glass of the monitor.
It has better blacks than a CRT (!).

-To my eyes, looks slightly sharper than the CRT. It has a full HD pixel
structure so it should be no surprise. Geometry would be perfect and corner
performance doesn't suffer.
- Color close to the CRT sitting beside it but not 100% exactly the same.
This is likely because it's a prototype and they didn't bother matching the
color. Native transfer function/gamma of the FED is slightly different, and
the phosphors are different.
- No price, not shipping, may be at least 2 years out."

http://tig.colorist.org/pipermail/tig/2008-April/013206.html
 
Ugh - if only. A thin CRT with better blacks, better color, no geometry issues.... Yeah - sounds a bit PERFECT to me.
 
FED/SED would have been the rescue for us all who try to survive with the Trinitron. But sadly they dropped them because of OLED.

Plasma is an good option but only for Home Cinema and not for the PC usage.
 
Yeah, there's really no looking forward to anything in the PC display world anytime soon. I bought an IPS panel a couple of days ago (a pretty well reviewed one as well) and really tried to like it, I did. The colors were very good. Even the black level wasn't -that- bad. But the IPS glow which was -really- visible in the bottom right corner, even during the day, just killed all immersion and made the screen essentially junk because I was constantly annoyed by the glow rather than being able to enjoy the game.

Sure CRT's don't have perfect black level during the day and they look a bit washed out, but at least you can look forward to having an awesome experience at night. No such thing with the LCD, it just looks even worse...I returned it a few days later.
 
Yeah, there's really no looking forward to anything in the PC display world anytime soon. I bought an IPS panel a couple of days ago (a pretty well reviewed one as well) and really tried to like it, I did. The colors were very good. Even the black level wasn't -that- bad. But the IPS glow which was -really- visible in the bottom right corner, even during the day, just killed all immersion and made the screen essentially junk because I was constantly annoyed by the glow rather than being able to enjoy the game.

Sure CRT's don't have perfect black level during the day and they look a bit washed out, but at least you can look forward to having an awesome experience at night. No such thing with the LCD, it just looks even worse...I returned it a few days later.

According to WikiPedia, the folks who bought Sony's FED research are still continuing to research it, and that FED's not ready for consumers yet. I don't know - hopefully they bring em out.
 
Couple things:

Just resolved the issue with X-Rite about the spectroradiometer question. According to the head of tech support, the device is functioning as a spectrophotometer when it's measuring objects, such as a printed sample. When it's measuring a display, it's functioning as a colorimeter. They choose to describe it as such, because even though it's not working like a colorimeter (tristimulus device), it's only outputting colorimetric data, and not spectral radiance data (a spectroradiometer outputs spectral radiance data). However, because it can output spectral radiance data, with software such as HCFR, it can be used for spectroradiometric measurements. The reason they don't advertise it as a spectroradiometer, is because it doesn't meet the international standards and requirements for those devices.

I understand that you've reviewed those documents and compared the definitions of those different devices. But simply meeting the definitions is not the issue in this case.

Selling a spectrophotometer that can be correctly called a spectoradiometer would require X-Rite to meet all the international standards and certification specifications that those devices require. That simply isn't possible with this device, and the market for true spectroradiometers is sufficiently small that it would not be financially viable for our company to redesign and then profitably sell the resulting device.

...if you wish to use the i1PRO2 for spectroradiometric measurement, feel free. We will only advertise and promote the i1PRO2 as a spectrophotometer.




The second thing is regarding using your PC as a signal generator for WinDAS. I've been able to successfully sync my monitor to all the modes that WinDAS has requested so far, using the custom timing in Nvidia control panel. If you do not have this, you may have some luck with ATI's catalyst control center. I've never used that so I don't know if it has advanced timing capabilities, or how well they work. A couple alternatives are toastyx's CRU, or Powerstrip.

First, I would recommend going through this 13 page tutorial on how CRT timing signals work. It's excellent.

Now one of the hang-ups is that WinDAS requests the vertical parameters in microseconds, whereas Nvidia only allows you to input the parameters in pixels.

But the math works out.

Look at this image:

294qu84.png


Middle bottom: requested parameters from WinDAS

Top right: Nvidia custom timings settings

Top left: online calculator. The red fields indicate the values I inputted. It's easy to calculate back porch. Back porch is Blanking pixels - (front porch + sync width), and Blanking pixels = Total pixels - Active pixels (this goes to show that the only independent parameters are the nine values it asks for input (plus one for interlace mode, plus two for polarity).

In the ouput (green fields, bottom left), notice how everything matches up? With respect to horizontal timings, it only showed the H sync value, but I'm guessing the others also match up.

For example, H total is the amount of time it requires to go through one line of horizontal pixels.

The horizontal frequency is 107100 lines per second. That means each line takes 1/107100 seconds = 9.3337 microseconds. Check.

Horizontal Front time refers to the time it takes to scan through the horizontal front porch number of pixels. Our horizontal front porch in pixels is 152 pixels. Our Horizontal total is 2640 pixels. If it takes 9.3337 microseconds to scan through 2640 pixels, then it takes 0.003535 microseconds per pixel (3.5 nanoseconds!). Therefore, the time it takes to scan the horizontal front porch is 152x0.003535 = 0.537 microseconds. Check.

And so on and so forth, I'm pretty sure the math adds up.

As for test patterns, I've been making them as I go along, and will share the collection when ready. They include the White screen, the Black screen, the 30 IRE screen, the grayscale, the crosshatch RGB, and the crosshatch RB patterns, in all the resolutions that WinDAS requests. I made them in inkscape, designed from scratch for each resolution (so that there are no scaling artifacts).
 
Ugh - if only. A thin CRT with better blacks, better color, no geometry issues.... Yeah - sounds a bit PERFECT to me.

I dunno enough about FED, but would it have true multisyncing capability like a regular CRT? In other words, is it stuck at a native resolution (and would have to employ scaling to work with other resolutions)?
 
It's a fixed-pixel display. But even so, it would still be friggin epic. The only other thing that would be multisync might be Laser RP, like the Mitsubishi LaserVue TV's. It could display 90% of the visible color spectrum. Of course, like every other GOOD display technology out there, it got killed off last year.
 
Last edited:
Yea I was surprised I didn't hear much about the laservue as of late. What happened to it? Seemed like a solid piece of tech.
 
Probably not enough people cared about image quality to buy it, kept the price up high... wasn't profitable for them to sell it anymore so they binned it? Just my guess.
 
Spacediver, thanks for the timing information. So EVERY resolution that you selected in WinDAS synced up then? Awesome news indeed!
 
yep, although I haven't yet done the alignment at mid and low frequencies, so not sure if it'll ask for something funky then.
 
So I have a problem; someone won my auction for a FW900 and he's not contacting me at all; I really love this monitor but it's too expensive to keep so I have to give it to a good home who'll take care of it.

So who wants a good quality FW900 monitor; it's November 2000 manufactured; runs at a pretty low G2 level, has about 4380 hours on it and has no major problems; I'm looking to get about $200-$300 for it, if price is $500-600 I'll get LAURANAGNER to calibrate it for you. It's really a good monitor and all it needs is a good calibration and if calibrated annually and used wisely, it can last you a long time.

I'm located in Corona, CA; PM me if interested.
 
So I have a problem; someone won my auction for a FW900 and he's not contacting me at all; I really love this monitor but it's too expensive to keep so I have to give it to a good home who'll take care of it.

So who wants a good quality FW900 monitor; it's November 2000 manufactured; runs at a pretty low G2 level, has about 4380 hours on it and has no major problems; I'm looking to get about $200-$300 for it, if price is $500-600 I'll get LAURANAGNER to calibrate it for you. It's really a good monitor and all it needs is a good calibration and if calibrated annually and used wisely, it can last you a long time.

I'm located in Corona, CA; PM me if interested.

PM'd.
 
Just an FYI - there's a cheap Quantum Data Signal Generator for sale that's listed as untested and sold as-is. I've messaged the seller to get some more information - like how has he not been able to test it (seriously - it's easy enough to do - plug it into a monitor and turn the damn thing on)? No response yet. My guess is that it's "as-is" and "untested" because it doesn't work. Just letting you guys know to beware.
 
Interesting, that's good to know. Thanks for the heads up.

Honestly, the only reason I can see myself purchasing a signal generator would be that it makes it more convenient to set up the workflow, and would allow me to calibrate other people's monitors without worrying about whether their video card and operating system allows proper signals to be sent.
 
How do you know this? Is there some way for us to look at that value without pro equipment?

I read from an eBay member that the value could be read by using WinDAS, realizing this I took a look at past documents and found that I had documented the number of hours the monitor had been running.

You cannot believe the number of modified G2 dats that I have.
 
Are there any parameters in WinDAS that allow you to make the monitor brighter? (Increase the gain/contrast?)

When I took mine in for calibration they only calibrated one of the color temperatures and calibrated it to something like 80 cd/m2 max, not even the 120 cd/m2 that is usual... it was MUCH brighter before calibration, now it's unusable during the day, and hardly even bright enough at night... And this was at an authorized Sony service center.
 
I haven't yet done the WPB with the FW900, but with the P275 it asks you to hit certain luminance targets (I believe it was around 80 nits like yours was calibrated to).

There are some parameters in the EEPROM that control these settings. I know G2 is one, but I think that may be more of a brightness, rather than contrast issue. It's a bit difficult to figure out since brightness and contrast are often defined differently, depending on context or display. But there are parameters like DRIVE_MAX or something. I personally would rather just trust the factory specs: I'm not sure if you'll be able to safely increase the maximum brightness level without sacrificing black level performance.

I'll probably be doing my first WPB on my spare FW900 this wknd. I'll report back the luminance results. I can say that my main FW900 runs at around 80 nits with a white screen. I certainly don't have any complaints.
 
Yeah the G2 has an apparent effect on black level/offsets for the guns. That's not what I'm looking for, but to make the monitor brighter as in, increase the strength of white. Same thing that you can do with the contrast/gain control on the monitor. Except that they limited mine to some stupid low level.

Increasing contrast won't affect the black level, it's a completely different setting. It basically affects how hard the guns work. But it seems there are some controls in WinDAS that can restrict the contrast/gain setting in the monitor menu, which you could normally use to make the monitor brighter.
 
Yeah the G2 has an apparent effect on black level/offsets for the guns. That's not what I'm looking for, but to make the monitor brighter as in, increase the strength of white. Same thing that you can do with the contrast/gain control on the monitor. Except that they limited mine to some stupid low level.

Increasing contrast won't affect the black level, it's a completely different setting. It basically affects how hard the guns work. But it seems there are some controls in WinDAS that can restrict the contrast/gain setting in the monitor menu, which you could normally use to make the monitor brighter.

Spacediver already said it, but I'll confirm. The setting you want to tweak is MAX_DRIVE. This should adjust your white level (contrast). Be careful! Don't overdo it.
 
Thanks. I'm not looking for anything excessive. They just limited it way too much. I just want to return it to the level of contrast it had before, which was around 140 cd/m2 during the day and around 70-80 at night. Nothing out of the ordinary for a CRT (especially without anti-glare).

What is your MAX_DRIVE set at if you can take a look? I lost all my old .dat files sadly.
 
Thanks. I'm not looking for anything excessive. They just limited it way too much. I just want to return it to the level of contrast it had before, which was around 140 cd/m2 during the day and around 70-80 at night. Nothing out of the ordinary for a CRT (especially without anti-glare).

What is your MAX_DRIVE set at if you can take a look? I lost all my old .dat files sadly.

Max drive for me is 210 I believe. Every tube is different though, and you should first save data from your monitor and then make changes (with a new file name of course - always backup your DAT file).
 
heh, just came across a very interesting piece of information from Charles Poynton. I've been wondering for a while how studios that use CRTs for color mastering were able to adhere to the REC 709 gamut, when their CRTs were using the SMPTE-C primaries.

Turns out they might not have been.

Which means that a lot of HD material will actually render more accurately on our FW900's :D


here's a key quote:

However, HD content creators hold a dark secret: Following international agreement on BT.709 in 1990, content creators in 60 Hz countries never switched from the smpte primaries of 480i SD to the BT.709 primaries, and content creators in 50 Hz countries never switched from the ebu primaries of 576i SD to the BT.709 primaries.HD content is generally not approved on BT.709 displays! smpte “C” (RP 145) primaries remain entrenched for 60 Hz HD in North America, Japan, and much of Asia; ebu primaries remain entrenched for 50 Hz HD in Europe and other parts of the world.
 
always backup your DAT file).

second this - I'm really glad I made a backup, as I messed up some settings somehow in WinDAS and the display geometry was all messed up. Wouldn't allow me to vertically stretch the display back to normal. If not for that backup I made right at the start, I dunno what I would have done!
 
Back
Top