24" Widescreen CRT (FW900) From Ebay arrived,Comments.

redid some adjustments, and bumped contrast up to 100. Managed to get a black level of 0.01 and a white level of just over 100 (over 10000:1 contrast ratio). Gonna do some more experimentation tomorrow.
 
redid some adjustments, and bumped contrast up to 100. Managed to get a black level of 0.01 and a white level of just over 100 (over 10000:1 contrast ratio). Gonna do some more experimentation tomorrow.

I believe factory defaults are that the brightness is set to 50, and contrast is set to 85 if you want to shoot for some targets.
 
I believe factory defaults are that the brightness is set to 50, and contrast is set to 85 if you want to shoot for some targets.

yea, i remember reading about those values somewhere. Thing is, even after MPUing my display, it didn't change my brightness and contrast to those values.

Either way, I'll figure out a good contrast and black level that I'm happy with. I'm then gonna measure each of the 256 luminance levels, and use Matlab to create a 1D lookup table (LUT) so that I can hit any gamma curve I want, whether it be a flat 2.2, 2.4, or BT.1886. That way, I should be able to enjoy good contrast, without crushing detail or washing it out.
 
So I did some more experimentation, and noticed I was getting inconsistent white balance readings. After inspection, I figured out why.

The display doesn't have perfect color uniformity. At first I thought it was due to the aging and splotchy antiglare, so I tested it on my other unit, and it still has some issues (though I think it's not as bad, not sure yet). I wasn't able to solve this issue with the landing. I tested chromaticty at center of screen, and still wasn't able to match it near some of the corners no matter what landing adjustment I did. I tried both BNC and VGA cables - no diff.

I'm gonna try doing a full WinDAS calibration on this puppy and see if I can get some good results.

Anyway, in the process, I had a chance to really look at the image without the antiglare.

I have three words:

No. Fucking. Contest.

Antiglare off is an incredible image - it has a crystal clear glossiness that is just gorgeous.

And I don't think it affects black levels irreparably. Just a good idea to re-do your white point balance/g2 adjustment after removing it.

Wow!

I'm gonna stick with this display and play around with it. Once I've calibrated it to the best of my abilities, I'll report uniformity measurements across the entire screen (I'll do about 70 measurements, and report variations in luminance, x, and y, and plot them).

Future project will be to do the same measurements on my HP unit that still has antiglare on. I intend to do the measurements once with the antiglare on, and once with it off. That'll give interesting insights into how the AG changes chromaticity overall, and if and how it affects uniformity.
 
Last edited:
Well, I've just done three WPB's back to back. The first didn't save the settings properly. After the second was done, I realized I hadn't set my nvidia gamma to default. These are the results from the third:

2jb3xhi.png


These results are a real testament to the stability of the display's gun control that it can maintain a steady balance of beams in between the adjustment points.

Note that the black point is nice and low (and this is on unit without antiglare).

I think, however, that because I set the G2 value so low during the adjustment, the blacks are extremely crushed. Notice how the luminance (y) rises extremely slowly. This also accounts for the very low delta E 10% signal (delta is a perceptual measure, and chromaticity errors are less perceptible at lower luminances). I've used the updated delta E formula (CIE2000).

I'm learning to how make calibrated gamma ramps (CLUTs) so that you can adjust the video card's signal and create a much better gamma curve. This will allow me to preserve the awesome black level and color accuracy.

I know many of you are like WTF is he talking about, delta E, blah blah blah, but trust me, this is the mark of an impressive piece of display technology (barring uniformity issues).

I've also gained a lot of experience in the workflow, and have found a couple of really useful features of HCFR that help with the calibration. For example, you can have a realtime visualization of RGB levels while you're adjusting, so you don't have to always worry about thinking about x y values. Also, sometimes when you're not sure which compromise to make (say your target is x = 0.313, y = 0.329, and you're deciding between 0.312, 0.330 or 0.314 0.328), you can plug in the values into HCFR and have it tell you what the delta E is (we're not equally sensitive to errors in x and y).

I plan on making a comprehensive workflow tutorial, but in the mean time, if anyone has any questions about the WPB in windas, I'm happy to share.
 
Last edited:
do you have any before/after screenshots? i know, trying to show that is like explaining the advantage of 120hz on youtube but still, i'm very curious
 
All the benefits of these calibration procedures are instantiated through the display itself. If I took a screenshot and you loaded up that screenshot on your display, it would show the image as your display interprets it. The screenshots I take will just have RGB values at each pixel. Those values don't change at all when a display gets calibrated.

If I had a decent camera, I could capture the difference by taking an actual picture of the display, and it might lend some sense of the change, but even that has a lot of problems with it.

One thing that is definitely noticeable is that skin tones in movies look so much more real and natural.
 
I know you have a signal generator, but something like white balance could probably be done without one, am I right? After all if you are displaying a grayscale image testing for the adherence to D65 then it should not matter what the exact timings are, though I do not know what exact patterns WinDAS asks for during the process.

In other words you could use something like PowerStrip for the timings and HCFR to display patterns.
 
I know you have a signal generator, but something like white balance could probably be done without one, am I right? After all if you are displaying a grayscale image testing for the adherence to D65 then it should not matter what the exact timings are, though I do not know what exact patterns WinDAS asks for during the process.

In other words you could use something like PowerStrip for the timings and HCFR to display patterns.

I'm not using a signal generator - that's jbltecnicspro who got one, not me!

See my post here, and the discussion that follows it.

I think your intuition is probably correct. I see no reason why timings would be critical in the context of color calibration. That said, I've been able to successfully sync every single mode that WinDAS has requested, including the convergence modes, the modes at high and mid frequency alignments (haven't yet tried the low frequency alignments).

Once you get the hang of it, it's pretty simple. If you set the active pixels, total pixels, refresh, and vertical front porch and vertical sync width, the rest pretty much falls into place.

I use HCFR in free measure mode so I can monitor the readings of my instrument in real time (and I have HCFR loaded on my laptop along with WinDAS). My main computer, which acts as a signal generator, is dedicated to loading test patterns that I've created from scratch in inkscape. I use irfanview to display them, and have ensured that no color correction is "tainting" the signal (ICC, gamma correctionLUTs, etc.)
 
I know you have a signal generator, but something like white balance could probably be done without one, am I right? After all if you are displaying a grayscale image testing for the adherence to D65 then it should not matter what the exact timings are, though I do not know what exact patterns WinDAS asks for during the process.

In other words you could use something like PowerStrip for the timings and HCFR to display patterns.

I'm the dude with the generator. Be warned though... If you're not using a generator and just your video card, the output levels may not be trustworthy. Remember that generators are designed for the expressed purpose of being on point with their accuracy. If all you're doing a WPB for is for your own usage, then fine - it probably doesn't matter as much. But if your intentions are on getting absolute color accuracy for photo editing and film projects (or if you just care that much), then you'll probably want a generator.

Also - for anyone who cares. I've successfully tested the BNC output. So this means that my gamble on the 801GG has completely paid off. It's fully-functional, self-calibrating (all tests pass), and the RS-232 works just like it should. Boom! Now the million dollar question - do I flip it? LOL - Of course I won't. I wouldn't sell tool for anything.
 
I'm the dude with the generator. Be warned though... If you're not using a generator and just your video card, the output levels may not be trustworthy. Remember that generators are designed for the expressed purpose of being on point with their accuracy. If all you're doing a WPB for is for your own usage, then fine - it probably doesn't matter as much. But if your intentions are on getting absolute color accuracy for photo editing and film projects (or if you just care that much), then you'll probably want a generator.

Honestly, I see absolutely no reason why a generator is necessary even for high end WPB calibration. All that is needed to do a successful WPB is to be able to meet luminance and chromaticity targets. Video cards are more than capable of sending the correct RGB info to the monitor.
 
Yeah. There's really not much where a video card can be faulty in that regard. If your top luminance (full white) is a certain brightness, and the steps on the grayscale follow the required gamma while maintaining a low enough delta E, then by all accounts, the video card is doing its job properly (as you would expect, that is something pretty basic and after all it is almost 2014).
 
This is especially the case given that it's a 2 or 3 point grayscale adjustment. So when it asks for a 30 IRE pattern, so long as you're "dark gray" you should be fine. It only asks you to hit luminance targets with the full black or full white patterns.

But even with other adjustments (convergence, alignment), the more I learn, the more I suspect that signal generators are, in some cases, redundant. So long as you have a good video card, can control all the relevant timing parameters, and can generate and display your own test patterns, I don't see why there'd be a problem.

There's also this argument:

Suppose that indeed, somehow, a signal generator gave a "purer" signal compared to your video card, so that the chromaticities were slightly different. Well if that were the case, then calibrating it with the signal generator would kinda defeat the purpose in a way, if you're going to end up using your video card to actually USE the monitor. There is the argument to be made for calibrating in the same environment that you're going to be operating in. Anyway, that's just a "what if" thought experiment. There's no good reason to suppose that color is affected.

Don't get me wrong, a signal generator can be great if you're calibrating TV's, or want more portability. Plus, as many of us know, it can sometimes be a pain in the ass to set up the correct timings for the video card. I'd love to have a good one!
 
Actually, just had a thought.

jbl, if you end up getting a colorimeter, you can do some direct comparison measurements between the signal generator and your video card. Just display a white pattern, make sure all the video card settings are at default, and measure x,y, and luminance (y). Leave the colorimeter on the screen, and switch to the signal generator by switching inputs on the monitor. That way you can do direct A-B comparisons really fast.
 
The question is, in the PC world, what RGB value is actually 30 IRE?

As far as I understand IRE, 0 means full black and 100 means full white. In the HDTV world the RGB levels go from 16-235, but since we are working on PCs we should only concern ourselves with 0-255 levels.

IRE is dependent on gamma. So, calculating 30% IRE as 0.30 * 255 does not work without taking into account the gamma. But what gamma does the FW900 expect? It's native gamma? 2.3? Or the sRGB 2.2? Since it is supposed to be used with a signal generator I don't know if they display IRE steps linearly and let the devices native gamma handle it or if they implement a gamma curve themselves.
 
Yes, in general, use 0-255, unless you're dealing with very specific situations.

I wasn't aware that IRE was dependent on gamma. My understanding was that it is a unit of voltage that starts at -40 and ends at +100, where +7.5 is the black level (RGB 0 0 0). However, I've also heard that ths 7.5 IRE is equivalent to black levels only when dealing in video RGB, and in PC RGB it's equivalent to RGB 0 0 0.

It's hard to find solid info on this. One way to tell for sure would be to use a signal generator to output a 30 IRE pattern, measure the luminance, and then use your video card to output various RGB values until you got the same luminance.

I created a 30 IRE pattern using an RGB value of 77 77 77. I don't think it's crucial to get it precisely, because again, WinDAS never asks you to meet luminance targets at 30 IRE.

Why do you say it's dependent on gamma? IRE is a voltage, independent of gamma. The gamma tells you something about how the display reacts to voltage - i.e. luminance as a function of input signal. But RGB is also an input signal, and I would assume that the relationship between RGB and voltage is linear.
 
to be honest, i got no clue what you guys are talking about since the whole 'hey you must do windas calibration on fw900 or it sucks" -idea came to this thread the last months..

Getting kinda bored with it, it would be great if we could comment on an actual guide or so.. Maybe create a new thread?

No offense of course ;)
 
if you have something else FW900 related that you'd like to contribute, feel free to share. You won't be ignored :p

There's no harm in technical discussion, and it can only help clarify issues and act as a record for future readers.
 
Spacediver, I was reading this discussion:

http://www.avsforum.com/t/1113028/what-of-100-ire-is-30-ire

And it sort of got me quite confused. Now I think I finally understand it correctly that only the actual light output of 30 IRE varies with gamma... but 30 IRE will always be 30 IRE (30% of the 255 max RGB value), at least as long as we are talking about the PC 0-255 RGB range. On a HDTV 30 IRE is again something else... I hope I got that right.

So when WinDAS asks for that pattern it is simply to measure the DeltaE's on the low end of the grayscale? By the way, what calibrator are you using?


@atwix No one says the FW900 sucks if it isn't calibrated. Hell, I only eyeballed mine and it looked pretty darn awesome. I really need to get it out of storage and start using it again but it has this small issue so I don't know how long it will last. But it probably has a few months before I have to get it repaired again.
 
Last edited:
to be honest, i got no clue what you guys are talking about since the whole 'hey you must do windas calibration on fw900 or it sucks" -idea came to this thread the last months..

Getting kinda bored with it, it would be great if we could comment on an actual guide or so.. Maybe create a new thread?

No offense of course ;)

None taken. To me, calibration discussions are good because knowing how to calibrate a monitor is key to keeping our GDM's alive and kicking. It does get very technical, and there are differing opinions of course. I myself am sticking to my guns with the generator. It was cheap enough (costs the same as a budget video card), does all modes of the GDM line of monitors (and others of course), and used alongside WinDAS - makes calibrating these devices much easier than before. And it's probably more accurate to boot.
 
if you have something else FW900 related that you'd like to contribute, feel free to share. You won't be ignored :p

There's no harm in technical discussion, and it can only help clarify issues and act as a record for future readers.

Hear, hear! (or is it Here! Here! haha). If you go back further (and there are a TON of pages), you'll see that the people of the past kind of did the same thing. Though perhaps this is evidence that we really should just make an owner's club page (Reddit?) and consolidate all the information (post guides, etc).
 
4ort thanks for the link to that fascinating thread. After reading sotti's responses, I think tbrunet is correct about IRE being "linearly mappable" onto digital input level (the mapping would change depending on whether you're using 7.5IRE for black level or 0 IRE for black level). I think Chris and Doug were incorrect in the beginning of that thread where they said it depended on gamma. In other words, I think you have it right in your last post.

I may be meeting with someone in the near future who'll definitely have an answer, but for now I think it's a safe assumption to use 30 IRE as a 30% signal (safe in that it won't make a difference in WinDAS).

Yes, WinDAS asks you to adjust the settings at the low end (30 IRE) and settings at the high end (Full white), and to keep adjusting them interatively until you meet your targets at both levels. Read Curt Palme's guide here. It'll put the 30 IRE thing into context.

I'm using a DTP-94 (colorimeter), and I've checked it against a brand new i1 pro (spectroradiometer). It's common practice to profile a colorimeter against a reference spectroradiometer at higher luminances, and then use the colorimeter for the actual calibration, as they are faster and more accurate at lower luminances.
 
Last edited:
I think the HDFury 4 only supports up to 72hz @ 1080p, which means even lower refresh at 1920x1200. The admin who responded in that thread ignored the part about refresh rates.
 
I think the HDFury 4 only supports up to 72hz @ 1080p, which means even lower refresh at 1920x1200. The admin who responded in that thread ignored the part about refresh rates.

72hz at 1080p? Technically it should be just fine on our screen, barring the stupid bars on top and bottom.
 
still pretty limiting. Many people prefer a refresh of at least 85hz for visual comfort. Some people like to work at 2304x1440 - i can only imagine what refresh rate HDFury 4 would support at that resolution.

There ain't really a solution yet.
 
still pretty limiting. Many people prefer a refresh of at least 85hz for visual comfort. Some people like to work at 2304x1440 - i can only imagine what refresh rate HDFury 4 would support at that resolution.

There ain't really a solution yet.

Guess we're going to have to cook something up then! :D
 
With the upcoming UHD standards, there may be some hope. There's a lot of demand for high resolution displays (2640p), with high bit depth, and framerates. HDMI 2.0 offers 18 Gbps - anyone know how to figure out how many Gpbs a 1920x1200 @85hz signal requires?
 
I think the HDFury 4 only supports up to 72hz @ 1080p, which means even lower refresh at 1920x1200. The admin who responded in that thread ignored the part about refresh rates.

Yeah, you're right. The official HDFury guy (maybe one of the engineers?) showed up and clarified that no, none of the currently available models can go much higher than 1080p72, or a total of 235Mhz.

However, he did say that they are working on a 4k model that should arrive next year. So we may be in luck. That would arrive around the time we start to see more cards without DAC's.

http://dme.ghost2.net/forum/viewtopic.php?p=288290#288290

And I should also say that these are the guys who we can really trust to bring the best solution for us. Each iteration of the HDFury has added more and more features that have kept CRT owners on pace with evolving video standards. I currently use the original model for my PS3 and it looks fantastic and has absolutely no lag.
 
Yeah, you're right. The official HDFury guy (maybe one of the engineers?) showed up and clarified that no, none of the currently available models can go much higher than 1080p72, or a total of 235Mhz.

However, he did say that they are working on a 4k model that should arrive next year. So we may be in luck. That would arrive around the time we start to see more cards without DAC's.

http://dme.ghost2.net/forum/viewtopic.php?p=288290#288290

And I should also say that these are the guys who we can really trust to bring the best solution for us. Each iteration of the HDFury has added more and more features that have kept CRT owners on pace with evolving video standards. I currently use the original model for my PS3 and it looks fantastic and has absolutely no lag.

pardon my noobiness, but what exactly do you need a HDfury4 for on a fw900?

Is the only reason that ramdac will vanish on desktop graphic cards sooner or later? aka, to use it as external ramdac? Or are there more benefits? Like getting better image on connecting a playstation 4 to it or so?
 
Is the only reason that ramdac will vanish on desktop graphic cards sooner or later? aka, to use it as external ramdac?

that's definitely the main reason I think.

Or are there more benefits? Like getting better image on connecting a playstation 4 to it or so?

I think so. The whole HDMI concept is still kinda new to me, and the last console I owned was a Nintendo 8 bit! But I think yea, you'd be able to connect devices that have HDMI outputs to your monitor and get a great image.
 
However, he did say that they are working on a 4k model that should arrive next year. So we may be in luck. That would arrive around the time we start to see more cards without DAC's.

http://dme.ghost2.net/forum/viewtopic.php?p=288290#288290

And I should also say that these are the guys who we can really trust to bring the best solution for us. Each iteration of the HDFury has added more and more features that have kept CRT owners on pace with evolving video standards. I currently use the original model for my PS3 and it looks fantastic and has absolutely no lag.

this is FANTASTIC news!
 
Is the only reason that ramdac will vanish on desktop graphic cards sooner or later? aka, to use it as external ramdac?

Yeah, I'm just looking to the future. My 7850 is working fine right now, but I realize that next time I upgrade, the card I want might not have a RAMDAC built in. Maybe Nvidia is planning on keeping them around for a while (780 Ti still has VGA), but AMD is certainly beginning to phase it out.

Unless any 4K games come PS4, you can use any of the currently available models just fine for that.
 
So I think I've figured out how to get really deep blacks, without crushing detail. Gonna do some more experimentation, but here's what I've got so far.

In WinDAS, at the early stages of the WPB, it will ask you to adjust the G2 until the first pedestal value on your grayscale image becomes barely visible. I'm using a standard graybar, where the first bar is RGB 000 and the "pedestal" bar is RGB 16 16 16.

I slide the G2 bar until the pedestal bar is just invisible, so that if i were to step the G2 value up one more notch, the bar would just become visible. From what I can gather, WinDAS doesn't actually keep the brightness at the same level it was set during the adjustment - if it did, it would crush everything from 0-16. I think it asks you to do that as a reference, and then it adjusts things based on that.

After the WPB is done, I end up with a black level that is around 0.01 cd/m2 (at least, measured with my DTP-94 - I'm gonna try and borrow a minolta LS-100 from the lab to really test out the black level. Either way, it's a fantastic black level (and this is on display with antiglare removed).

The problem is, at this level, it crushes a lot of detail at low input levels. You can get rid of the crushing by increasing the brightness, but then you lose good black level.

You can use gamma correction fix this. One way is to do it through your video card's display settings - there is usually something called a gamma slider, and you can boost your gamma.

Another way is to use argyll CMS, and you can have a lot of control over how the gamma correction is applied.

Once you've done the gamma correction (which is applied to the video LUT), your desktop should look much better. Depending on which method you use (argyllCMS or the display settings), you may need to take different steps to ensure video playback will incorporate this.

Anyway, here are the results before and after:

Top is before, and bottom is after. The white line is a pure 2.4 gamma reference line. The yellow line is the measured function. In the before, the actual gamma is around 2.85!! I've chosen 2.4 as a target, for personal reasons. Depending on the viewing material (or environment), you may want to use a lower gamma.

There are probably other ways to tweak the gamma from within the WPB in WinDAS. One way is to set the g2 higher to begin with (so that the pedestal bar is more visible). Another would be to use meet different luminance targets during the adjustments.

Anyway, for those who are interested, this gives the best of both worlds - deep blacks AND no crushing. And you can probably achieve similar results simply by lowering the G2 without doing a WPB, and then doing gamma correction.

8yesfb.png
 
Good news! With the help of some very cool folks over at avsforums, I've discovered something very cool about CRTs (at least the FW900).

It's often been said that CRTs are capable of arbitrary bit depths, as they're analog. However, some have speculated that perhaps, since the FW900 is such an advanced CRT, the video amplification circuitry constrains it to 8 bits to reduce signal noise.

The problem is that it is very difficult to test this hypothesis. Typically, to render, say a 10 bit image (for example, a grayscale ramp with 1024 shades of gray, rather than the 256 shades you get with 8 bits), you need a video card that supports 10 bits, a video cable that supports 10 bits, drivers that supports 10 bits, software applications that can render in 10 bits, and a display that is 10 bits.

In other words, a total pain in the ass.

Well, I can just about confirm that the FW900 is capable of at least 10 bits. Not only that, but we can actually benefit from these 10 bits in an interesting way.

The Video LUT contains values that instruct the display how to interpret RGB signals. A linear LUT would tell the display to treat the difference between RGB [0 0 0] and RGB [1 1 1] the same as the difference between RGB [12 12 12 ] and RGB [13 13 13]

Now, in an 8 bit context, there can only be 256 different values (for each color channel). And for each of those different values, you can only choose one of 256 possible numbers.

However, windows can specify the LUT with 16 bit precision. That means that even though there may only be 256 different values, each of those values can take on any one of 65536 values. You can think of it like this. There are 256 steps, but for each step, you can choose one of 65536 step sizes.

Why is this important? Well, it means that even if your frame buffer is only 8 bits, you can make color adjustments that are incredibly precise!

A very real world application of this is gamma control. Try this experiment: turn your brightness all the way down to 0, so everything is crushed.

Now, if you have a gamma slider control somewhere in your video card display settings (e.g. nvidia control panel), you can try to regain the picture by sliding the gamma up. On a display with limited bit processing, doing this would result in horrible quantization artifacts (banding). But on the FW900 at least, you can do this and get all the detail back without artifacts.

This can be applied not only to gamma correction, but color correction. Remember, color is specified by combinations of voltage inputs to the electron guns. If you can tune those guns with 10 bit precision, you can tune your color with 10 bit precision.

btw, the way I was able to test the 10 bit precision was by using Argyll CMS. it can generate test patterns that are rendered with very slight modifications to the LUT, and then measures how the display reacts to those instructions. If you have a sensitive enough instrument, and the display is stable enough, arygll will estimate the precision of the display. With my setup, it estimated 10 bits. I wouldn't be surprised if with the right hardware, the FW900 would be able to handle 16 bits.

If someone has both a CRT and an LCD display, let us know what happens when you try the lowering brightness and raising gamma slider experiment. I'm curious to see what happens :)
 
Does anyone know the secret to judder-free movie playback? I'm trying to watch some movies that are 24 fps (23.976 to be precise). First I tried watching at an even 72hz, and noticed judder, so I tried to set a the more precise timing of 71.928 hz with the Custom Resolution Utility, and still noticed judder.

How have you guys solved this?
 
Does anyone know the secret to judder-free movie playback? I'm trying to watch some movies that are 24 fps (23.976 to be precise). First I tried watching at an even 72hz, and noticed judder, so I tried to set a the more precise timing of 71.928 hz with the Custom Resolution Utility, and still noticed judder.

How have you guys solved this?

Honestly - I haven't had the issue myself. 72hz gets rid of any judder that I may notice.
 
About to get one of these CRT's for $50, the guy said that it's not really working and you "MAY" be able to get it working but it's highly unlikely and difficult to get it to power on.

Anyone have a clue as to what the problem might be or should I just drop it off at a shop to get fixed?
 
Back
Top