24" Widescreen CRT (FW900) From Ebay arrived,Comments.

there cant be any input lag for a dac unless the frames are buffered or processed in complex ways (e.g. a poorly implemented scaling algorithm)
 
I'm guessing if this were to be successful, it would require a modification of the actual video card, adding a RAMDAC before things get converted to HDMI (i.e. perhaps converting from HDMI to analog introduces a necessary latency component).

I don't know enough about video cards or HDMI to know if this is even possible, though.
 
I think HDFury may be making an adaptor that would be able to support this, but I think it adds a frame of input lag (about 12 ms),

Where did you read it would have lag? All I've heard is that they're developing a high-pixel clock converter.

Most HD Fury models have zero lag. It's been tested by a few people. In my experience, going from 480p from the analog out on PS3 to 1080p through HDMI>HD Fury, I couldn't tell any difference in timing. This was playing Wipeout HD, which is a super fast game where you definitely need a low lag display.

But their HDMI ports are 300mhz. Could those ports be converted?

HD Fury adapters do convert HDMI, just most of them can't do it above 165 (HDFury 3 can go a little higher).
 
Ah interesting. I looked up HD Fury 1-3, and I couldn't find mention of any added latency, but also couldn't find mention that there is no added latency. Do you know this for sure? Or just based on your own personal experience?
 
Yeah, its for sure. You can find mention of it on a few different forums. Users did a few kinds of tests to determine it had near zero lag. I think there may be video of some of the tests on youtube. I know one person tested straight HDMI vs HDMI>HDFury>VGA on a low-lag LCD and there was something like 1ms difference. It was also mentioned in the marketing and FAQ's on the site though that may have changed after site redesigns.
 
I think making our own little converter won't be that difficult. No idea on cost. Might just need to mod an existing converter by replacing a chip. Who knows, maybe the results will be heaps better than the lousy DAC's they've been putting into most cards in the past 10 years.
 
Yeah, its for sure. You can find mention of it on a few different forums. Users did a few kinds of tests to determine it had near zero lag. I think there may be video of some of the tests on youtube. I know one person tested straight HDMI vs HDMI>HDFury>VGA on a low-lag LCD and there was something like 1ms difference. It was also mentioned in the marketing and FAQ's on the site though that may have changed after site redesigns.

awesome, thanks, that's great to know :)

I did find that video link but it had been removed from youtube as the account was deleted.

I think making our own little converter won't be that difficult. No idea on cost. Might just need to mod an existing converter by replacing a chip. Who knows, maybe the results will be heaps better than the lousy DAC's they've been putting into most cards in the past 10 years.

Not sure if it will be that simple, as I think you need to deal with the HDMI part of the equation (not sure, anyone know about this?)

Also, what makes you think the DACs in most cards are lousy?
 
awesome, thanks, that's great to know :)

I did find that video link but it had been removed from youtube as the account was deleted.



Not sure if it will be that simple, as I think you need to deal with the HDMI part of the equation (not sure, anyone know about this?)

Also, what makes you think the DACs in most cards are lousy?

I've been using a variety of cards with the FW900 since 2006. I'd say some of my earlier cards, especially from ATI, had the best IQ. The best one I had was a X1950PRO AGP Card, it was very crisp. I swapped it out to 2 different 3850 AGP cards, both were massively poorer in IQ. The last decent card with good IQ was the HD5870. I've had a GTX285, GTX580, GTX680 since, and they were just okay in sharpness.

I'm only speculating, but having good analogue IQ for gaming cards is not a huge priority for hardware makers and there's probably cost saving reasons to not use the best parts for DAC's.
 
as someone who desperately wants to use his fw900 with his r9 290 again, i'm curious about what more knowledgeable people will find out :p
But one thing i really don't understand is, why are active adapters so limited AND expensive when every 30$ video card has a 400Mhz ramdac?
 
as someone who desperately wants to use his fw900 with his r9 290 again, i'm curious about what more knowledgeable people will find out :p
But one thing i really don't understand is, why are active adapters so limited AND expensive when every 30$ video card has a 400Mhz ramdac?

Because the world has transitioned pretty quickly to non-analogue displays. These active adapters are niche products for a very niche crowd. Putting a few chips on a card is much less labor intensive than creating little boxes that very few will buy.
 
I've been using a variety of cards with the FW900 since 2006. I'd say some of my earlier cards, especially from ATI, had the best IQ. The best one I had was a X1950PRO AGP Card, it was very crisp. I swapped it out to 2 different 3850 AGP cards, both were massively poorer in IQ. The last decent card with good IQ was the HD5870. I've had a GTX285, GTX580, GTX680 since, and they were just okay in sharpness.

Interesting. I'm close to finalizing my setup for measuring sharpness objectively. Once I do, if anyone wants to send me a video card that they think has better analog out than my GTX 660, I'd be able to compare the two with some very precise measurements.
 
Sorry for bad quality post, but 1st I'm not native English, 2nd my hands are shaking.

FUCK. My girlfriend was so kind to clean up my desk a little. Or rather mess with my FW900.

I can't tell how lucky I was that I bought almost new FW900 some time ago. It has almost perfect geomtry, deep blacks, clean whites, is razor sharp... If I had ever seen like hundred of CRT monitors, this particular is like miles ahead of competition.

Every single day since I bought this gem it made me smile every single time I turned on my PC. Lots of CRTs die due to ther age, but this one seemed not to care about being old providing the best possible gaming experience the money could buy.

So, back to GF. As always she got rid of the dust on top of it ("oh honey your CRT is dust magnet" I heard hundred of times) and this time she decided to clean the glass. Well, it actually deserved it, but I heard stories how sensitive anti glare coating is to scratching. So I used to be super-delicate with every single smudge that it had.

Now I came back home and see this. AG coating scratch with size of a fucking fist.

rsnVFee.jpg


Can't tell how angry I am. Lost my appetite, have dizziness, I feel like losing member of my family. Never ever any single piece of electronics or toy gave me as much joy as this monitor. Always treated with utmost caution and now I feel like someone stabbed me in the back. Not to mention it's my GF. Not that I care about things, I used to drop and crack new phones, burn newest GPU but now I know that FW900's in good condition are so extremely rare nowadays that I guess I might never again have an opportunity to buy such excellent piece.

Now we sit in the same room and do not talk to each other. She tried to find any info on new AG coatings but I guess it's waste of time. Damn I even lost any lust. Wonder if we could ever have sex again :rolleyes:

Guess I'll have to get rid of AG but I have no idea where to start and how to open the case. Any movie/photos how not to fuck things up even worse?

At least I'm going to provide really good before/after photos as I do serious photography and I can keep really steady light conditions in all scenarios.

Any ideas on what patterns should I take? For sure I'm going to compare
- blacks and whites in different light conditions (heavy to no ambient light)
- text clarity
- white balance point
- contrasty patterns like white stripes on black background and vice versa


once again : FUCK.


Unkle Vito I hope that you still have some good pieces at your place. When I make some good money I'm going to personally flight over the ocean and choose the best one :p
 
Despite what most people here say I actually liked the monitor more with the AG on, it was usable in a wider range of ambient lighting and the tube did not immediately become washed out the moment you let any light into the room.

Needless to say I'm sorry for your loss, as there is no going back with the AG, save for finding some sort of neutral density filter that you could tape onto the glass. I don't know if there are any companies manufacturing such a thing. I'm sure you could find some if you really wanted to.

You have to be very explicit with everyone who lives with you when it comes to sensitive electronics. "DON'T F****** TOUCH IT" while making a decapitating motion with your index finger should convey the message clearly enough, most of the time.:)

The AG removal is easy enough and has been documented here multiple times. Just take it nice and slow.

Anyway, my FW900 is currently retired as it has developed issues 2 years ago and I don't want it to break big time. So I shelved it until I get my first paycheck then I can take it to get properly repaired. My other CRT a 21" EIZO is also about to be retired soon as it started to flicker blue when cold. I had it for 5 years and it's made in 2003. Going back to LCD land for the time being I guess.
 
Last edited:
Thanks for kind words.

This is what I'm afraid of, I already have hard times darkening the room and if you say that it's gonna get even worse... damn.

I have two spare CRTs, both doing 1600x1200 @ 100hz (or even 1920x1200 @ 100hz with black stripes, Dell 1130 and Eizo FS931) but holy crap, when I first saw FW900 I never thought of coming back.

I really hope that I get any advantage in contrast/sharpness in dark room so that it will be actually worth it at all, cause otherwise I'm gonna get a divorce :p

edit

looking for something spare I found things like these : https://www.youtube.com/watch?v=KEN9whWOqTI

I think I'm gonna give it a shot, just found 24" anti-glare protector for about $40. Maybe not all hope is lost.

edit2

just wondering if anyone here ever tried any anti glare protectors? They are commonly used for iPads and other tablets so there should be no big difference when using them on FW900. Any hints which one to choose?
 
Last edited:
Sorry for bad quality post, but 1st I'm not native English, 2nd my hands are shaking.

FUCK. My girlfriend was so kind to clean up my desk a little. Or rather mess with my FW900.

I can't tell how lucky I was that I bought almost new FW900 some time ago. It has almost perfect geomtry, deep blacks, clean whites, is razor sharp... If I had ever seen like hundred of CRT monitors, this particular is like miles ahead of competition.

Every single day since I bought this gem it made me smile every single time I turned on my PC. Lots of CRTs die due to ther age, but this one seemed not to care about being old providing the best possible gaming experience the money could buy.

So, back to GF. As always she got rid of the dust on top of it ("oh honey your CRT is dust magnet" I heard hundred of times) and this time she decided to clean the glass. Well, it actually deserved it, but I heard stories how sensitive anti glare coating is to scratching. So I used to be super-delicate with every single smudge that it had.

Now I came back home and see this. AG coating scratch with size of a fucking fist.

rsnVFee.jpg


Can't tell how angry I am. Lost my appetite, have dizziness, I feel like losing member of my family. Never ever any single piece of electronics or toy gave me as much joy as this monitor. Always treated with utmost caution and now I feel like someone stabbed me in the back. Not to mention it's my GF. Not that I care about things, I used to drop and crack new phones, burn newest GPU but now I know that FW900's in good condition are so extremely rare nowadays that I guess I might never again have an opportunity to buy such excellent piece.

Now we sit in the same room and do not talk to each other. She tried to find any info on new AG coatings but I guess it's waste of time. Damn I even lost any lust. Wonder if we could ever have sex again :rolleyes:

Guess I'll have to get rid of AG but I have no idea where to start and how to open the case. Any movie/photos how not to fuck things up even worse?

At least I'm going to provide really good before/after photos as I do serious photography and I can keep really steady light conditions in all scenarios.

Any ideas on what patterns should I take? For sure I'm going to compare
- blacks and whites in different light conditions (heavy to no ambient light)
- text clarity
- white balance point
- contrasty patterns like white stripes on black background and vice versa


once again : FUCK.


Unkle Vito I hope that you still have some good pieces at your place. When I make some good money I'm going to personally flight over the ocean and choose the best one :p

We are down to three (3) units... The GDM-FW900's w/o antiglare can be effectively used with a monitor hood. A very inexpensive hood can be built by cutting pieces of black foam core (3/16") and then piecing them together with duck tape. Then attach it to the frame and body monitor with velcro straps and you will be a happy camper.

Use the unit on a light controlled room, with 5000K light (preferable) and in a dim environment.

All of my photographers and video graphers customers use the GDM-FW900 that way and never have any problems and/or issues with glare.

Try it and see how it works...

Hope this helps...

UV!
 
Now we sit in the same room and do not talk to each other. She tried to find any info on new AG coatings but I guess it's waste of time. Damn I even lost any lust. Wonder if we could ever have sex again :rolleyes:

I hope you are joking here.
 
We are down to three (3) units... The GDM-FW900's w/o antiglare can be effectively used with a monitor hood. A very inexpensive hood can be built by cutting pieces of black foam core (3/16") and then piecing them together with duck tape. Then attach it to the frame and body monitor with velcro straps and you will be a happy camper.

Use the unit on a light controlled room, with 5000K light (preferable) and in a dim environment.


UV!

Unfortunatelly I use FW900 as my main monitor and have two others aside. That excludes any hood. I have 2 LED 5500K lamps behind my monitors, they give enough ambient light but not affecting blacks in FW900. Surprisingly contrast and blacks are even better than they used to be, looks like I unnecessarily feared in advance. I never thought I would be happier without AG. I am really delighted with current no-AG setup so no need to go overseas at the moment, but anyday I have supplementary cash I would be happy to come to see your beauties that you have in stock :D Well I hope that tommorow when sun shines it will be at least satysfing, I hope that I won't have to work in darkroom for next few years.


I hope you are joking here.

Of course I am :p I had to exaggerate just to leaven the story. Hope you enjoyed, the divorce is cancelled :D I would give away all FW 900's on earth for my lady's heart, thought that was really obvious joke :D


Right now my main concern is that glass itself with no AG is so damn electricized it catches all the dust around. I have to clean this thing every 5 minutes, but now with finest cloth I could find.
 
Of course I am :p I had to exaggerate just to leaven the story. Hope you enjoyed, the divorce is cancelled :D I would give away all FW 900's on earth for my lady's heart, thought that was really obvious joke :D

glad to hear, wasn't 100% sure :p

Right now my main concern is that glass itself with no AG is so damn electricized it catches all the dust around. I have to clean this thing every 5 minutes, but now with finest cloth I could find.

Get a good microfiber cloth - the kind made especially for cleaning delicate surfaces like displays. The static isn't a huge issue for me, I find a wipe every day or two is all I need. I think my cleaner might have anti static properties too.
 
My reverse ring adapter should arrive in a week or two. Once it does, I'll be able to take in focus shots of my CRT close to or at 1:1 magnification. That means that If I take an image of an inch of my screen, it will be captured on an inch of the camera sensor. The sensor in my camera is about 0.85 inches across, and has over 4200 pixels across the horizontal dimension.

If I subsample the image to get full XYZ information, that's about 2500 pixels per inch, or about 100 pixels per mm (or 0.01 mm per pixel). The dot pitch in the center of the FW900 is around 0.23 mm (the width of three sub pixels). That means each phosphor stripe is around 0.8 mm wide. This means that I should be able to get 80 pixels of information per sub pixel!

Even if diffraction limits come into play (and because I'll be using very high aperture, it's unlikely this will be a problem), this will allow me to measure luminance and color variations across small patches of the screen with incredible precision.

This will be great for measuring things like halation ("bleeding of light") and comparing it between my unit with the AG on and with it off. It will also allow me to objectively compare the sharpness of an image with different cables, such as BNC vs DVI, and different video cards.
 
Last edited:
don't forget about vignetting due to the camera

for cleaning the screen:
if the ag is already removed, the glass is pretty strong so you don't need to be too careful

if the ag is still on, i'd avoid using any type of cloth... as there's always the potential for dust to get trapped in the cloth that can scratch the ag the next time you wipe the screen. i usually use a disposable coffee filter dampened with some distilled water to clean my screen.

and i like to always swipe horizontally, so that if i do accidentally make a hairline scratch it's less noticeable as vertical scratches will line up with the phosphor stripes and make rainbow patterns across the scratch (e.g. on my iphone: http://i.imgur.com/UQn3Ouc.jpg look under "Thursday, January")
 
don't forget about vignetting due to the camera

good call, I wonder if there's a good way to accurately characterize the vignetting function (other than finding a reference patch with completely flat luminance). One thing I might do is generate 100 different random noise images, take an image of each, and use some sort of correlation/classification image analysis to reveal any consistent pattern between them.

edit: nice, just found this
 
Last edited:
Unfortunatelly I use FW900 as my main monitor and have two others aside. That excludes any hood. I have 2 LED 5500K lamps behind my monitors, they give enough ambient light but not affecting blacks in FW900. Surprisingly contrast and blacks are even better than they used to be, looks like I unnecessarily feared in advance. I never thought I would be happier without AG. I am really delighted with current no-AG setup so no need to go overseas at the moment, but anyday I have supplementary cash I would be happy to come to see your beauties that you have in stock :D Well I hope that tommorow when sun shines it will be at least satysfing, I hope that I won't have to work in darkroom for next few years.




Of course I am :p I had to exaggerate just to leaven the story. Hope you enjoyed, the divorce is cancelled :D I would give away all FW 900's on earth for my lady's heart, thought that was really obvious joke :D


Right now my main concern is that glass itself with no AG is so damn electricized it catches all the dust around. I have to clean this thing every 5 minutes, but now with finest cloth I could find.


For units that have NO ANTIGLARE ON... we use PANCRO lens cleaner, used extensively here in Hollywood to clean the movie camera lenses and glass. To wipe off the cleaner, we use the KLEAR SCREEN cloth which is lent free. Coffee filters are also good on the glass, and Microfiber cloths are also OK to use. Another alternative for a glass cleaner is using a non-abrasive glass cleaner water based or solvent based such as high grade alcohol.

For units with the ANTIGLARE ON... use strictly a water based glass and optics cleaner such as KLEAR SCREEN or any other high grade water based optics cleaner. UNDER NO CIRCUMSTANCES you are to use a solvent or any abrasive cleaner on the antiglare. If you do... then the damage to the AG coating is irreversible.

Hope this helps...

UV!
 
edit: nice, just found this

didn't really read it since its looking to calculate the vignetting based on a single image, which isn't applicable to your case since with a single image it's impossible to distinguish between vignetting and screen nonuniformity. (i.e. one equation, two unknowns)

i can't think of any straightforward ways to estimate it... probably you'll be fine by only looking at the center of your pictures (where vignetting should be negligible)
 
didn't really read it since its looking to calculate the vignetting based on a single image, which isn't applicable to your case since with a single image it's impossible to distinguish between vignetting and screen nonuniformity. (i.e. one equation, two unknowns)

I'm pretty sure they use carefully designed test gradients to characterize the vignetting and then can correct any other given image taken with same camera. The "single image" idea, as far as I understand (I've only barely skimmed paper), means that they only need a single image to characterize the vignetting. And so long as the statistics of the gradients aren't correlated with the nonuniformities of the screen, then you don't need a perfectly uniform screen for it to work. There is a preliminary step where they establish the ground truth for the optical center of the vignetting, which requires multiple images.

Another cool thing about the paper is that it's from the end of 2014, so the intro will have a lot of good references.

Given that I'm using a macro photography approach, I'll need a good chunk of the sensor if I want to take measurements of sufficient real estate on the screen. Plus, it's be really cool to get a handle on the vignetting (also, I wonder if it will be inverted from normal vignetting because I'm flipping the lens, but not sure).
 
Last edited:
Despite what most people here say I actually liked the monitor more with the AG on, it was usable in a wider range of ambient lighting and the tube did not immediately become washed out the moment you let any light into the room.

I agree with you. I regret having to take the filter of.


Use the unit on a light controlled room, with 5000K light (preferable) and in a dim environment.

Why not 6500K? (or is it different for bias vs ambient lighting?)
 
there's a bit here (haven't read it yet) that might shed light. Also, apparently the ambient illumination for sRGB encoding is D50. So perhaps if you're doing photography, a D50 light would be good, but as a bias light for watching HD content, I think D65 is the viewing standard.
 
there's a bit here (haven't read it yet) that might shed light. Also, apparently the ambient illumination for sRGB encoding is D50. So perhaps if you're doing photography, a D50 light would be good, but as a bias light for watching HD content, I think D65 is the viewing standard.

Hmm, I didn't get that from the article. They haven't mentioned ambient lighting in relation to CRT calibration or I've missed it.
 
For units that have NO ANTIGLARE ON... we use PANCRO lens cleaner, used extensively here in Hollywood to clean the movie camera lenses and glass. To wipe off the cleaner, we use the KLEAR SCREEN cloth which is lent free. Coffee filters are also good on the glass, and Microfiber cloths are also OK to use. Another alternative for a glass cleaner is using a non-abrasive glass cleaner water based or solvent based such as high grade alcohol.

For units with the ANTIGLARE ON... use strictly a water based glass and optics cleaner such as KLEAR SCREEN or any other high grade water based optics cleaner. UNDER NO CIRCUMSTANCES you are to use a solvent or any abrasive cleaner on the antiglare. If you do... then the damage to the AG coating is irreversible.

Hope this helps...

UV!

Is this the Klear Screen cleanser you're talking about?
http://www.amazon.ca/gp/aw/d/B0018C...l=1&dpID=416EwZ3mdtL&ref=plSrch&pi=SY200_QL40
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
True, but if you calibrate your display for D65 and then have 5000K bias lighting, your color perception will be altered and you will lose the calibration benefit.

I believe 5000k is the white point for photography and photo studios.
 
True, but if you calibrate your display for D65 and then have 5000K bias lighting, your color perception will be altered and you will lose the calibration benefit.

We calibrate and adjust all reference points (D93, D65 and D50) in a dark environment with no incident light directed at the screen of the monitor.

When the user performs the software calibration of the unit which generates the ICC profile then he chooses whichever environment he wants... Again, beauty is in the eye of the beholder...

UV!
 
I'm pretty sure they use carefully designed test gradients to characterize the vignetting and then can correct any other given image taken with same camera. The "single image" idea, as far as I understand (I've only barely skimmed paper), means that they only need a single image to characterize the vignetting.

K I think we were both wrong about the paper. Anyway, I tried a different idea. I loaded up a gray uniform pattern on my screen, and took 50 images, each at a random location on screen and at a random orientation. All images were taken with same focus, aperture, ISO, zoom, and at exact same distance from screen.

I then averaged the 50 images. Here's a plot of the change in pixel value across the middle row of this averaged image. Again, this is without subsampling and calculating XYZ information, so the line looks thick because of the high frequency energy introduced by raw bayer array.

15druv7.png


Now I just need to think of a way to represent this image (the whole image, not just that one row) such that I can factor it out of any other images I take with those conditions.

Of course, I'll need to take a new set of 50 images once I have my macro setup since the distance, zoom, aperture, focus etc will all be different.
 
True, but if you calibrate your display for D65 and then have 5000K bias lighting, your color perception will be altered and you will lose the calibration benefit.

You're not limited to 6500k. When you do the white balance adjust, the procedure has you calibrate 9300k, 6500k, and 5000k. So if you just select 5000k and use that as a bias light, you're in business.

I wonder if anyone has a switched-lamp - that is - one switch for 5000k and one for 6500k?
 
You're not limited to 6500k. When you do the white balance adjust, the procedure has you calibrate 9300k, 6500k, and 5000k. So if you just select 5000k and use that as a bias light, you're in business.

I wonder if anyone has a switched-lamp - that is - one switch for 5000k and one for 6500k?

Correct!
 
Back
Top