24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I was looking at CRT monitors on ebay, and stumbled upon this:

8QtNaeL.png


The shipping cost, WTF???

The shipping plus tariff/duty/VAT to Europe is quite expensive. The UPS quote is not surprisingly out of the ordinary for shpping to Europe... Now, if this is a quote for shipping to the contiguous 48-US states, then it is a total ripoff and for this Dell Trinitron model is not worth it.

UV!
 
These are ours... Excellent color reference monitor, and it comes with the CRS program (only compatible with WinXP), color sensor, digital VGA-USB cable and the monitor hood. Also, it can be calibrated with any commercial calibration system such as X-Rite i1, Gretag MacBeth, Colorvision, Monaco Optix, etc,

UV!

How much does a CRT like that last? I been reading that a CRT lasts 10-15 years. If that Sony CRT is made 2002 and it's now 12+ years old, how much more can you expect it to last if it has been used every day for several hours?

The shipping plus tariff/duty/VAT to Europe is quite expensive. The UPS quote is not surprisingly out of the ordinary for shpping to Europe... Now, if this is a quote for shipping to the contiguous 48-US states, then it is a total ripoff and for this Dell Trinitron model is not worth it.

UV!

No, that's shipping to Europe. I guess it expensive cuz it's fast shipping, just 3-5 days.
 
i have one myself. it's a very good monitor but i think the gdm series are a bit more polished/refined in every way.

my only complaints for my particular monitor are:
0. 0.24mm grill instead of 0.22mm. but it's not a huge deal
1. geometry slightly imperfect around the center. not fixable with windas hard to notice except when scrolling text.

everything else is pretty much perfect... or as good as i can imagine a crt monitor can possibly be.

$300 though... i don't think it's worth unless you want to become a crt collector :D or unless you live somewhere where it's rare to find these

No CRT will have perfect geometry. Only ones that come close are round tubes with tons of correction options. My GDM-F520 isn't perfect with geometry either. Like your monitor, it's not correctable. But it's still in spec and is still an A+ monitor. Believe me - Vito will tell you - that I was concerned when I saw the imperfect geometry. I expected everything to be perfect. But after having seen quite a few top-grade CRT's - none of them were ever perfect in the geometry region. It's a limitation of the technology I'm afraid.

I'm still very satisfied with my F520 and I played UT2004 the other day on it at 1920x1440 and damn - what a sight to behold.
 
From the OP (to which I contributed via edits submitted to Mathesar). . .

Initial Setup Panic? "Image Restore" to the Rescue!
When you first unpack and power up your monitor, do not worry if black levels and color are way off. Let it warm up for about 30 minutes. After thirty minutes in the powered-on state (don't let the monitor enter "sleep" mode), the "Image Restore" option becomes available under the "Expert" and "sRGB" color settings menu. Set your preferred color temperature in the Expert area and then activate the "Image Restore" option. Your colors should then "pop" into place. Edit: its also good to turn up the monitors Brightness setting before running Image restore, For example I normally run mine at around 25 but before running Image restore Ill turn it up to 50, Then back down after running it. Doing this normally gives the brightness setting a wider range, especially if you're currently having to run the monitor at a low brightness setting to achieve good black levels.

Green Tinge?
If you notice a "green tinge" to greys and blacks, do the "Image Restore" procedure above. But first, raise the Brightness setting for your monitor appreciably above where you'd normally like it. Then, do the "Image Restore" operation.
Someone (for some reason in an unrelated thread) is getting all bent out of shape because they're claiming (with some plausibility) that this advice will cause overbrightness issues that will only be reversible via WINDAS (which most users do not have, nor the cable necessary to use it).

I seem to recall some caution being advised at times throughout this thread about over-using Image Restore. But nobody took issue with this oft-repeated advice (until now).

Just thought I'd note this here and perhaps ask that a note be appended to these tips on the first page. Personally, I don't recall anyone ever complaining here that these tips (raise brightness a bit and then use Image Restore) caused any issues for them. To the contrary, quite a few people reported improvement even in black levels. But this thread is enormous and I stopped following long ago. So perhaps I've missed issues and/or further discussion.

Odd that the tip has been there for almost ten years and yet we're only hearing this now. But, regardless, people should use those tips with caution.

--H
 
well now that we have folks taking objective measurements on these units, should be easy enough to test how performing image restore at diff brightness levels affects the behaviour of the tube. My own hunch is that it makes no difference either way, but could be interesting to test. Maybe I'll do that with one of my backup units. btw, you might be interested in this thread if you haven't already seen it.
 
Someone (for some reason in an unrelated thread) is getting all bent out of shape because they're claiming (with some plausibility) that this advice will cause overbrightness issues that will only be reversible via WINDAS (which most users do not have, nor the cable necessary to use it).
Gosh my first stalker. <tears>

If helping to avoid further and not readily fixable damage to $1800 monitors qualifies as "bent out of shape", I'll leave it to readers to review the referenced discussion on OLED Computer Monitors.
 
Gosh my first stalker. <tears>

If helping to avoid further and not readily fixable damage to $1800 monitors qualifies as "bent out of shape", I'll leave it to readers to review the referenced discussion on OLED Computer Monitors.
There's irony. You clearly went back looking through my post history to try to find something you could use to distract from the rhetorical ass whooping you're enduring in another thread, and found this. . . and I'm the stalker?

You had a (perhaps genuine) concern. And without even mentioning your name, I posted it here. One would think you would applaud that if your concern for fellow CRT owners were genuine.

Please, don't sully another thread with your inane BS. You've embarrassed yourself enough in the other one already.
 
what i know
1. i have no idea what image restore actually does in terms of changing the values in the eeprom
2. monitors warm up faster if you display a white/bright image
3. as monitors age, the black level (which can be controlled by the osd brightness) increases and the max luminance (which can be controlled by the osd contrast) decreases
4. in windas, moving g2 slider up increases the black level. i forgot whether this corresponds to actually increasing or decreasing the g2 voltage
5. in windas, when you start a wpb procedure, the default g2 is set to 100 which is (for my 2 monitors at least) way lower than the proper setting
 
4. It does.
5. Never saw this in a wpb procedure on our F500R. It's supposed to (and does ime) default to whatever G2 value is stored for the currently loaded monitor profile. Ours came from the factory at 130 which at the time it was new was ridiculously overdriven. :) Sony's major competitor for prosumer monitors was Mitsubishi's DIamondtron, an absolute gamut monster even compared to Trinitrons. So Sony's (and everyone else's who was using Trinitrons) answer was to ship monitors with these stupid G2 voltages. It was responsible imo for many early tube deaths.
 
it does what?

idk which tube youre talking abiut but for instance the 2070sb, ive read, uses b22 (ebu) phosphors which are close to r ec709
 
it does what?

idk which tube youre talking abiut but for instance the 2070sb, ive read, uses b22 (ebu) phosphors which are close to r ec709
It increases G2 voltage. Wasn't that your question?

Cadalyst has (or at least had, look it up) comparative reviews of a F500R and 2040u. It was the one that made us buy the second of those before the first (not to mention the $600 msrp difference). I didn't mean to imply the Trinitrons were some kind of gamut slouch, only that somehow Mitsubishi managed to outdo them.
 
4. in windas, moving g2 slider up increases the black level. i forgot whether this corresponds to actually increasing or decreasing the g2 voltage

I believe it actually increases the voltage. G2 can be thought of as the first accelerating anode in the gun assembly. By increasing its voltage, you pull electrons with more force, resulting in a higher beam current passing through G1.

5. in windas, when you start a wpb procedure, the default g2 is set to 100 which is (for my 2 monitors at least) way lower than the proper setting

I don't remember this being the case on my tubes. I'll double check soon (gonna do a WPB on a couple FW900s soon).

Sony's major competitor for prosumer monitors was Mitsubishi's DIamondtron, an absolute gamut monster even compared to Trinitrons. So Sony's (and everyone else's who was using Trinitrons) answer was to ship monitors with these stupid G2 voltages. It was responsible imo for many early tube deaths.

this makes little sense for 2 reasons:

1: as flood pointed out, the gamut should be based on the SMPTE-C gamut, which is associated with Rec 601 (and very close to Rec 709). The gamuts of the two tubes should be virtually identical. Not sure where this notion of one having more saturated colors came from, but I'd like to know.

2: Increasing G2 would desaturate the colors, as you're diluting the primaries with extra white. You get the most saturated colors with a deep black level (i.e. lower G2).
 
link? id be interested to see actual chromaticities of the phosphors of that monitor
 
I believe it actually increases the voltage. G2 can be thought of as the first accelerating anode in the gun assembly. By increasing its voltage, you pull electrons with more force, resulting in a higher beam current passing through G1.



I don't remember this being the case on my tubes. I'll double check soon (gonna do a WPB on a couple FW900s soon).



this makes little sense for 2 reasons:

1: as flood pointed out, the gamut should be based on the SMPTE-C gamut, which is associated with Rec 601 (and very close to Rec 709).
I can only tell you our F500R came from the factory with G2 at 130 and wound up, calibrated, at 110.

EDIT: Sorry I managed to lop off your second point. Yes I know, my claim is that Sony was trading gamut for punch at the time.
 
Last edited:
Is the Mitsubishi Diamond Pro 2070 SB good as the FW900? I found one for 90&#8364;, that's pretty cheap.
 
no/maybe/who knows, but thats a really good price if its in decent condition

i think for these topoftheline crts the most important thing is how many hours the tube has gone through.. for example my newish cpd-g520p is way better than my fw900
 
no/maybe/who knows, but thats a really good price if its in decent condition

i think for these topoftheline crts the most important thing is how many hours the tube has gone through.. for example my newish cpd-g520p is way better than my fw900

Bingo.
 
I can only tell you our F500R came from the factory with G2 at 130 and wound up, calibrated, at 110.

EDIT: Sorry I managed to lop off your second point. Yes I know, my claim is that Sony was trading gamut for punch at the time.

Can you give us the link of that review? I'm genuinely curious as I have a rebranded Diamondtron monitor myself in my arsenal.

As for your F500R - interesting! So it came from the factory with a bad G2 level? Was the F500R a top-end CRT for Sony? I think I read it from SH-1 on this board that the F520 is a second gen F500R of sorts.

If it's not a top-end CRT for Sony, then my explanation would be that's maybe what they set all monitors to when they were at the factory. But who knows...

I have read and heard that later Sony CRT's just weren't as reliable as earlier ones, so this honestly wouldn't shock me if they were just setting all monitors to 130 for the G2 and just sending them out and still charging a premium. Sony seems to have a reputation for that - which could be part of the reason they're hurting now, I would speculate.
 
Is the Mitsubishi Diamond Pro 2070 SB good as the FW900? I found one for 90€, that's pretty cheap.

If it's in great condition, go for it. Honestly, for normal applications (general display and gaming display), my Diamondtron monitor (Lacie Electron Blue 22 II) is just as good as any of my Sony monitors. It's not as sharp as the F520 (0.24mm pitch), but it's a solid monitor that works very well.

And the bright side is that the service menu is FAR easier to get into on the Mitsu's than the Sony's. With Sony, you need a WinDAS cable and WinDAS software. With the Mitsu's you just need to enter a key combination and go through a couple of menus. But then again - the Mitsu's don't allow you to back your settings up if you screw something up so be aware of that.

The only thing you should be calibrating on a CRT is the white balance and maybe the convergence. Geometry shouldn't be touched unless it's really f--ked up. Often times you'll find that you can very well make it WORSE than it was before.
 
Is the Mitsubishi Diamond Pro 2070 SB good as the FW900? I found one for 90€, that's pretty cheap.

I owned both quite a long time ago. The Mtisu tubes were always brighter and clearer, especially with age than the Trinitron tubes.
 
Can you give us the link of that review? I'm genuinely curious as I have a rebranded Diamondtron monitor myself in my arsenal.
The review was on the 2040u (same Diamondtron tube as the 2060/70).

As for your F500R - interesting! So it came from the factory with a bad G2 level?
There is no bad G2. There is only marketing. It's the same reason CRTs were shipped with a blinding default 9300K temperature.

Was the F500R a top-end CRT for Sony? I think I read it from SH-1 on this board that the F520 is a second gen F500R of sorts.

If it's not a top-end CRT for Sony, then my explanation would be that's maybe what they set all monitors to when they were at the factory. But who knows...
People tend to have short memories, in 2001 the world was literally swimming in amazing CRT monitor producers. The F-series were the best consumer-grade CRTs ever produced by Sony, yet the competition in this market was still fierce, and the echo heard over and over again in reviews was, "gorgeous and ridiculously expensive", or even overpriced. As opposed to Mitsubishi's Diamondtron reviews which usually garnered judgments of "gorgeous and moderately expensive". :) The MSRP difference was several hundred dollars.

I have read and heard that later Sony CRT's just weren't as reliable as earlier ones, so this honestly wouldn't shock me if they were just setting all monitors to 130 for the G2 and just sending them out and still charging a premium. Sony seems to have a reputation for that - which could be part of the reason they're hurting now, I would speculate.
I know it was common only because default G2 settings from 130-140 are what's usually reported in Icrontic's and other ongoing long-term discussions.
 
The review was on the 2040u (same Diamondtron tube as the 2060/70).

There is no bad G2. There is only marketing. It's the same reason CRTs were shipped with a blinding default 9300K temperature.

Thanks for the review. I read a bunch of other reviews of monitors on that site too. Good reads!

As for your second point... I'm afraid you've got me lost. Can you elaborate on what you mean by "no bad G2 - there is only marketing?" :confused:
 
If it's in great condition, go for it. Honestly, for normal applications (general display and gaming display), my Diamondtron monitor (Lacie Electron Blue 22 II) is just as good as any of my Sony monitors.

could you measure its gamut i.e. chromaticities of the primaries? or maybe also the spectrum if you have a spectro(radio/photo/whatever)meter. thanks in advance
 
Can you explain what that means? It's a little bit of Greek to me. :) (I'm talking about the Cm, Zone A and B results).
 
no idea. all i know is that cm stands for contrast modulation= (max-min)/(max+min)so higher is better
i think they use a 1pixel white/1px black striped images. zones probably refers to areas on the monitor
 
no idea. all i know is that cm stands for contrast modulation= (max-min)/(max+min)so higher is better
i think they use a 1pixel white/1px black striped images. zones probably refers to areas on the monitor

contrast modulation refers to the ability to produce sharp transitions between neighbouring lines of black and white. Things like glare, halation, dot pitch size, spot size and focus of the beam all contribute to this. Not sure what the units on those graphs are though. Good contrast modulation is critical in things like medical imaging.
 
also, not that there is any question, but the specifications state that the 2070sb is sRGB.

So please stop referring to this as a gamut monster, and instead reserve that term for the diamondtron RDF225WG (which I'm not yet sure was actually released. I read somewhere it was $5000 tho, so not sure.)
 
Last edited:
ah this explains it (AG = aperture grille presumably):

Well the FW900 was like 2k when CRT's were the thing so not exactly a budget option compared to $600 24" LCD's which are bigger. Today it would cost even more. For example Sony still makes the 24" wide for broadcast and television and it costs $24,000. Not saying it would be that much but selling to a small maket would raise costs well above $2000 making them unsellable. I mean everyone has a price threshold before they say "this is good enough"

NEC tried to keep a 22" AG-CRT afloat for the last two years (Diamondtron UWG RDF225WG) and sold them for $5000 with no takers thus discontinueing the line.


The delima is certain fixed costs must be ammortized over monitors sold. Sell a lot costs go down. Sell a few or even project to sell a few and price them high, no one buys. Next problme is materials used in each AG-CRT cost a lot more than any LCD.

I afraid high end CRT's will never come back.
 
How much does a CRT like that last? I been reading that a CRT lasts 10-15 years. If that Sony CRT is made 2002 and it's now 12+ years old, how much more can you expect it to last if it has been used every day for several hours?



No, that's shipping to Europe. I guess it expensive cuz it's fast shipping, just 3-5 days.

If you follow the instructions and guidelines I provide with the purchase of our units, the units will last you a very long time. I am a living proof of that... I own six (6) GDM-FW900s that I unboxed in 2000 that still as bright and as sharp as the day I took them off the box. Also, I own four (4) GDM-C520K Artisans that I unboxed 2002 that still as bright and as sharp as the say I took them off the box... All I've done is follow the instructions and guidelines provided with the purchase.

Shipping to Europe vis FedEx is 3-5 working days for Economy and 2-3 working days for expedite service. That is just the shipping and it is not the time that it takes to get the unit out of customs.

Hope this helps...

UV!
 
No CRT will have perfect geometry. Only ones that come close are round tubes with tons of correction options. My GDM-F520 isn't perfect with geometry either. Like your monitor, it's not correctable. But it's still in spec and is still an A+ monitor. Believe me - Vito will tell you - that I was concerned when I saw the imperfect geometry. I expected everything to be perfect. But after having seen quite a few top-grade CRT's - none of them were ever perfect in the geometry region. It's a limitation of the technology I'm afraid.

I'm still very satisfied with my F520 and I played UT2004 the other day on it at 1920x1440 and damn - what a sight to behold.

That's correct! It is due to the amount of glass that is placed on the CRT in order to make it flat.

UV!
 
I owned both quite a long time ago. The Mtisu tubes were always brighter and clearer, especially with age than the Trinitron tubes.

Our experiences differ, I'm afraid. I have an older Mitsu tube and it's pretty good, but I would still rate it a little lower than my Sony's. But mine came from a photo lab though, and I'm sure it has tons of hours on it.
 
also, not that there is any question, but the specifications state that the 2070sb is sRGB.

So please stop referring to this as a gamut monster, and instead reserve that term for the diamondtron RDF225WG (which I'm not yet sure was actually released. I read somewhere it was $5000 tho, so not sure.)
It's a relative term, and an accurate one relative to everything under $2K available today, and even relative to the F500R which sits directly next to it. You are more than welcome to come over and compare. The 2040u does things the Sony can't, e.g. on an RGB chart of all possible values from 0,0,0 to 8,8,8, the Sony has never been able to display much above the bottom half of the chart at default monitor settings. On the 2040u all 24 boxes are or can be made visible. Etc. I'm probably overstating anecdotal experience here but it is what it is.
 
Back
Top