24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Hey Unkle Vito, it just won't power on. It arrived at my house damaged from delivery.

I plugged in all the appropriate cords.

I appreciate your help in this matter but I'd feel guilty if you are giving me technical support and not getting remunerated for it.

I'm pretty sure it is DOA, maybe some transformer went bad or something.

Really from packaging to delivery, FedEx completely dropped the ball, no pun intended.

Call me... I think you have my phone #.... If not.. PM me...

UV!
 
Off topic, but I was at Sam's club waiting for a tire install and I strolled by their display area. There was a Samsung Quantum Dot TV on display, and I have to admit I'm impressed! Blacks looked black (under the florescent lighting of Sam's club, but still - it was the darkest black of every screen there) and 4K material looks stunning.

Of course, it was calibrated to shit, so it was SUPER-OVER-POWERED!!! in the saturation department. But apparently these TV's allow some very thorough calibration options. I think we're finally getting to a display that can take on ye olde CRT and be relatively affordable.
 
yea perhaps in terms of picture quality, but don't forget the importance of a strobed display, and low input lag, and RGB 4:4:4 (i.e. no chroma subsampling)
 
I agree with all the points made above. I'm just saying - since Plasma's demise, all I've seen are shitty panels. It's nice to see a panel that's not so shitty. Sure, there are things that need to be ironed out. But at least we're finally getting there.

EDIT: Nevermind. Read a review on it and the panel only offers over 3000:1 contrast ratio. Sheesh. Even my tubes, with the black level raised for lit room conditions offer more.
 
Last edited:
3000:1 isn't terrible :), and also keep in mind that the ANSI contrast is probably better than a CRT (i.e. a white patch next to a black patch won't bleed light into the black region).

But yea, nothing like deep inky blacks on a CRT that has good G2 ;)
 
3000:1 isn't terrible :), and also keep in mind that the ANSI contrast is probably better than a CRT (i.e. a white patch next to a black patch won't bleed light into the black region).

But yea, nothing like deep inky blacks on a CRT that has good G2 ;)

The review I did used the ANSI contrast to come up with that number, so it may be actually good (no backlight dimming was used).
 
for lcds with no backlight dimming, ansi contrast is similar to sequential contrast (full white/black images) right? probably just some va panel then.

and afaik all so-called "quantum dot" displays are just lcds that use quantum dot stuff in the backlight to increase the gamut. so they carry all of the issues with lcd's pretty much

But yea, nothing like deep inky blacks on a CRT that has good G2 ;)
nothing like the nonexistent blacks of oled's :p

for viewing angles though, crts still win vs any non-projected technology i know of.
 
Hi everyone, i need some help. I had this Sun GDM-90W10 (16:10) for some time. i want to connect my ps4. The problem is when i try to set the resolution to 1080p goes black and turn a orange indicator(signal out of range) even when i try to set 1080p from my pc.But every other resolutions 1080 70hz,1080 75hz works. The problem seem just with 1080p 60hz. i tried in another crt 4:3 SGI and 1080p works fine (cable is fine).

IMG_20150921_230027.jpg



subir fotos a internet

This is the monitor:
http://www.shrubbery.net/~heas/sun-feh-2_1/Devices/Monitor/MONITOR_Color_24_Premium_CRT.html
 
yea perhaps in terms of picture quality, but don't forget the importance of a strobed display, and low input lag, and RGB 4:4:4 (i.e. no chroma subsampling)

Is the FW-900 the only CRT capable of RGB 4:4:4 with no chroma subsampling?

To be honest, I am not sure an OLED is going to usurp the strengths of the CRT which is what matters to me the most.

But corporations aren't interested in a consumer electronic that can deliver all the PQ strengths CRTs have and then some due to cost and whatever. Neither does the consumer themselves.

The average consumer isn't going to care about PQ differences if they are that marginal.

Whatever, the only hope is a company that specializes in computer displays will come out something worthwhile.
 
Last edited:
Is the FW-900 the only CRT capable of RGB 4:4:4 with no chroma subsampling?

I'm pretty sure all computer monitors (CRT or otherwise), if not all CRT displays can handle full chromatic spatial resolution. It's TV's that typically do subsampling, though some TVs have an option for full 4:4:4.

To be honest, I am not sure an OLED is going to usurp the strengths of the CRT which is what matters to me the most.

In theory an OLED will trump a CRT in just about anything that matters, with a couple potential caveats:

1) OLEDs still receive a digital signal, so the bit depth is limited by the circuitry of the panel.

2) It would need to be strobed, which might or might not add a negligible amount of input lag.
 
@Alfredo3001
what a lovely looking CRT!
1600x900@100Hz should work on it

as for PS4 it should too. There is no reason why would it not accept 1920x1080@60Hz, especially from PC. Have you tried other similar resolutions at 60Hz? Or different input maybe?

@Alexnova
comment you quoted was in regard to HDTV LCD
all LCD monitors have RGB 4:4:4 and 'full range' (0-255) inputs
CRT monitors with RGB inputs such as VGA monitors do not have any chroma subsampling.
 
I'm pretty sure all computer monitors (CRT or otherwise), if not all CRT displays can handle full chromatic spatial resolution. It's TV's that typically do subsampling, though some TVs have an option for full 4:4:4.



In theory an OLED will trump a CRT in just about anything that matters, with a couple potential caveats:

1) OLEDs still receive a digital signal, so the bit depth is limited by the circuitry of the panel.

2) It would need to be strobed, which might or might not add a negligible amount of input lag.
I am almost on the verge of buying an OLED TV tomorrow, but I wanted to wait a year or two until some of the quirks are ironed out.

It's a shame we may not see any real OLED computer monitors for a couple years.

Question I have for you spacediver is this:

Input lag is a big deal to me, so are you saying that if an OLED display is not strobed you are getting input lag on par with an LCD?

Isn't the motion blur on an OLED already good enough to where you shouldn't need to worry about strobing?
 
Input lag is a big deal to me, so are you saying that if an OLED display is not strobed you are getting input lag on par with an LCD?

To my knowledge, the input lag generally depends upon the amount of back end processing that the panel does. There could be significant input lag even without strobing, depending upon the panel circuitry. But this back end processing isn't a necessary component. If and when OLED gaming monitors make it to the market, I'm fairly confident there'll be little to no back end processing.

Isn't the motion blur on an OLED already good enough to where you shouldn't need to worry about strobing?

Eye tracking based motion blur arises from the "sample and hold" nature of non CRT displays. An OLED isn't an impulse display (a CRT is). The only way to eliminate this sort of motion blur is to emulate an impulse display by black frame insertion/strobing.

Here's a good read for you.
 
To my knowledge, the input lag generally depends upon the amount of back end processing that the panel does. There could be significant input lag even without strobing, depending upon the panel circuitry. But this back end processing isn't a necessary component. If and when OLED gaming monitors make it to the market, I'm fairly confident there'll be little to no back end processing.



Eye tracking based motion blur arises from the "sample and hold" nature of non CRT displays. An OLED isn't an impulse display (a CRT is). The only way to eliminate this sort of motion blur is to emulate an impulse display by black frame insertion/strobing.

Here's a good read for you.
Thanks bro, and holy shit no back end processing on an OLED :eek:

I'm already excited.

Thanks for the link, I'm reading it now. Appreciate it.
 
Which means whatever resolution is displayed is native?

If it helps you think if it that way - sure. :) It's a little more complicated than that. But yes - CRT's are only really limited by their phosphor densities and the electronics driving the whole show. Your GDM monitor is only really able to fully resolve 1600x1200. Anything higher and it doesn't quite fully resolve it. The GDM-F520 - Sony's tightest monitor can't do more than 1700-ish pixels itself either. GDM-FW900 could fully resolve its 1920x1200 pixels. Not more though.

EDIT: And while most would poo-poo this short-coming, I like to think of it as free anti-aliasing. :D
 
spot size of the electron beam is a huge limiting factor, but yes, so are the electronics and the density of the aperture grille.
 
strobed OLEDs will be done the same way as CRT does it, only except moving super bright point that gets dimmer and dimmer as time goes there will be moving bar of image. Perceived motion resolution will depend on height of this bar, the shorter the better motion resolution but the brighter pixels will have to be. Shorter bar will also mean more perceived flickering. Technically due to limited maximum brightness od panels height will have to be quite wide, at least at first.

Motion reduction match is the same as for strobed LCD
motion-blur-graph.png


There doesn't need to be any input lag in displays that have pixels that are self illuminated and it can be exactly the same as on CRT. Strobed LCD need input lag because they flash whole image for a while during v-blank period and do not display anything when pixels are actually changed on screen. It is inherent flaw of strobed LCD technology. Everything the same strobed OLEDs should be also easier on eyes, especially at night.

BTW. CRT doesn't have native resolution and it shows XD XD
For 21" Trinitrons anything above 1280x960 looks severly distorted. FW900 limit is at 1600x1000 and most popular 1920x1200 is already distorted due to similarity of resolution and pitch size and 2304x1440 is broken due to lost pixels. And frankly imho 1280x800 is last resolution that look sharp on FW900.

For games as is typical use scenario for those monitors it doesn't matter at all though.
 
Last edited:
yep, once the refresh rate is high enough to exceed everyone's flicker fusion threshold, then there'll be no perceived flicker. The visual system will temporally integrate it according to Bloch's law.
 
Over the years, I went from 1920@85hz as desktop resolution to 1440x900@120hz.

The latter is indeed lot sharper, no motion blur, no distortion.

I even use it in older games now.. Because the eye strain is considerably less if you spend 8+ hours before your CRT.

At 1920x1200@85hz I was Always more tired and really 'felt' my eyes eehhh.. cooking :)

Now I don't have to stare to read small ingame fonts etc.

And 120hz or even 160 or 180hz gives considerable advantage in pvp fps shooters like counterstrike!

Crt having no 'native resolution' is a BIG advantage.
 
hard to determine whether it's the sharpness that's an issue, or simply the smaller fonts. I use the nosquint plugin for firefox that adjusts text size - I find 115% works well on hardforum.
 
For 21" Trinitrons anything above 1280x960 looks severly distorted. FW900 limit is at 1600x1000 and most popular 1920x1200 is already distorted due to similarity of resolution and pitch size and 2304x1440 is broken due to lost pixels. And frankly imho 1280x800 is last resolution that look sharp on FW900.

I have a 21" Nokia 445Pro. It is a Diamondtron tube, tho. 1600x1200 and 1920x1440 are both very sharp, far better than 1280x960. 1920x1440 does need InterLacing to prevent flicker or else it only works at 80Hz. Going above 2048x1536 exceeds the dot pitch, but in games, it is OK. I usually use 1600x1200p@100 on the desktop. 2560x1920i@120Hz, 1920x1440i@160 and 1280x960i@240 for games.
 
Interesting you use interlaced resolution. I think I was able to push the 5402 to 160hz which is pretty neat.
 
I checked laced modes 1600x1000i@160Hz and it is completely hideous, then 1920x1200i@160Hz and it is only slightly less hideous. 2560x1440i@160Hz was for some reason not listed
Anyhow, it looks completely unusable, I see scan lines looking at static image when I move eyes, on moving images even worse because eyes move a lot. I feel tricked for even trying it out :mad:

I just added 1280x720@160Hz and oh my god, does it look... stable and just beautiful in comparison.
I would rather use virtual resolution (super sampling anti-aliasing) at 720p than even 2560x1440i@160Hz
 
uh are we talking about sharpness relative to the physical world or relative to pixels? obviously lower resolutions make it easier to make out individual pixels, but that doesn't mean the total image contains more detail

as for me, i'm fine with 1600x1200@85 on desktop for my cpd-g520p. and imo text is actually more readable than on (normal pixel density) lcd's.
 
Says who? I experienced one scenario where my AMD card looked sharper than my Nvidia one but I don't know if the reason was Nvidia vs AMD...
 
Hello!

Long time not seen here. I sadly lost interest in the FW900 and the forum here because of real life and my monitor was running well. Now it is defectiv and I was going to repair it. It is a maybe longer story which I will tell here for all, if all will go well.

Short version: My FW900 was horizontal collapsing and the brightness was dimmed and high in change if the monitor was warmed up and with more heat, it was collapsing faster. If I gently touched the sides of the monitor or the back, the picture come back sometimes, so I think it is a cold solder join. And my problem with zoomed in picture in cold state (Have written about 2 years ago), was only a side effect of this result (I think the solder join was cold before, but not as much as now).

I have resoldered some solder joins on the A-board and D-board inclusive the flyback. I have removed the suction cap and anode, have propertly discharged it. But there was something on (maybe) silicon base for isolate the suction cap under it. Maybe it is only glue I dont know. But this is now not clean anymore, anyone know if I can remove and clean the suction cap and the underlying area on the tube with alocohl and add the suction cap back to the tube without this glue? Or will something bad happen then?

Would be nice if someone could answer me about this. I have nothing found about this. Maybe Vito is here anymore? Thanks! :)

By the way: There are minimal scratches into the hole where the anode plugs in. The brown coating in the hole is minimal damaged. Will this result in some problems or is it not needed for properly function of the monitor?
 
Last edited:
My 980 gives a garbled, blurry signal but my 7970 is as clear as a LCD, even at extreme resolutions.

Can you provide some photos please? Take a closeup image of a white cross, that is in the center of the screen, on a black background. Switch video cards, take same image, and post both. Ideally, you want to mount the camera so that you don't have to move it in between shots. If you install both video cards at once, all you need to do is switch the cable from one output to the other.
 
Can you provide some photos please? Take a closeup image of a white cross, that is in the center of the screen, on a black background. Switch video cards, take same image, and post both. Ideally, you want to mount the camera so that you don't have to move it in between shots. If you install both video cards at once, all you need to do is switch the cable from one output to the other.

My 980s are not hooked up right now. The 7970 is air-cooled in a seperate PC. The 980s are watercooled, so they can't be interchanged. I don't have the PC that the 980s are in running right now due to stability problems. But I can get a picture of the 7970 vs a 6600GT or 7800gtx as those are the cards in my POwermac. (I'm currenytly tripping; will update w/ pics tommorow)
 
yes, that would be great. Also make sure you're running the same resolution/refresh and timing parameters (e.g. GTF timing).

Enjoy the acid.
 
Hello!

Long time not seen here. I sadly lost interest in the FW900 and the forum here because of real life and my monitor was running well. Now it is defectiv and I was going to repair it. It is a maybe longer story which I will tell here for all, if all will go well.

Short version: My FW900 was horizontal collapsing and the brightness was dimmed and high in change if the monitor was warmed up and with more heat, it was collapsing faster. If I gently touched the sides of the monitor or the back, the picture come back sometimes, so I think it is a cold solder join. And my problem with zoomed in picture in cold state (Have written about 2 years ago), was only a side effect of this result (I think the solder join was cold before, but not as much as now).

I have resoldered some solder joins on the A-board and D-board inclusive the flyback. I have removed the suction cap and anode, have propertly discharged it. But there was something on (maybe) silicon base for isolate the suction cap under it. Maybe it is only glue I dont know. But this is now not clean anymore, anyone know if I can remove and clean the suction cap and the underlying area on the tube with alocohl and add the suction cap back to the tube without this glue? Or will something bad happen then?

Would be nice if someone could answer me about this. I have nothing found about this. Maybe Vito is here anymore? Thanks! :)

By the way: There are minimal scratches into the hole where the anode plugs in. The brown coating in the hole is minimal damaged. Will this result in some problems or is it not needed for properly function of the monitor?

That is NOT GLUE! It is an special thermal grease. If you are not an experience technician specialized in CRT technology and/or do not have any practical experience working inside CRTs, I strongly recommend that you do not attempt any repairs or you may run the risk of sustaining severe injuries including electrocution.

Hope this helps...

UV!

.
 
Back
Top