24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Yeah, my Dell P992 is a little bright with a reddish tint when I first turn it on, but it settles after 15 minutes. Would love 21" version, which is what MeltBanana is basically looking at
 
I'm noticing the greenish border around the desktop image which could indicate it needs the infamous G2 adjustment via WinDAS, or it might just be cold still (sometimes symptoms go away after warmup).

Either way it's fixable and would be hard to turn down at $15

Honestly - sounds like a White Balance adjust is what the doctor ordered. The bias is probably a little off. No bigs. :D
 
I think SpaceDiver is the best person to answer these questions, but any input is welcome!
- This is a stupid question I am asking without doing any Google research, but what is the bit-depth of CRT monitors and FW900 in particular? For example, most LCD panels are 8bit, which limits them to only 0-255 grayscale level range. Many TN panels are actually 6bit + FRC (dithering), but some LCD panels are 10bit, and I think most HDTV's, including mine, are 12bit.

- Let's assume I get FW900 and it requires a 1D LUT / ICC profile. On 8bit panels, a high quality 1D LUT made by ArgyllCMS / dispcalGUI combo can create grayscale gradient banding on this test - http://www.lagom.nl/lcd-test/gradient.php . However, on 10bit and especially 12bit panels, these gradients are "smoothed out" almost to a point of imperceptibility. I know so even from experience because until recently, nVidia CP only allowed 8bit depth on my HDTV, but the latest drivers allowed me to switch to 12bit depth. I applied my 1D LUT and toggled between 8bit and 12bit modes while viewing that grauscale gradient. 8bit depth produced some banding, not severe, but I could notice it. 12bit, however, eliminated 90% of that banding and only left a barely notice-able hint that there was banding. Will CRT monitors also show banding once 1D LUT is applied?
 
Yea bit depth is a bit of a hairy chestnut.

A crt has essentially an infinite bit depth. The voltage you feed the tube will have a direct physical effect upon the beam current. You are essentially limited by the noise level of the circuitry. Data from Jenny Read's tests suggest that her old Compaq P1210 is capable of at least 16 bits (she was limited by the precision of her photometer). In order to do those tests, however, she used a datapixx device which is capable of feeding a monitor with such finely quantized voltages.

Then there is the difference between the number of LUT values, and the precision with which each entry can be specified, and this distinction is often not made clear.

If you have 256 LUT values, you might still be able to specify these values with 10 (or higher) bit precision. In other words, you may only have 256 levels between black and white, but you may be able to choose from 1024 different values for each of these levels. Both these forms of bit depth will help against banding.

You can actually do tests to see how much precision your own system has: you can write code that alters a LUT value real time (windows allows you to specify each value with a precision of 16 bits), and see how fine of a change in this value the system can actually register, by measuring real time with an instrument.

I've never heard of a 12 bit HDTV (other than perhaps studio grade broadcast displays). Which model do you have?

fwiw, on my FW900, I don't see any banding in this image, but I also have a well calibrated display with a fairly low peak luminance (with higher peak luminance, banding will be more visible as the levels are "stretched out").
 
Last edited:
wtf
why can't i see banding in that image

but can in http://www.lagom.nl/lcd-test/gradient.php

yup that madshi.net image has dithering (open in gimp, increase contrast)

btw i've been using 5000k for the last week and i'm so used to it to the point that i don't recognize that it's not 6500k :p
 
ah yes that makes sense. in the lagom test I can barely make out a hint of banding, but then again my visual system may not be as keen as others.

and yep, that doesn't surprise me - we adapt rather readily to different whites. Important thing is having it consistent across the grayscale.
 
Yea bit depth is a bit of a hairy chestnut.

A crt has essentially an infinite bit depth. The voltage you feed the tube will have a direct physical effect upon the beam current. You are essentially limited by the noise level of the circuitry. Data from Jenny Read's tests suggest that her old Compaq P1210 is capable of at least 16 bits (she was limited by the precision of her photometer). In order to do those tests, however, she used a datapixx device which is capable of feeding a monitor with such finely quantized voltages.

Then there is the difference between the number of LUT values, and the precision with which each entry can be specified, and this distinction is often not made clear.

If you have 256 LUT values, you might still be able to specify these values with 10 (or higher) bit precision. In other words, you may only have 256 levels between black and white, but you may be able to choose from 1024 different values for each of these levels. Both these forms of bit depth will help against banding.

You can actually do tests to see how much precision your own system has: you can write code that alters a LUT value real time (windows allows you to specify each value with a precision of 16 bits), and see how fine of a change in this value the system can actually register, by measuring real time with an instrument.

I've never heard of a 12 bit HDTV (other than perhaps studio grade broadcast displays). Which model do you have?

fwiw, on my FW900, I don't see any banding in this image, but I also have a well calibrated display with a fairly low peak luminance (with higher peak luminance, banding will be more visible as the levels are "stretched out").

It appears that most HDTV's are 12bit actually. Mine is a Samsung CCFL SPVA from 2009, but it really is 12bit. I tested the gradient and there was an obvious different between gradation visibility between 8bit and 12bit when the same 1D LUT was specified. I also tested with ArgyllCMS using dispcal.exe -R -y1 command. That command tells you your bit depth up to 10bits only. When I used 8bit mode, ArgyllCMS reported 8bit, but when I used 12bit mode, ArgyllCMS reported "unknown depth" because I think it is limited to 10bit. Of course only 0-255 grayscale levels will be measured and used to create a 1D LUT or it may take an extreme long time to create a 1D LUT using 12bit depth (4000+ grayscale levels or something like that).

Originally, nVidia drivers did not have any bit depth options (while AMD drivers did have 10bit depth option for a long time), but 2 latest nVidia drivers suddenly provided an option to change color bit depth. At first, I could only use 12bit @ 23Hz because using 12bit @ 60Hz resulted in severe artifacts all over the screen. Someone on Doom9 madVR forums suggested that I get a High Speed HDMI cable, which I always thought was a gimmick, like HDMI versions. Still, I bought a High Speed HDMI cables and BAM - 12bit mode works @ 60Hz without artifacts! It appears that there is truly a difference in bandwidth between Standard HDMI and High Speed HDMI cables. I used the original HDMI cable from 2009, which did not provide enough bandwidth for 12bit color depth @ 60Hz.

Did you know that Standard HDMI cables had specifications only for 720p and 1080i? Good cables though, allowed 1080p and higher I believe, but still, they were not really meant to.

So far 12bit in games and in films is only useful for gradient smoothness. However, there is at least one game, Alien: Isolation, that has an option to use Deep Color (10bit / 30bit). It looks fantastic @ 12bit. Black to dark to light dark shadow transitions are smoother than in any game I have seen!

Since you say that CRT's have almost infinite color bit depth, then 1D LUT that produces banding on grayscale gradient on an 8bit LCD would not produce ANY or extremely mild banding on a CRT? That would be wonderful!

I recall seeing FW900 at some point and one thing I there was slight curvature on the bottom of the screen, so geometry was not perfect. It was no big deal, but I once owned a Samsung DynaFlat ShadowMask CRT that had a rather bad curvature on the bottom and there was no way to adjust it. AFAIK, only left/right side curvature can be calibrated/corrected.
 
It appears that most HDTV's are 12bit actually. Mine is a Samsung CCFL SPVA from 2009, but it really is 12bit.

Where are you getting this info?


Originally, nVidia drivers did not have any bit depth options (while AMD drivers did have 10bit depth option for a long time), but 2 latest nVidia drivers suddenly provided an option to change color bit depth. At first, I could only use 12bit @ 23Hz because using 12bit @ 60Hz resulted in severe artifacts all over the screen.

Can you show a screenshot of this option?

So far 12bit in games and in films is only useful for gradient smoothness. However, there is at least one game, Alien: Isolation, that has an option to use Deep Color (10bit / 30bit). It looks fantastic @ 12bit. Black to dark to light dark shadow transitions are smoother than in any game I have seen!

I was under the impression you needed an Nvidia quadro card for deep color support (assuming you were using an Nvidia card).

Also not sure what you mean by "only useful for gradient smoothness". And when you refer to 12 bit, are you referring to the number of levels between black and white? Or the precision with which the 256 levels can be specified?


Since you say that CRT's have almost infinite color bit depth, then 1D LUT that produces banding on grayscale gradient on an 8bit LCD would not produce ANY or extremely mild banding on a CRT? That would be wonderful!

This is where my knowledge tends to run out. But even with infinite bit depth throughout the entire video chain, you can create banding by design if you wanted to.
 
The image of my FW900 has become slightly yellow in the following portion of the image.

kYN2hsJ.png


Had it been in the corner I'd have adjusted the landing.

If I degauss the monitor displays the image clear for a split second before settling, which clearly shows how discolored that part gets.

How can I remedy this? Windas adjustment (don't have a cable yet)? Degauss coil?
 
Windas has a Landing option that can adjust that portion of the screen. It's the first slider option.

EDIT: But you should only really play around with it if you know what you're doing. WinDAS tells you the values that it needs to be to be in spec. If you adjust it to the values in spec but still find an issue with the yellowing, then what you'd need to do is adjust the purity magnets on the back. Hurray - not easy.
 
Last edited:
yea just saw this. Better be in good condition for that price. June 2003 manufacturing date from what I can tell, not bad.
 
HOLYSHIT, that's priceless.

Is it even possible to get one of these somehow?

If you mean the GDM5402...looks like still at least one left. (Been a pattern of caches of new CRTs being discovered and sold off on eBay over the years.)
 

Part of the G1 family apparently. Not a CR1 (GDM-F520, Artisan, etc.) but still - a brand-new, 0.24mm AG and 21-inches... If I had the computer fund and space I'd take. :) Though looking at the OSD menu, I don't think these things have the "Expert" color adjustments that some of our more higher-end users like. But that's what the WinDAS WPB adjust is for. :)
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Just pulled out my FW900 to play some Outlast (among other games). Holy crap, this monitor looks soooo good. Currently playing at 2304x1440 at 70hz and using that resolution right now as I type this. Monitor is still sharp as ever and it works fantastic. I really really hope this thing lasts. :)
 
I Always find it amusing that the image shown in the AD pictures on those excellent quality CRT is Always BAD.

Makes me wonder if the Pinnacle CRT technology (aka the gdm-fw900) wasn't there too early. Graphical cards ets might not have been able to get the most out of this CRT.

I mean, I doubt there was a game let alone a graphics card that could display a game with 4k texture resolution at highest resolution back then.

At launch, this monitor was used only for film editing and photo work I imagine?

I wonder, if somehow this type of CRT only launched after 2005, if flat panels would still have had 99% of the market for gamers..
 
At launch, this monitor was used only for film editing and photo work I imagine?

I'm sure it was used by gamers who could afford it - but they probably didn't run their games at such high resolutions back then.

But yea, the FW900 was a big among graphics and video professionals.
 
So it looks like all the AMD 300 series cards will have analog VGA out:

http://www.pcworld.com/article/2936...fury-graphics-cards-new-r300-series-gpus.html

So that's pretty cool. The fury won't, but running the 390x in crossfire will still crush any game coming out.

wat?! :)

I was planning to replace my 7950 with a GTX 970, but if they really decided to roll back the DAC removal and put DVI-I on all the 390/390X cards and not just the prototype/display-model, I know what I'm buying instead
 
At launch, this monitor was used only for film editing and photo work I imagine?

I wonder, if somehow this type of CRT only launched after 2005, if flat panels would still have had 99% of the market for gamers..

I bet LCD's would have 99% of the market because nobody would make sell or advertise the fw900. Basically, the lcd market was worth, billions and billions of dollars to corporations, more than any of us can begin to fathom. I think i heard a figures over 100 billion, If they put that money into science... instead of flat panel marketing and pockets. They made a global fortune with it, lcds, in this age, if someone was trying to market such a display after lcd marketing, it would get swallowed up by it`s mainstream competitors, no mate how good it was. At the ery leas tit would be bought out and killed off. Every mainstream publication would be paid off and no one would hear about it..
  • Right now if relisted by some kind of a force, i think the display will have a hard time competing with the new 4k lcd's.. On the other hand, while superior still in many ways, it would be harder to market, and no one would be allowed to market it, oled causes enough trouble to the still multibillion dollar lcd world, as it's a waay more specialty thing.
  • For gamers, they might advertise response, the response is still faster then even the 1ms lcd displays, im sure, in getting to the millisecond.
  • The speed would also be a point of favor, rockin 90 hz is just fine. huge improvement over the 60hz lcd`s EVERYONE flocked too about a decade ago. But will unlikely realize then when shopping for their next display.
  • Viewing angles are better than, or on par with, the best ips panels, better imo. And i have many, and have used many ips displays. Everyone seems to love viewing angles now, but everyone ditched it 10 years ago.. No one thought twice about gamma stability or off angle viewing back then.. So many crying now for a 90hz wide viewing angle ips solution, over the past 3 years, yet turned a blind eye to the past. But of course not widescreen, most of us had 4:3s at the time, and many of us the smaller kind. :p
  • and the ability to properly display deep color, something the market has withheld from gamers for a later date. With new programs coming out in "deep color" movies, ect,, going back to crt has benefits and that could be advertised. But they are only advertising it now, since ppl have bought 16.7 million colors, now they want to sell us our colors back like itès so valuable.! But that could be used to sell a fw900.
  • The lighting, the lighting isnt led bias an has more uniformity than the typical lcd. Some people buy low bias lcd panels. Most gamers dont care about that, but if you bring it up as a problem in the other camp..
  • It will loose most people on the highly desirable 4k resolution. IT also loose 27"+ crowd, people who want a bigger display.
  • Most people will choose 4k rez(in programs with support), 16.7 millions colors, and a size increase, over multi-scan(non fixed resolution), 1 billion colors(in programs with support), contrast, and black point. Most people rock the fw900 passed it's best point anyhow to try and gain extra rez.. While detail is nice, and can help show some things, so does contrast, which people have moved away from in favor of fashion historically, and now the fw900 would have to deal with both 4k and new larger panels.
But what if the fw900 were in the 26 inch class, instead of 24, 1440p and 16:9 widescreen, with a .2 to .231 pixel pitch, and less than 1 ms response time? I think it could popular again then, at least for some special group of people. Uber geeked.
 
Last edited:
So it appears the 390 and 390x will not have VGA, only the 380 and below.

So, as of right now, The 280x is still the best single-GPU AMD card with VGA. And that will probably be the case for a while unless they plan on releasing a 380x. And all the Fury cards seem to be HDMI and DisplayPort only.
 
So I'm pretty sure this is it. She's finally gone. The power indicator is blinking an orange color. I tried different cables, video card, I use an HDFury2 for both the bnc and vga connection to see if it was faulty. Same result.
 
So I'm pretty sure this is it. She's finally gone. The power indicator is blinking an orange color. I tried different cables, video card, I use an HDFury2 for both the bnc and vga connection to see if it was faulty. Same result.

No need to guess. Check the service manual to see what the blinking sequence means.

EDIT: This will tell you what's wrong with it and what you'll need to do to fix the guy.
 
Last edited:
Just wanted to chime in. I finished playing Outlast on my PC with ye olde FW900. 2304x1440 at 70hz was what I played. Crisp and sharp, and nice and black. This monitor STILL kicks so much ass it's not even funny.
 
nice nice :D

I'm just finishing up the Mass Effect series. The graphics look bloody awesome on the display.
 
I have to ask some advice for people who own the FW900. I just got a new video cable that I was hoping would be the best way to hook up the monitor.

I got a straight high quality 5-bnc cable and a DVI to 5-bnc female adapter.

This is the BNC cable I got:

http://www.youravcablestore.com/component-video-5-bnc-sonicwave-zoom.html

Here is the DVI to BNC adapter I got:

https://rubimages-liberty.netdna-ssl.com/hi-res/DVI_A-5BNCF.png


I hooked up everything and it works BUT the screen is totally purple. I know that this usually means the color space is wrong or that there is a mismatch between RGB and Ypbpr signals.

How do I correct this though? Are there any settings that I can change to get the correct signal?

I was under the impression that any and all 5-bnc cables would work but just not the three bnc component video cables.

I even think I saw someone else mention they were using the exact same Sonicwave BNC cables I got somewhere on this thread.

I'm sure there is some setting I need to adjust but I can't seem to find it.

Any advice would be appreciated.
 
You could try fiddling with the green cable and see if that fixes it. Purple-ish hues are indicative that the green channel is absent.
 
You could try fiddling with the green cable and see if that fixes it. Purple-ish hues are indicative that the green channel is absent.

Thanks for the suggestion. I just switched back to a straight VGA cable and the purple tint remains! What the hell is going on?!

It's almost like connecting that cable triggered something in the video card to output a certain color format or something.

I know that Nvidia doesn't give you the option to switch color formats between RGB and Ycbcr or anything like that.

I really don't know what is going on.

It was working fine a couple hours ago with my other cables. To clarify, I just tried the 5-bnc cable and adapter cable and that particular cable was never working correctly.
 
I tried yet another cable and I get the exact same thing. Purple screen. This is true in the bios and in Windows.

I've never heard of something like this happening. All I did was plug in this new BNC cable I just got in the mail and now I have these same symptoms no matter what cable I use! I am using the exact same cable that was working fine two hours ago and now I've got a purple tint.

This is a classic example of a color space mismatch but I don't know how to fix it? There are no settings that I know of to adjust.

I'm open to all suggestions.
 
Back
Top