24" Widescreen CRT (FW900) From Ebay arrived,Comments.

If someone can remind you on my problem with glowing around white cubes around a total black background: I think now it is not the same as my problem which i can only describe with "ghosting". (If you moves the mouse over a black background see a very little time this white on the background..."

I think now that "glowing" problem is normal because of the contrast?

http://www.productionapprentice.com/tutorials/general/using-color-bars-to-set-up-your-equipment/

Here you should search a text about contrast or "blooming" and there is also the exact problem which i have. It is usggested that to lower the contrast. And if i do that it is much better but not completly gone. if i lower the contrast so much that i cant notice the glowing around this bright object on black background, then the contrast is very poor! I dont think that this is right. My contrast is 90% like sRGB mode suggested. If i higher the contrast more, than the glowing is also more.

Is this a common problem to trinitrons? For a test run Nokia Monitor test and than the black level test and on the 3. page there should be a test with 5 little white cubes on black background.

any idea?

Also i have found out that the "ghosting" problem is maybe an mechanic problem with grounding or connector, see that:

http://hardforum.com/archive/index.php/t-1308443.html

I have tested an much better vga cable which was also shorter and more shielded and looked much more stable. But the image was a little bit more worse. Also after restarting if had only maximum of 1024x768. I wonder about that and found out that this cable is maybe very crappy, it looks only of good quality. The problem was that with this cable i was dont possible for the PC to get the monitor EDID infos for resolution and frequencies and so on. I am now on the default cable and resolution is back normal. But glowing also again exist.

By the wayy: I have looking for the outputed EDID infos by resolving the other problem and found out that sony self send to PC the info for gamma for the fw900 to be 2.5 !! Also there is not way to correct gamma with OSD settings. I have now gamma correction about 1,19 and all gamma tests are passed. I dont think it is possible to pass the gamma tests only with OSD. Than the black level would be very worse. gamma is only for the middle tones not for the overall brightness. if you could reach the correct gamma value, than the blackpoint should be way to bright and vice versa. if someone say thats not true show me the settings in OSD. Would be nice to have gamma 2.2 without video driver gamma correction AND without any lost of true black. Any here with that? Also gamma is not D65. I have set 65K color temperature and that has nothing to do with gamma and not with brightness. Only middletones.

The gamma test is here again:

http://www.simpelfilter.de/farbmanagement/images/gammapyramide-rgb-flicker05.gif
Should not be difference between the 2 color sides. Only maybe low flickering.

If someone can archive this with 65k color temperature AND true blacks. But without gamma correction by video driver like me actually. please write us how you got that than :)
 
Last edited:
@ZeosPantera

Thx, i found it somewhere. Does it work? But yours is also nice, because it is fitting the whole screen.

How did you calibrate the gamma? Also at 2.2? And did you change gamma also with video driver gamma or is it possible to do that over the OSD? I think that is not possible over OSD. Only because CRT scale light linear and gamma correction is to change this to scale like the human eye and if CRT would able to to this than we never would need gamma correction. But i heard also that some LCD and CRT have this option in OSD anymore...

EDIT: But i have much differences between mine and yours. In your image it might that green is not in range and if i adjust all of your picture, than i need for blue: 1.08, for green 1.08 and for red 0.16. If i change than the picture to mine flickering one, than it is flickering much more and i need to adjust the values again to red, blue, green: 1.16. What are your corrections? Have you also differences? And have you need to set specific colors like green to a extra other color as for red for example? or all 3 at once value?

Also you have the "problem" that if you codlstart your monitor that the top and right edge are out of the screen a little bit and after 10 minutes or so they are again in screen? i mean out of the right and top broder(brezel) if you colstart. Only right and the top. Not the bottom and not the left. After 10 minutes the black borders on the right and top are equal like the bottom and left one. Is that an damage or so?

EDIT2: Also funny is: http://web.mit.edu/jmorzins/www/gamma/rwb/gamma2.gif
If i follow this picture i have an gamma about 1.8. But thats not true. 2,2 is in the first row to bright to fit to the 2.2 bottom. Anyone has results here?

EDIT3: What do you think about "gammax" a hardware gamma booster which do the same thing like the video driver gamma boost but in hardware like the fw900 would have a gamma control in OSD. Anyone tested that?
 
Last edited:
Most somewhat high-end CRTs do 120 Hz at 1024x768.

I know my Dell 21" Dell D1626HT (UltraScan 1600HS Series) does. So that, or any model which is better.
 
Using the program CRU I'm able to set the desktop resolution to 1920x1080 @ 100hz.

But when I go into a game, it's at 85hz.

Anyone know how to fix?
 
How did you calibrate the gamma? Also at 2.2? And did you change gamma also with video driver gamma or is it possible to do that over the OSD?

I do my gamma adjustments in my ATI software. The numbers are low 1.10 RED , 1.05 GREEN and Blue gets no tweak.
 
I do my gamma adjustments in my ATI software. The numbers are low 1.10 RED , 1.05 GREEN and Blue gets no tweak.

Ok, i did it also with driver adjustment... There is no hardware way as i know. Hm does it indicate that your monitor has a color shift is you had to change the gamma values for it own?

Also at your image the green is not right... have you tested mine? i cant pass both. if it not flickers at my image at green, your image shows dark green bars -.- Also have this?

@atwix
nice tutorial, but please test your gamma with the picture from ZeosPantera. Would be interesting if green is also not in range. I cant get the flickering pictute and the one from ZeosPanteros in range at once. Only a middle thing a once.
Also, i use linux. But you could set the gamma value also in windows 7 calibrations app? And yes brightness is not the brightness in nvidia or ati control panel, but there also should be a gamma setting, But if you set gamma not with nvidia or ati control panel the question is with what then? can win 7 set it self there?

Also interesting: http://www.lcdreviewz.com/GammaCorrection
If gamma would be really right, then nowhere should be a difference. But this is simply impossible for me! If it is 100% at top, then it is on a other area a little or more out of range. What can be that? Also i read, that if you change gamma with software, your colors are reduced. For example you only have from 255 red only 234 red colors or something. Becauso of 8 bit gamma correction in the video card or something. Here are so many people, which had calibrated their monitor. Would be very nice if some can see all this 3 pictutes in the right way at same(once)! and if not it confirm my assumption that some pictutes are realy not right! Also this test http://www.bberger.net/rwb/gamma.html does absolutly not work for me! Like all other "chessboard images". I think this effect dont work on the fw900. If i go sooo far away i can see the chessboard-dots anymore and it is not a single color -.-

Please, here should be one person which has all done this allready and can answer all my questions. It is crap dont to know which pictutres i can trust -.-
 
Last edited:
Well all the Gamma screens on the lcdreviewz site are perfect for me. The other site you need to squint to really use them.

Here is a photo of the gamma calibration image on my new Epson 8350 projector.
http://i.minus.com/iN8I0scaQ6qlf.jpg
You can see where the very corners start to mis-allign but that is it. This had to be adjusted with my monitors gamma corrections set since I clone one to the other. Took a while but the result is excellent. http://i.minus.com/ibCITbrXbDFMr.jpg

Not sure what is up if you can't calibrate your 900. I would clear the software tweaks for now and just focus on brightness, contrast and the color options in the OSD. After those you can get into the nitty gritty.
 
crtfw900.jpg


I used a lot of stuff made by Normen Koren in this guide. All credit goes to him for : making the images used in this guide, the making of the program quickgamma, and the use of some passages of his website in this manual.

Visit his website here:

http://www.normankoren.com/makingfineprints1A.html

INTRODUCTION

This is a CRT calibration guide. Well, more of CRT monitor image adjust guide. Hope it will be useful to you.


------------->I heard that some people use customised resolutions and refresh rates for games on CRT, for instance to play counterstrike at 120 hertz on a fw900. How do they do this?

They use toastyx his CRU program. It allows to safely override your monitor EDID without messing with the INF files. I increased the default refresh rate for my 1920x1200 resolution to 95 hertz for my fw900 with this program. You can set any resolution you want, at any refresh rate, as long as you don't exceed the 400MHZ bandwith limit for the signal.

A couple examples: 2304x1440@80hz uses 399.xx Mhz bandwith. 1920x1200@96hz also 399.xx. You can set a resolution of 1440x900@120hz or 1280@800@1xxhz. Even really high resolutions of 2560x1600@68 hertz are possible.. But the fw900 can't display all pixels then due to dot pitch limitations. :p Possibilities are endless.

If you want to play first person shooters on a CRT at 120hertz, USE THIS PROGRAM!

http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

------------>Monitor Calibration and Profiling: What's the Difference?

To "calibrate" is to change the behavior of a monitor (or printer or scanner) to return it to a standard. Periodic calibration will maintain the monitor so that the way it produces color will stay consistent over time.
To "profile" is to analyze the monitor to see how it produces its color. With a profile you can tell other applications (like Photoshop, for instance) how to convert color settings so the image looks right on screen.
In practice, most monitor calibration and profiling software performs both of these tasks at once and you may not notice when it moves from one task to another.

----------->I've heard I should calibrate my monitor. Why?

Monitors vary their color output over time as they age and with normal use. Calibration keeps them operating in a stable way and keeps the profile valid. A monitor profile is a conversion table that describes how a monitor produces color. It's used by your system and applications to convert colors for display - for when you want that scanned photo to look good on screen. It's also used to convert screen images for use elsewhere - for example, when you've edited an image on screen, like what you see, and want to reproduce the colors on your printer.
Monitor calibration is especially needed if your CRT monitor shows smearing or light trails. Imagine looking at a torch in a dark alley while moving past it. If the torch has a light trail.. Then you got this problem. It has to do with the phosphors of the monitor. Basically, they don't cool down from really bright to black fast enough anymore. Usually this can be fixed, as it can be caused by bad cables or bad dvi/VGA convertors or by video cards that are not very good at making analog signals (yes NVIDEA, i look at you! :p)
Don't throw a CRT away that has this problem. Consider a hardware calibration at a shop for it.

If your monitor has light trail problems: look at this post below.

http://hardforum.com/showpost.php?p=1039349278&postcount=8765

----------->How do I calibrate and profile my monitor?

There are a couple of different methods to calibrate and profile your monitor:
1. Software only. Their control panels can both be used to profile and calibrate your monitor "by eye". The software walks you through several steps to set your monitor's gamma and white points, and allows you to select your monitor from a predefined list.
2. Software and Hardware. Using a colorimeter or spectrophotometer (instruments which measure color output) in conjunction with software will also calibrate and profile the monitor. The hardware is "stuck" right to the surface of the screen and reads multiple color patches.

When only software is used, you are left to guess at the phosphor colors the monitor displays. With a hardware instrument the red, green and blue phosphor colors, as well as the white points, are all accurately measured and this builds a much more accurate profile. It also takes into consideration the aging of your monitor.


---------->best video and FAQ i ever found on CRT TECHNOLOGY

This is a really nice find on youtube: a guy who explaisn in full detail how CRT technology works, and how to do the 10-12 most common CRT repairs yourself, without too much electronics knowledge. Actually, itts quite relaxing to watch, and a good refresh course on people who used to know a lot about CRT monitor tech and fixing.

http://www.youtube.com/watch?v=YsZ5PJB-w2s

Considering the FAQ i found on the net:

It covers a lot of stuff, and i learned quite a lot about CRT technology reading all this. But boy it is outdated. I mentiones a plasma lcd that costs 15000 dollar to compare versus CRT technology. I lol'd, cause that comparison in price vs quality stayed true for about.. ten years (until flat screens could compare vs CRT)
link below

http://arcadecontrols.com/files/Miscellaneous/crtfaq.htm

------------>A very good program to test convergence, moiré, colours and lot more: nokia monitor test. Its freeware from 2004. it probably won't be compatible with windows 7 64 bit :(

download link below.

http://www.majorgeeks.com/downloadget.php?id=960&file=15&evp=ef020e4dc4accd5ec550443783275680

------------------------> Should i remove the anti-glare coating on the fw900 if its undamaged (no scratches)?

If you want my opinion: it does not hurt to remove it, but only consider it if you use your FW900 in a room with no direct lights or sunlight that can fall on the monitor.
In reality both monitors WITH and WITHOUT AG coating look very good. Blacklevel is beautiful on AG-Removed and AG-Coated. Also Sharpness and Colors. You can always remove it, but when its down then you cant get it back on!!!

People say it improves colors and can "restore" a screen with blurry text output. So if you play in a dark room (as most gamers do, right) you should consider getting it off. I posted some links below on the procedure on how to remove the bezel of the monitor (so you can access and remove the glue of the coating) and what the best procedure of removing the AG coating is.

WARNING: don't do it without reading all this. SOME PARTS ON THE INSIDE OF THE MONITOR CAN GIVE YOU ELECTRIC SHOCK EVEN WHEN THE MONITOR IS COMPLETELY OF THE ELECTRICAL GRID!! So be careful and follow the step by step guide.

Uncle Vito has this to say on the matter:

ANTIGLARE ISSUE: There are no fixes to the antiglare as it is a thick film pressured adhered to the screen at the factory. The only thing that can be done is to remove it. At the customer's request, I do that quite often as I've observed thru measurements that the antiglare film interferes with the calibration process and we've obtained different calibration data with and without the antiglare. Like I said, it is the customer's preference. Now, to remove the antiglare, you have to open the case, and then detach the bezel. Before you do that, MAKE ABSOLUTELY SURE THAT THE MONITOR IS OFF AND DISCONNECTED FROM POWER SOURCE! Then to remove the thick antiglare is very tricky and you must be patient! Carefully lift the film from any corner of the screen using a blade, and once you have enough detached film, with your fingers start pulling off the film. The pull action must be done slowly and from the top and bottom corners working inwards towards the center of the screen. If you pull too hard, you may peel off the film leaving a thick coat of glue and pieces of the film that are not easy to remove.

how to remove the bezel of the fw900 photo guide in this link. http://hardforum.com/showpost.php?p=1031915495&postcount=3668


What a damaged anti-glare coating looks like: http://hardforum.com/showpost.php?p=1039067786&postcount=8480 if you got a unit dalaged like this, DO NOT TOSS IT AWAY! If you remove the coating it gets fixed.

a guide to remove the anti glare coating at this link... http://hardforum.com/showpost.php?p=1036390264&postcount=6402

a couple of pictures that demonstrate the difference between an AG coated fw900 and a unit that has it removed. http://hardforum.com/showpost.php?p=1037137292&postcount=6975

1. CLARIFICATION ABOUT COMMON MISCONCEPTIONS ABOUT CALIBRATING A CRT MONITOR


i want to clarify some things on how to properly calibrate a CRT.

------------->1.1 Most people refer to "calibrating a CRT" when in fact they talk about adjusting a CRT's image output

(thx Uncle Vito for clarifying that out, i fell for it too). A "real" calibration is done with WINDAS cable and colorimeters etc.. Which is beyond most users capabilities at home. This guide explains how to properly adjust the image output of a CRT by installing a good profile, configuring the color options in the OSD of the monitor (this is the "MENU" button on the CRT monitor itself.choose if you want to go with D65, or the customised values for color gain and bias listed below), configuring the gamma of the monitor with software, configuring the brightness/contrast in the OSD of the monitor and configuring the color gamma of the monitor with software.
This procedure won't help CRT that have some kind of fault, like faulty guns. Then talk to Uncle Vito, the specialist on selling and refurbishing a FW900. He's called LAGRUNAUER on this forum.

------------->1.2 Calibrating a CRT is not as easy as modifying brightness/contrast on the monitor.

That might seem obvious to many of us, but most people don't know this, since LCD panels don't have very good black levels and CRT DO. Calibrating a CRT is usually a case of getting the GAMMA CORRECTION right. The more used a CRT monitor is, the harder it gets.. Aged CRT monitors got way darker image output then new ones needing other gamma correction values then lesser used monitors. Note that you can *still* get all values right on a very darkish used CRT. Its all about settings.

The Contrast control on CRT monitors and television sets is actually brightness or white level, and the Brightness control is black level. The nomenclature is confusing but deeply entrenched. For example, old CRT Televisions operating in typical viewing conditions, where high ambient light limits the visible dynamic range, the Contrast control affects the apparent contrast.
The native gamma of monitors-- the relationship between grid voltage and luminance-- is typically around 2.5, though it can vary considerably. This is well above any of the display standards, so you must be aware of gamma and correct it. A display gamma of 2.2 is the de facto standard for the Windows operating system and the Internet-standard sRGB color space. Tis is virtually same as D65 or 6500k color warmth. Use the D65 (6500k) color profile that come with the driver package for these monitors. They usually come with the driver ZIP files. I know most modders that adapted the driver to windows 7 added these in their ZIP. To install a color profile file: right-click on the Windows wallpaper (the background outside any open windows), then click on Properties, Settings. Usually color profiles are uploaded under a COLOR tab. It shouldn't be all that hard to find in Windows.

typically a properly prepared image for web will be in sRGB, which is a color gamut suitable for display. Adobe RGB among others are used for images going to print.
So an fw900 should be configured in sRGB, as i advised, and gamma should be set around 2.2 to match the SRGB profile. An image in RGB space should become lighter as you raise the gamma setting and darker as you decrease the gamma setting. Only tamper with gamma if its really needed. As you mentioned chris, 2x brightness 90 contrast in sRGB might appear dark at first, but your eyes adapt. the real goal is having real whites and real blacks, on internet websites and games. BUT YOU MUST CALIBRATE BRIGHTNESS/CONTRAST AT THE SAME TIME OF CALIBRATING GAMMA cause higher or lower gamma means lower or higher brightness. Its not really technically true, but it looks so at first glance.

----------->1.3 If a monitor still has washed out black levels after adjusting.. Then you probably set something wrong after all.

Especially if you are used to flat panels: your eyes need to get used to real black levels on a CRT.... Generally the FD Trinitrons fail bright, (G2 problem), or have HV failures (snap, crackle, pop), they don't seem to be failing black levels. So, a CRT that has seen a lot of use can get darker image output, but thats about it. If black levels aren't right its a case of bad settings; many settings affect each other, especially in the more advanced modes. For example: you can have bad grey values on a CRT or weird bright colorisation or grey image that has red, green or blue tint, or overall way too bright colors. You must be certain that your individual RGB channel gains haven't been changed on the OSD, and that invidual RGB gamma is set ok. Also check if you didn't adjust settings in ati/nvidea panel ALONG with settings in OSD or profile...All this could cause such tints. Test this with the quickgamma program. I hope that is not the case for you, or anyone, as most people think then that their CRT is dying.. Its usually NOT. CRT are sensitive to environmental stuff like electromagnetism and maltreatment during transport. That is why i use quickgamma, it allows you to change the individual RGB channels, in case something looks odd color wise. Always make sure your monitor is set to its original D65 (6500k) color profile ON ITS OSD MENU, before fiddling around with gamma, contrast brightness etc.

----------->1.4 General advice: DO NOT PANIC if something looks WAY off. Its the settings of the OSD or bad settings of gamma, color profile and whatnot in windows.

----------->1.5 I must warn you that the more used a monitor is, the harder it gets to achieve perfect black levels...Aged CRT monitors got way darker image output then new ones.


2. GUIDE ON CALIBRATING: BEFORE YOU START CALIBRATING:

--------> 2.1 Your monitor should be operated in subdued light; strong direct light should not reach the screen.

Dark areas of the screen should appear dark to the eye. I work in a semi-darkened room with a lamp to the left of my screen (positioned in a way that no direct light reaches the screen). Total darkness is unnecessary! It can even be worse idea then a semi-darkened room.

-------->2.2 Set your monitor's color temperature (white point) to 6500K on the menu button of the monitor under colors.

White point calibration is a point of some discussion. In theory, calibration of your monitor to 5000 Kelvin would achieve a match with a light booth containing 5000 K lights. In practice, however, due to instrument inaccuracies or the differences in human perception between monitor-produced white and paper-reflected light, the calibration may not match. Many monitors ship from the manufacturer at a 9300 Kelvin white point. Manufacturers prefer this blue version of white as it is bright, but the blue guns are fired at a high level and this can drastically affect the life of the monitor. Calibrating your monitor to 6500 or 5000 Kelvin will reduce the wear and tear on your monitor and bring it into more standard viewing conditions. So, I recommend setting white point to 6500k in your monitor's OSD or to try the custom values i list below A fw900 its white point options are 5000K, 6500K, and 9300K. 5k will be dark and reddish and 9,3k very blueish and bright. Just go for 6500K if you game a lot.
Note the confusing terminology: Artists call higher color temperatures (bluer) "cooler" and lower color temperatures (yellower) "warmer." The huge variety of available hardware can make setting color temperature confusing. You may have the option of setting color temperature on the monitor (preferred) or with video card or monitor calibration software. Do not set it in both; this may result in an overcorrection-- your monitor will appear dim and yellow. Software settings work correctly if the monitor is uncorrected, i.e., about 9000-9300K.

Another note: i don't advise to use the preset values the SRGB white point setting offers on the OSD of a fw900. Its always better to make sure those settings are correct. Thx chris2000 to remind me of it..

------------->2.3 Your display adaptor software should be set to 24 or 32 bit color (True Color). To see the setting, right-click on the Windows wallpaper (the background outside any open windows), then click on Properties, Settings.

------------>2.4 Let your monitor WARM UP ATLEAST HALF AN HOUR BEFORE CALIBRATING!!!!! I can't stress enough how important this is.

------------->2.5 Also, DO A COLOR RESTORE on your CRT's OSD under color options after setting the monitor at D65 or 6500k color warmth before calibrating further. These last two things are very important

------------->2.6 Install a color profile AND driver that is perfect for your monitor and windows version.

(This post has links to modded drivers for windows 7 64 bit for the fw900. Might be useful to people...

http://hardforum.com/showpost.php?p=1038424294&postcount=8063 )

This means that you have to use a color profile adapted to your white point of 6,5k or 5k or 9,3k. I recommend 6,5k (see explanation above). You can use the profiles that are likely in the ZIP of your modded fw900 windows 7 driver zip file (they might be called D65,D50 or D93),

choice 1: To create your own personal color profile, use this application. I used it, works wonders, same as quickgamma that i mention further in this guide. The reason i recommend this program is cause it will make it so that the color profile will STICK IN EACH GAME. Yes, not all games use YOUR color profile.. Unfortunatly.

http://www.simtel.net/product/download/id/61876

Open MCW, and put a check next to Load at Windows startup, and Persistent profile. Never worry about calibrating another game again. Your color profile will stick!!! Yay! :D

choice 2: install a color profile manually in windows 7. details below.

Save the attached profiles somewhere on your computer (official profiles, on Windows 7 systems, are stored at ...Windows\system32\spool\drivers\color
Go to Display, Screen Resolution.
Select your monitor and click "Advanced Settings"
Select "Color Management" tab
Click "Color Management . . ." button
Click "Use my settings for this device" check box
Click "Add" under the "Profiles associated with this device" box.
Select "Browse" and select the color profiles you wish to add.
With more than one profile in the box - simply click "Set as Default Profile" to make that one default.
That will set the color profile to use in color managed programs, but that won't load the profile's gamma correction. You also have to go to the Advanced tab and enable the "Use Windows display calibration" option. The option may be grayed out in the initial screen, so you have to click "Change system defaults..." and go to the Advanced tab there to enable it.
This is very important: why calibrating if the result does not get used by windows? They should make this more obvious in windows 7 calibration.... Oh well.. Windows!!:rolleyes:



So to summarize: use the windows 7 calibration tool, quickgamma, nokia monitor test etc. AFTER doing these steps:
1. letting the monitor warm up for half an hour or more
2. setting D65 or 6500k color warmth in your monitor's OSD (the menu button on the monitor). Then install the D65 color profile in windows that you probably got along with the driver of the fw900. Or use the tool i mentioned above to make your own color profile.
3. doing a color restore by using the option in the monitor's OSD under color/expert, after half an hour. If you don't wait half an hour the option won't be available.

Now you are ready for the real work.

3. GUIDE ON CALIBRATING: WHEN CALIBRATING FOR GAMMA

----------> 3.1 introduction

For people who have never heard about gamma correction: its something that is only needed on a CRT. That is why i explain this extensively here. Usually an LCD needs little gamma correction. Testing gamma on a LCD CAN be useful though. You can think of gamma as a "layer" that gets drawn over your image output. A shader effect, so to speak.

Basically, gamma is the relationship between the brightness of a pixel as it appears on the screen, and the numerical value of that pixel.
You probably already know that a pixel can have any 'value' of Red, Green, and Blue between 0 and 255, and you would therefore think that a pixel value of 127 would appear as half of the maximum possible brightness, and that a value of 64 would represent one-quarter brightness, and so on. Well, that's just not the case, I'm afraid.
Cathode-ray tubes, such as the screen you're probably reading this on at the moment (why would you otherwise :p), have a peculiar relationship between the voltage applied to them, and the amount of light emitted. It isn't linear, and in fact it follows what's called by mathematicians and other geeks, a 'power law' (a number raised to a power). The numerical value of that power is what we call the gamma of the monitor or system. For a CRT, the gamma that relates brightness to voltage is usually in the range 2.35 to 2.55; video look-up tables in computers usually adjust the system gamma to the range 1.8 to 2.2, which is in the region that makes a uniform encoding difference give approximately uniform perceptual brightness difference.

What does this all mean in plain English? For simplicity, consider the example of a monochrome CRT. Pure black (0.0) and pure white (1.0) are the only shades that are unaffected by gamma. In this case, when a video signal of 0.5 (representing mid-gray) is fed to the display, the intensity or brightness is about 0.22 (resulting in a dark gray). Same math goes for the shade the monitor applies to colors.
So, a gamma correction is needed to get the best possible output for colors and white/black levels.Read more about gamma on the link below.

http://www.cgsd.com/papers/gamma.html

Why do you have to set gamma for white/black AND the red/green/blue channels (RGB)? Do your digital images look too dark when you print them? Or maybe the colors look washed out when you upload the images to a photo sharing site and view them using a web browser? If so, then you're probably suffering from a breakdown in the color management "ecosystem", in particular in its handling of the non-linear transformation of pixel values called gamma correction. The greatest effect of Gamma on the representations of colors is a change in overall brightness of all colors. Look at the image below. Its all about how your CRT displays light/black and how it displays colours ACCURATLY, like the source file is MADE. Wrong values give a different image on your screen vs how its creator intended it. Same for games. In theory you could get advantage in online FPS by having values that illuminate all shadowy hiding spots... But still, overall it damages immersion of the games. Imagine playing sneak game in an overbright shaded dark corridor. Bweh :mad:



On the left is the image as it might appear on an un-corrected monitor. The centre image should look right on a monitor with a gamma of around 1.8, and lastly; the right-hand image is how a system with a linear response [gamma of 1.0] might display the image. I think the difference is obvious. The left image doesn't look as if its taken in broad daylight...
On a CRT you should aim for a gamma correction of 2.5. "Why not 2.2?" you might ask, "it looks fine to me!" Well.. Colours do NOT get displayed right usually at gamma 2.2. The reason? A change in gamma can also effect hue of a particular color representation. Most web content is made for sRGB (close to D65 or 6500k) and sRGB gets displayed right ONLY at gamma 2.5.
I use gamma correction of 2.5 on MY fw900 and all tests show it is MY best value for gaming and white/black levels in-game. Note that i use custom values for color gain (everything to 100) and color bias (all close to 50 or blue at 73 for some stuff) This is mainly due to the fact that my fw900 has seen a lot of use, and has a lot "darker" image output then it was new. I let it be calibrated and this came out as best result. But beauty is in the eye of the beholder, and you can choose for gamma 2.2 or 1.8 too and KEEP 6500K color gain and bias settings on the OSD of your monitor, if it suits your eyes better or if you got a practically unused fw900 (one can dream)...

Gamma on a monitor is similar to dot gain on press. It describes the mid tones of your images. Monitors can be calibrated to a gamma of 1.8, 2.2, 2.5 or sometimes everything in between. 1.8 gamma has traditionally been the setting for Macs as it is similar to dot gain on paper. 2.2 is the typical gamma setting for PC monitors and televisions. This gamma difference is the main reason images appear different between Macs and PC's on the web. If your work is headed to the web, 2.2 is probably the best choice, if headed to print 1.8 or 2.2 are both good choices. A CRT that is mainly used for games needs different gamma correction value then a CRT used for image editing. Since people hardly use CRT for image editing, set your gamma correction "for your eyes only". What you think is best IS best for the CRT...
Experimentation is key. The example below will show what i mean with this.




Correction = 1.0 || Correction = 2.5

Source-->Output||Source-->Output

R 80%-->R 57%||R 80%-->R 80%
G 20%-->G ~0%||G 20%-->G 20%
B 20%-->B ~0%||B 20%--> B 20%

What does the above mean? A change in gamma can also effect hue of a particular color representation.At anything below a gamma correction of 2.5, the output of colours changes. A source colour of 80% red, 20% green and 20%blue will look darker and more reddish at gamma 2.2 and below. Either you like this effect on colors that happens when lowering gamma correction under 2.5 or 2.2, or you don't. Adapt as you wish. Granted, things look more saturated, and some people like that for GAMING... But if you use a CRT for photo editing you MUST set gamma right or you can't see the right colours and light/black values of the original image!![/B] I recommend setting color saturation in NVIDEA/ATI panel if you want your colors to be more vibrant and saturated, at the cost of immersion in many games. But be warned that this setting affects everything, even DVD watching. I advise against it, unless you can make seperate saturation profiles for each game. A bt like having different shader effects for each game.

------------->3.2 Conclusion?

If gamma is not set right, Your screen MORPHS the way you see anything that comes on the screen. So, bad idea. But if you think gamma correction 2.5 looks godawful... Then change it to 1.8 and see if that other extreme suits you better. Gamma should NEVER be corrected to a value outside 1.8 to 2.5.
GAMMA affects black and white levels BUT ALSO color representation, so you MUST set this before you set BRIGHTNESS. Most people are happy with an adjustment to 2.2 on all channels, but best settings can vary from monitor to monitor. The more use a monitor has seen, the darker it gets. So some people need more or less gamma. Same goes for color channels.. They can get distorted during transport for example. If you got a specific color tint on your grey images, adjust the individual color channels in windows 7 calibrator or quickgamma. See below.


------------->3.3 I recommend using this gamma tool, QUICKGAMMA. I use it, and it works wonders. It has detailed help and its very easy to use.


http://www.normankoren.com/QuickGammaV2EN.exe

You can use this gamma tool alongside with the image for gamma that windows 7 color calibration tool offers. Quickgamma makes sure gamma stays same at startup.It is lot easier then posting several gamma calibrating images here.. Try it out, you won't be dissappointed. It has detailed help on how to use it.

NOTE: if you start windows calibration AFTER you altered gamma with ANOTHER way (like quickgamma, or adjusting it in nvidea/ati panel) gamma reverts to 2.2 to start calibrating. Be aware of that!!


----------> 3.4 To start Display Color Calibration in windows 7:

If you don't trust a program, use windows 7 calibration tool or look at these images below.

1.Open Display Color Calibration by clicking the Start button , and then clicking Control Panel. In the search box, type calibrate display, and then click Calibrate display color. If you're prompted for an administrator password or confirmation, type the password or provide confirmation.

2.In Display Color Calibration, click Next to continue.

look up how this windows standard calibrator works! go here:

http://windows.microsoft.com/is-IS/windows7/Calibrate-your-display

NOTE: gamma reverts back to 2.2 each time you start the windows calibration. Be aware of this!!!! Make sure the changes are persistent.. Aka try it out several times at startup. If windows reverts it, make sure gamma changes actually get saved. Look at point 2.6 choice 2 to resolve this matter. I put test mode on since i hooked an fw900 to windows 7 desktop computer to be able to use modified drivers and resolutions/refresh rates for this FW900.


-----------> 3.5 Explanation on how to use gamma "calibrating" web images

Gamma is estimated by locating the position where the average luminance across the gamma pattern is constant. The corresponding gamma is shown on the left. You should be far enough from your monitor so the line pattern is not clearly visible. In normal english: look a bit "beyond infinity" with your eyes at this image. Look for the setting where everything blurs and you can't discern a change in gray tints between the grey background and the middle vertical bar with horizontal greyish lines. Look at the image below to see what i mean. Adjust your gamma slider accordingly, so that the middle vertical bar with lines does not have a whitish "bleed" or a dark "smear". I know its hard to see, but just imagine you lost your glasses and have to peer at text to be able to read it.. Enable the focus in your eyes :) Unless you have a calibrator you'll have to trust your eyes: White and gray images (where R = G = B) should appear tonally neutral, i.e., they should have no visible tint. If gamma for the three color channels (R, G, and B) is inconsistent, you'll notice color variation across the used calibration image (it should be uniform neutral gray).

use this to set gamma http://click.project-kb.com/?url=http://www.lcdreviewz.com/graphics/ColorGammaGradient.png&id=1 and http://click.project-kb.com/?url=ht...t/images/gammapyramide-rgb-flicker05.gif&id=1



4. GUIDE ON CALIBRATING: WHEN CALIBRATING FOR BRIGHTNESS and contrast

Look at the image above, the one on the left. Your monitor's brightness control (which should actually be called black level) can be adjusted using the mostly black pattern on the right side of the chart. This pattern contains two dark gray vertical bars, A and B, which increase in luminance with increasing gamma. (If you can't see them, your black level is way low.) This is why you MUST set gamma before brightness/contrast on your OSD: luminance increases if you set gamma higher.If you set gamma after brightness/contrast it your brightness values won't be good for pure blacks and whites. The left bar (A) should be just above the threshold of visibility opposite your chosen gamma (anything between 1.8 and 2.5)-- it should be invisible where gamma is lower by about 0.3. The right bar (B) should be distinctly visible: brighter than (A), but still very dark.

Once again, these images are TOO SMALL to do it properly, i just posted them to give you idea. Use windows 7 calibration or another calibrator or the images linked below for lot better result .

--------------> choice 1: I refer to this site: the images on this post are quite large and good, for color gamma and brightness/contrast.

http://click.project-kb.com/?url=ht...t/images/gammapyramide-rgb-flicker05.gif&id=1 and http://click.project-kb.com/?url=http://www.lcdreviewz.com/graphics/ColorGammaGradient.png&id=1


-------------->choice 2: or take a look at this image here.

crtmonitorcalibrationch.png



Here's how this chart works: Simply use your monitor adjustments, so that the following things happen:

1. The middle bars, or Zone Steps, indicate the nine step transition from screen black (not TRUE BLACK) to pure white -1 (NOT SCREEN WHITE). The darkest bar or "I" should appear as black as your screen can go. The lightest bar, or "X" should appear SLIGHTLY LESS WHITE THAN THE SURROUNDING WHITE BACKGROUND. If the two appear completely the same, then there is a problem with your monitor calibration.

2. The lower left hand box is the black calibration box. You should adjust your monitor so that the gradient within it can be seen but only enough so that it appears to vanish into the surrounding blacks between 1/3 and 1/2 from the left to the middle of the box.

3. The lower right hand box is the white calibration box. You should adjust your monitor so that there is a difference between the two inner boxes and the outer one, but only enough so that they do not blend completely together. The middle box is SCREEN WHITE.
 
Last edited:
I remember when those were new.. I drooled wanting one badly.. I used to be a CRT aficionado the black levels were always better than the new LCDs and Projectors of the day. I still have my Sony KV-30HS420 Specs

542162e89da043b127c53110.L.jpg
 
Pretty well every review and comparison on the BEST HDTV CRT picture display you could buy was always won by the Sony KD34XBR960. (never mind it was a 220lb pig :eek:)

Specs here:

http://reviews.cnet.com/direct-view-tvs-crt/sony-kd-34xbr960/4507-6481_7-30787600.html

It is 34 inch 1080i able 3D DIGITAL CRT TV.

This puppy had everything too: 3 firewire inputs, memory stick; HDMI; ATSC/NTSC/QAM tuners, DRC, cable card, and the best tube Sony has ever made: the Super Fine Pitch Trinitron.

I STILL use it alongside my fw900.

I bet some wiseguy might show me a digital flatscreen (worth a couple grand :D) that CAN display blacks as good as this CRT TV .. But then i just laugh at him and say i got my 35 inch HD CRT TV second hand refurbished for 300 dollar and that it will probably work until 2030, as opposed to his shiny flatscreen.

They usually shut up or roll their eyes and leave me to my nutty fascination with CRT technology... Saying its not supported anymore and you got no warranty blahblah. Quite untrue. Sony still has repair service for these AND for fw900. But you got to pay usually since the warranty (indeed) expired.

One more point i want to clarify: you can use a fw900 or any CRT TV to make PS3 games look a LOT better with a hdfury You can use a hdfury aswell to display digital video or blurayplayer content on a fw900.From their website http://www.hdfury.com

3Dfury allows you to view 3D content from your PS3, Blu-ray player, satellite/cable or internet on all standard 50/60Hz or 100/120Hz HD displays with either Digital or Analog inputs or both at the same time !
If your HDTV or projector can play regular non-3D games and Blu-ray movies then the 3Dfury will work for you! It turns your existing HDTV or projector into a 3D home theater!
If your HDTV or projector cannot play HDMI content because it lacks of a HDMI input, 3Dfury can still turn it into a 3D home theater, thanx to embedded HDMI converter based on HDfury technology.
Upgrade to 3D and HDMI at a fraction of the cost of buying a new display!

The 3Dfury supports every 3D signal type in existence today including frame-packing, side-by-side, top-bottom.
We are CRT lovers in this thread after all, right? Reason i still use it is cause i simply CANNOT get used to the difference in black level that flatscreen offer.

The hdfury can't be used for computer games unfortunatly. It simply cannot transmit 400 MHZ signal speed.. It can't even handle 1920x1200@85hz, which is the fw900 basic resolution. It won't give a better image in-game if you use HDFURY between desktop computer and the FW 900 for gaming. But boy on a CRT TV connected to a PS3 game it really shines....

An fw900 can be used as TV for playstation aswell along without hdfury. you can use component cable from playstation to the fw900 as one input source, and the hdmi/vga from desktop to fw900. But the image from the ps3 will be lot worse.
Then you can use the "select input" button to switch between the two. quite handy.
 
Last edited:
Pretty well every review and comparison on the BEST HDTV CRT picture display you could buy was always won by the Sony KD34XBR960. (never mind it was a 220lb pig :eek:)

I have the 34XBR960N version which comes from the factory without the antiglare coating (a little brighter picture) apparently they're pretty rare as Sony only made them towards the end of it's production run.

I didn't realize I was getting an N version when I ordered it online (nor did I know anything about it), needless to say the picture quality on these CRTs is pretty spectacular (N version or not).

The only downside is the geometry / convergence / corner to corner focus isnt in the same league as a PC CRT such as the FW900 (you can fix some of it in the service menu) but these issues are much less of a problem on an HDTV used mainly for movies or console gaming.

Crank_Bluray_1080i.jpg
 
crtfw900.jpg




1. CLARIFICATION ABOUT COMMON MISCONCEPTIONS ABOUT CALIBRATING A CRT MONITOR

Calibrating a CRT is not as easy as modifying brightness/contrast on the monitor. That might seem obvious to many of us, but most people don't know this, since LCD panels don't have very good black levels and CRT DO. Calibrating a CRT is usually a case of getting the white and black levels right.

i want to clarify some things on how to properly calibrate a CRT.

In answer to chris2000: yes, you can set gamma with windows 7 calibrator.

first and foremost: ------------> Let your monitor WARM UP ATLEAST HALF AN HOUR BEFORE CALIBRATING!!!!! I can't stress enough how important this is. Also, DO A COLOR RESTORE on your CRT's OSD under color options after setting the monitor at D65 or 6500k color warmth before calibrating further.

I refer to my post here, where i explained in basic language how to calibrate a CRT
.

http://hardforum.com/showpost.php?p=1039306386&postcount=8711




To start Display Color Calibration in windows 7:

1.Open Display Color Calibration by clicking the Start button , and then clicking Control Panel. In the search box, type calibrate display, and then click Calibrate display color. If you're prompted for an administrator password or confirmation, type the password or provide confirmation.

2.In Display Color Calibration, click Next to continue.

look up how this windows standard calibrator works go here:

http://windows.microsoft.com/is-IS/windows7/Calibrate-your-display

But i use special program for it called quickgamma. Link below.


The Contrast control on CRT monitors and television sets is actually brightness, and the Brightness control is black level. The nomenclature is confusing but deeply entrenched. For example, old CRT Televisions operating in typical viewing conditions, where high ambient light limits the visible dynamic range, the Contrast control affects the apparent contrast.
The native gamma of monitors-- the relationship between grid voltage and luminance-- is typically around 2.5, though it can vary considerably. This is well above any of the display standards, so you must be aware of gamma and correct it. A display gamma of 2.2 is the de facto standard for the Windows operating system and the Internet-standard sRGB color space. Tis is virtually same as D65 or 6500k color warmth. Use the D65 (6500k) color profile that come with the driver package for these monitors. They usually come with the driver ZIP files. I know most modders that adapted the driver to windows 7 added these in their ZIP. To install a color profile file: right-click on the Windows wallpaper (the background outside any open windows), then click on Properties, Settings. Usually color profiles are uploaded under a COLOR tab. It shouldn't be all that hard to find in Windows.

typically a properly prepared image for web will be in sRGB, which is a color gamut suitable for display. Adobe RGB among others are used for images going to print.
So an fw900 should be configured in sRGB, as i advised, especially if you do the calibration by web images.. As you mentioned chris, 2x brightness 90 contrast in sRGB might appear dark at first, but your eyes adapt. the real goal is having real whites and real blacks, on internet websites and games. BUT YOU MUST CALIBRATE BRIGHTNESS/CONTRAST AT THE SAME TIME OF CALIBRATING GAMMA cause higher or lower gamma means lower or higher brightness. Its not really technically true, but it looks so at first glance. In my opinion, you can't set brightness and contrast right if you didn't calibrate gamma FIRST.So to summarize: This is cause luminance increases if you set gamma higher.

[2. GUIDE ON CALIBRATING: BEFORE YOU START CALIBRATING:

use the windows 7 calibration tool as i mentioned so many times now AFTER 1. installing color profile, 2. letting the monitor warm up for half an hour or more 3. applying custom color values for color gains and color bias if you wish, with your FW900's OSD under color/expert tab. My custom values are listed in the post i linked below.


--------> Your monitor should be operated in subdued light; strong direct light should not reach the screen. Dark areas of the screen should appear dark to the eye. I work in a semi-darkened room with a lamp to the left of my screen (positoned so no direct light reaches the screen). Total darkness is unnecessary. Set your monitor's color temperature (white point) to 6500K, D65, or sRGB, which is equivalent to 6500K. This is preferable to setting it on video card or monitor calibration software. A fw900 its monitor selections are 5000K (D50), 6500K (D65), and 9300K.
Note the confusing terminology: Artists call higher color temperatures (bluer) "cooler" and lower color temperatures (yellower) "warmer." The huge variety of available hardware can make setting color temperature confusing. You may have the option of setting color temperature on the monitor (preferred) or with video card or monitor calibration software. Do not set it in both; this may result in an overcorrection-- your monitor will appear dim and yellow. Software settings work correctly if the monitor is uncorrected, i.e., about 9000-9300K. Unless you have a calibrator you'll have to trust your eyes: White and gray images (where R = G = B) should appear tonally neutral, i.e., they should have no visible tint. If gamma for the three color channels (R, G, and B) is inconsistent, you'll notice color variation across the used calibration image (it should be uniform neutral gray).
Your display adaptor software should be set to 24 or 32 bit color (True Color). To see the setting, right-click on the Windows wallpaper (the background outside any open windows), then click on Properties, Settings.

3. GUIDE ON CALIBRATING: WHEN CALIBRATING FOR GAMMA

I recommend using this gamma tool, i used it, works wonders.http://www.normankoren.com/QuickGammaV2EN.exe

It is lot easier then posting several gamma calibrating images here.. Try it out, you won't be dissappointed. It will also prevent windows from reverting gamma settings to default which can be a pain in windows 7!!!

If you don't trust it, use windows 7 calibration tool or look at these images below, and make sure the changes are persistent.. Aka try it out several times at startup. I don't know why it sometimes reverts to default.. Damn windows 7. I put test mode on since i hooked an fw900 to windows 7 desktop computer to be able to use modified drivers and resolutions/refresh rates for this FW900. Gamma changes didn't revert anymore either. Maybe its better to adjust the gamma values in ati/nvidea panel.. Didn't try that yet.




Gamma is estimated by locating the position where the average luminance across the gamma pattern is constant. The corresponding gamma is shown on the left. You should be far enough from your monitor so the line pattern is not clearly visible. In normal english: look a bit "beyond infinity" with your eyes at this image. Look for the setting where everything blurs and you can't discern a change in gray tints between the grey background and the middle vertical bar with horizontal greyish lines. Look at the image above to see what i mean. Adjust your gamma slider accordingly, so that the middle vertical bar with lines does not have a whitish "bleed" or a dark "smear". I know its hard to see, but just imagine you lost your glasses and have to peer at text to be able to read it.. Enable the focus in your eyes :)



4. 1. choice 1 GUIDE ON CALIBRATING: WHEN CALIBRATING FOR BRIGHTNESS and contrast

Your monitor's brightness control (which should actually be called black level) can be adjusted using the mostly black pattern on the right side of the chart. This pattern contains two dark gray vertical bars, A and B, which increase in luminance with increasing gamma. (If you can't see them, your black level is way low.) This is why you MUST set gamma before brightness/contrast on your OSD: luminance increases if you set gamma higher.If you set gamma after brightness/contrast it your brightness values won't be good for pure blacks and whites. The left bar (A) should be just above the threshold of visibility opposite your chosen gamma (2.2 or 1.8)-- it should be invisible where gamma is lower by about 0.3. The right bar (B) should be distinctly visible: brighter than (A), but still very dark.

Once again, these images are TOO SMALL to do it properly, i just posted them to give you idea. Use windows 7 calibration or another calibrator or the images linked in this thread for lot better result overall for calibrating gamma, brightness, contrast and color gamma.

4.2 . choice 2 GUIDE ON CALIBRATING: TO CALIBRATE brightness/contrast for black and white levels with specific image.

i refer to this post by zeospantera, the images on this post are quite large and good, for color gamma and brightness/contrast.

http://hardforum.com/showpost.php?p=1039228358&postcount=8589

or take a look at this image here.



Here's how this chart works: Simply use your monitor adjustments, so that the following things happen:

1. The middle bars, or Zone Steps, indicate the nine step transition from screen black (not TRUE BLACK) to pure white -1 (NOT SCREEN WHITE). The darkest bar or "I" should appear as black as your screen can go. The lightest bar, or "X" should appear SLIGHTLY LESS WHITE THAN THE SURROUNDING WHITE BACKGROUND. If the two appear completely the same, then there is a problem with your monitor calibration.

2. The lower left hand box is the black calibration box. You should adjust your monitor so that the gradient within it can be seen but only enough so that it appears to vanish into the surrounding blacks between 1/3 and 1/2 from the left to the middle of the box.

3. The lower right hand box is the white calibration box. You should adjust your monitor so that there is a difference between the two inner boxes and the outer one, but only enough so that they do not blend completely together. The middle box is SCREEN WHITE.


Apart from quickgamma and calibration web images... use the windows 7 calibration tool as i mentioned so many times now AFTER 1. installing color profile, 2. letting the monitor warm up for half an hour or more 3. applying custom color values for color gains and color bias if you wish.


Although these guidelines and procedure are very detailed and easy to follow, I want to stress the fact that they are not the Sony factory calibration & white point balance adjustment via WinDAS/WinCAT that either Sony or we perform in the lab, which require a laboratory grade spectrophotometer (or a high grade colorimeter & program) and pattern generating equipment.

The outlined procedures will not achieve satisfactory image adjustment if any and/or all the following conditions are present in your GDM-FW-900:

a) Bad/faulty guns (evident if your CRT has strong color casts or tints that cannot be fixed via color restore function),
b) Out of spec G2 voltage setting (evident if your CRT has retrace lines and/or if the image is very bright even with the brightness adjustment set all the way down to zero),
c) HV faults of any kind (as of faulty FBTs),
d) H-K or G2 shorts,
e) ABL faults,
f) Landing and/or heavily magnetized CRTs that cannot be fixed via landing adjustments,
g) Low emission CRT.

Now, if your CRT is in good functional condition (means free of faults), then the guidelines and procedures provide you with a cost-effetive and an alternative way to adjust (not calibrate) your monitor without generating an accurate ICC profile, which is generated ONLY with the use of a measuring instrument or device after a color calibration is performed.

Lastly, congratulations to ATWIX for taking the time in producing the guidelines & procedures...

Hope this helps...

Sincerely,

Unkle Vito!
 
I have my screen looking the way I want but I can't get a screen to fix correctly anymore, which is OK when I'm in the dark but seeing a border around the screen is getting annoying.

I've played with all the settings for quite some time but I can't get every corner right I always have one corner out of the image if the border is missing.
 
Another thanks for the detailed calibration information. I've had my FW900 for five years now, and it's still going strong *knocks on wood*. But my picture looks better than ever after really taking the time to calibrate as well as possible with my naked eye. I also removed the anti-glare coating recently.

The black levels are especially good now; inky black but with still vibrant colors.
 
@atwix

Thanks for you nice guide!

But if i follow your guide i think i have a problem.

http://img820.imageshack.us/img820/128/crtmonitorcalibrationch.png

Here in the bottom black box, i can only see the 3 blocks if my brightness is over 42%, but then i dont have real black. also my gamma is right. but at my brightness 29%, i cant see not one of the 3 blocks. what is the problem here? how is that possible?

Also in games there are some (only few) areas which are so dark that gamma is not setting up brightness in this area (like true black which is also not changed with gamma correction), but in this areas are details i cant see (like the 3 blocks in the black box in the linked image). And if i can see the 3 blocks, only at 42% brightness, but than black is not black anymore.
 
like I said, setting black levels is the hardest part of adjusting a CRT image output. The image above is a big bitmap converted to a small jpeg image. It might be the conversion.. I did warn you the images aren't good. I'd try different images, like the one zoespantera posted. The guide has link to them too. The image and especially the black box is too small. But be warned, the difference between monitor black and the bit brighter black can e VERY hard to notice. That is why i always do it with very BIG specialised images.

Or try doing it with a static scene IN A GAME. For instance, any FPS with dark corridors is perfect to set brightness and contrast. After you think its ok, move to other spot with more/less ambient light ingame and compare. I used a scene ingame where you could look out of a window into space, standing into a dark corridor. Such images give best result.

What gamma levels did you choose? I went for 2.5 all and about 30 brightness and 90 contrast.

Each CRT is different unfortunatly... It also depends on your color profile for a small part.

I retried setting gamma, and i think setting it in software panel of ati/nvidea is best idea for windows 7. It keeps reverting to 2.19 sometimes, for no apparent reason.
 
Last edited:
like I said, setting black levels is the hardest part of adjusting a CRT image output. The image above is a big bitmap converted to a small jpeg image. It might be the conversion.. I did warn you the images aren't good. I'd try different images, like the one zoespantera posted. The guide has link to them too.

Or try doing it with a static scene IN A GAME. For instance, any FPS with dark corridors is perfect to set brightness and contrast. After you think its ok, move to other spot with more/less ambient light ingame and compare. I used a scene ingame where you could look out of a window into space, standing into a dark corridor. Such images give best result.

What gamma levels did you choose? I went for 2.5 all and about 30 brightness and 90 contrast.

Each CRT is different unfortunatly... It also depends on your color profile for a small part.

I retried setting gamma, and i think setting it in software panel of ati/nvidea is best idea for windows 7. It keeps reverting to 2.19 sometimes, for no apparent reason.

In zeozoespanteras images i can see all black levekels. the last one is difficult but i can see it realy slight. I have also 29% blacklevel and 90% contrast. And how did you mean you are on gamma 2.5? I have posted like you wrote in the thread, that the fw900 has default value of gamma 2.5 but what is the main advantage versus 2.2? I mean For me it is to dark with gamma 2.2 and your black level and contrast, then gamma 2.5 has to be realy dark or can you see all? I think the main advantage of gamma correction is to get ride of gamma 2.5 and use the gamma 2.2. If you say your naturlal gamma is 2.19, than you should pass all gamma tests WITHOUT gamma correction right? Than something is wrong maybe, because fw900 report himselfs as gamma 2.5...

I am not using a color profil, i am running linux. color profil is possible, but i dont think that this is the problem. i have adjusted all to D65. Also i did image restauration. And what is the magic about the brightness and contrast values from sRGB mode in OSD? What did this option suggest you from brightness and contrast? And how are this calculated or on what it is refering? Maybe we can find out something like this "if your suggested sRGB brightness value is under xx your fw900 is end of live/or in very good condition or vice versa!" If Maybe i have a damage which crush blacks and loosing details of blacks because some part of the monitor is dying or something?

I did set my blacklevel 29% and contrast 90% like the sRGB mode suggested, if your 30% blacklevel and 90% contrast is also the suggested value in sRGB mode, i think our fw900 conditions should be nearly the same? Because the sRGB mode suggestens refering to the different voltages and emissions which are refering on the conditions of the moinitor?
 
Look.. Your really make too much of an issue of this :) If you can see all black levels in zeospantera image then it should be ok. I suggested sRGB cause it is best to browse internet with. But it does not differ much from D65 6500k in terms of colors. RGB IS a bit darker however, and requires different brigtness setting then 6500k. And i use 2,5 gamma across the board, cause 2.2 is too low. Native gamma setting of most CRT is 2.5, not 2.2. The 2.2 setting shows "too low" in windows 7 calibrator image... 2.5 came out as best, as expected. I set black levels and white levels, thus brightness and contrast with gamma 2.5.

Just forget about HOW it works and just try setting it with an ingame image that has a lot of different blacks in it. I'll try finding some expert images in big format to set black levels.

If a monitor still has washed out black levels after adjusting... Generally the FD Trinitrons fail bright (G2 problem), or have HV failures (snap, crackle, pop), they don't seem to be failing "dark". Many settings affect each other, especially in the more advanced modes. You must be certain that your individual RGB channel gains haven't been changed. I hope that is not the case for you, or anyone.
I must warn you that the more used a monitor is, the harder it gets to achieve perfect black levels due to the listed problems above.The reason i use quickgamma program is because you can adjust the different RGB channels with it.


edit: i found a utility suited for linux chris, take a look at

http://www.argyllcms.com/
 
Last edited:
firstly thanks for you answer :)

Yes i said 2.5 is native of crt, but than you dont have to tune up your gamma to 2.5. And i think the windows 7 calibrator gamma images are to reach gamma 2.2. I think you are using maybe also 2.2 if you say 2.2 is to low? Most of the programs dont show the value where are you (this is impossible i think) only show which value have your gamma correction.

Yes, maybe i really make too much of an issue of this ^^ But all the time after i own the monitor i think something is wrong with the dark images... It is not only that all other monitors which i have before were to bright, this effect i have also on tfts, but it is in terms of crushed blacks... I dont think that the textures ingame are made to display as dark... What does sRGB mode in OSD say for brightness and contrast? Only for comparing.

And if windows 7 says "too low" with gamma correction 2.2, then that this "too low" is refering always to gamma 2.2 i think. it might maybe only that the gamma correction in this programm which fits for most monitors not fits to yours and you have to set gamma CORRECTION to 2.5 to reach the final gamma of 2.2? Because if you want to reach 2.5 gamma, then you normaly should do nothing with gamma correction. If i dont have any gamma correction, i also have gamma 2.5.

And you are aware that gamma 2.5 is darker as 2.2 right? with 30% brightness only, your image should be much more crushed in blacks as mine...

EDIT: I have read your edit. Thanks for the link to this application! Hm what did you mean with HV failure? The only problem which i have is (look posts before) i have a problem after coldstarting the monitor the top and right edge is stretched over the borders and only if the monitor is warmed up this edges are in range. Is that HV problem?

Also the problem is not that i cant archive real blacks i think, its much more that black is much more as black? I mean not the final 255 blacks... the 254 black. It shoould be possible to see here a difference right? The only case could be that my G2 is too low. But like i said it is not the final black level. It is the "range" it is too low to see a difference between nearly black and true black if you understand that. If i higher the brightbress/black level. Then this "range" is much better, but then my true black is nearly lost... It is that lets say 35% brightness i can view the differences between different black levels MUCH better, but THEN the true black is not true black anymore. I dont know how to describe it... What did you mean with gains? I have only set it in OSD to sRGB mode, this is d65 and the suggested brightness and contrast setting. But to archive right results with zeopanteros gamma test image (which also have strong moire in the green and red part), i have to set different gamma values for each color. Like green 1.10, red 1.19, blue 1.13. This are gamma CORRRECTION values. But on all other images on the web, i only need to change all 3 gamma colors at once. It is only zeopanteras gamma image which need to adjust different values for different colors. I think one of the images are not right constructed... But that have to be the one from zeopanteras. Because nearly ALL other images not need to adjust different color gammas. Also i never seen any colorcast.

But also no one tested the one from zeopanteras and the flickering one and reported me that they are able to pass BOTH at same time with the SAME gamma setting at the same time.

EDIT: Only starting at 42% brightness i have the same "scale" for black and white in this both pictures. http://www.imaging-resource.com/ARTS/MONCAL/zBlackCalCheck.jpg and http://www.imaging-resource.com/ARTS/MONCAL/zWhiteCalCheck.jpg. The results for white are very good and i cann see the last block difference between true white without to conentrate me. But for black i have to go so near on the monitor and concentrate me as much only to see an difference between the last row. And betwen the last black block and the true black i cant see any difference. My gamma is right and all other settings also. How is that possible? :(

The goal for gamma correction is that my eye can see the differences(range/scale) for black and white in the same intensity. But this is not the fact under 42% brightness to me.

Maybe that can be an effect of a bad cable or vga to dvi adapter? ... I have bought this cable "vga to dvi 1,5m cable clictronics" which is without dvi adapter and gold connectors, is not as long as my actuall cable and the cable is made without oxygen in the cooper. That should be the best dvi to vga cable which is selled in my country or over the world. Bu is is shipped actuall time i dont own it now. maybe next week.

Also i use under linux "monica" for gamma correction and it has a test image in it. And if i reach it 100% perfect for gamma 2.2 and THEN show zeopanteros gamma image, it shows that different colors would be out of range. Therefore i would say that this image is broken. No one ever has reported that it worked for him. That is noting against zeopanteras but it maybe he has grabed this from somewhere and the real creater did it create wrong,
 
Last edited:
Pretty well every review and comparison on the BEST HDTV CRT picture display you could buy was always won by the Sony KD34XBR960. (never mind it was a 220lb pig :eek:)

Specs here:

http://reviews.cnet.com/direct-view-tvs-crt/sony-kd-34xbr960/4507-6481_7-30787600.html

It is 34 inch 1080i able 3D DIGITAL CRT TV.

This puppy had everything too: 3 firewire inputs, memory stick; HDMI; ATSC/NTSC/QAM tuners, DRC, cable card, and the best tube Sony has ever made: the Super Fine Pitch Trinitron.

I have a sony xbr960 too. They have a hdmi input on them so ps3 works fine. When I got my ps3 move and later bought some guns for it to play some shooting gallery type games, I realizes the input lag and ghosting/trailing on my samsung VA tv was horrible for it. The floating gun sight in shooting gallery type games would trail horribly, and my quick-aim was off. I'd have to swoop the recticle around swimmingly to maintain constant accuracy even when I put the tv in game mode. Once I hooked it up to my xbr960 tv the difference was night and day. In one part of the game, it is like a quick-draw gun shoot where you lift the gun as fast as you can to shoot the opponent first. Other parts require fast re-targeting, as fast as you can. The va lcd could not keep up at all and the game was unplayable. Rock band timing also improved dramatically. Even though calibrating rock band (using its in-game app) on the VA TV in an attempt to compensate helps enough to make it playable, on the crt its just "there" timing and accuracy wise. Also no crazy ghosting which was very annoying with the gun recticles at very high speed. I have very fast reflexes and my VA tv could not keep up.
.
the black levels were always better than the new LCDs and Projectors of the day.
Although lcd panels, VA in particular , have come a long way in black depth, many that I've seen still lack a fine amount of detail-in-blacks. In order to get that detail, you usually have to make a set of settings that lighten the scene brightness/gamma/contrast. I keep three sets of settings on my tv for different light conditions in my room, but I also switch between them even in a dark room on some darker movies in order to see more detail in blacks.
 
Last edited:
@chris 2000: regarding your question about HV...

The problem you get is high-voltage Arcing / whining sound,
heard around the focus/G2 pot assembly , or the picture-tube socket.
White sparks can be seen on the screen together with the arcing sound.
The intensity of the arcing can be adjusted to some amount with the focus
pot.
When set is turned off, the intensity of the arcing decreases and after a
few seconds it's gone. It might be resolved by changing out the safety hold-down capacitors. the
larger ones near the HOT and the flyback.

If these are open, you must replace them with the correct value or something like that.. I'm no expert :p

Regarding your concerns about real blacks... I did warn you that calibrating with web images is not the way it should be done!!

beauty is in the eye of the beholder.. Like you said, i stand corrected. Its gamma correction. I achieved best results with a correction to 2.5. It depends on the longevity of the monitor... The more used the thing, the more washed out blacks, cause a CRT gets darker with old age. Mine was used a lot before i bought it. But that does not always mean it cannot be calibrated decently anymore!

MY settings work for me and i got them with someone doing windas and all that on this monitor, doing some stuff on the right bottomside of the monitor, when facing the back side. In theory, you can change a lot of video variables. The brightness can be decreased, with the result that the 360 true black level increases. also you can change the max contrast level to 130 or more..

Its like uncle vito said: tyou can only calibrate a CRT monitor decently THIS WAY! my guide is like fixing an airplane with iron wire. You will never get best results without using the right hardware.

My guide is for basic alteration, with SOFTWARE, and OSD changes. It only tries to achieve the best possible using these means. But each CRT monitor is different, if you really want to calibrate it after a period of use. My settings won't work on any other fw900, since it depends on the monitor itself, the use it has seen and even the surroundings of the monitor...

The only thing that will be "almost" same on each CRT is color settings.


Calibrating a CRT the "real" way is beyond 80% of the people that still use on i bet. If you really want to know more, i can only refer you to uncle vito.. He is the real expert.
I must admit i can't help you on real calibrating. Sorry. I'm just a CRT lover that compiled CRT knowledge (that is likely 10+ years old) to write the above guide..

if you REALLY want ANY information on crt go this link

you'll be busy quite while reading all this!

http://arcadecontrols.com/files/Miscellaneous/crtfaq.htm

the amount of info in this text is a collection.. I never found a better FAQ. I linked it in guide.

Once again chris, if you really want better results then adjusting OSD with the images from zeospantera and all.. Then just do a real calibration, or let someone do it for you. You won't be dissapointed. If its ONE major advantage CRT still has, its the CAPABILITY of best true blacks. ACHIEVING this is a whole other matter.. Unfortunatly. Like i said, even the theory behind it goes well beyond 80+% of CRT users.
 
Last edited:
I would like to mention that I bought Black Ops 2 for the pc ..It looks unbelievable on the fw900 never seen such colors on it before .
 
The images I made are pretty simple. The brightness contrast one is self explanitory and the color gamma one is simply an expanded version of my old paint shop pro's software gamma correction where it is just 255 red and 0 black checkerboard and 128 red bars. Same with green and blue.

The other levels of gamma adjustment would require images like

ipZx3emc3NhHb.png


isJ1c2yQQNM5r.png


ibsvrcvfuVYx02.png
 
I edited the guide extensively, a FAQ at the start, adding links to some usefull programs for gamma correction and custom color profiles and adding a detailed explanation on why gamma correction is really needed on a CRT.

I also edited it to a better lay-out. A lot of credit goes to normen koren. I used his calibration images and linked his quickgamma program.
 
I have now my dvi to vga cable from clicktronic. its the best in my country. the image quality is now a little bit better, maybe. But i have phosphos lag or ringing or what it is called anymore...

If i have some bright object like the mouse and move it fast than i can see the trace of it. now it can not be the cable anymore. vito and other said that it should be the cable or dvi to vga adapter, but now it is existing anymore without change.

any ideas? is my monitor now end of life or what?
or is something broken or bad caps related problem?
for other sony monitors i have read something that the vga connector on the monitor side inclined to be loosely... something checked that? or are here any people also with this "problem"?

Also with the new cable crushed blacks are less. also without gamma correction. but i have set it now with the 3 pictures above :)
 
Phosphor lag is normal on all CRT's. You can't get rid of it.

I don't think you know what to expect with a CRT. They are not like other newer displays. Each will act "funny" from time to time. Like this one (my SGI) shifts the image to the left about 5mm every once and a while and I need to change resolutions back and forth once to fix it. Just what happens.
 
phosphor lag from bad cable only affects static images. If it happens on moving,I'd get a collection of different DVI/vga adaptors and try them all out My bet is that with one it will vanish. I concur with lagranuer on that.

Worked for me once.. I don't know why. Maybe has t o do with the material its made of... I got rid of the trail by using a silver DVI/VGA adaptor that came along with my very old ATI 9800 Pro card. I found it in the graphic card box in my basement and said "what the heck, let's try this one." POOF problem solved. :D Maybe try it out and if it works buy a very good material dvi/vga convertor (silver?)and try a VGA/VGA cable to go along with it.

Although i would not understand why DVI/VGA convertor with vga to vga cable would be BETTER then DVI to VGA cable.. But it worked for my old NEC CRT. I thought it was at end of lifespan.. Guess not.

As for collecting a lot of those converters.. Any gamer or computer shop should have a boatload used ones, since they came with any graphic card box for a lot of years.

Or try hooking a PS3 to your fw900 with component cable and see if the image output has same problem. If so, then its just your monitor that has this problem due to old age. You can solve it by lowering contrast usually, or fiddling around with gamma.
If all this fails, its probably your CRT that is just worn out :( When you first turn it on the picture should look normal in well under a minute. If it is dim, tinted, or blurry for more than a minute or two the CRT is getting weak.


EDIT: I add this for people that get a similar problem.. Mouse trailing or window trailing is entirely different thing.

Its not window trailing you are talking about right? Does it happen when moving windows over each other on desktop, that they trail too?

Then it is mouse signal rate which is available in mouse options. Turn it to 125 or so. The lower the better. Disable "Display Pointer Trails" too!

If its windows on desktop that trail when moved over each other: 1 Right click right hand mouse button on desk top. 2 Select properties. 3 Select Desktop tab. 4 Select costomize desktop. 5 Select Web tab. Uncheck any boxis checked. 6 Click OK. Try to move open window on desk top and see if it stoped trailing. Good luck.
 
Last edited:
Hello,

I have used the same silver vga to dvi adapter from ati. i had it from my old card also but i dont worked! Also i have read your post about it. But it not worked for me sadly.

But if you say that you got ride of it, then the adapters should be the problem. but with the full cable without a adapter the image should be as best as the best adapter that would do that... Also it is a very expensive hight end cable.

And this is my problem. One linke zeopanteras say that is normal and your say it is not and vito say also it is not normal and he was able to fix that with a good cable.

Also i had a CRT up to the fw900. i never got a tft i also should realy know CRT feeling. I had a crappy Phillips 19" 109e5 which was to bright out of factory. i never knowed about lowering g2 but maybe this is also here the problem. but the fact is that i never had this "phosphor lag" on this and other CRTs. It is if you play lets say quake 3 and look around you see a trail of bright objects which are here for a very short time but it realy sucks much.

Also if you start eizo monitor test there is a test where white blocks are falling from top to the top with different speeds. and the trails are visible only on the 2 fastest blocks. but if this test is avalaible then this should maybe really normal. but vito said it is not and you got ride of it. Also what is true? ... It is crazy that only now someone looks for that :D

http://www.meintrendyhandy.de/shop/clicktronic-dvi-i-vga-49103p.html

it is german, sorry. it is 24 gold connectors and 3 times shielded and use oxygen free cooper for the cable and 2 ferits. should be the top of the top. it costs 30€ or more and i got it because of end of life for 10€ :)

It looks very, very high end. complete massiv connectors from one block of steel or so also.
 
I'll be blunt now: plug the CRT to another computer. IF the problem persists, its the CRT that is close to dying or the CRT that needs a "real" calibration I added some explanation on this at the bottom of this post.

Regarding the fact people say different things on phosphor lag being common on CRT

Well we are all three right, in a way.. I'll clarify.

Phosphor lag is only really common AND bad on plasma screens and CRT TV CRT monitors CAN get phosphor lag at the end of their lifespan. But usually its caused by something else, like bad cable/convertor or a video card thats crappy at producing analog signals. If your CRT has phosphor lag you cannot get rid of in any way (even after a hardware calibration), then its likely your CRT itself that is at the end of its lifespan or that it needs a hardware calibration.

So zeospantera, uncle vito and me are all right, in a way. I hope this is not too hard to understad :p

Regarding the solution with silver/gold connector and vga/vga cable instead of DVI to vga cable

What do you mean, the connector won't work? It should, really. These adaptors were used a lot more then dvi-d to vga cables. A better cable and adapter can help, as it might reduce reflections inside the connection. Consider this "less interference" in English. I'd rather say it's the gfx card that's responsible for a crappy signal if all this does not help... Read on below.

Regarding your gold connector cable.. Gold always has been one of the best materials for cables, so i think you can safely use your cable then :) I just HAD to ask, since there *ARE* a lot of crappy cables and convertors out there that aren't suited for high resolutions of CRT monitors.

IF THE PROBLEM IS NOT YOUR CRT ITSELF (or the fault of either a bad DVI/VGA convertor or bad VGA/VGA - DVI/VGA cable):

Then the problem is most likely the analog signal that comes from your gfx card. ATI has been known to offer LOT better support for CRT analog signals in their newest video cards. Do you use an NVIDEA card? Then it might be that.. The analog output of all video cards is only tested company internally, but you nearly never get to see the results, so you basically buy the cat in the bag. My friend did some analog tests of gfx cards himself with an oscilloscope and he said that it is really scary how many top notch cards offer crappy analog signals. So it might just be that your card performs poorly at the analog connectors, which is rather likely, if it has a double DVI connector. I recommend trying toastyx his CRU program and try to raise the 1920x1200 refresh rate to like 95 hertz. It might just be that you gfx card offers a better signal at that setting. Also try 1440x900 at 120hertz or a similar setting. The trail very well may vanish or be less distincitive on higher refresh rates.

-------------------------------->So, I'd try a lot of different settings to see if the visuals improve. It might be possible to lessen the trails, if not 100% removing them.

1. lower resolution and go with higher refresh rate. If it solves the problem, then get toastyx his CRU program and increase the refresh rate of your 1920x1200 resolution to 95 or even 97 hertz. It is the max it can do before you hit the 400Mhz signal bandwith limit that recent video cards have on their RAMDAC (the thing that converts video signal to analog). IF THIS WORKS then make a desktop resolution of 1920x1080 and up the refresh rate eve further. You will hardly see difference between 1920x1200 vs 1920x1080. You can even settle for a desktop resolution of 1440x900 and do games at a higher one.. The possibilities are endless with toastyx his program.

GET IT HERE http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

2. Try setting custom values for your 6500k settings on the OSD of the monitor.Turn up the Red, Green, and Blue Color Gains all the way up to 100. Now go to the Color Bias and turn up the Red to 55, the Green down to 43, and the Blue Color Bias turn it up to 74. This will give the best, rich, most vibrant picture possible on MY monitor. Granted, Bias settings will differ from monitor to monitor.. But just try out at least all gains to 100. You might get better picture after you adapted the brightness/contrast to these new settings for color gain and bias, the values i say here come from a "real" calibration of this FW900.

3. Try this too: set your convergence right for the resolution you use on the monitor. Basically set these to the point where you start to the see black bars coming in from the sides and top and bottom of the screen. Once you see them, set them to the point where they just reach the edges of the screen so you can't see them.

4. If it still does not help, plug the CRT to another computer. IF the problem persists, its the CRT that is close to dying or the CRT that needs a "real calibration with hardware at a repair shop... :(

If its the CRT itself that is problem,

Then it is likely a problem with the persistence of the phosphors. In a CRT, the time a phosphor dot remains illuminated after being energized. Long-persistence phosphors reduce flicker, but generate ghost-like images that linger on screen for a fraction of a second. Sounds familiar? i think this is what you mean with your complaint, right? It happens a lot on CRT TV, but lot less on CRT monitors, due to the fact that CRT monitors are made with different phosphors then CRT TV.
You might be able to fix it with a real calibration like uncle vito does. . CRT display persistance decay must be fast enough to prevent smearing of a moving image from one frame to the next. To measure the actual persistence of a CRT monitor, put a photocell (or photodiode connected to an amplifier) on the face of your monitor, and observe the resulting waveform on an oscilloscope.

Hope this helps. I hope it ain't the CRT itself all in all.. Mot people ditch their CRT when problems like this occur, and don't take it to repair shops for real hardware calibrating. If conclusion is that the problem is caused by your CRT itself, it 'll be up to you whether or not to let it be calibrated for real and if its worth it.

I added link to this post in the CRT calibration guide i posted bit back, as i can imagine many CRT will start showing these problems sooner or later :(
 
Last edited:
I asked a question a while back on why people think 2304x1440 or 2560x1600 looks better on this monitor, since the aperture grille can only fully resolve 1920x1200. I tried it out now for a couple of weeks and I want to say i'm convinced 2560x1600 looks LOT better then 1920x1200 or even 2304x1440.

Really high resolutions of 2560x1600@68 hertz are possible on a sony gdm-fw900... And what's really nice is that since the aperture grille can only fully resolve 1920x1200, higher resolutions end up looking like they are AA'd even if you have AA off. It sucks for text (still readable if you need it but it doesn't look pretty), but for games it's absolutely gorgeous and looks just as good, if not better than FSAA.

I just post this here to promote the use of Toastyx his cru program t unlcok all refresh rates and all resolutions for a fw900. Link in post above.

Some people will get headaches of gaming at 68 hertz refresh rates; others will not. 68 hertz is a big difference vs 60.
If you do get headaches, try gaming at a 16:9 resolution thats bit lower. 2560x1440 will net you 70+hertz.

Just try it out people.. If you can stand the 68 hertz: The image output looks better at 2560x1600 without AA then 2304x1440 WITH FORCED AA ON.

The funny part is that you get better image output at higher resolution, and i actually had better FPS with 2560x1600@68 hz with no AA vs 2304x1440@80hz with forced AA on any game.
 
Being that the topic is alive on cable/converter quality, as well as video card DAC output, I've had a growing concern as a CRT enthusiast--It seems likely to me that the day will come when enthusiast level video cards don't even come with a DAC anymore. Maybe an external dedicated DAC will be the most positive solution in the coming years.

Problem is, I'm sure such DAC's, if available, are for the professional video editing/broadcasting market and would be super expensive. I don't consider the HDFury products viable, as they've had substantial limitations in my experience.

It would be super cool to construct an "open source" DIY kit that could take digital video signal up to 120Hz or more, and output a high quality analog RGB signal for our beloved CRT monitors. Then we could wash our hands of inconsistencies with the DAC implentations on video cards.
 
Yeah cards without RAMDAC unti would suck :(

Although i doubt ATI would give up the support for it.

I myself also looked up how RAMDAC works and wondered if i could overclock the RAMDAC somehow.

I found out that it would make no difference for CRT, since you hit horizontal scan limit almost equally vs hitting RAMDAC 400 mhz limit. 2304x1440@80hz on a fw900 is less then % away from the max possibilities of both horizontal scan limit AND the RAMDAC 400 mhz limit. So a higher RAMDAC will NOT allow a fw900 to do 2304x1440@85 or 120hz.

I also never really got confirmed anywhere if the FW900 ITSELF also has a limitation of 400 MHZ signal INPUT.

Anyways... If they remove ramdac, all beamers and stuff would need to be digital too, right? I don't see it happen anytime soon. I think the VGA analog signal won't dissappear anytime soon.
 
I have read somewhere that it is possible to overclock RAMDAC also i have read you can get exotic resolutions and refresh rates with it on the fw900. maybe the programs which allow you to set custom resolutions only do that...

Also why a crt should be end of live if it shows phosphor lag? If a CRT aged the intensity and bright from phosphor is degreasing... The phosphor lag should be much less viewable than. i have read that about plasma tv also. if they are new the phosphor lag is most noticable. Also my intensity of phosphor looks very hight, i dont think my CRT is end of live. but how you come to the conclusion that it is a end of live symptom?

Also i HAVE a ATI Card :)

Also it is not only if i move something. Also if you look for the eizo or nokia monito test where 4 white blocks only stay at the black background. you can see a white glow around it.

I looks like its only reflecting the light..

Also why a calibration should correct this? you can only lower the brightness, than it is not much noticable, but very dark at all. and it has nothing to do with contrast or, g2, or color temperature or something else. what should be calibrated for get ride of it?

And yes the open source RAMDAC would be awesome :)
There is a open source project also for building a mobile phone and wlan jammer. And there was a project for build a fully open source graphic card which use open source driver and is fully doucmented. but the project goes wrong end never finished sadly :(

Best image quality has matrox but not enjoyable for gaming...

I also think that vga analog signal will be on the graphic cards for long time. most of the costumer tft panels are vga only anymore...
 
Say you are in doom 3, and there is a door with a light shining on it, and then everything around it is pitch black. If you move the screen quickly to one side, a shadow of the door will persist for a moment.

THIS is your problem right? Thats phosphor trailing.

If my above suggestions don't help then maybe correcting the source signal INSIDE the crt might help, or something like that. But all in all it is more a question for uncle vito. I thought this could be done with a hardware monitor calibration, anyone care to elaborate on this matter... . Please try the suggestions above first and let us know result. All i know is i solved the matter with a silver dvi/VGA convertor combined with a VERY good sVGA cable. You got a dvi/sVGA cable so i doubt it is cable, especially since cables only give problems in static images usually.

Please try the suggestions above first and let us know result. There might be an option set that almost removes the trails to a point you almost can't see them..
 
Last edited:
Ok, i will test the suggestion from you. And yes, than it is phosphor trailing. But what is than phosphorlag? And what is if i have a pitch black background an a white dot on it and the background around it is "glowing"?

And again the question about hardware calibration... what can do a hardware calibrqation do against it? it is no color problem or so. a hardware calibrator only can set the right rgb values and brightness etc. but this values dont affect this problem like i know. and vito said that he only have seen that one time and could fix that whit a better cable and A LOT of ferrit cores.

EDIT: I have googled for a video with phosphor trail. but i dont see yellow or green trails... it is only the color which has the bright object. mostly white.

someone write that it is normal and it is a atomic problem with the phosphor. i have read that sony took slowly phosphor for the fw900. maybe they bought like many other firms various sorts and some of our fw900 use other phosphor brands which are mor slowly as other... but than i cant be possible that you atwix fixed it with the cable. or your problems was a other.
 
Last edited:
Back
Top