24" Widescreen CRT (FW900) From Ebay arrived,Comments.

i might had missunderstand you, but did you say that it is posible to play games, for example those at 30fps, and 45fps and have a visual motion clarity the same as when the frame rate matches the monitor´s vertical frecuency (so no triple or double image) but by using interlaced resolutions and double vsync instead?

No, you still get a double image in the form of combing artifacts, but interlacing allows you to use higher resolutions than progressive. Like on a 21" monitor, say you want to play a game at 2560x1920 and 45fps. At that resolution, you can't do 90hz progressive, but you can do 90hz interlaced. The resulting image ends up looking pretty similar, just a little more flickering on horizontal lines.

And if a game has adjustable motion blur, like recent Battlefield games, that really comes in handy when running low frame rates like 45 or 30fps. You can tweak it to where it appears to blend adjacent frames together.
 
Just joined the club!

I need some help to get the plastic shell on the back snapped in all the way; couldn't find any info; pictures, video on it. Somehow the bottom plastic HINGE like joints seperated on left and right side for the back plastic panel. There's no way I can bend the plastic enough to get the ball joint under the cup joint, I don't think it was designed to do this maneuver while the rest of the plastic is in place. I think the bottom hinges are supposed to go on FIRST and then the top folds over and snaps in place. So my problem is the top is snapped in and I think need to snap it OUT so I can put the bottom hinges in first.

Anyone have some pictures/video or advice for these top snaps (they're snaps right?) They are not easy to unsnap and I don't want to damage the plastic any more than I have to.
Thanks!
 
there's a tricky schematic that illustrates how to do it in the service manual. I have problems following it, and every time i disassemble the chassis, I always forget how to do it. I always end up figuring it out though. I think something to pry open the latch might help.

This post with photos will probably help

https://hardforum.com/threads/24-wi...rived-comments.952788/page-59#post-1030076125

Thanks!
Looks like most of the time the back plastic panel is always attached to the main one and they are treated as one. I assume the back is designed to come on and off while the main shell is still attached though; otherwise it's a strange design to separate the shell into two pieces, maybe though, it was easier than manufacturing it all in one big shell like most every other smaller CRT.
 
I"m pretty sure you want WinDAS, and you'll need a USB to TTL cable, and a colorimeter.

The guide here should apply to the W900 (pretty sure, but not 100% sure)

sorry for my model you really need Sony DAS and you also need the Sony cable for it. On one side it has a mini DIN and on the other RS232

what I do not know is, if thats enough or if you also need a Interface unit from Sony. In the Service Manual for W900 it only shows the cable, so maybe I'm lucky, but I need the Software. On several pages of this thread it is stated, that it is a DOS Software and that it can be found on Internet. Now I need a link or someone who can mail it to me.

Thanks much
 
sorry for my model you really need Sony DAS and you also need the Sony cable for it. On one side it has a mini DIN and on the other RS232

what I do not know is, if thats enough or if you also need a Interface unit from Sony. In the Service Manual for W900 it only shows the cable, so maybe I'm lucky, but I need the Software. On several pages of this thread it is stated, that it is a DOS Software and that it can be found on Internet. Now I need a link or someone who can mail it to me.

Thanks much

Just opened WinDAS and you might be right - I don't see a listing for the W900 in the model selection.

For Sony DAS, this came up, so you should probably search through here (assuming you haven't already).

As for the cable, I looked up mini DIN, and it looks similar to a serial interface (though I know next to nothing about this). I think the USB to RS232 interface would work no? I believe the driver for the cable takes care of "formatting" the data so that what ever comes out on the side that plugs into the monitor is in the correct format.
 
Here are the pics you requested. In one you can see the lines present, the other they're not there. Those lines flash about twice per second. I tried lowering the refresh rate to 85hz @ 1024x768 but they lines were still there, they just moved upwards slightly.

I never had any problems using a DVI-i to VGA adapter on my old computer with a gtx 980. I tend to think its not an issue with my new computer or the 1060, but the converter. I've gotten cables from monoprice there were DOA; I hope thats the prob, I really want to keep using this CRT; all new LED monitors give me eye major strain.



So I got monoprice to replace the DP to VGA converter on account of those lines. Sadly, the replacement unit does the same thing with the flashing raster lines.

Does anyone know what could be causing this? Is it worth spending $25 for one of the "better" DP>VGA converters, or is this an issue with my pc/monitor that can't be fixed with any converter. When running my GTX 980 with this monitor I had no issues whatsoever, so I am inclined to think the problem is the converter. I have had some crappy cables from monoprice before....
 

Attachments

  • 20170419_163121.jpg
    20170419_163121.jpg
    133.2 KB · Views: 35
  • 20170419_163123.jpg
    20170419_163123.jpg
    134.7 KB · Views: 34
Last edited:
Definitely the converter. VCOM adapter is the way to go from what I understand. Goes a little higher than HD Fury Nano GX, and my GX has a little ringing in the signal, not sure if that's typical.
 
OMFG, interlacing empire strikes back...

Interlacing blurs still images, blurs moving images, blurs at full frame-rate and blurs at half refresh rate. Using it to push horizontal resolution past dot size is nonsensical if not completely retarded. It degrades CRT image quality even more than up-scaling does on LCD.

Rather than use idiotic methods like interlacing just buy new fancy high and variable refresh rate QHD/4K monitor. It will have better image and motion quality and will be bigger.
 
Interlacing blurs still images, blurs moving images, blurs at full frame-rate and blurs at half refresh rate.

At half refresh rate it gives almost equivalent quality, progressive just contains redundant visual information. And my whole point was that at high resolution, running half vsync at high resolutions isn't always possible in progressive.
 
No, you still get a double image in the form of combing artifacts, but interlacing allows you to use higher resolutions than progressive. Like on a 21" monitor, say you want to play a game at 2560x1920 and 45fps. At that resolution, you can't do 90hz progressive, but you can do 90hz interlaced. The resulting image ends up looking pretty similar, just a little more flickering on horizontal lines.

And if a game has adjustable motion blur, like recent Battlefield games, that really comes in handy when running low frame rates like 45 or 30fps. You can tweak it to where it appears to blend adjacent frames together.

interesting info you got , thanks, but the motion blur...thats the main reason among other things i cant stand current display technology, rather prefer to lower some setting to keep fps matching monitor V frecuency, with a decent frecuency of flicker, and have to agree that image quality degrades even at high resolutions compared to progressive.
 
Last edited:
i have realized that if i shrink the vertical and horizontal size of the fw900 via OSD size menu to 0, contrast becomes noticiable improved, so the viewable area becomes brighter without the need to rise what is called in the OSD "brightness" or better said, without the need to rise and crush black level, so what you see on screen really looks even better at that size, my contrast is already at 100% or 255 when looking in the windas corresponding line, i guess it has to do with the electronic components inside having to work harder when the screen is full so the contrast is reduced even when it is set to 100, from what i have read and i understand it has to do with what its called "luminance" and white level, (not black or G2 adjustment).

so im just curious to know if there is a way to improve that contrast further maybe changing a variable via windas but without the need of a colorimeter?.
 
Last edited:
As far as I know brightness/contrast level can be lowered by the ABL protection if its values are set too low. But the only way to get them back properly at normal levels is to perform several white point balance procedures until the ABL values stabilize. I suspect the ABL protection is your problem since it's triggered by the total brightness on the screen area. If you decrease the display surface that increases the brightness allowed per surface for the same ABL limit.

Of course there may also be onboard issues influencing the contrast level but I've not yet found which specific components are involved in its control. That's one of the few functions without a line with a clear label in the datasheets.
 
@EnhancedInterrogator
maybe it is personal thing but for me any interlaced mode looks unbearably ugly
and besides I use my 21" CRT at 1280x960@120Hz v-sync off because for me resolution past 1280 pixels wide on this display size does not make any difference whatsoever when I actually play game and refresh rate and performance and input lag does make huge difference

2560x1440@90i is ugly, I checked something like this on FW900, saw no point in it whatsoever
running game at half frame rate compared to refresh rate looks ugly on CRT because it cause double images
using v-sync make only any sense at far beyond 100Hz and only if game run full framerate all the tame, not half
but frankly v-sync suck and I rather compromise on tearing which at >100Hz is not really visible on CRT

Any further discussion on this topic does not make sense. If you do not see how crappy interlacing is then maybe you are interlacing liking person... like there are guys who like fat girls... one can only wonder but never really fully understand XD
 
@EnhancedInterrogator
maybe it is personal thing but for me any interlaced mode looks unbearably ugly
and besides I use my 21" CRT at 1280x960@120Hz v-sync off because for me resolution past 1280 pixels wide on this display size does not make any difference whatsoever when I actually play game and refresh rate and performance and input lag does make huge difference

2560x1440@90i is ugly, I checked something like this on FW900, saw no point in it whatsoever
running game at half frame rate compared to refresh rate looks ugly on CRT because it cause double images
using v-sync make only any sense at far beyond 100Hz and only if game run full framerate all the tame, not half
but frankly v-sync suck and I rather compromise on tearing which at >100Hz is not really visible on CRT

Any further discussion on this topic does not make sense. If you do not see how crappy interlacing is then maybe you are interlacing liking person... like there are guys who like fat girls... one can only wonder but never really fully understand XD

O.....K. Seems like you have a lot of set-in-stone rules for yourself, regardless of the game you're playing

For me, it's a situational thing. Just like on consoles, some developers choose to make their game 30, some choose 60. It depends on what works best for the type of experience they're trying to craft. I use my PC the same way, except I have way more freedom to tweak the experience. You know, 45fps, 75fps, 90fps, whatever fits best with what I'm playing.

Like running a game like Witcher 3 on low settings at 100hz doesn't make any sense to me, since it's a slower paced, not-twitchy game. 60fps is more than enough for that game, and that allows you to pump up the lighting quality, geometry, all sorts of stuff that makes a bigger artistic impression than a higher frame rate.
 
dude look at my sig, I have plasma to play games

obviously plasma is far superior in everything except things which for me make CRT even worth keeping: easy to drive medium res @120Hz with 0ms input lag. CRT is good only for quake style games
 
Last edited:
dude look at my sig, I have plasma to play games

obviously plasma is far superior in everything except things which for me make CRT even worth keeping: easy to drive medium res @120Hz with 0ms input lag. CRT is good only for quake style games

Plasma doesn't have the flexibility we've been talking about, though. It's either 30fps or 60fps, or tons of ugly tearing and stuttering if you can't accept one of those two frame rates. And a properly calibrated CRT should have deeper blacks, in my understanding
 
my suggestion is to ignore XoR. This individual has a long history of being rude and unwilling to engage others in a respectful and open minded manner.
 
Plasma would be the GOAT multimedia display technology were it not for the atrocious input lag (and to a lesser extent the fixed resolution and burn-in). As it is, CRT>>> for gaming, plasma>>> for movies.
 
As far as I know brightness/contrast level can be lowered by the ABL protection if its values are set too low. But the only way to get them back properly at normal levels is to perform several white point balance procedures until the ABL values stabilize. I suspect the ABL protection is your problem since it's triggered by the total brightness on the screen area. If you decrease the display surface that increases the brightness allowed per surface for the same ABL limit.

Of course there may also be onboard issues influencing the contrast level but I've not yet found which specific components are involved in its control. That's one of the few functions without a line with a clear label in the datasheets.

im not certainly sure if it can be considered a problem, comparing with another compaq 17" crt monitor i have with still overall excelent image quality, the same occurs and picture becomes brighter when i reduce the viewable are via OSD, so it seems to be normal? its not that my fw900 screen when full is dark, its good indeed, but when reduced it looks even better, so i just see it like a bonus contrast and i am curious to know if it may be posible to achieve at full screen area.

reading some post from you when you talk about the ABL thing, i found two parameters related when checking the viewer list REG section, in a recently imported dat file:
ABL_SHUTDOWN_LMT= 193
ABL_CONT_LIMIT=168

but before going further with this, i think you calibrated your fw900 professionaly with colorimeters, so i would like to ask you if you could please reduce your viewable area the same way i did, just to know if the same occurs on a proffesional calibrated fw900, and in what extent. ;)
 
my suggestion is to ignore XoR. This individual has a long history of being rude and unwilling to engage others in a respectful and open minded manner.
this is personal attack and is much more 'rude' than anything that I ever wrote

from what I remember I did polarizer mod and posted my findings and you did a fuss about some unrelated thing "WinDAS White Point Balance" and said anything I say is invalid because I do not subscribe to your bullshit. You were unable to provide even single explanation why doing this WPB would give different image quality than lowering G2, upping max contrast and then doing rest of calibration from OSD. And attacked me with the same bullshit as now. And it is me who is rude? F.u.u.s.r
 
@EnhancedInterrogator
CRT is flexible but using bunch of different displays for their strong sides make more sense than trying to squeeze everything on to one display.

Blacks on plasmas, especially those with proper screen filters are second only to OLEDs and CRT are not really comparable despite some people nonsensical dubious claim of full black screen black level which:
a) I can also reproduce so it is not an issue of my hardware
b) do not translate to real life performance at all

As soon as anything other than pure black is displayed CRT black level rises.
Due to this and other effects such as flaring and inner glass reflections it is necessary to increase black level considerably to avoid some colors (those using lowest intensity RGB subpixels) being displayed improperly. This is why at times where CRT were used professionally for things like photography they were calibrated to rather poor and unimpressive black levels, worse than even early IPS panels.

For pure multimedia purposes rising black level is not such a big issue but certainly for keen eye this inconsistency in colors is noticeable =)

@aeliusg
Most Panasonic plasmas have 24ms input lag.
Is it 'atrocious'? For Quake-Live perhaps but for these games even 60Hz with 0ms input lag is atrocious and one need >100Hz with strobing, preferably GDM-FW900 set to exactly 125Hz, v-sync off (it did always tear only at top of screen when set this way) =)
In realm of TVs 24ms is best of the best and even in computer monitors it is acceptable. Certainly a lot of people used/usee more laggy displays and do not get brain cancer
Burn-in? Could not care less
 
Last edited:
reading some post from you when you talk about the ABL thing, i found two parameters related when checking the viewer list REG section, in a recently imported dat file:
ABL_SHUTDOWN_LMT= 193
ABL_CONT_LIMIT=168

but before going further with this, i think you calibrated your fw900 professionaly with colorimeters, so i would like to ask you if you could please reduce your viewable area the same way i did, just to know if the same occurs on a proffesional calibrated fw900, and in what extent. ;)
Well, I spent enough time setting up geometry right, now the settings are locked and I prefer not to touch anything on this one.

But I remember that after stabilizing ABL_CONT_LIMIT was close to 200, and below that I couldn't reach all the peak brightness levels required during the white point balance procedure. Two quick tests may help finding out if the limit is triggered or not on your screen:
- display a black screen and switch to a full white one, normally you should see the white being bright and half a second later dimmer if the ABL limit is on.
- try to lower contrast. If you barely see a brightness difference while it decreases, and past a given contrast value theere is a significant brightness decrease with contrast decrease, then the ABL limit is on. The fact that you run the screen with 100% contrast let me think so by the way.

And you know, calibrating a screen is nothing that "professional", I just use a common Spyder 3 colorimeter and free softwares, nothing too expensive or complicated actually. ;)

XoR_
White point balance is the procedure to follow by design to fix G2, simply because things are more complicated than this single value and the procedure also sets "hidden" related values accordingly. Editing the G2 value alone improves things as far as your concern is a G2 value too high, but it's the dirty and inaccurate way to do things.
 
You were unable to provide even single explanation why doing this WPB would give different image quality than lowering G2, upping max contrast and then doing rest of calibration from OSD.

And once again, you appear to have a very selective memory.

I addressed this in quite some detail. Funnily enough, you never responded to this post to say "thanks" or "I see what you mean". You completely ignored it (as per usual).

In WinDAS I only changed some setting that allowed for higher luminance/contrast. Color-wise nothing was really necessary.
What setting relate to tube health and why?

First off, controlling overall luminance is important (determined by beam current), as higher beam currents wear out the tube faster. The phosphors age faster, and more impurities are generated by the increased electron bombardment. Impurities are bad, as they accumulate on sensitive structures within the tube, reducing gun efficiency and eventually causing shorts.

I don't have a good grasp on these next two, but:

Second, the relationship between G2 voltage and the cutoff voltage has pretty dramatic consequences. If things are not well calibrated, the electrical fields around G1 result in the aperture, through which electrons flow, to "appear" smaller to the electrons. This means that they are drawn from a smaller area of the cathode, which intensifies cathode loading.

Third, the cutoff voltage has an impact on the burden on the amplifier circuitry, as it determines how much the amplifier has to "swing" to generate the necessary voltages to produce the desired beam currents. I think it may also have an impact on focus considerations.

WinDAS WPB systematically adjusts ALL these various parameters to factory specs, including cutoff voltages, which ensures optimal operation. You simply cannot adjust the tube to the same state using only the OSD.
 
@EnhancedInterrogator
CRT is flexible but using bunch of different displays for their strong sides make more sense than trying to squeeze everything on to one display.

Well I'm already up to two CRT's running in my place at all times. High-frequency PC CRT for PC and modern consoles, and then I have 15kHz Sony PVM's for retro games. So that's all I'm willing to do for now. In the future I hope it becomes viable to replace the PC CRT with a high PPI, high refresh rate OLED.

But even then, I'm wondering how flexible the OLED's will be. I will definitely want the ability to select an arbitrary refresh rate and use it in conjunction with scanning/strobing, as I'm able to do with my CRT now.
 
@EnhancedInterrogator
I bet that if you took panel from Dell UP3017Q and gave it better electronics it would be possible to make near perfect gaming display for both new (strobed G-Sync, square pixels at 1080p, even form of scanlines for better looks) and old (15KHz, proper (original) progressive and interlaced rendering) hardware. In reality we will have all sorts of badly designed displays with all sorts of flaws and for 15KHz your only solution is stockpile of old CRT TVs =(

@spacediver
technically I asked for actual settings in WinDAS which you did not really provide with your answer. If anything you communicated your belief in automatic calibration being the best way to provide best longevity and image quality for tube. I do not need to agree with any of it and want to manually tweak my hardware. That is if I see need for it which I did not and still do not think my FW900 need any WPB.

@Strat_84
Unlike some people I do not share belief it is good for tube health to regularly procedures that make monitor click numerous times and set values to some values to check for hell know what effect. Having tons of CRT monitors including many Trinitrons it is possible to rate if monitor is in good health and if it needs any tinkering with voltages.
I never indicated in this topic that my FW900 have any issues at all. Rather always claimed it have super sharp image, black level that is like if monitor was completely off and tons contrast to spare - or in other words: perfect tube and after putting proper coating best CRT monitor ever.
 
Plasma doesn't have the flexibility we've been talking about, though. It's either 30fps or 60fps, or tons of ugly tearing and stuttering if you can't accept one of those two frame rates. And a properly calibrated CRT should have deeper blacks, in my understanding

Eh, it's a wash. Depending on the plasma display in question. The two technologies are similar in that they're emissive and both deal with phosphor but that's where the similarities end. Plasmas should be able to get as dark as direct-view CRT's. In fact, the plasma television should have about the same full on/off contrast but where it actually outshines CRT is in intra-scene contrast, or ANSI contrast. CRT has pretty bad ANSI contrast, and plasma does not. In my opinion, a plasma display with very low input lag, which is calibrated of course, should make for an excellent display.

EDIT: The problem, for me at least, is that most plasma displays are consumer televisions with little in the way of tweaking the image settings. Such a disappointment. I had the opportunity to get a P65-S2 - 65 inch plasma - and passed on it because I figured that if I was going to get such a large image, I may as well say screw it and just get a projector - something that would have way more tweakability.
 
@Strat_84
Unlike some people I do not share belief it is good for tube health to regularly procedures that make monitor click numerous times and set values to some values to check for hell know what effect. Having tons of CRT monitors including many Trinitrons it is possible to rate if monitor is in good health and if it needs any tinkering with voltages.
I never indicated in this topic that my FW900 have any issues at all. Rather always claimed it have super sharp image, black level that is like if monitor was completely off and tons contrast to spare - or in other words: perfect tube and after putting proper coating best CRT monitor ever.
That's an interesting mix of contradictions, ignorance and self satisfaction. But if you're happy with that it's ok then.

For your information, talking about tube health like you do is indeed belief, because almost every variation or issue has to do with the electronics behind. Components do age in different ways, it's illusory to think the fine tuned settings when the screen is new have the very same result after 15 years of use. There are already slight variations between boards brand new out of the factory, simply because there is a tolerance in value for components of the same type. Calibration procedures are there for that, to get past these variations and obtain the intended display on all screens.
The fact that you added a coating with characteristics completely different from the original makes such procedure even more relevant than for anyone else.
 
Last edited:
That's an interesting mix of contradictions, ignorance and self satisfaction. But if you're happy with that it's ok then.

For your information, talking about tube health like you do is indeed belief, because almost every variation or issue has to do with the electronics behind. Components do age in different ways, it's illusory to think the fine tuned settings when the screen is new have the very same result after 15 years of use. There are already slight variations between boards brand new out of the factory, simply because there is a tolerance in value for components of the same type. Calibration procedures are there for that, to get past these variations and obtain the intended display on all screens.
The fact that you added a coating with characteristics completely different from the original makes such procedure even more relevant than for anyone else.


Thank you for posting this. Calibration indeed will keep your equipment going healthy and well. In one particular instance, adjusting the output levels of the CRT so that the proper light output is reached will prevent you from overdriving the CRT and shortening its life.
 
There is not a single proof that regularly doing WPB (or even at all!) will lead to increased monitor health. When all you have are assumptions on even more assumptions, all without any actual measurements and proper component wear-off analysis based on these measurements, then your guesses are as good as mine.

Besides who cares about 'factory' whatever?
Factory screwed up so many things in this monitor it might be smarter thing to do to not really opt for factory settings XD

@jbltecnicspro
Black level when comparing GT30 and VT30 to my CRT's black when displaying pure black screen is comparable.
I mean it can be made much better on these CRT's, basically like screen off but to avoid needing to use LUT's (only simple gamma correction) or issues with darkest shades being 'eaten' by glowing black due to flaring/inner_glass_reflections I actually usually end up using black level that is more like these plasmas have.

ANSI is as good on PDP as it is on any LCD which is almost perfect.
Coatings of modded FW900 and factory of P1110 are similar to that of GT30 with VT30 resembling LCD/OLED when in full light.

Nice thing about plasmas is their current price of used units. Very inexpensive and only thing beating them in pure performance for games (for large displays) and movies is OLED which is prohibitively expensive.

Ofcourse it is not replacement for GDM-FW900 which one of a kind and unreplacable =)
 
There is not a single proof that regularly doing WPB (or even at all!) will lead to increased monitor health. When all you have are assumptions on even more assumptions, all without any actual measurements and proper component wear-off analysis based on these measurements, then your guesses are as good as mine.
That depends on what you call regular. I don't think it's really necessary to do a WPB every said period just for the sake of doing it, but it's a good thing to do when you can assess some parameters varied. And the issue with most of the monitors of this kind, is the G2 value drifting. The first goal of WPB is of course to restore the properties of the display you actually see, but as a side effect restoring the G2 to lower values and the balance between related voltages must prevent excessive stress the circuits weren't designed for.

BTW, if you followed the topic a bit you must have noticed this is actually what I ended doing on my GDM-5410, measurements everywhere and assessing the shape of components to find the root of some issues. So far I have desoldered and checked more than half of the components on the D board and there's been some extensive work on the A board as well. I think I got some knowledge on the subject in the process.
 
Strat_84 yes, there is a dim difference when switching from a white to black backgound, but barely noticiable, i repeated the test on both full and shrinked display areas, same barely noticiable corresponding dim result,

i have just messed arround with the image restoration setting and reset with the reset button to default all color parameters and also push the ASC button (i know this is for size, but wanted to reset everything i could) and then i have re- setup again everything, using the same parameters the way i had before the image restoration and global reset and now i notice some improvement in contrast, not huge but noticable, i use 100 contrast because i also use some other custom color settings diferent from the default color temp modes.

after all that did the dim test again for both sizes and same barely noticiable dim result, also the display still gets brighter when shrinking the display area, i imported the dat file again with windas and the ABL_SHUTDOWN_LMT= 193 ABL_CONT_LIMIT=168 still have same values than before,, so maybe that extra contrast may not be related to the ABL thing?

to be honest i have my doubs a colorimeter + wpb will give dramatically improved results on this monitor at least for my tastes, unless it can rise contrast levels on the complete screen area to the ones when the screen size is reduced to 0, would be worth it. but when searching for the colorimeters recommended here from fw900 users, even the cheap ones would be still expensive where i am because of the national taxes and international shipping, the only one i see available where i am, is the I1display Pro which is as expensive as you can build a basic gaming computer system with the cost of it.

so, just testing the reduced screen size area and comparing the contrast with the full size screen on a calibrated monitor would give me an idea if it would be worth to try to get and invest money on a not so expensive colorimeter to try those metods or not.
Strat_84, you only need to unlock OSD and reduce the vertical and horizontal size to 0 from the size menu, you do not need to mess with geometry settings, and to restore the corresponding number for the horizontal and vertical size you just get back to the number corresponding to vertical and horizontal you were previously, also you can do a DAT backup before to feel safer. It also would be posible to test with another whatever resolution regardeless of the aspect ratio, and shink it to 0 size, and that way would be easier to compare contrast gain by just switching from the full to the shrinked resolution, if you definitely do not want to do it, i understand, i dont want to bother you with this, thank you anyway ;)
 
Last edited:
even the cheap ones would be still expensive where i am because of the national taxes and international shipping, the only one i see available where i am, is the I1display Pro which is as expensive as you can build a basic gaming computer system with the cost of it.


what the hell, do you live on Mars?

There's a DTP-94 on ebay for $20, and you can get an i1 display pro for under $200
 
As an eBay Associate, HardForum may earn from qualifying purchases.
3dfan I know what I would have to do to reduce the display width of course, but the thing is I carefully set the width/height to have an exact 16:10 ratio and I don't want to bother checking everything again with a ruler. Besides, this way of testing is less relevant than the ones I indicated you. ;)

Ok, a few things now:
- the relation between contrast and ABL limit is that when ABL is triggered, brightness is capped. In other words, you might very well have no actual brightness increase when increasing contrast from 80 to 100 for example.
- I wouldn't have expected a reset or color restore to change ABL. As far as I know, it's a "hidden" value set during WPB, and it might change (decrease ?) automatically during use if ABL errors are encountered. The ABL is a safety function, it's not meant to be changed freely because of users.
 
@3dfan
reducing monitor size should only really make display brighter because more electrons now hit less area: very small barely noticeable difference => not worth it. Tested it, it work as expected, small barely noticeable difference. I do not think it should change anything related to ABL as size setting only change how much deflection coils 'stretch' beam and pretty much nothing else. (I could be mistaken here. Will try to test this in a moment...)

To test ABL limit issue display something like very small white rectangle and then switch to image of full white screen. If there is very strong visible difference in brightness of white then you hit ABL limit, if not then it is not it. Even then ABL should not be that much of an issue with real life images as it would only make very bright images somewhat dimmer, not a biggie.

Post your bias/gain setting for RGB in expert color setting.

In any case these monitors are exceptionally bright for CRT's and have very translucent ag coating so it is strange you would lack brightness...
 
spacediver
im in colombia south america, where import taxes, dollar currency, other taxes (which are currenlty even higher than before thanks to corruption, mediocre goverment managment, etc) make a cheap product like the $20 DTP-94 at least 3x times more expensive, also this is an used and old product that makes me doubt if would work as expected, and the i1 display pro is at least 2 times more expensive here, also i dont believe my monitor is in such bad conditions so that i cannot setup it correctly to suit my likes via OSD, the only thing i have not been able to improve more is the contrast level i have been talking about, but the current contrast i use is not bad for me, just see if i could get that aditional contrast when reduced size to 0 but on full like a bonus, however i could consider the DTP-94 if i can get it considerable cheaper in the future when some relatives comes from the US and bring it,so i could skip aditional taxes and other aditional costs.

Strat_84
thanks for the info, would it be posible to know how did you do to get an exact 16:10 ratio? i mean, OSD? via windas? if so, what values did you change for that?


XoR_
interesting info you got there, thanks to do the test i was refering to. i rised both ABL settings a bit (250 shutdown, 230 limit) with hex editor and imported DAT and test, did not see a difference, still noticiable brighter when reduced area, and same barely difference when going from a black to a white image that i already tested, so i reverted to a backup DAT with default ABL values.

so in my case there is a noticiable difference on the display bright area when reduced size to 0 it is like if i rise the contrast from say...100 to 150 (about 50% difference), when using my RGB settings from the expert color settings, but now that you talk, i have realized that if i use some of the the default colors in the easy section, for example, 5000k (reddish) and then reduce the size to 0, the extra bright is less noticiable, but if i go to 6500k it is more noticiable than at 5000k, and 9300k even more than 5000k, and at my settings even more. its like the more aproach to whiter white, the brighter the area gets when reduced size.

my bias/gain setting for RGB in expert color setting are

100
100
80
90
91
100

those settings gives me a whiter feeling for what is white while keeping overall good color to my likes, i also use a custom coating from a polarized car which is a bit darker than the original.
 
Strat_84
thanks for the info, would it be posible to know how did you do to get an exact 16:10 ratio? i mean, OSD? via windas? if so, what values did you change for that?
The geometry settings in Windas do that amongs other things, making the image ratio fit the resolution ratio. Then you can use the OSD controls (only zoom if the Windas setup is right normally) to expand that to the display area since the Windas settings let some black borders that can be used.


interesting info you got there, thanks to do the test i was refering to. i rised both ABL settings a bit (250 shutdown, 230 limit) with hex editor and imported DAT and test, did not see a difference, still noticiable brighter when reduced area, and same barely difference when going from a black to a white image that i already tested, so i reverted to a backup DAT with default ABL values.

so in my case there is a noticiable difference on the display bright area when reduced size to 0 it is like if i rise the contrast from say...100 to 150 (about 50% difference), when using my RGB settings from the expert color settings, but now that you talk, i have realized that if i use some of the the default colors in the easy section, for example, 5000k (reddish) and then reduce the size to 0, the extra bright is less noticiable, but if i go to 6500k it is more noticiable than at 5000k, and 9300k even more than 5000k, and at my settings even more. its like the more aproach to whiter white, the brighter the area gets when reduced size.

my bias/gain setting for RGB in expert color setting are

100
100
80
90
91
100

those settings gives me a whiter feeling for what is white while keeping overall good color to my likes, i also use a custom coating from a polarized car which is a bit darker than the original.
Well, THAT is very informative.

First, I told it sooner but I'll tell it again, DON'T EDIT ABL SETTINGS. This is part of an automatic safety function, very bad idea to do that manually.

Second, you use a darker polarizer, hence the 100% contrast (very unusual normally).

Third, all the gain/bias colors are maximum or close (+ contrast), that may also explain the brightness variations. The electronic is overdriven and may not like it very much.

Fourth, even though a colorimeter is a luxury in your country, you would certainly benefit from it. With polarizer + all settings to max there's not a single chance your screen still displays colors right.
 
Last edited:
spacediver
im in colombia south america, where import taxes, dollar currency, other taxes (which are currenlty even higher than before thanks to corruption, mediocre goverment managment, etc) make a cheap product like the $20 DTP-94 at least 3x times more expensive, also this is an used and old product that makes me doubt if would work as expected, and the i1 display pro is at least 2 times more expensive here, also i dont believe my monitor is in such bad conditions so that i cannot setup it correctly to suit my likes via OSD, the only thing i have not been able to improve more is the contrast level i have been talking about, but the current contrast i use is not bad for me, just see if i could get that aditional contrast when reduced size to 0 but on full like a bonus, however i could consider the DTP-94 if i can get it considerable cheaper in the future when some relatives comes from the US and bring it,so i could skip aditional taxes and other aditional costs.

The DTP-94 has excellent build quality. Those things last decades. The second hand one which is $20 would be $60 for you because of all the shipping costs etc. That's still a good deal.
 
By the way a little update on the AR coating subject.

I've taken brightness measurements on a non damaged area with the coating on, then after taking off the coating. Measurements taken with the following settings in the OSD (not calibrated): 5000K (white display), 6500K (white, red, green, blue), 9300K (white), after letting the screen run on a white display for at least 1h30.

The results in % (brightness with coating/brightness without)
:

White: 5000K 66.6%, 6500K 66.4% 9300K 66.1%
Red 6500K 67%
Green 6500K 66%
Blue 6500K 63%

Removing the original coating increases the color temperature. I noticed a measured 7300K T° (setting 6500K), becoming about 7800K without coating. That's consistent with the previous measurements.

So roughly the original AR coating is one with a transmittance of 66% with a curve similar to the bold one here (extract from The Transmittance Control of Film Laminated CRTs by Light Absorbent Anti-reflective Coating and Pigmented Adhesive):

transmittance_vs_wavelength.jpg
 
Back
Top