24" Widescreen CRT (FW900) From Ebay arrived,Comments.

After reading this post I boght Delock 62967. It is quite cheap and for future compatibility it will suffice as I do not really care for anything higher than 1920x1200@96Hz

Should I expect any compatibility issues due to default cable being bad quality?

It depends on the strength of the displayport signal of your graphic card.
Example,on a MSI GTX 1070 Gaming i tested 8 different samples and they all work perfectly,on a Gigabyte 7950 one
sample is perfect,two reach HBR2(over 180MHz) but unstable,the other five samples can't go over 180 MHz,but on this card i have mini displayport connectors so between the adapter and the card there is another adapter that fuck the signal further.
Like danny_discus said it's a hit or miss,adapters aren't all the same and the graphic card model changes the result,if it doesn't work,changing the cable should solve all the problems.
 
Derupter
Oki then. If there are any issues I will replace the cable. It should be at all difficult.
Are there any other issues or considerations with this adapter I should be aware of?
 
It's not that difficult,if you know someone skilled with a soldering station it's an easy task,i've done it on a sample with a shit soldering iron.
There aren't other issues
 
Yes I do. It's an i1 Pro. I've done a full white balance calibration already.

I'm just not sure I completely follow your instructions regarding determining whether the adapter is 10 bit.

And if the adapter is NOT 10 bits what does this mean practically? Does it only matter if I'm adjusting the gamma with dispcal? Or would I see banding artifacts anyway?

I'm happy to run the tests and let everyone know if you could provide some easy to follow instructions.

Sorry for the delayed reply was away for a few days.

You said i1 pro - do you mean the i1 display pro (colorimeter), or the i1 pro (spectrophotometer/spectroradiometer)?

Assuming you have argyllCMS installed (instructions are on the WPB guide), all you have to do is place the measuring instrument so that it's measuring the center of the screen, and type dispcal -R in the command line. A test will be performed that tries to determine how many bits you have, and then it will print the results on the screen.

Having a 10 bit LUT is important if you want to do any software color calibration after performing the hardware calibration.

Most of us (there are some exceptions) are running in an environment that has 8 bits per channel framebuffers. This means each pixel, in any given image, is constructed by choosing from a palette of 256 unique values for each of the RGB channels. The video LUT (look up table) is a list of values that maps how these values get interpreted into voltages that are sent to the CRT. Think of it as a matrix that has three columns (one for each channel), and 256 rows. Each element of this matrix will have a value that is represented with 16 bit precision in windows. That means that each value can be an integer between 0 and 65535.

The default LUT is a linear one, which means that the values between 0 and 65535 are evenly divided up by 256. So the first value would be 0, the second would be 255, the third would be 511, and the 256th would be 65535.

Now even though the LUT is specified with 16 bits of precision, this isn't 16 effective bits. For example, if you change the value from 511 to 513, you might think you'd get a small bump in voltage. But with 8 bits of precision, you won't get a bump in voltage until you reach 767.

With 10 bits of precision, even though you can define only 256 video input levels, you can choose from a palette of 1024 values for each of those 256 levels.

Suppose you've hardware calibrated your CRT to perfection, and need no software calibration (i.e. modification to the LUT). Then everything is fine.

But suppose you calibrated your CRT so that you get really inky blacks by using a lower than normal G2 voltage. A side effect of having such a low G2 is that the gamma/luminance function/EOTF of the display isn't ideal. The blacks will be crushed (high gamma). The way to address this is to adjust the LUT.

Here's the important part: with a LUT that has only 8 bits of precision, you only have 256 unique voltages to assign to each of the 256 video input levels. If you want to change the gamma, by raising or lowering the default voltage of one of the video input levels, this new voltage will be the same voltage as a neighbouring video input level. Each time you make a change in this fashion, you drop the number of unique voltages being used, and therefore reduce the number of unique colors being displayed. The more radical the change to the gamma, the more number of unique colors you sacrifice.

Here's an image I made that may help illustrate the limitation of having only 8 bits of LUT precision:

2cwlcwg.png


The gray line represents the default luminance function of a hypothetical display (gamma = 2.2).

The black line represents the target luminance function (gamma = 1.0, which is the standard in many visual psychophysical experiments).

In order to change the gamm a from 2.2 to 1.0, you end up dropping from 256 unique luminance levels to 184 luminance levels (you can see the quantization artifacts more clearly in the magnified inset).
 
Last edited:
Sorry for the delayed reply was away for a few days.

You said i1 pro - do you mean the i1 display pro (colorimeter), or the i1 pro (spectrophotometer/spectroradiometer)?

Assuming you have argyllCMS installed (instructions are on the WPB guide), all you have to do is place the measuring instrument so that it's measuring the center of the screen, and type dispcal -R in the command line. A test will be performed that tries to determine how many bits you have, and then it will print the results on the screen.

Having a 10 bit LUT is important if you want to do any software color calibration after performing the hardware calibration.

Most of us (there are some exceptions) are running in an environment that has 8 bits per channel framebuffers. This means each pixel, in any given image, is constructed by choosing from a palette of 256 unique values for each of the RGB channels. The video LUT (look up table) is a list of values that maps how these values get interpreted into voltages that are sent to the CRT. Think of it as a matrix that has three columns (one for each channel), and 256 rows. Each element of this matrix will have a value that is represented with 16 bit precision in windows. That means that each value can be an integer between 0 and 65535.

The default LUT is a linear one, which means that the values between 0 and 65535 are evenly divided up by 256. So the first value would be 0, the second would be 255, the third would be 511, and the 256th would be 65535.

Now even though the LUT is specified with 16 bits of precision, this isn't 16 effective bits. For example, if you change the value from 511 to 513, you might think you'd get a small bump in voltage. But with 8 bits of precision, you won't get a bump in voltage until you reach 767.

With 10 bits of precision, even though you can define only 256 video input levels, you can choose from a palette of 1024 values for each of those 256 levels.

Suppose you've hardware calibrated your CRT to perfection, and need no software calibration (i.e. modification to the LUT). Then everything is fine.

But suppose you calibrated your CRT so that you get really inky blacks by using a lower than normal G2 voltage. A side effect of having such a low G2 is that the gamma/luminance function/EOTF of the display isn't ideal. The blacks will be crushed (high gamma). The way to address this is to adjust the LUT.

Here's the important part: with a LUT that has only 8 bits of precision, you only have 256 unique voltages to assign to each of the 256 video input levels. If you want to change the gamma, by raising or lowering the default voltage of one of the video input levels, this new voltage will be the same voltage as a neighbouring video input level. Each time you make a change in this fashion, you drop the number of unique voltages being used, and therefore reduce the number of unique colors being displayed. The more radical the change to the gamma, the more number of unique colors you sacrifice.

Here's an image I made that may help illustrate the limitation of having only 8 bits of LUT precision:

View attachment 72120

The gray line represents the default luminance function of a hypothetical display (gamma = 2.2).

The black line represents the target luminance function (gamma = 1.0, which is the standard in many visual psychophysical experiments).

In order to change the gamm a from 2.2 to 1.0, you end up dropping from 256 unique luminance levels to 184 luminance levels (you can see the quantization artifacts more clearly in the magnified inset).

Thanks for the instructions. I'm using an i1 Pro spectrophotometer, not the i1 Display Pro.

Here are the results:

s1ry1f.png



So... 8 bits.

After using the 1070 Ti and Sunix adapter for about a week or so, I'm really considering going back to the 980 Ti. Maybe I'll even find a Titan X.

I'm much more concerned with absolute image quality than the small increase in FPS I'd get in some games with the 1070 Ti.

I've also noticed that the Sunix adapter causes periodic wavy distortion, and the screen will flicker black every once in a while. Subjectively, the image quality seems a bit worse to me.

On the plus side, I'm able to run any of the resolutions and refresh rates that the 980 Ti could run, but I can't push it past the 400mhz ceiling.

I'll see if EVGA will replace this one, or I'll have to sell it and buy a different card from Ebay.
 
Thanks for sharing these results. The i1 pro spectro is a good instrument to have, but the i1 display pro can read much lower luminance levels. That's going to be important when doing software calibration with deep blacks.
 
Thanks for sharing these results. The i1 pro spectro is a good instrument to have, but the i1 display pro can read much lower luminance levels. That's going to be important when doing software calibration with deep blacks.

I have been thinking about getting an i1 Display Pro, then profiling it to the i1 Pro. Then I'd have the best accuracy and the speed and ability to read to low luminance levels.

The problem with colorimeters is that they drift so you can't really know whether they are accurate without having a spectro to measure them against. I actually own an old i1 Display 2 which reads lower than the i1 Pro but not nearly as low as the i1 Display Pro. I might profile it against the i1 Pro and recalibrate the display and adjust the gamma with ArgyllCMS. But I agree with you, as far as colorimeters go, the i1 Display Pro is the one to get.
 
Thanks for the instructions. I'm using an i1 Pro spectrophotometer, not the i1 Display Pro.
Spectrometer is great to measure gamut but not really to calibrate CRT's

Can you post spectral characteristics of your FW900 and RGB CIE values?
Argyll have tools to measure these things.
 
So... 8 bits.

After using the 1070 Ti and Sunix adapter for about a week or so, I'm really considering going back to the 980 Ti. Maybe I'll even find a Titan X.

I'm much more concerned with absolute image quality than the small increase in FPS I'd get in some games with the 1070 Ti.

I've also noticed that the Sunix adapter causes periodic wavy distortion, and the screen will flicker black every once in a while. Subjectively, the image quality seems a bit worse to me.

On the plus side, I'm able to run any of the resolutions and refresh rates that the 980 Ti could run, but I can't push it past the 400mhz ceiling.

I'll see if EVGA will replace this one, or I'll have to sell it and buy a different card from Ebay.

Sorry to hear the Sunix adapter didn't work out =(. I guess it really does seem to be lottery with the Sunix (or maybe it doesn't like NVidia cards). Not sure you missed it, but for me, to fix the wavy, I had to use a refresh rate not ending with 5. So 60, 70, 80, 90, 100 Hertz work while 65, 75, 85 had wavy. Also to fix the flicker black, I used USB power.
 
I have been thinking about getting an i1 Display Pro, then profiling it to the i1 Pro. Then I'd have the best accuracy and the speed and ability to read to low luminance levels.

The problem with colorimeters is that they drift so you can't really know whether they are accurate without having a spectro to measure them against. I actually own an old i1 Display 2 which reads lower than the i1 Pro but not nearly as low as the i1 Display Pro. I might profile it against the i1 Pro and recalibrate the display and adjust the gamma with ArgyllCMS. But I agree with you, as far as colorimeters go, the i1 Display Pro is the one to get.

the i1 display pro is a very finely engineered instrument, and you probably won't have to worry about drift if you take decent care of it.

Don't forget that spectros drift too.

As a sanity check, you can always measure your primaries every few months and see if the chromaticity remains stable (being sure to measure same part of screen each time to be safe.)
 
Sorry to hear the Sunix adapter didn't work out =(. I guess it really does seem to be lottery with the Sunix (or maybe it doesn't like NVidia cards). Not sure you missed it, but for me, to fix the wavy, I had to use a refresh rate not ending with 5. So 60, 70, 80, 90, 100 Hertz work while 65, 75, 85 had wavy. Also to fix the flicker black, I used USB power.

I'm using USB power and I've tried using different refresh rates. The wavy distortion comes and goes and it may happen less at refresh rates that don't end in 5, but it still happens. And periodic flashes to black still occur. Now, these don't happen too often, but coupled with all the other little hassles I've experienced I'd rather just forget this card and go back to a 980 Ti or a Titan X.

Don't get me wrong. It's not terrible, all things considered. If I actually needed the extra speed, I might put up with it or seek a workaround. But I prioritize absolute image quality now, and issues like the 8 bit LUT versus 10 bit would bother me since I use argyllcms to adjust gamma. Maybe it is just because I had to switch to a different cable, but text seems a little fuzzier.

It's a combination of all these things actually. Others might have a better experience.
 
I'm using USB power and I've tried using different refresh rates. The wavy distortion comes and goes and it may happen less at refresh rates that don't end in 5, but it still happens. And periodic flashes to black still occur. Now, these don't happen too often, but coupled with all the other little hassles I've experienced I'd rather just forget this card and go back to a 980 Ti or a Titan X.

Don't get me wrong. It's not terrible, all things considered. If I actually needed the extra speed, I might put up with it or seek a workaround. But I prioritize absolute image quality now, and issues like the 8 bit LUT versus 10 bit would bother me since I use argyllcms to adjust gamma. Maybe it is just because I had to switch to a different cable, but text seems a little fuzzier.

It's a combination of all these things actually. Others might have a better experience.
On Radeon 8bit is not an issue due to dithering which imho excellently implemented.
As for adapter maybe Delock will be better (as suggested by Derupter). It should be good for 1920x1200p@96Hz which is fine for FW900 and not as demanding for GPU so 96fps constant should be doable on fast GPU. 1440p on FW900 is imho overdoing it anyway.

if you go 980Ti route be sure to get fastest model around.
In any way what is great about CRT is that you can always lower resolution for more demanding games without any scaling (y)
 
Can the Delock 62967 do 1920x1200 96hz guaranteed or is it a lottery? Also where is everyone in the US buying them from? I found this website but not sure if it's the best place to order it. http://www.grooves-inc.com/delock-d...7PfhcCuq0tILjqCJji78ilu_C9L8uEewaApSUEALw_wcB

Does it work with a vga to BNC adapter? I'd rather use BNC to my FW-900

You have various solutions

1)Delock 62967,1920x1200 96 Hz is guaranteed but if it doesn't work you will need to replace the cable,so if you are able to do it this is the most economical solution

2)There are USB-C adapters with the same chipset as 62967 like:
Delock 62796
Plugable USBC-VGA
Sunix C2VC7A0
With these you need a card that acts as displayport to USB-C adapter:
Sunix UPD2018 or Delock 89582 (you need a free x1 pci express slot)
This is a working and stable but less economical solution

3)Sunix DPU3000,very high pixel clock but some users have had various problems

4)Wait for a revised 62967,but i don't know if or when it will happen
 
Alright, my only FW900 completely functional, set up and with a film in a pretty good shape just decided it was time for holidays ... The display width started decreasing/increasing at fast rate after I powered it on, with finally no image after a few minutes and 2 orange blinks.

Of course the service manual of the FW900 gives a very useful piece of advice about orange blinking: contact your Sony authorized dealer. :ROFLMAO:

In the P1130 service manual I can find HV failure for this, but I don't know if the code is the same for the FW900. Does anyone know for sure what it means ?


Of course I'm not going to let that lazy bastard get away with it, it won't rest much and it will remain my slave forever. But if I can have some information helping to look directly in the right direction that can surely help. :)


edit: OK, nevermind, the answer is hidden in the G1 chassis manual (the one somewhat describing the functions of electronics in the FW900 and 21" of the same generation)

HV_FAILURE Amber (0.5 sec) and Off (0.5 sec) and Amber (0.25 sec) and Off (1.25 sec) Extremely high voltage, or high voltage stop

HV_FAILURE
If high voltage detection continues for more than 2 sec, or if the voltage value is out of the specified value, the system is forcibly shut down. In concrete, the voltage of pin 86 (HVDET) is detected.
 
Last edited:
Well, I think I'm finally lining up a deal to source a replacement D board/flyback transformer! Hopefully, I can rejoin the FW900 crowd soon enough; mine's been out of commission for far too long, and I refuse to just junk it.

Sure, my FG2421 is more convenient in a lot of ways (60 Hz modes that modern games default to don't flicker the hell out of my eyes, for starters), but it's just not the same with that VA black smear and 1920x1080 limit.

Ok, I plugged my USB TTL cable into the monitor.

Windows Knows it is a USB to serial however, it won't install drivers. I am reading about RDX and TXD cables being swapped? Does the TXD go to the RXD and vice versa?
If you see a COM port in the Device Manager, you're good to go driver-wise.

As for Tx and Rx connections, remember that those are labeled according to which device you're looking at and needed to be crossed over accordingly. Thus, computer Tx to monitor Rx, and vice versa.

Imagine it being like a two-lane highway; if you connected Tx to Tx on both ends, you'd have traffic crashing head-on into each other!

Also, as an aside for you retrocomputing enthusiasts out there, these TTL/UART serial adapters are also exactly the sort you can use to flash a Gotek floppy emulator with custom firmware like FlashFloppy or HxC, instead of paying ridiculous sums to a reseller just to get a pre-flashed drive. If you're willing to solder on the pin headers yourself and make sure to disconnect the LCD before flashing, it's pretty easy.

Why wouldn't it work through BNC? BNC and VGA carry the same signal.
At a base level, this is true; the RGBHV lines don't change just because they're going through quintuple BNC connectors rather than a DE-15.

However, the extra pins on a DE-15 VGA connector carry EDID information that there are no provisions for on the BNC input, hence why it comes up as a "Generic Non-PnP Monitor" in the Device Manager. This may result in weird complications with being able to set the higher-resolution modes, given my experiences with trying to use the BNC input in the past.
 
HV_FAILURE Amber (0.5 sec) and Off (0.5 sec) and Amber (0.25 sec) and Off (1.25 sec) Extremely high voltage, or high voltage stop

HV_FAILURE
If high voltage detection continues for more than 2 sec, or if the voltage value is out of the specified value, the system is forcibly shut down. In concrete, the voltage of pin 86 (HVDET) is detected.
it is so f*****g sad when FW900 dies :cry:
 
I won't let it die, don't worry. :p

After checking the fail info with Windas I could find an error for vertical deflection. I need to extensively check components around but it seems likely the vertical deflection IC fried and caused instabilities on the 15V / -15V lines in the process which trigered the HV protection.

Fortunately I have a spare IC of that kind on a scrap board.
 
Brace yourselves, ridiculous story. :p


I checked the vertical deflection IC, compared it to the spare one, seemed OK. I still replaced it. I Checked the components around, nothing wrong. Yet I would only get a green light with no picture at all.

I swapped the N board with a working one (the problem could have come from the signal generation), still nothing. Not the G board either (shouldn't have been that since voltages are OK, but it had to be ruled out).

... And in the process of swapping the S and A board I discovered one of the flat cables between the N and D board had a broken wire. The one for Vshape ... And 2 wire ends were unstuck and fold up on the other one. Actually the vertical shape issues I have had from the start on that monitor probably came from that f***** cable, and it's not the resoldering on the N board that fixed it temporarily, but moving around the cable. :rolleyes:

With another cable the monitor isn't brain dead anymore. :D

The display is significantly brighter, and quite blurry though, I still have to investigate if there is another issue to fix or if it just needs to be calibrated again. While it was open I did replace a disc capacitor which is plain crap on the G2 line (type E ceramics with -20+80% tolerance, and insane capacitance drop with temperature/voltage + fast aging). This may be the reason of the sudden change. I suspect that capacitor is the actual root of the G2 drift issue.
 
I'm trying to convince Delock to do the 62967 without the cable,like this:
http://www.delock.com/produkte/1023_Displayport-male---VGA-female/65653/merkmale.html
62967 is amazing,it costs nothing,it is stable as a rock,i have a sample that can do 355 MHz,the most unlucky sample i have can do 340 MHz and if i remember correctly it can do interlaced resolutions,any resolution within that pixel clock is accepted without problems.
But they used a shit cable,it doesn't work good with most video cards and i want to solve this problem.
Who does not care for a higher pixel clock it is the best solution.
About the Sunix adapter,inside the Synaptics VMM2322 chipset there is a triple 8 bit DAC
Mine does 348. It does not work at 349 and in the middle there are artifacts.

I do not have FW900 here and do tests on Dell P1110 and maximum resolution I tried 2360x1770@60Hz to test FastSync and RTSS at 60fps. Butter smooth and less input lag than V-Sync ON :)

Given that I took time to set it up I also added 1920x1150p at 90/1.001, 96/1.001 and 100Hz and madVR resolution auto-switching and will use it to play videos instead of HP LP2480zx. Now I have full vacuum tube experience :cool:

Dell P1110 have much better better ambient light blocking/reflecting than FW900 with polarizer (screen look blacker when off and have less flaring/inner glass reflections) and maximum whites in 'my eyes hurt it is so bright range'. I do not even want to imagine how momentary brightness of raster dot is... if it wasn't moving around so fast it would be probably be like watching at sun :dead:

So far I have not seen anything I could complain about with Delock 62967. It handles any resolution and refresh rate I throw at it without any issues :cat:
I will do sharpness comparison with 980Ti later.

Oh and it does have banding on NV card because adapter is 8bit. On AMD cards there won't be any banding because they know word 'dithering' ;)
 
Mine does 348. It does not work at 349 and in the middle there are artifacts.

I do not have FW900 here and do tests on Dell P1110 and maximum resolution I tried 2360x1770@60Hz to test FastSync and RTSS at 60fps. Butter smooth and less input lag than V-Sync ON :)

Given that I took time to set it up I also added 1920x1150p at 90/1.001, 96/1.001 and 100Hz and madVR resolution auto-switching and will use it to play videos instead of HP LP2480zx. Now I have full vacuum tube experience :cool:

Dell P1110 have much better better ambient light blocking/reflecting than FW900 with polarizer (screen look blacker when off and have less flaring/inner glass reflections) and maximum whites in 'my eyes hurt it is so bright range'. I do not even want to imagine how momentary brightness of raster dot is... if it wasn't moving around so fast it would be probably be like watching at sun :dead:

So far I have not seen anything I could complain about with Delock 62967. It handles any resolution and refresh rate I throw at it without any issues :cat:
I will do sharpness comparison with 980Ti later.

Oh and it does have banding on NV card because adapter is 8bit. On AMD cards there won't be any banding because they know word 'dithering' ;)

So it works well and you didn't have to change the cable,what video card do you have?
348 MHz is a nice sample,if you want to be sure it is stable you can try with right click on a black desktop looking for little noise on window border
The behavior of your sample is right,between a perfect stable image and black screen usually there are 2-3 MHz max
About the banding on Nvidia cards i'v never seen it during my test (2D and 3D) on a 1070,but it's not my computer and i did only short tests,so maybe i didn't notice it.
Is there a specific test to do?
Did you see banding on games or only on specific things?
I'm interested on this because probably my next card will be Nvidia
 
Some news regarding AR film. There's probably no way to have a film manufactured with the correct specifications unless someone personally knows the right people in the right company. All the companies I contacted obviously don't bother thinking about something that isn't expected to bring them much profit. I give up.

I've received 2 samples of standard film to consider as a stopgap though. One AR film and another one which is both antireflective and antiglare.

The AR/AG film is really good regarding reflections, they are low and you can't guess what that reflected shadow is. But as the specifications let foresee (haze <= 5%), that film is pure crap on a display despite being advertised for that use. The picture becomes quite blurry.

The AR film seems on par with the one on FW900s regarding reflections, maybe a bit better. It has purple reflections when the original has rather blue ones. It seems to bring a slight increase in contrast which is consistent with the specs (transmittance of about 92%), no significant static electricity on the surface with a conductive tape simply stuck on the external surface of the film. If necessary I may try to improve the electric contact between the tape and film by fastening a staple in the film. The picture is as sharp as without a film. Pretty much a winner I think.

AR film vs no film:

DSC03170.JPG

DSC03181.JPG

AR/AG film vs no film:

DSC03168.JPG

DSC03186.JPG

AR/AG (left top) vs AR (left bottom) vs bare glass vs original coating (right):

DSC03194.JPG


There's one funny thing I really can't explain though. I wanted to check roughly the transmittance values/profile of the films vs no film with my colorimeter, but I can't. The measurements don't make any sense. I obtain higher luminance values with the films than without. :wacky:
 
  • Like
Reactions: Meeho
like this
So it works well and you didn't have to change the cable,what video card do you have?
348 MHz is a nice sample,if you want to be sure it is stable you can try with right click on a black desktop looking for little noise on window border
The behavior of your sample is right,between a perfect stable image and black screen usually there are 2-3 MHz max
About the banding on Nvidia cards i'v never seen it during my test (2D and 3D) on a 1070,but it's not my computer and i did only short tests,so maybe i didn't notice it.
Is there a specific test to do?
Did you see banding on games or only on specific things?
I'm interested on this because probably my next card will be Nvidia
Without changing GPU LUT content there is no banding. On AMD cards there won't be any banding at all because their output implementation does not suck balls.

I didn't need to change cable. It works flawlessly. GPU is 9800Ti, Asus Posseidon to be precise. It make little sense to use adapter with this GPU obviously but it was cheap and now I am prepared for the future =) I won't ever need faster adapter as even 320MHz is all I ever wanted and this one does even more
 
Last edited:
Does anyone know where some flat cable compatible with the FW900 connectors can be found ? I've searched all over the internet with no luck so far, everything I find has a different width for the same amount of pins. :banghead:

References for the cable are the following: E66085 AWM 20624 80C 60V VW-1 BANDO-S-F 25 pins / about 33mm width / 210mm length
 
Strat_84 , i dont know how hard can be to find an exact cable like that in you location or to find a computer repair center, but where i am (colombia) for example for laptops that use similar cables but its hard to find the exact model, technicians working on computer repairing malls get a lengthier, wider, but same thickness and same pin separation cables and they cut those with scissors to match the width of the original one and it works as replacements, fw900 flat cables seems very similar, you may try going to a repair center to get one and cut?
 
In France repair centers are more or less part of history. But the trick of cutting a wider cable to size is a good point, I'm going to look in that direction. ;)
 
13 years on the front page of [H] Display Sub Forum. Legend.

I finally dumped mine. They both died but one was surely fixable, but the hassle was not worth it. I let them sit in my room for over a year, and finally with a deep sadness wheelbarrowed the damn things to the truck. :/
 
The display is significantly brighter, and quite blurry though, I still have to investigate if there is another issue to fix or if it just needs to be calibrated again. While it was open I did replace a disc capacitor which is plain crap on the G2 line (type E ceramics with -20+80% tolerance, and insane capacitance drop with temperature/voltage + fast aging). This may be the reason of the sudden change. I suspect that capacitor is the actual root of the G2 drift issue.
I investigated this further today. The capacitor in question is C919 BTW. The original one measured about 4150 pF after being heated with desoldering (meaning the value was probably lower before) instead of 4700pF. Not out of specs yet but pretty much worn out considering it may have measured up to 8460pF when new.
But the interesting part is about brightness. The monitor was calibrated for 65 cd/m² brightness 1 year and half ago. G2 may have drifted a bit but not too much. With the capacitor replaced by a new 4700pF / 2kV type B ceramic, brightness now reaches up to 125 cd/m² with the very same settings. No wonder why the display is blurry, the G2 is too high now. Which means that on the top of having crap specifications, that capacitor was also leaky. :facepalm:
 
is it possible to treat scratches on the special coating with some kind of chemical or spot repair or just convert to bare glass?
 
It's can't be "repaired". The only thing you can do is removing the entire film, and stay like that or replace with another film. (if you intend to replace, better keep the damaged film on until you can replace it to avoid as much dust/fat/whatever on the glass as possible)

Regarding replacement, I ordered some AR film for my screens, it should be delivered next week.
That's the one I talked about previously. It's the best I could find but minimum order is 2 m² and that makes the invoice pretty harsh for someone with a single damaged screen. I'll give some feedback about the final result as soon as possible.

If it performs as well as it should and people are interested, I suppose there could be some kind of collective [H] order(s) so that everyone can obtain a piece of film suited for its needs at a reasonable price. I'll have myself some extra surface for sale as I won't use everything.
 
Oh wow didn't think it could replaced! thats nice to know. any other before and afters DIY film replacement?

Ok this thing is heavy so i'm gonna ask first! Can i rotate the based 90 degrees so my keyboard isn't ramming.interfering with the USB ports?
 
Oh wow didn't think it could replaced! thats nice to know. any other before and afters DIY film replacement?

Ok this thing is heavy so i'm gonna ask first! Can i rotate the based 90 degrees so my keyboard isn't ramming.interfering with the USB ports?
The circle part is actually the front of the base and the USB ports are supposed to point right, so you shouldn't have issues with that.
 
Back
Top