24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Should be pretty good. Its horizontal and vertical refresh rates are identical to GDM-FW900
View attachment 565064
Nice! FD Trinitron too.

One thing is that while a lot of the 20" monitors spec'd 1600x1200 being the max, that was only because they were never tested with higher. That's how I found out my Lacie 22" could do 2560x1600 over vga. From what I remember, most 20"+ could do 2048x1536.
 
That's why I said it would have to be some killer application. I'm not sure what, but Porsche's first car in 1898 was an EV and GM had the EV1 in the 1990s and both companies though those were trash technology until Tesla came along. Now everyone is falling all over themselves to make an EV again. This is the type of scenario I see under which CRTs would be around, and I do think it would have to be for a very specific application where money is no object (like being able to see into a working nuclear reactor with your bare eyes or some craziness like that where people would pay ridiculously because the industry has trillions in it so $20k for a monitor would be a normal expense).
Everyone knows and always knew robots in the future will use CRT for a head.
1682082457769.png

Apparently using CRT will give them much needed e-motional clarity.

Other than that I cannot think any good reason for CRT tech to make come back 😪
 
Hello! I need a real Sherlock Holmes here or LAGRUNAUER (or if by chance you know any CRT Wizard show please show him my post)

One component was failing and finally failed on my fw900; I need to locate him.

This CRT is my life I can exist without him, literally.

Key points - present
1) When the monitor will luckily turn on there are absolutely no issues as long as he is displaying the picture. When the monitor will lose power or will go to standby mode it is not working and cannot be turned on.

2) When the monitor will somehow turn on it is flashing green, red, or both. It depends on the room temperature.

2a) If he receives a high-resolution picture from a graphic card (higher voltage needed to operate) he is flashing green and red and goes to the standby mode and cannot be turned on again. He cannot stabilize

3) It was required to keep him on 640x480 dpi until he will warm up

4) If the back of the monitor was preheated with a hairdryer from the sides and turned on there is no green and red flickering and can instantly operate on higher resolutions with no issues. If he will luckily turn it on.

5) When he cannot be turned on and there is no graphic card connected to the monitor the led is green and goes to standby mode. There is no OSD menu displayed at all or green INPUT subtitles on the screen.



6) If he has a graphic card (first video input 1) connected, the monitor after a few seconds starts blinking orange - no light - orange no light infinitely. In the video below I am switching from INPUT 2 (no source) to INPUT 1 (VGA source). For BNC sources the same situation.
There is no OSD menu displayed at all or green INPUT subtitles on the screen.



Key points - before
1) Monitor was turning on with no issues but the system was flashing a few times green.

2) The issue started progressing after a few power surges (losing power in the grid). More green flashing and occasionally turning off on high res was used (different pictures displayed) before he warmed for a few minutes. It was occurring for 2 years and he finally is almost impossible to be turned on.

He was doing precisely what is shown in the bottom video from this time stamp.



3) Now if he is lucky, it is behaving like this if the monitor is cold (no preheat).

What I have done already
- I have schematic
- Screen is disassembled and checked for cold joints and absolutely any visual anomaly on electrical components, no issues found

What is the problem
- I have no slightest idea which exact components are responsible for this behavior and where they are located (the guy from the video above has not repaired his fw900 as well).
- I need someone with a good understanding of CRT screens or have encountered similar issues. I have seen a few videos with this exact issue (like the video above). Sadly no solutions due to amateurs repairs
- I can measure, I can solder, and I can replace components
- This should be pretty obvious for someone with experience and for such help I am hoping for, there is failing one thing
 
Hello! I need a real Sherlock Holmes here or LAGRUNAUER (or if by chance you know any CRT Wizard show please show him my post)

One component was failing and finally failed on my fw900; I need to locate him.

This CRT is my life I can exist without him, literally.

Key points - present
1) When the monitor will luckily turn on there are absolutely no issues as long as he is displaying the picture. When the monitor will lose power or will go to standby mode it is not working and cannot be turned on.

2) When the monitor will somehow turn on it is flashing green, red, or both. It depends on the room temperature.

2a) If he receives a high-resolution picture from a graphic card (higher voltage needed to operate) he is flashing green and red and goes to the standby mode and cannot be turned on again. He cannot stabilize

3) It was required to keep him on 640x480 dpi until he will warm up

4) If the back of the monitor was preheated with a hairdryer from the sides and turned on there is no green and red flickering and can instantly operate on higher resolutions with no issues. If he will luckily turn it on.

5) When he cannot be turned on and there is no graphic card connected to the monitor the led is green and goes to standby mode. There is no OSD menu displayed at all or green INPUT subtitles on the screen.



6) If he has a graphic card (first video input 1) connected, the monitor after a few seconds starts blinking orange - no light - orange no light infinitely. In the video below I am switching from INPUT 2 (no source) to INPUT 1 (VGA source). For BNC sources the same situation.
There is no OSD menu displayed at all or green INPUT subtitles on the screen.



Key points - before
1) Monitor was turning on with no issues but the system was flashing a few times green.

2) The issue started progressing after a few power surges (losing power in the grid). More green flashing and occasionally turning off on high res was used (different pictures displayed) before he warmed for a few minutes. It was occurring for 2 years and he finally is almost impossible to be turned on.

He was doing precisely what is shown in the bottom video from this time stamp.



3) Now if he is lucky, it is behaving like this if the monitor is cold (no preheat).

What I have done already
- I have schematic
- Screen is disassembled and checked for cold joints and absolutely any visual anomaly on electrical components, no issues found

What is the problem
- I have no slightest idea which exact components are responsible for this behavior and where they are located (the guy from the video above has not repaired his fw900 as well).
- I need someone with a good understanding of CRT screens or have encountered similar issues. I have seen a few videos with this exact issue (like the video above). Sadly no solutions due to amateurs repairs
- I can measure, I can solder, and I can replace components
- This should be pretty obvious for someone with experience and for such help I am hoping for, there is failing one thing

It's insane how wide the period gap between GDM-FW900 and C9 is, we got to live through almost 20 years of LCDs before a proper gaming OLED comes out.
 
Hello! I need a real Sherlock Holmes here or LAGRUNAUER (or if by chance you know any CRT Wizard show please show him my post)

One component was failing and finally failed on my fw900; I need to locate him.

This CRT is my life I can exist without him, literally.

Key points - present
1) When the monitor will luckily turn on there are absolutely no issues as long as he is displaying the picture. When the monitor will lose power or will go to standby mode it is not working and cannot be turned on.

2) When the monitor will somehow turn on it is flashing green, red, or both. It depends on the room temperature.

2a) If he receives a high-resolution picture from a graphic card (higher voltage needed to operate) he is flashing green and red and goes to the standby mode and cannot be turned on again. He cannot stabilize

3) It was required to keep him on 640x480 dpi until he will warm up

4) If the back of the monitor was preheated with a hairdryer from the sides and turned on there is no green and red flickering and can instantly operate on higher resolutions with no issues. If he will luckily turn it on.

5) When he cannot be turned on and there is no graphic card connected to the monitor the led is green and goes to standby mode. There is no OSD menu displayed at all or green INPUT subtitles on the screen.



6) If he has a graphic card (first video input 1) connected, the monitor after a few seconds starts blinking orange - no light - orange no light infinitely. In the video below I am switching from INPUT 2 (no source) to INPUT 1 (VGA source). For BNC sources the same situation.
There is no OSD menu displayed at all or green INPUT subtitles on the screen.



Key points - before
1) Monitor was turning on with no issues but the system was flashing a few times green.

2) The issue started progressing after a few power surges (losing power in the grid). More green flashing and occasionally turning off on high res was used (different pictures displayed) before he warmed for a few minutes. It was occurring for 2 years and he finally is almost impossible to be turned on.

He was doing precisely what is shown in the bottom video from this time stamp.



3) Now if he is lucky, it is behaving like this if the monitor is cold (no preheat).

What I have done already
- I have schematic
- Screen is disassembled and checked for cold joints and absolutely any visual anomaly on electrical components, no issues found

What is the problem
- I have no slightest idea which exact components are responsible for this behavior and where they are located (the guy from the video above has not repaired his fw900 as well).
- I need someone with a good understanding of CRT screens or have encountered similar issues. I have seen a few videos with this exact issue (like the video above). Sadly no solutions due to amateurs repairs
- I can measure, I can solder, and I can replace components
- This should be pretty obvious for someone with experience and for such help I am hoping for, there is failing one thing


"This CRT is my life I can exist without him, literally. "

I truly understand your desire to keep you unit up and running but I hope you are just kidding when it comes to your seeming level of anguish. :) A crt is a meager thing when compared to the best things in life.
 
Hi Guys, I'm just curious how many members do we have here? In total and interacting with this now LEGENDARY thread!?
I don’t have my GDM’s anymore but I know WinDAS very well and have calibrated my screens many a time so I like to hang out and help when I can.
 
Hi, i just squeezed the trigger on the only FW900 i could find on the european market in near mint condition. Holy expensive!
Having skimmed through ~50 pages of this impressive thread, some stuff has stuck with me and a lot more hasn't.

I will be picking this unit up in a week, yet i am concerned on how to run it properly.
To ensure this, i need to get some questions and uncertainties out of the way.
I will try to explain how i envision setting up this screen taking into account what i think i have gathered as someone who never
dealt with CRTs except playing A Link to the Past as a 5 year old... and reading this thread 20 years later

My current graphics card is an RTX 4080, which features Displayport exclusively. This means i have to get an adapter.
Important to me is the size of the displayed picture being 2304 x 1440 with a refresh rate as high as possible.
I have read about people driving this resolution at 80 Hz, however i am not sure if this is attainable at all using adapters.
Can any of these adapters run max resolution above 60 Hz?
1. Startech DP2VGAHD20
2. Sunix DPU3000
3. Delock 62967
If i missed any, i would be most grateful for recommendations!

After this, i'd set my windows resolution to something lower than 2K that would fit onto the FW900 since i'm about to plug it in - probably won't matter anyways, but just a precaution.
FW900 turned on, my 4080 connected to the FW900 via the adapter (which i have yet to choose), at this point i should see my desktop while the CRT
warms up.

I have no idea which resolution i am going to see, but it will probably be something. I have heard of something called EDID which is transmitted from
monitor to PC telling windows what the monitor can and can't do, correct me if i'm wrong please. I have also heard of some adapters causing windows to "play it safe" because EDID is not
being correctly relayed, forcing 60 Hz.

I expect to be able to change the resoluton to 2304x1440 in the "NVIDIA Custom Resolution and Refresh Rates in the control panel" as stated in the original post.
If i am for some reason forced to 60 Hz, i have to install a program named "CRU" and overwrite my display settings in the registry to what i desire. This is limited by my adapters(?) bandwidth (?)
and the vert. and horiz. refresh rates of the FW900. (Both things i know close to nothing about).
It seems like i will have to find what Hz i can drive at 2304x1440 without running into problems with my picture. I am worried that i can damage the CRT if i mess this up.

Max resolution at high refresh is important to me. Most important to me is the health of this beautiful relic. So please stop me from doing idiot stuff!
I will consider calibration at a later point in time.

After having typed this up, i realize i'm shamefully asking for a runthrough... just throw me a tip here and there, and i'll be glad to further read through the previous countless pages in the meantime.
It is a true shame that this reference list link is dead. Anyways, i will try to document my experience setting up this impressive dinosaur when the time comes.
 
Last edited:
First of all, congratulations on getting your FW900!

So regarding the adapters, to be able to go up to 2304x1440 at 80Hz you'd need the Sunix DPU3000 / Delock 87685 (rebadged Sunix).
At that resolution, 80Hz is indeed the maximum the FW900 can do. I tend to use both that resolution and also 1920x1200 at 96Hz a bit more often, at least for recent games.

No need to change your resolution before plugging in a new monitor, the resolution you have on your current monitor won't affect the other one you just plugged in. Windows will just select a resolution that seems appropriate for the monitor, according to the EDID it got from the VGA port. If you're using the 5x BNC connectors, no EDID infos will be communicated to your computer, so Windows will default to a low resolution at 60Hz as you mentionned, likely to be compatible with a wide range of displays. It should work just fine, I never encountered any issue plugging in a monitor on a Windows machine, with or without EDID.

Now, since you're using an Nvidia GPU, whether or not you have EDID, you can just go to the Nvidia custom resolution tool and create a custom resolution yourself, using automatic GTF timings (CVT works too), and you just have to set the resolution of 2304x1440 (or whatever else you want to set) and the frequency you want to go for. As long as the adapter allows it, and it is within the specs of the monitor, it should work right away.

As far as I remember, recent Nvidia drivers don't appreciate CRU too much. I might be wrong, someone else please correct me on that if I am.
But if I remember right, I had issues getting CRU resolutions to be recognized by Windows itself when using my RTX 3080.

I need to find my notes on that, but I remember the bandwidth limit of the DPU3000 is around 540MHz, so you have a lot of room to play with, using that adapter. The monitor itself doesn't have a real pixel clock limit (I pushed it to like 650MHz or something like that once), but it has horizontal (121.99kHz) and vertical (160Hz) frequency caps. You can see those frequencies calculated in both CRU and the Nvidia tool when you enter a custom resolution. Just make sure the values get updated in the Nvidia tool when you enter new numbers, you might have to click elsewhere in the Nvidia window to get the values to auto update after you change the numbers, you'll see it by yourself.

Basically, let's say you want to do 1920x1200, you enter this resolution in the Nvidia tool, with GTF timings. It will show you that at 60Hz you have 74.52kHz, and a pixel clock of 193.1558MHz, you're far from the 121.99kHz limit of the FW900, and the 540MHz limit of your adapter. So you decide to enter 100Hz, then you see now you would be at 127.10kHz and 337.5776MHz, you're above the 121kHz limit. Your monitor can't go that high. You'll see that at 96Hz, you have 121.73kHz which is within the monitor's specs. So you can use that resolution.

The same thing goes for higher resolutions, let's say you're curious and want to go crazy and try 2880x1800, you will see that at 65Hz vertical you're getting 121.42kHz horizontal, and 481.7946MHz pixel clock using GTF timings. That's fine for your monitor too, but the pixel clock gets really high now. If you're using the DPU3000 this will work (I just tried it, I can confirm)!

I would say you won't risk damaging your CRT fiddling with custom resolutions, as the firmware of the CRT itself will not allow any resolution above those limits to be displayed. I tried some extremes already myself (4K 60Hz on a F520, 240p 420Hz on a Iiyama 514), without any issues so far. But I'll let someone else comment on that.

The maximum refresh rate you can achieve on the FW900 is 160Hz, and that would match with a resolution a bit below 720p. If you stay with around 16:10 aspect ratio, you can do like 1024x640 at 160Hz, that's gonna be 112.48kHz so you have a little more room to go higher. That's what's amazing with those monitors, you can do absolutely whatever resolution and refresh rate you like, it will always look super clean and you can do weird resolutions that are not common (as long as the games support it, that's another issue haha).

It's late where I live, so I hope I didn't write anything wrong LOL. But here are my two cents. Hope it helps you better understand things so you can soon set up your FW900!

And also, one question. Do you know if it has the anti glare film, or if it has been removed by the previous owner? If you still have it and intend on keeping it, be careful cleaning the screen! I don't have the film on mine anymore so I don't exactly remember what's the best way to clean it. But it can be easily damaged as far as I remember. People here will have mixed opinions on what's better, with or without the film, but if you have it and it is still in great condition (no scratches or blemishes) I'd say try to keep it in good shape.
 
Last edited:
On another subject, I was finally able to get my hands on a replacement compatible flyback for my FW900.

The original flyback from the FW900 (G1W chassis) is the NX-4504//J1D4 (1-453-348-11).

I remember some talk about the flyback from the Sony G1 chassis being compatible and working flawlessly. Those are the references I noted (all being the same flyback).
NX-4502//J1D4 (X-4560-175-1, 8-598-827-00, 8-598-827-10)

I managed to order one recently, I'm waiting for it to arrive. It was pulled from a poor IBM P260 monitor which had its tube damaged in transport, unfortunately. So it's clearly used but technically working.

My FW900 is working fine at the moment, but years ago the flyback started showing signs of weakness, and it scared me at the time. Fortunately the issue looks to be gone now, but I've been hunting for a replacement since then. So the day something wrong happens to it, I'll have a replacement ready. Hopefully that day does not come anytime soon.

But that said, I was curious to know if the person who tried that is still around, and how is the monitor doing after all this time? Everything still going perfectly with the replaced flyback?
 
Thank you so much for the insight, etienne51!
As for your question, the unit still has the anti glare film. I will look into how to handle cleaning these properly sure enough, thanks for the heads up.
I will be sure to post an update when the day comes around :)
 
I suggest not cleaning the screen too often, but rather regarding it as a special event. And when you do treating it as a coated camera lens. No cleaners with an anti-static solution, which the manual warns will scratch. And using a lens cleaning cloth with the lens cleaner. (Which afterwards I would discard for a fresh one.)

With that, the anti-glare lasted on my first FW900 without any sign of damage.

EDIT: Also please make sure any speakers you have near the screen are truly magnetically shielded. (Can't trust manufacturers claims regarding this at this point.)
 
I've been using soap on my LaCie for close to 10 years now. It's always super mild castile soap, and just a small amount for loosening oil and dirt
 
About that mysterious LK7112 HDMI chipset that can do 395mHz. Originally these appeared in some older versions of an adapter from "Gembird"

Looks like a guy in Turkey cracked open a bunch of converters to find which ones had the chipset, and he resold the ones with a LK7112 inside a 3D printed shell:

https://www.ebay.com.my/itm/385652361429

Pretty awesome service to the community. Looks like he's sold out thought.

Derupter I don't think you added this one to your master list?

Unfortunately I can't even find any sort of information on it, like a datasheet from the manufacturer. Like tears in the rain.

s-l1600.png
 
About that mysterious LK7112 HDMI chipset that can do 395mHz. Originally these appeared in some older versions of an adapter from "Gembird"

Looks like a guy in Turkey cracked open a bunch of converters to find which ones had the chipset, and he resold the ones with a LK7112 inside a 3D printed shell:

https://www.ebay.com.my/itm/385652361429

Pretty awesome service to the community. Looks like he's sold out thought.

Derupter I don't think you added this one to your master list?

Unfortunately I can't even find any sort of information on it, like a datasheet from the manufacturer. Like tears in the rain.

Added this and other info to the list
 
Today I had a chance to speak on the phone with a crt repair tech. We spoke about running a crt monitor beyond specs. This is what he told me. " Yes, you can run a crt monitor above it's rated specs such as higher res and refresh rate. He prefaced his answer with, "It will shorten the life of the crt by causing above normal heating of the yoke and also caps will dry out faster." That said, your mileage will vary according to current state of the monitor.
 
Shouldn't be any chance for manufacturer showing off the best Professional CRT that was ever made in the world with HDR and Adaptive-Sync/G-Sync Ultimate. We'll have to settle with FW900 for eternity, which's already the best from consumer market.
 
Today I had a chance to speak on the phone with a crt repair tech. We spoke about running a crt monitor beyond specs. This is what he told me. " Yes, you can run a crt monitor above it's rated specs such as higher res and refresh rate. He prefaced his answer with, "It will shorten the life of the crt by causing above normal heating of the yoke and also caps will dry out faster." That said, your mileage will vary according to current state of the monitor.

The thing is, you don't have to run it over spec all the time. Maybe just for a particular game your really like, once in a while.
 
Today I had a chance to speak on the phone with a crt repair tech. We spoke about running a crt monitor beyond specs. This is what he told me. " Yes, you can run a crt monitor above it's rated specs such as higher res and refresh rate. He prefaced his answer with, "It will shorten the life of the crt by causing above normal heating of the yoke and also caps will dry out faster." That said, your mileage will vary according to current state of the monitor.
Funny answer. :LOL:

I don't see any reason why a simple coil running at a lightly higher temperature would shorten the life of the tube. As for capacitors drying out faster, it's plain bullshit. BUT the electronics before the tube may not like higher resolution or higher refresh rates, because it could simply be off spec for the various semi-conductors on the path of the signal (like amplifiers). That may degrade the signal quality, or cause some excessive heating of the said amplifiers and their premature death. Provided the monitor accepts to display off specs pictures, which is not a given in the case of trinitrons.
 
Will the arrival of newer and more OLED monitors make the prices of high-end CRTs drop? Barring motion clarity, a fast OLED Monitor is an upgrade coming even from good CRTs.
 
Every time I look at the price of my 21" Diamondtrons on ebay, they've gone up by another $100
They can ask whatever they want, but I doubt actual payers exist. Until now some people still prefered CRTs because of the superior uniformity, colors, black levels, display lag and motion clarity. But now that OLED monitors offer similar benefits barring motion clarity (which should just be a matter of time), what reason is there for CRTs to command such prices?
 
Will the arrival of newer and more OLED monitors make the prices of high-end CRTs drop? Barring motion clarity, a fast OLED Monitor is an upgrade coming even from good CRTs.
Doubt it. CRT still has its place and its fans will be fanning away, baby. That being said I'd get one. :)
 
  • Like
Reactions: Xar
like this
So, i finally have it. 830 mile trip :confused:
As promised, i wanted to update.
I will abstain from pictures, they would just be pictures of a FW900 with a different flavor of desk and room than other FW900 pictures ;)
The unit seems flawless externally. Every square inch of antiglare is pristine, except from a finger smudged corner i will not wipe down until i got some good quality microfiber. Housing is pristine too.

It runs over: DP to miniDP > Delock87685 > VGA cable
which seems... fine? I can display max resolution at 80 FPS, however with periodic trembling that becomes most apparent on thin lines. It's not continuous, but happens in a "pulsating" manner every 10 minutes or so.
During this phenomenon, the perceived jitter seems to come from the picture moving up and down by a very small amount, before coming to a rest. When moving down, it leaves a white line at the top.
I cannot source this disturbance, however, running at 1920x1200@96FPS made it less frequent, and it has completely gone away at 1920x1200@92FPS (-4 FPS).

At high brightness+contrast (happens with either, probably to do with total display brightness), the middle of the screen "selectively" gets out of focus in a strange manner. If you look at a thin horizontal line, it starts to "split" into two parrallel lines
as it goes from the sides into the central field of vision, before merging again as it approaches the other side of the monitor. It is only noticeable with dark lines on bright backgrounds, with the effect not happening with bright lines on dark.
The space between the split lines increases with increasing brightness setting, and starts to disappear when decreasing. Fortunately, there is a point (27 bright, 66 contrast) where it's gone. It doesn't affect verticals.
The entire effect is only visible on very thin lines, can be countered, and seems negligible in itself. First and foremost, it is still strange.

This aside, i am amazed at what i see coming from a lifetime of LCD use. Amazed at how even low resolutions simply look good. The motion clarity.
And at last, the colours. This thing needs calibrating. I run it at 25 brightness to have good black, if i go brighter i see a red / green tinge i can only combat so much with visual tuning.
Image restoration and a little eye measurement has left me at a point that looks good. Stunning, even.
I have a DTP94, TTL cables + adapter on the way and i am stoked to suffer through WinDAS.

All in all, when it comes to the FW900, i can see what you guys are all about. You're onto something, big time, with this monitor :)
 
Last edited:
So, i finally have it. 830 mile trip :confused:
As promised, i wanted to update.
I will abstain from pictures, they would just be pictures of a FW900 with a different flavor of desk and room than other FW900 pictures ;)
The unit seems flawless externally. Every square inch of antiglare is pristine, except from a finger smudged corner i will not wipe down until i got some good quality microfiber. Housing is pristine too.

It runs over: DP to miniDP > Delock87685 > VGA cable
which seems... fine? I can display max resolution at 80 FPS, however with periodic trembling that becomes most apparent on thin lines. It's not continuous, but happens in a "pulsating" manner every 10 minutes or so.
During this phenomenon, the perceived jitter seems to come from the picture moving up and down by a very small amount, before coming to a rest. When moving down, it leaves a white line at the top.
I cannot source this disturbance, however, running at 1920x1200@96FPS made it less frequent, and it has completely gone away at 1920x1200@92FPS (-4 FPS).

At high brightness+contrast (happens with either, probably to do with total display brightness), the middle of the screen "selectively" gets out of focus in a strange manner. If you look at a thin horizontal line, it starts to "split" into two parrallel lines
as it goes from the sides into the central field of vision, before merging again as it approaches the other side of the monitor. It is only noticeable with dark lines on bright backgrounds, with the effect not happening with bright lines on dark.
The space between the split lines increases with increasing brightness setting, and starts to disappear when decreasing. Fortunately, there is a point (27 bright, 66 contrast) where it's gone. It doesn't affect verticals.
The entire effect is only visible on very thin lines, can be countered, and seems negligible in itself. First and foremost, it is still strange.

This aside, i am amazed at what i see coming from a lifetime of LCD use. Amazed at how even low resolutions simply look good. The motion clarity.
And at last, the colours. This thing needs calibrating. I run it at 25 brightness to have good black, if i go brighter i see a red / green tinge i can only combat so much with visual tuning.
Image restoration and a little eye measurement has left me at a point that looks good. Stunning, even.
I have a DTP94, TTL cables + adapter on the way and i am stoked to suffer through WinDAS.

All in all, when it comes to the FW900, i can see what you guys are all about. You're onto something, big time, with this monitor :)
Take. Your. Time. Really. Relax and take your time. You're not going to get it the first go round. Congratulations though!
 
All in all, when it comes to the FW900, i can see what you guys are all about. You're onto something, big time, with this monitor :)

Don't forget, now that you don't have Gsync, you gotta use Vsync to get proper motion clarity (frame rate = refresh rate)

There's a few ways to get low lag vsync:

Latent Sync in Special K. This even includes something called "delay bias" where in games that have some GPU overhead, you can lag lower than vsync off!

Then there is Scanline Sync in RTSS. This is essentially vsync-off, just the software keeps the tearing line in the blanking interval, so it's not visible

Then there is always just using in-game vsync combined with a frame rate cap which can be a good method depending on the game engine.
 
Just picked up a 22in HP P1230 Diamondtron for a personal monitor. You can really tell the difference between the two when you have both a Trinitron and a Shadow Mask sitting next to each other. I've got an excellent 21in Cornerstone p1500 shadow mask next to the HP1230. The Cornerstone is the definition of a "text monitor" the tight .22mm dot pitch makes working and reading text much easier on the eyes. By contrast the HP p1230 with it's Trinitron tube and .24mm dot pitch is less of a performer when it comes to text but much more vivid when playing games. I know I could probably give the colors of the shadow mask a little more "punch" using NVCP should I choose, but it's rather inconsequential as I prefer my colors more muted and natural. Here's some pics of the HP1230
 

Attachments

  • 20230627_135048.jpg
    20230627_135048.jpg
    327.6 KB · Views: 0
  • 20230627_155330.jpg
    20230627_155330.jpg
    369.2 KB · Views: 0
  • 20230627_155423.jpg
    20230627_155423.jpg
    335.7 KB · Views: 0
Last edited:
colors of the shadow mask a little more "punch" using NVCP should I choose,

Actually I'm pretty sure this would give you banding, since you'd most likely be using 8-bit color, and all those 256 steps of R,G, and B are already accounted for. So any adjustments would cause overlap between colors that are supposed to be separate.
 
Actually I'm pretty sure this would give you banding, since you'd most likely be using 8-bit color, and all those 256 steps of R,G, and B are already accounted for. So any adjustments would cause overlap between colors that are supposed to be separate.
Good info! Thanks. I'm running the HP1230 @2880x2160@60hz just for a couple of sessions of Metro Exodus. Very nice!
 
Last edited:
So, i finally have it. 830 mile trip :confused:
As promised, i wanted to update.
I will abstain from pictures, they would just be pictures of a FW900 with a different flavor of desk and room than other FW900 pictures ;)
The unit seems flawless externally. Every square inch of antiglare is pristine, except from a finger smudged corner i will not wipe down until i got some good quality microfiber. Housing is pristine too.

It runs over: DP to miniDP > Delock87685 > VGA cable
which seems... fine? I can display max resolution at 80 FPS, however with periodic trembling that becomes most apparent on thin lines. It's not continuous, but happens in a "pulsating" manner every 10 minutes or so.
During this phenomenon, the perceived jitter seems to come from the picture moving up and down by a very small amount, before coming to a rest. When moving down, it leaves a white line at the top.
I cannot source this disturbance, however, running at 1920x1200@96FPS made it less frequent, and it has completely gone away at 1920x1200@92FPS (-4 FPS).

At high brightness+contrast (happens with either, probably to do with total display brightness), the middle of the screen "selectively" gets out of focus in a strange manner. If you look at a thin horizontal line, it starts to "split" into two parrallel lines
as it goes from the sides into the central field of vision, before merging again as it approaches the other side of the monitor. It is only noticeable with dark lines on bright backgrounds, with the effect not happening with bright lines on dark.
The space between the split lines increases with increasing brightness setting, and starts to disappear when decreasing. Fortunately, there is a point (27 bright, 66 contrast) where it's gone. It doesn't affect verticals.
The entire effect is only visible on very thin lines, can be countered, and seems negligible in itself. First and foremost, it is still strange.

This aside, i am amazed at what i see coming from a lifetime of LCD use. Amazed at how even low resolutions simply look good. The motion clarity.
And at last, the colours. This thing needs calibrating. I run it at 25 brightness to have good black, if i go brighter i see a red / green tinge i can only combat so much with visual tuning.
Image restoration and a little eye measurement has left me at a point that looks good. Stunning, even.
I have a DTP94, TTL cables + adapter on the way and i am stoked to suffer through WinDAS.

All in all, when it comes to the FW900, i can see what you guys are all about. You're onto something, big time, with this monitor :)
1920x1200 @ 96Hz is the most I was able to get out of the FW. 97Hz was out of scan range. It sounds like your FW was simply crying for help at 96 which is why it is behaving better at 92. With all the different resolutions I've tested at, I finally settled at 2235x1397 @ 83Hz using Custom Resolution Utility which still works fine with Nvidia. I run mine with the Startech DP2VGAHD20, but I also have the Delock 87685 sitting in a box. I bought a spare Startech just in case because I am paranoid these will eventually stop existing lol.
 
So, i finally have it. 830 mile trip :confused:

If I may ask, what was the price? I am thinking about selling mine in the same condition - 1st owner, flawless AR, overall pristine condition without a scratch. I am from EU but I can possibly ship anywhere over the globe.
 
If I may ask, what was the price? I am thinking about selling mine in the same condition - 1st owner, flawless AR, overall pristine condition without a scratch. I am from EU but I can possibly ship anywhere over the globe.
I’ve bought a few in the last 8 months. Paid between 1000 usd and 2500…. Going off recent eBay listings, pristine ones or near enough have sold around 2800.
 
Actually I'm pretty sure this would give you banding, since you'd most likely be using 8-bit color, and all those 256 steps of R,G, and B are already accounted for. So any adjustments would cause overlap between colors that are supposed to be separate.
Thanks sir! I am not versed as well as I'd like in this subject so always good to pick up a bit of knowledge from someone like yourself. :)
 
Back
Top