24" Widescreen CRT (FW900) From Ebay arrived,Comments.

This doesn't have much relevance, other than you guys were talking about old stuff and fans:

Remember, The Lord of Apple decided that even a single fan in an Apple III would be aesthetically displeasing. If the comparable BS were to happen today, Apple fanatics would just get more powerful ACs for there Apple Huts.

IT DID HAPPEN TODAY! The trash can mac pro only has one fan and they run hotter than the sun. It was polarizing and controversial and they are supposedly going to roll out a new design that ISN'T a trash can mac pro, IF they don't kill off the mac pro all together.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I did, it's in my wall of text that you apparently missed :p.

Unless if you're insinuating 90 hz is different from the 95 hz that I tested? Regardless, both work fine and input latency is the exact same as VGA direct out. I'm able to hit resolutons even higher than 1200p, actually. I'll see if I can post a video.

since you wrote "1900 by 1200 " (did not put the "2" after the "9" but now i understand it was a mistake, no prob ;) )
i said 90hz because thats the sweet balance spot to me between a good resolution and good refresh rate on the FW900.

Okay guys, moment of truth:

First image - my FW900 with adapter running at 1920 x 1200 @ 95 hz.
View attachment 96015

Second image - both my FW900s (one VGA out and the other with adapter) running at 1900 x 1200 @ 85 hz (Windows would only let me duplicate at this refresh rate).
View attachment 96014
Video - Taken after second image, running stopwatch test at slow-mo to show 0 input latency between the two monitors. Compared to my laptop's LCD, both displays are approximately 15 - 30 ms faster. I'm too lazy to upload the slowmo videos comparing each FW900 with the laptops LCD, so you're going to have to take my word for it.


It's attached, and link is also here: https://imgur.com/P4HvfcR

Edit - maybe next week I can provide an input lag comparison between my FW900 and my PG287Q @ 165 hz. I'd have to downclock both to the lowest common denominator for refresh rate and resolution though.
P4HvfcR.mp4

many thanks for this, however it worries me that i see some waves in the first picture you posted and somewhat blurry image, is that normal? or maybe a camera affect and focus? (hope so)

it would be interesting to see the FW900 and my PG287Q comparisions
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
since you wrote "1900 by 1200 " (did not put the "2" after the "9" but now i understand it was a mistake, no prob ;) )
i said 90hz because thats the sweet balance spot to me between a good resolution and good refresh rate on the FW900.



many thanks for this, however it worries me that i see some waves in the first picture you posted and somewhat blurry image, is that normal? or maybe a camera affect and focus? (hope so)

it would be interesting to see the FW900 and my PG287Q comparisions

Yes, that is my camera lens distorting the image. The geometry isn't that bad. This is a common issue for most cell phones. https://talk.sonymobile.com/t5/Xper...as-serious-distortion-issue/td-p/1232765#gref
 
i see that adapter you use has a 5V input, did you need to use that? or it just worked wihtout that aditional voltage?
 
i see that adapter you use has a 5V input, did you need to use that? or it just worked wihtout that aditional voltage?

It's needed to do the actual conversion. I think display ports dont 'need a seperate power input because displayport already provides the 5v but HDMI doesn't. I could be wrong though.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
As an Amazon Associate, HardForum may earn from qualifying purchases.
The BIAS/GAINS control on the OSD is to accurately white point balance/color calibrate the display using any commercial color calibrations system. It gives the user indendent and accurate control of each color gun in a defined color space at D93, D65 and D50 targets. However, these controls cannot be indenpendently adjusted while pefroming the white point balance via WinDAS and/or WinCAT because during the process, both programs lock the OSD, and the process is performed via EEPROM and not via OSD.

If during a white point balance using any commercial color calibration system (software calibration), the gun(s) reached 100 on either BIAS/GAIN, then the unit needs hardware white point balance calibration via WinDAS and/or WinCAT. Now, if after the hardware white point balance, the unit still reaches 100% on either BIAS and/or GAINS via ODS adjustment, then the gun(s) are bad, the tube cannot achieve accurate white point balance, and the tube may be near the end of its life. In addition to this, the luminance parameter can be adjusted via WinDAS, but if during the process, the unit adjust above 200 (0-255 on the WinDAS adjustment scale) on the second white background pass, then the tube's emission is pretty low and the tube may have reached the end of its life.

There are many other factors that will directly affect the life of the tube, however contrary to many beliefs, in my professional opinion, running the unit at 100% contrast or less will not affect the life of the tube as the Trinitron CRT is designed to sustain such level of contrast. If the monitor is hardware calibrated via WinDAS (and/or WinCAT) every six months, then the life of the tube can be extended for a very long time. Proof of that... I own, among many of them, a pair of GDM-FW900s with manufacturing dates of 2001, that are as bright and sharp as the day I bought them and look pretty new if you look at them, because I perform hardware calibration via WinDAS every six months. Is like owining a high performance vehicle, and performing a tune up and regular maintenance every six months...

On cathode poisoning... I never experienced this scenario. If on my two-CRT setup, I am using one of the monitors to watch a movie or play a game and do not want to be disturbed with the other unit, I simply turn it off.

Hope this helps...

Sincerely,

Unkle Vito!

Does this still hold true Unkle Vito?
 
I bought some more cheap cables from Amazon, this time from displayport to vga instead of from hdmi. Will report back on performance next week if I get the time.
You were pretty lucky with the HDMI to VGA one, I doubt you'll get a result as good with a cheap displayport/VGA converter. Most of them have a pixel clock stuck to 200Mhz and image quality issues. That's why we had to look for the two Sunix or Delock models, they are pretty much the only ones with proper performance levels.
 
Does this still hold true Unkle Vito?
If it was true for fifteen+ years then it is most probably still true...
...but consider this: 6 month time was derived with using monitor 100% of the time you use computer and this today should not be the case if you mostly use computer for web browsing and other such text/desktop related tasks because you should not use FW900 for that as it is terrible for it and if you care for monitor longevity you should limit its usage to bare minimum: gaming

other than that the whole WPB thing being so beneficial for monitor is debatable anyway. What it most probably does is set what voltages values in OSD correspond to and while doing WPB tube is switched on and off many times which does not look or sounds all that healthy....

so considering all that I would rather recommend doing it like once two/three years to correct things like G2 voltage (for which BTW you do not need to do WPB and thus do not need to own calibration probe) and do normal OSD calibration once few months instead
 
But fortunately, given the continuous flow of bullshit you keep writing I hope nobody will take your recommendations too seriously.

If you knew anything about WPB you would not pretend the tube is "switched on and off many times" in the process ... And setting the G2 alone with manual values edition is against a proper operation of the monitor (as long as a terrible idea for anyone carring about color accuracy BTW).
 
Woah woah, can we chill with the hostility. We're a small bunch - a dying breed - let's just all get a long :)
I once said WPB is bullshit and one need to put proper coating on CRT for best image quality and WPB was non-important which was against notion of "removing AG made FW900 brilliantly perfect" and "doing WPB make CRT black look vantablack"

people here do not know how CRTs are controlled and never gave any thought what WPB can actually do and believe in magic, hence 'conflict'

ps. oh, and did I mention I dared to point out flaws of CRT tech and that in some things some LCD are superior? that sure must have drown some people to mindless overdrive ...
 
people here do not know how CRTs are controlled and never gave any thought what WPB can actually do and believe in magic, hence 'conflict'
Oh, then can you please share some of your genius thoughts with us, unworthy ignorants, and explain what you think WPB does ? It's going to be interesting ... :cool:
 
It stands for White Point Balance. It's the procedure used to set the color/G2 factory settings of the monitor using a colorimeter, the 4 pins interface at the back of the monitor and Windas.
 
Oh, then can you please share some of your genius thoughts with us, unworthy ignorants, and explain what you think WPB does ? It's going to be interesting ... :cool:
It does absolutely nothing that should bother my FW900 with as it does not need it and won't need it for years to come.

I have Dell P1110 though with green gun somehow needing to have GAIN/BIAS setting at 100 while red and blue are set to near 0 for good color. This one will use WPB because soon it will go out of range where it can be calibrated. Ultimately if I did WPB now while still in calibration range guns will get exactly the same voltages and monitor will the same image quality.

Values in EEPROM that WPB changes can be related to ranges of BIAS/GAIN/BRIGHTNESS/CONTRAST in either of two ways:
1. They control different regulators on the board than these OSD settings and thus set ranges in more "analogue" fashion
2. They are just taken into calculations performed by microprocessor <- hint: cheaper and more sane solution

In either case the most important part of CRT monitor: electron guns, get the same set of voltages in both cases and will produce identical image quality that is only dependent on their wear out not the way control voltages are. Last transistor, the one which actually drives the gun and is most likely to fail also is controlled in the same way.
Rest is some low power components and unlikely to fail thus irrelevant even if in case no. 1 doing WPB can shift some stress from one part of electronic to the other.

CRT dies regardless of if someone does WPB once six months or just set G2 once few years and call it a day. This part of electronic have simplest and have easiest job, it is relatively low power and low voltage, unlike eg. flyback transformer, saw tooth wave generator and coil drivers, i.e. the stuff that actually does the heavy lifting

I believe you and people who claim that after WPB you see improvement in colors because I know how sight work and how placebo work. I can control vibrancy of colors with my will like I would move a hand. I can control them from range of normal colors to the point I experience colors on every sense (its called synesthesia) and colors are literally glowing orgasmic pleasure and look magical. You and the likes of you who does not know this and cannot control this fall in to all sorts of placebos. It doesn't take much of this to make you see and feel incredible improvement happened when in fact nothing happened. Oh, and I learned this because I did not believe sight cannot be improved or all that there is to it is 'factory setting' being the best one or which one should use. You people are all about factory setting, factory coating and factory everything. One could argue your whole existence is like factory one and because of that you like the idea so much :ROFLMAO:
 
Nice, we've had a good laugh. :LOL:

Now, let's cut all the part about people being idiots, magic, placebo, false notions about basic electronics, and let's explain things straight and quickly.

The tube of a CRT display is controlled by electronics, and the signal from VGA goes through electronic components which do not all have the exact same values from board to board. The primary goal of WPB is to bypass this fact to obtain the same display result on all monitors (with a tolerance) by setting some voltage values using a feedback control loop. That feedback loop is materialised by Windas and a colorimeter.
That makes the procedure primarily used when a monitor is assembled or after boards are replaced/repared.

BUT with time some components age and drift, and some voltages drift significantly as well. Like the G2 for instance. WPB is also useful to fix this, as it allows to set again all these voltages regarding the current state of the electronics, not the one of 15 years ago.
 
Last edited:
Ok, so maybe try different approach...
Do you think G2 setting in WinDAS and BRIGHTNESS and BIAS in OSD control different things or is it the same voltage?
If these are different is the difference constrained internal electronics only or does it actually influence voltages going to the tube. I mean for example if I make two extreme settings with some arbitrary black level same for both scenarios, one with low G2 and high BRIGHTNESS/BIAS and other with higher G2 and low B/B then would tube be driven differently or the same?

Existence of such difference should come out when measuring gamma response with colorimeter/

So how do you think it goes in this example?
 
You don't need to have an opinion about that, it's just a FACT.

If you know how to read an electronic diagram just have a look at the schematics of the A board in the FW900, the answer is straightforward. The G2 is a separate voltage input to a specific pin of the neck of the tube, all the other settings like R, G, B bias, contrast and brightness are successive modifications of the VGA signal through several integrated circuits.
 
You don't need to have an opinion about that, it's just a FACT.

If you know how to read an electronic diagram just have a look at the schematics of the A board in the FW900, the answer is straightforward. The G2 is a separate voltage input to a specific pin of the neck of the tube, all the other settings like R, G, B bias, contrast and brightness are successive modifications of the VGA signal through several integrated circuits.
Fair enough
What are other setting in WinDAS that have influence on what goes in to the tube?

Friends, friends. Let all come together in harmony and enjoy a toast over our beloved cathode ray tubes!
Please someone give some vodka shot to this man :)
 
alright gents, the input lag difference between by PG279Q and my FW900 is approximately 16 ms in the FW900's favor, with the PG279Q trailing behind.
 
alright gents, the input lag difference between by PG279Q and my FW900 is approximately 16 ms in the FW900's favor, with the PG279Q trailing behind.
16ms seems pretty high, too high in fact which suggest PG279Q was not set to its optimal mode in which it should be used and in which (2560x1440 144Hz)
LCD panel when driven at different refresh rates might exhibit changes in color and because of that when there is refresh rate mismatch incoming frame must be buffered. This in real life is not an issue because these monitors are never used at 60Hz or non-native resolutions.

what was the testing methodology?
 
You're right, it wasn't set to it's optimal setting. Since to do the test, I had to set it to the lowest common denominator - 2304 by 1440 @ 85 hz. However, even so, I don't think 16 ms is unreasonable at all. Very rarely have I seen a LCD have a input latency of less than 10 ms (via leo bodnar).
 
alright gents, the input lag difference between by PG279Q and my FW900 is approximately 16 ms in the FW900's favor, with the PG279Q trailing behind.

thanks for the report, and if you want, can you please do the test setting both monitors at 2560x1440 75HZ? that way there should not be the supposed buffering due to refreh rate mismatch and the PG279Q would be at its native resolution.
(i know FW900 supports that setting progressive, i have been able to set it from the nvidia control panel or custom resolution utility)

just for curiosity ;).
 
Last edited:
thanks for the report, and if you want, can you please do the test setting both monitors at 2560x1440 75HZ? that way there should not be the supposed buffering due to refreh rate mismatch and the PG279Q would be at its native resolution.
(i know FW900 supports that setting progressive, i have been able to set it from the nvidia control panel or custom resolution utility)

just for curiosity ;).

The reason I couldn't do that initially was because my 1080 Ti doesn't support analog out, and my DAC adapter doesn't go all the way to 2560 x 1440. But now thinking about it, I can look for a miniDP to DP cable and see if I can use my laptop with a gtx 660m and vga out for the test. I'll report back whenever I get the chance.
 
thanks for the report, and if you want, can you please do the test setting both monitors at 2560x1440 75HZ? that way there should not be the supposed buffering due to refreh rate mismatch and the PG279Q would be at its native resolution.
(i know FW900 supports that setting progressive, i have been able to set it from the nvidia control panel or custom resolution utility)

just for curiosity ;).
GPU might not do buffering but monitor will so any such comparison is invalid.

Only viable input lag testing methodology for games would be very high refresh rate camera and counting frames after pressing mouse button (with LED indicator) to actual action on-screen and using modes that are actually used during gaming.
Something tells me G-Sync monitor would be actually faster, especially V-Synced CRT @ eg. 80Hz vs G-Sync monitor with frame rate limiter like RTSS
In strobed mode LCD woult certainly be slower than CRT
 
especially V-Synced CRT @ eg. 80Hz

This is incredibly game dependent. Like in recent Battlefield games, you can set the internal frame limit to match your refresh rate, and combining this with vsync gives noticeably less lag than vsync with no frame limit. In fact, it feels pretty damn close to vsync off. I've read that this trick works in other games by precisely setting a limit via RTSS.

So yeah, if you're going to test vsync'ed CRT vs Gsync, you need to consider stuff like this so you're getting the best possible vsync implementation.
 
I was unable to force my PG279Q to 75 hz and unable to get my FW900 to 85 hz. So with PG279Q @ 85 hz and FW900 @ 75 hz, they are pretty much neck in neck, though I don't think it's a fair test without the refresh rate being the same. I tried CRU to force the PG279Q to 75 hz but that didn't work either =\.
 
  • Like
Reactions: 3dfan
like this
I was unable to force my PG279Q to 75 hz and unable to get my FW900 to 85 hz. So with PG279Q @ 85 hz and FW900 @ 75 hz, they are pretty much neck in neck, though I don't think it's a fair test without the refresh rate being the same. I tried CRU to force the PG279Q to 75 hz but that didn't work either =\.
The thing is that because of how G-Sync monitors work anything less than native refresh rate is i not 'fair' because monitor won't even start draving frame until it loads it whole to buffer to then draw it in 1000/144 ms to LCD panel
Especially if you consider that you would never ever use PG279Q at 75Hz when using it

Of course strobing mode of G-Sync monitors vs strobed CRT there would be large advantage of CRT

This is incredibly game dependent. Like in recent Battlefield games, you can set the internal frame limit to match your refresh rate, and combining this with vsync gives noticeably less lag than vsync with no frame limit. In fact, it feels pretty damn close to vsync off. I've read that this trick works in other games by precisely setting a limit via RTSS.

So yeah, if you're going to test vsync'ed CRT vs Gsync, you need to consider stuff like this so you're getting the best possible vsync implementation.
Sure thing
I even tested FastSync + RTSS to get something similar to V-Sync (with occasional frame drops... though V-Sync will have these also, especially with limiting frame rates to refresh rate) but with less input lag (especially in DX9 games)
G-Sync is however much more reliable than anythign you can do with V-Sync/Freesync/etc. and in this category vasty superior and reliable
 
Monitors with fans, though? WTF? Not even CRT and plasma displays needed those, and those output far more heat than LCDs do!
It depends on what's putting out the heat, and what needs cooled.

CRT and plasma displays, it's the display tube or panel itself that's emitting the heat, and there's nothing particularly temperature sensitive in there.

LCDs with fans aren't cooling the panel, they're cooling the scaler electronics. (I've got an IBM T221 which has fans to cool its large, fast FPGAs that are acting as the scaler system.)
 
Back
Top