24" Widescreen CRT (FW900) From Ebay arrived,Comments.

and i am shocked that in 2017 is still very hard, not to say imposible to find a monitor capable of reproduce OLED natural blacks quality levels, IPS color quality levels, excellent contast, excellent viewing angles, very good geometry, clear motion quality without sacrificing brightness or requiring high vertical frecuencies, better non crappy scaled multi resolutions, excellent latency, excellent static image clarity, widescreen resolutions up to 1920x1200@96HZ, even 2k (i can get my fw900 up to 2560x1600 @ 68HZ, with still very good image clarity and quality, more experienced users probably can go even higher), still very good flat screen size at 22.5 inches widescreen 16:10 aspect ratio. all this features in one package: the fw900.

i bet this is why many people like me still want and enjoy this more than a decade "obsolete" monitor, not just this, but the CRT technology in general.

man i miss those ages when even the modest and cheaper color CRT monitors offered many of those features no current monitor is offering in one single package, so practically you only had to choose between monitor screen sizes.

I'm a bit surprised too. I really thought by now we would have good OLED monitors, but they're holding that technology back so they can milk "144hz, adaptive-sync, "HDR", etc shitty LCD panels for every dollar possible by slapping on a new feature every year and forcing consumers to comprise even with $1000+ panels. Maybe in a year or two we'll see some OLED monitors (they are appearing in laptops already), but I'm afraid its going the way of SED monitors. The industry knows OLED will blow away every single LCD on the market, so they are stubornly refusing to give the consumers what they want.

I sold my two A7217A's (HP fw900) two years ago thinking it was a good time to sell and OLED was guna be here soon...huge mistake.Now I'm running a 21'' 4:3 CRT i found for $10, its great, but I often miss the sharpness and widescreen on my old Trinitron's. I want to "upgrade" to a new monitor, but even the most expensive LCD's have worse colors and cause significantly more eye strain. I'll be keeping my giant CRT until there is a real upgrade where I won't have to pay hundreds to compromise and have worse colors, blacks, image depth, input lag, contrast, etc.
 
Any time you add new resolutions in CRU, you have to restart your PC, or restart your driver with the included "restart64.exe".

So for 1920x1200 interlaced at say 100hz , you would enter the resolution, with "CRT standard timings" at the top, then input 100hz for vertical refresh, then check the "interlaced" check box at the bottom. Then restart, and when you go to select the refresh rate in Windows display settings, you'll see it listed as "50hz interlaced".

Your monitor will report the vertical frequency as 100hz, but Windows is thinking more about how many full 1920x1200 frames are being produced, which is technically 50.

thanks. I'll have time to do this tomorrow. If you don't hear from me by the end of Friday, just post back here to remind me :)
 
I got something similar when I surpassed the 165mhz limit, but that's not your case. Have you used the crt timmings preset on CRU?
I played around with CRU a bit. I tried using the crt timing and setting my native to 1280x1024@85hz and 1280x960@90hz but both game me out of sync when i tried to set them in windows or the nvidia control panel. Does this mean that my monitor just can't go that high?

https://books.google.com/books?id=p...#v=onepage&q=hitachi superscan pro 21&f=false


^ Thats the only info i can find regarding the resolutions this monitor is capable of
 
I played around with CRU a bit. I tried using the crt timing and setting my native to 1280x1024@85hz and 1280x960@90hz but both game me out of sync when i tried to set them in windows or the nvidia control panel. Does this mean that my monitor just can't go that high?

That seems low for a 21 inch monitor, but maybe not since it's from 1995? My 19 inch from 2002 can do 105hz @ 1280x960.
 
I played around with CRU a bit. I tried using the crt timing and setting my native to 1280x1024@85hz and 1280x960@90hz but both game me out of sync when i tried to set them in windows or the nvidia control panel. Does this mean that my monitor just can't go that high?

https://books.google.com/books?id=pzoEAAAAMBAJ&pg=PA76-IA4&lpg=PA76-IA4&dq=hitachi+superscan+pro+21&source=bl&ots=nM9KbNTKZw&sig=ahlk7wZvu6mNhBZkzZAomcm7MzQ&hl=en&sa=X&ved=0ahUKEwjOuczribLTAhUE5YMKHRrWDKYQ6AEIUzAH#v=onepage&q=hitachi superscan pro 21&f=false


^ Thats the only info i can find regarding the resolutions this monitor is capable of
Well... 85khz horizontal frequency, sadly that means you don't need to buy the vcom adapter since the bottleneck isn't on the adapter but on the display itself.
In order to get a valid resolution you need to be within the limits, the program will tell you
vfNe5n3.png

Those 2 resolutions you tried are more than 85khz
 
EnhancedInterrogator you're right about the interlacing. When I answered previously, I just checked if the image was displayed while I was in the nvidia custom res utility. In it, I can create any interlaced resolution, and it'll properly display as interlaced; I can even save it. But once saved, I can't actually get Windows to display that resolution. It's not even available as an option.

I tried creating a custom interlaced resolution in CRU as well. Same deal; it won't show up at all in nvidia's display properties, nor will it display in the Windows display dialog.

So what do you think the probability of this getting fixed, or even noticed, is? Zero? :) Interlacing wasn't something I've ever used, but as someone who is still using a CRT in 2017, I can understand how it's frustrating when "legacy" stuff that still works just fine is abandoned.

On a similar Windows Creator's Update note, now my StarTech DP to VGA adapter (good for 230 MHz pixel clock) will not display anything above 1280x1024x85. I can't even force it with nvidia's control panel or CRU. I hadn't checked this until just now since I bought a used 980 Ti and had no need for the adapter. But I know it definitely worked before!
 
Oh wasup man!

you're right about the interlacing.

Actually, I was a little wrong. I just installed an earlier AMD driver from March, and it has no problem with interlaced resolutions. So it appears the interlacing issue is with AMD's newest driver and how it interacts with the 1703 Windows update. I'm also waiting to see what spacediver's results are.

As for why you can't select interlaced stuff, have you gone into Windows display settings, the click on the blue text "Display Adapter Properties"? Under that, I think you should see interlaced refresh rates. But you may need to create a progressive refresh rate for the resolution as well so it shows up in the drop-down box in the first place.

I think that's the workaround I found last time I had an Nvidia card. Also, this utility, QuickRes, is good for selecting weird resolutions that don't show up in Windows or Nvidia CP https://www.ultimarc.com/download_old.html (at the bottom of page).

As for it getting noticed? I think pretty unlikely, other than the fact that TV broadcasts are 1080i, and maybe PC's need to support it for production work? But I submitted bug reports to both Microsoft and AMD, hoping they'll fix it. I also don't like seeing legacy tech being chipped away at, especially when I still use it.

On a similar Windows Creator's Update note, now my StarTech DP to VGA adapter (good for 230 MHz pixel clock) will not display anything above 1280x1024x85. I can't even force it with nvidia's control panel or CRU. I hadn't checked this until just now since I bought a used 980 Ti and had no need for the adapter. But I know it definitely worked before!

Hmm I had an issue similar this with an HD Fury HDMI>VGA adapter, and I solved it by importing the HDMI.dat file on CRU's home page. Apparently the adapter was being detected as DVI single-link, which is 165mhz max. The HDMI.dat increased that 600mhz

So maybe there's a similar trick for Displayport? I'd peruse the CRU forum, and ask ToastyX himself if you can't find anything.
 
I'm a bit surprised too. I really thought by now we would have good OLED monitors, but they're holding that technology back so they can milk "144hz, adaptive-sync, "HDR", etc shitty LCD panels for every dollar possible by slapping on a new feature every year and forcing consumers to comprise even with $1000+ panels. Maybe in a year or two we'll see some OLED monitors (they are appearing in laptops already), but I'm afraid its going the way of SED monitors. The industry knows OLED will blow away every single LCD on the market, so they are stubornly refusing to give the consumers what they want.

I sold my two A7217A's (HP fw900) two years ago thinking it was a good time to sell and OLED was guna be here soon...huge mistake.Now I'm running a 21'' 4:3 CRT i found for $10, its great, but I often miss the sharpness and widescreen on my old Trinitron's. I want to "upgrade" to a new monitor, but even the most expensive LCD's have worse colors and cause significantly more eye strain. I'll be keeping my giant CRT until there is a real upgrade where I won't have to pay hundreds to compromise and have worse colors, blacks, image depth, input lag, contrast, etc.

yes, agree that consumerism is determinant here, and it is really frustrating being forced to adquire newer but inferior quality products,however there are some interesting news i found: the OLED 4K 30" 60 Hz - Dell UP3017Q monitor here and at the blurbusters forums , a 60hz 4k OLED monitor with strobing to improve motion clarity (i´d wish motion clarity finally becomes common in all monitors), Although with still some flaws, such some noticiable input lag and expensive price, as reported by one of its owners, but i think its a good start, so finally it can begin to improve those flaws, add better features and become massive with better prices in the near future
 
Last edited:
so is there any readily available solution for new cards that gives more bandwidth than 225MHz?
Later models of HDFury range devices claim to hit that. Can they be overclocked to more reasonable speeds?
 
What are the best tests, patterns, software to run for INSPECTING the condition a CRT like this? Ballparking the number of hours and age on the electron guns and phosphorus, things that can't be fixed or calibrated to work better.

Also, any advice on seeing how deep scratches are besides the fingernail dig test? To see if they go past the factory installed anti-glare coating or not?

Thanks!
 
What are the best tests, patterns, software to run for INSPECTING the condition a CRT like this? Ballparking the number of hours and age on the electron guns and phosphorus, things that can't be fixed or calibrated to work better.

Also, any advice on seeing how deep scratches are besides the fingernail dig test? To see if they go past the factory installed anti-glare coating or not?

Thanks!
https://www.eizo.be/monitor-test/
You won't be able to see the exact amount of hours on the tubes
 
Any time you add new resolutions in CRU, you have to restart your PC, or restart your driver with the included "restart64.exe".

So for 1920x1200 interlaced at say 100hz , you would enter the resolution, with "CRT standard timings" at the top, then input 100hz for vertical refresh, then check the "interlaced" check box at the bottom. Then restart, and when you go to select the refresh rate in Windows display settings, you'll see it listed as "50hz interlaced".

Your monitor will report the vertical frequency as 100hz, but Windows is thinking more about how many full 1920x1200 frames are being produced, which is technically 50.


Ok, I added the interlaced resolution into the "detailed slots" thingie.

Then i did the restart64.exe thing. My monitor flashed, showed a weird looking resolution very briefly and then went to my regular desktop 1920x1200 resolution.

Now what? How do I actually activate the interlaced resolution?
 
Ok, I added the interlaced resolution into the "detailed slots" thingie.

Then i did the restart64.exe thing. My monitor flashed, showed a weird looking resolution very briefly and then went to my regular desktop 1920x1200 resolution.

Now what? How do I actually activate the interlaced resolution?

Go into Windows display settings, choose the resolution, then click on the blue text "Display Adapter Properties", then the monitor tab, then choose the interlaced refresh rate. You may also need to make a separate, progressive refresh rate for the same resolution if it doesn't show up in the first window.
 
ok i think i didn't do it properly the first time. This time, when i went to change resolution in nvidia control panel, a 50 hz option was there. But when i select it, the screen flashes, and my OSD reports no change (85 hz, 107.2 Khz), even though it's clear that nvidia control panel thinks I"m in 50hz mode.
 
maybe these screenies will help you guide me.

Top left: some of the resolutions that nvidia control panel has. If i click the refresh dropdown, a 50hz one shows.

Top right: some of the adaptor modes. Notice that GDM-FW9011 is selected.

Bottom left: monitor screen refresh rate options. Note that it says generic pnp monitor. When I select 50hz here, same thing happens.

Bottom right, left panel: CRU, showing the interlaced resolution I added.

Bottom right, right panel: CRU, showing what monitros are listed when i pull down the menu. Notice the pnp monitor at bottom.

I don't think generic pnp thing is an issue, but not sure.

Give me a non standard 16:10 resolution to try, and I'll try creating it with CRU, and selecting it with quickres.

2nqu1k9.png
 
Quickres Utility is at bottom of page. It says XP only but it works in other operating systems. Or at least it works in Win 10.

As far as resolution goes, you can create any resolution you feel like. Let's say 2688x1680i@100hz
 
yea, I'm pretty sure it's just scaling. No click, but image looked like a higher res version, where everything got smaller. OSD showed no change. Maybe modern CRTs are incapable of interlaced scanning? I've always found interlaced stuff a bit confusing.

Suppose we have a really low res CRT that does 10x10 pixels.

In progressive mode (say 100 hz) , the CRT scans each of the 10 rows, and refreshes those 10 rows 100 times a second.

In interlaced mode, I'm assuming the CRT scans each of the 10 rows, but turns the beam off for alternating rows, and then repeats the scan but now turning the beam off the the complementary set of alternating rows.

So is the CRT itself in a different scanning mode?

Or is it the video signal itself where the interlacing occurs?

You could feed an interlaced video signal to your CRT while it's in "progressive mode", and it would produce an interlaced video. The video signal would just have black lines for half the rows, and then the next frame would have black lines for the complementary set of rows. The CRT doesn't know the difference.

You see why I'm confused?
 
Man, that's so weird, I thought I remembered getting interlaced video to work last time I had an Nvidia card. I guess last thing you can do is make sure VSR is not enabled. That's like GPU downscaling, instead of upscaling. I can't think of what to do beyond that. Somebody else with an Nvidia card is going to have to step in and help

Anyway, as far as new vs. young CRT's and interlaced signals: I've tried interlaced video on all my CRT's, the oldest one being from 2000, newest one being from 2004.

Interlaced video isn't really something your CRT does differently, it's what the video card does differently. It's only sending half the lines, and it's offsetting (via the front porch or sync width, I think) where the first line lands each refresh. That way, they tend to line up correctly each scan.

So really, from the CRT's perspective, there's not much of a difference between interlaced and progressive. Like 540p signals and 1080i signals, your CRT isn't going to know the difference. Both are 33.75kHz horizontal frequency.

In fact, you can actually take those early 00's era Sony 1080i HDTV CRT's and feed them a 540p over the component input, and they will display it. It's impossible for the TV to tell the difference.
 
thanks, I think that helps clear things up, specially the offsetting between each frame (or field, rather).

Feel free to suggest more ways that I can try to get interlacing to work. I'm willing to be your experimenter by proxy for as long as it takes.

I'm on a GTX 660 (or is it 660 GTX, too lazy to check), and DSR factors is (and has always been) set to "off".
 
Thanks again man. I'll try to come up with some more ideas to get back to you over the weekend.

But yeah, more food for though: Nintendo and Sega games of the 80's/90's weren't 480i, they were actually 240p. TV broadcasts, on the other hand were 480i. They were both able to display on our TV's because they both had the 15kHz horizontal frequency. It was called non-interlaced mode back in the day.
 
Also, how severe do scratches have to be before they can't be fixed with removing factory installed anti-glare film?

As in, how can you tell if a stratch or mar goes past it? How THICK is this built in anti-glare film?
I would guess likely between 0.1 - 1mm thick, but I have no idea really and that's not precise enough.
Any good tests to tell if damage goes past it?

And is buffing them out a viable option on a flat screen? I've seen done for curved-screens, there's some big guide someone made, somewhere..

thanks!
 
Last edited:
What I do is to select the interlaced resolution in displays of the Windows control panel then test the custom interlaced resolution again in the nVidia control panel. If you can not test the resolution again in the nVidia CP, adjust it by one Hz. That usually makes things work.

See this thread:



In games I run my Nokia 445 Pro at 2560x1920 120Hz, 1600x1200 190Hz and 1280x960 220Hz thru nVidia GTX 980 2xS.L.I.
 
Also, how severe do scratches have to be before they can't be fixed with removing factory installed anti-glare film?

As in, how can you tell if a stratch or mar goes past it? How THICK is this built in anti-glare film?
I would guess likely between 0.1 - 1mm thick, but I have no idea really and that's not precise enough.
Any good tests to tell if damage goes past it?

And is buffing them out a viable option on a flat screen? I've seen done for curved-screens, there's some big guide someone made, somewhere..

thanks!

I've never heard of the glass itself being damaged when the antiglare is present. Hell, most of my tubes have the antiglare on and the glass on all of them are in perfect condition.
 
What I do is to select the interlaced resolution in displays of the Windows control panel then test the custom interlaced resolution again in the nVidia control panel. If you can not test the resolution again in the nVidia CP, adjust it by one Hz. That usually makes things work.

Thanks, that actually seems to have worked. the 50z 1920x1200 interlaced resolution shows up as 100hz on OSD, 89.3 Khz. The viewable picture is smaller though. I can't get it to fully fill the screen, even when I max out the size/position OSD controls.
 
There is GDM C520 for $15 near me in great working condition. Any reason to get it when I already have two FW900s?
 
Thanks, that actually seems to have worked. the 50z 1920x1200 interlaced resolution shows up as 100hz on OSD, 89.3 Khz. The viewable picture is smaller though. I can't get it to fully fill the screen, even when I max out the size/position OSD controls.

Sounds like something is wrong there. According to CRU, 1920x1200 interlaced at 100hz should only be 64kHz. Maybe it's centered inside of higher resolution?

What I do is to select the interlaced resolution in displays of the Windows control panel then test the custom interlaced resolution again in the nVidia control panel. If you can not test the resolution again in the nVidia CP, adjust it by one Hz. That usually makes things work.

Can you still do interlaced resolutions after the latest update?

By the way, you should submit a bug report to Nvidia about not being able to select interlaced directly from Windows or Nvidia CP without jumping through hurdles.
 
I'm surprised anyone even wants to go back to interlaced resolutions, since the combing artifacts annoy the hell out of me. They're just a compromise from the days of SDTV, and even then, one that consoles weren't really set back by throughout 5th-gen due to running in 240p, except for the occasional 480i menu screen.

They're especially bad when using a capture card and the video preview window doesn't deinterlace the view properly!

Still, I suppose pixel clock limitations in current DP to VGA adapters force everyone's hands here.
 
I'm surprised anyone even wants to go back to interlaced resolutions, since the combing artifacts annoy the hell out of me. .

I use interlaced resolutions if I want to play at a high-resolution+low frame rate. Say you want to play a game at 1920x1440, but you can only really maintain 45fps consistently. You can't set your monitor to 45hz, too much flicker. You can't set it to 90hz, because that's beyond your monitors range. So instead, you set it to 90hz interlaced. Then use double vsync to get a perfectly smooth 45fps. Combing artifacts aren't a big deal here, because you get ghosting artifacts anyway when using double vsync.
 
For nvidia users, the method that rabidz7 posted does work, so thank you! And EnhancedInterrogator, I should have done this sooner. This is a game changer. :)

I'm using a Dell P1130, which has a max horizontal freq. of 130ish kHz and tops out at 170 Hz refresh regardless of the h.freq. (so I can't run any higher at super low resolutions). For general desktop use and most games, I use 1340x1005x120Hz. For older games, 920x690x170Hz. For games locked at 60 fps, I use 2496x1872x60Hz, which doesn't actually hit the h.freq. limit, but the 400 MHz pixel clock instead.

I tried out two interlaced resolutions: the first was 1840x690 at that same 170 Hz (h.freq. limited) and the second 2448x918 at 120 Hz (pixel clock limited).

At that lower resolution, the even and odd lines are slightly visibly different and the image is a bit soft. Combing artifacts are only visible when you're not eye-tracking, and I only noticed them a few times running past a grate texture. But playing Quake Live at "1840x1380" at 170 Hz is amazing. I tried the new UT as well. The image is definitely softer than native, but the increase is resolution is huge compared to 920x690.

"2448x1836" is even better. This CRT is soft anyway at these resolutions in progressive mode, so I wouldn't use this on the desktop, but it looks fantastic in games, especially in UT. The 980 Ti can barely keep up here, so I had to lower some settings to maintain 120 fps.

I need this all the time! I'm either going to have to find some prehistoric GeForce driver that doesn't require the work-around, or I'm going to have to face the ridicule that'll come from asking them on forums.geforce.com to support interlaced resolutions.

You should all at least give this a try.
 
Haha, yeah, I've already been ridiculed on the AMD forums.

I've never really tried running games at high fps + interlaced. I just figured progressive scan near your monitor's PPI with downsampling or AA would look better than running beyond your monitor's PPI in interlaced mode. I guess I'll give it a try sometime to see how I like it.
 
I use interlaced resolutions if I want to play at a high-resolution+low frame rate. Say you want to play a game at 1920x1440, but you can only really maintain 45fps consistently. You can't set your monitor to 45hz, too much flicker. You can't set it to 90hz, because that's beyond your monitors range. So instead, you set it to 90hz interlaced. Then use double vsync to get a perfectly smooth 45fps. Combing artifacts aren't a big deal here, because you get ghosting artifacts anyway when using double vsync.

how do you enable double vsync? i have an nvidia card but cant find a setting for that.
 
how do you enable double vsync? i have an nvidia card but cant find a setting for that.

I haven't used an nvidia card in a while. They may call it "half vsync". It's also available in the settings for a lot of games now. The last few Far Cry's have had it, for sure.

If neither of those two are an option, you can always use a frame-limiter to cap your frames to half your refresh rate.
 
Hey folks

Can someone point me to the Sony DAS Software. NOT Windas, only DAS.

I have a GDM W900 and found the Sony Cable I think, now I only need the software.

Thanks much

EDIT: As i now have made more research I have found a claim that you will also need a Sony Interface Unit to use the DAS software, is this true?

Can anyone quickly explain how I can make the same changes on a W900 that you guys talk about for the FW900s of yours?
I don't have a problem with my W900 so far, it runs good but I wanted to be ready when I need it...
Can you adjust convergence, focus etc without the software DAS too??

Sorry for questions I'm quite new on this subject but willing to learn...
 
Last edited:
I haven't used an nvidia card in a while. They may call it "half vsync". It's also available in the settings for a lot of games now. The last few Far Cry's have had it, for sure.

If neither of those two are an option, you can always use a frame-limiter to cap your frames to half your refresh rate.

i might had missunderstand you, but did you say that it is posible to play games, for example those at 30fps, and 45fps and have a visual motion clarity the same as when the frame rate matches the monitor´s vertical frecuency (so no triple or double image) but by using interlaced resolutions and double vsync instead?
 
Last edited:
Hey folks

Can someone point me to the Sony DAS Software. NOT Windas, only DAS.

I have a GDM W900 and found the Sony Cable I think, now I only need the software.

Thanks much

EDIT: As i now have made more research I have found a claim that you will also need a Sony Interface Unit to use the DAS software, is this true?

Can anyone quickly explain how I can make the same changes on a W900 that you guys talk about for the FW900s of yours?
I don't have a problem with my W900 so far, it runs good but I wanted to be ready when I need it...
Can you adjust convergence, focus etc without the software DAS too??

Sorry for questions I'm quite new on this subject but willing to learn...

I"m pretty sure you want WinDAS, and you'll need a USB to TTL cable, and a colorimeter.

The guide here should apply to the W900 (pretty sure, but not 100% sure)
 
Back
Top