24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I'm looking to buy a GTX 1080 TI. I wondering with the FW900, is it possible to "downscale" higher resolutions (1440p) onto the monitor (running at 1920x1200) during games

FW900 can do 2560x1600, so no downscale needed on that side. But it depends on whether your DAC can actually support that resolution (not a problem with 980ti and older cards). But if your DAC can't go that high, you have multiple ways to downscale:

1) In-game resolution scale slider, like you see in Frostbite games like Battlefield 1
2) Nvidia DSR
3) For DX9 games, Gedosato is a good tool for downsampling
 
No, 1080 Ti requires a converter. And no converter today can really do 2560x1200 very well. Hence I'll need DSR for newer games. I never had luck with DSR on the GTX 680 though. I rather not run 2560x1600 anyways, 1920x1200 is my go-to. Prefer to downsample to get that extra crispness.
 
Just got myself an fw900 last week. Appears to be in good shape. I asked the seller to send me the usage data on that thing. Here is what I got:
--------------
Hello it is a little hard to tell. I used it from 2003 until 2008 maybe 5 hours a week maybe less. From 2008 until 2009 for about 18 months daily 4-6 hours Mo-Fr. Since 2009 until today it was mostly off I just kept it because I paid so much when new. I believe this helped.
---------------

So, based on this and provided the information is true, how much life do you think it has got left? He seems to have used it for approximately 3000 hours at the most.

Cheers.
 
Well, it will probably last as long as something important doesn't fail or drift too much on the boards. :p

I doubt anyone can accurately predict the lifetime of a complex device like this.
 
Is the F520 the tightest stripe/dot/mask pitch CRT or were there ever even finer ones?
 
I found a DAC that says it can do 2560x1600 at 60 Hertz:

http://www.delock.de/produkte/N_62796/merkmale.html?setLanguage=en

That adapter was made at end of March,it has the same chipset we need(ANX9847),but it is USB Type-C to VGA and there isn't cables to connect it to a graphic card.
The adapter they are working on is Displayport to VGA with the ANX9847.
MSI recently announced a Nvidia GTX 1080Ti with USB Type-C onboard connector,If in the future other video cards manufacturers will start using that connection,that adapter is good.
 
Is the F520 the tightest stripe/dot/mask pitch CRT or were there ever even finer ones?

yea I believe so. I'm pretty sure there were higher resolution medical CRTs but those were monochrome and didn't have strips or a dot mask. Just a single phosphor surface.
 
Googling about crts and their color gamut capabilities I found this diamondtron that claims to cover 98% of adobe RGB, the exterior looks exactly the same as all the diamondtron 2070sb clones, the horizontal frequency is also the same, but I guess that it should have different guns to achieve that coverage.
https://www.manualslib.com/manual/381983/Mitsubishi-Diamondtron-Rdf225wg.html?page=4#manual
http://www.necdisplay.com/documents/UserManuals/RDF225WGmanual051804.pdf
mn_page5.jpg

A video proof to confirm that they actually made them

Has anyone more information about this particular model? AFAIK the fw900 is only capable of ~100% srgb and the 2070sb more or less the same.
 
Last edited:
Googling about crts and their color gamut capabilities I found this diamondtron that claims to cover 98% of adobe RGB, the exterior looks exactly the same as all the diamondtron 2070sb clones, the horizontal frequency is also the same, but I guess that it should have different guns to achieve that coverage.

the gamut of a CRT has nothing to do with the guns, and everything to do with the phosphors used. 98% of adobe RGB is great, but it would mean that using the display in a regular sRGB environment (so you can browse the web/watch videos with accurate colors) would require a lot of hassle such as 3D lookup tables.

The monitor was discussed in this thread. See here, and from the following post:

NEC tried to keep a 22" AG-CRT afloat for the last two years (Diamondtron UWG RDF225WG) and sold them for $5000 with no takers thus discontinueing the line.

(source)
 
Keep in mind that offical specs for ANX9847 are the same as ANX6212
Max input bandwidth 360 MHz 24 bit
VGA max pixel clock 270 MHz 24 bit
The resolution indicated in specs is with standard LCD timings (2560x1600 60Hz=268 MHz)(348 MHz on CRT)
We know that prototype with ANX6212 can do 341MHz,but we don't know what changes has been made (firmware? power? nothing?)
The ANX9847 has the same specs but it is a different chip and we don't know if it can reach the same limit as 6212
 
98% of adobe RGB is great, but it would mean that using the display in a regular sRGB environment (so you can browse the web/watch videos with accurate colors) would require a lot of hassle such as 3D lookup tables.

Not really I think. Wider than sRGB gamut monitors are more and more popular now. All you need is proper icc profile and use software that can recognize it, then you should be all fine.
 
Not really I think. Wider than sRGB gamut monitors are more and more popular now. All you need is proper icc profile and use software that can recognize it, then you should be all fine.

I don't think you can properly remap primaries with only a 1D LUT, and to properly implement an icc profile, you need color aware applications (i.e. some Adobe applications). Recently, with the ReShade injector, this can be done with many games too (also see here). I'm not sure if ReShade works with all games, or whether there's a performance hit or not, but it's certainly a great step forward. I'm not sure what the landscape is these days with windows 10, so I'm not sure if you can get full color management in native windows.

btw, when ppl use icc profiles to implement color correction, they're usually talking about the VCGT tag (see my post here)
 
Yes that's what I mean - if you want proper color management, then you have to use proper software. I do use displayCAL to create ICC profile which is then recognized by Adobe software or free software like IrfanView. It's quite easy to check with green color - we all know that it's somewhat beyond sRGB gamut with FW900 phosphor. If I create 0 255 0 image with Photoshop and measure it with HCFR, its coordinates are almost on spot with sRGB green primary.

Thanks for linking ReShade - need to read more about it, sounds promising.
 
spacediver, do you know anything about problems with ICC profiles and 3D games in full screen mode? I would imagine there are some issues there, which would probably make me want to do all calibration monitor-side.
 
So what's the deal so far with digital to analog conversion? Any devices out there that can handle the big monsters yet, or are we still boned?
 
spacediver, do you know anything about problems with ICC profiles and 3D games in full screen mode? I would imagine there are some issues there, which would probably make me want to do all calibration monitor-side.

Not much experience, but there is software that attempts to force a 1dlut from the icc (e.g. cpkeeper i think). Some ppl use windowed borderless but I think there are input lag issus there (not sure).
 
Some ppl use windowed borderless but I think there are input lag issus there (not sure).
generally, yea but it can be negligible
the exception is using nvidia fermi cards and older when there's no desktop composition (so no aero, or use windows xp)

after fermi, nvidia cards do this weird thing where the tearing is not uniform.

http://forums.blurbusters.com/viewtopic.php?f=2&t=1541&p=11209

(which means i may consider "upgrading" to a gtx580 on my future pc :D)
 
ah so does this mean it's better to disable desktop composition when running human benchmark (so that vsync is disabled)?
 
probably. how much better it is depends though

if you find that the numbers are always quantized to some multiple of 10ms or 16ms or something, then don't trust the results too much
 
Just swapped in the F520. I've been going back and forth now with the Artisan and the F520 every other month or so on rotation - since I only have one computer. I don't want the monitors sitting there not doing anything with no electricity. :D
 
the 1080N laptop card is +/- 10% of the desktop counter part and my laptop has VGA port which works with my 980M mobile. i plan on getting the 1080N and am using the laptop as a desktop replacement.

EDIT: what is an F520?
 
It's unsure you can replace the graphic card in a laptop, and if you can I would be surprised if the VGA port wasn't part of the graphic card itself. GTX1080 series do not have a built-in digital-analog converter, a VGA port would be useless without it.

The F520 is a 21" CRT.
 
I have a customize HP A7217A
Sony faceplate with a hardwired connection for Win Das tweaking
My name appears on bottom first page by thread creator
For sale U.S. inquires only PM me
 
the 1080N laptop card is +/- 10% of the desktop counter part and my laptop has VGA port which works with my 980M mobile. i plan on getting the 1080N and am using the laptop as a desktop replacement.

EDIT: what is an F520?

I had one of those F520's and I'm thinking I should have had it repaired instead of getting rid of it.
 
Dzięki, ja jakoś żyję chociaż raz na jakiś czas się przeprowadzam i 3 kineskopy ze sobą wiozę :D 2x FW900 i raz P1130.



Same feeling, my Dell P1130 has much better built in coating than FW900 has ever had with any foil.



That's correct. I have one one FW900 with polarizing foil and another with original AG coating side by side.

Polarizing foil:
- better, smoother appearance
- better sharpness
- darker

Original AG:
- better antiglare, they seem much more dispersed
- high transmittance
- delicate

FW900 with polarizing foil is perfectly sharp at around 55-60 cd/m2 and I like this image much more than the one with original AG. Looks crisper, more contrasty and smoother. It behaves worse in well lit room and at 85 cd/m2 image is just blurry compared to original AG. Must admit that it looks amazing at 85, just blurry due to overdrive.

Goal for now is to find polarizing foil with higher transmittance / transparency, so it would be not as dark and would allow to achieve 85 cd/m2 with perfect sharpness. When it happens I will get rid of original AG on second monitor with no regret at all.



That's incorrect. As mentioned few paragraphs earlier, FW900 with polarizing foil is perfectly clear at lower brightness. You could try to calibrate your FW900 with original AG to say 180cd/m2 and you would see that it would become blurry. Or you could just go to OSD -> Color -> Expert and move RGB Gains all the way to 100%. Even with original AG it would become more blurry.



I'd say that polarizing foil is tad better than normal foil just because it cuts of some polarized light. In my previous apartment I had glassy wardrobe behind my back and polarizing foil cut out like 80% of the reflections. Polarizing or not, lowering the black point is damn worth it, because FW900 without any kind of darkening is just barely usable in non-perfectly-dark room.



Let us know how it worked, I would buy leftover if result was satisfactionary :)



These are my thoughts exactly.




Polarizing or not, I find AG to be somewhat different (like probably AG coat). Original AG produces much more diffused reflections and thus easier for the eyes.


If someone know any foil with similar transmittance to original AG I would be more than happy to try it out.


late to the party, what about no AG or Polarizer at all?

Actually I think the analog signal is gone on the GTX 1000 series, you would need a GTX 970 or 980.
On the AMD side, the best card which still has an internal DAC is the R9 380X. There's also a R9 380 4Gb made by Sapphire in the ITX form factor, it's slightly less powerful but it can be found quite cheap. I guess it's a way to get rid of the remaining 380 chips they had in stock.

Not on the laptop the 10 series on the laptop (except The MAx Q Series) is +/- 10% of the desktop card and some laptops like my M17x R4 have VGA.
 
It's unsure you can replace the graphic card in a laptop, and if you can I would be surprised if the VGA port wasn't part of the graphic card itself. GTX1080 series do not have a built-in digital-analog converter, a VGA port would be useless without it.

The F520 is a 21" CRT.


Get learnt son! :p

http://forum.notebookreview.com/

Great place to see which laptops have BGA GPUs and which have MXM GPUs (upgradable) as well ass benches user reviewer, suppliers, trading, buying and selling, even guides on upgrades and mods.

On serious note my M17x R4 most certainly has an upgradable card as i upgraded the 7970m to a 980m and will be upgrading a 1080N, i am using my SGI FW900 (some varied model number actually).

I'm getting 40-80 fps in GTA V but, it's not 100% optimized with the occasional hiccup everyone complains about. the 1070N is 35% faster than the 980M and this link used to say the 1080N (not the MaxQ version) is more powerfull than 2 980Ms in SLI in some benches and just as fast as 980M SLI in others.


https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1080-Laptop.171212.0.html
 
Goal for now is to find polarizing foil with higher transmittance / transparency, so it would be not as dark and would allow to achieve 85 cd/m2 with perfect sharpness. When it happens I will get rid of original AG on second monitor with no regret at all.

yall are too caught up with the idea of using a "polarizer"
any neutrally tinted film achieves the same effect (and generally will be even better than a linear polarizer)


Polarizing or not, I find AG to be somewhat different (like probably AG coat). Original AG produces much more diffused reflections and thus easier for the eyes.
also idk about the fw900 film, but the original cpd-g520p film isn't "diffusing" like the matte lcd antiglare films. the thin film coating on the film literally reduces reflectance, it doesn't spread them out.
(any diffuse reflections you see from a crt are from the phosphor layer)
 
yall are too caught up with the idea of using a "polarizer"
any neutrally tinted film achieves the same effect (and generally will be even better than a linear polarizer)



also idk about the fw900 film, but the original cpd-g520p film isn't "diffusing" like the matte lcd antiglare films. the thin film coating on the film literally reduces reflectance, it doesn't spread them out.
(any diffuse reflections you see from a crt are from the phosphor layer)


Reminds me of when I took off the film of my old FW900. I really wish I didn't do that. Yes, under perfect lighting conditions - it was technically better. But lighting had to be damn near perfect to see the benefit.
 
2 Questions:

1.Is there any consensus on some of the best GFX cards to use with this? I assume analog out is a prerequisite, I know there's other solutions like HDfury but they don't sound all the way there yet, particularly if you running mostly older games you don't need all the Ghz in more mordern cards anyway.

2.Is there a nice guide for calibrating these somewhere? I know windas with the special USB cable is likely involved near the middle/end of the process but I'm looking a for a total comprehensive guide. I'm big on the calibration and getting the most out of the hardware.
 
Last edited:
Back
Top