24" Widescreen CRT (FW900) From Ebay arrived,Comments.

that also happens with mine with my sunix dpu3000, when playing at 1280x960@60hz, but as yours, is so small and is out of usable screen area that i find it not annoying nor something to be concerned about at all.

EDIT: in fact i have realized it happes with all resolutions i use that are 4:3 like 2048x1536, etc that leave some unused space at the vertical borders of the screen, its seems related only to the dpu3000 in my case, when i was using native analog outputs didnt see that, but again, nothing to worry about.
 
Last edited:
While gaming, game game will change refresh rates to 85hz while my desktop is set to a higher one.

eg: 1600x1200 at 100hz in desktop, but game runs it at 85hz instead.

Anyone know how to lock the gaming refresh rate to the desktops?
 
Anyone know how to lock the gaming refresh rate to the desktops?
1) Switch to 100hz before you launch the game
2) Use Nvidias "use max refresh rate" in 3D settings
3) Use another 3rd party program like RefreshLock

and if everything else fails:

4) Make a custom resolution of 1624x1218 @ 100hz and just use that
 
So I was trying to see how high a resolution I could do at 100Hz vertical refresh on the FW900. Using GTF timings, keeping within the display's true aspect ratio, and keeping things divisible by 8 if that's still a thing. And ended up with this:

FW900.PNG


Is it odd the FW900 goes this high in horizontal refresh? (Display itself also reports 122KHz.) Also I'm thinking maybe it's not a great idea to run it at its very limit...
 
I've been running my LaCie at 135kHz-140kHz for years now and things seem OK. But that's only when I'm playing games. For general desktop use, I'm back down to around 100kHz, or whatever 1600x1200 @ 85hz comes out to.

And have you tried using CRU? I find it more intuitive than AMD and Nvidia's tools, and it allows you to save profiles which is a big deal when updating drivers.
 
IMO it's only a matter of the entire electronic chain being able to sustain a signal frequency without distorsion. There shouldn't be a risk of failure just for using the screen in the higher fraction of its refresh range.
 
Thank you both. I used to use PowerStrip to enforce resolutions back in the day, but just the built in tools after that. Downloaded CRU. Looks good. Just used it to fix 1600 by 1024 at 100 Hz I think.
 
And have you tried using CRU? I find it more intuitive than AMD and Nvidia's tools, and it allows you to save profiles which is a big deal when updating drivers.
AMD's custom resolutions are stuck at 6bpc max so I tried CRU. AMD doesn't use EDID overrides created by CRU (or any other method affecting the registry) when the display is connected to an MST Hub. EDID overrides appear to be per display model and GPU port, so you need to copy the EDID override to each port that you connect the display to. While creating the EDID override in CRU, you may want to change the name of the display in the EDID override to include the name of the port it's connected to. The modified display name helps to indicate when the custom EDID is being used. For the case where the EDID override is not being used by AMD for a display connected to an MST Hub, the Windows software will show the new display name while the AMD software will not show the new display name. The AMD software will show the new display name when the display is not connected to an MST Hub.
 
Last edited:
That's an interesting test joevt

I got around the issue by using an Extron EDID 101V, that I programmed with a Dr. HDMI from HD Fury. Dr. HDMI will take BIN files generated by CRU.

Convoluted as hell, but at least I have a decent selection of default resolutions.

The other problem is that "standard resolutions" output at CVT reduced timings on AMD, so they're not useable on a CRT
 
Greetings, I'd like to ask some basic questions. I can get IBM P275 but I would need a converter to be able to connect it to my GTX 1060(3x Display Port, 1x DVI-DP, 1x HDMI). I only care about 1600x1200@85Hz which is monitor native resolution, eventually 100Hz one if it's possible but it's not mandatory. I care about colors, pixel effects and mostly about input latency. But which converter to choose? I've noticed that all those "professional" converters(ie. Synaptics VMM2322 hardware) cost more than actual CRT. Do I need such expensive converter for my 1600x1200@85Hz(100Hz)? Also, as there is some selection available, which port is best for converters - Display Port, DVI-DP or HDMI? I've read for example that most active converters require extra power and while Display Port has highest bandwidth, it's also 3.3V while HDMI and DVI-DP are 5.0V. Does it matter?

To add, I'm from Europe so my list of possibilities is limited to what's available here.
I have found that I could try to get this Tendak here: https://www.amazon.com/Tendak-Converter-Adapter-Portable-Connector/dp/B01B7CEOVK
or
This Delock: https://www.reichelt.com/de/en/adap...ga-female-black-delock-62967-p211925.html?r=1

Are those even good, advised and which one is better?
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Greetings, I'd like to ask some basic questions. I can get IBM P275 but I would need a converter to be able to connect it to my GTX 1060(3x Display Port, 1x DVI-DP, 1x HDMI). I only care about 1600x1200@85Hz which is monitor native resolution, eventually 100Hz one if it's possible but it's not mandatory. I care about colors, pixel effects and mostly about input latency. But which converter to choose? I've noticed that all those "professional" converters(ie. Synaptics VMM2322 hardware) cost more than actual CRT. Do I need such expensive converter for my 1600x1200@85Hz(100Hz)?


the Startech DP2VGAHD20 DP to VGA adapter seems to be one of the best - price/performance/pixel clock from those being tested in this forum topic, from what i remember. (up to 350 mhz + pixel clock which is even higher than the about 280 mhz pixel clock requiered for something like the 1600x1200@100Hz combo you ask).

at the moment fo writing this, seems to be available in amazon uk:
https://www.amazon.co.uk/StarTech-c...d=1&keywords=DP2VGAHD20&qid=1621422954&sr=8-1

for more info about good results tested adapters here search user "derupter" posts
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Delock 62967 is solid. Does up to around 340something MHz pixel clock which will be plenty for 1600x1200 :)
I enable dithering in registry to have 16-bit gamma which is maybe not the best but 1) I am not sure if 10bit adapters exist 2) dithering would still be needed for best results (note: 8-bit with dithering will be better than 10bit without it so anyway...)
 
I only care about 1600x1200@85Hz which is monitor native resolution,

CRT monitors don't have a native resolution. They don't have individual pixels like an LCD.

So you shouldn't limit yourself to one resolution, especially if you play a lot of different games.

Download CRU and start making custom resolutions. So you can run higher refresh rates for some games, or higher resolutions for others. Or if a game is really intense on your GPU, and you can't vsync at 85hz, you can make a 60hz resolution instead to make it smoother.
 
Thanks. Did you ever try running CRT through iGPU VGA port with dual GPU setup? You can clone desktops or force just CRT and it will use dGPU as renderer but at performance hit. I've just seen plenty of topics people discussing it so maybe there's something to it. And what about input lag in such setup?

Also want to ask about BENFEI Displayport to VGA(I've read alot about HDMI to VGA ones here), did anyone try it? It is passive or active adapter? It's like ten times cheaper than Sunix here.
 
Last edited:
For Cyberpunk 2077, I ended up with a resolution of 1880 by 1200 at 72 Hz. This is with all graphics settings through Ray Tracing maximized, except had to enable DLSS (quality setting). I also tried lower resolutions without DLSS, but Cyberpunk appears to benefit more from the combination of higher resolution plus DLSS.

Supporting hardware, RTX 3090 (EVGA FTW Ultra), 5800x, stock timings including RAM at 3600 MHz, Resizable BAR enabled.

Frame rate appears to be locked or almost so, except a big caveat is I have only seen the initial portions of the game. (An exception to what I've seen so far is a big FPS drop in the training area. However, as the training area appears to leave significant GPU and CPU on the table, I think this is an issue with the game.)

(As to 1880 versus 1920, may be a bit silly, but I always advocate such slight tweaks to match a CRT's true aspect ratio.)
 
More of an issue with the FW900 probably. With 1280 by 1024 being the classic exception to avoid for 4:3 tubes. (And ironically the 1600 by 1024 version being almost spot on for the FW900.)

FWIW...I also still make sure both the horizontal and vertical are divisible by 8.
 
Last edited:
Hello everyone.

Yesterday I received my Sony GDM-FW900. But there are problems.
The monitor arrived by car.
There is a red spot in the left corner, green in the right corner. Degauss function didn’t help. It looks like it works - there is a click when pressed and the image is shaking.
If the monitor is turned on its side or upside down, then the spots move to the other side of the screen.
I tried to tweak it with LANDING menu - I managed to get the left side of the screen back to normal, but the right side turned red-green.

I found a similar bug in this thread on page 447 (by CommandoSnake). But didn't find if he fixed it.
He was given the following answers:
1.page 447 (Enhanced Interrogator)
All signs point to a broken aperture grille, which is unfixable.
But there is a possbility that the yoke just came loose in the drop, which is fixable.
2. page 448(LAGRUNAUER)
If the tube was hit (and it looks like it was), the tube is rendered useless...
3. page 450(mathieulh)
Yes, this can be fixed, I would get it myself but I am in the EU and shipping this from the US would cost way too much.
This is a "landing" issue (the magnets stuck on the tube to mitigate deflection, probably moved, you'd need to reset them in place).

I contacted a CRT repair technician.
He advised to first turn off the monitor for a day (last night the monitor was turned off from the button, but was connected to the power supply).

Is this the displacement of the magnets?
Where are they located on the monitor?
Are there any chances of returning to normal colors?


Thank you.

P.S: Sorry for the mistakes, but I'm not a native speaker of english. Scratches on the anti-glare layer - it will be removed if the monitor can be repaired.
 

Attachments

  • IMG_8369 (1).jpg
    IMG_8369 (1).jpg
    206.9 KB · Views: 0
  • IMG_8377.jpg
    IMG_8377.jpg
    287.7 KB · Views: 0
  • IMG_8379.jpg
    IMG_8379.jpg
    284 KB · Views: 0
Damn that's really bad.

The problem is with "purity". If you google that, you will find a lot of information.

If you're lucky, only the yoke came loose. You're going to have to open it up and see if the yoke has shifted.

Magnets popping off is another possibility, you can look inside to see if any fell off.

If you're unlucky, the aperture grille inside the tube is loose, and you'll have to find another tube. Though the rest of the electronics inside should still be working fine
 
Damn that's really bad.

The problem is with "purity". If you google that, you will find a lot of information.

If you're lucky, only the yoke came loose. You're going to have to open it up and see if the yoke has shifted.

Magnets popping off is another possibility, you can look inside to see if any fell off.

If you're unlucky, the aperture grille inside the tube is loose, and you'll have to find another tube. Though the rest of the electronics inside should still be working fine
Thanks, I've seen this issue with other monitors. But I did not find if it was possible to fix them in the end.

Please tell me if I have circled the location of the magnets in the pictures correctly.

To begin with, I'll wait another 8 hours - just a day will pass since I turned it off. If it does not help, then I will look for a repairman.
If these are still magnets, or the demagnetization loop will help, then at the same time it will clean the monitor from dust.
I could climb inside, but I'm too young to die))

Chance to change the pipe tends to 0, because in my country I will not find a second monitor(
 

Attachments

  • 1.png
    1.png
    80.5 KB · Views: 0
  • 2.png
    2.png
    72 KB · Views: 0
Well, the last image looks a lot better. Looks like you had some degree of luck with the landing adjustments.

It wasn't as bad as your FW900, but an F520 arrived to me with an error outside of the range of the landing controls to compensate. I addressed it, at least temporarily, by positioning some magnets on my desk.

I suppose it goes without saying that you've moved any speakers or other inadvertent sources of magnetism well away from the display.
 
Well, the last image looks a lot better. Looks like you had some degree of luck with the landing adjustments.

It wasn't as bad as your FW900, but an F520 arrived to me with an error outside of the range of the landing controls to compensate. I addressed it, at least temporarily, by positioning some magnets on my desk.

I suppose it goes without saying that you've moved any speakers or other inadvertent sources of magnetism well away from the display.
these are images of the same monitor, just IMG_8369 (1) taken in the evening, and the other two in a sunny morning with a different refresh rate(the monitor was also turned on its side and upside down)

I have an idea of what might have happened. when I raised the monitor to the 4th floor, it was in my backpack, and there were 2 magnets in the clasp... completely forgot ...

on the other hand, they are not that powerful. it is possible that the magnets were mixed during transportation(очень на это надеюсь)

upd1.Disassembled the monitor - took 2 photos (in the attachment). Please see if everything is in order and where are the damn magnets.

upd2. found a magnet at home. with a spacer in the form of a piece of wine cork, I repaired the right side.

upd3. I straightened the left side with the same magnet (I only have one) - but a small greenish tint appeared at the bottom left.

is it still not damaged Aperture grille?
 

Attachments

  • Screen+Shot+2018-04-18+at+7.41.24+AM.png
    Screen+Shot+2018-04-18+at+7.41.24+AM.png
    785.1 KB · Views: 0
  • IMG_8382.jpg
    IMG_8382.jpg
    457.4 KB · Views: 0
  • IMG_8383.jpg
    IMG_8383.jpg
    494.6 KB · Views: 0
  • IMG_8384.jpg
    IMG_8384.jpg
    216.1 KB · Views: 0
  • IMG_8385.jpg
    IMG_8385.jpg
    264.9 KB · Views: 0
Last edited:
Well, if you keep the tube close to a magnet powerful enough, it may remain magnetized and that could explain your issue indeed. There aren't many ways to fix this, either wait, repeatedly apply the internal degauss function and see if that improves, if it does but that doesn't seem to be enough to entirely fix the issue, bring it to a professionnal to use an external degausser.
Putting more magnets in the game isn't exactly a good way to deal with this IMO.
 
Well, if you keep the tube close to a magnet powerful enough, it may remain magnetized and that could explain your issue indeed. There aren't many ways to fix this, either wait, repeatedly apply the internal degauss function and see if that improves, if it does but that doesn't seem to be enough to entirely fix the issue, bring it to a professionnal to use an external degausser.
Putting more magnets in the game isn't exactly a good way to deal with this IMO.
Thank you. I removed the magnets. set only for the experiment.
I'm waiting for a master with an external demagnetizer
 
Thank you. I removed the magnets. set only for the experiment.
I'm waiting for a master with an external demagnetizer

Back in the day, I demagnetized a CRT by duct-taping 2 magnets to the chuck on my drill, pulling the trigger, and waving it around the screen.
 
External degausser didn't help. So far solved the problem with 3 small magnets.
Any ideas? After all, the tube is dead?

I removed the anti-glare layer today, and on the screen I see 2 thin horizontal stripes along the screen. This is normal?
Are these parts of the aperture grille (is it a special damping thread designed to damp the vibrations of the vertical strings that make up the aperture grille)?
 

Attachments

  • IMG_8391_(1).jpg
    IMG_8391_(1).jpg
    679.4 KB · Views: 0
Yeah that is part of the tube, all trinitrons have those thin cables in the screen.
Thank you. On a light background, it dispenses terribly. My BVM A14F5M has one such line, but it is only visible at resolutions higher than 480p.
 
Did anyone disassembly BENFEI or RANKIE or other cheap adapter and checked which exactly chipset sits inside? Those adapters are a real lottery as they all look the same but can have totally different electronics inside.

I've also found DELL 5KMR3(also known as M9N09) adapter, it looks sturdier than other adapters and I can get it equally cheap, did anyone try those DELL adapters?
 
Last edited:
I have a Benefei I can take apart sometime. I don't use it.

It can do 4k60 at 4:2:0, but the colors are wrong. Haven't tested at other resolutions yet
 
That would be great but bear in mind, all those adapters are of very poor quality and one common thing is that VGA port there isn't holding well in case. They are saving money even on glue when making those, which could make reassembling bit of an issue. I do have to chose between DP to VGA or HDMI to VGA, I've heard that technically DP should be superior but practically they were using higher grade chipsets in those cheap HDMI adapters and DP versions are usually limited to 165MHz pixel clock, moreover sometimes it's not even pixel clock but also some over-the-top refresh rate limiter(60Hz). Anyway, in my case just 1600x1200@100Hz is fine and even 1600x1200@85Hz would be accepted as I am on 17" DiamondTron. I could probably use YCrCb as well as I'm on NVidia card.

A little tip to everybody, nothing new but perhaps it will help some people out. I am using image sharpeners on old CRT games and some of them look very well. The software is NVidia Control Panel or ReShare or any similar tool. Won't work with everything, sometimes requires some compatibility patches/mods to make games use supported version of DX/OpenGL.
 
Last edited:
alright hemla, got a few pics of my Benefei HDMI>VGA+stereo audio adapter. I can't read exactly what the bottom numbers say, but the number at the top is probably more important
 

Attachments

  • DSC_0330.JPG
    DSC_0330.JPG
    383.5 KB · Views: 0
  • DSC_0328.JPG
    DSC_0328.JPG
    306.9 KB · Views: 0
  • DSC_0324.JPG
    DSC_0324.JPG
    293 KB · Views: 0
Back
Top