24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Sorry, I should have clarified.

It has a noticeable scratch in the antiglare directly in the middle of the screen. Creating a rainbow effect visible on flat colors (like a blue sky, etc)

If I can manage the issues of removing the antiglare just by playing it in the dark, I would happily do that.
If there's a rainbow effect, this is probably not just a damage on the AR layer. There must also be a dent in the glass, I don't think of any other explanation.
 
Having recently though a lot about human vision and colors I did remember how after removing AG coating on my FW900 colors become much less vibrant while before they were much more so and in ways I didn't see on other Trinitrons.

So first humble reminder: DO NOT REMOVE AG COATING ON SONY GDM-FW900 and if its scratched then just deal with it. Unless AG filter is completely ruined as to make using monitor impossible. In this case there is nothing else than can be done but otherwise don't go nilly willy breaking amazing monitor by removing AG because you read some posts from idiots recommending it. And of course we can assume good-will but that one guy who was selling broken FW900's and advertised service of removing AG who always said image looks much better was just dishonest - and of course it was immediately suspicious and I should not ignore that.
This comment finally motivated me to complete a job I set out to do last year.

I removed the AG coating from my monitor around 2011-2012 since I was comfortable using it in a dark environment and had good monitor calibration equipment to tweak the settings to my heart's content. The initial calibration results were amazing—whites jumped from 90 cd/m² to 105/110 cd/m², and gamma was set at 2.4 with no issues. On paper, it seemed like a free upgrade.

However, over time, I noticed whites blooming and overpowering the picture, dulling colors and reducing detail. Simple white subtitles would cause this bloom, turning background blacks to grey. I constantly had to tweak Reshade settings (Levels, colourfullness, vibrance) for each game.

Bloom affect on sub titles

I then came across this youtube video BVM/PVM Tint where someone applied a 50% tint on a BVM/PVM. I ordered various tints to see if I could replicate the original AG coating and fix the bloom and color loss. After testing, I found the 50% tint to be the best and used it on my FW900.

In the pictures below, the FW900 is calibrated without the AG coating, so the tint patch is not calibrated and at around 55/65 cd/m².

50% Tint Patch

I’m glad I finally applied the tint that had been sitting in my room for over a year—the results are better than expected. I just wish I had done it before finishing BOTW, Sekiro, and other RPGs.

So yeah have to agree with the above comment, running without any form of coating is headache for light rooms, color/detail lose.

New tint applied
 
whites jumped from 90 cd/m² to 105/110 cd/m²
In the pictures below, the FW900 is calibrated without the AG coating, so the tint patch is not calibrated and at around 55/65 cd/m²
How much nits your monitor does at actual maximum without and with your filters?
I understand your FW900 can do more than what you quote, right?

Mine gets 140 nits with linear polarizer before white pixels image starts to get this extreme blur that all overdriven CRTs get. Without filter it was more than twice that as polarizer removes above 50%. I did >300 nits measuring it somewhere after I removed the filter.

I used my FW900 at usually 120 to 140 cd/m2 for few years with no issues.

I then came across this youtube video BVM/PVM Tint where someone applied a 50% tint on a BVM/PVM. I ordered various tints to see if I could replicate the original AG coating and fix the bloom and color loss. After testing, I found the 50% tint to be the best and used it on my FW900.
Good video.
I wonder if car filter would be better than polarizer. In theory it might have better light properties and also not cause haidinger's brush effect.

Looks nice.
A lot more reflections compared to my polarizer which is excellent in blocking them. Though I guess in the same lighting conditions from the same angle it would probably be similar. Not sure but straight on I see it much harder to see my face than any other CRT.

Anyways, your post in effect does motivate me to look for film replacement. My polarizer application was not perfect (film with adhesive are bitch to apply perfectly without pesky air bubbles) and film itself had flaw on one side which I did not notice before cutting a piece. Also and perhaps more importantly I would like to find better filter for the effect I want so something closer to IBM P275. Imho it has the same phosphors underneath but image looks much better on P275 compared to FW900 even with filter.

I will aim at less natural looking filter. Something slightly more magenta and restore green by driving green gun harder. I guess this is why P275 green gun especially over the years wear out much faster and I have to from time to time correct its levels each time I visit my father to whom I gave my P275. It might be not related though - not really sure. Not even sure if this is a film like on FW900 that can be removed or what. There is a very little scratch and it seems like some kind of paint underneath image is brighter and more green. For obvious reasons I didn't attempt to scratch it off to uncover its secrets 🙃
 
Well guys, super happy to have finally found a second FW900 after years and years of looking! :D

Slightly bad news, it is not quite exactly 100% functional, but it is in very very good condition. No anti glare though... but it's the same for pretty much all FW900s these days sadly.

According to the seller, when the monitor works, it works perfectly fine, nothing weird on screen. But sometimes the power indicator just remains orange, with no reaction whatsoever from the monitor no matter what.
So I'm gonna have to figure this out, but at least I will have a mostly good monitor!

Now for the crazy story...

I am from Reunion Island. There is no way I would ever find a high end CRT here locally. Only once, a friend got lucky and after talking to a random guy locally, learned that he has a unused Mitsubishi Diamond Plus 230SB in a basement of a family member. We managed to get the monitor, so that was cool! It was by FAR the single ever higher end monitor we ever found here in something like 5 years of actively looking.

Okay, so no, the FW900 I just found is not local, it is from France. But after talking to the seller over the phone... what do I learn? He is actually from Reunion Island too, he moved to France a few years ago and the FW900 was actually on the island before, he brought it with him to France when he moved!! Originally, he acquired the monitor from Germany back in 2009 if I remember right, and got it sent to Reunion.

So that monitor went from Germany to Reunion Island, back to Europe in France, and now I will bring it back to Reunion Island. :eek:

I had no idea, that was so random it's crazy!

So after some discussion the guy told me he would help me since we are both from the island, he will pack the monitor for me so I can ship it via maritime freight, on a pallet.

This is quite the oversized monitor compared to the rest, but my first FW900 I got back in 2015 (post below) was also packed and shipped from France, this time via DHL Express (air).

I've been looking for a used FW900 for nearly a year with no luck, until last month. I have to say that I live in Reunion Island, it's a french island somewhere far, far away from France. The monitor I found was located in France.
I have to thank the guy who sold me that monitor because I had no way to see it in person, I had to trust him. He was really nice, and he did a hell of a packaging job. Shipping was handled by DHL Express, and I received the monitor exactly 2 weeks ago in perfect condition. Believe me when I tell you that I was really happy that day.

DSC02156.jpg


So the monitor is a Sony branded GDM-FW900, the guy sold it about $55, I paid him about $100 to help with the packaging and everything. I couldn't find better, I wanted a Sony, not a rebranded HP or something else. Of course the shipping company wasn't willing to offer their services to me, and I had to pay about $220 more to actually get this behemoth shipped. :( In the end, I paid a bit less that $400 with taxes and all that stuff.
I was worried about pretty much everything waiting for my monitor to arrive.

DSC02159.jpg

DSC02160.jpg

DSC02163.jpg

DSC02167.jpg

Back then I got a triple wall 75 x 60 x 60cm box (29.5 x 23.6 x 23.6in). The FW900 measures 57.2cm side to side (22.44in) x around 52cm depth (20.47in) x 50cm high (19.69in).
This time I am getting a 65 x 60 x 60cm box (25.6 x 23.6 x 23.6in), which seems more appropriate and cheaper to ship, though it's double wall not triple, but that's the only box I can get. Just purchased the box, it will be delivered at the seller's place.

It should be fine in my opinion. I will ask the seller to maybe put something rigid, like a wood board or something at the bottom of the box for the CRT to sit on (might be a bad idea, in case of a dent to the box... it might contribute to breaking the monitor stand more easily?), an maybe something else similar in front of the glass as well just to be safe.
Hopefully he will be able to do that. I guess I will ask him to wrap the monitor in bubble wrap first, and then add a layer of random cardboard around it, and fill the free space in the box very tightly with packing peanuts like I did the first time.
And slap stickers like UP, FRAGILE, and GLASS all over the box lol. :dead:

Maybe I could go for something like those expanding foam things InstaPak I think it's called... but I never tried that before, not sure I'm gonna go that route, and since this willl be eventually shipped on a pallet it should be less risky than my first FW900 shipment before.
But before the box is put on a pallet, I'm not sure how it will be handled at the warehouse.

What do you guys think?
 
Last edited:
Is the stand removable?
I was asking myself the same question yesterday, taking measurements. But then I realised there is the USB hub in the stand... so I guess that might be a little too much to ask the seller. It's not like some monitors where you just remove the stand it seems.

Where mine is located right now is difficult to access, if anyone here knows please let us now, otherwise I will check on mine in the next few days. Or maybe I'll check in the service manual or something.
 
How much nits your monitor does at actual maximum without and with your filters?
I understand your FW900 can do more than what you quote, right?

Mine gets 140 nits with linear polarizer before white pixels image starts to get this extreme blur that all overdriven CRTs get. Without filter it was more than twice that as polarizer removes above 50%. I did >300 nits measuring it somewhere after I removed the filter.

I used my FW900 at usually 120 to 140 cd/m2 for few years with no issues.
Color temperature is set to 6.5K. With the AG coat removed, using Windas WPB + DisplayCal 2.4 gamma calibration, I was achieving 105 cd/m². After applying the 50% tint, I redid the WPB + DisplayCal 2.4 gamma calibration, and now I'm hitting 90 cd/m². I had to bump the G2 up by 3 points.

How can I increase the nits without affecting the WPB or causing the blacks to turn gray?

Looks nice.
A lot more reflections compared to my polarizer which is excellent in blocking them. Though I guess in the same lighting conditions from the same angle it would probably be similar. Not sure but straight on I see it much harder to see my face than any other CRT.

Anyways, your post in effect does motivate me to look for film replacement. My polarizer application was not perfect (film with adhesive are bitch to apply perfectly without pesky air bubbles) and film itself had flaw on one side which I did not notice before cutting a piece. Also and perhaps more importantly I would like to find better filter for the effect I want so something closer to IBM P275. Imho it has the same phosphors underneath but image looks much better on P275 compared to FW900 even with filter.

The pictures of the screen with the tint applied where deliberately positioning it to face maximum light from the bay doors and windows to capture all the bubbles and imperfections. Once the screen was fully assembled, I took a final picture to highlight a small crease visible near the top center, slightly to the right side of the screen. This particular tint was really easy to work with, as it allows you to peel the sides and readjust. However, on this occasion, I peeled away too quickly and created a small crease.

I had fully accepted that I would need to buy another sheet of tint and do a better job next time. However, once I turned the screen on, I realized I couldn’t see the crease—no rainbow effect, no distortion at all.
 
Color temperature is set to 6.5K. With the AG coat removed, using Windas WPB + DisplayCal 2.4 gamma calibration, I was achieving 105 cd/m². After applying the 50% tint, I redid the WPB + DisplayCal 2.4 gamma calibration, and now I'm hitting 90 cd/m². I had to bump the G2 up by 3 points.

How can I increase the nits without affecting the WPB or causing the blacks to turn gray?
You can try increasing the contrast, but not sure how much that will raise the black level - probably less than increasing the brightness.

90 nits is decent if you're able to manage lighting in your viewing environment.
 
According to the seller, when the monitor works, it works perfectly fine, nothing weird on screen. But sometimes the power indicator just remains orange, with no reaction whatsoever from the monitor no matter what.
So I'm gonna have to figure this out, but at least I will have a mostly good monitor!

I'm dealing with the exact same issue with my FW900. When I try to turn it on, a solid orange light displays on the power indicator - no degauss or anything indicating that its powered on.

When connecting it to Windas it shows the following errors

1725417182469.png


Refreshing this temporarily fixed it but the issue still persists whenever I shut the monitor off and turn it on again. Do you think it could be related to the power supply? Maybe it shorted somewhere?
 
Last edited:
When connecting it to Windas it shows the following errors
So HV OVP means High Voltage Over Voltage Protection, so it looks like there is some issue with the flyback circuitry for some reason, and it goes into protection.
ABL is the Automatic Brightness Limiter, which is also linked to the high voltage stuff, so it definitely looks to be the same issue for both errors.
I don't think there is a short, maybe someone would be of better help than me for this.

I have no idea yet what the issue on mine is, I will have to wait something like 4 months to receive mine since it will be shipped by sea. Then I will be able to have a closer look.
 
  • Like
Reactions: D_San
like this
How can I increase the nits without affecting the WPB or causing the blacks to turn gray?
In OSD in advanced/custom color on profile 3 (somehow the brightest) you have bias (same as brightness but for each RGB channel) and gain (which is contrast) and you can max your gain there. With these at close to 100 (one should be at 100 for max contrast) you should be able* to max out tube at normal contrast at 90 or so. You might also want to change max gain value in WinDAS to 255 too - on my unit it was close but not maxed out by default so I set it to 255.

I always use HCFR https://www.homecinema-fr.com/colorimetre-hcfr/hcfr-colormeter/ to do white point balance on CRTs.

----
BTW. You say
DisplayCal 2.4 gamma calibration
Did make sure to have dithering enabled?
If you use AMD then no matter if you use older model with VGA output or any DP/HDMI converter you need not worry even if running it at 8-bit.
Nvidia users however have the issue that for full range RGB signals the dithering is disabled and the only way to get dithering is either using registry tweaks or switch to limited RGB range. Registry tweaks are preferred method to squeeze maximum luminance but if that is no real consideration then limited range is good. You will only need to make sure that when range is set to full tube doesn't run out of specs - just in case. Also to adjust brightness/bias to compensate for risen voltage for black.

Writing this in case you do use gamma correction and use Nvidia card as in this case correcting gamma will cause banding. Lots of people talk about needing 10-bit DACs (do we even have these yet?) but actually 8-bit + dithering gives even higher accuracy as 10-bit alone without dithering (like on last Nvidia cards with VGA port like GTX 980Pro) still has tonal errors even if there usually is not banding as in missing levels completely. Best image quality was of course on last Radeons which still supported analog outputs as then you had 10-bit DACs + dithering. Not sure if there are DACs which have 10-bit inputs.

-----
Last trick for ultimate image quality

If you do use Radeon then it is also possible to do basic gamut correction. I say basic because accuracy of encoding monitor's gamut is limited but it is enough to correct Rec.709 colors to SMPTE-C gamut that FW900 uses. In this case it could work out of the box with default values encoded in the monitor which do represent SMPTE-C gamut but Radeon reacts strangely to xy point of white encoded in EDID and this value needs to be reset to standard white - then only gamut is corrected. To pull that off EDID emulator is needed. Can be something for VGA or if you use DP or HDMI adapters it can be model for VGA or DP/HDMI depending on what adapter you use.

Personally I didn't want to spend any money so I hacked ATiny8 chip inside DVI-A to VGA converter and flashed it using Arduino to respond with my own EDID. As a bonus I could set custom resolution and use BNC cables with full EDID.

Advantage is that all colors within intersection of SMPTE-C and Rec.709/sRGB are displayed with correct hue and saturation and all colors which are in Rec.709/sRGB and not SMPTE-C at least have correct hue.

Perhaps simpler way hardware-wise but more involved in usage and not as universal is using reshade shaders that correct gamut based on matrix multiplication. It is also possible to correct gamut in video players but I don't think there is a way to apply shaders to something like YT. Or at least in the past web browsers were CMS aware but something like video players were not. Not sure how it looks today as I have not felt the need to correct gamut for more than a decade at this point.
 
You can try increasing the contrast, but not sure how much that will raise the black level - probably less than increasing the brightness.

90 nits is decent if you're able to manage lighting in your viewing environment.
First and foremost, a massive thank you for the excellent WPB guide. I’ve calibrated many screens in my time, from LCDs and plasmas to newer OLEDs, but I’ve always enjoyed calibrating the FW900 using your WPB guide.

I experimented with raising the G2 by +1 and then applying WPB again, but it seemed to result in a loss of black quality. Your suggestion to raise the contrast from 90 to 100 was the best approach, and hit 102 nits with no lose in black quality.

In OSD in advanced/custom color on profile 3 (somehow the brightest) you have bias (same as brightness but for each RGB channel) and gain (which is contrast) and you can max your gain there. With these at close to 100 (one should be at 100 for max contrast) you should be able* to max out tube at normal contrast at 90 or so. You might also want to change max gain value in WinDAS to 255 too - on my unit it was close but not maxed out by default so I set it to 255.

I always use HCFR https://www.homecinema-fr.com/colorimetre-hcfr/hcfr-colormeter/ to do white point balance on CRTs.
Aha, this use to be my old method for calibration using HCFR with the fw900 bias levels but I much prefer the Windas WPB into displaycal profile.

BTW. You say

Did make sure to have dithering enabled?
If you use AMD then no matter if you use older model with VGA output or any DP/HDMI converter you need not worry even if running it at 8-bit.
Nvidia users however have the issue that for full range RGB signals the dithering is disabled and the only way to get dithering is either using registry tweaks or switch to limited RGB range. Registry tweaks are preferred method to squeeze maximum luminance but if that is no real consideration then limited range is good. You will only need to make sure that when range is set to full tube doesn't run out of specs - just in case. Also to adjust brightness/bias to compensate for risen voltage for black.
Yep, I have an Nvidia card, and dithering has been an issue for me as well. I tried a third-party add-on a while back, but I couldn't get it to work. If you have a working solution via the registry, I’m all ears. Thanks!
 
First and foremost, a massive thank you for the excellent WPB guide. I’ve calibrated many screens in my time, from LCDs and plasmas to newer OLEDs, but I’ve always enjoyed calibrating the FW900 using your WPB guide.

I experimented with raising the G2 by +1 and then applying WPB again, but it seemed to result in a loss of black quality. Your suggestion to raise the contrast from 90 to 100 was the best approach, and hit 102 nits with no lose in black quality.

You're most welcome! And glad the contrast did the trick :)
 
Yep, I have an Nvidia card, and dithering has been an issue for me as well. I tried a third-party add-on a while back, but I couldn't get it to work. If you have a working solution via the registry, I’m all ears. Thanks!
I used this site https://hub.displaycal.net/forums/topic/how-to-enable-dithering-on-nvidia-geforce-with-windows-os/

I remember having to redo that each time I reinstalled the drivers. Also and which was much worse it would disable itself after either sleep or hibernation or both until restart.
That said it is an image improvement so if you do it and it stops working you at most get to where you were before.
Also last time I used it was two years ago - since there were changes in behavior between Nvidia drivers and even Windows versions it is hard to say how is the current state.
 
Sorry for the FW900 adjacent post. My FW900 is being repaired so I’ve turned my attentions to my F520.

I’m having an issue which I’m totally lost with. The individual RGB biases constantly drift. They will literally be different each time I power the set on and let it warmup for at least an hour before measuring. It’s almost like the G2 compensation is in overdrive and happening rapidly. Sometimes green has become elevated, sometimes red and blue have increased. It might not bother most people that much, but to someone who really cares about colour calibration, I’d love to fix it. I can go in and adjust it in expert mode each time but it’s wearing thin.

10k hours on the monitor, I’ve successfully completed WBP (several times). I’ve cleaned all the contacts on the video board and applied some fresh thermal paste where it had slightly dried. I’ve visually inspected the A board, tried both inputs, made sure G2 and ABL are set correctly, nothing has improved it.

Any ideas? Is there a value within the .dat file that controls G2 compensation? As there is no physical Drift Correct circuit labelled like on earlier Sony monitors, I thought it might be BCF_G2_CRCT, but adjusting that didn’t seem to help.

Am I out of luck?
 
Does the Radeon RX Vega 64 and the Radeon VII support interlaced resolutions through HDMI on modern drivers???
 
Might need some capacitors replaced. Don't forget these monitors were produced during the capacitor plague
Thanks, on inspection they all look okay but I haven’t properly tested them. I think I’ve ruled out just about everything else though. I imagine it will be components on the A or neck board?
 
If capacitors are to be replaced, it's rather because of old age/use than bad quality. Many capacitors are of Rubycon/UCC japanese brands and some are even rated for 105°C use. That's pretty good for the period.
 
If capacitors are to be replaced, it's rather because of old age/use than bad quality. Many capacitors are of Rubycon/UCC japanese brands and some are even rated for 105°C use. That's pretty good for the period.
I’ve recently come across a very old post where someone detailed replacing a resistor (R941) on the main board of a CR1 chassis monitor to fix the G2 drift. I wonder if that’s my issue and the resistor is just so cooked it’s allowing the brightness to drift much faster than it usually would.
 
Your issue has little to do with G2, G2 has an impact on brightness, not colors. I'm not sure what you're talking about with that resistor but it may be the resistor mod, a "dirty" and outdated way to fix excessive G2 when people didn't know about Windas and the white point balance procedure.
 
Your issue has little to do with G2, G2 has an impact on brightness, not colors. I'm not sure what you're talking about with that resistor but it may be the resistor mod, a "dirty" and outdated way to fix excessive G2 when people didn't know about Windas and the white point balance procedure.
It may well have nothing to do with G2. But in my original post I explained the symptoms I’m seeing are the same as what happens with the expected G2 drift on Trinitrons, but vastly accelerated. That is the biases only, drift in a non uniform way, upwards. I WBP this monitor just a few months ago with WinDAS. Since then (and before) each time I turn the monitor on the black level has drifted into either green or purple territory and I have to rebalance RGB biases and slightly lower osd brightness. In only a few months I could no longer get good black levels with the G bias setting and OSD bias setting at 0. Is that not a common occurrence with G2 drift but at a much more rapid pace?

I’m not saying that particular resistor will be the problem, but I thought it might be worth investigating.
 
And as I said, you're complaining about color balance, not excessive brightness, This part of the circuitry is completely different and not related to G2.
 
And as I said, you're complaining about color balance, not excessive brightness, This part of the circuitry is completely different and not related to G2.
Well yes, but the brightness is creeping up at the same time. When G2 drifts on a trinitron and gets excessive, it’s usually also tints the colours. Maybe those 2 things are just happening discreetly, but you can see why I made the comparison? Either way, it was just an observation. What I’m really after is any experience anyone might have had with a similar issue, and where to start to fix it. I’m not super experienced with electronics and don’t want to go blundering in with a monitor this nice and regret it.
 
It may well have nothing to do with G2. But in my original post I explained the symptoms I’m seeing are the same as what happens with the expected G2 drift on Trinitrons, but vastly accelerated. That is the biases only, drift in a non uniform way, upwards. I WBP this monitor just a few months ago with WinDAS. Since then (and before) each time I turn the monitor on the black level has drifted into either green or purple territory and I have to rebalance RGB biases and slightly lower osd brightness. In only a few months I could no longer get good black levels with the G bias setting and OSD bias setting at 0. Is that not a common occurrence with G2 drift but at a much more rapid pace?

I’m not saying that particular resistor will be the problem, but I thought it might be worth investigating.
Sounds like normal big Trinitron warmup issue + you need to lower G2 in WinDAS
During warmup if G2 is too high you often cannot compensate it with brightness and screen usually becomes more green.

Nothing out of ordinary from your description.
 
Sounds like normal big Trinitron warmup issue + you need to lower G2 in WinDAS
During warmup if G2 is too high you often cannot compensate it with brightness and screen usually becomes more green.

Nothing out of ordinary from your description.
It’s 100% out of the ordinary. I’ve used and successfully recalibrated several trinitron monitors, including another CR1 chassis and I’ve never had this problem. I never make any judgements until the monitor has properly warmed up, and I’ve done the white balance procedure in WinDAS several times. The issue is more like warm up never ends. After one hour all the biases have drifted out of alignment, after 2 hours they’ve drifted further etc. they never just stay put. If I put them all back in alignment in expert mode, turn the monitor off and come back the next day, by the time it’s warmed up again they are all out of line again.
 
Interlaced works fine on driver version 537.58, tried it with the CRU technique. It's a fairly recent-ish driver dating Oct. 10, 2023 so just last year. But definitely not the latest driver. Some recent games might not like an older driver like this one.

Tried 2560x1600i at 140Hz, which is 429.95MHz. It displayed fine, but I was starting to see some noise looking closely at the screen.
My LK7112 adapter does not have a custom heatsink yet, just the tiny one from the eBay seller. So in just a few seconds it did start cuting off. So no more testing for now, I'll continue tomorrow.

I will get some of those heatsinks on Mouser, I have an order to pass soon anyway, but tomorrow I'll try to find something to help dissipate the heat for further testing.

EDIT: Tried it quick again at 3200x2000... and it crashed the driver. Same thing on 3840x2400 of course. Oh well, that damn Nvidia driver that never allowed going past 2560 pixels wide on interlaced. I forgot about that!
Well... so 2560x1600 around 140, maybe 144Hz, is the best I can do here it looks like so far.

EDIT2: 2800x1750 interlaced at 118Hz works. The limit is not 2560 then, it's a bit higher.
Having to reboot in the mode that ignores driver signatures, apply the CRU inf file, and reboot is not the best way to test resolutions efficiently, but well it gets the job done lol.

EDIT3: 3000x1920 interlaced at 100Hz works as well! Considering 3200 does not, the limit is either 3000 or very close. I will find the exact limit later.
Question , does Cyberpunk 2077 latest version work on the 1080Ti with those drivers? (537.58)
 
btw here's an update on the same (what i thought was) uber-shit generic 5U$D adapter i bought off the streets near my hood from this post : https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1045887000

I just thought to try it and check if it could run the maximum desirable interlaced resolution on my monitor (samsung syncmaster 997MB) , 1920x1200i 144hz , this is 94kHz and 246mhz bandwidth on CVT Timings...this little shit runs it like a charm LOL

btw this means that 1920x1440i 120hz will run just as well since its 93kHz and 242mhz...so...lower than 16:10 1200i 144hz.

Here's a photo taken with my phone to confirm im not trippin' .... lol i always get terribly unlucky with these things, it was about time luck came my way...so many people spending 20$, 30$ , 50$ on startech adapters that cant even hold 220mhz properly....i literally get a decent adapter fully noname generic hdmi to vga adapter without blackcrush.

Times have changed brothers.
 

Attachments

  • IMG_20240920_193314_HDR[1].jpg
    IMG_20240920_193314_HDR[1].jpg
    154.5 KB · Views: 0
Juanme555, how high can you take it to? I was able to take the StarTech DP2VGAHD20 to about 375Mhz. I use it at 2235 x 1397 @83hz.
 
I found amazing tool - ability to correct gamut using ICC profile
AMD: https://github.com/dantmnf/AMDColorTweaks
NVIDIA: https://github.com/ledoge/novideo_srgb

I only tested AMD version.
AMD supported this trick since long time ago but EDID had to have correct values with unchanged white point - because correcting white point doesn't works that well. FW900 has different value so whatever its EDID values are worth they are unusable. Also precision of chromatic point in EDID is very limited encoded as 10 bit integer value if I remember correctly. Not to mention biggest issue: you need EDID emulator.
 
  • Like
Reactions: Xar
like this
Only had to wait a few decades to catch up to previously existing technology :D.
I mean yes. But once it does, it has pretty much supplanted CRT in all aspects (more or less). That would be a wonderful day. It's like saying "we finally have something that completely replaces the automobile. Or something like that? We like car analogies here so there you go.
 
I mean yes. But once it does, it has pretty much supplanted CRT in all aspects (more or less). That would be a wonderful day. It's like saying "we finally have something that completely replaces the automobile. Or something like that? We like car analogies here so there you go.
Screen life (when not abused) and color purity might be iffy for a while longer but maybe by the time 1khz monitors are out it'll be beyond concern.
 
Alright so we are halfway there lol. I wonder how that is going to affect the GDM’s value when the 1k come out.
 
Alright so we are halfway there lol. I wonder how that is going to affect the GDM’s value when the 1k come out.
Trying to brute force something that a CRT does so effortlessly. Doubt even a 5090 will be enough against 500Hz. And I imagine we can forget about ever getting true multi-scan back.

That said, an OLED can be nice on its own terms. Queue another plug for 4K combined with the late great OLED Motion Pro... :)
 
Back
Top