24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Hello everyone. I thought for sure I'd posted in here before but I guess I hadn't...

Anyways, I'm coming here due to the recent experimental firmware on the RetroTINK 5X Pro scaler by Mike Chi, which allows 720p content to be scaled up to 2560x1440p. I have (and love) my FW900 here, and using a GTX 980 Ti, I am readily able to output that resolution with my PC over native VGA, but the RetroTINK only outputs HDMI, so I am trying to find an HDMI -> VGA converter that will allow this. I've seen all of the usual suspects before as far as DP -> VGA, but not HDMI like the TINK uses. Ideally this would be only for primarily 720p native-focused game consoles such as the Xbox 360, PS3, and Wii U, but in theory would be useful should I pick up an Xbox Series X that natively supports 1440p (or a PS5 if Sony were to be so kind), so 60hz is all that's necessary. I see this is about a 312.455 pixel clock, though I'm a complete amateur at this stuff compared to you guys so I could have gotten that wrong. I've seen some Delock and Sunix DP adapters posted here as high as 375, so I imagine this is at least in the realm of possibility?

If no such adapter exists, would it be possible to get an HDMI -> DP converter, and then use one of the aforementioned DP -> VGA units in a sequence? I'm not sure how this would affect input latency, but I would think, perhaps naively, that an HDMI -> DP converter and a DP -> VGA converter that are both held in positive regard for their performance would still work well in tandem... Either way, I figured you guys were probably the best people around to ask. :)
 
Ideally this would be only for primarily 720p native-focused game consoles such as the Xbox 360, PS3, and Wii U
Why would you want to do that?
720p games can work natively in 1280x720 and in this resolution they will look the best.
For 2560x1440 monitor or 5K monitor (probably also 4K) it probably makes sense to upscale image from these consoles to 1440p but for CRT? I would say not.

Rather than having natural CRT scanlines that make games look better such integer upscaling will remove scanlines and make games look pixelated.
It is the same as comparing 15KHz CRT to 30KHz double-scanned VGA image. The latter will look worse. If you double-scan 240p to 480p you might as well do even higher upscale ratio to remove scanlines completely as they already do not match picture but I would not upscale native 480p to 960p or 720p to 1440p just to make perfect image in to blocky image.

BTW. For the question itself "Which HDMI to VGA converter does 1440p" I have unfortunately no answer. I have question to you: is it necessary to output 2560 pixels wide for RetroTink? I do not support this idea as I mentioned above but that aside if it was not necessary (because it is not necessary) then bandwidth requirements would be halved. I mean if you need to double scan 1280x720 then you only need resolution of 1280x1440 in which case ordinary cheap converter should be sufficient.
 
One way you could do 2x scaled 720p with a standard HDMI adapter would be if the Retrotink supports 1280x1440 output. So a 1x scale on the horizontal axis, 2x on the vertical. This would look identical to a full 2x 2560x1440.

That said, I don't really see the point of doing this over running 720p straight to your FW900. Do you not like the visible scanlines or something? I would imagine they wouldn't be too noticeable, but maybe they are at 24".

I'm not aware of any HDMI adapters that can do full 1440p. Though there is one adapter from Benefei that people have said will go to decently high clocks when it is fed YPbPr. I have the adapter, but I don't have an Nvidia GPU. My Radeon card doesn't let me switch to YPbPr for non-TV resolutions.

Hopefully, HD Fury will eventually release a HDMI 2.0 DAC to fix our problems for good. As of a couple years ago, they said they still want to do it. But obviously it's a low priority for them since they're doing well with their other video processors.
 
BTW. I checked RetroTink 5X Pro webpage and it doesn't seem to have HDMI input...
Meaning that to get 720p upscaled to 1440p one would need to use analog Component inputs

Also *720p and 1080i inputs are chroma limited. All other input resolutions preserve the full original chroma bandwidth. 1080i inputs use 'Bob' deinterlacing, 480i/576i inputs can utilize all deinterlacer modes.
Meaning it will have lower chroma resolution to do anything with RetroTink for those consoles making it now REALLY BAD IDEA

That said, I don't really see the point of doing this over running 720p straight to your FW900. Do you not like the visible scanlines or something? I would imagine they wouldn't be too noticeable, but maybe they are at 24".
720p on FW900 looks... ABSOLUTELY GLORIOUS 😎

ps. sorry for shouting 😅
 
I believe the reason they don't track TVs is because the majority are either OLED or VA-panels designed for HDR movies, where contrast is tantamount. Neither caters to motion clarity specifically.

Motion clarity is also a small subset of the gaming community that's "Woke". Our irrational love for CRTs made us IMMUNE to the Bullshit that g4m3r companies like Asus and Acer sells, gtfo 360hz tub of vaseline smeared on my screen.
Damn shame. Using the monitor more and playing Doom 3. I have to say that I'm seeing the ANSI contrast advantage more and more. Even on a dark game like Doom 3 this monitor pops a little more than my CRT in most of the scenes. The only difference, and it still sucks but whatever, is that black isn't really black. Hard to explain to my CRT folks who are used to the lower ANSI contrast of the tube.

I think, in the end, I personally still prefer the image of the tube though. Just having that black background be there when the scene calls for it is nice. But I'm slowly getting used to this though. What holds IPS displays back, in terms of contrast?
 
It would make sense 3000:1 panel would be better for CRT owners wanting LCD replacement, wouldn't it?
Of course they are not better :)
I got in to VA panels myself after CRTs. Had three until I realized what a load of crap they were. Then switched to 600:1 IPS and despite having 5x lower contrast ratio as measured using calibration probe I finally found something I could watch comfortably.

VA panels are plagued with obvious issues in darker shades. And the worst of VA panels are actually those panels which people do comparisons to IPS panels where in dark room VA panel black level remain black off angle and IPS gets much brighter. Why? Because #000000 might stay the same but #010101 get brighter off angle very quickly. So quick in fact having two eyes one eye will see it much brighter than the other just because of miniscule angle difference. Just look at these sweet photos https://hardforum.com/threads/any-d...e-with-decent-blacks.1993774/#post-1044550912. On IPS panels you would struggle to see any stripes with source image. I know because I created these stripes on IPS. I didn't even have VA panel at the time but people started making photos and it worked perfectly... unfortunately 🙃

Anyways, on VA panels which have no off angle light glow on pure black you see splotches of blackness. Try to get immersed while watching dark movie or playing dark game that there is evil lurking in the shadows when you see there is absolutely nothing there even rendered. And still it isn't even pure black but has obvious glow to it. There are also compression artifacts on the edges of these black blobs and in general in dark shades. Some VA panels have off-angle glow on black and it looks like they just do not use pure black at all. Those do not look nearly as good compared to IPS from wide angle but in actual viewing are less distracting. Of course one can increase black level of panels which have black splotches to minimize how much this effect appears at the expense of contrast ratio... or better yet do not buy VA panels at all :D

IPS in comparison has quite wide angle before anything changes and when it does the effect looks like glare. Kinda like some pesky light behind you was in the room and reflected off the screen surface making it harder to see stuff in this spot. Irritating when it happens and I can see how people who have faced in their monitors might find IPS panels bad... but VA is not the solution to any viewing angle issues 🤣

IPS with A-TW or OLED-like displays are solution to those issues.
Hopefully in next few years we will have option to buy IPS panels with reasonable amount of zones and A-TW-like polarizer and black not being black will be replaced with slight haloing. Just like on CRT but with better sharpness and ANSI contrast and HDRs :)

BTW. I use IPS for almost all video watching, movies, yt, netflix, music videos, etc. and even without ambient light I never really notice black could be better. It is what it is, it doesn't draw attention to itself in any way. It is the same for CRT with exception CRT actually draws more attention to dark stuff because haloing can make it seem I it makes certain details less visible than they should be. Especially obvious when I eg. watch music video on CRT which I watched numerous of times on IPS. For new content I didn't see before its less obvious. And to CRT I can quickly adapt and after few hours of use I consider CRT as the true reference on how image should look like. After week or so of using CRT watching movie on LCD seems strange. But then again I can as quickly adapt to IPS so I consider both good display technologies.

To VA however... I cannot adapt at all. Same for TN panels.
 
It would make sense 3000:1 panel would be better for CRT owners wanting LCD replacement, wouldn't it?
Of course they are not better :)
I got in to VA panels myself after CRTs. Had three until I realized what a load of crap they were. Then switched to 600:1 IPS and despite having 5x lower contrast ratio as measured using calibration probe I finally found something I could watch comfortably.

VA panels are plagued with obvious issues in darker shades. And the worst of VA panels are actually those panels which people do comparisons to IPS panels where in dark room VA panel black level remain black off angle and IPS gets much brighter. Why? Because #000000 might stay the same but #010101 get brighter off angle very quickly. So quick in fact having two eyes one eye will see it much brighter than the other just because of miniscule angle difference. Just look at these sweet photos https://hardforum.com/threads/any-d...e-with-decent-blacks.1993774/#post-1044550912. On IPS panels you would struggle to see any stripes with source image. I know because I created these stripes on IPS. I didn't even have VA panel at the time but people started making photos and it worked perfectly... unfortunately 🙃

Anyways, on VA panels which have no off angle light glow on pure black you see splotches of blackness. Try to get immersed while watching dark movie or playing dark game that there is evil lurking in the shadows when you see there is absolutely nothing there even rendered. And still it isn't even pure black but has obvious glow to it. There are also compression artifacts on the edges of these black blobs and in general in dark shades. Some VA panels have off-angle glow on black and it looks like they just do not use pure black at all. Those do not look nearly as good compared to IPS from wide angle but in actual viewing are less distracting. Of course one can increase black level of panels which have black splotches to minimize how much this effect appears at the expense of contrast ratio... or better yet do not buy VA panels at all :D

IPS in comparison has quite wide angle before anything changes and when it does the effect looks like glare. Kinda like some pesky light behind you was in the room and reflected off the screen surface making it harder to see stuff in this spot. Irritating when it happens and I can see how people who have faced in their monitors might find IPS panels bad... but VA is not the solution to any viewing angle issues 🤣

IPS with A-TW or OLED-like displays are solution to those issues.
Hopefully in next few years we will have option to buy IPS panels with reasonable amount of zones and A-TW-like polarizer and black not being black will be replaced with slight haloing. Just like on CRT but with better sharpness and ANSI contrast and HDRs :)

BTW. I use IPS for almost all video watching, movies, yt, netflix, music videos, etc. and even without ambient light I never really notice black could be better. It is what it is, it doesn't draw attention to itself in any way. It is the same for CRT with exception CRT actually draws more attention to dark stuff because haloing can make it seem I it makes certain details less visible than they should be. Especially obvious when I eg. watch music video on CRT which I watched numerous of times on IPS. For new content I didn't see before its less obvious. And to CRT I can quickly adapt and after few hours of use I consider CRT as the true reference on how image should look like. After week or so of using CRT watching movie on LCD seems strange. But then again I can as quickly adapt to IPS so I consider both good display technologies.

To VA however... I cannot adapt at all. Same for TN panels.
I’m not sure I’m following you. I’m not saying I want a VA monitor. I’m just saying that if this particular monitor had a deeper contrast it would be perfect for me.

I only mention VA because I know they can achieve that contrast and better and I was just wondering how they could do it is all.
 
  • Like
Reactions: XoR_
like this
Why would you want to do that?
720p games can work natively in 1280x720 and in this resolution they will look the best.
For 2560x1440 monitor or 5K monitor (probably also 4K) it probably makes sense to upscale image from these consoles to 1440p but for CRT? I would say not.

Rather than having natural CRT scanlines that make games look better such integer upscaling will remove scanlines and make games look pixelated.
It is the same as comparing 15KHz CRT to 30KHz double-scanned VGA image. The latter will look worse. If you double-scan 240p to 480p you might as well do even higher upscale ratio to remove scanlines completely as they already do not match picture but I would not upscale native 480p to 960p or 720p to 1440p just to make perfect image in to blocky image.

BTW. For the question itself "Which HDMI to VGA converter does 1440p" I have unfortunately no answer. I have question to you: is it necessary to output 2560 pixels wide for RetroTink? I do not support this idea as I mentioned above but that aside if it was not necessary (because it is not necessary) then bandwidth requirements would be halved. I mean if you need to double scan 1280x720 then you only need resolution of 1280x1440 in which case ordinary cheap converter should be sufficient.

Isn't this subjective? I think it's pretty easy to say you think a game should be run at its original resolution only, but as for something looking "the best" then isn't that up to the individual viewer? Heck, most games from that generation ran at 720p native and were intended to be scaled by the console to 1080p anyway, so even authorial intent is an iffy argument for how they "should" be seen. For what it's worth, I'm fine with the scanline look of 1280x720 on the FW900, but I'm also fine with not having them there at 1080p or 1440p. I don't get upset when a 1080p game doesn't have that scanline look to it, and I don't lose sleep over games running higher than their original resolutions or framerates in emulators and the like. What if I did prefer NOT to have scanlines; would that be objectively 'wrong' or 'worse' than having them? Isn't one of the FW900's greatest perks being able to run content at flexible resolutions, refresh rates, and even aspect ratios as I enjoy? (PLEASE NOTE I'm not saying that everything you guys talk about as far as optimal settings for this or that doesn't matter, I'm just saying some people might have different preferences or want to experiment).

As far as "ordinary cheap converters being sufficient," I've currently tried a few but none have supported 1440p. I've used the Tendak and the white box HDMI2VGA one that is powered by Mini-USB, and neither of them accept 2560x1440 16:9 content or 1920x1440 4:3 content. Again, I'm not nearly as knowledgeable on the technical stuff as you guys, and I don't think you meant that they would work with 1440p as is, just figured I would throw in that they either lack the pixel clock (if that is the right term) or just don't allow any 1440p resolutions in firmware.

One way you could do 2x scaled 720p with a standard HDMI adapter would be if the Retrotink supports 1280x1440 output. So a 1x scale on the horizontal axis, 2x on the vertical. This would look identical to a full 2x 2560x1440.

That said, I don't really see the point of doing this over running 720p straight to your FW900. Do you not like the visible scanlines or something? I would imagine they wouldn't be too noticeable, but maybe they are at 24".

I'm not aware of any HDMI adapters that can do full 1440p. Though there is one adapter from Benefei that people have said will go to decently high clocks when it is fed YPbPr. I have the adapter, but I don't have an Nvidia GPU. My Radeon card doesn't let me switch to YPbPr for non-TV resolutions.

Hopefully, HD Fury will eventually release a HDMI 2.0 DAC to fix our problems for good. As of a couple years ago, they said they still want to do it. But obviously it's a low priority for them since they're doing well with their other video processors.

So to clarify a bit, the RT5X has a few features that are fun to mess around with, such as its own, fairly high quality scaling filters and scanline generation. One bonus is a smoothing algorithm that can supplement the anti-aliasing (or lack thereof) of these aging consoles, while others allow you to soften your image to recreate more of an authentic 'retro' look akin to 15khz consumer CRT televisions. The scanline options are fun to play with, especially since you can get a taste of what these games might look like on various shadow mask/aperture grille tubes. I guess this might be 'heresy' to you guys to even think of altering the already perfect look of the FW900, but maybe I'm just a weirdo that likes to tinker with experimental stuff. ;)

BTW. I checked RetroTink 5X Pro webpage and it doesn't seem to have HDMI input...
Meaning that to get 720p upscaled to 1440p one would need to use analog Component inputs

Also *720p and 1080i inputs are chroma limited. All other input resolutions preserve the full original chroma bandwidth. 1080i inputs use 'Bob' deinterlacing, 480i/576i inputs can utilize all deinterlacer modes.
Meaning it will have lower chroma resolution to do anything with RetroTink for those consoles making it now REALLY BAD IDEA


720p on FW900 looks... ABSOLUTELY GLORIOUS 😎

ps. sorry for shouting 😅

This is exactly correct; the RT5X does not support HDMI input, but all of the above consoles are able to output analog video, or simply be run through a lagless HDMI->Component YPbPr converter. I've been a lurker in this thread for a while now, but I'm also a lay person when it comes to thinks like color depth and gamuts and such. What exactly do I lose, in simple terms, by having chroma limited output? Just from the term alone, it makes it seem like some of the color information would be gone, but to my obviously untrained eye, this doesn't seem to be making 720p content look 'bad' coming out of the RT5X such as original Xbox games or 720p HDMI content via the aforementioned HDMI -> YPbPr converter. I won't disagree that 720p content looks glorious on the FW900 but from my own experience, EVERYTHING looks glorious on the FW900, thus my desire to tinker and play around with stuff to my fancy. Easily the best display I've ever owned! :D

I'm not gonna lie, I didn't expect this much resistance and skepticism to my initial post, haha. I wanna be clear though I'm not meaning to antagonize or say anyone is flat out "wrong" lol. I would still like to know though what you all think about going the HDMI -> DP route in general, if not for the RT5X or FW900 but possibly for other content like a PS5 or Series X. Or, heaven forbid, I had another GPU and all the DP connections were taken by other outputs or just didn't work, and I decided I wanted to use an HDMI output to go to a DP input monitor that might itself have its HDMI connections unavailable.
 
@ XoR,

You're conflating a few different things together which are seperate issues with different consequences.

VA has significantly higher contrast ratio, but it has leakage like any other lcd.

Even with that leakage though, the overall contrast at its off-axis is still higher than IPS.

The reason IPS looks more even, is because the contrast ratio is So-Low that the leak is not as obvious relative to it's horrendous black level.

As for "shadow detail", the gamma tracking plays a big roll in determining what is , isn't , displayed.

With IPS, alot of manufactuers compensate for its poor contrast by using black point compensation, raising the gamma near black, I've seen 1.1, 1.5 gamma out of the box at 20%. so you see the shadow detail, but it's actually very very wrong, as the backlight leak is significantly brighter than those pixels should be. The backlight actually crushes 100% of the shadow detail, but it's being Raised way above spec to pop them back out. it's there but it's "wrong".

With VA, they usually follow pure-power curve, flat 2.2 or flat 2.4 all the way down, so you get legit shadow detail at the correct luminance ON Axis, but of course, due to the limitation of LCDs, Off axis that detail is subject to contrast drift which makes it look something like the IPS shadows.

So with the pattern you linked, the VA is giving you the more accurate image, DESPITE the leakage and drift. An IPS will in most cases give you an image that is compensating for its raise black, so yes you see it as more even, but it's extremely off. and flat looking.

If you watch real content instead of the test pattern, the VA will render significantly more depth because of deeper blacks, essentially subjects will look much more set in their background, where as on IPS, characters might look like they're pieces of 2D paper sliding left and right..

For Movies, it's no contest, 24p filmography is not sensitive to motion blur because there's so much blur in its long shutter times. VA all the way.

IPS is appropriate for fast games if motion matters, but it's pretty horrendous for movies.
 
Last edited:
  • Like
Reactions: Meeho
like this
Yes Mike, I watch test patterns and not real images. You got me 🤪
And you of course totally watch real content and not obsess about contrast ratio and how black is your black level... that is why you are not bothered by gamma shift that makes gamma completely wrong in 99.9% of the screen. Makes sense. Thankfully VA panels do not have much retention otherwise you would have ANSI pron burned to the screen 🙃

Watching IPS panel looks like watching well made (self-)illuminated print. I would prefer to watch something with better contrast ratio but real world content doesn't even always have black at #000000 and it often changes from scene to scene. Hence watching my precious test patterns I learned to just ignore constant non-black black and assume it is current black.

Heck, I didn't have to learn it. When I got IPS monitor which I measured 450:1 and which replaced VA monitor I measured at ~2000:1 my impression of it was: "black is pretty terrible... how am I supposed to watch anything horrendous like this?" and two hours later after watching movie my thought was "damn, I totally forgot about this whole terrible black level thing, have to watch more videos and play more games and then surely I will be bothered by black level..." and years later, especially on much better displays it is pretty much the same story except art 1200:1 it is much harder to be bothered by something that is not bothersome even at lower contrast ratio. I just do not see it. Maybe if I increase backlight brightness to 450 nits I can notice anything but at this point I am much more bothered by image being too bright. Usually using monitor at ~200-300 nits for videos/games.

Watching VA I was constantly thinking about black level, black crush, wrong gamma, viewing angles and also unrelated issues like pixel transition times and overdrive artifacts. Does ambient light make black displayed on IPS look inky black? Yes it does. Does ambient light make VA gamma shift any less ugly? No it does not.

So here you have it, you can easily fix any IPS issues with 💡Less than ideal contrast ratio being pretty much the only issue these panels really have... except maybe more glow off steeper angle and backlight bleed issues which are result of manufacturing process being a bit harder for IPS to get uniformity right.

BTW. I do not really get why you are making all this fuss about black compensation. Compensation if any is slight and other than gamma to which displays are calibrated being usually slightly too low compared to where it should be (Rec. 709 is the standard people should use for content consumption, not sRGB) I do not see any issue. If you want different gamma curve then you can always calibrate display or get even get display which has hardware calibration. In any case if calibration is something you worry about you should get display with this capability.
 
I mean for the most part the contrast of the XG isn’t an issue. A few times it rears it’s head but it’s just the few times.

I’ll say this. If you ever see a plasma or an OLED you learn to appreciate that black is a black hole, so to speak. Not a dark gray one. But whatever. I’m not trying to turn this into an argument.
 
I apologize for come off as Judgy, brother XoR.. That was not my intention..

I'm not an IPS h8er, In fact, LMCL is basically Dual-IPS one stacked in front of the other, I am very much looking forward to hisense's next LMCL with better sync..

As for gamma, while it is a case of "To Taste", we generally try to work within the artistic intent of the media. In that respect, IPS is just _not quite right for movies_. This is different for many pc games, because they're very loosely graded and some studios take into account that most gaming is done on ips, so they do not take advantage of contrast in ways that could be compromising.

For movies, which have always been graded using the deepest black possible, IPS is not faithful.
 
Movies are not made for VA panels either. If anything if anyone cared for VA then they would rather avoid dark scenes without anything bright to draw attention and put ton of noise over everything to make all the artifacts less visible... wait... sheet ☹️
 
Movies are not made for VA panels either.☹️

It's not graded ON a va-panel, but VA gets much closer to the intent.

For example, SDR was often graded with 0.01nit blackpoint. <these days they use oled, so 0 nit, but still sometimes 0.01>

On a 6000:1 contrast VA-panel, it can drive a range from 0.016 nit to 100nit, using flat gamma response.

On a 1000:1 IPS panel, you're looking at 0.1 nit to 100nit. This KILLS virtually ALL shadow detail. Building these panels, they have to boost gamma at the bottom Way-Up off spec to retain any shadow detail, the net result is the image looks flatter.

For games, this is less of an issue, if most scenes are bright and dark outlines are not used as part of gameplay, IPS gets away with it. But as jbltecnicspro pointed out, doom 3, or any dark games, you definitely notice the panel glow, if you've seen the game on other technologies.

If he plays something like overwatch, everything is bright, so he'd never notice, and the motion clarity is there with the blurbuster panel..
 
It all Mike depends what irritates you more.
To me 0.1 is pretty black. Heck, currently watching 0.3 nits in black room and I think its still black. Lots of videos have parts of it with even higher black level, not sure if for artistic purposes of someone didn't master them correctly but I am hardly bothered by this.

I cannot however stand gamma shift in any amount. And imho talking about gamma in dark shades being 'boosted' on IPS to argue VA is better makes absolutely no sense to me. VA whole issue is that gamma of dark shades decreases (== is boosted) and is way too low on most of the screen surface making dark details stick out too much. It is impossible to get consistent gamma on VA panel.

Anyways, LCD panels are only good for Excel because they make them fit laptop or on desk and they do not have burn-in and for movies one can get big OLED. No point in continuing this discussion.
 
It all Mike depends what irritates you more.
To me 0.1 is pretty black. Heck, currently watching 0.3 nits in black room and I think its still black. Lots of videos have parts of it with even higher black level, not sure if for artistic purposes of someone didn't master them correctly but I am hardly bothered by this.

I cannot however stand gamma shift in any amount. And imho talking about gamma in dark shades being 'boosted' on IPS to argue VA is better makes absolutely no sense to me. VA whole issue is that gamma of dark shades decreases (== is boosted) and is way too low on most of the screen surface making dark details stick out too much. It is impossible to get consistent gamma on VA panel.

Anyways, LCD panels are only good for Excel because they make them fit laptop or on desk and they do not have burn-in and for movies one can get big OLED. No point in continuing this discussion.
Sheesh, we get it.

Edit. Moving on. Has anyone tried a Moome external box? It’s an external HDMI interface that also has neat features like gamma adjustment.

For you CRT folk who want deep blacks but not crush and don’t want to be tied to a PC.
 
Kind of irrelevant to black levels and display types, but do you guys prefer to use the BNC connectors or the VGA input on your FW900? Seems like the BNC input is more prone to noise or color issues but my cable could also just have been cheap or faulty...
 
Kind of irrelevant to black levels and display types, but do you guys prefer to use the BNC connectors or the VGA input on your FW900? Seems like the BNC input is more prone to noise or color issues but my cable could also just have been cheap or faulty...
With good cables I never noticed the difference between the two in image quality.
 
One neat thing about BNC connectors is that you can easily hook up a PS2 with a component cable and 3 RCA>BNC adapters if it's set to RGB mode. At least for the games that support 480p or work well with forced 480p/960i via homebrew.

Also a PS3 but unfortunately you're limited to 480p max in analog RGB (though there may be a way around this via homebrew as well)

Has anyone tried a Moome external box? It’s an external HDMI interface that also has neat features like gamma adjustment.

For you CRT folk who want deep blacks but not crush and don’t want to be tied to a PC.

No but I should look into this. Do some models have analog inputs as well? Because it would be nice to use this with a Dreamcast's VGA out, for example.
 
One neat thing about BNC connectors is that you can easily hook up a PS2 with a component cable and 3 RCA>BNC adapters if it's set to RGB mode. At least for the games that support 480p or work well with forced 480p/960i via homebrew.

Also a PS3 but unfortunately you're limited to 480p max in analog RGB (though there may be a way around this via homebrew as well)



No but I should look into this. Do some models have analog inputs as well? Because it would be nice to use this with a Dreamcast's VGA out, for example.
Digital only I’m afraid. :(
 
That's not true jbl, BNC gives you 300% more e_peen.. that's like at least 50% boost to most people's K/D.
:LOL:

Come on, the BNC connection uses the exact same signals as VGA except it misses the line allowing plug and play capability. The BNC cables may actually be a tad worse as the wires in VGA cables are shielded from one end to another whereas the wires of my BNC cables seem to be unshielded where the cable splits to connect each line separately to the monitor.
 
Goodness, is this actually a thing people think/thought at one point?
Probably not. Assuming good cables the only thing you’ll notice is that your device will no longer sense what kind of screen it is as there’s only RGBHV signals sent. No EDID, no TTL.
 
:LOL:
the wires of my BNC cables seem to be unshielded where the cable splits to connect each line separately to the monitor.
Size matters, if the unshielded section isn't huge, it's not a big deal. But when you have a giant length of it, it's like a net that catches interference.

BNC is just cool though, very cybrpunk. That's how I want to be hooked up to the matrix.
 
You don't happen to have the dead FW900 anymore? Are you located in Germany?
Yes, I'm in Germany, and, no, I don't have it anymore. I'd sent it to someone who's knowledgeable with Sony CRTs, but, the tube didn't survive the overvoltage, and, neither the voltage regulator. He sold it for me to someone who payed 200 or 300.
 
I believe the reason they don't track TVs is because the majority are either OLED or VA-panels designed for HDR movies, where contrast is tantamount. Neither caters to motion clarity specifically.

Motion clarity is also a small subset of the gaming community that's "Woke". Our irrational love for CRTs made us IMMUNE to the Bullshit that g4m3r companies like Asus and Acer sells, gtfo 360hz tub of vaseline smeared on my screen.
Hahaha
 
CRTs do have motion blur in the phosphor trails, it's just "less visually obvious" than the persistence on other technologies.

OLED can already achieve the motion clarity of CRTs, it just can't do it while being bright enough for everyday use.

MicroLED is already as good as CRTs in terms of motion clarity, but only available for Millionaire$.

p00r peeps like us, have to wait for Samsung's Qned, (distinct from LG's qned, which is just ips).

LMCL is also a possibility but it's tough to synchronize the dual lcd panels for motion clarity.

Some argue fast-IPS blurbuster monitor that jbltecnicspro just bought is already pretty darn close.

Some hurdles left on fast-IPS is fps matching refreshrate, and strobe crosstalk at 240hz. it already has near perfect strobe at 120-144hz

I don't think OLED can achieve the same motion feel though. I've tried my phone's OLED screen, and OLED tv in comparison with my FW900. There's no ghosting, but, the motion has a sort of lifelike smoothness to it on CRT. As for brightness, QOLED is coming out, and, it looks to be nearly twice as bright as OLED, which I'm excited for use with HDR content... even though I probably won't have the money for it 😅.

I didn't know MicroLED was that fast. Google says it's 0.2 nano seconds 🤯.
 
Last edited:
I don't think OLED can't achieve the same motion feel though. I've tried my phone's OLED screen, and OLED tv in comparison with my FW900. There's no ghosting, but, the motion has a sort of lifelike smoothness to it on CRT. As for brightness, QOLED is coming out, and, it looks to be nearly twice as bright as OLED, which I'm excited for use with HDR content... even though I probably won't have the money for it 😅.

I didn't know MicroLED was that fast. Google says it's 0.2 nano seconds 🤯.
I was refering to strobing oled, it actually responds so fast, that you can strobe it just like you would a CRT to achieve the same motion clarity, However, it doesn't brighten fast enough for that strobe to hit modern usable brightnesses. So current oleds require sample and hold.
 
I was refering to strobing oled, it actually responds so fast, that you can strobe it just like you would a CRT to achieve the same motion clarity, However, it doesn't brighten fast enough for that strobe to hit modern usable brightnesses. So current oleds require sample and hold.
Ah, ok. I know that backlight strobing gets rid of the image being held after being sampled for the frame, and, before the next, but, I haven't experienced it yet on fast displays. I have a 1440p144hz VA monitor that I used for some time, and, while it felt better to play with backlight strobing on, because of less ghosting, the response time didn't feel any faster. I know VA panels have a weird thing where even though response time is fast, it can feel slower than on IPS panels that have the same response time...
 
VA has alot of smearing because its grey2grey time is much slower than IPS. With strobe, it cuts down on the smearing, but there's still alot of crosstalk, double-image-flicker.

More than fast enough for movies, and casual gaming. For hardcore gaming, it's either ips or oled or CRT.
 
So mostly out of morbid curiosity, I decided to buy a couple converters for testing purposes as far as connecting an HDMI to DP converter to a DP to VGA converter for use with the FW900 specifically. I found a combo that do in fact work together, and even work with my other HDMI equipment like splitters and such, but when I feed 2560x1440p@60hz into it, the FW900 shows as:

"MONITOR IS WORKING
INPUT 1: 44.4kHz/ 30Hz
OUT OF SCAN RANGE"

Even though we know the FW900 can indeed support 1440p... Does anyone have suggestions on why this is the case? Is it an issue with the digital converters specifically, or is 1440p just not possible for external video DACs? For the record, I'm using the following devices:

HDMI to DP converter (claims to support 4K60): https://www.staples.com/club-3d-0-8...-1331/product_IM17RB016?cid=EM:ordconfrm::sku
Startech DP2VGA20 converter (said to have a ~350MHz pixel clock): https://www.startech.com/en-us/audio-video-products/dp2vgahd20

If anyone has any suggestions, please feel free to chime in! Otherwise I'm going to keep buying more of these converters to test! For science!
 
44.4kHz/ 30Hz looks like wrong mode rather than issue with the chip

I recommend Delock 62967
Not the fastest kid on the block but can do 1920x1200@97Hz just fine and at least has very good image quality. I didn't notice any difference vs GTX 980 Ti other than having to use dithering because of 8bpp limit

VA has alot of smearing because its grey2grey time is much slower than IPS. With strobe, it cuts down on the smearing, but there's still alot of crosstalk, double-image-flicker.

More than fast enough for movies, and casual gaming. For hardcore gaming, it's either ips or oled or CRT.
It is not G2G response times that cause smearing but black to dark gray.
Some transitions from black can take multiple frames hence visible dark streaks.
This is caused by crystal in VA panel really liking to stay in the position where it produces black screen and once its in there it doesn't want to move.

In the past and on some TV probably even today there is trick where they pre-tilt crystal when black->dark gray transition is about to happen and this sped up these black dark gray transitions considerably. Old S-PVA panels which used this didn't have these dark streaks. IPS panels were still faster and had less issues with overdrive artifacts but overall response times on S-PVA were quite good for the time. Heck, a lot TN monitors had more dark streaks (similar issue) than S-PVA.

The issue with this trick is that in order to do this pre-tilting they would need to know in advance if there will be such transition and because LCD panels have crystals but not crystal balls this means this tricks add input-lag. Would not be such a big issue if:
- panel was driven faster than refresh rate eg. twice as fast
- it was an option - for movies having input lag doesn't matter as much as dark streaks or overdrive artifacts caused by trying to force pixel out of it's preferred orientation

Especially for faster panels adding 4-8ms input lag wouldn't be such a big issue even for games if it improved response times.
 
Ah, ok. I know that backlight strobing gets rid of the image being held after being sampled for the frame, and, before the next, but, I haven't experienced it yet on fast displays. I have a 1440p144hz VA monitor that I used for some time, and, while it felt better to play with backlight strobing on, because of less ghosting, the response time didn't feel any faster. I know VA panels have a weird thing where even though response time is fast, it can feel slower than on IPS panels that have the same response time...
It is not this weird thing but VA panels actually have worse response times
Visually you notice smearing for the slowest transitions and not average. Average G2G transition times is pretty pointless way to measure response times on LCD panel and it is only used because it tends to give really low numbers monitor manufacturers need for marketing. For the same reason they put these extreme overdrive modes which are unusable even in the most optimal case and absolutely horrendous at eg. 60Hz

If you have one panel that has most transitions at eg 8ms to 4ms and other panel which has transitions ranging from say 30ms to 1ms and on average they are the same then the first one will not only feel smoother with less smearing but also any overdrive artifacts will be less noticeable. When you get to really slow panels there can be really weird overdrive artifacts like posterization (banding) on gradients.

Best to look at actual tables for response time measurements and not average values.
 
44.4kHz/ 30Hz looks like wrong mode rather than issue with the chip

I recommend Delock 62967
Not the fastest kid on the block but can do 1920x1200@97Hz just fine and at least has very good image quality. I didn't notice any difference vs GTX 980 Ti other than having to use dithering because of 8bpp limit

It's definitely weird... The converter combo I have took 1920x1440 but keeps giving me the OUT OF SCAN RANGE for 2560x1440. Not entirely sure why it's saying "30 Hz" at all when everything I'm feeding it is 60, but I went ahead and ordered a Delock per your suggestion so we'll see if that solves the issue.
 
It is not this weird thing but VA panels actually have worse response times
Visually you notice smearing for the slowest transitions and not average. Average G2G transition times is pretty pointless way to measure response times on LCD panel and it is only used because it tends to give really low numbers monitor manufacturers need for marketing. For the same reason they put these extreme overdrive modes which are unusable even in the most optimal case and absolutely horrendous at eg. 60Hz

If you have one panel that has most transitions at eg 8ms to 4ms and other panel which has transitions ranging from say 30ms to 1ms and on average they are the same then the first one will not only feel smoother with less smearing but also any overdrive artifacts will be less noticeable. When you get to really slow panels there can be really weird overdrive artifacts like posterization (banding) on gradients.

Best to look at actual tables for response time measurements and not average values.
Makes sense. Although, I remember watching a video that tested the response time of the VA monitor in depth before buying it, and, the response time from lowest to highest was similar, and, even a bit lower than the IPS monitor I had before it, yet, the difference was still there. Either way, I'm only using my FW900 now, so, it doesn't matter to me anymore. I hope my FW900 lasts me a while :)
 
Back
Top