24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Referring to top-edge compressed linearity issue or top-edge folding issue

This may not be applicable to this type of issue, but generically seen on almost any CRT when:
- When you try to put LCD timings to a CRT.
- When you enter too-small sync/porch numbers in Manual mode in a Custom Resolution Utility
- Or when CRT vertical-reset capacitors get slightly too weak to reset electron gun fast enough

Sometimes the simple workaround for linearity at edge is to use a larger blanking interval. The blanking interval's too short of an interval for the CRT gun to move from bottom back to top edge.

If the problem is a bit of top-edge linearity problem, try small increases to Vertical Back Porch in a Custom Resolution Utility. If your linearity problem spreads over 50 pixels, then perhaps increase Vertical Back Porch by about 50.

You can also experiment by transferring Front Porch to Back Porch so you have more time interval on the other side of a sync signal. Usually on older CRTs you need more time after the Vertical Sync, to allow time for the CRT electron gun to reset back to the top edge, and if it's not reving up back to a constant speed when starting the top-to-bottom sweep, it can create a top-edge linearity issue (compressed linearity at top, or a fold-over effect)

The numbers in a Custom Resolution Utility can easily be human-visualized as overscan area beyond edges of screen, in this signal structure layout:

VideoSignalStructure.png

Pixels are delivered left-to-right, top-to-bottom, like a calendar or a book. The back porches is the overscan area as the "CRT electron gun acceleration area" after the reset (triggered by Sync signal). It's in the overscan area above top edge (vertical) and in the overscan area to left edge (horizontal). So any compressed linearity there, is indicative of not enough beam acceleration.

The fix, naturally, is to embiggen the Back Porch (overscan area) -- to give more hidden overscan area as "acceleration room" for the CRT gun out of the overscan area, so that it is a constant speed by the time it emerges below top edge (vertical) or from left edge (horizontal). Constant speed = linearity issue disappears.

So fudging around the numbers can add/remove extra time for CRT electron gun to move itself and re-accelerate (horizontally or veritcally) so it's a constant speed before showing picture data -- preventing linearity issue caused by not enough time for CRT gun to accelerate before displaying picture data

Cables deliver these pixels over this sequence over the last 100 years, from 1930s analog TVs to 2020s DisplayPort, left-to-right, top-to-bottom. Yesterday, sync/porches were analog commands/timing to control the CRT electron beam. Today for digital flat panels, they're just defacto equivalent to digital comma-separators. But the signal is the same layout for 100 years -- even a 240 Hz FreeSync and G-SYNC signal is transmitted in this signal layout and sequence too! Complete with porches and sync pixels, even 100 years later.

Be noted, increasing Vertical Back Porch increases your horizontal scan rate without decreases to refresh rate. So you may need to back off by a few Hz in refresh rate, to get enough Horizontal Scanrate specroom to hide an edge-only linearity issue.

If you're already maxed-out in refresh rate spec-wise, then for a 5% top-edge-only linearity problem, may require a 5%-reduction in refresh rate if you use the "bigger blanking interval" trick to hide a CRT top-edge linearity problem.

To ease the math calculation of "transfer-bandwidth-to--bigger-VBI", you can use ToastyX CRU and lock the Horizontal Scan Rate (Horizontal Refresh Rate) while increasing Vertical Back Porch a few pixels at a time (try about the pixel height region of your linearity problem to begin with, then adjust slightly bigger/smaller until your "too-little-time-for-electron-gun" linearity problem is now safely hidden in the VBI). ToastyX will automatically reduce refresh rate proportionally to the Vertical Total increasing.

If your linearity problem at top edge is slowly getting worse and worse, it's sometimes a hallmark of a CRT electron gun reset becoming weaker and weaker (becoming too slow to reset during the time interval in a VBI) -- this will require repairwork, hopefully just replacing caps (easy) that are now unable to give a strong enough gun-position-reset kick fast enough.

But it also happens when you use "Automatic" instead of "Manual" in a custom resolution utility, and those blanking intervals are often too small for CRTs. Then it's possibly simply a VBI too small for CRT spec, even would have had the problem when the CRT was new.

On the other hand, a minor degradation may have happened since the display was new -- then making VBI bigger with a larger Back Porch (more time to accelerate the scanout to a constant velocity = fixes the linearity). This is fine for those small 5% top-edge-only linearity problems, but keep a hawk eye for gradually-degrading linearity issue.

Hope this helps understand left-edge and top-edge compressed-linearity and how to quickly fix them;

YMMV -- but sometimes a simple fix, sometimes not.
 
Last edited:
It's harder to see when you're running fullscreen 4:3, it's more subtle. So there's a chance you just didn't notice it back then

In my picture it's running 2560x1600 (16:10). It's especially pronounced when running 16:9 from modern consoles.
I remember when it was new that I spent hours adjusting the geometry with the various resolutions using different models of grids, I would certainly have noticed the problem.
Now I notice the problem just by looking the desktop icons at the top of the screen.
Moving the icons from the bottom to the top of the screen, I notice that they change in size vertically, but only at the top end.
I discovered this problem 4 or 5 years ago but I honestly don't know how long it took to show up.
 
I remember when it was new that I spent hours adjusting the geometry with the various resolutions using different models of grids, I would certainly have noticed the problem.

Interesting. Then maybe it's a problem that shows up in the first few thousand hours, then levels out. Because mine hasn't perceptibly changed one way or the other since I purchased it from a printing studio 8 years ago.

Referring to top-edge compressed linearity issue or top-edge folding issue

Thanks for the response, I've actually thought of a lot of this stuff, and shifting lines from the front to back porch does fix it.

BUT, I also play consoles on my CRT, and timings are not adjustable on those.

One solution I did find was that Extron RGB processors, like the 203rxi, have a vertical position adjustment that will shift the blanking interval around, . This actually works great at 720p, but the Extron isn't fast enough to give a good picture at 1080p, which is the resolution I use for PS4 and Switch. Maybe there is another Extron RGB that can handle 1080p out there somewhere, but I haven't found it.

But what you said about the gun reset time possibly being tied to degraded caps is interesting. My problem is I don't know how to study the schematic for a CRT and find the circuits responsible for vertical deflection.
 
Last edited:
Yes -- consoles are a tough problem when a CRT is too degraded to reset fast enough.

Sometimes the "capacitor shotgun approach" works wonders -- replacing all electrolytic capacitors with those of same farad rating and at least same/higher voltage rating. There are some YouTubes about re-capping CRTs, but it depends on how worthy the CRT is to rescue (a Sony FW900 would be!). Definitely heed the high-voltage warnings (let the CRT be unplugged for a while, then discharge all caps).There are HOWTOs specific to specific models, that vary from model to model, that may have simpler solutions than shotgun recapping. But shotgun recapping is a favourite legitimate approach of novice CRT presrvationists (who otherwise is adept at soldering irons or other electronics). Be noted problems can also be with the flyback transformer or other issues, and that gets more complex since those are specialized components. Depends on how good you decide to be at preserving a special CRT specimen, and whether you expend time on troubleshooting exact problem with it, or simply do the shotgun approach on a CRT that would otherwise be junked.

That said, if it is minor enough (5%), only affects non-PC devices, and fixable by PC CRU, I wouldn't bother until it became really bad and the CRT is too difficult/valuable to replace (like FW900 and their clones!).
 
Last edited:
If the device is really old (like built in the 80s) or was heavily used, full recapping is indeed useful.
But for monitors built in the beginning of 2000, it shouldn't make much sense yet if the capacitors used were quality ones in the first place. They may be aging, but no to the point it has any impact on the device operation. ;)
 
When a fuse breaks, 99% of the time (save the 1% where the fuse itself is defective), it's because something else is defective and caused an abnormal current consumption. Don't count too much on a simple fuse change fixing everything. ;)
Yes, seeing that the fuse broke means that the fuse saved a lot of other components from burning due to the overvoltage, which is why I'm happy to know that it broke. It's currently sitting with the person who's gonna fix it and he'll get to working with it in a few weeks or so since he's busy with other fixes for now.
 
Could anyone recommend me an adapter for the Sony GDM FW900 ? I have a GTX 980 Ti which I can use the DVI to VGA through, but I will upgrade in 2021 to 3080 or RX 6800 XT because of games getting hard to run even at lower resolutions and because I want to experience Ray Tracing. The faster the adapter/the higher resolution and refresh rates it can do and the lower the latency for conversion the better of course.
 
melibond go back 50 pages and just skim through the thread. We've talked about a bunch of different adapters.

You could also just read Derupter's post history, he's made a few good summary posts
 
If the device is really old (like built in the 80s) or was heavily used, full recapping is indeed useful.
But for monitors built in the beginning of 2000, it shouldn't make much sense yet if the capacitors used were quality ones in the first place. They may be aging, but no to the point it has any impact on the device operation. ;)
This is true, so shotgun recapping is a lot of time potentially wasted (but the easier Novice Way for anyone with basic knowledge of caps and soldering).

For those dedicated/inclined, best to learn which caps drives the verticals, and test the integrity of those caps and/or replace only that one slightly-degraded capacitor caused by a good CRT specimen being heavily used at max-spec for months/years, especially specific ones were driven harder at high temps for years. Some are more well used than others. Small degradations can exist after 20 years on heavily used equipment, even if 90%+ of caps are perfectly fine.

New slight edge-only linearity compressions (that are unfixable for standard signals it used to do fine) can potentially be of these cases. As long as it’s not source specific affecting all units of that CRT (i.e. PS5 showing linearity compression on all units of a specific model, even recapped ones) indicative of timings being out of spec for that specific model of legacy CRT.
 
Last edited:
Yeah finding the major capacitors in the vertical deflection circuit seems to be the the thing to do, but that will probably take many nights of reading old CRT repair manuals and browsing old message boards.

Chief Blur Buster by the way, on an unrelated note, I read that you guys are making 60hz strobing required for your certifcation on new displays. But I was wondering, will you have a higher level of certification that would require arbitrary strobing rates? Like support for, say, 75hz, 80hz, 90hz, etc?

Because the reason I still won't switch from my CRT is because it literally strobes at any refresh rate I choose, and 80hz and 90hz are some of my most commonly used refresh rates.
 
Chief Blur Buster by the way, on an unrelated note, I read that you guys are making 60hz strobing required for your certifcation on new displays. But I was wondering, will you have a higher level of certification that would require arbitrary strobing rates? Like support for, say, 75hz, 80hz, 90hz, etc?

Because the reason I still won't switch from my CRT is because it literally strobes at any refresh rate I choose, and 80hz and 90hz are some of my most commonly used refresh rates.
Yes, Blur Busters Approved Version 2.0 (Year 2021) definitely tries to gets LCDs closer to CRT flexibility bit by bit. Although previously not explicitly listed as a requirement that all refresh rates must be strobeable -- it must strobe at far more refresh rates than simple presets (like ULMB). Blur Busters Approved requires a very high degree of strobe flexibility as one of the tests to pass the Blur Busters Approved strobe tests.

Even when the first monitor that was approved (the ViewSonic XG270) can strobe any refresh rate in ~0.001Hz increments from 75Hz through 240Hz -- all the way to the refresh rate granularity limits of your Custom Resolution Utility. So it strobes the 80Hz and 90Hz just fine (uses its 75Hz strobe tuning, which has less crosstalk than its 100-120Hz strobe tuning range). It's been sort of heralded as the LCD equivalent of the Sony FW900 by some reviewers. In the ApertureGrille review by a5hun, at least at the specific 119Hz refresh rate where XG270 has less motion blur than a CRT (with comparision photos to boot). Even though it has all the typical IPS imperfections like IPS glow and poor blacks, but it still managed to beat a CRT in lack of crosstalk + lack of motion blur + lack of ghosting -- at specific refresh rates at least. I managed to successfully out-tune any NVIDIA ULMB refresh rate on any IPS panel. Where NVIDIA artifically caps the strobe rate, I recommended that strobe rates are uncapped / unlocked -- to let end users choose quality vs crosstalk tradeoff.

Blur Busters has independently come up with more effective low-cost strobe-tuning tricks that are far more effective more cheaply than ULMB, while being more flexible than ULMB, while also supporting much more refresh rate flexibility.

Although I cannot yet disclose vendor / model (NDA) -- the prototype strobed LCD sitting on my desk can single-strobe any refresh rate in 0.001Hz increments all the way from ~59 Hz through 240 Hz, sync'd to whatever custom resolution you dream up. Using a custom resolution containing a Vertical Total 4500 at 60Hz (4x QFT), it is practically free of strobe crosstalk for the entire vertical height of the screen, Top/Center/Bottom.

<Technical>
(Behind The Scenes stuff, loved by advanced readers)

The higher the max Hz of an LCD, the easier it is to achieve CRT clarity at low-Hz on the LCD.

To get LCDs closer to CRTs, one biggest problem is trying to educate generic scaler firmware programmers (who barely know English language) about large vertical totals, since it makes it harder to strobe tune if the panel is not compatible with large vertical totals / Quick Frame Transport -- hitting two birds with one stone, reducing input lag AND also reducing strobe crosstalk. Some parts of the screen surface can have less input lag than CRT, thanks to the quick frame transport (QFT) + quick scanout (bottom edge 4.2ms versus bottom edge 16.7ms).

It's the art of transmitting a 60Hz refresh cycle faster from GPU to monitor (a 60Hz refresh cycle transmitted over cable in 1/240sec) and then scanned-out on the panel in only 1/240sec (top-to-bottom sweep in 1/240sec). This allows lots of LCD GtG time in the VBI between refresh cycles, to allow GtG to finish unseen in total darkness between refresh cycles.

Crosstalk-free is kind of a "cram the GtG elephant in the VBI drinking straw" problem. Less than 5% of strobed LCDs achieve crosstalkless operation (such as Oculus Quest 2 VR LCDs). Most VBI are less than a millisecond, but using large VBIs fixes that. A 60Hz refresh cycle is a 16.7ms refresh cycle, but accelerating that by 4x means the refresh is completed in only 4.2ms. That leaves 12.9ms of VBI to let the 1ms GtG finish completely unseen by human eyes. A humongous 12ms+ vertical blanking interval permits 10ms realworld GtG100% then 1ms strobe flash. Large vertical totals can be done internally via scan-conversion (for backwards compatibility with, say, gaming consoles that slow-transmits a 60Hz refresh cycle in 1/60sec over HDMI cable), or via external Custom Resolution Utility.

There are pros/cons of the two approaches. Internal scan conversion makes it easy (simply turn on/off feature), while external large vertical totals requires custom nonstandard EDIDs/DisplayIDs (or the user to create the custom resolution manually). I've been trying to teach scaler vendors to support both, but I have not succeeded (yet) on the language barriers (asian panel developers / programmers) since they don't quite understand the latency-vs-crosstalk-vs-convenience tradeoffs involved. Sometimes they're hardcoded to a specific mode, and I can only "polish turds" as much as I can (Blur Busters has rejected the logo for multiple different LCDs in year 2021, even though I helped tuned them better than they would have been without my help. To this day, those vendors I helped aren't even allowed to advertise my involvement, as I don't want to dilute the Blur Busters brand).

But the Holy Grail Strobed LCD Panel would need to support both optionally -- for low-lag Quick Frame Transport strobing while still being compatible with video game consoles / BluRay players / DirectTV / Comcast / etc boxes not capable of custom resolutions. (As good as they are; not even many people at NVIDIA quite understands the need for this, either). But. One dominoe at a time!

I do not have control over many other aspects of LCD engineering, but I have been making movements on backlight programming/engineering which has finally made it practical to make 60 Hz single-strobe a mandatory requirement -- I've convinced two manufacturers to stop balking and just do it. Focussing on core speciality (strobe) -- NVIDIA only certifies the G-SYNC, AMD only certifies the FreeSync, and Blur Busters only certifies the strobing. But getting good at what Blur Busters does.

A future dominoe of bringing LCDs closer to perfect CRT clarity will also be rolling-scan Full Array Local Dimming (FALD) backlights. Those who remember me almost ten years ago remember the Arduino Scanning backlight experiments, but there were major problems with internal backlight diffusion (also acknowledged by the a5hun 60 Hz single strobe Hack YouTube -- check it out). Unfortunately highly programmable high-LED-count low-diffusion Full Array Local Dimming (FALD) are still too expensive, but by ~2025, cheap highly progammable HDR FALD's should enable the 3-figure strobed LCD with CRT-quality blacks (using surge HDR nitroom to compensate for the darkness of the strobe). Multi-thousand-element MiniLED backlights, mounted very close to the LCD, allowing extremely low internal diffusion with simpler optics -- makes it a lot more practical than it was ten years ago.

While CRT has a bit of diffusion too (phosphor bloom / halos), light diffusion in scanning FALD is still a big problem, even if it theoretically helps the GtG problem. One problem is that 100,000:1 contrast ratios is actually only 100:1 contrast ratio for the backlight, combined with 1000:1 contrast ratio for the LCD glass. A 100:1 contrast ratio for backlight means 1% strobe crosstalk, still human visible like RGB(252,252,252) versus RGB(255,255,255). To make strobe crosstalk below human visible noisefloor, ideally, internal light diffusion in a backlight needs to be smaller than 1000:1 contrast ratio for the backlight, so achieving true million-to-one contrast ratio is horrendously difficult for FALD (1000:1 backlight contrast + 1000:1 LCD contrast). We can settle for 100:1 since many FALD's only do 20:1 or less contrast ratio at the backlight level due to the diffusion. At a certain point, it begins to look better than the bloom around a CRT phosphor dot or CRT ghosting, and finally good scanning backlight experience. But it's still too expensive to do cheaply.... yet.

Initially for CRT equivalence, that's kind of why Blur Busters has long known global-flash strobe backlights are easier to cheaply sidestep the internal-diffusion problem. Doing the QFT trick (gigantically huge VT's >3x bigger than the refresh cycle) has become easy with 240Hz-360Hz panels, with 360Hz FreeSync panels capable of Vertical Total 6,750 at 60Hz now permitting the 2.7 millisecond 60Hz refresh cycle sweep! Simply refresh the LCD in its entirety in total darkness, wait for GtG to finish, THEN globally flash the backlight on a fully perfect refresh cycle (like a5hun's 119Hz review of the 240Hz ViewSonic XG270). It's going to be multi-years long slog, but we're getting there bit by bit. The best LCD horse is now currently ahead of best OLED in motion clarity now (cherrypicked OLED versus cherrypicked top-1% LCD) for a number of reasons also explained here. Engineering is improving both concurrently but currently, the LCD horse is way ahead in motion-blur-reduction (zero ghost, zero crosstalk, zero blur, zero motion artifact) when doing an ultraquick QFT scanout + excellent OD tuning + heatsinked stadium-bright LEDs (hard to cram all those lumens in tiny OLED pixels in ultrabrief flashes as bright as a CRT electron beam dot) -- I've seen very bright 0.3ms realworld realmeasured MPRT100% crosstalk-free LCD strobes already. OLED hasn't yet been able to achieve 0.3ms MPRT with fewer artifacts than a top 1% tune job of an LCD. There are pros/cons and both horses stays in the race to retina refresh rates, but the LCD horse is ahead at the moment.
</Technical>

TL;DR: Strobing is easy at all refresh rates. But doing it crosstalk-free and making it closer to CRT is HARD, just see an impressive DIY attempt by ApertureGrille:

 
Last edited:
Hey Chief. So I played some games with some Samsung PLS (their version of IPS?) screens and I think I could tolerate the lack of contrast if it meant good motion clarity.
 
...............In the ApertureGrille review by a5hun, at least at the specific 119Hz refresh rate where XG270 has less motion blur than a CRT (with comparision photos to boot). Even though it has all the typical IPS imperfections like IPS glow and poor blacks, but it still managed to beat a CRT in lack of crosstalk + lack of motion blur + lack of ghosting -- at specific refresh rates at least..............

i dont want to create controversies here, and with a straight and respectful intention and opinion, i want to say a couple of points that i want to share, as gaming computer monitor enthusiat that deserve to know the aspects either positive and negative of the product being advertised, but being really dissapointed with blurbusters overbias with this monitor that in the end is not resulting in the quality being advertised by them.

crts are crosstalk free at any refresh being used, im yet to see the first crt ever creating crosstalk (and surely you know it)
by the way, now that you once more are comparing xg270 and praising it agains crts, you are not mentioning (as expected) important things like xg270 low brightness level when using its crt motion quality mode, permanent crosstalk at the top and botton of the screen even at its best strobing mode and other flaws mentioned in the xg270 video review from a5hun which i highly recommend watching entirely, a review i personaly found plausible among many things, due to his honesly and real ethics with the customer that would expect to be transparently informed about the reviewed monitor features, either negative or positive.


also i want to share some important quoted comments from the a5hun video review to all that i find critical at least to me from which along with other decribed xg270 flaws, were much more informative than rather just trusting a very limited "certification" "approvation" page in which when i go to read why is the mentioned monitor "certified" only finds words praising xg270 motion "superiorness" against crts along with other advertising information which contradicts the final real result of the monitor quality, and made me concluded and saved me from a big frustration and money waste figuring out how overrated the xg270 is and far to be the "superior motion experience than crts" blurbusters pretended to convinced (to ensure his comission from viewsonic for sure) a crt motion enthusiats like me with their rather marketing biased based than technical "certfications" "approvations".


here are the quotes from the video review i was refering (and the link to the whole video review at the end of this post):

quote "the xg270 has the best backlight strobing mode i have ever seen but its really only at one specific refresh rate 119 hertz" at minute 18:40

quote "i think the xg270 looks better than the crt........at the center of the screen" at minute 20:46
so the xg270 is only able to reduce its crosstalk at the center of the screen, not even in the whole screen even at its best strobing mode

quote "119hz mode does increase input lag slightly" at minute 21:45



xg270 low brightness level of just 70 nits at its best crt motion quality mode "purexp ultra"
quote "its just to dark at 70 nits" at minute 18:23
quote "ultra is just to dark to recommend generally" at minute 21:18
if user want a brigher mode like the next to "ultra" "extreme", will have to sacrifice motion clarity, no longer being in the crt motion quality realm.
quote "not quite yet matching the crt" refering to the "extreme" mode at minute 20:08

so, users will have to stick to ultra with such dark screen, requiring absurd high constant 119fps with still crosstalk at the botton and top of the screen and added input lag to enjoy the advertised "superior motion experience than crt".........🙄



i just wanted to share this and my personal opinion with you all, from a customer perspective that expect transpareceny and honestly from a product manufacturer, advertiser, or whatever it can be named, so anyone is aware, as i would expect to be informed about the negative and positive aspects from this or any other product., i dont pretend to tell anyone if they may buy or not buy this or other a particular monitor, obviously its up to anyone whatever they want to do with their money and who to support, and of course people deserve to get their commisions from the product their advertise, but its underwhelming when its pretended to do so with false advertising and by fooling the users with a flawled product specially by abusing the populary earned as a tactic to do so.


link to the menioned xg270 video review:
there is also a written review https://www.aperturegrille.com/reviews/ViewSonicXG270/ from which pretty much said in the video is written there.




 
I can kind of see the "crosstalk" implied here:
1609797770547.png

Maybe the phosphor decay in the top part is what they mean. caption "CRT" at 20:39 of the above video (a5hun's XG270 review).

Overall I think Mr. Rejhon is almost unimpeachable. If you have followed his work from Blur Busters onward he doesn't appear to be the kind of guy who can be paid off, certainly not at the prices Viewsonic can afford ;).

His statement here is honest and accurate IMO to the best of my knowledge:
Yes, Blur Busters Approved Version 2.0 (Year 2021) definitely tries to gets LCDs closer to CRT flexibility bit by bit.
He makes no claim that CRT has been dethroned, and honestly I doubt it ever will in a consumer display even when the technology that makes VR displays so fast and crisp is brute forced into a gaming monitor. The instantaneous brightness of the kilovolt CRT trace combined with the phosphor warmth of the aperture grille inlaid on leaded glass - it must be considered an art form. (kind of joking - but not really :geek:)
 
<SideTrack>
For [H] readers: This reply refers to the ViewSonic XG270 comments discussion about whether it matches CRT motion clarity or not -- in a discussion between myself and 3dfan on the comments section of a5hun's YouTube channel.
also i want to share some important quoted comments from the a5hun video review to all that i find critical at least to me from which along with other decribed xg270 flaws, were much more informative than rather just trusting a very limited "certification" "approvation" page in which when i go to read why is the mentioned monitor "certified" only finds words praising xg270 motion "superiorness" against crts along with other advertising information which contradicts the final real result of the monitor quality, and made me concluded and saved me from a big frustration and money waste figuring out how overrated the xg270 is and far to be the "superior motion experience than crts"
I totally agreed with the flaws a5hun picks out. Just because XG270 has superior motion versus a CRT at a specific refresh rates and settings, doesn't mean everything else is superior -- for example, CRT blacks versus IPS blacks (grey blacks, glow, backlight bleed, firmware bugs which also affects operating systems and big-brand G-SYNC and FreeSync too).

3dfan interprets that video is a bad review of XG270.
But I think to the contrary; it's high compliment he chose the XG270 as the gold standard to compare against a CRT -- not a worse LCD.

As you know, everybody sees differently and everybody is picky about different things. Stutters? Latency? Tearing? Motion blur? Flicker? Brightness? Blue light? Colors? You name it. Sometimes it's a personal preference and sometimes vision factors also contribute (not 20/20 vision, or also 12% of population is human blind, or a person-specific eyestrain from a specific antiglare filter texture, some have Akinetopsia (motion insensitivity), or eyes hurt with monitors way too bright (100 nits is too bright for some), or CRT fuzziness tricks the eyes into focussing wrongly, or dislike strobing/flicker/impulsing due to flicker headaches which also happens with CRT, or others are not picky about color quality or blacks quality when bright pictures look great, etc).

What this means, is that what's amazing about XG270 motion can outweigh other attributes for other people, yet others people may dislike the XG270 because they want a better standard such as CRT or OLED blacks, etc. The LCD industry have a lot of typical "race-to-bottom" sub-$1000 desktop gaming LCD flaws. Computers used to cost much more, and premium desktop monitors were way more expensive than they are today - like CAD$1500 for a Samsung SyncMaster 17GLsi when it was new -- a mere 17 inches in 1995. But that doesn't tarnish the motion clarity aspect, if that's your primary attribute. To some people, it's a huge upgrade. Just because I said XG270 can have superior motion than a CRT at specific settings, doesn't mean it's superior in all other aspects. Just because one may be dissatisfied with XG270, doesn't mean many are satisfied with XG270 for example -- a product always has a dissatisfaction % rating.

Everyone has different preferences and sensitivities. HardForum posters can research the Amazon reviews, the various YouTubes, etc. And make their decision. Certainly I do highlight Blur Busters Approved monitors higher than other monitors, but not to the exclusion of other monitors. Sometimes things like monitor lists go out of date because I haven't had time to update them, while other things like general quality control is something that is hard to deal with -- NVIDIA G-SYNC and AMD FreeSync monitors have subject to the same issue; even the big companies have had difficulty forcing better quality control to third parties. I try my best (at least in the strobe department), best to be a Narrow Master Of Strobing, rather than Jack of All Trades, Master of None. So, mea culpa if you're dissatisfied with your XG270 unit -- but I have been honest.

The name sake, Blur Busters, means we're about motion quality and motion blur -- and we've talked about lots of TN panels. Before XG270, almost anything strobed was much worse quality. I remember how poor quality the LightBoost colors were (compared to ViewSonic XG270's colors in comparison). Sure, they are not perfect, but if you're a LightBoost user, it's quite a significant improvement mind you -- it depends on where you are coming from. If you are a CRT user and have a huge preference to great blacks, I have already pointed that out in many places too.

blurbusters pretended to convinced (to ensure his comission from viewsonic for sure) a crt motion enthusiats like me with their rather marketing biased based than technical "certfications" "approvations".
I never pretended about the motion. It actually has motion preferable to a CRT for some (see above about different people picky about different things). But it's not for you if you want all the CRT checkboxes ticked (blacks-vs-other-pros, panel lottery, refresh rate preference, VSYNC preference, global-vs-rolling flicker preference, etc). And many reviews agree. After all, it does have high average Amazon reviews.

Blur Busters also has left money on the table -- refusal of money from some monitors that failed certification or was likely to fail (e.g. VA panels that also contained KSF phosphor, two strikes!). That's why only 1 monitor was Blur Busters Approved in year 2020. For 2020 (taking a cut in my time and vendor's time as a compromise) -- I permitted a credit towards helping tune and attempt (and failed) certify a future monitor instead of approving a strobe-flawed monitor. In those cases, I effectively took large pay cuts for those one-time situations and failed to certify 3 monitors for the payment of 1 monitor. Zero logo, three times the work -- and it takes lots of work.

Naturally I am biased to Blur Busters Approved much like NVIDIA is biased to G-SYNC as well as AMD is biased to FreeSync. To some people, that is quite broad enough (motion blur reduction) and to others, that is quite narrow -- it is a matter of perspective. Strobe tuning is a very big universe of work and extremely difficult to do crosstalklessly. There are many G-SYNC monitors with bad backlight bleed, to many FreeSync monitors with the 55Hz behavior issue with non-AMD cards (AlienWare, Acer, ViewSonic etc) -- if you compare the Amazon reviews of XG270 versus Amazon reviews of other monitors, the XG270 holds its own pretty well.

But the bottom line, frankly, is that a few percent of ANY monitor is unsatisfied -- whether 1% or 5% or 2% or whatever -- 2% is still 200 out of 10,000. Any tech support forum is frequently full of users needing support rather than the happy users. It's just something I have to deal with, much like the anti-Linus followings or anti-RTINGS followings, or such. I only can try my best, and I have shown well-intentioned honesty. The ViewSonic Amazon end user reviews are consistently averaging above 4 stars (4.7 star average on Amazon USA). Understandably, filter/sort the reviews to view 5-stars 4-stars, 3-stars, 2-stars, 1-star, and you'll see the happiest and the frustrated users. Even back in their original era; even CRTs have flaws -- there's also always a % of CRTs were defective (e.g. misconvergence in a corner, or having difficulty syncing to a specific refresh rate, or having some electron beam astig/focus issue, etc.) so per-panel defects (e.g. variable backlight bleed) is par for the course that even NVIDIA/AMD/myself/third parties tend to have difficulty control. Blur Busters can only try their best, in the narrow scope where we excel at.

Neither the gorilla brands of NVIDIA nor AMD are able to force all panel vendors to massively improve quality control (and even CRTs back in their respective eras have had quality control issues too! Maybe you are not picky about corner misconvergence issues, but another person might be. Everyone is picky about visuals in different ways, and may prefer the combo of a perfect LCD grid combined with great CRT motion clarity for example, even if not caring about blacks as much as you do). Blur Busters Approved is proudly a narrow-scope certification focussed on the name sake, Blur Busters. Even all the big companies know it is hard to make a factory run perfect from unit to unit, especially after the race-to-bottom pricing (four figure display prices becoming three figure prices, etc). That said, I do my best to improve the specialty of Blur Busting -- and that's what I am good at.

- Many strobe backlights don't have a clarity-vs-brightness tradeoff adjustment. (i.e. stays permanently dim). Blur Busters require user choice of tradeoff.
- Many strobe backlights are stuck at fixed Hz presets. Blur Busters require a flexible strobe continuum.
- Many strobe backlights prevent max-Hz strobing. Blur Busters recommends that users choose ability to choose max-Hz strobing (even if less quality than best-Hz strobing)
- Many strobe backlights have too much crosstalk at all refresh rates. Blur Busters requires greatly reduced crosstalk at sufficiently high refresh rates.
Etc.
Then I add new requirements (e.g. single-strobe 60 Hz).
I can only cherrypick the battles I am able to succeed, with industry inertia. In situations where many big vendors don't do it, it's miraculous Blur Busters is able to effect a movement.

Also, because Blur Busters underadvertised Blur Busters Approved in 2020, currently a5hun (at the time) was unaware of the strobe support going as low as 75 Hz (and doing even better than 119Hz). 75 Hz is not a factory EDID inclusion in XG270, so it has to be custom resolution created.

I do understand skepticism by others to Blur Busters hobbyst-turned-smallbiz -- I still have my hobby blood which helps makes me an ambassador / advocacy / go-between between The Little Guy and the Big Corporations. I do very well in this role. I do have a reputation to keep and sometimes I'm certainly not perfect, but I do want a more completeness of a record here, so:

I even criticize all vendors, including, ViewSonic in some posts. I have no qualms of doing so, even as I promote "Everything Better Than 60Hz".
Crossposting my reply on your YouTube comments, in the interest of advanced-user public information:

YouTube Comments #1 Crosspost

LINK: YouTube Comment Reply by Chief Blur Buster

@3dfan No worries about controversy; Three tidbits to clarify that may assuage this: (A) While Blur Busters certainly earns commissions; If you've seen my forums often, I frequently recommend monitors other than XG270 including models BB earn no commission from; (B) Also, to protect our integrity, Blur Busters does not "mass-review" monitors in order to avoid competing with reviewers who use Blur Busters freely available inventions at [Link Self Removed To Respect HardForums Rules] and [Link Self Removed To Respect HardForums Rules] -- I do encourage people to convince reviewers to test monitors. (C) Occasionally, I have beefs with some reviewers' including RTINGS occasionally-flawed boilerplate review script template, including the situation of limited detail (I am with you there), you should see my comments about this on Blur Busters Forums.

Beyond encouraging users/reviewers to include monitors (sure, shamless plug) -- I have no control over what info reviewers adds/omits from their reviews, or their lack of allowance to deviate from the script; etc. Obviously I love how RTINGS has improved the reviewer industry compared to 10 years ago and how they have helped popularize one of Blur Busters' most famous testing inventions (the pursuit camera) even though I've had no payments from RTINGS ever to this date (2020) -- my pursuit camera invention is FREELY AVAILABLE! A5hun uses my invention (with full blessing) with his Frog Pursuit app. But yes, BB does reputationally benefit of popularizing inventions (such as pursuit camera that replaced a $30,000 commercial lab equipment with a $100 homebrew pursuit camera test that can be built DIY for free -- see link above about my free-to-build invention).

There certainly may be some bias -- but we are huge believers on "rising tides lifts all boats"
NVIDIA certifies G-SYNC, and they are biased on G-SYNC.
AMD certifies FreeSync, and they are biased on FreeSync.
Blur Busters tests strobing, and we are biased on good strobing.

Certainly, Blur Busters fully acknowledge bias on good strobing. However, I never suppressed the XG270 flaws on Blur Busters Forums. I do provide solutions obviously (e.g. lower Hz to reduce crosstalk). Just as NVIDIA earns from G-SYNC, AMD earns from FreeSync, Blur Busters does earn from good strobe technology, fitting of the Blur Busters namesake -- and most users don't mind as we already wholeheartedly acknowledge that we do earn money off improving industry's strobing -- we've out-tuned NVIIDA with 119Hz/120Hz PureXP+ better quality than 120Hz NVIDIA ULMB on IPS. Certainly, I'm at least damn proud of that (even if 240Hz strobe is so-so on all 240Hz IPS panels). My bias may be showing. COVID hurt the program a bit (slowdowns) to show a single-monitor bias (sigh). But Blur Busters Approved 2.0 will have multiple vendors listed by end of 2021 as BB got signed contracts now. It's a matter of time (thanks to yearlong monitor engineering cycles) though, so one has to be patient.

___

Also, I noticed that occasionally, review scores are artificially penalized by scripted test methodology: One famous example is NVIDIA artificially caps ULMB to a lower refresh rate because of strobe crosstalk, while BenQ/ViewSonic/etc uncaps their max strobe rate. What this means is some reviewers end up testing max-Hz strobe for strobe quality, rather than using the refresh rate headroom (120Hz strobing superior on 240Hz panel can produce superior results to 120 Hz strobe on 240 Hz panels). In one sense, one has to rheoretically ask -- are reviewers penalizing uncapped strobe rates, even if 144Hz-vs-144Hz strobe is similar (for a 240Hz monitor) -- if one panel has a 144Hz strobe cap and the other has uncapped strobe (240Hz strobe allowed).

Max-Hz strobing (240Hz) is usually bad on all panels, so artificially capping the strobe to a lower Hz can improve reviewer scores, while depriving users of choice. There's a continuum of slowly worsening crosstalk as you go a few Hz up and up from 75Hz towards 240Hz, but reviewers usually only test one strobe Hz.

Sometimes a review script is "Test Max Hz Strobe for Quality" or such, which can penalize the score of manufacturers nice enough to uncap their max strobe Hz. We think users should have the choice to strobe at any refresh rate that they want. More crosstalky higher Hz strobe, versus less crosstalky lower Hz strobe.

Another way a reviewer score is penalized is missing EDID's. For example, minimum EDID strobe Hz on XG270 is 100Hz, but if you use ToastyX CRU or NVIDIA Custom Resolution, you can create 75 Hz strobe (or even a lower-lag Quick Frame Transport 120Hz strobe mode with a VT2250 large-VBI), you can even improve strobe lag or improve strobe crosstalk even further (75Hz Min instead of 100Hz min). RTINGS temporarily docked the score for XG270 because they thought 100Hz was minimum strobe, but now the RTINGS score for XG270 strobe is now a higher 8.6 because they realized 75 Hz is min-strobe. And they upgraded firmware & they added a 2nd photograph for the 120Hz strobe (that looks better). However, that's as far as they'll go within the confines of the manager-approved review testing process (script of test inclusions).

Another way review scores may be docked is testing older firmwares. RTINGS tested only the older XG270 firmware too (the one prior to the Blur Busters Approved firmware), which worsened the strobe score. However, they've corrected that.
"RTINGS.COM: Update 12/11/2020: We updated the monitor's firmware and retested the BFI range. It can flicker as low as 75Hz when you set a custom resolution and use a 75Hz signal."
Right now, it's a score of 8.6 which is better than the first score because of the old firmware + wrong min Hz knowledge.

Unfortunately reviewer employees often aren't allowed to deviate from the management's review script edicted that may sometimes penalize a specific monitor's strobe tests. No 240Hz monitors can do a crosstalk-free job of 240Hz strobing, but sometimes uncapped strobe is unfairly penalized -- since their photograph is always of max-Hz strobe. They do a good job of reviewing a lot of criteria, but sometimes review methodology is restricted by a management's edicted review script to one of their fleet of testing employees, so RTINGS clearly was limited by how much they can write about it. While I love RTINGS, I don't have universal acclaim. If you saw what I wrote on Blur Busters Forums, you have seen I've already had a beef about RTING's limited info. However, I have already talked a lot about XG270's flaws and virtues on Blur Busters Forums. If you need better quality 240Hz strobe, you need a recent TN panel (like the XL2546S or XL2546K), but as far as IPS panels go, I haven't seen 119Hz/120Hz strobed IPS as clean as XG270's (so far).

For strobe Hz, all refresh rates should be strobeable, no matter what monitor (including monitors I don't tune). Users should be able to choose any custom strobe Hz! As a hobbyist, I have a major beef with strobe restrictions.

Even though Blur Busters earn commissions from monitors, Blur Busters has already turned down Blur Busters Approved money for 3 models this year - because they did NOT pass the strobe tuning quality benchmarks. There are a few models on the market as of 2020 that Blur Busters has strobe-tuned (but that general public doesn't know about because Blur Busters has not authorized Blur Busters namesake in their advertising). Some of them was because of KSF red phosphor at [Link Self Removed To Respect HardForum Rules] ... No additional models came because of COVID. (This will be fixed by end of 2021 with multiple models logo-stamped by end of year).

While Blur Busters is blur-reduction-biased, clearly (pun!), Blur Busters is not sacrificing the integrity of Blur Busters Approved program. Even though it was a bit slow for the 2020 covid era, and will pick up 2021.

Hopefully this summary (of what I've already written elsewhere in bits and pieces) ... clarifies the behind-the-scenes things better!

YouTube Comments #2 Crosspost

LINK: YouTube Comment Reply by Chief Blur Buster

Just as AMD certifies only the FreeSync, NVIDIA certifies only the G-SYNC, we only certify the strobing. That's what the levels are for: You can choose brighter strobe which still looks good. Also, there are better panels (TN) than XG270 if you want 240Hz strobing, such as XL2546K, and there is the LG CX OLED if you want perfect blacks, etc. However, different people are picky about different artifacts (tearing, or stutter, or motion blur, or too bright, etc). People who need a desktop size factor that prioritize on blur reduction with less crosstalk than the competition (NVIDIA ULMB is also dim and has worse images at top/bottom too) have been attracted to XG270's relatively excellent strobing (compared to any available 24"-27" competition).

When you witnessed the XG270 on your desk, have you seen any strobed IPS better than XG270 [as of 2020] when you're using strobe 100% of the time at its best strobed refresh rate? There are issues on a lot of IPS panels, so a panel is often a tradeoff with different panels. Remember, some of us have motion blur headaches and have to use strobe modes to avoid it -- so it is not a "boring" feature for some of us. Our name sake Blur Busters is a beacon for people who really need to avoid motion blur (even at other cost of flaws). I understand the monitor definitely has other flaws in reviews. And to be fair, you can get PureXP above 200 nits in a blur-to-brightness compromise. The user has been given a choice to adjust the balance at least (it was a mandatory requirement) -- many strobe modes on other models don't even let you adjust this.

PureXP Ultra = 10% persistence
PureXP Extreme = 20% persistence, ~2x brighter
PureXP Normal = 30% persistence, ~3x brighter
PureXP Light = 40% persistence, ~4x brighter

Metaphorically, it is like an adjustable-persistence CRT phosphor. To get brighter without overvoltaging the LEDs too much (to death), you need more persistence. ULMB default settings is higher persistence than PureXP default settings, so NVIDIA ULMB is brighter but blurrier at default settings, with a slight bit more crosstalk at the lower refresh rates (in the 75Hz-120Hz range), but if you adjust ULMB Pulse Width, it gets much better but ULMB also even gets dimmer than PureXP Ultra (try NVIDIA ULMB Pulse Width 10 as an example -- less than 50 nits!). I love NVIDIA ULMB, it's been an inspiration to try to beat; not easy to do so for a small company.

However, some of us are dependant on strobing (can't stand motion blur) or really prefer blur reduction above all else. Some of us gets more headaches from motion blur than from flicker, etc.

We'll post some new articles about Blur Busters Approved to help educate people on the purpose of Blur Busters Approved, as we don't certify other aspects of the monitor.

Be noted (contrary to assumptions), we are not a "review site". We don't mass-review monitors; that is done by others. We do third party certification of strobing standards. We do special editions (like GSYNC 101) to reveal nuances of new technologies and such, as well as our advanced Area51 research like [Link Self Removed To Respect HardForum Rules] -- incubating inventions and knowledge for industry.

Bear in mind,
- Strobe tuning is 75Hz-240Hz. Higher quality strobe, the lower refresh rate you go. 75 Hz strobe tuning is even better than 119-120 strobe tuning, so you can lower your refresh.
- All CRTs and strobe modes have the same stutter flaw: They all require framerate=Hz for best performance without duplicate images.
- All strobe modes on monitors have a blur-vs-brightness tradeoff due to law-of-physics called Talbot-Plateau Theorem. Even at its brighter settings, it still can outperform competition such as IPS-panel ULMB.
So that's the deets that brings more clarity (clarity pun intended in Blur Busters flavour).

Cheers,
</SideTrack>
 
Last edited:
Thanks for the exhaustive explanations Chief Blur Buster. You definitely have your work cut out for you in the coming years.

Do you think the dual layer LCD's (monochrome light-blocking LCD panel sandwiched behind color LCD panel), like the tech in Sony's PVM's and BVM's, and some Hisense TV's, could be competitive with FALD in regards to strobing ability?
 
I fear
Do you think the dual layer LCD's (monochrome light-blocking LCD panel sandwiched behind color LCD panel), like the tech in Sony's PVM's and BVM's, and some Hisense TV's, could be competitive with FALD in regards to strobing ability?
These are fun questions for me to answer, but I need to self-throttle my replyrate on large semi-offtopic replies (even if it's in the scope of FW900 motion-clarity comparisons). I reply with bigger replies to these kinds of questions in Area51 section of Blur Busters Forums.

But the short answer is "it depends". Dual layer LCDs are valid horse in the motion blur reduction (strobing / refresh rate race) although they are specializing more on color gamut. There are technological variables such as the GtG speed and refresh pattern capability of the 2nd layer -- which determines the LCD layer's ability to double a stand-in for a scanning backlight or strobe backlight.

Speaking of which, after I'm vaccinated, I'm hoping to get more time again with a local FW900 specimen (spacediver is a Canadian friend of mine). It's long considered a gold standard to beat (one lineitem at a time, bit by bit) with desktop flat panels simply because it's the most similar in size and shape of screen to contemporary desktop monitors.
 
Last edited:
Thanks for the exhaustive explanations Chief Blur Buster. You definitely have your work cut out for you in the coming years.

Do you think the dual layer LCD's (monochrome light-blocking LCD panel sandwiched behind color LCD panel), like the tech in Sony's PVM's and BVM's, and some Hisense TV's, could be competitive with FALD in regards to strobing ability?
I would love to see gaming variants of these. No doubt.
 
If replacing that kind of film was that easy, we'd have found a good alternative long ago, not just placeholders. Way better ? Good luck finding that. It's a custom film with specific electrical and optical properties, the only way to have it is to manufacture it on specification and the few companies that could do that will laugh at you for any order of a lower magnitude than hundreds of square meters.

And before you think about pretending a monitor without a film is OK (or even makes the display better like that russian retard does, but well, he's trying to sell his junk after all), no, it's not. It's just the last possibility you have to make the picture less bad than with a heavily damaged film. But that film is there for good reasons and it's required for a good quality display.

I bought my FW900 back in 2006. It was in perfect condition cosmetically and image quality. The anti-glare film had no scratches. About 8 yrs back the film was scratched when I was moving. I ended up removing it and have had zero problems with it gone. No static build-up and the blacks are as black as ever as long as there aren't any light sources reflecting off the screen in front of it. The image is sharper and colors more vibrant with the film removed. The only reason I would still want the anti-glare film is to protect the glass. So I disagree that the monitor isn't OK without the film and the film is required for a good quality display.
 
I could not get my FW900 to resolve black even in a very dark room without the filter. Got it to work again with the Kantek filter. (The details are in a post I put end of last year.) Not for everyone though as it requires the monitor be debezeled.

And the caveat others mentioned about not spending much on something that might break pretty quickly at this point. (Unless it's NOS or you've got enough money that it wouldn't bother you.)

Could you provide more information on the Kantek filter you used on your FW900 please? I removed my factory filter some years back and have no negative results from doing so. Better colors and contrast and no static build up. Just have to make sure there are no external light sources in front of the screen to get good blacks and image quality. I would like to replace the screen filter mainly to protect the glass.
 
Last edited:
I bought my FW900 back in 2006. It was in perfect condition cosmetically and image quality. The anti-glare film had no scratches. About 8 yrs back the film was scratched when I was moving. I ended up removing it and have had zero problems with it gone. No static build-up and the blacks are as black as ever as long as there aren't any light sources reflecting off the screen in front of it. The image is sharper and colors more vibrant with the film removed. The only reason I would still want the anti-glare film is to protect the glass. So I disagree that the monitor isn't OK without the film and the film is required for a good quality display.
I agree... In the past, we debunked the issue of AG on vs. AG off and picture/image quality ...

Unkle Vito!
 
For new video cards with USB-C output there are also:
Delock 62796
Plugable USBC-VGA
Sunix C2VC7A0
Same chipset as Delock 62967 and the last two available in USA
And for EU users the Delock 87685,same Synaptics chip as Sunix DPU3000 for high pixel clock

Speaking about chipset,i talked with a guy from Lontium R&D center,they are updating one of their best chip and this is what he said:

We are actually updating LT8612SX to LT8612UX , which support HDMI 2.0 input, we will try to make sure that VGA output support upto 400MHz

So input bandwidth should be 600 MHz with 24 bit and 480 MHz with 30 bit,oh and DAC is 10 bit
Well it sound interesting,we'll see
Any news on the LT8612UX update ?
 
If anyone with a debezeled FW900 is interested in mounting the Kantek Filter (https://www.amazon.com/Kantek-Anti-Glare-Monitors-Diagonally-LCD19/dp/B00P85FWG4?th=1), I did so with these:

View attachment 204376 View attachment 204379


Brackets assembled from:
1) 6 mm metric extruded U Nuts. (Long ones for top, regular for bottom)
2) M4 .7 50mm screws
3) Corresponding nuts and rubber bumpers

(Was all available at a local hardware store.)

I used the top most mounting hole in each corner for the brackets. (The screw in the other mounting hole in each corner continuing to hold the CRT in place.)

I removed the silver layer on top, which was just tape. Drilled out the original hangers. (Carefully to avoid damaging the filter.)

The product description is a bit of a mess and all over the place. In there, beyond the generic "antiglare" designation, it says it's an antireflective coating, which appears to be true. And has FWIW a "neutral light tint".

As you are looking through a relatively thick piece of acrylic as opposed to a thin film, stuff you are looking down at through the filter, e.g., text on the bottom part of a page, has a slight distortion or echo. I mitigated this by raising the angle of the monitor up a bit. Subjectively, overall, in my use of the screen for gaming, other media, and as a secondary screen for productivity (office apps, coding, etc.) it seems kind of great actually. To me.


(Sorry, I did not try it against the bezel with the original hangers. I suspect it might ride a little high as it barely covered the visible area without the bezel with those hangers. I like the look, but I'm not advocating debezeling. Especially if safety concerns, e.g., kids or cats have access to the room.)

What are your thoughts on just hanging the kantek filters on the outside of the FW900 screen/bezel like how they are intended to be mounted on an LCD screen? I'm thinking of picking one of these up...
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Could anyone recommend me an adapter for the Sony GDM FW900 ? I have a GTX 980 Ti which I can use the DVI to VGA through, but I will upgrade in 2021 to 3080 or RX 6800 XT because of games getting hard to run even at lower resolutions and because I want to experience Ray Tracing. The faster the adapter/the higher resolution and refresh rates it can do and the lower the latency for conversion the better of course.
Here a summary of various adapters.
If 375 MHz max pixel clock is enough for you, buy the Startech DP2VGAHD20, it doesn't cost too much, it's available everywhere and it works good.
If you want more performance the only solution is one of the Synaptics VMM2322 based adapters.
There are also the USB-C Lontium based adapters, but they are difficult to find, we don't know the real performance and you need a video card with USB-C output or a special card to convert the displayport to USB-C output.
About the LT8612UX, the chipset is done but they have not tested it with 400 MHz so we don't know the real performance, for now i don't know adapters with this chipset.
 
BUT, I also play consoles on my CRT, and timings are not adjustable on those.

One solution I did find was that Extron RGB processors, like the 203rxi, have a vertical position adjustment that will shift the blanking interval around, . This actually works great at 720p, but the Extron isn't fast enough to give a good picture at 1080p, which is the resolution I use for PS4 and Switch. Maybe there is another Extron RGB that can handle 1080p out there somewhere, but I haven't found it.
My impresario always was that 720p works ok on CRT's but 1080p has not enough horizontal blanking and causes horizontal image warp. Though maybe the image was not perfect, not sure. Definitely 1080p looks not ok while 720p is more or less ok.
Some CRT's also do not support 1080p while generally work ok with 720p.

Regardless of slight image warping FW900 is superb display for console gaming, especially for PS3 as it allows to avoid upscaling. I played many games on FW900 including The Last of Us 😎 The only better display for PS3 was Panasonic plasma TV with 1024x720, it made games look even better than they really look. I also played some PS4 games on CRT, eg. Bloodborne, thus experience with image warping at 1080p.
 
I bought my FW900 back in 2006. It was in perfect condition cosmetically and image quality. The anti-glare film had no scratches. About 8 yrs back the film was scratched when I was moving. I ended up removing it and have had zero problems with it gone. No static build-up and (...)
I stopped reading there. Sorry but this is bullshit. This is physical, static electricity will build up on any CRT without proper grounding, and that film is what grounds the front of the CRT. If you can't notice that, then you're not able to notice anything.
 
My impresario always was that 720p works ok on CRT's but 1080p has not enough horizontal blanking and causes horizontal image warp.

Yeah on my 4:3 monitor that's not a big deal, I just overscan the 5% or so of the picture that is warped, since it's all on the extreme right and left. I never thought of 16:10 monitors, you can't really overscan too much of the sides without also overscanning the top. But I guess you'd have to. Something like a 5% overscan on the sides and 3% vertical.
 
Greetings to all and happy new year!

Back on pages 433 and 434 I have described some problems with my HP A7217A(differently branded FW900). Basically the main problem was that it turned off the moment I plugged a VGA cable in it. After leaving the monitor in a repair shop for a few months this issue is finally fixed. The man said there were broken circuits on the D board. Now the monitor works, however when I first turn it on, the image is extremely bright and it takes about 30min to come down to normal. The problem can't be fixed by adjusting the G2 voltage with WinDAS. I have the same problem with a Sony E530 except it warms up faster and the brightness goes down to normal in about 10min.

Any ideas on how to fix this?
 
What are your thoughts on just hanging the kantek filters on the outside of the FW900 screen/bezel like how they are intended to be mounted on an LCD screen? I'm thinking of picking one of these up...

Rechecked with my spare Kantek, which still has the hangers. Negative on hanging the Kantek from the FW900 bezel. It's not tall enough to cover the visible screen area. Leaves a big gap at the bottom.

To just protect the screen I think it's not worth it anyway. I put a filter on to restore blacks in a dimly lit (but not completely dark) room. When I held swatches of filters against the screen during a game, the filters restored the original black. Otherwise the phosphors are just too reflective of the ambient light. Still enjoyed the screen without the filter, but it wasn't the same though.
 
I stopped reading there. Sorry but this is bullshit. This is physical, static electricity will build up on any CRT without proper grounding, and that film is what grounds the front of the CRT. If you can't notice that, then you're not able to notice anything.

Lol ok dude whatever you say. I've been messing with a FW900 a lot longer than you have but if you think you know everything and want to be belligerent keep right on doing it.
 
Back
Top