LG finally launches the first large-screen OLED TV

It looks like the two current front runners are very close in results. You can change the brightness in the OSD, but the BenQ seems to have a crimson tint where the Asus does not.
- There is a crimson tint. It becomes more subdued if I raise the Contrast in BENQ's OSD to above "43", but it starts clipping colors.
- The ASUS colors does look look better at desktop. I don't notice in games though; the BENQ actually is slightly sharper looking because it is smaller (24" rather than 27").
.
I don't think he is comparing to the newer asus in the quote below. I think these results and comments are comparing an ASUS VG278HE - 144Hz to a BENQ XL2411T. Asus is releasing a new model VG248QE 144hz (24", 1ms) this month that has a similar panel to the BenQ XL2411T but different circuitry. The newer asus has not been received and put through its paces by owners (especially Mark R.) yet.

Preliminary BENQ XL2411T observations

I want to keep most of my data for a detailed web page, but I will post some preliminary data. LightBoost works as advertised, following my own instructions (I now have more screenshots for the series of steps to install an EDID override file, will post those in the next month or two.

When enabling LightBoost:
- BENQ is a much, much better panel from a trailing-artifact perspective. This is the most impressive aspect of the BENQ.
- There is a crimson tint. It becomes more subdued if I raise the Contrast in BENQ's OSD to above "43", but it starts clipping colors.
- The well-known "BENQ AMA coronas" completely disappears.
- When LightBoost is MAX in monitor's OSD, the LightBoost Brightness on BENQ at contrast 50 is similar to ASUS at contrast 90.
- I see the faint horizontal lines at the upper-right, but only when LightBoost is enabled. This appears to be a normal LightBoost artifact for a BENQ.
- The ASUS colors does look look better at desktop. I don't notice in games though; the BENQ actually is slightly sharper looking because it is smaller (24" rather than 27").
- The LCD inversion artifacts are much less on the BENQ than on the ASUS. LightBoost does not amplify inversion artifacts on the BENQ nearly as much as it does on ASUS.
- LightBoost doesn't seem brighter on the BENQ than ASUS. (The most surprising aspect)
- I need to do more tests on MPRT, but preliminary MPRT of about 1.9ms from default settings, but if I set LightBoost down to 10% (not OFF) via monitor's OSD, I'm able to get MPRT 1.4ms. That's a much dimmer image. I need to verify I'm reproducing the same test conditions that Vega is.

That said, the most impressive aspect of BENQ: During LightBoost, the crosstalk between refreshes is darn near practically zero to the eye (Less than 1% for sure -- possibly about 0.5% inter-frame GtG pixel persistence leakage). The LightBoost-specific response time compensation (non-adjustable when LightBoost enabled) is very good. This is very, very good for 3D glasses, as this will mean you won't notice 3D crosstalk with these "1ms" panels. During 2D motion, there isn't even a faint 'sharp faint doubled-up edge' minor artifact that is seen on the ASUS VG278H. I will attempt to have high-speed camera comparisions of both the BENQ vs ASUS refresh within a month or two.

Although very hard to tell since both are zero motion blur LCD's, due to fewer side effect artifacts (except crimson tint), motion looks slightly better on BENQ, and the BENQ is well known to have less input lag. If competitive FPS gamers are looking an answer -- the BENQ input lag will win out.
 
Last edited:
And everything else - outside of crt and Lightboost2 1ms synchonized backlight strobing in 2D - turns into fingerpaint colors smeared across the screen during blurred FoV movement in games where all detail is lost.
we have 60hz motion blurr-mess, 120hz with increased motion tracking/more recent action per hz and around 1/2 as much blur, and finally something new, a 120hz + 1ms lightboost2 backlight strobing lcd with essentially zero blur. (The only other thing with essentially zero blur like that is a fw900 crt with its own considerable tradeoffs).
.
In regard to the dismissive color reply.. there is a big difference between the saturated (yet non-uniform across the panel due to TN shift/shading) color available on the better of the current TN monitors, and having an overall crimson tint while gaming.
 
Last edited:
Fw900 tradeoffs, off the top of my head

imperfect geometry issues
most fade or bloom over lifetime
can lose their full focus
can have difficulty getting text as sharp and non-fatiguing as an lcd (potentially glowing edges or in regard to lacking full focus,etc)
dpi varies from center to edges
viewable display size (22.5")

require a warmup period (1/2 hr?) to reach full contrast/clarity
are very bulky (take up a lot of room), ..
weight obviously (~96 lb). not realistic to put on a monitor arm mount
power button/unit can crap out
availability and buyer's risk, quality risk, from most vendors/sellers
high cost for a pristine+calibrated one and lack of any easy shipping options from some vendors
lack of affordable or even available repair/shipping options let alone warranty.
require a considerable knowledge in how to tinker with them in order to get their PQ their highest and maintain it over time
in time usually start exhibiting malfunctions anywhere from the fading or blooming to screen aberrations and failures of display initialization, etc. (ticking time bomb in some respects).
Other potential issues.. driver/inf support itself, driver/inf support in regard to certain resolutions and refresh rates being available, more modern gpu's can omit support of some timings
screen anomalies due to unclean power sources/magnetic interference.
can be difficult to get it to "play along" in multiple monitor setups.
.
I love fw900 at their best. I've owned two (one faded and later died). I'd still prefer a blur free and easy to set up and maintain lcd, especially going forward as fw900 do require knowledgeable tinkering and tend to start showing failures eventually.. People have been saying their fw900 will hold them out until sed, oled etc for what seems like a decade now >.< . Until the lightboost2 thing came out I would have kept my (second go at) fw900 at desk until its display degraded or died, but sometime this year it may be replaced finally (paired with one or two gtx780's). I currently own a 27" 2560x1440ips for desktop/apps/imagery, a 27" 120hz non-lightboost2 1080p TN for gaming, and a 22.5" viewable fw900 for gaming and the odd video playback.
 
Last edited:
I currently own a 27" 2560x1440ips for desktop/apps/imagery, a 27" 120hz non-lightboost2 1080p TN for gaming, and a 22.5" viewable fw900 for gaming and the odd video playback.

So in other words you need 3 monitors to do what a single CRT did 10 years ago and it's still worse / debatable? No wonder LCD manufacturers have agreed to price gouge for so long. I didn't think people actually did this.

Btw I never had any of those problems with my CRT but I didn't have an FW900. Mine eventually died after 7ish years because the edges started to vibrate along with color warping at the edges. I'm happy with getting close to a decade out of a $60 monitor.
 
You are putting words in my mouth. Two monitors is reasonable though, considering how things are. One for desktop/apps/imagery and one for games.
.
I used to use a 1920x1200 lcd next to my fw900,
The fw900 faded, and later died.
Later I got the 27" 108.8ppi 2560x1440 ips which is goregeous for desktop apps/imagery, and tried to just use that with a few smaller "sidebar" panels. The blur was very annoying in games.
.
Because I found the blur in games so annoying, I ordered a B~B+ grade fw900 from a warehouse store and crossed my fingers
.
I read a lot on hardforum about 120hz TNs , and having a fw900 die on me recently and getting a B/B+ grade one to replace it with who knows how much life left in it -- when I saw a 27" 120hz TN on sale for $399 very briefly I was in indecision about it, but not wanting to miss such a deal, I sprung on it figuring I could resell it if I had to. They subsequently shot up to over $600 for quite some time. The 120hz (non lightboost2) has about 1/2 as much blur as a 60hz lcd, which is appreciable. They also have increased motion tracking from more frames of more recent action crammed in per second (at high fps). I use the 27" 120hz a lot now, but the fw900 is sill around since I spent the $ on it, and it makes a good comparison tool, especially since I am on this forum so much. :b
.
In my opinion considering the massive tradeoffs, using one monitor for desktop stuff and one for games is the way to go. I just happen to have three because thats the way it happened due to availability, sale timing, time of failure, and because I like trying out new tech advancements.
.
Yes I have been on a quest for even two monitors with the best tradeoffs I can get. A crt cannot do 27" at 108.8ppi at no-issue perfect geometry and good brightness, and its lifespan as a pristine display can be something of a ticking time bomb and requires tinkering. I listed all the other considerable tradeoffs already, although I omitted lack of brightness. I am a fan of fw900 crt's but like every other monitor we are talking about, they all have considerable tradeoffs. You can't say crt is pure excellence in anything outside of motion blur (and perhaps input lag compared to some displays) really, and with the lightboost2 tech there is another option for zero blur gaming. I'm hoping to get a 120hz 1ms lightboost2 monitor and one or two gtx780's sometime this year. I'll sell my non-lightboost2 TN, and possibly retire my fw900. I'll keep the 27" 2560x1440 108.8ppi ips for all things desktop outside of games.
 
Last edited:
.You can't say crt is pure excellence in anything outside of motion blur (and perhaps input lag compared to some displays) really, and with the lightboost2 tech there is another option for zero blur gaming. I'm hoping to get a 120hz 1ms lightboost2 monitor and one or two gtx780's sometime this year. I'll sell my non-lightboost2 TN, and possibly retire my fw900. I'll keep the 27" 2560x1440 108.8ppi ips for all things desktop outside of games.
I would go far as stating the only true advantage fw900 offers is deep black's but that's only if there's no ambient light.

PS: the amount of problems and limitations of CRT is enough to put people off. I guess some people forgot I bad things used to be
 
Nah I just remember playing competitive FPS games that were liquid smooth. I remember spending hours tweaking my game configs and adjusting PS2 sampling rates to ensure proper tracking and precision.

It was really important to have absolute pixel perfect precision. I feel like I'm really sensitive when it comes to this stuff.
 
You are putting words in my mouth. Two monitors is reasonable though, considering how things are. One for desktop/apps/imagery and one for games.
.
I used to use a 1920x1200 lcd next to my fw900,
The fw900 faded, and later died.
Later I got the 27" 108.8ppi 2560x1440 ips which is goregeous for desktop apps/imagery, and tried to just use that with a few smaller "sidebar" panels. The blur was very annoying in games.
.
Because I found the blur in games so annoying, I ordered a B~B+ grade fw900 from a warehouse store and crossed my fingers
.
I read a lot on hardforum about 120hz TNs , and having a fw900 die on me recently and getting a B/B+ grade one to replace it with who knows how much life left in it -- when I saw a 27" 120hz TN on sale for $399 very briefly I was in indecision about it, but not wanting to miss such a deal, I sprung on it figuring I could resell it if I had to. They subsequently shot up to over $600 for quite some time. The 120hz (non lightboost2) has about 1/2 as much blur as a 60hz lcd, which is appreciable. They also have increased motion tracking from more frames of more recent action crammed in per second (at high fps). I use the 27" 120hz a lot now, but the fw900 is sill around since I spent the $ on it, and it makes a good comparison tool, especially since I am on this forum so much. :b
.
In my opinion considering the massive tradeoffs, using one monitor for desktop stuff and one for games is the way to go. I just happen to have three because thats the way it happened due to availability, sale timing, time of failure, and because I like trying out new tech advancements.
.
Yes I have been on a quest for even two monitors with the best tradeoffs I can get. A crt cannot do 27" at 108.8ppi at no-issue perfect geometry and good brightness, and its lifespan as a pristine display can be something of a ticking time bomb and requires tinkering. I listed all the other considerable tradeoffs already, although I omitted lack of brightness. I am a fan of fw900 crt's but like every other monitor we are talking about, they all have considerable tradeoffs. You can't say crt is pure excellence in anything outside of motion blur (and perhaps input lag compared to some displays) really, and with the lightboost2 tech there is another option for zero blur gaming. I'm hoping to get a 120hz 1ms lightboost2 monitor and one or two gtx780's sometime this year. I'll sell my non-lightboost2 TN, and possibly retire my fw900. I'll keep the 27" 2560x1440 108.8ppi ips for all things desktop outside of games.

How old was the FW900 that had (phosphor?) fading and defocussing? CRT tubes are ultimately disposable and I had heard of those issues, but as extreme end of life things...
 
The fw900 thread has plenty of stories about screens wavering, dim/faded, bloom, screen display popping loudly out and back on, interference patterns, power button/power units failing, etc... and otherwise "on the way out but still using it for sub-par performance until it won't work at all" stories. They have a limited lifespan and are usually purchased having already been used at some point. Some people's prime lifespan is luckier than others. Their convergence/pots/focus etc sometimes have to be tweaked too (opening the chasis and screwing around inside of it), and some "repair menu"/technician editing of settings using a special cable interface if you want them to be calibrated to their best or to otherwise attempt to fix annoying issues. They are great screens when all set up but to have a 1080p lcd at 100hz or 120hz with zero blur dedicated to gaming would be a great alternative and be less of a ticking timebomb (and tinkerbox as needed), as well as all the other tradeoffs I mentioned. Especially if you can swing the ridiculously good deal price (~$350) of a korean 2560x1440 108.8 ppi ips to throw next to it someday for everyting outside of games.

Fw900 tradeoffs, off the top of my head

<edit> Inferior Brightness (looks best in a dark room)
<edit> require considerably high hz setting to avoid flickery appearance/eye-strain, highest resolutions slightly limit max hz
imperfect geometry issues
most fade or bloom over lifetime
can lose their full focus
can have difficulty getting text as sharp and non-fatiguing as an lcd (potentially glowing edges or in regard to lacking full focus,etc)
dpi varies from center to edges
viewable display size (22.5")

require a warmup period (1/2 hr?) to reach full contrast/clarity
are very bulky (take up a lot of room), ..
weight obviously (~96 lb). not realistic to put on a monitor arm mount
power button/unit can crap out
availability and buyer's risk, quality risk, from most vendors/sellers
high cost for a pristine+calibrated one and lack of any easy shipping options from some vendors
lack of affordable or even available repair/shipping options let alone warranty.
require a considerable knowledge in how to tinker with them in order to get their PQ their highest and maintain it over time
in time usually start exhibiting malfunctions anywhere from the fading or blooming to screen aberrations and failures of display initialization, etc. (ticking time bomb in some respects).
Other potential issues.. driver/inf support itself, driver/inf support in regard to certain resolutions and refresh rates being available, more modern gpu's can omit support of some timings
screen anomalies due to unclean power sources/magnetic interference.
can be difficult to get it to "play along" in multiple monitor setups.
 
Last edited:
I would go far as stating the only true advantage fw900 offers is deep black's but that's only if there's no ambient light.

PS: the amount of problems and limitations of CRT is enough to put people off. I guess some people forgot I bad things used to be

Controlled lighting, but LCD black is so bright that it certainly doesn't have to be a totally dark room to see the difference. And also the dynamic range -- CRT's ability to show a bright scene and then reach deep for the lows in a following scene of video or graphical animiation, is something I've really noticed versus LCD. And also the picture uniformity in dark scenes on LCD gets to me (though I realize that varies by luck of the draw...)

Anyway, sure...CRTs are plenty of trouble. And I'd actually think most folks who had CRTs at work, which seemingly were invariably set to 60 Hz and otherwise misconfigured, do not have fond memories of them. And maybe direct view CRTs were always doomed, because they couldn't practically scale to today's sizes...all that said, the F520 and FW900 are awesome displays, nailing what many of us would consider to be the fundamentals, and it's very much a shame that the mainstream market's sensibilities deprived the computer enthusiast community of those monitors and what should have rightfully been their successors, if raw performance had been the only dictate...(I do realize that in the real world, it's much more about economics...)
 
The fw900 thread has plenty of stories about screens wavering, dim/faded, bloom, screen display popping loudly out and back on, interference patterns, power button/power units failing, etc... and otherwise "on the way out but still using it for sub-par performance until it won't work at all" stories. They have a limited lifespan and are usually purchased having already been used at some point. Some people's prime lifespan is luckier than others. Their convergence/pots/focus etc sometimes have to be tweaked too (opening the chasis and screwing around inside of it), and some "repair menu"/technician editing of settings using a special cable interface if you want them to be calibrated to their best or to otherwise attempt to fix annoying issues. They are great screens when all set up but to have a 1080p lcd at 100hz or 120hz with zero blur dedicated to gaming would be a great alternative and be less of a ticking timebomb (and tinkerbox as needed), as well as all the other tradeoffs I mentioned. Especially if you can swing the ridiculously good deal price (~$350) of a korean 2560x1440 108.8 ppi ips to throw next to it someday for everyting outside of games.

Indeed, tubes are consummable and wear out. My original FW900 died after about 9 years. Had a good picture to the end, but was old enough that we didn't try for a repair...

Lots of tubes maybe are end of life at this point, at least in terms of optimal PQ...

With LCD, I would also go multi-panel and large, maximizing to my mind, the tech's advantage...
 
I used a sony 34" xbr960 1080i/"720p" widescreen crt (with hdmi input) for several years, for tv and movie watching, while the earliest lcd tv blacks were total crap. It wasn't until modern VA panel LCD tv's had decent black levels and detail in blacks, far superior to IPS and TN, that I bought a 46" samsung VA tv (glossy). Plasma has even better blacks and due to pixel flashing and other tech somewhat less blur than an lcd tv (though both still have 60hz input regardless of their higher hz claims). Then there is input lag and ghosting on tv's for gaming, but another topic (ugh).
.
I just keep any movie watching to my tv for the black levels and detail-in-blacks. I don't notice how mediocre the computer monitor's blacks are unless I watch movies on them usually. That said, my modern monitor's black levels (at least , their black depth) seem a lot better than some of my older monitors (outside of the fw900). I think glossy displays help how blacks appear somewhat, and also help with color vibrancy and clarity in general.
 
Back in 2006/7, I got sick of TN LCDs and ordered a 22" IBM CRT monitor via eBay. It had the latest Sony's FD Trinitron tube, which I always wanted but couldn't afford one.

When I got it, I was so underwhelmed. The picture was dim and under room lighting the phosphors became visibly gray (not dark gray as I hoped), which washed out the picture. I also couldn't get the geometry spot on and the screen wasn't perfectly flat. The two lines also starting to annoy me. I don't like any visible artifacts between me and the content.

Maybe I should have opted for high-bright Diamondtron monitor, but after exchanging 4-5 monitors directly from IBM (the original was under warranty), I gave up. In the end IBM gave me an LCD, which I sold and got a NEC 1970NXp with PVA panel from eBay.

I had nothing but bad experience with CRTs. I had to return 3 WEGA TVs due to geometry problems. Luckily the 4th replacement was close to perfect and it served me well for 3 years (although it crush blacks an awful lot). LCDs and PDPs do have their share of problems but most are trivial compared to CRTs; not to mention they so much easier to return than CRTs.
 
Last edited:
For console gaming and high speed action tv/movies, that sony xbr960 is still great. I only watch movies on my lcd tv because its so much larger, and try not to pay attention to the motion failings on it.
.
I can't even play ps3 move shooting gallery games on my lcd tv because the input lag and trailing is so bad , even in game mode.
.
FW900 graphics professional crt's are pretty tight when calibrated decently.. lush color, detail-in-blacks (not just "back depth" via black blob areas) and have zero motion blur. Lots of other tradeoffs though.
 
Back in 2006/7, I got sick of TN LCDs and ordered a 22" IBM CRT monitor via eBay. It had the latest Sony's FD Trinitron tube, which I always wanted but couldn't afford one.

When I got it, I was so underwhelmed. The picture was dim and under room lighting the phosphors became visibly gray (not dark gray as I hoped), which washed out the picture. I also couldn't get the geometry spot on and the screen wasn't perfectly flat. The two lines also starting to annoy me. I don't like any visible artifacts between me and the content.

Maybe I should have opted for high-bright Diamondtron monitor, but after exchanging 4-5 monitors directly from IBM (the original was under warranty), I gave up. In the end IBM gave me an LCD, which I sold and got a NEC 1970NXp with PVA panel from eBay.

I had nothing but bad experience with CRTs. I had to return 3 WEGA TVs due to geometry problems. Luckily the 4th replacement was close to perfect and it served me well for 3 years (although it crush blacks an awful lot). LCDs and PDPs do have their share of problems but most are trivial compared to CRTs; not to mention they so much easier to return than CRTs.

Maybe you meant 21" as the 22" aperture grill was the Diamondtron. And it would not have had the most advanced Trinitron tube as Sony only sold that under its own brand as the GDM-F520. And the F520 has the brightness modes. So no, it appears you never experienced the best CRT technology had to offer, and Sony's top end really took it to another level.

That said, based on your criteria, you would have still been disappointed. CRTs have reflective screens and do wash out. They only shine with controlled ambient lighting...

As to PDPs...I do like those for TV...just wish on the larger sizes I couldn't see the dots...I suppose OLEDs will have the same issue in that regard....
 
These are the only two I would regards as significant concerns from that list:

And weight and bulk arguments arguments against CRTs alway ring hollow to me, especially on enthusiast forums. In terms of raw IQ, LCD/Plasma tradeoffs are overwhelming in comparison.

The FW900 is nothing but a widescreen Trinitron.

When I went from Trinitron to LCD it was better in every way that mattered.

The sharpness is in a whole other league. The perfect geometry was a breath of fresh air. The color accuracy was also much better. This was the trifecta for me.

I had both on my desk for a little while, but CRT just had to go, it is simply outclassed for image quality.

I could see if you are a twitch gamer to benefit from high refresh rates and low input lag and strobbing effect to improve motion resolution.

But outside of Twitch gaming, CRTs were just a bunch of mushy artifacts.
 
I could see if you are a twitch gamer to benefit from high refresh rates and low input lag and strobbing effect to improve motion resolution.

But outside of Twitch gaming, CRTs were just a bunch of mushy artifacts.
I actually like fw900 crts, I just don't think they are without tradeoffs. A properly focused and calibrated fw900 graphics pro crt is very tight with a high dot pitch and fast refresh rates, and considerably high resolutions. nothing like your run of the mill desktop crt of old.
.

Its not just about twitch gaming for many of us. Its about visual clarity during motion, not scoring higher. I appreciate the perfect geometry, size, and other benefits of lcd but the motion blurring was a horrible tradeoff. I'm looking forward to lightboost2 backlight strobed LCD with zero blur.

LCDs (outside of the new 1ms backlight strobe synchronized setups) are mush blurs during FoV movement. 120hz ones with aggressive response time compensation have a sort of soften blur haze closer to the "shadow mask" of scene elements, but still blur out all object detail and textures incl bump mapping and shaders.. and of course any nameplates/text/writing/logos in the scene.

---- A 60hz TN blurs horribly during FoV movement, a 60hz IPS would blur even worse.
----The 120hz TN's reduce this blur about 50% compared to a 60hz tn.
----The 120hz IPS (limited number of korean ips) blur a little worse than 50% compared to a 60hz TN.
----The 144hz 1ms TNs with 1ms Lightboost2 3D capability, set at 100hz and running 100+fps, or set at 120hz running 120+fps, result in ZERO BLUR during FoV movement.
.
the most modern, extremely high resolution texture mapped games , + bumpmapping depth and shaders, make my (60hz ips) lcd screen blur even more obvious and eye wrenching than before since the blur on fast FoV movement washes out that extremely high detail+3d depth my eyes "have a lock on" every time. It strains my eyes and is much more obnoxious to me than more than simpler textured/older games.

Most people seem to agree with this representation of 60hz/120hz *LCD/ CRT blur in games.

lcd-blur.jpg


So it appears to me that 120hz vs the limitation of LCD pixel response times and retinal retention blur would still not be enough to retain the focus on texture detail (much like fine text scrawled on a surface which gets smudged out) and bump map depth.

Its like you have goggles filled with some liquid-gel and every time you turn quickly, your eyes see all fine detail lost in a blurring. 120hz might replace your goggles with a fluid which has double the viscosity, blurring near half as much.. but its still a lousy prescription compared to clear sight imo.
.

The 1ms lightboost2 monitors result in the same full clarity as the crt representation.
Consider that blur effect on not just a single simple cartoon cell shaded car.. but rather on a scene, your entire "viewport" full of high detail objects, architecture, landscape, high detail textures, depth via bump mapping, shaders, etc all smearing out during FoV movement. I find it hard to label any monitor that smears like that a "superior picture" or "higher quality display" in regard to gaming. Desktop/app use is another matter, which is why I use a high ppi IPS next to my gaming monitor and intend to going forward. All about tradeoffs.
 
Last edited:
Nah I just remember playing competitive FPS games that were liquid smooth. I remember spending hours tweaking my game configs and adjusting PS2 sampling rates to ensure proper tracking and precision.

It was really important to have absolute pixel perfect precision. I feel like I'm really sensitive when it comes to this stuff.

So in other words, you completely forgot about how bulky CRTs were, how much they weighed, how much heat they emit, the number of CRTs with geometry issues, the fact that trinitrons used dampening wires - so 2 visible horizontal wires were always on the screen, and text sharpness worsening slightly as the monitor "warmed up"?

I owned a 15 inch, then a 19 inch trinitron and I loved them for their time. However, I don't miss those days at all - I don't miss their drawbacks. In terms of image clarity within windows, a top quality IPS panel completely in a different world in comparison to a trinitron CRT - they are much, much better. I understand that some competitive counterstrike or FPS players love their CRTs though, and that's cool.
 
Maybe you meant 21" as the 22" aperture grill was the Diamondtron. And it would not have had the most advanced Trinitron tube as Sony only sold that under its own brand as the GDM-F520. And the F520 has the brightness modes. So no, it appears you never experienced the best CRT technology had to offer, and Sony's top end really took it to another level.

That said, based on your criteria, you would have still been disappointed. CRTs have reflective screens and do wash out. They only shine with controlled ambient lighting...

As to PDPs...I do like those for TV...just wish on the larger sizes I couldn't see the dots...I suppose OLEDs will have the same issue in that regard....

FW900 came with £1000 price tag and most 4:3 20" models retailed for £500-800 :(. So I couldn't afford em' at that time. I was only 15/16 back then.

I got the IBM cos of manufacturers warranty (and a good thing too). I would have kept it if it weren't for the ridiculously low brightness. I prefer around 100-120 cd/m2 (due to my room lighting).

I guess I'll never know the true apex of CRT tech but I'm not too disappointed. I got my 42G30 plasma for movies and console games and 27" IPS for PC games.


PS: I suppose in term of cost vs performance, LCD and PDP are leaps ahead of CRT.
 
Here's a tip: If you ever want your anti-CRT argumants to be taken seriously, exclude the bulk/weight/heat clichés. We're on an enthusiast forum for Christ's sake. For example: Is the average enthusiast likely to write off the GTX680 because it takes up two slots and puts out a lot of heat? No.

I'm not making an argument. I'm stating that I liked Trinitrons during their time but there is no way i'd go back. The weight being ridiculous was definitely part of that, my 19 inch trinitron weighed nearly 50 pounds, however that is only one point. There are numerous other disadvantages that factor into why I would never go back. I don't want to stare at dampening wires 24/7, i'd go into more but i'd just repeat what I've already stated. It's interesting that you focused completely on the weight issue.

You're free to like trinitron CRTs, that's cool.. I'm just stating my preference. Although they're pretty good for gaming, comparing one of my current IPS panels to any trinitron in a windows environment - the IPS would quite literally blow it away in terms of perfect geometry, text sharpness and overall image quality.

That's what getting a monitor is all about. Pick the tech you enjoy the most and live with the compromises - none are perfect. Do you want 120hz? then you'll need a CRT or TN panel, and you'll have to put up with the low resolution of TN as well as poor color accuracy/viewing angles etc. Or you can get an IPS but that doesn't give you 120hz. Maybe OLED will be the next big thing but so far it is not ready for prime time in terms of cost and practical desktop/HDTV use.

Basically, it's highly subjective.
 
Last edited:
Still a completely pointless, expensive gimmick that will only detract from image quality.

The real use for transparent displays is in Heads-Up-Displays. Not monitors/TVs.

Well actually for alot of space constrained people a window that becomes a TV would be pretty useful. A ton of people out there hate the way a TV breaks up their decorating.

If all anyone cared about was quality then TVs would not even ship with the shitty speakers they have. But people sacrifice large amounts of quality very often to get aesthetics or convenience.


I think the main thing everyone has to say is we just have to wait and see, I am not worried about 4k content, what if the OLEDs ship with issues as most new tech does, then I would rather go 4k, then again 4k might lag like hell.

One of the thing everyone seems to forget is that alot of content will be generated, hook your PC up and you are 4K ready now. Games, programs, etc can all use it right today. So its not like its totally useless. Second most of us are not watching any 1080p content, we are watching compressed content sold to us as 1080p , but it did not stop any of us from buying 1080p sets. Why should 4k be any different?
 
Although BD is compressed, it can reproduce 1080 lines. It would be ideal to have uncompressed video content but the difference in quality isn't worth the hefty storage space. Contents that are broadcasted can also resolve 1080 lines (static) but during motion, resolution will drop due to interlacing and de-interlacing (unless the content is broadcasted in 1080p high-motion with sufficient bandwidth).

In reality, we don't always see 1080 lines worth of detail (especially when the content is shot from a camera). The only way to is via CGI (mostly video games)
 
Last edited:
I'm not making an argument. I'm stating that I liked Trinitrons during their time but there is no way i'd go back. The weight being ridiculous was definitely part of that, my 19 inch trinitron weighed nearly 50 pounds, however that is only one point. There are numerous other disadvantages that factor into why I would never go back. I don't want to stare at dampening wires 24/7, i'd go into more but i'd just repeat what I've already stated. It's interesting that you focused completely on the weight issue.

You're free to like trinitron CRTs, that's cool.. I'm just stating my preference. Although they're pretty good for gaming, comparing one of my current IPS panels to any trinitron in a windows environment - the IPS would quite literally blow it away in terms of perfect geometry, text sharpness and overall image quality.

That's what getting a monitor is all about. Pick the tech you enjoy the most and live with the compromises - none are perfect. Do you want 120hz? then you'll need a CRT or TN panel, and you'll have to put up with the low resolution of TN as well as poor color accuracy/viewing angles etc. Or you can get an IPS but that doesn't give you 120hz. Maybe OLED will be the next big thing but so far it is not ready for prime time in terms of cost and practical desktop/HDTV use.

Basically, it's highly subjective.

Bah, you can't even see the dampening wires. I'm as OCD as you can get and they don't even bother me. I can't even see them actually unless i'm on a light colored screen. Which i hardly ever am, i run everything dark grey or black. The only thing white on my screen is text for the most part. Can't see the wires in games or movies.
Weight? I don't know, i don't move my monitor - ever. I wouldn't move my monitor if it were an lcd either. It sits on my desk and there it stays.Geometry is perfectly fine - lcd is better sure, but it's nothing really noticeable. I sit in front of an lcd at work and come home to a fw-900, i don't notice jack other than how much the fw-900 pwns the F*ck out of lcd's.

Geometry issues - Nah, not really. Since i use illustrator, corel draw ect straight lines and and grids all the time with it - it's perfectly acceptable - a non issue.

Let's count the negatives of it shall we.

Weighs a lot... Solution, don't lug it around your house or take it to lan parties ( do people still do that? )

Put's out a lot of heat....Mmm heat, keeps me warn in the frosty Winter Morn.

Sucks a lot of energy - Don't Care. Rather, Don't Give a Flying F*ck. I only ever have like 1 other light bulb on in the house at a time. My pc is my entertainment center. I use less energy in my home than some people on vacation.

Damn, i just can't think of any more..Eye Jizz every time you look at it?

Ok Positives.

Blackest pitch blacks.

Perfect screen and brightness distribution.

Unmatchable color quality.

Liquid fluid motion.

Instant response.

True Depth of image /Solidity, not over bright "lightbox" effect.

Tons of resolutions/refresh rates.

Finest Dot Pitch - (until most recently, now being matched).

No Glow, Gamma Shift, Color Shift, stick Shift, oh wait it does have a stick shift....perfect image at any angle or distance.

Watching movies or even "HD" content looks incredible, or better than the best 55" Samsung whatever HD TV due to all of the reasons stated above.


Pros for LCD

lightweight, thin, easy to pack up and ship back to sender.
 
Liquid fluid motion.

Instant response.
Unfortunately, bad news. Instant response does not, by itself, eliminate motion blur because of sample-and-hold (see Why Do Some OLED's Have Motion Blur?). It's a solvable problem, however, with some additional engineering. I'd like to see measurements of LG OLED's ability to eliminate motion blur. I'm particularly interested how it compares to Panasonic's VT50 plasma (2500Hz FFD) as well as LightBoost LCD's. Most current full-color OLED's still have more motion blur than those displays.
 
Last edited:
I have had a few FW900s, but I would only use such a smalls screen for gaming. It is 23" viewable and 16:10 at that, so 16:9 shrinks in the case of movies too. More appropriate for movies and console use imo, I have a sony xbr960 34" widescreen crt with hdmi input that I use for ps3 and dvd's. I use it for these since it has very little input lag and no ghosting or blurring , unlike my 46" samsung lcd (led edge lit) tv. Lower rez content looks a lot better on the crt too (dvds, even though the ps3 upconversion is good to start with, and formerly ex gf's wii looked 100x better on crt than lcd). The crt is in the corner so the back of it utilizes the corner of the room. My slim 46" 'led" tv is in front of a window (back to the window). Its larger screen size and slim profile are tradeoffs. I do see motion blur on it, but I find viewing movies and shows enjoyable. The black levels are decent when tweaked, and the colors are good. In hindsight, I probably would have got a good plasma tv since I mostly watch movies where the black levels and detail in blacks would be a little better. Plasma also has slightly better motion vs blur due to the nature of the tech.
.
I'm keeping my eye on OLED, but burn in (burn-out?) of colors, not addressing sample and hold motion blur that lcd's are plagued with, and input hz regarding computer games and movies going forward are some of the things I'd be wary of..
 
Back
Top