24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I appreciate the good discussion around this, and I'm curious to see that side-by-side comparison.

Here's my damage level; it's a little more obvious in person:

oQEdJjR.jpg


But it's easier to ignore when it isn't on a solid bright area:

rUNypRI.jpg


(The fuzziness and wavy lines are just my camera being confused by the CRT.)
 
Well, if you remove the film, the monitor is basically unusable with overhead lighting. (Not that such lighting is a CRT's friend in any case.) However, even in a darkened room, but with a little light, the phosphor layer will reflect and the floor of your black level will be fixed at that higher point.

That said, alas, the damage in those images looks quite severe to me.
 
After an insane amount of work (which involved biking to bestbuy and spending $90 on VGA cables, navigating through the clusterfuck that is the santa claus parade, making multiple trips to basement at back of house, and carrying three FW900s across ice and snow, and up and down flights of stairs, since the first two untis I brought up were the wrong, and nonfunctional ones!), I finally got the comparison done between Anti glare and no antiglare.

I turned down the G2 to 0 on both machines at same time, and the one with AG was much blacker.
I then calibrated each one, setting the luminance in each of the two G2 steps (remember how it asks you to adjust G2 twice, the second time is with green gun) to 0.02 nits, and then following the WinDAS targets. I didn't use Argyll to calibrate post hardware.
Next, I checked the delta e's of the gray levels, and the luminance curve using HCFR, for each monitor. Similar results. Black level on both was around 0.17 nits, which tells me that the black level you set G2 at isn't the same as the resulting black level.
With lights off, no perceptible difference between image quailty.
With lights on, more glare on unit without anti glare, and black levels were raised moreso than on the unit with antiglare.
Also did a halation test, measuring black level on a full field black screen, compared to measuring a patch of black surrounded by thick white borders.
Results:

Unit without antiglare:
full field black: 0.174 nits:
small patch of black: 0.985 nits

Unit with antiglare:
full field black: 0.173 nits:
small patch of black: 0.821 nits

So antiglare seems to provide a slight improvement in halation.

I'm probably going to remove the antiglare on this unit though. There is a scratch and a couple small scuff marks, and I'd like to be able to run the tube at a lower beam current (antiglare blocks about 33 % of light, so removing it means you can run the tube at a lower current).

That said, I do concede that, tube health and scratch issues aside, having the antiglare on is superior.

Some photos for your amusement:



Z0jJlgP

KCvJgEH

6UgNDqS
 

Attachments

  • 4fGnB3w.png
    4fGnB3w.png
    459.6 KB · Views: 80
  • 4SylMT2.png
    4SylMT2.png
    270.4 KB · Views: 81
  • j5cXLe3.png
    j5cXLe3.png
    713.6 KB · Views: 89
Last edited:
Good writeup.

Don't know why you spent $90 on VGA cables at best buy though. Goodwill has bins full of them for $2 a piece. Nice meaty ones too with ferrite chokes
 
Higher beam current with antireflection film is a non issue IMO. The screen is built for it, and that model has been used for almost 20 years now without significant reports of tube issues (appart from a few catastrophic failures that have little to do with aging). It is only something you should pay attention to if you put on a film which is significantly darker than the original one.
 
Higher beam current with antireflection film is a non issue IMO. The screen is built for it, and that model has been used for almost 20 years now without significant reports of tube issues (appart from a few catastrophic failures that have little to do with aging). It is only something you should pay attention to if you put on a film which is significantly darker than the original one.

I'm not saying that it presents an "issue" per se, but rather that the life of the monitor may be extended beyond it's normally expected lifespan. CRTs don't last forever, and one of the reasons for this is that positively charged ions (impurities) can accumulate on negatively charged elements (e.g. cathode, grid). See sch3mat1c's answer to my question in this thread.

Now the assumption here is that reducing the cathode voltage, and thus beam current, by a third (which is what removing the antiglare essentially does) may reduce the rate at which the cathode deteriorates. This would be because at lower beam currents, there are fewer electrons flying through the tube, and at lower velocities. Thus, fewer and less violent impacts with random gas atoms, which is what apparently causes the issues in the first place.

I've had two FW900's experience pretty severe flashing issues upon startup, and one of them lost a lot of focus. I'm pretty sure both of these things can be attributed to the above described phenomena.

What is an open question is what kind of function relates the beam current to the amount of impurity accumulation. Perhaps a question for the arcade tube restoration experts.

Also, lower beam current means phosphors won't age as fast.
 
Last edited:
I understand the theory behind a tube aging but the foreseable consequence should just be less brightness, with a potential drift in color (since the blue, red and green phosphors won't be impacted at the same rate).

Regarding flashing at startup I'd rather think about a problem with the electronics or an intermittent short related to the tube itself. That's a defect, not a tube aging problem. Same for the focus issue. The tube just renders the signal that is output by the boards, it doesn't process it by any mean. If the focus is wrong then the signal must be wrong in the first place.
 
I understand the theory behind a tube aging but the foreseable consequence should just be less brightness, with a potential drift in color (since the blue, red and green phosphors won't be impacted at the same rate).

Regarding flashing at startup I'd rather think about a problem with the electronics or an intermittent short related to the tube itself. That's a defect, not a tube aging problem. Same for the focus issue. The tube just renders the signal that is output by the boards, it doesn't process it by any mean. If the focus is wrong then the signal must be wrong in the first place.

Did you read the thread I linked to? Shorts themselves can be caused by accumulation of impurities, which themselves are caused by collisions between electrons and gas atoms.
Also, when the centre of a cathode becomes less emmissive (due to accumulation of impurities), the beam spot profile is widened, which directly impacts focus.
 
Did you read the thread I linked to? Shorts themselves can be caused by accumulation of impurities, which themselves are caused by collisions between electrons and gas atoms.
Also, when the centre of a cathode becomes less emmissive (due to accumulation of impurities), the beam spot profile is widened, which directly impacts focus.
Shorts CAN be caused by accumulations, CAN is the main word here. It might happen, but under which conditions ? You would need quite a massive accumulation for this to happen if everything was properly assembled and healthy in the beginning. That's not happening in real life use or there would be people reporting faulty screens all over this thread by now.

It's not because in theory one thing could be explained one way that you have to absolutely believe that's what happens. Problems related directly to the tube are far less likely than problems coming from the rest of the screen, simply because the tube is a part less complex and more sturdy than customer grade electronics. In real life, when a tube has a problem, that's most of the time either because there was a fault during manufacturing, or because it was physically damaged later. ;)
 
Shorts CAN be caused by accumulations, CAN is the main word here. It might happen, but under which conditions ? You would need quite a massive accumulation for this to happen if everything was properly assembled and healthy in the beginning. That's not happening in real life use or there would be people reporting faulty screens all other this thread by now.

It's not because in theory one thing could be explained one way that you have to absolutely believe that's what happens. Problems related directly to the tube are far less likely than problems coming from the rest of the screen, simply because the tube is a part less complex and more sturdy than customer grade electronics. In real life, when a tube has a problem, that's most of the time either because there was a fault during manufacturing, or because it was physically damaged later. ;)

Flashing trinitrons are quite common, and seem to be very much associated with age.

Are you suggesting that the cathode life, of a properly assembled tube, is infinite?
 
Flashing trinitrons are quite common, and seem to be very much associated with age.

Are you suggesting that the cathode life, of a properly assembled tube, is infinite?
No no no, if it were an aging problem and flashing started because tube's lifetime was exceeded, it wouldn't be "common", it would have hit most of the screens by now. And by the way, I've seen and I own quite a few trinitrons, the only time I've seen the display flashing on one, it was one channel of a video amplifier that died on the A board. And it did flash green only for a couple of days before losing it completely and leaving a purple display.

Flashing means an intermitent problem, and if it cures temporarily when the screen is hot then you have all the typical symptoms of a VERY COMMON problem with electronics of that time: BAD SOLDER. That can also betray a problem with a component, but it's not always easy to find which one.

A cathode certainly doesn't have an infinite life. But it is much higher than the one of an electronic board with hundreds of different components, some of which with a quite limited life expectency, and more than twice as many potential solder issues as there are components. Really, focussing on the tube as the primary source of issues is as ridiculous as blaming the flyback for every problem. Especially without an extensive diagnostic of the screen to at least locate roughly the part causing the problem.
 
A cathode certainly doesn't have an infinite life. But it is much higher than the one of an electronic board with hundreds of different components, some of which with a quite limited life expectency, and more than twice as many potential solder issues as there are components. Really, focussing on the tube as the primary source of issues is as ridiculous as blaming the flyback for every problem. Especially without an extensive diagnostic of the screen to at least locate roughly the part causing the problem.

I'm not claiming that cathode wear is the primary source of issues, but the cathode and other grid components do age! When people talk about how much life is left in a tube, it's primarily due to these effects.

Are you suggesting that things like G1 and H-K shorts are caused by bad solders???

Cathode wear and cathode poisoning *is* an issue, that often presents itself! There are many symptoms associated with this, and some of them can be (temporarily) alleviated through tube restoration techniques. Devices like the Sencore CR7000 can test gun emission and shorts.

Here is an excerpt from Jeroen Stessen, an engineer who was at Philips for 25 years (linkedin profile here):

I've bolded the relevant parts (taken from here)


CRT Degradation
CRT Aging - Effects on Electrical Characteristics and Performance
(From: Jeroen H. Stessen ([email protected]).)
Specifications for Philips CRTs can be found in the regular series of data books from Philips Components. Companies and universities usually have them. Usually the data sheets show typical Ik/Vk characteristics. They also list the spread on cutoff voltage and cathode gain, and this spread is quite large even on new CRTs. They also list phosphor sensitivity (Lum/Ik), this too has a large spread. But they almost never list anything about the aging process.

Here are some of the effects:

Phosphor ages due to burn-in, particularly on static pictures, this is immediately obvious on visual inspection. If the aging is even (no pattern) then at least the efficiency is reduced.
Cathodes age due to loss of emission material, particularly for oxide cathodes. The central part of the cathode surface has carried the most current density and will wear out first. The surrounding area takes over, this will lead to an unsharp picture. Adjusting the focus voltage will not really improve it. The tube is worn out.
Also poisoning of cathode surface may occur. This can be cured temporarily by short-time overheating ("re-conditioning").
The cathode that wears out first (often the red one) also loses gain, so the white point of the image will shift (to cyan). The white point can be re-adjusted with the gain potentiometers and the contrast, but peak brightness will not be as high as new.

The cutoff voltages of all cathodes will drift. Common drift is adjusted by the user by controlling the brightness. Different drift leads to a coloration of the black background level. In extreme cases vertical flyback lines will appear. Cutoff voltage can be adjusted with potentiometers, or there is automatic stabilisation. Still, the VG2 (screen) may need periodic adjustment too.
Leakage currents may disturb VG2 and focus voltage, re-adjustment has only a temporary effect.
VG2 and focus potentiometers may wear out due to electromigration etc. A hole may form under the wiper, re-adjustment is then impossible.
Some types of cathode wear (according to a friend in Philips Semiconductors) can cause the Ik/Vk transfer characteristic to divert so much from an ideal gamma function that no adjustment can compensate for it. Then the tube is really worn out.
I hope that this helps you to distinguish between a really worn out tube and one that still has some life in it after re-adjustment.
 
Are you suggesting that things like G1 and H-K shorts are caused by bad solders???
Please stop suggesting things I've never suggested and just read what I write, that'll save a pointless argument. If a short happens in the tube of a Trinitron, then this most probably is a tube with a defect, not the result of aging in normal condition use.

Cathode wear and cathode poisoning *is* an issue, that often presents itself! There are many symptoms associated with this, and some of them can be (temporarily) alleviated through tube restoration techniques. Devices like the Sencore CR7000 can test gun emission and shorts.

Here is an excerpt from Jeroen Stessen, an engineer who was at Philips for 25 years (linkedin profile here):

I've bolded the relevant parts (taken from here)
Well, your quote says exactly the same I did a couple of posts ago: tube aging affects brightness and may lead to color drift ...

I understand the theory behind a tube aging but the foreseable consequence should just be less brightness, with a potential drift in color (since the blue, red and green phosphors won't be impacted at the same rate).

Your quote also talks about picture losing sharpness (that makes sense), but be careful, this is not a focus problem and this doesn't involve the same parts/circuits at all. The beam becomes wider and hits a bigger surface at the same time BUT it is still focused. A focus issue can't be caused by tube aging.

Besides, keep in mind this is a very instructive website about CRTs in general (I've used it often BTW) but it is not specific with models or conditions: what is or may have been a problem with a cathodic tube from the 80s will not necessarily happen on a Trinitron from 2000, or not the same way, not in the same conditions. Cathodic TVs also won't behave the same way as a computer monitor.
 
Your quote also talks about picture losing sharpness (that makes sense), but be careful, this is not a focus problem and this doesn't involve the same parts/circuits at all. The beam becomes wider and hits a bigger surface at the same time BUT it is still focused. A focus issue can't be caused by tube aging.

I'll grant you that, although cathode aging can indirectly cause focus issues, since the electron optics have a range of voltages over which they perform optimally. If the cathode is depleted too much, the drive needs to be boosted, which affects the ability to focus.

But independent of focus, cathode aging directly affects image sharpness due to the wider spot.

So I guess we disagree on how much the more modern trinitrons' cathodes age. Vito would often comment about how he'd be able to measure the tube life of his FW900's by measuring emission. Doesn't this fact alone tell you that these tubes do age, independent of the other electronics?
 
I'll grant you that, although cathode aging can indirectly cause focus issues, since the electron optics have a range of voltages over which they perform optimally. If the cathode is depleted too much, the drive needs to be boosted, which affects the ability to focus.

But independent of focus, cathode aging directly affects image sharpness due to the wider spot.

So I guess we disagree on how much the more modern trinitrons' cathodes age. Vito would often comment about how he'd be able to measure the tube life of his FW900's by measuring emission. Doesn't this fact alone tell you that these tubes do age, independent of the other electronics?

I would hazard a guess that when checking emission, it's in respect to whatever level of current/voltage/whatever you're applying to the cathodes. In other words, if I give the cathode, let's say (this is an arbitrary example) 50mv, and it's only able to produce 85 cd/m2 of full white, while another tube can produce 95 cd/m2 given the same voltage, then you could probably surmise that the first tube is older than the second tube.

All of this, of course, is a guess on my part.

From personal experience, it's true that aged tubes require more "juice" to reach a specification. Newer tubes require less effort to do so. I saw this in many different televisions that I used to take apart and play with.
 
I would hazard a guess that when checking emission, it's in respect to whatever level of current/voltage/whatever you're applying to the cathodes. In other words, if I give the cathode, let's say (this is an arbitrary example) 50mv, and it's only able to produce 85 cd/m2 of full white, while another tube can produce 95 cd/m2 given the same voltage, then you could probably surmise that the first tube is older than the second tube.

Yea, I believe the sencore cr-7000 directly measures the current across the gun under certain controlled conditions.
 
Don't get the studio, it's a somewhat neutered version of the pro.

More details here

My recommendation is the OEM rev b (one caveat is that I don't think that version will work with the Xrite software, but you won't be needing that software anyway).

WinDAS cable is not a special cable, it's just a usb to ttl cable. The PL2303HX chipset is the one I use, and I think has been tried and tested successfully by others here.

I have an Xrite eye one display 2, part no 35.54.30 Rev A

Is that possible to use with windas? If so, with what operating system?
 
I have an Xrite eye one display 2, part no 35.54.30 Rev A

Is that possible to use with windas? If so, with what operating system?

You can use any instrument with WinDAS, including your own biological eyes if you trusted them. As far as I know, the eye one display 2 is compatible with argyll and HCFR (both of which work in windows, not sure about linux or mac). Keep in mind that the instrument probably can't read as low luminances as the i1 display 3, so you might not want to lower the G2 too much when setting black level.
 
Finally tried a patch of the dark film from 3DLens (https://3dlens.com/linear-polarizer-film.php) we discussed earlier. Alongside a patch of the original film and the raw glass.

In one screen of a game, I could see where even the original lighter film was enough to meaningfully bring a black part of the game's image home as opposed to seeing the phosphor reflect in place of black. In another screen with a black area, I didn't perceive much difference, because of the overall image in that screen I guess.

All in all with both films it seemed kind of meh to me. Probably I'm just too used to how it looks without the film at this point. And that this room is pretty dim much of the time.

Definitely think less is more with regard to the amount of tinting. Might check out some other stuff along that theme...(And would not remove the original film unless damaged.)

Would still be interesting to me if Sony designed the similar D24 without any AR treatment. In images, it looks like it's without, but who knows.
 
This is a bug in recent AMD drivers I think. It's definitely an issue with my 5700xt. Any "standard resolution" is using reduced LCD timings instead of CRT timings.

What you need to do is recreate the resolution in AMD's custom resolution tool and choose GTF or CVT timings.

You should also try using CRU to do it, just to do me the favor of finding out whether it works with your card or not. Because AMD is definitely ignoring CRU overrides with my 5700xt for some reason.
Hey, I tried CRU at 1920x1200 @70hz and it worked with no issues thanks!
 
Finally tried a patch of the dark film from 3DLens (https://3dlens.com/linear-polarizer-film.php) we discussed earlier. Alongside a patch of the original film and the raw glass.

In one screen of a game, I could see where even the original lighter film was enough to meaningfully bring a black part of the game's image home as opposed to seeing the phosphor reflect in place of black. In another screen with a black area, I didn't perceive much difference, because of the overall image in that screen I guess.

All in all with both films it seemed kind of meh to me. Probably I'm just too used to how it looks without the film at this point. And that this room is pretty dim much of the time.

Definitely think less is more with regard to the amount of tinting. Might check out some other stuff along that theme...(And would not remove the original film unless damaged.)

Would still be interesting to me if Sony designed the similar D24 without any AR treatment. In images, it looks like it's without, but who knows.

Thanks! This is exactly the kind of test I was hoping to hear about. Would you recommend the 3DLens film as a replacement if the original is scratched up like mine? It looks like their P620A could be cut down to the right size and stuck on.

I'd also be curious how it compares to the original with reflections in a brighter room. Maybe you could turn on a lamp behind you? :¬)
 
Last edited:
Thanks! This is exactly the kind of test I was hoping to hear about. Would you recommend the 3DLens film as a replacement if the original is scratched up like mine? It looks like their P620A could be cut down to the right size and stuck on.

I'd also be curious how it compares to the original with reflections in a brighter room. Maybe you could turn on a lamp behind you? :¬)
A couple of people used polarizers similar to this one and seemed to be happy with it. It is more effective at getting good black levels than the original film but keep in mind it has absolutely no EMI effect (static electricity will also build up on the surface), probably little antireflection effect either, and it will require to push the tube much harder to reach the same brightness. At such level it may become a problem sooner or later (funny to say this just after that little debate on tube aging :p).
 
It is more effective at getting good black levels than the original film but keep in mind it has absolutely no EMI effect (static electricity will also build up on the surface...

This is something I'm really beginning to appreciate, now that I have the antiglare on. Not having to wipe off the dust everyday is quite nice!
 
Both films reflect, but agree that the 3DLens film much more than the original.

Both are effective at suppressing reflection from the phosphor making the screen more usable in a well lit room. (And with a flashlight pointed at it as well.)

That said, I don't think CRT, which carves out its big dynamic range on the black side of it, works well in a well lit room. Needs to be a dim room I think.

As to recommending the film, well, it's probably too dark for me. Actually, even with the original film, there's something there, the way it mutes the colors or something, that I don't think I like anymore. It's at the cost of the black levels though, So maybe a film, but lighter the better, to a point.
 
Actually, even with the original film, there's something there, the way it mutes the colors or something, that I don't think I like anymore. It's at the cost of the black levels though, So maybe a film, but lighter the better, to a point.

Did you see my recent experiment (I had both monitors being fed identical signal via a splitter, and calibrated them simultaneously with WinDAS)? I couldn't see a difference with lights off.

One thing that did somewhat appeal to me without antiglare was that the screen looked more "glassy" with the lights on. But that was precisely because of the reflections!
 
A couple of people used polarizers similar to this one and seemed to be happy with it. It is more effective at getting good black levels than the original film but keep in mind it has absolutely no EMI effect (static electricity will also build up on the surface), probably little antireflection effect either, and it will require to push the tube much harder to reach the same brightness. At such level it may become a problem sooner or later (funny to say this just after that little debate on tube aging :p).

My FW900's black is a bit light already, which I've heard is something that Trinitrons drift towards with age, so a slight darkening might actually help. Hmm, I may go for this.

It would be nice to find a film with anti-static properties too, but I'm not coming up with much on that search.
 
Did you see my recent experiment (I had both monitors being fed identical signal via a splitter, and calibrated them simultaneously with WinDAS)? I couldn't see a difference with lights off.

One thing that did somewhat appeal to me without antiglare was that the screen looked more "glassy" with the lights on. But that was precisely because of the reflections!

Your experiment A/Bing the two monitors was far superior to me sticking patches of the original and 3DLens films on the screen.

You found the colors to be equally as vibrant with the film? (More of an issue I suppose with the darker film.)
 
Your experiment A/Bing the two monitors was far superior to me sticking patches of the original and 3DLens films on the screen.

You found the colors to be equally as vibrant with the film? (More of an issue I suppose with the darker film.)

Yes, no difference in anything like that.

For some reason which I hope to understand in near future, even though I set the g2's to same level, the colors weren't identical for the first 20-30% of the luminance range, post calibration. But this probably had nothing to do with the antiglare. I'm pretty sure if I had set the black level lower, this wouldn't have been an issue.

So it wasn't as clean of an A/B test as it could have been, but I'm confident that this difference didn't interfere with my ability to compare the quality of the image. The contrast/luminance range were virtually identical, and I couldn't detect any difference. I loaded up a very high quality blu-ray rip of quantum of solace, and looked carefully at a few scenes too.

I can certainly understand that if you were to do it with side by side patch comparison on the same screen, like in your case, the colors would appear less vibrant - colors lose their vibrancy with lower luminances. And as you understand, if you were to install a film on a naked screen, you'd need to boost the luminance to compare apples to apples.
 
My FW900's black is a bit light already, which I've heard is something that Trinitrons drift towards with age, so a slight darkening might actually help. Hmm, I may go for this.

It would be nice to find a film with anti-static properties too, but I'm not coming up with much on that search.
If your screen is getting too bright, what you need is to use Windas to set it properly again. Using a darker film is certainly not a good way to fix that, as G2 will keep increasing with time to the point the display will become green/blury and scan lines will become visible.

If you still want to replace your film with a polarizer, I would advise you to look for either:
- a polarizer with low efficiency (meaning not all the light flux is polarized, maybe some can be found around 60-65% transmittance). But you have to be careful as this may also mean low film quality and some haze issues.
- a polarizer rated for 50% transmittance. That's the clearest polarizer that can be found, and pretty new. Few companies sell that, and I'm not sure how much it may cost though.
 
Not the greatest reviews on quality but I am going to try this as well.
I agree. Try it and let me know what you think. So far I did managed to get 2560x1600 but it looks weird. There is some green lines in the blacks,but I am satisfied with 1920x1200 @70hz. When gaming.
 
don't bother with a polarizer unless you are sure it is a circular polarizer.a
neutral density film is better than a linear polarizer.

personally i would not remove the ag film unless it's really really scratched
 
don't bother with a polarizer unless you are sure it is a circular polarizer.a
neutral density film is better than a linear polarizer.

personally i would not remove the ag film unless it's really really scratched

yea, after using this new tube with the ag for a few days, I'm beginning to get accustomed to its benefits. Would be nice if there was a good replacement though, I do have a couple scratches - thick enough that they're not rainbowing, but still noticeable.

I don't know anything about the physics of polarization/glare reduction, but why is circular better in this context?

I get why neutral density is important (you don't want to shift the primaries).
 
Last edited:
Some fresh informations about Delock 62967 ... I bought 4 of them for my CRTs from Reichelt.de, unpacked one and tried it:
- With a HD5850 / Catalysts 14.12 I only get up to 1280*1024@85hz resolution and can't set anything higher with CRU
- With a R9 380X / Adrenalin 18.12.3 I get resolutions up to 1600x1200@60hz, and when I try setting standard resolutions for the FW900 (like 1920*1200@85hz), the screen keeps clicking, and display remains black most of the time with some sporadic images. On the top of that the system is more or less hanging, I have to unplug the adapter for this to stop.
I'll try a few more things but digging back in the thread, I guess I've fallen victim of their shit connector and I'll have to add some cable soldering job to my long list of things to do ... :dead:
 
Confirmed, this is the exact same behaviour reported previously. The display is stable as long as the resolution/refresh rates are low enough to remain with HBR links, as soon as it switches to HBR2 the converter goes south.

I suppose no one has had yet a retail Delock 62967 working properly with an AMD card for now ?

edit: I tried cleaning both male and female displayport connectors with isopropylic alcohol just in case, and also tried to push the male connector with more strength into the female one. It seems to hold better, and the display was also more stable. Did anyone try to just remove the plastic cover of the DP connector of the adapter ? I'm starting to wonder if the culprit isn't as stupid as the cover covering 1 or 2 mm too much the connector itself, preventing from pushing the plug deep enough for proper contact. And maybe DP ports are a tad deeper on AMD cards than on Nvidia ones.

edit2: Bingo, I compared with the displayport plug of the Sunix adapter, the Delock is about 1.5mm shorter. So the issue must definitively be the nice crappy red and black plastic cover of the plug ... :facepalm:
 
Last edited:
yea, after using this new tube with the ag for a few days, I'm beginning to get accustomed to its benefits. Would be nice if there was a good replacement though, I do have a couple scratches - thick enough that they're not rainbowing, but still noticeable.

I don't know anything about the physics of polarization/glare reduction, but why is circular better in this context?

I get why neutral density is important (you don't want to shift the primaries).

I think circular was the one that has a visible seam. Didn't appear to be an option. Tried the linear one.

Even if one has to come up with their own way to mount it, this one might be interesting -- http://www.kantek.com/p-LCD24W/c-10...y-Filter---Fits-121-Widescreen-Notebooks.html
 
I think circular was the one that has a visible seam. Didn't appear to be an option. Tried the linear one.

Even if one has to come up with their own way to mount it, this one might be interesting -- http://www.kantek.com/p-LCD24W/c-10...y-Filter---Fits-121-Widescreen-Notebooks.html
Nope, not interesting at all, especially at such price. This is sold as an ANTIGLARE protection filter meaning the optical properties are most certainly crap. This is even more likely because there is absolutely not a single accurate reference to the said optical properties (haze, transmission, reflection levels ...), only vague promises.
 
Back
Top