Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
What about HD Fury 4?
Can it handle 400MHz RAMDAC?
Just confirmed that 2304x1440 @ 80 hz works on it. He can't do 10 bit test yet as he doesn't have a colorimeter yet.
Just confirmed that 2304x1440 @ 80 hz works on it. He can't do 10 bit test yet as he doesn't have a colorimeter yet.
Awesome. Derupter do we know if a DP version of this adapter exists?
The Lontium LT8711X-B is native USB-C
Ok, so now we know that they have connected all the four lanes to the chip, i was worried about that.
About the 10 bit, just ask if there is an option in the graphic panel to set 8 or 10 bit on the digital output, because without that I have doubts that it can work.
Amazing to see this thread still going! I use to frequent this thread daily for a good year or two something like ~6-7 years ago when I purchased a FW900 from UncleVito on eBay. It served me incredibly well for a few years when I was into playing Counter-Strike, but had been collecting dust for the last couple years. With the new Call of Duty releasing, I decided to bring the beast out again! Happy to report after she warmed up everything looked as good as I last recall.
On the product page it's labeled as a DP converter with USB-C as the alternate mode:
http://www.lontiumsemi.com/uploadfiles/pdf/LT8711X_Product_Brief.pdf
I imagine it would be possible to connect a display port cable, if we had the pinout for the chip. But then it might need new firmware or something, who knows.
Keep in mind the power issue. An adapter connected on an USB-C port can drain as much power as it likes, one connected on a displayport input will receive only limited power supply (hence the USB external power supply on the sunix). That's why just replacing the USB-C plug with a displayport is likely to fail.On the product page it's labeled as a DP converter with USB-C as the alternate mode:
http://www.lontiumsemi.com/uploadfiles/pdf/LT8711X_Product_Brief.pdf
I imagine it would be possible to connect a display port cable, if we had the pinout for the chip. But then it might need new firmware or something, who knows.
Keep in mind the power issue. An adapter connected on an USB-C port can drain as much power as it likes, one connected on a displayport input will receive only limited power supply (hence the USB external power supply on the sunix). That's why just replacing the USB-C plug with a displayport is likely to fail.
Oh i forgot that the USB-C voltage is 5 V to 20 V and displayport is 3.3 V, the chipset should have a dedicated pin for that if it is designed to be connected directly to the displayport cable as well as to the USB-C cable.
Does anyone know of a good dvi d to vga conveter that can atleast do 1920x1200 properly without the edges cut off?
Asus R9 390XI guess an modification would need to cut the lines for video signal, while leaving the power lines intact to go to a USB C charger or connector? Then splice a displayport cable directly to the chip?
I know we'd need a data sheet, and then there are voltage and current tolerances that would probably need to be considered. And then it might not work at all because the chip may need a different firmware.
I guess we really need a datasheet first.
What's your GPU?
Asus R9 390X
Given the limited demand for such devices, it's pretty likely all the adapters you'll find on the market are from the same batch that was tested here in the beginning. Either there is enough demand, and they'll make another one when everything is sold, or that reference will then just disappear.Looked quite a way back, ok, looks like a have to solder on a new Displayport connector as Delock's are awful.
Cable has been bought in the last month, so obviously Delock haven't been bothered about improving things.
Hi,
I have two IBM C220p monitors, and have recently put two Delock 62967 (ANX9847) adapters on a Vega 56. However, it's not supporting more than 60Hz, and it's limited to 6 bit colour per channel, so there's banding onscreen. Same with both Windows 8.1 and Windows 10.
Any advice? I have a 780Ti I need to fix that I can potentially test the adapters with as well.
It's the 'almost original reference' Vega 56 as it's an MSI Airstream OC.
(No, I do not want to use another graphics card. I need something that works with PCI passthrough, that isn't too loud, and has decent open source support (RX5700 is too new, anything new Nvidia has poor open source support))
it's in decent condition, only flaws are scratches to ag film ...
Removing the AR film is the most stupid thing to do unless it is so damaged it makes the screen unusable. It reduces reflexions, increases contrast, protects against EMI and removes static electricity.That's some kind of custom film produced specifically for that application, you won't find any standard equivalent.My FW900 has the same problem. Quick warning: wiping with microfiber cloth will scratch this film!
Did you end up doing anything about it? I've seen a video of someone just removing the anti-glare film, and I'm wondering if there's a good replacement that could be applied, maybe something like ClearCal. I'm worried if I just remove it, reflections are going to drive me nuts.
Removing the AR film is the most stupid thing to do unless it is so damaged it makes the screen unusable. It reduces reflexions, increases contrast, protects against EMI and removes static electricity.That's some kind of custom film produced specifically for that application, you won't find any standard equivalent.
Stay away from clearcal or any standard film advertised as ANTI-GLARE as this is some crap not suitable for a high resolution display, AG means it makes the display blurry. The correct type of film is an AR film (ANTI-REFLECTIVE) but then, any standard film you'll find will only be a poor stopgap. It may improve things compared to no film at all, but it won't replace the original properly.
Although I do kind of like the appearance of the raw glass on my debezeled (de-cased?) FW900, I certainly would not have removed the film if I hadn't damaged it.
Thankfully, for much of the day in the room, it's dark enough that it's ok. I think I read that Sony did not apply the film to the BVM displays using the same or similar tubes? If so, that's at least some solace.
Ideally, the FW900 would have gotten a proper baked on AR treatment like the F520, etc. (And the rest of the F520's advancements, but I think the FW900 is close enough that the additional size probably wins out.)
I wonder how long I'd be able to extend the life of my monitors if I removed the anti-glare, since I would be able to reduce brightness by some amount.
I wonder how long I'd be able to extend the life of my monitors if I removed the anti-glare, since I would be able to reduce brightness by some amount.
I can't find again the post but I answered the transmission question some time ago when I made measurements with a colorimeter with a film, and then with the film removed. It's about 66%. If I remember correctly, 63% for blue, 66% for green, 67% for red.In order to answer that question, we'd have to know two pieces of information:
1: The transmission of the film (i.e. what percentage of light from the phosphors passes through).
2: How much the cathode ages as a function of beam current (I believe beam current is linearly proportional to the luminance). Also would be nice to know the same function but for phosphor aging.
flod and I attempted to deduce 1, but didn't get very good data. See discussion starting here. One way to do it would be to take luminance measurements of a full field white screen while some of the antiglare is off. You measure a spot on the screen where film is still present, then compare it to where it's removed.
2 is likely much trickier to ascertain. And cathode aging is probably not linearly related to beam current.
just spent half an hour trying to get transmission measurements. Not satisfied with the way I set it up. Hard to keep the smart phone (which I was using as a flashlight) at the same angle consistently, etc.
But got some decently consistent measurements.
With AG: 4.5 nits
Without AG: 10 nits.
(4.5/10)^0.5 = 0.67.
So 67% transmittance, just as flod suspected
Crazy, so removing the AG means the tube has to produce almost half the beam current to achieve the same brightness!
I may repeat in future with a better setup. If I have a regular flashlight, I can create a consistent setup from screen to screen fairly easily.
I have a PVM monitor that doesn't have the anti-glare. The look is exactly the same as an AG-less FW900. FYI, the FW900 tube and the BVM-24 tubes aren't the same. They're very similar but not the same. Both employ SMPTE-C phosphors, but they're not the same tube, as far as I'm aware.
hah, turns out the measurements from 4.5 years ago were actually pretty accurate!