Separate names with a comma.
Discussion in 'Displays' started by mathesar, Sep 13, 2005.
That and the settings don't transfer inside games either.
I can see how LUT adjustment could degrade quality without a 10 bit DAC, but if you do have a 10 bit DAC, then a LUT adjustment makes a massive difference if done correctly. And depending on the game, you can often get the game to respect your LUT, and there are tools like cpkeeper that can help in many cases (though perhaps not all).
I think like 30-50% of games I have tried did override the custom LUT. When this happens you are stuck with in-game gamma slider which sometimes is not "wide" enough. I think LUTs look really nice, especially in movies, its like they de-crush the whole image and there is suddenly much more detail in the whole image, the image looks very "soft" and "real" compared to a more contrast-y original.
By the way, I still get a little crushed blacks with LUTs, possibly because I have only DTP94 and not something better? Is there maybe some GUI tool where I could adjust the luminance function further myself and see the results on-the-fly? I just worked with dispcal CLI so far.
spacediver what are our options today for 10 bit DAC? The Sunix adapter is 8 bit, right? Same goes for modern nVidia cards? That info from Derupter about LT8612UX sounds really sweet!
Define "modern". You mean a 980 Ti? What about GTX 4xx/5xx?
I copied the OP and posted it to another forum that I hang out on, as if I just bought this monitor recently.
People are losing their minds over it.
Modern as in "can play latest games reasonably well". GTX480 would be a little stretch then but could be considered still modern I guess. But feel free to include old cards too if you want, I am building a Geforce 7 rig right now so there is plenty of love for the older cards too.
What forum is it ? I want to have fun too.
If anyone is interested in selling mint-condition FW900s with AG still intact, please PM me. Thanks.
^I'll pay 1.5x whatever he's offering
Lol. Ok let’s start a bidding war.
Radeon cads do 8bit dithering and can even correct gamut if you have EDID emulator - unfortunately white point is not set properly and Radeon gamut correction can only work with data from EDID and behaves strangely when white point is not exactly where sRGB spec says it should be
Though of course gamut is close to sRGB enough for it not be an issue and for movies where it actually matter gamut, gamma, etc. correction is best done with madVR <- as is everything related to watching movies on PC, best piece of code evar, especially for SD movies
BTW. This is ridiculous that NV just doesn't add ability to load ICC profile. It + dithering (which these cards can do, they just won't) would solve all issues related to calibration ...
I'm thinking about upgrading my PC and so I would like to upgrade my 980Ti which we all know is the last DVI-I card.
Are there any better alternatives to DeLOCK 62967 ? Of course I'm going to use WPB and custom ICC profile...
The Delock 87685 is better or if you are in USA, the SUNIX DPU3000.
im looking at upgrading soon too. I really want to keep my recently purchased fw900. Is there a way i can use a 2080ti with it. Interestingly were in the same ballpark, although i reside in the uk. I did notice the evga 1080ti has a dvi out on it, (if the image is correct). But after doing light research i saw that i may be able to use the displayport to connect to my fw900. Is there a possibility of using a 2080 displayport or hdmi to connect my monitor? Which is best and possibly available in the uk. I saw these but i am unsure of the quality DELL and HP .I could really do with some expert advice as the way to approach it correctly. Thanks in advance
popalazarou If you're getting a 2080Ti, then you want the Sunix DPU3000 so you can run crazy shit like 2304x1440@85hz.
Since you're in the UK, it may be easier to find the Delock version. They all use the Synaptics VMM2322 chip.
i can see one Sunix DPU3000 on amazon uk so im pretty chuffed, It does look precisely what im after. this sounds nuts. I was a little skeptical of my recent fw but i have a reasonable calibration and brightness that im happy with ( perfect blacks). Seems even better than my last one and seems in very good condition so i may now go down this road. Witcher 3 cyberpunk metro tomb raider would be pretty amazing at 2304x1440 @85hz. Does it come with software ? i never used one before. https://www.amazon.co.uk/gp/offer-listing/B00M89Q3FC/ref=dp_olp_new?ie=UTF8&condition=new
i noticed this item at a much lower price , would it do as good a job as the sunnix?
I don't remember which Delock adapter had the Synaptics chip.
Derupter do you remember which one it was? This one looks like it may also have the chip.
I listed it above, as it looks exactly like the SUNIX, but I could be wrong.
Edit: I looked at Delock's website and it's the correct one.
These are just plug and play. There is no software.
No idea, it may be a hit or miss. It was just luck that the VMM2322 is able to support high clocks.
Whoops, I missed that post. Yeah, the Delock 87685 is the same as the Sunix and might be cheaper in the UK.
Though that adapter popalazarou linked might be worth looking into, now that Nividia's cards have USB-C out. It's also a splitter, so there's a possibility it's using the synaptics chip.
This is the 87730,i just asked Delock which chipset is inside it,if VMM2322 it will be very good for that price.
Looking amazon uk,if you don't need pixel clock over 330-340 MHz(1920x1200 96Hz is around 322 MHz),the best is this:
Other USB-C solutions are:
Delock 62994 with Realtek RTD2169U
Delock 63924 with IT6562 and 10 bit DAC
Both have 360 MHz input bandwidth but no one tested them,so we don't know how can the DAC go up
If you need more,like 2304x1440 80 Hz or extreme pixel clock the only solutions are:
Sunix DPU3000-D3 (D2 is for cards with mini Displayport)
Both with Synaptics VMM2322 chipset
On Amazon UK there is also the adapter discovered by CommandoSnake
tested up to 1920x1200 96 Hz,for that price it's very interesting,the only problem is that it need to be set to YCbCr instead RGB,i don't know if that can cause problem with some configurations
Thanks for all the replies, so it's Delock 87685 or Sunix DPU3000-D3. Both are more expensive than what I paid for each of my FW900 I'll stick to Delock 87685 as it's available in my country. Even though it's a splitter, does it work as normal DisplayPort to VGA converter? So just one output from Nvidia Control Panel just like if it was single VGA output?
They call it splitter because it has multiple outputs but works like all other adapters
If you need more than 1920x1200 96Hz those two are the only we know,but keep in mind that some users have had minor problems with Synaptics chip,at least with Sunix adapter.
So if your new card come with USB-C output and you don't need super high resolutions, consider a more cheap solution like Plugable USB-C to VGA or Delock 62796.
Yeah, it even passes through the monitor's EDID to the GPU.
Because it's a splitter, you can actually connect multiple monitors too! Though I have not tried that myself, lol.
I have small semi off topic question,I bought Eizo T57S CRT,it have nice Trinitron display and its spec is 1280 x 1024 at 85 Hz.I have Quadro P620 gpu,it have only mini displayport output.I bought cheap Akasa miniDisplayPort > VGA adapter and using Win 10 I could only run 60 Hz not matter what resolution.
I installed Win 7 and somehow I got option to set it to 75 Hz,that helped alot with flicker but its still not 85 Hz.Any idea how to get 85 Hz? I dont know if it is the VGA adapter limitation or software thing.
Try with CRU:
Read the instructions,set a resolution (detailed or standard or both),with detailed you have more settings (with this use CRT timing)
After this restart the driver with the integrated utility
Your monitor has these limits:
30 to 92 kHz horizontal frequency
50 to 160 Hz vertical frequency
Usually displayport adapters are limited only by pixel clock,if you set a resolution over max pixel clock it simply doesn't appear on Windows control panel
You can do without problems 1280x1024 85 Hz or example 1024x768 110 Hz or 640x480 160 Hz
I received a reply from Delock, no Synaptics chipset inside 87730 splitter.
Those of you with good graphics cards: how do the recent Nvidia cards from the last 2-3 generations do when it comes to supersampling/downsampling? I'm looking to run games in 1440p or 4k and downsampling to 1920x1200, but I've never been able to get Nvidia's downsampling from the control panel to work properly.
It should work fine. It'll be 4x your native resolution at most.
But I’ve never got DSR to work.
It's a bit of a headache for CRT monitors. If you want to downscale to 1200p, you have to delete all higher resolutions with CRU, like 2304x1440.
And even then, some games won't detect the DSR resolutions. Recently, I found Mega Man 11 doesn't detect the DSR resolutions for some reason.
on a GTX 980 TI with the FW900, i have tested adding a DSR 4x resolution downsampling to 1920 x 1200. with a result of 4096 x 3072.
what i do is to make windows believe 1920 x 1200 is the monitor native resolution (tested on windows 10 pro 64bit 1709), for that, i create the resolution with CRU (tested with custom resolution utility version 1.4.1) from the detailed resolutions, using automatic CRT standard, and moving the resolution to the top with the up arrow, so it seems the top resolution on that list will be what windows believes is tha native one, then restart the PC, and now you check that nvidia control panel lists 1920 x 1200 as native, then create DSR 4x and thats it.
now if what you want is 4K known as 3840 x 2160, i created 2560 x 1440 with same procedure mentioned, in DSR set to 2.25x, (FW900 supports 1440p up to 2560 x 1440 75hz 16:9 and 2560 x 1600 16:10 68hz without the need of DSR, at least with the gtx 980 TI)
How does it look?
Hello, I bought this cable to plug my fw900 into the gtx 650 ti boost from my pc. It doesnt seem to work.
Can the gpu handle the analog signal?
Or its the cable?
Well, if your video card has a DVI-I output it should work. But it's hard to tell without knowing the exact model you have, there might be custom manufacturer cards not featuring it but still using connectors with the DVI-I keyed connections instead of the DVI-D ones.
Fool-proof question: Did you think about selecting input 2 on the monitor switch when plugging that cable ?