24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I don't know how Nvidia's user interface works, but I know on CRU it's easy to accidentally double the vertical resolution if you enter the number wrong. With interlaced checked, you have to make sure enter half-resolution the vertical column, and it will show the full resolution total on the right. So for example, you will have 540 lines entered, and it will give a 1080 line total on the right.

Yeah I saw that. Still doesn't work for me. Which graphics card do you have?
 
I've been playing Battlefield 1 at an interlaced resolution for a year and a half and have a 71% win percentage.

Besides, to watch movies in 4k on a FW900 you need interlaced modes. 96Hz to get perfect cadence on 24fps movies.
You can watch 4K movies in 1920x1200@96Hz just fine or some higher resolution like 2560x1600@72Hz. It is small monitor with blurry image... do numbers mean that much to you even if interlacing make image ugly?

In games it is impossible to have proper motion sharpness with interlacing because of interlace artifacts and on top of that you loose half of the lines which your GPU worked hard to draw (if you maintaing frame rate as fast as refresh rate that is)... and a lot of pixels horizontally because gun hits shadow mask. So many pixels not displayed at all... really make this whole resolution numbers meaningful...

Lower resolution + SSAA (or DSR (and FXAA while at it!)) make much more sense on this monitor than higher resolution. Imho.
 
Yeah I saw that. Still doesn't work for me. Which graphics card do you have?

Radeon 380x. Interlaced resolutions work up to a point for me on the regular VGA connection. They don't work at all on the Sunix, though I think that's on AMD's side. Aktan said they work fine in Ubuntu

You can watch 4K movies in 1920x1200@96Hz just fine or some higher resolution like 2560x1600@72Hz. It is small monitor with blurry image... do numbers mean that much to you even if interlacing make image ugly?

Well in my experience, even if some of the pixels are being smashed together by the mask, you still have almost unlimited vertical resolution on a Trinitron. So I bet having the ability to display all 2160 lines would give a noticeable increase in detail. I haven't tried it myself though, I'd need to get a Blu-ray drive hooked up to my PC, or a high quality digital rip.

In games it is impossible to have proper motion sharpness with interlacing because of interlace artifacts and on top of that you loose half of the lines which your GPU worked hard to draw (if you maintaing frame rate as fast as refresh rate that is)... and a lot of pixels horizontally because gun hits shadow mask. So many pixels not displayed at all... really make this whole resolution numbers meaningful...

Lower resolution + SSAA (or DSR (and FXAA while at it!)) make much more sense on this monitor than higher resolution. Imho.

When you're playing a game where your frame rate matches your refresh rate, then yeah, I'm totally with you.

But I play BF1 at 45fps with max settings, which would be too low of a refresh rate in progressive, so I need some multiple of 45. And the higher the multiple, the better, since it will reduce stuttering when you drop below your target. So I'm running the game at 135hz interlaced, and it looks fantastic.

Another example more recently is Final Fantaxy XV, which I was playing at 30fps. Instead of playing it at 60hz progressive, I played it at 150hz interlaced. Really smooths out the picture in those hectic moments where it may drop to 27-25fps.
 
Ok I finally got my hands on the Sunix adapter. For the Delock one, they took almost 3 weeks to refund me after telling me my address was not valid. I'll order again next month, and provide a friend's address.

Anyway, I spent the past few hours testing the Sunix adapter on my GTX1080. So far this is what I can say.

Of course all the stock resolutions of the FW900 are working perfectly fine. I tried to push things further and I managed to get all the following resolutions to work fine.

- 2304x1440@80Hz (120.80kHz / 382.6944MHz)
- 2560x1600@72Hz (120.24kHz / 423.2448MHz)
- 3000x1920@61Hz (121.39kHz / 502.0691MHz)
- 3232x2020@58Hz (121.22kHz / 539.1866MHz)

This last one is touching the limits of horizontal scan and pixel clock of the monitor. I applied these resolutions using the Nvidia Control Panel, with CVT timings.

I remember reading about someone who tried some odd resolutions which made the adapter crash, black screen. Well, I don't get a black screen, but the monitor is just not stable at all with certain resolutions. I don't remember exactly what these were. Not much of a problem though, at least not for me so far. But that's still something to investigate further.

Now I was hoping to try some interlaced resolutions, including 4K (3840x2400) interlaced. But I spent the last few hours trying everything I could (except reinstalling Windows 10 or trying Linux), I just can't apply one single interlaced resolution on this computer.
The Nvidia Control Panel just says my monitor won't support it, BS. With CRU, I was able to see the progressive resolutions I added when opening "List all modes" in the Windows Control Panel, but no interlaced resolutions ever showed up... Now CRU is not working anymore. It doesn't do a thing anymore. Any ideas?

I tried with the VGA cable, and the monitor EDID recognized, and with the 5 BNC cable, with a VGA adapter and no EDID, it's recognized as Synaptics VMM2300 DEMO, but the issue is the same.

I tried 3 different Nvidia drivers, the stock one I had on the computer 391.01, then I upgraded to 391.35, and then downgraded to 388.13. Nothing helped.

I should mention sometimes the adapter starts behaving weirdly. I don't know if that's due to a bad contact with the DisplayPort connector or the actual chip itself, but sometimes the image gets - how can I describe this - cut in 3 parts, and reassembled out of order. The right side of the screen is in the middle, etc... But everything is still perfectly sharp and stable. I have to move the cables a bit, and it gets back to normal.

Nice,what about image quality,good like 62967?
That thing with the image out of order is indeed very strange,does it happen also at lower clock or only at super high resolutions?
 
Thanks again all.

In the last week, I got super-busy with work. I am pulling 60+ hours/week hard labor in service of my employer. They pay a fair wage, so I have no complaints.

In the mail I have gotten a Optix color unit that I have not tested, as well as a TTL cable. Is there a picture or something showing where/how to hook up the TTL to the monitor? I am confused as to how to hook it up. I have never done anything like this, and would like to read about it before I attempt it.

I have not gotten my polarizer film yet, so am not ready yet to do the work. I look forward to updates with pictures.

My ONLY change I have had time for was creating a NAS server out of an old HP DC5800 which I am now using as a monitor riser. The extra few inches of lift is great for viewing angle....
 
Thanks again all.

In the last week, I got super-busy with work. I am pulling 60+ hours/week hard labor in service of my employer. They pay a fair wage, so I have no complaints.

In the mail I have gotten a Optix color unit that I have not tested, as well as a TTL cable. Is there a picture or something showing where/how to hook up the TTL to the monitor? I am confused as to how to hook it up. I have never done anything like this, and would like to read about it before I attempt it.

I have not gotten my polarizer film yet, so am not ready yet to do the work. I look forward to updates with pictures.

My ONLY change I have had time for was creating a NAS server out of an old HP DC5800 which I am now using as a monitor riser. The extra few inches of lift is great for viewing angle....
d10OgeY.png

That's the pinout for the ECS port accessible at the rear-left of the monitor. Check to make sure your TTL matches that. I don't think you need to have USB power hooked up as the monitor itself powers the 5V from what I remember. Search for the WinDAS WPB guide written by spacediver for details about using WinDAS.

Edit: It's rear-right if you're looking at the back of the monitor.
 
Radeon 380x. Interlaced resolutions work up to a point for me on the regular VGA connection. They don't work at all on the Sunix, though I think that's on AMD's side. Aktan said they work fine in Ubuntu
These devices are dumb DACs and that is the only reason they work. This it the last GPU and not that powerful or rather dog slow by todays standards.

Well in my experience, even if some of the pixels are being smashed together by the mask, you still have almost unlimited vertical resolution on a Trinitron. So I bet having the ability to display all 2160 lines would give a noticeable increase in detail. I haven't tried it myself though, I'd need to get a Blu-ray drive hooked up to my PC, or a high quality digital rip.
Actually you loose chroma resolution. Luma also but it is mostly resolved with something which looks like subpixel rendering but is not quite it either.
There is great explanation here . Great channel BTW covering a lot of older technologies like CRT tech i early television, Trinitron, etc.

Many 4K UHD releases of movies have detail level that is exactly the same as 1080p.
For excellent scaling I recommend madVR and NGU Anti-Alias upscaler. It can do magic, especially for SD movies. Check it out https://imgur.com/a/7hUVzam

When you're playing a game where your frame rate matches your refresh rate, then yeah, I'm totally with you.
With good enough GPU it is possible to drive even high progressive resolutions at frame rate equal to refresh rate.

[qutote]But I play BF1 at 45fps with max settings, which would be too low of a refresh rate in progressive, so I need some multiple of 45. And the higher the multiple, the better, since it will reduce stuttering when you drop below your target. So I'm running the game at 135hz interlaced, and it looks fantastic.

Another example more recently is Final Fantaxy XV, which I was playing at 30fps. Instead of playing it at 60hz progressive, I played it at 150hz interlaced. Really smooths out the picture in those hectic moments where it may drop to 27-25fps.[/QUOTE]
I would rather use lower progressive resolution with AA eg. MLAA or built in-game FXAA and I actually did when using it with HD7950 going as low as 1280x800 (or 720p on PS3) and guess what... it looked excellent :cigar:

But of course you do you and if you like it that interlacey way then good for you... or not since you won't be able to upgrade your GPU... ever :dead:
 
Actually you loose chroma resolution.

The trinitron mask doesn't affect vertical resolution (neither chroma nor luma).

The vertical resolution of a trinitron is limited by the size of the phosphor grain, scattering of light, scattering of electrons, and the the electron optics of the tube (how tightly the beam can be focused).
 
Actually you loose chroma resolution. Luma also but it is mostly resolved with something which looks like subpixel rendering but is not quite it either.

Glad you mentioned subpixel rendering. I think that's basically the reason that games will continue to look better as you increase the resolution even beyond the line count on the aperture grille. You get diminishing returns of course, but it becomes less and less aliased and gives the impression increasing detail

But maybe you're right about 4k video. Maybe it's better to let a computer algorithm downscale intelligently instead of having the adperture grille and phosphors do it. I may bust out my LaCie sometime to do some 4k comparisons between the two methods. Though the FW900 would be a better candidate since it's wider.

These devices are dumb DACs and that is the only reason they work. This it the last GPU and not that powerful or rather dog slow by todays standards.

I actually have two 380x's. For games that support both I still get a high-end experience. For those that don't (a majority sadly) I still get a pretty good experience.

But yeah, in regards to the DAC, I'm hoping we can figure out how to get interlaced resolutions with this Sunix adapter on at least one of the two GPU brands. Not a super common use-case but I don't like it missing features from the built-in DACs we've had for years

I would rather use lower progressive resolution with AA eg. MLAA or built in-game FXAA and I actually did when using it with HD7950 going as low as 1280x800 (or 720p on PS3) and guess what... it looked excellent :cigar:

Some games like FF15 are designed to be played at 30fps. So dropping resolution significantly to get 60fps would be kind of a bad trade on a slower paced game with very detailed models and terrain. So that's one of the few times I'd recommend switching to interlaced to get the refresh rate boost at higher resolutions.
 
Last edited:
https://www.3dlens.com/shop/circular-polarizer-500x1000mm.php
Left handed is probably better. It seemed that way, not sure why.

Absolutely does not work. There is no way that this product fits on my screen (Or anyone with a GDM-FW900). PERIOD. There is a "retarder" seam, and at no point in the entire sheet is there enough material to cover the glass with the material provided. There will always be lines where no film is. That was a total waste of $100 + shipping. I guess I should have researched more before trusting some random person on the internet to spend my hard earned money. It won't be the last time I am sure, but it will be the last time I read anything you type XoR_. Technically the cost of this film was about 2 hours of work given to my employer, but it doesn't include the time I wasted getting the monitor taken apart only to find out that there is absolutely no way to make it fit. This is a Caveat to anyone who had read the post that specifically stated the exact product to use for a polarizing lens. You are wasting your money on this specific product, and if you find another that does actually work feel free to let me know. What is my part in this is that I cut a piece to fit my monitor not knowing that it was impossible to do.....Not that it mattered. It was a total waste from the start. Thanks.

As for my TTL

Check to make sure your TTL matches that.

I opened my TTL cable today, and I am a bit confused confused, but think this will work. There are no markings on the 4x plugs whatsoever. The wires are colored: Red Black Green White. While I was wasting time taking the shell off to apply the incompatible Polarizer film I found the plug(s) for them. I am not plugging anything in at this point until I can figure out for certain how to do it perfectly. Any suggestions on what color goes where?

*EDIT*

I looked at where I bought it from, and they provided the list of what each cable does.
Black cable-----GND
Green cable-----TXD
White cable-----RXD
Red cable -------VCC
I am putting this here because I am too hot to even bother with this right now. Perhaps this weekend.


Without film, the white point balance procedure should work well.

So the TTL WPB adjustments would work without a window film?

Can I use the Optix to calibrate without a film? Id like to know before I even waste any more time on this pipe dream.

Friends: If you can't tell, I am outright fuming. If anyone wants to buy a sheet of film 26"x20" that won't fit your GDM monitor, feel free to get in touch with me. I would gladly send you a piece for a fair price.
 
Last edited:
you don't need anti glare film to do a WPB. You can do a perfectly good WPB calibration with or without the antiglare.
 
Ok, I plugged my USB TTL cable into the monitor.

Windows Knows it is a USB to serial however, it won't install drivers. I am reading about RDX and TXD cables being swapped? Does the TXD go to the RXD and vice versa?
 
Last edited:
I have a dilemma and I'm hoping someone could help me out. I recently RMA'd my 980 Ti to Nvidia. I just received the replacement, but it's not a 980 Ti.

They sent me a 1070 Ti instead. This is a significantly faster card, but it doesn't have analog video output. I guess they didn't have any 980 Ti cards for replacing anymore.

I guess I could either try and sell this card, and buy a used 980 Ti from Ebay, or I could go ahead and try out the Sunix adapter.

What would you do in my situation? I don't need to use a bunch of odd resolutions, and I certainly don't care about interlaced resolutions.

If I did keep the 1070 Ti, is the Sunix adapter the one to get? It's the very best outboard DAC that you can buy?

Is the image in any way inferior to the 980 Ti?

Thanks.
 
Ok, I plugged my USB TTL cable into the monitor.

Windows Knows it is a USB to serial however, it won't install drivers. I am reading about RDX and TXD cables being swapped? Does the TXD go to the RXD and vice versa?
I’m not sure if they’re swapped since it’s been a while. As for the drivers, check the website for the particular chipset your adapter is using. I believe there are some counterfeit adapters out there as well.

Edit: At this point I’d only do a quick G2 adjustment once you get WinDAS working as the full WPB will take a while and involves geometry adjustments as well. Better to save yourself the trouble for now as getting things wrong here can make your monitor worse off.
 
I have a dilemma and I'm hoping someone could help me out. I recently RMA'd my 980 Ti to Nvidia. I just received the replacement, but it's not a 980 Ti.

They sent me a 1070 Ti instead. This is a significantly faster card, but it doesn't have analog video output. I guess they didn't have any 980 Ti cards for replacing anymore.

I guess I could either try and sell this card, and buy a used 980 Ti from Ebay, or I could go ahead and try out the Sunix adapter.

What would you do in my situation? I don't need to use a bunch of odd resolutions, and I certainly don't care about interlaced resolutions.

If I did keep the 1070 Ti, is the Sunix adapter the one to get? It's the very best outboard DAC that you can buy?

Is the image in any way inferior to the 980 Ti?

Thanks.
Sunix is fine and any difference in quality is marginal. Stick with the new card.
 
for the cable, the order is black red green white (top to bottom).

Thanks for the outlook. I certainly need to swap the RDX & TDX cables. This explains why my PC doesn't seem to recognize it. This will be a task for another day. Thanks again!

What would you do in my situation?
I have RMA'd products before. Once ram, and once a MOBO. I got the same thing back that I mailed in and would have been hot otherwise.

I have a 9 series nvidia gpu SPECIFICALLY for the analog output. IF you purchased your card for that reason, I would contact them and advise them that the replacement does not meet your standards. Most people would be happy with this sort of upgrade, but we are not most people. I would find a receipt for the card you bought and ask for your money back if they cannot uphold their end of the purchase agreement. This is me however. I ALWAYS get what I want from customer service.
 
it's unclear that the sunix adaptor will support 10 bit control over your LUT, whereas the 9 series GPU will. If adjusting gamma without causing any potential banding artifacts is important to you, then there is value in 10 bit LUT control. You can, for example, calibrate your tube to produce inky blacks (which has a side effect of raising the gamma dramatically (crushing the blacks), and then use a LUT to bring the gamma to 2.2 or 2.4, withuot any banding.
 
it's unclear that the sunix adaptor will support 10 bit control over your LUT, whereas the 9 series GPU will. If adjusting gamma without causing any potential banding artifacts is important to you, then there is value in 10 bit LUT control. You can, for example, calibrate your tube to produce inky blacks (which has a side effect of raising the gamma dramatically (crushing the blacks), and then use a LUT to bring the gamma to 2.2 or 2.4, withuot any banding.

I'm not sure I follow this. Why would the Sunix adapter have this limitation?

I've already ordered the Sunix adapter to test it out. How would I tell if this limitation exists, or would matter to me?

I'm really interested in knowing what limitations or advantages the Sunix adapter has compared to a 980 so if anyone else has anything else to contribute I'd appreciate it.

Someone a while back suggested that the Sunix adapter might have superior image quality in some respects (less ghosting, sharper text, etc). I just don't know. On the other hand, the Sunix adapter can be pushed higher than 400mhz, so I could presumably run resolutions and refresh rates higher than a 980.

I'm just not sure. I'd appreciate the increased power of the 1070 Ti, I just don't know if the trade-offs are worth it.
 
I'd be surprised (but extremely pleasantly surprised) if the chipset in the Sunix adaptor was 10 bit. You can test it using argyll CMS with the command "dispcal -R" (this will adjust your LUT using various increments and monitor the output of your colorimeter to see whether these changes result in a change in luminance. If there is a change in luminance with an increment that corresponds to a precision of 10 bits, it means that you have 10 bit control over the LUT).

I did essentially the same experiment manually using Matlab and psychtoolbox, as described here
 
I have a 9 series nvidia gpu SPECIFICALLY for the analog output. IF you purchased your card for that reason, I would contact them and advise them that the replacement does not meet your standards. Most people would be happy with this sort of upgrade, but we are not most people. I would find a receipt for the card you bought and ask for your money back if they cannot uphold their end of the purchase agreement. This is me however. I ALWAYS get what I want from customer service.
That might not be a good idea in the present situation. Given the crazy price increases of GPUs during the past year, they might well refund much less money than the card is worth today. ;)


Oh, BTW, regarding the quest for an AR film, one more company claiming loud that they may manufacture anything on your specifications, but that is actually only interested in selling you standard shit that doesn't fit your needs. Trying the next one.

I might succeed in a few years ... :LOL:
 
I have a dilemma and I'm hoping someone could help me out. I recently RMA'd my 980 Ti to Nvidia. I just received the replacement, but it's not a 980 Ti.

They sent me a 1070 Ti instead. This is a significantly faster card, but it doesn't have analog video output. I guess they didn't have any 980 Ti cards for replacing anymore.

I guess I could either try and sell this card, and buy a used 980 Ti from Ebay, or I could go ahead and try out the Sunix adapter.

What would you do in my situation? I don't need to use a bunch of odd resolutions, and I certainly don't care about interlaced resolutions.

If I did keep the 1070 Ti, is the Sunix adapter the one to get? It's the very best outboard DAC that you can buy?

Is the image in any way inferior to the 980 Ti?

Thanks.
It depends on what resolutions you use. My Sunix can't handle properly anything above 1920x1440 and I get occasional image shifting (image cut in 3 parts) and waviness on all resolutions. See my posts. I have a GTX 980 and I gave up on the Sunix. There is also the Delock 87685. I haven't tried it yet but
etienne51 said he would.
 
It depends on what resolutions you use. My Sunix can't handle properly anything above 1920x1440 and I get occasional image shifting (image cut in 3 parts) and waviness on all resolutions. See my posts. I have a GTX 980 and I gave up on the Sunix. There is also the Delock 87685. I haven't tried it yet but
etienne51 said he would.

I've got the Sunix adapter coming tomorrow, so I'll try it out and see what my success rate is. 95% of the time I'll just be using 1920x1200 at 85hz. But I'd like as much flexibility as possible. It seems like others have had more success running higher resolutions so maybe I'll luck out.

Just checking Ebay prices, 1070 Ti cards are selling for over $500, while I've seen 980 Ti cards going for under $400. Worst case scenario is that I'll sell this card and grab a 980 Ti from Ebay, and pocket the difference.

I'm sure I won't be able to get a 980 Ti from EVGA, since they never would have sent me a 1070 Ti if they had any in stock.

I'll do some experiments and see how it goes.
 
for the cable, the order is black red green white (top to bottom).

I managed to get the plugs fixed up, but lost the black plastic on the green cable. It was easier to plug in without the plastic shield, and only one lost it. Windows now recognizes the TTL cable as a USB to serial com adapter.

I got the drivers you recommended installed & am running a version of windas I found earlier in this thread that has all the necessary DLL files.


However: I am getting an error on starting windas: No MDL file.
 
try getting a copy from the link in this video. If that doesn't work, try the link in this video (also see instructions in that second link's description). I've only tested winDAS on windows xp, but I know others have gotten it to work on later version of windows.
 
try getting a copy from the link in this video. If that doesn't work, try the link in this video (also see instructions in that second link's description). I've only tested winDAS on windows xp, but I know others have gotten it to work on later version of windows.

This was great, and led me to this video which shows how to register the .ocx file in windows 7 x64.



I got windas to open up, and set the model correctly. Is there a guide I can use from here that gives me some info on backing up the eeprom and "fixing" issues from there?
 
i wouldn't worry too much about backing up the eeprom. Not sure what you mean by fixing issues, but you'll want to follow my guide.
 
Backing up the current settings is still a necessary "fool proof" operation to perform before messing around IMO. I do it everytime even if I know the software well enough to not risk making a mistake.

It's easy to do:
File -> Save data to file -> set a file name and go

To restore a previously saved file:
File -> Load data to set -> select the file to load and go
 
Absolutely does not work. There is no way that this product fits on my screen (Or anyone with a GDM-FW900). PERIOD. There is a "retarder" seam, and at no point in the entire sheet is there enough material to cover the glass with the material provided. There will always be lines where no film is. That was a total waste of $100 + shipping. I guess I should have researched more before trusting some random person on the internet to spend my hard earned money. It won't be the last time I am sure, but it will be the last time I read anything you type XoR_.
I never said I used this exact product. It was an example link and I was not aware of the seam thing.
Your reaction is completely inadequate.

BTW. thanks for the heads up. I will buy linear polarizer then as it should be fine and ennough for 3 FW900 mods.

And I guess you didn't even try it even with the polarizer to see how it looks and if you like the effect?
 
Last edited:
I'd be surprised (but extremely pleasantly surprised) if the chipset in the Sunix adaptor was 10 bit. You can test it using argyll CMS with the command "dispcal -R" (this will adjust your LUT using various increments and monitor the output of your colorimeter to see whether these changes result in a change in luminance. If there is a change in luminance with an increment that corresponds to a precision of 10 bits, it means that you have 10 bit control over the LUT).

I did essentially the same experiment manually using Matlab and psychtoolbox, as described here

So, I got the Sunix adapter today. So far it seems to be working great. I haven't had any trouble with any resolutions I've tried so far. I have had occasional wavyness that I hope I can eliminate somehow. Otherwise the image quality looks about equal to the 980 Ti.

I don't know how to do the testing you mentioned to see if the Sunix adapter is 10 bit.

Is there an image or test pattern that I could use to see if there are any banding artifacts?
 
So, I got the Sunix adapter today. So far it seems to be working great. I haven't had any trouble with any resolutions I've tried so far. I have had occasional wavyness that I hope I can eliminate somehow. Otherwise the image quality looks about equal to the 980 Ti.

I don't know how to do the testing you mentioned to see if the Sunix adapter is 10 bit.

Is there an image or test pattern that I could use to see if there are any banding artifacts?

Do you have access to a colorimeter? If not, it'll be tricky, but not impossible in principle.
 
Do you have access to a colorimeter? If not, it'll be tricky, but not impossible in principle.

Yes I do. It's an i1 Pro. I've done a full white balance calibration already.

I'm just not sure I completely follow your instructions regarding determining whether the adapter is 10 bit.

And if the adapter is NOT 10 bits what does this mean practically? Does it only matter if I'm adjusting the gamma with dispcal? Or would I see banding artifacts anyway?

I'm happy to run the tests and let everyone know if you could provide some easy to follow instructions.
 
jrodefeld, many thanks for your valueble testings! ;) can you please test 1920 x 1200@90hz on your 1070 ti, fw900 and the sunix?, also can you please share the link of the exact sunix adaptor version you bought?

also if you can, can you test what is the max refresh rate you can push stable and no artifacted at 1920 x 1200?
 
Last edited:
I'm trying to convince Delock to do the 62967 without the cable,like this:
http://www.delock.com/produkte/1023_Displayport-male---VGA-female/65653/merkmale.html
62967 is amazing,it costs nothing,it is stable as a rock,i have a sample that can do 355 MHz,the most unlucky sample i have can do 340 MHz and if i remember correctly it can do interlaced resolutions,any resolution within that pixel clock is accepted without problems.
But they used a shit cable,it doesn't work good with most video cards and i want to solve this problem.
Who does not care for a higher pixel clock it is the best solution.
About the Sunix adapter,inside the Synaptics VMM2322 chipset there is a triple 8 bit DAC
 
Last edited:
hey i have a sony gdm fw 900 and it wont let me choose anything above 85 hz for any resolution. i have a good graphics card, what could be the issue?
 
hey i have a sony gdm fw 900 and it wont let me choose anything above 85 hz for any resolution. i have a good graphics card, what could be the issue?

Maybe GPU scaling is on?

Might want to download Custom Resolution Utility as well to take a look at your EDID.
 
nah gpu scaling is off

i downloaded cru and i clicked on make a new profile but now it asks me to input values for front+back porch and synch width and i dont know what those are. i also have no idea what my monitor can do at a certain resolution so how do i pick a hz thats safe?

also should i type anything for pixel clock?
 
I was just telling you to download it so you can take a look at your EDID. If you don't see anything in there over 85hz, then you know your issue. Also, if you're hooked up with BNC connectors, your FW900 EDID won't show up. Which is fine, you can still ad whatever you want with CRU.

If you ever decide to add resolutions just pick "standard CRT timings" in the drop down box instead of adjusting all the porch values.

And your monitor won't display anything it can't handle. If you try to do something crazy and display 4k at 60hz progressive, your monitor will just say "out of range" and ignore it until you go back to a resolution it can handle. This is why windows always has that "would you like to keep this resolution" prompt.
 
But they used a shit cable,it doesn't work good with most video cards and i want to solve this problem.
Who does not care for a higher pixel clock it is the best solution.
I'm trying to convince Delock to do the 62967 without the cable,like this:
http://www.delock.com/produkte/1023_Displayport-male---VGA-female/65653/merkmale.html
62967 is amazing,it costs nothing,it is stable as a rock,i have a sample that can do 355 MHz,the most unlucky sample i have can do 340 MHz and if i remember correctly it can do interlaced resolutions,any resolution within that pixel clock is accepted without problems.
But they used a shit cable,it doesn't work good with most video cards and i want to solve this problem.
Who does not care for a higher pixel clock it is the best solution.
About the Sunix adapter,inside the Synaptics VMM2322 chipset there is a triple 8 bit DAC
After reading this post I boght Delock 62967. It is quite cheap and for future compatibility it will suffice as I do not really care for anything higher than 1920x1200@96Hz

Should I expect any compatibility issues due to default cable being bad quality?
 
Back
Top