24" Widescreen CRT (FW900) From Ebay arrived,Comments.

AMAZING!

It blows the nVIDO one in my 980GTX away and is even better than AMD DACs! The cheap aMAZONe DAC I got has LCD level clarity. I thought the blur of my monitors was just an inherent CRT property, but it was really a case of send shit in, get shit out.

I'm skeptical of this claim, but I remain open minded.
 
No. The refresh rate has no effect on CRT clarity. You are probably using a crappy DAC that can't handle the bandwidth. I will send you a link to the one I bought on amazon when I check, later. I thought the same thing you do, but once I used an external DAC, I relized the truth. 1600x1200 @ 85 looks as good on my eDAC as 1600x1200 @ 50 on the on GPU one. If you interlace, 180Hz on eDAC is as clear as 100i on GPU. Too bad it only does 225MHz, but 16x12 @ 85 (or 180i) is OK for me.

Still looking for that link...
 
So can someone share this mystical dac and link, i had one that coud do 1600x1200@75hz out of the box but i broke it :) Poor material quality and the weight of thick vga cable just pulled the vga part and left the rest inside hdmi port on gpu xD. It could do even 94 hz with some resolution "tweaking" so with dac that could do 225Mhz i think i could hit a bit more.
 
Would a 225MHz DAC do 1920x1440 @ 75Hz or 72Hz or 70Hz?

Looks like that would require a pixel clock of 300+mhz, so no.


For this to be achievable I need a DAC, correct? Can you recommend one?

You only need a DAC if you are using a graphics card that doesn't have a DAC - you need a card with DVI-I. All Nvidia Maxwell cards have DVI-I, just not the new 10xx series. AMD discontinued it on their high end cards since the 290-series.
If you don't have DVI-I you can buy the 10$ DAC rabidz7 suggests for example.
With only 225mhz bandwidth it wouldn't be able to use the high end resolutions/refresh rates high end CRTs are capable of though, those the 400mhz pixel clock of a graphics card can handle.

I just went over your other posts. Are only the W900 and F520 available at this point?
The thing is, you are limited by the horizontal scan rate of the CRT monitor. That means the W900 with its lower 96khz horizontal scan would only be able to do ~100hz at 1440x900. The FW900 has a higher 121khz, but the F520 has a whopping 137khz scan rate. Unless you really need to play in wide format or the F520 is prohibitively expensive, grab that CRT and never let it go.
 
I'd buy both, JayFkayy, not sure what the hold-up is. Are they both really expensive or something? The most I've payed for a CRT in the past 5 years is $50 (21" Lacie), but most of them were either <$20 or free.

If you get both, you can use the W900 when you're playing widescreen-only, low refresh rate games (like Street Fighter 5, which is 16:9, 60fps only), and you can whip out the F520 when you're playing a 4:3 capable game that can play at higher refresh rates (like the upcoming Battlefield 1, or any other popular shooter)
 
I'm skeptical of this claim, but I remain open minded.
This is rabidz where he said delid a 3960x and posted a core 2 extreme quad. I would take anything with a grain of salt.

also emailed moome, man I do wish his English was better, make it so hard to email him.
5a165ac7a3.png
 
I'd buy both, JayFkayy, not sure what the hold-up is. Are they both really expensive or something? The most I've payed for a CRT in the past 5 years is $50 (21" Lacie), but most of them were either <$20 or free.

If you get both, you can use the W900 when you're playing widescreen-only, low refresh rate games (like Street Fighter 5, which is 16:9, 60fps only), and you can whip out the F520 when you're playing a 4:3 capable game that can play at higher refresh rates (like the upcoming Battlefield 1, or any other popular shooter)

I agree with this. If you can, get both. Sorry Jay - I was distracted by the 225 MHz RAMDAC that has yet to make an appearance (seriously rabidz - where's that fucking link?).
 
Personally I'm more interested in the possibility of very common budget 225mhz adapters delivering a crisper image than my Maxwell card. I doubt there's a single one that is vastly superior to all other 10$ choices.


---------------------


PSA: Nvidia Fast Sync is live! well, sort of...
Fast Sync is available right now • /r/nvidia

The first part of Nvidia's efforts to make V-Sync obsolete was G-Sync, the tech that eliminates tearing at framerates below the refresh rate without high input lag. People were not satisfied that they were locked into V-Sync at higher framerates initially though, and enabling the option to run with no V-Sync didn't address the tearing.
So now there is "Fast Sync", the second piece to the puzzle. Basically a triple buffer (but not regular triple buffering) solution to only draw the last rendered frame fully onto the screen, eliminating tearing at virtually no input lag at higher than refresh framerates.
CRTs don't get G-Sync obviously, but CRTs and input lag- + tearing-free gaming without the hickups of V-Sync below refresh is a match made in heaven.

Explanation / comparison to old school tripple buffering:



In practice I find it very enjoyable. Path Of Exile for example is eerily smooth and responsive in 2048x1280@90hz using Fast Sync.
Using that refresh rate as context: in some more demanding games it seems to consistently cap at 90fps and stay there, while in an easy to run game like PoE it sometimes exhibits 180 and even 270. This behaviour seems to address the inherent loss of smoothness you get when you just draw whatever frame was rendered last at any framerate - which is incidently what happens when I try it in Source games, it aims for that factor but ends up all over the place most of the time. My fix so far has been to cap at 91fps which seems to work reasonably well.

Keep in mind that it's not officially in the drivers yet, probably still in beta.
Some games don't seem to run with it enabled at all, there are some hickups to be addressed, some people with pre Pascal/Maxwell cards are reporting BSODs and crashing, etc.
Fun playing around with though.
 
Last edited:
just for curiosity, when looking at the evga webpage, and search in products for the gtx 1080 and 1070, and looking specs and details for those cards, the 1070 is listed as analog vga compatible, in both specs and details, it list "Max Analog : 2048x1536", i checked for the 1080 so maybe all cards listed the same by mistake, but for the 1080s only digital resolutions are described, interesting.......
 
just for curiosity, when looking at the evga webpage, and search in products for the gtx 1080 and 1070, and looking specs and details for those cards, the 1070 is listed as analog vga compatible, in both specs and details, it list "Max Analog : 2048x1536", i checked for the 1080 so maybe all cards listed the same by mistake, but for the 1080s only digital resolutions are described, interesting.......
could you paste the link for that specs?

---------Edit
Nevermind, it also states DVI-D, so no love for the analog output.
 
yes, in the picture also there is the dvi-d only, i believe evga made a mistake on the 1070 specs and details, perhaps i would like to cross my fingers hoping its not.
 
Well, the startech has arrived. no luck. Same capabilities as cheap converter. It tops at 1080@60hz (unusable).
So it's 980Ti until HDfuryreleases 400mhz DAC, or new cards for 1440x1900@96hz (best I've been able to get with the adapter). First option I think.
 
Have had my eye on this for a while as an alternative to the current hdfury offerings

Extron Digital to Analog Converters - DVI-RGB 200

does 1920x1200 60hz max if I read the user guide properly, Not exactly amazing but would work very well for consoles via an hdmi adapter. Extron build quality is also very good from my experience

edt: Went ahead and ordered the older 150 variant which is limited to 1080p 60hz for $40 shipped. Will report back after I get it
 
Last edited:
Have had my eye on this for a while as an alternative to the current hdfury offerings

Extron Digital to Analog Converters - DVI-RGB 200

does 1920x1200 60hz max if I read the user guide properly, Not exactly amazing but would work very well for consoles via an hdmi adapter. Extron build quality is also very good from my experience

edt: Went ahead and ordered the older 150 variant which is limited to 1080p 60hz for $40 shipped. Will report back after I get it

HD Fury 3 would probably be a better option, it can do 1080p at 72hz. Probably similar pixel clock to the extron, but the HD Fury guys had a big focus on eliminating input lag. You may be rolling the dice with the Extron.

Do let us know how it works out, though.
 
HD Fury 3 would probably be a better option, it can do 1080p at 72hz. Probably similar pixel clock to the extron, but the HD Fury guys had a big focus on eliminating input lag. You may be rolling the dice with the Extron.

Do let us know how it works out, though.


I'm going to use this primarily for hdmi consoles on the fw900 so no need for more than 1080p/60, but lag may be an issue (though I doubt it would be more lag than a console straight into an average lcd hdtv)

Whats the best program/website to check lag with?

I have an older pc with vga that I can use as a baseline then the extron to compare
 
The article talcks about reference PCB for that 1080. But it isn't.
Look at this Pic. This is the reference PCB
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/images/front.jpg

It´s different from the second pic of that article.

I hope they did a new PCB with DVI-i and that's not an older card used as sample for showing the cooler.

By the way, at inno3D website, the custom pcbs only have DVI-D. But that pic isn't a Maxwell PCB neither.
 
Last edited:
Has anyone had luck with the HDFury Nano GX and the latest PS4 firmware update (3.50)? I can't get HDCP working with it.
 
Has anyone had luck with the HDFury Nano GX and the latest PS4 firmware update (3.50)? I can't get HDCP working with it.

Oh crap, don't have a PS4 but I have the Nano GX. Is it just movies that don't work or are you not getting picture period?

If it's no picture period, then you can try hooking upl the PS4 to something else and resetting the video mode to 720p or something.

Either way, HD Fury website has a forum and the developers may be able to help you out (or they'll just tell you to buy Dr. HDMI).
 
Oh crap, don't have a PS4 but I have the Nano GX. Is it just movies that don't work or are you not getting picture period?

If it's no picture period, then you can try hooking upl the PS4 to something else and resetting the video mode to 720p or something.

Either way, HD Fury website has a forum and the developers may be able to help you out (or they'll just tell you to buy Dr. HDMI).
It's cool that you have one, too. I can't find much user info with the usual web search. Niche product, I guess. I did send them an email about it.

It works when I disable HDCP in the system settings (LED is steady green with USB 5V connected), but enabling it blanks out the display output (even though the Nano GX was supposed to have taken care of HDCP issues with the PS4 from descriptions on the site and past reviews). I'm guessing Sony blacklisted the EDID with a recent firmware or something since whoever tried to take HDFury down. Xbox One still works fine.
 
Spent some time messing with the G70. Oh boy - this will take a while. It's a beast and a pain in the ass to setup. I'll be glad once I'm done.
 
It's cool that you have one, too. I can't find much user info with the usual web search. Niche product, I guess. I did send them an email about it.

It works when I disable HDCP in the system settings (LED is steady green with USB 5V connected), but enabling it blanks out the display output (even though the Nano GX was supposed to have taken care of HDCP issues with the PS4 from descriptions on the site and past reviews). I'm guessing Sony blacklisted the EDID with a recent firmware or something since whoever tried to take HDFury down. Xbox One still works fine.

That stinks, I've had to use my Nano GX to get HD video to play from Amazon Prime, because they will only play SD if they detect a CRT monitor/other non-HDCP display. If Sony blacklists the Nano, I'm guessing it's only a matter of time for the rest of the services do.

It really feels like the media companies are just trying to force us to buy new monitors, because I highly doubt pirates convert movies to analog to rip them. I'm sure they have better ways.
 
That stinks, I've had to use my Nano GX to get HD video to play from Amazon Prime, because they will only play SD if they detect a CRT monitor/other non-HDCP display. If Sony blacklists the Nano, I'm guessing it's only a matter of time for the rest of the services do.

It really feels like the media companies are just trying to force us to buy new monitors, because I highly doubt pirates convert movies to analog to rip them. I'm sure they have better ways.
I don't know, I'm just guessing. Maybe mine is simply borked. Xbox One works like a charm, though (however it doesn't require the extra 5V). I'll see about posting on their forum.
 
So , what is the tested BLACK p oint of this in cdm2? What is the luminance? 90-120cdm2? What is the contrast ratio and delta E
Sorry, i can't find these measurements easily and there's tons of page :D
 
Black level and peak luminance depend on how you calibrate it. The black level can go quite low. See this thread, where I measured it to be around 0.0006 cd/m2. Peak luminance can probably go quite high, but if you follow the WinDAS steps, you end up with a value of around 85-87 cd/m2
 
Been emailing the HDFURY peoples. Crowdfunding for a 500+ dollar device may be possible. I wonder how much interest there would be.
 
Been emailing the HDFURY peoples. Crowdfunding for a 500+ dollar device may be possible. I wonder how much interest there would be.

You still owe us a link to that 225 mhz adapter that you've been raving about. Seriously - is CTRL+C and CTRL+V too hard?
 
Been emailing the HDFURY peoples. Crowdfunding for a 500+ dollar device may be possible. I wonder how much interest there would be.

can you give us a link, or give us a name of the 225 mhz adapter that you claim to own? Can you take a picture of it? I'm assuming it is made of visible matter and can be photographed, right?
 
Black level and peak luminance depend on how you calibrate it. The black level can go quite low. See this thread, where I measured it to be around 0.0006 cd/m2. Peak luminance can probably go quite high, but if you follow the WinDAS steps, you end up with a value of around 85-87 cd/m2
WOW. so at .00006, what was the calibration set to? Surely if u set brightness to 10cdm2, that figure would be moot. But if you set it anything over 90cdm2, then its an amazing figure ;) But hey, its CRT ! I had an NEC diamondtron, but at night you could still see black background glowing greyish. I really don't know why i wasn't able to get good results out of a premium monitor back in 2002. The luminance was also low . The focus was also screwed up and it took me ages to find a 12" probe to focus it. I guess thats the luck I got on ebay >.>
 
WOW. so at .00006, what was the calibration set to? Surely if u set brightness to 10cdm2, that figure would be moot. But if you set it anything over 90cdm2, then its an amazing figure ;) But hey, its CRT ! I had an NEC diamondtron, but at night you could still see black background glowing greyish. I really don't know why i wasn't able to get good results out of a premium monitor back in 2002. The luminance was also low . The focus was also screwed up and it took me ages to find a 12" probe to focus it. I guess thats the luck I got on ebay >.>

0.0006 not 0.00006 :)

Brightness and contrast were at the defaults (31 brightness and 90 contrast). This was all done in the white point balance adjustment using WinDAS. Keep in mind that these figures represent dynamic contrast, not static contrast. So it's great for scenes that are really dark, like a star field, or when there's a black period between two scenes. But the black level rises considerably on the FW900 when a light patch is neighbouring a dark patch.
 
0.0006 not 0.00006 :)

Brightness and contrast were at the defaults (31 brightness and 90 contrast). This was all done in the white point balance adjustment using WinDAS. Keep in mind that these figures represent dynamic contrast, not static contrast. So it's great for scenes that are really dark, like a star field, or when there's a black period between two scenes. But the black level rises considerably on the FW900 when a light patch is neighbouring a dark patch.

Yes, and this is what's known as ANSI contrast. To measure, you put a checkerboard pattern up on the screen and measure the difference between light and dark patches. CRT's don't have good ANSI contrast due to the reflections in the glass. If only there was a better option for creating a vacuum that's completely transparent with no reflections. Hah - but these are the things of fiction or stupid expensive materials... Which, for the consumer, mean fiction. :D

Interestingly enough, has anyone actually measured ANSI contrast on the FW900?
 
0.0006 not 0.00006 :)

Brightness and contrast were at the defaults (31 brightness and 90 contrast). This was all done in the white point balance adjustment using WinDAS. Keep in mind that these figures represent dynamic contrast, not static contrast. So it's great for scenes that are really dark, like a star field, or when there's a black period between two scenes. But the black level rises considerably on the FW900 when a light patch is neighbouring a dark patch.
Ok that's pretty good n the dark scenes. Well is there still contrast punch in mixed scenes, despite measurements? And can you get refresh rate OVER 60hz in 1080p (or next neighbor res)?
 
That stinks, I've had to use my Nano GX to get HD video to play from Amazon Prime, because they will only play SD if they detect a CRT monitor/other non-HDCP display. If Sony blacklists the Nano, I'm guessing it's only a matter of time for the rest of the services do.

It really feels like the media companies are just trying to force us to buy new monitors, because I highly doubt pirates convert movies to analog to rip them. I'm sure they have better ways.
Good news, they sent me a firmware update, and now all is well. Sorry for the unfounded speculation. :p
 
Ok that's pretty good n the dark scenes. Well is there still contrast punch in mixed scenes, despite measurements? And can you get refresh rate OVER 60hz in 1080p (or next neighbor res)?

Yes contrast is still punchy, especially with a nice 2.4 gamma curve (or BT.1886 curve which is essentially the same as 2.4 for displays with low black levels). Keep in mind I do all my "critical viewing" in a fairly well light controlled environment. I run 1920x1200@ 85 hz, so yes, that's over 60hz and I don't experience flicker.
 
Back
Top