video cards with 500 mhz RAMDAC or more?

atwix

Limp Gawd
Joined
Oct 7, 2012
Messages
169
I know the standard since 2006 on all video cards is 400 mhz ramdac..

I read somewhere that nvidea 670 and 680 and maybe other may have 500 mhz ramdac, but i may have misread it..

Ramdac is the part of video card that makes analog signals for CRT displays.

Reason i ask is... My fw900 is limited by this 400 mhz ramdac limit, as it can do refresh rates and resolutions beyond 400 mhz. When i used my radeon 9800 pro, it had 500 mhz ramdac or more if i recall correctly.. I could run 1920x1200 at 100+ refresh rate back then if i recall correctly, using powerstrip... But back then the video cards weren't strong enough to actually get 100 FPS in most games. . I can install 2560x1600@68hz custom resolution for this FW900 at ramdac 400mhz, but I read somewhere that people with a NVIDEA 670 actually can squeeze out 75 hertz at that resolution, due to the higher ramdac MHZ limit. The difference between 68 and 75 refresh rate is huge on a CRT... I'd also like to get an as high as possible resolution for 120hertz for first person shooters on this fw900, but the 400 Mhz Ramdac is limiting me.....Hope you can understand my problem better now.:(

Just wondering, for next desktop i'll build, if any of the new cards or any in the pipeline will have MORE then 400 Mhz ramdac. I want to keep using my fw900 CRT until 2030 and beyond, if able..

Please don't come with smartass comments like "they will likely remove ramdac out of new cards soon" or "why the heck are you still using a CRT".. That won't quite help me.

Thx in advance for any help.:)
 
Last edited:
Why the heck are you still using a CRT? :D
hehe...

Indeed, both the Geforce 680 and the Radeon 7970 have a 400MHz RAMDAC

I was like you, I prefered the quality of image of the CRT monitors... until I bought a Dell Ultrasharp :)
My next monitor upgrade would be a FED monitor or something similar :cool:

Powerstrip? Wow
remember using that program back in the Riva TNT days!!
 
i have seen that thread too.. i was just trying to find info if next generation of video cards will have 400 mhz, 500 mhz or NO RAMDAC whatsoever...

You people confuse signal MHZ with ramdac MHZ. Its two different things...

Again, ramdac only "works" in a video card if you use CRT that needs analog signal. The ramdac converts the digital signal to analog signal, and it usually has 400 mhz limit...

It has nothing to do with dual-dvi, since a CRT needs VGA signal, and therefore a dvi to vga cable or a dvi/vga convertor and a vga to vga cable.

I guess most of you people don't know a lot on CRT anymore..

But IMHO its still best gaming monitor, for its black levels. Stuff ingame looks liek oil paitnigns thx to this CRT technology.. You folks should see BF3 on this CRT, vs a 27 inch Catleap at 120hertz..

For some games where black levels are important, the FW900 still beats the 27 inch 2560x1440 displays.

Thats why i'm worried that support for VGA (the ramdac) will dissapear on the new video card generations...


EDIT: Does something like an external RAMDAC exist? Somethign that converts digital to analog signal OUTSIDE of the computer? Or would the ramdac limit of the card itself still apply then....
 
No you are pretty much screwed. New cards will focus less and less on their analogue outputs. In terms of external converters yet you can find such a thing, but they are designed more for pro kind of work and they don't support insanely high resolutions. Gefen makes one that has 350MHz of bandwidth, HDFury has one with 225MHz but that is really all you are going to see.

There is just little interest in analogue displays these days, and to the extent there is it is not really in computers.

Personally I'd say if you have black level issues, particularly with something like BF3 which is not dark, you answer isn't a CRT, it is a better LCD. You get something that can handle an advanced transfer function, sometime more than a simple power curve.

If it is just boosting low levels for shooties and image quality isn't all that important, the BenQ XL2420T does a nice job. Very low latency 120Hz LCD and it has a setting so you can raise black levels, without blowing out the image. Good shooty monitor.

Now if you want something for high quality images, then look at an NEC PA series display. They can be calibrated in hardware for advanced gamma curves like sRGB and, my favourite, L*. These preserve shadow detail without blowing out the high end. I love my 2690 (previous generation to the PA series) in L* mode. Shadows are detailed, highlights and midtones are correct, correction is applied in game (since it is in the monitor itself, not the graphics card) and there is no banding since it does 12-bit LUTs.

If you want to stick with CRTs, you are basically limited to what you find in GPUs. Those aren't going to get any better for analogue, they'll either hold steady or get more limited since the demand is just not there.
 
Yeah I think you are pretty much out of luck on this front. I remember when the RAMDAC's speed was a selling point, nowadays they aren't even talked about anymore.

And why is he still using a CRT?? Because it's a FW900, that's why.
 
I still use CRT! :D

After a long and extensive search for a good LCD I ended up coming back to my Dell/Sony Trinitron P1110. Besides the 4:3 (which is great for desktop use...still prefer 16:9/16:10 for gaming which I can DO...its just 16:9 results in like a ~17 inch screen) and the size/weight I have NEVER seen an LCD (other than maybe OLED) that come anywhere close to my monitor as far as black levels and colors go. Plus the whole native pixels at any resolution, high refresh rates, and no back light bleed is a pretty big plus as well! 2048x1536 or 1600x1200 @ 95hz is pure beauty. I am firm in the belief that weight and size aside, CRT's are still vastly superior to LCD in every way. To deny this I think is quite ignorant really...

Anyways, congrats on the FW900 atwix! If I could find one I would own one. Most definitely.

CRT for life! :D
 
Last edited:
I know the standard since 2006 on all video cards is 400 mhz ramdac..

I read somewhere that nvidea 670 and 680 and maybe other may have 500 mhz ramdac, but i may have misread it..

Ramdac is the part of video card that makes analog signals for CRT displays.

Reason i ask is... My fw900 is limited by this 400 mhz ramdac limit, as it can do refresh rates and resolutions beyond 400 mhz. When i used my radeon 9800 pro, it had 500 mhz ramdac or more if i recall correctly.. I could run 1920x1200 at 100+ refresh rate back then if i recall correctly, using powerstrip... But back then the video cards weren't strong enough to actually get 100 FPS in most games. . I can install 2560x1600@68hz custom resolution for this FW900 at ramdac 400mhz, but I read somewhere that people with a NVIDEA 670 actually can squeeze out 75 hertz at that resolution, due to the higher ramdac MHZ limit. The difference between 68 and 75 refresh rate is huge on a CRT... I'd also like to get an as high as possible resolution for 120hertz for first person shooters on this fw900, but the 400 Mhz Ramdac is limiting me.....Hope you can understand my problem better now.:(

Just wondering, for next desktop i'll build, if any of the new cards or any in the pipeline will have MORE then 400 Mhz ramdac. I want to keep using my fw900 CRT until 2030 and beyond, if able..

Please don't come with smartass comments like "they will likely remove ramdac out of new cards soon" or "why the heck are you still using a CRT".. That won't quite help me.

Thx in advance for any help.:)

Give up, nothing you can do, really. Stick with what you have.

You have an amazing CRT, yet you're trying to squeeze 2560x1600@75hz which is terrible for your eyes. Everything below 100 is pure flickerfest.

Sycraft recommended BenQ XL2420T which is a fine monitor and I am very happy with it, however you HAVE to give up colors big time. Also, 120 Hz CRT will always be better than LCD. Man, I hate LCD but seems like FED (nuCRT :D) died out so we're stuck for years to come.
 
Back
Top