does anybody have a copy of the modified windows 7 64 bit gdm-fw 900 drivers anywhere?
I lost my backup :(
I am going to store it in my dropbox, if any of you have it anywhere.
Also, would these drivers help on using gdm-fw900 as extended desktop monitor?
When I start windows 7 now (without...
I'm no expert on this, but I am pretty sure this is very BAD idea.
Installing anew coating will ruin the image further no doubt
As recommeneded in thread: if anti glare is removed, the idea is to use the monitor in perfect lighting conditions to get the most out of it.
I use mine in dimly lit...
the CRT blacks are SO Black, that the 'power on' light of the CRT is the only thing visible.
I'm looking at the blacks on above screenshots, and is it ME or can you actually SEE the difference between where the CRT screen is at, compared to the bezel and the crt surrounding?
Must be my...
Isn't hdfury something to use for playstation and old crt televisions? Or are they designing the new one for crt monitor users in mind?
I looked at them before, but the hdfury could never handle high mhz signals, like 1920x1200@85hz..
same here. Some of my old trinitrons 4:3 21 inch had various problems after a few years.
I bought my fw900 in 2007, I have been using it for 2-8 hours (usually eight) hours during weekdays forever after, and the thing is still flawless.
I got cru utility from toastyx and installed a...
So fw900 wasn't the Pinnacle of CRT technology?
0.19mm shadow mask monitor sounds like it is better?
One thing about fw900 that Always annoyed me is that the maximum (realistic) resolution is 1920x1200, because at higher resolutions the dot pitch starts interfering..
Or not?
Over the years, I went from 1920@85hz as desktop resolution to 1440x900@120hz.
The latter is indeed lot sharper, no motion blur, no distortion.
I even use it in older games now.. Because the eye strain is considerably less if you spend 8+ hours before your CRT.
At 1920x1200@85hz I was...
I Always find it amusing that the image shown in the AD pictures on those excellent quality CRT is Always BAD.
Makes me wonder if the Pinnacle CRT technology (aka the gdm-fw900) wasn't there too early. Graphical cards ets might not have been able to get the most out of this CRT.
I mean, I...
that is exactly why many players who play multiplayer first person shooters on a CRT at 800x600@160hz HAVE AN EDGE in that game.
Better reaction time=EDGE
As to why it is? dunno :p
My best guess: LCD change each pixel in colour when displaying a moving scene at 144hz.
CRT actually...
I just installed a 1920x1080@96hz resolution with CRU.
The idea is to make a resolution that has a hertz of x times 24hz, for movie watching.
Is it even possible to watch movies at 96hz on a gdm-fw900?
I vagiely remember reading something about it being possible, but that the values needed...
I own a gdm-fw900 since 2007.
Can anyone direct me to an old good CRT software program that includes geometry patterns and contrast images and colour images to calibrate the image with the OSD? It becomes really hard to find a download on the web that doesn't include installing shady download...
I actually bought a 27 inch good quality monitor, able to display 1440p@120 FPS a while back.
I was playing dark caves in Skyrim, and no matter what I tried, the blacks never got displayed correctly. I could see all walls even in total darkness.
On my trusted sony gdm-fw900, i actually saw...
I'd get it. Like the gdm-fw900, that monitor was designed for professional photo editing.
A 19 inch monitor is good too, just sit bit closer to the screen if you think its tiny :)
Whether or not its worth to buy if you already own good crt, is up for debate.
Nice. Drag a FW900 to a LAN Party someday.
I bet all the youngsters won't be able to decide whether you're a LOON using a stone age CRT, or whether to approach the beast in curiosity.
are you sure its not a software problem? Check ATI or NVIDEA control panel and look
if the "adapt image size" is on, or not.
Messing with that option enables to stretch 4:3 games to the entire screen, but i think it makes the old games look weird.
Ok sorry for reacting in the way i did.
But, to be honest, this ain't true for a CRT.
On an lcd, playing old games or newer ones at lower resolution and higher refresh rates, looks bad and blurred.
Not on CRT.
Playing 800x600 or 960x(xxx)@160hz gives a good image on a CRT.
In...
don't be so patronizing. You got no clue what you talk about.
I agree that for work conditions lcd are fine, but for gaming CRT is still best. Period. Try playing counterstrike on a lcd at 800x600@160hz. FAIL. CRT can do that. Thats just one example.
You are thinking of low res 14 inch...
Pretty well every review and comparison on the BEST HDTV CRT picture display you could buy was always won by the Sony KD34XBR960. (never mind it was a 220lb pig )
It is 34 inch 1080i able 3D DIGITAL CRT TV.
I STILL use it alongside my fw900.
I know 1080i. But still, i watch blurays on...
found out what the problem is. Resolutions for interlaced only accept 60 hz(2x30hz)
So i'll try the highest 16x9 60hz res i can get for interlace"d.
Speaking of which? How do i calculate this? What is highest 16:9 interlaced resolution a fw900 can handle?
best gaming monitor, best picture and colors, best black and white levels etc etc.
And you can play online shooters at 120 hz or even 160 hz or more for an advantage.
it seems the interlaced resolutions ujst don't get accepted.
I got 2240x1260@85hz and 2400x1350@85hz installed for farcry4 (since it has to be 16:9 res there)
The interlaced ones ust don't show up.
i still can't set the interlaced resolution. in amd catalyst control center it shows the maximum supported resolution is the 3264x2040 resolution, but i can't select it. Not in farcry 4 either.
How do i set resolution with radeonpro? Can't find it...
i will try this out for sure, since most games these days are harder on cpu then gpu.. And 30-40 fps max in games is fine with me, as long as it isn't an online shooter.
And why 1020?
edit: nevermind, farcry 4 doesn't support 16x10 resolutons without black bars.
If you want my opinion: it does not hurt to remove it, but only consider it if you use your FW900 in a room with no direct lights or sunlight that can fall on the monitor.
In reality both monitors WITH and WITHOUT AG coating look very good. Blacklevel is beautiful on AG-Removed and AG-Coated...
maybe a silly question, but here goes.
When i bought this fw900, i used brightness 30.
Then 25, 20,15 and now i'm at 10. I used windows 7 test for brightness and some others (and yes, the X is still vaguely visible as it should be)
I wonder, is this case of being too used to washed out...