GeForce FX 5900U - A retrospective by you, the forum user

Mr. Bluntman

Supreme [H]ardness
Joined
Jun 25, 2007
Messages
7,100


I'm starting this thread to get some feedback from fellow [H]ers about how (if at all) the GeForce FX line's drivers and performance has improved since their launch roughly a decade ago. I was recently given an AMD Athlon XP 2700 (TBred B) system by my gf's stepdad with a decent KT600 motherboard, and would like to replay the games of yesteryear that don't really work well on multicore systems and Windows 7. I thought this would be a fun distraction and also a serviceable internet computer for when someone else needs a PC to use.

*Now before anything is said about how "the GeForce FX line sucked" - it's around half to a third of the price from the couple of Radeon 9800 Pro/XT cards on eBay, and considering I am between jobs at the moment, money is scarce so I am looking to do this on the cheap. If I really had the money, I would suck it up and buy an Athlon 64 FX- 62 and a GeForce 7900GTX. The card is already paid for and en-route by USPS. And hey, at least it's better than the 64MB Radeon 7000 that's currently installed in that PC, amirite?

I have been pouring over reviews and they don't seem to use any other drivers than the 44.xx series Detonators or the 52.xx ForceWare drivers with the first revision of the HLSL compiler.

So... The question I am posing to you is what to expect with the most recent driver revisions supporting the card, what kind of gameplay/overall experiences can I expect?

System specs will be:

AMD Athlon XP 2700+ Thoroughbred B @ 2.16GHz (soon to be upgraded to a 3200+ Barton)
1GB DDR-333 CL3 (soon to be 2x512MB DDR-400 CL3)
Gigabyte GA-7VT600-RZ (again, looking to upgrade to an ASUS A7NAX-Deluxe)
GeForce FX 5900 Ultra 256MB (reference, pulled from a Gateway AGP system)
Maxtor DiamondMax 9 160GB PATA 7200RPM w/8MB cache
AGI 300W ATX PSU
Windows XP SP3



I will be playing stuff like Silent Hill 3, Doom 3, FarCry, NFS:U, Serious Sam, Max Payne 1/2, Half-Life 2, Unreal 2k3/4 and Unreal II, TR: AOD, etc. Also open to suggestions on what other great games from 10 years ago that would work great with the system.

Thanks in advance and look forward to reading the ensuing discussion!
 
Last edited:
The thing you have to remember about the GeForce FX series of cards is that they weren't tuned for Shader Model 2.0 (DirectX 9) performance. It's fully supported, but the architecture isn't heavily geared towards it.

When the FX series was being designed, DirectX 8.1 was still in heavy use, so a lot of the improvements in the NV30 core went into Shader Model 1.x performance. End result was beastly DX 8.1 performance compared to the GeForce 4, totally overlooked because most performance tests were done using DX9 on the FX cards.

If you can stick to games where you can force them to use DX 8.1 instead of DX9, you should actually be pretty happy with the performance. Case in point, there's a mod that re-implements Oblivion's shaders in Shader Model 1.4 called "Oldblivion" that actually gets the game running quite well on even the lowly GeForce FX 5200. Similarly, adding the "dxlevel -81" switch to Source Engine games like Half Life 2 will also give them a massive performance boost on the FX series.
 
The thing you have to remember about the GeForce FX series of cards is that they weren't tuned for Shader Model 2.0 (DirectX 9) performance. It's fully supported, but the architecture isn't heavily geared towards it.

When the FX series was being designed, DirectX 8.1 was still in heavy use, so a lot of the improvements in the NV30 core went into Shader Model 1.x performance. End result was beastly DX 8.1 performance compared to the GeForce 4, totally overlooked because most performance tests were done using DX9 on the FX cards.

If you can stick to games where you can force them to use DX 8.1 instead of DX9, you should actually be pretty happy with the performance. Case in point, there's a mod that re-implements Oblivion's shaders in Shader Model 1.4 called "Oldblivion" that actually gets the game running quite well on even the lowly GeForce FX 5200. Similarly, adding the "dxlevel -81" switch to Source Engine games like Half Life 2 will also give them a massive performance boost on the FX series.

It was because NVidia built this card for FP16 and FP32, while ATi went with FP24 following the DX9.0b spec for all pixel shader ops, wasn't it? I remember there being a huge hubbub about this... And NOBODY ever implemented the FX12 shaders built in to the hardware, IIRC... If anyone has examples of this, great! Show me...

I suppose the question that I was posing was did NVidia ever improve SM2.0 performance beyond what was seen with the 52 series ForceWare drivers in initial reviews, and what games this forum recommends that I revisit that would run well on said hardware?
 
Last edited:
It was because NVidia built this card for FP16 and FP32, while ATi went with FP24 following the DX9.0b spec for all pixel shader ops, wasn't it? I remember there being a huge hubbub about this... And NOBODY ever implemented the FX12 shaders built in to the hardware, IIRC... If anyone has examples of this, great! Show me...
That was an issue specific to the Source engine, for the most part. Valve used FP24 precision for all shaders, which ATi cards could run natively without a problem.. in fact, FP24 was the only level ATi cards supported at the time.

The GeForce FX cards supported both FP16 and FP32, but skipped FP24. Attempting to run FP24 precision shaders on a GeForce FX results in the card having to run them in FP32 mode (with a fairly large performance hit).

The controversy sprang up because the shaders that caused these issues on the FX series cards could have been implemented in FP16 without any real drop in visual quality. This would have resulted in a massive speedup for GeForce FX cards, but no improvement for ATi cards (which would then have to run the FP16 shaders at FP24 precision). People accused Valve of playing favorites with ATi, when in reality they were just following the DX9 spec (which says FP24 is the minimum).

I suppose the question that I was posing was did NVidia ever improve SM2.0 performance beyond what was seen with the 52 series ForceWare drivers in initial reviews, and what games this forum recommends that I revisit that would run well on said hardware?
Most drastic way to improve SM 2.0 performance is probably by forcing FP16, as seen in this thread here: http://hardforum.com/showpost.php?p=1026929557&postcount=9

May or may not lead to visual artifacts, depending on the game... but it will prevent the use of the performance-crippling FP32 mode. Switching to DX 8.1 mode also avoids the whole issue.
 
Last edited:
That was an issue specific to the Source engine, for the most part. Valve used FP24 precision for all shaders, which ATi cards could run natively without a problem.. in fact, FP24 was the only level ATi cards supported at the time.

The GeForce FX cards supported both FP16 and FP32, but skipped FP24. Attempting to run FP24 precision shaders on a GeForce FX results in the card having to run them in FP32 mode (with a fairly large performance hit).

The controversy sprang up because the shaders that caused these issues on the FX series cards could have been implemented in FP16 without any real drop in visual quality. This would have resulted in a massive speedup for GeForce FX cards, but no improvement for ATi cards (which would then have to run the FP16 shaders at FP24 precision). People accused Valve of playing favorites with ATi, when in reality they were just following the DX9 spec (which says FP24 is the minimum).

But the lackluster performance in SM2.0 was not just limited to Source, correct?

And there is the matter of ATi giving Valve $6M in funding. The conspiracy theorist in me *dons tinfoil hat* still thinks this is no small coincidence. But I'm not trying to necro a nine year old flamewar on the subject...
 
But the lackluster performance in SM2.0 was not just limited to Source, correct?
Correct, though Source hit FX cards particularly hard compared to other game engines.

And there is the matter of ATi giving Valve $6M in funding. The conspiracy theorist in me *dons tinfoil hat* still thinks this is no small coincidence. But I'm not trying to necro a nine year old flamewar on the subject...
Nah, there's really no need for a flamewar now. These cards and issues are ancient.

For what it's worth, precision hints are also part of the DX9 spec, which tell the card when it can use lower precision to render a shader. Valve neglected to use that part of the spec, meaning all shaders were given to the card as "FP24 minimum" even when most would have worked fine with hints telling the card it could use FP16 instead. Again, this wouldn't have helped or hurt ATi cards (since all they can run is FP24), but it would have made GeForce FX cards perform a hell of a lot better.
 
We do not talk about FX/Nv30.that is classified information.Therefore we have no idea what you are talking about sir:eek:
 
Correct, though Source hit FX cards particularly hard compared to other game engines.


Nah, there's really no need for a flamewar now. These cards and issues are ancient.

For what it's worth, precision hints are also part of the DX9 spec, which tell the card when it can use lower precision to render a shader. Valve neglected to use that part of the spec, meaning all shaders were given to the card as "FP24 minimum" even when most would have worked fine with hints telling the card it could use FP16 instead. Again, this wouldn't have helped or hurt ATi cards (since all they can run is FP24), but it would have made GeForce FX cards perform a hell of a lot better.

Agreed, which is why if Valve was trying to make the game run as best as possible on the widest range of hardware, why the hell didn't they use those flags?

Anyways, back on topic: Has performance been noticeably improved since initial reviews through driver updates and recompiler improvements? Or no?
 
Last edited:
Agreed, which is why if Valve was trying to make the game run as best as possible on the widest range of hardware, why the hell didn't they use those flags?

Anyways, back on topic: Has performance been noticeably improved since initial reviews through driver updates and recompiler improvements? Or no?

we wont know that until you get your card. I highly doubt anyone has done any tests on this. Most just forgot about the FX series of cards once the Geforce 6 series came out.
 
I really don't think things will have improved that much. It also looks like they stopped updating driver for the GeForce FX line in 2006 (Forceware 96.85)

Edit: Ok, never-mind, it gets weirder...

Latest drivers for Vista/7 = 96.85
Latest drivers for WinXP = 175.19
 
I really don't think things will have improved that much. It also looks like they stopped updating driver for the GeForce FX line in 2006 (Forceware 96.85)

So, really no point in installing a driver past that revision, no?

EDIT: I think the 175's break remote desktop, but that's not really a concern of mine. Ultimately, at the end of the day, performance increases DO matter.

I'm going to take it upon myself to find out if no one else has a GeForce FX 59xx series card with AGP system to match.
 
Last edited:
we wont know that until you get your card. I highly doubt anyone has done any tests on this. Most just forgot about the FX series of cards once the Geforce 6 series came out.

I think you may be right. Funny in retrospect, because R600 was an equally huge flop, albeit slightly more competitive. Well I guess I am going to have to shed some light on this, then...

Grab the popcorn, this is going to be interesting. :D
 
it would have made GeForce FX cards perform a hell of a lot better.
Along with introducing all kinds of precision artifacts and compromises in image quality.

fp16 only gives you enough mantissa to accurately sample a 256x256 texture

fp24 has enough mantissa to sample 2048x2048 textures (the minimum required by directx 9)

fp32 could sample 524288x524288 textures but that was overkill for the time.

The fact of the matter was that NV3X was never meant for Directx 9. They only had fp32 for marketing purposes. DaveBaumann=mod@beyond3d=Now currently working for ATI/AMD

DaveBaumann said:
On the way back to the hotel from E3 one night I skadged a lift on the NVIDIA bus. When I got on I noticed the name badge of Kieth Galocy, a name I recognised from the 3dfx days. We were taling about a number of things such what he's up to and the NV3x parts etc., and he made that exact same point himself. He said that NV30 is really a good DX8 performer, but with DX9 hardware, which is similar to the previous generations. With the number of FX12 units NV30 is a superb DX8 class performer, not not quite so hot a DX9 performer.

ATI took a slightly more generic route, of not bothering with full FP32 precision, but more FX24 units that can generically cope with both DX9 and DX8 shaders, so they have ended up with a more balanced architecture for current titles and new titles - if NV stick to running DX9 shaders at full DX9 precision half of the NV30 pipeline is wasted as its not float.


Then came the Dawn demo and the wrapper to run it on ATI hardware and then the upgrade to full FP32. This is what the author of that upgrade had to say regarding NV3X. Uttar=Arun=mod@beyond3d

I'm gonna be clear here, comparing the R350 and NV35...

nVidia got higher AA speed, but lower AA quality.
nVidia got higher AF speed, and roughly equal quality.
nVidia got roughly equal VS speed ( depends on operation )

But...

nVidia got fifty billion times slower Pixel Shaders.

I mean, damn, this is ridiculous! Okay, maybe not fifty billion times, but for complex programs, they are easily *five times slower than the R350*. No joking here.

And will guarantee to nVidia that the NV35 will run good in future games?
Oh, well, that one ain't hard. They're slowing down the whole industry - because their hardware can't run it, and many FXs are gonna be sold, developers are not going to use so complex shading programs.

Want it or not, the NV3x is really crap at pixel shaders. Unless the NV40 fixes this massively, I fear I've got to say that the R350 is a much better architecture than the NV35 - too bad nVidia will do everything to prevent it from looking so, and thus making the NV35 better. Eh.


Uttar


There is no need for conspiracy theories here, when the simple matter was Nvidia got outdesigned by ATI's engineers.

P.S: It's quite funny to read that last part of his comment and remembering how Nvidia made Ubisoft pull the DirectX 10.1 patch from Assassin's Creed, The way it's meant to be bribed lol. Old habits die hard, eh?
 
Last edited:
Great find, Lorien. Interesting. So, if FP32 was useless, why the hell bother to implement it other than for purely marketing purposes? As an aside, didn't Dave Baumann get let go some time after the ATi/AMD (DAAMiT) merger?

They must have been passing around some really GOOD SHIT around NVidia's offices after the 3dfx buyout to celebrate...
 
Last edited:
Great find, Lorien. Interesting. So, if FP32 was useless, why the hell bother to implement it other than for purely marketing purposes? As an aside, didn't Dave Baumann get let go some time after the ATi/AMD (DAAMiT) merger?

They must have been passing around some really GOOD SHIT around NVidia's offices after the 3dfx buyout to celebrate...

He still works at AMD as a Product Manager. He was in fact directly responsible for how good the Radeon 4850 turned out and has had a hand on recent AMD products as well. http://www.anandtech.com/show/2679/9
 
Last edited:
Along with introducing all kinds of precision artifacts and compromises in image quality.
Erm... the idea behind precision hints is you only tell the card to run low-precision when it wont impact image quality. It's part of the DX9 standard (the spec requires FP24, but lower levels are supported and accounted for).

Most of the effects in HL2 do, in fact, work fine in FP16 mode (meaning Valve could have used hints to improve performance on Nvidia cards). The only things that really needed FP24 were the water shader and window reflections, as far as I can tell.

The fact of the matter was that NV3X was never meant for Directx 9.
You don't have to tell me, that's one of the first things I said on the matter :p
 
Erm... the idea behind precision hints is you only tell the card to run low-precision when it wont impact image quality. It's part of the DX9 standard (the spec requires FP24, but lower levels are supported and accounted for).

Most of the effects in HL2 do, in fact, work fine in FP16 mode (meaning Valve could have used hints to improve performance on Nvidia cards). The only things that really needed FP24 were the water shader and window reflections, as far as I can tell.

I still have to scratch my head as to why Valve didn't do this... Either laziness or a $6M kickback...
 
I had a FX5900 back in the day. I really liked it.

IIRC there was a driver that forced FP16 for games that used FP24, I don't remember if it was official or hack but it sped thing tremendously in shaders.
 
Holy shit this thread brings back memories of Doom 3 and Unreal Tournament goodness. I had a FX5950 Ultra at the time (major leafblower under load) and this 5900 makes me really want to pick up a 5950 again just for fun.
 
like these?

e8c82562-296d-49ae-9343-ca05f4ec9e38_zps2fbad842.jpg

IMG_0167.jpg

;)

the FX series had some of the best looking heat sink / fan setups... PNY not so much but cool purple pcb.. Chaintech and Gainward always had the fun looking coolers.

as for performance, once you hit like the doom3 engine these start to work hard... I played Max Payne and SIN on a 5950 a year or so ago and those played fine.. unreal2 even played pretty well. I think I played a little Pariah also... but it was struggling with that.
 
Last edited:
Holy shit, revenant! That's a hell of a collection! What else you got in that acrylic case?

I can only dream of having such a collection one of these days. :D
 
Holy shit this thread brings back memories of Doom 3 and Unreal Tournament goodness. I had a FX5950 Ultra at the time (major leafblower under load) and this 5900 makes me really want to pick up a 5950 again just for fun.

Indeed, hence the build. I would have bought a 5950U but they were $20 more (I paid $30 shipped for the card) and isn't worth the price premium, IMO...) and a GeForce 6800 GT/Ultra would be a bit overkill for this system (and more money!). I played Doom 3 originally on a FX 5900 128MB non-Ultra on a 2600+ Barton. Was a solid setup, until the neighbors downstairs busted the front door off the hinges and even blasted a hole in my locked bedroom door to steal that system. Sold it off to get some meth AFAIK (were definite druggies, read: SCUMBAGS). I was SOOO pissed! Cops wouldn't do shit because I didn't have serial numbers for EVERY component in the system. :mad:

As an aside, had the same thing happen in May with my Xbox 360. I started taking photos of all the hardware I have with serial numbers in view just in case.

Guess karma is finally paying it forward 9 years later. Better late then never, I suppose. Anyone have a 21/22" Trinitron CRT laying around? :p

Now karma can pay it forward with a new 360... :p
 
Last edited:
Holy shit, revenant! That's a hell of a collection! What else you got in that acrylic case?

I can only dream of having such a collection one of these days. :D

haha... ebay is your friend. I got a lot of these for $30 bux or less (less than $10 in some cases)... a couple were my old cards..

P1020657.jpg

0be22272-0d24-4c78-9865-4e514021198f_zpsa0fe9231.jpg


I ran out of space kinda.. I'm going build a new display that will hold all the cards on a self with some lighting.

sorry about all your theft issues. I have a an extra 6800GT pci-e if you want one.. PM me.
 
Last edited:
haha... ebay is your friend. I got a lot of these for $30 bux or less (less than $10 in some cases)... a couple were my old cards..

P1020657.jpg


I ran out of space kinda.. I'm going build a new display that will hold all the cards on a self with some lighting.

Beautiful! And yeah that's where I scored my card at. Once things get more situated financially I'll have to start my own collection! Is that a reference GeForce 3 in the top left? I remember the Maximum PC article in March 2001 covering it. Good stuff!

EDIT: PM sent. Thanks! Now I just need a PCIe Motherboard and Athlon 64 and I am G2G. :D
 
Last edited:
My memories of the FX-5900 were of my friends dogging me for getting a Radeon 9700pro (this was before the FX series was even out) and not waiting for the GeForce 5 series. :D

Edit - er... I mean 5800.
 
My memories of the FX-5900 were of my friends dogging me for getting a Radeon 9700pro (this was before the FX series was even out) and not waiting for the GeForce 5 series. :D

Edit - er... I mean 5800.

Good thing you declined to wait, huh? The FX5800 was awful, even I will admit.
 
Last edited:
Holy shit this thread brings back memories of Doom 3 and Unreal Tournament goodness. I had a FX5950 Ultra at the time (major leafblower under load) and this 5900 makes me really want to pick up a 5950 again just for fun.

Really? I had my 5900 flashed to 5950 and it wasn't loud at all.
 
when I used that chaintech 5950 (with the chrome cooler and two fans) it was VERY loud under load.. and heat was pouring off it. I think they ran pretty hot no matter what.
 
Really? I had my 5900 flashed to 5950 and it wasn't loud at all.

The stock coolers to the 5900 were significantly quieter than the 5950U. Knew someone that had a 5950, and while it wasn't 5800 loud, you definitely heard it running in your PC.

when I used that chaintech 5950 (with the chrome cooler and two fans) it was VERY loud under load.. and heat was pouring off it. I think they ran pretty hot no matter what.

Yes they did. Even the ASUS 5900 NU I had in my PC that got stolen in '04 ran pretty toasty.

As an aside, I got my 5900 Ultra today, I'm going to do some benchmarking when I get home, for sure. It will be interesting to see if there were any major performance improvements courtesy of the drivers when I get home this evening. :)
 
Good thing you declined to wait, huh? The FX5800 was awful, even I will admit.

Oh yes. To be honest, even when the 5900's came onto the scene, I still felt no need to upgrade. The Radeon 9700pro lasted me until 2005, when I went up to an X800XL. And to be even more frank, if it didn't die on me, I would still be using it for one of our LAN boxes. :D
 
As an aside, I got my 5900 Ultra today, I'm going to do some benchmarking when I get home, for sure. It will be interesting to see if there were any major performance improvements courtesy of the drivers when I get home this evening. :)


Looking forward to the results. For some reason, this thread has piqued my curiosity. :)
 
Looking forward to the results. For some reason, this thread has piqued my curiosity. :)

As am I. No hardware review site AFAIK has ever revisited the GeForce FX series in such a manner, investigating if there were any performance improvements sine the initial 44.xx and 52.xx drivers. I am extremely curious if NVidia continued to do improvements to the shader compiler or just stopped after the GeForce 6 release...

I plan on testing Tomb Raider: Angel of Darkness, Max Payne 1/2, Serious Sam, Doom 3, UT2K3, and 3DMark2001/03. What other games/benchmarks should I test?
 
As am I. No hardware review site AFAIK has ever revisited the GeForce FX series in such a manner, investigating if there were any performance improvements sine the initial 44.xx and 52.xx drivers. I am extremely curious if NVidia continued to do improvements to the shader compiler or just stopped after the GeForce 6 release...

I plan on testing Tomb Raider: Angel of Darkness, Max Payne 1/2, Serious Sam, Doom 3, UT2K3, and 3DMark2001/03. What other games/benchmarks should I test?

Far Cry, Oblivion, and The Chronicles of Riddick. I'm curious to see how big of a beating Oblivion gives it.
 
What's really interesting is benchmarking old 3DFX Voodoo graphics cards in modern systems.

They lack Hardware T&L, which means it ends up running in software on your CPU. Pair a Voodoo card with a Core i7 and you'll see some pretty jaw-dropping performance compared to release-day benchmarks.
 
What's really interesting is benchmarking old 3DFX Voodoo graphics cards in modern systems.

They lack Hardware T&L, which means it ends up running in software on your CPU. Pair a Voodoo card with a Core i7 and you'll see some pretty jaw-dropping performance compared to release-day benchmarks.

Absolutely fascinating. Anyways, the card is in and seems to be working well so far hooked up to my LP2475w through DVI. I installed the 163.75 ForceWare drivers to start. I will go back driver revisions and see what works best, and what doesn't.


0731131847.jpg

0731131848.jpg


More to come!
 
Back
Top