GeForce FX 5900U - A retrospective by you, the forum user

This whole thread is just a huge nostalgia trip. I had those RAM slots on my old AMD64 system. Back when the more lights I had on the case the better.
 
I've been wondering... Can an Opteron 180 and a 7800GTX match or exceed IQ settings for PS3 and PC multiplatform titles today? We're gonna find out... :D
 
If a flub in pictures on an eBay auction turn out to be so, I'll do testing on single and dual 7800 GTX cards. Worse case I'll have a Quadro to fuck around with. Gonna be fun times :)

I will still continue FX 5900 Ultra testing on a more regular basis after this week comes to a close. Also, because of this thread's expanding nature, the thread title has been edited. If a mod could correct that when reading this that would be awesome.

More to come mid next week!
 
Last edited:
at this point I think it's about having fun and messing around with old school hardware.. kinda.. :)
 
at this point I think it's about having fun and messing around with old school hardware.. kinda.. :)

Agreed, with an emphasis on testing the older nVidia cards to see how they have aged. Also to note, I have 2 7800 GTX 256MB cards with 512 style double slot heatsinks coming (they're Dell reference cards), with an accompanying SLi bridge. This will be fun... :D
 
Agreed, with an emphasis on testing the older nVidia cards to see how they have aged. Also to note, I have 2 7800 GTX 256MB cards with 512 style double slot heatsinks coming (they're Dell reference cards), with an accompanying SLi bridge. This will be fun... :D

maybe I'll toss my pair of 512's in a similar system and compare...
 
The 780GTX 512 was a whole different card performance wise, not just a 780 with extra memory, if I remember right.. quite a step up from the 256mb variant. it would be interesting indeed to see the comparison. :)
 
I own an FX 5950 Ultra and I've tested it in a number of systems against a number of graphics cards and I pretty much got the same results that everyone else is talking about in this thread. For everything up to DX 8.1 it's actually a pretty good card. It's a beast at UT2k4 and even Killing Floor (DX8 mode), but the second you give it anything made for DX9 it just chokes hard. HL2 running on Source 2007 (the EP2 engine) is basically unplayable on it at any setting. My 9800 Pro just destroys it in every aspect except for vram (128mb vs 256mb).

Currently I've got it in a dual Athlon MP 2600 system and occasionally play some UT2k4 oldschool style on it.


On another note, I've been mucking around with a 7900GTX I had laying around and managed to get it to run Tomb Raider 2013 with drivers released during its hayday (not the latest ones as the game refused to start on them). It definitely manages to put up a fight, but I'd be kidding myself if I said it ran it as well as a PS3.


I've got all sorts of goodies to play with. I have every Nvidia flagship card from the Geforce 2 Ultra up to the 8800 Ultra.
 
maybe I'll toss my pair of 512's in a similar system and compare...

Please do! :)

I own an FX 5950 Ultra and I've tested it in a number of systems against a number of graphics cards and I pretty much got the same results that everyone else is talking about in this thread. For everything up to DX 8.1 it's actually a pretty good card. It's a beast at UT2k4 and even Killing Floor (DX8 mode), but the second you give it anything made for DX9 it just chokes hard. HL2 running on Source 2007 (the EP2 engine) is basically unplayable on it at any setting. My 9800 Pro just destroys it in every aspect except for vram (128mb vs 256mb).

Currently I've got it in a dual Athlon MP 2600 system and occasionally play some UT2k4 oldschool style on it.


On another note, I've been mucking around with a 7900GTX I had laying around and managed to get it to run Tomb Raider 2013 with drivers released during its hayday (not the latest ones as the game refused to start on them). It definitely manages to put up a fight, but I'd be kidding myself if I said it ran it as well as a PS3.


I've got all sorts of goodies to play with. I have every Nvidia flagship card from the Geforce 2 Ultra up to the 8800 Ultra.

Wish I could find some 7800GTX 512 cards for cheap, but they are practically non-existant if you're not running a Mac Pro (and definitely not worth the price!), and I may as well buy a 7950 GX2 for the price of one 7900 GTX.

It will be interesting indeed...

Also, try rendering at 1152x640, the resolution the PS3 (or was it XBox 360?) version used. Set it in custom resolutions in the nVidia control panel. You should have much better results on a mix of medium and high settings with newer drivers. It will be interesting to see how that does...

Personally, if I can get most games I want to test at console quality settings or better I'll consider this worth the time and money. Can you say S939 club v2? :p
 
Last edited:
The 780GTX 512 was a whole different card performance wise, not just a 780 with extra memory, if I remember right.. quite a step up from the 256mb variant. it would be interesting indeed to see the comparison. :)

Yeah, I think it was slightly faster than a 7950 GT 512... It was definitely beastly. It would be awesome if I lucked out and got two 512 cards by mistake... One can hope for an eBay miracle, no? :D
 
Please do! :)



Wish I could find some 7800GTX 512 cards for cheap, but they are practically non-existant if you're not running a Mac Pro (and definitely not worth the price!), and I may as well buy a 7950 GX2 for the price of one 7900 GTX.

It will be interesting indeed...

Also, try rendering at 1152x640, the resolution the PS3 (or was it XBox 360?) version used. Set it in custom resolutions in the nVidia control panel. You should have much better results on a mix of medium and high settings with newer drivers. It will be interesting to see how that does...

Personally, if I can get most games I want to test at console quality settings or better I'll consider this worth the time and money. Can you say S939 club v2? :p

Nope, PS3 Tomb Raider rendered at 1280x720 just like the 360 version and actually looked sharper thanks to a less aggressive FXAA algorithm and better texture filtering.

It takes running at 800x600 on normal details (max you can get from a DX9 setup) to get anything resembling playable out of a 7900GTX. The cpu it was paired with was a Phenom II x4 965, so it wasn't exactly cpu limited.

The PS3 version obviously had several optimizations made to squeeze the most out of the Cell and the RSX's independent pixel and vertex shaders. The PC version seems to highly favor a unified shader setup. I tossed in a lowly Geforce GT 220 (with DDR2 memory even) and watched as it handled 1280x720 with the same sort of framerates the 7900GTX was getting at 800x600.
 
Last edited:
The PS3 version obviously had several optimizations made to squeeze the most out of the Cell and the RSX's independent pixel and vertex shaders.
Yeah, PS3 optimization is a bit goofy at times. It's not uncommon to see developers actually run things like post-processing shaders in software, on the Cell CPU, rather than in hardware on the GPU.

Gains them some extra headroom on the GPU side of things while also letting them run totally custom shader code. Makes porting games off of the PS3 and to other systems a massive pain in the ass though.
 
So far it seems that CompuG##K is correct. Even in SLi it seems like anything past 2007 is too much for this system as is. S.T.A.L.K.E.R. wants to be playable maxed at 1280x720 but I think I am RAM limited, so lots of disk swapping ensues. Outside of PS3 versions themselves there seems to be little opimization for games before the 8 series debut.

More findings and screens tomorrow!
 
Last edited:
So far it seems that CompuG##K is correct. Even in SLi it seems like anything past 2007 is too much for this system as is. S.T.A.L.K.E.R. wants to be playable maxed at 1280x720 but I think I am RAM limited, so lots of disk swapping ensues. Outside of PS3 versions themselves there seems to be little opimization for games before the 8 series debut.

More findings and screens tomorrow!

Yep. It doesn't help that SLI scaling back then with those cards wasn't anywhere near as efficient as it is now with newer cards. Anyway, I popped an 8800 Ultra in the same system as I tested the other two cards and watched as it breezed through Tomb Raider at 1600x1200 with no problems. Amazing how much oomph a 7 year old card still has. Either that or it's sad to see how little games have progressed having been held back by the 360 and PS3 hardware. :(
 
Yep. It doesn't help that SLI scaling back then with those cards wasn't anywhere near as efficient as it is now with newer cards. Anyway, I popped an 8800 Ultra in the same system as I tested the other two cards and watched as it breezed through Tomb Raider at 1600x1200 with no problems. Amazing how much oomph a 7 year old card still has. Either that or it's sad to see how little games have progressed having been held back by the 360 and PS3 hardware. :(

I would say a bit of both, actually... The 8800 Ultra was a beast, for sure. Then again so were the 7800 GTX 256MB cards 2 years before... :/

Still with the 307.83 release SLI works more often than not. Went through driver/corruption hell all day trying to upgrade nForce and downgrade ForceWare drivers... UGH. Welcome to SLI, I suppose. :p

From what I understand, 2 7800 GTX = 9600 GT... Accurate?
 
a year or so ago I was tinkering with some older cards and games and concluded this:

The X1900 series was much more future proof than the 78/900 series. My X1900XTX could play moderm games like Batman AA fluidly with good details while on one of my 7800 GTX 512's was a quite a few ticks slower across the board.

Once you tap into the GeForce 8 series, there isn't much they can't play well with reasonable details on a 1680x1050 screen or lower.
 
I would say a bit of both, actually... The 8800 Ultra was a beast, for sure. Then again so were the 7800 GTX 256MB cards 2 years before... :/

Still with the 307.83 release SLI works more often than not. Went through driver/corruption hell all day trying to upgrade nForce and downgrade ForceWare drivers... UGH. Welcome to SLI, I suppose. :p

From what I understand, 2 7800 GTX = 9600 GT... Accurate?

I would think a 9600 GT is way ahead of 9600 GT in anything modern.
 
From what I understand, 2 7800 GTX = 9600 GT... Accurate?
It's going to be hard to compare those two setups...

Each 7800 GTX has 24 pixel and 8 vertex shaders, so you have a total of 48 pixel shaders and 16 vertex shaders working for you with two of them in SLI (with varying degrees of effectiveness, depending upon the game).

A 9600 GT has 64 stream processors, which can be used for either pixel or vertex operations. A 9600GT could potentially allocate all of its stream processors to pixel or vertex shading (depending on the requirements of the game) and blow the 7800GTX SLI setup out of the water. Any games that are heavy on vertex shaders will favor the 9600GT by a huge degree.
 
It's going to be hard to compare those two setups...

Each 7800 GTX has 24 pixel and 8 vertex shaders, so you have a total of 48 pixel shaders and 16 vertex shaders working for you with two of them in SLI (with varying degrees of effectiveness, depending upon the game).

A 9600 GT has 64 stream processors, which can be used for either pixel or vertex operations. A 9600GT could potentially allocate all of its stream processors to pixel or vertex shading (depending on the requirements of the game) and blow the 7800GTX SLI setup out of the water. Any games that are heavy on vertex shaders will favor the 9600GT by a huge degree.

This. I may have posted it earlier in this thread, but I ran Sonic All-Stars racing on a rig with an AMD Radeon HD-3450 and on a rig with a GeForce 6600. The 6600 essentially outperforms the HD-3450 in older games where unified shader architectures didn't exist, and the HD-3450 obliterates the 6600 in unified shader games. Sonic All-Stars Racing was almost decently playable on the HD-3450, whereas the 6600 just would not and could not do it at all. I'm guessing the 3 vertex pipes were to blame (along with obvious efficiency the HD-3450 had over the older tech). I think Anandtech threw some recent IGP benchmarks to compare to older graphics cards like the GeForce 7900 series. The 7900 series got the smack-down put on them when it came to geometry processing, due to the 8 vertex pipes.
 
a year or so ago I was tinkering with some older cards and games and concluded this:

The X1900 series was much more future proof than the 78/900 series. My X1900XTX could play moderm games like Batman AA fluidly with good details while on one of my 7800 GTX 512's was a quite a few ticks slower across the board.

Once you tap into the GeForce 8 series, there isn't much they can't play well with reasonable details on a 1680x1050 screen or lower.

Judging from what I have seen so far, I'm inclined to agree. These 7800 GTX 256 cards are far and above WAY better than the GeForce FX 5900 Ultra that kicked off this thread, but as soon as I start messing around with games past mid '07 they start running out of steam real fast. I ordered a Mushkin 2GB DDR400 kit with 2-3-2-5 timings that are equal to the 1GB OCZ kit a generous forum member sent me :D so that opens the doors to testing in many more new games. But for reference, just look at Xenos vs. RSX... More often than not games that don't heavily lean on CPU (read: Cell Broadband Engine) but instead the GPU, XBox 360 pulls ahead almost every time. Sometimes significantly so.

And, well... Guess what's next on the chopping block once I'm done with these fellas? :p

It's going to be hard to compare those two setups...

Each 7800 GTX has 24 pixel and 8 vertex shaders, so you have a total of 48 pixel shaders and 16 vertex shaders working for you with two of them in SLI (with varying degrees of effectiveness, depending upon the game).

A 9600 GT has 64 stream processors, which can be used for either pixel or vertex operations. A 9600GT could potentially allocate all of its stream processors to pixel or vertex shading (depending on the requirements of the game) and blow the 7800GTX SLI setup out of the water. Any games that are heavy on vertex shaders will favor the 9600GT by a huge degree.

That's true, it's a huge advantage to be able to switch your workloads on the fly as needed, but the general purpose "stream processors" from my understanding aren't as efficient per-clock as dedicated pixel and vertex shaders, but the flexibility more than makes up for it.

This. I may have posted it earlier in this thread, but I ran Sonic All-Stars racing on a rig with an AMD Radeon HD-3450 and on a rig with a GeForce 6600. The 6600 essentially outperforms the HD-3450 in older games where unified shader architectures didn't exist, and the HD-3450 obliterates the 6600 in unified shader games. Sonic All-Stars Racing was almost decently playable on the HD-3450, whereas the 6600 just would not and could not do it at all. I'm guessing the 3 vertex pipes were to blame (along with obvious efficiency the HD-3450 had over the older tech). I think Anandtech threw some recent IGP benchmarks to compare to older graphics cards like the GeForce 7900 series. The 7900 series got the smack-down put on them when it came to geometry processing, due to the 8 vertex pipes.

I do remember reading about that, the new 3DMark, correct?

That's interesting nonetheless. There is a reason the fixed pipelines went the way of the dodo...
 
And... By request of revenant... EXTREME CLOSE UP! WOAAAAAAAA!





Time to put these boys to work on some benchmarking next week!
 
whoa... those are big. :eek: is that the stock opty 180 cooler?

Yes, sorry. That's why last time I posted thumbnails. :p

That is indeed the stock heatsink for the Opty 180 (or any 110-125w AMD CPU).


The cards themselves are pulls from old Pentium D Dell XPS 600 systems, hence the double slot Quadro FX 4500/GeForce 7800/7900 GTX 512 heatsinks rather than the single slot 7800 GTX 256MB ones we are used to. It's a good thing too because the top card gets super toasty (10-15 deg. C hotter than the bottom one) when running either in 3D or even idling it's still several degrees warmer.

At any rate I love the smell of vintage hardware in the morning. :D
 
Last edited:
Those 7800s have some mem chips on the backs also.. the ones on the bottom card might get warm.. but should be ok. :)

man.. this thread = such a nostalgia trip.
 
I should throw an interesting twist to this thread and pull out my HD 3850 AGP and run some tests... :D

Or maybe I should dig out my 5950 Ultra and see how well it runs when being paired with a modern quad core cpu...


That board with the Penom II x4 965 that I tested the 7900 GTX on just so happens to have BOTH a PCI-E x16 slot AND an AGP 8x slot!

http://www.asrock.com/mb/NVIDIA/ALiveDual-eSATA2/

It's the only board I know of that supports up to a Phenom II x4 965 that supports AGP 8x and has a full speed PCI-E x16 slot that is not gimped to x4 like on the equivalent Intel boards that support C2Q Kentsfield chips. It's incredibly hard to find these days and I was lucky to find this one on a guy in Europe that was selling it.





To the lab, Igor!
 
Last edited:
DFI Motherboards. So damn sexy

Yes they are. I had a LanParty NF3 250 Gb back in '06-07 that was totally awesome. I love the RAM slots over the CPU, dunno why, but it is damn sexy.

Those 7800s have some mem chips on the backs also.. the ones on the bottom card might get warm.. but should be ok. :)

man.. this thread = such a nostalgia trip.

Yep these do. Other than the Quadro/GTX 512 heatsinks these are totally stock 7800 cards. And the memory modules on the back of the top card get so hot under load I can't stand to touch them! I think the top gets up to mid 70's C and the bottom one gets to low to mid 60's. That about right for air cooling?

I should throw an interesting twist to this thread and pull out my HD 3850 AGP and run some tests... :D

Or maybe I should dig out my 5950 Ultra and see how well it runs when being paired with a modern quad core cpu...


That board with the Penom II x4 965 that I tested the 7900 GTX on just so happens to have BOTH a PCI-E x16 slot AND an AGP 8x slot!

http://www.asrock.com/mb/NVIDIA/ALiveDual-eSATA2/

It's the only board I know of that supports up to a Phenom II x4 965 that supports AGP 8x and has a full speed PCI-E x16 slot that is not gimped to x4 like on the equivalent Intel boards that support C2Q Kentsfield chips. It's incredibly hard to find these days and I was lucky to find this one on a guy in Europe that was selling it.





To the lab, Igor!

Awesome score, and I think you should definitely throw a screwball in this thread to spice it up!

Eheh... Yesss Masster! Eheh! :p
 
Once properly configured, as long as you don't ask too high of resolution from them, dual 7800 GTX cards make relatively quick work of F.E.A.R. 2. :D










The final WHQL release for the GeForce 7 series is 307.83, and honestly, seems to work the best so far 80% of the time.

More to come.
 
Nice... I finally just finished FEAR2 and the little DLC expansion. Not as good at the original but still had some of the feel and sound of the original... whereas FEAR3 is a pretty big departure.. not in a good way imo.
 
Nice... I finally just finished FEAR2 and the little DLC expansion. Not as good at the original but still had some of the feel and sound of the original... whereas FEAR3 is a pretty big departure.. not in a good way imo.

This is my third or fourth time going through. At first I was definitely on the bandwagon of rampant "consolitis" plaguing this game, but in retrospect playing it on this hardware I realize it's really not all that bad.

Especially in comparison to F3AR, as you said, was nowhere near as good by a longshot.

2GB DDR400 kit arrives tomorrow. Then the fun begins. :D
 
RAM has arrived and is working. 3DMark03, 05, and 06 kick things off later. Stay tuned.
 
Does your profile name have any reference related to choof? ;)

Anywho, I should be able to gain some free time tomorrow or later this week to get the shit setup as I promised earlier.
 
Does your profile name have any reference related to choof? ;)

Anywho, I should be able to gain some free time tomorrow or later this week to get the shit setup as I promised earlier.

Perhaps... I will leave the guesswork to you, though... ;)

Sounds good. I've been busy dealing with personal drama and other issues so that's why benchmarks have not gone up. Things are seemingly better so I should be able to get to some benchmarks this weekend. A few new pics to follow as well. Apparently my board is a NF4 Ultra-D modded to an SLI-DR... I accidentally cleaned away the conductive dab of AS5 making the bridge, but it was nothing a .9mm mechanical pencil couldn't fix! Also with the GTX 512 heatsinks on these cards I found that an overclock to near-7900GT clocks was a cinch, so I've flashed them to 450/1300. I don't want to push harder considering the age of these boards, and the fact that the top card easily gets 10-15c hotter than the bottom (the RAM sink on the top card becomes unbearable to touch under load without a 120mm fan blowing on them), I'm playing it safe...

More to come!
 
Last edited:
Back
Top