The GPU Nvidia would rather forget – GeForce FX

Thank you, I didn’t know this history
ikr? I never knew that either. I rode a 9700 Pro out for a long ass while skipping over the whole FX era and then went straight to a 6800GT. I did watch from the sidelines and it seemed like people were pretty underwhelmed by the FX cards for the most part.. well until the final couple skews..

EDIT: Dan_D and WilyKit - Yeah the 1900xt's (I think one is an xtx) were nice.. I still have a pair of those which I ran in CF back in the day.. loud as hell but they were beautiful cards for the day.. They still look pretty bad ass too.
 
The FX series was perhaps the last time that NVIDIA just completely fell flat on their face with no backup plan. Even the previous GeForce 4 Ti4600 looked pretty competitive for the money against the FX series, to say nothing of ATI’s dominant 9x00 series. Any other NVIDIA failures since then have been short-lived or happened during AMD’s long mailaise era so they went unpunished. The embarassment of the FX 5800 Ultra probably led to the all-out dominance of the 8800 GTX, so we should probably be thankful.

The last few times we got a powerhouse card from NVIDIA, it was when they were worried about stiff competition. The 8800 GTX was designed to defeat the mythical Radeon HD 2x00 series (not the actual one, which was a disappointment), the 1080 Ti was supposed to topple mythical Vega (but only had to contend with actual Vega), and the 4090 was meant to go up against what the 7900 XTX was supposed to be. AMD doesn’t need to actually bring the fight for consumers to win, but NVIDIA needs to think they will.
 
EDIT: Dan_D and WilyKit - Yeah the 1900xt's (I think one is an xtx) were nice.. I still have a pair of those which I ran in CF back in the day.. loud as hell but they were beautiful cards for the day.. They still look pretty bad ass too.
Yes they were, I had an x1900xtx; in fact that was my very last Radeon ever. I remember how loud that damn red blower fan was inside the clear poly-carbonate housing. :ROFLMAO:
 
At some point, which was I think 8800GTX, I realized NVIDIA was far superior than ATi for antialising due to more ROPs. I never went back to ATI/AMD after that. Whatever card AMD had up against the 8800GTX was a disaster, but the benchmarks partially glossed over the defeat because there were many benchmarks that didn’t push AA performance. I returned whatever card that was and got the 8800GTX, which was a beast.
 
At some point, which was I think 8800GTX, I realized NVIDIA was far superior than ATi for antialising due to more ROPs. I never went back to ATI/AMD after that. Whatever card AMD had up against the 8800GTX was a disaster, but the benchmarks partially glossed over the defeat because there were many benchmarks that didn’t push AA performance. I returned whatever card that was and got the 8800GTX, which was a beast.
AMD underinvested in ROPs that generation, and tried to make up some of the shortfall by leaning heavily on shader power to perform AA. The problem below the high end 2900 series parts was that the video decoding block took up an appreciable amount of die space, roughly equal to a block of 40 CUs or so. The 2900s, with a maximum number of 320 CUs, weren't impacted - IIRC they missed out on at least some video acceleration capabilities - but the 2600 and 2400 cards were better off in that respect. Unfortunately, with the inclusion of the video block, that took the 2600 series, which could have been a 160 CU part, and shunted it down to 120 CUs, and the 2400, which could have been an 80 CU part, down to 40 CUs. We've established that there was a shortfall of ROPs, and outside of the highest end parts there also wasn't enough raw shader grunt to handle antialiasing well on top of finite memory bandwidth and the constrained compute.

The 8800GT/GTX/Ultra were bonecrushers, even if their own low-end derivative parts suffered a bit for reasons not totally dissimilar to AMD's...
 
Last edited:
The FX series was perhaps the last time that NVIDIA just completely fell flat on their face with no backup plan. Even the previous GeForce 4 Ti4600 looked pretty competitive for the money against the FX series, to say nothing of ATI’s dominant 9x00 series. Any other NVIDIA failures since then have been short-lived or happened during AMD’s long mailaise era so they went unpunished. The embarassment of the FX 5800 Ultra probably led to the all-out dominance of the 8800 GTX, so we should probably be thankful.

The last few times we got a powerhouse card from NVIDIA, it was when they were worried about stiff competition. The 8800 GTX was designed to defeat the mythical Radeon HD 2x00 series (not the actual one, which was a disappointment), the 1080 Ti was supposed to topple mythical Vega (but only had to contend with actual Vega), and the 4090 was meant to go up against what the 7900 XTX was supposed to be. AMD doesn’t need to actually bring the fight for consumers to win, but NVIDIA needs to think they will.

AMD underinvested in ROPs that generation, and tried to make up some of the shortfall by leaning heavily on shader power to perform AA. The problem below the high end 2900 series parts was that the video decoding block took up an appreciable amount of die space, roughly equal to a block of 40 CUs or so. The 2900s weren't impacted - IIRC they missed out on at least some video acceleration capabilities - but the 2600 and 2400 cards were better off in that respect. Unfortunately, with the inclusion of the video block, that took the 2600 series, which could have been a 160 CU part, and shunted it down to 120 CUs, and the 2400, which could have been an 80 CU part, down to 40 CUs. We've established that there was a shortfall of ROPs, and outside of the highest end parts there also wasn't enough raw shader grunt to handle antialiasing well on top of finite memory bandwidth and the constrained compute.

The 8800GT/GTX/Ultra were bonecrushers, even if their own low-end derivative parts suffered a bit for reasons not totally dissimilar to AMD's...
Wasn’t 8800GTX basically the Crysis card? Except when the game actually hit, the card was no where near adequate. My recollection is that for other titles like Battlefield and Half-Life 2 it provided an unprecedented amount of headroom.
 
Wasn’t 8800GTX basically the Crysis card? Except when the game actually hit, the card was no where near adequate. My recollection is that for other titles like Battlefield and Half-Life 2 it provided an unprecedented amount of headroom.
Was an Oblivion card for me. With mods that game looked great.
 
The one individual card of the series they would prefer, more than anything else in the history of the company, to erase would be the FX 5200.

It had all the fancy features, with none of the actual performance to drive it. Pixel Shader 2.0? Sure, supported! Want to actually see something like.... water with it? lol. lolol. lololol. DX9? Yup! Totes! If you want to run at the lowest settings. Any game that was only DX9 would never have run on this card at any decent settings.
 
The one individual card of the series they would prefer, more than anything else in the history of the company, to erase would be the FX 5200.

It had all the fancy features, with none of the actual performance to drive it. Pixel Shader 2.0? Sure, supported! Want to actually see something like.... water with it? lol. lolol. lololol. DX9? Yup! Totes! If you want to run at the lowest settings. Any game that was only DX9 would never have run on this card at any decent settings.
Yeah it was odd. Coming from the previous gens where support for different features was completely different per model, they just said make all of FX support all the features. Now if many of the FX models were actually capable of driving said features, well...that's another story entirely. :ROFLMAO:
 
Wasn’t 8800GTX basically the Crysis card? Except when the game actually hit, the card was no where near adequate. My recollection is that for other titles like Battlefield and Half-Life 2 it provided an unprecedented amount of headroom.
A single one ran Crysis fine on mostly high settings at 1600x1200. With two in SLI you could crank most settings up to their highest.
 
I eventually began damn near speed racing how quickly I could take down an oblivion gate. Make yourself super quick + invisible ring man I was OP AF
 
I eventually began damn near speed racing how quickly I could take down an oblivion gate. Make yourself super quick + invisible ring man I was OP AF
Yup, I farmed the Moth Priests for OP rings in my original playthrough. It made the game unfair... for the AI.
 
The one individual card of the series they would prefer, more than anything else in the history of the company, to erase would be the FX 5200.

It had all the fancy features, with none of the actual performance to drive it. Pixel Shader 2.0? Sure, supported! Want to actually see something like.... water with it? lol. lolol. lololol. DX9? Yup! Totes! If you want to run at the lowest settings. Any game that was only DX9 would never have run on this card at any decent settings.
Yep, I remember in Feb 04 i had a chance to get either a 5200 or ti4200 64mb. I choose the 4200.
 
Back
Top