Question for the veterans of PC gaming: why was the Radeon 9700 Pro such a big deal?

6800NU with all the 16 pixel pipelines unlocked anyone? ^_^

And how much did the DDR memory bandwidth hold that 16-pipe performance back (versus the 6800 GT)?

It was probably a huge a gap as the 6600 versus the 6600 GT.

The reason I bought the 6600GT AGP over the 6800 was because It was an extra $50, and because it wasn't decisively faster

https://www.anandtech.com/show/1546/13

This is the page that made me jump on that launch week XFX 6600 GT AGP: to plaay half life 2. It played it splendidly at 1280x960 high, 2xaa.

If I was going to spend that kind of money, why not go 6800 GT?
 
Last edited:
I kept up with the nVidia 6800 series when it came out and I never heard of the 6800 NU. I knew about the vanilla, GS, GT, and Ultra, but never heard of that.
 
It was a, by far, better card. I remember at the time nvidia wasn't even discussed in my circles.
 
I kept up with the nVidia 6800 series when it came out and I never heard of the 6800 NU. I knew about the vanilla, GS, GT, and Ultra, but never heard of that.
6800NU = Non Ultra or vanilla
There was also a 6800LE that could unlock pipelines to the full 16.
 
For me it was the first AMD card I ever bought. I owned their CPUs before but never a GPU. I bought the 5800 and was just not satisfied with the performance and especially the noise/temps. It taught me an important lesson about brand loyalty.
 
The early 2000s were pretty much peak PC gaming, if you ask me, and the Radeon 9700/9800 lines were some of the reasons why. Alas, I only had a 9600 XT because my parents didn't feel like letting me pitch in a bit extra to step up to the much faster 9700, but oh well, I got the Half-Life 2 voucher out of it.

Everyone's already gone over the contextual relevance by now; that was when ATI curbstomped NVIDIA just like AMD curbstomped Intel with the Athlon 64. GeForce FX, while now sought-after for retrogaming builds due to having paletted texture support and a few other things dropped in the 6 Series that some older games rely on, simply sucked at DX9/Shader Model 2.0, to the point that the top-of-the-line GeForce FX 5950 Ultra was only about as fast as a mere Radeon 9600 in SM2.0, while costing about as much as a Radeon 9800 XT - back when a top-of-the-line gaming graphics card only cost about $500-550. (If only I could say the same of today's RTX 2080 Ti!)

What hasn't been touched on as much were the games that were dropping around this time, the reason why I'm calling this the peak PC gaming period. This was right around when games were starting to require programmable pixel/vertex shaders, so if you were stuck with an integrated GeForce 2 GTS or whatever, sorry, no Deus Ex: Invisible War or Thief: Deadly Shadows for you, go buy something that properly outclasses the original Xbox already.

On top of that, while some other hit games like Unreal Tournament 2003 2004, Doom 3 and Half-Life 2 didn't technically require programmable shaders (and the latter two largely made it a fallback because of the popular but infamously misleading GeForce 4 MX cards being glorified GF2s with fixed-function shaders instead of the GF3-succeeding GeForce 4 Ti with Shader Model 1.1), they really benefited from the massive leap in GPU performance.

Doom 3's one of the few from that time that didn't suck hard on GeForce FX due to its use of OpenGL and how it handles dynamic shadows, but we also got Far Cry and Half-Life 2 in 2004 as well, which really cemented how much of a gigantic leap top-of-the-line PC gaming was above the aging consoles of the time (note the heavily downgraded Doom 3 and Half-Life 2 ports on Xbox, and how much better Splinter Cell: Chaos Theory looked on PC despite the Xbox version not being a slouch for the time) and gave people reasons to buy these expensive graphics cards to run 'em right.

Follow that up with the likes of the original F.E.A.R. and The Elder Scrolls IV: Oblivion in the following years, and the Radeon 9700/9800 still held up decently, even though the GeForce 6800 and Radeon X800 lines had released and literally doubled performance over the old 9800 XT at the same $500-550 price range for the very best gaming graphics cards available, with the 6 Series even adding Shader Model 3.0 support that Splinter Cell: Chaos Theory and Far Cry got patched to utilize for HDR rendering (sadly mistaken for overdone bloom in those days).

It was a time of excitement and AAA hits out the wazoo, before the industry got poisoned with things like excessive monetization through paid DLC, reliance on upgrade trees and grinding for competitive games, the relative disuse of dedicated servers and server browsers as well as game mods, and the stagnation in PC hardware we saw last decade, where hardware prices got jacked way up for much less of a practical performance increase. (On the other hand, still being able to game pretty well on my aging 4770K/GTX 980 build, or my over-a-decade-old Q6600/GTX 760 setup is pretty nice. It's mostly VR games that make me want a new GPU.)
 
Very well said NamelessPFG!

The games made a huge difference, I can confirm the main reason I picked up the 9700 Pro was due to it being used to showcase the upcoming Doom III. Up until that time, it was the most graphically impressive game I had ever seen. And I wanted to be sure I could play it. Our family Dell shipped with a Geforce 3 Ti200.
Ironically the Geforce 3 had also been used to showcase Doom III, although I didn't know it at the time.

Just off the top of my head, some of the titles I got to enjoy were obviously Doom III. But also Half Life 2, Far Cry, Call of Duty, Rainbow Six 3 Raven Shield, Battlefield 2, Splinter Cell, and even Command and Conquer Generals. It was truly an amazing time to be a PC Gamer.
 
8800 GTX was too expensive for most people, most people had had to wait a year for the 8800 GT to get a taste of that performance. 9700 Pro launched at a normal price, and actually the 9700 was an even better deal at nearly $100 less.
How convenient of you not to mention the 8800GTS 320 & 640mb versions which were released at same time as the GTX. Those were reasonably priced and offered almost twice the performance as the previous gen and which ATI had nothing at that performance level at the time. It wasnt the 8800GTX that was the star of the show, but the GTS versions due to their tremendous bang for the buck performance. The 8800gt came out a year later and further improved upon these cards at an even lower price.
 
I moved from an FX 5900 XL over to a 9700 Pro as DELL was pushing ATi hard with display booths set up in area malls and running demo's for people to interact with the games . Back then I played MOHAA and then the first COD on Asus P 800 before jumping over to AMD socket 939 and 64bit 3200 single core before my 3800x 2 dual core which is still running today.
 
Why? It was fast, had DX9 features, and didn't sound like dust buster. All while being relatively inexpensive. It was also heavily geared toward Half-Life 2.

I still have mine...and it's still working in a retro-rig. ;)
 
  • Like
Reactions: rmfa
like this
How convenient of you not to mention the 8800GTS 320 & 640mb versions which were released at same time as the GTX. Those were reasonably priced and offered almost twice the performance as the previous gen and which ATI had nothing at that performance level at the time. It wasnt the 8800GTX that was the star of the show, but the GTS versions due to their tremendous bang for the buck performance. The 8800gt came out a year later and further improved upon these cards at an even lower price.

The 640mb wasn't very affordable and the 320mb was VRAM starved, plus it cost more than the 8800gt, hence why I didn't mention it.
 
Part of the hype around cards like the 9700pro/9800pro are at least partly due to the games that were released after that, namely Half Life 2, Doom, and Farcry. While there were eventually better options for these games later on, these were the cards that many used to carry them through this era. Half Life 2 has a particular connection to Radeon cards at this time. Not only did they perform well, many Radeon cards had Half Life 2 Compatibility check marked on the box of their cards. Radeon jumped on the hype train for Half Life 2 and it seems like it turned out to be a good market strategy.

Looking back at the 8800gtx, which is another famous card for performance, was released before Crysis came out and was able to play it decently.

I think there is definitely a correlation to these cards and the games they were used to play.
 
Last edited:
The ati 9700 FULLY Supported dx9.0c at the time as it also had a voucher for Half Life 2 which was aslo fully dx 9.0c Nvidia at the time defaulted to dx 8.1 its never ever made dx 9.0c at the time and pissed all the nvidia fan boys off and others........a HUGE down turn and lie from nvidia at the time.
 
The 640mb wasn't very affordable and the 320mb was VRAM starved, plus it cost more than the 8800gt, hence why I didn't mention it.
They were $449 MSRP on release and over 50% faster than 7900GTX on average. Sometimes they were over 100% faster. Comparatively, the 2080 was $700 MSRP on release and what a measly 10% faster than 1080TI at best?
The 8800GTS offered a lot of performance for the $$$.
 
The 640mb wasn't very affordable and the 320mb was VRAM starved, plus it cost more than the 8800gt, hence why I didn't mention it.
320mb at the time (2007) was above average, certainly more than anything ATI had at the time. Plus it had a 320-bit bus. The 8800gt had even less vram!! 256mb! I guess you werent old enough to remember cards from that era.

This was the most demanding game at the time. Look at the ATI cards in comparison:

14021.png

And before you comment on the low FPS, 1920x1200 was like 4k back in the day. Most systems were on 1280x1024 res back then.
 
320mb at the time (2007) was above average, certainly more than anything ATI had at the time. Plus it had a 320-bit bus. The 8800gt had even less vram!! 256mb! I guess you werent old enough to remember cards from that era.

This was the most demanding game at the time. Look at the ATI cards in comparison:

View attachment 233958

And before you comment on the low FPS, 1920x1200 was like 4k back in the day. Most systems were on 1280x1024 res back then.

That is incorrect, the 8800 GT launched with 512MB of VRAM. The 256MB was a cheap SKU that was launched way later, and hardly anybody wanted that model except the cash strapped who couldn't afford a $200 512 MB model because those gaming at higher resolutions had been avoiding the 320MB 8800 GTS.
 
That is incorrect, the 8800 GT launched with 512MB of VRAM. The 256MB was a cheap SKU that was launched way later, and hardly anybody wanted that model except the cash strapped who couldn't afford a $200 512 MB model because those gaming at higher resolutions had been avoiding the 320MB 8800 GTS.
yep, my bad. meant to say the 8800gt was 256bit (as opposed to 320-bit of the earlier cards), but it came with 512mb vram as well as the 256mb.
I've owned the 8800gts 640mb from when it was released and in most benches for the cards of that period, the 320mb was mostly equal with the 640mb (@ 1680x1050). Also, the 320mb was still more vram than the competition had at the time and sufficient for mainstream resolutions of that period. These cards were what caused the huge migration for ATI owners to Nvidia basically.
 
The ATI 9700 Pro was the first ATI card that made me jump ship from nVidia, of whom years previously I had jumped shipped from 3DFx. Up to that point, ATI was known for making somewhat good hardware, but really bad drivers and never really taking advantage of the performance it was capable of. They were always behind both 3DFx and nVidia. The 9700 Pro was the first ATI card that really leaped to the fore front. Even with the rough drivers at release time, it was still faster than anything out on the market at the time. I remember I was one of the first to get it (Platinum edition with t-shirt and extras). Great memories.
 
The ati 9700 FULLY Supported dx9.0c at the time as it also had a voucher for Half Life 2 which was aslo fully dx 9.0c Nvidia at the time defaulted to dx 8.1 its never ever made dx 9.0c at the time and pissed all the nvidia fan boys off and others........a HUGE down turn and lie from nvidia at the time.

Minor correction: The Radeon 9700 (and ATi's R300 line in general) supported DirectX 9.0. 9.0a was a nearly forgotten revision rushed out to give Nvidia's FX series some kind of fighting chance; 9.0b was issued for the second generation R400 Radeons; and 9.0c was pushed out for R500 Radeons and the NV40 Geforce parts, the 6000 and 7000 series, for Shader Model 3 support and the first initial trickle of GPGPU capability.

Nvidia did support DirectX 9.0 properly, they were just abysmally slow when running at full FP32 precision. They were hoping developers would go to the trouble to make rendering paths optimized for FP16 and FP32, but compared to writing a proper DirectX 9 path that worked fine and a fallback path for older / less performant hardware, that was a big enough hassle to alienate most developers. Given that most devs didn't support an FX-optimized DirectX 9 codepath, and that the line ran DirectX 8.1 pretty well, that was a sane default option. I think you're specifically referencing Half-Life 2, which was Valve's decision and not uncommon across the industry generally. Even with an early, FX-tuned codepath the 5800 Ultra managed a speed less than half that of the Radeon 9700 Pro; I can't blame Valve for divesting themselves of what they thought was a bad investment while still providing a pretty solid experience for FX card owners.
 
Bought my 9700 pro from circuit city back in the day just to play the original far cry for pc. 1024x768 all day.
 
My first custom gaming PC used the Radeon 9700 Pro. It's synonymous with my childhood love of computers. I owned something high tech that just performed amazingly. UT was smooth as butter. Word of mouth spread so fast for that card and it kick-started my career. It was paired with an AMD 2700+ and 256MB of RAM on what I think was an Abit board. All this was built into a custom chassis that was epic.
 
I had an Nvidia 4800ti at the time and was playing the fuck out of Unreal Tournament and Return to Castle Wolfenstein MP. I upgraded to the 9800 XT 256 when Half Life 2 came out and loved that card...Unreal and RTCW-MP was amazing as well. Prolly my fav card of all time. That card finally died a few years ago in my old P4 extreme system.
 
yep, my bad. meant to say the 8800gt was 256bit (as opposed to 320-bit of the earlier cards), but it came with 512mb vram as well as the 256mb.
I've owned the 8800gts 640mb from when it was released and in most benches for the cards of that period, the 320mb was mostly equal with the 640mb (@ 1680x1050). Also, the 320mb was still more vram than the competition had at the time and sufficient for mainstream resolutions of that period. These cards were what caused the huge migration for ATI owners to Nvidia basically.
I wonder if the extra VRAM on the GTS would give it the edge over an 8800GT in today's games. I actually have the rare 112SP 8800GTS SSC which was even faster and on the heels of 8800GTX. Come to think of it I have 9800GT laying around too. Would be interesting to see.
 

You forgot the 9550 and 9600 series, which were basically half of a 9800.

The 9550 was an interesting card that got a lot of negative press because it was assumed to be a variant of the existing 9500 by the public, when in fact it was a cut down 9800 core with half the pixel, vertex, TMUs and ROPs. Due to this, it was considerably slower than the former two cards it was designed to replace, and burned quite a few people that were either scammed or didn't do their research on it. While the 9550 was generally a potato, it did have significant overclocking headroom, which could squeeze out some respectable gains in performance. I do have one 9550 with an aftermarket cooler (the original fan seized) and it can do 450 MHz core / 425 MHz RAM (up from the 250/400 stock) with no real trouble. I've had it as high as 480 MHz, but the cooler can't keep up with it at that speed.

I also have both a 9600 and 9800 Pro, which I bought new back in the day. I'm not sure where the 9550 came from though, but I've had it for a long time.
 
  • Like
Reactions: Halon
like this
Ahh I have fond memories of my 9500np. Soft modded it and I believe it also had a pencil mod to add more voltage. Had a water-cooling plus a Swifty pelter setup. Had alot fun with that card. Killed it eventually with condensation.
 
You forgot the 9550 and 9600 series, which were basically half of a 9800.

The 9550 was an interesting card that got a lot of negative press because it was assumed to be a variant of the existing 9500 by the public, when in fact it was a cut down 9800 core with half the pixel, vertex, TMUs and ROPs. Due to this, it was considerably slower than the former two cards it was designed to replace, and burned quite a few people that were either scammed or didn't do their research on it. While the 9550 was generally a potato, it did have significant overclocking headroom, which could squeeze out some respectable gains in performance. I do have one 9550 with an aftermarket cooler (the original fan seized) and it can do 450 MHz core / 425 MHz RAM (up from the 250/400 stock) with no real trouble. I've had it as high as 480 MHz, but the cooler can't keep up with it at that speed.

I also have both a 9600 and 9800 Pro, which I bought new back in the day. I'm not sure where the 9550 came from though, but I've had it for a long time.

The 9550 wasn’t bad new - I had one I kept running in an Athlon XP box for years alongside a Voodoo2 SLI. From what I remember it was a Radeon 9600 with a core clock of 250 MHz instead of the 9600's stock 325, but the same RAM speeds as the faster part. One nice side effect was that they could be passively cooled, and were still fast enough to drive early DirectX 9 games at resolutions that were fine on contemporary CRTs. I didn't have many complaints at the time - they'd set you back about as much as a GeforceFX 5200 with 128-bit memory, but would run circles around those cards, and they still had solid VESA 2.0 support for DOS games, which I still cared about back in 2003 or so. One nice perk they inherited as R300 parts was that they didn't support full-scene antialiasing in 16-bit color: the driver just set a 32-bit mode and tricked the application into believing it was using a 16-bit framebuffer. In conjunction with its support for anisotropic filtering, a lot of early 3D games looked INCREDIBLY good and still ran very well, and all on the cheap.
 
  • Like
Reactions: XoR_
like this
For me it was a breakthrough because at the time, 1024 x 768, which is what I believe I was on, pushed all the prior hardware too hard for 4x anti-aliasing. The 9700Pro got the job done. But there was more to it. It was heavily tied to a few AAA titles I believe.
 
For me it was a breakthrough because at the time, 1024 x 768, which is what I believe I was on, pushed all the prior hardware too hard for 4x anti-aliasing. The 9700Pro got the job done. But there was more to it. It was heavily tied to a few AAA titles I believe.
HL2
 
Nvidia did support DirectX 9.0 properly, they were just abysmally slow when running at full FP32 precision. They were hoping developers would go to the trouble to make rendering paths optimized for FP16 and FP32, but compared to writing a proper DirectX 9 path that worked fine and a fallback path for older / less performant hardware, that was a big enough hassle to alienate most developers. Given that most devs didn't support an FX-optimized DirectX 9 codepath, and that the line ran DirectX 8.1 pretty well, that was a sane default option. I think you're specifically referencing Half-Life 2, which was Valve's decision and not uncommon across the industry generally. Even with an early, FX-tuned codepath the 5800 Ultra managed a speed less than half that of the Radeon 9700 Pro; I can't blame Valve for divesting themselves of what they thought was a bad investment while still providing a pretty solid experience for FX card owners.
Valve could make all DX9 effects available on DX8 though
There was even a fan made path that you could install an have DX9 effects on DX8 cards such as GeForce3 a performance hit of additional effects was actually pretty minimal. It is impossible Valve did not know they could do this. It probably was a marketing ploy to have HL2 be marketed as DX9 game, and one very advanced at that.

BTW. For a long time game developers actually did care for GeForce FX performance and games were optimized. That is why FX initially did compete in benchmarks. Not all games though and of course later games did not care at all so games like Oblivion were virtually unplayable on FX cards even if they did run just fine on cheaper Radeons
 
The 9550 wasn’t bad new - I had one I kept running in an Athlon XP box for years alongside a Voodoo2 SLI. From what I remember it was a Radeon 9600 with a core clock of 250 MHz instead of the 9600's stock 325, but the same RAM speeds as the faster part. One nice side effect was that they could be passively cooled, and were still fast enough to drive early DirectX 9 games at resolutions that were fine on contemporary CRTs. I didn't have many complaints at the time - they'd set you back about as much as a GeforceFX 5200 with 128-bit memory, but would run circles around those cards, and they still had solid VESA 2.0 support for DOS games, which I still cared about back in 2003 or so. One nice perk they inherited as R300 parts was that they didn't support full-scene antialiasing in 16-bit color: the driver just set a 32-bit mode and tricked the application into believing it was using a 16-bit framebuffer. In conjunction with its support for anisotropic filtering, a lot of early 3D games looked INCREDIBLY good and still ran very well, and all on the cheap.
9550 was most popular card at the time.
I was working at computer shop at the time and these cards sold like hot cakes. Of course a lot of FX 5200 were also sold but it was always sad when someone deliberately choose slowest graphics card just to save few pennies.
9550 were also great for overclocking. You could get hefty performance increase. Also some cards actually used 9600XT boards and could be overclocked as much as you would 9600XT. I was actually using such card and it ran great all games.

As for R300 family of cards and 16-bit I am not sure games actually displayed 32bit (not that you said they did) but 16bit mode look great and there was improvement in dithering vs using 16-bit without AA. I was using 16bit mode with AA for GTA San Andreas as it ran much much better than 32-bit mode. There was if I remember correctly dithering and reduced precision on Z-buffer calculations led to some minor artifacts at distant objects. But game run really much better at 16-bit and still looked great and I actually liked the dithered look. True 32-bit mode looked smoother but not really better. I tried the game at FX card and it looked ok in 16-bit but AA did not really improve it, more like ordinary 16-bit rendering. In 6x00 cards NV removed dithering in 16-bit completely so the mode was unusable completely.
 
Last edited:
  • Like
Reactions: Halon
like this
The 9700 pro was a special card at the time - it enabled new features (DX9/ advanced shaders) that allowed a true noticeable generational leap in games, was really really fast with current games at the time of release and reasonably priced for a top end card. I knew that HL2 was being developed with this card before it came out from interviews with Valve, and shortly after its release, there was a leaked version of HL2 that you could "play" that while despicable as an act of sabotage/blackmail it was basically a really early debug version and more like a tech demo, but it blew minds at the time at what could be done with this level of tech. Only the 9700 pro could run this tech at full fidelity.
The 9800 pro was out by the time HL2 released, I believe, but it was still a great card and the 9800 was just a faster 9700 pro for the most part and not another generational leap, and in that sense less special. IMHO, this is peak Radeon - there's never been a better card for them, IMHO, although some are close.
 
Ah, the days when each new card release brought about incredible new features and a lot more performance almost every year. The 9700 Pro was a fantastic card, for sure. It was just an exciting time in general because we were always waiting for ATI and Nvidia to one-up each other.

I didn't ever get the 9700 Pro; I was a kid in high school (with a high school kid's budget) and just beginning to tinker in the high-end PC market. I did however get a Geforce 6800 GT and overclock it to 6800 Ultra levels. Total beast of a card.
 
I didnt get the 9700 when it came out, but I did get the 9500 pro that came on the 9700 board. I later upgraded it to a 9800 pro and later a 9800 XT.
 
Back
Top