erek
[H]F Junkie
- Joined
- Dec 19, 2005
- Messages
- 10,897
i gave the FX 5800 Ultra in for a broken one (that is now repaired and owned by CoolTweak), very rare
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And later they released the Bulldozer of video cards: the HD 2900 XT. The comment thread for the review here on that one was glorious. Good times.man, i remember this generation VIVIDLY. ati really knocked it out of the park with this one
I remember my dustbuster very well.
And later they released the Bulldozer of video cards: the HD 2900 XT. The comment thread for the review here on that one was glorious. Good times.
I was one of the people with the 2900xt, gotta say, when mildly overclocked it wasn’t nearly as bad as most people made out
man, i remember this generation VIVIDLY. ati really knocked it out of the park with this one
I was one of the people with the 2900xt, gotta say, when mildly overclocked it wasn’t nearly as bad as most people made out
The thing that people often fail to recognize here is that, like AMD's Athlon and later Athlon 64, ATi's Radeon 9700Pro was the product of the right company and technology acquisition at the right time. ATi bought out a company called ArtX, which gave them the technology they needed to create the Radeon 9700 Pro.
It was still way slower than NVIDIA's offerings and ATi was caught cheating on the benchmarks. About the only thing that card did better was 3D Mark. This is where I realized that 3D Mark scores meant virtually nothing as the variables that mattered to it weren't the ones that necessarily improved game performance.
You do know Nvidia was caught cheating with the FX 5800 Ultra on 3DMark.
I actually edited to be a little more clear.Very true. I didn't mean to imply that ATi was the only one doing this. At various times, NVIDIA and ATi had both been guilty of it.
The FX 5000 series was truly infamous.
I still have my 9800 Pro. Lots of fond memories with that one.
The 9700 Pro was pretty much the best video card I have ever owned. At least nostalgia wise, I have extremely fond memories of that card and playing any game I wanted to play with all the settings cranked.
At the time, this was true. Later on when larger LCD's hit the market and multi-monitor became a thing, the top end video card was no longer sufficient without buying them in pairs. Today, we could still use the power of a second card, but multi-GPU isn't supported well enough for that to be a reasonable option.
I was a reliable buyer of nvidia from the TNT2 up to the 5800 series. I switched because the radeon was better. That and the 5870 were the only AMD GPUs I ever owned.
The X100 and X1000 series were solid product lines as well. It really wasn't a problem until the HD 2000 series and the beginning of the DX10 era and ATI/AMD began to falter.ah, yes, the 9700 PRO, great card, but it was the card that marked the beginning of the end for ATI.
I was pretty much the same, starting with TNT. I bought a new card pretty much every time they launched one for years. TNT, TNT2, GeForce 256, GeForce2, GeForce3 skipped GeForce4 then jumped to 9700 Pro. Since the 9700Pro I stopped buying every launch and skip back and forth based on value. Funny thing is, I'm not sure if I stopped buying every launch because of the prices skyrocketing or because I got married! It seems I switched to a best value strategy after getting hitched. Then this last rotation I insisted on AMD because I was fed up with Nvidia's new Apple like business strategies.
I remember getting to take a look at one in a box at QuakeCon, after hearing FrgMstr speak
Talk about shitting the bed... just as Nvidia's FX series should have been good GPUs- and they actually were, but developers didn't code for them, imagine that!- the HD2900XT should also have been a good GPU.
Between these two generations, I concluded that:
- For DX9, the ATi 9700 Pro was basically the hardware reference; they also designed to use 24bit color throughout, so for 32bit sources they tossed fidelity, and for 16bit stuff they had extra overhead, but they did not do what Nvidia did which was to run 32bit color at half speed; since developers targeted ATi and released 24bit color sources, the FX5800 essentially ran at half speed or nearly so
- For DX10, this flipped: Nvidia essentially had the reference hardware with the 8800GTX- which was a knockout not just because it ran DX10 pretty well, but because it also ran DX9 very well
The FX 5000 series was truly infamous.
I still have my 9800 Pro. Lots of fond memories with that one.
Flipped again in DX11 where Fermi was a furnace
That is because of the XBOX conflict : Nvidia was hired by Microsoft to design the first xbox. And nVidia used the platform to launch their line of nForce motherboards without paying any royalty to Microsoft. In revenge, MS withhold the DX9 specs from nVidia but disclosed them to ATI inflicting a painful financial lesson on nVidia, far greater than any royalty.I was a reliable buyer of nvidia from the TNT2 up to the 5800 series. I switched because the radeon was better.
man, i remember this generation VIVIDLY. ati really knocked it out of the park with this one
d.
I don't think so, sure the FX5800 was a fiasco, but nvidia sold gazillions of FX5600, 5700 and 5900 cards. I loved my FX5900 (flashed to ultra).
I had a 5700 that played UT pretty well, with a Athlon64. The 6800GT that replaced it tho was beastly in comparison.
I went from the FX5900 to a 6800GT. Actually I was quite happy with the FX performance, but the 6800 was a nice jump.Had the same upgrade path with a 5700U to a 6800GT soft-modded to Ultra clocks. Was blown away with how much faster it was than the little FX.
I went from the FX5900 to a 6800GT. Actually I was quite happy with the FX performance, but the 6800 was a nice jump.
Agreed completely.My 5700Ultra wasn't a bad card per-se, but in HL2 and DOOM 3, the 6800GT destroyed it.
Wow really? I didn't know that!In revenge, MS withhold the DX9 specs from nVidia but disclosed them to ATI inflicting a painful financial lesson on nVidia, far greater than any royalty.