2900 XTX? Is it?

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,868
Is this the 2900 XTX?

https://www.ebay.com/itm/183524453672


when i google "102-b00101-00" i find references that it might be...


GPU Device Id: 0x1002 0x9400 113-B00101-19e 102-B00101-00 R600XTX BIOS GDDR4 796e/1009m (C) 1988-2005, ATI Technologies Inc. ATOMBIOSBK-ATI VER010.039.000.000.023973
 
As an eBay Associate, HardForum may earn from qualifying purchases.
the concern is if it's a R600XT / OEM ES or R600XTX ES?
 
well i'll be. i remember hearing about it but didn't realize it never released. but still, $800?:confused: must be opening a gpu museum probably gonna cost $25 to get in.


same, i remembered hearing about it back then but never knew it wasn't actually released.
 
  • Like
Reactions: erek
like this
Here's the BIOS https://www.techpowerup.com/vgabios/205389/205389

Here's the GPU-Z https://www.techpowerup.com/gpuz/details/52af


apparently's an ATI Radeon HD 2900 XTX 1GB (A12 Revision) Prototype

IMG_1367.JPG
IMG_1368.JPG
IMG_1369.JPG
IMG_1370.JPG
IMG-137111.jpg
IMG_1366.JPG
 
Looks more like something from the fire line. How could one tell the difference?
 
So did you buy it and then take the pics? Or do you know the buyer?
 
The only difference between the XT and XTX was a slight increase in clock speed.

5% higher core clocks wouldn't have saved this card from second place: the castrated texture units (unchanged from x1900 days) and broken MSAA were what sealed it's fate.

14639.png
 
5% higher core clocks wouldn't have saved this card from second place: the castrated texture units (unchanged from x1900 days) and broken MSAA were what sealed it's fate.
Don't forget about that wonderful 512-bit ring bus the memory used...
Oh lord that was a bad era in ATI (actually AMD) at that point in time - they really did fix everything wrong with the HD2000 series with the HD3000 and HD4000 series, though, which were both much better performers.
 
  • Like
Reactions: erek
like this
Don't forget about that wonderful 512-bit ring bus the memory used...
Oh lord that was a bad era in ATI (actually AMD) at that point in time - they really did fix everything wrong with the HD2000 series with the HD3000 and HD4000 series, though, which were both much better performers.


Actually, the HD 3870 mostly matched the performance of it's predecessors. Until you turned on AA, and then the smaller memory bus showed it's head. They both had the same broken ROPs.

https://www.anandtech.com/show/2376/7

https://www.bit-tech.net/reviews/tech/graphics/sapphire_radeon_hd_3870/1/

Newer, more demanding games like Crysis, Oblivion, and Opposing Fronts all stressed the memory bus to the limits. This card only surpasses on less stressful multiplayer titles, and console ports.

It was faster in games that stressed the memory less, but they really did nothing for the memory subsystem, except resorption back to it's intended bus width of 256-bit external.

The complete gutting on the entire architecture didn't come until the HD 4870. Then they copied NVIDIA's memory controller design, redesigned the texture units and ROPs, and took advantage of the die shrink to double the shader counts.
 
Very cool it works and all. And yes the HD2000 series was a dark time for team red. So many things didnt go their way back then: late response to green team, performance just wasnt there, power consumption through the roof, and these cards ran warm. Combined with Nvidias absolute grand slam G80 products. The 8800GTX is arguably the most legendary Nvidia GPU release.
The HD3000 series mostly improved power efficiency over the HD2000 series. Like mentioned performance was nearly identical. AMDs day of reckoning didnt arrive until the HD4000 series.
 
  • Like
Reactions: erek
like this
Actually, the HD 3870 mostly matched the performance of it's predecessors. Until you turned on AA, and then the smaller memory bus showed it's head. They both had the same broken ROPs.

https://www.anandtech.com/show/2376/7

https://www.bit-tech.net/reviews/tech/graphics/sapphire_radeon_hd_3870/1/

Newer, more demanding games like Crysis, Oblivion, and Opposing Fronts all stressed the memory bus to the limits. This card only surpasses on less stressful multiplayer titles, and console ports.

It was faster in games that stressed the memory less, but they really did nothing for the memory subsystem, except resorption back to it's intended bus width of 256-bit external.

The complete gutting on the entire architecture didn't come until the HD 4870. Then they copied NVIDIA's memory controller design, redesigned the texture units and ROPs, and took advantage of the die shrink to double the shader counts.
You know, you are totally right about that, especially on the lack of AA due to it, and I didn't realize that the memory bus was so extremely gimped on the HD3000 series.
While the GPUs were a bit more powerful, halving the memory bus was a terrible decision - those so could not compete with the NVIDIA Series 8 or Series 9 GPUs.

I did also manage to find a review from here on one of them as well: https://www.hardocp.com/article/2008/02/25/asus_eah3870_top/1

1202678284nKXg814OSS_4_4_l.png


Funny to think that the "high resolutions" used back in 2008 were still only 1280x1024 and 1920x1200 (16:10, woo!).
Thought we were sooo high tech back then, haha, how things change. :D
 
The only difference between the XT and XTX was a slight increase in clock speed.

5% higher core clocks wouldn't have saved this card from second place: the castrated texture units (unchanged from x1900 days) and broken MSAA were what sealed it's fate.

View attachment 120973


What in the actual fuck were they smoking thinking that they could get away with keeping the Texture units and ROP count the same? Yes, more shaders were definitely the way of the future, but the graphics card needed to perform well NOW while looking forward.

Radeon 9700pro did this perfectly.
 
My first amd card was a 5750. I was super pleased with that card and had 3 24 inch monitors in eyefinity goodness.
 
You know, you are totally right about that, especially on the lack of AA due to it, and I didn't realize that the memory bus was so extremely gimped on the HD3000 series.
While the GPUs were a bit more powerful, halving the memory bus was a terrible decision - those so could not compete with the NVIDIA Series 8 or Series 9 GPUs.

I did also manage to find a review from here on one of them as well: https://www.hardocp.com/article/2008/02/25/asus_eah3870_top/1

View attachment 121130

Funny to think that the "high resolutions" used back in 2008 were still only 1280x1024 and 1920x1200 (16:10, woo!).
Thought we were sooo high tech back then, haha, how things change. :D

"Looking closely at the image above, some differences are certainly visible, but it's just not that big of a deal."
 
The 1:1 TMU and ROP Ratio wasn't very forward thinking IMO. Texture filtering performance was dismal and It was speculated that AA was handled by the shaders and not the ROPs. AA performance is pretty dismal on R600 because Shader performance wasn't great. ATI took 64 5-way shader units and marketed them 320 stream processors and they weren’t scalar like G80’s. They couldn’t process instructions on individual threads like NVIDIA’s GPU.

I've assembled a number of older benchmarks on an older C2Q test bed and it's shocking how bad the 8800 GTX beats the 2900 XT 512, 2900 XT 1GB, and 2900 XTX 1GB (756MHz).
 
The 1:1 TMU and ROP Ratio wasn't very forward thinking IMO. Texture filtering performance was dismal and It was speculated that AA was handled by the shaders and not the ROPs. AA performance is pretty dismal on R600 because Shader performance wasn't great. ATI took 64 5-way shader units and marketed them 320 stream processors and they weren’t scalar like G80’s. They couldn’t process instructions on individual threads like NVIDIA’s GPU.

I've assembled a number of older benchmarks on an older C2Q test bed and it's shocking how bad the 8800 GTX beats the 2900 XT 512, 2900 XT 1GB, and 2900 XTX 1GB (756MHz).
Would be very interesting to include X1900XTX or X1950XTX to compare ATI's previous gen flagship to HD2900XT. Measuring performance gain from 1 generation to another is almost as important as comparing competing products of the same generation. When CPU bottleneck was removed 8800GTX was up to and sometime more than 100% faster than 7900GTX.


Think about that for a minute. When was the last time Nvidia (or AMD) released a new generation flagship that was 100% faster than the previous? Even the beastly 1080TI is nowhere near 100% faster than 980TI.
 

Attachments

  • quake4_2048_1536.gif
    quake4_2048_1536.gif
    42.4 KB · Views: 0
Would be very interesting to include X1900XTX or X1950XTX to compare ATI's previous gen flagship to HD2900XT. Measuring performance gain from 1 generation to another is almost as important as comparing competing products of the same generation. When CPU bottleneck was removed 8800GTX was up to and sometime more than 100% faster than 7900GTX.


Think about that for a minute. When was the last time Nvidia (or AMD) released a new generation flagship that was 100% faster than the previous? Even the beastly 1080TI is nowhere near 100% faster than 980TI.


Now I might have to a video on that too :) . As far as a 100% gain from gen to gen. It’s been since the 8800. I will say the 6800 U did that sometimes vs. the 5950 U, but it does vary greatly from game to game. 7800 GTX was much faster in SM3 vs. the 6800 U, but not if I remember correctly.
 
  • Like
Reactions: erek
like this
Now I might have to a video on that too :) . As far as a 100% gain from gen to gen. It’s been since the 8800. I will say the 6800 U did that sometimes vs. the 5950 U, but it does vary greatly from game to game. 7800 GTX was much faster in SM3 vs. the 6800 U, but not if I remember correctly.

Would love to see a video on it. Actually I dont think theres a video you have published yet I don't thoroughly enjoy. And I believe you're right on 6800 Ultra vs FX 5950 Ultra. 7800GTX might be in the same elite group especially if comparing 512MB 7800GTX.
 
So curious, what exactly is the difference between 2900xt and xtx? I watched the f2f video, and they seemed to perform identically, given clock speed.
 
  • Like
Reactions: erek
like this
So curious, what exactly is the difference between 2900xt and xtx? I watched the f2f video, and they seemed to perform identically, given clock speed.

It's the same core The XTX was supposed to clock high enough to compete with the 8800 GTX (to make up for it's poor design), but that never happened.

It's just like the x700 XT: there were hand-picked samples sent to reviewers, but they could never get the clocks of that chip reliably above 425 MHz.

The x700 XT was a whole lot closer to being a product than the 2900 XTX ever was. At least they got samples available in quantity at target clocks.

When you can't beat your compatetion, you try whatever you can to work the problem. Unfortunately, new silicon takes a long time.
 
Last edited:
  • Like
Reactions: erek
like this
i have a an msi diamon hd2600 XT (512mb gddr5) and i was gmaing with it like 4 or 5 years ago maybe less, obviously limited but, still can handle a number of titles but, it wasn't a 1080p wide screen it was a 1600x1200
 
  • Like
Reactions: erek
like this
Dammit, I need to be on the lookout for the X2900 XTX prototypes now.. lol. I only have a pair of X1950 XTX's.

rSftUINl.jpg


vI7LiiRl.jpg


1FDeWG3l.jpg
 
I've got a 1950 XTX as well. Might be one of my favorite card between noise level, form factor and build-quality.
 
  • Like
Reactions: erek
like this
Back
Top