Question about polygon performance

Forealz

Limp Gawd
Joined
May 7, 2007
Messages
253
I remember back in the day every graphics card and console were measured in their polygons/sec performance. Lately however I've never seen this figure. I remember the PS2 coming out with a maximum of something like 70m polygons, which was absurd. Anyone know what today's cards are pushing or why they have stopped this measure of processing power?
 
I believe the marketing buzz changes every couple generations or so. Back in th old days it was all about "bits", then came "polygons" and now, it's "shaders". PS2 could in theory render 70M polygons though, but that was only if the system was doing nothing but pumping out polygons. It would be interesting to know what the cards of today can pump out in terms of polygon performance though.
 
The processing units have become more generic and programmable, and the figures tend to reflect that. Plus, moore's law seems to apply, where performance tends to double every 1.5 years.

You might be interested in this: Tech ARP - Graphics Card Comparison Guide -- see the bottom for theoretical figures on different nvidia models. If you hunt around, they have comparisons of technical specs, other brands, etc.

According to that chart, the new GX280 is just about 100 times more powerful than the original GeForce. If my dates are right, that's slightly ahead of moore's law, if the prices were the same.
 
Back
Top