Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Who said there ever was going to be a 8900?
It would be wise for Nvidia not to worry about a refresh and concentrate all efforts on the G92 or what will probably be known as the Geforce 9800.
i remember reading about the 8900's in like february. a gtx, gts, gt, and a 8950 gx2.
The HD 2900XT sucked. Sorry all you ATi fanboys.
-It doesn't do AA well
-It is slower than the competitions cheaper card
-It runs hot and sucks energy
did nvidia put the 8900 series of cards on hold? i havent heard anthing about them in a while. is nvidia holding them back because the 2900 isnt faster than the 8800gts?
The original ATI Radeon "sucked" compared to the competition.
The GeForce FX 5800 Ultra "sucked" compared to the competition.
The Radeon HD 2900XT does not "suck". It's as fast as the second-tier card from it's competitor (8800GTS), and it's priced at that tier. Both the above examples of cards that DID actually "suck" failed to compete with the second-tier card from their competitor, and were priced to target the top-end.
Yes, we would all have liked the HD 2900XT to be a better part...but that doesn't mean it is a BAD part...just not the one we wanted.
There's still hope that with revisions and a move to 65nm can save the 2900xt and make it a better product. The question of course is if that is even worth it if the next gen nvidia card is less than 6 months away.
Unless ATI follows AMD's example by cutting prices drastically on it's new cards,Nvidia has no reason to rush on releasing a new series,or to cut prices on the 8800's.
Only place I've ever heard about the 8900 is the Inquirer, and I've been doubting the existence of it all this time.
I seriously doubt that a 65nm part alone would make that much of a difference. The HD 2400 and HD 2600 cards are 65nm and they aren't any better than their competitors either.
I have seen a similar statement such as this now twice today and I have to say (again) that I think this is completely wrong. There is at least one compelling reason for NVIDIA to release newer high end cards. That reason is that people like me who have no upgrade path currently has no reason to buy anymore NVIDIA products. New cards means I'll open my wallet again and by something else.
Plus there is a second reason. Marketting, by releasing new products it builds confidence in the company and a positive reputaiton. NVIDIA probably doesn't really make money on high end video cards but by providing a flagship part that is better than anything the compeition has, it gives Joe-Sixpack the perception that NVIDIA's products are better than ATI's even if that's not true.
We certainly know that the lineup in the lower and mid range can look quite different from the flagship part in terms of cost and performance ratios. Often we've seen NVIDIA rule the roost at certain price points and ATI at others. But what your average computer shopper hears is NVIDIA has the fastest cards and that's what they should look for. Even if they really only have one faster card at the high end. Sometimes their information will even be a generation or two out of date. So it's important to keep the lead as often as possible.
forget that, I'm holding out for the 10800 series
specifically the 10850XTX LE PE GTO GX8 8096mb with Hyper-Knitting technology
The HD 2900XT sucked. Sorry all you ATi fanboys.
-It doesn't do AA well
-It is slower than the competitions cheaper card
-It runs hot and sucks energy
I seriously doubt that a 65nm part alone would make that much of a difference. The HD 2400 and HD 2600 cards are 65nm and they aren't any better than their competitors either.
Someone already pointed out, when they move to the smaller process they could of changed the map to fix leakage, failed portions of the core (ie: not working), etc. etc. and hopefully make it a great contender
oh and, I seriously doubt Nvidia could get a 8800GX2 version on this generation of cards, unless they underclocked to 500mhz core speed, and what in the hell is the point in that?
ATI has mentioned to us that the Radeon HD 2900 XT should be a great overclocker, in the right conditions nearing the 1 GHz clock frequency mark. We are certainly not there with these two video cards currently. Raising the clock frequencies does positively impact the performance of these video cards. We saw positive results in 3DMark06 with much improved pixel shader and vertex shader performance when overclocked. If these video cards could reach the 1 GHz barrier, they certainly might be a force to contend with. As it is however right now we seem to be hitting a brick wall at default voltages and thermal configuration.
On the subject of a theoretical 7950GX2 type card based on dual G80 cores, I think the power consumption of such a card would be too high and the thermal solution might have to be excessively large or rediculous in some way to make it work. Then again no one ever imagined two 7900GTX GPUs on one card either, and it happened.
I know for certain that if a GX2 type card was made based on the current 8800 GTX, it would probably set some kind of record for power use, heat, and size all in one package
forget that, I'm holding out for the 10800 series
specifically the 10850XTX LE PE GTO GX8 8096mb with Hyper-Knitting technology
I wish they would give more info about the G90 or 92 or whatever the next high end card is. I want to know if it's worth waiting for. I also want to know about Nvidia's new mobo's coming out with PCIE 2.0.
They could certianly clock the r600 a lot lot higher, which is what some ATI engineers seemed to hint as was why it was lackluster. I remember someone from ATI saying that they had hoped for much higher clocks at release but had no choice.