9800 GX2 Pictures and Specs

I should be within the step-up period from my 8800GTX, but if I have to pay for the upgrade I probably won't take it... it just seems like a dud to me.
 
Haven't they learned? The 7950GX2 was a mess, and they throw us this? I hope to be proven wrong, but I don't expect much. I want a real G100 (or whatever NVIDIA's new high-end is called), not two G80s (or G92s) packed together!
 
We suspected Nvidia would answer ATi's 3870 GX2. Last rumor I heard, ATi's was looking at 15% faster than an Ultra.
 
Thanks for the sneak peek [H]. :)

Very disappointed in Nvidia.

I guess I won't have to worry about buying an 8800GT now though.
 
It would be nice to find out that this card is not truly being released as Nvidia's high end, next gen card for the near term. Maybe it will cut close to the release of the newer true 9 series high end part. I have read rumous, what ever they really mean, that February will be the month for the new part. If this 9800 GX2 really is it what they consider the next high end part, I, along with others sure hope ATI/AMD come out swinging. I will be all over it if they can.
 
I love the way they went with 7950GT / 7950GX2=>8800GT / 9800GX2, my next upgrade would be the card in the same price range as 8800GT :D A lot of people want 7900GTX=>8800GTX kind of jump but I would rather stay with 7950GT=>8800GT kind of upgrade.
 
I like it, they shrunk the di size which is a good test of the new technology since the di shrink is what usualy allows for all the power, I think they have more chance of getting the 9800 cards right with a test 9800GX2 like this, and it gives people a good 8800 cheap SLI solution if it sells with a price anything close to the 7950GX2 (of which i owned one and it was a very good card, pretty much on par with the 8800gts to be release later into the 8xxx series.)

Hopefuly after this we'll see some proper high end 9800 parts.
 
Makes sense why Nvidia has released quad SLI drivers. I thought they were being nice and finally supporting those who had troubles with the 7950 GX2...
 
Because of the qouted ram amount on the 'card' 1Gb.

I wonder if it'll use GDDR3 or 4?

So why not 512bus then?

This is based on G92 though. Which so far has been 256.

And I don't think the low end 8x00 parts were based on G80.

Well there was still 384 (GTX/Ultra) and 320 (GTS).

Also, it might use GDDR5 since samsung (?) is producing that now. Its supposed to be a lot better from what I recall from the news post a few months ago or so, so maybe it would help with the smaller bus if there is one.
 
Meh, not all that impressed. Now if nVidia were to figure out how to make two GPUs act as one so there wouldn't be any SLi related issues (read: lack of support) this type of card might be worth it, but until such a time I won't be considering an upgrade before true "next gen" GPUs are out...
 
So why not 512bus then?

Well there was still 384 (GTX/Ultra) and 320 (GTS).

Also, it might use GDDR5 since samsung (?) is producing that now. Its supposed to be a lot better from what I recall from the news post a few months ago or so, so maybe it would help with the smaller bus if there is one.

A 512-bit bus on two GPUs would be a massive cost, that's why people see the issue there, and then who knows who the G92 is designed to scale to that.

The GTX/Ultra and old GTS are based on the old G80, not the new die shrink G92. So I'm not at all sure what your point is..

GDDR5 is a bit out, most GPUs these days are hardly using GDDR4, so again GDDR5 would be an extreme cost in the amount they're talking about.
 
Price will be what matters now that Nvidia has their underwhelming next gen card in the works.

My opinion $400-$450 and it will sell well for them, $450+ and it's just a dumb "upgrade" for people who already have an Ultra, or a GTX clocked to Ultra speed.

Something seems off about Kyles info here, no offence Kyle, but this is a really weak offering, even from the undisputed high end champ at the moment. If Nvidia isn't interested in jacking up the bar until ATI gets a competitive part, then why not just stay with the Ultra, or release a single PCB G92 Variant that slightly beats the Ultra? It would seem that that would be cheaper than this.... thing....
 
I was looking forward to putting together a new gaming rig this fall to run 2560x1600, but I won't be using this 'SLI on a card'. Looks like I'll be waiting til 2009 :(
Come on ATI/AMD! Get some performance parts out and slap Intel/Nvidia.
Intel/Nvidia are really spoon feeding their technology with AMD in such a state. :(
 
nice, we got a card that requires a SLI motherboard and only works on "certain profile" games... WONDERFUL
 
Something seems off about Kyles info here, no offence Kyle, but this is a really weak offering, even from the undisputed high end champ at the moment. If Nvidia isn't interested in jacking up the bar until ATI gets a competitive part, then why not just stay with the Ultra, or release a single PCB G92 Variant that slightly beats the Ultra? It would seem that that would be cheaper than this.... thing....

Welcome to the world of building integrated circuits, wafer supply, die shrinks, process transition, and making more money per square milimeter. ;) If you have better information than mine, I would love to hear it.
 
What a GHEY friggin' card. I'll be waiting for the next GTX to replace MY GTX. Dangit nV!

Talk about dragging their asses - sheesh.
 
The GTX/Ultra and old GTS are based on the old G80, not the new die shrink G92. So I'm not at all sure what your point is

My point was there are multiple bus sizes for the G80 cards so its not impossible to have different bus sizes for G92.
 
I'm disappointed as well.

Also, I never figured they would use the 9800 numbering.
 
I've been waiting patiently for a good one slot SLI solution. But unless this card can out-perform a pair of 8800 GTX Ultra's in SLI by a reasonable margin (10 to 20% depending upon the game) then it won't be for me.

As far as the naming goes, I believe it should be labeled as 8800GX2, not a 9800GX2.
 
Bah i was hoping for a single card like Ultra but of course faster. Would been better the last GX they did was a flop
 
Yep can't say I am at all excited, probbably 80%+ more power and more heat for only 30% ish performance gain over the current ultra. And then there's all the SLI type issues booo.

Looks like the rumor mill got it right again.
 
Wow, what a load of crap.

I have been waiting to upgrade my 7800 GT for a few months now, when I first heard rumors of a new flagship card coming in November, later changed to Q1 '08, and THIS is what they throw at us? A card that performs a mere 30% better than one that was released a year ago? Disappointing to say the least.

ATI really has to get in the game asap and get the competition going.
 
I realize a lot are disappointed by GX2 revisions, but does anyone think this is where graphics processing is going?

I mean, how much better can we do above current GTX/Ultra technology? Not that I'm denying progress, but sometimes progress comes from unexpected sources/methods...

Seems like a baby step (albeit an annoying one for us) in the right direction. A little time/support and I can see these multi-core solutions becoming a thing of the enthusiast.
 
My point was there are multiple bus sizes for the G80 cards so its not impossible to have different bus sizes for G92.

The "G80 cards" are actually 3 different cores. G80 is 384 or 320 bit, G84 is 128, G86 is 64 or 128.

G92 is 256 or 192 bit.

Source.

Anyways, this card looks like it'll be pretty terrible, honestly. Given nvidia's track record with GX2 cards, it will not end well. Still, people will likely buy it because the number is bigger, so I guess it works out for nvidia.

Steve said:
If two GeForce 8800 GPUs on a dual PCB, single video card doesn’t make your heart flutter, the fact that these will most likely support quad-SLI surely will.
Maybe not my heart, but at least my power supply. 3 ultras draw 800W+ from the wall peak... this will be lower power draw per PCB than an ultra, but there's still 4 of them. I wouldn't be surprised to see this hit 900W of power draw.
 
I realize a lot are disappointed by GX2 revisions, but does anyone think this is where graphics processing is going?

The average high end gamer will not accept 400watts++ or more power to go SLI or CF just to play the latest games, The true enthusiasts will find it acceptable, but that makes up a very small percentage of potential buyers.

AMD and Nvidia are getting it wrong pushing CF and SLI so heavily.


This BS seems lazy to me.
 
Looks like grabbing a gts a year ago was the right move to make. I'll consider upgrading when the next-gen stuff hits, but certainly not for this.
 
I don't see the problem with naming it 9800 GX2.... ATI did it with The HD 3800 series, and that was the same architecture. Has anybody considered that maybe a new architecture, isn't done yet? I think we're lucky Nvidia is giving us this for us to munch on before they release a new architecture, rather than stick with the 8800 Ultra as the high end, which they could do.
 
The average high end gamer will not accept 400watts++ or more power to go SLI or CF just to play the latest games, The true enthusiasts will find it acceptable, but that makes up a very small percentage of potential buyers.

True on all points, but my only thought was that sometime we`ve got to start to move beyond single core solutions (as did CPUs).

Perhaps now is not the best time given the reaction here...but it`s got to happen at some point right? Maybe when the power consumption drops a bit? :confused:
 
Back
Top