9800 GX2 Pictures and Specs

The GTX/Ultra and old GTS are based on the old G80, not the new die shrink G92. So I'm not at all sure what your point is/QUOTE]

My point was there are multiple bus sizes for the G80 cards so its not impossible to have different bus sizes for G92.


True,but there is no way in hell this GX2 card coming to a Newegg near you,will have a bus that wide...


I want to know more about the 9800GTX that Kyles references in the second article !
 
I don't see the problem with naming it 9800 GX2.... ATI did it with The HD 3800 series, and that was the same architecture. Has anybody considered that maybe a new architecture, isn't done yet? I think we're lucky Nvidia is giving us this for us to munch on before they release a new architecture, rather than stick with the 8800 Ultra as the high end, which they could do.
the 3800 series had some changes that allowed it be pci-e 2.0 and DX 10.1 compliant. this GX2 card from Nvidia appears to be nothing more than two G92 aka 8800gt/gts cards so theres NO reason for it to be named 9800 series. Nvidia didnt even change the name when going from G80 to G92 core cards so why would they go to a complete series name change for just sticking two cards together?
 
pooper.

also, i expect a new record msrp on this card. and then comes the overpricing. wouldnt be surprised to see this at 1k+ for a bit
 
LAME. I don't think the card is bad, I think the whole GX2 idea is kind of lame.

Pretty much guaranteed to be better than "R680" however, I don't know why nvidia is calling this a 9800, if its 128 SP x2 that would sound to me like its just 2xG92 chips, This is the old shit given a new name, I got no problem with this card, but its 2 old cards packaged together, and its being sold as "the new shit".

Thank you ATi for starting the trend of relabeling the old stuff.

Kyle, please ask nvidia for an explanation on the number please.
 
I got the real specs on card!!

1. Meh
2. Meh
3. Meh

Seriously, is this a joke?
 
the 3800 series had some changes that allowed it be pci-e 2.0 and DX 10.1 compliant. this GX2 card from Nvidia appears to be nothing more than two G92 aka 8800gt/gts cards so theres NO reason for it to be named 9800 series. Nvidia didnt even change the name when going from G80 to G92 core cards so why would they go to a complete series name change for just sticking two cards together?

Well, if this article at DailyTech is correct, the 9600GT is supporting 10.1. I see no reason why 9800GX2 wouldn't. Nvidia seems just to be doing the same thing that ATI is doing.. small refresh to the architechture, and 10.1 support.
 
LAME. I don't think the card is bad, I think the whole GX2 idea is kind of lame.

Pretty much guaranteed to be better than "R680" however, I don't know why nvidia is calling this a 9800, if its 128 SP x2 that would sound to me like its just 2xG92 chips, This is the old shit given a new name, I got no problem with this card, but its 2 old cards packaged together, and its being sold as "the new shit".

Thank you ATi for starting the trend of relabeling the old stuff.

Kyle, please ask nvidia for an explanation on the number please.

I think the R680 will rival the current 8800 Ultra in performance, but it is still going to be hindered in AA at high resolutions is my guess.

None of this information originated with NVIDIA so I will not ask for any comment as they do not comment on future products.
 
I dought Nvidia is relaxing,its more like pacing themselves.AMD is looking to take somebodies crown,since it cant be Intel at the moment ,why not Nvidia.Expect something close between both gpu's.GTX will come when AMD.s R700 shows up;)
 
Well, if this article at DailyTech is correct, the 9600GT is supporting 10.1. I see no reason why 9800GX2 wouldn't. Nvidia seems just to be doing the same thing that ATI is doing.. small refresh to the architechture, and 10.1 support.
well if this GX2 is just two 8800gt/gts cards then no it wont support 10.1. the 9600GT appears to be an architectural change where this card doesnt but who really knows?
 
i hope that cooler(s?) is good enough for G92.....and not weak like the 8800gt came with....
 
wow so its really just a 512 card because that 1gb is shared by both cards
 
Bah! Cheezy stopgap if you ask me. If I wanted SLI, which I dont, I would have bought two cards already. Why now would I want the same two cards just glued together instead? So I could go Quad when it's hard enough to justify the expense of SLI? Forget it! Release new cards when theres something new to release.
 
The general consensus on all the forums I have looked over,is a big thumbs down on this new 'GX2' crap thats for sure.

It seems many are aware of the past support issues,that Nvidia had/has with the last incarnation of this garbage. :eek:


I really want to know more about the 9800GTX mentioned.How many SP,and how fast ? Core clocks ? Bus width,and ram speeds/kinds and amounts.
 
My point was there are multiple bus sizes for the G80 cards so its not impossible to have different bus sizes for G92.

Look at those numbers in the bus sizes, you might notice a pattern, you might also need to realize there was a number of reasons Nvidia didn't go for 512-bit, for one its insanely complicated and therefore expensive. Just look at AMD moved away from it...
 
lol how much is this thing sposed to retail for? I like the 2 chips on pcb ATI idea a lot more than 2 pcbs in one box type thing
 
lol how much is this thing sposed to retail for? I like the 2 chips on pcb ATI idea a lot more than 2 pcbs in one box type thing

Yeah but it allows them to keep the card as compact AS possible, the downside to the ATi solution is atleast from the pictures ive seen which may mean nothing, its longer than this.
 
The pictures I saw posted of the R680, looks to be the same length as an Ultra.
 
Or is this all a ploy to confuse 'ATI' ? :p


The pictures I saw posted of the R680, looks to be the same length as an Ultra.



Possibly a hair shorter even then the Ultra,and certainly cheaper then this frankenstien.
 
i hope that cooler(s?) is good enough for G92.....and not weak like the 8800gt came with....

Being that you are a noobie, I won't flame you for spreading misinformation. But the coolers that the GT came with were fine. A slight fan speed software bug, yes. Weak cooler? no.

Secondly, new chip process means better yields. Don't forget that. Better yields = more profit. Or more simply more cards that work right to make money off of. SO why is nvidia flooding the market with all these variations? New processes, and also centrilization of certain processes also. It's all about the bucks, you consumers really think you are THAT important.
 
True on all points, but my only thought was that sometime we`ve got to start to move beyond single core solutions (as did CPUs).

Perhaps now is not the best time given the reaction here...but it`s got to happen at some point right? Maybe when the power consumption drops a bit? :confused:

We moved beyond single "core" GPUs about four or five years ago. The G70 and G71 have 24 "cores". The X1800 ,and to some extent the X1900, have 16 "cores". If were comparing CPUs to GPUs then an execution core on a CPU is pipe on a GPU. The reason CPUs took so long to manage 2 "pipes" or "cores" is because they have to do a whole lot more then the factory GPU model of plot a point, map textures, shade, render to 2D, export.

If your talking about multiple dies and/or wafers then it just isn't cost-effective to run 2 or more. When you run two wafers on the PCB you simply have far too much latency to have both compile the same frame they way they would if it was a single die. You have to split the frame logically between the two (Split it (SFR) or have each core render every other (AFR)).

We have yet to see the effect of the latency on two dies on a single wafer.

2007 is going to go down in my books as the most stagnant year in recent GPU history. It looks like 1H 08 is going to go down in much the same way. Still, CES autta have a couple of things intresting in this area for us, if only the R680.

The G92 is an artifact and I only realized it now.

If ATI's HD2900XT had been the card we were all hoping it would be, performing on-par or better then the 8800Ultra, the burner under Nvidias ass would have been turned up a couple notches. The logical result (based on the last 2 generations) would have been Nvidia creating die shrink and renaming their smaller lineup. The 8600 probably would have become the 8700, and the 8800 probably would have become the 8900. Again, under the assumption ATI had answered the call with the R600 and had created a whole decent looking lineup from midrange to highend and that was putting good pressure on NV, the current 8800GT would have been in the prime position to have been named the 8900GT, with another revision (65nm (or even smaller) ver of G80) taking up the name of 8900GTX/Ultra/GX2, with most of the same properties as this 8900GT: the evolved Texture Mapping architecture and probably a bigger memory bus.

But instead ATI couldn't answer the call. Already having started development on the "8900GT" I guess they figured that it would still be released but under a different nomenclature.

The result is we're in a similar situation as the one we were in when the 7900GTO was available. We have two cards which will outperform each other randomly, and at a roughly 50/50 bases: do you want more memory throughput (7950GT) or do you want a faster core clock (7900GTO)?...
(actually for me it was a moot point since the X1950XT 256mb could be found for like $250 at that time). Do you want a card with more texture mapping horsepower (8800 256bit) or do you want one with more memory throughput (8800 384bit)?
 
Look at those numbers in the bus sizes, you might notice a pattern, you might also need to realize there was a number of reasons Nvidia didn't go for 512-bit, for one its insanely complicated and therefore expensive. Just look at AMD moved away from it...

AMD also didn't have a part that could compete with the 8800GTX and was not using its memory.

If this part is ~30% or more faster than an Ultra then it will be able to utilize that extra memory bandwidth, especially if you have them in quad SLI.
 
v5 5500.. last 1 card/dual GPU(SLI) setup worth getting :) [where or were did the v6 end up]


FSAA ftw!!




i actually find it it a little odd that this new card in SLI (the 2 onboard) is only 30% faster than a Single 8800Ultra..

so is SLI 8800 Ultra still the king?

they should have just glued 2 8800 Ultras together so they can claim it is 50% faster than a single 8800 Ultra.. :)
 
I bought the 7950GX2.... I will not make the same mistake twice....

ah it wasnt that bad really. they were decent cards and did great on highr resoloution.

hey with this new 9800gx2, could you stick 3 of them on say the 780i and run sex sli? (sex sli = 6 no jokes needed )
 
v5 5500.. last 1 card/dual GPU(SLI) setup worth getting :) [where or were did the v6 end up]

FSAA ftw!!
i actually find it it a little odd that this new card in SLI (the 2 onboard) is only 30% faster than a Single 8800Ultra..

so is SLI 8800 Ultra still the king?

they should have just glued 2 8800 Ultras together so they can claim it is 50% faster than a single 8800 Ultra.. :)

Depending on its size and if you could triple sli it, it might beat out a triple sli ultra setup... but who knows if the drivers or bridge support will be there for it :p
 
its only 30% faster than a ultra cause its using 256 bit busses, and 512 ram on each card.
Im willing to bet a true 8800gtx or ultra sli setup will smoke this thing at high resolutions.
imho anyway.
 
How can NVIDIA call this a 9800? I can understand it being the 8800 GX2, but 9800 GX2? I really hope that this doesn't represent what the future 9xxx architecture is. I'm getting so tired of the 8800 stuff now. It's been too long. We need something new
 
As others have said; this card is a weak offering from nVidia. I just want to add:

A high clocked G92 (~800MHz, it's doable) with 384-bit memory bus would have been a much better choice than this GX2, and SLI would have been a MUCH better option since 2cards is the sweet-spot vs 3 or 4.
 
mmm glad I got my 8800gt.. think ill just sit back and see how it all turns out by Q3.
 
Disappointing move by Nvidia.
I was hoping we'd get another GTX ripper, looks like we'll have to wait.

I just hope it isn't another overpriced, lack of driver support piece of trash...

I suppose time will tell.
 
hay lez just put two 8800 gpu in 1 card and put a unimaginative hsf on it and call it 9800! 30% is rather dissapointing
 
Kyle in your opinion do you not think this pales in comparison to the AMD 3870 X2 with ONE pcb design. ? At least from a perspective of engineering... AMD's card certainly seems more
elegant if nothing else.

I was afraid this would happen.I want a single gpu solution,not more GX2 crap.If the past is of any indication,this will get questionable support at best over time.

It may not be as elegant, but I can almost guarantee that two 8800GTX GPU's shrunk to 65nm will ASSRAPE a 3870 X2 card. I do hope that NVIDIA handles things better than they did with the 7950GX2. If they don't I'll be pissed off.
 
It may not be as elegant, but I can almost guarantee that two 8800GTX GPU's shrunk to 65nm will ASSRAPE a 3870 X2 card. I do hope that NVIDIA handles things better than they did with the 7950GX2. If they don't I'll be pissed off.



Strong words for a card that claims only 30% better performance then an Ultra ! :)


I dont think Nvidia learned their lesson with the first one,and will repeat the bad customer support the original saw.I also think this will be a product with a very short shelf life.
 
Back
Top