9800 GX2 Pictures and Specs

The GTX should outperform the GX2. If it doesn't I will be FURIOUS.

I do not think it will outperform it. I think that all that has changed was the naming convention. Biggest change is the shrink from 80nm to 65nm. Same transformation GTS went.

The only thing worth waiting for is whatever gets released in mid. 2008, either way, I am happy with mah GTX.
 
I do not think it will outperform it. I think that all that has changed was the naming convention. Biggest change is the shrink from 80nm to 65nm. Same transformation GTS went.

The only thing worth waiting for is whatever gets released in mid. 2008, either way, I am happy with mah GTX.
it was 90nm.;)
 
The GTX should outperform the GX2. If it doesn't I will be FURIOUS.

Why do you think this is the case? In the past the 7950GX2 was the flagship card. GX2 has topped GTX in the past and Ultra topped all of them. So why would you think this generation would be any different?
 
Wasn't the GTX Nvidia's new king that dethroned the 7950GX2?
Why shouldn't the new GTX be faster as in the past?
 
Wasn't the GTX Nvidia's new king that dethroned the 7950GX2?
Why shouldn't the new GTX be faster as in the past?

The 8800GTX was the 8800 series, the 8-series, completely different from the 7950GX2 which was the 7-series, which had its own GTX's, the 7800GTX and 7900GTX.

So of course the new generation GTX is going to outperform the old generation flagship, just like the 7950GX2 beat the 6800Ultra, etc.

The 9800GX2 is the 9800 series, so using that same logic, it should beat the current 8800GTX and would beat the 9800's GTX

It remains to be seen, however, if the new GTX is going to a new architecture (which means it could beat a G92 based GX2), or if it goes G92-derivative architecture (in which case, it probably won't, driver problems aside)
 
The 9800 is a renamed G92 chip/chips.
256bit vs. the current 384 bit on the GTX.

more than likely two declocked 512 GTS G92's for heat and power. Two GTS's in SLI should out perfom the GX2.
 
So anybody basically with a GTS and above should pass it up, unless you play at like 1920x1200 or loads of AA at 1600x1200?

~Ibrahim~
 
The 9800GX2 is the 9800 series, so using that same logic, it should beat the current 8800GTX and would beat the 9800's GTX

Except it's not, really. The 9800GX2 is G92 based, it's not the new technology that a "real" 9800 would bring. Nvidia is getting slippery with the naming here.
 
8900 GX2 pictured

Edit:

Fudzilla and a few others are saying it will be named the 8900GX2

Not the 9800GX2....
 
:mad: If this is going to be the high end I hope that nvidia and ati make sure their sli/crossfire drivers stay on par with the single card solutions. I also hope that there isn't any games that leave sli/crossfire users hanging for too long.
 
Except it's not, really. The 9800GX2 is G92 based, it's not the new technology that a "real" 9800 would bring. Nvidia is getting slippery with the naming here.

Technically we don't know if this board is G92 based, no source has said the words "G92 based", it is assumed however given the basic stats that it is G92 based.

I still don't believe a G92 based dual GPU design would be called 9800 anything, but it remains to be seen.
 
I'm worried about driver problems as like the 7950GX2?
Better wait for the 9800GTX I think?:confused:
 
Is it my imagination or does it seem like there is always more confusion about Nvidia's upcoming cards than ATI's?

Almost feels like there is someone out there intentionally trying to confuse us. I'm thinking Nvidia is playing mind games with us!
 
Is it my imagination or does it seem like there is always more confusion about Nvidia's upcoming cards than ATI's?

Almost feels like there is someone out there intentionally trying to confuse us. I'm thinking Nvidia is playing mind games with us!

marketing hype. who knows maybe the gx2 is more then 30% faster
 
All I can say is that I am praying to God that this is the 8900.... Otherwise I'm going to turn into an ATI fan boy. :mad:
 
All I can say is that I am praying to God that this is the 8900.... Otherwise I'm going to turn into an ATI fan boy. :mad:



As others more learned then I have said,I dont give a good god damn what its called,but the performance and the price.I just hope it has a black PCB.Oh,and that Thermalright puts out a aftermarket cooling solutoin that allows higer oc's and lower temps.

Whoever has the best product the day I buy a new card gets my cash.Fuck being a fanboi ! I am a fan of technology,period.Well,that and profiting from the technology. :eek:
 
As others more learned then I have said,I dont give a good god damn what its called,but the performance and the price.I just hope it has a black PCB.Oh,and that Thermalright puts out a aftermarket cooling solutoin that allows higer oc's and lower temps.

Whoever has the best product the day I buy a new card gets my cash.Fuck being a fanboi ! I am a fan of technology,period.Well,that and profiting from the technology. :eek:
Agreed in a way but I am just saying a 30% increase should be something you see in a jump from a 8800 to a 8900, not a 8800 to a 9800, and this is the GX2??? And also they built the 9800 on the same technology as the 8800 so your a fan of old technology?
 
Is it my imagination or does it seem like there is always more confusion about Nvidia's upcoming cards than ATI's?

Almost feels like there is someone out there intentionally trying to confuse us. I'm thinking Nvidia is playing mind games with us!

Its because nvidia can hold a secret better than ATi, so their are more people who "speculate" (make shit up) when it comes to nvidia's side. We knew about the 3850/70 because AMD told everyone about a month previous to its launch, to try and gain PR, which they did. Nvidia doesn't need PR, it just needs to get cards that are more kick ass than the competition, which they have for the past 18 months.
 
All I can say is that I am praying to God that this is the 8900.... Otherwise I'm going to turn into an ATI fan boy. :mad:
They can name it whatever they want, to any G92 based card is just a 8850 or at best 8900-series.
9800 is for me the next card aka G100 or whatever the chip is.
 
Agreed in a way but I am just saying a 30% increase should be something you see in a jump from a 8800 to a 8900, not a 8800 to a 9800, and this is the GX2??? And also they built the 9800 on the same technology as the 8800 so your a fan of old technology?


Its ok,I understand ! :)
 
Agreed in a way but I am just saying a 30% increase should be something you see in a jump from a 8800 to a 8900, not a 8800 to a 9800, and this is the GX2??? And also they built the 9800 on the same technology as the 8800 so your a fan of old technology?

30% makes a difference. That is about a 10 fps difference, and I don't see why people are bitching. What did you all expect?
 
30% makes a difference. That is about a 10 fps difference, and I don't see why people are bitching. What did you all expect?

I believe most of the complaints are valid given the long wait and high expectations. 10 fps increase is disappointing when you take that into account.
 
I think it is obvious that this card is not aimed at existing 8 series owners. The 8 series line is being refreshed and repriced.

I think the GX2 is more aimed at the small but growing number of users who are looking for bigger, higher resolution monitors. A 30% performance jump probably would serve them well. I am talking about people who want to play Orange Box, Call of Duty, Unreal Tournament, Bioshock, ect ect at resolutions well above the standard 1680 x 1050. 30% doesn't sound like a lot, but if it is the difference between running your 24 or 30 inch monitor's native resolution or not then it can become a big difference.

But Crysis is a totally different story. I don't know a whole lot about graphic card architecture but I am guessing that Crysis really needs twice the shader power of what is available now to run like we want it.

The reason why SLI is being pushed so hard is that it is probably sufficient for 99% of the games on the market right now and in the near future.
 
I want a card that can handle my 30 inch LCD Monitor on 2560x1600 and everything on High playing games?
Question is which one 9800GX2 or 9800GTX?
 
30% makes a difference. That is about a 10 fps difference, and I don't see why people are bitching. What did you all expect?

Because you can take an 8800 GTX and toss an aftermarket cooler and OC the thing and get that 30% FPS... and that is at 1/4 the damn price!

So why are we bitching? Because 8 series saw 100% increase in FPS over the 7 series. and expectations are that 9 series must do the same.

10fps is not going to enhance your overall gaming experience. 30fps increase is!

simple!
 
Because you can take an 8800 GTX and toss an aftermarket cooler and OC the thing and get that 30% FPS... and that is at 1/4 the damn price!

So why are we bitching? Because 8 series saw 100% increase in FPS over the 7 series. and expectations are that 9 series must do the same.

10fps is not going to enhance your overall gaming experience. 30fps increase is!

simple!
Very true. I am not a complainer usually, but I am sorry if you cannot boost the performance by at least 30fps in Crysis for instance you should just go back to the drawing boards. I game @ 1080p on my Westinghouse. Crysis is unplayable for the most part for me and I have SLi 8800GTX's OCed to 680/2000 on water. nVidia should have set their sights on being able to hit 60fps average @ 1920x1200 everything @ Very High. Anything less is pretty much worthless. Also I should not have to go SLi to achieve this.
 
I want a card that can handle my 30 inch LCD Monitor on 2560x1600 and everything on High playing games?
Question is which one 9800GX2 or 9800GTX?

We won't know until they are released. Anything now is pure speculation.
 
Lets figure it out!

SLI'd 8800GTX's = what about 30fps in Crysis at max settings at 1920x1080 (currently what I'm running)

So, the 9800GX2 is supposed to be 40% better (Ultra being 10% faster then an GTX and the GX2 being 30% faster then an Ultra) yes I know that is technically 43%, and lets face it 30fps is bare min.

So 1920x1200=2304000 and 2560x1600=4096000, thus that is a 44% increase in resolution and thus I would say the GX2 will be so so for your situation.

The GTX will most likely be the better fit. Unless you are planning on SLI'n GX2's then I would say you should expect a playable experience at that resolution.
 
This card will suck at 2560x1600, remember: 512MB

I'm betting a 768MB GTX will be faster than this card at that resolution in Crysis.
 
Would the 9800GX2 really be a PITA to work in a IP35 Board ? (Intel, P35 chipset based)

I mean, i've never had a SLI sort of card in a board that doesn't have 2x pci x16 slots.
 
As others more learned then I have said,I dont give a good god damn what its called,but the performance and the price.I just hope it has a black PCB.Oh,and that Thermalright puts out a aftermarket cooling solutoin that allows higer oc's and lower temps.

Whoever has the best product the day I buy a new card gets my cash.Fuck being a fanboi ! I am a fan of technology,period.Well,that and profiting from the technology. :eek:

If it's a dual PCB design, how are you going to use aftermarket cooling on it? You will only be able to cool one chip LOL, this card is laughable IMO, ATI's card is a heck of alot more impressive with both processors on one PCB. Nvidia needs to go back to the drawing board on this one.
 
Given my gaming rig chews up everything out there...with the SOLE exception of Crysis...(Which makes my system get on its knees and ask for its daddy)...I am concluding that, A) The problem is Crysis, and B) The only reason I would have to go to one of these bizzarre GX2 cards is to play Crysis slightly better but probably still not perfect.....and lets face it..I finished Crysis already!

Therefore Crysis can bite me for making me play thier game at supoptimal settings by releasing it before proper hardware existed

AND

Nvidia can polish my toilet until they can release a real next gen card.

Q.E.D.
 
Back
Top