9800 GX2 Pictures and Specs

Anyone else notice that odd numbered Nvidia releases seem to suck?
TNT, Geforce (buggy), Geforce 3 (late and short lived), 5000 series (no words), 7000 series (plagued by ram failures), and now the 9xxx series?

I've bought only even numbered series (with exception of 7800 and on third replacement for bad ram). TNT2 Ultra, Geforce4, 6800GT, 8800GTS and have been happy with them all. I'll be waiting this round out. Bring on the 10000 series.

My 8800GTS purchase made in nov. 06 sure has had legs.

I can certainly understand why nVidia has released this kind of product. Good way to clear out inventories of low speed binned g92 gpu's ahead of the true next gen release.

Now ATI needs to counter with a dual PCB 3870 x4...LOL
 
Anyone else notice that odd numbered Nvidia releases seem to suck?
TNT, Geforce (buggy), Geforce 3 (late and short lived), 5000 series (no words), 7000 series (plagued by ram failures), and now the 9xxx series?

I've bought only even numbered series (with exception of 7800 and on third replacement for bad ram). TNT2 Ultra, Geforce4, 6800GT, 8800GTS and have been happy with them all. I'll be waiting this round out. Bring on the 10000 series.

My 8800GTS purchase made in nov. 06 sure has had legs.

I can certainly understand why nVidia has released this kind of product. Good way to clear out inventories of low speed binned g92 gpu's ahead of the true next gen release.

Now ATI needs to counter with a dual PCB 3870 x4...LOL

I can't agree with you. I've had some issues out of even numbered cards. Specifically the Ti4600's and Ti4200's.

Riva TNT2 Ultra -No problems.
GeForce 2 GTS 64MB -No problems.
GeForce 3 Ti500 -No problems.
GeForce 4 Ti4200 -Company I worked for burned up 32 of these in less than a year.
GeForce 4 Ti4600 -Two failures.
GeForce FX 5200 -Purchased for a server, no problems.
GeForce 6800GT AGP -No problems
GeForce 6800GT x2 SLI -No problems. Overclocked to Ultra speeds all thier life.
GeForce 7800GTX x2 SLI -No problems.
GeForce 7900GTX x2 SLI -No problems.
GeForce 8800GTS 320MB -No problems.
GeForce 8800GTX x3 3-Way SLI -No problems.

So with the exception of the GeForce 4 Ti series I haven't had any problems at all. I burned up one card in like a week and took it back to the retailer. The one I got after that burned up more than a year later and I replaced it with an ATI Radeon 9600Pro. Back then I couldn't afford the 9800Pro.
 
I can't agree with you. I've had some issues out of even numbered cards. Specifically the Ti4600's and Ti4200's.

So with the exception of the GeForce 4 Ti series I haven't had any problems at all. I burned up one card in like a week and took it back to the retailer. The one I got after that burned up more than a year later and I replaced it with an ATI Radeon 9600Pro. Back then I couldn't afford the 9800Pro.

I wasn't only talking about failure rates...more of impact to the market and overall performance/enhancements that made it differential to the previous gen.

The TNT was a struggle to compete with voodoo, but the TNT2 pretty much took Voodoo out.
The original Geforce was kinda buggy, but what do you expect for the first gpu...even so..the Geforce 2 was a significant improvement.
The Geforce 3 hadn't been out what 3 months and was eclipsed by the 4000 series. I had a Ti4400 and it was an excellent upgrade and good bang for the buck.
The 5200 series might have been reliabile but oh man the performance sucked.
The 6800 cards were another jump and with the exception of some dishonesty on hardware video acceleration support, were great cards.
The 7000 series didn't offer a vast jump in performance over the 6000 series, and my experience has been poor reliability wise. I'm on my third go7800GTX, and frankly, I'm running it underclocked just so I don't have to worry about sending my XPS off for a third time.
The 8000 series was another jump equal IMO to the one from 5000-->6000 series.
So far the 9000 series looks to be about on par with the GF2->GF3 step or 6800->7800 step.
 
I wasn't only talking about failure rates...more of impact to the market and overall performance/enhancements that made it differential to the previous gen.

The TNT was a struggle to compete with voodoo, but the TNT2 pretty much took Voodoo out.

I'm talking about the TNT2 Ultra which was the big dawg of the day that provided the best performance and image quality of the itme.

The original Geforce was kinda buggy, but what do you expect for the first gpu...even so..the Geforce 2 was a significant improvement.

I never had an original GeForce card. I was actually still trucking along with my TNT2 Ultra at that time. I didn't have the funds to buy a new video card every six months to a year.

The Geforce 3 hadn't been out what 3 months and was eclipsed by the 4000 series. I had a Ti4400 and it was an excellent upgrade and good bang for the buck.

No the GeForce 3 had a run of about 3 months and then the GeForce 3 Ti 500 came out and replaced it. So the total GeForce 3 run was six months if I am not mistaken. The GeForce 4 Ti 4600 was excellent aside from the fact that some brands had terrible voltage regulators or faulty capacitors. I was struck by the faulty voltage thing several times at home and at work.

The 5200 series might have been reliabile but oh man the performance sucked.

The GeForce 5200 sucked balls. It was worse than the card it replaced. The whole FX line was crap and even NVIDIA knows it. Or you could look at it this way: The FX line wasn't really all that bad but in contrast the ATI Radeon 9000 series was just THAT good.

The 6800 cards were another jump and with the exception of some dishonesty on hardware video acceleration support, were great cards.

Yeah the Purevideo thing didn't work out as well as they had hoped. The later mid-range cards fixed this issue but generally speaking the GeForce 6 series was great. They also brought us a reason to go PCIe on the high end and that was SLI.

The 7000 series didn't offer a vast jump in performance over the 6000 series, and my experience has been poor reliability wise. I'm on my third go7800GTX, and frankly, I'm running it underclocked just so I don't have to worry about sending my XPS off for a third time.

The jump from the 6800Ultra to the 7800GTX was pretty large as I recall. Not quite the same as the GeForce FX 5950Ultra to the 6800Ultra but similar. The real disappointment in the 7-series came from the poor incremental upgrades. The 7800GTX 256MB to the 7800GTX 512MB wasn't really worth the cost of upgrading. The jump from the 7800GTX 512MB to the 7900GTX wasn't huge either. Granted the difference between the 7800GTX 256MB and the 7900GTX was worth while for some but that inbetween step was almost useless. The 7900GX2 was an OEM only solution which was a big fuck you to the DIY and enthusiast communities and of course the length and power requirements were bullshit and SLI'ing two of them wasn't an option until much later. The 7950GX2 came out and was better than the 7900GX2 in that it was smaller slightly faster and used less power but originally couldn't be SLI'ed due to poor drivers same as the 7900GX2 couldn't be paired up in SLI. For three months the 7950GX2 was slower than the 7900GTX SLI setups that many of us had. Then they released Quad SLI 3 months before the release of the 8-Series. Even then the poor drivers and terrible Quad-SLI scaling made it a poor purchasing decision in my opinion. NVIDIA continued to screw the 7950GX2 crowd over in Vista for some time.

The 8000 series was another jump equal IMO to the one from 5000-->6000 series.
So far the 9000 series looks to be about on par with the GF2->GF3 step or 6800->7800 step.

The 8-Series was a huge jump and worth while for anyone. While not always faster than the 7950GX2 Quad-SLI setup image quality was better as was power usage and heat output along with reduced noise. Not to mention you could get a slot back going from two 7950GX2's to one 8800GTX though I suspect most Quad-SLI users went to dual 8800GTX's which of course was a massive increase over the 7950GX2 Quad-SLI setup. The 8800 Ultra was almost insulting as the cost was rediculous and all it really offered was better overclocking. Stock for stock it was a small increase over the 8800GTX. I think the 8800Ultra was released for marketing reasons and little else. Then NVIDIA got confusing with 8800GT's and a new 8800GTS with less memory. The entire 8-series at least the 8800's are all stellar cards and can be had in a surprisingly large amount of price points. Truly an exceptional generation. Though it seems that for the moment the 9-Series may be underwhelming but with competition like ATI's been putting out lately I wouldn't be surprised if NVIDIA half-assed the 9-Series and manages to stay ahead of ATI.
 
I'm talking about the TNT2 Ultra which was the big dawg of the day that provided the best performance and image quality of the itme.

.............

be underwhelming but with competition like ATI's been putting out lately I wouldn't be surprised if NVIDIA half-assed the 9-Series and manages to stay ahead of ATI.


Great Summary of the entire lineup's strengths and weaknesses.

I think it really shows that there are multiple forces at play with any product line when you consider that this is a company with a goal to make money & increase market share.

When the opportunities arose to make a high margin product, they took it and “you” bought it.

When the product was or perceived as superior and generating sales, R&D fell off and the current product was sold or re-hashed for as long as possible to maximize profits.

When the product was less appealing to the market place as the competition’s, more R&D was made and a more competitive product was launched to retain or gain market share.

All of these things have given us a Company that was voted Forbes “company of the year” last year, a highly profitable muli-billion dollar tech company and a brand loyalty on par with Star Trek. (Lets face it, some of you would KILL an ATI Fan-boy if you could get away with it…)

Take those facts in to consideration and you can pretty much tell where the next products are heading and what to expect from virtually any company.
 
I think it is obvious that this card is not aimed at existing 8 series owners. The 8 series line is being refreshed and repriced.

I think the GX2 is more aimed at the small but growing number of users who are looking for bigger, higher resolution monitors.

Also people like me who still have a 7900 GT and are gonna be getting a new video card with their tax rebate and all I hear is "Its faster than an Ultra". As long as the card is 699 or less I will be buying it.
 
It will be interesting to see final retail cards and performance numbers from reputable sites.

I can't say for sure but I suspect the 9800GX2 will be lackluster, but still faster than anything else out at the time. It will be interesting to see what the 9800GTX will be like. As many have speculated that will be the real high end card even thought he name suggests otherwise. The last two GX2 cards were all higher end than the GTX cards of the day.

This may be about like the upgrade from the 9700Pro to the 9800Pro. Sure a jump can be seen on paper but it won't be worth the upgrade as far as most people are concerned. That's the way it was for me back in the day when I purchased a GeForce 2 GTS 64MB. They released a Pro model and one other upgrade of the same card as I recall. All of them were so incremental that I ended up using my GeForce 2 GTS 64MB for about a year and a half until the GeForce 3 Ti500 came out. That's when I made the change. Of course I sort of got boned on that when 3 months later the GeForce 4 ti 4600 came out and was considerably faster than the GeForce 3 series cards.

Of course if you are running a 7 series card now the 9 series may have some real appeal. Honestly if these things are using G92 cores I don't think they should be called 9-series cards at all. Maybe an 8900GX2 would be a more appropriate name and that card should stick around for six months until a real next-generation part comes up. If indeed that ends up being the case I'll probably stick with what I've got.
 
It will be interesting to see final retail cards and performance numbers from reputable sites.

I can't say for sure but I suspect the 9800GX2 will be lackluster, but still faster than anything else out at the time. It will be interesting to see what the 9800GTX will be like. As many have speculated that will be the real high end card even thought he name suggests otherwise. The last two GX2 cards were all higher end than the GTX cards of the day.

This may be about like the upgrade from the 9700Pro to the 9800Pro. Sure a jump can be seen on paper but it won't be worth the upgrade as far as most people are concerned. That's the way it was for me back in the day when I purchased a GeForce 2 GTS 64MB. They released a Pro model and one other upgrade of the same card as I recall. All of them were so incremental that I ended up using my GeForce 2 GTS 64MB for about a year and a half until the GeForce 3 Ti500 came out. That's when I made the change. Of course I sort of got boned on that when 3 months later the GeForce 4 ti 4600 came out and was considerably faster than the GeForce 3 series cards.

Of course if you are running a 7 series card now the 9 series may have some real appeal. Honestly if these things are using G92 cores I don't think they should be called 9-series cards at all. Maybe an 8900GX2 would be a more appropriate name and that card should stick around for six months until a real next-generation part comes up. If indeed that ends up being the case I'll probably stick with what I've got.


I think you're thinking of the GF2 Ultra. Started out with GF2 and GF2 Ultra. Added Pro somewhere along the line, and of course MX varitities.
 
You know.. its almost worth Buying the Highest end Card from each generation and Dumping it the moment the next High end card comes out to recoup your costs a bit..
 
I think you're thinking of the GF2 Ultra. Started out with GF2 and GF2 Ultra. Added Pro somewhere along the line, and of course MX varitities.

That's the one! I couldn't remember it to save my life. I knew there were at least two revisions of the GPU that came out that weren't really worth the cost to upgrade to. Through overclocking and voltage mods I was able to get nearly to the Ultra's performance levels and definitely on par to the Pro.
 
Therefore Crysis can bite me for making me play thier game at supoptimal settings by releasing it before proper hardware existed

Admittedly, I really don't get this- Crysis, even on High, looks amazing. Hell, with my 8800GTS 640mb I can get 23fps (abysmal in most games, not in Crysis though) at 1600x1200, 16xAF, vsync, and everything on High (and Textures on Very High) in DX10. I'd assume a GTX could pull some better performance and could tune a few more settings to Very High (I *might* be able to- still testing Crysis in Vista). Nevertheless, no card can truly run Crysis at an "acceptable" frame rate with everything on at the moment- but why should it? Why should a game's graphics become dated as soon as the next generation of hardware comes out? Why can't a game scale for future hardware, so that when you're loading the game up with your new GeForce 10800GTX the game is still fresh and amazing visually?
 
Admittedly, I really don't get this- Crysis, even on High, looks amazing. Hell, with my 8800GTS 640mb I can get 23fps (abysmal in most games, not in Crysis though) at 1600x1200, 16xAF, vsync, and everything on High (and Textures on Very High) in DX10. I'd assume a GTX could pull some better performance and could tune a few more settings to Very High (I *might* be able to- still testing Crysis in Vista). Nevertheless, no card can truly run Crysis at an "acceptable" frame rate with everything on at the moment- but why should it? Why should a game's graphics become dated as soon as the next generation of hardware comes out? Why can't a game scale for future hardware, so that when you're loading the game up with your new GeForce 10800GTX the game is still fresh and amazing visually?

I agree and disagree with your points.

I am running Crysis on a 7900GS OC and the rig in my sig and I feel that it is more than playable on mostly medium settings, 1024x768.. Looks great to me.. So I agree.

The only thing that I dont agree on so much is the game being too-forward looking for the current generation. I have had "high end" graphics cards since the 2mb Cirrus Logic VLB Cards came out and I always found that the novelty of seeing an old game "in all its glory" months or years after its release was less than satisfying. Sure, its cool, but by the time that the new hardware cycles through, better games, even on the same engine, have come out and it really detracts from the experience and the "sting" of dissapointment from the initial games ill-performance never wears off.

The only way I could see Crysis as a Lead-In would be if the 9xxx series offered proper performance for the game right from the bat and were released within 90 days of the game. At that point, Crysis would have become a real driver for upgrades.
 
Terran, you wrote:

"Nevertheless, no card can truly run Crysis at an "acceptable" frame rate with everything on at the moment- but why should it? Why should a game's graphics become dated as soon as the next generation of hardware comes out? Why can't a game scale for future hardware, so that when you're loading the game up with your new GeForce 10800GTX the game is still fresh and amazing visually?" -

Well WHAT THE HELL IS THE POINT to a game that you can play now at lower setting...they could have put resources into optimizing rather than making unplayable settings available...or you can wait to play the game until next gen hardware comes...but it this is the choice then there is no reason to push the game out yet and resources could be used on better game play in current games and when the game does come out it will be better because the coders would have actually been able to optimize more and most notably do this on next gen hardware which I doubt was available to the coders that were working on Crysis.

Another thing is now next gen hardware engineers have to, at least to some extra degree, worry more about supporting old crap than about making the truly best next gen piece of silicon...I know that it is ultimately a symbiosis but I think we all do better if the coders have to adapt to the hardware engineers more than visa versa.

just IMHO of course...
 
What game looks better than Crysis on High? None that I've seen lately. High settings are certainly feasible with cards currently on the market, and the graphics offered by them still dominate every other game available for every platform. Only the Very High settings are out of your grasp, but those aren't really meant to be the game's present highest settings- they're simply not achievable right now. As far as spending more time on optimization, the game may have been developed by Crytek, but it was published by EA- you do the math.

As far as hardware developers needing to "cater to old crap", I very much doubt that in the case of Crysis- and in fact you could perhaps make the argument that such is the reason Very High settings are only available on DX10 (excluding the hack ofc, but as [H] itself has proven, the hack is not true Very High- on a slightly OT note, having just upgraded to Vista, even w/the same settings as I used on XP [no Very High], DX10 seems to offer superior lighting and image quality- it's subtle, but it seemed noticeable to me)- I doubt such a line of thought is valid, but if you're going to make your previous argument, you may as well make this one too.

Finally, as far as what settings to make available and what is required to achieve those settings, such decisions are made far in advance of release. Crytek obviously planned for Very High for quite some time- quite possibly long before even the 8800 series existed. Thus, they needed to estimate what hardware would be available at release. As many have previously noted, if nVidia followed their usual path (or if you want to look at it another way, if ATi challenged the 8800 GTX and Ultra) of releases, the GeForce 9xxx series would have been launched already. Would even this have been enough? Looking at the 9800GX2, perhaps not (refer to my prior set of parenthesis ;)), or perhaps so. It depends actually. If you don't mind playing Crysis in the ~20-30fps range, it may well be (I play it at ~23fps- Crysis really doesn't need a very high fps to be smooth enough, and the sheer beauty of it more than balances-out the low framerates). Anyway, the point is, Crysis decided to plan for such high-end visuals, and thus at the time of release, they had two options- make those settings available out-of-the-box or only make them available in a patch after hardware has been released that can cope with the game? Given that choice, I think the former is obviously the better option, for it gives the user the choice- if you're willing to sit through a slideshow, feel free. There really isn't an option to sit on the game until technology becomes available to play the game at its highest settings- it's not an economically viable option. Ultimately, you are criticizing Crysis for trying to put forth the most cutting-edge visuals it possibly could while attempting to stay within the confines of what modern hardware can handle. Sure, their estimates may have been a little off, but that fact does not make the game unplayable for higher-end users- it just means you'll need to cope with the fact that there is a game out there that your rig can't play on its highest settings. My apologies in advance for the hit your ego will take because of this.

My only complaint towards Crytek on this issue is that Crysis's performance is not constant- the second half of the game essentially has different requirements. What settings sufficed in the first half no longer do (well, they kinda do- you really don't need to a smooth FPS to conquer these sections easily) in the second half. This should not happen- the game should be set-up so that the settings are consistent throughout the game. On the plus side, you can modify most graphical settings in-game. Also, Crytek really needed to do a better job of implementing SLI support- to demand so much of graphical hardware but to make the "best" solutions available to tackle the performance requirements for Crysis's highest settings ineffective at launch is unacceptable- Crysis is the kind of game many enthusiasts put together their high-end rigs to play, and to deny them the benefits those rigs supply is jipping them somewhat. However, imo, this goes back to the Crytek and EA connection, and I feel it reflects more on EA than Crytek.
 
Terran, you wrote:

"Ultimately, you are criticizing Crysis for trying to put forth the most cutting-edge visuals it possibly could while attempting to stay within the confines of what modern hardware can handle."

Yes I am because they failed. I am not at all impressed that they could put out a game with such great graphics that it isn't playable. Sorry, that just doesn't get me a great gaming experience.


On the other hand Terran, I agree with you for the most part. I am just pointing out that the best situation for gamers is when the current gen software is optimized for the current gen hardware...any mismatch is at best useless (to me right now) and at worst frustrating and irritating ...corporate concerns aside.

I know this is to a large extent my own psychology but having played Crysis at nearly but not quite very high, and having been irritated by the sudden tank in fps 2/3 through the game...I will never get to play the game in all its glory...by the time proper hardware is out I doubt I will want to go back and replay the same exact places and objectives...now the game kinda fooled me, really taxing on hardware..but if I had known it was going to get MUCH worse more than half way in I just might have shelved it until next gen vaporvideo appeared....I have done this before...as a matter of fact did so with Oblivion....really really became a huge fan of Oblivion when I could play it in all its glory...problem is Crysis has nothing on that game in terms of longevity of play...no reason to go back...sure the company still has my money...woo woo, they win...I just don't feel like I won...

Oh well...looking forward to games with this engine in the future...when they invent hardware for it...
 
Well, with a mid-range card I'm able to get everything on at least High and some things on Very High at 1600x1200 with "acceptable" performance, and it looks better than anything else around. In terms of Crysis tanking 2/3 the way through, I totally agree that should never have happened, but it doesn't really matter, imo, because anything past halfway through the game isn't much worth replaying anyway. But the first half of the game sports more replayability than any game in my sizable library- already I've played the first mission alone an amount of times that creeps into double digits, and it's no short mission. I will relish the day when I can do so with the settings maxed- and that day will likely mark the play-through that sends the amount I times I have played the first level into triple digits.
 
The original Geforce was kinda buggy, but what do you expect for the first gpu...even so..the Geforce 2 was a significant improvement.
I bought a geforce DDR the first week they came out. Never had any issues. At the time, it ran games like Quake 3 and Thief 2 flawlessly. It even did okay with Morrowind. It gave me a lot of mileage, almost 3 years of gaming until UT2003 put it into retirement. IIRC, geforce 1 and 2 are the same graphics tech (dx7). But 2 has faster clocks. So the original was put to good use.

It doesn't seem like many people will bite on the 9800GX2 so hopefully it's only a half step and Nvidia will be forced to release the next gen soon. I just read R700 may arrive in late spring.
 
sorry if this has been posted already, but i'm just wondering if there's any "dates" revealed about this launch? tentative or speculative or anything.
 
wish they made something better than this for a model 9800 Hyped video card like for a GX2 but 512 bit 1 GB per core per PCB, Stream Processors 512 you know they can do it. why not. why not go for far more extreem for the highest $ card on the market



Seagate 2x 250GB 16 MB cas SATA2 RAID0 Vista Ultimate x64bit OS
motherboard : GA-K8N-PRO-SLI Nforce 4 chipset
CPU: AMD socket 939 AMD Opteron Dual Core 180 Denmark 2.4GHZ
Memory : PC4000 Ballistix Tracer 4GB Dual Channel 250MHZ 3-4-4-10 2.5v
PCI-E : XFX Geforce 256MB 7800GT Extreme Editions
Power Supply : Rosewell 600W ATX2.01
Slave Hard drive : Seagate SATA2 300GB,SATA2 160GB
 
wish they made something better than this for a model 9800 Hyped video card like for a GX2 but 512 bit 1 GB per core per PCB, Stream Processors 512 you know they can do it. why not. why not go for far more extreem for the highest $ card on the market



Seagate 2x 250GB 16 MB cas SATA2 RAID0 Vista Ultimate x64bit OS
motherboard : GA-K8N-PRO-SLI Nforce 4 chipset
CPU: AMD socket 939 AMD Opteron Dual Core 180 Denmark 2.4GHZ
Memory : PC4000 Ballistix Tracer 4GB Dual Channel 250MHZ 3-4-4-10 2.5v
PCI-E : XFX Geforce 256MB 7800GT Extreme Editions
Power Supply : Rosewell 600W ATX2.01
Slave Hard drive : Seagate SATA2 300GB,SATA2 160GB

If it really is only going to be $449 I can answer your question...
 
wish they made something better than this for a model 9800 Hyped video card like for a GX2 but 512 bit 1 GB per core per PCB, Stream Processors 512 you know they can do it. why not. why not go for far more extreem for the highest $ card on the market

ATI shot for the same, and failed miserably. Simple rule applies to engineers too, dont bit off more then you can chew.
 
that tid bit has been mentioned in several sources, some more reputable then others. If you think about it that pricing fits in line with the current offerings and the card that should come shortly after it the GTX which would be 549.99 ish

So not that unrealistic.
 
I,m not sure if I go for the 9800GX2 or the GTX because of driver problems that they had with the 7900GX2?
I think with a 30 inch LCD Monitor at 2560 x1600 it will benefit with a GX2 I hope?:confused:
 
I,m not sure if I go for the 9800GX2 or the GTX because of driver problems that they had with the 7900GX2?
I think with a 30 inch LCD Monitor at 2560 x1600 it will benefit with a GX2 I hope?:confused:

Would definitely (contingent upon good drivers- but it would behoove nVidia to have anything but if they wish to make a decent amount of sales on this card) benefit from the GX2 greatly for all games that support SLI.
 
Would definitely (contingent upon good drivers- but it would behoove nVidia to have anything but if they wish to make a decent amount of sales on this card) benefit from the GX2 greatly for all games that support SLI.

This is a stupid question but, is it easy to set up a SLI (9800GX2)because I never done one before?
 
Don't have any experience w/the 7950GX2, but from my interpretation of what I've heard about it, it's really no different than a single gpu video card- you plug it into your PCI-E slot, install the drivers, and play. The only difference I think is that you'll need to deal w/SLI on the software front- which basically just means enabling it in the CP (if the drivers are good, this should probably be done automatically already for you just by installing them once it has detected a 9800GX2) and creating SLI profiles for games that do not directly support SLI to get SLI performance in them.
 
I'm considering whether or not I should use my step-up from EVGA to get this card, or to wait for the GTX...but the GTX release date may be too late. Anyone know a date for that one?
 
Suspected to be Feb/Mar slightly after the 9800GX2- don't know any more than that and don't have total confirmation on that either. As far as step-up, I'm guessing you paid ~$350. If the GX2 really is the top of the line and really is $450, then the GTX will likely be $400. For $50+ shipping, it's probably going to be worth it for the GTX at least.
 
I'm considering whether or not I should use my step-up from EVGA to get this card, or to wait for the GTX...but the GTX release date may be too late. Anyone know a date for that one?

Simple wait to the last day and if the GTX is out pull the trigger on that card, if not pull the trigger on the GX2.

Either way your getting a jump in performance for very little money.
 
Simple wait to the last day and if the GTX is out pull the trigger on that card, if not pull the trigger on the GX2.

Either way your getting a jump in performance for very little money.

If the GX2 lives up to its, "30-50% performance increase over the 8800 Ultra" it would definitely be worth $100 to step-up from an 8800GTS 512mb.
 
wait, so the GX2 is for sure going to be better than the GTX?

By this I'm assuming there won't be a 9800 "ultra", but the GX2 will be the top card? Because if that's the case then I won't bother waiting for the GTX and snag the GX2 for sure.
 
Back
Top