Nvidia Geforce 9800GTX Specifications released!

To the guy who convinced me about the GTS, what should I get (Evga for sure) 320 or 640MB? I play at 1680x1050.


Although I am not that person....I have found always go for as much ram as you can afford...this is true with a video card as well. While the 320 version is nice, I have seen some reports of it choking at higher resolutions. The 640 should be the way to go.
 
To the guy who convinced me about the GTS, what should I get (Evga for sure) 320 or 640MB? I play at 1680x1050.

Depends on what kind of games you play, and how much you have... since you were looking at getting a 2900, getting the 640mb should be in your price range. Get EVGA, BFG, or XFX (or even OCZ I think they have lifetime as well).

640mb will allow you to have higher textures / AA levels at higher resolutions because of the extra memory as shown in Oblivion / LOTR.

OCing the cards is easy, and also as shown provides a nice increase in FPS and thus detail settings as well.

If you buy a card now, you will probably also get a newer core (A3) which have shown to be excellent overclockers.
 
What if it is louder, hotter and uses to more power then the HD2900, then what are you going todo

Lol. still getting it and putting watercooling on it. Idk about the power, I dont see anything higher then 1200W powersupplys :/
 
Wrong. You hit your opponent while he's down. Not when he's getting up. NVIDIA does very well, in continuing this kind of pressure over AMD/ATI.
Also, your remark about 8800s not really being DX10, but rather "DX9 usage, with DX10 support", makes no sense. The fact that there are no actual games that take advantage of all DX10 features, is hardly the hardware manufacturers/designers fault, but the developers. Even so, 8800s are dominating the few titles that take advantage of DX10 features.

There's so much time until ATI brings out it's best that 3 months won't make a difference imo. I don't think you understood what I was saying on the DX9/10 part (I agree with what you posted on the subject). I'm saying that if the new generation of cards are released this fall/winter, then they are the ones that are really going to lead the way into DX10. The 8800 series would have been for DX9 ownage and a few DX10 patch-jobs (hardly a decent share of DX10 for the first DX10 cards). The G90 cards would be the ones to get from then onward. Where as if they don't release a G90 high-end card, then the 8800 series will actually a get a few games under its belt (as a few competent DX10 games are just starting to roll out this fall/winter).

I'd also like to point out that this rumor is hardly one, just some BS conjecture from an article COMMENT about 2 months ago and nothing's been uttered since...except for the OP who stumbled across it. Those who hope, should wait until rumors are actually ya know, rumors. Meaning when it's all over the net. They remind me of the R600 specs to be honest lol.
 
Although I am not that person....I have found always go for as much ram as you can afford...this is true with a video card as well. While the 320 version is nice, I have seen some reports of it choking at higher resolutions. The 640 should be the way to go.

Depends on what kind of games you play, and how much you have... since you were looking at getting a 2900, getting the 640mb should be in your price range. Get EVGA, BFG, or XFX (or even OCZ I think they have lifetime as well).

640mb will allow you to have higher textures / AA levels at higher resolutions because of the extra memory as shown in Oblivion / LOTR.

OCing the cards is easy, and also as shown provides a nice increase in FPS and thus detail settings as well.

If you buy a card now, you will probably also get a newer core (A3) which have shown to be excellent overclockers.

Well, you know what? I'm going to look on FS/FT threads and look for an used 7900. Since Nvidia's releasing another wave of cards this fall/winter, now I don't see the picture of myself with a 8800 or something. I should have realized this a long time ago.
 
Not if you got an EVGA.

(checks calendar)

Well, no, I guess you wasted your money. But there isn't anything in the computer industry that doesn't immediately become obsolete.

People who bought an 8800 GTX at its launch still dont have an obsolete part...how many months has it been?
 
Its beautiful stuff like this that make me glad I'm waiting awhile to upgrade.
 
This internet circle just won't die.

Those specs aren't real at all and are a month old at best!
 
Even if its just a rumor, a little part inside of me wishes its not a mid-range product and shooting for higher end.

If so, i'm down for a 9800GTX :)
 
Released? No. Hows about rumored, because thats all it is.

I feel sorry for people who believe everything they read on the internet.
 
Released? No. Hows about rumored, because thats all it is.

I feel sorry for people who believe everything they read on the internet.

I also feel sorry for people who believe everything they read in the bible.
 
I really should have looked closer before commenting.

It is not even a blurb at Xbit, it is comment by someone in the comment section of a blurb.
It is old besides that.

So we have an old message board comment and we all get excited. I hang my head in shame.

Ok where is NV engineering located we need to start going through the trash for real clues. ;)
 
Interesting. I do not believe these specs nor do I deny that they are a possibility. I've already been planning to buy the G9x and there are only three things that can stop me. One being if by some miracle my 8800 gtx can manage to pull out min frame rates of 30 in crysis, ut3, and assassins creed at 1920x1200 or ati manages to pull out a card that can before the G9x. Last being that not even the g9x can do said things :|.
 
What strikes me as odd is that most of these specs are already present on the 1GB 2900XT, or were rumored to be in the case of the ones that ended up not being there (like the eDRAM). It feels like a cut/paste job from old R600 rumors.

Not that I don't hope it's all true.
 
Does anyone know if I could add one of these next gen cards 9800? to an existing 8800 gts or gtx in sli mode. One person told me you had to have identical cards for sli. Another said that you could add a different card from NVidia but both cards would run and the same level as the worst card.
 
Does anyone know if I could add one of these next gen cards 9800? to an existing 8800 gts or gtx in sli mode. One person told me you had to have identical cards for sli. Another said that you could add a different card from NVidia but both cards would run and the same level as the worst card.

No you will not be able to.
 
I call bs for two reasons, that has been out for almost two months, and if it were to be real, we would see it at multiple places, not even the inq has attempted to spin it. Secondly, when that post was made, their is still over 5 months intill its suposid release date, everyone believed that G80 was a fixed pipeline design upto a month before its release, because people simply didn't know, and thats good work on the part of nvidia.

Also I call bs, because it looks like this isn't an "article" as much as it is a post on some xbit forums by some random person.

I also call BS because that post says 65nm, where nvidia has yet to have a 65nm part out yet, releasing their first 65nm design as a top end part, w/o working out the bugs (and their will be bugs) would seem a bit risky for both nvidia and TSMC, its common practice to release lower end parts with a new process before going full tilt on it.

I also call bs on the timetable, 12 months between generations, not even a refresh, just a brand new high end part.

Its not that I don't believe that nvidia is capable of what was listed in those specs, I just believe their smarter than that when it comes to both PR and design. I say PR because news like this would take focus off their 8800 parts, and people would wait 4-5 months for a card of this magnitude, and thats just bad business.

Then again, they released 680i with the 8800gtx, mite not be far off that they do the same thing here.

ninja edit: I wouldn't be surprised of this kind of speculation hurts them right now, just looking at other posts on the front, people jump to conclusions and hold off on their purchases. Nothing atm is set in stone as far as I am concerned.

you are 100% correct. greed means that this is not at all an official statement of any kind or "leak". It could be insider info, yes, anything is possible, but greed and satanic lust for money says one thing:

dont kill the enthusiasm for the 8800gtx/gts line while it is selling like sick puppies on fire.
 
If the 9800 (hopefully they come up with a new naming scheme) is released before anything new from AMD or Intel and it performs alteast 50% faster then the 8800ultra. It will be held up as a success wether it has major problems or not.

And I am sorry it is going to be bigger, louder and hotter then the 8800 series. The 7800 was hotter and bigger then the 6800 and the 8800 was bigger and hotter then the 7800. Trends dont change only peoples perception of them do.
 
yeah, the chance that it runs cooler is like 0

the good part is that pci-e-2 is comming out.

the standard for it is each socket produces 150 watts and 32x speed. I could care less about the 32x but the 150 watts means that no special psu crap requirements will be needed.
 
And I am sorry it is going to be bigger, louder and hotter then the 8800 series. The 7800 was hotter and bigger then the 6800 and the 8800 was bigger and hotter then the 7800. Trends dont change only peoples perception of them do.

You best run memtestSoftNSquishy. The 7800 was most definitely smaller (1 slot design), used less power, and ran cooler than the 6800 leaf blower. Here's a ref.

Here are a two more perception calibrators:

HardOCP 7900 vs 7800
TechReport 7900 vs 7800 vs 6800
 
if anything this blurb helped me from pulling the trigger for just a little bit longer. :)
 
There's so much time until ATI brings out it's best that 3 months won't make a difference imo. I don't think you understood what I was saying on the DX9/10 part (I agree with what you posted on the subject). I'm saying that if the new generation of cards are released this fall/winter, then they are the ones that are really going to lead the way into DX10. The 8800 series would have been for DX9 ownage and a few DX10 patch-jobs (hardly a decent share of DX10 for the first DX10 cards). The G90 cards would be the ones to get from then onward. Where as if they don't release a G90 high-end card, then the 8800 series will actually a get a few games under its belt (as a few competent DX10 games are just starting to roll out this fall/winter).

Which is exactly why NVIDIA is releasing them. When they released G80, last November, everyone knew that DX9 performance was the most important. Vista was not even officially out and there were no DX10 games (even if just in patch form). And so, 8800s delivered the best DX9 performance ever, which was exactly their main purpose, though of course, they had the capabilities to run DX10 games, if and when they really come out.
Now that more and more DX10 games are coming out, NVIDIA will release their new chip, rumored to be twice as powerful as a 8800 GTX, aimed at games like Crysis, Bioshock and UT3 (for example). It's definitely the best thing they can do. AMD/ATI is down and NVIDIA won't give them any breathing space. If G90/G92 is what it is rumored to be, AMD will be in big trouble, considering that they are still catching up with the "old" 8800 GTX/Ultra performance.

Miscommunication said:
I'd also like to point out that this rumor is hardly one, just some BS conjecture from an article COMMENT about 2 months ago and nothing's been uttered since...except for the OP who stumbled across it. Those who hope, should wait until rumors are actually ya know, rumors. Meaning when it's all over the net. They remind me of the R600 specs to be honest lol.

A rumor is a rumor and this is no different. It will be considered and discussed as a rumor, until we know for a fact, it's true or not.
And actually, most of R600 rumored specs, turned out to be true...Let's just wait and see what happens.
 
THAT is absolute utter bollocks!

edit:

And to clarify why I think that, here's a couple of points. They've basically copied some specs from ATI chips and messed it up.

- eDRAM die for "FREE 4xAA".
Riiiiiiiiiiiiiiight. Where have I heard this before, oh that's right. On the Xenos. They just won't do this on a PC graphics chips. Unless the amount is LOADS (which it won;t be - Die space could be used better) how is it gonna apply free 4xAA to the resolutions used on the PC?

- built in Audio Chip.
Riiiiiiiiiiiiiight again. Even on the HDMI ATI cards there's no sound chip, just uses a pass through from the sound card to combine the audio with the video signal to pass via the hdmi.


I say again, UTTER bollocks.
 
I've been waiting three years for this card. I always wanted an nVidia card to have the same name as an ATI card.
 
Back
Top