Info leaked on Geforce 9600GT!!

When you have two people saying the same thing to you, dont you think it might be time to take a step back and think for a second, that maybe you are doing what we said your doing? Just a thought for you to consider, i really couldnt care less, if you want to spread rumors, go right ahead, but most of us here look down on that.
some of the stuff you quoted me on is nothing more than my thoughts and opinions just like anybody else would have in a thread like this. I would never start a thread based on something at fudzilla or theinquirer but I see nothing wrong with pointing out info from those sites while speculating on unreleased and/or likely products.
 
some of the stuff you quoted me on is nothing more than my thoughts and opinions just like anybody else would have in a thread like this. I would never start a thread based on something at fudzilla or theinquirer but I see nothing wrong with pointing out info from those sites while speculating on unreleased and/or likely products.

Assuming i agree with the "some of the stuff" statement, that still leaves all the 8800 GX2 info you were touting, with out any hint of "this is just my speculation", you seemed pretty adament about this card coming out.

Pretty sure im done responding to you regarding this.
 
IMHO it's a pretty obvious anyway; if it's officially released information, it's probably all over the web (or at least reported personally by Kyle/Brent/Dan/etc...). If not, it's pretty obvious it's just speculation and rumor.
 
Assuming i agree with the "some of the stuff" statement, that still leaves all the 8800 GX2 info you were touting, with out any hint of "this is just my speculation", you seemed pretty adament about this card coming out.

Pretty sure im done responding to you regarding this.
You really need to lose the attitude because thats certainly not how you would talk to someone in person. I guess since I dont claim to work for Nvidia or know anybody that does that "imo" is implied. The GX2 type card appears to make sense as this has been rumored for awhile and the 3870 X2 is basically now official. Yes maybe using fudzilla for info may be wrong but we are all just speculating so I dont see the big deal. ;)
 
Well, the GTS has a 128-bit memory bus which is probably holding it back.

If 9600GT has a 256-bit memory bus it should perform much better than 8600GT/S and any 7xxx series.
 
Well, the GTS has a 128-bit memory bus which is probably holding it back.

If 9600GT has a 256-bit memory bus it should perform much better than 8600GT/S and any 7xxx series.
its mainly the 32 SP and 8 ROP that are holding the 8600gt/gts back.
 
I'm actually quite glad the high end parts are not for another 3-4 months minimum, gives me time to save up after the christmas wallet rape.
 
The 8600GTS is equal to / slightly better than the 7950GT.

Only in 3dmark, but once you get to the stuff that actually matters, the 8600 folds over, and even the 78xx series is a better option, let alone the 79xx series.

There are really only 2 reasons to buy an 86xx card.

The first being the HD decoder support, and if you buy it for that, your still an idiot because the 8500 does the same thing for alot less, and the second being some guy at Best Buy told you it would be an upgrade over a really old card, but not a good upgrade, which still wouldnt be a wise purchase.
 
Only in 3dmark, but ones you get to the stuff that actually matters, the 8600 folds over, and even the 78xx series is a better option, let alone the 79xx series.

There are really only 2 reasons to buy an 86xx card.

The first being the HD decoder support, and if you buy it for that, your still an idiot because the 8500 does the same thing for alot less, and the second being some guy at Best Buy told you it would be an upgrade over a really old card, but not a good upgrade, which still wouldnt be a wise purchase.
well the 8600gt is better than the 7900gs and usually the 7900gt in newer games like Bioshock, Crysis, COD4, and UT3.
 
well the 8600gt is better than the 7900gs and usually the 7900gt in newer games like Bioshock, Crysis, COD4, and UT3.

Havent kept up too much with prices of these lower end cards, but the last time i looked, the 3850 was alot more powerful for only a little but more then the 8600. I would expect the 8600 to be better/equivalent in the more shader intensive games. However i would think this is due to the unified architecture of G80. I would only expect this at the lower resolutions though. Provide documention if i am wrong, i have not really kept up with 86xx performance other than general knowledge.

I should have mentioned a resolution when making my comment since that seems to be a pretty significant aspect these days. The 128bit bus suffers pretty badly above 1280x1028/10680x1050, or is at the very least a significant bottleneck, coupled of course with the lack of shaders/rops.
 
Havent kept up too much with prices of these lower end cards, but the last time i looked, the 3850 was alot more powerful for only a little but more then the 8600. I would expect the 8600 to be better/equivalent in the more shader intensive games.However i would think this is due to the unified architecture of G80. I would only expect this at the lower resolutions though. Provide documention if i am wrong, i have not really kept up with 86xx performance other than general knowledge.
yeah the 3850 is roughly twice as fast on average as the 8600gt/gts. the 8600gt is really only a 1024x768 card for most modern games unless you turn down a lot of game settings. the 128-bit doesnt really make that much difference since the 8600gt doesnt have the power to run modern games at higher resolutions anyway.
 
yeah the 3850 is roughly twice as fast on average as the 8600gt/gts. the 8600gt is really only a 1024x768 card for most modern games unless you turn down a lot of game settings. the 128-bit doesnt really make that much difference since the 8600gt doesnt have the power to run modern games at higher resolutions anyway.

yes exactly
 
After reading 3 pages I'm now:confused: First off is it or is it not confirmed the 9600gt will be out in FEB? I could care less about duel GPU cards as all they are is SLI on one card with all the same problems and more.

Why is it a "9600" with a 8 series core:confused: Why would Nvidia not release the 9800GTX in Feb? It would not affect the 8800 GT and GTS sales that much since they could charge 1K for the GTX (make a crap load of money) and people would still buy a crap load. Most "smart or normal" people would opt for the GT or GTS anyway since these cards play all current and upcoming games pretty well. The GTX would hit the nitch market for the rich, stupid, benchmarking and or high rez gaming.

Oh...I'm part of the GTX group:D
 
Wow, what a mess; I guess we'll have to wait and see what actually comes out. It sucks that I won't be stepping up, but I'm glad I got a GTS 512 to hold me over until the spring :D.
 
I'm actually quite glad the high end parts are not for another 3-4 months minimum, gives me time to save up after the christmas wallet rape.
I'm with you. I'm already down over a thousand so far, and I'm not even done!
 
dude he was just joking. nobody knows what the price is going to be and we can only assume that it will be at least as much as the current 8800gtx.

Prices for computer hardware, over the years, have typically been going down substantially as the power / capacity increases.

Used to be a PC cost $2,000. Now, a much, much, faster PC can be purchased new for a little over 10% of that cost.

This thing with the prices of video cards going up and up is an anomaly, and eventually things should turn around and nice powerful 3D cards will start going back down in price.
 
Only when they start hitting a brick wall like soundcards. I really dont see that happening to soon, because they can go bump up to multi cores, and add memory. Until things just start to really slow down. Right now, games can barely play Crysis. I don't see what your talking about in the foreseeable future. Especially since the US dollar is worth as much as a canadian one.
 
I think consoles will cause that brick wall before anything else. They can do whatever they want with the hardware, GPU cores are really already multi-core so its just a matter of adding more, but if you dont have software to take advantage of the hardware, the hardware is worthless. SLI is a good example, not that its worthless because it clearly has its benefits, but just that not many people go the SLI route, so not many games take a huge advantage of it, or an advantage worthwhile to most people.
 
I can definitely vouch for the poor SLI support- of the ~8 new games in my library, not even half actually have any decent support for it. Going to test my OB games now (play some TF2, xD), and those, judging from what the nVidia CP is saying, should support SLI, but UT3, CoD 4, GoW, and to an extent Crysis (right now, SLI is basically useless- going to try plopping driver-induced AA on the second card, which would probably do more good than what true SLI is doing for it atm; the patch should hopefully change this- would really like to be able to play Crysis on Very High- Vista x64 will be here in a couple of weeks, so...). Props to ET: QW for making really good use of SLI support, and Red Orchestra (not exactly a new game, but one that has stayed relatively up-to-date thanks to a fantastic post-release effort on Tripwire's part) for doing the same- the benefit of SLI is readily apparent in both games. Admittedly, it's not even really a problem in CoD 4 and Company of Heroes (which I'm not totally sure about whether or not it really supports SLI- I kinda fudged-up my initial tests and have just shifted AA to the second card since) and even UT3 I can just shift the AA to the second card and my single 8800GTS already has enough power to tackle those games at 1600x1200, 16xAF, and max settings w/out AA at excellent framerates, and the 16xAA that second card can add on is just adding to the sweetness. But it hurts in Crysis b/c the framerates are abysmal to begin with and once I get Vista the poor SLI performance will be stopping me from attaining the maximum settings I should otherwise be able to achieve.

Consoles are definitely a limiting factor, but I think that once we start to get more affordable hardware that can cope with the likes of Crysis on max settings at a decent resolution, you will start to see the console effect diminish.

As far as sound card prices go, I just picked-up an X-Fi XtremeGamer for $50 plus a free $40 headset (which is a pretty awesome headset- great mic, fairly comfortable, and excellent sound quality). I'm not 100% sure, but I think the card is at least 1-2 years old and it easily handles the highest sound quality from every game in my library (granted, I achieved the same thing with my Audigy 2 Value, but that was on the back of my CPU, and this last upgrade was intended, and will be, my last for quite some time, so in the interests of allowing my CPU to deal with what it really needs to be dealing with...).

As far as software goes, Crysis is here, Far Cry 2 is coming, and there are quite a few other games around the corner which should push the hardware. I feel that if nVidia and ATi show that they can put-out a $250 card that can handle Crysis on Very High at 1280x1024 (at least; and after the January patch), more developers will put more effort into upgrading their PC versions' graphics over their console counterparts. Of course, the other problem we have is that even though more people have PC's that can play Crysis than there are people w/PS3's and Xbox 360's combined (and, ostensibly, if you can play Crysis, you built/purchased your rig to play games), PC game sales are abysmal (which is the biggest factor of why you don't see more effort being put into the PC versions of games). And, to make matters worse, one of the few games that sold over 100,000 copies this year was Call of Duty 4, which certainly had zero upgrades for the PC over the console graphically (just the usual culprits- AA, AF, and vsync). There really just isn't much of an incentive for developers to push the envelope on PC gfx or for publishers to allow their developers to.
 
Well, SLI never really worked well in d3d from day one and still strugglels to this day and I won't even get into awfull Vsync support. It Always worked like it should in openGL thats why Quake Wars does well. I hope Nvidia irons out the issues with the 9 series but I'm not holding my breath. This is why duel GPU cards fail as well, you have a 8850 (per say) thats has 2 times the ram and GPU yet runs like a single card most of the time. SLI works well with 3DMark becouse Nvidia made sure it would;)

As far as sound cards in Vista, they are a waste for gaming imo. Creative doesn't even use a true DX10 path and works by working around it rather than with it.


The thing we forget is alot of very good original games come to PC first simply becouse people can develope for the PC easyer than a console and with far less money. Take FarCry for example. Crytek was a nobody developer with big hopes and dreams to make a new action packed FPS. They had more power to develope a engine with and the ability to do almost anything they wanted to do with the game on a PC platform. Now look how well the game turned out. If Crytek had tried to make FarCry for a console at the time, they may very well have never been able to to get the game off the ground. If they had it probably would have turned out like crap. I imagine the same could be said for HalfLife, Painkiller, COD, Quake, Doom, FEAR and others. Even Halo was originaly done for PC but with the extra backing from MS they were able to bring it to console. I still beleive if Halo had stuck with PC it would have been a better game but money talks.

We are still going to see new and original games come to PC first for the same reasons they always have. New developers are the ones to bring out games that stand apart from the rest, ones we will not here much about untill you read a review one day about how great it is. I see most new interesting console games being done by devs with lots of money and exp like borderlands, Project Origin and Fallout 3 ect... Though these games will probably be good, they will not be anything like the Dooms or FarCrys done on the PC with cutting edge hardware and a fresh do whatever you want dev team.

The incentive for great PC games is more about the passion of a team of people wanting to make something great and break into the scene. From there they move on to bigger and better things (like money) such as console gaming. The same could be said for music bands or movie directors. Alot of great music and movies came from people struggling to make something they believe in or have a passion for all the wile hoping to make it big someday.

The point of my post...Um well we need better graphics cards and pc hardware to pave the way for new and exciting games and future console hardware:cool:
 
I still beleive if Halo had stuck with PC it would have been a better game but money talks.

And Gearbox's PC version of Halo was much better than the Xbox version. Anyway, I pretty much agree with your post.
 
[The] 7950 GX2 was [snip] built on the 7950GT mobile chip which used alot less power and didn't have all the pipelines enabled like the regular 7950GT...

Yes it did have all the pipes enabled. The 7950 GX2 has 16 vertex shaders, 48 pixel shaders, and 32 ROPs per card (8 vertex, 24 pixel, 16 ROPs for each GPU, same as G70). You can check it here for confirmation.

And power consumption wasn't all that much better per GPU than the 7900GT, either. Improved, yes, but not dramatically.

And I immediately call BS on D9P being PCB backward compatible with the 86/76/6600GT. Those boards all had a 128-bit bus wired into them. I could see a 7900GT-like PCB for these cards in regards to size, though. But yeah, you can't squeeze blood from a turnip.
 
I hope they release the high end ones first so I can do the evga step up program.:D
 
So if the performance is somewhere between the 3850 and 8800gt, that means it may be around 3870 performance? Hmm, I might get this one.
 
Probably the biggest bit of info I took out of that blurb was that the 9600GT won't support Tri-SLI. I had previously thought that nVidia just wanted to start getting Tri-SLI moving w/the high-end cards and ostensibly the enthusiasts for the 8xxx generation and then move down the ranks for the 9xxx generation. However, it seems that they are still keen on keeping Tri-SLI to the higher-end cards, which is interesting in that I don't know whether it's a commentary on Tri-SLI itself not being worth it at the lower card ranges or if nVidia just feels that people picking-up a 9600GT will not be wanting to use Tri-SLI (personally, though I'm set w//my two 8800GTS's for awhile now, I would use Tri-SLI capability as more of a way to expand my rig's lifespan rather than getting Tri-SLI on day one- much the same way I handle SLI).

Tri-SLI usually doesn't make much sense if you can get the performance with a Single much more powerful video card, or 2 Somewhat more powerful cards in regular SLI.
 
Tri-SLI usually doesn't make much sense if you can get the performance with a Single much more powerful video card, or 2 Somewhat more powerful cards in regular SLI.

That is if you're planning on purchasing all three cards at one time. However, if you purchase one card, then wait 6-12 months and go for another (price drops quite a bit there), and then wait another 3-6 months (price should be as low as it will ever be now) it becomes "worth it".
 
That is if you're planning on purchasing all three cards at one time. However, if you purchase one card, then wait 6-12 months and go for another (price drops quite a bit there), and then wait another 3-6 months (price should be as low as it will ever be now) it becomes "worth it".

That's a big mess, if that's the case you can just sell your old graphic card and get a new and more powerful one for a better price, you shouldn't need to ever mess with more cards if a single card can give more performance. You avoid all the SLI driver hassles as well as have lower power consumption with less heat to boot.
 
Crysis is the game that seems to be causing people to think their cards are slow. Its more like just bad game design. They add a forest of a zillion trees to the game, then it runs like crap... its the zillion moving trees that are causing the slowdown in the game fps. Then add in AA to drawing all those trees, and its not good.

The game seems to run better itself though just playing the game than it does when you run the benchmark program.
 
Yes it did have all the pipes enabled. The 7950 GX2 has 16 vertex shaders, 48 pixel shaders, and 32 ROPs per card (8 vertex, 24 pixel, 16 ROPs for each GPU, same as G70). You can check it here for confirmation.

Wow, I don't know how I screwed that one up, I never really payed attention to the 7950GX2 because I couldn't afford it, and even then I wouldn't of wanted it, I fucked something up along the line there.
 
Crysis is the game that seems to be causing people to think their cards are slow. Its more like just bad game design. They add a forest of a zillion trees to the game, then it runs like crap... its the zillion moving trees that are causing the slowdown in the game fps. Then add in AA to drawing all those trees, and its not good.

The game seems to run better itself though just playing the game than it does when you run the benchmark program.

It's a game designed to push hardware... we are undoubtedly getting to the point where a realistic jungle setting like Crysis' should certainly be doable with good gfx and a stable framerate. Granted, it was released before Crytek really wanted to release it hence it is not the best optimized game around at the moment, but that will be changing this month, and it will be interesting to see how performance changes when that patch arrives. Nevertheless, Crysis wasn't designed to be a game that everyone could run at Very High settings on day 1. Quite the contrary, in fact. Cevat Yerli himself noted that he wanted the game's graphics to scale into the future. Hence it is certainly not surprise that few combinations of modern hardware can truly handle Crysis at its highest settings. An additional note- if nVidia followed their previous release patterns, the 8800GTX would have been replaced already and it's possible that a single card may have hence been able to tackle Crysis on Very High at a decent resolution.
 
Crysis is the game that seems to be causing people to think their cards are slow. Its more like just bad game design. They add a forest of a zillion trees to the game, then it runs like crap... its the zillion moving trees that are causing the slowdown in the game fps. Then add in AA to drawing all those trees, and its not good.

The game seems to run better itself though just playing the game than it does when you run the benchmark program.

Have you even looked at the source code for the game? Have you even really looked at the design documents?

I doubt it, considering both are internal only. There is no reason to bash on a game that has amazing graphics just becuase it beats the crap out of the current high end graphics cards. The game is designed to do that. Its a high end game designed to push the limits. Not bad design, not bad coding, They wanted to see what they could achieve by pushing the limits of technology. thats why technology advances.
 
I doubt it, considering both are internal only. There is no reason to bash on a game that has amazing graphics just becuase it beats the crap out of the current high end graphics cards. The game is designed to do that. Its a high end game designed to push the limits. Not bad design, not bad coding, They wanted to see what they could achieve by pushing the limits of technology. thats why technology advances.
I agree. In the end, it's just how much the video card can do in respect to what the game is asking. Crysis looks absolutely amazing; nothing looks as good and it is ahead of its time. Does that mean they can't tweak the engine to run better on current hardware, not at all and we'll see what the first patch does.
 
New high-end to be "announced in March/April"? It'll be 6 months before it becomes feasible/available to buy! :eek:
 
What about the lower-end cards (similar to 7600gt and 8600gt) ? Anyone heard anything about those. That's what I need.
 
Back
Top