(G92)8800GT First review!

Well the first orders will be first gen hardware. Let hope there arn't any problems ala 2900XT.
 
seems to be only brand showing up via froogle

Yeah, my guess is that they were first to the distributors so far :). Someone linked a BFG card from an individual seller on a forum before, but as far as vendors go, I've seen nothing yet but PNY.
 
wow....when i first heard of this 8800gt priced at ~$200, i thought it'll be a bit slower than gts. but this is crazy!!!
 
Re-Read my post, I added some color to help you with the important bit. Remember these are launch prices, only down from there.

Also consider that a 8600GTS replacement on smaller faster process is also likely. They can't release every chip at the same time. They need to concentrate where it makes the biggest difference. The G92 is fantastic midrange for anyone in that market and it exceeds what everyone was calling for in a midrange card.

Wow I went to work, came home, and theirs 5 more pages.

Nvidia makes the bulk of their money on parts costing 200 dollars or less, that space is filled with cards being the 8600GTS and below. The 8400 and 8500 series are already 65nm and their going to be slow anyways, no one is going to dispute this, but the 8600GT(S) is the real issue here. If you remember from the 6600GT and 7600GT, nvidia got it dead on with power/performance ratio for 200 dollars or less, and im sure if this card hits 200 bucks (atleast the 256) this one will due the same, make tons of money for them, they were their best margin cards for them. The 8600GT(S), not so much but they were still alright.

All that being said, the reason why the 6600 and 7600GTs where their best margin cards was simple, the PCBs cost less, the power circuitry cost less, the chips themselves were made to cost less and have a higher yield rate. They were also great because their their performance to price ratio was pretty bitchin, I almost bought one, but I held out and 8 months later I bought a 6800GT instead(3 weeks before 7800s came out :( )

Now the G92 is a new chip (which had millions of development behind it) and for all intensive purposes could be their next high end chip just like G80, and a new high end board could have as many as 160 sp's could be unlocked, just like the G80 (The G80 does have 160 SPs but 128 enabled, its not well known, but it is a fact). That being said, They sell this board for 200-250 dollars, thats really cutting into the profits (yes 65nm makes it cheaper and more efficient to crank them out, but it still costs money), and I don't think those flextronics PCBs are that cheap, not for the 8800 line and the power they need to be fed, atleast more than what it costs to produce an 8600GTS pcb which I believe uses the same design pin outs and circuitry as the 66 and 7600GT (also why their cheap). so selling 30,000 boards that cost $150 to produce isn't as good as selling 75,000 boards that cost $80 (just a hypothetical).

So one of two things needs to happen, the 8600GT(s) need to be killed off immediately and be relaunched (which doesn't sound so unreasonable since their both 80nm chips), or the 8600GT (or GTS) need to be killed off and the other one needs to be made $100, because that will be the max that card is worth, but more importantly, anyone with a clue wouldn't buy one for 150-200 given the EXTREME gap in between the 8600GTS and 8800GT in performance.

Also nvidia would need to drop the GTS/ultra down to $400 if they don't have an intention for replacing them with something newer on their lineup anytime before the next series, or risk bad PR for those two cards on price (who going wants an ultra for 550+ when you can get a GT for 250 and overclock it to match).

I am all for nvidia kicking ass and taking names where it counts ($200-250), but they really don't want to forget about the customers who got them to where they are now ($200 and less), or they mite just start forgetting about them if you know what I mean. The 2600XT is cheaper (and slower) than the 8600GTS, but would that matter if the 2600 is 50 dollars cheaper and nvidia is ripping you off?
 
It beats ATi's top end card at a fraction of the cost... sad to say it but ATi and AMD went down the drain.
 
It beats ATi's top end card at a fraction of the cost... sad to say it but ATi and AMD went down the drain.

This statement made which completely rejects the fact that their are numerous rumors of an RV670 chip being launched mid november.
 
So theres no complaints of this site not being legit?

If its true I think its time to ebay my 320MB. Sometimes the 640 is close but the 320 is always behind =\
 
Because 3dmark is such an awesome game and is the end all and be all of how a game will perform with 8xAA, or just in general.:rolleyes:

He asked for 8x AA comparison to 4x AA with the same application... stop nitpicking and let it go already.
 
THink it would be a good idea to sell my 8800gts 640 for like 300 640 and buy the gt ?
 
He asked for 8x AA comparison to 4x AA with the same application... stop nitpicking and let it go already.

Nitpicking ey? Well for one, he didnt specify 3d mark scores, and i doubt he gives a shit about 3dmark scores and anyone looking for a comparison between card using 3dmark needs to pull their head out of there ass. http://enthusiast.hardocp.com/article.html?art=NDkxLDEsLGhlbnRodXNpYXN0 if you disagree.

But since im such a nitpicker, let me break it down for you. The 8600GT gets good 3dmark06 scores, but falls flat on its face in games. Why?- Its 128bit memory bus. So, yes, 3dmark06 is an awesome program to test the limits of the memory bus. Once again...:rolleyes:
 
Nitpicking ey? Well for one, he didnt specify 3d mark scores, and i doubt he gives a shit about 3dmark scores and anyone looking for a comparison between card using 3dmark needs to pull their head out of there ass. http://enthusiast.hardocp.com/article.html?art=NDkxLDEsLGhlbnRodXNpYXN0 if you disagree.

But since im such a nitpicker, let me break it down for you. The 8600GT gets good 3dmark06 scores, but falls flat on its face in games. Why?- Its 128bit memory bus. So, yes, 3dmark06 is an awesome program to test the limits of the memory bus. Once again...:rolleyes:
the 8600gt would suck almost just as bad with a 256-bit bus. its the 32 shaders that are killing it.
 
He asked for 8x AA comparison to 4x AA with the same application... stop nitpicking and let it go already.

Maybe I should have been more clear. I assumed that being here at the [H] of all places, I was referring to game benchmarks and "no one" cares about 3DMark.


and i doubt he gives a shit about 3dmark scores and anyone looking for a comparison between card using 3dmark needs to pull their head out of there ass. http://enthusiast.hardocp.com/articl...hlbnRodXNpYXN0 if you disagree.

But since im such a nitpicker, let me break it down for you. The 8600GT gets good 3dmark06 scores, but falls flat on its face in games. Why?- Its 128bit memory bus. So, yes, 3dmark06 is an awesome program to test the limits of the memory bus. Once again...

Bold highlights the truth btw. The HD2900XT showed how much 3DMark is worth in the real world, if I wanted to stop at 4xAA I would've gotten an HD2900XT tbh. Like I was trying to say/imply, there's no 4xAA to 8xAA in-game comparison and Hard|OCP is about one of maybe two reputable sites on the net that do. I plan to continue to use 8xAA 24/7 and I'm very skeptical of this card with it's smaller bus.

EDIT: Edited for completeness via nissanztt90 :p.
 
I respectfully disagree.


EDIT: w00t.
I would love for you to explain to me why an 8600gt would be much better with all the same specs except for a 256-bit bus in place of 128-bit. Overall it probably wouldnt make hardly much difference in real world gaming. Im not saying that it wouldnt help but its not the main reason the card sucks. IMO having 48 shaders would have done much more for it than having a 256-bit bus.
 
I would love for you to explain to me why an 8600gt would be much better with all the same specs except for a 256-bit bus in place of 128-bit. Overall it probably wouldnt make hardly much difference in real world gaming. Im not saying that it wouldnt help but its not the main reason the card sucks. IMO having 48 shaders would have done much more for it than having a 256-bit bus.
I don't get it either. Everyone knows by know that it's the lack of shaders that hurts the 8600 series more then anything else; bumping it up to 256bits would help but the card is still going to be starved for data because it doesn't have enough shaders to process all of the info that is now getting tunneled to it.

Hell, these 8800GT numbers are enough to prove that.
 
I don't get it either. Everyone knows by know that it's the lack of shaders that hurts the 8600 series more then anything else; bumping it up to 256bits would help but the card is still going to be starved for data because it doesn't have enough shaders to process all of the info that is now getting tunneled to it.

Hell, these 8800GT numbers are enough to prove that.

its not just the shaders, it has horrible fill rate, that is its main weakness.
 
I don't get it either. Everyone knows by know that it's the lack of shaders that hurts the 8600 series more then anything else; bumping it up to 256bits would help but the card is still going to be starved for data because it doesn't have enough shaders to process all of the info that is now getting tunneled to it.
Agreed, but the wider bus would help. It wouldn't make the card particularly more impressive in some scenarios, but it would help to some degree in many scenarios, especially with regard to AA.
 
how about everyone go on google and post places where to preorder this thing already...lol
 
Well, at least for those running GPUs that are already lower-end than the GT, it will be a big step-up.

I'm running a GTX, but my wife's system (yeah, she games in FPS with me) I put in my old x1900xtx. For a measly $200, the GT is (or should be) a huge step up from that, considering it's currently showing it's between the GTS and GTX.

So I'll be snagging one to throw in her system, as long as [H] gives it a solid work-through and it doesn't end-up being a "dud" in any way, especially in terms of handling AA.
 
I thought you didn't believe in rumors ;).

When theirs smoke theirs fire, and that fire is real, but the rumors are like people trying to identify the cause of it by looking at the smoke from a television screen 100 miles away, they cant tell you, but they figure its their job to report it to you anyways.

What im trying to say is don't take the information you see reported from sources other than the manufacturer or vendor at face value. Question its validity, don't blindly accept something because you are TOLD it, but it doesn't mean there wrong or even off.

But in june, july and august, all we were told about this video card was that it was called G92 and that it did one teraflop, if the 8800GT cant beat an 8800ultra, and that card does 300 something Gflops, what does that tell you. It pisses me off people who propagate information they learn 3rd hand and have no information, but they do it anyways. They propagate this information, convince people not to buy the current cards because "something is coming and its much faster", isn't that sabotage?

But I digress, just because someone reports smoke, and calls it a chemical fire when it turns out to be arson, doesn't mean I don't believe that fire isn't burning. What can I say, im a cynic, your statements are false to me until proven true.

I would love for you to explain to me why an 8600gt would be much better with all the same specs except for a 256-bit bus in place of 128-bit. Overall it probably wouldnt make hardly much difference in real world gaming. Im not saying that it wouldnt help but its not the main reason the card sucks. IMO having 48 shaders would have done much more for it than having a 256-bit bus.

Its bitchin for games below 1280x1024, the memory bus is choking those 32 steams, so their not, theirs a clear line of demarcation on benchmarks, when you do something memory intensive (something that 128bit wont cut it for) , it just drops off.
 
I don't get it either. Everyone knows by know that it's the lack of shaders that hurts the 8600 series more then anything else; bumping it up to 256bits would help but the card is still going to be starved for data because it doesn't have enough shaders to process all of the info that is now getting tunneled to it.

Hell, these 8800GT numbers are enough to prove that.

People here seem to be obsessed with memory bus width. Few people recommended the 7600 GT over the X800/850 XT because it only had a 128-bit memory bus, even though it was cheaper, smaller, quieter, and had basically the same performance. Fast forward a bit and you see people slamming the 8600's and acting like no one in their right mind would get an 8600 GT or GTS over an X1950 Pro or 7900 GS. Sometimes they cite the slight performance advantages of those cards over the 8600's, but usually they just say "the small memory bus chokes the life out of that card" as if that settles the argument.

But whether it's the memory bus, the SP's, or the fill rate that hurts the 8600's most, it's really rather silly to chastise nVidia for their "stupidity" in designing it that way. Yes, if they increased those three, it would perform like an 8800. That's because it would be an 8800 and it would be priced accordingly. There's no magical way to squeeze vast increases in performance out of a midranged card without increasing the price accordingly. Bus width isn't some magical talisman of GPU goodification (to borrow a term from Frank Caliendo), it's a tradeoff that has its costs just like everything else.

P.S. Anyone who thinks the 8800 GT will launch at $150-$200 needs to have their head examined and all the wishful thinking surgically removed.
 
WHATS THE RELEASE DATE ????? O and For sure is this card better than the 8800gts 640 mg ? If so im selling it tonight to a friend for 280 SHOULD I DO IT >>???? LET ME KNOW
 
WHATS THE RELEASE DATE ????? O and For sure is this card better than the 8800gts 640 mg ? If so im selling it tonight to a friend for 280 SHOULD I DO IT >>???? LET ME KNOW

Maybe with a little less caps lock I would of answered your question, but because of it, your going to need to read the first 9 pages of this topic first, and don't say you did, because you wouldn't be asking that if you did.

Edit. You PM'd me, you asshole, im not giving you information because you cant sit still long enough to read some forum posts, get some ritalin dumbass.
 
Maybe with a little less caps lock I would of answered your question, but because of it, your going to need to read the first 9 pages of this topic first, and don't say you did, because you wouldn't be asking that if you did.

Edit. You PM'd me, you asshole, im not giving you information because you cant sit still long enough to read some forum posts, get some ritalin dumbass.

sorry seriously this guy is basically giveing me seconds to make my choice so i need answers asap please get back to me
 
sorry seriously this guy is basically giveing me seconds to make my choice so i need answers asap please get back to me

I answered pretty closely before your post, go read! There's no possible situation you could need to know the info faster than you could find it in this thread, heck, you probably spent more time asking than looking ;)!
 
But whether it's the memory bus, the SP's, or the fill rate that hurts the 8600's most, it's really rather silly to chastise nVidia for their "stupidity" in designing it that way. Yes, if they increased those three, it would perform like an 8800. That's because it would be an 8800 and it would be priced accordingly. ..., it's a tradeoff that has its costs just like everything else.

IIRC the use of a 128-bit bus was partly so as to make a GPU that was pin-compatabile with the existing low end 7xxx PCBs. Of course, that then begs the question of why they didn't do the same with 256-bit designs ...

(Hell, they could have created a 192 bit bus -- given how sucessful the 384-bit GTX design proved to be.)
 
sorry seriously this guy is basically giveing me seconds to make my choice so i need answers asap please get back to me

um, he is giving YOU seconds? and you are taking it like a little bitch? grow some balls. there is always tomorrow.
 
At the risk of getting flamed myself ... :)

Guys, get a room please ... and web-cam the results thanks
 
SWEET, 20w less consumption at load makes me really happy. I've been upset over the 8800 GTS, which consumes an ungodly 110w a full load...90w plus better performance is a little easier to swallow.
 
I really wish it blew the hot air outside the case though, even though it doesn't require a 2 slot cooler.
 
Has anyone noticed that a few months before the launch of a new card, companies start shipping free games and massive mail in rebates. Almost every nvidia package right now comes with a free game.
 
Has anyone noticed that a few months before the launch of a new card, companies start shipping free games and massive mail in rebates. Almost every nvidia package right now comes with a free game.

Was the same way when I ordered my card in Feb... $30(?) MIR + free game.
 
Back
Top