GF 9xxx in February 2008

The G80 has 681 million transistors not 686 million. :D


Another thing,the the next high end card could get along just fine with a 256 bit bus (instead of 320/384/or even 512),as long as the memory tied to it,was
fast enough.

Newer compression algorithms of color,etc,obviously plays a large role with these newer chips,as they dont seem to be as hampered by thier lower memory bandwith.
There isnt memory thats fast enough to make up for the gap between 384-bit and 256-bit not to mention the gap between 512bit and 256bit. Right now the gtx uses 384bit 1800mhz which equals 86.4gb/s bandwidth and the ultra has 2160mhz resulting in 103.68gb/s. If the next gen has 256bit then it would have to use 2700mhz(which doesnt even exist) just to equal the gtx. That would not even be close to the Ultra plus that memory would be very expensive. The next gen aka 9800 will definitely not use anything lower than 384bit for the gtx and it will certainly need the bandwidth.
 
There isnt memory thats fast enough to make up for the gap between 384-bit and 256-bit not to mention the gap between 512bit and 256bit. Right now the gtx uses 384bit 1800mhz which equals 86.4gb/s bandwidth and the ultra has 2160mhz resulting in 103.68gb/s. If the next gen has 256bit then it would have to use 2700mhz(which doesnt even exist) just to equal the gtx. That would not even be close to the Ultra plus that memory would be very expensive. The next gen aka 9800 will definitely not use anything lower than 384bit for the gtx and it certainly will need the bandwidth



Your wrong,check out Beyond3D and read up on it. :)
 
The cost of using extremely high clocking, possibly engineering sample (think 7800GTX 512) DDR4 would probably outweigh the cost of using a denser (more layers) pcb with more memory circuit connections, and more easier to produce, lower clocked ddr3 chips by adding one or two more memory channels.

Also cannondale, Did it ever occur to you, that nvidia may not want to tell us that the G80 and G92 have more than 8 blocks of 16 SPs because such information may impede the sales of current products (if they had plans to release a new card based off the same chip which had more SPs)?. How many people that bought the 8800GT wouldn't of knee jerked into the decision to buy it on oct 29th if they knew of a slightly faster, slightly better 128sp 8800GTS which would fill the old 8800GTS's would be around on December the 10th? How many people would wait then if they knew with certainty of something better on some other date? see what im saying.

They can increase profit both by limiting the flow of information to the people who make the knee jerk reaction, and also use it to their advantage, by getting rid of the first batch of chips, the first batch which will have the most bad chips, some of which can be salvaged by disabling some sp's and resold.

What I am trying to say to you is, how would you know the G80-92 are limited to just 8 blocks of 16 SPs, using info other than what is made available to you thru nvidia, and the info that leaks to the press once the boards hit AIBs? Can you atleast admit to me that the above is possible?
 
The cost of using extremely high clocking, possibly engineering sample (think 7800GTX 512) DDR4 would probably outweigh the cost of using a denser (more layers) pcb with more memory circuit connections, and more easier to produce, lower clocked ddr3 chips by adding one or two more memory channels.

Also cannondale, Did it ever occur to you, that nvidia may not want to tell us that the G80 and G92 have more than 8 blocks of 16 SPs because such information may impede the sales of current products (if they had plans to release a new card based off the same chip which had more SPs)?. How many people that bought the 8800GT wouldn't of knee jerked into the decision to buy it on oct 29th if they knew of a slightly faster, slightly better 128sp 8800GTS which would fill the old 8800GTS's would be around on December the 10th? How many people would wait then if they knew with certainty of something better on some other date? see what im saying.

They can increase profit both by limiting the flow of information to the people who make the knee jerk reaction, and also use it to their advantage, by getting rid of the first batch of chips, the first batch which will have the most bad chips, some of which can be salvaged by disabling some sp's and resold.

What I am trying to say to you is, how would you know the G80-92 are limited to just 8 blocks of 16 SPs, using info other than what is made available to you thru nvidia, and the info that leaks to the press once the boards hit AIBs? Can you atleast admit to me that the above is possible?
Well if Nvidia wants to lie about their G80/G92 architecture than that would seem kind of odd. Also wouldnt somebody have figured it out by now if Nvidia was lying about how many blocks they have. Why are you so hell bent on thinking there are more SP available when there clearly isnt??

When it come to the memory who really knows? All we are doing here is speculating but lets at least use common sense. I doubt Nvidia will drop below their current 384-bit memory on the 8800gtx when they go the 9800gtx. I personally think that 512-bit might be a tad too expensive but who knows? ;)
 
Perhaps you should actually look at what I was replying to. I didnt say anything about the 9800 and I have no idea what core it will use. All this started when Simpson5774 suggested a G92 with 160 SP and that is not possible because the G92 only has 8 blocks.

I did look at what you were replying to. Based on your comments so far, it seemed likely that you were overgeneralizing. Reading your posts after this one, I stand corrected--you're overspecifying instead. Since you were speaking specifically about the actual G92 (rather than doing as many have done and using that as shorthand for the entire upcoming generation), my comment stands corrected with regard to you specifically, but still makes an important point about the 9XXX series for the benefit of the thread as a whole (remember what the thread was about?).

However, your continuing bold assertions about what is and isn't possible or what will and will not be are plainly bluster. ATi's misadventures with a 512-bit bus and the success of the 8800GT with a 256-bit bus are ample proof that theoretical memory bandwidth is no match for architectural innovation. And yet you confidently claim (because you are a GPU engineer?) that nVidia must maintain and increase the top-end bus width of its GPUs.
 
I did look at what you were replying to. Based on your comments so far, it seemed likely that you were overgeneralizing. Reading your posts after this one, I stand corrected--you're overspecifying instead. Since you were speaking specifically about the actual G92 (rather than doing as many have done and using that as shorthand for the entire upcoming generation), my comment stands corrected with regard to you specifically, but still makes an important point about the 9XXX series for the benefit of the thread as a whole (remember what the thread was about?).

However, your continuing bold assertions about what is and isn't possible or what will and will not be are plainly bluster. ATi's misadventures with a 512-bit bus and the success of the 8800GT with a 256-bit bus are ample proof that theoretical memory bandwidth is no match for architectural innovation. And yet you confidently claim (because you are a GPU engineer?) that nVidia must maintain and increase the top-end bus width of its GPUs.


you want more proof :p? 3870 has 256bit bus with STOCK 1125mhz(2250mhz DDR4) and it outperforms the 2900XT in a few tests which has 1600mhz 512bit memory. I'd say thats enough proof that the current cards really are not that bandwidth limited.
 
I did look at what you were replying to. Based on your comments so far, it seemed likely that you were overgeneralizing. Reading your posts after this one, I stand corrected--you're overspecifying instead. Since you were speaking specifically about the actual G92 (rather than doing as many have done and using that as shorthand for the entire upcoming generation), my comment stands corrected with regard to you specifically, but still makes an important point about the 9XXX series for the benefit of the thread as a whole (remember what the thread was about?).

However, your continuing bold assertions about what is and isn't possible or what will and will not be are plainly bluster. ATi's misadventures with a 512-bit bus and the success of the 8800GT with a 256-bit bus are ample proof that theoretical memory bandwidth is no match for architectural innovation. And yet you confidently claim (because you are a GPU engineer?) that nVidia must maintain and increase the top-end bus width of its GPUs.
yes I remember what thread was about but I was replying to a specific comment about the G92 architecture. when it comes to the next completely new generation for Nvidia I still say they will maintain 384-bit or maybe go higher for the very top card. of course I am no expert but that just seems to be the most likely thing to me. ;)
 
So to lighten the mood a little:

I wonder why Nvidia is choosing to go with the 9XXX pattern? I know it's "next", but I would skip it, if I were Nvidia marketing. It's going to confuse the crap out of the mainstream purchasers, as ATI already had a successful 9XXX series of cards. Yah, I know, that was years ago, we should know better. But will your average consumer? It's just confusing. I am surprised they didn't jump straight to the GeForce 10 (and they may still, the current naming is all speculation).
 
Is there something I am missing that prevents an intellegent naming option for GPUs? Look at AMDs CPUs, while not a direct measure of preformance by any stretch, at least they make some sort of logical sense and progression. Why MUST they name over the top of current cards.
 
yes I remember what thread was about but I was replying to a specific comment about the G92 architecture. when it comes to the next completely new generation for Nvidia I still say they will maintain 384-bit or maybe go higher for the very top card. of course I am no expert but that just seems to be the most likely thing to me. ;)

Now that I have vented, let me say that if the next cards have a fat bus and really use it well, that can only be a good thing. :cool: I don't know what is preventing larger-than-256-bit buses from making a tangible performance difference--I guess as Viper-X says, we're just not limited there yet. Hopefully we can get back to nice "even" doubles now that nV is on a smaller process.

So to lighten the mood a little:

I wonder why Nvidia is choosing to go with the 9XXX pattern? I know it's "next", but I would skip it, if I were Nvidia marketing. It's going to confuse the crap out of the mainstream purchasers, as ATI already had a successful 9XXX series of cards. Yah, I know, that was years ago, we should know better. But will your average consumer? It's just confusing. I am surprised they didn't jump straight to the GeForce 10 (and they may still, the current naming is all speculation).

Is there something I am missing that prevents an intellegent naming option for GPUs? Look at AMDs CPUs, while not a direct measure of preformance by any stretch, at least they make some sort of logical sense and progression. Why MUST they name over the top of current cards.

I foresee continued struggles with naming. They want to convey the idea that higher=newer=better, but ATi already hit the 10,000 wall and tried to solve it by replacing it with an X. In my opinion it didn't work too well, especially with their X-heavy suffixes. Thankfully they seem to have dropped all the X's, but before long they'll be tripping over series numbers that overlap old nV cards.

Now nV is at the 10,000 barrier, and what will they do? They are trying the whole 9E-9P thing, which would take all the extra zeros out, but will they be bold enough to use that on the consumer side or only as codes that are familiar to enthusiasts? I hope they use it across the board, because then they would be good for dozens of generations before it started getting unwieldy again.
 
So to lighten the mood a little:

I wonder why Nvidia is choosing to go with the 9XXX pattern? I know it's "next", but I would skip it, if I were Nvidia marketing. It's going to confuse the crap out of the mainstream purchasers, as ATI already had a successful 9XXX series of cards. Yah, I know, that was years ago, we should know better. But will your average consumer? It's just confusing. I am surprised they didn't jump straight to the GeForce 10 (and they may still, the current naming is all speculation).

Well I am not sure but I dont think that the 9800pro is still being sold still brand new, so it shouldnt really be too confusing. but I know what you mean.
 
Was there mention of the price for the cheaper one? Is it going to be the new GT or whah
 
Now nV is at the 10,000 barrier, and what will they do? They are trying the whole 9E-9P thing, which would take all the extra zeros out, but will they be bold enough to use that on the consumer side or only as codes that are familiar to enthusiasts? I hope they use it across the board, because then they would be good for dozens of generations before it started getting unwieldy again.
I feel that at least something needs to change. I would like to see M/P/E replace GS, GT, GTS, GTX, Ultra. Obviously the G means nothing at all and the S, T, TS, TX don't really give any indication unless you are in-the-know enough to realize that perfomance goes in alphabetical order :p. Or at least, that's how it was until NVIDIA decided to have 4 somewhat different cards under the same model name (8800GTS). I'm honestly curious as to why NVIDIA thought it was a good idea to reuse that model name rather than going to 8850GTS and 8900GTS...

At least M/P/E actually stand for words! However, I feel that there's still ambiguity between Performance and Enthusiast - the average joe won't immediately realize that "Enthusiasm" is faster than "Performance" :confused:.

AMD has it right. Just about everybody knows that 3870 is a higher number than 3850. There's simply no ambiguity. You could break it down into 3=generation, 8=market segment, 7=variant but nobody actually needs to learn that to compare the two. It's just simple.

Stupid suffixes are far more likely to confuse the uniformed consumer than the Nvidia 9800 xs ATI 9800 issue because the 9800 is not being sold, and hasn't been for years. So i really just hope that we see card names like Geforce 9850, 9800 etc.

As for the 10 000 issue... I don't know maybe it's time for nvidia to retire the Geforce name? They should hold another contest :D.
 
Was there mention of the price for the cheaper one? Is it going to be the new GT or whah

Going by the last few releases I think it's safe to assume that there will be two high end parts, one with less ram, slower clocks, and/or disabled shaders.
 
Hell Yes! Thanks for posting. I am building my rig, and it just so happens it was going to be finished around February, and the 9 series will be my last purchase... :D
 
I'm reading to up grade for the new top end geforce 9. Only thing im concern is the power related. I have Enermax galaxy DDX 850w not sure is gona be enough. I hope it uses the same or less power then a 8800gtx. I also hope the pci express power will require 6 pin, because I hate to change that power supply. Is a pain taking it out my case. It was a pain to fit it, and I did have to scratch my nice paint of the PSU doing so.
 
∞Velocitymaster∞;1031751782 said:
I'm reading to up grade for the new top end geforce 9. Only thing im concern is the power related. I have Enermax galaxy DDX 850w not sure is gona be enough. I hope it uses the same or less power then a 8800gtx. I also hope the pci express power will require 6 pin, because I hate to change that power supply. Is a pain taking it out my case. It was a pain to fit it, and I did have to scratch my nice paint of the PSU doing so.

Same here. I didn't even get to finish my new system yet and my PS with the 6pin PCIe power connectors is already outdated.
 
I don't know maybe it's time for nvidia to retire the Geforce name? They should hold another contest :D.

Amen that, brother! I'm sick of the same old names.. Geforce's been around for nearly as long as the Pentium name, which Intel has gotten rid of. Same for Radeon..

I think it would be more interesting if Nvidia and ATI give new names for every generation of cards instead of just changing the numbers. Honestly, numbers are getting old and boring.
 
Amen that, brother! I'm sick of the same old names.. Geforce's been around for nearly as long as the Pentium name, which Intel has gotten rid of. Same for Radeon..

I think it would be more interesting if Nvidia and ATI give new names for every generation of cards instead of just changing the numbers. Honestly, numbers are getting old and boring.

Dropping the Geforce name would have its disadvantages though. The Geforce name is well recognized and could cause Joe Shmoe to ask "Where did Geforce go? They were a great company" when the new nVidia cards are sitting on the shelf in front of him. Even so, it's time for a change.

I don't think a new name for every generation is the world's best idea though. Numbers are old and boring but they're easy to understand.
 
I wouldn't miss the GeForce name. I thought it was a stupid name when they announced it was the winning Name for the new GPU at that time.
 
What are the some of the old school VideoCard names that never happened ?? I know 3DFX had some cool stuff or their future cards beyond 6000, as well as nVidia and ATI.

Voodoo "Banshee"
ATI "Dragons Head" or something ??
nVidia ??
 
Banshee did happen. It was a less powerful Voodoo 2 derivative with onboard 2D capabilities (the regular ones needed a separate card for 2D).
 
the new card must be able to run crysis with very high settings at 1600 resolutions for me to consider it a true heir to the 8800gtx. that card was finally able to run oblivion at 1900 resolutions with all settings maxed and transparency aa. if gamers need two g92 premium cards to run crysis then its only a revision-a slightly faster cousin at best.
 
If the card is in fact a 9800GTX (rather than 8900) I would expect it to be about +/- 40% faster than an 8800GTX. This just based on history.
 
If the card is in fact a 9800GTX (rather than 8900) I would expect it to be about +/- 40% faster than an 8800GTX. This just based on history.

40% would be a huge letdown, for a product coming out almost two years later, well one in 2006, this one 2008

Based on history would mean the next generation high end product from nVidia always beat their previous one in SLI = 1-9800GTX is as fast as 2-8800GTX SLI'd

Becauase 1-8800GTX = 2-7800GTX, and 1-7800GTX = 2-6800Ultra's
 
This would be good, since it takes an 8800GTX SLi to run Crysis at high res in 1080p. I'm not buying the game until I have a card that can do that.
 
1-8800GTX = 2-7800GTX, and 1-7800GTX = 2-6800Ultra's

:confused: I really didn't think the 7800GTX was that much of a revolutionary jump in speed. I do think the 8800 bit off a big chunk in a hurry. I'm hoping the next card at least does to 8800's what the 8800 did to 7800/7900
 
:confused: I really didn't think the 7800GTX was that much of a revolutionary jump in speed. I do think the 8800 bit off a big chunk in a hurry. I'm hoping the next card at least does to 8800's what the 8800 did to 7800/7900
yes it was. the 7800gtx was TWICE as fast as the 6800ultra. even the midrange 7600gt was faster than the 6800ultra. http://www23.tomshardware.com/graphics_2007.html?modelx=33&model1=713&model2=720&chart=318



new generations should be about twice as fast as their predecessor. with Crysis even doubling of performance of the 8800gtx will not be enough for very high settings, AA and resolutions like 1920 and above.
 
Now G92 has pretty much all the same bits and pieces that G80 is. The difference is, its 65nm, it has "vp2" like the 8600GTS, so its better at video decoding than the 8800GTX because the decode functions are now on the chip.

G80 to G92 is like NV40/41 to G70(formally NV47). Its not as dramatic of a change as (G70 to G71) but more like a new chip revision with some addons, baring the die shrink, not too terribly spectacular.

The reason I bring all this up is because G92 has a slighltly different ratio of texture addressing units than the G80, G80 has 32 texture addressing units / 64 filtering units. Where G92 has 64 texture addressing units / 64 filters.

http://www.anandtech.com/video/showdoc.aspx?i=3140&p=3

Now heres the interesting factor, according to that link, the 8800GT with 112sp's has 16 ROPs where the 8800GTX has 24 ROP's. Now the 8800GTS with 96sp's has 20 ROP's.

Now for all I know, their should be no reason why their is no less than 24 ROPs built onto the G92 chip. I don't really know the formula for how you get the number of SP's, but my question to you and anyone else who is questioning my speclutive comment, is their any reason to why their would be less than 24 ROPs weather functioning or not, on the G92 die?, and why does the ratio change between G80 and G92?

There is not enough data either way at this point to know how many ROP's are on the G92 core at this point in time, we know for sure that there is at least 16 ROP's as right now from the dat we have available, ROP are tied to bit width, for every 64Bit Channel you have 4 ROP units. So there is at least 16 on the G92 core with 24 on the G80 core.

There is also no formula for the Stream Processors as they are not tied to ROP's or bit Width, take a look at the eVGA SSC with 96+ SP units which is a 112 SP part based on the older G80 core rather then the newer G92 core. This card still has a 320 Bit Wide Memory Interface so it still only has 20 ROP's. Nvidia can activate or deactivate SP clusters at will independent on the memory channels.

I would say the G80 to G92 transition at this time is much more similar to the G70 to G71 transition then NV40 to G70, from the perspective that the overall SP processor amount has not increased, because as of this time the highest SKU we know of that exists that is based on the G92 is the newer version of the 8800 GTS with 128SP coming out in 2 weeks or so. Most of the changes on this transition have been cost cutting which is very similar to G70 to G71.

I don't think we will see major amounts of functionality added till D9E coming out in February 2008 which I don't think will be based on G92, maybe it will be based on "G90" or whatever 192 SP part on the 65nm process, who knows at this point, Nvidia is pretty good at keeping secrets.

I also think that the G80 to G92 transition can be compared to the NV40 to NV42, remember the 110nm Geforce 6800 GS, that was a nice performance-mainstream part, in that case though the only difference is that the 7800 GTX was out already at that time.

I expect the D9E or Geforce 9800 GTX or whatever the name may be will make the Geforce 8800 GT look like the Geforce 6800 GS did to the 7800 GTX.

The ratio of texture address unit to texture filtering units changed because that improvement was already pioneered back on the G84 core with the 8600 Series, so the G92 borrows both the VP2 improvement and the texture address unit improvement, not entirely surprising.

From what we have seen ROP are not tied to texturing units. The eVGA 8800 GTS SSC 96+ (112) and the 8800 GT proves that.
 
So to lighten the mood a little:

I wonder why Nvidia is choosing to go with the 9XXX pattern? I know it's "next", but I would skip it, if I were Nvidia marketing. It's going to confuse the crap out of the mainstream purchasers, as ATI already had a successful 9XXX series of cards. Yah, I know, that was years ago, we should know better. But will your average consumer? It's just confusing. I am surprised they didn't jump straight to the GeForce 10 (and they may still, the current naming is all speculation).

That was ages ago in terms of computer hardware terms, the Radeon 9xxx Series debuted in 2002, were going to be in 2008 by the time the Geforce 9 Series hits, it's just logical not to skip a number. Its not confusing when the confusing hardware has long been discontinued.

Regarding Geforce 10, I am not sure if that is so wise, having 5 digits in a marketing is a bit much, look at how everyone seems to be sticking to only 4 number sequences, maybe they will go Geforce X800 who knows?
 
Is there something I am missing that prevents an intellegent naming option for GPUs? Look at AMDs CPUs, while not a direct measure of preformance by any stretch, at least they make some sort of logical sense and progression. Why MUST they name over the top of current cards.

The reason why is that generational changes happen more often on the GPU from then they do on the CPU front.The graphic card companies only have 1 brand to work with while companies like AMD and Intel have several.


Nvidia: Geforce
ATI: Radeon

Intel: Core 2, Pentium, Celeron
AMD: Phenom, Athlon, Sempron
 
:confused: I really didn't think the 7800GTX was that much of a revolutionary jump in speed. I do think the 8800 bit off a big chunk in a hurry. I'm hoping the next card at least does to 8800's what the 8800 did to 7800/7900

Because we had the X850 XT PE to compare it with, and compared to that card at times it was only 20-30% faster, while it was at least 60-70% faster then the 6800 Ultra.

This time around with the 7900-8800 transition you only have the Nvidia series to compare it to as ATI was late to the party, and you got a 70-80% increase at least, and even better in shader intensive games that the 7900 was weak at.


yes it was. the 7800gtx was TWICE as fast as the 6800ultra. even the midrange 7600gt was faster than the 6800ultra. http://www23.tomshardware.com/graphics_2007.html?modelx=33&model1=713&model2=720&chart=318

new generations should be about twice as fast as their predecessor. with Crysis even doubling of performance of the 8800gtx will not be enough for very high settings, AA and resolutions like 1920 and above.

That is the 7800 GTX 512 which is an entirely different SKU, I am sure the guy was talking about the original June 2005 7800 GTX 256, which was not quite as powerful.
 
The 9800GTX will make the 8800GT look like what it really is = just a $200 VideoCard...In history the $200 cards were always the average or just ok for gaming, but the $400-$500 cards were the geeks dream to run game the way they were meant :) :)
 
Back
Top