Nvidia Readies New “G92” Graphics Processing Unit for November

Svengali

Limp Gawd
Joined
Mar 13, 2006
Messages
207
http://www.xbitlabs.com/news/video/display/20070902154658.html

Nvidia Corp., the world’s largest supplier of discrete graphics processing units (GPUs), reportedly plans to update its lineup of chips in mid-November this year. While according to media reports the company has no plans to dethrone its high-end GeForce 8800 Ultra with the new code-named G92 product, it is highly unlikely that Nvidia has no plans to offer a new top-of-the-range product this fall.

Several media reports claim that the G92 chip due to be commercially launched on the 12th of November will be positioned as a performance-mainstream part, not as an ultra high-end product. The chip will sport an improved PureVideo HD video engine, PCI Express 2.0 bus, DVI, DisplayPort as well as HDMI outputs, media reports claim. It remains to be seen whether the new product will also feature DirectX 10.1 capabilities. The new chip is projected to be made using 65nm process technology at TSMC.

Earlier this year Michael Hara, vice president of investor relations at Nvidia Corp., said that the company’s forthcoming flagship product would have peak computing power close to 1TFLOPs, about two times more compared to the current code-named G80 chips, which is used on the GeForce 8800 GTS, GTX and Ultra products.

Nvidia’s code-named G80 chip was introduced in mid-November, 2006, as a high-end offering from the company. Being made using 90nm process technology, the chip featured 681 million transistors without output logic, which means that the solution was quite expensive to build.

Santa Clara, California-based Nvidia Corp. already has a history of creating an expensive solutions and then making them faster and more affordable to manufacture. In June, 2005, Nvidia introduced its G70 processor that powered the company’s GeForce 7800 family of products and had a rather large die size. Already in March, 2006, the company released its G71 chip that had the same capabilities as the predecessor, but could operate at higher clock-speeds and also was cheaper to make. While the GeForce 7900 did not offer a two times performance improvement over the predecessor, the GeForce 7950 GX2 graphics card could boast with unbelievable peak computing power for that time as it featured two G71 GPUs.

Besides creating a chip that would feature over a billion of transistors, Nvidia may chose an option to produce a dual-chip graphics card based on relatively inexpensive graphics processors. Unfortunately, there is no reliable information whether Nvidia’s G92 is actually a complex high-end graphics processor, or a chip of modest complexity to replace Nvidia GeForce 8800 GTS and power a dual-GPU “GX2” graphics card.

Nvidia does not comment on the news-stories.
 
I have a feeling the card will be as powerful as the 8800GTS//GTX, while releasing less heat and hopefully being a single slot card.
 
Well this "news" report pretty much sums up what we already know lol, didn't really need another thread for this... I'm just waiting for it to be officially confirmed that the G92 is not a high-end product (it has a #2, sheesh), then I'll commence with a "I told you so" thread :p. Seriously though, I don't think they'll do 3 card releases in 8 weeks time. You've got the G92 and the lower end G98, I just don't see a new high-end card being released in 2007. 2008 (even early in the year) is wide open and I'd expect it sometime then.

I don't see how this card (clocked much higher like the 8600GTS) will replace the 8800GTS performance-wise with a 256-bit bus. It'll fit right in between the two for that sweet spot of performance and price since it isn't a new core to boot. I consider the G92 an 8600GTS taken to the next step, what the 8600GTS should have been ;).
 
I'm cool with this. Looks like I guessed right going with the 8800GTX then.
 
I'm cool with this. Looks like I guessed right going with the 8800GTX then.

It only makes sense imo, the G80 core is the most expensive development they've done by far yet. They need mainstream parts to pull in more sales/revenue, hence more money to fill up the hole left by their R&D. I'm not saying they're "struggling" for revenue, but they certainly need it like anyone else. Most people are still on the 7x00 and x19x0 series. G80 customers are still the great minority even though the 8600s have started to contribute to the sales. More performance for the "right amount" of money is needed by your average PC gamer ($150-$200) in the G80 line. Right now, the overall better choices are still the x19x0 and 79x00 class cards.

Nvidia invested time into Vista and DX10 (still is in driver development), I think they're pushing people to go DX10/Vista. Why else release the G92 so close to Crysis' release ;)? The damn 8600GTS can't really cut it in DX10 (when it does, DX9 looks better).
 
So is this "G92" just gonna be a refresh for the 8600's? OR is it gonna do anything to the 8800 line? I know it says that Nvidia doesnt want to dethrone its 8800GTX's with a new gen of cards, but does that mean we could see a improvement like a 8850GTX?
 
Like the story said, I think that it is highly unlikely that nVidia WON'T release a new top of the line consumer GPU this year. They've done so every year like clockwork for the last six years so I don't know why this year would be any different (the 8800 Ultra is a refresh, not a new chip). At the very least they'll paper launch it this year, but I'd be shocked if they don't have a new hard launched top of the product in November.

We should know one way or the other by no latter than the begining of next month for sure.
 
While they did release a new high end every year, it was also when ATI had offered stiff competition. And technically, the G71 core was a direct refresh of the G70 core, and it took nearly a year to reach the G71 core from the G70. From G70 to G80 though, it took a year and a half.

Anyways, HKEPC and Digitimes both reported G92 as being a card to replace the 8800GTS with higher clocks and die-shrink, but 256-bit. This *does* make sense, however, as keep in mind that both Nvidia and ATI never do a high-end on a new process until a mid-range and low-range are already tested and proven to work at that range. The G80 was on the 90nm process while the 8600/8500/8400 were on the 80nm series. However, since ATI jumped to 65nm, Nvidia was forced to basically follow suit to 65 nm because they would fear if ATI having the technological leap would hurt them or if they release a unproven 65nm and face a ton of problems.

This source also says G92 is 256-bit: http://we.pcinlife.com/thread-815475-1-2.html

Anyways, the big thing is that the GTX and Ultra are still priced at $500+ and I'm sure that Nvidia would love to keep milking that cash cow while ATI offers no resistance. And hence they might not feel pressured to release a new high end by years end.

Another big thing is that with Crysis' release in November and all the hype around it, theres no doubt that a lot of mainstream gamers will want a DX10 card for it. Lots of people are considering the 8600GTS but its performance sucks in DX10. ATI is supposedly releasing the 2900/2950PRO to slot between the 2600 and 2900XT and Nvidia knows that if it doesn't release a real competitor at that mainstream, it might be shut out of the biggest market: the mainstream gamer.

Anyways, that DOES NOT mean that no high-end will come out. But, from what has been already said, and from the whispers within companies in the channel business for video cards, the word is that the codename G92 and G98 refer to the upper-mid / low-high end and lower end cards (such as 8700 series). But, as I said, that doesn't mean no high-end will come out, its just the codename people are hyping up might not even be it.
 
NV has committed to a 1 year full refresh cycle, starting with the G80 line of gpus. There is no doubt a new high end card will be released. What it is though, is still pretty much anyone's guess.
 
I hope someone can answer my question, I think I know but I'm not 100%....since the new g90xx will use pciexpress 2.0, will they be backwards compatible with pciexpress 1.0 of today?

I'm sure those cards won't come close to saturating the limits of pci 2.0, let alone 1.0, so I think I read that's just a marketing gimmick in reality--still, if they DIDN'T WORK in my board so I couldn't upgrade that would suck!

I also remember reading that the pciexpress 2.0 benefit would be delivering more power to the board so it wouldn't need external power. Hopefully they'll still be built with an external power adaptor for us with older boards right?

Any info on this is appreciated.
 
A high mid-range / low enthusiast card makes perfect sense to me. The mid-range and enthusiast segments make up over 80% of revenue, after all. If they can make a card that is priced like a GTS 320 and performs like a 640 or slightly better but is cheaper to produce because of the 256-bit bus, then they'll really rake in the cash. Replacing the GTX and Ultra is a pretty tall order to fill anyway, and it may take 4-6 more months of research and advancement before they can satisfactorily top those monsters even if they wanted to.
 
I hope someone can answer my question, I think I know but I'm not 100%....since the new g90xx will use pciexpress 2.0, will they be backwards compatible with pciexpress 1.0 of today?

I'm sure those cards won't come close to saturating the limits of pci 2.0, let alone 1.0, so I think I read that's just a marketing gimmick in reality--still, if they DIDN'T WORK in my board so I couldn't upgrade that would suck!

I also remember reading that the pciexpress 2.0 benefit would be delivering more power to the board so it wouldn't need external power. Hopefully they'll still be built with an external power adaptor for us with older boards right?

Any info on this is appreciated.

IIRC, I think they will work fine. Think in terms of AGP 4x and 8x options etc. If they keep cranking up power it won't matter if it's external or not, a person will still need a larger PSU. I just bought a new PSU and if the next generation skips over mine, I'll just wait until GeForce X/10 or whatever to upgrade. This whole DDR3/PCIe 2/45nm business is too expensive for my blood to rebuild again so soon lol (though I was late to the DDR2 party). My next upgrade will be octo-core ;).
 
I know this doesn't belong here, but seeing as it's a driving force for many people building and upgrading their PCs I'll post it.

http://www.incrysis.com/index.php?option=com_content&task=view&id=456


If this pans out to be true, taking Cevat's word, then I should be all set for my GTS @ 1280x1024 res lol :cool:. Also, in case those that missed it, it appears that quads will only offer a 25% -physics- performance increase over dual-core of the same speed. It's google-translated, but it explains the CryEngine2's functions of multi-processing the core mechanics of the game. Quads could see an overal fps increase over dual-core, but it's unknown. Basically the engine is designed to keep the renderer seperate from physics, which takes the fps-hit when the going gets tough, in stead of the renderer/gameplay.

http://translate.google.com/translate?u=http%3A%2F%2Fwww.pcgameshardware.de%2F%3Farticle_id%3D610309&langpair=de%7Cen&hl=en&safe=off&ie=UTF-8&oe=UTF-8&prev=%2Flanguage_tools
 
Doubt we'd have a lot of use for a 8800GTX replacement right now, nothing would make a great deal of use of the additional power, maybe Crysis when it's out, but thats just one game...
 
Yay for a card that will hit the much desired "sweet spot"...hopefully for the sweet price, too!
 
Yeah I would love for nothing more than to get a card in line with an 8800gts but down in the $200 range, that or drop on us some nasty new performers and lower the prices of the existing ones a bit :D
 
Back
Top