G92 by Xmas???

this guy is amazing.

he knows information no one knows now but he did back in may.

tales to amaze and astound!

he needs a new turban.
 
this guy is amazing.

he knows information no one knows now but he did back in may.

tales to amaze and astound!

he needs a new turban.

Um.. maybe you missed the news posts reporting that there was an analyst webcast where NVIDIA made these claims themselves? Everyone's known it since then. i.e.

http://www.theinquirer.net/default.aspx?article=39829

http://www.fudzilla.com/index.php?option=com_content&task=view&id=1153&Itemid=34

http://www.xbitlabs.com/discussion/3953.html
 
The inquirer said that G80 was a fixed pipeline design around a month before it came out, they must of had some real good intel on that one.

Fauds and the inq's storys are almost mirror images.

And the third post is unverified, altho nvidia does call it the G92, does say 1teraflop, dates weren't issued, and therefore it can be assumed that w/e the specs, are subject to change.

Talk = talk, and there is no confirmation on any of what was said on the third link from nvidia, not even the 65nm part. Nvidia has made it a priority not to make info known to the public, for as long as possible, seeing as most people didn't see 8800GTX intill a week before it was launched, it can be assumed that much wont actually be known for a while.

I'm not a fanboy for ati, but I hate when I see people validate speculation for no reason other than "it must be true".
 
Well, it wouldnt suprise me if nVidia launched their next gen cards around the holidays. We know that they are trying to do to ATI like Intel is trying to do to AMD right now. Smack down while your opponent is down! Considering the pace of technolgy these days I wouldnt be suprised by a G90, or whatever they plan on calling the next gen video card, in time for Xmas.
 
It must be true since fudzilla and the Inq said it was :eek: ...


...or not. Since there is no "8950" anyway, if you are holding out for a new nvidia card, you will most likely be holding out until around Christmas anyway. I'd imagine we can expect something new in around 3 more months, since the G80 is getting a little long in the tooth at this point.

Maybe AMD/ATI will actually have something competitive, too, around that time.
 
The inquirer said that G80 was a fixed pipeline design around a month before it came out, they must of had some real good intel on that one.

Fauds and the inq's storys are almost mirror images.

And the third post is unverified, altho nvidia does call it the G92, does say 1teraflop, dates weren't issued, and therefore it can be assumed that w/e the specs, are subject to change.

Talk = talk, and there is no confirmation on any of what was said on the third link from nvidia, not even the 65nm part. Nvidia has made it a priority not to make info known to the public, for as long as possible, seeing as most people didn't see 8800GTX intill a week before it was launched, it can be assumed that much wont actually be known for a while.

I'm not a fanboy for ati, but I hate when I see people validate speculation for no reason other than "it must be true".

Everything said about G90/G92 specs, are nothing but rumors and most likely, you are right when you say, we won't know them for sure, one week prior to its launch.

However, NVIDIA's release schedule is known for a while now. In fact, ever since G80 was launched. And that schedule is, to launch a new high-end card at the end of each year. This was said during G80's launch and also by a NVIDIA PR, a couple of months ago, when talking about NVIDIA's next big thing: G90/G92. Yes, it can change, but this is no rumor. It was confirmed twice, by NVIDIA itself.
 
Bring on the G90/G92'ss I'm ready. Too bad Amd/Ati always release their stuff late. I hope they really strike hard with barcelona and phenom. Or else I will go for the q6600 and Nvidia 9800gts.
 
As long as the new cards come out within 3 months of me purchasing my new system, I'll be happy. Step up rocks!
 
I think nVidia will be naming it the 9000, and wont' confuse people with ATI 9800 series name, and it also means it won't be the true next generation 9800GTX we thought but just a couple steps above an 8800 meaning it won't blow your sox off like the 9800GTX will in 2008, but will be a nice jump above the 8800GTX......As to putting it on hold of course, they have no competition right now from anyone and they own the top spot, why release another higher end video card to beat themselves

But I bet in November something new for sure will be out just in time for nVidia's silent partner Epic to release UT3 which is said to need the latest best hardware available to run even at medium settings, that even 8800Ultra SLI will not cut it per Mark Rein

And don't forget Crysis just got pushed back to an official release date of November 16 close to UnrealT3, and perfect time for nVidia to bring on the new card to handle these two brand new next generation games, so I bet big money there will be a new video card in November
 
thought Nvdia wasnt going to release any new GPU's since they already have ATI by the balls.

I truly dont think nvidia will release anything anytime soon.
 
thought Nvdia wasnt going to release any new GPU's since they already have ATI by the balls.

I truly dont think nvidia will release anything anytime soon.

No? What about when the new DX10 games come out with the possibility of running like shit on our current DX10 cards?
 
No? What about when the new DX10 games come out with the possibility of running like shit on our current DX10 cards?

Well ultimately game companies will either release games for the cards that are in people's systems or they'll just go out of business. This idea that companies have the ability to say "Our game is going to require X much power and those fucking video card companies better provide it," is beyond silly. Games have to take in to account what cards on the market are capable of if they don't, they fail. Now they can target different parts of the scale, some games target very high end, many target more midrange, but they have to decide what they can and can't include based on the cards that are available and becoming available. They don't get to lay down the law with the graphics card companies (remember both companies are already trying as hard as they can to release faster and faster chips).

Also for DX10, please people just wait. Just chill. Give it time. Right now anything you think you understand about DX10 performance is wrong. It has been out for a very short time, we've got NO idea how optimized it might be or what problems it might have. The drivers might need tuning, the apps themselves might need tuning (a company releasing something DX10 doesn't mean they did it right), in fact DX10 itself may need tuning and it seems a patch is on the horizon that does that. The performance you see now has nothing to do with what the performance will be after all groups involved straighten everything out.
 
Well ultimately game companies will either release games for the cards that are in people's systems or they'll just go out of business. This idea that companies have the ability to say "Our game is going to require X much power and those fucking video card companies better provide it," is beyond silly. Games have to take in to account what cards on the market are capable of if they don't, they fail. Now they can target different parts of the scale, some games target very high end, many target more midrange, but they have to decide what they can and can't include based on the cards that are available and becoming available. They don't get to lay down the law with the graphics card companies (remember both companies are already trying as hard as they can to release faster and faster chips).

Also for DX10, please people just wait. Just chill. Give it time. Right now anything you think you understand about DX10 performance is wrong. It has been out for a very short time, we've got NO idea how optimized it might be or what problems it might have. The drivers might need tuning, the apps themselves might need tuning (a company releasing something DX10 doesn't mean they did it right), in fact DX10 itself may need tuning and it seems a patch is on the horizon that does that. The performance you see now has nothing to do with what the performance will be after all groups involved straighten everything out.

i still doubt that we'd be able to run dx10 games @ max setting in lets say 1440/900 ~ 1680/1050 rez w/ 8800 gtx
 
I think the developers aim to have their game run in medium settings on medium end hardware which the bulk of users seem to have. Then usualy run in high settings on the best hardware, and Ultra settings reserved for those with SLI and for future generations.

It bugs me that DX10 is getting a bad rap in that it can't be used with current gen hardware, the simple fact is that we're reaching the end of the graphics card cycle, the 8xxx is showing its age and with cards 2x faster only months away they are fast sliding from top end cards to mid range cards which means the new developers will be aiming for mid range settings on this hardware and reserve high end settings for the new hardware thats coming out.
 
of course they are going to release another card. They are in the business of selling cards and a good chunk of the next gen video cards buyers are going to be people that already have a 8800 series card.

I would guess Nvidia has been working with the UT3 and Crysis folks to get a card out around the time they are going to release these games. Games drive video card sales more than benchmarks.

Its kinda of a perfect storm of buying madness: Christmas,UT3, and a video card release.
 
Failing to release another card because the current one is better than ATI's would be brain-damaged. When you're competitor's down, you kick him, you don't rest on your laurels and wait for him to turn the tables. Besides, it takes a long time to develop a card. They would have started R&D on the G92 before they ever knew what ATI's new cards were going to look like. Also don't forget that the HD 2900 can overclock to insane speeds given proper cooling. With a die shrink and further development, I wouldn't immediately discount it against nVidia's architecture.
 
Back
Top