G92 = GF 8700 GTS

why does everyone not believe it, i mean if i was nvidia and i had a clear lead why would i update the highend part first, nvidia is trying to get back the oem market with the g92 first otherwise ati will have made the most money in that sector, they have no reason to release a 9800gtx or whatever it might be called because they have the performance lead and they know ati wont catch up until q1 2008. and by then they would have probably refined their 65nm process for their highend part, and i think the real highend war would in q1 2008, i am sure ati will probably have learned a lesson or two by than.
 
Well heres another thought. With Crysis coming out in November, we really have no idea on how that game is going to run on the 8800 series. Sure they maybe DX10 cards, but how they perform in a DX10 enriched game like Crysis is another story. You are looking at a year old technology. Im sure Nvidia would want the baddest of the baddest out for when that game lauches. I could be wrong though. But just like how ATI launched the 9800XT with the HL2 coupon. Nvidia could pull the same thing here. They did it with COD2 and Q4.


Plus to me it wouldnt fit right for them to name the 8700 code name G92 when it would go against what they have done with labeling their cards for the past 2 years.
 
Well heres another thought. With Crysis coming out in November, we really have no idea on how that game is going to run on the 8800 series. Sure they maybe DX10 cards, but how they perform in a complete DX10 scenario is another story. You are looking at a year or more so technology. Im sure Nvidia would want the baddest of the baddest out for when that game lauches. I could be wrong though. But just like how ATI launched the 9800XT with the HL2 coupon. Nvidia could pull the same thing here. They some what did it with COD2 and Q4.


Plus to me it wouldnt fit right for them to name the 8700 code name G92 when it would go against what they have done with labeling their cards for the past 2 years.

well i can understand that but crysis wont be outdated for a while, and its been demonstarted over and over and i have even seen video on youtube where the developer shows the pictures of the system and it was running 8800gtx and only one of them and the game was running on ultra settings, crysis is a well optimized game, i wouldnt be surprised if nvidia brings out a new highend card but i highly doubt that looking at all the developer comments and videos. if crysis is running fine on 8 series nvidia has no need to release the new highend chip, because they are losing oem market to ati with the mainstream cards. they already got apple to use the 2600 and 2400 series, and other oem are jumping on it because it uses less power and runs cooler. i think thats the market nvidia wants to capture, hey who knows they might have a dual chip card just like the upcoming rv670, which is highly rumoured to be dual chip card and i read somewhere that the design kit is already out for that card.
 
I've never seen prices this high for this long. I've been waiting for the GTX to come down in price or at least a GTS640. Looks like it will not happen this year. I have all my parts ready waiting for a video card. :(
 
well i can understand that but crysis wont be outdated for a while, and its been demonstarted over and over and i have even seen video on youtube where the developer shows the pictures of the system and it was running 8800gtx and only one of them and the game was running on ultra settings, crysis is a well optimized game, i wouldnt be surprised if nvidia brings out a new highend card but i highly doubt that looking at all the developer comments and videos. if crysis is running fine on 8 series nvidia has no need to release the new highend chip, because they are losing oem market to ati with the mainstream cards. they already got apple to use the 2600 and 2400 series, and other oem are jumping on it because it uses less power and runs cooler. i think thats the market nvidia wants to capture, hey who knows they might have a dual chip card just like the upcoming rv670, which is highly rumoured to be dual chip card and i read somewhere that the design kit is already out for that card.



I could be wrong on this as well but i did see the CEO from Crytek talking and demonstrating the game and saying it was using the 8800 GPU's. I even made a post about this i think a week or so ago. BUT, someone informed me that, that video was over a year and half old. The game has matured since then. Kinda like how HL2 was coming out and people were like ooooo 9800 Pro will dominate the game. The 9800 series played it, but not to the extent they had mentioned. I think the x800 was one that picked it up a notch. But again i could be wrong on a few things here and there.
 
I mean this sucks! Why is it, when one of the most highly regarded games in a recent generation is coming out, there is no new great high end video card along with it. It just doesn't make sense to me, it is goin to be a year since 8800 was dropped, and Nvidia is always touting how they like to refresh the market every few months, so you are going to refresh it with low end video cards, Cheese and Crackers-Cheese and Crackers, i know most ppl are rite and the game has matured, but I am only of those willing ppl to spend hundreds of dollars on a high end video card just because, so I can stand up and say "look at me now bitches, hahahaha"!! But NO, now i'm going to have to take in the butt like i live in some sort of public prison and endure the pain of having to watch the release of one of the best video games possibly to come out in the past decade and say "oh look, tight ass video game, yippy sure looks great on hardware that is a year old"
 
im putting my money on the new high end cards...why? because i just bought an 8600GTS and i don't wanna have to go out and buy another 'new' midrange card.
 
we all know how powerful the 8800gtx is, its way to fast for any current gen game, and i am sure u all agree, and crytek is known for optimizing games, here is the problem guys, nvidia wont hurry up if they dont have too, if ati is going to release their next refresh(probably not possible) before christmas than we might see nvidia bring out a new highend part, the real money is in the mainstream market and that is nvidia loosing right now, so thats why i am not surprised that nvidia is refreshing their midrange and mainstream first and than the highend. nvidia wants to capture the oem market, and apple already jumped on with ati in their new imac's. the oem's go for best bang for ur buck and ati's mainstream parts are running cooler and perform pretty close to nvidia parts and consuming less power.
 
why does everyone not believe it, i mean if i was nvidia and i had a clear lead why would i update the highend part first, nvidia is trying to get back the oem market with the g92 first otherwise ati will have made the most money in that sector, they have no reason to release a 9800gtx or whatever it might be called because they have the performance lead and they know ati wont catch up until q1 2008. and by then they would have probably refined their 65nm process for their highend part, and i think the real highend war would in q1 2008, i am sure ati will probably have learned a lesson or two by than.
Because some of the people most likely to buy high end cards are those consumers that currently have high end cards. You cant stop putting products out just because your competitor hasn't because your still losing a huge number of sales from consumers wanting to upgrade.

All the big releases that I'm following are slated for a Q4 07 release not a Q1 08 release. It seems a MUCH smarter decision to have their new high end cards out at the same time as the major releases of the upcomming 6 months arrive rather than wait. If they miss the new games releases people arent going to have any real reason to get the cards in Q1 08 unless theres something out then but to be honest I dont think theres any real big names out then. By then they will have finished the Q4 titles.

Q1 also misses Xmas which I'm sure is a time many people will be getting games and possibly systems too. Sure they will want to meet average consumers here too but I think your foolish to leave the high end market to the side because ATI isnt around.

If NVidia doesnt think ATI is still a threat they'll more likely than not just get complacent. That same mistake cost Intel years ago by allowing AMD back into the game and has in turn hurt AMD who havent been able to get any hot new products out once they did take the lead. If you want to beat someone kick them while they're down and dont delay tech advances because you "think" the opposition can't catch up.
 
Lay down the pipe.
The 8800GTX "doesn't cut it" any more and it's the fastest card on the market at the moment...but you're instead apparently planning on buying the closest comparable card, the 2900XT, which is slower in at least 80% of scenarios, unless you like AA...then it's at least 95%. And all because nVidia is trying to close the massive gap between the 8600GTS and the 8800GTS. Wow, when ePenis gets taken too far...

edit:
Oh, unless you mean you spend it on something entirely different, like a girlfriend or something. I just don't often see people thinking of things besides videocards here.


Ummm I pretty much put down the 2900XT in the following post. And yeah I meant elsewhere ie other hobbies duh. I already have an 8800gts and it isn't about "ePenis'" it's about gaming. Just because you don't have the money to upgrade every refresh you don't need to blow a gasket. I honestly don't give a crap about midrange cards and if people want performance the 8800 GTS 320 isnt THAT expensive.
 
Its all AMD's fault. Right now there is no competition besides the mainstream market. And yes there is need for a next gen card, World in Conflict and MOH Airborne don't quite run that great on 8800 series hardware with resolutions cranked at 1900x1200 which is becoming the new standard as affordable 24" displays are becoming very common.
 
Who ever rights their "articles" needs to learn how to use a spell-check :eek:.

"rights"
--noun 1. (used with a plural verb) civil rights.
–adjective 2. civil-rights: a rights worker.


"writes"
v. wrote (rōt), writ·ten (rĭt'n) also writ (rĭt), writ·ing, writes.

Whoever WRITES their "articles" needs to learn how to use a spell-check.
 
"rights"
--noun 1. (used with a plural verb) civil rights.
–adjective 2. civil-rights: a rights worker.


"writes"
v. wrote (rōt), writ·ten (rĭt'n) also writ (rĭt), writ·ing, writes.

Whoever WRITES their "articles" needs to learn how to use a spell-check.

lol pwned.


Regarding the OP, i think there are good reasons on either side. nVidia really needs to grab the low-end OEM market, where they are getting killed, so i would hope that it's a lower power, higher-performing mid-range chip (scalable down to the bottom/integrated level. However, i don't think that the need for that was quite so obvious to nVidia when the decisions were made leading to the upcoming release.

Anyway, my bet is that, whatever it is, it will kick ATI a few new bruises.
 
Why do people keep saying that nvidia is getting killed in the low end market? ALL of their cards are much better price/performance than ATIs this generation. The only card that beats out the 8600 series is ATI's last gen 1950xt/pro. The 7900gs is a bit faster, but lower IQ.
 
I strongly believe that the G92 series will be the flagship 9800GTS/GTX that we have eagerly been waiting for. If you go to incrysis.com, you will learn that CryTek is working exclusively with Intel and Nvidia on releasing their upcoming game Crysis simultaneously with the new high end flagship GeForce 9 and flagship Penryn proces

Well, I hope nVidia doesn't use the 9800GTS name or anything 9700 9600 whatever, because those cards are already well known.

I'm sick of the 4 digit model numbers. Rename the Geforce series. Time for DX10, and 10 generations later... give it a new name! and shorten the model numbers!
 
Why in the world would nvidia bring out cards that competes with cards already out?
So they bringing out 8700s that are in between 8600 and 8800? Is there like a huge gap in price and performance between the top end 8600 and 7900s and low end 8800? If there is I understand why they would release a sort of "test card" with the new die shrink, but otherwise it will just waste what they already have released.

They already have midrange cards and if they being beaten by ATi in the midrange, the best way to fix the problem is to drop the current 8800s down in price and bring out new high end vid cards. That way you beat ATi in both midrange and high end cards.

Either there was a misunderstanding or Nvidia purposively wanted to get people to stop waiting for the new gen and just buy what they have already out and get rid of their current stocks. Then release new high end later this year and if people complain they'll just blame the reporter.
 
they cant cut price of 8800 series...its expensive to make 8800 series

Who says they need to keep producing the 8 series cards if the next gen is cheaper to produce and faster?

They aren't going to hold back on releasing a card, the card was probably already designed before ati released the 2xxx series.
 
Who says they need to keep producing the 8 series cards if the next gen is cheaper to produce and faster?

They aren't going to hold back on releasing a card, the card was probably already designed before ati released the 2xxx series.

yea and if people keep buying 8800s for cheap and eventually run out of them they'll release midrange next gen cards which are cheaper to make. Then none of the 8800s are going to sit collecting dust.
 
It seems the boys over at Fudzilla are convinced that G92 is the new high-end card, which will be G80 on 65 nanometer process with some performance tweaks, e.g. going from the 7800 series to 7900 series. G98 will be a 65nm die shrunk G84. This throws a monkey wrench into my plans. :(

Looks like I will be keeping my 320MB 8800GTS a bit longer than I anticipated. *sigh*
 
bringing 8900s makes way more sense then 8700s. There is room for a faster card, but the bit rating is lower while having faster memory and probably faster clock. This may make 8800s still a decent seller, but 8900s being more profitable as replacement in a way if it costs less to manufactor. It makes those who recently bought 8800s less weary of their vid card going obsolete, but those looking for next gen high end cards a bit dissapointed.
 
I hope this isn't true because it means we're probably going to be stuck waiting at least another 3 months for a flagship replacement card. With all these new demanding games coming out the 8800GTS/GTX just doesn't cut it anymore. I was already saving my pennies for a new nvidia series 9 card, guess I can spend my money elsewhere

your on a 19" monitor and an 8800GTS and your saying it cant handle todays games... :confused:

find a game for me that you cant run at your native res with 4X AA and 16AF and 60fps...

is the 8700 going to be faster than the 8800s?

no, that would be a marketing screwup for the ages.

Nvidia told us they would stick to the 6 month release scheduel: 6 months after release day we would see a refreash to the series: 8900s and 8700s. Word is they've dropped that plan due to ATIs inabillity to compete. They have stated however that they will start a new series 12months after release in accordance with the 6 month refreash schedual.

If its true, and this turns out to simply be a revamp of the 8 series, ATi has caught one helluva break, and will probibly be able to bring out there R700 long before Nvidias 9 series starts.
 
Oh dear, and the 2900XT was just starting to show some benchmarks comparable to the 8800GTS :)
 
your on a 19" monitor and an 8800GTS and your saying it cant handle todays games... :confused:

find a game for me that you cant run at your native res with 4X AA and 16AF and 60fps...

World in Conflict
and im going to bet Crysis and Straglehold
 
your on a 19" monitor and an 8800GTS and your saying it cant handle todays games... :confused:

find a game for me that you cant run at your native res with 4X AA and 16AF and 60fps...
.

I actually game on the 21" and Bioshock doesnt run a steady 60fps at 1680x1050 with 16AF. Don't people read before posting ? Oh and I'm guessing Crysis won't either and Oblivion doesnt exactly run smooth either. Don't all rilled up I'm not saying it's a bad card just long in the tooth a bit. Whose to say I don't have a 27"+ by November? Why shit on people just because they want a better card?
 
I think Nvidia is giving AMD/ATI a chance to catch up. It would be really harsh to beat them again with the release of a 9800/g90 series flagship card this fall. AMD is focusing on their CPU's which again they are behind.

All I really want is a 24" monitor and a graphic card then can run World in Conflict, UT3, and Bioshock at 1900x1200 4x AA / 16x AF with no slowdowns, but I guess I'll have to wait to q1 08 or even q2 08.

Right now my system can run those games at decent framerate, but we all know a steady 60+fps is far more enjoyable then constant drops in framerate in intense battles.
 
yea and if people keep buying 8800s for cheap and eventually run out of them they'll release midrange next gen cards which are cheaper to make. Then none of the 8800s are going to sit collecting dust.

Why do you think Nvidia cares about cards collecting dust? Its not like they even make the cards.

High end - 8800 GTX/Ultra
Mid end - 8800 GTS 640/320
Low end - 8600 cards

There is a market that Nvidia will go after, its all those people sitting there with year old GTX cards. This isnt always about trading blows with AMD, this is also about extending ones lead. To think Nvidia has spent the last year working on low and and mid range cards is just stupid.

So, since all you people really seem to believe every word of this bullshat article, do any of you have any non-forum sources to confirm this stupid article.

I bet my next video card upgrade budget that the we will see a new high end before the year is out.
 
World in Conflict
and im going to bet Crysis and Straglehold

You shitting me? I run the demo maxed out with 16xmsaa on 1680x1050 and get at least 60fps on Vista. I am quite happy with my 8800, no reason to upgrade :)
 
I actually game on the 21" and Bioshock doesnt run a steady 60fps at 1680x1050 with 16AF. Don't people read before posting ? Oh and I'm guessing Crysis won't either and Oblivion doesnt exactly run smooth either. Don't all rilled up I'm not saying it's a bad card just long in the tooth a bit. Whose to say I don't have a 27"+ by November? Why shit on people just because they want a better card?

Well quite frankly, while I do agree there is room/reason for them to release new cards, there is already a(re) better card(s) out there for what you have, the GTX/Ultras.



Nvidia won't give ATI time to come back, WHY would they want to? Hmm I have a superior product so lemme sit on my ass now until I get beat again. Ya sounds like an excellent marketing team :rolleyes:

Like people (myself included) have already said, Nvidia was working on these new cards before ATI released the 2xxx series. And Nvidia did do a "refresh" in the form of the 320mb and Ultra cards.

And why would G9X cards fall in line with the 8 series numbering scheme?
 
They have done this before with notebook video cards.

Remember the ATi Mobility 9800? It was R420 - Which is an X850.
 
Oh dear, and the 2900XT was just starting to show some benchmarks comparable to the 8800GTS :)

That started happening quite a while ago actually, now nearly all of the benchmarks shows the 2900XT being quicker, which is just as well, as it is a more expensive graphics card than the GTS.
 
That started happening quite a while ago actually, now nearly all of the benchmarks shows the 2900XT being quicker, which is just as well, as it is a more expensive graphics card than the GTS.

What reviews so that exactly? So far the new driver releases haven't done much of anything performance wise for the 2900xt.
 
There is a market that Nvidia will go after, its all those people sitting there with year old GTX cards. This isnt always about trading blows with AMD, this is also about extending ones lead. To think Nvidia has spent the last year working on low and and mid range cards is just stupid.

First of all, this market of ours is an ant hill in the big picture. "All those people" is really extremely small, even after all of this time. The mainstream, the vast majority of people, still have last-gen 7x00 and x1x00 cards. No one's saying that Nvidia has been sitting on it's tush looking at daisies, highly unlikely that they are or have been. They may want to fill-out a few more gaps to court people into the new DX10-enabled era of Nvidia cards, that means G80 and beyond. That's where the real money or need is, not this high-end arena exclusively. Vista adoption is still low, everyone's protesting to stay with XP for X and Y reasons. I think DX10 performance needs to be optimized or "fixed" at the software and implementation levels before you start throwing more hardware at the problem.

For all we know, a high-end card could be released at the very end of this year. Although that would make for a very crowded and confusing release period = G9x, G92, and G98 ( will address the G92 and G98 in the next paragraph). They could just as easily release the high-end card early 2008 or shortly thereafter. The enthusiast market will buy it "whenever" it releases.


So, since all you people really seem to believe every word of this bullshat article, do any of you have any non-forum sources to confirm this stupid article.

The real point is there's no proof saying otherwise so why do you call this article or the current "state of affairs" BS? Both sides, opinions, or view points have no real concrete proof, it's just that one particular point of view is being pushed much harder now than it was originally by multiple sites, articles, or whathaveyou (which was ignored by most).

Another interesting aspect that few seldom consider is that Nvidia will release the new series when it's ready. They have the time to do that, they do not have the time to "take all day". That's the difference in trying to meet a silly 1yr anniversary that's not crucial because... Nothing matters unless the product is ready. AMD had all kinds of time with the R600 and it couldn't pull out a show stopper. However, they had quite a bit of time to do so, where Nvidia stepped out first and their hard work still pays off.

Once you tie off a core there's no going back to "fix" a certain thing deep down (AA? ;)). Nvidia could rush themselves to release "on time" and ATi comes out later and trumps them. It could still happen even if they're careful, it all depends on who has the best arch in the end. Quality control > rushed. Oh, and the latest interview with Cevat of Crytek has him recommending an 8800GTS or x2900xt to play Crysis at max settings at 1280x + resolutions. Anything higher (1600x +) would require GTX/Ultra, it's in the MAXIMUM PC magazine August volume. Will it end up being true? What's his definition of playable, 30fps, 50fps? Only 2 weeks to find out for sure.
 
Many people will be building new computers around Thanksgiving to Christmas with the release of penryn from Intel with X38 around the corner with PCIe 2.0 and DDR3 adoption. When something like Crysis is being launched around those months, people will be looking for a good card card (price/performance) to buy. G92 should meet that demand which is in the high end (performance at or above 8800s), but hopefully priced cheap enough so that it isn't limited to the enthusiasts that mostly make up hardocp. If G92 costs as much as 8800 ultras a few months ago upon its release (in the 800+ dollars) I will be sorely disappointed. I'm hoping G92 will be high performance (recommended to play Crysis), but due its lower bit and cheaper manufactoring, will lower costs and be targeted to a larger population that are usually not willing to spend more then 500 dollars for a vid card.
 
Back
Top