G80 8800GTX 30% faster than X1950XTX, 8800GTS even

I'm thinking we should remember the NV40 launch re driver immaturity. This is a wholly new architecture, thus driver tweaking has just begun.
 
The forum famous crystal ball predicts scores of 13,500 with Conroe setups 11,000 plus with athlon 64FX 62. The G80 should also score near those numbers in 3dmark 06 even at 1600x1200 which is gonna be the biggest shocker :eek: SLi on a 16x10 widescreen pannel will not be necessary to max your games out until the more demanding DX10 games come out.
 
ChrisMorley said:
Great questions to ask - this card is closely guarded by NVIDIA in terms of performance metrics and specs. I highly suggest you guys wait until NDAs are lifted before coming to conclusions...

I totally agree, but half the fun of all this is to hear all of forecasts of other members here. This way later on, we can see who was right, and who are the idiots. (Meant in a nice way).
;)
 
Strange enough, its too early to point out fingers guyz. But if its really 30% fater then X1950 XTX then I am really happy with my Asus Geforce 7950 GX2 1GB, It give me 60+ FPs in GRAW @1600x1200 , and i play that game the most.
 
keysplayr said:
I totally agree, but half the fun of all this is to hear all of forecasts of other members here. This way later on, we can see who was right, and who are the idiots. (Meant in a nice way).
;)

Think of it this way: those who really do know aren't allowed to say. ;)
 
So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
 
ChrisMorley said:
Great questions to ask - this card is closely guarded by NVIDIA in terms of performance metrics and specs. I highly suggest you guys wait until NDAs are lifted before coming to conclusions...

When is the NDA going to be lifted so we can see reviews posted?
 
When they [H]ard launch on 11/07/2006. I bet brent Justice is locked in a basement room slaving away for hours with 3 other assistants just cramming away to finish the review in time. No bathing, or breaks allowed. Once a day Kyle Bennett will throw 1 bone down there with just a little meat on it and let them fight over it. :D :D
 
Lord_Exodia said:
When they [H]ard launch on 11/07/2006. I bet brent Justice is locked in a basement room slaving away for hours with 3 other assistants just cramming away to finish the review in time. No bathing, or breaks allowed. Once a day Kyle Bennett will throw 1 bone down there with just a little meat on it and let them fight over it. :D :D

LOL
:p
 
phide said:
700 million transistors are not dedicated solely to DX10. To assume such is nonsense. The 128 shader processors will function regardless of what Shader Model a particular application is using, and the shader processors are the significant transistor meat of G80.

Things make more sense when you consider the context in which they are said. Someone stated that with 700 million transistors we should expect much more than a 30% performance boost. And I stated that all those transistors are there for DX10 support. So the point here is that tripling the number of tranistors does, in no way, mean that performance should triple, or even double. The amount of transistor budget used up by DX10 features is an unknown quantity. What on earth makes you think that a DX9 class pipeline and a DX10 class pipeline would use the same number of transistors?
 
Lord_Exodia said:
When they [H]ard launch on 11/07/2006. I bet brent Justice is locked in a basement room slaving away for hours with 3 other assistants just cramming away to finish the review in time. No bathing, or breaks allowed. Once a day Kyle Bennett will throw 1 bone down there with just a little meat on it and let them fight over it. :D :D
I'd be even more surprised if the review sites get the hardware a week prior to launch.
 
5150Joker said:
So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?

what the crack pipe?
why is ATI dead
 
5150Joker said:
So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?


Hmm don't think nV will slow down otherwise they would be in trouble, the r600 isn't the last of ATi's cards they probably still have 2 generations in developement, at least, so AMD will have a nice track (hopefully smooth, less delays) to start on.
 
Tigerblade said:
Lmao, in your dreams f_a_n_b_o_y :cool:


ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.


Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.
 
^eMpTy^ said:
The amount of transistor budget used up by DX10 features is an unknown quantity. What on earth makes you think that a DX9 class pipeline and a DX10 class pipeline would use the same number of transistors?
I never assumed such. Where exactly did I state this?
 
5150Joker said:
ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.


Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.

lol, what a Joker
 
5150Joker said:
ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.


Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.


Yeah AMD bought one of the 2 major IHVs to disband it, give up a market segment, and hand NV a virtual monopoly. I hear AMD will next give up pcs all together, and become a brokerage firm.....
 
5150Joker said:
ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.


Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.
Of course they will. After all, who needs all that money?

The fact is that ATI has a lot of sunk costs in development, manufacture, and distribution of high-end cards. A smart company is a diversified company, not one that "focuses" on the most lucrative aspect of the business to the exclusion of all else. Besides, as far as I know, AMD has not replaced ATI's management, so there won't be any drastic changes of direction.
 
I doubt the 8800 GTX is "only" 30% faster than a X1950 XTX. I think the 30% advantage might be for the 8800 GTS.
8800 GTX (in most games, not all) should be around 60-70% faster.
What I really want to see is DX10 performance, since DX9 performance will be on NVIDIA's "lap" for at least three motnhs or so.
 
5150Joker said:
So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?

ATI is not "dead". Far from it. The fact that AMD bought it, won't make them stop releasing new cards. Though I do agree with you on one thing. They will likely not release any major "single" chip after the R600, because with AMD, they will invest on a CPU/GPU combo instead.

And even so, I doubt NVIDIA will slow down its pace, because they will also need something to compete with this new combo and that's why I believe the rumors of NVIDIA working on a CPU, are not far from the truth, because in time, they will need to start the development of a CPU/GPU.
Considering this, it would probably make sense for Intel to buy NVIDIA, but I doubt it. At least not right now, because Intel has a sweet spot with integrated graphics. Plus, I believe NVIDIA has the guts to pull something like this by its own, so they don't need to be bought by Intel.
 
5150Joker said:
So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
If AMD does indeed drop ATI from the high end market. NVIDIA will slow its release schedule and more than likely price drops will also slow down. :(

Basically look at Creative and the sound card market.

We need the competition to stay just as it is. Both companies fighting tooth and nail by releasing better products and price wars.
 
PRIME1 said:
If AMD does indeed drop ATI from the high end market. NVIDIA will slow its release schedule and more than likely price drops will also slow down. :(

Basically look at Creative and the sound card market.

We need the competition to stay just as it is. Both companies fighting tooth and nail by releasing better products and price wars.
i doubt amd would drop ati from the high end. amd is proc company, they know all about epeen waving value
 
AMD is far from dead. The high end market may not be as lucrative as the integrated graphics market, but that doesn't mean there's still not a large amount of money to be made off of it. Obviously, the AMD acquisition of ATi means AMD is headed towards an integrated platform, but not to the detriment of the existing ATi marketshare. The staff of ATi remains in place after the buyout so it's dubious to propose that they are dead as a company.

Besides, as PRIME1 said, having the competition between the two companies is a GOOD thing for consumers. Despite any !!!!!!-ism and bashing of the other company, ATi and nV play off of each other and help keep graphics technology quickly evolving and prices down somewhere reasonable. Without either company we'd be in a much worse situation than we are now. Even you nV !!!!!!s need to realize that ATi going under would be a bad thing for you, even if you never have and never will purchase one of their cards.
 
That's_Corporate said:
Same with Bugatti, but they still make the fastest road car in the world.


yeah but management has alot to do with direction thats the problem with a merger, there is a good chance things at ATi will change substantial something we just have to wait and see.

Anyways ya your point is valid, ATi can still be very competative even though they arn't going to be in charge of what they do.
 
Martyr said:
i doubt amd would drop ati from the high end. amd is proc company, they know all about epeen waving value
Actually, that is exactly why AMD, may drop ATI out of the high end. AMD may divert ATI's engineering focus towards making CPUs instead of GPUs. AMD did not buy ATI just to have it do no work for AMD.
 
Silus said:
I doubt the 8800 GTX is "only" 30% faster than a X1950 XTX. I think the 30% advantage might be for the 8800 GTS.
8800 GTX (in most games, not all) should be around 60-70% faster.
What I really want to see is DX10 performance, since DX9 performance will be on NVIDIA's "lap" for at least three motnhs or so.
My point in an earlier post was that "30% faster" is a meaningless statement, and might very well be true in certain comparisons even if it's highly misleading overall.
 
Oh yeah!!!

Since the G80 broke the 10,000 mark in 3DMark06 this year, Futuremark will have to release 3DMark07!

Yes! I hated both 3DMark05 and 06. 06 was nothing new, and rushed out to the market with only HDR and a couple of new tests added. Now, 07 better be completely new, and I hope Futuremark was prepared to it.

Futuremark said that it's their policy to release a new version of 3DMark in the following year whenever the 10,000 mark is broken by a single video card.
 
ATI isn't gone.

I think ATI was bought to try and get a dedicated team of people working on mobile chipsets for AMD....and AMD can also make ATI / AMD combos faster then ever on their platform.
 
GAWD... will you people please stay on topic? Been trying to find out more information on the G80 and all I have heard for the past 2 pages is dumb asses saying ATI is dead, and people arguing with each other about transistors. if you don't have any information to offer, then start up a flame war somewhere else.
 
It is not our responsibility to make your endeavors easier. If you want to pay me for G80 information, though, that can be arranged.
 
I cant belive that one card broke the 10,000 mark. With my SLI im gettting 9850.
 
yeah, maybe with 3ghz core2 duo ,

so they got +few K just from having a bad ass cpu and lots of ram,

pair it with like any amd single core cpu and that 3dmark wont go past 8k ,
 
bobrownik said:
yeah, maybe with 3ghz core2 duo ,

so they got +few K just from having a bad ass cpu and lots of ram,

pair it with like any amd single core cpu and that 3dmark wont go past 8k ,

And the performance will go down too, hence the lower rating.

"O yea! I bet if I mash it into an AGP slot it won't get ANY 3d marks!! ROFLPWNED nVidia!"
 
phide said:
It is not our responsibility to make your endeavors easier. If you want to pay me for G80 information, though, that can be arranged.


when did I ever say it was your responsibility to make my endeavors easier? Just asking for people to stay on topic to this thread.. and that is DEFINITELY your responsibility.. for somebody with so many posts I would assume you knew that.. No thread crapping..no hijacking... etc..etc..

Hell.. I'm breaking the rules just by arguing.. :p
 
HeXeD said:
Hell.. I'm breaking the rules just by arguing..
You've already broken one of the biggest rules by calling contributors to this thread "dumb asses".

In the time you've taken to make your two posts, you could have acquired any and all information you've been seeking (and then some). Consider that for a moment, then re-evaluate your intentions.
 
Back
Top