Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
ChrisMorley said:Great questions to ask - this card is closely guarded by NVIDIA in terms of performance metrics and specs. I highly suggest you guys wait until NDAs are lifted before coming to conclusions...
keysplayr said:I totally agree, but half the fun of all this is to hear all of forecasts of other members here. This way later on, we can see who was right, and who are the idiots. (Meant in a nice way).
ChrisMorley said:Great questions to ask - this card is closely guarded by NVIDIA in terms of performance metrics and specs. I highly suggest you guys wait until NDAs are lifted before coming to conclusions...
Lord_Exodia said:When they [H]ard launch on 11/07/2006. I bet brent Justice is locked in a basement room slaving away for hours with 3 other assistants just cramming away to finish the review in time. No bathing, or breaks allowed. Once a day Kyle Bennett will throw 1 bone down there with just a little meat on it and let them fight over it.
phide said:700 million transistors are not dedicated solely to DX10. To assume such is nonsense. The 128 shader processors will function regardless of what Shader Model a particular application is using, and the shader processors are the significant transistor meat of G80.
I'd be even more surprised if the review sites get the hardware a week prior to launch.Lord_Exodia said:When they [H]ard launch on 11/07/2006. I bet brent Justice is locked in a basement room slaving away for hours with 3 other assistants just cramming away to finish the review in time. No bathing, or breaks allowed. Once a day Kyle Bennett will throw 1 bone down there with just a little meat on it and let them fight over it.
5150Joker said:So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
5150Joker said:So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
5150Joker said:So now that ATi is dead and R600 will likely be their last major performance architecture
Tigerblade said:Lmao, in your dreams f_a_n_b_o_y
I never assumed such. Where exactly did I state this?^eMpTy^ said:The amount of transistor budget used up by DX10 features is an unknown quantity. What on earth makes you think that a DX9 class pipeline and a DX10 class pipeline would use the same number of transistors?
5150Joker said:ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.
Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.
5150Joker said:ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.
Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.
Of course they will. After all, who needs all that money?5150Joker said:ATi still exists as an independent company? Nope. Question is will AMD continue to waste resources trying to compete with nVidia or will they instead focus on the more lucrative complete platform market like Intel's Centrino? I think the latter is a lot more likely than the former and thats why I speculated whether or not R600 is ATi's last major high performance architecture. I can see AMD producing midrange cards and low end ones in the future but not the uber high end--I think they'll leave that to nVidia.
Anyway I think the 30% figure is probably b.s. unless it takes into account HDR+AA and even then I'd say its still wrong and being underestimated.
5150Joker said:So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
If AMD does indeed drop ATI from the high end market. NVIDIA will slow its release schedule and more than likely price drops will also slow down.5150Joker said:So now that ATi is dead and R600 will likely be their last major performance architecture, do you guys think nVidia will slow down its pace of introducing faster cards every few months?
5150Joker said:ATi still exists as an independent company? Nope.
i doubt amd would drop ati from the high end. amd is proc company, they know all about epeen waving valuePRIME1 said:If AMD does indeed drop ATI from the high end market. NVIDIA will slow its release schedule and more than likely price drops will also slow down.
Basically look at Creative and the sound card market.
We need the competition to stay just as it is. Both companies fighting tooth and nail by releasing better products and price wars.
That's_Corporate said:Same with Bugatti, but they still make the fastest road car in the world.
Actually, that is exactly why AMD, may drop ATI out of the high end. AMD may divert ATI's engineering focus towards making CPUs instead of GPUs. AMD did not buy ATI just to have it do no work for AMD.Martyr said:i doubt amd would drop ati from the high end. amd is proc company, they know all about epeen waving value
My point in an earlier post was that "30% faster" is a meaningless statement, and might very well be true in certain comparisons even if it's highly misleading overall.Silus said:I doubt the 8800 GTX is "only" 30% faster than a X1950 XTX. I think the 30% advantage might be for the 8800 GTS.
8800 GTX (in most games, not all) should be around 60-70% faster.
What I really want to see is DX10 performance, since DX9 performance will be on NVIDIA's "lap" for at least three motnhs or so.
bobrownik said:yeah, maybe with 3ghz core2 duo ,
so they got +few K just from having a bad ass cpu and lots of ram,
pair it with like any amd single core cpu and that 3dmark wont go past 8k ,
phide said:It is not our responsibility to make your endeavors easier. If you want to pay me for G80 information, though, that can be arranged.
Vapor03 said:I just want DX10 lol..
You've already broken one of the biggest rules by calling contributors to this thread "dumb asses".HeXeD said:Hell.. I'm breaking the rules just by arguing..