GeForce 9800 GX2 Launch Delayed Till 18th March

Yeah, correct on the last few Ati releases, Back last year they told us all about the 8900 series, told us how it would do 1 teraflop, and all that bs. Then I think they circulated th rumor like everyone else that G92 was also 1 teraflop. To be honest, they haven't 'producing' the bulk of rumors that they have in the past because their writing staff keeps leaving the inq, atleast the ones that could write.

They didn't call it anything at all. They just said their next high end part would be capable of close to 1tflop. Nothing more. They never called it 8900GTX, G92, or even G100.
 
They didn't call it anything at all. They just said their next high end part would be capable of close to 1tflop. Nothing more. They never called it 8900GTX, G92, or even G100.

Maybe Simpson5774 went to some super secret meeting with nvidia where they made the 1tflop 8900 announcement. Who knows. :p:D
 
I know the real reason they delayed it a week.... They knew i was gonna use my step up :(:( Now I'm SOL :D:D:eek::p;)
 
They didn't call it anything at all. They just said their next high end part would be capable of close to 1tflop. Nothing more. They never called it 8900GTX, G92, or even G100.

Yes, but they did say 1TF in 07, due to nVidia...

http://www.beyond3d.com/content/news/230
NVIDIA confirms Next-Gen close to 1TFlop in 4Q07
Wednesday 23rd May 2007, 12:12:43 PM, written by Arun

In recent analyst conferences that were publicly webcast on NVIDIA's website, Michael Hara (VP of Investor Relations) has claimed that their next-generation chip, also known as G92 in the rumour mill, will deliver close to one teraflop of performance. In a separate answer to an analyst's question, he also noted that they have no intention from diverging from the cycle they have adopted with the G80, which is to have the high-end part ready at the end of the year and release the lower-end derivatives in the spring.

I can't fault them for that. I think they were talking about the GX2 also....

Then the devastating news to me that the TF in 07 was undoable:
:mad:
http://www.fudzilla.com/index.php?option=com_content&task=view&id=4045&Itemid=1

"Nvidia told a few souls that its new high-end card is getting ready for Q1 2008 launch. Nvidia knows it is too late for the holiday season, but it didn't have much option, as its new high-end card was simply not ready for Q4 2007.

The whole new high-end is codenamed D8E. Enthusiasts will have two chipsets on two PCB's with two huge power connectors. Most of you remember the 7950 GX2, and this will be a similar card. We still have to confirm if the chipsets on this card are two G92s, but we heavily suspect that Nvidia can enable the eight cluster of this 65nanometer chip and make the current 8800 GT chip even faster. If Nvidia does that, the 8800 GT chip will become an 8800 GTS 512MB, just with a touch of a flash tool.

After all, the G92 has more transistors than the G80, which means it has at least the same brute power, faster Shaders and new video processors; and we are sure that it has eight clusters, which are just as many as the G80 has.

Two of these chips should run very fast and we believe that such a card should be able to beat R680, ATI’s high-end card scheduled for the same release timeframe.

Nvidia also plans a new single chipset high-end card for mid 2008 codenamed G100, or D9E".

Needless to say I was crushed... :(

Now we fast foward to Feb 08, all ready for the Big Dogs release after getting our appetites wetted by a speedy G92 8800 GT and GTS, and the news breaks of the 9 Series GTX and GX2's release at the end of Feb, or early March.

The excitement builds again.... There going to be released!... There going to be released!.. Could be runners?? :) :D :cool:

Then we have to get some dude posting an OC'ed picture of an alleged GTX hitting only 14K....


http://www.techpowerup.com/53685/NVIDIA_GeForce_9800_GTX_Scores_14K_in_3DMark06.html

Can't be true.. naaaa. Not the 9800 GTX right....

Then this which added more credibility to the 14K on the 9800 GTX:
http://www.vr-zone.com/articles/GeFo...arch/5602.html

Note:Nvidia has informed their partners the upcoming GeForce 9800 GX2 launch is delayed again till March 18th. No reasons provided by Nvidia. However, it is probably due to drivers issue according to partners. Right now, the card is scoring about 14K+ in 3DMark06, lower than the Radeon HD 3870 X2.

So now the GTX and GX2 both are getting 14K in 3DMark06? I am seeing a pattern... Disturbing yes, but still a pattern. It also is a known thing that cards in SLI that aren't working properly, can give you the same score of a single GPU. The 14K again... Gasp! This could be what is going on. Not sure, but believe there is now a speed issue going on with both cards...

Then more conformation on the delay...

http://www.theinquirer.net/gb/inquirer/news/2008/02/18/nv-release-woes
"Oh yes, we probably should mention that the 9800X2, or whatever they call it, has also moved from CeBIT to the end of March, for now. Before you get all hot and bothered, this is basically a couple of downclocked G92s on 2 PCBs".

Not exactly a good intro for the new 1TF Speed King, hugh? ... :eek:
Sounds kinda 14K'ish to me.
Thank God we had theinquirer tip us off.....
We would probably still be expecting it's release any day.
And still expecting the 1TF nVidia promised us in 07. :rolleyes:
The rumor sites are letting us down as gently as possible, but still feels like a plenty hard landing to me.
We have a good 30 more days I bet? Sigh...

I don't think this is what any of us was expecting when the new GTX and GX2 arrived....

You must admit, it has been a long ride, and for the largest GPU maker in the world not to have some PR department, is just plain sad to me! You think they would wan't to keep us happy, and more informed than they do? :confused:

I for one am glad we at least get a few tips/rumers now and then. :)

It helps to lessen the pain of the sharp stick in the ass nVidia sometimes deals us.
 
I know the real reason they delayed it a week.... They knew i was gonna use my step up :(:( Now I'm SOL :D:D:eek::p;)
Lol that's how it always work. I'm too ashamed to say what video card I'm using but needless to say, getting the 9800 GX2 is going to be leaps and bounds better than what I have.

I don't have the luxury of owning an 8800 GTX as (through my eyes), what I'm currently using worked fine for the games I play. Going into a new build with new product coming out (new technology too) makes putting together a system more enjoyable.

Less than a month now!
 
Bleh, looks like i'll be keeping the GTS512 SLI setup until i build my nehalem rig and then switch to crossfire because Nvidia refuses to give up SLI on intel boards.
 
god if you brought an 8800gtx back in november/december of 2006 that would quite possibly be the best money you could ever spend :p
Indeed. I could not be happier with that decision.

I approached the buying decision thusly: What's the best available today (not six months from now. . . or six months after that)? Can I afford it? Okay then. . . ding!

18 months later. . . high-res, high-detail eye candy happiness prevails.

Meanwhile, I look back "fondly" at all the folks angrily posting that those buying the 8800gtx were idiots and morons because the "refresh" just a few months later would run cooler, faster, and be so much "better". . . and the ATi offering would beat even that. Just wait! Just wait!

I suspect those were the same folks who --more recently-- have spent the last several months (and some continue to do so) claiming that nvidia has been sitting on their "monster card" and just pushing back its release because ATi hasn't been challenging them.

Just a lot of angry people who can't just appreciate what's available, buy what they can afford, and then merely enjoy their purchases. It's all about predicting the video card future, finding/inventing villains, concocting conspiracy theories or ill-conceived explanations for a video card's company's behavior. . . and, of course, being oh-so-self-righteously angry at some imagined injustice. That "injustice" of course, being that the video card they really want hasn't been released yet. . . or doesn't even exist.

People should just research what's available, decide what they can afford, try to determine that nothing newer/better is being released within 30 days. . . and then just pull the trigger. . . and not look back.
 
I don't count fudzilla as accurate. Obviously since no 8900GTX was ever released, I can rest my case concerning thier accuracy. What I said was that NVIDIA never called their next generation card 8900GTX or 8900GX2 and you've shown me nothing to the contrary.

It does seem that I was mistaken about G92 being referred to as a 1tflop part.

Someone had said the INQ had been spot on the past few releases, I said only on ATi side, NV side their always wrong untill someone else is right. You for some other reason disagreed, I wasn't paying attention, I showed you the link thinking it was you who disagreed with me on the INQ. I wasn't talking about an 8900GTX that nivida had planned, I was talking about how the people at INQ make shit up.

Srry bout that.
 
Meanwhile, I look back "fondly" at all the folks angrily posting that those buying the 8800gtx were idiots and morons because the "refresh" just a few months later would run cooler, faster, and be so much "better". . . and the ATi offering would beat even that. Just wait! Just wait!

I suspect those were the same folks who --more recently-- have spent the last several months (and some continue to do so) claiming that nvidia has been sitting on their "monster card" and just pushing back its release because ATi hasn't been challenging them.

Just a lot of angry people who can't just appreciate what's available, buy what they can afford, and then merely enjoy their purchases. It's all about predicting the video card future, finding/inventing villains, concocting conspiracy theories or ill-conceived explanations for a video card's company's behavior. . . and, of course, being oh-so-self-righteously angry at some imagined injustice. That "injustice" of course, being that the video card they really want hasn't been released yet. . . or doesn't even exist.

This doesnt make logical sense. Your comment would mean that Nvidia has not advanced, technologically in the past 18 months. Hence the "monster card" theory.

I did not say you were wrong though, just that it doesnt make any logical sense.

If true, there is something REALLY, REALLY WRONG with Nvidia's R&D right now.
 
This doesnt make logical sense. Your comment would mean that Nvidia has not advanced, technologically in the past 18 months. Hence the "monster card" theory.

I did not say you were wrong though, just that it doesnt make any logical sense.

If true, there is something REALLY, REALLY WRONG with Nvidia's R&D right now.

Of course nvidia is advancing their cards, but the idea that they are sitting on a new "monster card" is laughable at best. nVidia is a company, nvidia needs to turn profits on R&D, nvidia would make more money releasing a new card it already had than milking their current line up. Lets face it, people who can afford an 8800GTX would have bought it by now, so nvidia is lacking something to offer the enthusiasts with cash. While it may not bring in a bulk of their revenue, it is certainly a high profit margin card and drives lower-end sales. They need a new card just as much as ATI does.

My guess is that either nvidia is having troubles with their true next gen, or decided to put some extra work into it to really polish it off (I'm hoping the later - I want to see another huge jump soon like from the 7xxx to the 8xxx). The G92 feels too much like a "meh, here's a new card, now stfu" response from nvidia.
 
This doesnt make logical sense. Your comment would mean that Nvidia has not advanced, technologically in the past 18 months. Hence the "monster card" theory.

I did not say you were wrong though, just that it doesnt make any logical sense.

If true, there is something REALLY, REALLY WRONG with Nvidia's R&D right now.
You have a peculiar sense of "logic" if you think that my comments necessarily mean that nvidia must not have made any progress in the last eighteen months. Rather, it's readily apparent that they have been making slow and steady progress. . . just not according to the timetable of some of the more excitable, angry folks around here.

As I've said many times before, rather than become invested in fanciful conspiracy theories, it's far more likely that nvidia really outdid themselves with the 8800gtx and are having just as much trouble as ATi in topping it. That's a much more "logical" conclusion than believing in some phantom "monster card" that nvidia is just sitting on. Companies don't tend to invest in and create products that would absolutely bury their competition only to merely sit on it. . . thus giving their competition time to recover and pull back into contention as ATi is now doing.
 
You have a peculiar sense of "logic" if you think that my comments necessarily mean that nvidia must not have made any progress in the last eighteen months. Rather, it's readily apparent that they have been making slow and steady progress. . . just not according to the timetable of some of the more excitable, angry folks around here.

As I've said many times before, rather than become invested in fanciful conspiracy theories, it's far more likely that nvidia really outdid themselves with the 8800gtx and are having just as much trouble as ATi in topping it. That's a much more "logical" conclusion than believing in some phantom "monster card" that nvidia is just sitting on. Companies don't tend to invest in and create products that would absolutely bury their competition only to merely sit on it. . . thus giving their competition time to recover and pull back into contention as ATi is now doing.

slow and steady progress does not equal a new product that 18 months later, does not outperform the old product. people upgrade for performance increases. Im not dropping 200$ to save on my electricity bill.

no performance increase=no upgrade. and at that point I have a big problem even calling it an "upgrade".

i dont see how you took that post all personal and responded with irrelevent comments. I'm talking strictly performance. Noone cares if the die size is smaller with less heat, if there is no performance increase. If I wanted that, I'd get a laptop. This is beyond disappointing. (but then again, we have not seen real game benchmarks yet, so THIS ENTIRE THREAD is speculation.)

Chill.
 
slow and steady progress does not equal a new product that 18 months later, does not outperform the old product. people upgrade for performance increases. Im not dropping 200$ to save on my electricity bill.

no performance increase=no upgrade. and at that point I have a big problem even calling it an "upgrade".

i dont see how you took that post all personal and responded with irrelevent comments. I'm talking strictly performance. Noone cares if the die size is smaller with less heat, if there is no performance increase. If I wanted that, I'd get a laptop. This is beyond disappointing. (but then again, we have not seen real game benchmarks yet, so THIS ENTIRE THREAD is speculation.)

Chill.
I'm not exactly sure where I'm taking things personal or getting heated. I merely responded. Nor am I clear on how what I said in my response was "irrelevant" considering that it directly addressed your (to my mind, ill-considered) contention that there being "no withheld monster card" necessarily/logically compells us to conclude that no progress has been made over the last 18 months.

Indeed, that's a straw man. You continue to repeatedly say that there is no performance gain. There clearly (if these 3DMark numbers are accurate) is one and it's not entirely unsubstantial. Though I'm not a fan of 3DMark, it's currently all we have from the new cards. . . my single 8800gtx scores 9,900 with all stock settings (C2D @ 2.4GHz). With my CPU at 3.375GHz, I get 11,000. We're seeing 3DMark of the 9800GTX at 14,000 with the CPU @ 3GHz. Just for context, 15,000 is what I score in SLI with my CPU @ 3.375GHz.

So, We're looking at a performance increase from ~10,000 3DMarks to ~14,000 3DMarks with a bit of wiggle room in there due to the difference in CPU speeds and number of cores (dual vs quad). Yet, from these rough numbers the single 9800GTX could be interpreted as approaching 8800gtx SLi speeds. And, let's also remind ourselves one more time that 3DMark06 sucks and especially recently --along with canned bechmarks-- hasn't been doing new nvidia cards justice. Yet you've said repeatedly that there is no performance gain, and your point of view and assertion that someone else isn't making any sense is entirely based on that faulty foundation.

But again, I haven't gotten heated and I don't need to "chill." I wonder if maybe there is some projection going on. But I do tend to point out when someone is using the term "logic" in an odd way. It's pretty bold to state that someone isn't making any "logical sense" while you yourself are making several leaps and building your point of view on unsound ground.
 
Indeed, that's a straw man. You continue to repeatedly say that there is no performance gain. There clearly (if these 3DMark numbers are accurate) is one and it's not entirely unsubstantial. Though I'm not a fan of 3DMark, it's currently all we have from the new cards. . . my single 8800gtx scores 9,900 with all stock settings (C2D @ 2.4GHz). With my CPU at 3.375GHz, I get 11,000. We're seeing 3DMark of the 9800GTX at 14,000 with the CPU @ 3GHz. Just for context, 15,000 is what I score in SLI with my CPU @ 3.375GHz.

So, We're looking at a performance increase from ~10,000 3DMarks to ~14,000 3DMarks with a bit of wiggle room in there due to the difference in CPU speeds and number of cores (dual vs quad). Yet, from these rough numbers the single 9800GTX could be interpreted as approaching 8800gtx SLi speeds. And, let's also remind ourselves one more time that 3DMark06 sucks and especially recently --along with canned bechmarks-- hasn't been doing new nvidia cards justice. Yet you've said repeatedly that there is no performance gain, and your point of view and assertion that someone else isn't making any sense is entirely based on that faulty foundation.

The reason you need to "chill" is the same reason your last post was so long.
You are not reading what I wrote, but somehow changing it to what you think im implying in your head.

I'll keep this short, so I dont confuse you anymore. And BTW, the "monster card theory", you brought up and I only used it as an example. (Evidenced by why I said "I did not say you were wrong, only that it does not make logical sense[from a business point of view])

I get 13500 3dmarks on a 8800GT
If the new 9800GTX only get 14K, that is beyond disappointing.

But again, like I said before, we have not seen real game benchmarks yet, so THIS ENTIRE THREAD is speculation.
 
I thought about getting another GTX. I've had mine since release.

I'd need to get another motherboard to use SLi and to go to water cooling. I wouldn't want two GTXs in my case on air come the summer. 680i|780i are warm as well.

I don't plan on buying a card that idles over 60C with stock cooling again.
 
The reason you need to "chill" is the same reason your last post was so long.
You are not reading what I wrote, but somehow changing it to what you think im implying in your head.
Actually, you said that what I wrote made "no logical sense". . . and then went on to clearly state that what I was saying (no "monster card") must mean that nvidia has made no progress in eighteen months. That is a clearly illogical statement on many levels. But fear not, what you wrote wasn't terribly complex and I'm reading it just fine. It's not beyond the comprehension of mere mortals. But, as you can see if you'll take a moment and really think it through, it's ironic that in fact it is you who attributed your own invented meaning to my words that made "no logical sense" itself. For the record, if you're going to castigate the logic of others, it's incumbent upon you to demonstrate both where the logical flaw exists in the argument of others and also maintain some semblance of logic yourself. You've failed to do either. And now you seem to just want to make this personal or complain about post length.

I'll keep this short, so I dont confuse you anymore. And BTW, the "monster card theory", you brought up and I only used it as an example. (Evidenced by why I said "I did not say you were wrong, only that it does not make logical sense[from a business point of view])
And, again, as I and others have repeatedly asked (though you call it "irrelevant"), why does it make more "business sense" for nvidia to sit on a card and give ATi time to recover. . . and even 18 months later release these cards instead of the "monster" when they could have buried ATi 6-12 months ago (according to the conspiracy theorists) had they just released this fanciful "monster."

So, in keeping with your "so I don't confuse you" theme (nice), I'll be succinct here: You somehow got it into your head that my saying that there is no "monster card" is equivalent to saying that there has been no progress on the part of nvidia. This was a leap of logic on your part (ironically while you accused another of the same) that you have been unable to support.

I get 13500 3dmarks on a 8800GT
If the new 9800GTX only get 14K, that is beyond disappointing.
You keep talking about no performance gain in 18 months. Yet now you want to compare to a card that is three months old (that costs less than half as much as its current nearest performance competitor)? Do you not see anything wrong or. . . dare I say. . . "illogical" in suddenly making that leap from eighteen months to three? That's really moving the goalposts. Especially when you take into account the fact that we're now talking about performance numbers at a substantially lower price than the $600-650 we paid for the 8800GTX eighteen months ago.
 
You keep talking about no performance gain in 18 months. Yet now you want to compare to a card that is three months old (that costs less than half as much as its current nearest performance competitor)? Do you not see anything wrong or. . . dare I say. . . "illogical" in suddenly making that leap from eighteen months to three? That's really moving the goalposts. Especially when you take into account the fact that we're now talking about performance numbers at a substantially lower price than the $600-650 we paid for the 8800GTX eighteen months ago.

My card is an example i used because, its all i have.
A GTX that is 18 months old, is faster. What exactly do you not understand here? Their top of the line card then, is just as fast as their top of the line card now. Show me the amazing 18 month performance advance. This is not about money. Its about overall top of the line performance. Oh yeah its 15%.......:rolleyes:

Whatever dude. I'm not arguing anymore. You just want to disect posts and not read what I'm conveying. Good night.
 
As far as what I was expecting, I would make these points:

1 - The 'Close to 1TF Card' Michael Hara was talking about, that they were trying to release in 07 was the GX2.

2- It had been posted around the web that the speed of the current 8800 Ultra was about 1/2 a TF.

3- I was thinking that with 2 G92 GPU's on 1 card, we would have been closer to Michael Hara's 1TF.

I am not the smartest GPU dude in the forum, but don't buy into that 14K is fast enouigh yet... Even if 3DMark06 is a poor measureing stick....

I still am not even sure, if it really hit the reported 30% minimum speed increase over the Ultra yet either?

I also can't believe that Michael Hara was really talking about the G100 being the 1TF card, and it being released in 07... It wasen't near ready.

The GX2 was supossed to be the Beast!

I for one am disappointed, but will probably still buy one.

I am not waiting for G100 with my 8600GT. I have my 1,920 x 1,200 res to think about, and too many games yet need to be played. :)
The 8600 GT is the fastest GPU I have ever owned. I wanted major techno shock when I plug the next GPU into my newly built quad system.
 
I don't care about 3dmark scores, show me some in-game performance. I thought we had all moved past this glorious 3dmark age?
 
i never thought that the 3870 would give ati some room to work with until the 48xx series by june/july... and im proven wrong :D go ATI! hopefully the 4000 series would be up to the task...

I was imagining that ATI was gonna be killed off because nvidia had everything in their hands i.e. they already got the tech to stop whatever ATI throws at em... but its not the case. Im disappointed, because i wanted to get a 9 series card given how good the 8 series are... but its too early to say
 
I don't care about 3dmark scores, show me some in-game performance. I thought we had all moved past this glorious 3dmark age?
I wish they would have posted some in game frame rates...

3DMark06 Score is all we have so far.
 
i never thought that the 3870 would give ati some room to work with until the 48xx series by june/july... and im proven wrong :D go ATI! hopefully the 4000 series would be up to the task...

I was imagining that ATI was gonna be killed off because nvidia had everything in their hands i.e. they already got the tech to stop whatever ATI throws at em... but its not the case. Im disappointed, because i wanted to get a 9 series card given how good the 8 series are... but its too early to say

There is a momentim shift in the air. I can feel it! ;)

Competition is a good thing. It will keep the gas on hard at both camps.
 
Cant we all just stop posting inquirer, fud, vrzone links? They want ad traffic and they generate it with ridiculous crap like this.

As crappy as they are, the Inq and Fudzilla give us crap to discuss and argue about. Gives forum veterans something else to do besides answer questions about PSUs being powerful enough or parts for new builds :p.
 
My card is an example i used because, its all i have.
A GTX that is 18 months old, is faster. What exactly do you not understand here? Their top of the line card then, is just as fast as their top of the line card now. Show me the amazing 18 month performance advance. This is not about money. Its about overall top of the line performance. Oh yeah its 15%.......:rolleyes:
You repeatedly state that there has been "no technological advancement" for eighteen months and base your entire point (and your criticizm of mine) on that "fact." Then, you bring up a three month-old card (8800GT). Yet, oddly, despite that the 8800GT is less than half as much money as the 8800GTX, has fewer shaders, less RAM, uses a different manufacturing process, incorporates new texture compression technology, takes up less space, and runs cooler (etc.). . . and yet nearly matches the GTX in performance, you still claim there has been no "technological advancement." Your criteria for "technological advancement" seems rather peculiar. And when you attempt to point out that the "monster card" must exist or nvidia has made "no technological advancement" your idiosyncratic criteria for "advancement" sorta bites you in the butt and pulls the "logic" rug right out from under you. Which would be fine and wouldn't trouble me at all if you weren't committing these logical errors while erroneously declaring someone else's argument to be "illogical."

You just want to disect posts and not read what I'm conveying.
Again, I read and understand what you are writing perfectly fine. You just don't seem to want to own up to what you've said and instead seem to now just want to assert that I'm just misunderstanding you. You intended --and, for your criticizm of my point to stand, needed-- to "convey" that there has been "no technological advancement" in the last 18 months if the "monster card" does not exist. You now (without actually acknowledging it), seem to want to magically transform what you said into: "I'm disappointed that the newer cards don't appear to be as fast as I would have liked. They don't look like they'll be much faster than what's currently available." And the best part is that it's apparently my problem that I don't consider those two points of view to be essentially synonymous.

The problem is that those are two very different things. . . and you originally made the (obviously inaccurate) statement that there had been no technological advancement in 18 months in order to try to demonstrate that someone else was saying something illogical. It's a shame that you can't just acknowledge that you were actually the one conflating terms and operating under a narrow criteria of "advancement" and thereby jumping to illogical conclusions yourself.

Instead, you prefer to walk out while pretending that you're just being misunderstood.

Dude, if you're going to throw the word "logic" around and castigate the "logic" of others. . . you should be more prepared to back up what you say and you certainly shouldn't play as "fast and loose" as you seem to want to do with commonly accepted terms and definitions. Or, in other words, you'd best make sure your own argument is "logical" before you start questioning the logic of others.
 
I like cupcakes.

If nvidia offered a cupcake with their 9800GTX, it would offset any performance shortcomings.

The 8800GTX does 518GFLOPS theoretically (MADD+MUL) but its much lower when you factor out the MUL.

Heres a quote off wikipedia

The claimed theoretical processing power for the 8 Series cards given in FLOPS may not be correct at all times. For example the GeForce 8800 GTX has 518.43 GigaFLOPs theoretical performance given the fact that there are 128 stream processors at 1.35 GHz with each SP being able to run 1 Multiply-Add and 1 Multiply instruction per clock [(MADD (2 FLOPs) + MUL (1 FLOP))×1350MHz×128 SPs = 518.4 GigaFLOPs][4]. This figure may not be correct because the Multiply operation is not always available[5] giving a possibly more accurate performance figure of (2×1350×128) = 345.6 GigaFLOPs.

The 8800 ultra = 576Gflops, the 8800GTS (G92) = 624. Those numbers are single-precision FP32.

Sure a GX2 G92 card would technically be over 1tflop, so is the 3870x2 but the numbers are skewed.

I found an interesting page about CUDA when typing this out, actually goes into detail about alot of things.
 
Back
Top