Theoretical gt300 performance? vs 5870

It's a bit annoying that Nvidia is trying to either create or get into other markets, adding new stuff not entirely or necessarily related to gaming, and we are going to have to pay for it through higher prices.....as if Nvidia cards weren't expensive enough.

The more and more I think about it, the more and more it looks like I might be heading to ATi land.
 
Regardless I perdict that the GT300 will out perform the HD5870 - it has to. Nvidia can't afford to be 3 months late to the ball and then NOT have a faster product.

Other than that I hope that Nvidia makes a competitive product, if not then I'll end up with a HD 5850 this time around :)

Knowing Nvidia, in all likelihood, you can have TWO HD 5850's for the price of ONE gt300. And at that probably the slightly gimped one, rather than the flagship model. ;)
 
It's a bit annoying that Nvidia is trying to either create or get into other markets, adding new stuff not entirely or necessarily related to gaming, and we are going to have to pay for it through higher prices.....as if Nvidia cards weren't expensive enough.

The more and more I think about it, the more and more it looks like I might be heading to ATi land.
They haven't even released it yet or released (or even a rumored) price and you are complaining that it is going to be expensive? You also assume that it isn't going to help gaming which is something, I'm not convinced of. Maybe you should wait till we have some real clue as to what it is going to cost and how it is going to preform before you make such statements?
 
Nvidia's drivers blow. I still get constant "display driver has recovered" messages after my games crash. I did not have that issue with my 3870x2. Not once.

Nvidia *can* afford to be 3 months late and not live up to the massive hype. Why? Fanboys, and ATI did it with HD 2000. It'll take recovery time, but thats really a non-issue.

While the specs are impressive, I'll wager it at roughly 39% faster than GTX 285 with equivalent clock speeds. Since clocks are REALLY unknown ATM I'd rather not base a prediction on that so much as a comparison with less variables.

I'm willing to bet you have Vista 32bit. Every post/person I talk to who has issues with Vista and anything hardware, I dont care who makes it, has issues vs those with 64bit by about a 4:1 margin. I run 64bit vistau with 2x GTX260s and my last BSOD error was over a year ago when I had 2x 8800GTS 640 and the driver was failing due to the creative sound card. I removed the card and haven't had a single issue since, ever. And I've updated to every new driver WHQL they have released since the 17xs.
 
They haven't even released it yet or released (or even a rumored) price and you are complaining that it is going to be expensive? You also assume that it isn't going to help gaming which is something, I'm not convinced of. Maybe you should wait till we have some real clue as to what it is going to cost and how it is going to preform before you make such statements?

Well what do you expect? The bogus thread was started using tons of assumptions.
 
They haven't even released it yet or released (or even a rumored) price and you are complaining that it is going to be expensive? You also assume that it isn't going to help gaming which is something, I'm not convinced of. Maybe you should wait till we have some real clue as to what it is going to cost and how it is going to preform before you make such statements?

Why can't he make those statements now if people are making statements projecting the GT300 to perform better without seeing any hardware, benches, or numbers, but based solely on assumptions? That's like the pot calling the kettle black.

Ex. I want to assume A+B+C. Thus GT300 is faster. I'm sorry, you cannot assume A or B based on the GT300 using C percent of transistors for non-gpu ops because that might lead to the conclusion that the GT300 is slower. Huh?
 
Why can't he make those statements now if people are making statements projecting the GT300 to perform better without seeing any hardware, benches, or numbers, but based solely on assumptions? That's like the pot calling the kettle black.

Ex. I want to assume A+B+C. Thus GT300 is faster. I'm sorry, you cannot assume A or B based on the GT300 using C percent of transistors for non-gpu ops because that might lead to the conclusion that the GT300 is slower. Huh?

There have already been sites claiming to have sources that say the GF100 (or GT300 whatever) will have no problem besting the 5870. With the white papers released we have starting points for the specifications. Assuming equal clock speeds after a die stretch hardly seems outrageous.

But assuming it is going to be "expensive" "just because" is a joke. At least start with some basis in reality.
 
Well what do you expect? The bogus thread was started using tons of assumptions.

I would guess Nvidia to release a slightly faster hardware (10-20%) in the form of a 380 vs a 5870 and charge 15-25% more for it. I would also expect them to release a 360 and price it even with the 5870 and preform on par. Somehow that seems more probable then nvidia releasing hardware 4 months late and trying to charge more for it then what is already on the market.
 
I would guess Nvidia to release a slightly faster hardware (10-20%) in the form of a 380 vs a 5870 and charge 15-25% more for it. I would also expect them to release a 360 and price it even with the 5870 and preform on par. Somehow that seems more probable then nvidia releasing hardware 4 months late and trying to charge more for it then what is already on the market.

You really dont know Nvidia do you? the only company that likes to charge a premium more than them, is Intel. Nvidia GPUs have always commanded a $50-$100 premium when there was no competition. As soon as any competition arrived, the price fell by those 50-100. That translates into a lot of profit. ATI is a bit better than that. they could charge an arm and a leg for the 5870 right now, close to the 295, but they are not.
 
They haven't even released it yet or released (or even a rumored) price and you are complaining that it is going to be expensive? You also assume that it isn't going to help gaming which is something, I'm not convinced of. Maybe you should wait till we have some real clue as to what it is going to cost and how it is going to preform before you make such statements?


Historically, they are always more expensive. So It isn't an assumption, it's a prediction based off of historical evidence. Plus, it's completely different architecture, with components that aren't necessarily related to gaming.

If you believe it won't be more expensive...shit...I'll go ahead and say a lot more expensive than the Ati card, I have a bridge in Brooklyn I think you'll be interested in.

You could be right, of course, but I wouldn't bet on it.
 
You really dont know Nvidia do you? the only company that likes to charge a premium more than them, is Intel. Nvidia GPUs have always commanded a $50-$100 premium when there was no competition. As soon as any competition arrived, the price fell by those 50-100. That translates into a lot of profit. ATI is a bit better than that. they could charge an arm and a leg for the 5870 right now, close to the 295, but they are not.
Actually, the priced it in line with their own 4870x2. Because otherwise the 4870x2 would cut into the 5870s profit. You are going to tell me it costs them nearly twice as much as a 4890 to make one of these? I find that hard to believe. So please continue to tell me how ATI is not profit mongering.


Historically, they are always more expensive. So It isn't an assumption, it's a prediction based off of historical evidence. Plus, it's completely different architecture, with components that aren't necessarily related to gaming.

If you believe it won't be more expensive...shit...I'll go ahead and say a lot more expensive than the Ati card, I have a bridge in Brooklyn I think you'll be interested in.

You could be right, of course, but I wouldn't bet on it.
I didn't ever say it won't be more expensive. In fact I think I just said it would be more expensive. Perhaps you should read my posts?

and exactly what historical evidence? Last gen ATI was late to market, not Nvidia.
 
I am thinking that most people in this thread do not understand the whitepaper. assuming equle clock speeds "fermi" will be one hell of a monster compaired to anything out there. the progression from the 7x00 series to the 8x00 was a full 100% upgrade (7800gtx SLI +/- <5% 8800GTX) and the same goes for 8x00 to gt200. the specs for this not only more than double the shader count but also add 50% more memory bandwidth and also put in place much improved thread managment.

i am not going to try to guess on what kind of numbers we will see with whatever incarnation the "gamer" version takes but i will be waiting to see.......unless they dont have "eyefinity" then F em and 5870 CF here i come
 
They haven't even released it yet or released (or even a rumored) price and you are complaining that it is going to be expensive? You also assume that it isn't going to help gaming which is something, I'm not convinced of. Maybe you should wait till we have some real clue as to what it is going to cost and how it is going to preform before you make such statements?

Actually, the priced it in line with their own 4870x2. Because otherwise the 4870x2 would cut into the 5870s profit. You are going to tell me it costs them nearly twice as much as a 4890 to make one of these? I find that hard to believe. So please continue to tell me how ATI is not profit mongering.


I didn't ever say it won't be more expensive. In fact I think I just said it would be more expensive. Perhaps you should read my posts?

and exactly what historical evidence? Last gen ATI was late to market, not Nvidia.

Are you off your meds or something? In your first reply to me you mentioned nothing about how you said Nvidia was going to be more expensive.

And what do you men, "exactly what historical evidence"? Do you even own a video card? I have to ask this ridiculous question since everyone knows Nvidia cards are always more expensive.

Maybe you're the one who needs to read because I never even talked about being late to market.

I think you're just an ATi fanboy who is foaming at the mouth so badly, and trying to repspond to some many different posts with your drivel that you are mixing up who you are talking to.

\your meds, go back on them
 
Judging by the fake card they showed at the developers conference, id say nVidia fell asleep at the drivers wheel. Id say were lucky if we get something by Q2 2010. And by then ATI should have made enough revisions to release new versions of the 5850 and 5870, maybe call them 5860 and 5880's?
 
Judging by the fake card they showed at the developers conference, id say nVidia fell asleep at the drivers wheel. Id say were lucky if we get something by Q2 2010. And by then ATI should have made enough revisions to release new versions of the 5850 and 5870, maybe call them 5860 and 5880's?

Fell asleep at the drivers wheel? Don't you mean developed a completely new, possibly revolutionary card, in almost the same amount of time Ati managed to put out little more than a refresh?

Not that I approve of them cramming shit on their cards that will do anything other than improve gaming, but c'mon people. You act like they just pumped out another card......

Of course this doesn't suprise me given what 99% of any of you know about developing, engineering, and manufacturing anything.
 
Fell asleep at the drivers wheel? Don't you mean developed a completely new, possibly revolutionary card, in almost the same amount of time Ati managed to put out little more than a refresh?

Not that I approve of them cramming shit on their cards that will do anything other than improve gaming, but c'mon people. You act like they just pumped out another card......

Of course this doesn't suprise me given what 99% of any of you know about developing, engineering, and manufacturing anything.

Did you watch the conference? It was mind numbing (albeit a developers conference and not a gamers one)

Nvidia hasn't pumped out anything new? nothing works, reports are out that the Fermi he showed off was a non working model, The Cuda and PhysX stuff is the same thing we've been seeing for a while

A Quarter of that conference was dedicated to the idea of 3D glasses... Which go to your local Frys and try on, it makes you down right dizzy!!!

nVida seems more interested in the computation side of the business rather than the look and feel of a game. Which for me was enough for me to take the Red Pill.

Now with that said, i guarantee i will try nVidia again in the future and I applaud their achievements as they have real world benefits especially in the medical field. I also would never want a single GPU provider, competition is what drives this innovation!
 
on your "linear increase with more cores" and "SLI doesn't double performance", look up Amdahl's Law, and you'll understand whats wrong there...

point well taken.

for example 5870 vs 4870/90, or GTX 285 vs 8800GTX/Ultra, it isn't double the speed, but its about half again, so based on that, your estimations of the GT300's performance are within reality (this doesn't mean that it couldn't be twice as fast, mind you, nVidia has released stuff thats close to 2x faster than their previous generation before (GeForce 6800 comes to mind), not to mention that nVidia is notorious for absolutely BEASTLY designs (its like Intel and nVidia want to race who can get more trannies on a block, Itanium vs GeForce))

Hahahaha, good old Itanium. I'm still hoping that architecture gains some traction, from what I've seen it sure would allow us to slipstream alot of code into substantially fewer clock cycles (read: big performance bump). I remember ages ago someone posted about how they were angry that AMD did the pentium Pro better than intel was (athlon64 vs P4), and thus was causing intel to lose Itanium investors. I sorta laughed at the guy at the time but now, after giving Itanium a couple hours of my time, I'm not angry at AMD or anything but if every server out there had unanimously made the switch we'd be well on our way to web 3.0. Anyways, I digress...

Regardless I perdict that the GT300 will out perform the HD5870 - it has to. Nvidia can't afford to be 3 months late to the ball and then NOT have a faster product.

Other than that I hope that Nvidia makes a competitive product, if not then I'll end up with a HD 5850 this time around :)

Yeah, if Nvidia can't hit the 09 X-mas season they'd better have a damn good reason, because you can expect by X-mas 2010 AMD is going to have a full range of SKUs --inclusive of possibly a 32nm part-- based off what they have now. If Nvidia cant compete there, their investors walk.

/sigh as much as I'd love to chime in here it seems this thread has been invaded by idiots. Another time another thread I suppose.

I was just chiming in to say I'm suprised how civilized everyone has been...

As per GT300 I'd like to look at it from as practical a point of view as I can:

In the past (RV770 vs GT200b), ATI crammed just under 1billion transistors into a 256mm² package (at 55nm of course), Nvidia managed 1.8 billion transistors in a 497 mm², thus both companies demonstrated ~4million transistors/mm². If we assume ATI and Nvidia have the same transistor density at 40nm, for 3 billion transistors Nvidia's GT300 die is ~ 520mm² (a touch bigger than the GT200 (edit: corrected from GT200b)), which falls into line nicely with these rumored terrible GT300 yields considering this is Nvidias first stab at 40nm (and TSMC isn't so hot at 40nm as demonstrated by the RV740).

So, and take this as you will, but off of a single 300mm wafer ATI is going to get ~1.6 times as many raw chips (working or otherwise), than Nvidia is.

In typical Nvidia fashion, if they're going to have a huge die, and its going to be late, you'd better expect its going to scream through the benchmarks, and take your wallet along for a ride.
 
Last edited:
Fell asleep at the drivers wheel? Don't you mean developed a completely new, possibly revolutionary card, in almost the same amount of time Ati managed to put out little more than a refresh?

Not that I approve of them cramming shit on their cards that will do anything other than improve gaming, but c'mon people. You act like they just pumped out another card......

Of course this doesn't suprise me given what 99% of any of you know about developing, engineering, and manufacturing anything.

LOL it makes me laugh I offended people become in threads, yea you are right, 99% of are probably not engineers, and you dont have to be an engineer to be a gamer, yes the card will be revolutionary, yes ati doubled everything but somehow managed to double the transistor and got the same power envelope as the hd 4890, actually a little less. Nvidia did design a great chip, but it is in a sense of cgpu, meaning it is more for general purpose gpu than a gaming gpu, for gaming ofcourse it will be fast but the gaming part of it is just like ati, it is basically doubled shader units and more rops, the revolutionary part of it is not really for games it is probably for TESLA cards and making supercomputers out of these cards. Yes nvidia have to do this to survive in the future, so I don't blame them.
 
I would guess Nvidia to release a slightly faster hardware (10-20%) in the form of a 380 vs a 5870 and charge 15-25% more for it. I would also expect them to release a 360 and price it even with the 5870 and preform on par. Somehow that seems more probable then nvidia releasing hardware 4 months late and trying to charge more for it then what is already on the market.

Are you off your meds or something? In your first reply to me you mentioned nothing about how you said Nvidia was going to be more expensive.

And what do you men, "exactly what historical evidence"? Do you even own a video card? I have to ask this ridiculous question since everyone knows Nvidia cards are always more expensive.

Maybe you're the one who needs to read because I never even talked about being late to market.

I think you're just an ATi fanboy who is foaming at the mouth so badly, and trying to repspond to some many different posts with your drivel that you are mixing up who you are talking to.

\your meds, go back on them

Try checking some time stamps, I wrote that 16 mins before your post. And I'm an ATI fanboy? yeah, that's why I've got an Nvidia card in my rig? (check the sig) I think you should start looking for the medicine for your self.
 
In the past (RV770 vs GT200b), ATI crammed just under 1billion transistors into a 256mm² package (at 55nm of course), Nvidia managed 1.8 billion transistors in a 497 mm², thus both companies demonstrated ~4million transistors/mm².

Actually G200 B3 (55nM GTX 285) was only 1.4B transistors @ ~452 mm (or 470mm). Actually I havent been able to ascertain for certain how big g200b die is.

Here's one tech site which actually bothers to explain this.
TSMC, who produce the GT200 and GT200b, specify a linear shrink of 10 percent for the new architecture. If Nvidia was able to make full use of those possibilities, the Die size of the GT200b should be about 490 mm² - this is still almost twice the size of the RV770 Die which is working on AMD's Radeon HD 4800 cards.

http://www.pcgameshardware.com/aid,673142/Geforce-GTX-285-reviewed/Reviews/
 
Last edited:
Try checking some time stamps, I wrote that 16 mins before your post. And I'm an ATI fanboy? yeah, that's why I've got an Nvidia card in my rig? (check the sig) I think you should start looking for the medicine for your self.

OH WOW LOL........That post of yours that you qouted wasn't even to me. I didn't read it, I was trying to read your conversation with me. I wasn't aware that I had to follow our conversation, plus the conversations you were having with everyone else so I could have a proper debate with you. :rolleyes:

And I probably ignored you at first because your very first post in this thread was you calling everyone idiots, and how you weren't going to post. But here you are, balls deep in the thread, looking to be the only idiot around.

Like I said, your meds, find them.
 
OH WOW LOL........That post of yours that you qouted wasn't even to me. I didn't read it, I was trying to read your conversation with me. I wasn't aware that I had to follow our conversation, plus the conversations you were having with everyone else so I could have a proper debate with you. :rolleyes:

Yeah, a whole one post. What a burden.
 
There have already been sites claiming to have sources that say the GF100 (or GT300 whatever) will have no problem besting the 5870. With the white papers released we have starting points for the specifications. Assuming equal clock speeds after a die stretch hardly seems outrageous.

But assuming it is going to be "expensive" "just because" is a joke. At least start with some basis in reality.

Assuming that the GT300 is more expensive because it has a LARGER die and because it produces lower YIELDS is just common sense. Not to mention knowing Nvidia's past pricing policies -- Hello $649 GTX 280! Not to mention that it has 3 billion transistors, an unknown percentage that will not even be used for GPU/gaming. But the cost will be passed along to the consumer nontheless. I don't see any joke here, just logical conclusions based on the facts.
 
Assuming that the GT300 is more expensive because it has a LARGER die and because it produces lower YIELDS is just common sense. Not to mention knowing Nvidia's past pricing policies -- Hello $649 GTX 280! Not to mention that it has 3 billion transistors, an unknown percentage that will not even be used for GPU/gaming. But the cost will be passed along to the consumer nontheless. I don't see any joke here, just logical conclusions based on the facts.

What facts ? We don't know if the consumer cards will have the same features as what was demonstrated in the GPU Technology Conference. It was meant for developers, because NVIDIA doesn't just focus on the gaming market with their GPUs. Ever since G80, this IS a fact.
It's been widely speculated that the consumer cards won't have ECC support, because NVIDIA itself mentioned how with ECC enabled, there's a possibility of a 15-20% drop in performance. And there are other features that may not even see the light of day for consumer cards, which will make the chip smaller and cheaper. Fermi as whole, is an HPC chip, but given its scalability, it will be "downgraded" for consumer cards.
There are no "facts" that point to what you said, especially since you are assuming that NVIDIA didn't learn their lesson with their price adjustments during GT200 vs RV770.

And what about the $649 GTX 280 ? They launched it before the competition and when the competition was very close to their performance, they adjusted the pricing. They even had their partners give back the difference, to early adopters. So what's the problem ?. Yes, $649 is expensive, but why is that used against them, when that was the normal price for a high-end card until then ? The HD 5870 X2 will cost about that (rumors point to $599). Isn't that expensive too ?
 
Because everyone should live without their soundcard to be compatible with nvidia hardware? Huh?

If people are stupid, dumb, idiotic or retarded enough to buy a creative sound card, then yes they should. Creative sucks at writing drivers, by far the worst out there for any hardware maker.
 
If people are stupid, dumb, idiotic or retarded enough to buy a creative sound card, then yes they should. Creative sucks at writing drivers, by far the worst out there for any hardware maker.

This one hundred thousand times.
 
And what about the $649 GTX 280 ? They launched it before the competition and when the competition was very close to their performance, they adjusted the pricing. They even had their partners give back the difference, to early adopters. So what's the problem ?. Yes, $649 is expensive, but why is that used against them, when that was the normal price for a high-end card until then ? The HD 5870 X2 will cost about that (rumors point to $599). Isn't that expensive too ?

None of what you just said denies the preposition that the GT300 will be expensive, and likely a lot more than the 5870. I also find it amusing that you try to compare the price of an overpriced Nvidia single GPU to an ATI DUAL GPU card. So you're saying that it's ok for Nvidia to overprice their cards as long as the single GPU one costs roughly the same as an ATI Dual GPU? That's like saying, "See you can buy TWO ATI Gpus for the price of ONE Nvidia GPU. I don't see the problem with that pricing!" LOL. Hilarious.
 
Back
Top