GT300 taped out

I didn't say they aren't important, but I did say that they are not responsible for the highest profit margins.

And answering your question, because it's not cost effective. The highest end chip obviously costs more to produce then all the others that derived from it. The R&D costs also revolve around the highest end, since from the design of that one, more chips will be created. High end cards are not meant for high volumes of production, so they have a high price for longer than others, which usually is justified because they perform better. And that's true, but for money making companies what matters is how much it costs to produce one and how much profit they can make out of that one. Since they sell less of those, they charge more for them and that's the reasoning behind "Supply and Demand". They could probably price it well enough for more people to buy it (high end I mean), just not enough to keep a good profit margin on it.

Just look at what happened with the 7800 GTX 512 and 8800 GT when they first debuted. The 7800 GTX 512 was obviously just meant to spoil the already late arrival of the X1800 XT, but NVIDIA didn't anticipate that people would flog to them as they did, even at the ridiculously high prices it had. Same for the 8800 GT. Launched before ATI released the HD 3800 series, to spoil the launch once more. Only this time, the 8800 GT had a very attracting price, which again, led people to grab it off the shelves and click on the checkout button like maniacs. NVIDIA had to increase orders to TSMC because of that. There was a piece of news about that. I'm sure you can find it.

Well, I'll just leave it at I disagree with your reasoning.
 
Then by all means share yours!

I think while Nvidia can afford to loose profits on it's high end cards, it can't afford to "loose touch". ATI is determined to fight a price/preformance war despite Nvidia pretty much offering last gen to let them launch thier cards at a VERY profitable price point. Frankly, I disagree with the ATI's decision. I think that given AMDs financial situation, they needed to suck it up and take the profits when given, but that is another story.

Nvidia has the fastest graphics chips on the high end. I seriously doubt that will change next generation simply because the relative size of R&D budgets between AMD and Nvidia. Furthermore, I suspect Nvidia will correct the mistake of not launching a dual-PCB card sooner. I understand why they didn't launch the 295 and that was because they launched the 9800GX2 (which was a mistake to ever launch it).

Bearing all that in mind, I don't believe Nvidia can afford the market image that will result if they launch a single GPU card much above 500$. The market didn't bear it when the launched the 285 and there were games demanding more GPU power then. There are less games that are demanding insane GPU power now (there are still pleanty of GPU hungry games, don't get me wrong.) I think they know if they launch at 500$ and AMD launches at 400$ they are going to have to give that extra 100$ in preformance and if they don't, they are going to loose a lot of market share.

I believe they know that ATI is quickly becoming the "AMD" while Nvidia is becoming the "Intel" of the Pentium/K6-2 Era. Public perception sells video cards, and while the "big money" is in some of the lower-end/OEM parts, the "4870/275" value minded enthusisast market is no small market.

They should have seen how little GTX 285s they sold this round because of the difference in price to preformance. While it is faster than a 4890, it isn't by enough to justifiy the price difference. How they expect someone to pay 600$ for a video card that is unlikely to outpreform a GTX 295 which aren't exactly flying off the shelevs at 500$.... well I just hope they aren't that dumb.

Basically to sum up the semi-random paragraphs above. They can't afford to look out of touch with price/preformance in an economic slow down. They can't afford to cost twice as much as an Xbox. If they do, they will hurt the name that sells the OEM parts that "bring home the bacon". Futhermore, they still have to sell the high end parts and if the prices are too out of line, people will simply buy older chips (perhaps not even thier older chips) and crossfire/sli them.
 
Don't get me wrong, I have nothing against a company making a healthy profit for producing a quality part. However the non-NRE cost is the same as a GTX 280 assuming the same yields, so cost isn't an issue. What I'm refering to is what the market will bare, and I don't see them being able to sell an ultra high end card for 3 times that of it's 1/2 preformance partner.

The problem is that new generation GPUs are expensive to create. I don't think ATI can keep this price war up, especially considering they keep losing money, further dragging AMD down.

I don't like paying high prices for cards, but you can't expect a company to operate for years without a profit and stick around.
 
I think while Nvidia can afford to loose profits on it's high end cards, it can't afford to "loose touch". ATI is determined to fight a price/preformance war despite Nvidia pretty much offering last gen to let them launch thier cards at a VERY profitable price point. Frankly, I disagree with the ATI's decision. I think that given AMDs financial situation, they needed to suck it up and take the profits when given, but that is another story.

No, they don't, just like any other money making company. But when comparing the two, NVIDIA was and still is, in a much better position to lower their profit margins than AMD. But obviously, they don't want to as would be expected.

vengence said:
Nvidia has the fastest graphics chips on the high end. I seriously doubt that will change next generation simply because the relative size of R&D budgets between AMD and Nvidia. Furthermore, I suspect Nvidia will correct the mistake of not launching a dual-PCB card sooner. I understand why they didn't launch the 295 and that was because they launched the 9800GX2 (which was a mistake to ever launch it).

They didn't launch the GTX 295 sooner, because they needed the 55nm chips, which they didn't have at the time they launched the GTX 2xx cards. It had little to nothing to do with the GX2. A chip is designed for a certain process. GT200 was designed with 65 nm first (which was a process that NVIDIA had formidable results with: G92). Going to 55nm was obviously in the queue for GT200, but it doesn't happen from day to night.

vengence said:
Bearing all that in mind, I don't believe Nvidia can afford the market image that will result if they launch a single GPU card much above 500$. The market didn't bear it when the launched the 285 and there were games demanding more GPU power then. There are less games that are demanding insane GPU power now (there are still pleanty of GPU hungry games, don't get me wrong.) I think they know if they launch at 500$ and AMD launches at 400$ they are going to have to give that extra 100$ in preformance and if they don't, they are going to loose a lot of market share.

Well in here we seem to be in agreement. I'm sure they both learned from their mistakes and will correct them this time. Just don't expect that the ultra high-end will be cheap, because it most certainly won't be. Expect mid-range cards to be priced correctly, according to the performance they provide, but high-end...it will be expensive as always.

vengence said:
I believe they know that ATI is quickly becoming the "AMD" while Nvidia is becoming the "Intel" of the Pentium/K6-2 Era. Public perception sells video cards, and while the "big money" is in some of the lower-end/OEM parts, the "4870/275" value minded enthusisast market is no small market.

Disagreed. The market for cards such as those was practically forged during this period. A GTX 275 is nearly 95% of the performance of NVIDIA's high-end single card, the GTX 285 and the HD 4890 IS ATI's high-end single card. As prices were being adjusted and lowered for cards such as the HD 4850, HD 4870, 9800 GTX and GTX 260, none of these companies could release the new cards at a much higher price. The HD 4890 is marginally faster than a HD 4870 1 GB and a GTX 275 isn't that much faster than a GTX 260 216. When new generation of cards are released, their prices will set the performance/price that the other company must follow and adjust to compete or kill the other company's profit margins. If they are originally priced very low, to grab market share (as was AMD's intention with the HD 4800s), then the situation will repeat itself, but I have my doubts that they want to keep losing profits, when they so badly need them i.e. they will price it adequately, but high enough to profit. For example: between $350 and $450 for the high-end single chip. And this goes for NVIDIA too.

vengence said:
They should have seen how little GTX 285s they sold this round because of the difference in price to preformance. While it is faster than a 4890, it isn't by enough to justifiy the price difference. How they expect someone to pay 600$ for a video card that is unlikely to outpreform a GTX 295 which aren't exactly flying off the shelevs at 500$.... well I just hope they aren't that dumb.

Well, I don't know the numbers for either of those cards, so I really can't comment on that, but it seems that with what you said, you are assuming any company knows how many cards they will sell. They don't and so they do the orders on a cost effective manner. As I mentioned before, what matters is the money it takes to produce them and how much they can profit out of it. And that's why the high-end (in this case the GT200b chip that powers GTX 285) isn't produced in large volumes. They order a few thousand chips based on sustainable costs for that order and watch for supply and demand. If it increases, they order more. If it doesn't, they stay put. The low-ends for OEMs are massively produced, because they already have the orders for those chips from OEMs guaranteed. a while back.

vengence said:
Basically to sum up the semi-random paragraphs above. They can't afford to look out of touch with price/preformance in an economic slow down. They can't afford to cost twice as much as an Xbox. If they do, they will hurt the name that sells the OEM parts that "bring home the bacon". Futhermore, they still have to sell the high end parts and if the prices are too out of line, people will simply buy older chips (perhaps not even thier older chips) and crossfire/sli them.

Agreed, since that's pretty much what I said with the "they learned from their mistakes" bit. But again, the ultra high-end will be as expensive as always. Don't expect the dual chip cards to cost any less than $500-$550.
 
Given that NV made DX10 a joke and is still messing about trying to get proper DX10.1 cards... I say little hope of a NV dx11 card before end of next year.
 
Given that NV made DX10 a joke and is still messing about trying to get proper DX10.1 cards... I say little hope of a NV dx11 card before end of next year.

Uh? NV made DX10 a joke ? Really want to see the answer to that one...
 
Got MS to pull most of it making it a pointless exercise cos they couldnt get hardware working.

Its also the reason the vista drivers got redone. (was suppost to be totally out of the core. NV couldnt get it sorted in time so MS "tweaked" it so they could do it) Intel and ATI got proper drivers working.
 
Uh? NV made DX10 a joke ? Really want to see the answer to that one...

I think he was just being a little harsh. nvidia had some of the original features removed, some of showed up in DX10.1 that nvidia is now (why?) supporting. I have heard a couple of versions of this though so it might need a grain of salt.
 
Got MS to pull most of it making it a pointless exercise cos they couldnt get hardware working.

Its also the reason the vista drivers got redone. (was suppost to be totally out of the core. NV couldnt get it sorted in time so MS "tweaked" it so they could do it) Intel and ATI got proper drivers working.

LOL and where did this fairytale come from ?

All it's missing now is that you say that that contributed to the delays of R600...

AMD/ATI fans are totally off the chart in terms of conspiracy theories and making things up...

Back to reality though, Microsoft didn't pull anything and NVIDIA has nothing to do with the success or failure of any DX version. They did in fact impulse its use, since they had working hardware BEFORE Vista was even out. And the drivers being redone is another fairytale. Did you get this from where ? Because there were lots of problems with the drivers at first ? Well, that always happens when there's new hardware and a new OS and especially when this new OS has a new driver model. ATI didn't face those many problems, because they had NO hardware at launch.
 
I think he was just being a little harsh. nvidia had some of the original features removed, some of showed up in DX10.1 that nvidia is now (why?) supporting. I have heard a couple of versions of this though so it might need a grain of salt.

...:rolleyes:

No. There were no original features removed. DX10 had its spec well defined (as is the case all the time) and NVIDIA had hardware that supported all the DX10 features and that's why it got certified my Microsoft as being a DX10 compliant hardware. DX10.1 is a minor increment of which NVIDIA supports a few features, but not all and thus couldn't get DX10.1 certification. Are there more wild conspiracy theories to debunk ?

DX11 is a superset of DX10.1 and thus having DX10.1 is the minimum to support some of the DX11 features in Windows 7.
 
I have my doubts that they want to keep losing profits, when they so badly need them i.e. they will price it adequately, but high enough to profit. For example: between $350 and $450 for the high-end single chip. And this goes for NVIDIA too.

you do realize that AMD made the die size of the 4800 series small compared to nvidia and ati's past chips right? having this smaller dies size has allowed amd to have very good yields and be able to make healthy profits off of their cards this past round, which in turn put some heavy pressure on nvidia. The profit margins for amd cards where good enough to pull a new AIB partner (XFX), so considering these facts and that amd plans to keep the current strategy of small die size / decent performance, there should be decent prices to come from amd in the new generation of cards.

you can read more herehttp://www.anandtech.com/video/showdoc.aspx?i=3469
 
you do realize that AMD made the die size of the 4800 series small compared to nvidia and ati's past chips right? having this smaller dies size has allowed amd to have very good yields and be able to make healthy profits off of their cards this past round, which in turn put some heavy pressure on nvidia. The profit margins for amd cards where good enough to pull a new AIB partner (XFX), so considering these facts and that amd plans to keep the current strategy of small die size / decent performance, there should be decent prices to come from amd in the new generation of cards.

you can read more herehttp://www.anandtech.com/video/showdoc.aspx?i=3469

They both lost profit margins, because they engaged in a price cut war (you should read the rest of the thread). AMD gambled with the fact that they thought NVIDIA couldn't keep up with the price cuts but they did. Just like AMD wanted to capture market share, even if it meant losing some profits, NVIDIA also didn't want to lose market share and thus had to lower profits. The HD 4850 competed with the year old 9800 GTX which had around the same die size and less transistors than the HD 4850. Can you guess who had the better yields (considering the maturity of the chip) and who made more money out of those two ?

It's undeniable that the GTX 200 cards had more costs attached to them than those of the HD 4800s, however AMD was using the very recent GDDR5, which wasn't cheap and didn't get cheaper, because no one else was using it (thus not increasing demand) and one of their main GDDR5 providers, Qimonda, was having some serious financial problems.

I don't know the details about why XFX started selling ATI cards, but they didn't stop selling NVIDIA cards, so nothing really confirms great profit margins out of the AMD products. Just that XFX wanted a more wider range of products to sell, which obviously means more profits, since until then, they were NVIDIA exclusive.

As for the "small die strategy" that will keep prices down, let's wait and see. AMD is in the red for 11 quarters now. They won't be able to get out of the red, while engaging in price cuts with companies that had nothing but profits until the recession started.
 
They both lost profit margins, because they engaged in a price cut war (you should read the rest of the thread). AMD gambled with the fact that they thought NVIDIA couldn't keep up with the price cuts but they did. Just like AMD wanted to capture market share, even if it meant losing some profits, NVIDIA also didn't want to lose market share and thus had to lower profits. The HD 4850 competed with the year old 9800 GTX which had around the same die size and less transistors than the HD 4850. Can you guess who had the better yields (considering the maturity of the chip) and who made more money out of those two ?

It's undeniable that the GTX 200 cards had more costs attached to them than those of the HD 4800s, however AMD was using the very recent GDDR5, which wasn't cheap and didn't get cheaper, because no one else was using it (thus not increasing demand) and one of their main GDDR5 providers, Qimonda, was having some serious financial problems.

I don't know the details about why XFX started selling ATI cards, but they didn't stop selling NVIDIA cards, so nothing really confirms great profit margins out of the AMD products. Just that XFX wanted a more wider range of products to sell, which obviously means more profits, since until then, they were NVIDIA exclusive.

As for the "small die strategy" that will keep prices down, let's wait and see. AMD is in the red for 11 quarters now. They won't be able to get out of the red, while engaging in price cuts with companies that had nothing but profits until the recession started.

not be be rude, but did you happen to read the link i posted? it actually addresses some of the things i said and talks about how amd no longer plans to seek out the the top performance crown, but rather seek small die size with better prices and yields
 
im hoping my 650w psu will be fine for this monster? They need to seriously make these GPU'S more energy efficient.

im on a i7 pc with 4 hdd's along with a 260gtx and so far running well but i am looking to add more hdd's and possibly one of these GPU's but im worried that my PSU will not cope
 
I am exceedingly confident that my Antec NeoPower Blue 650 will be enough for one card, two hdd, 4 sticks of ram and an overclocked Q6600.
 
just wondering and it might be a dumb question but will CPUs sch as the phenom II x4 suffer any bottlenecking on new gen cards or are the cpus good for along time to come
 
just wondering and it might be a dumb question but will CPUs sch as the phenom II x4 suffer any bottlenecking on new gen cards or are the cpus good for along time to come
they will certainly suffer a little compared to overclocked i7 setups. it really just depends on how fast these cards really are. for the next gen of dual high end sli setups only a fool would go with anything other than an i7. core 2 and phenom 2 cpus already get their ass kicked badly in tri sli setups and two of the next gen cards will be much faster. I have a feeling there will be quite a few people severely bottlenecked in a few months.
 
im must wondering because im building a new phenom II x4 955 this upcoming week and im concerned i will be bottlenecking and ive been reconcidering just buckling down for an i7 rig
 
im must wondering because im building a new phenom II x4 955 this upcoming week and im concerned i will be bottlenecking and ive been reconcidering just buckling down for an i7 rig
well if you plan on running high end gpus plus upgrading down the road then I would certainly go i7. going Phenom 2 will bite you in the ass later for sure since even now multi card of this gen already run better on i7.
 
kool i figured this will prolly be the only time i will be able to build for awhile and ive asked many users help building and every keeps talkign me into the 955 and with the cash i got now ive been going back to my i7 build that i got saved
 
kool i figured this will prolly be the only time i will be able to build for awhile and ive asked many users help building and every keeps talkign me into the 955 and with the cash i got now ive been going back to my i7 build that i got saved
why not just wait a few weeks for the Lynnfield cpus? P55 boards will be much cheaper than X58 and i5 cpus will perform about the same as i7. heck from what I understand the P55 will also have i7 cpus.
 
kool yes thats what i was referring to. its hard to hold 1500 dollars till then haha. so i might just buckle down and build an i7 anyways. everytime i get money something comes up and i have to spend it so if i spend it i cant spend it elseware
 
Can't wait for the new 400GB/s+ memory bandwidth card. Anybody confirmed how big this card and how much power it suck up during usage?
 
GT200 has 240 single precision shaders PLUS 30 shaders exclusively for double precision work..since those normal shaders can't do it. This is major weakness in GT200:
-Dual precision performance is poor
-These units can't be used in single precision work
-One of the reasons why GT200 is so big

Now GT300 would have 512 MIMD cores; no need for group of shaders that would exclusively do double precision work. This means huge performance increase:
GTX 295 - 149 gigaFLOP/s
GT300 - 1000-1200 gigaFLOP/s (Depending on clocks)

So one GT300 could be that 6.5-8 times faster in dual precision calculation than 2xGT200. There's no dual precision work in games..basically it could mean that Nvidia's Tesla-products could become pretty popular.

Isn't the GTX 280 a 933 gigaFLOP/s card? And the 4850 a 1 teraFLOP/s card?
 
Last edited:
Can't wait for the new 400GB/s+ memory bandwidth card. Anybody confirmed how big this card and how much power it suck up during usage?

The rumor mill points to less than the max TDP for GT200 i.e. < 236w.

And the rumor mill also puts GT300 (or whatever it's called) @ around 260-290 GB/s of bandwidth and not 400 GB/s.

As for the how big the card is, the rumor mill isn't specific, but they do mention that the chip itself should be a bit smaller than GT200b i.e. < 490 mm2.
 
The rumor mill points to less than the max TDP for GT200 i.e. < 236w.

And the rumor mill also puts GT300 (or whatever it's called) @ around 260-290 GB/s of bandwidth and not 400 GB/s.

As for the how big the card is, the rumor mill isn't specific, but they do mention that the chip itself should be a bit smaller than GT200b i.e. < 490 mm2.

Awww, I wish the cards get more bigger. Prob unless they make something like a GTX300GX2 :D

Still high 200's of memory bandwidth for a single card/core is mighty impressive. I remember the days when the 8800Ultra was king of the hill with a then huge 115GB/s.
 
why do you want cards to get bigger? theyw otn fit on most cases if they did get any bigger
 
why do you want cards to get bigger? theyw otn fit on most cases if they did get any bigger

Thats the whole point, I want to see how big these cards can go as they advance more throughout the years to come. Sorry from point of view its god, but from your point of view it might be stupid :)
 
Thats the whole point, I want to see how big these cards can go as they advance more throughout the years to come. Sorry from point of view its god, but from your point of view it might be stupid :)

There are serious cost and yield issues at 500mm^2, I can't imagine trying to produce silicon economically at sizes larger than that.
 
Awww, I wish the cards get more bigger. Prob unless they make something like a GTX300GX2 :D

Still high 200's of memory bandwidth for a single card/core is mighty impressive. I remember the days when the 8800Ultra was king of the hill with a then huge 115GB/s.

It will be on 40nm process, so "same size" (die) as GT 200 will allow it to pack in close to twice the transistors. Which means it will have ~twice the shaders, etc.

It will essentially be ~2X as big, shrunk into the same area. Thats what die shrinks do.
 
There are serious cost and yield issues at 500mm^2, I can't imagine trying to produce silicon economically at sizes larger than that.

esp at 40nm (at least right now) they can't produce enough 4770 let alone a chip of that size.
 
who cares? there really is no reason to upgrade. There is no crysis on the horizon. everyone game release on the pc now, is either a 360 port or indy type game from a publisher like stardock, thats more about gameplay that graphics.
 
who cares? there really is no reason to upgrade. There is no crysis on the horizon. everyone game release on the pc now, is either a 360 port or indy type game from a publisher like stardock, thats more about gameplay that graphics.
Clear Sky on max DX10 settings and Cryostasis are two games that would be nice to have way more gpu power for. also all those with 30 inch monitors and/or AA whores still like to have as much gpu power as possible. yeah most games certainly dont need more than a current gtx285 though.
 
who cares? there really is no reason to upgrade. There is no crysis on the horizon. everyone game release on the pc now, is either a 360 port or indy type game from a publisher like stardock, thats more about gameplay that graphics.

QFT.
I'm going to probably pick up two of these new gt300 series cards down the road but in all truth I know it's soley a want and not a need situation. Sorry to say it's true.
 
Clear Sky on max DX10 settings and Cryostasis are two games that would be nice to have way more gpu power for. also all those with 30 inch monitors and/or AA whores still like to have as much gpu power as possible. yeah most games certainly dont need more than a current gtx285 though.

I mean ya your right but thats all fluff and nice to be able to do but hardly game changing or breaking. I mean shit, heaven forbid if we can't run gobs of AA. Ya man, 4x aa simply isn't enough I need 16XCSMPAACXRSSADDDGG AA LOL.
 
Back
Top