55nm Production Status. (Rumors)

vengence

Level capped
Joined
Nov 7, 2007
Messages
18,469
http://www.tcmagazine.com/comments.php?id=20755&catid=6


Seems the 280s (as well as all thier current GPUs) could see the 55nm shrink by the end of Q3 begining of Q4, which realistically puts them sometime in late September / early October.

Judging by what they've been able to do with other chips on die shrinks, we could well see a "stock" speed increase to 700mhz. Seeing how EVGA has already binned thier chips to put out a 700Mhz water cooled version, the same "binning" might result in a 800Mhz version. I know some people have already pushed that EVGA card into the 750+. It might (low odds) be possible we could see someone get 900mhz off of one of these. I say might, because it's not going to be everyone, in fact most won't. But still, the idea of a single one of them running at 50% faster than the single stock 280 now is just.... well it's just sick. :p


Disclaimer: This post is about bleeding edge hardware. Price/preformance ratio will not be discussed in this post. Why? Because this is about future hardware. Price has dropped by 30% on the 280 in a month, so talking about it now when it hasn't been released would be dumb.
 
Price has dropped by 70% on the 280 in a month, so talking about it now when it hasn't been released would be dumb.

70%? I think you meant to say it dropped 30%. It's 70% of what it was at launch roughly.
 
If nVidia can tweak this architecture, up the speeds and and the costs down, the GT200b is going to be a formidable chip for AMD to counter. And it looks like soon after 55nm they are going to 40nm. It this architecture scales well they could put 2 billion transistors on less space than they put 1.4. Go to DDR5 and reduce the memory bus to 256 bit and put DX 10.1.

I think that this is an architecture that was really meant for the future more than now.
 
So if we see a 55nm version of the GT200 cards in the next few months, would that mean same specs except for higher clocks, lower temps, and a slightly lower price tag?
 
Ahh the sweet smell of speculation..

attachment.php


^ Hopefully the shrink will prevent this from happening :p
 
I was thinking, I guess what Nvidia would basically do would be the same thing as with the 9800gtx, make a + version of the GT200 cards.
 
If nVidia can tweak this architecture, up the speeds and and the costs down, the GT200b is going to be a formidable chip for AMD to counter. And it looks like soon after 55nm they are going to 40nm. It this architecture scales well they could put 2 billion transistors on less space than they put 1.4. Go to DDR5 and reduce the memory bus to 256 bit and put DX 10.1.

I think that this is an architecture that was really meant for the future more than now.

40nm would be sick. Really sick, especially for the clock speeds. Yeah the architecture really looks like it was made for a 55nm or a 40nm process.

I'm not sure about the DDR5. This card needs the 1 Gig of RAM and 1 gig of DDR5 would be horrifically expensive.
 
POPEGOLD is not going to like your disclaimer.

hahaha, yeah. He and all the other people who just want to start fan boy flame wars can find another thread. Speaking of fan by flame wars, Hello trudude! Go find somewhere else to start a flame war. This thread is about Nvidia's latest products. Period.
 
Okay since we are talking about bleeding edge hardware lets talk about the HD4870x2 with 2GB of GDDR5. That is pretty bleeding edge. Come back when nVidia has something bleeding edge.

Sure it is. 1.4 billion transistors on a chip isn't exactly high school science project either. In any case, all of these parts are going to be short lived as the ante has been upped. By this time next year we should have 40nm GPU's from both AMD and nVidia. Not to mention that nVidia has some intresting software coming out in PhysX and CUDA games and applications. I can't wait for Badaboom.

Don't expect the 4870x2 to have the performance crown for long. Heck, AMD might very well dethrone it before the year is out. In the next year we are going to see more activity in the GUP space than we've seen this decade.
 
Ahh the sweet smell of speculation..

attachment.php


^ Hopefully the shrink will prevent this from happening :p

Ya know it's pictures like that that make me think the "ATX LONG" standard should be coming out soon. Really, amazing things you could do to temperatures and noise if you doubled the size of space for the heatsink/fan.

IIRC, I think one of the vodoo cards that never made it to the market had a wall wart.

Although, a GTX 280+x2 would do several things!
1) cause lots of people to die from the most horrible name ever invented! :D
2) shut everyone up who screams 4870x2! :rolleyes:
3) Cause EVGA to come out with a water cooled version that can be yours for only $10,000. :p
 
Sure it is. 1.4 billion transistors on a chip isn't exactly high school science project either. In any case, all of these parts are going to be short lived as the ante has been upped. By this time next year we should have 40nm GPU's from both AMD and nVidia. Not to mention that nVidia has some intresting software coming out in PhysX and CUDA games and applications. I can't wait for Badaboom.

Don't expect the 4870x2 to have the performance crown for long. Heck, AMD might very well dethrone it before the year is out. In the next year we are going to see more activity in the GUP space than we've seen this decade.

Don't even bother replying to him. This thread is about new architecture. Not who can take existing products, solider them together and call it bleeding edge.
 
http://www.tcmagazine.com/comments.php?id=20755&catid=6


Seems the 280s (as well as all thier current GPUs) could see the 55nm shrink by the end of Q3 begining of Q4, which realistically puts them sometime in late September / early October.

Judging by what they've been able to do with other chips on die shrinks, we could well see a "stock" speed increase to 700mhz. Seeing how EVGA has already binned thier chips to put out a 700Mhz water cooled version, the same "binning" might result in a 800Mhz version. I know some people have already pushed that EVGA card into the 750+. It might (low odds) be possible we could see someone get 900mhz off of one of these. I say might, because it's not going to be everyone, in fact most won't. But still, the idea of a single one of them running at 50% faster than the single stock 280 now is just.... well it's just sick. :p


Disclaimer: This post is about bleeding edge hardware. Price/preformance ratio will not be discussed in this post. Why? Because this is about future hardware. Price has dropped by 30% on the 280 in a month, so talking about it now when it hasn't been released would be dumb.

actually I don't see the 280 die shrink doing that much for them. The chip itself will still be too big to get much in the way of clock speed. Still it would improve the yields and margins. and it would still be the fastest single GPU on the market.

what I would be interesting to see is the G94 die shrink. give the way it scales with SLI that would be a good dual GPU card.
 
actually I don't see the 280 die shrink doing that much for them. The chip itself will still be too big to get much in the way of clock speed. Still it would improve the yields and margins. and it would still be the fastest single GPU on the market.

what I would be interesting to see is the G94 die shrink. give the way it scales with SLI that would be a good dual GPU card.

Die shrinks allow clock speeds to be increased. Look at the G80-> G92, the G80/G92 GTS or the 9800 GTX / GTX+. Both gained serious clock speeds. Lower thermal loads also allow the chips to be pushed harder with the same cooler design.

Die shrinks actually tend to hurt the yield in terms of good chips / chips made. However the amount of chips per waffer increases more than the decrease in yield % making the cost per chip go down.
 
Die shrinks allow clock speeds to be increased. Look at the G80-> G92, the G80/G92 GTS or the 9800 GTX / GTX+. Both gained serious clock speeds. Lower thermal loads also allow the chips to be pushed harder with the same cooler design.

Die shrinks actually tend to hurt the yield in terms of good chips / chips made. However the amount of chips per waffer increases more than the decrease in yield % making the cost per chip go down.

that is true to a point but your still left with problem of actual distance. don't get me wrong, it will help but I don't see anything close to what you were talking about. I do expect that other improvements will come. But I am thinking that AMD has the right strategy here, make smaller scalable chips. I think Nvidia could do this now and I don't understand why they don't.

Yields = number of good chips per die. at least that is how I was using it.
 
that is true to a point but your still left with problem of actual distance. don't get me wrong, it will help but I don't see anything close to what you were talking about. I do expect that other improvements will come. But I am thinking that AMD has the right strategy here, make smaller scalable chips. I think Nvidia could do this now and I don't understand why they don't.

Yields = number of good chips per die. at least that is how I was using it.

I think we could see a 700mhz stock version. It is very much inline with what we have seen from the 9800GTX to GTX+ line. Now, it is possible we could see 750mhz. There was much talk that they were unhappy with the clock speeds they got out of the 65nm process they may have fixed the limiting problems, who knows.

The above 900mhz dream was talking about OCing binned parts that are used on watercooled cards. There is no way the stock version will be that fast.
 
if you think a shrink of the g200 is going to save nvidia you're a moron

because the 9800gtx+ was such a big improvement over the 9800gtx amirite
 
if you think a shrink of the g200 is going to save nvidia you're a moron

because the 9800gtx+ was such a big improvement over the 9800gtx amirite

Ok, there you go again with the personal attacks. As the moderators are quick to point, its a free country however these forums aren't. Behave!:)
 
the last ATI card I've owned was a 9800 pro. No ATI fanboy here. I Just think nvidia deserves a kick in the nuts for slacking off all this time.

I don't think anyone here disagrees, I know I don't. nVidia needed the heat turned up on them. That's good for them and for consumers as we have seen these last few weeks.

However this is just one generation and I think nVidia will respond in a positive way. What will be interesting is if AMD can be consistent. That's not something that they are renowned for. Plus as the prices have come down on the GTX's, they are a hell of lot more competitive than they were at release. Now its AMD's turn with the 4870x2 to respond.

I just saw this review: http://www.driverheaven.net/reviews.php?reviewid=586&pageid=1

Not exactly sure what these cards are going for but at the same time this isn't bad performance compared to 4870 CF. A shrink to 55nm and getting the clock speeds on this puppy up and getting the cost down could make this a really nice part.
 
I think we could see a 700mhz stock version. It is very much inline with what we have seen from the 9800GTX to GTX+ line. Now, it is possible we could see 750mhz. There was much talk that they were unhappy with the clock speeds they got out of the 65nm process they may have fixed the limiting problems, who knows.

The above 900mhz dream was talking about OCing binned parts that are used on watercooled cards. There is no way the stock version will be that fast.

there are other improvements that could be made. for real, to take the speed up like that it would have to go to 45 or so. but still even a good jump in clock speeds, memory improvements, and the all important driver improvements could make a world of difference here. then again what do I know? I went with the 939 socket instead of the AM2
 
After the 90nm to 65nm shrink (G80 > G92) a 65nm to 55nm seems a bit weak for major improvements.
Sure its better than nothing but I wonder how much they can improve the big die power hungry GT200s?
 
After the 90nm to 65nm shrink (G80 > G92) a 65nm to 55nm seems a bit weak for major improvements.
Sure its better than nothing but I wonder how much they can improve the big die power hungry GT200s?

Really, the 280 isn't that power hungry for what it does. watt for watt, it is amazing when compared to say, 2x8800GTXs which are pretty close to it's equal. I think the biggest thing is people just aren't use to seeing that much power on a single card.

The die shrink will help it some, however I expect to see the power draw stay the same, or go up as they increase the speed and maybe a volt mod on the 55nm version.
 
To everyone who is screaming that Nvidia is going to die, and this will do nothing: Find another thread to do it in. This thread is about tech, not buisness. This thread has already seen posts deleted for starting crap, I'd hate to see anyone get banned over not being able to control them selves.
 
We knew about this before the GTX 200 based cards were out. It will obviously reduce production costs, reduce power draw and allow better overclocks.
I don't remember NVIDIA launching their high-end, using a fab they haven't before, so this was expected by us and by them no doubt. I'm not sure that they ever had plans to release a GX2 like card, but if they do, it will be exclusively to take the performance crown back, which should shift from them to ATI, once the HD 4870 X2 hits the market.
 
Judging the small performance leap from the 9800gtx to the 9800gtx+ i wouldnt expect much performance boost... What it will do is make costs cheaper which is always a great thing.

It also may make gtx280 X2 possible.....
 
if you think a shrink of the g200 is going to save nvidia you're a moron

because the 9800gtx+ was such a big improvement over the 9800gtx amirite

What the hell are you talking about? :confused: nVidia does not need saving. AMD/ATI finally gets it's shit together and creates a pretty decent line of cards and now that = nVidia needs saving ? You fear uncertainty and doom slingers need to go play in the traffic or something..... :rolleyes:
 
Competition makes R&D worth it for both companies when the profits are coming in, whether you're first or second in the market. If the GPU market goes to a battle with each new one trumping the last, it will be a good time for not only enthusiasts, but mainstream, too.

ATI has done well and that's why we can hope for better from Nvidia.
 
Judging the small performance leap from the 9800gtx to the 9800gtx+ i wouldnt expect much performance boost... What it will do is make costs cheaper which is always a great thing.

It also may make gtx280 X2 possible.....

I wouldn't call the difference between 600 and 700mhz small. It's not "next gen" big, but it makes a major move on price/preformance.
 
These days with GX2s, X2s, Tri-SLI, and quad crossfire, it's rather hard for anything to hit the market and produce numbers that are "off the charts".

I'll gladly take a 16% boost, and probably more OC headroom with volt mods, though.
 
These days with GX2s, X2s, Tri-SLI, and quad crossfire, it's rather hard for anything to hit the market and produce numbers that are "off the charts".

I'll gladly take a 16% boost, and probably more OC headroom with volt mods, though.

well if the reports that say the 4870x2 2GB DDR5 is ripping the GTX280 at a 50 to 80% clip across the board.... a 16% boost is kinda like pissing in the wind

has the GT300 core hits prototype stage yet?
 
http://www.fudzilla.com/index.php?option=com_content&task=view&id=8391&Itemid=34

According to this, if it is true then the 55nm cards will be out as soon as September. I have to question, just how much better will the 55nm cards perform over the current cards? Obviously we have no way to know right now. If all that happens is what we saw with the 9800gtx cards (9800gtx+ is a 9800gtx with higher clocks) then I have no problem with that. However, I am planning on buying a GTX260 FTW card next week.

I don't know what to do. Do I gamble and buy the card, hoping that the overclocked FTW card works out to be the same performance as the new cards at stock? If the new cards happen to be a huge improvement in performance and my step-up period has experiered I will be beyond mad (and will probably become an ATI fanboy at that point lol).
 
Back
Top