GTX 480 PCB pictures

Wow, that's a first. It's so hot it actually needs to pull air from BOTH sides of the fan. Pic looks believable, so that says more about its power consumption than any of the rumours I've seen to date! :eek:
 
Whats up with blocking out the PCIe connector?
I'd guess so an accurate measurement of the chip can't be made by using the connector as a scale to start with.
 
I'd guess so an accurate measurement of the chip can't be made by using the connector as a scale to start with.

Well that wouldn't make much sense. You can still see the full width of the connector and there's plenty of other (and better) things you could use to scale it.
 
Reminds me a lot of what the 8800GTX looks like with its cooler removed.
 
Maybe I missed the picture. Which one shows the length of the PCI-E connector so it can be used to scale the chip/card?
 
It doesn't matter, you can clearly see a two pin fan connector! Scale away! Who knows why they blocked the PCI express connector. Probably because they have no idea WTF they are doing?
 
Yes there are known components that will be used to attempt to guess the exact size of the chip.
Watch enough pre-launches, pictures being posted will show you how much free time many have on their hands trying to be the first with a measurement.

I'd guess they were instructed to hide it by nVidia.


P.S. @ B3D, they are suggesting it's as simple as hiding a logo or ID number for the blackout.
 
Last edited:
What you're seeing is the heatspreader, not the core itself. The core is big, yes, but not THAT big :p
 
^ Heh you would think people could tell the difference after all this time. I mean we've only had heatspreaders on GPU cores since 2006.....
 
It doesn't matter, you can clearly see a two pin fan connector! Scale away! Who knows why they blocked the PCI express connector. Probably because they have no idea WTF they are doing?

probably because the PCIe connector and the Chip have identifier numbers on them, dont' want to give away the leaksource now do we

same reason the cooler has the stocker blocked out
 
I thought GTX 480 was going to use volterra vrms.

The cool thing is the chip ihs appears to be smaller then that of gtx 285 (or g200 b3). Hmmm...this is definetly getting very interesting because tons of rumours were putting core size into the 470mm^2-500mm^2 territory and here obviously its way smaller then that judging by ihs size.

Dare I say....its about the same size as the G80 ihs.

Oh shiii..........
 
The reason why a lot of stuff is obscured is because nVidia knows exactly which partners PCB look like what.

This is to protect the source of the photos and the person of the company partner that leaked them.

It said that on all the news sites that hosted the pics.
 
^ Heh you would think people could tell the difference after all this time. I mean we've only had heatspreaders on GPU cores since 2006.....

Don't forget that the GeForce FX GPUs had them also back in 2003-2004.
 
Maybe I missed the picture. Which one shows the length of the PCI-E connector so it can be used to scale the chip/card?

Middle pic. You can see the shadow of the connector pretty clearly under the blacked out area.

But you've still got the slot bracket, the SLI connections, and the PCI-E power connectors that are a standard size and easy to measure to scale things.

Not like it matters that much since the chip has a heatspreader on it anyway.
 
Wow, that's a first. It's so hot it actually needs to pull air from BOTH sides of the fan. Pic looks believable, so that says more about its power consumption than any of the rumours I've seen to date! :eek:
Probably explains part of the reason why NVIDIA has been missing in action, and also infers that unless NVIDIA's engineers get serious about / get brains for optimizing their design to get the same performance output for lesser heat, lesser electricity, and smaller nm and board sizes -- they'll be out of the gaming-graphics industry in no time.

Look at ATI. For the same bang, the ATI cards use less electricity and generate less heat per nm, can handle more heat per nm, and as a consumer-plus -- cheaper! Efficiency is the key, something ATI is doing way better than NVIDIA.

Instead of making their designs and architectures more efficient, NVIDIA is just pumping in more electricity (thus more heat) and upping voltages. Great game plan![/sarcasm]

FYI: No, I'm not an ATI fanboy. I buy the card that's the best bang for the buck (aka most cost effective).
 
Wow, that's a first. It's so hot it actually needs to pull air from BOTH sides of the fan. Pic looks believable, so that says more about its power consumption than any of the rumours I've seen to date! :eek:

a few of NVIDIA's past cards had the opening on the PCB to allow better air flow, but usually the high end cards, the 7 series i think, the 8 series had it for sure i think only on the GTX?
 
I like the venting / air duct on the PCB... that's how more hot running cards should be imo. more spots to get air into that fan.
 
Instead of making their designs and architectures more efficient, NVIDIA is just pumping in more electricity (thus more heat) and upping voltages. Great game plan![/sarcasm]

You do realize the GTX480 (250w) draws less power than a GTX295 (289w), right? It's also quite likely to be faster than the 295.

Well, would you look at that. Less power draw and more performance at the same time. ;)
 
I have every confidence that EVGA will develop a GF100 board that displays to 3 monitors.
 
You do realize the GTX480 (250w) draws less power than a GTX295 (289w), right? It's also quite likely to be faster than the 295.

Well, would you look at that. Less power draw and more performance at the same time. ;)
Aaaaand compared to its ATI equivalence?
 
well looks like no real change in layout since the 8800gtx and see they still use the crappy clay type heatsink paste
 
Probably explains part of the reason why NVIDIA has been missing in action, and also infers that unless NVIDIA's engineers get serious about / get brains for optimizing their design to get the same performance output for lesser heat, lesser electricity, and smaller nm and board sizes -- they'll be out of the gaming-graphics industry in no time.

Look at ATI. For the same bang, the ATI cards use less electricity and generate less heat per nm, can handle more heat per nm, and as a consumer-plus -- cheaper! Efficiency is the key, something ATI is doing way better than NVIDIA.

Instead of making their designs and architectures more efficient, NVIDIA is just pumping in more electricity (thus more heat) and upping voltages. Great game plan![/sarcasm]

FYI: No, I'm not an ATI fanboy. I buy the card that's the best bang for the buck (aka most cost effective).

well thats nothing new amd have always undercut most of nvida and intel for prices this goes way back to when amd used to have the rights to make chips to fits the intel slots and sockets
 
Aaaaand compared to its ATI equivalence?

HD5870 uses 188w, while the HD5970 uses 295w. That puts the GTX480 almost in the middle at 250w.

So if it performs between the 5870 and the 5970, it fits its power draw just fine.
 
a few of NVIDIA's past cards had the opening on the PCB to allow better air flow, but usually the high end cards, the 7 series i think, the 8 series had it for sure i think only on the GTX?

GTX 295 had it as well, sandwich board design did anyways.
 
HD5870 uses 188w, while the HD5970 uses 295w. That puts the GTX480 almost in the middle at 250w.

So if it performs between the 5870 and the 5970, it fits its power draw just fine.

Except that the 5970 (and the 295, for that matter) have two GPUs.

The fair comparison of the GTX480 is the 5870 and/or the 280/285, from a heat/power consumption standpoint. Two GPUs will of course bias the results.

Compare a GPU to another GPU, not to two of them.
 
nice m8! those are good pics! everyone saying that this may not even be fermi just photoshoped old cards hence the black mark (not sure why they blacked that out) just got owned
 
The GTX 280 was going to be a lot cooler to run than the GTX295?
Is that correct?
 
well thats nothing new amd have always undercut most of nvida and intel for prices this goes way back to when amd used to have the rights to make chips to fits the intel slots and sockets

They have? I remember buying some damned expensive Athlon 64 X2's back in 2005.

Back on topic.. screw pictures, I want to see some testing. :D
 
Except that the 5970 (and the 295, for that matter) have two GPUs.

The fair comparison of the GTX480 is the 5870 and/or the 280/285, from a heat/power consumption standpoint. Two GPUs will of course bias the results.

Compare a GPU to another GPU, not to two of them.

if it all fits in 1 single PCIe slot then it is a comparison to me at least.

just cause one company can do it with 2 GPU's and another with 1 for example doesnt mean they should be compared.

also i look at the price, if they both sell for say $499, why not compare them
 
Exactly. If the price is close, they will be compared as which offers the best performance for the price.
 
534565 695EA2 :D

ATI HD5870 Solo - 188w
NV GTX480 Solo - 250w
ATI HD5970 Dual - 295w
NV GTX295 Dual - ???w

What else should I add to this? I think it's interesting. Just looking at the first two, the GTX480 uses more power per nm (this correct to say?). Not sure how much the 295 pulls, so .. *shrug*

EDIT: I am looking at efficiency here, price at the moment not being a factor. You could incorrectly assume that for the HD5970, each GPU pulls about ~147.5w, which is quite better than for one GPU on GTX480 -- but then you would have to factor in whether or not HD5970 performs similarly or better than the GTX480. Otherwise this would be meaningless LOL.

EDIT2:
Exactly. If the price is close, they will be compared as which offers the best performance for the price.
True true. Not sure how to factor it in though -- my brain hurts. o_O I'm sure there is a proper way to evaluate the overall efficiency of cards per nm.
 
Back
Top