Videocardz: NVIDIA GeForce RTX 3080 pictured? (now thats different)

Looks like JPEG compression artifacts to me, but you can see what appears to be the gold fingers:

1591550078077.png


enhance.gif
 
those are the "groves" the fingers are supposed to go in.

Fingers don't go in grooves (or groves), they are etched from from the copper on the PCB and plated.

The absurdities that occur when people can't admit to making a mistake.

We have a low quality image of the PCB edge connector, that if you zoom in you can see some of the fingers.

But sure rationalize whatever nonsense you can come up with avoid admitting a mistake, and stick with your original claim.
 
Fingers don't go in grooves (or groves), they are etched from from the copper on the PCB and plated.

The absurdities that occur when people can't admit to making a mistake.

We have a low quality image of the PCB edge connector, that if you zoom in you can see some of the fingers.

But sure rationalize whatever nonsense you can come up with avoid admitting a mistake, and stick with your original claim.
blah blah blah guess we see if thats what they look like when they are actually announced
 
Im gonna call BS on this. The fans don't look right. They designed special fans for the 20xx series cards. Don't see any reason they would abandon them.
 
those are the "groves" the fingers are supposed to go in. the tab is one consistant color all the way across. combine that with the other irregularities, im saying fake.
Agree, the blurring used to mask the bad color fills...
 
I'm looking at this and wondering about the power connectors. There's no conformation in the pictures (and no visible connectors, either), but the renders seem to like the idea that the power sockets are on the rear of the heatsink, meaning wires are running through the sink and behind the fan to the PCB. While "neat," this just doesn't seem like the best engineered design possible vs. a GTX 670-like option?

Or these could just be fake PCBs designed to test the fitment of the fansink prototypes...
 
So either Nvidia has gotten a LOT better at making smaller PCB's or they are just saying screw it?? I mean maybe there was a lot of junk space on older PCB's just to hold spacing and shrouds?

Also exactly how much heat is this recirculating back into the case anyway?
Maybe went HBM for the half length card required like the old Fury Nano and the rest...
 
Love the idea. If it's real, can't wait to see what the thermal performance is like.
 
I'm looking at this and wondering about the power connectors. There's no conformation in the pictures (and no visible connectors, either), but the renders seem to like the idea that the power sockets are on the rear of the heatsink, meaning wires are running through the sink and behind the fan to the PCB. While "neat," this just doesn't seem like the best engineered design possible vs. a GTX 670-like option?

Or these could just be fake PCBs designed to test the fitment of the fansink prototypes...
That was my first thought as well when people stated half sized card., don't you want the power connectors directly onto the board? But then I was thinking why would the flag ship product be a half sized card? You want to spread everything out so you don't localize heat too much, half sized cards come later with the 3060 type models.


Conclusion there is no way this is a half sized card, and if real that isn't a push pull design, it is blowing air on both halves
 
That was my first thought as well when people stated half sized card., don't you want the power connectors directly onto the board? But then I was thinking why would the flag ship product be a half sized card? You want to spread everything out so you don't localize heat too much, half sized cards come later with the 3060 type models.


Conclusion there is no way this is a half sized card, and if real that isn't a push pull design, it is blowing air on both halves

I can't entirely agree with your conclusion. There are two notable exceptions to high-end cards on small footprints: Radeon Fury (granted, HBM was used, reducing real estate) and my aforementioned GTX 670, which used the same GK104 parent chip as the GTX 680 but on a smaller board. If the power circuitry needs can be cut down, why couldn't the board be made smaller?

Also, if there's a fan on the back side, wouldn't that make it a 2-piece board to accomodate the offset? Or is there a big hole in the PCB?
 
That was my first thought as well when people stated half sized card., don't you want the power connectors directly onto the board? But then I was thinking why would the flag ship product be a half sized card? You want to spread everything out so you don't localize heat too much, half sized cards come later with the 3060 type models.


Conclusion there is no way this is a half sized card, and if real that isn't a push pull design, it is blowing air on both halves

But the 3080 wont be the flagship card. This is considered mid-tier/high end. The flagship card will be the 3080ti. Just like it was for the 900, 1000, 2000 series.

Nvidia released the 980 GTX, 1080 GTX, and 2080 GTX first (2080ti was only 7 days later).
 
But the 3080 wont be the flagship card. This is considered mid-tier/high end. The flagship card will be the 3080ti. Just like it was for the 900, 1000, 2000 series.
This is almost correct. Based on my sources at a chinese textile mill + a father high up at Nintendo, Ampere will break down as so:

3080 - Mid-tier
3080 Ti - Mid-High
3080 Te (Tear Extractor™) - self explanatory
 
Last edited:
Half sized card? That thing is still like a brick! Remember what cards used to look like? They were like half inch thick and one third shorter. Now they take up half the case and are like 3 inches thick!
Cards back then didn't require auxiliary power, only drawing 30-40W from the slot. The card that introduced us to the dust buster, the FX 5800 Ultra, was a 70W card. Top-end cards these days use 250-300W.
 
Well what happened?! CPUs are not following the same trend, aye? Bollocks!!

Someone noticed that relative performance could skyrocket if you added a bunch of power and cooling, and then multiple copies of units (analogous to "cores," today, but not really the same) which also increased power. The idea first became normalized in the consumer market around the Pentium 4 era (even if it wasn't actually true at that time), but CPU/GPU power requirements had been steadily increasing over time, along with performance. CPUs were following the same increasing power trends as GPUs, but almost never stopped. Consumer GPUs seem to at least have hit a "250W at stock is enough" plateau, but CPUs keep going up.

Before Pentium 4 (50-130W and they meant it), the top Pentium III was a 35W cpu. The hottest OG Pentium was 17W. 486s with heatsinks were a new and rare concept, and they ran 4-7W. MOS 6502: <1W. Core 2 was a remarkable return to efficiency, but even with the "2% performance gain for every 1% increase in power" policy, look where Intel is, now: double TDP at least to run at stock all-core turbo. AMD's 280W Threadrippers are very well-received (even after the 220W FX 9590 debacle), and even they followed a similar path.

I could write a several-page paper expounding on this. But I won't.
 
Well thank you, Sir. You seem very educated on this subject.

What about size though? CPUs have stayed relatively the same in size vs 20 years, but look what happened to GPU's! Morbidity obese!! Are we looking at bigger units in the next 5 years with dedicated power supplies to keep them running?
If you include the memory slots, VRMs, and heat dissipation surface areas that today’s CPUs require, you’ll see that they easily rival GPUs.
 
I was talking about the actual consumed areas on the motherboards. Half the area on today's HEDT boards are easily consumed by the CPU socket, the VRMs and their heatsinks, and the 8 memory slots vs maybe up to 1/3 of a high-end 939 or 478 board of 2004. A heatsink that's just barely adequate for a 18-core HEDT chip dwarfs an outstanding one for an Athlon FX-51 or a Pentium 4 Extreme Edition.

And the 6600GT was mid-range card. The 6800 Ultra was very nearly the size of today's high end cards.

Then there's the CGA cards of the early 80's.
1591664148508.png

Or VLB cards of the 90's
1591664395397.png
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Thank you, sir. I have 9800gt, which is arguably longer than mid tier video card, but when compared to mid tier gtx460, the latter is twice as thick, and requires two power plugs, vs one on 9800gt.

When taking into consideration more recent GPU's, only the most lowly brethren sport single sloth configuration..

Taco has feeling we are heading in the wrong direction.
The more heat an electronic circuit generates, the quicker you need to dissipate it before it burns itself to death. You increase the rate of dissipation in basically two primary ways: pushing it through a heat conductive material with as large a surface area possible preferably with the aid of air pressure from a fan or fans, or by exchanging the heat with another medium such as water. In either case you are taking up a certain volume of space within a PC environment, and the former is easier and cheaper to manage. Unfortunately that means you can't get effective cooling in a single slot solution since you are severely limited in the surface area of the heatsink you can use.

Aside from the core and memory you also have the yards of tiny wire intricately weaved through the layers of the PCB. This, too, needs cooling. The way this is done is by strategically placing tiny holes throughout the PCB that heat can be dissipated through. Again, this takes up surface area to be effective. There is a technical name for the holes in a PCB, but I can't remember what it is...

More performance = more power needed = better cooling capacity required. You can't beat physics.
 
GPU performance has increased far, far more than chip size or power consumption since the early 2000’s. You guys are just spoiled.
 
More performance = more power needed = better cooling capacity required. You can't beat physics.
That's what's bothering taco.

I think there's still a slight disconnect in the cooling we use vs. the cooling we need. While in the physical sense, Armenius is dead on, but in the marketing/consumer application, something else is going on. I'm not claiming 100% certainty here (because I do not work for a tech company), but I think the trend to bigger and badder coolers isn't necessarily driven by increased power, but rather decreasing noise for a better user experience. Cards haven't changed

The blower fan, which is probably the most effective solution, in that it works very well for a single fan and removes heat directly from the case, is now rare and on "cost saving" models: the noise sucks. The bigger a cooler is, the more heat it can dissipate. Putting a 250W-capacity cooler on a 250W card (or CPU for that matter) means it is working at 100% to accomplish that, with all the noise that goes with it. Put a 500W-capacity cooler on a 250W card, and now you can turn down the fans and the cooling system is much less noticeable to the end user (this is why watercooling with giant radiators is so awesome). Now, why the 2-slot, 3-fan cooler on my OC'd GTX980ti can keep the card cool at 100% OpenCL load at only 50% fans (and not loud at all), but still doing a better job than my 3-slot 3-bigger fan cooler on my not OC'd 1080ti, makes me wonder.

LInking this back to the 3080 topic, it is my sincere hope that if this really is the new cooler, the radical design accomplishes great cooling and low noise, because Grebby needs a new GPU and his expansion slots back.
 
Thank you, sir. I have 9800gt, which is arguably longer than mid tier video card, but when compared to mid tier gtx460, the latter is twice as thick, and requires two power plugs, vs one on 9800gt.

Edit: photos for comparison.
View attachment 251910View attachment 251911View attachment 251912

When taking into consideration more recent GPU's, only the most lowly brethren sport single sloth configuration..

Taco has feeling we are heading in the wrong direction.

The difference in size between those cards has a lot to do with the big leap in efficiency that Nvidia made from G80 to G92 not to mention that the single slot coolers on the 8800gt/9800gt were completely inadequate(loudest GPUs I've ever owned).

The 9800gt was basically a cheaper more efficient 8800 Ultra/GTX which were both dual slot coolers. Perhaps more importantly it was followed by the gtx 260 in the midrange that was dual slot and 181w tdp vs the 460 which is 160w tdp, I think that shows that while power usage has gone up over time it's also cyclical(as long as we keep getting process shrinks).
 
You can always just water cool it to get some of that space back, but then you got to have room for radiators, either way it's a matter of where you want stuff to take up space.
 
You can always just water cool it to get some of that space back, but then you got to have room for radiators, either way it's a matter of where you want stuff to take up space.
You might need a block on both sides or two blocks, be interested if GPU is on one side and the VRMs and Ram on the other side of the board. I would think very fast DDR6 will be producing a lot of heat.
 
Or VLB cards of the 90's
View attachment 251906

I am broken record on this, but The GPU form factor could really use a radical shift.

The x86 PC general purpose card cage came into being with the first IBM PC, there was never any expectation that this would be needing to consume/cool 200W+ loads. Even in 1990's you can see the VLB graphics cards didn't even have passive heat sinks, so they were consuming insignificant power.

Now high end GPUs consume hundreds of watts, and in most gaming PCs, they consume much more than CPU. Yet the CPU has Vastly superior airflow, and cooling path.

In a PC tower, the CPU has a near ideal airflow. Intake fan in front, in line with Tower cooler, and in line with exhaust fan at the back.

OTOH. The GPU cooling is a mess. Being vertical card slot based, it doesn't really matter if it takes more slots, the are limited to flat blow down designs which are quite inferior to tower coolers. And there is no decent cooling path. There might be a front intake fan, but it might also be partially obstructed by a PSU shroud, and then even if it isn't, air flow is not that good, because there is no outlet at the back to allow hot air to escape. Blower cards try to make their own exhaust path, but they don't really have enough throughput to cool high end GPUs, so their main benefit is for ventilating cramped cases for better CPU thermals. So instead we have multiple fan open air blow down coolers than dump heat back in the case for making the best of a bad situation.

Ideally the GPU needs a cooling solution and air path more like the CPU. With dedicated intake and exhaust and tower cooling in between them.

Though it will never happen because we tend to get stuck with standards of the past, no matter how ill-suited they are for today's needs.

/rant


Initially this odd cooler design looked interesting, as there was some thought that it could be a push-pull design, which could yield some nice GPU cooling resuts (but still would be dumping heat in the case), but it seems evident now that this isn't push-pull, but more of a pointless gimmick, so no progress in any real way.
 
I am broken record on this, but The GPU form factor could really use a radical shift.

The x86 PC general purpose card cage came into being with the first IBM PC, there was never any expectation that this would be needing to consume/cool 200W+ loads. Even in 1990's you can see the VLB graphics cards didn't even have passive heat sinks, so they were consuming insignificant power.

Now high end GPUs consume hundreds of watts, and in most gaming PCs, they consume much more than CPU. Yet the CPU has Vastly superior airflow, and cooling path.

In a PC tower, the CPU has a near ideal airflow. Intake fan in front, in line with Tower cooler, and in line with exhaust fan at the back.

OTOH. The GPU cooling is a mess. Being vertical card slot based, it doesn't really matter if it takes more slots, the are limited to flat blow down designs which are quite inferior to tower coolers. And there is no decent cooling path. There might be a front intake fan, but it might also be partially obstructed by a PSU shroud, and then even if it isn't, air flow is not that good, because there is no outlet at the back to allow hot air to escape. Blower cards try to make their own exhaust path, but they don't really have enough throughput to cool high end GPUs, so their main benefit is for ventilating cramped cases for better CPU thermals. So instead we have multiple fan open air blow down coolers than dump heat back in the case for making the best of a bad situation.

Ideally the GPU needs a cooling solution and air path more like the CPU. With dedicated intake and exhaust and tower cooling in between them.

Though it will never happen because we tend to get stuck with standards of the past, no matter how ill-suited they are for today's needs.

/rant


Initially this odd cooler design looked interesting, as there was some thought that it could be a push-pull design, which could yield some nice GPU cooling resuts (but still would be dumping heat in the case), but it seems evident now that this isn't push-pull, but more of a pointless gimmick, so no progress in any real way.
Power is also provided by the big ass motherboard you have to set a CPU in. Memory is also modular for the CPU. A GPU also needs all that stuff.

So what would you suggest in how to do it better? You could potentially take the GPU out of the case, but you still need to provide it power. The further away the GPU is from the interface the more latency you introduce. We could make it completely modular like the CPU, but that would potentially double the space needed inside your PC case, making the internal temperature argument moot.
 
Power is also provided by the big ass motherboard you have to set a CPU in. Memory is also modular for the CPU. A GPU also needs all that stuff.

So what would you suggest in how to do it better? You could potentially take the GPU out of the case, but you still need to provide it power. The further away the GPU is from the interface the more latency you introduce. We could make it completely modular like the CPU, but that would potentially double the space needed inside your PC case, making the internal temperature argument moot.
He's talking about reorienting it, not moving it to another state... Like a pcie extension riser that allows it oriented horizontal instead of vertical so you can put a similar heat sink like a CPU instead of making them 2-3 slots and still crammed. Lay it flat and have a 60+mm tall heatsink on it with a fan (or two) blowing right through it.
 
He's talking about reorienting it, not moving it to another state... Like a pcie extension riser that allows it oriented horizontal instead of vertical so you can put a similar heat sink like a CPU instead of making them 2-3 slots and still crammed. Lay it flat and have a 60+mm tall heatsink on it with a fan (or two) blowing right through it.
A picture of how that is supposed to work would be great.
 
Ever seen a mini it's case with a 90 degree pcie adapter so the GPU can lay flat to keep the height down? Like that, except instead of keeping the height down, install a full size cooler and keep it tall. If you have seen them, I'm sure I can track down a picture somewhere. Obviously some imagination is still required as nobody actually makes it the way he's describing.

Edit:
https://www.tweaktown.com/reviews/6932/azza-mini-itx-gaming-chassis-review/index.html
This is one of the cases I have at the house. If you look at the photo gallery you can see the GPU laying flat. If designed properly and "regular height case" you could slap a nice tall cooler on it. Obviously the orientation isn't quite right on it for it to work, it would have to be designed to do it.
 
Last edited:
Ever seen a mini it's case with a 90 degree pcie adapter so the GPU can lay flat to keep the height down? Like that, except instead of keeping the height down, install a full size cooler and keep it tall. If you have seen them, I'm sure I can track down a picture somewhere. Obviously some imagination is still required as nobody actually makes it the way he's describing.

Edit:
https://www.tweaktown.com/reviews/6932/azza-mini-itx-gaming-chassis-review/index.html
This is one of the cases I have at the house. If you look at the photo gallery you can see the GPU laying flat. If designed properly and "regular height case" you could slap a nice tall cooler on it. Obviously the orientation isn't quite right on it for it to work, it would have to be designed to do it.
Okay, I can picture it now. And it looks ridiculous. I can't imagine it really being practical.
 
This is almost correct. Based on my sources at a chinese textile mill + a father high up at Nintendo, Ampere will break down as so:

3080 - Mid-tier
3080 Ti - Mid-High
3080 Te (Tear Extractor™) - self explanatory

Don’t forget the 3080 Super Te.
 
A picture of how that is supposed to work would be great.

Start with an ITX MB, but put a PCIe edge on the opposite of the board, so it lies flat with cooler side up, then you can use tall coolers.

With that, then you can connect a flat lying GPU with tower cooling into it. The goal isn't looking pretty, it's to have improved cooling. An extreme example, but tower flow through design is better for cooling:
TowerGPU.jpg
 
The GPU needs to be plugged directly into the motherboard, flat-side down, with a new interface. No more PCIe. The entire card needs to act as a daughterboard and look like one too, and be placed directly over or into the CPU, allowing for lowest latency, as well as an easier time watercooling due to the proximity bonus when routing pipes. Also, watercooling is now mandatory due to lack of airflow.
 
That's a great picture. Yeah, an edge connector would be simplest and allow it to lay flat. Could even have standardized mounting in the case so it can't flex as much.
 
Back
Top