Brackle
Old Timer
- Joined
- Jun 19, 2003
- Messages
- 8,566
I don't know about anyone else, but I am shocked that 3090 is a dual slot design. even with the leaf blower I wouldn't think it can keep it cool enough.
Crazy
Crazy
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I don't know about anyone else, but I am shocked that 3090 is a dual slot design. even with the leaf blower I wouldn't think it can keep it cool enough.
Crazy
I don't know about anyone else, but I am shocked that 3090 is a dual slot design. even with the leaf blower I wouldn't think it can keep it cool enough.
Crazy
Probably gonna end up being known as the "Throttle Edition"
Same. I didn't think there would be any blower style 2 slot designs. I wonder how well it performs because thats really interesting for a SFF build.
yeah I had planned to build a sff pc for my oled tv man cave area I am building, but kind of gave up once I saw the 3090 specs being triple slot. Might have to change my mind now....very interesting
I put mine in my basement and ran extension HDMI cables. That way it’s completely silent, I leave the side panel off, and it’s mATX so there’s no sff “fees”.
Hmmm, do they even make a long quality HDMI 2.1 cords? sounds like a good idea!
Gigabyte GeForce RTX™ 3090 TURBO 24G
https://www.gigabyte.com/Graphics-Card/GV-N3090TURBO-24GD#kf
View attachment 280480
Gen 3 vs Gen 4, only shows gaming applications and highlights the 1-2% gain. However, seems 3D Mark shows that a bandwidth difference exist...
Gen 3 vs Gen 4, only shows gaming applications and highlights the 1-2% gain. However, seems 3D Mark shows that a bandwidth difference exist...
Did you actually watch the whole video? They show productivity benches near the end and there is literally no difference. They also said that the 1-2% differences in gaming were within margin of error. The take-away is that currently there is no difference between Gen 3 and Gen 4 for performance.
Adding on to my "why buy small cases" post a few days ago. exhibit a:
https://i.redd.it/sjgy75hlgzn51.jpg
I don't know about anyone else, but I am shocked that 3090 is a dual slot design. even with the leaf blower I wouldn't think it can keep it cool enough.
Crazy
I'm waiting for reviews of the Asus ROG STRIX and OC versions, there are none so far.Starting to look like the ASUS TUF is the best overall 3080.
I'm waiting for reviews of the Asus ROG STRIX and OC versions, there are none so far.
I'm waiting for reviews of the Asus ROG STRIX and OC versions, there are none so far.
Yeah, my card isnt set to ship until Sep 30th at the earliest.The flagship cards are going to be delayed a little bit, the AIB's didn't have enough time to redesign everything for 400w cooling. 300 vs. 400 is a substantial uptick in heat management.
Saw a board analysis of it and they really focused on getting the most out of the cool while not cutting too many corners on the power delivery.
That’s the board I’m thinking of buying. If there are no physical differences I’m might even go for non-oc version.Saw a board analysis of it and they really focused on getting the most out of the cool while not cutting too many corners on the power delivery.
I'm looking for the non oc version, the board and cooler are the same.That’s the board I’m thinking of buying. If there are no physical differences I’m might even go for non-oc version.
How much power do you want your GPU to pull out of the wall? 400W? I think when it comes to efficiency Ampere is a joke. Yes it is 65% faster than a 2080, but we have seen much more impressive generational leaps before.Imo doesn’t matter how good the cooling is on the card if they gimp the power limit of the card. Make sure you find out the power limit if u plan to overclock a card
... I bet this is because Nvidia couldn't get a place in line at TSMC or they didn't want to pay the price that TSMC wanted to manufacture the GPUs. Is there any info on if Nvidia is going back to TSMC in the future or will they stay with Samsung?
How much power do you want your GPU to pull out of the wall? 400W? I think when it comes to efficiency Ampere is a joke. Yes it is 65% faster than a 2080, but we have seen much more impressive generational leaps before.
The jump we got with the GTX 1080 was also 63% compared to the GTX 980, but it had the same power draw as the GTX 980.
This time it is like 65% more performance with 80% more power draw, if you compare the 2080 with the 3080. I bet this is because Nvidia couldn't get a place in line at TSMC or they didn't want to pay the price that TSMC wanted to manufacture the GPUs. Is there any info on if Nvidia is going back to TSMC in the future or will they stay with Samsung?
Would be interesting to see how much power they would draw...Their data center top of the line Ampere version A100 is on TSMC 7nm. Those customers demand high efficiency, so there is that.
I know what you are saying.If you are going to pay a $150-200 premium for better cooling and overclocking and find out your limited to 375w. Then that’s the point? I mean this is the [H] forum. You are damn right I want to pull as much power as I want to overclock. That’s why you pay a premium.
Do you guys think I would have a problem using one of these on 1 of the 2 pcie cables from my psu (650w)to power a 3 plug evga ftw 3080? Not planning to oc or anything, probably undervolt if possible... Just don't want to have to replace my psu.
View attachment 280826
Corsair 650What is your PSU?
Best practice is 1 to 1.Do you guys think I would have a problem using one of these on 1 of the 2 pcie cables from my psu (650w)to power a 3 plug evga ftw 3080? Not planning to oc or anything, probably undervolt if possible... Just don't want to have to replace my psu.
View attachment 280826
Best practice is 1 to 1.
Not many PSUs have 3 8 pin outputs though, seems to be the main issue. So you have to decide to split one of the inputs generally.