carrierPigeon
Limp Gawd
- Joined
- Sep 22, 2017
- Messages
- 162
Background: I was about to upgrade the power supply on an old computer from 250W to 450W. The reason that I was doing this is because I had installed a new video card a couple of months ago to test something, it solved the problem and I forgot how weak the power supply was. According to a review website the GPU uses 65 Watts at idle and 130W at full load. Interestingly, everything seems to be working perfectly fine without installing a new power supply. Since the video card was PCIe 3.0 8x and the motherboard slot was for PCIe 2.0 16x, perhaps that reduced its abilities and power consumption.
Anyway, the graphics card manufacturer gives no information about how many watts the graphics card uses. Only a recommendation about what size power supply you should have. Seems strange to me; is that typical?
Comparing the 250 watt power supply to the 450 watt power supply, the 250 watt power supply has higher specs on the 5V rail (around 14 amps on the 250 watt, and 10 amps on the 450 watt). So, it's possible that the 250 watt is actually more suited for the job.
I also looked at the motherboard manual, and I could find no data as to what conversions the motherboard is doing (if it's converting any of the voltages from the 20/24 pin connector to something else).
Is there a way to figure out the voltage rail/rails that our hardware actually uses and what conversions our motherboards are doing? Or do you just have to go insanely overboard on the power supply, and waste money on electricity?
Anyway, the graphics card manufacturer gives no information about how many watts the graphics card uses. Only a recommendation about what size power supply you should have. Seems strange to me; is that typical?
Comparing the 250 watt power supply to the 450 watt power supply, the 250 watt power supply has higher specs on the 5V rail (around 14 amps on the 250 watt, and 10 amps on the 450 watt). So, it's possible that the 250 watt is actually more suited for the job.
I also looked at the motherboard manual, and I could find no data as to what conversions the motherboard is doing (if it's converting any of the voltages from the 20/24 pin connector to something else).
Is there a way to figure out the voltage rail/rails that our hardware actually uses and what conversions our motherboards are doing? Or do you just have to go insanely overboard on the power supply, and waste money on electricity?
Last edited: