Any pros to AIO cooled GPU's anymore?

Napoleon

[H]ard|Gawd
Joined
Jan 27, 2003
Messages
1,073
With the OC'ing capacity of the GTX 10-series not appearing to be limited much by the air vs. water cooling, is there any point in AIO or non-AIO water cooling anymore?

My CPU is currently under water and I don't plan on plumbing a GPU into the current loop.

I imagine installing an AIO or aftermarket cooling kills the warranty, is this incorrect?

I had myself convinced that a 'hybrid' type stock card was 'for me' (I may have had a crush on the 980TI hybrids...), but after the many reviews I'm not sure it's 'worth' it, $ for performance wise. It appears 2ghz is common and even 2.1ghz is attainable.

I was thinking the AIO/hybrid would theoretically have better boost performance, thermals, longevity, and OC headroom.

I understand that when you feel the need for speed then value per $ takes a back seat; I just enjoy getting a good price/performance deal.

Thanks for your thoughts and opinions!
 
is there any point in AIO or non-AIO water cooling anymore?

I like it for the quietness. That alone matters enough to me. I had my 570's on AIO's, then my 780, and now my 980Ti. The 570s and the 780 were markedly cooler and quieter, of course.

I imagine installing an AIO or aftermarket cooling kills the warranty, is this incorrect?

If you're careful, it doesn't hurt anything, and you can reinstall the stock cooler for any warranty issues. It would be best to double-check with the manufacturer since YMMV, some brands actually put warranty voiding stickers on the heat-sink screws these days.

I had myself convinced that a 'hybrid' type stock card was 'for me' (I may have had a crush on the 980TI hybrids...), but after the many reviews I'm not sure it's 'worth' it, $ for performance wise. It appears 2ghz is common and even 2.1ghz is attainable.

I have the 980Ti Hybrid. It probably wasn't worth the expense versus what I could have done just buying a card and then doing it myself, but it's still really nice.
 
It matters on a 250W+ cards like the Titan X. A CLC cooler on a reference 10.5" board fits more cases versus a giant triple slot, triple fan 12"+ card, especially now that smaller cases are all the rage.

Obviously if you are running a full size tower in an open bench/or with side fan, a big triple slot triple fan cooler (with custom fan curve) will be comparable in performance and noise.
 
Warranty wise, it depends on manufacturer and where you live. Some manufacturers put warranty void stickers and explicitly state they won't warranty card if the sticker is broken. Some countries however have laws that say manufacturer has to warranty card even if the sticker is broken. Some manufacturers don't put the stickers so they wouldn't be able to tell if you had AIO CLC cooling at one point, and some manufacturers do put stickers, but warranty the card anyway as long as nothing is physically damaged. Basically, as I said, it depends on where you live and what manufacturer you buy it from.

As far as does it make sense or not, it depends on the card wattage and your tolerance for noise. I have a very low tolerance for noise, I found that high wattage cards like R9 290 just cannot be cooled quietly enough for me with a regular cooler, they need to be under water. And in fact I did put my R9 290 under water when I had it, that was the only way I could make it quiet enough for me. I have RX 480 now, which is a low wattage card, and I find that quality cooler like the one from MSI Gaming X is quiet enough for me (practically silent) that I just don't want to bother with AIO CLC. If I were to ditch my RX480 and buy high wattage card, I'll probably put it under AIO CLC again.
 
Heat and noise, especially if you have multiple GPUs. SLI cards on air can definitely warm up a room during a long session.
 
Im thinking of the Alphacool expandable system. The full block on GPU with a pump that can be added to their CPU expandable line. But that's next month :p
 
Heat and noise, especially if you have multiple GPUs. SLI cards on air can definitely warm up a room during a long session.

Uh what?

It doesn't matter whether the cards are cooled by air or water they're still dispensing the same amount of heat into the room.

An air cooled 980 Ti w/ same clocks will heat up a room just as much as a water cooled 980 Ti w/ same clocks.
 
Uh what?

It doesn't matter whether the cards are cooled by air or water they're still dispensing the same amount of heat into the room.

An air cooled 980 Ti w/ same clocks will heat up a room just as much as a water cooled 980 Ti w/ same clocks.

No, they are not. Water conducts heat better than air, thus the better temperatures using this method. I noticed as a result of switching my CPUs and GPUs over to water my room warmed MUCH slower and the final temperature was several degrees lower. I measured this using a portable thermostat.
 
No, they are not. Water conducts heat better than air, thus the better temperatures using this method. I noticed as a result of switching my CPUs and GPUs over to water my room warmed MUCH slower and the final temperature was several degrees lower. I measured this using a portable thermostat.

You are wrong. Just because the CPU and GPU are actually cooler using water doesn't mean they're not dispensing the same amount of heat. As you stated water just displaces the heat away from the CPU and GPU better so you see better temperatures on the CPU and GPU. Your room will still heat up the exact same. Period.
 
No, they are not. Water conducts heat better than air, thus the better temperatures using this method. I noticed as a result of switching my CPUs and GPUs over to water my room warmed MUCH slower and the final temperature was several degrees lower. I measured this using a portable thermostat.

Yeah the water is moving away more heat from the cpu/gpu, but it doesn't magically disappear. Just relocating where the heat is going more efficiently than air coolers.
 
No, they are not. Water conducts heat better than air, thus the better temperatures using this method. I noticed as a result of switching my CPUs and GPUs over to water my room warmed MUCH slower and the final temperature was several degrees lower. I measured this using a portable thermostat.

v8ccqht.jpg
 
I never stated the heat magically disappeared. I simply stated water is more conductive, which is true, and also that my room heats up much more quickly and to a higher terminal temperature with an air cooling solution versus a water cooling one.

In your heated response to "win" on the internets, you completely missed what I was saying. I stated my experience as this was actually one of many motivating curiosities that led me to try watercooling.
 
Water is not conductive. The minerals in water is what makes it conductive.

Additionally, the room heats the same with water cooling and air cooling.

I missed nothing. You're just wrong.

I think you should let this thread get back on track. Water is the better cooling solution. My room did not heat the same as with air. I measured it with the same ambient temperature under the same load under air and water cooling.

I'm not wrong about anything.
 
Has a lot to do with this thread. OP doesn't need to buy an AIO GPU expecting his room to heat up less. His room follows the laws of physics unlike you fairly land room.

I really think you should conduct this test yourself!

Anyways, I think AIO GPU water cooling solutions are absolutely worth the cost, but mainly on the more power hungry cards that put out more heat. It leads to more peace of mind that the card won't throttle (e.g. 1080ti) and may allow for more overclocking headroom.
 
I've got 5 980 Ti AIO cards and 2 980 Ti air cooled cards. Has nothing to do with testing. Its basic physics man. The cards are still running the same watts. If AIO cards dispensed less heat then they would use less power. They do not.
 
I love the internet, threads like this are gold. Fuckin' thermodynamics man, how does it work?
 
  • Like
Reactions: Parja
like this
You are wrong. Just because the CPU and GPU are actually cooler using water doesn't mean they're not dispensing the same amount of heat. As you stated water just displaces the heat away from the CPU and GPU better so you see better temperatures on the CPU and GPU. Your room will still heat up the exact same. Period.

In general, you are correct. However in reality, the better-cooled card will operate at lower temps, and therefore be more efficient (fewer electrical losses), so it'll be slightly less heat. It's fairly negligible though. Example, if a card at 100% load, operating at 75C may draw 250W. The same card at 100% load cooled to 45C may only draw 240W.

In reality, you might also OC that 45C card a bit more, and end up with the same power draw anyway.
 
Water is not conductive. The minerals in water is what makes it conductive.

Additionally, the room heats the same with water cooling and air cooling.

I missed nothing. You're just wrong.

I think you should let this thread get back on track. Water is the better cooling solution. My room did not heat the same as with air. I measured it with the same ambient temperature under the same load under air and water cooling.

I'm not wrong about anything.

At the very least, your statements do not tell the whole truth.

The amount of power consumed and the heat generated by a graphics card is a fixed quantity within a certain spec based on work load. When we talk about cooling what we are really talking about is heat dissipation. Water coolers do this much more quickly and efficiently than air coolers do for multiple reasons. Flow rate, surface area, etc. are all part of the reason why water cooling is more effective and results in lower GPU temperatures.

Statements about room temperature are anecdotes and not scientific evidence. Since the amount of heat generated by the GPU is more or less fixed, what's exhausted into the room should be the same amount of heat either way. What changes is how that heat is removed from the GPU. Air coolers simply exhaust it or blow it off the aluminum or copper heat sinks. This air typically creates hot spots around the system that water cooling may not create due to more even dispersion of heat out of the system due to radiator surface area. A portable thermometer again is a potential data point but its suspect. Ambient temperature, time of day, atmospheric conditions such as humidity, precise system work load and any other number of variables that are difficult to account for could easily alter the context of such data. Effectively, the statement is virtually meaningless without precise context.

More effective and consistent heat dissipation into the environment may impact ambient temperatures to a lesser degree than some hot ass corner of the room doe. I don't know. That part of the discussion is really outside my wheelhouse.
 
Heat and noise, especially if you have multiple GPUs. SLI cards on air can definitely warm up a room during a long session.

It doesn't matter if they are on air or water, they are still going to warm up that room the same amount, it just means your heat is all being dumped out of your rad for your loop instead of inside/out the back(for blower types) of your case. You think water coolers are magic? =P
 
I ran a pair of GTX480s that I ziptied corsair H60s to and it worked great.
In general, you are correct. However in reality, the better-cooled card will operate at lower temps, and therefore be more efficient (fewer electrical losses), so it'll be slightly less heat. It's fairly negligible though. Example, if a card at 100% load, operating at 75C may draw 250W. The same card at 100% load cooled to 45C may only draw 240W.

In reality, you might also OC that 45C card a bit more, and end up with the same power draw anyway.

I had one of those original gtx titans that would throttle like crazy due to power draw without a modded bios. A water block made a huge difference. That card never throttled with a water block on. But it was a full cover so I'm sure that it helped the ram and vrms run a little more efficiently as well.
 
For me less noise and more stable clocks are big pluses. I find that outside of MSI cards (which funnily happen to be the best recipients for AIO coolers due to separate VRAM plate and VRM heatsink) the stock coolers on load are too loud on many cards. Then you have the FE cards which seem to throttle more. While air coolers are fine for the xx70/80 series, the Ti models should come with AIO water coolers built-in.

I can see myself picking up a MSI 1080 Ti or Volta and slapping on the Corsair H55 + NZXT G10 bracket that I have on my MSI 980 Ti at the moment.
 
I really think you should conduct this test yourself!

Anyways, I think AIO GPU water cooling solutions are absolutely worth the cost, but mainly on the more power hungry cards that put out more heat. It leads to more peace of mind that the card won't throttle (e.g. 1080ti) and may allow for more overclocking headroom.

There's no need for anyone to conduct this "test" themselves, because of how physics works. If your hardware is generating X amount of energy(heat) that is being dissipated via heatsink into the air in a room, it's going to be dissipating the same amount of energy passing air through a radiator. The only things you're changing is how quickly heat can be pulled from that source(which only affects the maximum temperature of that heat source, in this case the CPU, GPU, VRMs, etc. which btw is a minuscule amount of surface area compared to any heatsink, radiator, or room), or dumping that heat into a location other than the room the computer is in by using some sort of exhaust to elsewhere(other than the open air in the room) or having your radiator elsewhere(which is doable, but impractical for most people at home).

Your claims that the room gets warmer or cooler when you have the same source(again, the GPU, CPU, etc.) dumping the same amount of heat into the same room is absolutely silly.
f3f77e1d1510c7563054e7372387efad.jpg
 
  • Like
Reactions: 50Cal
like this
I used AIO coolers on my cards. It was worth it when I ran SLI. I have a triple slot air cooler on a single card now and temps are as good or better than they were with my AIO coolers.
 
I like an AIO on my GPU so I can have more fans on it spinning slower for less noise, as mentioned above it will also help out with more then one video card.

 
There's no need for anyone to conduct this "test" themselves, because of how physics works. If your hardware is generating X amount of energy(heat) that is being dissipated via heatsink into the air in a room, it's going to be dissipating the same amount of energy passing air through a radiator. The only things you're changing is how quickly heat can be pulled from that source(which only affects the maximum temperature of that heat source, in this case the CPU, GPU, VRMs, etc. which btw is a minuscule amount of surface area compared to any heatsink, radiator, or room), or dumping that heat into a location other than the room the computer is in by using some sort of exhaust to elsewhere(other than the open air in the room) or having your radiator elsewhere(which is doable, but impractical for most people at home).

Your claims that the room gets warmer or cooler when you have the same source(again, the GPU, CPU, etc.) dumping the same amount of heat into the same room is absolutely silly.
f3f77e1d1510c7563054e7372387efad.jpg
Fun fact:
A more efficient AIO cooler would put MORE heat into the air, since less of it will be left on the CPU. ;)
 
In general, you are correct. However in reality, the better-cooled card will operate at lower temps, and therefore be more efficient (fewer electrical losses), so it'll be slightly less heat. It's fairly negligible though. Example, if a card at 100% load, operating at 75C may draw 250W. The same card at 100% load cooled to 45C may only draw 240W.

In reality, you might also OC that 45C card a bit more, and end up with the same power draw anyway.

I see what you're saying, but a card that runs hotter will most likely hit some sort of thermal throttle and thus use less energy to get the card back to cooler temps. This is a limit less likely to be reached by an AIO card thus it'll be able to run 100% loads longer.
 
With the OC'ing capacity of the GTX 10-series not appearing to be limited much by the air vs. water cooling, is there any point in AIO or non-AIO water cooling anymore?

My CPU is currently under water and I don't plan on plumbing a GPU into the current loop.

I imagine installing an AIO or aftermarket cooling kills the warranty, is this incorrect?

I had myself convinced that a 'hybrid' type stock card was 'for me' (I may have had a crush on the 980TI hybrids...), but after the many reviews I'm not sure it's 'worth' it, $ for performance wise. It appears 2ghz is common and even 2.1ghz is attainable.

I was thinking the AIO/hybrid would theoretically have better boost performance, thermals, longevity, and OC headroom.

I understand that when you feel the need for speed then value per $ takes a back seat; I just enjoy getting a good price/performance deal.

Thanks for your thoughts and opinions!
Maybe the bigger AIB air heatsinks are different, but my Founder's Edition 1080 definitely sees a benefit from water cooling. With the stock cooler, it will reliably maintain maybe 1850- 1900 MHz, whereas with a full cover jacket, it stays at 2088-2100 for hours on end. That's a ten-ish percent increase in clock speed, to say nothing of the lower power consumption in general, thanks to the lower temperatures.

I'm not a fan of AIOs in general, though. The ones I've actually used did not really produce temperatures that are that much lower than a similarly priced metal heatsink, but came with the obvious disadvantage of having a finite lifespan.
 
I like the AIO setups in small cases because it's much easier to direct all the hot air from the processor (CPU or GPU) directly out of the case giving you overall cooler temps inside.
 
I see what you're saying, but a card that runs hotter will most likely hit some sort of thermal throttle and thus use less energy to get the card back to cooler temps. This is a limit less likely to be reached by an AIO card thus it'll be able to run 100% loads longer.

Possibly, possibly not. To tell for sure, you'd have to have good-enough air-cooling to keep the device below the thermal throttling limit for an entire "run". Those certainly exist for both CPUs and GPUs.

There's no question that a hotter processor will consume more power for the same tasks, but it's pretty minor in a real-world piece of silicon. That and power/temp throttling in newer processors make this extremely hard to measure reliably.
 
Fun fact:
A more efficient AIO cooler would put MORE heat into the air, since less of it will be left on the CPU. ;)
Except you're still dissipating the same amount of heat into the room, otherwise you'd have a CPU that would be infinitely increasing in temperature. It simply boils down to how quickly you can get the heat off the die, and how much builds up on the tiny surface area of it.

On the flip side, you could have the worst AIO on the planet that's lubricated with sand and generates a ton of heat on its own... but I think it's a safe assumption this guy's watercooling rig was "reasonable".
 
In general, you are correct. However in reality, the better-cooled card will operate at lower temps, and therefore be more efficient (fewer electrical losses), so it'll be slightly less heat. It's fairly negligible though. Example, if a card at 100% load, operating at 75C may draw 250W. The same card at 100% load cooled to 45C may only draw 240W.

In reality, you might also OC that 45C card a bit more, and end up with the same power draw anyway.

Of course when you are breaking it down to this level, you need to also factor in the water pump using energy and also creating its own heat :eek:
 
My average GPU clock speed went from ~1920MHz on my Titan XP to around ~2070MHz once I put it under water, along with being substantially quieter.

Worth it, IMO.
 
The biggest pro to any AIO cooling unit - GPU or otherwise:

You can control exactly where the heat is exhausted to.

It's still the same amount of heat that comes off the GPU, so the amount your room heats up isn't going to change appreciably.

You ~may~ get better temps, but I wouldn't guarantee it - water is an excellent transfer medium, but you are still dumping the heat to the air, so your final temp will be a function of water block efficiency, radiator surface area, and air moving through the radiator.

The best benefit - you can mount the radiator where ever you please, and direct the air flow through it so you are directly exhausting outside of the case, or directly intaking from a cooler air source than inside the case (or both). Either way is much better than most open-type air coolers that suck air in from the case, and exhaust a good portion of that hot air back inside the case which raises the temperature of everything inside your case and requires you to use more case ventilation.
 
Last edited:
Water is not conductive. The minerals in water is what makes it conductive.
You are confusing conducting heat with conducting electricity. Water has a very high heat capacity, and it absolutely does conduct heat. You can test this by using deionized water vs. tap water.
 
*sigh*

A few notes:

Everyone arguing that water cooled or air cooled will dump the same amount of heat into the room are both right and wrong. Assuming identical thermal conductivity, then the statement is true.

Personally, I think it's a safe assumption than an aftermarket cooler, regardless of type, is going to have superior thermal characteristics vs a mass produced OEM heatsink. In that case, the aftermarket cooler should be pulling MORE heat from the emitter, and transferring this increased heat load into the same environment, creating a slightly hotter environment (dependent as already noted on individual variables not accounted for).

That said, a water cooling loop (as atmartens above alluded to) has a significantly higher heat capacity than the entire box of air in the computer case. Seriously, water has an impressive ability to carry thermal energy.

So what does that mean in our overall system? Well, that water is holding a significant portion of the thermal energy the emitter is generating. If it were not so, your water would stay the same temperature at all times (as all heat picked up would be completely ejected at the radiator) and anyone can tell you that is clearly not the case. We haven't built some +100% energy transfer system with a water cooling loop (even with a peltier block, energy is added to facilitate removal of the thermal energy, which, itself, is less than 100% effective).

In essense, what you have is a large quantity of the heat in a room being held in that water loop, which is reflected by a slower heating of the room vs air cooled. This is further facilitated by the fact that most air cooled cards are pushing that air with significantly greater velocities than the fans needed in a water loop (which furthers the water loop's improved audible characteristics), thereby helping in their own little way to transfer the heat more quickly into the environment (as they are, after all, designed to do!)

Another way of looking at it - the water loop acts as a giant assed heatsink. Said heatsink takes a lot longer to heat up and transfer that thermal energy into the environment, making the environment seem cooler. The heat just isn't up in your face :)
 
Pretty sure First Law of Thermodynamics says if you pump 250W into something, your going to get 250W worth of whatever out the back end**. Doesn't matter how many steps or what type of heatsink happens to be inbetween and are in the overall process.

Heat capacity is irrelevent, because no matter what you use in a closed system, it's eventually going to saturate, and your going to end up with the end to end process of electricy consumed by your electronics, and the atmosphere (either in your room, or outside of your home if you want to consider your home's HVAC as well) is your ultimate heat sink. Even the quality of the heat sink is irrelevant - Yes, it will affect your temperature, but as your temp goes up, the efficiency of the heat transfer goes up, and low and behold, it will all meet in equilibrium such that it does not violate the First Law, and power in will equal power out.

**(Unless you happen to be creating or destroying matter, in which case, you are probably doing it wrong)
 
Back
Top