Nvidia GTX 1080 & 1070 EVGA Cards Reportedly Catching Fire & Dying Due To VRMs Overheating

It is somewhat confusing, but it seems by that video that they are now overcompensating by ensuring the vrm area gets an ample amount of clamping force. The pad over the chokes is larger than just the chokes and covers the vrm area too. This seems to point to the fact that the fets always had thermal pads but they were thin, not providing enough pressure. That video bugs me, that guy is touching everything leaving his skin oils all over the place lol.

smh, why they didn't mold some heatsinks into the cooling plate is strange. I suppose they thought that a simple thin plate was enough lol.

What clamping force?

JayzTwoCents and GamersNexus both have videos where they apparently follow EVGA's instructions, and they put the pad on top of the chokes, with nothing touching it on the other side. They then put one on the back of the card between the PCB and backplate. I guess that one accomplishes something, but the one that touches the chokes looks like it's a placebo meant to pacify totally ignorant n00bs.
 
That's only if they intentionally lied and tried to deceive like Nvidia did with the 970 4gb. This is a fiasco alright but I'm not seeing intent to deceive here.
EVGA has issued at least 2 statements about this now and they sound increasingly desperate, downplaying the issue as much as possible.
We learned now that they even flubbed VRAM pads. It just keeps getting worse.

#wheresvega
 
What clamping force?

JayzTwoCents and GamersNexus both have videos where they apparently follow EVGA's instructions, and they put the pad on top of the chokes, with nothing touching it on the other side. They then put one on the back of the card between the PCB and backplate. I guess that one accomplishes something, but the one that touches the chokes looks like it's a placebo meant to pacify totally ignorant n00bs.

The cooling plate screws into the backplate, this creates the clamping force.
 
EVGA has issued at least 2 statements about this now and they sound increasingly desperate, downplaying the issue as much as possible.
We learned now that they even flubbed VRAM pads. It just keeps getting worse.

#wheresvega

Maybe... I dunno this just seems like a lot of stupid things happened that shouldn't have. Maybe I'm just more tolerant of it being a longtime amd gpu owner lmao?
 
The cooling plate screws into the backplate, this creates the clamping force.
Yeah, but they're putting the pad on the wrong side of the cooling plate. Watch the video. They never remove the front cooling plate from the card at all, and just drape the pad over the chokes. It literally does not touch anything on one side.
 
Yeah, but they're putting the pad on the wrong side of the cooling plate. Watch the video. They never remove the front cooling plate from the card at all, and just drape the pad over the chokes. It literally does not touch anything on one side.

Yea, true. I would have liked to see the underneath that cooling plate too and some more careful examination of the height tolerances. But that doesn't change that they raised the clamping force by putting the pad between the pcb and backplate. Think of it like an EK backplate, the backplate using thermal pads pushes into the back of the pcb adding to the clamping force.
 
Alternative route EVGA could have taken was simply to ignore the issue and RMA the cards that may end up with problems. That wouldn't be the first time an AIB did that.
 
Yeah, but they're putting the pad on the wrong side of the cooling plate. Watch the video. They never remove the front cooling plate from the card at all, and just drape the pad over the chokes. It literally does not touch anything on one side.
So, I asked the "Actually Hardcore Overclocking" dude about this.

According to him, there's already a thermal pad under the plate, and the purpose of the one they put on the front is so that the main GPU core heatsink will get smushed into it, and use the heatsink fins for additional cooling of the FETs.
 
So, I asked the "Actually Hardcore Overclocking" dude about this.

According to him, there's already a thermal pad under the plate, and the purpose of the one they put on the front is so that the main GPU core heatsink will get smushed into it, and use the heatsink fins for additional cooling of the FETs.

That is a bad idea imo. I've seen scratched dies due to lopsided or unbalanced forces on the cooler. If we use it like that, the cooler becomes one bigass lever, and the fulcrum is the core. smh...

Personally I don't think they need to pads on top of the chokes, as that is somewhat redundant. They are already improving clamping force by sticking pads between the pcb and backplate. Someone needs to take a side shot of the cooler to show the clearances.
 
That is a bad idea imo. I've seen scratched dies due to lopsided or unbalanced forces on the cooler. If we use it like that, the cooler becomes one bigass lever, and the fulcrum is the core. smh...

Personally I don't think they need to pads on top of the chokes, as that is somewhat redundant. They are already improving clamping force by sticking pads between the pcb and backplate. Someone needs to take a side shot of the cooler to show the clearances.
No, the point of the pad is just to fill the small air gap between the heatsink and cooling plate, allowing heat to be conducted into the fins from the plate. It doesn't materially contribute to the camping force, I don't think.
 
No, the point of the pad is just to fill the small air gap between the heatsink and cooling plate, allowing heat to be conducted into the fins from the plate. It doesn't materially contribute to the camping force, I don't think.

But that doesn't make any sense. The pad doesn't do jack unless it's sandwiched between a heat conductor and the heat source. I would think a pad that isn't used to transfer heat one from the source to a heat conductor... would just insulate the heat?
 
But that doesn't make any sense. The pad doesn't do jack unless it's sandwiched between a heat conductor and the heat source. I would think a pad that isn't used to transfer heat one from the source to a heat conductor... would just insulate the heat?
It does. On one side, it's got the heat spreader that actually touches the VRM. On the other, according to AHOC dude, the heatsink touches it. Because it's kind of a goopy material, the heatsink just sort of smushes into it a little, providing an interface surface for heat to be conducted.

I'm sure if EVGA had this to do again, they'd physically attach the heatsink to the plate, like Sapphire does, but this appears to be a "best we can do without replacing expensive metal parts" kind of solution.
 
It does. On one side, it's got the heat spreader that actually touches the VRM. On the other, according to AHOC dude, the heatsink touches it. Because it's kind of a goopy material, the heatsink just sort of smushes into it a little, providing an interface surface for heat to be conducted.

I'm sure if EVGA had this to do again, they'd physically attach the heatsink to the plate, like Sapphire does, but this appears to be a "best we can do without replacing expensive metal parts" kind of solution.

Sorry, that explanation is still wacky. Thermal pads are not the best conductors of heat, they have a rating that measures how much they suck as conductors. They are used because they get soft/gooey when heated up and thus fill in the micro gaps of the surfaces they are mounted to. Thermal pads are more tolerant of variations in part tolerances unlike thermal grease which is superior to pads. They need a real heat conductor to transfer any heat that they don't waste themselves. The explanation of whomever saying that putting them on top of the cooling plate then using the fins of the heatsink to smush into them is a valid conductor of heat transfer is freaking carazy.

And the topic of having an integrated vrm cooler built into the finned heatsink/cooler as if it is some miraculous invention is more bs. Companies having been using integrated vrm cooling on finned and heatpiped coolers here and there as back as heatpipe open coolers have been popular. I get the impression that half the ppl posting are irate because EVGA didn't use integrated vrm cooling.

The way the evga acx cooler is designed is essentially the same as every open fan cooler before it. The cooler only direct cools the core and the cooling plate covers the other vrm/memory ic and the coolers fan pushes air down over the plate. Where EVGA messed up is that they didn't cast in cooling fins on the cooling plate like MSI below. Btw, the Lightning cooler below uses integrated vrm cooling... lol not on the vrms but the 4 memory ic's which highlights how hilarious that is. They don't integrate the vrm cooling on the 16 phases lol but instead the 4 mem ic. Clearly it didn't matter to MSI, or did it?

https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/4.html

cooler5_small.jpg


290x which pulls down like 3 times the power draw, zomg no direct vrm cooling on the actual cooler?
 
This isn't the first time EVGA have pulled a snafu on customers. Some years back they sold a large batch of cards that had vram on them that couldn't do their factory OC. Can't remember which model it was now but I had one and had to do two RMAs with them because the first replacement card they sent me had flaky vram too. I think it was GTX7800 or 7900 series.

At first I fixed it myself by modifying the bios on the card to run the ram at a slower speed but later decided to rma it anyway after resetting the bios to factory default.

I've never bought an EVGA card since because of that fiasco and mostly go with Gigabyte Windforce now and have never had an issue with their cards and they are competitively priced too.
 
Here is a video explaining what the temperature problem is, what the bios cooling improvements bring, and what using the thermal pads do for the card temperature wise.

 
Sorry, that explanation is still wacky. Thermal pads are not the best conductors of heat, they have a rating that measures how much they suck as conductors. They are used because they get soft/gooey when heated up and thus fill in the micro gaps of the surfaces they are mounted to. Thermal pads are more tolerant of variations in part tolerances unlike thermal grease which is superior to pads. They need a real heat conductor to transfer any heat that they don't waste themselves. The explanation of whomever saying that putting them on top of the cooling plate then using the fins of the heatsink to smush into them is a valid conductor of heat transfer is freaking carazy.

And the topic of having an integrated vrm cooler built into the finned heatsink/cooler as if it is some miraculous invention is more bs. Companies having been using integrated vrm cooling on finned and heatpiped coolers here and there as back as heatpipe open coolers have been popular. I get the impression that half the ppl posting are irate because EVGA didn't use integrated vrm cooling.

The way the evga acx cooler is designed is essentially the same as every open fan cooler before it. The cooler only direct cools the core and the cooling plate covers the other vrm/memory ic and the coolers fan pushes air down over the plate. Where EVGA messed up is that they didn't cast in cooling fins on the cooling plate like MSI below. Btw, the Lightning cooler below uses integrated vrm cooling... lol not on the vrms but the 4 memory ic's which highlights how hilarious that is. They don't integrate the vrm cooling on the 16 phases lol but instead the 4 mem ic. Clearly it didn't matter to MSI, or did it?

https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/4.html

cooler5_small.jpg


290x which pulls down like 3 times the power draw, zomg no direct vrm cooling on the actual cooler?
Dunno what to tell you, man. It appears that that's what the prescribed fix is, for better or for worse. I'm not claiming it's a terribly great solution; I was just curious, so I asked the AHOC dude about.

Also, the 290X I have actually did come with a heatsink that has fins in that area, but even with a big meaty waterblock, the VRM on that thing runs hot.
 
Does the BIOS add any performance? I put a waterblock on mine before I even installed the card so temps aren't really a concern for me. I just want more POWA.
 
No it doesn't. Just changes the fan curve (Though the bios update prior to that IF you got a Micron-Based VRAM unit will supposedly allow better memory OC).
 
Does anyone know the dimensions/thickness of the thermal pads EVGA is sending out? I'm going to request a set of pads, but if I have some of the right thickness/size I may just go ahead and apply those before I get the EVGA ones.
 
Does anyone know the dimensions/thickness of the thermal pads EVGA is sending out? I'm going to request a set of pads, but if I have some of the right thickness/size I may just go ahead and apply those before I get the EVGA ones.

It should be on the EVGA forum in the main topic sticked. Sorry I don't remember.
 
Does anyone know the dimensions/thickness of the thermal pads EVGA is sending out? I'm going to request a set of pads, but if I have some of the right thickness/size I may just go ahead and apply those before I get the EVGA ones.
The GamersNexus guy talks about this in his video, I think.
 
the more i read about this the more I want this to be about an $80 vid card i bought , not a $680 vid card i bought

i have had too many expensive items not fully work like they are supposed to

(couldn't sell this now and get decent price, fix a $700 product your self?)

EVGA"fiy
 
Last edited:
the more i read about this the more I want this to be about an $80 vid card i bought , not a $680 vid card i bought

i have had too many expensive items not fully work like they are supposed to

(couldn't sell this now and get decent price, thanks evga)

Why would you want to sell it? What problems are you personally experiencing? Just apply the thermal pads and the BIOS update and continue enjoying your purchase.
 
I am getting a Advanced RMA of my GTX 1070! EVGA will first send me a brand new GTX 1070 with the thermal pads installed and updated BIOS!
 
If you all had just watercooled your graphics cards like you were supposed to, none of this would have ever happened!


TBH I am also a bit surprised that this has become such an issue. I've had cards where the VRM's ran up to like 100C that never blew up or caught on fire. Mosfets can usually take the heat without dying as long as theres some airflow, you just might get some instability. How different is EVGA's ACX cooler design than anything else out there right now? Doesn't look like theres much fundamentally different than the XFX DD 290X coolers I had, but I guess the devil is in the details
 
If you all had just watercooled your graphics cards like you were supposed to, none of this would have ever happened!


TBH I am also a bit surprised that this has become such an issue. I've had cards where the VRM's ran up to like 100C that never blew up or caught on fire. Mosfets can usually take the heat without dying as long as theres some airflow, you just might get some instability. How different is EVGA's ACX cooler design than anything else out there right now? Doesn't look like theres much fundamentally different than the XFX DD 290X coolers I had, but I guess the devil is in the details

These are hitting 110 c within seconds of startup. Look at the Jayztwocents video I linked.
 
These are hitting 110 c within seconds of startup. Look at the Jayztwocents video I linked.
Sorry, I don't support that clown so I'll pass. 110C in seconds sounds ridiculous. Even uncooled I wouldn't think they'd get that hot, that fast. I've had other, admittedly lower powered, GPU's with nothing on the VRM's at all. Even mining and drawing full power they didn't get that hot with just normal airflow from the cooler blowing on the core.
 
We're approaching class-action lawsuit territory, boys.

I am critical of the EVGA situation, but they may have some wriggle room if someone tried a lawsuit, primarily because EVGA has stressed in normal operation all components are within spec; meaning no additional OC just what EVGA sets and fan profiles not tweaked to be quieter.
Note it only reached what could be deemed the specification ceiling with Furmark in the Tom's Hardware test, I still see it being useful but needs context and that is the result needs to be balanced against OC/fan profiles that consumers will play with for gaming.

Like I said before, sucks though to buy an expensive custom AIB model if one cannot unlock its clock/tweak fan profile/etc.
Furmark is an interesting test, because even AMD do not like it, that said I think it is still worth including just needs a caveat/explanation on its value, just like how Tom's Hardware power demand analysis could be taken out of context (so many tried to argue with PCPER that 960 was out of spec even when they showed multiple times why it is not and how to interpret the data).
So EVGA may be lucky with their wriggle room, although for now they seem to be engaged in assisting consumers while also continuing to stress within spec-parameter use; so covering themself legally in multiple ways.
Cheers
 
Quick FYI -- For anyone doing an RMA with EVGA for a new 1070 or 1080, they've got a 3-4 week backlog on shipping out the replacement cards.

So can you RMA for a brand new card?

I am getting a Advanced RMA of my GTX 1070! EVGA will first send me a brand new GTX 1070 with the thermal pads installed and updated BIOS!

Which model did you purchase and how did you get that service? My card is only 1 month old and I don't feel like taking it apart myself. Have a feeling I'd tear out those little fan wires or something.
 
Last edited:
These are hitting 110 c within seconds of startup. Look at the Jayztwocents video I linked.
I checked the video and while he mentions 110c he never said within seconds of startup - I was looking at the 3m 5secs.

One thing about his video, IMO I think he is downplaying the noise levels with the more aggressive fan profile solution EVGA is implementing with the BIOS.
Cheers
 
A pretty popular thread on reddit, seems some cards have vram chips not being properly cooled.


In response to this post on Reddit and EVGA forums, EVGA will be adding to their thermal pad offer:
Reddit thread for followup:
Starting next week we will ship thermal grease, memory thermal pads, along with PWM thermal pads in the package. It is recommended to remove the existing grease on GPU and memory pads and apply the new ones. For any customers that did not receive memory thermal pads, please contact us so we can arrange it.

I suggest if you don't like/want to DIY the card modifications, you should contact EVGA about RMAing it, that is an option, also, for those already sending their cards, read through this page of the thread: http://forums.evga.com/gaps-between...thermal-pads-on-the-midplate-m2570619-p4.aspx
But I would recommend contacting support and making sure these newest modifications are completed before shipping back to you.
 
what a fucking headache.
After browsing the EVGA forums, it appears you might not get a new card if you return yours for cross shipment. They may send you a refurbished one with thermal mod instead.

http://forums.evga.com/FindPost/2576983

You would think they could at least ensure you receive the same card back but then that also has costs associated with it, although considering the issue is down to cutting corners in the 1st place that is the minimum they should do IMO.
Not a fan of refurbished GPUs being sent as a replacement, and I doubt the consumers paying this kind of money for a GPU want a historically unknown cared for card that was returned to EVGA and refurbished.

If sending back I would write the serial no down and try to get some kind of guarantee of whether it would be new or same card back.
Cheers
 
So can you RMA for a brand new card?



Which model did you purchase and how did you get that service? My card is only 1 month old and I don't feel like taking it apart myself. Have a feeling I'd tear out those little fan wires or something.

GTX 1070 and I emailed tech support. They put me under the Advanced RMA for free which means they send me a card out first with the thermal pads and bios update first. Then I ship them my other card for free.

I am getting a brand new card sense I filed the RMA within 30 days of the original purchase date!
 
Back
Top