so a 1080 - in the same comparison uses 43 cents in 24 hours at full tilt (or 1.8 cents per hour)
as mentioned earlier 300 watt Vega will use 72 cents of electricity in 24 hours of full tilt
If you game with two hours on each that's 6 cents for Vega, or 4 cents for 1080.
And your point of not using full TDP for gaming is fair -- which makes it even less for both cards -- if we use 66% TDP as a estimate then it's 4 cents for Vega for two hours of gaming and 2.5 cents for 1080 for two hours of gaming.
Still wholly immaterial -- and all the concern otherwise is folly.
It's not immaterial if the heat is forcing other component fans to run harder, is in a SFF factor, the end user wants to OC, or the turd is heating up a room. And where do you draw the line with efficient products? Or do you not, because based on your logic crap engineering only costs $.05 a day, so we should never really consider or reward efficiency unless it's the clearly better deal.