Let RX 580 Overclocking hype train begin.. LOL

You realize this is effectively like caring about gas mileage on a performance car, instead of only caring about price/performance for the money? At least for me, power has never been a consideration.

For some people, power is a consideration. Maybe they live somewhere electricity is expensive.
 
Or they dont want their room heated up by a "spaceheater" and then deal with that effect. Or a combination of both :)

While energy cost may be an issue (though a 100watt difference isn't going to amount to much overall...) , I don't think the effective heat output of 2 to 3 100 watt incandescent light bulbs is really going to make that big of a difference. If we were talking an actual power draw of 1000 watts, I could see how it would be an issue. How many bulbs does your ceiling fan have on it :)

In fact, I would almost guess that my subwoofer and studio monitors probably put out more heat than that.
 
For some people, power is a consideration. Maybe they live somewhere electricity is expensive.
It doesn't make much of a difference. It's probably like few a year may be more. It's not going to break your wallet. Most people that have jobs don't game 24/7. My Second rig rx 580 on sometimes doesn't get turned on for a week. It's in a guest room. Sometimes my wife gets on it or my nephews if they come over. People talk about it like everyone is sitting in front of a screen 24/7 playing games.
 
It doesn't make much of a difference. It's probably like few a year may be more. It's not going to break your wallet. Most people that have jobs don't game 24/7. My Second rig rx 580 on sometimes doesn't get turned on for a week. It's in a guest room. Sometimes my wife gets on it or my nephews if they come over. People talk about it like everyone is sitting in front of a screen 24/7 playing games.


Operating 24/7 , A 100 watt card would use approximately 876 KW/H a year. Going at a .10 c per kw/h rate to simplify the math, you would spend 87.60 a year per 100 watts of consumption. so a 50 watt difference would be roughly 43.80 at .10c per KW/H, or 3.65 a month, I think. Increase or decrease depending on the cost of power for you, (I only pay .07 per KW/H).

And that is playing 24 hours a day, 365 days a year. Obviously, its going to be even lower with the normal amount of time people game.
 
While energy cost may be an issue (though a 100watt difference isn't going to amount to much overall...) , I don't think the effective heat output of 2 to 3 100 watt incandescent light bulbs is really going to make that big of a difference. If we were talking an actual power draw of 1000 watts, I could see how it would be an issue. How many bulbs does your ceiling fan have on it :)

In fact, I would almost guess that my subwoofer and studio monitors probably put out more heat than that.

You be surprised. Our heating usage for an entire year is 1.5Mw to compare. Or around 170W an hour and we got a room temperatures of around 25C and bathroom around 28C(36C floor).

My ceiling fan doesn't have any light blubs. We dont have a fan :)

And everything is LED these days. 6 bright spots in the bathroom may be the top user with around 30W combined.

Operating 24/7 , A 100 watt card would use approximately 876 KW/H a year. Going at a .10 c per kw/h rate to simplify the math, you would spend 87.60 a year per 100 watts of consumption. so a 50 watt difference would be roughly 43.80 at .10c per KW/H, or 3.65 a month, I think. Increase or decrease depending on the cost of power for you, (I only pay .07 per KW/H).

And that is playing 24 hours a day, 365 days a year. Obviously, its going to be even lower with the normal amount of time people game.

Paying 10 US cents or below isn't normal. And you have indirect cost too if you start to use AC to get rid of the heat.

But again, why pay in the first place? Its simply a tax on the enduser. Imagine if that price of a lifetime ownership of the card was added to the initial price instead. A RX580 would be what then? 400$? 500$?
 
Last edited:
Operating 24/7 , A 100 watt card would use approximately 876 KW/H a year. Going at a .10 c per kw/h rate to simplify the math, you would spend 87.60 a year per 100 watts of consumption. so a 50 watt difference would be roughly 43.80 at .10c per KW/H, or 3.65 a month, I think. Increase or decrease depending on the cost of power for you, (I only pay .07 per KW/H).

And that is playing 24 hours a day, 365 days a year. Obviously, its going to be even lower with the normal amount of time people game.

Exactly and those playing 24/7 a day, their parents are probably paying the bill cuz they have nothing else to do! lol!
 
Exactly and those playing 24/7 a day, their parents are probably paying the bill cuz they have nothing else to do! lol!

In Denmark, even with 3 hours of gaming a day. After 3 years a RX580 cost over 100$ more than a GTX1060 just in ownership cost. And that's excluding all the extra cost a RX580 brings as well.
 
Operating 24/7 , A 100 watt card would use approximately 876 KW/H a year. Going at a .10 c per kw/h rate to simplify the math, you would spend 87.60 a year per 100 watts of consumption. so a 50 watt difference would be roughly 43.80 at .10c per KW/H, or 3.65 a month, I think. Increase or decrease depending on the cost of power for you, (I only pay .07 per KW/H).

And that is playing 24 hours a day, 365 days a year. Obviously, its going to be even lower with the normal amount of time people game.
Did the math. The $10 price advantage 480/580 has over 1060 in local stores presently is down for in a year assuming 8 cents per kwH and 20-24 hours per week of actual active usage.

It adds up steadily, you know.
 
In Denmark, even with 3 hours of gaming a day. After 3 years a RX580 cost over 100$ more than a GTX1060 just in ownership cost. And that's excluding all the extra cost a RX580 brings as well.

Doesn't it get fairly cold in Denmark? The average summer temp in July is 73f/23c, and in anytime not summer, you could probably use the heating from the card a cost reduction against your normal heating expenses. It's probably pretty close to a wash. It is certainly more efficient to use the gaming card to produce heat than AND play games than it is to burn electricity/coal/natural gas/whatever just for heat.
 
Did the math. The $10 price advantage 480/580 has over 1060 in local stores presently is down for in a year assuming 8 cents per kwH and 20-24 hours per week of actual active usage.

It adds up steadily, you know.

I don't know why you would assume that is the only way to approach the problem. In the winter, I can keep my house warm enough with computers and electronics, without turning on the heater. I routinely keep my room warm with my 980ti as long as it doesn't down below freezing. In that case, you are saving money, not paying extra.
 
Doesn't it get fairly cold in Denmark? The average summer temp in July is 73f/23c, and in anytime not summer, you could probably use the heating from the card a cost reduction against your normal heating expenses. It's probably pretty close to a wash. It is certainly more efficient to use the gaming card to produce heat than AND play games than it is to burn electricity/coal/natural gas/whatever just for heat.

District heating cost a fraction of electricity. So that's not even a use case.

And then there is the summer period, with no AC. Where it simply adds room temperature you dont want.

In short, we use ~1500Kw/h in heating, including hot water. And we use ~1800Kw/h in electricity for a year in the entire household.

Current prices:
District heating=0.66181DKK per Kw/h
Electricity=2.4394DKK per Kw/h
 
Last edited:
District heating cost a fraction of electricity. So that's not even a use case.

And then there is the summer period, with no AC. Where it simply adds room temperature you dont want.

In short, we use ~1500Kw/h in heating, including hot water. And we use ~1800Kw/h in electricity for a year in the entire household.

Current prices:
District heating=0.66181DKK per Kw/h
Electricity=2.4394DKK per Kw/h

That is crazy low energy usage. I remember in college w/ my roommates we were averaging 1800KW/H a month during the summer. Keeping a house cool is expensive when it is 100f-110f outside . Right now I average 500 KW/H a month.
 
That is crazy low energy usage. I remember in college w/ my roommates we were averaging 1800KW/H a month during the summer. Keeping a house cool is expensive when it is 100f-110f outside . Right now I average 500 KW/H a month.

Keeping the house cool is fuckin expensive here in northern california PG&E market. My bill in summer shot up to 600 to 700 and these summer months caused me to get solar panels. Rates are going up more and more. Even at one point PG&E tried to raise connection rate to solar. Fuckin thug move because so many here have converted to solar. I financed my panels and its still saving me 2-3k a year. Probably more down the road as PG&E rate increases overall.
 
That is crazy low energy usage. I remember in college w/ my roommates we were averaging 1800KW/H a month during the summer. Keeping a house cool is expensive when it is 100f-110f outside . Right now I average 500 KW/H a month.

Modern power efficient appliances, standard LED lightning etc. Then you as such dont have to compromise on anything. Tumbler got heat pump too and so on.

And for the home, A rated construction. That means heat recovery for ventilation(obviously not in summer, using bypass there), triple glass windows, 300mm rockwool or equal insulation in walls etc and a recirculation system for the floor heating.

Hell, we even have passive houses now that produce more energy than used. Ground heating is also a super cheap way to get heating and hot water. And that system can also be used for cooling.
 
Last edited:
Keeping the house cool is fuckin expensive here in northern california PG&E market. My bill in summer shot up to 600 to 700 and these summer months caused me to get solar panels. Rates are going up more and more. Even at one point PG&E tried to raise connection rate to solar. Fuckin thug move because so many here have converted to solar. I financed my panels and its still saving me 2-3k a year. Probably more down the road as PG&E rate increases overall.

I barely spend 2.5K per year on electricity here in Texas and I have a 3000 sq/ft home in south Texas. I keep my home cool also, around 74F in summer and 70F winter.
 
Extra 50-75W of heat dump into a room can mean the difference between a fan and an air conditioner.

Those living in hotter climates can attest to that.
 
You realize this is effectively like caring about gas mileage on a performance car, instead of only caring about price/performance for the money? At least for me, power has never been a consideration.

Same. Especially in winter, it's just a heater that plays games.
Don't game much, so when I do, a slight power bump means basically nothing.
People act like they will be gaming 24/7 and that 50w-70w is a big deal. Save sheckels on a fucking 50W less PSU? Seriously? You'd crimp your build and expansion capabilities to save some sheckels on a PSU by buying a slightly more efficient and more expensive GPU? It's like their arguments never used graphics cards for the last 5+ years, that are around that power envelope normally and provided far less performance than today.
 
Same. Especially in winter, it's just a heater that plays games.
Don't game much, so when I do, a slight power bump means basically nothing.
People act like they will be gaming 24/7 and that 50w-70w is a big deal. Save sheckels on a fucking 50W less PSU? Seriously? You'd crimp your build and expansion capabilities to save some sheckels on a PSU by buying a slightly more efficient and more expensive GPU? It's like their arguments never used graphics cards for the last 5+ years, that are around that power envelope normally and provided far less performance than today.

So it has to be winter, must use it very little.

What else do you want to add to defend 120W performance for 200-240W usage?

For a 3 year casual ownership here you could just as well have bought a GTX1070. The cost would be the same over that time period.
 
So it has to be winter, must use it very little.

What else do you want to add to defend 120W performance for 200-240W usage?

For a 3 year casual ownership here you could just as well have bought a GTX1070. The cost would be the same.

Depends on what you do with your card. Mine often sits there idling away as I don't usually have time to game. If it takes you 3 years to save the difference, you can't afford it anyway.
80-120W difference is a lightbulb or two, or a small-medium screen.
Big fucking deal.
 
Depends on what you do with your card. Mine often sits there idling away as I don't usually have time to game. If it takes you 3 years to save the difference, you can't afford it anyway.
80-120W difference is a lightbulb or two, or a small-medium screen.
Big fucking deal.

If your light blubs use that much you are doing it wrong as well ;)

Funny you mention "cant afford it". A GTX1070 in that case is a much better investment. So buying a RX 580 must mean you couldn't afford anything better at the time of purchase and just pay a constant usage tax instead. So maybe you should avoid silly comments like that in the future.
 
If your light blubs use that much you are doing it wrong as well ;)

Funny you mention "cant afford it". A GTX1070 in that case is a much better investment. So buying a RX 580 must mean you couldn't afford anything better at the time of purchase and just pay a constant usage tax instead. So maybe you should avoid silly comments like that in the future.
Silly comment = when you assume Power usage and cost is a huge factor for EVERYONE. I run my 8350(4.74Ghz)/290(1100) 24/7 gaming about 35~40 hours a week and that all is in a air conditioned case that runs 24/7 for the last 5 years. It idles at ~100W And gaming typical can be between 240W up to 500W depending on what games I feel like playing. And truth is my power bill has never been so high that I felt the need to lower it. I never get over $150 in a month, usually around $90-$120. And add to that my wife plays around the same amount on her 7870(4.5Ghz)/370(1050).

Even with all that the power usage/cost is reasonable enough that Power usage of any particular is not even on my list for criteria on a card.

Now I get you and others around the world ,some must other just to keep energy cost in check, care and it ranks higher on the criteria list. But to assume it is a huge factor in all criteria list is asinine.
 
Silly comment = when you assume Power usage and cost is a huge factor for EVERYONE. I run my 8350(4.74Ghz)/290(1100) 24/7 gaming about 35~40 hours a week and that all is in a air conditioned case that runs 24/7 for the last 5 years. It idles at ~100W And gaming typical can be between 240W up to 500W depending on what games I feel like playing. And truth is my power bill has never been so high that I felt the need to lower it. I never get over $150 in a month, usually around $90-$120. And add to that my wife plays around the same amount on her 7870(4.5Ghz)/370(1050).

Even with all that the power usage/cost is reasonable enough that Power usage of any particular is not even on my list for criteria on a card.

Now I get you and others around the world ,some must other just to keep energy cost in check, care and it ranks higher on the criteria list. But to assume it is a huge factor in all criteria list is asinine.

Power still cost doesn't it? So no matter how you try and twist it, its still part of the TCO. So when you try and use some excuse that an AMD card is cheaper, the ownership of said card may not be. You simply pay for over time what AMD couldn't deliver, despite all their powerpoint slide lies.

http://www.hardware.fr/articles/961-7/consommation-efficacite-energetique.html
http://www.hardware.fr/articles/961-28/recapitulatif-performances.html

Slower and uses more power.
 
And faster, that's the funny part.

Exactly :)

upload_2017-4-29_14-11-21.png
 
Oh gee sorry I was only going off a couple reviews like [H]'s cause I don't care as much. Thanks for doing the research.
 
Power still cost doesn't it? So no matter how you try and twist it, its still part of the TCO. So when you try and use some excuse that an AMD card is cheaper, the ownership of said card may not be. You simply pay for over time what AMD couldn't deliver, despite all their powerpoint slide lies.

http://www.hardware.fr/articles/961-7/consommation-efficacite-energetique.html
http://www.hardware.fr/articles/961-28/recapitulatif-performances.html

Slower and uses more power.
You know what AMD delivers? Sustained performance over time. Cases in point: 7970/280X, 290/290X(390/390X) each steadily distancing itself from its Nvidia counterparts.

But that wasn't the point. The point is not everyone cares about power usage. You can keep hammering away at it but it will do little to deter those like me that don't care about it. I concede it matters to some, whether by choice or some set of rules or laws.
 
You know what AMD delivers? Sustained performance over time. Cases in point: 7970/280X, 290/290X(390/390X) each steadily distancing itself from its Nvidia counterparts.

But that wasn't the point. The point is not everyone cares about power usage. You can keep hammering away at it but it will do little to deter those like me that don't care about it. I concede it matters to some, whether by choice or some set of rules or laws.

That myth stopped long ago.
1920.png

me4_1920.png

sr_1920.png
 
Nice cherry picked benchmarks, looks like those games just came out and AMD didn't even get CF working yet. lol "beta"
 
You know what AMD delivers? Sustained performance over time. Cases in point: 7970/280X, 290/290X(390/390X) each steadily distancing itself from its Nvidia counterparts.

But that wasn't the point. The point is not everyone cares about power usage. You can keep hammering away at it but it will do little to deter those like me that don't care about it. I concede it matters to some, whether by choice or some set of rules or laws.


Longevity for AMD cards? really not there, different strokes for different folks, games now are using more polygon counts then before so older GCN products are going to get hit hard, just like newer games at that time used more shader resources, older nV cards got hit harder in the past and because typically had less VRAM that too hurt them some. But over all, talking about 3 generation old hardware, ya got turn some settings down. If a person is holding on to high end cards from previous generations for 2 gens, its better to get mid range cards and buy them every generation, end result you get the best gaming experience ever generation. And if that person resales cards they end up saving more money that way, instead of having depreciated product over 2 gens.

Typical people that buy higher than a $400 card probably update their cards every generation anyways (if the upgrade is worth it). So that point is fairly moot.

AMD no longer has raw shader throughput advantages, that is because nV has been able to ramp up their clocks for Maxwell, and more so with Pascal, a ramp up that we typically don't see, 25% increase with Maxwell, and another 25% with Pascal.

Previous gens 25% was hard to do even with node changes......
 
Back
Top