GoldenTiger
Fully [H]
- Joined
- Dec 2, 2004
- Messages
- 29,801
Thanks !
I'm also no forgeting that I save money each month with the 5850.
I don't need a new power supply.
I don't need more cooling which uses more power.
Depending on how much i game in a month I can most likely save $10 or so a month by keeping the 5850. I keep my pc on all the time. I normaly have blurays burning (home video 1080p edited with effects and what not ) when i go to sleep or leave the house and I have torrents going to (legal stuff)
Thats $120 bucks a year. I only plan on running these cards for a year. But assuming $260 vs $350 + $100 is going to be an easy win in my book
Idle power draw is about 75w different from your 5850 at load to go to a GTX 470, as an example (source below). Let's pretend you don't even encode, so your system is *ALWAYS* at idle when not playing a game. In fact you are stating that it is often at partial load, but let's just ignore that and give a less favorable scenario here for how much it would cost you.
Let's say you have an average household electric rate of $0.15 (15 cents) per kwH (kilowatt hour). Let's also say you don't have a job, and game at a rate of 10 hours per day to keep the card at high load, every day, for an entire year (highly unlikely but let's pretend here).
That is a difference of $41.06 on your electric bill for the year.
Let's place a more average scenario of a heavy gamer playing on average of 4 hours a night after work: $16.43 for the year. This is *every... single... day... without fail...* as well.
From firingsquad.
You literally cannot even construct a scenario that even under 24/7 gaming by a zombie who cannot sleep, go to the bathroom, or eat, can rack up a $120 difference at the average household rate. Some may be higher, some may be lower ($0.10/kwH through around $0.20/kwH is typical). In fact, you'd barely manage it with this imaginary situation,
Last edited: