Data Centers Are Finally Using Less Electricity

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
A study is showing that despite a “boom in online activity,” electricity consumption has barely grown in the last few years—it seems that tech companies have done a great job in making their server farms more efficient. Projections show that electricity usage will grow at a similar rate in the coming years, with the possibility of reduction, even, by as much as 45 percent.

From 2000 to 2005, the study found, data centers in the United States increased their electricity consumption by 90 percent. From 2005 to 2010, it went up another 24 percent, even with the Great Recession. But since then it has been flat, growing by only 4 percent from 2010 to 2014 despite a boom in online activity, millions of new smartphones, social media mania and other trends that have driven Americans to spend evermore time online. And, the study projects, from now until 2020 electricity use from U.S. data centers will grow only 4 percent and could actually be reduced by as much as 45 percent -- back to 2003 levels -- with additional energy efficiency measures.
 
I can't say I am surprised. The time frame fits with the rise of VShpere and Hyper-V usage. If a lot of data centers are like ours, we have shut down dozens of physical machines moving them a half dozen ESXI hosts on an EMC SAN. The power savings really adds up, not only the reduction of physical machines, but the air conditioning system is taxed much less, saving power there as well.
 
You can get a lot more CPU power with the newer servers, while the newer CPU's draw less power, especially at idle.

Last server I bought had dual 10 core CPU's. Even with multiple VM's running, I double I'll be able to use all that CPU power over the next few years.
 
Back
Top