Despite U.S. government warnings on exploding energy consumption, data centers are not getting more power-efficient. An ongoing Uptime Institute survey found that, from 2005 to 2008, the electricity usage of its members' data centers grew at an average of about 11 percent per year. "We've done a lot of great stuff at the infrastructure level, but we haven't changed our behavior," says Microsoft's Rob Bernard.
Speakers at the recent Uptime Symposium 2010 conference highlighted a number of power-draining factors, including energy-indifferent application programming, siloed organizational structures, and better hardware. One problem is that developers regularly build programs that allocate too much memory and hold onto the processor for too long. "The application isn't energy-aware, it doesn't matter that every other application on the client is," Bernard says.
Speakers at the conference estimated that the average CPU utilization stayed somewhere between five and 25 percent. The industry needs a more dynamic way for the data center to scale its power usage with the amount of work that needs to be done, according to the speakers. "As an industry, what we'd truly like to see is truly linear scaling where you'd use zero watts when doing zero work to drawing a lot of power [only] when you are doing more work," says analyst John Stanley.
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA