Home → News → AI Could Help Data Centers Run Far More Efficiently → Full Text

AI Could Help Data Centers Run Far More Efficiently

By MIT News

August 28, 2019

[article image]


Massachusetts Institute of Technology (MIT) researchers have created a system that automatically learns how to optimally allocate data processing workloads across thousands of servers as a means of boosting data center efficiency.

The Decima scheduler leverages reinforcement learning to make scheduling decisions for specific workloads in specific server clusters.

Decima tests multiple incoming workload allocation strategies across the servers to find the best trade-off between the use of computational resources and fast processing speeds.

Decima's completion speed is about 20% to 30% faster than the best handwritten scheduling algorithms, the researchers say.

MIT’s Hongzi Mao observed that “any slight improvement in utilization, even 1%, can save millions of dollars and a lot of energy in data centers.”

From MIT News
View Full Article

 

Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA

0 Comments

No entries found