August 17, 2018 at 09:41AM New on MIT Technology Review Google just gave control over data center cooling to an AI
Google revealed today that it has given control of cooling several of its leviathan data centers to an AI algorithm.
Over the past couple of years, Google has been testing an algorithm that learns how best to adjust cooling systems—fans, ventilation, and other equipment—in order to lower power consumption. This system previously made recommendations to data center managers who would decide whether or not to implement them, leading to energy savings of around 40 percent in those cooling systems.
Now, Google says, it has effectively handed control to the algorithm, which is managing cooling at several of its data centers all by itself.
“It’s the first time that an autonomous industrial control system will be deployed at this scale, to the best of our knowledge,” says Mustafa Suleyman, head of applied AI at DeepMind, the London artificial intelligence company Google acquired in 2014.
The project demonstrates the potential for artificial intelligence to manage infrastructure—and shows how advanced AI systems can work in collaboration with humans. Although the algorithm runs independently, a person manages it and can intervene if it seems to be doing something too risky.
The algorithm exploits a technique known as reinforcement learning, which learns through trial and error. The same approach led to AlphaGo, the DeepMind program which vanquished human players of the board game Go (see “10 Breakthrough Technologies: Reinforcement Learning”).
DeepMind fed its new algorithm information gathered from Google data centers and let it determine what cooling configurations would lower energy consumption. The project could generate millions of dollars in energy savings and may help the company lower its carbon emissions, say Joe Kava, vice president of datacenters for Google.
Kava says managers trusted the earlier system and had few concerns about delegating greater control to an AI. Still, the new system has safety controls to prevent it from doing anything that has an adverse effect on cooling. A datacenter manager can watch the system in action, see what the algorithm's confidence level is about the changes it wants to make, and intervene if it seems to be doing something too risky.
Data center energy consumption has become a pressing issue for the tech industry. A 2016 report from researchers at the US Department of Energy’s Lawrence Berkeley National Laboratory found US data centers consumed about 70 billion kilowatt hours in 2014—about 1.8 percent of total national electricity use.
But efforts to improve energy efficiency have been significant. The same report found that efficiency gains are almost cancelling out increases in energy use from new data centers although the total is expected to reach around 73 billion kWh by 2020.
“Use of machine learning is an important development,” says Jonathan Koomey, one of the world’s leading experts on data center energy usage. But he adds that cooling accounts a relatively small amount of a center’s energy use, around 10 percent.
Koomey thinks using machine learning to optimize the behavior of the power-hungry computer chips inside data centers could prove even more significant. “I’m eager to see Google and other big players apply such tools to optimizing their computing loads," says Koomey. "The possibilities on the compute side are tenfold bigger than for cooling.”