How Google uses DeepMind’s AI to reduce Cooling Bill By 40%

There are over 7,500 Data centers around the world out of which Top 20 global cities have 2,600 of them. The National Resource Defense Council estimated that data centers consume about 3% of global energy production. The concentration of data centers is largest in London with 377 data centers, more than any other city.

In 2014, Google started using Machine Learning for its data centers and managed to reduce the amount of energy required for cooling by 40 percent. Noted as a vast improvement and a phenomenal step towards energy efficiency in the times when data centers are rapidly increasing, and so is energy demand.

DeepMind’s AI

DeepMind is an Artificial Intelligence development and research company founded in 2010 and headquartered at London. In 2014, Google acquired DeepMind and made it a part of the Alphabet Group. Significant entrepreneurs like Elon Musk and Scott Banister are investors in DeepMind along with major venture capital firms such as Horizon Ventures and Founders Fund.

In 2016, DeepMind jointly developed an AI-powered recommendation system to improve the energy efficiency of Google’s already highly-optimized data centers. Rather than using human-implemented recommendations, DeepMind’s AI system directly controls data center cooling, remaining under the expert supervision of data center operators.

How It Works

By using a system of neural networks that got trained in different operating scenarios and parameters within Google’s data centers, DeepMind created a more efficient and adaptive framework to understand dynamics and energy optimization of a data center.

DeepMind collected a large amount of historical data by thousands of sensors fitted at Google data center – data such as rotations per minute, temperature, cooling outlets, pump speeds, power. This data was used to train the deep neural networks. These neural networks were based on the ratio of total building energy usage to the IT energy usage – called Power Usage Effectiveness(PUE).

DeepMind also trained two additional group of deep neural networks that predict the pressure and future temperature of the data center over the next hour at the center.

Upon deploying the model at a live data center on a typical day of testing, the machine learning recommendations were set on for a period and then shut off. The data collected through sensors were used to generate a graph that showed that AI achieved a consistent 40% reduction in the amount of energy for cooling, which equates to a 25 percent reduction in PUE.

Though a single neural network and machine learning architecture is not useful for another data center since every data center has different methods of processing and cooling systems. These deep neural networks need to get customized according to the architecture of a Datacenter.

AI-powered Recommendation System

Every five minutes, DeepMind’s cloud-based AI pulls a snapshot of the data center cooling system from thousands of sensors and feeds it into our deep neural networks, to predict combinations of potential actions which will affect future energy consumption. It then identifies which actions that minimize the energy consumption keeping in mind the safety constraints. Responses get sent back to the data center, are verified by the local control system and then implemented.

The recommendations system delivered consistent energy savings of about 30 percent on average, with further expected improvements.

An average data center consumes over 100 times the power consumption of a large commercial building. A large data center may even consume enough power used for powering up a small town. The energy needed to power & cool the massive amounts of equipment data centers require as much as 40% of the total operational costs. Reducing energy usage has been a significant focus for many IT giants, and much has been done, but there remains, even more, to do with the increasing energy consumption.


Leave A Comment