There are over 7,500 Data centers around the world out of which Top 20 global cities have 2,600 of them. The National Resource Defense Council estimated that data centers consume about 3% of global energy production. The concentration of data centers is largest in London with 377 data centers, more than any other city.

In 2014, Google started using Machine Learning for its data centers and managed to reduce the amount of energy required for cooling by 40 percent. Noted as a vast improvement and a phenomenal step towards energy efficiency in the times when data centers are rapidly increasing, and so is energy demand.

DeepMind’s AI

DeepMind is an Artificial Intelligence development and research company founded in 2010 and headquartered at London. In 2014, Google acquired DeepMind and made it a part of the Alphabet Group. Significant entrepreneurs like Elon Musk and Scott Banister are investors in DeepMind along with major venture capital firms such as Horizon Ventures and Founders Fund.

In 2016, DeepMind jointly developed an AI-powered recommendation system to improve the energy efficiency of Google’s already highly-optimized data centers. Rather than using human-implemented recommendations, DeepMind’s AI system directly controls data center cooling, remaining under the expert supervision of data center operators.

How It Works

DeepMind used neural networks trained in diverse Google data center scenarios to develop an adaptable framework for energy optimization.

DeepMind gathered extensive historical data from thousands of sensors at Google data centers, including RPM, temperature, cooling, pumps, power. This data was used to train the deep neural networks. These neural networks were based on the ratio of total building energy usage to the IT energy usage – called Power Usage Effectiveness(PUE).

DeepMind also trained two additional group of deep neural networks that predict the pressure and future temperature of the data center over the next hour at the center.

During testing at a live data center, the model’s machine learning recommendations were enabled for a period and then deactivated. Analysis of sensor data revealed AI consistently lowered cooling energy by 40%, reducing PUE by 25%.

A neural network and machine learning architecture aren’t universally applicable across data centers due to varying processing and cooling approaches. It is necessary to customize these deep neural networks according to the architecture of a Datacenter.

AI-powered Recommendation System

Every five minutes, DeepMind’s cloud-based AI pulls a snapshot of the data center cooling system from thousands of sensors and feeds it into our deep neural networks, to predict combinations of potential actions which will affect future energy consumption. It then identifies which actions that minimize the energy consumption keeping in mind the safety constraints. The data center receives responses, the local control system verifies them, and then implements the responses.

The recommendations system delivered consistent energy savings of about 30 percent on average, with further expected improvements.

An average data center consumes over 100 times the power consumption of a large commercial building. A large data center may even consume enough power used for powering up a small town. Data centers can consume up to 40% of operational costs for powering and cooling their extensive equipment. Many IT giants have placed significant emphasis on reducing energy usage and have accomplished much. However, addressing the increasing energy consumption still requires much more effort.