Google Is Using AI to Cool Down its Data Centers

On Friday, August 17, a news report came out, claiming that Google is taking the human element out of its data centers.

At the same time, they are looking to save on energy costs, in a space where these costs have long been skyrocketing. Without knowing the true numbers, you most likely know that for many technology companies, data centers are their biggest expense, due to how much energy is consumed by continuously running, as well as cooling down the servers in these locations. According to TechCrunch, Google is now turning towards AI systems, in an attempt to optimize the cost of running its data centers. Given that they are perhaps the biggest technology company on the planet with perhaps the most data centers on the planet, this would seem to be the ideal course of action at this time.

Interestingly enough, the way that Google came to this decision relates back to their most serious foray into Artificial Intelligence through their subsidiary, DeepMind, which we have published other pieces on, as well. Reportedly, back in 2016, Google started using a large part of their AI resources to examine what the effect would be on their cooling costs if the temperature regulation of their data centers was placed in the hands of an AI.

What they found was actually quite telling.

Based on Google’s research, doing so with DeepMind at a large scale would reduce their cooling bills by 40%, which would appear to indicate that an AI can consistently maintain a higher level of efficiency in this area than humans. Inside of this report, Google mentioned a fact that has been worrying the greater part of all businesses involved in computing, in any capacity. With the evolution of computers, comes the further skyrocketing of energy costs, unless we as a world, do something to combat this. Given that Google is already known for taking specific, significant steps to curb its energy usage in several other areas of its operations, it is not surprising that they would lead the charge in this area, too. Lending credence to this idea is the added fact that their self-stated overall goal with their energy consumption is to eventually become 100% run by renewable energy in every one of their business units. At the same time that they state this as one of their biggest driving forces, Google claims that by 2016, they had already found a way to have at least 3.5 times the amount of computing power as compared to 2011, while using the same amount of energy. Moreover, the process of coming to such a conclusion involved letting the AI have unfettered access to just about every part of a Google data center, with the aim of understanding all of the aspects of how and why they do what they do.

In doing so, they also set careful barriers for the AI, including apparently explicitly telling it in terms that it can understand, what actions are not okay to take under any circumstances. Reportedly, in truth, the system had already been allowed to provide extra guidance to data center employees after the publishing of the 2016 report. At the time, it was left up to the discretion of the human workers on a center by center basis, whether or not they would allow the DeepMind to have any sort of influence over data center operations whatsoever.

Now, however, the DeepMind team has decided that its AI is effectively ready to take control over the temperature regulation of all data centers and as a result, the energy consumption of all data centers. At the time, the solution has only been rolled out to a few key locations, which would seem to show that DeepMind and Google are still cautious about being overconfident in a solution that largely takes humans out of the equation. Judging by information that TechCrunch has obtained, however, with any implementation, comes a large group of what their article calls “checks and balances,” including forcing the AI to evaluate and make all of its decisions based on the data that it obtains every five minutes from thousands of sensors, in each data center.

Even so, a large risk remains simply because of the nature of machine learning models, which are the foundational elements of every AI system. In short, machine learning models are in their weakest state as they first implemented, and require time to become truly useful. Thus, in their nature, they are flawed in that they learn from failure, which would seem to indicate that keeping an element of the control in human hands is the best option, at this point.

Resources:

Primary Source:

https://techcrunch.com/2018/08/17/google-gives-its-ai-the-reins-over-its-data-center-cooling-systems/?guccounter=1

6 Things to Know about Data Centers and Artificial Intelligence:

https://www.telehouse.com/2018/01/data-centers-and-artificial-intelligence/

Impact of AI in the Data Center:

https://betanews.com/2018/04/30/data-center-ai/

About Ian LeViness 113 Articles
Professional Writer/Teacher, dedicated to making emergent industries acceptable to the general populace