Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Google's Chiller-less Data Center

Google has begun operating data center in Belgium that has no chillers to support its cooling systems, which will improve energy efficiency but make local weather forecasting a factor in its network management.

Rich Miller

July 15, 2009

4 Min Read
DataCenterKnowledge logo in a gray background | DataCenterKnowledge

The equipment yard at the Google data center in St. Ghislain, Belgium features no chillers. (Photo from Google)

google-belgium1

The equipment yard at the Google data center in Belgium features no chillers. (Photo from Google)

Google (GOOG) has begun operating a data center in Belgium that has no chillers to support its cooling systems, a strategy that will improve its energy efficiency while making local weather forecasting a larger factor in its data center management.

Chillers, which are used to refrigerate water, are widely used in data center cooling systems but require a large amount of electricity to operate. With the growing focus on power costs, many data centers are reducing their reliance on chillers to improve the energy efficiency of their facilities.

This has boosted adoption of "free cooling," the use of fresh air from outside the data center to support the cooling systems. This approach allows data centers to use outside air when the temperature is cool, while falling back on chillers on warmer days.

Google has taken the strategy to the next level. Rather than using chillers part-time, the company has eliminated them entirely in its data center near Saint-Ghislain, Belgium, which began operating in late 2008 and also features an on-site water purification facility that allows it to use water from a nearby industrial canal rather than a municipal water utility.

Year-Round Free Cooling
The climate in Belgium will support free cooling almost year-round, according to Google engineers, with temperatures rising above the acceptable range for free cooling about seven days per year on average. The average temperature in Brussels during summer reaches 66 to 71 degrees, while Google maintains its data centers at temperatures above 80 degrees.

So what happens if the weather gets hot? On those days, Google says it will turn off equipment as needed in Belgium and shift computing load to other data centers. This approach is made possible by the scope of the company's global network of data centers, which provide the ability to shift an entire data center's workload to other facilities.

In a March interview, Urs Holzle, Google's Senior Vice President of Operations, said the company typically uses manual tools to manage data center level outages and downtime.  "Teams regularly practice failing out of or routing around specific data centers as part of scheduled maintenance," he said. "Sometimes we need to build new tools when new classes of problems happen."

Redirecting Workloads Instantly
At last month's Structure 09 conference, Google's Vijay Gill hinted that the company has developed automated tools to manage data center heat loads and quickly redistribute workloads during thermal events (a topic covered by The Register). 

"You have to have integration with everything right from the chillers down all the way to the CPU," said Gill, Google's Senior Manager of Engineering and Architecture. "Sometimes, there's a temperature excursion, and you might want to do a quick load-shedding to prevent a temperature excursion because, hey, you have a data center with no chillers. You want to move some load off. You want to cut some CPUs and some of the processes in RAM."

Gill was asked if this was a technology Google is using today. "I could not possibly comment on that," Gill replied.

Look Ma: No Chillers!
But Google engineers had already disclosed the existence of the chiller-less Belgium data center at the Google Data Center Efficiency Summit in April in Mountain View, Calif. At the event, we asked specifically: are there chillers on-site that are rarely used, or no chillers at all?

The answer: no chillers at all. The facility will rely entirely on free cooling, and redirect workload on days when it's too hot to operate. This approach makes local weather an issue in network management, although advanced forecasting can help Google anticipate days when it may need to divert work from the Belgium facility.

Nonetheless, even Google is periodically challenged by rerouting entire data centers, as seen in a February Gmail outage when a data center was overloaded while shifting workloads. Traffic redirection was also an issue in a brief outage in May.      

An Enabler for "Follow the Moon"?
The ability to seamlessly shift workloads between data centers also creates intriguing long-term energy management possibilities, including a "follow the moon" strategy which takes advantage of lower costs for power and cooling during overnight hours. In this scenario, virtualized workloads are shifted across data centers in different time zones to capture savings from off-peak utility rates. 

This approach has been discussed by cloud technologists Geva Perry and James Urquhart as a strategy for cloud computing providers with global data networks, who could offer a "follow-the-moon" service to enterprise customers who would normally build data centers where power is cheap. But this approach could also produce energy savings for a single company with a global network - someone like Google.

Read more about:

Google Alphabet
Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like