New Content Updates
Educational Webcast Alerts
Building Products/Technology Notices
Access Exclusive Member Content
By Casey Laughman, Managing Editor
Data Centers Article Use Policy
The cooling infrastructure isn't the only place where Google has a sharp focus on efficiency. The servers in its data centers are custom-built and stripped down to use only what they need, so unnecessary items such as peripheral connectors or video cards are not included. The fans on the servers run only fast enough to keep the machines from overheating instead of simply running at full speed around the clock. And, in a departure from industry norms, the servers include backup batteries directly on the racks, which not only cuts down on the power demand, but cuts out two AC/DC conversion stages as power comes into the servers, which leads to less power loss.
That means the servers are not supported by the usual UPS structure, but, as Kava points out, the UPS is not meant to handle long-term outages, so anything beyond what the on-board battery can handle should be handled by a generator anyway.
"Even if you've got 10 minutes of battery, you're probably not going to fix the problem within 10 minutes," he says.
All of Google's efficiency efforts mean that its data centers run at a power usage efficiency (PUE) of 1.12. In other words, for every one watt that Google uses to power the IT equipment, it uses an additional .12 of a watt to power the facility the equipment is housed in. According to a 2012 study from the Uptime Institute, the industry average PUE is 1.88, which works out to 880 more watts needed for overhead power for every kilowatt used to power IT equipment. (To account for the on-board batteries taking the place of UPS, Google only counts the servers, storage, and networking equipment as IT power.)
This focus on efficiency becomes even more pronounced when you consider that, at Google scale, it's multiple megawatts that are being consumed. Which means that efficiency improvements, regardless of how small they are, add up quickly.
"If you make a very, very minor efficiency improvement across a data center that has 1 MW of critical load, it might take a very long time for that efficiency change to pay for itself," says Kava. "If you have that same efficiency change over a data center that maybe has 10 MW of critical load, well, then, the payback can be much faster."
As Google Grows, It's Up To Joe Kava To Ensure Data Centers Keep Pace
To Keep Up With Demand For Data Centers, Google Focuses On Policies, Procedures
Custom-built Servers, Onboard Batteries Help Google Meet Its IT Energy Efficiency Goals
Google Shares Data Center Energy Efficiency Information To Help Push Industry Change
Customized Training Helps Google Data Center Employees Get Up To Speed