Reducing Data Center Energy Use Means Focusing on PUE, Air Flow, LEED
First of a three-part article on how to make critical facilities more energy efficient.
Given all the power that data centers use, it’s no surprise that energy efficiency is a hot topic in the industry. “Interest in energy efficient data centers has grown as companies realize they can lower the cost of operating a data center by reducing fees for power consumption,” says Robert Cassiliano, CEO at Business Information Services and chairman of 7x24 Exchange.
The good news for facility managers is that a range of proven measures — including some relatively low cost steps — can be taken to cut energy use significantly.
Energy efficiency is far more than talk for many data centers today. One tangible sign of progress on data center energy efficiency is that typical power usage effectiveness (PUE) numbers have been falling. PUE, the main measure of data center energy efficiency, is defined as total power use divided by IT power use.
In general, says Paul Schlattman, senior vice president at ESD Consulting, a design PUE of 1.4 with an operating PUE goal of 1.2 annualized has become the basis of design. He says that seven or eight years ago, a PUE of 1.8 or higher was acceptable.
Cassiliano says he sees a slightly higher PUE value in the design of new corporate data centers. “New data centers typically are targeting PUEs of 1.6 or better,” says Cassiliano.
The move to energy-efficient data centers got a boost when the U.S. Green Building Council released its LEED for Data Centers rating system in 2013. Corey Enck, vice president of LEED technical development, says that the USGBC worked with the industry to quantify how data centers are different from other buildings and rate them accordingly. Of course, energy use is “orders of magnitude different from a typical site,” Enck says. Beyond that, commissioning a data center requires a specific skill set, Enck says, and the standards for indoor environmental quality are actually lower. That’s because so few employees typically work there.
Water use, too, is far different from a typical office building, in which LEED standards reflect sinks and toilets. A data center’s water use has to be evaluated using industrial standards. Enck says it’s “just a different output, data instead of widgets.”
The new LEED certification applies only to buildings solely devoted to data centers. Developing criteria for an office building that might have one floor of data storage is a longer-term project, Enck says.
Whether a data center is standalone or part of a larger building, or new construction or legacy space, a set of fundamental best practices can go a long way to achieving energy efficiency.
A good place to start is with air flow.
“It’s important for customers to maintain discipline around their own equipment,” Rinard says. “Air is like water. It seeks the path of least resistance.” To help control that flow, Equinix uses blanking plates in server cabinets with unused slots; if a customer’s cabinet has 10 empty spots, a metal and foam block can plug those spaces. To reduce overall energy use, Equinix also builds its facilities with variable-speed drives in its HVAC equipment and installs smart control systems.
Along with effective air flow management, the use of hot-aisle or cold-aisle containment is a proven way to trim energy use.
A standard practice at new facilities is to install servers in double rows, with back sides (hot) facing each other. Building partitions to keep the hot aisles and cold aisles separate lowers overall cooling costs; cooled air feeds into the servers’ intakes before it can mix with hot air being sucked out. Containment can also be implemented in existing data centers.
Perhaps the simplest energy-saving technique of all is probably to let the building get warmer. That’s because ASHRAE and the IT industry have agreed that servers can handle higher temperatures than they used to. Years ago, 68 to 72 degrees was considered the standard, according to David Rinard, senior director of global sustainability for Equinix, which runs 146 colocation data centers totaling 6 million square feet.
However, today’s IT equipment can handle temperatures of 78 degrees or more, meaning less cooling is needed.
About 7X24 Exchange
The 7x24 Exchange is the leading knowledge exchange for the Mission Critical Industry and as such through conferences and 7x24 Exchange Magazine will continue to present topics of interest, case studies, and industry trends which provide value to our membership and conference participants. Visit 7x24 Exchange’s website for more information.