New Content Updates
Educational Webcast Alerts
Building Products/Technology Notices
Access Exclusive Member Content
Part 1: Mission Critical Facilities Require Cooperation Between Facilities and IT
Part 2: Avoiding Conflict Between FM and IT
Part 3: Teaching IT About Energy Efficiency in Data Centers
Part 4: ASHRAE and Energy Star for Green Data Centers
By Rita Tatum
January 2010 -
Data Centers Article Use Policy
Peoples says that IT managers need to understand two important points when it comes to energy efficiency. "Electrical power is not limitless and it is not free," he says. "Data center capital costs, their total power capacity, and the energy consumption of IT equipment should all be important considerations in the value proposition of IT investments."
Reed says that one way to reach IT professionals is with good metrics. For example, for Internet search companies, present what energy costs per search, he says. Other suggestions include translating energy use into transactions per day or how many data hits per watt.
Another way to get the attention of IT managers is to point out that sharp increases in data center energy consumption have an impact on IT as well, even if they don't pay the electric bill from their own budgets. As electricity costs rise from 1 percent to 8 percent or more, companies find they have less to spend on software and other IT needs, says Brill.
For those who still don't get the importance of energy consumption, one facility manager suggests changing the way overhead is assigned. "The only way to educate them is to charge them kilowatt-hours instead of square footage," he says.
What makes one data center more efficient than another? Obviously, electrical and mechanical equipment efficiencies are one factor. Today's uninterruptible power supplies and transformers easily operate at 95 percent efficiencies. "In fact, much of today's gear is 97 to 98 percent efficient," says Dennis Cronin, principal with Gilbane - Mission Critical.
Mechanical gear for cooling varies widely. DX, centrifugal, screw, absorption and other compressors — all need to be evaluated for their efficiencies. Other opportunities for potential energy savings may tap waterside and airside free cooling. Or waste heat may be recovered and used elsewhere.
Another element is the level of reliability to which the data center is designed. In fault-tolerant facilities, there is more standby redundant equipment capacity that works at less than peak efficiency due to partial loading, says Cronin.
Load versus capacity is a key factor in efficient operations. Data centers operate most efficiently at 100 percent load. However, that rarely occurs. "To address loading versus capacity issues, many designs are migrating to a modular concept," says Cronin, "where capacity can be quickly added in discrete steps as the load grows."
Good design and efficient technology are important, but they only go so far. Visiting site after site, Ken Brill of The Uptime Institute says he often finds the initial data center design is "very efficient, but the client is not operating it as it was intended." Often, he says, the problem goes back to lack of engineering knowledge. "The hot aisle should be hot; the cold aisle should be cold, but not that cold."
— Rita Tatum
When it comes to tackling data center energy use, knowledge is power. Dennis Cronin, principal with Gilbane - Mission Critical, says that facility managers should focus on data-center energy-use measurements because the monitoring technology exists.
"You can now monitor power right down to the outlet," he says. "You can even do it over a wireless network." Still, only high-end data center operators are installing the load monitoring devices and analytical software needed for extensive monitoring.
Thomas Reed, senior principal, director of mission critical projects for KlingStubbins, suggests that the common measurement of watts per square foot needs tweaking.
A better metric is kilowatts per cabinet, he says. Knowing the kilowatts being used per cabinet can point to operational inefficiencies. Servers, like chillers, work best at their optimum conditions. So, if some cabinets are operating at 2 kilowatts, when they are rated for 15 to 20 kilowatts, IT loads need to be shifted, "taking advantage of some substantial energy and footprint savings," says Reed.
Two common ways to measure the efficiency of data center energy use are power usage effectiveness (PUE) and data center infrastructure efficiency (DCIE). PUE is expressed as a ratio, with efficiency improving as the ratio decreases toward 1.0. Basically, PUE is the result of dividing the power entering a data center by the power used to run the computer infrastructure. DCIE is PUE's reciprocal and is expressed as a percentage that improves as it nears 100 percent.
Many experts use PUE or DCIE to discuss how power is used. One facility manager puts it this way: "Every watt saved in the IT equipment is 2 watts off the electric bill, assuming a PUE of 2. Therefore, IT efficiency has the largest impact on the data center's electric bill and impact on reduction in greenhouse gases."
Another way to think of energy use in a data center with a PUE of 3.0 is this: "One watt in CPU reflects three watts at the utility," says Paul Schlattman, vice president, mission critical facilities group for Environmental Systems Design.
For the environmentally conscious, Rocky Mountain Institute (RMI) takes that watt back to its source — the power plant. RMI estimates every watt saved in IT equipment can save as much as 10 watts of power plant energy.
Paul Peoples, group manager, data center operations for Target, takes a holistic view of data center energy use. "I believe 100 percent comes from IT equipment," he says. "The data center supplies continuous power and cooling for the exclusive use of the IT equipment. If the IT equipment did not require cooling, then energy consumption for cooling would be zero. If power consumption were not required to be continuous, then the incremental energy use beyond the IT equipment would be minimal. So data center energy use is directly related to the environmental requirements of the IT equipment as necessary to deliver the critical business functions that equipment enables."