New Content Updates
Educational Webcast Alerts
Building Products/Technology Notices
Access Exclusive Member Content
By Rita Tatum
Data Centers Article Use Policy
Three decades ago, larger companies had glass- and drywall-enclosed data centers holding one or more enormous mainframe computers that read punched data cards to calculate and extrapolate data. Then technology shrunk computing to desktops and smaller mainframes. By the early 1990s, the in-house data center often was dismantled and reused for other functions.
But the need for instant data has grown astronomically since 2000. Today, many companies use buildings dedicated exclusively to mission critical operations. Even smaller operations are relying on 10,000- to 20,000-square-foot internal data centers to manage the information needed in today's global economy.
Data, data everywhere means that facility managers are seeing IT energy costs growing about 20 percent annually, according to Ken Brill of The Uptime Institute. "Over the last 10 years, we have seen data center energy costs rise from 1 percent to between 8 and 15 percent of a company's total electricity consumption," Brill says.
That change, coupled with the rising capital and operating costs for facilities, has corporations looking high and low for ways to control those expenses. One of the biggest opportunities might seem to be one of the easiest to take advantage of: better teamwork between facility and IT managers. But in the real world, that teamwork isn't so easy to attain. It means breaking down long-standing barriers and adopting new ways of thinking among both facility managers and their IT peers.
One important step to better teamwork is understanding the underlying factors that have made data-center energy use such a big issue.
The phenomenal growth in power consumption comes in part from something as tiny as a computer chip. About every two years, the number of functions on that chip doubles. With more functions come more energy use and more waste heat that must be managed.
What's more, costs for better, faster servers have dropped dramatically, so more companies can afford them. According to Brill, the number of watts per $1,000 for servers has gone up by a factor of five since 2000. "In 2000, you got 32 watts per $1,000," he says. "That same $1,000 in 2008 bought 150 watts."
The increased wattage translates into dizzyingly fast computations at the server level, but it also adds tremendous amounts of waste heat that must be managed. And, while today's servers are more tolerant of consistently higher operating temperatures than their predecessors, they still have some sensitivities to heat and humidity.
Falling costs for computing power put facility costs in the spotlight. When IT equipment was much more expensive, the facility cost was a small percentage of the total data center spend. But as the costs for IT equipment decrease, the proportion of costs attributable to facilities increases. Energy costs rise, partly because the new servers require more power and partly because the spaces that house those servers need more cooling.
Costs aren't the only risk with rapidly growing energy use. Some countries already are instituting load restrictions. "In places like Japan and the United Kingdom, there are strict mandates on how much energy can be consumed by a building, no matter how much of that building's space is a data center," says Andrew Fanara, program manager for U.S. EPA's Energy Star Program.
Some facility managers view the rapid rise in energy consumption as a one-time spike. However, that's probably not the case, says Paul Peoples, group manager, data center operations for Target. "IT equipment energy consumption will continue to grow over time and that increase is primarily driven by the long-term trend of rapid technology innovation in IT equipment that has been occurring since the early 1970s," Peoples says.
Other experts see the same thing. William Kosik, energy and sustainability director at HP Critical Facilities Services, delivered by EYP MCF, says that today's data centers consume 20 to 30 times the annual energy used by the average commercial office facility. One reason demand has grown is because virtualization and other high tech applications require more from servers than ever before.
"Nothing suggests that demand for IT technology will diminish," says Fanara.
Brill says there's no magic bullet to kill the energy beast in data centers. He argues that a 40 percent energy reduction in data centers is very achievable using today's technology. "We have a management problem, not a technology problem," he says.
That's why FM-IT teamwork is high on many experts' to-do lists for data center energy efficiency. In many organizations, that's a big shift from the way facility managers and IT managers have dealt with each other.
Mission Critical Facilities Require Cooperation Between Facilities and IT
Avoiding Conflict Between FM and IT
Teaching IT About Energy Efficiency in Data Centers
ASHRAE and Energy Star for Green Data Centers