fnPrime


Key Advances Rewrite Data Center Management



Facility executives and managers face crucial decisions related to power density, liquid cooling, flexible energy and unified data platforms.


By Ronnie Wendt, Contributing Writer  
OTHER PARTS OF THIS ARTICLEPt. 1: This PagePt. 2: The AI Data Center Demands a New Skill Set


For decades, facility executives and managers relied on predictable design assumptions for data centers: steady rack densities, traditional air cooling and incremental power growth. Artificial intelligence (AI) has shattered that model. 

AI-driven data centers are forcing a shift in the way facilities are designed, powered, cooled and operated. Graphics processing units (GPU) now dominate workloads, driving unprecedented power densities, thermal loads and power grid impacts. 

As a result, a facility manager’s role includes strategic oversight for the most energy-using infrastructure in the built environment. 

“Data centers are becoming the epicenters of energy consumption,” says Sadiq Syed, senior vice president of the digital energy software business with Schneider Electric. “Facility managers have a tough balancing act — managing uptime and the highest levels of SLA [service level agreements] while also managing energy consumption. This is no longer a traditional facility management role.” 

From a power and electrical architecture perspective, the growth curve is even more stark. 

“With generative AI becoming prime time, the energy consumption from the data center business is increasing exponentially,” says Bin Lu, executive vice president of power products at Schneider Electric. 

Mark Swift, who leads engineering and product management at Starline, agrees. 

“Whether it’s an enterprise data center or a cloud or hyper-scale data center, AI deployments and denser rack topologies are putting a substantial strain on a facility’s existing power structure,” he says. 

A new era of power density 

Most enterprise data centers operate in the 5-20 kilowatts (kW) per rack range. But with AI, that benchmark is becoming obsolete. 

“With AI deployments, we’re seeing requests for 100 kW, 150 kW and even 200 kW per cabinet,” Swift says. 

Lu says that soon, even those numbers may be conservative. 

“We will move to a one megawatt (MW) level and above in the next one to two years,” he predicts. “Facility managers will no longer be adding 20 percent more electrical demand every year. They will be looking at doubling and even tripling their energy consumption yearly.” 

That leap will fundamentally change facility design. He says that a rack with identical physical dimensions will escalate power output from 50 kW to 1 MW — a 20-fold increase in density. 

Michael Giannou, global general manager of data centers with Honeywell, says this shift will redefine facility design. 

“It’s all about density in a rack,” Giannou says. “For years, data centers sat at 5 to 20 kilowatts per rack. With AI chips, you’re now seeing configurations between 30 and 100 kilowatts, and some hyper-scalers are pushing far beyond that.” 

Power availability pressures 

These power demands are not just a design challenge. They are an electrical grid challenge. 

“The U.S. power grid is currently very stretched,” Lu says. “Every conference I go to, people are asking where they can get access to the power they need.” This issue will persist, according to industry projections. Lu says the United States is projected to see a 30-50 percent increase in energy demand over the next five years, and data centers will be the primary driver of this growth. 

Some areas already are experiencing this strain, he says. For example, data centers account for 39 percent of the state of Virginia's electricity use. This shift is altering timelines, impacting capital expenditures and influencing data center site selection. 

“We’re seeing site selection driven by access to power or natural resources,” Swift says. “Some operators are even building their own substations or leveraging on-site natural gas to secure the power they need.” 

For facility executives and managers, access to electrical power is a strategic constraint that demands earlier and deeper engagement with utilities, regulators and energy partners. 

It also demands that they consider steps they can take to reduce strain on the grid, such as an on-site micro-grid, which is a small-scale power grid that can operate independently or in conjunction with the main power grid, to provide electricity to an area. These systems rely on multiple energy options, including solar panels and batteries and can operate during power interruptions. 

Some managers are also turning to battery energy storage systems (BESS) and demand response programs to access the energy they need. 

“Battery energy storage systems can support mission-critical reliability and control costs,” says David Chernis, director of flexible compute platforms for CPower Energy. “They protect against short-term power loss and improve power quality, but they can also earn revenue through demand response and energy flexibility programs.” 

Ronnie Wendt is a freelance writer based in Minocqua, Wisconsin. 


Continue Reading: Data Centers

Key Advances Rewrite Data Center Management

The AI Data Center Demands a New Skill Set



Contact FacilitiesNet Editorial Staff »

  posted on 3/5/2026   Article Use Policy




Related Topics: