High Performance Computing Requires Careful Heat Management

  July 14, 2011

This is Casey Laughman, managing editor of Building Operating Management magazine. Today's tip is to consider the amount of heat produced by a high-performance computing system.

High-performance computing generates massive amounts of heat in a small area. These systems will require an upgrade of the conventional HVAC system in the form of package chillers and chilled-water piping. Although chillers are an expensive first-cost element, they are more cost-efficient to operate over the life-cycle of the facility compared with competing cooling technologies. Converting an entire facility to chilled-water cooling will save money over the long term.

With a packaged chilled-water system removing the heat of the high-performance computing system, the net result will be a very small heat load on the air-side system. This may enable a data center to turn off some air conditioning equipment, while maintaining enough running units to manage humidity. Using chilled water also makes available an option of installing a fluid cooler that would allow free cooling in the cooler seasons, depending on location. The Jaguar HPC is the fastest computer in the world. According to Cray, the Jaguar's manufacturer, the room housing it required 100 fewer computer room air conditioners than before because of the pumped refrigerant system.

Manufacturers are experimenting with high-temperature HPC products, which will reduce cooling requirements.

An HPC is unlikely to require qualitative changes to the data center's existing uninterruptible power supply topology or emergency power generation model; rather, what will be required is a relative increase in UPS power that is brought to a small area of the data center. The availability of chilled water, UPS power, and generator power will vary by facility, and all of these facets will need to be researched before installing an HPC.

Given the multiple forces that could drive adoption of HPC in commercial and institutional data centers, now is the time to start researching and planning.


Read next on FacilitiesNet