All fields are required.
Part 2: Increasing Data Center Capacity and Improving Efficiency
By James M. Krolikowski
July 2010 -
When the base building systems described above lack the capacity or fail to meet the redundancy/reliability needed to support high-density, high-performance technology tenants, supplemental mechanical and electrical equipment may be necessary. Large corporations will need more than just basic supplemental cooling equipment, and may even require space on the roof or in the mechanical room, depending on the power density of the operation; small companies may need less space. In-row coolers, chilled water AC units and heat pumps paired with air-cooled condensers, chillers and cooling towers are common answers where more power density is required.
The traditional solution for increasing capacity in a data center is the addition of perimeter cooling units. When this isn't possible due to space or other constraints, in-row cooling can be a convenient alternative. It works like this: A cooling device about 9 inches wide is placed between pairs of racks, conditioning air locally. Unlike the CRAC unit designed to cool the air in the total space, the in-row cooler is more specific to the racks that it serves, allowing for localized higher densities and giving data center facility managers the opportunity to upgrade their capacity without a major reconfiguration of the IT infrastructure.
Piped in overhead or under the floor, in-row cooling is made possible by a refrigerant that evaporates in the event of a leak, and is therefore a relatively low risk to data center operations. Heat absorbed by the in-row cooler is rejected to an outdoor, air-cooled condenser or to a small water-cooled condenser in a mechanical closet.
Located either in the building's mechanical space, or wherever the landlord can accommodate, water-cooled chillers are a preferred source of supplemental cooling because of their kilowatt-per-ton efficiency and proven reliability. Air-cooled chillers consume more electricity, must be installed outside the building, occupy a large footprint relative to their cooling capacity, require some form of freeze protection and are less reliable in extreme temperatures. When employed, however, air-cooled chillers are often paired with a dry cooler for a couple of reasons, the main one being that manufacturers cannot guarantee the device will start in low ambient conditions. Secondly, a dry cooler paired with the chiller allows the owner to take advantage of a waterside economizer cycle.
When the installation of additional chillers is required, it's important to take into consideration the tenant location in relationship to the roof and mechanical floor. Can the new equipment be piped to the tenant floor economically and without disrupting other tenants? Secondly, cooling towers and other HVAC equipment can be extremely heavy and, depending on the height of the building, may require a crane or chopper lift.
Code deficiencies are often present in older buildings and are a factor in any upgrade. If the new design includes the addition of chillers, code requires a refrigerant evacuation system to be installed, which includes an alarm, a strobe and sometimes SCUBA equipment at the entrance to the mechanical room. Other building system configurations may be grandfathered in, but modifying the system in any way may also require an upgrade to current code requirements. The additional costs of these requirements must be considered prior to construction.
Older data centers can improve their energy efficiency by implementing some of today's design best practices. One of the most common and effective air flow management techniques for a data center is hot/cold air containment. Today's new data centers have the luxury of designing this method in from day one, but with a little effort existing data centers can take advantage of the benefits as well.
The first step is to mitigate the dispersion of hot air from the rear of equipment racks back into the conditioned cold zone. Ceiling space can be converted into a means of conveyance and isolation of hot air by ducting the return opening on the CRAC unit into the ceiling plenum. By replacing ceiling tiles above hot aisles with perforated grilles, cooling equipment efficiency and the overall cooling capacity of the room will be increased.
To take this concept to the next level, owners can individually isolate higher-density racks by installing passive or fan-assisted chimneys between the rack and the ceiling plenum. Another step is to install partitions with doors at the ends of data cabinet aisles to form a temperature zone enclosure.
With the 2008 update of the American Society of Heating, Refrigeration and Air Conditioning Engineers' (ASHRAE) TC 9.9 — "Thermal Guidelines for Data Processing Environments," designers have a larger window of opportunity to maximize energy efficiencies in terms of acceptable temperature and humidity. Engineers are trying to dispel the "colder is better" mindset that is still prevalent in IT operations across the country. Cooling of electronics is a process load composed almost entirely of sensible heat. Most server manufacturers warranty equipment for operating inlet temperatures of up to 90 degrees F. With this in mind, it makes good sense to raise air and water temperature set points. Colder water is necessary to handle the latent loads associated with typical office environments, but in the rarely-occupied data center it is a waste of energy.
While its systems may be dated, the existing office building still provides great opportunities for hi-tech companies and their power and efficiency needs. Keeping these key considerations in mind, any building owner or manager will have the tools to choose the right space for their unique and high-powered tenants.
James M. Krolikowski, P.E., is a senior mechanical engineer at Environmental Systems Design in Chicago. His current focus is mechanical engineering for mission critical facilities. Krolikowski can be reached at email@example.com.
Part 1: Determining if an Existing Facility's Infrastructure can Meet High-tech Needs
Part 3: Cooling a Trading Floor's Data Center