All fields are required.
By James Piper
August 2004 -
Electrical chillers represent the single largest electrical load in most institutional and commercial facilities, accounting for 35-50 percent of a building’s annual electricity use. For this reason, maintenance and engineering managers should look at their building’s chiller systems first when looking for ways to reduce operating costs.
Fortunately, developments in chiller technology have provided a number of chiller-related tools that can help them to achieve higher operating efficiencies.
The most dramatic improvement in operating efficiency comes from replacing an older chiller with a new, high-efficiency chiller. Centrifugal chillers that are 15-20 years old had a peak efficiency in the range of 0.75-0.85 kW/ton. Those 10-15 years old had a peak efficiency of 0.60-0.70 kW/ton. And those ratings were for the chillers at their time of purchase.
Even with good maintenance — including a comprehensive chemical water-treatment program and regular cleaning of chiller tubes — those efficiencies declined with time, resulting in chillers with peak efficiencies in the range of 0.80-1 kW/ton or lower.
Today’s centrifugal chillers offer peak efficiencies of 0.50 or higher. When coupled with a variable-frequency drive, they offer high efficiency over a wide range of cooling loads. High-efficiency chillers often can reduce cooling energy requirements by 30-50 percent annually.
Refrigerant management is one of the most effective tools for maintaining or improving a chiller’s operating efficiency. When refrigerant management is mentioned, most managers think in terms of compliance with regulations, inventories, and chiller refrigerant conversions. But refrigerant management also refers to a way to help ensure that chillers operate as efficiently as possible.
During operation, sealed refrigerant systems develop small leaks. As the system loses refrigerant, operating efficiency decreases and often goes unnoticed. When it is noticed, mechanics add the required amount and enter it into the chiller log.
It is important for optimal chiller efficiency that technicians monitor the addition of refrigerant. They must review logs regularly to determine trends or changes that might indicate performance-decreasing leaks.
One significant development in improving chiller efficiency is the development of chiller management and maintenance software. These programs provide continuous monitoring of operation, providing operators and system managers with information on cooling rates, kW/ton efficiency calculations for different cooling loads, operating costs, water flow rates, and water temperatures.
Use of the software allows operators to identify problems as they develop, identify situations when the chiller is operating out of design performance conditions, and provide indication of temperature and flow sensors that have gone out of calibration. In plants that use multiple chillers, the programs can identify which chillers can be used to meet the current load at the highest operating efficiency.
Contaminants in a chiller’s refrigerant circuit reduce operating efficiency. Low-pressure systems, such as centrifugal chillers, are more susceptible to contaminants because any leak results in air and moisture being introduced into the system.
To minimize the impact of contaminants, manufacturers had added purge systems to chillers to separate air and moisture from refrigerant. While these systems worked, they were not very efficient. Besides not removing all of the contaminants, their use also resulted in the loss of refrigerant.
Newer, high-efficiency purge systems are both efficient and effective. In addition to separating most air and moisture from refrigerant, they do so without allowing any refrigerant to escape. The result is a refrigerant practically free of efficiency-robbing contaminants.
High-efficiency purge systems are available on new chillers or can be added to existing chillers. In both cases, technicians should regularly monitor the run time for the purge unit. Changes in purge-unit operation can be an early sign of a developing leak.
One problem that decreases the operating efficiency of many centrifugal chillers is contamination of the refrigerant circuit with oil. Most centrifugal chillers use a series of gears in their drive systems. These gears require lubrication. In normal operation, only a small amount of this oil ever gets carried over into the refrigerant circuit.
Eventually, this oil collects in the system’s evaporator, decreasing system capacity and efficiency. Special purge units designed to remove oil from the refrigerant can help, but they cannot remove all of the oil, nor can they clean the inside of the evaporator’s tubes.
Advanced designs for the latest generation of centrifugal chillers allow them to operate without using gears — a direct-drive system. Elimination of the gearing enables the units to operate without oil, eliminating the risk of oil contamination and the resultant loss of operating efficiency. Centrifugal chillers that require no oil also offer the promise of reduced maintenance requirements and even greater reliability, due to their simpler design and lower part count.
One common mistake from an operations standpoint is designing a building’s central chilled-water plant around one chiller. While using a single chiller reduces first costs and saves space, it causes problems for managers.
Single-chiller operation gives managers no backup. Chiller failure forces the facility to operate without air conditioning. That outage can continue for a long period of time if technicians must order replacement parts or if the chiller must undergo a major overhaul to be made operational.
Single-chiller operation also hurts operating efficiency. Chillers must be sized to meet a facility’s peak cooling load, even though that peak load might occur for only a few hours each year.
Unfortunately for managers, operating efficiency falls off very quickly as the load decreases, causing the chiller to operate at reduced efficiency for more than 95 percent of the cooling season.
An alternative approach is to use multiple chillers with different capacities. The combined capacities of the chillers would be needed only when the facility was experiencing its peak-cooling load. As the load decreases from peak, managers would have the option of selecting the chiller that most closely matches the cooling load, thus improving the operating efficiency of the cooling plant over the entire cooling season.
Managers would also gain the added advantage of having backup should one chiller be out of service. While one chiller might not be able to meet the entire facility’s cooling load, it would be able to provide cooling to critical areas.
For large applications, electric-drive, centrifugal chillers have been the most widely unit for building air conditioning. Centrifugal units offer high efficiency, high reliability, and low maintenance.
While centrifugal chillers remain a good choice for many applications, facilities that have a large central chilled-water plant with multiple chillers might benefit from use alternative-fuel chillers. These chillers might not offer improved operating efficiency, but they can reduce operating costs due to the changes brought about by deregulation.
Deregulation has created real-time pricing for electricity. Under this pricing structure, managers have a strong incentive to reduce peak electrical loads. Since cooling loads very closely track peak electrical loads, any reduction in chiller electrical energy use during peak periods can produce significant savings.
New-technology chillers, including direct-fired-absorption and natural-gas-driven centrifugal units, allow managers to use alternative fuels to produce chilled water at times when electricity rates are highest. Plants that use both conventional and alternative-fuel chillers allow managers to choose the most cost effective chiller to operate, based on the cost of the energy source at the time.
Taken together, recent advances in chiller design can means very significant savings. Installations that take advantage of all of the new technologies and design features can be expected to use only 50-60 percent of the energy required by systems that were installed even as recently as 10 years ago.