4  FM quick reads on Data center

1. Experts Conclude Data Centers Can Be Warmer


Today's tip is to consider a warmer temperature in your data center. Engineers are working on more sophisticated technologies for both computers and cooling equipment.

In the early days of data centers, computer equipment was kept at very cool temperatures — from 68 F to 70 F. These temperatures were mandated by computer manufacturers, who would not guarantee their equipment at higher temperatures. However, in 2008 the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) revised the guidelines to temperature band of 64.4 F to 80.6 F. "What is so powerful about these new recommended ranges is that they apply to legacy IT equipment," says Don Beaty, president of DLB Associates.

Using the higher end of the new temperature ranges can significantly reduce cooling energy use in chilled water systems. "Normally the water in the chiller would be cooled to 44 degrees, but if they take it to 50 degrees, it requires half of the energy," says Jim McEnteggart, vice president of Primary Integration Solutions.

Some data centers are running at only slightly higher temperatures, says Paul Mihm, executive vice president, technical services group, of Rubicon, "because the time to reach critical temperature is significantly shorter in the event of a failure." One solution, says Mihm, is to have a backup system to exhaust hot air out of the space. Most environments could withstand a short period of heat until the backup system kicked in, he says.
Another option is thermal energy storage using tanks of cold water. In water-cooled systems, the tank could be used to pump cold water through the system very quickly when a facility switches to emergency power.
While the type of data center will, in large part, determine tolerance of the new ASHRAE standards, geography also plays a role, since a hot climate will likely have a lower threshold for higher temperatures. If the data center needs to be rapidly cooled off, the outside air will not be much help, so there is less margin of error.


2.  Ways to protect data centers from fire

Today's tip is to make sure your data center is adequately protected from fire. In the past, of course, data centers could use Halon to put out an electrical fire. But, once it became known that Halon was destroying the ozone layer, it was phased out for new systems.

Halon alternatives generally fall into two categories: clean agent systems, many of which use halocarbons, and inert gases. Clean agent systems extinguish fires by removing heat. Inert gases essentially suffocate the fire by depriving it of oxygen. Both can be "excellent, reliable systems," if they are properly designed and commissioned, says Scott Golly, senior fire protection engineer at Hughes Associates. Inert gas systems use a higher concentration of gas to extinguish a fire than halocarbon systems, so they require more storage space.

Any facility using a "dry," gaseous product for fire suppression must also have a water-suppression system, according to Kevin J. McCarthy, vice president of engineering company EDG2. But using water in a data center "can cause catastrophic damage to equipment," Golly says.

The sensitivity of conventional sprinklers may justify a pre-action sprinkler system, which requires multiple events for pipes to flood with water. A pre-action sprinkler has a large valve at the back of the water supply, so the pipes are empty.

A double interact system uses a clean agent to put out a fire long before a smoke-head is set off. A pre-action system may also require the activation of two smoke detectors in two different zones before a deluge valve opens to fill the pipes. The clean agent or inert gas fire suppression is designed to put out the fire before the sprinkler head begins dropping water.

Facility managers must evaluate what would constitute an acceptable loss. "Can you afford to have all those computers taken off line for several weeks?" Golly says. "If you cannot, then you cannot rely solely on sprinklers." Other questions to consider include storage space and cost. Both clean agent and inert gas systems are more expensive than pre-action systems.

3.  FM And IT Need To Work Together

Today's tip is to involve both facility management and IT in designing a new data center. With input from both groups, design decisions can benefit both sides.

When FM and IT don't work together, the most common problem is over-built mechanical and electrical infrastructure. Too much UPS, too many generators and excessive precision cooling is installed on Day 1. In addition, the oversized equipment operates very inefficiently and reliability expectations may not be achieved.

The IT group may have critical applications that require a higher level of reliability than facility management plans to build. Oftentimes, one of the biggest issues is the need for concurrent operation of the data center and maintenance, or as the industry calls it, concurrent maintenance.

FM and IT need to agree on a set of performance objectives and success measures. Learn to communicate free of jargon, using easy-to-understand terms and descriptions. Describe the challenges each side faces, and interdependencies between both disciplines. Facility management and IT also need to jointly educate themselves about risk analysis, assessment and mitigation, in order to explain to each other what can happen under various scenarios. Necessary steps include going beyond the Uptime Institute Tier and other rating systems; exploring failure rates and their effects; and considering different ways to address risks, such as operations and maintenance improvements.

All of these considerations should be communicated to the on-site facility management and all shifts of the IT staff. If a problem occurs, both IT and facility management need to know. Finally, the team should conduct a post-event evaluation to determine the cause and prevent it from happening again.

Ten years ago, reliability was the top priority for data center design and operation. Now, cost to build and cost to operate are equally important. Solutions have to be scalable to allow for critical power and cooling to be installed in increments to match the growth in IT build-out. This cannot happen without close coordination between facility management and IT.

4.  DCIM Offers Benefits In Legacy Data Centers

This is Casey Laughman, managing editor of Building Operating Management magazine. Today's tip is that Data Center Infrastructure Management offers benefits in legacy data centers.

Where there is an existing plant, a middleware DCIM system is ideal. Configured and placed on the Industrial Ethernet or IT network, the object of the system is to listen on the wire for any pre-defined data or receive traps from legacy management/monitoring systems and report or archive accordingly.

The benefits of this type of installation include using existing BMS/BAS/EPMS (emergency power management system) and network management systems and less installation time since the points at the far end are already connected. However, many drawbacks exist. Those drawbacks include: the risk that custom software will be have to be developed to fit existing systems; incomplete data gathering based on the possibility that the building's legacy 'tool' cannot integrate; the potential that staff may tire of the system prior to full implementation and shelve it; concern that the DCIM will be yet another platform to increase operational expenses; and the probability that something on the raised floor will change during DCIM implementation, rendering Day 1 data out of date.

Don't let the drawbacks weigh too heavy, though. Tying existing systems into a central point of collection wisely capitalizes on the existing investment in management systems and enables cross-system data sharing. But be wary when checking into DCIM or middleware. Ask a lot of questions and provide the vendors with a list of the systems for desired integration.

Questions should include the obvious: Can you integrate with everything on my list? What protocols have you successfully integrated with? What systems have you successfully integrated with? Can I use this to tie not only one, but multiple data centers together?

Be prepared for a lot of "vaporware." Middleware and DCIM at this level of integration is an emerging field and vendors will make promises that the next release will contain everything. Don't expect an out-of-box solution from any of them — there are just too many types of systems. Consider creating a bubble diagram that shows the existing systems by manufacturer name and function and reveals any existing relationship between the two, as well as the desired future relationships. This can go a long way toward illustrating the desired equipment.


RELATED CONTENT:


Data center , temperature , ASHRAE , cooling

65 Crazy, Outrageous Occupant Complaints. Order your copy today >


QUICK Sign-up - Membership Includes:

New Content and Magazine Article Updates
Educational Webcast Alerts
Building Products/Technology Notices
Complete Library of Reports, Webcasts, Salary and Exclusive Member Content



click here for more member info.