Building Operating Management

Benchmarking — Inside and Out





By Greg Zimmerman, Executive Editor   Facilities Management

Ask 100 facility executives how they benchmark, and you’re likely to get 99 different answers. But what every benchmarking exercise has in common is the quest for the proverbial apples-to-apples comparison. Whether that is best done with external benchmarks from organizations like BOMA or IFMA, or internal ones that compare two facilities in the same organization, or the same facility over two periods of time, depends on the goals of the organization. A strong benchmarking system is the result of the facility executive’s ability to massage the pros and cons of both internal and external benchmarks to make them work for a given situations.

But the usefulness of benchmarking data isn’t the only reason to develop comparative data. Top management frequently wants to see benchmarks — especially external benchmarks — to help evaluate the performance of the facility executive.

“One problem we all face is that CFOs read reports about benchmarks for real estate costs,” says James Morgensen, formerly vice president of real estate for Macromedia. “They see something that says a cost should be 50 and they want to know why we’re at 60.”

If that’s not enough reason to benchmark facility performance, it’s worth remembering that, if an in-house facility executive won’t provide benchmarks, an outsourcing firm will.

Handling Complications

With external benchmarks, the biggest drawback is that there are so many variables specific to each facility that an industry benchmark is never a good comparison. There are a variety of ways to address that issue. For many facility executives, external benchmarks serve as a general frame of reference for their own numbers. The external benchmark is a tool for spotting trends in their own numbers.

“You benchmark and find that the average operating expense for office buildings minus rent is $18 per square foot,” says Robert Gross, principal of facility management for The Vanguard Group. “Then you ask, ‘How are we trending?’ Let’s put that $18 on our graph and let’s see if we’re moving closer to it or moving away from it, and let’s find out why.”

Jim Bullock turned that approach around. Bullock, director of facilities at The Getty Center, uses data collected by the International Association of Museum Facility Administrators (IAMFA) for external benchmarking. But because The Getty Center has more than 800 acres of grounds, IAMFA’s usual metric of cost per building square foot didn’t provide a good benchmark for The Getty Center. With so much grounds space, and the extra cost associated with maintaining the grounds, Bullock’s numbers were always skewed.

“Over a couple-year period, we compromised,” Bullock says. He worked with IAMFA to develop a new metric: cost per acre. “It is a much better fit,” he says.

Of course, before using industry benchmarks, the facility executive should ensure that internal numbers are being reported in a way that matches industry-wide categories as closely as possible. That isn’t always easy.

The problem is that the facility executive may want numbers that no one else in the organization cares about. That’s more work for the accounting department. “You may have to push to get the detail you need,” Morgensen says. “You need to be crisp about the categories of numbers you need and the reasons why you need them.”

Instead of looking strictly at numbers for trends, some facility executives use benchmarking in a more liberal definition of the term: to qualitatively define best practices.

For example, Don Sposato, a consultant and retired vice president of project management and engineering services for Becton Dickinson Co., used anecdotal evidence from other organizations on how to secure entrance lobbies.

“We went out and benchmarked other organizations,” he says. “How did they handle the entrance? Was there a guard there? Who controls signs? Who controls visitors from the lobby into the organization? We found out after benchmarking that we were pretty lean, so we went back to management and said X company does it this way, Y company does it this way and here’s where we are. We’re kind of low on the scale.”

Looking In

One advantage of using internal benchmarks is that it’s easier to discover the story behind the numbers, and react accordingly.

Don King, director of maintenance operations consulting services for Kaiser Permanente, already had standardized operating procedures in place when he began an internal benchmarking program. King collected facility data at the HMO’s 920 buildings and 30 hospitals. By analyzing the data, he determined that the numbers between similar facilities sometimes varied greatly, even though these facilities were supposed to be using similar practices. “I said, ‘Maybe I’m not a rocket scientist, but something’s going on here, team.’”

King studied the facilities that had better numbers to identify things they were doing right. The strategies used could then be disseminated to facilities whose metrics were also off — a sort of restandardization process. “Here are the tactics being used, go forth and stabilize your metric,” King told his facility managers. “A lot of them did, but you still had some saying, ‘We’re different.’”

“We’re different.” That objection is a common problem facing facility executives seeking to use internal benchmarking to improve the performance. “Everybody sees you collecting metrics that might have an impact on them and they want to index out their variation,” says King. “They say that the metrics don’t take into account the degree days or the color of the walls or the traffic.”

King’s answer to those who criticize his metrics is short and sweet: “Until you can bring better numbers to the table, mine stand.”

‘Things Get Better’

Finding ways to squelch the “we’re different” cries is one of the challenges of a successful internal benchmarking system. But there’s good reason to tackle that problem, says Gary Boyd, senior vice president, clinical and support services for John Muir/Mt. Diablo Health System and COO of Brentwood Medical Center.

Boyd benchmarked capital projects at two of the organization’s acute care hospitals. He determined that more expensive capital projects at a hospital were basically a result of differences in how projects were managed by the contractor and architect. So Boyd established a team of standard architects and contractors both hospitals can pull from. His goal is to bring higher cost projects into line with lower cost ones. The move has worked so far.

“If we start measuring things, things get better,” he says. “Suddenly, the architects and contractors have gotten a lot more responsive to my need to get our arms around costs.”

Benchmarking is work. But the information that benchmarking provides is so valuable that some facility executives can’t imagine not having it.

“I’m acutely aware of companies that don’t benchmark,” says Morgensen.“But I have always viewed benchmarking as part of my responsibility to get value for the money we’re spending. I expect my staff to know what their numbers are. And when I ask them what their neighbors are paying, I expect them to know that too.”

How It’s Done: Energy Benchmarking for K-12 Schools

A benchmarking process for K-12 school systems was developed by researchers from MIT in cooperation with the West Contra Costa Unified School District and Pacific Gas & Electric. The project was sponsored by the Public Interest Energy Research Program, funded by the California Energy Commission.

     Step 1 Compile energy use data for all fuels, working with the local utility if information isn’t readily available. Analyze daily consumption patterns, not annual energy bills, to better identify problems in systems and in different facilities. Adding meters that can record energy use in smaller intervals, such as daily intervals, can provide more accurate benchmarking information.

     Step 2 Tabulate absolute energy consumption in terms of site energy use per year, broken down by electricity and natural gas. List total consumption, cost and peak demand.

     Step 3 Establish energy intensity indicators.

Determine site energy consumption and cost, site energy and cost per student, site energy intensity and cost per unit floor area, energy per student-hour of operation, and energy intensity per hour of operation.

Consider measuring student population and density, school schedules, and features of the building itself. Be sure to account for differing hours of operation.

     Step 4 Rank the schools using an index. Each school’s position in the index should be computed as the average of the ranked positions of that school under each indicator in Step 3. Rank-ordering the facilities shows at a glance how one school stacks up to others in the system.

     Step 5 Identify the worst performers. The rank index will reveal the schools that consume more energy than their peers.

     Step 6 Develop an action plan for the highest energy users after using an audit or energy monitoring to determine which building systems might be malfunctioning and causing higher energy use in those facilities.

HOW WE DEVELOP AND USE METRICS:





Contact FacilitiesNet Editorial Staff »

  posted on 4/1/2006   Article Use Policy

Comments