New Content Updates
Educational Webcast Alerts
Building Products/Technology Notices
Access Exclusive Member Content
November 2011 -
Equipment Rental & Tools Article Use Policy
While the amount a department might save in consulting and service fees can be significant, managers have other ways to measure ROI related to infrared technology. As with all reliability-testing methods, the goal of thermography is greater equipment uptime and lower maintenance costs. Assigning a value to those savings is the tricky part.
Most of the benefits in this regard show up as a reduction in unscheduled maintenance. When a breakdown occurs, maintenance technicians must drop everything to respond, and most facilities have a cost associated with downtime. What is often less evident is the cost of the response to downtime, beyond the costs directly tied to the equipment. Unplanned downtime drives up technician overtime costs.
Also, managers must find a way to accurately track and account for express-shipment costs for needed parts that are not in stock, the increased cost of maintaining inventory in case of failure, and costs tied to equipment disposal.
Though hard to measure, these factors all have real costs. Burden rates — charges beyond the actual costs for labor and materials — for maintenance technicians can be $65-75 per hour, and higher if overtime is included. So increased efficiency means money saved.
Excess inventory — also called just-in-case-it-breaks parts — also has value, as does storing, managing, and organizing the parts. Depending on the facility in question, the cost of carrying excess inventory can be as high as 25 percent of the value of the inventory. The cost of storing that 20 horsepower, $6,000 motor can be $1,500 a year.
Using infrared technology to find failures in motor circuits in their infancy can extend the life of the motor and motor circuit, potentially eliminating the need to keep such excess inventory and allowing managers to avoid all of these unnecessary costs related to unplanned downtime.
One of the more sophisticated metrics to measure in the effectiveness of a predictive-maintenance or condition-based monitoring program — in short, thermography — is overall equipment effectiveness (OEE). This concept is essentially the product of three factors: the percentage of available uptime; the percentage of maximum processing rate; and the percentage of quality yield rate.
An OEE of 1 occurs when a piece of equipment is available 100 percent of the time, can run at maximum output, and never produces a defective outcome. Other metrics that point toward operational efficiency include: mean time between failures, average equipment life, and percentage of work flow that is planned maintenance.
If those principles seem too daunting to calculate right away, managers can show the impact of thermography using other measurements. For example, measuring overall maintenance costs as a percentage of the asset base is a basic metric managers can glean from basic accounting data.
A properly applied thermography program can quickly lower an organization's maintenance and repair costs, and managers can measure this impact almost immediately. These savings are part of a manager's ROI, which can grow right alongside the overall impact of a thermography program.
Dave Sirmans, CMRP, is an instructor and consultant with the Snell Group. He has worked in reliability as lead engineer in an electrical-testing company, where he developed and implemented NFPA 70E compliance for infrared inspections.
Successfully Specified, Infrared Systems Can Deliver a Solid Return on Investment
Training is Essential for Successful Use of Infrared Imagers
Quantifying the Bottom-Line Benefits of Infrared Imagers