In my previous column, I discussed the importance of using metrics to measure the success of a lubrication-management program, specifically metrics that one would develop from oil analysis data. In that column, I alluded to what I called use-dependent parameters and use-independent parameters. I'm going to discuss these further in this article.
Let's assume that you have a lot of pumps in your plant. You've learned, by experience, that most pump bearing failures are attributable to water contamination. So two of the important metrics you've identified as ones you'd like to monitor are water contamination and elemental iron (Fe).
It might be reasonable to assume that calculating the average water contamination and average elemental iron would be a good way to track the health of your pumps. This method works well for determining water contamination, but it is misleading for determining bearing wear.
Let's say the plant has just had a major outage. During the outage, the oil in 90 percent of the pumps was changed. A month or two after the outage, you look at average moisture and average elemental iron for the pumps, and you are pleasantly surprised to see that while water contamination has decreased only by a small amount, the elemental iron has decreased by an order of magnitude. Does this mean the pumps are wearing less because they have nice fresh oil? Of course not. All it means is that using the average iron value to monitor pump wear is not a good way of measuring wear.
Here is why: contamination is not a function of how long the oil has been in the pump, but iron (from wear) most definitely is. Whether the oil has been in use for one day or one year, there is a certain limit of moisture that should not be exceeded. Water contamination is what I call use-independent. On the other hand, the level of elemental wear readings in a lubricant is a function of how long the oil has been in use. These are use-dependent metrics.
So how do we go about establishing a useful metric to track machine wear? It's easy; all that has to be done is monitor the average wear rate (as opposed to average wear). To get wear rate, divide the wear readings by some sort of time factor. It could be weeks, months, miles - whatever is appropriate for the machine and can be easily tracked.
As a rule of thumb, readings are use-dependent; contamination and fluid properties parameters are use-independent. There are exceptions, however.
Let's take a look at some of the more common parameters you might use to measure machine wear.
Particle Count and Moisture
These are both important parameters, and being indicators of contamination, they are both use-independent. In other words, average them out across the machines in the group.
Soot contamination is a useful measure of the combustion efficiency of a fleet of diesel engines. Soot is a contaminant, but it's an exception to the rule. This is one contamination parameter that is actually use-dependent. The amount of soot present in an engine oil is a function of how long the oil has been in use. For this reason, you should use the rate of soot formation rather than just soot levels in developing the metric.
In most industrial equipment, the iron level is probably the best indicator of wear. It is a use-dependent parameter, and therefore, the rate of iron formation must be used. In cases where iron can also be present as a contaminant (such as in an iron ore mine), it becomes difficult to use and should probably be avoided.
Copper should be used as the wear indicator in worm gearboxes that have a bronze bull gear. Copper is use-dependent.
The ferrous density is a good indicator of abnormal wear, particularly with unfiltered gear lubrication systems. It is use-dependent.
The concepts of use-dependence and use-independence are simple, but if oil analysis metrics are to be used effectively, they must be well understood and applied effectively.