It has been well discussed that oil analysis can effectively monitor three parameters of lubrication.
First and foremost, many tests monitor the health of a lubricant. These tests are pretty straightforward, and the results are generally compared to the properties of the new lubricant to gauge how much the in-service lubricant has changed. It is common practice to set condemning limits or monitor the trends for significant changes.
Contamination levels are also monitored using oil analysis, and cleanliness targets are usually used to trigger maintenance activities to bring the lubricants back to acceptable levels of contamination.
A program based on proactive maintenance principles, which monitors and corrects the parameters mentioned above, will significantly reduce the likelihood of machine wear - the third parameter that can be effectively monitored by oil analysis. While it is certainly true that maintaining a healthy, clean lubricant will minimize machine wear, there are still many wear modes that can arise in spite of these efforts. Misalignment, imbalance, overloading, improper installation, fatigue … the list goes on. Abnormal wear, for whatever reason, happens more often than maintenance professionals like to think. Therefore, it is essential to have a strategy in place to monitor machine wear.
Oil analysis remains the best tool in the predictive maintenance toolbox for the early detection of wear problems. Wear metal and wear particle levels will begin to increase well before the machine exhibits symptoms, such as vibration, temperature or noise. It is difficult, however, to determine the correct wear metal level thresholds. This is particularly true in industrial applications where equipment categories traditionally used are so general. The following gearbox example reinforces this point.
The question “How much iron is too much in a gear box?” sounds simple. However, when the many different sizes, types, loads, environments and applications are included in that question, it becomes more complex. If the many lubrication systems and lubricant types in use are added to this simple question, it becomes much more complicated. Is it realistic to think that there could be a good answer to such a question?
Probably not. Yet in most cases, this is exactly the type of question that is being asked each time an oil sample is taken. If an oil analysis program is expected to detect machine wear problems effectively, better questions must be asked.
What really needs to be determined is what is normal. Therefore, “normal” must be defined. According to Webster’s Dictionary, normal means conforming to a usual or typical pattern. How can a pattern in a broad category such as gearboxes be identified? The answer is fairly simple; by evaluating as much data as we possibly can. Before continuing, a review of how wear metals have traditionally been evaluated in the past is needed.
Many programs have used fixed limits, giving simple pass or fail criteria for each wear metal. Table 1 is an example of what fixed alarm limits might look like.
The drawback to this type of alarming technique is that it does not account for different contributing factors. Gearboxes come in many sizes and shapes. Some gearboxes are lightly loaded and run at constant speed, which would lend such a gearbox to a low wear rate. It might be in serious trouble if the iron level reaches 200 ppm. On the other side of the spectrum, the gearbox could be a low-speed, reversing, heavily loaded gearbox that hasn’t had less than 500 ppm of iron in its oil since it was tested at the assembly plant.
The lubrication method can have a large impact on wear metal levels as well. Many gearboxes are splash lubricated and hold only a small oil volume. As such, wear metals will build in the lubricant as time goes on. This situation would reveal a steadily increasing wear metal level and cause a false positive reading when the level broached the fixed alarm. Other gearboxes might be lubricated by a highly filtered circulating system, where wear particles are removed by filtration as rapidly as they are generated. In this case, the wear metal trend might be flat, and a significant change could occur without surpassing the fixed alarm. Such an exception would likely be missed by a fixed limit system.
Trend analysis allows the development of a pattern of behavior for a particular unit. If the sampling technique and interval are consistent, regular monitoring of the wear metal levels can effectively monitor for changes in the wear rate. This helps account for many of the variables within the equipment group. An uncharacteristic increase in iron, for example, would indicate a change in the wear rate.
Many techniques can be applied to evaluating trend data, such as averages, standard deviations and linear regression. All are intended to identify a condition that is not normal in relation to the machine’s past behavior. What is missing here is identifying what is normal for that machine type. Is it normal for a gearbox like this to generate this level of iron?
Figure 1. Trend Analysis
The answer to that question can be found using family analysis. This is a technique that compares the wear metal levels of groups of similar or identical equipment to identify a “usual or typical pattern.” It works like this:
Equipment is grouped together by family. A family may consist of identical equipment located in many plants, such as GE Frame 7 gas turbines in many power plants across the country. Equipment might also be grouped based on load, size, lubricant type and operating parameters, such as a group of agitators at a chemical plant. The wear metal data is then evaluated as a whole. Next, the data for each machine is compared to the family wear rate.
As an example, let’s say that we have a family of 50 motor bearings at a steel mill. The average tin reading is 7 ppm with 90 percent of the bearings reading less than 10 ppm. It would then be safe to assume that it is “normal” for these bearings to have less than 10 ppm tin in their oil. If one of the bearings were found to have 35 ppm of tin, it would be safe to say that its wear rate is “abnormal.” An effort could then be initiated to determine the cause of the higher wear rate and correct the problem. The problem can be detected, identified and resolved before the damage occurs, saving a premature bearing failure and replacement costs.
Family analysis techniques can have a significant impact on both large and small companies’ programs. A large company could use such a program to monitor large fleets of similar equipment among its plants, as well as to benchmark performance of individual plants. Companies with less equipment can compare their wear rates to equipment in many other plants, and take advantage of the lab’s vast database of equipment data.
Realistically, the ideal analysis program would be a blend of the three analysis techniques discussed here. The data evaluation process would become most cumbersome if each of these were applied to each wear metal for each machine tested in a program. This is where information technology systems come into play. Computers can automate this process so that each parameter is evaluated using numerous techniques, and the best possible analysis is obtained.
Computers are now capable of using statistical calculations, database mining and a rule-based knowledge hierarchy to compare the test data to fixed limits, trend analysis and family analysis, and they can select the most appropriate evaluation for each application.