- All Topics
- GearTalk Podcast
- Training & Events
- Buyer's Guide
The ability to interpret oil analysis results is crucial for guiding decisions about preventive maintenance activities. Having someone in your organization who can pick up a report and interpret it in the context of the environment is essential. This is a skill that can easily be developed with a minimal investment in training and certification. This article will address the fundamentals of oil analysis and how to interpret the resulting reports.
Once an analysis is completed, it is important to review the report and interpret the accompanying data. Based on the report, you can determine whether action is needed. The report does not always pinpoint specific problems, but it does provide a starting point for analysis.
Each test should be clearly identified. The information usually is organized in a spreadsheet format with numbers indicating the test results. When looking at your reports, the first thing you should do is to ensure that they are indeed your reports. Be certain the report includes your name, lube type, machine manufacturer and machine type.
The report should also clearly state your machine and lubricant condition. The laboratory should have a rating system that notifies you of normal, marginal and critical levels. In addition, the report should include comments from the analyst who reviewed your results. These comments will help you gauge the criticality of the problem and provide a suggested course of action.
Viscosity is the most common test run on lubricants because it is considered a lubricant’s most important property. This test measures a lubricant’s resistance to flow at a specific temperature. If a lubricant does not have the right viscosity, it cannot perform its functions properly.
If the viscosity is not correct for the load, the oil film cannot be established at the friction point. Heat and contamination are also not carried away at the appropriate rates, and the oil cannot adequately protect the component. A lubricant with improper viscosity can lead to overheating, accelerated wear and ultimately the failure of the component.
Industrial oils are identified by their ISO viscosity grade (VG). The ISO VG refers to the oil’s kinematic viscosity at 40 degrees C. To be categorized at a certain ISO grade, the oil’s viscosity must fall within plus or minus 10 percent of the grade.
So for an oil to be classified as ISO 100, the viscosity must fall within 90 to 110 centistokes (cSt). If the oil’s viscosity is within plus or minus 10 percent of its ISO grade, it is considered normal. If the oil’s viscosity is greater than plus or minus 10 percent and less than plus or minus 20 percent, it is considered marginal. Viscosity greater than plus or minus 20 percent from grade is critical.
Analyzing an oil analysis report involves understanding the concentration of expected and unexpected elements in your oil. Some contaminants are picked up as the oil circulates and splashes off different machine components and surfaces. Other contaminants can enter the machine during manufacturing or routine service, as well as through faulty seals, poor breathers or open hatches. No matter how the contaminants enter the oil, they can cause significant damage.
Elemental spectroscopy is a test used to determine the concentration of wear metals, contaminant metals and additive metals in a lubricant. A concentration of wear metals can be indicative of abnormal wear. However, spectroscopy cannot measure particles larger than roughly 7 microns, which leaves this test blind to larger solid particles. As with any type of testing, spectroscopy is subject to inherent variance.
When oil additives containing metallic elements are present, significant differences between the concentrations of the additive elements and their respective specifications can indicate that either incorrect oil is being used or a change in the formulation has occurred. Also, keep in mind that sump sizes can vary in custom applications.
When reviewing the wear levels in your test results, look at the trend history of each machine, not just the recommendations from the original equipment manufacturer (OEM). OEMs offer good benchmarks, but it is not wise to just follow their recommendations because most machines are used differently.
For example, two identical pieces of equipment may have vastly different elemental spectroscopy results due to variations in load, duty cycle and maintenance practices. Their results might even show a variety of particle count levels. Both machines could still be considered healthy based on the trending of the analysis.
Trending is extremely important in determining a machine’s health. A good rule of thumb is to use your best judgment and review the trend data. Has anything changed with the operating conditions? Have you been running the machine longer? Have you been putting more load on the machine? You should also discuss the test results with the lab analyst before making any decisions.
Contamination causes a number of oil system failures. It often takes the form of insoluble materials such as water, metals, dust particles, sand and rubber. The smallest particles (less than 2 microns) can produce significant damage. These typically are silt, resin or oxidation deposits.
The objective with contaminants is to detect the presence of foreign materials, identify where they came from and determine how to prevent further entry or generation. Contaminants act as a catalyst for component wear. If the cycle is not broken, wear accelerates and downgraded serviceability results.
Typical elements that suggest contamination include silicon (airborne dust and dirt or defoamant additives), boron (corrosion inhibitors in coolants), potassium (coolant additives) and sodium (detergent and coolant additives).
When free water is present in oil, it poses a serious threat to the equipment. Water is a very poor lubricant and promotes rust and corrosion of metal surfaces. Dissolved water in oil produces oxidation and reduces the oil’s load-handling ability. Water contamination can also cause the oil’s additive package to precipitate. Water in any form results in accelerated wear, increased friction and high operating temperatures. If left unchecked, water can lead to premature component failure.
The Karl Fischer coulometric moisture test is the most common method used to analyze water levels in oil. When reviewing these test results, remember that low levels of water are typically the result of condensation, while higher levels can indicate a source of water ingress. In most systems, water should not exceed 500 parts per million.
Common sources of water include external contamination (breathers, seals and reservoir covers), internal leaks (heat exchangers or water jackets) and condensation. Read Water In Oil Contamination for more depth on this.
Acid number (AN) is an indicator of oil condition. It is useful in monitoring acid buildup. Oil oxidation causes acidic byproducts to form. High acid levels can indicate excessive oil oxidation or additive depletion and can lead to corrosion of internal components.
Acid number testing uses titration to detect the formation of acidic byproducts in oil. This test involves diluting the oil sample and adding incremental amounts of an alkaline solution until a neutral end point is achieved. Since the test measures the concentration of acids in the oil, the effects of dilution often negate the effectiveness of acid number testing.
Similarly, some oils containing anti-wear or extreme-pressure additives that are mildly acidic can also provide false high or low readings due to additive depletion. Acid number values should be considered in concert with other factors such as additive health and water content. Take a look at Acid Number: A Comprehensive Guide for more detail.
The concentration of wear particles in oil is a key indicator of potential component problems. Therefore, oil analysis must be capable of measuring a wide range of wear and contaminant particles. Some types of wear produce particles that are extremely small. Other types of wear generate larger particles that can be visually observed in the oil. Particles of any size have the propensity to cause serious damage if allowed to enter the lubricating oil.
Particle count analysis is conducted on a representative sample of the fluid in a system. The particle count test provides the quantity and particle size of the various solid contaminants in the fluid. The actual particle count and subsequent ISO cleanliness code are compared to the target code for the system. If the actual cleanliness level of a system is worse than the desired target, corrective action is recommended.
Particle counts generally are reported in six size ranges: greater than 4 microns, greater than 6 microns, greater than 14 microns, greater than 25 microns, greater than 50 microns and greater than 100 microns. By measuring and reporting these values, you can gain an understanding of the solid particles in the oil.
Monitoring these values also can help confirm the presence of large wear particles that cannot be seen through other test methods. However, particle counting simply indicates the presence of particles and does not reveal the type of particles present.
The ISO cleanliness code is utilized to help determine solid contamination levels in both new and used oils. The current ISO standard for reporting cleanliness is ISO 4406:99. Take a look at What Is the ISO Cleanliness Code? for more information.
In accordance with this standard, the values used from the particle count data are related to the greater than 4, greater than 6 and greater than 14 micron levels. The raw data at these micron levels are compared to a standard table and then translated to a code value.
It is important to understand the concept behind the ISO code table. The maximum value of each level is approximately two times the value of the preceding level. This means the minimum value of each level is also nearly double the minimum value of the preceding level. This is accomplished by using the ISO code, which is a value that is an exponent of two, dividing that result by 100 and then rounding.
Analytical ferrography is among the most powerful diagnostic tools in oil analysis today. When implemented correctly, it can be an excellent tool for identifying an active wear problem. However, it also has limitations. Analytical ferrography is frequently excluded from oil analysis programs because of its comparatively high price and a general misunderstanding of its value.
The results of an analytical ferrography test typically include a photomicrograph of the found debris along with specific descriptions of the particles and their suspected cause. Particles are categorized based on size, shape and metallurgy. Conclusions can be made regarding the wear rate and health of the component from which the sample was drawn.
The analyst relies on composition and shape to determine the characteristics of the particles. Due to the subjective nature of this test, it is best to trust the analyst’s interpretation regarding any action to be taken. This test is qualitative, which means it relies on the skill and knowledge of the ferrographic analyst.
While most lubrication professionals rely on oil analysis to help safeguard their equipment from unplanned downtime, an inability to dissect and comprehend a problematic report often yields inappropriate action when abnormal results appear. Your lab can only provide you with the machine condition data. It is up to you to take action.