Simply hoping your lubricants are operating within their protective-specification limits doesn’t make it so.
By Ken Bannister, MEch Eng (UK) CMRP, MLE, Contributing Editor
Lubricants are designed and chosen to perform as finite and perishable, integral components of host machines. Rarely, if ever, will a lubricant be employed in identical application and environmental conditions. Enter oil-analysis testing.
Why we test
The uniqueness of lubricants reflects how and when they must be tested, maintained (filtered and temperature controlled), and changed out. Stresses and influences such as load-induced shear stress, thermal degradation, various types of contamination, and wear-metal-catalyzing alter and prematurely degrade lubricant properties.
Oil is made up of a base oil and an additive package that’s designed to combat ambient and working environmental stresses/influences and deliver reasonable lubricant life. Outside stresses produce an array of detrimental effects, including oxidation, polymerization, cracking hydrolysis, and evaporation that manifest as thickening or dilution of viscosity, acid buildup, and sludge. Additionally, when oil loses some of its protective ability, its host bearings can come into contact with one another and release metal-wear particles into the lubricant, which then act as a bearing-attacking abrasive material (three-body abrasion).
These effects and conditions are why we analyze oil. This testing is how we ensure lubricants are serviceable and bearing surfaces are protected.
Oil analysis is analogous to a blood test wherein a single, properly extracted fluid sample is used for a variety of diagnostics that indicate machine and lubricant conditions. To ensure an accurate interpretation of results every time—reliable ones suitable for trending and historical analysis—samples must be collected in a consistent manner and sent to the same laboratory for testing on the same equipment.
The lab will also require a virgin sample of any lubricant to be tested. This sample is used to document baseline measurements of base-oil type, additive-package levels (metals and chemicals), cleanliness level (dirt-contamination level), and viscosity and acidity. A set of initial samples detailing how and where each was taken will also be required for each machine.
Good laboratories also document an operational profile for each machine tested. Based on it, they can recommend additional beneficial testing, e.g., a Karl Fischer water-contamination test for a food plant with daily machine wash downs; tests for soot and glycol in mobile equipment and generator engines; or ferrographic analysis of metal particulates to determine specifically how a bearing is failing.
Basic oil analysis concentrates primarily on fluid property and fluid contamination.
In analyzing fluid properties, laboratories typically look at viscosity, acidity, and additive elements—the “big three” characteristics that make oils unique—and which, through their changes in service, can tell us how to better maintain our lubricants.
Viscosity. The viscosity rating of new oil is typically measured in centistokes (cSt), i.e., oil’s kinematic viscosity depicting measured resistance to flow and shear by the force of gravity. As oil thickens or dilutes over time, however, its specific gravity changes, leading to errors in gravity-based tests. A more consistent measurement is achieved by checking for the absolute viscosity rating depicting oil’s resistance to flow and shear through measurement of its internal friction. Because absolute viscosity is measured by multiplying kinematic viscosity by the actual specific gravity, it’s an accurate, error-free trending method of choice for most laboratories. To understand which tests your lab used, note the measurement scales: kinematic viscosity (good test) is measured in centistokes (cSt), absolute viscosity (best test) in centipoise (cPs).
Given oil’s many variables, it’s best to work with a laboratory that’s experienced in setting up caution and critical limits for your industry type. Most labs typically start with a clearly defined set of viscosity limits of –10% CL (critical lower), –5% CaL (caution lower), +5% CaU (caution upper), and +10% CU (critical upper) for industrial oils. In more severe environments, the CaU and CU limits can be reduced to +4% and +8%, respectively. For oils with viscosity improvers, the lower limits are usually doubled.
Thickened, more viscous oil points to oxidation (depleted additives), air entrainment, and/or contamination. Thinner, less viscous oil points to a wrong substitution or fuel dilution.
Acidity. The acid number, or AN, is a measurement of the acid concentration in the oil, not the acid strength, and is greatly affected by the presence of water within the oil. Most oils start with an AN of less than 2.
Setting limits for acidity isn’t as easy as setting those for viscosity. The caution and critical limits are dependent on the type of additive package used in the oil. Most standard mineral oils are considered corrosive over AN 4, whereas AW (anti-wear) or R&O (rust-and-oxidation-inhibited) oils are considered critical well below AN 3. Working with your oil supplier’s engineering department and/or a reputable oil lab with experience in your industry is the best way to set up meaningful acceptable limits for your environment.
A change in oil’s acidity (TAN) points to base oil deterioration, oxidization, and contamination.
Additive Elements. The table on p. 38 lists the typical standard elements for which oil analysis tests. Since some perform in multiple functions, they must be checked against a virgin sample and operational profile to determine if they are beneficial or detrimental when their values are compared with known values.
Dirt, water, and chemical contaminants are highly destructive to lubricants. For the most part, however, they’re easily avoidable.
Solids contamination. Testing for solid contaminants involves particle counting based on ISO Cleanliness Code ISO 4406:1999. One method requires a technician to use a light microscope and manually count the number of particulates in a 100-ml oil sample that are >4 microns, >6 microns, and >14 microns in size. The total is then compared with the ISO 4406 cleanliness chart to derive a three-number ISO cleanliness rating. An alternative, automated approach leverages sensors and light-absorption principles to detect and count particles. With this method, ISO 4406 calls for three sample size counts at >4 microns, >6 microns, and >14 microns.
Water contamination. Water in oil promotes rust and corrosion—and, in a dissolved state, will accelerate oxidation. Water can be introduced as contamination through wash downs of equipment or leakage. Prevention measures include coalescing filters/breathers and physical waterproof protection around areas susceptible to moisture ingression.
Testing for water contamination typically involves the Karl Fischer moisture titration method: A vaporized oil sample is carried by oxygen-free nitrogen into a reaction-vessel containing methanol. Trapped moisture is titrated to an end point with a reagent to establish the presence of water in parts per million.
Beyond why and what
The procedures discussed here represent the major components in standard, inexpensive oil-analysis testing. In most cases, they’ll indicate when to change oil, based on condition. Unusual or inconclusive findings should generate more-specific testing that can lead to positive outcomes for both lubricant and machine. MT
Ken Bannister is managing partner and principal consultant for EngTech Industries Inc., Innerkip, Ontario, an asset management-consulting firm now specializing in the implementation of certifiable ISO 55001 lubrication-management programs and asset-management systems. For further details, telephone 519-469-9173, or email firstname.lastname@example.org.