Can automation of diagnostics make a difference?
SymphonyAI Industrial diagnostic software performance put to the test
Download case study
Three mechanical engineers whose main job responsibility has been to analyze vibration data for the past 2-4 years were each given data on 332 machines to analyze. The data was collected by one of SymphonyAI’s clients and sent to them as part of a routine data submittal. It was not pre-screened to weed out bad data or to provide especially interesting or challenging cases. 23 different machine types were present in the data population representing over 52 unique machine models including centrifugal pumps, blowers, generators, piston pumps, screw pumps, turbines and reduction gears.
Each set of data was presented to the engineers with an average baseline overlay, which represents the vibration signature of each particular machine, as it appears when the machines are in good health. Machine diagrams and schematics identifying forcing frequencies were also provided to aid in the analysis. The engineers were asked to identify unique mechanical faults, and if any were present, provide supporting information for their analysis, assign one of four severity levels to each fault (slight, moderate, serious and extreme) and recommend an appropriate repair action.
The results were then compared to the unedited output of SymphonyAI’s Expert Automated Diagnostic System (EADS). A senior engineer with 15 years of daily vibration analysis experience also analyzed the data along with the EADS reports and corrected the diagnosis. The EADs output reviewed and edited by a senior engineer was considered the “control” in the study, and his analysis was deemed the best available. The performance of the EADS system and the other engineers were thus compared to the analysis of the senior engineer.
EADS was 8% more accurate than the 3 engineers on average, and 86% accurate compared to the senior engineer. This also means that the junior engineers were about 78% accurate on average when compared to the senior engineer. EADS took a total of 20 minutes to analyze all of the data and produce a concise report whereas the engineers took an average of 130 hours to analyze the same data, without producing a polished report!
Discussion of results
Among the discrepancies, in both the EADS and the human analysis, 93% of the erroneous analysis were in differentiating between severity levels of slight or moderate, which typically result in no repair recommendation or simply a recommendation to monitor the machine for changes in vibration. This implies that both the humans and the automated system appear to be best equipped at finding more serious or obvious mechanical faults. Fortunately, these faults are also the most important ones to recognize.
Another interesting result of the study is that 67% of the faults missed by the junior engineers were one of multiple faults in the machine. This implies that the human analysts tend to identify the predominant fault but then fail to look deeper for additional faults, whereas EADS looked for all of the faults present in the machines.
Finally, both EADS and the human analysts averaged about the same number of missed faults as false positive faults. Although the criteria used for determining if the diagnosis were correct allowed EADS or the engineers to be off by one level of fault severity, there were few cases where these discrepancies existed.
It should also be noted that the cause of most of the erroneous calls EADS made were due to bad data (incorrect test conditions) in this data set. Where bad data was present, the human analysts were slightly better at determining the cause of the bad data, such as accelerometer overload or incorrect machine speed. EADs was nearly as proficient at determining that the data was abnormal but was not as concise about the cause.
History of EADS
The Expert Automated Diagnostic System was originally developed for the US Navy (where it is still in use), by
DLI, beginning in the early 1980’s and was made available commercially in 1990. The main reason for developing the system was to reduce the number of man hours needed to successfully analyze hundreds (and later thousands) of machine tests per month and to provide an objective analysis of machine health that would not rely on the experience level of the analyst.
The system is empirically based on more than 20,000 machine tests collected annually since the early 1980’s and the system continues to evolve today as new machine types are encountered and added to the system. Currently it is installed and operating successfully in hundreds of plants around the world covering industries ranging from breweries to aircraft carriers, pharmaceutical companies to computer chip makers, and nuclear power plants to oil refineries.
How it works
EADS uses the same process as a human analyst to analyze vibration data. The run speed of the machine is identified using an automatic process of data normalization, which also serves to “line it up” with the reference data.
Known forcing frequencies are identified and extracted from the spectra, as are nonsynchronous peaks that may
be identified as bearing tones. Cepstrum analysis is used to determine if non-synchronous peaks are parts of harmonic series or have sidebands, further confirming that they are bearing tones. If demodulation (envelope detection) data are collected, they too will be screened for bearing tones and compared to the spectra.
Once the peaks of importance have been identified and extracted from the spectra, they are compared to the baseline and processed through the complex rules that apply to the particular machine type. Typically each test location will be comprised of tri-axial data in two frequency ranges and one demod point if the unit has rolling element bearings. (Note that this data collection may be initiated by a single swipe of a barcode reader.) Not only will the rule base compare these spectra to each other when determining what faults are present, it will also compare data from different positions on the machine. As an example, for coupling misalignment to be cited, signs of misalignment will need to be present on both sides of the coupling and other potential faults will need to be disqualified. Thus this is quite different from a simple system of alarms or alarm bands.
Unique aspects of the system, as mentioned earlier, are that it does not rely on industry standard alarms. Instead it uses a baseline comprised of statistically averaged data from the machines themselves along with a rule base comprised of over 4,500 unique rules for identifying individual faults in a variety of machine types. Additionally, the system provides a concise report identifying individual faults and their severity along with a repair recommendation with a corresponding level of priority; not just data (Figure 1, previous page). Configuration of the system is accomplished using a setup “wizard” that asks simple questions about the machine.
Notes on cost savings
As mentioned in the last section, EADS performs its analysis functions in the same way that a human would approach this task and both require the same amount of information to successfully analyze data. This includes a minimum amount of information about the machine and the types of faults to which it is prone. Good data must be collected and compared to some sort of reference. Holding the time to collect these things equal, the time to actually analyze the data and produce reports may be compared.
If an assumption is made that in the U.S., the labor rate for a mid-level engineer with 2-4 years experience is between $30 – 40 / hour, then 130 man hours costs between $3,900 and $5,200. This, divided by the number of machines analyzed (332) gives a cost of approximately $12 – $16 / machine test. If machines are tested monthly, this brings the totals to $144 – $192 / machine / year. Note that in this study, the engineers did not write formal reports of their findings, therefore, the total costs per machine per year would actually be quite a bit higher depending on the detail needed in the reports.
If one considers the amount of time to initiate downloading and processing data by EADS to be negligible (it takes less time to process data through EADS then to print graphs for manual analysis), then these figures can be seen as the cost savings associated with using EADS. Returning to the total man hour cost of analyzing this data set ($3,900 – $5,200), the annual savings associated with monitoring this group of 332 machines on a monthly basis is between $46,800 and $62,400. Again, these figures do not include the generation of formal reports.
Not only was EADS 8% more accurate than degreed engineers who have manually analyzed hundreds of machine tests a week, month after month for 2 – 4 years, it accomplished the task over 430 times faster. Instead of taking more than three man weeks, costing between $3.9K-$5.2K, it took EADS 20 minutes to complete the task — and it did a better job of it.