Machine vision research to expand from laboratory to field studies near oil production, distribution regions
Credit: Southwest Research Institute
With over 80,000 miles of oil pipelines across the United States, many waterways are at risk for environmental damage from incidents such as the 2010 Kalamazoo Spill, which cost more than $1.2 billion and three years to clean up. Monitoring waterways near oil pipelines is costly and time consuming with conventional solutions that rely upon satellite remote sensing or laser spectroscopy.
SwRI addresses these challenges with its Smart Leak Detection on Water (SLED-W) system, which uses algorithms to process visual and thermal data from cameras affixed to aircraft, stationary devices or watercraft.
“SLED-W was able to detect two different types of oil with unique thermal and visible properties,” said Ryan McBee, a research engineer who led the project for SwRI’s Critical Systems Department. “SLED-W showed positive initial results, and with further data collection, the algorithm will handle more varied external conditions.”
The internally funded project expands on previously developed SLED technology that detects methane gas from pipelines as well as liquid leaks on solid surfaces such as soil, gravel and sand.
SwRI applied a multidisciplinary approach to develop SLED-W. Computer scientists teamed with oil and gas experts from the Institute’s Mechanical Engineering Division to train algorithms to recognize the unique characteristics of oil on water. Oil can spread over water or blend with it, making it hard for sensors to discern under different lighting and environmental conditions.
“Labeling oil is a significant challenge. For SLED-W, we had to account for different behaviors so it would know what to consider and what to ignore to avoid false-positives,” McBee said.
By combining thermal and visible cameras, SLED-W analyzes scenes from different perspectives. Visible cameras alone are limited by glare and have difficulty capturing transparent thin oils that blend with water. Thermal vision requires heat differences to discern features. This can lead to false positives near animals and other warm objects. By combining thermal and visual images into the machine learning system, algorithms can choose the most relevant information, mitigating the weaknesses of each sensor.
Next, the team will perform field testing to train the algorithms and is currently working with industry partners to equip aircraft with SLED-W to gather data in real-world conditions.
For more information, visit https:/
YouTube video for the release: https:/