Skip to main content

INDUSTRIAL

Redefining Rail Inspection with AI-Based Computer Vision

AI computer vision

Railroads are the connective tissue of our world’s infrastructure. But despite their critical role in global transportation and supply chains, most railroad track maintenance begins with a human inspector.

To identify damaged rail ties and tracks, human inspectors walk or drive miles of railroad every day looking for inconsistencies. The total time and cost required to manually inspect just the United States’ 160,000 miles of track is unquantifiable, and since humans conduct the inspections, it’s also inherently error prone.

Recent advancements in computer vision (CV) have opened new opportunities to automate railroad inspection, significantly reducing costs and improving accuracy in the process. But railroads present unique challenges for CV systems that range from high variability in the deployment environment to a safety-critical industry’s preference for trusted solutions.

That’s why organizations like Ignitarium, a product engineering company, are reinventing CV using AI technology to address pain points and reduce the need for human track inspection.

Overcoming Challenges in Infrastructure Inspection with AI-Based CV

Unlike controlled indoor environments where CV systems have a proven track record, railroads present a wide range of lighting conditions, weather variations, and other unpredictable factors. These variables can significantly impact the performance and accuracy of CV systems.

Another uphill battle for CV technology in outdoor railroad applications is changing what Ignitarium CTO Sujeeth Joseph calls “a highly traditional mindset in the industry,” referring to rail professionals’ desire to use tried-and-tested methods over novel approaches.

#Railroads present unique challenges for #CV systems that range from high variability in the deployment environment to a safety-critical industry’s preference for trusted solutions. @ignitarium via @insightdottech

These challenges led to the development of Ignitarium’s TYQ-i platform, which aims to blend the best of classical CV techniques with advanced custom neural nets. The result is an efficient solution that can detect a wide range of anomalies over many miles of track.

The operation of TYQ-i can be broken down into four stages:

  • Ingestion: The platform supports many visual sensors, including RGB, 3D, laser, and multi-spectra interfaces. In the rail industry, 2D cameras and laser scanners are the go-to sensors, says Joseph.
  • Preprocessing: Ignitarium has developed a library of image processing components that prepare the data for analysis. These include basic operations such as scaling and rotation as well as more complex tasks like stitching, tracking and noise reduction.
  • Deep learning: At the heart of TYQ-i are custom AI models for specific use cases. These models have been pretrained to detect various anomaly classes, delivering high levels of accuracy and efficiency with minimal input from the customer.
  • Presentation: The processed data is then presented to the user through dashboards and files that are human- and machine-readable. This enables the platform to integrate seamlessly with existing processes, helping overcome resistance to adopting new technology.

According to Joseph, one example of how these capabilities can be used is railroad ballasts—the track-bed on which railroad ties are laid. An airborne drone or camera mounted on the underside of a locomotive could use TYQ-i to detect areas where a ballast needs to be replenished, as well as areas that should be avoided due to safety or other operational concerns. That information would then pass to the ballast laying and tamping machine, so it could automatically perform maintenance in only the appropriate areas.

Achieving Scale and Flexibility with TYQ-i

To obtain the accuracy and reliability of Ignitarium’s TYQ-i platform, it initially underwent training using TensorFlow and PyTorch—two of the most popular open-source frameworks for machine learning and neural networks. This training was initially performed on powerful Intel® CPU and GPU targets, providing a solid foundation for the platform’s AI capabilities.

But to truly scale performance across a variety of use cases, Ignitarium recognized the need for a more versatile processing solution. This led to the decision to migrate TYQ-i to Intel® Core and Xeon® processors. While not a common target, there’s even a port on the Intel® Arria family of high-performance FPGAs.

The interoperability of these processors helps the company keep costs in check. “If the workload is heavier, we would go with server-class machines,” explains Joseph. But for lighter worklo­ads, the company uses solutions like the 12th Gen Intel® Core processor, which can accelerate AI with its in-built Intel® HD Graphics integrated graphics processors (IGPs).

The migration to Intel® processors also brought additional benefits. For instance, it allowed Ignitarium to take advantage of the robust software infrastructure of the Intel ecosystem, which offers a wide range of tools and resources to optimize performance and efficiency.

One such tool is the OpenVINO AI toolkit, which Ignitarium used to further optimize TYQ-i. OpenVINO is designed to facilitate the deployment of AI applications at the edge, offering support for a variety of neural network architectures and providing a comprehensive set of tools for optimizing performance.

Because the toolkit supports a wide range of Intel processors, Ignitarium can pick a processor and “the code just compiles and runs,” Joseph explains. At the same time, OpenVINO offers a variety of tools that help developers get the most out of their chosen processor. “We optimize using everything that the toolkit can provide us,” says Joseph.

All of these capabilities allow TYQ-i to run in various environments, from edge devices to cloud-based systems. At the edge, TYQ-i can process data in real time, providing immediate insights and allowing for quick decision-making. This is particularly useful in situations where low latency is crucial, such as detecting defects on a high-speed railway line.

For larger-scale applications, TYQ-i can also be deployed in the cloud to support vast amounts of data and perform more-complex analyses—a useful feature for monitoring extensive rail networks.

This flexibility allows it to be deployed in a wide range of scenarios, making it a highly adaptable solution for infrastructure monitoring.

The Future of Infrastructure Inspection Is Here

The challenges facing the rail inspection industry are significant. From the vast expanse of tracks to highly varied environments, the industry is in desperate need of innovative solutions. Ignitarium’s TYQ-i platform, with its blend of AI and CV technologies, offers a powerful answer to these challenges.

TYQ-I’s custom AI models, honed for high performance with minimal customer datasets, provides a solution that readily folds into existing workflows, offsetting bias against new answers to old problems. The result is a solution that is winning over track maintainers across the US.

As we look to the future, it’s clear that AI-based computer vision solutions like TYQ-i will play a crucial role in transforming the infrastructure inspection industry, delivering improved accuracy, efficiency, and safety for all.
 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

About the Author

Brandon is a long-time contributor to insight.tech going back to its days as Embedded Innovator, with more than a decade of high-tech journalism and media experience in previous roles as Editor-in-Chief of electronics engineering publication Embedded Computing Design, co-host of the Embedded Insiders podcast, and co-chair of live and virtual events such as Industrial IoT University at Sensors Expo and the IoT Device Security Conference. Brandon currently serves as marketing officer for electronic hardware standards organization, PICMG, where he helps evangelize the use of open standards-based technology. Brandon’s coverage focuses on artificial intelligence and machine learning, the Internet of Things, cybersecurity, embedded processors, edge computing, prototyping kits, and safety-critical systems, but extends to any topic of interest to the electronic design community. Drop him a line at techielew@gmail.com, DM him on Twitter @techielew, or connect with him on LinkedIn.

Profile Photo of Brandon Lewis