Skip to main content


Transforming the Factory Floor with Real-Time Analytics

Factory floor

Manufacturers are under a lot of pressure to take advantage of all the intelligent capabilities available to them—technologies like machine vision and AI-driven video analytics. These can be crucial tools to enable everything from defect detection and prevention to worker safety. But very few manufacturers are experts in the AI space, and there are many things to master and many plates to keep in the air—not to mention future-proofing a big technology investment. Those new technologies need to be created to be adaptable and interoperable.

Two people who can speak to these needs are Jonathan Weiss, Chief Revenue Officer at industrial machine vision provider Eigen Innovations; and Aji Anirudhan, Chief Sales and Marketing Officer at AI video analytics company AllGoVision Technologies. They talk about the challenges of implementing Industry 4.0, what manufacturers have to do to take advantage of the data-driven factory, and how AI continues to transform the factory floor (Video 1).

Video 1. Industry experts from AllGoVision and Eigen Innovations discuss the transformative impact of AI in manufacturing. (Source:

How can machine vision and AI address Industry 4.0 challenges?

Jonathan Weiss: All we do is machine vision for quality inspection, and we’re hyper-focused in industrial manufacturing. Traditional vision systems really lend themselves to detecting problems within the production footprint, and they will tell you if the product is good or bad, generally speaking. But then, how do you help people prevent defects, not just tell them they’ve produced one?

And that’s where our software is pretty unique. We don’t just leverage vision systems and cameras and different types of sensors, we also interface directly with process data—historians, OPC UA servers, even direct connections to PLCs at the control-network level. We give people insights into the variables and metrics that actually went into making the part, as well as what went wrong in the process and what kind of variation occurred that resulted in the defect. And a lot of what we do is AI and ML based.

How can video analytics address worker risk in the current industrial environment?

Aji Anirudhan: The primary thing in this industry is asking how you enhance the automation, how you bring in more machines. But people are not going to disappear from the factory floor, which basically means that there is going to be more interaction between the people and the machines there.

The UN has some data that says that companies spend $2,680 billion annually on workplace injuries and damages worldwide. This cost is a key concern for every manufacturer. Traditionally, what they have done is looked at different scenarios in which there were accidents and come up with policies to make sure those accidents don’t happen again.

But that’s not enough to bring these costs down. There could be different reasons why the accidents are happening; a scenario that is otherwise not anticipated can still create a potential accident. So you have to have a real-time mechanism in place that actually makes sure that the accident never happens in the first place.

That means that if a shop-floor employee is supposed to wear a hard hat and doesn’t, it is identified so that frontline managers can take care of it immediately—even if an accident hasn’t happened. The bottom line is: Reducing accidents means reduced insurance costs, and that adds to a company’s top line/bottom line.

In the industrial-manufacturing segment, it’s a combination of different behavioral patterns of people, or different interactions between people and machines or people and vehicles. And what we see in worker-safety requirements is also different between customers: oil and gas has different requirements from what is needed in a pharmaceutical company—the equipment, the protective gear, the safety-plan requirements.

For example, we worked with a company in India where hot metal is part of the production line, and there are instances when it gets spilled. It’s very hazardous, both from a people-safety and from a plant-safety point of view. The company wants it continuously monitored and immediately reported if anything happens. 

Are manufacturers prepared to take on the data-driven factory at this point?

Jonathan Weiss: Manufacturers as a whole are generally on board with the need to digitize, the need to automate. I do think there’s still a lot of education required on the right way to go about a large-scale initiative—where to start; how to ensure program effectiveness and success; and then how to scale that out beyond factories.

In my world, it’s helping industrials overcome the challenges of camera systems being siloed and not communicating with other enterprise systems. Also, not being able to scale those AI models across lines, factories, or even just across machines. That’s where traditional camera systems fail. And at Eigen, we’ve cracked that nut.

“By bringing vision systems and #software tools to factories, we’re enabling them to inspect parts faster” – Johnathan Weiss, @EigenInnovation via @insightdottech

But what Aji and I do is a small piece of a much larger puzzle, and the one common thread in that puzzle is data. That’s how we drive actionable insights or automation, by creating a single source of truth for all production data. Simply put, it’s a single place to put everything—quality data, process data, safety data, field-services-type data, customer data, warranty information, etc. Then you start to create bidirectional connections with various enterprise-grade applications so that ERP knows what quality is looking at, and vice versa.

It’s having that single source of truth, and then having the right strategy and architecture to implement various types of software into that single source of truth for the entire industrial enterprise.

How can manufacturers apply machine vision to factory operations?

Jonathan Weiss: You have to understand first what it is that you’re trying to solve. What is the highest value defect that occurs the most frequently that you would like to mitigate?

In the world of welding it’s often something that the human eye can’t see, and vision systems become very important. You need infrared cameras in complex assembly processes, for example, because a human eye cannot easily see all around the entire geometry of a part to understand if there’s a defect somewhere—or it makes it incredibly challenging to find it.

It’s finding a use case that’s going to provide the most value and then working backwards from there. Then it’s all about selecting technology. I always encourage people to find technology that’s going to be adaptable and scalable, because if all goes well, it’s probably not going to be the only vision system you deploy within the footprint of your plant.

Aji Anirudhan: Most factories are now covered with CCTV cameras for compliance and other needs, and our requirements at AllGoVision easily match with the in/output coming from them. Maybe the position of the camera should be different, or the lighting conditions. Or maybe very specific use cases require a different camera—maybe a thermal camera. But 80% of the time we can reuse existing infrastructure and ride on top of the video feed.

What’s the importance of working with partners like Intel?

Aji Anirudhan: We were one of the first video-analytics vendors to embrace the Intel open-window architecture. We have been using Intel processes from the early versions all the way to Gen4 and Gen5 now, and we’ve seen a significant improvement in our performance. What Intel is doing in terms of making platforms available and suitable for running deep learning-based models is very good for us.

Some of the new enhancements for running those deep learning algorithms—like the integrated GPUs or the new Arc GPUs—we are excited to see how we can use them to make it more effective to run our algorithm. Intel is a key partner with respect to our current strategy and also going forward.

As this AI space continues to evolve, what opportunities are still to come?

Jonathan Weiss: At Eigen, we do a variety of types of inspections. One example is inspecting machines that put specialty coatings on paper. One part of the machine grades the paper as it goes through, and you only have eight seconds to catch a two-and-a-half-millimeter buildup of that coating on the machine or it does about $150,000 worth of damage. And that can happen many, many times throughout the course of a year. It can even happen multiple times throughout the course of a shift.

And when I think about what the future holds, we have eight seconds to detect that buildup and automate an action to prevent equipment failure. We do it in about one second right now, but it’s really exciting to think about when we do it in two-thirds of a second or half a second in the future. 

So I think what’s going to happen is that technology is just going to become even more powerful, and the ways that we use it are going to become more versatile. I see the democratization of a lot of these complex tools gaining traction. And at Eigen, we build our software from the ground up with the intent of letting anybody within the production footprint, with any experience level, be able to build a vision system. That’s really important to us, and it’s really important to our customers.

Although in our world we stay hyper-focused on product quality, there’s also the same idea that Aji mentioned earlier that people aren’t going away. And I think that speaks to a common misconception of AI, that it is going to replace you; it’s going to take your job away. What we see in product quality is actually the exact opposite of that: by bringing vision systems and software tools to factories, we’re enabling them to inspect parts faster. Now they’re able to produce more, which means the company is able to hire more people to produce even more parts.

A lot of my customers say that some of the highest turnover in their plants is in the visual-inspection roles. It can be an uncomfortable job—standing on your feet staring at parts going past you with your head on a swivel for 12 hours straight. And so this may have been almost a vitamin versus a painkiller sort of need, but it’s no longer a vitamin for these businesses. We’re helping to alleviate an organizational pain point, and it’s not just a nice-to-have.

Aji Anirudhan: What is interesting is all the generative AI, and how we can utilize some of those technologies. Large vision models basically look at explaining complex vision or complex scenarios. I’ll give an example: There is an environment where vehicles go but a person is not allowed to go. And the customer says, “Yes, the worker can move on that same path if he’s pushing a trolley.” But how do you define if the person is with a trolley or without a trolley?

So we are looking at new enhancements in technology, like the LVMs, to bring out new use cases. Generative AI technology is going to help us address these use cases in the factory in a much better way in the coming years. But we still have a lot to catch up on. So we are excited about technology; we are excited about the implementation that is going on. We look forward to a much bigger business with various customers worldwide.

Related Content

To learn more about AI-powered manufacturing, listen to AI-Powered Manufacturing: Creating a Data-Driven Factory and read Machine Vision Solutions: Detect and Prevent Defects. For the latest innovations from AllGoVision and Eigen Innovations, follow them on Twitter/X at @AllGoVision and @EigenInnovation, and on LinkedIn at AllGoVision and Eigen Innovations Inc.

This article was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza