Skip to main content

Mariner

Getting the Smart Factory to 20/20 Machine Vision

machine vision cameras

In the past couple of years, manufacturers have been under a lot of pressure to streamline their operations. One way to do that is through the transformation to smart factory. That in itself can mean a lot of things, one being the use of camera systems for machine vision. Throw AI into the machine vision solution, and that solution could seem more intimidating than the problem, particularly without data scientists or AI developers on hand.

David Dewhirst, Vice President of Marketing at Mariner, a provider of technology solutions leveraging IoT, AI, and deep learning, breaks the situation down for us. David spotlights Mariner’s crucial area of expertise—harnessing machine vision and AI for quality assurance on the factory floor, because, as he points out, quality needs to be paid for one way or another. And paying for it up front, on the factory floor, sure beats paying for it in a reputation for shoddy product.

What does it mean to be a smart factory?

I like to draw a distinction between data and information. Data is just the inputs—and they’re everywhere. You need to somehow transform that data so that you can do useful things with it as information. When I’m thinking about a smart factory, or a connected factory, I’m thinking about all of the information that’s inherent on the factory floor. So how do you connect all the data together? And how do you process that data to get useful results out of it—to get information? As well as availing yourself of new sensors and technology to really advance the state of the art in manufacturing.

How are manufacturers doing on this journey to becoming smart factories?

In fact, there is a high project-failure rate in this space. But you have to do it anyway, because all of your competitors are doing it. If you don’t, you’re going to be left behind.

In my observation, when these projects fail it’s because manufacturers haven’t actually thought through what they’re trying to do. They know they need to do this cool thing, but they may not necessarily be doing it to solve a specific problem. But that’s how I think any smart-factory initiative should proceed. If you’re charged with digital transformation in your factory, find the use case that may not be the coolest thing that you can do, but that solves the biggest, hairiest problem. Our solution is very pointedly aimed at improving defect detection in factory, so that’s one kind of use case.

It’s also important to find those use cases where you can sell your project both below and above—to the engineers who are impacted by it, but also to the decision-makers who cut the checks. And then you’ll be on a clear middle path towards smart factory. Clearly identifying your use case will help you sell it, and it will also help you solve it; if it’s a defect-detection problem, you can go looking for companies like Mariner that specialize in that. And from there, maybe you’ll identify other use cases that you can tackle later on.

The best way to start identifying these use cases is to talk to the people who have the problems. Talk to the people on the factory floor, the engineers—the boots on the ground. They will often be aware of day-to-day problems; they may even be suppressing problems, or just trying to ameliorate problems that they would love to have a solution for if you just asked them. Also talk to the people above you. Say to them, “What is costing us money?”

What’s the importance of machine vision to the smart factory?

When we talk about machine vision or camera systems or computer vision in the factory setting, those are typically fixed cameras in a fixed position with a fixed type. They are very bespoke to the production line. They will be designed in their placement, their lighting, their setup, in order to be targeted to the specific product on that production line. Their importance is in their ability to improve the quality control process.

There is the concept of total cost of quality, right? You’re going to spend money on your factory floor to have good quality that goes out the door. Or, if you don’t do that, you’re going to have a lot of returns, and you’re going to have warranty claims. Not spending money on the quality costs on the factory floor means you’re still going to spend money on quality costs; it’s just going to be in canceled contracts and bad brand association.

“If you’re charged with #digital transformation in your #factory, find the use case that may not be the coolest thing that you can do, but that solves the biggest, hairiest problem.” – David Dewhirst, @MarinerLLC via @insightdottech

The cheapest, highest ROI way to pay this cost is to do the quality work on the factory floor. This isn’t a new concept. Ever since the first assembly line in Dearborn, Michigan, you’ve had guys at the end of the line looking at products and doing quality control. Machine vision systems, or camera systems, to help do that have been around for decades. They are useful because they can present a very consistent look from piece to piece, from part to part to part to part. It always looks the same because the camera, as I said before, is very fixed and situated.

How does AI help take this process to the next level?

For the past several decades, machine vision systems have been very good at solving binary problems. For example, is there a hole in this piece, or is there not a hole in this piece? That’s a binary thing: yes or no. It’s very easy using traditional programming, which relies on those true/false questions to come up with a true/false answer.

But what happens when your problem isn’t binary? What happens when, instead of just asking is it a hole or not a hole, what happens when you’re looking at, for example, is this an oil stain on fabric or is it a piece of lint? They’re both kind of fuzzy. Maybe the stain is a little bit fuzzier and the lint is less fuzzy, but you have to draw an arbitrary line between the fuzziness levels. Then what happens if there is lint that is a little bit fuzzier than where you drew the line? That gets called defect. What happens if the stain is a little less fuzzy than you thought it would be? That will escape, because you might think that it’s lint. That’s where AI comes in.

With machine learning, with deep-learning techniques, you don’t need to draw an arbitrary line for a true/false answer. You can just train the AI with enough samples of stains and lint, and the AI will learn on its own what the difference is. AI can solve those kinds of challenges that weren’t really solvable before with just traditional programming, so you can oftentimes get your machine vision system, your camera system, to do what you hired it to do and what it has never really done a good job at.

What can manufacturers do if they have a lack of IT or AI support?

At Mariner, we use a tool. We ask your quality guys to take all the images you have of your product that show defects, upload them to the tool, and draw a little box around them. That lets your quality guys do what they’re good at—looking at these images and pointing out defects. We can take advantage of that and then do the part we’re good at, which is the data science. Our data scientists will build that AI model so you don’t need data science guys on the factory floor. We’ve got you on that.

Other companies with other solutions and other spaces will ship prebuilt models. Those may or may not work, depending on how closely those prebuilt models match what your particular situation is on the factory floor.

Where is all the data collection and processing happening—the edge or the cloud?

It depends. If you have 10,000 sensors all over your factory and you’re generating terabytes of information, you’re going to have to do it in the cloud. In machine vision there’s a little bit less reliance on the cloud. Mariner, with our Spyglass Visual Inspection solution—SVI—actually uses a hybrid solution. And that’s because, for the real-time defect-detection work, we don’t have time to make a round trip to the cloud. We’re doing our actual defect detection and the AI-inference work on the factory floor because then, even if you lose internet connection, your production isn’t shut down, your factory isn’t shut down.

We do also make use of the cloud. SVI is designed to run headless, without anybody standing around, but engineers can go back and review the decisions that the AI has made. If the AI got something wrong, the engineers can correct it. That will go up to the cloud. And if the AI model needs to be retrained, we can do that in the cloud because it doesn’t require real-time connectivity.

How do you work with other partners in this ecosystem to make it all come together?

Number one, we don’t sell cameras; we are an AI software-as-a-service solution. If you need cameras, we work with a vision integrator that will get you the right camera. By and large, we don’t care what the camera is; we can make use of any camera you already have, or work with you to get one.

Partner number two, because we need some powerful processing capabilities, we work very closely with Intel® and Nvidia, both on the factory floor. We ship AI software as a service that, ironically, will arrive to you on a server box. We do that because then we can build those server boxes to do what we want. So we have Intel® Xeon® chips in there for really muscular, beefy processing, and we have Nvidia cards in there for extra GPU power.

We also partner on the cloud with Microsoft, typically in Azure. There are a lot of prebuilt services and other capabilities in Azure that we can make use of, and also be certain about security and speed and all those other important things.

Anything else you would like to add?

You may not need Mariner’s solution, but you will need to move forward with industrial IoT and AI. Actually, you may or may not need AI, given your use case, but you are going to need to have industrial IoT of some kind. Mainly I would encourage people to think about the use cases and the situations that are right for them. Find that hook, get in, and don’t be the last guy.

Related Content

To learn more about defect detection, read A Guaranteed Model for Machine Learning and listen to Product Defect Detection You Can Count On: With Mariner. For the latest innovations from Mariner, follow it on Twitter at @MarinerLLC and LinkedIn at Mariner.

 

This article was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for insight.tech. Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza