Skip to main content

INDUSTRY

Bringing Digital Twins to the Factory Floor

Digital Twins

Imagine a crystal ball that could tell you whether a future project would be successful, or whether it would suffer fatal glitches along the way. Now imagine that crystal ball as more of a mirror image. Welcome to the world of digital twins.

For manufacturers, and the whole factory environment, this concept has exciting implications, and it’s been drawing a lot of interest recently. Martin Garner, COO and Head of IoT Research for CCS Insight; and Ricky Watts, Industrial Solutions Director at Intel, break it down for us. What exactly is a digital twin? What challenges might manufacturers face with this new technology? And what are the short- and long-term benefits? 

What exactly is a digital twin—especially in the context of manufacturing?

Martin Garner: I like the view from the Digital Twin Consortium—the industry body around these things—that a digital twin is a virtual model of machines, factory processes, and other things that exist in the real world. There needs to be some sort of synchronization between the real thing and the virtual model, and that could be in real time or it could be much slower than that. You also need to know the quality of the data that’s being synced, and how frequently that’s happening. That might all sound quite simple, but there are a lot of layers going on.

For manufacturing you can think of a James Bond-style diagram on the boardroom wall, where the live state of all the operations are there in one view. From there you can analyze processes and look at wear rates, and you can do predictive maintenance, process modeling, and optimization. You can also do staff training without letting people loose on the real machine. One of the uses of digital twins that I really like is software testing and simulation. For example, you can do a software update on the virtual machine first, validate it, make sure it doesn’t crash or break, and then download the software to the real thing.

We’re also now starting to think about a grander, scaled-up vision of digital twins. Factories are at the hub of very large supply chains, so why not have a digital twin of the whole supply chain? That might even include a machine that you’ve supplied to a customer to see how things are working at that level. Tesla does that with its cars.

Ricky Watts: I would add that, if I think about why digital twins exist and where they come into the picture, really it relates to data. As we’re starting to see more and more data coming out of factories, we need to be able to intelligently understand that data before it’s applied. What a digital twin is, to some extent, is a way of representing the data as it’s coming out of the machine—making some assessment of that data, and understanding it before applying an output or an outcome.

“There are some very good uses that have really good payback times; #PredictiveMaintenance and #software testing are two of the key ones.” – Martin Garner, @CCSInsight via @insightdottech

What are some of the challenges that come with these new digital technologies?

Ricky Watts: One of the challenges is that manufacturers are not generally people who really understand AI and machine learning; we do have a skills gap, to some extent. So how do you implement something like this within the workforce that you’ve got today? Another thing, of course, is that this is all relatively new: How do you trust the data? How do you apply it? I think those are some of the challenges with AI and ML as well.

There are also small- and medium-sized businesses that represent a huge amount of the industrial footprint. So bringing scale into these solutions is important—scaling a digital twin for a car manufacturer versus one for somebody who makes screws for a car manufacturer. And it’s about making sure that we don’t just empower manufacturers; we also need to empower an ecosystem to be able to go out and service these models as well.

These are some of the things we’re certainly working on at Intel: how to simplify consuming some of these technologies; and building partnerships and ecosystems to bring in the infrastructure. There’s a lot that goes on behind the scenes to do that.

How can manufacturers successfully adopt digital twins?

Martin Garner: In the fullest version, digital twins can be quite a long-term project across both OT and IT. In that case it’s really not a quick fix. And in the current economic climate, some companies may hesitate to step into that bigger, long-term project. But using digital twins for short-term gain can be done. There are some very good uses that have really good payback times; predictive maintenance and software testing are two of the key ones. The trick is to make sure you get a properly architected system, one that is open enough to build up the ecosystem, plug in other machines, and expand toward the fuller vision.

Ricky Watts: We’re technologists; we see this huge potential. But right now I think concentrating on the needs of manufacturers is crucial. These manufacturers are very much focused on how they’re going to survive—probably in a very tough fiscal environment for the next few years. It really is around getting very tactical: what can I do with something today that’s going to give me a benefit tomorrow—not next week, not the week after, not next year.

Yes, predictive maintenance is a great example. If you’ve got a digital twin that represents some part of your machine, and that digital model tells you that your machine is going to fail, then you can fix something before it happens. If you do that, you keep your factory operating.

So, focus on something that’s going to add near-term value. That has two benefits. One, it’s solving a problem for today. And, two, it allows manufacturers to start learning themselves. It gives you a near-term outcome to address some of your near-term challenges; but also, long term, it allows you to expose your workforce to the use of data in those environments. And those opportunities in the near term could actually benefit us in the mid- to long term, with progress toward more expansive use of digital twins.

What skill sets do manufacturers need to have available for this process?

Ricky Watts: You can’t create data scientists en masse. So, what can you do to effectively turn, say, a process-control engineer in an oil and gas plant into a data scientist without needing 10 years of training? We’re creating tools and capabilities in the background to repurpose the skills of that process-control engineer. To say, “Use the skills you have right now to tell me what’s going on. And then we’ll apply that to the data models and the digital twin models in a very simplistic way.”

In a sense, I do think skills are going to be needed: How do you install compute? How do you look after it? But let’s not lose the benefit of those process-control engineers: They know the outcome. They know when something’s wrong in their manufacturing. We can translate that into compute code that sits inside the digital twin and our models and our AI, and that then recognizes the issue.

What are the tools and technologies needed to implement this approach?

Martin Garner: One of the bits people find hardest is just getting the data feeds organized and set up in a way so that the data is compatible. Different sensors and different machines present data in a whole variety of ways because they weren’t expected to have to be compatible. Factories are complicated things. They have a whole range of different machines and technologies—technologies at all levels of the technology stack, from connectivity all the way up to AI. That makes it very hard to do any kind of templating.

What that means is that for a larger system there might well need to be quite a lot of systems integration work to really get the value. I think you can start small and simple, start getting value from it in one small area, and then progressively build out from there. But it quickly becomes a bigger project as you scale it up.

Ricky Watts: One of the things we are doing is trying to create uniformity through factory standards, using universal languages such as OPC UA. That means using a universal language for the machines to talk to each other so that every machine understands every other machine, at least to some extent.

Martin Garner: And the great thing about that is that it turns a stove-piped, vertical, proprietary thing into much more of a horizontal-platform approach. That’s much better for building out scale across manufacturers, supply chains, different sectors, and so on. It’s just an all-around better approach.

How is Intel® working to make digital-twin concepts successful?

Ricky Watts: What Intel does extremely well—in addition to obviously building those wonderful compute platforms that run these things at the edge of the network—is look at scale and at creating standards. We’re working with industry partners as part of foundational efforts to create coalitions to identify how to create these standards. We’ve been doing that in the oil and gas industry around what they call the OPAF, the Open Process Automation Forum.

We’ve been looking at compute platforms—making sure that we’re bringing through the technologies that manufacturers are going to need. For example, they need to be synched on atomic clocks so that data on one platform is synched with time stamping to data on another platform. We’re enabling the software ecosystem to use these capabilities, making sure we validate with Linux operating systems, with Windows, with virtual machines, with Kubernetes—all these wonderful things that are basically software-infrastructure layers allowing us to run the applications.

And of course working with the end-user community to make sure we’re not creating Frankenstein’s monster here.

Any final takeaways to leave us with?

Martin Garner: The full vision of digital twins might include something like planetary-scale weather and geological systems that can help us better understand global warming and things like that. But against that there are lots of smaller companies that really don’t know where to start. So we need to make it easier for them and more worthwhile for them to invest in this concept.

That means really focusing on the short term: how to save money using digital twins in the next quarter, how to make them easier to buy and set up. The vision is one thing, but we need to pull along the mass market of people who might use this as well. We can’t just do one or the other; we need to do both.

Ricky Watts: I think Martin is absolutely spot on. Keep it small, keep it simple. We do have solutions that are available to start you on your journey, and we’re really very much focused on what your problem is today.

Related Content

To learn more about digital twins in manufacturing, listen to the podcast The Role of Digital Manufacturing Operations. For the latest innovations from CCS Insight and Intel, follow them on Twitter at @ccsinsight and @Inteliot, and on LinkedIn at CCS-Insight and Intel-Internet-of-Things.

This article was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for insight.tech. Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza