Fill form to unlock content
Error - something went wrong!
Your content is just a step away. Please submit below.
There are so many shiny new toys to play with on the factory floor—from edge computing to digital twins—that have the potential to lead to faster, safer, more sustainable industrial processes. But with those benefits come the challenges, like bridging the IT/OT divide, and pairing advanced technologies with legacy infrastructure.
Rainer Brehm, CEO of Factory Automation at industrial manufacturing solution provider Siemens, discusses these industrial trends and transformations. He talks about standardization, autonomous AI, and use cases that include Siemens itself. He’s been at the company since 1999, and has seen firsthand how the space has evolved, as well as where it might be going next.
What trends can we expect in 2023 and beyond for the industrial space?
The trends we are seeing are combining the digital—which is the simulation module, digital twins—and the real worlds. You basically simulate everything up front and then you implement it. Now you have a feedback loop. You get real-time data out of the operation and feed it back to the digital twin, then you can further optimize it. Leveraging of data is significantly important, because AI isn’t yet really a big thing on the shop floor, but it will become a big thing as data becomes more and more available.
We will also see software-defined control, or software-defined automation. Currently, everything is very much bundled and tied with hardware, and it’s going to be more decoupled, more virtualized.
And, last but not least, especially when we look at the shop floor, the users of these more complex technologies are still the people operating the machines. These are not IT experts, but they still need to be capable of operating and maintaining those lines, those machines, those infrastructure plants. Therefore, we have the topic of human-centric automation: How can we make it as easy as possible?
Leveraging of #data is significantly important, because #AI isn’t yet really a big thing on the shop floor, but it will become a big thing as data becomes more and more available. @Siemens via @insightdottech
What are the challenges to reaching those Industry 4.0 goals?
I think a lot of the technologies are there. But the reason why they aren’t scaling is that OT and IT people, they simply speak different languages. I experience that even within our own organization, where I’m more the OT guy. When I talk about connectivity, I think about connectivity to the real world, to the equipment, to the sensors, to their drives, and so on. The IT person, when he talks about connectivity, he is thinking about connectivity to databases, to cloud, to data lakes. And what we experience in our company our customers experience as well. There is still a gap between the IT department and the OT people, who are the ones defining how you’re going to automate something, how you set up the equipment, how you set up the lines, how you maintain it all to optimize it.
So how do you bring the languages together? This could be about terms, but it could also be about, for example, how you program the OT landscape. We have introduced a new programming environment called SIMATIC AX (Automation Xpansion) extension. It’s called an extension because it makes the OT world more accessible to the IT people.
The landscape is also very, very heterogeneous. A lot of the machines don’t speak the same language because they’re from different vendors. There aren’t standards, so you can’t really scale. You need a standard for that. And this also applies even to new machines, to greenfield, but it applies even more to brownfield. A factory normally runs a minimum of 10 years—most are 20 years or 30 years. If you go to the energy sector or the chemical sector, it might run 40 years.
How has the emergence of the edge and AI complicated factory automation?
When you talk about edge computing on a shop floor, there are more requirements. And if you talk about real time, maybe it’s a jitter of microseconds. If you imagine a very fast process, in a microsecond a lot of things could happen. And if you’re not reacting fast enough, then you might question a machine, or you might get to different results. So the topic of real time is very important.
Secondly, if you want to deploy AI workload on a shop floor and you want it to react very fast, it’s important that this AI workload has an inference close to the machine, simply because of the speed of light. The other aspect is that you want the AI to interact frequently with your real process. Basically, you’re going to interfere with the process, so you want to have that kind of close allocation, close to the machine or to the line. You also want to take data out of the process and feed it back into the AI.
I can give you one example. In our factory in Hamburg, we produce about 10 terabytes of data every day. You don’t want to send those 10 terabytes of data into a cloud; you want to have it there where the source of the data is. That is different, maybe, to a classical IT landscape. But we need to add not only real-time capabilities, we also need to add the safety.
It’s a little bit like autonomous driving, where safety is also a very important aspect. You could imagine that, when you do autonomous driving in the car industry, you don’t want the cloud to be defining whether you stop or not if a child is running into the street. You want that reaction being executed as fast as possible directly in the car. The same is true on a machine. If a press is going down and somebody has his finger there, it should stop immediately. So you need to have that kind of fast reaction.
But why not think ahead? When I started at Siemens in 1999, what you automated, basically, were very repetitive tasks. And mass production was perfect for that because mass production has a lot of repetitive tasks. Or you automated something that was predictable. You could basically only automate what you knew.
Now there’s leveraging AI for optimizing processes, but couldn’t we also use AI for a more autonomous factory? How could we use AI so that a machine, a robot, could decide itself what to do? That means AI is not only optimizing the process, optimizing, and enhancing the engineering, but really steering the robot, the machine, and the line. And that application for AI is really, really exciting because it opens up new fields for automation.
What are some Siemens use cases that show these solutions in action?
Let’s start with our own factories; what we apply to our customers, we apply it to ourselves here also. One example of a use case of IT/OT leveraging AI is, again, in our plant in Hamburg. There’s a very high throughput of PCB lines, and a complex process of how you put the components on the circuit board. In the past we normally did an X-ray inspection of the PCB at the end, and there was always a bottleneck there. So, by leveraging AI, we now predict whether each individual PCB has a high quality or not, and every machine with a very, very high probability of having no quality issue, we don’t send it to the X-ray machine anymore; it bypasses the X-ray machine and goes to the final assembly.
Another example is in infrastructure, doing tunnel automation. If you drive through a tunnel in the Alps or in the Rocky Mountains, there’s a high probability that those tunnels are automated and controlled by our PLCs. We are now using AI more and more in order to detect emergency situations in those tunnels—if there’s a traffic jam, if there’s a fire. If you need to react fast, how do you evacuate the tunnel? How do you switch on or off vents or lights?
Going back to the factory again—we’re doing real-time flexible grasping where something is taken out of a box. The AI tells a robot where to grasp an aspect without having to train or program that robot on the thing that needs to be picked up. We train the robot on the skill: to pick up. Basically, the robot can pick up anything that is necessary—as long as the gripper allows it. So, with that skill of grasping, we can start automating something unknown or unpredictable.
And my last use case, which is not currently reality, but it’s where I invest money: Can you in the future automate repair? If you take, for example, a car battery. You, maybe, in the future can take a car to a workshop; the battery is taken out, there’s a defect, and a system can automatically detect where the problem is and can autonomously repair the battery cell. That is also automating the unknown, because every battery is a unique thing. Can you automate that leveraging AI? So, some of the use cases where I’m really excited really will make a difference in the future.
Tell us about the value of your partnership like the one with Intel.
We have worked with Intel probably for four decades. But I know that we started in 2012 with TAP, the Technology Accelerator Program, to enable the processes of low-latency functionality—especially for those workloads where you need to act in microseconds. So that was very, very fruitful, and helped us to use the Intel chips in our controllers.
We’re currently working with Intel on the supply chain crisis. So—also thanks to Intel—I think we have been capable of fulfilling, maybe not all the demands of our customers, but as much as possible.
Machine-vision application is a workload that consumes a lot of compute power. And for that we will bring out a new portfolio leveraging the 4th Gen Intel® Xeon® Scalable processors. We’re looking forward to introducing that in the market in the middle of 2023. So, very excited to have that new portfolio element, which is addressing exactly that need we see on the shop floor.
Any final thoughts for us?
First of all, I strongly believe there will be no sustainable future without automation, electrification, and digitalization. And, therefore, what we do together with Intel really is a significant contribution for our future. Number two, I believe the area of automation will expand more and more as we automate workload that is unpredictable and individualized. And, third, we need to make this technology as user-friendly as possible so that OT people can handle this complex technology.
This article was edited by Erin Noble, copy editor.