Skip to main content

AI • IOT • NETWORK EDGE

AI Robots Navigate the Smart Factory

AI robots, robot vision, containerization

By now we’re all familiar with Industry 4.0, and the ways in which connectivity and analytics are being used to revolutionize productivity and efficiency in automation environments. But you might be surprised to learn that Industry 5.0—a world in which humans work with and alongside smart robots—is already on the horizon. And that horizon is emerging in the form of collaborative robots.

These industrial or service robots are independent and operate freely within complex, semi-structured environments.

This is a big step forward from the automated guided vehicles (AGVs), which are still widely deployed today—a common sight in factories and warehouses—often used to transport goods and materials from one location to another. The major limitation of AGVs is that they are “guided,” meaning they follow predetermined paths outlined with lines, wires, tape, lasers, or even computer vision that recognizes landmarks along a specified route.

This makes AGVs excellent at performing repetitive, fixed functions, but not much else. In the evolution to collaborative robots, the first step is the autonomous mobile robot (AMR).

From Automated to Autonomous

AMRs leverage cutting-edge intelligent vision systems and simultaneous localization and mapping (SLAM) software to afford freedom of movement and operational flexibility. The advanced robotic platforms are able to recognize their surroundings, navigate them accordingly, and even identify and avoid objects in their path.

In short, AMRs are more mobile, efficient, flexible, and safe. They can even collaborate with other AMRs to increase productivity, and decommission themselves to recharge when their batteries are low.

Of course, the ability to achieve mobile autonomy is the result of more sophisticated AMR technologies. For instance, SLAM technology requires higher-precision sensors, which in turn require more compute resources to process data and act on it. And building an AMR from scratch based on these components is a complex undertaking that presents systems integrators with many hardware, operating system, and protocol engineering challenges.

“There are a lot of systems integrators that build their own AMR frameworks,” explains Kev Wang, a product manager at Vecow Co., Ltd., a developer of innovative industrial automation products. “But they have to test and verify from the very beginning, which alone could take a team of five to eight people more than three to four months to do so.”

With VHUB ROS, @VeCowCo worked with Intel® and Virtuoso Robotics to deliver a standard framework that gives SIs a better understanding of how to approach AMR design.

Accelerate AMRs with an All-in-One AI Kit

An alternative is to get a head start with a rapid prototyping platform. For example, The Vecow VHub Robot Operating System (ROS) is a turnkey AI development kit that provides systems integrators with a fully integrated hardware, software, and tools stack for accelerating AMR design (Figure 1).

IPC and industrial AMR
Figure 1. The VHUB ROS AI development platform includes a complete autonomous mobile robot (AMR) development stack. (Source: Vecow)

VHUB ROS is the result of a collaboration between Intel®, Vecow, and Virtuoso Robotics, and is compatible with select hardware starter kits based on 11th Gen Intel® Core i7/i5/i3 processors—also known as Tiger Lake UP—and Intel® Movidius VPU accelerators. A Perception software development kit (SDK) layers on top of those hardware targets, and includes ROS 2, control firmware, and an AI engine.

“With the VHUB ROS, we cooperated with Intel and Virtuoso Robotics to deliver a standard framework that gives systems integrators a better understanding of how to approach AMR design,” says Wang.

The AI engine within the SDK gives developers access to more than 200 of the most commonly used AI models, including SLAM, object detection and recognition, and other algorithms. And it works with industry-leading training platforms like Caffe2, mxnet, and TensorFlow.

To prepare these models for use on resource-constrained AMR platforms such as the Vecow EMBC-5000, models trained in these frameworks are passed through the Intel® OpenVINO Toolkit, a cross-platform model optimizer and inference engine.

“We’ve tested the VHUB ROS SDK using CPUs, CPUs with the Movidius VPU, and now CPUs with integrated Intel® Xe graphics with Movidius VPUs.” Wang says. “When we test these platforms with OpenVINO, we see at least twice the performance.”

Deeper Insights for Collaborative Robots

The biggest differentiator of an AMR design today is certainly its on-board AI, which means developers will spend a great deal of time and effort training and labeling AI models in the cloud. To facilitate this, the VHUB ROS SDK can be hosted as a containerized instance on cloud services like Amazon’s Elastic Container Service (ECS).

But since the process of refining AI algorithms is based on capturing inferences from an AMR and feeding them back into the VHUB model training instance, systems integrators need a mechanism for transmitting edge data into cloud containers in real time. To accomplish this, Vecow leverages Intel® Edge Insights for Industrial (Intel® EII).

EII is a containerized software architecture that collects, stores, and analyzes time-series and vision sensor data, then orchestrates and manages it across a variety of operating systems and protocols from the edge to the cloud. It does so securely, and in a near-real-time (<10 ms) closed-loop.

Vecow engineers have extensive experience deploying EII in AI inferencing optimization. They are adept at leveraging the software’s capabilities to maximize the performance of multiple AI functions running simultaneously in real time. The team has already performed research, implementation, testing, and verification of all the AI capabilities, tools, and platforms that comprise the VHub ROS.

This eliminates large swaths of the evaluation and integration lifecycle, allowing AMR systems integrators to move immediately into application design and reducing their time to market by several months.

AMRs: Toward a Collaborative Future

AMRs are poised to transform the automation market once again and usher in an era of never-before-seen partnership between people and machines.

But to reach the potential of these early collaborative robots, organizations will first have to work with technology partners who can simplify the growing complexity of AMR hardware, software, and connectivity stacks to open the door for disruptive application engineering. With turnkey platforms like the VHUB ROS, fast-paced AI innovation is within reach.

About the Author

Brandon is a long-time contributor to insight.tech going back to its days as Embedded Innovator, with more than a decade of high-tech journalism and media experience in previous roles as Editor-in-Chief of electronics engineering publication Embedded Computing Design, co-host of the Embedded Insiders podcast, and co-chair of live and virtual events such as Industrial IoT University at Sensors Expo and the IoT Device Security Conference. Brandon currently serves as marketing officer for electronic hardware standards organization, PICMG, where he helps evangelize the use of open standards-based technology. Brandon’s coverage focuses on artificial intelligence and machine learning, the Internet of Things, cybersecurity, embedded processors, edge computing, prototyping kits, and safety-critical systems, but extends to any topic of interest to the electronic design community. Drop him a line at techielew@gmail.com, DM him on Twitter @techielew, or connect with him on LinkedIn.

Profile Photo of Brandon Lewis