Skip to main content

embedded world 2023: Modular Building Blocks at the Edge

embedded world 2023

So, you’ve got an idea for a new intelligent edge system. Maybe you just finished modeling an AI-powered robot prototype, or maybe you’ve already completed an industrial computer vision proof of concept. Now the real work of specifying components and building an embedded system that meets your design requirements begins.

Where do you start? Nearly 27,000 of your colleagues began their embedded solution development journey at embedded world 2023, where modular vision, AI, and workstation building blocks from Intel and the Intel® Partner Alliance put them on the fast track to next-gen system deployment.

Here’s a snapshot of what you missed at the show.

Intel® Arc Graphics Gives Modular Vision Tech a Boost

Computer vision technology is now mainstream and popping up in places you’d never have expected it a few years ago. From retail kiosks to autonomous robots—embedded solutions engineers now need a way to quickly add, upgrade, and scale their vision-equipped embedded system designs to meet market demand.

The challenges of doing so at the edge are well documented—high-performance vision technology comes with power, thermal, and cost constraints in embedded systems that are already resource-limited. Intel® Arc graphics processors were developed to help device engineers thread this needle. Companies like global leader in edge computing ADLINK Technology, embedded and automation solutions provider Advantech, and embedded computing technology provider Kontron have already delivered plug-in hardware solutions that streamline their addition to any design.

From #retail kiosks to autonomous #robots—embedded solutions #engineers now need a way to quickly add, upgrade, and scale their vision-equipped embedded system designs to meet market demand. @embedded_world via @insightdottech

At embedded world 2023, Kontron’s Thomas Stanik, a Senior Sales and Business Development Manager, unveiled a collaboration with Intel that brings the new Arc GPUs into embedded workstations at the edge (Video 1). Designed for factory automation, medical imaging, and other environments that require high-performance vision processing in compact power and thermal envelopes, the liquid-cooled embedded workstation is built around a long-lifecycle-supported K3851-R industrial ATX motherboard outfitted with 12thand 13th Gen Intel® Core processors and either the Arc A40 or A50 GPU.

Video 1. Embedded workstations are an ideal entry point for Intel® Arc GPUs in industrial environments. Shown here is a liquid-cooled workstation proof of concept based on latest generation Intel® Core processors and the new GPUs. (Source: insight.tech)

With that much performance so close to the industrial edge, developers can create immersive human-machine interfaces for operators that still pack enough horsepower to offload vision processing tasks from nearby camera platforms.

Elsewhere at the show, Advantech broke down how developers can leverage the new GPUs to match the demand for data analytics at the edge using modular building blocks. For instance, the company’s portfolio includes single-board computers (SBCs), computer-on-modules (COMs), and plug-and-play solutions that accelerate the design and deployment of intelligent edge solutions.

From platforms designed around Intel embedded processors with integrated Intel® Iris® Xe graphics to high-performance COM-HPC modules to discrete PCIe 4.0 or MXM cards equipped with Arc GPUs, Thomas Kaminski, Director of Product Sales Management, Marketing, and Technical Support at Advantech, explained the options for scaling edge analytics and vision performance across new or existing designs.

At the ADLINK Technology booth, the company’s Head of Modular Solutions, Henk van Bremen, introduced showgoers to the MXM-AXe, a VR-ready MXM 3.1 Type A module based on an Intel Arc GPU. The module provides up to eight Xe cores, ray tracing units, and an AI engine for driving as many as four 4K displays (Figure 1). It also equips 4 GB of dedicated onboard GDDR6 memory and, importantly, uses 8x PCIe Gen 4 interfaces carried out over a 16-lane, 314 contact connector for quick system integration.

The ADLINK Technology MXM-AXe module is built around an Intel® Arc™ GPU that delivers eight Xe cores, ray tracing units, and an AI engine, and can slot into COM Express systems over PCIe Gen4 links
Figure 1. The ADLINK Technology MXM-AXe module is built around an Intel® Arc GPU that delivers eight Xe cores, ray tracing units, and an AI engine, and can slot into COM Express systems over PCIe Gen4 links. (Source: ADLINK Technology)

The company was showcasing the MXM-AXe alongside its COM Express Type 6 Rev. 3.1 development kit that the graphics module can easily slot into. This architecture not only consolidates component sourcing and procurement down to a couple of modules available through a single vendor, but it also unifies the software stack around x86 devices.

Completing the Vision at embedded world 2023

Of course, building block embedded hardware is just one step on the way to completing a design. With that foundation in place, work with embedded software and tools can begin. And Intel partners like embedded computer modules supplier congatec and IoT solution developer SECO were showcasing end products built on Intel solution stacks at the show.

The first of these, presented by Christian Eder, Director of Product Marketing at congatec, consisted of a demonstration of an AI- and vision-enabled robotic pick-and-place machine running on a hypervised multicore Intel Core processor implemented in a COM-HPC module (Video 2). A real-time hypervisor from the company’s affiliate Real-Time Systems GmbH partitioned the cores so that image analysis and AI workloads that let the robot sense its environment are deployed on a Linux operating system while control and actuation functions are executed on a separate RTOS running on the same chip.

Video 2. Real-time hypervisors allow multiple operating systems to run on the same chip so embedded engineers can maximize resource utilization and minimize cost in systems like AI-powered robotic pick-and-place machines. (Source: insight.tech)

The AI and vision portions of the system are powered by the OpenVINO toolkit, which facilitates the “playful” capabilities of the autonomous robot without the need for discrete graphics processors.

But intelligent vision processing is also available on entry-level devices. This was demonstrated by SECO’s Chief Product Officer, Maurizio Caporali, in a retail kiosk on the embedded world show floor. Caporali showcased a dual-core Intel Atom® x6000 processor available on SMARC or COM Express modules that accepts video inputs from a single camera, then applies both facial recognition and emotion detection algorithms against the stream.

This type of AI can be developed and deployed onto SECO’s range of Intel-based edge hardware using the company’s Clea AI and IoT platform, which provides easy-to-use APIs for optimizing AI models, then updating endpoints in the field through a single pane of glass.

From Vision to Reality

These were just a few of the ways Intel partners enable development, deployment, and adoption of AI and computer vision-based technology at the intelligent edge, which will soon be the rule rather than the exception.

Embedded hardware building blocks based on Intel Arc GPUs, 13th Gen Intel Core processors, the Atom x6000E Series, and other Intel technologies are available from multiple distributors, including Rutronik, EBV Elektronik, and Arrow. Many of these distributors also offer design and manufacturing services to help bring your design ideas to life as quickly as you can envision them.

Learn more about what’s possible on the embedded world digital event platform, where many of the conference proceedings are archived, or by discovering all the next-generation vision and AI solutions available now from Intel partners.

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

About the Author

Brandon brings more than a decade of high-tech journalism and media experience to his current role as Editor-in-Chief of the electronics engineering publication Embedded Computing Design. His coverage focuses on artificial intelligence and machine learning, the Internet of Things, cybersecurity, embedded processors, edge computing, prototyping kits, and safety-critical systems, but extends to any topic of interest to the electronic design community. Brandon leads interactive YouTube communities around platforms like the Embedded Toolbox video interview series and Dev Kit Weekly hardware reviews, and co-hosts the Embedded Insiders Podcast. Drop him a line at brandon.lewis@opensysmedia.com or DM him on Twitter @techielew.

Profile Photo of Brandon Lewis