Skip to main content

AI • IOT • NETWORK EDGE

Why Containers Are the Future of the IoT

Containers are incredible tools in the world of agile software development. And with the convergence of IT and operations technology (OT) in many organizations, containers are making the leap from enterprise systems to IoT equipment.

Although they once had little in common, IT and OT are becoming so interdependent and closely connected that they often need to use the same approaches—from hardware to software to design methodologies. This has led to the adoption of IT concepts like microservices and DevOps in the embedded space.

Similarly, established IT platforms are borrowing ideas from the world of embedded design, such as real-time capabilities and distributed computing. One of the most important developments arising from the convergence of enterprise and embedded is the emergence of containers for IoT.

Containers Are Like Virtual Machines Without the Overhead

Containers make it easier to build, deploy, and maintain embedded designs—critical requirements in a competitive business world and agile software development.

Conceptually, containers are similar to virtual machines (VMs). Each container is a process with its own virtual resources and filesystem (memory, CPU, disk, etc.), isolated from other applications and containers (Figure 1).

 

Figure 1. Containers are like lightweight virtual machines. (Source: Wind River)

The key difference is that containers run on the host Linux OS in user space, instead of an entirely different environment, like VMs. Thus, they are less resource-intensive (lighter weight), which makes them considerably smaller than a virtual machine. Containers can run parallel to other applications in user space, stand alone on a system, coexist with virtual environments, and even run inside one or more VMs.

As illustrated in Figure 2, containers are managed and packaged through daemons. Docker was one of the first to introduce commercialized daemons and other software to simplify the building and management of containers in Linux.

 

Figure 2. Containers are managed through command-line daemons. (Source: Red Hat)

Since then, the Open Container Initiative established standards for container images (Open Container Format) and runtimes to make them easily portable across hosts. Other recent container projects include repositories where container images can be downloaded and run on a containerized host. These enhancements have led to containers becoming the popular tools they are today in software development.

Moving Intelligence to the Edge

Containers are key to pushing intelligence to the edge. Many edge devices are built with compute capabilities to do much more than traditional acquisition or movement of data. They can analyze incoming data streams using trained machine learning models.

Consider AI applications. By moving inferencing out of the cloud and into edge devices, IoT systems can provide real-time analytics. Medical scanners can detect anomalies and thus assist clinicians. Cameras and visual gateways can identify situations quickly and alert operators instead of sending data to a central location.

Similarly, infrastructure functions like firewalls and packet inspection can move to the edge. Now processing can happen where the data first appears instead of in the core of the network.

Using containers, these and many other processes can be developed, deployed, and run simultaneously in isolated instances on the edge device.

Containerized Services for IT, OT, and IoT

To make IoT containers more practical, several cloud service providers (CSPs) have created solutions to support development, monitoring, and management of containers’ edge devices. For example, AWS Greengrass offered by Amazon Web Services enables users to deploy trained inferencing algorithms to IoT end devices running the AWS Greengrass Core container image. AWS Greengrass is part of Amazon Web Services cloud platform for IoT that includes training and monitoring of edge devices in the cloud.

Similarly, Microsoft offers a cloud-based service for building and deploying Azure IoT Edge modules, which are Docker-compatible containers running Azure IoT edge applications. Another example is the IBM Watson IoT Platform Edge, a platform for developing edge applications, including trained machine learning models, and deploying them to edge devices. All services are end-to-end, providing tools to develop, deploy, and manage devices

Changing Design Cycles

The wide adoption of containerization reflects how software development has evolved to adapt to developer expectations, lifecycle requirements, and customer needs. Long development cycles with careful performance optimization is a thing of the past. Today’s business environment and customer demands emphasize rapid iteration to enable quick deployment, as well as ongoing upgrades as customers ask for richer functionality or regulations require quick changes.

All of these trends are driving functionality out of hardware and into the software—both in IT and IoT. Unlike traditional embedded systems, where functionality is built into the hardware, IoT solutions today are developed in software. They are often deployed on commercially available acquisition and/or compute devices, and more of them are installed as containers.

In fact, IoT systems are moving toward a Device as a Service model. Initial deployments may offer only basic functionality (i.e., a minimum viable product, or MVP), with new capabilities added over time. This approach also allows devices to adapt to changing circumstances, from changes in analytics frameworks to new security threats. The challenge is that this new approach is more resource-intensive on the compute platform.

Containers Need Cores

Containers, while enabling agility and isolation, have larger memory footprints than traditional embedded applications. And often they are used to run isolated applications side by side. Hardware systems typically need multiple cores and expanded resources, especially where they support hard partitions between containers.

The latest Intel® Core processor and Intel® Xeon® Processor E-2100M family, codenamed Coffee Lake-H, is well suited for these demands. The new processors offer more cores than previous generation devices, with six instead of four cores, at no significant cost increase.

With 50 percent more computational resources, they readily support numerous containers and VMs, while leaving headroom for future expansion and extensibility of the end device. Because the processors are built on the same architecture and technologies that already exists in the data center, they can readily support demanding server-like functions at the edge.

COM Express + Containers Create Powerful IoT Solutions

Single-board computers (SBCs) built on the COM Express standard are a great way to take advantage of containerization. They offer a small footprint that many remote devices require. And they can be found with powerful computing capabilities to support a wide range of applications and inferencing at the edge.

One example is the Express-CF/CFE, a COM Express Basic Size Type 6 module from ADLINK. This PCIMG COM.0 R3.0 module offers:

  • Up to 6-core processors with turbo boost of up to 4.4GHz for uncompromising system performance and responsiveness.
  • Support for Windows 10 and Linux, so it can support the most popular containers.
  • Up to 48 GB of DDR4 memory and support for Intel® Optane storage for high-performance, memory-intensive workloads.
  • Integrated Intel® Generation 9 Graphics and support for up to three independent displays.

The module’s high-performance graphics are well suited to industrial automation, medical, and other visual applications that use OpenGL 4.5, DirectX 12/11, and OpenCL 2.1/2.0/1.2, or require video acceleration for today’s modern codecs.

An Agile Approach to the Future

The use of containers is becoming more commonplace in IoT devices. Containers support the convergence of IT and OT, and leverage the benefits that application developers have experienced for creation of innovative IoT devices with growing intelligence at the edge.

But containers and containerized applications at the edge need more computational resources than legacy single-board computers. COM Express modules built on the latest Intel Xeon processors are a powerful foundation for building flexible IoT solutions with the resource that modern edge applications need.

About the Author

Ken Strandberg is a technical story teller, creative writer, and amateur filmmaker. He writes articles, white papers, seminars, case studies, web-based training, video and animation scripts, technical and non-technical marketing literature, and interactive collateral for emerging technology companies, Fortune 100 enterprises, multi-national corporations, startup businesses, and non-profits. His work has appeared on a wide range of websites from large enterprises to a carpet cleaning service, in leading trade publications, and on blogs. Mr. Strandberg’s technology areas include Software, HPC, Industrial Technologies, Design Automation, Networking, Medical Technologies, Semiconductor, and Networking and Telecom. For the last ten years, he and his wife roamed North America, traveling in and working out of a van (vanlife.us), until recently settling in Oregon, USA.

Profile Photo of Ken Strandberg