Containerizing Build and Runtime Environments for Hardware in the Loop Testing

Ari Mahpour
|  Created: April 16, 2024  |  Updated: April 24, 2024
Containerizing Build and Runtime Environments for Hardware in the Loop Testing

I’ve been getting a lot of questions lately about containerizing environments for automated testing when using Continuous Integration systems. If you didn’t understand the majority of that sentence don’t worry because we’re going to do a full dive into containers, Docker, and how to leverage them in your embedded environment and hardware in the loop testing.

What are Containers?

There are plenty of excellent write ups on containers including this one from Docker (one of the most popular container runtime engines out there). Containers in build environments (i.e. embedded systems) and test environments (i.e. hardware in the loop testing) give us the ability to abstract away all the messy setup every time we want to bring up a new machine. This isn't just related to new test machines but also scaling our operation in the cloud as well for building embedded firmware. 

No matter what size operation you’re running these days many companies leverage the cloud to offload keeping bare metal servers around. In DevOps principles we always want to ensure that any software we write can be built and run anywhere at any time, in any place. Constantly spinning up new machines in the cloud and installing compilation software, libraries, and other software packages doesn’t scale well. This is, precisely, why containerization has become so popular. We can take our build (or runtime environment), package it up into a very lightweight virtual machine, and deliver it to any machine to run on whether it be the cloud or our own personal computer.

Creating and Using Containers

Let's explore how to actually create and use these containers in your projects. When we first begin creating a container image we have to start with an existing “base image.” In most cases, some variant of a Linux operating system, such as Debian, Ubuntu, or Alpine, will suffice. Once you create your Dockerfile you refer to the image like so:

FROM ubuntu:latest

This indicates that the base operating system will be running the latest Ubuntu Docker image. After that we’ll need to install whatever libraries are necessary for our build or test environment. In one example repository, I’ve installed the Arduino IDE using Debian package manager (Apt) and then added more layers by installing the Arduino Sam board drivers as well. Running this container in privileged mode (or passing in the volume mount point to the device) I can compile and upload an Arduino sketch via command line on a brand new machine that only contains Docker (i.e. no IDE or drivers).

We can do the same thing with machines that are connected to our devices under test. In this Docker container that I’ve put together I install all the dependencies and software needed to run an Analog Discovery 2 device. In theory, I can spin up the Docker container on a brand new machine (that contains only Docker) and start talking to the Analog Discovery 2 without any fuss. With the Analog Discovery 2 I can write tests to validate my ADCs, DACs, or send I2C/SPI commands to different chips on my board (along with a myriad of other capabilities).

Scaling with Continuous Integration

Now, let's discuss how containers can enhance Continuous Integration systems to boost efficiency and scalability. The real magic happens when we start to use containers in conjunction with Continuous Integration (CI) systems. We may have dozens, if not hundreds of physical test machines or access to thousands of cloud machines for our test and build servers. In order to scale practically, as mentioned above, we can’t configure each and every individual machine as they come online. Delivering a container with every CI run gives us not only a systematic, repeatable way of running builds and tests but also liberates us from having to reconfigure a machine every time we bring it up (which, in the cloud, happens on almost every CI run). By leveraging containers for embedded builds and physical hardware testing we provide ourselves and our companies a certain level of scale that previous generations could only dream about.

Conclusion

In this article, we've highlighted the crucial role of containers in embedded systems development, especially for consistent build and testing environments across different types of infrastructure. Their integration with continuous integration systems not only streamlines development but also boosts product scalability and reliability. As container technology advances, its adoption will become increasingly important for developers. Dive deeper and experiment with different setups by visiting some example repositories at https://gitlab.com/docker-embedded.

About Author

About Author

Ari is an engineer with broad experience in designing, manufacturing, testing, and integrating electrical, mechanical, and software systems. He is passionate about bringing design, verification, and test engineers together to work as a cohesive unit.

Related Resources

Related Technical Documentation

Back to Home
Thank you, you are now subscribed to updates.