What's in a Hardware-Software Co-design Process

Zachariah Peterson
|  Created: November 5, 2022
Hardware-Software Co-design

Software developers have it easy. They can prototype infinitely, and they don’t have to worry about building a physical device that implements their application. As soon as you bring a piece of hardware into your product ecosystem, you’ve opened a totally new can of worms that requires a collaborative process for success. Hardware/software co-design aims to implement a streamlined process for building these products that can optimize costs, performance, and the need for design modifications prior to volume production.

A PCB designer building embedded systems will have to work with an embedded developer to ensure the software/firmware implementation matches the design intent in the PCB. This might sound like a simple matter of communication on a messaging platform (Slack anyone?), but it’s easy to have technical details get lost in a linear communication chain.

Instead of a linear process, designers and their developer team members can implement a co-design process, where hardware and software are being implemented simultaneously, but with cross-checking between teams along the way. When you look below the surface, embedded developers and software developers have the potential to influence many aspects of real designs, so they need to work in tandem with hardware developers.

Firmware and Software, Meet Your Hardware

It’s no secret that software and hardware go together. Without hardware, the code we create for an application is meaningless. So it would make sense that, when developing embedded systems, we develop the two in sync. The two domains have to work together to create the best quality product.

At a very high level, the development process for hardware systems with a software component is based on acknowledging which side creates constraints for the other:

  1. Developers define performance requirements based on one discipline (either hardware or software)
  2. This defines constraints in the other discipline that limits the design capabilities

So, we basically have two possible directions or approaches for hardware-software co-design. Co-design of hardware and software, whether for a general-purpose system or application-specific system, has been driven by these two approaches:

  • In the hardware-driven approach, the entire system was instantiated in hardware to the greatest extent possible (with FPGAs and ASICs), then the remaining portions of the system were built in software running on the system host.
  • In the software-driven approach, the entire system logic was instantiated as software, then tasks were offloaded onto hardware with the goal of minimizing or optimizing compute workloads.

Computer systems and embedded systems design largely started as a hardware driven approach until computing power got to the point that most tasks could be run in software. Today, we’ve transitioned to a mix of a hardware and software approach; we typically find hardware that can implement some of the most critical features in the product, and then we fill in the rest of the functionality with software. The transition between the two has been enabled by greater feature integration in components we use in PCBs.

hardware-software co-design
The hardware-software co-design spectrum.

I’ve drawn out the drivers of hardware-software co-design as a spectrum: the performance requirements in one area put constraints on the design space in the other area. So, if you start with hardware, you inherently limit what the software platform will be able to do based on its interactions with an embedded device; the software can only do what the hardware enables! Looking in the other direction, software functional requirements might limit you to very specific pieces of hardware to support those functions, and some of that hardware might not exist!

I think this illustrates why so many PCB designers and development teams implement a hardware-driven approach to product development for embedded systems. The hardware is selected from a range of off-the-shelf components, and the firmware/software fills in the gaps. The software then comes behind everything and creates the user interface, but it is totally constrained by the capabilities in the hardware.

Why SoCs?

Why should this class of component sit anywhere on the hardware-software co-design spectrum?

The SoC movement has totally changed the way developers can implement co-design under the hardware approach. In the past, we used to implement the hardware approach with a huge number of ASICs and a weak processor, as shown in the spectrum graphic above. Everything was placed on large PCBs. Newer MCUs, SoCs, and SiPs have totally changed the dynamic; now many of the important features that would have been present in discrete semiconductors are built into the package of an SoC/SiP. Modules (SoMs, CoMs) essentially do the same thing but in a larger footprint.

So with an SoC-driven process, you are basically consolidating a lot of components and you can incorporate basic programming instructions in firmware. Vendors also offer libraries that allow interfacing with another system through software, or for running an embedded application directly on the device.

Why FPGAs?

In my opinion, an FPGA is a much better piece of hardware that enables a software-driven co-design process. An FPGA allows programmable logic to be defined at the logic level, rather than implementing it in software where it could be very inefficient. In an advanced application area like AI, vendors provide tools that allow developers to translate their code into hardware-level instructions. This means it is much easier to start systems development from the software end rather than by selecting hardware.

hardware software codesign FPGA

If you’re an HDL buff, then you’ll most likely instantiate the logic yourself as a custom instruction set, which is then called by the software deployed on the device. Overall, the compute workload and latency are massively improved in this approach. Many FPGA vendors are also embracing RISC-V as a development option for these systems as they allow instantiation of custom OS instructions at the logical level in FPGAs. This can be a big enabler of unique features running in hardware on the device as well as in software.

Testing After Prototyping

After building an initial prototype, the design will need to be thoroughly tested. The main aspects of PCB testing with a fabricator or assembler will have already been qualified by this stage. In a hardware/software co-design process, this stage is all about functional testing rather than bare board or assembly testing. This is the first stage where you will verify the board boots and that it can interact with its software application.

Board bring-up - The first step in prototype testing is board bringup, where you simply boot up the board. In some ways, initial testing on only the hardware is much easier; you’re taking direct measurements as part of bringup, probably connecting to a COM port on a computer, and checking that the firmware on the device boots as intended. If you wrote the firmware to provide test messages and errors at the bring-up stage, then you can quickly identify problems in the firmware.

Application testing - The application testing stage focuses on ensuring the hardware-software interaction is correct. Closed-loop testing, or hardware-in-the-loop testing, brings greater efficiency to the testing process and enables a seamless integration with the software side of a product. Testing can be highly automated and data can be pulled directly into a software instance to qualify performance. Make sure to read the linked article above to learn more about this process and how to set up your own hardware-in-the-loop testing process.

Other Developer Resources

No matter which path you want to take to develop a new product, there are resources to help accelerate the driving side of the ecosystem and quickly define new requirements for the system. From modules to open-source projects, developers can access several resources that can help teams jump into the process quickly.

All-hardware

SoC/FPGA-enabled

All-software

  • Reference designs
  • Open-source hardware platforms
  • Application notes
  • Modular platforms (dev boards, SBCs, etc.)
  • Vendor libraries for SoCs
  • Vendor IP for FPGAs
  • 3rd party libraries (e.g., Arduino)
  • 3rd party code examples
  • Open-source projects on GitHub
  • Some vendors offer development toolkits and applications
  • 3rd party code examples

 

Starting with reference designs is probably the most common approach. At the schematic level there is nothing wrong with this, especially if there is some critical hardware-enabling feature in a particular component that the system needs. Be careful with the PCB layout in this case as reference design practices do not always telegraph successfully into a custom PCB. 

Semiconductor vendors are offering a lot of support with libraries and software tools to help speed up development. This actually enables a software-driven approach to system development. Designers can focus on building the software stack and using those features to inform hardware requirements, then they can go back and build the hardware. Many times, this drives the use of FPGAs simply because they offer programmable logic that can best match the software requirements.

Software Releases Will Be Slower, But Higher Quality

Frankly, there is no one approach that is objectively “better” than the other. Depending on the performance requirements and user experience you want to create, and the available hardware to support those goals, one approach might be better than the other. It’s a matter of weighing available components, required features, and ease of implementation for both.

Overall, design teams may find that a hardware-software co-design process tends to increase the time required to get a software application built and into production. This is because the software has to follow the hardware implementation; whatever is implemented or changed on the hardware must be reflected in the device firmware and software.

The tradeoff here is that alignment between software developers and hardware designers can be ensured at each stage of the design process. In smaller companies, this might be less of a challenge because communication can be much easier. In larger companies, where each domain operates semi-autonomously, alignment across teams is critical for preventing wasted redesigns and lost time. An integrated environment can help teams work together and provide visibility into project data throughout the design process.

When you need to implement a cohesive workflow across your design team, make sure you use the complete set of PCB design tools in Altium Designer® and the team collaboration features in Altium 365™. Altium’s unified design environment gives teams the tools they need for comprehensive product development and management throughout the product lifecycle.

We have only scratched the surface of what’s possible with Altium Designer on Altium 365. Start your free trial of Altium Designer + Altium 365 today.

About Author

About Author

Zachariah Peterson has an extensive technical background in academia and industry. He currently provides research, design, and marketing services to companies in the electronics industry. Prior to working in the PCB industry, he taught at Portland State University and conducted research on random laser theory, materials, and stability. His background in scientific research spans topics in nanoparticle lasers, electronic and optoelectronic semiconductor devices, environmental sensors, and stochastics. His work has been published in over a dozen peer-reviewed journals and conference proceedings, and he has written 2000+ technical articles on PCB design for a number of companies. He is a member of IEEE Photonics Society, IEEE Electronics Packaging Society, American Physical Society, and the Printed Circuit Engineering Association (PCEA). He previously served as a voting member on the INCITS Quantum Computing Technical Advisory Committee working on technical standards for quantum electronics, and he currently serves on the IEEE P3186 Working Group focused on Port Interface Representing Photonic Signals Using SPICE-class Circuit Simulators.

Related Resources

Back to Home