What is PCB Data Management
PCB Design Data
The successful design and manufacture of any PCB comes down to managing data. First, there are the requirements of the circuit and expression of what the final product should do, its specification and tolerances, and its operating environment. Then there is the data for each component in a downloaded datasheet or stored electronically in a design tool library. Then there is the data for the PCB itself, the material properties, and conductor thicknesses. Finally, in many cases, designs rarely start from scratch but often reuse parts of existing successful designs as a starting point. Circuit designs and PCB layouts available in electronic format support reuse.
The critical questions for the designer are:
- Do I have all the data I need,
- Is it correct and up to date,
- Has anyone made changes that I don't know about yet?
Working with incomplete or out-of-date data will result in a design that doesn't perform as required.
The problem is that the data changes throughout the development process. Customers change their requirements. The design team's understanding of the requirements can change with the resolution of ambiguities and the challenge of assumptions. Design decisions in one aspect of the design force change to other elements, such as the shape of the enclosure changing, which means a PCB component is now too tall to fit inside. Operating environment changes mean the design needs to cope with different ambient temperatures or higher vibration levels. The design of a section of control logic may require the design for the power supply to provide power within different tolerances. The list of possible changes is endless. It is imperative to have processes to manage the data to address any changes.
These issues are magnified when there's collaborative working on a PCB design. For example, processes need to ensure everyone on the design team knows that a specification has changed or that a different component with different physical or electrical properties has been swapped into a design.
As mentioned, using incorrect or out-of-date data is a recipe for failure. So the critical question here is how you know that you're using the correct information for your PCB layout.
Requirements management tools are an established mechanism for capturing and managing customer requirements and tracing these through to design decisions. They allow the management of changes but highlight which changes may impact parts of a design to a specification. Verification of the requirements is demonstrated after the development process through product testing. Formal requirements management tools allow confidence that the captured requirements are maintained as best as possible during the development process.
For component and PCB specifications, datasheets are generally obtained from the manufacturer and taken on trust. Typically there are no announcements when there are revisions to datasheets, so the design team is primarily responsible for checking that they are using the latest version and that there are no notifications of errors in the public domain. Often this comes down to engineering experience and choosing components from trusted manufacturers with an established record of providing correct data. A good tip is for parts available from multiple vendors, the datasheets from each vendor can be compared and differences investigated. Additionally, vendors with a history of datasheet errors can be moved from trusted status to requiring additional verification checks should their products be used.
Internal design decisions and specification data for bespoke components such as hand-wound inductors or laser cut busbars are usually managed less formally. Often this relies on communications within the design team to ensure everyone has the latest information available kept up to date with changes. This is, in essence, a project management issue. However, problems come when stakeholders forget this need and communications break down. This is particularly a problem where a collaborative team is not co-located or does not undertake regular meetings to discuss progress and decisions.
Management of Data
Ad hoc solutions and manual processes will only get you so far. They rely on designers to remember to tell the rest of the team a change has been made to a circuit design or component choice, which can never be reliable. The answer is an integrated data management solution that provides what is known as a single source of truth, a final dataset with no duplication that everyone can access in real-time. Hence, a change made by one person is immediately visible to everyone else.
This data management can be even more challenging when you remember not all data is static. For example, while component tolerances and specifications tend to be set in stone, dynamic data such as component prices and lead times can change on a daily basis. Monitoring dynamic data for changes and managing updates to the dataset can be an effort-intensive activity that is an ideal candidate for automation, given the potential consequences of using erroneous or out-of-date data.
The challenge has been that the dataset is typically vast and incompatible. The data contained in electrical computer-aided design (ECAD) and mechanical computer-aided design (MCAD) tools are rarely compatible with each other, let alone compatible with the data in requirements management tools. Effective data management requires a single integrated solution that can link tools together with managed libraries of data seamlessly and effortlessly from the design team's point of view.
Adopting a solution like Altium to provide a unified data management platform for the development process is an excellent foundation for effective design processes. But there is still work to do in creating automated processes around the development lifecycle to eliminate human error and ensure the validity of the dataset.
Maintenance of Data
The data associated with a component library changes over time. New components become available, changes to manufacturing processes can alter specifications resulting in an updated datasheet, older parts cease manufacture and become obsolete as stocks dwindle. Keeping the library up to date can be a full-time job in itself.
Some organizations undertake the continuous maintenance of libraries so the design teams can ensure they are using the latest data and the components they select will be available. For other, typically smaller organizations, the design team will need to check that they have the correct information and that their chosen components are still available.
Maintaining data will depend on the nature of the organization. The essential point is that there is a process. There's nothing worse than spending weeks refining the perfect circuit design only to find that critical components are now obsolete and the alternatives have subtly but significantly different specifications.
Security of Data
Having all the data collated and managed is great until something catastrophic happens that affects all the data. The accidental deletion of the entire dataset or worse, a malicious hacker changing the data can bring a project to a halt. Data security has three vital components; confidentiality, integrity, and availability.
Confidentiality means protecting data from unauthorized access. For example, design decisions or customer requirements can include sensitive intellectual property or copyrighted materials that a competitor would love to see. Therefore, the systems that store the data must be protected so that only authorized members of the design team and other key stakeholders have access.
Integrity means protecting data from unauthorized change. A disillusioned team member or hacker can alter data which can wreak havoc if their alterations are not noticed straight away. Finding a problem at the end of the development process when the first board has been manufactured means enormous amounts of wasted time and resources. Trawling back through the data to find and correct errors can take days of effort. The systems that store the data must be protected so that only authorized personnel can alter data, and there are records kept when changes are made.
Availability means ensuring the design team has access to the data when they need it. Having highly secure data that no one can access is pointless, and disruption to networks through equipment failures or denial of service attacks that prevent access to data can bring a design project to a halt until the problem is solved.
So far, we've talked about the data used in the development process. The developers will send off a completed and verified design for manufacture at the end of this process. First, the purchasing department needs bills of materials to source components. Next, the board manufacturer needs PCB layout schematics to produce the production standard PCBs. PCB layouts are also required by the assembly line to build the boards. Then there's the mechanical schematics that the assembly line needs to fix the boards into their enclosure, and other non-board mounted components and wiring requirements.
The completed design will also have its specification data regarding its input requirements, power supply or signals, and outputs regarding levels, tolerances, and power ratings. The essential information will be generated by the development process for use in interface specifications and operating manuals.
All this data needs to be generated, verified, and delivered to those who need it so that they receive the correct data. It's no use managing data to create the perfect design if the wrong version of the PCB track layout is sent off to the manufacturer. The data management needs to be a complete end-to-end solution, from component data sheets coming in at one end to the assembly schematics going out the other end.
PCB design revolves around data. Data coming in that's used in the design process, data created and used within the design process, and data produced by the design process that's needed in the next stage of a product life cycle. Managing this data is key to trouble-free design processes. Any solution needs to collate data, both static and dynamic, from diverse sources into a single source of truth that designers can rely on being up-to-date and accurate. Automating this process to minimize data management overheads and eliminate human error is essential.