From Excel to Enterprise: Modern PLM Solutions for Hardware Teams

James Sweetlove
|  Created: August 22, 2025  |  Updated: November 14, 2025
From Excel to Enterprise Modern PLM Solutions for Hardware Teams

Discover how modern hardware teams are abandoning Excel spreadsheets for cloud-native PLM solutions that streamline product development from design to manufacturing. Michael Corr, Founder and CEO of Duro Labs, shares insights from his 20-year engineering career and explains how AI-powered product lifecycle management is transforming the hardware industry.

Learn why traditional PLM systems have been considered a "four-letter word" among engineers and how new-generation solutions are changing that perception. This episode covers supply chain integration, the cultural shift happening in hardware development, and practical strategies for implementing PLM workflows that actually work for fast-moving teams.

Resources from this episode:

Listen to the Episode

Watch the Episode

Transcript

James: Hi everyone, this is James from the Ctrl+Listen podcast, brought to you by Octopart. Today I have a guest for you. This is Michael Corr, founder and CEO at Duro Labs. Thank you so much for joining us today.

Michael: Thanks for having me, James.

James: Anytime. Before we get into the nitty-gritty, do you want to introduce us to the company, tell us a little bit about Duro’s story and your background as well?

Michael: Yeah, happy to. A quick summary: Duro is a native cloud, AI-powered PLM platform. We handle customers’ CAD data, bills of materials, and supply chain, and it’s really a progressive, agile workflow type of revision control for handling all the disparate data that’s needed to go from design to manufacturing for almost any type of hardware product.

James: Okay. And what’s the story behind how the company was formed?

Michael: So my background: I’m actually an electrical engineer, and I’ve been designing and manufacturing various products throughout my career—nearly 20 years in the industry. My area of expertise was RF and telecom equipment.

I started my career building radios for the government, then moved into cleantech, wireless lighting controls. I did IoT, robotics, wearables, drones—quite a handful of different products. I did quite a bit of manufacturing in China and overseas, fell in love with Hong Kong, and moved there for a few years. I had my own business helping companies get up and running in factories there and learned quite a bit about the whole gamut from design all the way through production—what it takes to get products into market.

Throughout my career I was following what were considered industry best practices and using the leading software tools for managing all this content. Being in the trenches with products being manufactured, I saw how critical it was for your bill of materials and supplemental data to be accurate, to reduce the risk of manufacturing mistakes.

Contract manufacturers are often on the receiving end of all this content. They weren’t part of all the dialogue that happened during the design cycle, so they don’t have the tribal knowledge of the nuances of your product. They’re really depending on the documentation they get from you. If there are inconsistencies, errors, or ambiguity, it puts a lot of pressure on them to resolve it and hopefully build your product correctly.

As we all know, hardware is not a cheap endeavor. Mistakes can get extremely expensive—not just in capital, but in lost time and opportunity as well. I recognized that, yes, humans make mistakes—it’s part of the equation—but a lot of those mistakes were rooted in poor data management.

A lot of it was rooted in what I felt were software tools and processes that at one point were reasonable and considered best practice, but so much had evolved in terms of technology and capabilities that most of this stuff was now antiquated and horribly inefficient. The root problem was still a heavy dependency on manual involvement for managing the data. We’re human, we make mistakes, we’re inconsistent, and it just wasn’t the right pillar to build your data management on.

It really should be leaning on software tools to automate those processes, to catch inconsistencies or enforce rules. With the boom in the software industry in the mid-2000s and what’s now commodities—cloud technologies, SaaS, APIs—I saw a lot of the technology was there and available to bring the hardware industry where it needed to be, to resolve a lot of these manual processes. That was really the impetus for starting Duro.

James: Wow. It’s a very rich background you have there, and I think it really gives you a different understanding compared to someone who’s just created software—understanding only the software side of things but not the practical application of it.

Michael: Yeah. I like to think I’ve been exposed to quite a few parts of the hardware ecosystem.

James: Definitely. I remember speaking with you and you said there was a product relaunch. What led to that pivot, and what is the relaunch that you’ve done?

Michael: It was kind of bittersweet, if that’s the right word. As I said, when we first started around 2018 or so, we saw the writing on the wall—we saw the cultural shift happening in the hardware industry. There were more and more young engineers entering hardware than there really had ever been before, and it was starting to pick up pace.

But the market was still dominated by the existing workforce, legacy tools, and legacy workflows. So we focused on that new generation: startups and small teams looking to move fast. They understood cloud, APIs, and this plug-and-play mentality.

Our first product was very intentionally simple, straightforward, and I’d say opinionated. It implemented industry best practices. It didn’t require the team to know how to properly configure a PLM, how to set up a part numbering system or revision scheme. A lot of that stuff isn’t taught in school—it’s learned on the job.

Through my experience, I’d seen so many teams—through no fault of their own—naively set up and configure their PLM systems, not really knowing what they were doing, and they’d always end up painting themselves into a corner with complexity that wasn’t necessary. We felt that by removing that complexity and enforcing best practices, we allowed teams to get up and running quickly. They didn’t have to use Excel as a stopgap. They could use Duro to implement proper PLM workflows with a fraction of the time and knowledge needed.

That worked really well for small teams, fast-paced teams, or NPI (new product introduction) teams within larger companies. But, for good or bad, the hardware economy caught up and accelerated at a much faster pace than we anticipated. Now there is broader understanding and knowledge of agile workflows and tech stack concepts. There are more software-for-hardware companies in the market now than there were five years ago.

The market’s more educated on these workflow models and implementing them for hardware. Admittedly, our first product didn’t have the infrastructure to adapt to how fast the hardware industry evolved and how fast the software tools for hardware came to market. To keep up with customers’ wishes—the integrations they wanted and features they wanted—we did what we could to update the existing product, but we reached a crossroads where we said, “Let’s pick our heads up, take a breath, and look around. What’s the right opportunity?”

Very fortuitously, there were a couple of ingredients that helped shape our decision. One was simply that SaaS technology as a whole had evolved tremendously since we first started five years ago. There are so many advantages of newer SaaS technologies that allow you to build a product faster. A lot of the core foundations we had to build ourselves were now available out of the box.

The other aspect was that AI was now prevalent and reasonably mature—not just to add AI tools to your product for your customers, but to be a consumer of AI yourself to help you build your product. Various models and tools were out there that helped expedite software development. That really empowered us to say, “We have a small window of opportunity to build something fresh and new. We can start with 2025 SaaS technology and, on top of that, use AI to accelerate. We can build the product we want from scratch in a fraction of the time it would have taken years ago, or even to continue evolving our existing product.”

Combined with years of empirical experience working with customers and knowing what they want, we had clarity on exactly what the market needed. We relaunched, basically just a couple of weeks ago, a brand-new product called Duro Design, which is a more agile PLM workflow platform.

It’s very API-leaning—we started with an API-first approach. Everything available to the customer through the web application and our plugins is available through the API. One thing we recognized culturally is that a lot of hardware teams now have in-house software development expertise, and many of them enjoy building their own tools and building on top of other platforms. We wanted to give that access to our customers.

And then, of course, we have a line of AI-empowered features available to customers as well, which I can also get into.

James: Yeah, I’d love to hear about that. Just before we get into it, for anyone who’s not super familiar with the terminology, do you want to outline what a PLM is?

Michael: Sure. I recognize our industry is very acronym-heavy. PLM stands for Product Lifecycle Management.

The best analogy, especially for people more familiar with software workflows, is that it’s like GitHub. It’s revision control, for all intents and purposes. It’s the central hub of content where you upload your CAD content, you build out your bill of materials, and you supplement it with supply chain information, procurement information, manufacturing assembly drawings, test information.

It’s basically how you define the recipe to build your product. The output of a PLM is the cookbook you give to your manufacturing team—whether in-house or third party—and they use that to know how to procure parts, manipulate them, and eventually build and test your product.

On top of that, it has revision control. As you make changes—which is a natural part of product evolution—it provides ways to track your changes, review them, approve or reject them, and have traceability for when changes were introduced. You can compare revisions, and so on. It’s a true revision control platform.

I refer to it as a multi-dimensional GitHub. GitHub really just tracks software files—text files—in a fairly linear fashion. With hardware data, you’ve not only got CAD files (which are analogous to text files), but also supply chain information, assembly information—multiple vectors of information that can change at any time.

James: Thank you. That was a fantastic explanation. To get a little deeper now: about your UI and why that’s unique—can you give a couple of examples and some use cases?

Michael: Absolutely. That’s partly my own personal pet peeve: strong UI/UX—user interface and user experience. I fundamentally believe that strong UI/UX leads to better adoption and usability of your product and more productivity.

The more work it is to do a task, the more people are going to be hesitant to do it. The more complicated or ambiguous a process is, the more likely there will be deviations or inconsistencies among users.

In the end, the whole tech stack—from CAD to PDM, PLM, ERP, MES, and all the other software—is what I call a “garbage in, garbage out” problem. If there’s erroneous, inconsistent, or incomplete information going into these platforms, it’s not going to get cleaned up. It won’t be any better coming out into your production.

The more intuitive, easy, and automated a user interface is, the more encouraging and enjoyable it is to use—which encourages adoption—and it helps ensure there’s less garbage coming into the platform.

We’ve always stressed clean and simple UI. We’ve got a fantastic product team. We’ve won many accolades about how simple and intuitive our product is.

We do things like automating workflows. Software is so much better at managing data—certainly large volumes of data—looking for inconsistencies, missing information, conflicting content. It’s simple to build business logic to scrape your data and identify issues.

For example, when you import spreadsheets or CAD information, we run a litany of validation rules and quickly identify anything inconsistent or problematic so you can immediately address it before it’s too late.

Other things include the change order itself. That’s the process where an engineer submits a change that needs to be reviewed by a team before it’s adopted into the main branch. Again, looking at software, the analogy is a pull request. A change order is such a pivotal experience and process that if it’s not simple, intuitive, and efficient, it’s not going to be worth its weight in use. But it’s critical to PLM and design management.

The stories I tell are that if a change order takes too much effort for the reviewer to review, and they have to do a lot of manual steps to properly analyze it, they’re either going to rubber-stamp it—say, “I know that engineer, he’s good, I don’t have time to go through this,” and just approve it—or they’re going to wait until they can block out a chunk of time to roll up their sleeves and do the due diligence. Neither is efficient.

We spent a lot of time studying how we can automate and speed up the change order process. How can we do the data management and automatic reporting about what’s happening in the change order? How can we fill in summaries based on the context of what’s been submitted? How do we integrate with other tools—simulations or validation tools—that autonomously validate content in the change?

By the time a reviewer comes to do their job, everything is presented on a silver platter so they can still do a thorough analysis, but all the legwork is done for them. It becomes a much more efficient process.

James: Fantastic. And I know a big part of that is integrations. You offer a lot of integrations with other software and products. Do you want to explain that a little bit?

Michael: Absolutely. A PLM has no value without data, and a lot of that data starts in CAD. We wanted to really control that user experience and provide plugins for the CAD tools we support.

A lot of legacy PLM providers don’t touch CAD plugins with a 10-foot pole—they let the market develop its own. I’ve found that creates an inconsistent experience because now you have two or more brands of products trying to work together. There’s a lag when one changes its APIs or features before the other can catch up. As a consumer, I now have two different contracts to manage or two different vendors to chase if there are issues.

We felt that by Duro owning that whole experience, we can make it consistent and efficient. There’s no lag: as we add features, they’re immediately available across the entire platform.

Today we have CAD integrations for Altium, SolidWorks, Onshape, and NX, with several others coming down the pike. On the other side, products like ERP—NetSuite or SAP—are market-leading ERP platforms. As teams reach certain maturity points, they tend to use ERP to manage purchasing and orders. We have plugins to the major ERP platforms as well.

We’re really trying to make it an out-of-the-box plug-and-play experience for our customers, so they’re not chasing multiple vendors or paying expensive consultants to build custom integrations. We’re trying to make it as simple as possible: put down a credit card, enter your account credentials, authenticate the products to communicate with each other, and that’s it—simple, very smooth.

James: I know you have some supply chain integrations as well. Do you want to explain what that means and why you feel that’s an important feature for the software?

Michael: That goes back to some of my personal pet peeves. When I was an active electrical engineer, we spent so much time chasing down parts—resistors, capacitors, inductors, ICs.

I had a web browser open with five tabs: Arrow, Digi-Key, Avnet, Mouser, etc. They were always open so that as I was designing boards and going to production I was constantly looking at part availability. It was a complete waste of my time.

When I found parts, I’d manually copy content from those catalogs into my PLM system. Based on my own patience at the time, and because it was a manual process, that was prone to error.

One of the first features we built, to be honest—and not just to plug your product—but it was Octopart. When I discovered Octopart I thought, “Why weren’t you available during most of my career?” It was trivial.

Again, software is better at managing data than people. Octopart was a perfect product: I could put in a manufacturer part number and get all the information I needed from those catalogs in milliseconds across multiple vendors. In Duro, I could set up rules about what content I wanted to capture so it was consistent, error-free, and exhaustive.

It’s so important to make that content easily accessible—not just for obvious efficiency gains by removing humans from the loop, but also for greatly reducing development cycles.

Another classic example every engineer has gone through: as an electrical engineer, I’m designing a board. I’m using these catalogs to confirm parts are available today. But by the time I finish the board design and release the Gerbers, and then actually go to produce it, a lot can happen in supply chain. Those parts might not be available anymore.

Often, especially at bigger companies, we had a separate team responsible for procuring parts. Because of those factors, by the time we actually went to produce, those individual parts weren’t necessarily available or didn’t meet purchasing criteria—pricing or lead time. I had to go back to the drawing board.

Best case, I found an alternate that was package-compatible. But like many electrical engineers, I’ve had situations where I couldn’t find a package-equivalent part and had to go back to the board and redesign it to change the footprint so it was compatible with supply chain.

You lose so much time and so many cycles through that process because there’s so much effort to confirm supply availability that you only do it when absolutely necessary. With products like Octopart and other digital supply chain tools, with a simple refresh operation I can find out whether all these parts are still available.

During design cycles I can incorporate that information much more readily and don’t have to wait for procurement or manufacturing cycles to find out. By empowering the design team with more information, more efficiently, you reduce the dependency on procurement and manufacturing to figure out supply issues.

James: Fantastic. And I’m sure things like end-of-life for products as well—you really want to check that so you don’t waste a lot of time in production and design cycles.

Michael: Exactly. And again, software is very good at processing data. You can set alerts: let me know if these parts go end-of-life, or if lead time changes, or if pricing exceeds a threshold. You can now be more passive instead of constantly proactive to get this information, and respond much more efficiently.

James: Fantastic. I know you offer a couple of different plans for your software. Do you want to talk us through those and how they scale?

Michael: We’ve learned over the years that no team is the same. By monitoring how our customers use our platform, we were able to deduce and, to keep it simple, bin them into three tiers of maturity and the types of tools and features they need along the journey.

We offer three subscription tiers: Team, Business, and Enterprise.

At a high level, Team is when you’re just getting your CAD integrated. You want to set up business rules, change order workflows, part numbering, and you want to get going. You don’t want to use Excel or any homegrown solution to manage your information. You recognize the value of using proper PLM much earlier.

It’s simple: create an account, connect your CAD tools, and you’re up and running. We have out-of-the-box workflows you can choose from, a bit of configurability, but it’s for teams that don’t want to be bothered with days or weeks of configuration.

As you progress, you’ll want to integrate with other tools: your ERP system, MES for manufacturing, other supply chain tools, requirements management tools, etc. Now you’re starting to mature—that’s when you get into the Business tier. You’re really in production, not just early-stage design concepts. The Business plan includes all those features and allows more storage and more users.

Finally, the Enterprise tier is completely configurable. As I mentioned earlier, we’re an API-first platform. You get exposed to everything. Any feature available in our plugins or web client is available through our API. I kind of joke that you can rebuild the entire front end of the web client using our API because all the logic is in the backend.

It really empowers customers to take control of their platform. There are so many new software-for-hardware companies entering the market now; there’s no way we can be responsible for building integrations for every single one of them. We didn’t want to block our customers from growth—which, unintentionally, the legacy product sometimes did.

That’s why we’re focused on making sure everything is accessible. Whether they want to build an integration or configure their own system within Duro, it’s all available through our Enterprise platform.

James: Fantastic. I want to come back to something we talked about earlier in the episode—it’s a very popular topic right now: AI. I want to talk about the role AI has played in the PLM space so far. Obviously things are progressing incredibly rapidly. What changes have you seen in recent years, and what do you think is coming?

Michael: We’re very bullish on how AI can supplement and support the PLM category. We were sitting on the sidelines for a bit to see where the market was going and waiting for some of these models and tools to mature, but now it’s very clear that AI is a perfect fit for PLM in many respects.

One of the reasons is that PLM really is the aggregation point of multiple data sources coming from different CAD tools, different ERP tools, MES, and other resources. These are different siloed buckets of data, each with their own formatting, culture, and vernacular.

Historically, to get all that data into your PLM system and make value out of it—even with advanced algorithms like machine learning—you had to build translators that convert the data from its source format to your PLM platform’s model. Those translators are expensive and have to be built; some are more trivial, others more difficult.

AI has a unique characteristic: it’s very good at processing data in its native format. It removes the burden of building those translation tools. As long as there’s a digital connection—a digital thread—between the two platforms, AI can process the data as it is.

That allows Duro and our users to get value out of disparate data sources more quickly, at lower cost than ever before. That’s one fundamental reason why AI is a perfect fit for PLM.

James: That makes total sense. I think AI is perfect in any sort of data or analytics role—it’s really freeing up so much time for creativity and other aspects of work.

Michael: Agreed. I’m oversimplifying, but I see AI in two categories. It’s being used to generate content—whatever the medium—and to process existing content. I’d argue the processing-content category is a little more mature than the generating-content category.

That’s not to say generating content won’t mature; in some areas it already is. But Duro is leveraging AI to process content. That’s our job: you bring your content—CAD data, production data—and say, “Here it is. What can you do with it?” We use AI to get insights, reports, and value out of that information.

To give a concrete example: again, change orders. The change order module is where all the activity around your product lives. What are people doing? How are they commenting? Are they rejecting changes, approving them?

When there’s a production issue or success, you can connect your MES and ERP system back to PLM and create a digital thread from the production event back to the change order that led to it. There is a huge amount of content about the evolution of your product in the change order module, and it’s all accessible through text or structured data.

We feed our AI engines to process that and get insights so that it can learn. When new change orders are submitted, we can offer value like: what is the probability this will be successful? Should there be recommended adjustments before you submit it?

Even if we make change orders more efficient, it still takes people time to review them. What can we do to help the author increase the chances of the change order being approved? How can we learn from prior events to help ensure new changes have a higher degree of success? That’s one area.

Other areas are natural language search. As I mentioned earlier, PLM is a garbage-in, garbage-out problem. When it’s difficult to find content in your PLM system, engineers can be lazy. If they find it hard to locate a part in the existing PLM, they might say, “Forget it, I’ll just create a new entry and we’ll figure it out later.” That continues to build the garbage.

Natural language is a fantastic, intuitive way to ask, “Does this part exist with these parameters in my library?” NLP and search can more accurately tell you, yes or no, or find similar parts so the engineer can evaluate, “Yes, this is exactly what I’m looking for,” or, “No, I can definitively say it doesn’t exist and need to create a new entry.”

James: So, taking a step back from AI and looking at the PLM space outside of AI advancement, what changes have you seen in recent years? Do you see anything coming in the future that would shake things up?

Michael: One of the things I’m most excited about is that PLM is now being more embraced and recognized for the value it provides. It wasn’t that long ago that PLM was considered a four-letter word. People hated it.

They would procrastinate as long as possible before bringing a PLM system into their team, which is why a lot of companies were still building products on the backs of Excel and homegrown file directory structures. They felt that the effort to set up and run a PLM system—and still have mistakes—wasn’t worth the investment. They’d rather take on that responsibility themselves and build some homegrown system.

Many people understood the risk, but at least they owned it and could move fast with it. Many teams procrastinated until it was too much work to manage manually before finally deploying a PLM system.

Now I’m seeing more cultural acceptance and understanding of what PLM provides as a value-add. Companies like Duro are making it much easier to implement, which supports that shift.

I can’t stress enough how exciting it is that there’s a new generation of hardware engineers entering the workforce. What makes them unique, certainly compared to when I entered or my predecessors did, is that they’re all very competent at software development and intuitively understand the cloud and APIs, at much bigger scale.

When they realize that a PLM system is analogous to Git, they immediately understand the value. There’s no software developer in the world who doesn’t use Git or some revision control system—it’s unheard of.

A great comparison is: every software developer’s very first line of code is git init. They set up their Git repository before they start writing code. Traditionally, hardware developers dive into CAD first, and once they reach a breaking point they retrofit their revision control system—PLM.

It’s a very different approach: do I start with revision control or lag with it? I’m seeing more people understand its value and bring PLM in much sooner, closer to day zero when they start with CAD.

James: Wow. So it’s a cultural shift that you’ve seen that’s made a huge difference.

Michael: Correct. It’s a big cultural shift.

James: What about effects from world events? Have you seen any effects post-COVID or from regional conflicts that affect this space?

Michael: I think the biggest impact on PLM—and certainly Duro’s position—is in supply chain integration.

Duro has always been bullish on bringing supply chain access into earlier stages of design cycles through PLM. Electronic parts distributors were some of the first to digitize their catalogs and make APIs accessible so we could integrate them. Mechanical parts followed suit, and we’re seeing more custom manufacturing—PCBA fabs, machined parts, injection molding—offering equivalent web-based digital services where you can automate procurement and quoting.

World events like COVID, tariffs, even tanker ships getting stuck in the Suez Canal affect supply chain. Supply chain is fragile—everyone kind of knew it was a house of cards, but COVID exposed it to the general public.

By digitizing these resources and creating digital threads, users can respond more quickly to changing events. PLM is the central point because the BOM—the bill of materials—is what dictates what you need to buy, and the BOM is stored in the PLM.

By integrating supply chain with PLM—and I’d argue not just ERP—you can evolve and adapt much faster as world events change your supply conditions: availability, pricing, and so on. I’m seeing more awareness that having a PLM system integrated with digital resources for monitoring and procuring supply chain capabilities makes teams more nimble and able to adapt as things evolve.

James: We’re coming up on time, so I only have two more questions. The first is: is there anything coming up from Duro Labs we should keep an eye out for—any announcements we should watch out for?

Michael: We’ve launched Duro Design, and it’s continuing to evolve. We’ve got a huge list of features and integrations we’ll be adding, so definitely keep an eye on our website and subscribe to our LinkedIn page.

We’re launching a PDM product called Duro Drive. We find that a lot of customers are also looking for easier ways to manage check-in/check-out of their files—their CAD files and other documents. There’s a natural relationship between PLM and PDM.

We’re really excited about the way the industry is evolving. There are fantastic partners we’re working with, and we’re bringing on new partners all the time. We have a partner program: anyone who has a software tool they want to integrate with PLM—talk to us. We’ll find ways to add value.

James: Fantastic. The second question is something you kind of answered already: best ways to follow you and keep up with updates. So website and LinkedIn are the two best?

Michael: I would start there, absolutely. Our website is getduro.com, and you can find us on LinkedIn and follow our page as we make announcements.

James: Fantastic. Well, thank you so much, Michael, for your time. It’s been fascinating talking with you and you’ve been very thorough—honestly, great guest.

About Author

About Author

James Sweetlove is the Social Media Manager for Altium where he manages all social accounts and paid social advertising for Altium, as well as the Octopart and Nexar brands, as well as hosting the CTRL+Listen Podcast series. James comes from a background in government having worked as a commercial and legislative analyst in Australia before moving to the US and shifting into the digital marketing sector in 2020. He holds a bachelor’s degree in Anthropology and History from USQ (Australia) and a post-graduate degree in political science from the University of Otago (New Zealand). Outside of Altium James manages a successful website, podcast and non-profit record label and lives in San Diego California.

Related Resources

Related Technical Documentation

Back to Home
Thank you, you are now subscribed to updates.