The NVIDIA-Arm Deal Will Expand Hardware for AI and Deep Learning
On September 13, NVIDIA finalized their deal to acquire Arm from SoftBank for a massive $40 billion in stock and cash. Shortly thereafter, MarketsandMarkets released their AI chipset market forecast, projecting a market size of $57.6 billion by 2026, equating to 40.1% CAGR. With GPUs being the current workhorse for AI workloads in data centers and at the edge, market leader NVIDIA is set to see major growth for their core products.
The Arm acquisition reflects much more than an expected growth in the market for AI applications. When a company has $40 billion to spend on chip IP, there must be serious growth potential. There are several reasons NVIDIA is positioning themselves in this way, and we can see one reason when we look at the expected market growth for AI chipsets and their hardware platforms.
What’s Going on at NVIDIA?
Without getting too deep into the financials, NVIDIA now owns a stake in all Arm IP that will be licensed to chipmakers. As part of the NVIDIA-Arm deal, it was noted NVIDIA “will continue Arm’s open-licensing model and customer neutrality and expand Arm’s IP licensing portfolio with NVIDIA technology.” This licensing model cuts both ways, where Arm’s customers can now gain access to NVIDIA’s AI IP and GPU IP. Some have said the deal would be a "technology disaster" and "reckless". I see the deal as a natural pairing of two sets of IP (chip IP and AI IP) that will see huge growth in the years to come.
To better see how this is advantageous, just look to the future of computing, of which AI will play a major role, both at the edge and in data centers. With AI and edge computing (including the combination of the two) being set for huge year-over-year growth in the next 5-10 years, NVIDIA now has ownership over a suite of IP to serve customers in both areas.
Although the financial end has been finalized, there are still regulatory hurdles to overcome. This deal could be seen as giving NVIDIA an unfair advantage over their competitors as they now own IP their competitors may need to compete against NVIDIA. It remains to be seen whether NVIDIA will maintain its commitment to customer neutrality while acting as a technology licenser. In addition, Arm’s efforts in developing AI chipsets plays directly into NVIDIA’s recent foray into the AI hardware market, putting NVIDIA in a favorable growth position.
To be fair, NVIDIA is not the only game in town when it comes to AI chipsets, and other companies could certainly spin-off subsidiaries to offer their IP under a licensing model. Increased adoption of AI at the edge and data center levels will still depend on FPGAs, x86 CPUs, and AI-optimized SoCs. Startups are abuzz with AI chipset innovations, and it’s conceivable that one of NVIDIA’s competitors could acquire their IP. NVIDIA may be the leader in this area, but who knows how long that will last in the hyper-competitive chip market.
The Next 5-10 Years
Over the next 5-10 years, companies like Microsoft, Amazon, Google, and a slew of startups will play a critical role in developing software stacks and enterprise-level AI applications. Some might argue we’re living in an AI software bubble, while hardware has yet to catch up to the level of development seen at major software companies. Today, we’re still using bulky GPUs for AI computation; these processors generate a ton of heat, they have a large footprint, and they’re expensive. It’s the closest thing we have to an optimized hardware platform for the repetitive tensor calculations involved in AI computation.
The NVIDIA-Arm deal is about more than just dominating the embedded and data center markets; it gives NVIDIA ownership over the IP that will appear in the next generation of SoCs and ASICs. This includes AI chipsets, whether they are produced by NVIDIA or another chipmaker. Over the next 5-10 years, the AI chipset market will consist of major players like NVIDIA, Intel, AMD, IBM, Samsung, Qualcomm, and Xilinx, as well as a small number of startup companies working on neuromorphic computing and unique transistor architectures. NVIDIA is positioning themselves to dominate this up-and-coming market.
Board designers can already use platforms like NVIDIA’s Jetson Nano with a custom carrier board to create production-grade AI applications involving computer vision, on-device NLP tasks, and small robots. If you’re working on AI hardware, watch for new products from Intel and NVIDIA that bring AI compute workloads out of the data center and onto end devices. As processor core architectures continue to be re-engineered to optimize AI compute workloads, embedded board designers will have a wealth of AI-driven functionality they can bring to the device level.
As the industry evolves and new products become available, we’ll be here to keep you updated on critical developments like the NVIDIA-Arm deal. If you’re ready to innovate and develop new AI products and hardware platforms, the PCB design tools in Altium Designer® contain everything you need to stay at the cutting edge.