Can Big Tech Break Nvidia Chip Dominance With AI-Networking Standard?
Last week, leading tech firms announced they are developing an AI-networking standard for data center linkage. According to TechCrunch, industry heavyweights like Google, Meta, and Microsoft are setting up an industry group known as the Ultra Accelerator Link (UALink) Promoter Group. Other members of the UALink Promoter Group include Cisco System, Intel, Broadcom and Hewlett Packard Enterprise
Connecting AI Accelerators
The UALink Promoter Group’s move is informed by the need to connect AI accelerators across data centers. Standardization of computing interfaces is required to make this happen.
In a joint statement, the companies said, “An industry specification becomes critical to standardize the interface for AI and Machine Learning, HPC (high-performance computing), and Cloud applications for the next generation of AI data centers and implementations.”
The promoter group has developed the big tech AI-networking standard for linking different AI accelerators in data centers. Accelerators are systems that can boost the processing of huge amounts of data used in AI tasks. They include chips that speed up the fine-tuning, training, and running of AI models, and range from GPUs to tailor-made solutions.
Enriching the AI Ecosystem
UALink 1.0 is the initial version of the new AI-networking standard. It will connect upwards of 1024 AI accelerators in one computing pod. A pod is a single or multiple rack within a server. Based on open standards, UALink 1.0 will facilitate direct stores and loads between AI accelerator memory and reduce data transfer latency to boost speeds.
In a brief to reporters, AMD’s GM for Data Center Solutions said, “The industry needs an open standard that can be moved forward very quickly, in an open [format] that allows multiple companies to add value to the overall ecosystem. The industry needs a standard that allows innovation to proceed at a rapid clip unfettered by any single company.”
The UALink Promoter group is set to create a consortium to oversee the creation of UALink specifications in quarter 3 of 2024. UALink 1.0 will be released around that time. Release of an updated version, UALink 1.1, that will have higher bandwidth is expected in quarter 4 of 2024. According to Norrod, initial UALink products will mostly happen over the next couple of years.
Breaking Nvidia’s Dominance
The move by the tech giants to develop an AI-networking standard appears to be the newest attempt to break Nvidia’s dominance in the market. Every year, tech companies invest billions of dollars in hardware that’s needed to support AI applications. This has led to increased demand for AI data centers as well as chips that are needed to run them.
With a market share of about 80%, Nvidia has been dominating the AI chip market. Meta and Google pay billions of dollars for AI-networking standard Nvidia, the GPUs they need to train their AI models and power their clouds. They appear keen on lowering their dependence on a supplier who is currently dominating the AI hardware ecosystem. Nvidia’s current networking business is a critical part of the package that supports its AI dominance.