Necessary Always Active
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
|
||||||
|
||||||
|
||||||
|
In Focus
Global technology firm Broadcom is unveiling a new networking chip called Thor Ultra. According to Reuters, the new chip connects hundreds of thousands of data processing chips and is designed to help businesses to build AI computing systems.
The tech giant unveils the new chip a day after the OpenAI Broadcom chip alliance deal was formally announced.
With Broadcom’s Thor Ultra networking chip, computing infrastructure operators can launch more chips to build and run power-intensive large models that support AI applications like ChatGPT.
The purpose of the new Broadcom networking chip for AI is to entrench dominance in network communications within AI data centers as it takes on NVIDIA’s AI chips. Thor Ultra chip links AI systems with other parts of data centers, allowing operators to move information around within the facilities.
“In the distributed computing system, the network plays an extremely important role in building these large clusters. So I’m not surprised that anybody who’s in the GPU business wants to make sure that they are participating in the networking,” Broadcom Senior Vice President Ram Velaga said.
In July 2025, Broadcom launched the Tomahawk Ultra Ethernet switch which improves AI workloads and supports high-performance computing.
Broadcom views AI as a huge growth opportunity. Networking chips form an integral part of Broadcom’s plan. However, it is AI chips that the company designs for cloud computing giants like Google that are more lucrative. Broadcom has developed multiple versions of Google’s Tensor processor which the search giant started developing over a decade ago.
Analysts estimate that Broadcom has generated billions of dollars from Tensor chips. In 2024, Chief Executive Hock Tan said the company is pursuing a $60 billion to $90 billion market with its AI chips by the end of 2027. This includes the data center processors and networking chips Broadcom develops with Google and OpenAI.
Broadcom generated $12.2 billion in AI revenue in 2024. In September 2025, the company announced that it had onboarded a new customer who needs $10 billion worth of custom data center AI chips.
The latest deal, OpenAI is partnering with Broadcom to develop and launch its own AI processors. This decision could redefine the dynamics around data center networking and chip supply strategies as OpenAI works to secure computing power for its fast AI workloads.
Broadcom will supply 10 gigawatts of custom accelerators and Ethernet-based networking systems to OpenAI starting 2026. This points to a shift towards custom silicon and open networking architectures that might affect how companies build and scale AI data centers in the future.
“The racks, scaled entirely with Ethernet and other connectivity solutions from Broadcom, will meet surging global demand for AI, with deployments across OpenAI’s facilities and partner data centers,” OpenAI said in a statement posted on its website.
Broadcom’s New Networking Chip at a Glance
The Broadcom vs Nvidia networking battle heats up with OpenAI’s decision to leverage Ethernet fabric as opposed to NVIDIA’s InfiniBand interconnects. The decision also highlights OpenAI’s intention to develop a more scalable, open networking structure that might set the pace for AI infrastructure across enterprises.
Analysts say this aligns to a wider industry trend that is inclined towards open networking standards, which offer more flexibility and compatibility between systems.
“Ethernet offers broader interoperability and avoids vendor lock-in, which could accelerate the adoption of disaggregated AI clusters. This move is another attempt to challenge InfiniBand’s dominance in high-performance AI workloads and may push hyperscalers to standardize on Ethernet for ecosystem diversity and digital sovereignty,” Principal Analyst at Forrester Charlie Dai said.