Intel and AMD Lead UALink Alliance to Challenge NVIDIA's NVLink Technology
The Super Accelerator Link Consortium (UALink), initiated by companies including Intel and AMD, has officially been established. The consortium aims to create new standards for high-speed and low-latency communication between AI data center servers.
Its board members include representatives from Intel, AMD, Cisco, Hewlett Packard Enterprise (HPE), Meta, AWS, Google, Microsoft, and Astera Labs, among others. Additionally, the consortium is actively seeking more contributing members.
The Super Accelerator Link Consortium will provide an open industry standard for the extended connection of numerous AI accelerators, countering Nvidia's proprietary NVLink technology, which is Nvidia's solution for server or intra-server GPU to GPU communication.
The chairperson of the UALink consortium has also publically invited companies interested in joining the consortium as contributing members to support its mission to establish an open and high-performance accelerator interconnect for AI workloads.
Although the consortium has just been established, the technological scheme has been under research. The UALink 1.0 technical specification will be made available to members within the year, offering connections up to 200Gbps per channel for as many as 1024 accelerators within an AI pod.
For instance, if a server like the NVIDIA HGX houses 8 AI accelerators, UALink could connect up to 128 such machines within a pod. However, UALink is likely to be used on a smaller scale, typically interconnecting about 8 servers within a pod through UALink, with further upgrades managed by Super Ethernet.
The UALink standard is set to be officially released in the first quarter of 2025, synchronously with the first version of Super Ethernet. AMD has already announced the launch of the industry's first connector supporting Super Ethernet at 400GbE.
Both AMD's Super Ethernet and UALink aim to overturn Nvidia's dominant position in the AI infrastructure field, but the ultimate outcome remains to be seen.