Laptop maker Dell and Advanced Micro Devices (AMD) have joined hands to produce new generative AI servers that are capable of running Large Language Models (LLMs) like OpenAI’s ChatGPT. Dell’s new AMD powered server will join the ranks of its high-performance computing portfolio for AI workloads.

Dell’s new generative AI server is called PowerEdge XE9680 and it operates similar to the company’s Nvidia powered servers, giving customers more variety in their AI infrastructure choices.

 

The PowerEdge XE9680 is powered by eight AMD Instinct MI300X accelerators that provide a total memory bandwidth of 1.5GB (HBM3) and a performance capacity of beyond 21 petaFLOPS. This hardware power makes the XE9680 ideal for training and operating in-house LLMs, which has proven to be highly profitable for other hardware makers such as Nvidia. 

Dell’s latest product stands out for its exceptional scalability. This is made possible through the use of the global memory interconnect (xGMI) standard, allowing customers to expand their deployed systems efficiently. Furthermore, this new offering integrates AMD’s GPUs, which can be linked through an Ethernet-based AI network using the Dell PowerSwitch Z9664F-ON. This innovation comes after Dell’s previous introduction of a model equipped with Nvidia H100 GPUs.

Dell has strategically introduced the Dell Validated Design for Generative AI with AMD, a novel standard designed to assist organizations in implementing their own hardware and network structures for large language models (LLMs). Thanks to AMD ROCm powered AI frameworks, which is central to this standard, developers will have the convenience of accessing drivers, development toolkits, and APIs compatible with AMD Instinct accelerators, all in one open-source package.

Dell is taking a step away from Nvidia’s approach when it comes to networking standards by working together with the Ultra Ethernet Consortium (UEC) which advocates the use of open Ethernet for AI, unlike Nvidia. This approach, shared with AMD, promotes an open Ethernet environment for AI applications, fostering compatibility across switches from multiple providers within a unified system.

Dell’s strategy is designed to encourage companies to adopt a holistic and open model, integrating computing, network, and storage components, which are vital for the successful deployment of internal generative AI applications.