AMD Makes NVIDIA GPUs EPYC

"We’ve shown a 20% improvement in training and 15% improvement in inference when connecting EPYC CPUs to NVIDIA’s H100 GPUs."
AMD Makes NVIDIA GPUs EPYC

Even as competition soars between the two leading chip manufacturers, AMD and NVIDIA, a new partnership is quietly emerging between the two rivals in the AI and computing world. Few realise that AMD’s EPYC CPUs are critical in powering NVIDIA’s GPUs for large-scale AI workloads.

Case in point: As AI models grow in complexity, the demand for both GPU and CPU performance is increasing, with AMD CPUs showing significant potential to enhance NVIDIA’s GPUs.  

“We’ve shown a 20% improvement in training and 15% improvement in inference when connecting EPYC CPUs to NVIDIA’s H100 GPUs,” said Ravi Kuppuswamy, senior vice president & general manager at AMD.  At the Advancing AI 2024 event, Kuppuswamy cited the Llama 3.1 inference model with 8 H100 GPUs, where the CPU provided significant value in large-scale GPU clusters. 

He added that this is a joint collaboration between AMD and NVIDIA, where they have identified the best EPYC CPUs to optimise the configuration between CPU and GPU. 

AMD Makes NVIDIA GPUS EPYC

AIM noted this interesting synergy as the chip manufacturers partnered to integrate AMD’s EPYC CPUs into NVIDIA’s HGX and MGX GPU systems. It optimises AI and data center performance by leveraging AMD’s high-core processors alongside NVIDIA’s parallel computing GPUs, while promoting open standards for greater flexibility and scalability.

“We don’t want to force choices on our customers… We will continue to push open standards and interoperate with vendors across the industry,” said Madhu Rangarajan, corporate VP at AMD, EPYC Products. He emphasised AMD’s open approach, supporting diverse customer needs.

AMD’s 5th-generation EPYC processors, with NVIDIA’s HGX and MGX GPU clusters, are likely to optimise the performance of data centers and enterprise tasks to the next level. “even the fiercest of rivals can come together when it benefits their customers,” said AMD, underscoring the importance of this partnership in advancing AI and high-performance computing.

Previously, AMD claimed its EPYC processors deliver twice the performance of NVIDIA’s Grace Hopper Superchip across multiple data center workloads, showcasing significant advantages in general-purpose computing and energy efficiency. “Our EPYC processors provide a lower total cost of ownership due to their performance, energy efficiency, and extensive x86-64 software compatibility.” It highlighted how NVIDIA’s Arm-based CPUs lag in non-AI workloads compared to AMD’s Zen 4 EPYC processors.

“This is good news, as NVIDIA is currently using the vastly inferior Intel Xeon in its systems,” A reddit user posted, suggesting that AMD should leverage its strengths to capture a larger share of HPC CPU. 

After NVIDIA, AMD Partners with Intel 

In an unlikely turn of events, AMD today partnered with Intel to create an x86 ecosystem advisory group bringing together technology leaders to establish the world’s most widely used computing architecture. 

“We are on the cusp of one of the most significant shifts in the x86 architecture and ecosystem in decades – with new levels of customisation, compatibility and scalability needed to meet current and future customer needs,” said Pat Gelsinger, CEO of Intel. 

AMD’s chief, Lisa Su, said that this collaboration brings the industry together to pave the way for future architectural enhancements and extend the success of x86 for decades. “Establishing the x86 Ecosystem Advisory Group will ensure that the x86 architecture continues evolving as the compute platform of choice for developers and customers,” she added. 

Both AMD and Intel believe that x86 is still relevant in the era of AI. an AMD executive told AIM at the Advancing AI 2024 event that the company was the first to bring neural processors to the x86 environment. 
“In the x86 world, we introduced those first neural processors in 2023 with a product we call ‘Phoenix Point’ delivering 10 TOPS of neural processing performance and enabling several workloads, such as Windows Studio effects and many other third-party ISVs that were supporting those early chatbots and assistants on the device,” shared the executive.

📣 Want to advertise in AIM? Book here

Picture of Shalini Mondal

Shalini Mondal

Shalini is a senior tech journalist, exploring the latest advancements in AI. When she's not reporting on the latest innovations, you can find her immersed in her next literary adventure.
Related Posts
Association of Data Scientists
GenAI Corporate Training Programs
Our Upcoming Conference
India's Biggest Conference on AI Startups
April 25, 2025 | 📍 Hotel Radisson Blu, Bengaluru
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.