‘Most Data Centres Are Not Ready for Liquid Cooling’, says Oracle Exec on NVIDIA Blackwell

Built on the Blackwell architecture introduced last year, Blackwell Ultra features the NVIDIA GB300 NVL72 rack-scale solution and the NVIDIA HG B300 NVL16 system.

Oracle Cloud Infrastructure (OCI) is bringing NVIDIA’s Blackwell Ultra GPUs to its cloud platform, a move announced at the GTC 2025 AI conference. While this expands OCI’s capabilities, it also demands new infrastructure solutions, such as implementing liquid cooling in its data centres. But it comes with its own challenges. 

“Most data centres are not ready for liquid cooling,” said Karan Batta, senior vice president at OCI, in an exclusive interview with AIM, acknowledging the complexity of managing the heat produced by the new generation of GPUs.

He added that cloud providers must choose between passive or active cooling, full-loop systems, or sidecar approaches to integrate liquid cooling effectively. Batta further noted that while server racks follow a standard design (and can be copied from NVIDIA’s setup), the real complexity lies in data centre design and networking. 

Batta explained that today, every cloud provider essentially buys a rack from NVIDIA. “The differentiation comes from the data centre design—how hot you can run these GPUs and how much you can scale them,” he said, adding that ensuring the highest uptime and minimising failures is critical. 

“The biggest challenge is not deploying the GPUs—anyone can do that—but actually managing and operating a massive GPU cluster,” Batta said.  

Built on the Blackwell architecture introduced last year, Blackwell Ultra features the NVIDIA GB300 NVL72 rack-scale solution and the NVIDIA HG B300 NVL16 system. The GB300 NVL72 delivers 1.5 times the AI performance of the NVIDIA GB200 NVL72.

Last year, Oracle also announced the launch of the world’s first zettascale cloud computing clusters powered by NVIDIA Blackwell GPUs last year. These clusters offer up to 131,072 GPUs and deliver 2.4 zettaFLOPS peak performance.

Batta added that NVIDIA’s DGX Cloud offering is also hosted on Oracle Cloud Infrastructure. “As we launch GB200 this quarter and later GB300, DGX Cloud will continue to run on our infrastructure,” he said.  

Additionally, Batta mentioned that Oracle is collaborating with other cloud service providers, such as Google and Microsoft Azure, to establish multi-cloud partnerships at the infrastructure level by deploying OCI (Oracle Cloud Infrastructure) within their data centers.

“We’re already doing a lot with Microsoft by integrating Oracle databases and various other services. With Google, it also makes sense because their customer base is different from ours—there’s no overlap,” said Batta, adding that this leaves room for collaboration, especially since Google has a strong AI model, Gemini.

Talking of compute needs, he said it is not going to slow down. “It will only increase as customers find more use cases and more inferencing to do,” Batta said. Oracle’s strategy is to be an open cloud provider that offers a wide variety of AI models rather than favouring any specific one. “We are already collaborating with OpenAI, Meta, and Cohere, and we continuously update our offerings with the latest versions.”

OCI x NVIDIA AI Enterprise 

Oracle has partnered with NVIDIA AI Enterprise, allowing customers to accelerate AI adoption, including sovereign AI initiatives. This cloud-native software platform will be available across OCI’s distributed cloud and purchasable using Oracle Universal Credits.

Batta said Oracle customers can now access the NVIDIA AI Enterprise suite within Oracle Cloud. He explained that customers do not need to purchase it separately, instead, they can use their existing Oracle Cloud credits to access it.

Unlike other NVIDIA AI Enterprise offerings, OCI will make it accessible directly through the OCI Console, enabling faster deployment, direct billing, and customer support. 

Customers can use over 160 AI tools, including NVIDIA NIM microservices, to streamline generative AI model deployment. The integration allows enterprises to build applications and manage data across multiple deployment environments.

Batta said that for Oracle, ‘distributed cloud’ refers not just to the commercial cloud but also to environments such as OCI’s public regions, Government Clouds, sovereign clouds, OCI Dedicated Region, Oracle Alloy, OCI Compute Cloud@Customer, and OCI Roving Edge Devices.

He further added that Nomura Research Institute (NRI), one of the largest financial system integrators in Japan, uses Oracle Alloy to deliver customised cloud services with NVIDIA Hopper GPUs and plans to deploy NVIDIA AI Enterprise to support AI use cases. 

“Half of the Nikkei Index runs through their books, and all of that operates on Dedicated Region—one in Tokyo and a disaster recovery site in Osaka. They are also deploying GPUs and have access to the NVIDIA AI Enterprise suite of software as well,” Batta said. 

Speaking of India, he said that Oracle already has two cloud regions in India and is building a third. 

“We also have a multi-cloud strategy in India, where we are partnering with AWS, Google, and Microsoft to interconnect our regions and provide our database services through these cloud providers,” he concluded.

📣 Want to advertise in AIM? Book here

Picture of Siddharth Jindal

Siddharth Jindal

Siddharth is a media graduate who loves to explore tech through journalism and putting forward ideas worth pondering about in the era of artificial intelligence.
Related Posts
Association of Data Scientists
GenAI Corporate Training Programs
Our Upcoming Conference
India's Biggest Conference on AI Startups
April 25, 2025 | 📍 Hotel Radisson Blu, Bengaluru
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.