Oracle recently joined the Open Cloud Compute (OCC) project and collaborated with People+ai and its partners to enhance the compute ecosystem in India and drive the country’s digital growth.
Under the OCC project, a network of interoperable micro data centres will be established nationwide. This will address the increasing demand for such resources, providing faster processing, lower latency, and improved data sovereignty.
“Many compute providers in India are smaller than major global cloud providers,” said Pramod Varma, former chief architect of Aadhaar, adding that with OCC India can create a network of hundreds of smaller players, each acting like a mega network.
“OCC can benefit from Oracle’s distributed cloud capabilities like speeding up project execution, providing greater transparency, delivering enterprise-grade efficiency and unlocking innovation through AI and analytics in customer data centres or on the public cloud,” said Shailender Kumar, senior vice president, Oracle India.
Oracle’s Love for India
Oracle is witnessing significant demand in India, with over 600 customers across various sectors. This includes notable names such as UPI, Flipkart, Apollo Hospitals, Fortis, Ola, ICICI Bank, Federal Bank, Wipro, PwC, Meesho, and Genpact.
“We run core banking systems for banks in India, handling every transaction in the core banking ledger. We manage hotel systems, from check-in to room key card taps and beyond. We run hospitality management systems, point of sales terminals, and utilities, including customer care and billing,” said Chris Chelliah, senior vice president of Oracle, JAPAC.
“We have got a fundamentally different network, or cloud, that can build the largest clusters of these GPUs working together,” he said, adding that OCI networking offers secure, low-latency, and high-performance connections within a virtual cloud network. This enables users to experience on-premises performance in the cloud.
Oracle Cloud Infrastructure incorporates RDMA (Remote Direct Access Memory) to improve the performance of its services. RDMA in OCI allows for faster data transfer rates and lower latency, making it ideal for high-performance computing (HPC), AI, and other data-intensive applications.
“Oracle commercialised RDMA in 2008, and we are 14-15 years ahead in this technology,” said Chelliah, adding that people didn’t know about RDMA even about four or five years ago.
Moreover, he explained that with OCI, customers can fine-tune existing models. For example, he said people could use OCI to build AI startups in India, such as developing an Indic LLM specifically for the country’s legal system.
He further said that Oracle gives choices in the cloud. “We’re providing customers with a single database that can store structured data, unstructured data, document data, relational data, and graph data, and can access your object storage data,” said Chelliah.
Oracle’s Autonomous Database allows users to rapidly build new features using SQL, JSON documents, graphs, geospatial data, text, and vectors in a single database.
Indian Cloud Providers
In India, Oracle faces stiff competition from Microsoft Azure, AWS, and local cloud players like Yotta, E2E Networks, and Krutrim AI Cloud. “We can’t force customers. We’re here to provide a platform where customers can bring in and train their models,” said Chelliah, adding that at the end of the day, customers will deploy based on a variety of factors.
Meanwhile, Ola CEO Bhavish Aggarwal recently fulfilled his promise of moving all AI infrastructure workloads from Microsoft Azure to Krutrim AI Cloud, citing high costs.
Oracle believes it is way ahead of its competitors. “One of them [the competitors] is very API dependent. One of them is all about ‘my model is the best’. Another says, ‘I have a kitchen where you can cook anything. Just bring your ingredients’,” said Chelliah.
According to Chelliah, Musk’s favourite, Oracle, is growing 50-60% faster than its competitors. He added that their competitors have been around for 20 years, yet only 30% of the workloads have been moved to the cloud.
Cohere for the Win
Further, Chelliah said that unlike other players, Oracle is not betting on LLMs as it requires a huge amount of money and computation. “[Even] If I train this model on everything in the Encyclopaedia Britannica and it has read every bit of William Shakespeare, it’s completely useless in solving a manufacturing problem,” he said.
Oracle has a strong partnership with Cohere. The former recently added generative AI capabilities within the Oracle Fusion Cloud Applications Suite, which consists of applications designed to manage various aspects of a company, including finance, human resources, supply chain, sales, marketing, and customer service.
On the other hand, Cohere recently announced Aya 23, a family of generative LLMs with open weights for 8-billion and 35-billion parameter versions.
These models cover 23 languages: Arabic, Chinese, Czech, Dutch, English, French, German, Greek, Hebrew, Hindi, Indonesian, Italian, Japanese, Korean, Persian, Polish, Portuguese, Romanian, Russian, Spanish, Turkish, Ukrainian, and Vietnamese.