Interviews and Discussions – Analytics India Magazine https://analyticsindiamag.com AIM - News and Insights on AI, GCC, IT, and Tech Thu, 13 Feb 2025 04:06:03 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2025/02/cropped-AIM-Favicon-32x32.png Interviews and Discussions – Analytics India Magazine https://analyticsindiamag.com 32 32 Why GenAI is India’s High-Performance, Low-Cost Game Changer https://analyticsindiamag.com/ai-features/why-genai-is-indias-high-performance-low-cost-game-changer/ Mon, 23 Dec 2024 05:56:07 +0000 https://analyticsindiamag.com/?p=10144143 AWS’s Satinder Pal Singh points out that while 99% of employers envision their companies becoming AI-driven organisations by 2028, nearly 79% struggle to find skilled professionals.]]>

Generative AI is at the forefront of India’s tech evolution, delivering great performance without the steep price tag. This has been possible thanks to some big players who continue to up their game. With resilient infrastructure, cutting-edge technology, and strategic investments, cloud giant AWS is helping Indian businesses scale AI while redefining price-performance benchmarks. 

Satinder Pal Singh, head of solution architecture for AWS India and South Asia, leads a team of architects tasked with helping customers design scalable, efficient solutions using AWS services. During an in-depth conversation with AIM at AWS re:Invent 2024 held in Las Vegas recently, Singh shared insights into AWS’s strategic focus on India, customer needs, and groundbreaking launches that promise to redefine cloud technology in the region.

“India is a significant country for us,” Singh began, emphasising AWS’s extensive infrastructure in the country. With two operational regions—Mumbai and Hyderabad—spanning three availability zones, 33 points of presence, and nine direct connect locations, Singh noted, “We provide extensive capabilities to our customers, enabling them to leverage more than 240 services—from infrastructure to analytics, IoT, and machine learning—while keeping their data within India.”

Custom Chips: Redefining Price-Performance

AWS’s investments in custom-built chips—Graviton, Trainium, and Inferentia—have been a game-changer for its customers. Graviton chips, designed for general workloads, have already delivered significant benefits. Companies like Zomato and Paytm reported reduced costs by up to 30% and improved performance by 20-35%, a testament to the efficiency of these chips.

Trainium and Inferentia, on the other hand, cater specifically to AI workloads. Trainium delivers a 64% performance boost compared to previous offerings, while Inferentia is optimised for inference, helping global clients save up to 40% on costs. Singh acknowledged that these advancements were critical for Indian enterprises navigating the challenges of scaling AI while managing budgets.

Upskilling India: AWS AI-Ready Program

One of the most pressing challenges for Indian organisations is the AI talent gap. Singh cited a startling statistic—while 99% of employers envision their companies becoming AI-driven organisations by 2028, nearly 79% struggle to find skilled professionals. To address this, AWS launched the AI-Ready Program, a comprehensive initiative to upskill over 2 million individuals globally by 2025. 

In India alone, over 5.9 million individuals have been trained in cloud services since 2017. AWS’s investment in India’s local cloud infrastructure is projected to reach $16.4 billion by 2030, supporting 131,700 jobs annually and contributing $23.3 billion to India’s GDP by 2030.

Why Mumbai and Hyderabad?

The decision to establish data centres in Mumbai and Hyderabad was driven by performance and reliability considerations while providing multi-region availability. AWS’s infrastructure is built with resilience at its core, comprising multiple availability zones, which are essentially isolated data centres capable of functioning independently.

This design has helped AWS customers achieve exceptional reliability. For instance, ANI Technologies, a leading financial service provider, utilised this setup to create a robust failover mechanism, ensuring uninterrupted services even during system failures. 

Singh explained that such resilience is crucial for mission-critical applications, especially in industries like finance and healthcare.

He elaborated on the layered design of AWS’s infrastructure: “Each region consists of multiple availability zones. Think of one availability zone as one or more isolated data centres. If one zone fails, workloads can shift seamlessly to another.” This architecture ensures resilience even in extreme scenarios like natural disasters.

Generative AI, a key area of focus for AWS, has witnessed increasing interest from Indian enterprises. Singh highlighted two major innovations announced at re:Invent: model distillation and advancements in automated reasoning.

The model distillation technique, offered via Amazon Bedrock, allows smarter, cost-efficient models to inherit the learnings of larger, more accurate ones, significantly lowering operational costs without compromising on precision.

Another critical advancement is AWS’s approach to minimising hallucination in AI models using automated reasoning. Singh emphasised that this enhancement ensures model reliability, making it an ideal solution for industries like insurance and healthcare, where accuracy is non-negotiable. 

These innovations are a direct response to the needs of Indian customers, who often demand high performance at lower costs. Singh detailed AWS’s use of automated reasoning to enhance model reliability. “With this, businesses like insurance companies can rest assured that their models provide accurate responses, eliminating the risk of incorrect outputs,” he said.

Driving Innovation Across Industries

AWS’s democratisation of technology has empowered organisations across diverse industries. For instance, manufacturing firms like Apollo Tyres are leveraging AWS services and reported a 9% increase in productivity, while banks like Axis and HDFC enhanced customer experiences with data-driven insights. 

Singh emphasised AWS’s commitment to innovation while ensuring inclusivity. “We democratise technology so that SMBs have the same access to cutting-edge capabilities as large enterprises,” he said. 

The importance of AWS’s partner ecosystem cannot be overstated. Globally, AWS collaborates with over 140,000 partners, with a significant presence in India. Companies like HCLTech and Persistent Systems are leveraging AWS’s capabilities to deliver innovative solutions to their clients.

]]>
Meet Ishit Vachhrajani, Global Head of Enterprise Strategy at AWS https://analyticsindiamag.com/videos/meet-ishit-vachhrajani-global-head-of-enterprise-strategy-at-aws/ Wed, 04 Dec 2024 06:34:05 +0000 https://analyticsindiamag.com/?p=10142426 Meet Ishit Vachhrajani with a discussion on how generative AI is revolutionising industries like healthcare, uncovering transformative use cases and innovations shaping the future.]]>
Meet Ishit Vachhrajani with a discussion on how generative AI is revolutionising industries like healthcare, uncovering transformative use cases and innovations shaping the future.
]]>
OpenAI’s Internal Progress Toward AGI with Pragya Misra https://analyticsindiamag.com/videos/openais-internal-progress-toward-agi-with-pragya-misra/ Sat, 30 Nov 2024 15:20:35 +0000 https://analyticsindiamag.com/?p=10142097 Meet Pragya Misra, sharing her insights into OpenAI's internal progress toward AGI. Learn how a typical week look like at OpenAI looks like, and the experience of working with Sam Altman.]]>
Meet Pragya Misra, sharing her insights into OpenAI’s internal progress toward AGI. Learn how a typical week look like at OpenAI looks like, and the experience of working with Sam Altman.
]]>
Granite 3.0 is IBM’s Love Letter to Indian Developers https://analyticsindiamag.com/ai-features/granite-3-0-is-ibms-love-letter-to-indian-developers/ Fri, 22 Nov 2024 12:00:00 +0000 https://analyticsindiamag.com/?p=10141464 “India’s developer ecosystem is vibrant and growing,” Vishal Chahal said, “and we are committed to supporting it every step of the way.”]]>

Small language models and AI agents are the talk of the AI town. With the release of the latest Granite 3.0 and Bee Agent Framework, IBM has shifted its focus towards smaller and more efficient models while maintaining transparency to a level previously unheard of in the AI industry.

A few days after the launch of Granite 3.0, Armand Ruiz, VP of product–AI Platform at IBM, publicly disclosed the datasets on which the model was trained. “This is true transparency. No other LLM provider shares such detailed information about its training datasets,” said Ruiz.

This is a practice IBM has adhered to even in the past with new model releases. Vishal Chahal, VP of IBM India Software Lab, told AIM that it is important for companies to be fully open about their language models and the data they’ve used to build them. 

Despite its smaller size, Granite 3.0 has been performing exceptionally well in tasks like coding and making small chatbots. The model is applicable across diverse domains, from healthcare to finance, demonstrating its adaptability. Looks like open-source models have finally caught up with their closed-source counterparts.

“You can start small,” Chahal highlighted, sharing that Granite 3.0 can even run on a Mac. This flexibility allows businesses to scale as their needs grow without initially needing extensive GPU clusters.

The open-source philosophy extends beyond Granite’s datasets to its availability on platforms like GitHub, Hugging Face, and IBM WatsonX. IBM’s AI alliance with partners such as Meta, LlamaIndex and Ollama further broadens accessibility for developers worldwide.

Open Source–The Backbone of Granite’s Success in India 

India’s developer ecosystem has embraced Granite 3.0 wholeheartedly. Thousands of developers are already leveraging it, drawn by its performance and versatility. Beyond individual developers, significant collaborations are happening at the institutional level. “India’s developer ecosystem is vibrant and growing,” Chahal said, “and we are committed to supporting it every step of the way.”

“Institutions like IIT Madras, IIT Bombay, and IIT Jodhpur have become part of IBM’s AI alliance,” Chahal shared. These institutions, along with initiatives like AI4Bharat, are using Granite to create benchmarks for Indian languages like MILU, which was released last month. 

Major players like Infosys, KissanAI, Wadhwani AI, and Sarvam AI have also integrated Granite into their AI ecosystems, demonstrating its relevance to startups and enterprises alike. This collaboration is fostering innovation tailored to India’s unique needs, from agriculture to vernacular language processing.

Chahal explained that many startups are using Granite alongside other models, such as Meta’s Llama. “Granite is exceptional for tasks like coding instructions,” he said, “but a multi-model approach ensures startups can pick the best model for each use case.” IBM’s WatsonX platform facilitates this by hosting both IBM’s and partner models, allowing developers to choose the most suitable tools for their needs.

Chahal said that IBM plans to release a multimodal model soon, which will include speech and video models.

Beyond just language models, at its recent TechXchange conference, IBM unveiled advancements in AI agentic capabilities through frameworks like the Bee Agent and MARC (Multi-Agent Resource Coordination) frameworks. “We are officially in the agentic world,” Chahal declared.

IBM offers multiple ways to build agents, from WatsonX and Watson Orchestrate to WatsonX Flows, catering to diverse business requirements. “Soon, we will see a multi-agent world, where hundreds of agents work collaboratively across enterprises,” he predicted.

Meanwhile, IBM’s Bee Agent Framework is designed to assist developers in creating powerful agents with minimal adjustments to existing implementations, particularly as it actively works to optimise for other popular LLMs.

The framework includes key features for building versatile agents. Its built-in Bee agent, configured for Llama 3.1, is ready to use, though users can also customise their own agents using built-in tools in JavaScript or Python. 

Supporting the Developers

“For developers, trust is paramount. A model should transition seamlessly across the development, testing, and production environments without requiring constant adjustments,” said Chahal, adding that the ability to retrain models efficiently is also critical for developers. 

Retraining should not require duplicating models and consuming excessive resources. 

At IBM Red Hat’s InstructLab, the team has developed an open-source tool that allows developers to retrain models using a single GPU cluster. By adding knowledge layers tailored to industry-specific needs, this tool eliminates the need for multiple copies and streamlines the retraining process. For developers, this solves a significant challenge: managing infrastructure while scaling capabilities.

“We aim to lower the barriers for developers by providing access to smaller, efficient models that work even on a single GPU or CPU cluster,” said Chahal, while adding that beyond AI models, developers also need robust data tools and pipelines. This is where IBM’s research software and systems labs collaborate with colleges and universities to share these tools and empower developers with end-to-end capabilities.

In July, IBM held an International GenAI Conclave in Kochi, where college students competed in a hackathon, and winners were recognised by dignitaries. IBM also hired some of these developers, demonstrating its commitment to nurturing talent.

The company also launched a GenAI Innovation Centre in Kochi, which allows colleges, universities, startups, and partners to experiment with business use cases and learn how GenAI is applied in real-world scenarios. This centre will soon be replicated in other locations, creating open zones for innovation.

“Building in India, for India, Bharat, and the world,” Chahal said, adding that IBM wants Indian developers to aspire to have a global impact. With IBM’s support, they can create solutions that not only address local challenges but also have global relevance.

]]>
Interview with Amit Kapur (How Lowe’s Improved OpenAI’s Models) https://analyticsindiamag.com/videos/interview-with-amit-kapur-how-lowes-improved-openais-models/ Wed, 20 Nov 2024 12:48:25 +0000 https://analyticsindiamag.com/?p=10141315 ]]>
Learn how Lowe’s partnership with OpenAI fuelled an AI-driven transformation in the retail industry with Amit Kapur.
]]>
Why to Build Indic AI with Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) https://analyticsindiamag.com/videos/why-to-build-indic-ai-with-tanuj-bhojwani-peopleplusai-head-and-vishnu-subramanian-founder-of-jarvislabsai/ Wed, 20 Nov 2024 12:34:43 +0000 https://analyticsindiamag.com/?p=10141312 Meet Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) to learn why to build an Indic AI & how it is important for Bharat.]]>
Meet Tanuj Bhojwani (PeoplePlusAI head) and Vishnu Subramanian (Founder of JarvislabsAI) to learn why to build an Indic AI & how it is important for Bharat.
]]>
What is Nikhil Malhotra’s Dream AI? https://analyticsindiamag.com/ai-features/what-is-nikhil-malhotras-dream-ai/ Thu, 14 Nov 2024 08:48:48 +0000 https://analyticsindiamag.com/?p=10140945 Dream AI aspires to create agents that “dream” by simulating environments and learning with a nuanced understanding of context, instead of relying on vast datasets alone.]]>

AI is a compute-hungry beast, pushing companies to make better hardware and experiment with the existing system architectures to ease the load. In this bid, Nikhil Malhotra, the global head of Makers Lab at Tech Mahindra, coined the idea of ‘Dream AI’.

Deep Reinforced Engagement Model AI, or Dream AI, as Malhotra explains is an architecture combining symbolic AI with deep reinforcement learning, marking a shift from conventional models and addressing fundamental limitations of today’s AI. In his LinkedIn post, Malhotra pointed out three problems with current AI systems.

  • 1. AI of today is just pattern recognition in a narrow domain. It’s not generalised.
  • 2. Autoregressive LLMs are divergent. If they take on a state space, it’s very difficult to bring them back.
  • 3. They do well to pass the Turing test but have no reasoning or context of what they say.

Enter Dream AI

Highlighting the core challenge of current systems, Malhotra explained to AIM that Dream AI builds on a neurosymbolic architecture, drawing from two foundational schools in AI—connectionist (or deep learning) and symbolic (logic and symbols). 

Over time, deep learning architectures like Transformers have gained traction for their remarkable performance. However, as Malhotra explained, they do not inherently understand or reason; they excel at remembering sequences rather than forming world models. 

For instance, a Transformer model might present varying responses to slight prompt changes, which can lead to hallucinations—a symptom of shallow context alignment. 

This is where Dream AI steps in, creating a dual-loop system where symbolic reasoning informs world models while neural networks enable actions within those models. By incorporating symbolic AI, the framework empowers agents to simulate and act based on physical and logical rules, much like human cognition. 

“Symbolism helps us build world models, and Transformers enable actions based on those models,” Malhotra said. Dream AI, thus, aspires to create agents that “dream” by simulating environments and learning with a nuanced understanding of context, instead of relying on vast datasets alone.

The Role of RL in Dream AI

Speaking at Cypher 2024, Malhotra shared that his key research goal is to make AI less compute-intensive. 

He calls this the ‘min-max regret model’, which shifts away from traditional reward models and aims to empower AI to “dream” about its capabilities and understand its existence more profoundly. Malhotra said that this “dreaming model” allows AI to contemplate its own questions and aspirations. 

“Can you dream about yourself? Once you develop your dream, now come back to life and start with the life that you have,” he said, highlighting the physical aspect of existence and how it relates to AI. Drawing parallels between human cognition and AI functionality, Malhotra said that just as humans subconsciously store information in the hippocampus, AI systems will use their ‘memory’ to inform decision-making processes. 

Central to Dream AI is its use of reinforcement learning (RL) to harmonise symbolic reasoning with neural actions. This learning process allows AI agents to interact within simulated environments, learning optimal behaviours through feedback. For example, OpenAI decided to move away from RLHF and shift towards RL for improving the reasoning capabilities of o1.

In this hybrid architecture, reinforcement learning serves as the bridge between symbolic world models—built through simulation engines like the NVIDIA Omniverse—and the actions guided by deep neural networks. As Malhotra noted, “The AI doesn’t merely repeat learned patterns; it refines its decision-making process based on the impact of its actions within a realistic context.”

Likening it to cycling—an instinctual skill honed from childhood that remains stored in our subconscious—he said, “A lot of the data that you collect and a lot of your information still resides at the back of the hippocampus. As a result, you pull out that memory when you have to cycle.”

Efficient and Scalable AI Training

The dual learning approach makes Dream AI a powerful solution for dynamic, complex environments, enabling agents to learn contextually rather than reactively. This is particularly useful in applications where conventional AI struggles, such as real-world robotics, autonomous systems, and other domains where both physical laws and logical reasoning are essential.

By incorporating symbolic models, Dream AI aims to reduce the training burden on AI systems. Rather than relying on enormous datasets, Dream AI uses symbolic structures to “dream” or simulate scenarios, effectively minimising repetitive data input. 

Symbolic AI provides a contextual foundation, accelerating learning and reducing the dependency on real-world data for every scenario. This process not only shortens the training time but also yields models that are resilient and adaptable, allowing for faster deployment and reducing costs significantly.

This is similar to what Amit Sheth, the chair and founding director of the AIISC, also told AIM. “The government is focused on AI for health, cybersecurity, and education as three of the top application areas,” Sheth said. 

That’s what Sheth conveyed to the ministry and Prime Minister Narendra Modi regarding the areas of AI India should prioritise for future investment. While generative AI has caught everyone’s attention, he outlined the relevance of neurosymbolic AI, which he believes will drive the third phase of AI.

A Leap Beyond Traditional AI Architectures

Malhotra’s Dream AI architecture offers multiple advantages over traditional AI models. Its integration of symbolic reasoning with deep learning enables a level of adaptability and contextual awareness often missing in autoregressive models. 

Traditional systems are confined to specific tasks, lacking a broader understanding of real-world contexts. Dream AI, however, allows agents to simulate world views, thereby aligning their actions with physical and logical principles. The dual-learning loop—symbolic reasoning paired with neural-driven action—fosters a self-reinforcing cycle of refinement and understanding.

By reducing reliance on extensive real-world data, Dream AI makes training both more efficient and scalable.

]]>
Avathon’s Industrial AI to Power Petrol Pumps, Planes https://analyticsindiamag.com/ai-features/avathons-industrial-ai-to-power-petrol-pumps-planes/ Mon, 11 Nov 2024 06:55:27 +0000 https://analyticsindiamag.com/?p=10140743 The company claims that its AI platform now powers 20% of India’s petrol pumps and ensures safety at over 17,000 retail outlets across the country. ]]>

Avathon, an enterprise robotics company, formerly known as Sparkcognition, has rebranded itself with an ambitious plan to triple its workforce in India within two years and transform Industrial AI in sectors such as Energy, aviation, and Supply Chain Management.

This rebranding underscores Avathon’s focus on advancing legacy infrastructure through AI, turning traditional systems into autonomous, sustainable, and resilient ecosystems. With over $100 trillion of ageing infrastructure globally under strain from supply chain disruptions, workforce shortages, and escalating security threats, Avathon is poised to tackle these challenges as a leader in industrial AI solutions.

The company claims that its AI platform now powers 20% of India’s petrol pumps and ensures safety at over 17,000 retail outlets nationwide. The company is strengthening its India presence to attract premier AI and engineering talent.

Furthermore, Avathon’s AI is contracted to enhance safety at major oil and gas facilities in India, including 83 airport terminals and 15 airport fueling stations. With key partnerships with technology leaders like NVIDIA and Qualcomm, Avathon continues to deliver cost-efficient products tailored to client needs. 

In line with this expansive mission, Avathon’s digital twin technology has been designed to operate beyond isolated factory floors, encompassing the entire supply chain. Johar compared Avathon’s platform to NVIDIA’s Omniverse, explaining, “While many digital twins focus solely on individual factories, our approach includes every step of the supply chain to optimise quality control and problem-solving,” Pervinder Johar, CEO of Avathon told AIM.

“We’re bridging the physical and computational aspects of AI,” Johar elaborated. Avathon is now positioned as a robust platform that integrates mechanical engineering advancements with AI innovations, a concept Johar believes is becoming increasingly relevant as industries shift toward intelligent, autonomous systems.

Digital Twin for Aviation Industry

Avathon’s digital twin solution addresses quality inspection issues not only at a factory level but also extends to tier-one suppliers and beyond. By deploying computer vision and other AI techniques, Avathon helps clients detect defects in parts like aeroplanes or car components early in the supply chain, avoiding costly recalls and logistical challenges. “If a part has a flaw, identifying it before it even reaches the assembly line saves time, resources, and ultimately, customer satisfaction,” Johar noted.

The expertise of Kunal Kislay, India president at Avathon, significantly strengthens Avathon’s computer vision and low-code AI platform. Kislay joined the company following the acquisition of his company, Integration Wizards. With a rich background in computer vision and AI engineering, Kislay now helps the company scale AI-driven industrial applications.

Johar shared broader industry insights and examples of Avathon’s impact. He illustrated how clients such as HPCL and IOCL leverage Avathon’s platform for everything from safety monitoring to productivity improvements. For HPCL, Avathon’s computer vision monitors security and safety on-site, identifying potential hazards before they occur. 

Highlighting the economic impact, Johar estimated that Avathon’s solutions have the potential to influence trillions of dollars in asset value across industries. “Just in aviation, our AI can provide predictive maintenance for fleets, allowing airlines to extend the life of their planes, which represents billions in capital investment savings,” he explained.

Even with a marginal cost improvement of 10%, clients can see significant returns on their investments. In sectors like aviation, this could mean substantial operational savings and improved efficiency for airlines managing vast fleets.

AI as a Copilot

Avathon’s emphasis on collaborative AI solutions extends to predictive and prescriptive maintenance for complex assets. The company’s AI copilots provide essential guidance, even for technicians unfamiliar with specific machinery. This capability is particularly valuable in industries like aerospace, where technicians may only specialise in certain types of equipment. 

“Parts may be interchangeable between Boeing and Airbus planes, but often mechanics aren’t trained across both,” Johar pointed out. “Our AI copilots step in to bridge that knowledge gap, saving time and avoiding the need to fly specialists worldwide.”

Looking towards the future, Johar envisions an AI-driven evolution across industries, particularly with the advent of humanoids and physical AI in manufacturing. He referenced the work of Avathon’s joint venture with Boeing, SkyGrid, which focuses on autonomous air traffic control for a future filled with autonomous aircraft. 

“As air traffic grows, we need a system that can manage the skies without relying on human controllers,” he explained. Similarly, Avathon is working toward creating autonomous supply chain planning systems that not only support human planners but could potentially automate decision-making processes entirely, as the supply of mechanical engineers is getting slow globally.

Sparking an Idea

Originally founded as Sparkcognition 11 years ago, Avathon emerged from the University of Texas at Austin, where it was initially a niche AI project led by UT Austin’s PhD student and Dr. Bruce Porter, the university’s former computer science dean. 

“Back then, AI was just beginning to gain traction. We were ahead of the curve, focusing on applying AI to real-world problems long before AI became the trend it is today,”. This foundation in academia drove a decade of innovation, especially for large-scale clients in Fortune 500 companies, as Avathon carved its niche in the AI-driven infrastructure space.

Over time, Johar realised that its original mission— ‘sparking an idea’ —was evolving into a more comprehensive vision. With a focus on building a sustainable platform that would serve industries for decades, they rebranded to Avathon, a name Johar explained as stemming from two Greek words that mean ‘to bind together.’ 

]]>
Why India Needs a Quantum Mission https://analyticsindiamag.com/ai-features/why-india-needs-a-quantum-mission/ Sat, 09 Nov 2024 03:30:00 +0000 https://analyticsindiamag.com/?p=10140722 “The US has banned the export of quantum. They don't want to share the technology, so that's the kind of message that we are getting from many countries, that nobody really wants to share,” said Ajai Chowdhry, co-founder of HCL and Chairman-Mission Governing Board of National Quantum Mission.]]>

In April 2023, India inaugurated its ambitious National Quantum Mission (NQM), representing a critical step forward in the nation’s journey towards harnessing quantum technology. With a target budget of ₹6,000 crore and an eight-year timeline, the mission aims to establish India as a global leader in quantum technology under the leadership of Ajai Chowdhry, co-founder of HCL and the founder and chairman of EPIC Foundation. 

“When it comes to quantum computing, we need to make a quantum computer of 1,000 cubits over the next eight years. With respect to communication, we are supposed to have on-ground communication of 2,500 km–-quantum communication—and in space 2,500 kilometres. This way, the targets are well set for each of the areas,” said Chowdhry, in an exclusive interaction with AIM

Establishing India’s Quantum Framework

By January 2024, Chowdhry and his team had established the quantum mission’s framework, beginning with a request for proposals (RFP) to attract expertise and innovative ideas across the quantum spectrum.

“We requested proposals, and within a week of our board meeting, we had already set up calls for proposals,” said the Padma Bhushan awardee. The NQM received an overwhelming response, with 385 applications submitted for review. By August, 84 proposals were approved, covering four primary areas namely quantum computing, quantum communication, sensors, and materials science.

Ambitious and Aggressive Targets 

Chowdhry referred to the mission’s goals as “very tough”, while emphasising that such high targets are essential to placing India on the quantum technology map. While goals are one part of the mission, the funding has been strategically distributed.

“It was decided that ₹4,000 crore will be directly with the National Quantum Mission and ₹2,000 crore will be spent by others, which include the departments of space, defence, and atomic energy. So, it’s a ₹6,000-crore plan,” he said.

The Need for Quantum

A critical challenge identified by Chowdhry is the need to shift researchers’ focus from academic publications to practical applications. Historically, researchers in India, much like other countries, have prioritised publishing papers over product development. NQM’s approach requires a shift to a goal-oriented mindset that aligns with technological targets rather than purely theoretical research.

“A lot of these large corporates will need quantum technology four to five years from now,” said Chowdhry. “For drug development, there’s nothing better than a quantum computer. It’ll dramatically reduce the time for drug discovery and the cost of drugs, so Indian pharma companies can go far ahead if they start thinking about quantum technologies,” he added.

Chowdhry emphasised the urgency of making India quantum-ready, especially in terms of cybersecurity. As quantum technology advances, traditional encryption methods could become obsolete, making sensitive data vulnerable to breaches. 

“When a quantum computer comes up that can crack current cybersecurity, you will be completely open to an attack, and this can happen maybe four years, five years, six years,” cautioned Chowdhry. “At a time when a powerful quantum computer is ready, it can crack your financial systems, it can crack your security systems.” 

Startup and International Collaboration

Chowdhry believes in the importance of involving startups in the quantum mission, recognising the potential they bring for rapid innovation and commercialisation. He explained that the NQM has already reached out to approximately 45-50 startups and created a groundbreaking startup policy, which grants up to ₹25 crore per startup, significantly higher than the usual government grants. 

“This is absolutely unique in the country. For the first time, we are giving startups up to ₹25 crore as support,” which he believes is crucial considering the expensive and resource-intensive nature of quantum research. 

Chowdhry also emphasised the promising nature of the Indian startup ecosystem. “There are already about three or four startups that are commercially selling their products, with one having a subsidiary in America,” he said. 

While collaboration with other nations might seem a logical path to accelerate India’s quantum mission, Chowdhry highlighted the limitations of this move. Countries with significant advancements in quantum, like the US, are often reluctant to share details about core technologies due to concerns over security and competition. 

“It’s nice to talk, but difficult to execute. The US has banned the export of quantum. They don’t want to share the technology, so that’s the kind of message that we are getting from many countries, that nobody really wants to share,” he quipped. 

India is negotiating alternative collaboration models, such as joint intellectual property (IP) development. For instance, Chowdhry proposed inviting foreign companies that have achieved intermediate quantum advancements to set up operations in India. This approach could enable India to build on existing innovations and leapfrog into more advanced stages of quantum technology development.

Quantum, AI and India 

In the long term, Chowdhry envisions a synergistic relationship between quantum computing and AI. While quantum computers are not yet advanced enough to significantly enhance AI applications, Chowdhry is optimistic about the progress of this combination in the upcoming years. 

“AI and quantum are going to be a deadly combination,” he said, forecasting that advancements in quantum computing will eventually transform AI capabilities.

Reflecting on his journey from co-founding HCL to spearheading the quantum mission, Chowdhry, commonly referred to as the ‘Father of Indian Hardware’ expressed confidence in India’s potential to lead in technology. He stressed that while software services have fueled India’s growth over the past two decades, the time has come for India to focus on products. 

“If we don’t shift towards a product-oriented mindset, we’ll never be a developed country,” he asserted. “Tomorrow’s people should be in India. This is a growth country,” he concluded. 

Ajai Chowdhry’s vision for NQM underscores India’s aspiration to become a quantum powerhouse, positioning the country among the leaders in one of the most transformative technological fields of the future. 

]]>
Lenovo’s ‘Custom-Fit’ Data Centres Pushes Against Competitors  https://analyticsindiamag.com/ai-features/lenovos-custom-fit-data-centres-pushes-against-competitors/ Wed, 30 Oct 2024 12:38:17 +0000 https://analyticsindiamag.com/?p=10139860 “We’re building exascale systems designed to be accessible to any customer, regardless of their data centre setup or size,” said Scott Tease, VP and GM of AI and HPC at Lenovo. ]]>

At the recently held Lenovo Tech World 2024 event, the company showcased its expertise in the concept of ‘Hybrid AI,’ encompassing products and launches focused on personal AI, enterprise AI, and public AI. While Lenovo PC’s and AI buddies were talked of highly, Lenovo’s continued push in the infrastructure domain was undoubtedly the highlight. 

Scott Tease, VP and GM of AI and High-Performance Computing (HPC) at Lenovo, explained what sets Lenovo’s HPC solutions apart from competitors like Dell. “We’re building exascale systems designed to be accessible to any customer, regardless of their data centre setup or size,” he said while interacting with AIM at the sidelines of Tech World in Seattle. 

Data Centre for All-Scale Business

Unlike competitors whose HPC solutions often require specialised facilities, Lenovo’s systems are engineered to fit into standard data centre racks and operate with standard power requirements. Tease emphasised that Lenovo’s HPC technology is designed for flexibility, with options for both air and water cooling and open interconnects, making it ideal for regular data centres.

“Our competitors might offer the same building blocks, but they’re often custom-designed for very large systems,” he explained when asked about HPC solutions offered by competitors such as Dell and HPE. 

Lenovo’s approach prioritises accessibility without the need for costly infrastructure changes like reinforced floors or oversized entryways. Tease explains that, with Lenovo’s technology, there’s no need to rebuild data centres or reinforce floors, as the equipment can fit through a standard 2-metre door or freight elevator. In contrast, competing solutions involve 4,000-kilogram racks that are 2.5 metres wide, exceeding typical floor support capabilities.

“We’re designing exascale for every scale,” he added, underscoring Lenovo’s mission to democratise high-end HPC technology.

Sustainable Power to Run the Show

While offering HPC as a major service, Lenovo is also managing to hit sustainability goals. Tease emphasised the importance of reducing power consumption to achieve both Lenovo’s and its clients’ ESG targets. “80% of a device’s carbon impact comes from power usage,” he said, underscoring Lenovo’s focus on lowering energy use to reduce overall emissions.

Lenovo’s sixth-generation Neptune cooling system ThinkSystem N1380 Neptune, unveiled at Tech World, is a water-cooling system built to support NVIDIA’s Blackwell platform and AI applications. The system is said to operate at 100 kW+ server racks, thereby achieving 100% heat removal. 

“If one of these large language model nodes costs you 10 kilowatts to run the compute itself, you’re going to spend another 4 kilowatts just for air conditioning. With Neptune, we can do away with all that air movement. We can do away with all the air conditioning and the air movement in the data centre,” said Tease. It could easily save 30-40% of the power bill. 

Tease explained that twelve years ago, Lenovo’s first server node consumed 330 watts of power. Today, the new Grace Blackwell super node uses around 14 times more power but offers nearly 1,000 times the performance of those early systems. This focus on increasing performance per watt has become essential for customer support. 

Lenovo’s Neptune liquid cooling system aims to help clients rethink data centre design by addressing not only the power needed to operate AI but also the substantial energy required for cooling.

Cooling systems have become the need of the hour, with a number of big tech companies partnering with data cooling system providers. For instance, NVIDIA is not just partnering with Lenovo but also Super Micro Computer, for liquid cooling. The company also announced its partnership with Elon Musk’s xAI to power Colossus, the world’s largest AI supercomputer. 

HPC in India

“For HPC, you have some very big clusters going on in India, a lot of research, a lot of great universities, and we want to be a part of every one of those we possibly can,” Tease said.

Tease stated that the deployment of IT technologies in India is tailored to meet specific regional requirements. “India is like a hotbed of AI startups,” he said. He also committed to identifying and supporting the most talented Indian startups, not only within India but also on a broader scale. 

Lenovo’s Indian ambitions have been steadily growing to fuel the hyperscaler ambition for India. The company also announced its full-stack portfolio, which is completely ‘Made in India’. 

Last month, Lenovo inaugurated its state-of-the-art research and development lab in Bengaluru, making it the fourth city to host an infrastructure R&D lab globally. 

]]>
Why India Needs People+ai https://analyticsindiamag.com/ai-features/why-india-needs-peopleai/ Fri, 25 Oct 2024 12:00:00 +0000 https://analyticsindiamag.com/?p=10139413 “If you look at how much an AI solution could mean to a user, it’s much higher in India with a much larger volume,” said Tanuj Bhojwani.]]>

When we visited People+ai a couple of weeks ago, we had no idea we would walk into a space filled with AI engineers, volunteers and researchers, and of course, walls adorned with AI-generated art, all in line with the philosophy of ‘What can (AI) do for you?’ — straight from Tanuj Bhojwani’s LinkedIn tagline. 

Under his leadership, People+ai is championing AI for India and addressing real challenges at scale with the aim to accelerate through various initiatives and government partnerships. As he put it, “We have a seat at the table… not just in building models but in creating the world’s largest use cases for AI where the difference in people’s lives is tangible”.

People+ai was born out of Nandan Nilekani, Rohini Nilekani, and Shankar Maruwada’s EkStep Foundation in June last year. This community-driven effort aims to harness AI capabilities to help a billion people. And it is being led by Bhojwani, who has been an investor, a consultant, and co-founder earlier as well.

Bhojwani explained the importance of building Indic language models and their use cases within the country. “If I put a gun to your head and tell you that you cannot use Google from next month, you will maybe push back for one or two months but then start coughing up.” Bhojwani said that everyone is accustomed to the idea of going to this magic box, typing what they think, and getting the results.

But according to him, most of the Indian population is yet to experience that type of internet because of the country’s low literacy rate. The same people are going to access the new internet multimodally, through voice and pointing the camera at things. This is why it is important to take AI to the grassroot-level of the country, and do it in the way that India does it – frugally.

People+ai is currently working on projects like Jan Ki Baat, Sthaan, and Open Cloud Compute (OCC) to make this a reality.

“I think there are problems that are going to be unique for India. Who is going to solve that?” Bhojwani asked rhetorically while explaining that copying the West for ideas is not the right way forward. He also added that the behemoth, Amazon, is shaken because of Zepto, Instamart, and Blinkit – the same should be applicable to AI, and not playing catchup with the West.

Nilekani, the co-founder and non-executive chairman of Infosys, previously noted that just as India benefited hugely from Digital Public Infrastructure (DPI), Aadhaar, and UPI, AI holds similar transformative potential. “AI is a very powerful technology, but it’s a technology like anything other,” said Nilekani, adding that AI needs to be used with appropriate safety and guardrails. 

This is the exact principle that People+ai works on. “When it comes to the infrastructural pieces, which involves DPI and policy, we step in,” said Bhojwani.

India as the AI Use Case Capital of the World

Building AI in India is a game that’s very different from that in the West. “If you look at how much an AI solution could mean to a user, it’s much higher in India with a much larger volume,” he added. In the West, it’s about acquiring enterprise customers willing to spend millions of dollars. 

But in India, it’s a high-volume, low-value game, where the AI users would not be paying so much. These people would be more comfortable using AI in their own native languages. This solution is at the population scale. For India to flourish in AI, models must be created that understand India’s linguistic nuances and cultural complexities. 

Bhojwani had earlier told AIM that the foundation had been exploring the benefits of AI for quite some time. However, it was only in the past year that they decided to establish a specialised unit focused on extensively exploring AI use cases.

He believes that in a decade, India is going to be the AI use case capital of the world because of its huge population, diversity, languages and the need for specialised AI tools, be it in healthcare, education or any other field. This is exactly the idea that Nilekani puts in with ‘Adbhut India’.

While enterprises and other companies are pushing the idea of building models and acquiring big customers, it is also essential to make AI reach the roots of the country, which is what AI for Bharat stands for. People+ai is involved with stakeholders to determine if better tools and resources can be developed. “For this, we are collaborating and talking to different organisations, some in Africa, because they are also facing similar problems,” Bhojwani said.

AI for Bharat

If India is going to be the AI use case capital of the world, a substantial compute infrastructure is imperative. People+ai is actively working on developing an open compute infrastructure network to meet the rising demand for compute while promoting market competitiveness.

The idea here is to help small businesses based in Tier 2 or Tier 3 cities access computing infrastructure for training or inferencing at a lower cost compared to leveraging services from the likes of AWS, Google or Azure, which could prove to be costly. 

People+ai is already working with Indian computing service providers E2E Networks, Jarvis AI Labs, NeevCloud, and Vigyan Labs and the idea is to create a network of micro data centres with interoperable standards, allowing small businesses and startups to easily plug and play, based on their specific requirements.

“However, the biggest stumbling block right now is the lack of understanding regarding the potential use cases of AI. While building chatbots is undoubtedly crucial, there is a significant amount of work that still needs to be undertaken, especially in the context of India,” Bhojwani said.

“If I had to choose one of them first, I would choose the use cases,” said Bhojwani, while explaining that every company in the world is increasingly getting interested in Indian AI models, be it OpenAI, Google, or Meta. The moat usually stands in building AI use cases as it is easier to go up that supply chain, rather than going down.

OpenAI did not build GPT-4 on day one, it took them years and several iterations to reach that level. The same would go for Indic language models built by Indian AI companies. “Being one or two generations behind the SOTA models is still good enough.” 

The long-term vision of all AI companies is the same—to have indigenously developed AI models. Given the network and access to resources that the Indian AI companies have right now, the next best move would be to build a GPT-2 level model instead of competing with the West.

Building models is getting cheaper and catching up with SOTA a few years later would be astronomically cheaper. “What is the hurry?” asked Bhojwani, explaining that it is better to solidify the market that could sustain the models. “For a constrained set of resources, where would you rather apply them,” he added and said that it is good to build models, but if you had to pick what to do first, defining the use cases is more important.

]]>
The Journey From Startups to IPO By Neha Singh – Co-Founder and CEO of Tracxn Technology https://analyticsindiamag.com/videos/the-journey-from-startups-to-ipo-by-neha-singh-co-founder-and-ceo-of-tracxn-technology/ Fri, 25 Oct 2024 04:28:21 +0000 https://analyticsindiamag.com/?p=10139358 Learn how Tracxn Technologies went from a startup to IPO in just 10 years with Neha Singh, the CEO and Co-Founder of Tracxn.]]>
Learn how Tracxn Technologies went from a startup to IPO in just 10 years with Neha Singh, the CEO and Co-Founder of Tracxn.

]]>
Interview with Tushar Vashisht, CEO & Co-founder of HealthifyMe‬ https://analyticsindiamag.com/videos/interview-with-tushar-vashisht-ceo-co-founder-of-healthifyme/ Wed, 23 Oct 2024 05:19:59 +0000 https://analyticsindiamag.com/?p=10139151 ]]>
Meet Ria, Healthify’s in-house AI, and learn how OpenAI integration is transforming user experience with Tushar Vashisth, CEO & Co-founder of ‪HealthifyMe‬.
]]>
Raj Verma, CEO, Singlestore – The Database Market in India is $120 Billion https://analyticsindiamag.com/videos/raj-verma-ceo-singlestore-the-database-market-in-india-is-120-billion/ Wed, 09 Oct 2024 05:50:03 +0000 https://analyticsindiamag.com/?p=10137910 Watch Raj Verma, CEO, Singlestore talk about the company's journey, his views on the Indian database market.]]>
Watch Raj Verma, CEO, Singlestore talk about the company’s journey, his views on the Indian database market.

What is Database Market?

The database market encompasses the industry focused on developing, distributing, and managing database management systems (DBMS) related services. The market includes various types of databases, such as relational (RDBMS), NoSQL, NewSQL, and cloud databases. Key players in database market include Oracle, Microsoft, AWS, and MongoDB, with demand driven by the exponential growth of data, big data analytics, and the shift toward cloud-based solutions for scalability and flexibility.

]]>
Mukund Raghunath – Breaking Down Entrepreneurship, Data Explosion, and the GenAI Revolution https://analyticsindiamag.com/videos/mukund-raghunath-breaking-down-entrepreneurship-data-explosion-and-the-genai-revolution/ Mon, 07 Oct 2024 12:01:26 +0000 https://analyticsindiamag.com/?p=10137735 Meet Mukund Raghunath, Founder and CEO of Acies Global, sharing insights on the evolving landscape of data science and engineering.]]>
Meet Mukund Raghunath, Founder and CEO of Acies Global, sharing insights on the evolving landscape of data science and engineering.

]]>
Startups are all About Solving Problems that were not Solved Before https://analyticsindiamag.com/videos/startups-are-all-about-solving-problems-that-were-not-solved-before/ Mon, 07 Oct 2024 05:18:11 +0000 https://analyticsindiamag.com/?p=10137675 Explore the impact of AI in shaping the future of analytics with Kishore Gopalakrishna, Co-Founder and CEO at StarTree]]>
Explore the impact of AI in shaping the future of analytics with Kishore Gopalakrishna, Co-Founder and CEO at StarTree

]]>
The Rise of Autonomous AI at Kyndryl https://analyticsindiamag.com/ai-features/the-rise-of-autonomous-ai-at-kyndryl/ Tue, 17 Sep 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10135574 "We believe this area is going to grow and is an important aspect for us," said Sreekrishnan Venkateswaran, CTO of Kyndryl India.]]>

Tech leaders and experts have been vocal about their predictions that AI agents will be the next big thing. Meta CEO Mark Zuckerberg recently said there could be more AI agents in the world than humans. Similarly, Google is also going big on agents with their senior developers dropping subtle hints on how AI assistants and agents will be a game-changer. 

A tech player who is already going big on this is Kyndryl.

Spinning off from IBM in 2021, Kyndryl, the world’s largest IT infrastructure services provider serving various sectors including cloud, security, network, enterprise, and applications, has been heavily investing in AI too.

“We have a lot of customers and our customers are across all verticals. They are seeking ways to ingest AI into their environments and then elicit advantage out of it,” said Sreekrishnan Venkateswaran, CTO of Kyndryl India in an exclusive interaction with AIM. He also noted that the majority of companies are using GenAI in production, not just for prototyping.

Speaking about the future of AI applications, Venkateswaran brought emphasis on how agentic AI will bring a significant shift in how things work. 

“We often use Agentic AI when mixed problem-solving approaches are required. For example, in customer engagement, the AI system directs each query to the most suitable agent based on its content. One agent might perform retrieval-augmented generation from an internal corpus, while another accesses real-time data through external APIs,” said Venkateswaran. 

The CTO explained that their collaborative workflow, where multiple agents tackle different aspects of a problem based on the query, is a prime example of Agentic AI. He emphasised that they leverage these Agentic AI patterns to effectively address several practical challenges faced by their customers.

Agentic AI is the Future

Venkateswaran explains Agentic AI as a concept that combines language models, custom code, data, and APIs to create intelligent workflows capable of solving business problems. He highlighted the significance of this development, noting AI and data to be particularly valuable in India due to its abundant data and skilled AI professionals. “We believe this area is going to grow and is an important aspect for us,” he stated.

Agentic AI represents a shift towards more autonomous decision-making systems. An “agent” in this context is a piece of code capable of perceiving its environment—through sight, sound, or text—and making decisions based on that input. 

This capability can range from simple tasks, like generating creative code and sharing it via WhatsApp, to complex functions such as managing supply chains or enhancing customer engagement systems. 

 “An agent takes the initiative and makes decisions to solve problems autonomously,” said Venkateswaran.

Venkateswaran’s illustrious career of over 28 years in technology, spending 25 years at IBM before moving to Kyndryl, stresses the importance of building broad technical skills and hands-on coding, cautioning against over-specialisation. As he puts it, “Leadership, yes, but you also need to bring in a C-level skill,” underscoring the need for both leadership and technical expertise. 

He also teaches computer science at BITS Pilani, emphasising that foundational skills like maths and algorithms remain crucial despite the industry’s evolving landscape.

Speaking about the progress in AI systems Venkateswaran said that neural networks required large datasets to learn and perform complex tasks. However, with foundational models like GPT, which are pre-trained, this requirement has been significantly reduced. 

“The human advantage in learning from smaller datasets is no longer valid,” the CTO observed. 

Nuanced AI Support

Speaking about Kyndryl’s application across domains, Venkateswaran highlights how AI has helped transform customer engagement in industries such as retail, travel and transportation, and believes AI is helping with nuanced approaches.

 “Customer engagement is much more than a Q&A with a chatbot.” He believes that to truly replicate or enhance live interactions, genAI must be adept at “understanding the sentiment associated with the text or sarcasm” and responding appropriately in the detected language.

Similarly, Kyndryl has also found strong use-cases in education, healthcare, and finance which is one of Kyndryl’s biggest focuses. “All of them [customers] ask for ways to use this new tech to solve fraud detection, anomaly detection, suspicious transactions, and so on,” said Venkateswaran. 

He also explains how adoption of generative AI in edutech surged post-COVID, with AI automating the evaluation of descriptive exam answers and creating tailored questions. In healthcare, AI is improving patient-provider connections by interpreting symptoms more effectively than traditional keyword searches.

“So any project where we have AI and data, it is usually a modern application project. And if you have Gen AI, it is like a modern application project, up on steroids. So it is not really, there is no real bifurcation between an AI app and a modern app,” said Venkateswaran. 

Kyndryl has partnered with top players including Google Cloud for developing responsible generative AI solutions and accelerating adoption by customers. “We have partnerships based on what we need from each of those partners, and then it goes back to what we can offer them with customers,” said Venkateswaran. 

Kyndryl has been rapidly expanding its operations in India expanding its base in Bengaluru. The company opened its third office in Bengaluru in April. Last month, Kyndryl launched a security operations centre (SOC) in the city for providing comprehensive support and advanced protection for cyberthreats leveraging AI. 


AI agents are predicted to be the next major technological breakthrough, with leaders like Meta’s Mark Zuckerberg suggesting AI agents could soon outnumber humans. Google also sees AI assistants as transformative game-changers. Kyndryl, the world’s largest IT infrastructure services provider, has been a key player in this space since its spin-off from IBM in 2021, focusing on AI to boost operational efficiency and innovation across sectors.

Sreekrishnan Venkateswaran, CTO of Kyndryl India, envisions AI agents, or Agentic AI, as the future of business. These intelligent systems combine language models, custom code, and real-time data to autonomously address complex tasks. Like a self-driving car that adapts without human input, AI agents handle tasks like fraud detection and customer sentiment analysis, reducing manual intervention and enhancing efficiency in industries such as finance, healthcare, and education. Agentic AI promises to revolutionize global operations by automating supply chains and procurement in e-commerce while interacting with customers. This level of automation allows human roles to focus on strategic decision-making and innovation.

As AI continues to advance, industries will become more adaptive, efficient, and customer-centric. Kyndryl’s partnerships with companies like Google Cloud underscore its commitment to responsible AI development.

Disclaimer: The opinions expressed in this expert analysis summary are solely of the AIM council members and do not reflect the views or opinions of the organization they are affiliated with.


AI agents are predicted to be the next major technological breakthrough, with leaders like Meta’s Mark Zuckerberg suggesting AI agents could soon outnumber humans. Google also sees AI assistants as transformative game-changers. Kyndryl, the world’s largest IT infrastructure services provider, has been a key player in this space since its spin-off from IBM in 2021, focusing on AI to boost operational efficiency and innovation across sectors.
 
Sreekrishnan Venkateswaran, CTO of Kyndryl India, envisions AI agents, or Agentic AI, as the future of business. These intelligent systems combine language models, custom code, and real-time data to autonomously address complex tasks. Like a self-driving car that adapts without human input, AI agents handle tasks like fraud detection and customer sentiment analysis, reducing manual intervention and enhancing efficiency in industries such as finance, healthcare, and education. Agentic AI promises to revolutionize global operations by automating supply chains and procurement in e-commerce while interacting with customers. This level of automation allows human roles to focus on strategic decision-making and innovation.
 
As AI continues to advance, industries will become more adaptive, efficient, and customer-centric. Kyndryl’s partnerships with companies like Google Cloud underscore its commitment to responsible AI development.


]]>
Former Nutanix Founder’s AI Unicorn is Changing the World of CRM and Product Development https://analyticsindiamag.com/ai-features/former-nutanix-founders-ai-unicorn-is-changing-the-world-of-crm-and-product-development/ Mon, 19 Aug 2024 10:56:33 +0000 https://analyticsindiamag.com/?p=10132923 Backed up Khosla Ventures, DevRev recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion. ]]>

DevRev, founded by former Nutanix co-founder Dheeraj Pandey and former SVP of engineering at the company, Manoj Agarwal, is an AI-native platform unifying customer support and product development. It recently achieved unicorn status with a $100.8 million series A funding round, bringing its valuation to $1.15 billion.

Backed by major investors, such as Khosla Ventures, Mayfield Fund, and Param Hansa Values, the company is on the road to proving the ‘AI bubble’ conversation wrong. “Right now, there’s a lot of talk in the industry about AI and machine learning, but what we’re doing at DevRev isn’t something that can be easily replicated,” said Agarwal in an exclusive interview with AIM.

Agarwal emphasised the unique challenge of integrating AI into existing workflows, a problem that DevRev is tackling head-on. Databricks recently announced that LakeFlow Connect would be available for public preview for SQL Server, Salesforce, and Workday; DevRev is on a similar journey, but with AI at its core, it remains irreplaceable.

DevRev’s AgentOS platform is built around a powerful knowledge graph, which organises data from various sources—such as customer support, product development, and internal communications—into a single, unified system with automatic RAG pipelines. 

This allows users to visualise and interact with the data from multiple perspectives, whether they are looking at it from the product side, the customer side, or the people side.

The Knowledge Graph Approach

Machines don’t understand the boundaries between departments. The more data you provide, the better they perform. “Could you really bring the data into one system, and could you arrange this data in a way that people can visually do well?” asked Agarwal. 

The Knowledge Graph does precisely that – offering a comprehensive view of an organisation’s data, which can then be leveraged for search, analytics, and workflow automation.

Agarwal describes the DevRev platform as being built on three foundational pillars: advanced search capabilities, seamless workflow automation, and robust analytics and reporting tools. “Search, not just keyword-based search, but also semantic search,” he noted.

On top of these foundational elements, DevRev has developed a suite of applications tailored to specific use cases, such as customer support, software development, and product management. These apps are designed to work seamlessly with the platform’s AI agents, which can be programmed to perform end-to-end tasks, further enhancing productivity.

“The AI knowledge graph is the hardest thing to get right,” admitted Agarwal, pointing to the challenges of syncing data from multiple systems and keeping it updated in real-time. However, DevRev has managed to overcome these hurdles, enabling organisations to bring their data into a single platform where it can be organised, analysed, and acted upon.

The Open Approach

The company’s focus on AI is not new. In fact, DevRev’s journey began in 2020, long before the current wave of AI hype. “In 2020, when we wrote our first paper about DevRev, it had GPT all over it,” Agarwal recalls, referring to the early adoption of AI within the company. 

Even today, DevRev primarily uses OpenAI’s enterprise version but also works closely with other AI providers like AWS and Anthropic. In 2021, the platform organised a hackathon where OpenAI provided exclusive access to GPT-3 for all the participants. 

This forward-thinking approach allowed DevRev to build a tech stack that was ready to leverage the latest advancements in AI, including the use of vector databases, which were not widely available at the time.

One of the biggest challenges that DevRev addresses is the outdated nature of many systems of record in use today. Whether it’s in customer support, CRM, or product management, these legacy systems are often ill-equipped to handle the demands of modern businesses, particularly when it comes to integrating AI and machine learning.

Not a Bubble

DevRev’s architecture is designed with flexibility in mind, allowing enterprises to bring their own AI models or use the company’s built-in solutions. “One of the core philosophies we made from the very beginning is that everything we do inside DevRev will have API webhooks that we expose to the outside world,” Agarwal explained. 

As DevRev reaches its unicorn status, Agarwal acknowledges the growing concerns about an “AI bubble” similar to the dot-com bubble of the late 1990s. “There’s so many companies that just have a website and a company,” he said, drawing parallels between the two eras. 

However, he believes that while there may be some hype, the underlying technology is real and here to stay. “I don’t think that anybody is saying that after the internet, this thing is not real. This thing is real,” Agarwal asserted. 

The key, he argues, is to distinguish between companies that are merely riding the AI wave and those that are genuinely innovating and solving real problems. DevRev, with its deep investment in AI and its unique approach to integrating it into enterprise workflows, clearly falls into the latter category.

]]>
Accel’s Prayank Swaroop on Navigating Challenges and Data Moats in Indian AI Startup Investing https://analyticsindiamag.com/ai-features/accels-prayank-swaroop-on-navigating-challenges-and-data-moats-in-indian-ai-startup-investing/ Mon, 19 Aug 2024 05:28:28 +0000 https://analyticsindiamag.com/?p=10132881 “My belief is that India is a great market, and smart founders come and keep on coming, and we'll have enough opportunities to invest in,” said Prayank Swaroop, partner at Accel. ]]>

As pioneers in the startup VC ecosystem, Accel (formerly known as Accel Partners), with over four decades of experience, entered the Indian market in 2008. They placed their initial bets on a nascent e-commerce company poised to compete with Amazon. 

In 2008, Accel India invested $800,000 in seed capital into Flipkart, followed by $100 million in subsequent rounds. The VC firm went on to back some of today’s most successful ventures, including AI startups. “We’ve invested in 27 [AI] companies in the last couple of years, which basically means we believe these 27 companies will be worth five to ten billion dollars [in the future],” said Prayank Swaroop, partner at Accel, in an exclusive interaction with AIM. 

Swaroop, who joined Accel in 2011, focuses on cybersecurity, developer tools, marketplaces, and SaaS investments, and has invested in companies such as Aavenir, Bizongo, Maverix, and Zetwerk. Having placed careful bets in the AI startup space, he continues to be optimistic, yet wary, about the Indian ecosystem. 

Swaroop observed that while the Indian ecosystem has impressive companies, not all can achieve significant scale. He mentioned that they encounter companies that reach $5 to $10 million in revenue quickly, but they don’t believe those companies can grow to $400 to $500 million, so they choose not to invest in them.

Swaroop told AIM that Accel doesn’t have any kind of capital constraints and can support as many startups as possible. However, their focus is on startups with the ambition to grow into $5 to $10 billion companies, rather than those aiming for $100 million. “I think that is our ambition,” he said. 

Accel has also been clear about having no inhibition in investing in wrapper-based AI companies. They believe that as long as the startup is able to prove that they will find customers by building GPT or AI wrappers on other products, it is fine.  

“The majority of people can start with a wrapper and then, over a period of time, build the complexity of having their own model. You don’t need to do it from day one,” said Swaroop.

However, he also pointed out that for a research-led foundational model, it’s crucial to stand out, and that one cannot just create a GPT wrapper and claim it’s a new innovation.

Accel has invested in a diversified portfolio including food delivery company Swiggy, SaaS giant Freshworks, fitness company Cult.fit, and insurance tech Acko. Accel has made its second highest number of investments in India with a total of 218 companies, only behind the United States with 572. In 2022, the market value of Accel’s portfolio was over $100 billion.

Accelerating AI Startups

Accel has a dedicated programme called Accel Atoms AI that looks to invest in promising AI-focused startups across early stages. The cohort of startups will be funded and supported by Accel partners and founders to help them grow faster. 

Selected startups in Accel Atoms 3.0 received up to $500k in funding, cloud service credits, including $100,000 for AWS, $150,000 for Microsoft Azure, $250,000 for Google Cloud Platform, GitHub credits, and other perks. The latest edition, Atoms 4.0, is expected to begin in a couple of months.

While these programmes are in place, Accel has been following a particular investment philosophy for AI startups. 

Accel’s Investment Philosophy

The investment philosophy of Accel when it comes to AI startups entails a number of key criteria, that includes even the type of team. “It’s a cliched thing in VC, but we definitely look at the team,” said Swaroop, saying that they need to have an appreciation of AI.

He emphasised that teams must embrace AI, and be willing to dive into research and seek help when needed, demonstrating both a commitment to learning and effective communication.

Accel also focuses on startups that solve real problems. Swaroop believes that founders should clearly identify their customers and show how their solution can generate significant revenue.

“We get team members who are solving great things, and we realise they are solving great things, but they can’t say that. When they can’t say that, they can’t raise funding. Basically, are you a good storyteller about it?” he explained.

Revenue Growth Struggles  

Swaroop further explained how VCs are increasingly expecting AI startups to demonstrate rapid revenue growth. 

Unlike traditional deep tech companies that may take years to generate revenue, AI firms must show significant commercial traction within 12 to 18 months. He also stated that as VC investment in AI rises, startups without clear revenue paths face growing challenges in securing funding. 

“Even to the pre-seed companies, we say, ‘Hey, you need to show whatever money you have before your next fundraiser. You need to start showing proof that customers are using you.’ Because so many other AI companies exist,” he said. 

Swaroop also highlighted how investment behaviour for AI startups has changed over the last year where investors are now asking the hard questions.

VCs Obsess Over Data Moats

Speaking about what differentiates an AI startup and their moat, Swaroop highlighted how the quality of datasets may be a deciding factor; and “not so much” with Indic datasets.

“I don’t think language datasets can be a moat, because everybody understands language. Recently, in the Bhashini project, IISc gave out 16,000 hours of audio, so it is democratic data. Everybody owns it, so what’s proprietary in it for you?” asked Swaroop.  

Proprietary datasets, such as those in healthcare or specialised fields, are valuable due to their complexity and the effort required to create them. “I think startups should pick and choose an area where they have uniqueness of data, where they will have proprietary data which is different from just democratic data. That’s the broad thing,” said Swaroop.

Irrespective of the moat, India continues to be a great market with multiple opportunities for investment. In fact, at a recent Accel summit, Swaroop jokingly mentioned how he did not invest in Zomato during its early stage, but there are no regrets. Interestingly, Accel has invested heavily in Zomato’s competitor, Swiggy.

“I think the first thing you have to let go of as a VC is FOMO, the fear of missing out, that’s why I could not think of a company that I regret not investing in, because, my belief is that India is a great market. Smart founders come and keep on coming. We’ll have enough opportunities to invest in,” concluded Swaroop, excited to meet the next generation of founders working in the AI startup ecosystem. 

]]>
Meet Deepak Joy Cheenath, co-founder of Quizizz https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/ https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/#respond Sun, 11 Aug 2024 05:11:42 +0000 https://analyticsindiamag.com/?p=10132064 Meet Deepak Joy Cheenath, co-founder of Quizizz, a powerful tool for enhancing student learning through interactive and engaging quizzes.]]>
Meet Deepak Joy Cheenath, co-founder of Quizizz, a powerful tool for enhancing student learning through interactive and engaging quizzes.
]]>
https://analyticsindiamag.com/videos/meet-deepak-joy-cheenath-co-founder-of-quizizz/feed/ 0
Devnagri is Building a Multilingual ‘Brain’ to Enable Companies Expand to Tier 2 & 3 Cities https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/ https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/#respond Sun, 04 Aug 2024 14:01:45 +0000 https://analyticsindiamag.com/?p=10131291 Devnagri's dataset is robust, comprising over 750 million data points across 22 Indian languages.]]>

Hyperlocal content is becoming crucial for businesses to expand into tier two and tier three cities in India. Devnagri, a data-driven generative AI company, is paving the way by developing a solution, which they call the brain for Indian companies. 

Nakul Kundra, the co-founder and CEO, told AIM about the moat of the company in the era of Indic AI startups

“Devnagri is dedicated to helping businesses move into new markets by providing hyperlocal content. Our machine translation capabilities enable businesses to transform their digital content into multiple languages, allowing them to engage with diverse customer bases,” Kundra explained.

Based in Noida and founded in 2021, Devnagri specialises in personalising business communication for non-English speakers. The company had recently raised an undisclosed amount in a Pre-Series A round led by Inflection Point Ventures. These newly acquired funds will be used for marketing, sales, technology enhancement, R&D, infrastructure, and administrative expenses.

Devnagri leverages open-source LLMs, such as the latest Llama 3, integrating it with its existing dataset and proprietary translation engine for 22 languages. It tailors business communications for diverse linguistic audiences, seamlessly integrating its technology into both private and government infrastructures. 

“We built application layers on top of our machine translation engine,” Kundra elaborated. “These layers allow customers to upload documents, select languages, and even customise the content before translation. The system understands specific tones and terminologies, ensuring that the translated content aligns with the business’ communication style.”

Similarly, New York-based RPA firm UiPath recently partnered with Bhashini. The focus of this collaboration is the integration of Bhashini’s language models with the UiPath Business Automation Platform. This will facilitate seamless translations of documents and other essential areas, specifically targeting Indian languages supported by Bhashini.

Companies such as CoRover.ai and Sarvam AI are also in the similar field of building translation capabilities for companies. Even big-tech companies, such as Microsoft and Google, are heavily focused on translation into Indic languages for catering to the Indian market. 

What’s the Moat Then?

However, Kundra said that Devnagri’s proprietary technology lies at the heart of this initiative, and also the moat of the company. “We’ve created our own machine translation capabilities from scratch,” Kundra said. “Businesses can use our APIs to integrate this technology directly into their platforms, localising content in real-time.”

Devnagri’s dataset is robust, comprising over 750 million data points across 22 Indian languages. “We initially built our models using a vast dataset, and recently we’ve incorporated SLMs and LLMs to enhance quality and address grey areas identified through customer feedback,” Kundra said. 

The goal is to create a single brain for businesses, integrating all touchpoints and datasets into a cohesive system that understands and responds in the desired tone.

“We adapt existing models and integrate them with our proprietary technology, ensuring high-quality multilingual capabilities,” Kundra added.

Collecting data for such a comprehensive system is no small feat. “Our data comes from multiple sources, including open-source dataset corpus, customer data, and synthetic datasets we create,” Kundra explained, saying that the introduction of new datasets from Bhashini also helps the company improve its models.

Devnagri’s multilingual capabilities extend beyond text-to-voice-based conversational bots. “We are developing multilingual bots that allow customers to interact in their preferred languages, whether it’s Marathi, Kannada, or any other language,” Kundra said and added that they aim to reduce the latency to as little as possible.

The Road Ahead

When asked about Devnagri’s differentiating factor, Kundra emphasised on their multilingual bots. “These bots are essential for companies operating pan-India. They handle calls in multiple languages, switching seamlessly to accommodate the caller’s preference, all with the lowest latency.”

Security and privacy are paramount, especially when dealing with government organisations, and customers such as UNDP, and Apollo, among several others. “All our modules are proprietary, enabling us to bundle and position them securely within enterprises or government agencies,” Kundra assured.

Devnagri’s journey has been remarkable, marked by notable milestones like their appearance on Shark Tank India 2022. “Multilingual conversation is the need of the hour, and our solutions aim to optimise costs and improve efficiency for enterprises.”

The firm has also received numerous prestigious awards, including the TieCon Award 2024 in San Francisco, the Graham Bell Award 2023, and recognition as NASSCOM’s Emerging NLP Startup of India.

“Our machine translation engine is a foundational model. It enables us to build conversational bots that understand and respond in multiple languages, tailored to specific business needs,” Kundra said.

As Devnagri looks to the future, their focus remains on building comprehensive AI solutions that cater to the diverse linguistic landscape of India. “We aim to create an ecosystem where businesses can thrive in any language, offering seamless multilingual interactions and superior customer experiences,” Kundra concluded.

]]>
https://analyticsindiamag.com/ai-features/devnagri-is-building-a-multilingual-brain-to-enable-companies-expand-to-tier-2-3-cities/feed/ 0
Meet Reema Lunawat – Accelerating Women In Data Science and Tech https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/ https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/#respond Fri, 02 Aug 2024 08:43:35 +0000 https://analyticsindiamag.com/?p=10131225
Learn how Reema Lunawat’s journey as a woman in tech and how ZS’s inclusive culture and diversity initiatives made a tangible difference for her.

]]>
https://analyticsindiamag.com/videos/meet-reema-lunawat-accelerating-women-in-data-science-and-tech/feed/ 0
Meet CEO of Wadhwani AI – Shekar Sivasubramanian https://analyticsindiamag.com/videos/meet-ceo-of-wadhwani-ai-shekar-sivasubramanian/ https://analyticsindiamag.com/videos/meet-ceo-of-wadhwani-ai-shekar-sivasubramanian/#respond Thu, 01 Aug 2024 06:18:03 +0000 https://analyticsindiamag.com/?p=10130938 Shekar Sivasubramanian is the CEO at Wadhwani AI. He is currently driving the organization towards establishing AI-driven solutions and ecosystems.]]>
]]>
https://analyticsindiamag.com/videos/meet-ceo-of-wadhwani-ai-shekar-sivasubramanian/feed/ 0
Meet Mansoor Rahimat Khan CEO of Beatoven ai https://analyticsindiamag.com/videos/meet-mansoor-rahimat-khan-ceo-of-beatoven-ai/ https://analyticsindiamag.com/videos/meet-mansoor-rahimat-khan-ceo-of-beatoven-ai/#respond Thu, 01 Aug 2024 06:18:03 +0000 https://analyticsindiamag.com/?p=10130944 Interview with Mansoor Rahimat Khan, CEO and Co-Founder of BeatOven AI]]>
Interview with Mansoor Rahimat Khan, CEO and Co-Founder of BeatOven AI
]]>
https://analyticsindiamag.com/videos/meet-mansoor-rahimat-khan-ceo-of-beatoven-ai/feed/ 0
C P Gurnani Addresses How India has Proven Sam Altman Wrong https://analyticsindiamag.com/videos/c-p-gurnani-addresses-how-india-has-proven-sam-altman-wrong/ https://analyticsindiamag.com/videos/c-p-gurnani-addresses-how-india-has-proven-sam-altman-wrong/#respond Thu, 01 Aug 2024 06:18:02 +0000 https://analyticsindiamag.com/?p=10130931 In MachineCon 2024 Event by AIM, CP Gurnani, addressed how India has proved Sam Altman wrong by successfully developing its own Large Language Model (LLM)]]>
In MachineCon 2024 Event by AIM, CP Gurnani, addressed how India has proved Sam Altman wrong by successfully developing its own Large Language Model (LLM)
]]>
https://analyticsindiamag.com/videos/c-p-gurnani-addresses-how-india-has-proven-sam-altman-wrong/feed/ 0
GitHub CEO Thomas Dohmke’s Shocking Response on AGI https://analyticsindiamag.com/videos/github-ceo-thomas-dohmkes-shocking-response-on-agi/ https://analyticsindiamag.com/videos/github-ceo-thomas-dohmkes-shocking-response-on-agi/#respond Thu, 01 Aug 2024 06:18:00 +0000 https://analyticsindiamag.com/?p=10130905 GitHub CEO Thomas Dohmke explained a shocking response on AGI]]>
]]>
https://analyticsindiamag.com/videos/github-ceo-thomas-dohmkes-shocking-response-on-agi/feed/ 0
Meet Naveen Rao, the Indian Origin VP of GenAI at Databricks https://analyticsindiamag.com/videos/meet-naveen-rao-the-indian-origin-vp-of-genai-at-databricks/ https://analyticsindiamag.com/videos/meet-naveen-rao-the-indian-origin-vp-of-genai-at-databricks/#respond Thu, 01 Aug 2024 06:18:00 +0000 https://analyticsindiamag.com/?p=10130909 Meet Naveen Rao, the Indian origin VP of GenAI at Databricks. At a discussion on AI, Neuroscience.]]>
]]>
https://analyticsindiamag.com/videos/meet-naveen-rao-the-indian-origin-vp-of-genai-at-databricks/feed/ 0
Abhishek Sinha, COO at L&T Technology Services (LTTS) Thinks GenAI is Hyped https://analyticsindiamag.com/videos/abhishek-sinha-coo-at-lt-technology-services-ltts-thinks-genai-is-hyped/ https://analyticsindiamag.com/videos/abhishek-sinha-coo-at-lt-technology-services-ltts-thinks-genai-is-hyped/#respond Thu, 01 Aug 2024 06:17:56 +0000 https://analyticsindiamag.com/?p=10130893 Generative AI is hyped but its real for the consumer, watch the latest interview with Abhishek Sinha from L and T Technology.]]>

]]>
https://analyticsindiamag.com/videos/abhishek-sinha-coo-at-lt-technology-services-ltts-thinks-genai-is-hyped/feed/ 0
Meet Mufeed VH, Adarsh Shirawalmath and Adithya S Kolavi in Tech Talk https://analyticsindiamag.com/videos/meet-indian-ai-developers-and-engineers-in-tech-talk/ https://analyticsindiamag.com/videos/meet-indian-ai-developers-and-engineers-in-tech-talk/#respond Thu, 01 Aug 2024 06:17:55 +0000 https://analyticsindiamag.com/?p=10130872 Meet the Indian AI developers and AI founders in tech talk podcast with Analytics India Magazine and learn from their tech journey.]]>
In this latest AI Tech Talk with AIM (Analytics India Magazine) meet the young Indian AI engineers, developers and founders. Mufeed VH, the creator of Devika; Adarsh Shirawalmath, the founder of Tensoic and creator of Kannada Llama. Adithya S Kolavi, founder of CognitiveLab and creator of the Indic LLM Leaderboard
]]>
https://analyticsindiamag.com/videos/meet-indian-ai-developers-and-engineers-in-tech-talk/feed/ 0
Ashan Willy CEO New Relic Interview | AI Observability https://analyticsindiamag.com/videos/ashan-willy-ceo-new-relic-interview/ https://analyticsindiamag.com/videos/ashan-willy-ceo-new-relic-interview/#respond Thu, 01 Aug 2024 06:17:55 +0000 https://analyticsindiamag.com/?p=10130877 Interview with Ashan Willy CEO of New Relic, he delves into how New Relic's AI-powered observability platform is assisting over 12,000 customers in India]]>
Interview with Ashan Willy CEO of New Relic, he delves into how New Relic’s AI-powered observability platform is assisting over 12,000 customers in India, including prominent companies like Swiggy and Ola Cabs.
]]>
https://analyticsindiamag.com/videos/ashan-willy-ceo-new-relic-interview/feed/ 0