Google Cloud – Analytics India Magazine https://analyticsindiamag.com AIM - News and Insights on AI, GCC, IT, and Tech Fri, 07 Mar 2025 12:20:02 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2025/02/cropped-AIM-Favicon-32x32.png Google Cloud – Analytics India Magazine https://analyticsindiamag.com 32 32 NoBroker Diversifies into SaaS with Multilingual ConvoZen.AI https://analyticsindiamag.com/ai-startups/nobroker-diversifies-into-saas-with-multilingual-convozen-ai/ Tue, 25 Feb 2025 13:10:22 +0000 https://analyticsindiamag.com/?p=10164580 15 companies, including Cars24, LendingKart, LeapScholar, and Tata AIG, leverage ConvoZen.AI in their enterprise workflows.]]>

Real estate services company NoBroker unveiled ConvoZen.AI, a conversational intelligence platform that aims to transform customer engagement for businesses in various sectors. The move fortifies NoBroker’s entry into the Software-as-a-Service (SaaS) market with its AI-driven automation workflow. 

ConvoZen.AI is engineered to analyse and transcribe customer engagements across multiple channels, including calls, meetings, chats, and social media. 

The platform supports multiple languages, including English, Hindi, Tamil, Telugu, Kannada, and Marathi. By employing ML models, ConvoZen.AI offers features such as speaker identification, sentiment analysis, and entity recognition, enabling businesses to enhance customer service and operational efficiency.

“Unlike conventional solutions built using third-party models and optimised for Western accents, we built ConvoZen.AI to empower businesses operating at a Bharat scale to embrace Gen AI,” said Akhil Gupta, co-founder and CPTO of NoBroker. 

Google Cloud for ConvoZen.AI

The company has partnered with Google Cloud to use its Cloud AI infrastructure for the development and deployment of custom models optimised for large-scale operations. These models are trained on extensive datasets derived from NoBroker’s customer interactions, which include 45,000+ hours of contact centre conversations. 

Since its launch, ConvoZen.AI has been adopted by numerous companies across sectors such as lending, insurance, edtech, and e-commerce. The platform processes substantial volumes of customer interactions daily, providing businesses with insights that drive improvements in customer engagement and operational processes. 

Naren Kachroo, head of GTM, Google Cloud India AI, emphasised the growing importance of AI-driven agents during his keynote speech at the ConvoZen.AI Summit in Bengaluru. “2025 is going to be the year of AI agents. It is going to be the year of agenting AI.” 

He further elaborated on the significance of NoBroker’s entry into this space. “The work that NoBroker is doing with ConvoZen.AI is so important and relevant today because it ties to a business objective in a business workflow and accomplishes a task.”

Notably, ConvoZen.AI has significantly automated quality audits, leading to enhanced agent efficiency and more effective compliance tracking. Fifteen companies, including Cars24, LendingKart, LeapScholar, and Tata AIG, leverage ConvoZen.AI in their enterprise workflows.

NoBroker in SaaS Landscape

NoBroker’s entry into the SaaS market with ConvoZen.AI positions it among a growing number of companies offering AI-driven customer engagement solutions. The platform’s focus on multilingual capabilities tailored to the Indian market, combined with the credibility of its established real estate platform, provides a distinct competitive advantage. 

“It is wonderful to see ConvoZen.AI utilising the power of our Generative AI models toward creating new-generation ‘Agentic’ capabilities, both internally and for their customers,” said Manish Gupta, Director of Google Research India. 

Manish Gupta, Director, Google Research India at ConvoZen.AI Summit. Source: AIM

Interestingly, Wipro Limited has also partnered with Google Cloud to launch the Google Gemini Experience Zone. 

This initiative provides enterprises hands-on access to Google’s AI technologies, including Gemini models and Vertex AI. The Experience Zone enables businesses to experiment with generative AI applications such as natural language processing, image generation, customer interaction tools, and predictive analytics, facilitating the co-creation of tailored AI solutions to address specific industry challenges.

NoBroker aims to establish ConvoZen.AI as a leading conversational intelligence platform for mid-to-large enterprises. By making conversational AI more accessible and effective, NoBroker’s strategic shift into the SaaS domain with ConvoZen.AI reflects its commitment to innovation and addressing the dynamic needs of the market.

]]>
NTT DATA to Acquire Niveus Solutions to Strengthen Google Cloud Expertise https://analyticsindiamag.com/ai-news-updates/ntt-data-to-acquire-niveus-solutions-to-strengthen-google-cloud-expertise/ Wed, 27 Nov 2024 11:40:13 +0000 https://analyticsindiamag.com/?p=10141792 This acquisition builds on NTT DATA’s partnership with Google Cloud, focusing on innovative data analytics and AI solutions.]]>

NTT DATA, a global leader in digital and IT services, has announced plans to acquire Google Cloud Platform (GCP) specialist Niveus Solutions. The move will make NTT DATA a top global partner for Google Cloud, enhancing its cloud modernisation and AI capabilities across key industries. Pending standard conditions, the transaction is expected to be completed within 30 to 60 days.

Expanding Google Cloud Platform Expertise

Niveus Solutions will bring 1,000 GCP experts to NTT DATA, strengthening its ability to deliver industry-specific solutions for sectors such as finance, manufacturing, retail, and logistics. This acquisition builds on NTT DATA’s partnership with Google Cloud, focusing on innovative data analytics and AI solutions.

“The acquisition of Niveus Solutions will firmly position us as a leading Google Cloud powerhouse, propelling NTT DATA as one of the world’s largest system integrator partners for Google Cloud,” said Charlie Li, head of cloud and security services at NTT DATA. “With Niveus Solutions, we are set to redefine possibilities in the AI-driven cloud era, enabling clients to accelerate digital transformations and achieve meaningful business outcomes across industries and geographies.”

As per reports, worldwide end-user spending on public cloud services in 2024 is forecast to total $679 billion and is projected to exceed $1 trillion in 2027. Niveus Solutions, recognised as 2024 Google Cloud’s ‘Breakthrough Partner of the Year – Asia Pacific’, has a strong track record of success. 

NTT DATA’s Recent Innovations in Cloud and AI

Previously, NTT DATA and IBM launched SimpliZCloud, a hybrid cloud service on IBM’s LinuxONE platform, to support critical financial applications like core banking and risk management for India’s financial sector. Earlier this year, NTT DATA unveiled an ultralight edge AI Platform. It unifies IoT devices, using lightweight AI models, and simplifies deployment through streamlined data discovery, integration, and management.

]]>
JSW MG Motor India, Google Cloud Launch AI Chatbots to Enhance Customer Experience https://analyticsindiamag.com/ai-news-updates/jsw-mg-motor-india-and-google-cloud-launch-ai-chatbots-avira-and-vir-to-enhance-customer-experience/ Wed, 23 Oct 2024 12:18:58 +0000 https://analyticsindiamag.com/?p=10139202 Earlier this year, Volkswagen integrated ChatGPT into its infotainment system, and BMW collaborated with Amazon to introduce an LLM-powered Alexa.]]>

JSW MG Motor India, in partnership with Google Cloud India, has unveiled two generative AI-powered chatbots, Avira and Vir, designed to enhance customer interactions and streamline services. Developed using Google Cloud’s PaLM 2 technology and Riafy’s AI expertise, these chatbots represent a significant leap forward in digital customer engagement for the automotive industry.

JSW MG Motor India is the first car manufacturer in the country to leverage this advanced AI technology for customer service. Avira and Vir are accessible across multiple platforms, including the MG website, WhatsApp, EVPEDIA website, and EVPEDIA WhatsApp, offering a seamless user experience across various touchpoints. The company sees this development as part of a broader AI initiative to boost customer satisfaction, operational efficiency, and sales.

“These chatbots mark a significant milestone in our journey to enhance consumer experience,” said Satinder Singh Bajwa, Chief Commercial Officer at JSW MG Motor India. “By leveraging AI, we are able to redefine customer engagement and operational excellence.”

The chatbots are designed to provide real-time, personalised support through natural conversations. With multilingual capabilities, they can communicate effectively across a range of languages and adapt to customer preferences over time. This AI-driven personalisation allows the system to tailor responses based on the customer’s interaction history, thus building trust and improving satisfaction.

Bikram Singh Bedi, Vice President and Country MD of Google Cloud India, expressed enthusiasm about the collaboration. “We are excited to partner with JSW MG Motor India as they reimagine their consumer experience using our AI platform,” he said.

Enhancing Customer Interaction

Avira is focused on enhancing the car-buying experience, offering interactive and personalised assistance for potential customers. It helps users navigate the car selection process, answering inquiries about JSW MG cars with human-like interactions to make the journey more seamless and engaging.

Vir, on the other hand, serves as an expert on electric vehicles (EVs). It provides detailed insights and educates users on the latest trends in the EV market. Accessible through the EVPEDIA platform, Vir helps customers make informed decisions about electric cars, empowering them with comprehensive knowledge.

This AI-driven innovation underscores JSW MG Motor India’s commitment to using technology to improve customer experience. The introduction of Avira and Vir builds on other tech-first initiatives such as MG Xpert, MG Epay, and MGVerse, further solidifying the company’s position as a leader in customer-centric innovation.

Automotive and AI

This collaboration is not the first of a major car manufacturer and big tech giant as many players are looking to bring AI and cloud capabilities in their vehicles.

Earlier this year, Volkswagen integrated ChatGPT into its infotainment system across its vehicle lineup, including models like the Tiguan, Passat, Golf, and its ID family of electric vehicles. The voice assistant enhances the in-car experience by offering features such as controlling infotainment, navigation, air conditioning, and answering general knowledge questions.

Other automakers are also embracing AI-powered assistants. At CES 2024, Mercedes-Benz unveiled its MBUX Virtual Assistant, based on MB.OS, which offers empathetic, natural responses with four emotional tones. Additionally, Mercedes-Benz had beta-tested ChatGPT last year for voice assistance, enabling drivers to engage in various conversations while staying focused on the road. 

Similarly, BMW had collaborated with Amazon to introduce an LLM-powered Alexa, offering a more intuitive way to interact with the vehicle and perform tasks.

]]>
Google Cloud Partners with Sequoia for Enhanced Support https://analyticsindiamag.com/ai-news-updates/google-cloud-partners-with-sequoia-for-enhanced-support/ Thu, 10 Oct 2024 09:52:17 +0000 https://analyticsindiamag.com/?p=10138048 "You don't have to invent every wheel," said James Lee, the general manager of Google Cloud's startups and AI program, in an interview.]]>

Google Cloud announced on October 9 that it is partnering with Sequoia Capital, an American venture capital firm. This partnership will allow its portfolio companies to access Google’s various cloud services, credits, and enhanced support. “It will help companies quickly build and scale their products on Google Cloud,” says Thomas Kurian, the CEO of Google Cloud. 

Under the terms of the deal, which is non-exclusive, Sequoia-backed companies can get migration support and up to 45 minutes of daily “white glove” support from Google staffers. They can also get up to $500,000 worth of free cloud computing, training, and other services from Cloud, compared to the $350,000 in credits and other benefits it offers to other AI startups

The move could give Google a leg up on its rivals OpenAI and Microsoft, helping it to increase its consumer base for cloud services and credits. 

With this partnership, Google has added to its list of multiple partnerships with other VC firms and accelerators, including Y Combinator and 500 Global. The tech giant also recently partnered with ParallelDots and Tech Mahindra, as reported by AIM earlier. 

“You don’t have to invent every wheel,” said James Lee, the general manager of Google Cloud’s startups and AI program, in an interview. He expressed that beyond the cloud credits, Google can help startups compare various options, which include Google and third-party models. 

Meanwhile, Sequoia partner Bogomil Balkansky said the deal benefits pre-seed stage companies that might only have a few million dollars in funding, a considerable part of which gets eaten up by cloud costs. However, he also expressed that it is optional for companies to build on Google Cloud.

]]>
Thomas Kurian https://analyticsindiamag.com/people/thomas-kurien/ Mon, 23 Sep 2024 07:11:23 +0000 https://analyticsindiamag.com/?p=10135930 CEO of Google Cloud]]>

From being an IIT dropout to becoming the CEO of Google Cloud, Thomas Kurian’s journey is nothing short of remarkable. Today, Kurian is spearheading Google Cloud’s generative AI initiatives and has introduced the Model Garden on Vertex AI, which offers over 130 foundation models. This wide range of options provides customers with extensive AI possibilities.

Kurian is also pushing forward with AI infrastructure, investing in custom AI chips like TPUs to enhance performance. Kurian’s vision includes practical applications of AI for businesses. He’s developing solutions to improve customer service, boost efficiency, and increase productivity. He’s also committed to ethical AI, ensuring responsible development and deployment.

Kurian is expanding Google Cloud’s AI ecosystem through partnerships and is making AI more accessible to developers with tools like the PaLM API and MakerSuite. He’s also promoting AI in cybersecurity and advocating for AI education and skill development.

]]>
Google Cloud Partners with ParallelDots to Enhance Retail Shelf Monitoring with AI  https://analyticsindiamag.com/ai-news-updates/google-cloud-partners-with-paralleldots-to-enhance-retail-shelf-monitoring-with-ai/ Tue, 03 Sep 2024 13:22:02 +0000 https://analyticsindiamag.com/?p=10134359 Customers will be able to integrate ParallelDots' shelf data within the Google Cloud platforms, eliminating the costly and complex task of manual integration.]]>

In a strategic announcement, Google Cloud collaborates with ParallelDots, one of the leaders in Retail Image Recognition solution, to deliver advanced, real-time AI solutions to global Consumer Packaged Goods (CPG) manufacturers and retailers. The goal is to employ the strengths of both the companies and unlock enhanced data accuracy and simplified AI training for CPGs. As a consequence of this, the in-store execution, customer satisfaction, and sales would increase. 

AI in Retail

Google Cloud ensures end-users with reliable performance and security due to its secure infrastructure. This integration will allow customers to deploy ParallelDots’ solutions quickly and easily. Customers will be able to  integrate ParallelDots’ shelf data within the Google Cloud platforms, eliminating the costly and complex task of manual integration.

This partnership helps combat losses related to the retail industry which is currently plummeting with a loss of 25% of sales annually due to poor in-store execution. Due to the limitations of manual store audits, there is also a dearth of  real-time data, efficient audits, and timely reporting. In such a context, advanced AI & IR solutions are gaining significance as they address problems like missing SKUs and price labels or incorrect product placement. 

Commenting on the partnership, Bikram Singh Bedi, Vice President and Country MD, Google Cloud India said, “Our aim is to offer our customers a secured infrastructure and capabilities to seamlessly perform complex tasks. With this collaboration we aim to empower ParallelDots to deliver unparalleled solutions to their customers. Our advanced cloud capabilities combined with their innovative technologies will simplify tasks, efficiently run complex problems and enhance cost efficiency.”

Ankit Singh, Co-founder & CTO, from ParallelDots is confident with this partnership, “This milestone marks a significant advance in delivering a world-class Retail Image Recognition solution to the global CPGs and retailers. Our partnership with Google Cloud enhances the reliability, security, and speed of our solutions, dramatically reducing deployment time, scaling our Image Recognition solution ShelfWatch, and boosting platform reliability and cost-effectiveness. This is a pivotal moment in our mission to create the world’s foremost retail shelf insights platform,” he said. Google Cloud’s strong presence in the retail space strengthens this collaboration in the marketplace. This joint solution stands as it allows end-customers to adopt ParallelDots’ technology, especially due to its platform security and robust infrastructure.

]]>
DeepLearning Offers New Course on Large Multimodal Model Prompting with Gemini https://analyticsindiamag.com/ai-news-updates/deeplearning-offers-new-course-on-large-multimodal-model-prompting-with-gemini/ Thu, 29 Aug 2024 07:38:10 +0000 https://analyticsindiamag.com/?p=10134017 The new course focuses on integrating video, image along with text prompts for GenAI output. ]]>

Andrew Ng’s DeepLearning.AI in collaboration with Google Cloud launched a new course named ‘Large Multimodal Model Prompting with Gemini’ aiming to provide learners with essential skills in using text embeddings for various applications in AI. Unlike Large Language Models (LLMs) that accept text prompts as input, this course aims at teaching how Large Multimodal Models (LMMs) like Gemini can integrate text, images and video as input prompts to deliver more comprehensive and accurate outputs. 

Andrew Ng with the offering of this course aims to teach effective techniques for multimodal promotion, the differences and use cases for Gemini Nano, Pro, Flash, and Ultra models. Taught by Erwin Huizenga, developer advocate for Gen AI on Google Cloud, it focuses on how to integrate Gemini with external APIs using function calling and best practices for creating multimodal applications.

Check out the course here

Innovation First

The ability to have a model that can reason across text and image is quite news. Before LMMs such as Gemini, one effective way to work with image and text simultaneously, would be to use an image captioning model to describe the image feeding that caption to a LLM. But with LMMs these images can be interpreted directly as inputs along with text by the AI.

DeepLearning.AI is offering a flurry of new courses, this month alone the company has unveiled a course on federated learning, allowing secure training on private data, alongside a partnership with Flower Labs. A significant collaboration with Upstage is helping students to efficiently pretrain large language models, including cost-saving techniques like depth up-scaling.

With this diverse range of offerings, DeepLearning.AI is providing a valuable resource for those seeking to advance their skills in the ever-evolving field of AI. Rounding out the recent launches is a course on optimising retrieval augmented generation (RAG) in partnership with MongoDB. This course equips learners with the skills for building efficient and scalable RAG applications, covering techniques like vector search and prompt compression.

Also Read: Andrew Ng’s DeepLearning.AI Unveils New Course on Building AI Applications with Haystack

]]>
Tech Mahindra Partners with Google Cloud to Accelerate Generative AI Adoption https://analyticsindiamag.com/ai-news-updates/tech-mahindra-partners-with-google-cloud-to-accelerate-generative-ai-adoption/ Thu, 22 Aug 2024 08:53:34 +0000 https://analyticsindiamag.com/?p=10133495 Tech Mahindra, a global leader in technology consulting and digital solutions, has announced a strategic partnership with Google Cloud aimed at accelerating the adoption of generative AI and driving digital transformation across Mahindra & Mahindra (M&M) entities.  This collaboration seeks to leverage cutting-edge AI and ML to enhance various aspects of engineering, supply chain, pre-sales, […]]]>

Tech Mahindra, a global leader in technology consulting and digital solutions, has announced a strategic partnership with Google Cloud aimed at accelerating the adoption of generative AI and driving digital transformation across Mahindra & Mahindra (M&M) entities. 

This collaboration seeks to leverage cutting-edge AI and ML to enhance various aspects of engineering, supply chain, pre-sales, and after-sales services for M&M, one of India’s leading industrial enterprises.

As part of the partnership, Tech Mahindra will spearhead the cloud transformation and digitisation of M&M’s workspace, deploying the company’s data platform on Google Cloud. This effort is expected to revolutionise M&M’s operations by integrating advanced AI-powered applications into critical business areas. 

Notably, Google Cloud’s AI technologies will be utilised to detect anomalies during the manufacturing process, ensuring zero breakdowns, optimising energy efficiency, enhancing vehicle safety, and ultimately improving the overall customer experience.

Bikram Singh Bedi, vice president and country MD at Google Cloud, emphasised the importance of this collaboration, saying, “Google Cloud is committed to providing companies like M&M with our trusted, secure cloud infrastructure, and advanced AI tools. Our partnership with M&M will help enable a significant cloud and AI transformation for its enterprise and its global customers.”

The partnership will also see Tech Mahindra managing various enterprise applications and workloads for simulators, leveraging its expertise in analytics and cloud migration. This strategic move promises significant value to M&M’s global customer base, aligning with Tech Mahindra’s ongoing efforts to enhance productivity through gen AI tools.

Rucha Nanavati, Chief Information Officer at Mahindra Group, said, “Google Cloud is committed to providing companies like M&M with our trusted, secure cloud infrastructure, and advanced AI tools. Our partnership with M&M will help enable a significant cloud and AI transformation for its enterprise and its global customers.”

Tech Mahindra and Big Cloud Partnerships

Tech Mahindra has been continuously partnering with big tech cloud providers to leverage their generative AI applications on their platforms. Recently, the company partnered with Microsoft to use dedicated Copilot tools to transform their workplace. 

Similarly, the company has also partnered with Yellow.ai for enhancing HR and customer service automation solutions. 

]]>
Google Maps API To Cost 70% Less Now https://analyticsindiamag.com/ai-news-updates/google-maps-api-to-cost-70-less-now/ https://analyticsindiamag.com/ai-news-updates/google-maps-api-to-cost-70-less-now/#respond Wed, 17 Jul 2024 07:34:52 +0000 https://analyticsindiamag.com/?p=10129381 The move comes days after Ola replaced Google Maps to its in-house Ola Maps]]>

At the Google I/O Connect, Bengaluru, on Wednesday, Google Maps announced that it has lowered India-specific pricing reductions for Google Maps Platform API. 

“Starting August 1, you can expect to pay up to 70% less on Google Maps Platform API,” said Ambrish Kenghe, vice president of Google Pay. The company also announced a special program in collaboration with Open Network for Digital Commerce (ONDC). 

“If you’re building for the ONDC, you might be eligible for up to 90% off on select map APIs,” Kenghe added.

New Collaborations

At the I/O summit, Google also announced its new collaborations as a part of its efforts to help the next generation of startups and developers  to solve real-world challenges. It announced that it is working with METI Startup Hub to enable 10,000 Indian startups in their journey with AI. 

“As part of this effort, we are supporting eligible AI startups with up to $350,000 in Google Cloud credits to help them invest in AI infrastructure. We are also reorienting our existing programs, like the Startup School and AppScale Academy, to be AI-first,” Kenghe said. Kenghe also announced a nationwide Gen AI Hackathon and an AI startup boot camp.

Google’s Answer To Ola?

The move comes days after Ola replaced Google Maps to its in-house Ola Maps. In an X post, Aggarwal wrote, “We used to spend 100 crores a year, but we’ve made that 0 this month by moving completely to our in-house Ola Maps!”. Ola Maps is positioning itself as a cost-effective alternative to Google Maps.

Aggarwal is not the only one challenging Google Maps at the moment. Recently, ISRO chief S Somanath claimed that “ISRO’s Bhuvan is 10x better than Google Maps”.

In a bid to keep up the race, Google has introduced new features to Maps in India, including Lens in Maps and Live View walking navigation. With these features, users can see arrows, directions, and distance markers overlaid on the Maps screen, helping them quickly figure out which way to go. 

The tech giant also introduced Address Descriptors on Google Maps to help users understand addresses better, in a way they are used to in real life. Google is now experimenting with generative AI as well.

]]>
https://analyticsindiamag.com/ai-news-updates/google-maps-api-to-cost-70-less-now/feed/ 0
AWS, Google and Other Cloud Giants Go After AI Agents https://analyticsindiamag.com/ai-trends/aws-google-and-other-cloud-giants-go-after-ai-agents/ https://analyticsindiamag.com/ai-trends/aws-google-and-other-cloud-giants-go-after-ai-agents/#respond Sat, 13 Jul 2024 05:31:06 +0000 https://analyticsindiamag.com/?p=10126748 AWS specified that the agent’s code interpretation capabilities are used only when the LLM deems them necessary, making them semi-autonomous.]]>

At the AWS New York Summit this week, the cloud provider announced that AI agents built through Amazon Bedrock would have enhanced memory and code interpretation capabilities. AWS’ AI and data vice president, Swami Sivasubramanian, said that this was part of a larger update to AWS’ overall GenAI stack available to their enterprise customers.

“At the top layer, which includes generative AI-powered applications, we have Amazon Q, the most capable generative AI-powered assistant. The middle layer has Amazon Bedrock, which provides tools to easily and rapidly build, deploy, and scale generative AI applications leveraging LLMs and other foundation models (FMs).

“And at the bottom, there’s our resilient, cost-effective infrastructure layer, which includes chips purpose-built for AI, as well as Amazon SageMaker to build and run FMs,” he said.

Now, agents built using Bedrock with improved capabilities are interesting, as they would be able to carry out multistep, complex tasks like automating the processing of insurance claims or booking flights for the business with prior knowledge of preferences.

As mentioned before, these agents now also have code interpretation abilities, which means they can generate and run code when the LLM deems it necessary, “significantly expanding the use cases they can address, including complex tasks such as data analysis, visualisation, text processing, equation solving, and optimisation problems”, the company said on the update. 

Despite these updates, AWS still seems to be slightly behind, as Azure also announced similar capabilities for enterprise AI agent building in April and GCP prior to that, though memory retention is not as seamless for agents built on GCP’s Vertex AI.

However, these rapid updates coming from the top three biggest cloud providers in the industry mean one thing: the next wave in the AI revolution is already underway.

Cloud > Generative AI > Building Agents

After the initial panic for businesses to move towards cloud-based systems to avoid going under, companies have quickly grown wise to how these systems can be leveraged to get the most out of their data.

Long story short, cloud providers identified these needs, deploying all-encompassing generative AI capabilities for their customers. As Sivasubramanian said, “They need a full range of capabilities to build and scale generative AI applications tailored to their business and use case.”

Now, the shift towards focusing on building AI agents and improving their overall capabilities signifies a larger need to seamlessly connect all of these services under an easy-to-use interface for employees.

The entire point of deploying generative AI for enterprises is to ease the order of operations within a business. The focus on agents is particularly important as companies rely on the ability to customise and finetune their AI to fit their specific and industry-relevant needs especially as agents have the malleability to execute varied tasks depending on the ask.

During Google Cloud Next 2024, CEO Thomas Kurien said, “Many customers and partners are building increasingly sophisticated AI agents.” With AI agents becoming all the rage, improving their capabilities has become a priority, which explains the slew of updates to agent-building capabilities in the last year alone.

What Can They Improve On?

These updates signify pretty exciting possibilities for what AI agents can do in the future. As is already the case, AI agents are a step towards achieving AGI. Whether that be in the near future or years away, agents still manage to reflect the best in terms of AI innovations. 

With AWS’ recent announcement, they specify that the agent’s code interpretation capabilities are only used when the LLM deems it necessary. Though this limits how these capabilities are used, as it’s not up to the user, it also marks a form of semi-autonomy.

However, fully autonomous AI agents are far from close. “I think it’s going to require not just one but two orders of magnitude more computation to train the models,” said Microsoft AI chief Mustafa Suleyman.

Nevertheless, the enterprise focus on seamless operations and better customer experiences means that agent capabilities will continue to expand, potentially allowing them to act and execute tasks autonomously to produce relevant and digestible outputs for the company’s employees.

As Sivasubramanian has said of AWS, “We’re energised by the progress our customers have already made in making generative AI a reality for their organisations and will continue to innovate on their behalf.”

This seems to be the sentiment across the board for both GenAI and cloud providers, as many industry stalwarts, including Andrew Ng, Andrej Karpathy and Vinod Khosla, have voiced a need for more education around and funding in agent research.

]]>
https://analyticsindiamag.com/ai-trends/aws-google-and-other-cloud-giants-go-after-ai-agents/feed/ 0
Bewakoof Teams Up with Google Cloud to Bring GenAI in Indian Fashion https://analyticsindiamag.com/ai-news-updates/bewakoof-teams-up-with-google-cloud-to-bring-genai-in-indian-fashion/ Thu, 13 Jun 2024 13:13:25 +0000 https://analyticsindiamag.com/?p=10123593 This partnership leverages Google Cloud's expertise in generative AI and machine learning capabilities to create unique and creative designs ]]>

Popular pop culture-based Indian clothing brand Bewakoof has announced a new collaboration with Google Cloud to design a collection of AI-generated t-shirts. 

This partnership leverages Google Cloud’s expertise in generative AI and machine learning capabilities to create unique and creative designs. The collaboration involves using Google’s AI tools to analyse trends, customer preferences, and other data to generate t-shirt designs.

Google Cloud provides advanced generative AI capabilities through LLMs like Gemini, enabling businesses to innovate with new content in text, images, and code. Emphasising responsible AI development, the company ensures ethical and secure use of such AI.

Bewakoof belongs to the TMRW House of Brands, an Aditya Birla Group venture. TMRW has acquired a majority stake of 70-80% in Bewakoof.

https://www.instagram.com/reel/C7rF5XqPT01/?utm_source=ig_web_copy_link

What will this Partnership Bring to the Table?

This collaboration showcases how technology can be integrated into fashion, pushing the boundaries of traditional design methods and introducing a modern, tech-driven approach to creating apparel. 

“We are excited to partner with Google to bring the power of GenAI to the hands of our consumers – enabling expression and personal connection.” said Prashanth Aluru, CEO at TMRW.

“Our generative AI solutions, especially use of the latest Imagen model, provides the ideal foundation for Bewakoof to bring its creative image generation tool to life. We’re excited to see the unique ways their customers will embrace this technology.”  said Bikram Bedi, VP and Country MD at Google Cloud India. 

Founded in 2012 by IIT Bombay graduates Prabhkiran Singh and Siddharth Munot, Bewakoof is an Indian e-commerce brand known for its trendy and affordable casual clothing.  

]]>
Soket AI Labs Partners with Google Cloud to Boost Pragna-1B Model https://analyticsindiamag.com/ai-news-updates/soket-ai-labs-and-google-cloud-launched-indias-first-multilingual-ai-model/ Wed, 15 May 2024 12:54:20 +0000 https://analyticsindiamag.com/?p=10120567 Pragna-1B, developed by Soket AI Labs and Google Cloud, delivers state-of-the-art performance for vernacular languages. ]]>

Soket AI Labs, the Indian AI research firm behind Pragna-1B, India’s first open-source multilingual foundation model, has announced a new collaboration with Google Cloud to further enhance the model’s capabilities and reach. Pragna-1B, which was initially released on May 1, 2024, aims to enable the adoption of Generative AI in India by providing support for vernacular languages such as Hindi, English, Bengali, and Gujarati.

Abhishek Upperwal, Founder of Soket AI Labs, said, “By leveraging Google cloud, Pragna-1B, despite being trained on fewer parameters, is efficient and compares performance in language processing tasks to similar category models.”

He further added, “Tailored specifically for vernacular languages, Pragna-1B offers balanced language representation and enables faster and more efficient tokenization suited for organisations seeking optimised operations and enhanced functionality.”

The collaboration also aims to make Pragna-1B more accessible to developers and organisations. Soket AI Labs plans to list its AI Developer Platform on the Google Cloud Marketplace and the Pragna series of models on the Google Vertex AI model registry. This integration will provide developers with a streamlined experience for fine-tuning models using high-performance resources like Vertex AI and TPUs.

The model has been designed specifically with Indian contexts in mind, ensuring transparency and clarity for enterprises integrating AI into their operations. Soket AI Labs leveraged Google Cloud’s AI infrastructure to achieve efficiency and cost-effectiveness in the development of Pragna-1B.

Google Cloud also plan to list Soket’s AI Developer Platform on the Google Cloud Marketplace and the Pragna series of models on the Google Vertex AI model registry. 

The collaboration between Soket AI Labs and Google Cloud also extends to technical work on training large-scale models and curating high-quality datasets for Indian languages. This joint effort aims to promote AI innovation in India while ensuring transparency and clarity in the development process.

The story so far

Soket AI Labs, founded by Abhishek Upperwal in 2019, created ‘Bhasha,’ a series of high-quality datasets designed for training Indian language models. This includes ‘Bhasha-wiki,’ which consists of 44.1 million articles translated from English Wikipedia into six Indian languages, and “Bhasha-wiki-indic,” a refined subset focusing on content relevant to India. 

Pragna-1B, features a Transformer Decoder-only architecture with 1.25 billion parameters and a context length of 2048 tokens. Trained on approximately 150 billion tokens, with a focus on Hindi, Bangla, and Gujarati, Pragna-1B delivers state-of-the-art performance for vernacular languages in a small form factor.

In a recent LinkedIn post, Upperwal highlighted the improvements in GPT-4o’s tokenizer and vocabulary size, which now supports 200k tokens. However, he noted that Pragna-1b’s tokenizer still outperforms GPT-4o when it comes to Kannada, Gujarati, Tamil, and Urdu, serving as a motivation for Soket AI Labs to improve support for Hindi and other Indian languages.

Soket AI Labs is also experimenting with a Mixture of Experts model, expanding the languages supported and exploring different architectures for increased optimisation. 

]]>
Zebra Brings Generative AI to the Frontlines with Google, Android, and Qualcomm https://analyticsindiamag.com/ai-news-updates/zebra-brings-generative-ai-to-the-frontlines-with-google-android-and-qualcomm/ Mon, 06 May 2024 13:05:55 +0000 https://analyticsindiamag.com/?p=10119721 Zebra collaborated with Google, Android, and Qualcomm is about to brings Gen AI to Android phones and tablets.]]>

Zebra Technologies has announced a partnership with Google Cloud, Android, and Qualcomm to bring generative AI capabilities to frontline workers across industries.

The collaboration integrates Zebra’s technological expertise with advanced AI from Google Cloud, hardware from Qualcomm, and software from Android. 

The new capabilities are designed to assist front-line employees by easing the cognitive load on these workers and helping them make better decisions in real time by providing a chat experience on their handheld devices.

By harnessing generative AI with domain-specific knowledge, frontline staff will soon have access to a chat experience on their handheld devices. This will allow them to retrieve information and get answers to task-related queries easily.

Tom Bianculli, Chief Technology Officer at Zebra Technologies, emphasised the conversation shift around generative AI, moving from ‘how’ it works to ‘what’ it can achieve. 

He envisions a future where planning and execution systems merge seamlessly, accelerated by finely tuned, real-world AI models capable of scheduling tasks, responding to requests, and providing context-based recommendations.

A European supermarket chain has experienced this collaboration firsthand where, feeding the AI model with the company’s entire standard operating procedure (SOP) library, employees can now tap into a vast knowledge base derived from policies, procedures, and product information. 

This ‘always-on’ digital assistant has the potential to reduce time to competency, ensure consistent best practices, improve customer interactions, and enhance employee satisfaction.

Rouzbeh Aminpour, Global Retail Solution Engineering Manager at Google Cloud, emphasises the fundamental change generative AI brings to organisations, fueling a new era of customer and employee interactions with businesses and brands.

Apart from this partnership, Zerba also partnered with Qualcomm, showcasing how their phones and tablets could use a large language model (LLM) without needing connectivity to the cloud.

]]>
Microsoft Eats into Amazon’s Cloud Market Share  https://analyticsindiamag.com/global-tech/microsoft-eats-into-amazons-cloud-market-share/ Fri, 03 May 2024 12:39:11 +0000 https://analyticsindiamag.com/?p=10119562 Microsoft Azure inches closer to Amazon with 25% cloud market share. ]]>

The $76-billion global cloud infrastructure services market has once again been captured by the big three with a 67% combined market share. Amazon continues to dominate the cloud market with a 31% share, taking a 1% hit from the previous year. Microsoft, on the other hand, has been surging forward. 

Microsoft Azure is the King

Microsoft Azure has shown steady growth in the cloud sector, showing an increased capture of the market. The recent quarter’s cloud revenue was $35.1 billion, which was up 23% year-on-year (YoY). The company is closely trailing Amazon with a global cloud market share of 25%

Microsoft’s large spread of AI offerings across its enterprise suite is proving to be its golden egg (the goose being OpenAI). 

“Our AI innovation continues to build on our strategic partnership with OpenAI. More than 65% of the Fortune 500 now use Azure OpenAI Service,” said Microsoft chief Satya Nadella, in a recent earnings call.  

Nadella also confirmed that the quantity of Azure deals valued at over $100 million rose by over 80% compared to the previous year, while the number of deals exceeding $10 million more than doubled. 

Guided by Nadella’s strategic brilliance, Microsoft’s cloud share has been advancing by 1% each quarter, mirroring the deliberate steps of the king on a chessboard.

Copilot Mode ON

Microsoft’s Copilot is proving to be the backbone for AI-powered products for its customers. “30,000 customers across every industry have used Copilot Studio to customise Copilot for Microsoft 365 or build their own, up 175% quarter-over-quarter,” said Nadella.  

In the earnings announcements, Nadella spoke at length about Copilot’s applications across domains. He claimed that almost 60% of Fortune 500 companies use Copilot and have witnessed an accelerated adoption across industries such as Amgen, BP, Cognizant, Koch Industries, Moody’s, Novo Nordisk, NVIDIA, and Tech Mahindra purchasing over 10,000 seats.

“We’re not stopping there. We’re accelerating our innovation, adding over 150 Copilot capabilities since the start of the year,” said Nadella. 

While Microsoft skyrockets, Google has maintained its 11% share of the cloud market.

Source: X

Google Cloud Remains Resilient

Google witnessed staggering growth in the recent quarter with 15% revenue growth YoY and a net income of $23.7 billion, which is a jump of 57% from the previous year. The company attributes a considerable chunk of growth to Google Cloud. 

“Today, over 60% of funded GenAI startups and nearly 90% of GenAI unicorns are Google Cloud customers,” said Google chief Sundar Pichai. The company posted an operating income of $900 million on cloud services. The company even acknowledged that the growth across the cloud is underpinned by the benefits of AI.

In cloud, Google has announced over 1000 new products and features in the past eight months. 

AI Integration Continues for AWS 

Though Amazon saw a 1% dip in the recent results, Amazon is not backing down in any way. AWS’s segment sales increased 17% YoY to hit $25 billion, and the company has been extensively investing in bringing AI on their platform. 

Recently, AWS announced the general availability of Amazon Q, which is the company’s most advanced AI-powered assistant. Amazon Q will be available in three forms to assist developers, enterprises, and Q apps, enabling companies to build generative AI apps using their company data.

“The combination of companies renewing their infrastructure modernisation efforts and the appeal of AWS’ AI capabilities is reaccelerating AWS’ growth rate,” said Andy Jassy, Amazon’s president and CEO.  The company is at a $100 billion annual revenue rate. 

Amazon Bedrock, AWS’s generative AI service that allows users to leverage the latest LLMs for building AI applications, also witnessed remarkable numbers in the recent quarter. Amazon confirmed that thousands of organisations worldwide are using Amazon Bedrock. 

]]>
HCLTech Launches Strategic Initiative with Google Cloud to Scale Gemini  https://analyticsindiamag.com/ai-news-updates/hcltech-launches-strategic-initiative-with-google-cloud-to-scale-gemini/ Wed, 03 Apr 2024 08:26:33 +0000 https://analyticsindiamag.com/?p=10117589 HCLTech will enable 25,000 engineers on Google Cloud’s latest GenAI technology to better support clients at every stage of their AI projects]]>

HCLTech announced an expanded alliance with Google Cloud to create industry solutions and drive business value with Gemini, its multimodal large language AI model.

HCLTech will enable 25,000 engineers on Google Cloud’s latest GenAI technology to better support clients at every stage of their AI projects, including the development of new use cases and capabilities for HCLTech platforms and product offerings, and initially focusing on bringing gen AI capabilities to clients in manufacturing, healthcare, and telecom.

The IT giant recently launched HCLTech AI Force, a pre-built GenAI platform that optimizes engineering lifecycle processes from planning through development, testing and maintenance. 

HCLTech will now enhance the HCLTech AI Force platform with Gemini’s advanced code completion and summarisation capabilities, which will allow engineers to generate code, remediate issues and accelerate the delivery time and quality of software projects for clients.

It will also use Gemini models to strengthen and expand the portfolio of industry solutions built out of its dedicated Cloud Native Labs and AI Labs, which focus on accelerating client innovation and are staffed by leading AI experts and engineers. Both labs will enable clients to better scope, manage and refine gen AI projects on Google Cloud’s infrastructure.

“HCLTech and Google Cloud have a long-standing strategic partnership. This collaboration will bring to market HCLTech’s innovative GenAI solutions using Google’s most capable and scalable Gemini models. We believe this helps us to bring even more value to global enterprises through HCLTech’s differentiated portfolio,” said C Vijayakumar, CEO and managing director, HCLTech.

]]>
Google Cloud, Bhashini, and MachineHack Come Together for Bhasha Techathon https://analyticsindiamag.com/ai-highlights/google-cloud-bhashini-and-machinehack-come-together-for-bhasha-techathon/ Fri, 08 Mar 2024 06:00:00 +0000 https://analyticsindiamag.com/?p=10115179 The techathon is scheduled to take place between 8th March and 15th May, 2024. ]]>

Unleash the power of language diversity with the Bhasha Techathon, presented by Digital India Bhashini Division in partnership with Google Cloud and MachineHack. This groundbreaking event is where innovation converges with real-world impact.

Since its inception in 2022 by Prime Minister Narendra Modi, Bhashini has revolutionised digital accessibility by providing services and internet access in various Indian languages, including voice-based functionalities. Already, it has been making waves across multiple sectors. Take, for instance, AskDISHA, the chatbot introduced by the Indian Railway Catering and Tourism Corporation (IRCTC), which is now harnessing Bhashini’s capabilities.

Witness the transformative potential of Bhashini firsthand as Prime Minister Modi demonstrates its versatility. From addressing gatherings in multiple Indian languages with effortless grace to seamlessly translating speeches, as seen at the Kashi Tamil Sangamam event in Varanasi last December, Bhashini is shaping the future of digital communication in India.

For more information visit our website: https://bhashatechathon.com/

Start Date: 8th March 2024

End Date: 15th May 2024

Register Now

About the Hackathon 

The Techathon is set to address key challenges in NLP and seeks to cultivate indigenous solutions to language-specific hurdles. 

Problem Statement Categories

1. Chatbot Assistance in Regional Languages (Government e-MarketPlace) 

Develop a multilingual chatbot for the Ministry of Panchayati Raj, supporting 22 Indian languages and enhancing user interaction and outreach.

2. Conversion of FAQs Section (GeM): Make FAQs on a website accessible in 22 Indian languages, featuring translation, transliteration, and a multilingual chatbot for improved user engagement. It includes NLP, search functionality, categorisation, and multimedia integration for improved user engagement and satisfaction.

3. Voice-to-Text and Bucketing of Complaints (Centre For Railway Information Systems): Automate complaint handling by converting voice messages to text in 22 languages and categorising them using AI/ML for improved efficiency and accuracy.

4. Video-to-Text and Bucketing of Complaints (CRIS): Automate video complaint analysis through video-to-text conversion and AI/ML categorisation to streamline processes and improve customer satisfaction.

5. CDSS in Multiple Indian Languages (National Health Authority): Enhance Clinical Decision Support Systems (CDSS) accessibility by developing multilingual versions for 22 Indian languages to improve patient care and support digital health initiatives.

Challenge Stages

1. Submission and Selection: Participants submit their approaches for each category.

2. Presentation and Evaluation: Selected participants present their solutions at an in-person event before the jury.

3. Winner Announcement and Prize Distribution: Winners are announced, and prizes are distributed.

Start Date: 8th March 2024

End Date: 15th May 2024

Register Now

Awards and Achievements

  • Accelerate Your Career: Develop and implement your solution for adoption across government bodies.
  • Audience Reach: Gain exposure on a widely-viewed platform, allowing you to present and market your innovation to leaders from various sectors of the Indian industry.
  • Expand Your Horizons: Connect with peers in your field and stay updated on the latest developments in the ecosystem.
  • Acknowledgement and Incentive: Earn substantial prize money at different program stages. The Hackathon winners will receive prizes of INR 50,000 for first place, INR 35,000 for second place, and INR 15,000 for third place.

Who Can Participate

  • Working professionals
  • Startups
  • Entrepreneurs
  • Students
  • Innovators
  • Freelancers

Start Date: 8th March 2024

End Date: 15th May April 2024

Register Now

]]>
Kyndryl Expands Partnership to bring Google Gemini for Generative AI Solutions https://analyticsindiamag.com/ai-news-updates/kyndryl-expands-partnership-to-bring-google-gemini-for-generative-ai-solutions/ Thu, 08 Feb 2024 11:45:14 +0000 https://analyticsindiamag.com/?p=10112251 Kyndryl is expanding its partnership with Google Cloud to implement generative AI solutions for enterprise customers ]]>

Kyndryl, the leading provider of IT infrastructure services globally, today announced an  expanded collaboration with Google Cloud aimed at developing responsible generative AI solutions and accelerating adoption by customers.

Since 2021, Kyndryl and Google Cloud have collaborated to facilitate the transformation of global enterprises through the utilization of Google Cloud’s sophisticated AI capabilities and reliable infrastructure. The upcoming stage of this partnership will concentrate on integrating Google Cloud’s internal AI technologies, such as Gemini, its most powerful large language model (LLM), with Kyndryl’s proficiency and managed services to create and implement generative AI solutions for clients. 

Kyndryl’s advisory and implementation services will help clients identify optimal generative AI use cases and data foundations, leveraging their expertise in Google Cloud technology for business transformation. Kyndryl also  plans to offer its new LLMOps Framework to Google Cloud customers, facilitating responsible and cost-effective solutions for common challenges in generative AI adoption.

Kyndryl intends to utilise the Google Cloud Cortex Framework to enhance the business value derived from customers’ Enterprise Resource Planning (ERP) data on Google Cloud, aiming to enhance productivity, foster innovation, and offer improved business insights, ultimately driving new outcomes for customers.

“Given Kyndryl’s data services expertise, along with our more than 30 years of experience managing large enterprise environments, Kyndryl understands the complexities in moving a generative AI solution from an idea into production. By combining this unique perspective and the Kyndryl Responsible AI Principles with Google’s AI history and in-house generative AI capabilities, we can quickly and responsibly bring this new generation of AI to customers and drive their business value,” said Nicolas Sekkaki, Kyndryl’s Global Applications, Data and AI Practice Leader. 

In an earlier exclusive interview with AIM, Naveen Kamat, Vice President & CTO of Data and AI Services at Kyndryl, spoke about how the company is charting its course in IT and AI. “Apart from our skills, experience and resources, we also have been investing into our partner ecosystem, which includes Microsoft, and AWS, apart from data platforms like CloudEra and Databricks. We bring in additional differentiation when we work with clients with our IP, assets and accelerators.” 

]]>
Samsung ‘Galaxy AI’ was Not About Samsung https://analyticsindiamag.com/ai-trends/samsung-galaxy-ai-was-not-about-samsung/ Thu, 18 Jan 2024 11:30:00 +0000 https://analyticsindiamag.com/?p=10111068 It was AI, provided by another big tech]]>

The much-publicised ‘Galaxy Unpacked’ event where Samsung had promised to unveil ‘a new era of phone’ took place yesterday at San Jose, and, yes, it was packed with a lot of ‘AI.’ While the event was all about Samsung, the true protagonist was another big tech company without whom the AI capabilities that the phone manufacturer is betting big on would not have happened. In comes, Google.  

Unveiling its new brand of Galaxy S24 series, namely, Samsung Galaxy S24, S24 Plus, and the S24 Ultra, the company’s focus was on showcasing all new AI capabilities on the smartphones calling it Galaxy AI. However, the integration of Google’s advanced AI-features on the phones took the limelight. 

Google All The Way

When Samsung teased the sparkles Samsung Galaxy AI logo, its uncanny resemblance to Google Bard’s logo drew everyone’s attention. At the Galaxy AI event, the suspicion became clear when the company announced the latest AI-enabled features of Google on the new smartphone. Almost all the new features announced in the S24 series were powered by Google.

The new ‘Circle to Search’ feature was separately but simultaneously announced by Google and Samsung. The new feature will enable users to draw a circle around any object in an image or video, and get AI-powered information from Google. With a simple gesture, a user can search for anything without switching apps. Live Translate, is another feature where the integrated AI model will provide real-time voice/text translation during calls in the native phone app. 

While the new AI-enabled features sound impressive, the biggest announcement is the integration of Gemini on Samsung phones. 

Gemini to See the Light of the Day

Google Gemini, which is considered to be Google’s most capable and advanced multimodal AI-model, that was released last month, will be integrated on the Galaxy S24 phones. The series will have built-in Gemini Nano, which is an optimised model for on-device tasks. 

Furthermore, Samsung will be the first Google Cloud partner to deploy Gemini Pro, which is Google’s AI model for a wide range of tasks, and Imagen 2, which is Google’s advanced text-to-image diffusion technology, on Vertex AI. Samsung phone applications such as Notes, Voice Recorder and Keyboard apps will utilise Gemini Pro for summarization features. 

Samsung will also be one of the first customers to test Gemini Ultra which is the most capable and largest AI model for highly complex tasks. Thomas Kurian, CEO of Google Cloud, considers the Samsung-Google Cloud partnership as a ‘tremendous opportunity for generative AI to create meaningful mobile experiences.’ With Gemini, Samsung smartphones can leverage Google Cloud’s infrastructure and advanced AI capabilities. 

Samsung’s Diversified Partnerships

CES 2024 – Samsung Keynote Speech. Source: The Verge

Samsung’s strong collaboration with Google will help power their devices with the latest AI-enabled features, however, that has not limited the smartphone maker to tie-up with other big tech giants. 

At the latest CES 2024 that concluded last week, where Samsung revealed new devices, the company announced its strategic partnership with Microsoft to bring Copilot features. Microsoft Copilot capabilities are set to arrive in Galaxy phones in March. 

Furthermore, the company also said that the Galaxy smartphone camera can be used as a webcam on a PC. The webcam support will allow the usage of both front and rear facing cameras and enable users to apply background blur and other customisations that are allowed in video calls.

With the recent release of Copilot Pro, which offers a number of AI-enabled features, Samsung’s partnership with Microsoft on one side, and Google on the other, is only enabling the company to experience the best of both the worlds.  

Tweet by co-founder and CTO of Lattice Labs. Source: X

The Android Upgrade 

With a number of AI-enabled features, powered by Google, another big development from the house of Google is in the OS segment. Samsung phones will now receive seven years of security updates and seven generations of OS upgrades starting with the S24 series. 

This comes as a significant upgrade from a previous announcement on android upgrade, where in early 2022, Samsung had announced four generations of Android OS upgrades and five years of security updates starting with S21 series. 

Interestingly, Google had also announced seven years of software updates for their latest Pixel 8 and Pixel 8 Pro phones that were released in October last year. With Samsung also announcing the same, the competition in the Android smartphone market will still continue between both companies. 

However, with the large market share that Samsung and Apple already has in the smartphone market, Google’s Pixel share is hardly a number. It is evident that in spite of being in the phone market, Google’s AI-models for smartphones are not solely for their customers. Tapping into the large Android phone makers’ market, Google’s AI offering is only going to be a win for the company. 

Considering how Google is providing Gemini in exclusivity (barring Pixel) for Samsung S24 series, the collaboration might have involved huge finances as well. However, Samsung will desperately look to reap benefits from the S24 series collaboration with Google, owing to recently losing its market dominance to Apple by 0.6% market share.

]]>
Cypher 2023: Key Highlights (Day 1) https://analyticsindiamag.com/ai-trends/cypher-2023-key-highlights-day-1/ Thu, 12 Oct 2023 04:10:42 +0000 https://analyticsindiamag.com/?p=10101376 Cypher day 1 was successful with more to come!]]>

Cypher 2023 was jam-packed. The day one of the event witnessed more footfall than we anticipated, close to 1500+ participants and 600+ companies took part in India’s biggest AI conference.

The event kicked off with Karthik Ranganath, general manager of Business IT at Shell R&D, talking about unleashing AI innovations for the better.

In his keynote discussion, Ranganath spoke about how Shell is harnessing the power of AI to make the energy sector more efficient, alongside sharing its partnerships with multiple AI startups to help tackle some of the pressing challenges in the energy sector. 

This was followed by a talk on “Digital Minds” by Jacy Reese Anthis, cofounder at Sentience Institute, who emphasised the relationship between humans and machines, in the backdrop of generative AI advancements

He explained that there is a profound shift in not only how we interact with computers but also how we as a society interact with each other. “We talk to the computer like we talk to a friend with natural language instead of writing commands in code,” he said. 

With the increase in chatbots, there is a danger in forming attachments to the computer which take on human-like characteristics with its language. His philosophical questions were thought-provoking and left the audience reflecting on his ideas. 

George Kuruvilla, the chief data platforms evangelist at SingleStore, shared how the company has built one of the best real-time data management systems, eliminating the need to run to multiple vendors to store, manage and harness data. 

He explained the real-time ease of working with a single platform. He also spoke of the issues that startups face in data management while also explaining how their larger customers work with extensive data. 

Biren Ghose with his jovial personality really connected with the audience.

Ghose spoke about how his company Technicolor Creative Studios has employed all the AI tools that create innovative animations and visualisations. He also showcased this with stunning videos that he played while talking about the how and the why of the entire creative process. 

Moving on to a more instructive session, Abhishek Nandy chief data scientist at PrediQt Business Solutions Pvt. Ltd and Intel Corporation certified instructor, gave a workshop on the AI kit tailored for Intel® architecture, specifically for data scientists and AI developers on how they can deploy AI models, particularly LLMs seamlessly. 

He also touched upon Intel® Developer Cloud, access to Ponte Vecchio instances, and practical demos using LLMs with Langchain and OpenAI Stable Diffusion.

Moumita Sarker, from Deloitte, spoke about how organisations can effectively balance the need for speed to release in the market and the requirement for robust AI model development and testing. While the latter does take time there is an urgency to have a product. She shared that, “AI has to be served in modules, organisations have to make it easily pluggable. Spend more time on the pilot, spend more time on the user co-creation. Make it valuable, give them the option to customise.”

One of the panel discussions explored the topic of how AI has barged in on every field into everyone’s life and work place. The conversation was between industry experts Akanksha Singh, Jayachandran Ramachandran, Vinodh Ramachandran and Chirag Jain

The discussion was largely around how the adoption of AI has affected employees and how to navigate this introduction positively. 

In addition to this, it also touched on the legal, academic, and enterprise perspectives and the panellists also elaborated on the best work practices. 

“The management should provide resources and clearly explain the vision of the organisation. It is the responsibility of anyone in the leadership position to instil faith and rally the organisation towards a common goal,” Vinodh Ramachandran clearly summed up.

Lastly, Jonty Rhodes, the South African cricketer and legendary fielder, spoke on the role of gathering data and analytics that gives players an edge in cricket. The retired player is a coach to IPL teams who touched upon different aspects of analysing players. This inevitably improves strategy, he said but not without the perils of too much information, which leads to a decision paralysis. 

Humble Rhodes was in high spirits and spoke about his love for India. He also gave away the Minsky Awards to some of the exemplary leaders and companies in AI and analytics for their contribution and impact.

]]>
NVIDIA Expands Cloud Business with Investments, Partnerships https://analyticsindiamag.com/global-tech/hugging-face-users-get-a-much-needed-hug-from-nvidia/ Fri, 15 Sep 2023 09:59:53 +0000 https://analyticsindiamag.com/?p=10100136 With NVIDIA partnership, Hugging Face users get access to SOTA GPUs and infrastructure needed to rapidly train and finetune foundation models at scale and drive a new wave of enterprise LLM development. ]]>

Hugging Face is steadily growing to be the one stop solution for AI models, after their partnership with NVIDIA, even more so. Recently, they announced that Hugging Face users will have access to NVIDIA DGX Cloud AI supercomputing to train and fine tune their AI models. 

With this partnership, Hugging Face users get access to SOTA GPUs and infrastructure needed to rapidly train and finetune foundation models at scale and drive a new wave of enterprise LLM development. 

Hugging Face has divided the costs of building specific models parameters on DGX, tokens and datasets from the range of $32,902 to $18,461,354 making the process more efficient. “We hope to spur a new wave of experimentation and learning in AI – exploration that simply wasn’t feasible before,” said Julien Chaumond.

NVIDIA CEO Jensen Huang also acknowledged the immense potential of Hugging Face as an AI enabler and said, “I think there are 50,000 companies with 2 million users and there’s some 275,000 models and 50,000 datasets. Just about everybody who creates an AI model and wants to share it with the community puts it up on Hugging Face.” 

NVIDIA, now a stakeholder at hugging face, is further driving their own product through them. They are likely to reap the benefits in revenue this year and next because of massive customer investments. 

While Hugging Face got a big boost from this partnership, NVIDIA has another plan. They’re growing the user base of their DGX cloud by inviting the open source community and accelerating AI innovation at scale. 

Is NVIDIA Eating Cloud? 

NVIDIA has positioned themselves smartly between the cloud service providers and their customers. 

In March, NVIDIA announced that the company  is partnering with leading cloud service providers to host DGX Cloud infrastructure, starting with Oracle Cloud Infrastructure (OCI). The company also said that Microsoft Azure, Google Cloud will Host DGX Cloud soon. Because of the partnership, users get to quickly access GPU servers and DGX Cloud without having to make commitments to multiple cloud vendors.

While NVIDIA insists this partnership is a shared success, it is clearly at odds with other cloud providers. The use of DGX as a “single software platform” from NVIDIA allows companies to streamline the operation of its new AI software across various cloud providers and within its own data centres, enhancing efficiency.

Additionally, NVIDIA’s DGX cloud servers are built by engineers who leverage their knowledge of the company’s chips and are in a better position to fine-tune DGX Cloud servers. As expected, this surpasses their performance in comparison to other AI-centric servers rented out by cloud providers, as confirmed by individuals closely acquainted with the service. 

Interestingly, the similar partnership offer by NVIDIA was also made to  AWS, but it refused. Joshua Bernstein, a former manager at AWS and Google Cloud, commenting on this said, “It puts NVIDIA’s brand front and centre over a cloud providers’ brand.” AWS is the biggest player in the cloud sector and is already a formidable competitor with their EC2 P5 service. 

NVIDIA for Startups

Nvidia is slowly building up its foothold in the AI startup and enterprise ecosystem. The startups themselves might be slow in arrival but NVIDIA is making sure to enable them. 

Last year, NVIDIA open-sourced certain parts of their GPU software for Linux after criticism of not being open-source friendly. 

They also said that they’re making code run more efficiently across different types of processors like CPUs, GPUs, and AI accelerators. To support the open source projects, they have a team of engineers for support and build of their services.  This inevitably helps startups and enterprises to build on top of their services as a comprehensive solution. 

Apart from their DGX Cloud capabilities, NVIDIA has a host of cloud products and services at a reasonable cost for enterprises. They unveiled a suite of cloud-based services tailored to develop generative AI models specialised for specific tasks within their domains, such as medical imaging. 

These services fall under the umbrella of NVIDIA AI Foundations and encompass two distinct offerings: NVIDIA NeMo, dedicated to language models, and NVIDIA Picasso, geared towards generating image, video, and 3D content. Once these models are prepared for deployment, businesses have the flexibility to run them either within NVIDIA’s cloud infrastructure or on other platforms of their choice.

Source: NVIDIA NeMo

NVIDIA is also heavily investing in AI startups. They’ve invested in 20+ companies this year apart from Hugging Face. The most recent was yesterday, when Databricks announced it received funding from NVIDIA and Capital One. These agreements enable Nvidia to keep AI companies loyal to their products. Majority of these startups and enterprises are already Nvidia patrons, from which the company stands to regain its investments. 

]]>
Google Turns AI ‘Bold & Responsible’ https://analyticsindiamag.com/ai-features/google-turns-ai-bold-responsible/ Wed, 30 Aug 2023 04:30:00 +0000 https://analyticsindiamag.com/?p=10099235 The tech goliath has been carrying around the ‘bold and responsible’ placard since 2023 began ]]>

Nearly half the planet uses Google and confides in the tech company to keep their data safe and secure. Apart from its popular search engine, Google provides other popular products and services which have long haunted the company on account of data breaches, leaks, and privacy scandals that have become commonplace since the infamous 2018 Google+ API breach

In 2023, the company made sure its ‘bold and responsible’ approach remains at the centre stage during all its events. Within the first eight months, Google has already released 49 security blogs with the maximum (13) in the month of May. Each blog introduced a bunch of updates varying from AI-powered projects to its AndroidOS. 

At the Google I/O alone, the company announced 11 new security updates majorly focused on AI and its large language models. The updates included a Safe Browsing API (which uses AI to identify and alert users about unsafe sites to avoid scams), and a similar tool for users to know if their data is being misused on the dark web. 

The Next security overhaul

Yesterday, at Google Cloud’s conference Next ’23, Sunil Potti, general manager and VP of Cloud security revealed GCP’s security strategy. The announcements build on Security AI Workbench (that leverages a security-specific large language model (LLM) from Google, Sec-PaLM) which the company released in April

Potti said the new strategy is built on three pillars: leveraging Mandiant expertise, infusing security into Google Cloud innovations and making expertise available for various environments. “The ongoing challenges in security include evolving threat landscapes, increasing number of security vendors and tools, and scarcity of security talent,” he said and further explained the potential generative AI holds to address those challenges. The announcement involves the technology’s applications in different security pillars: security workbench, chronicle’s security operations, and cloud security console.

Overall, Potti discussed the strategy of incorporating AI and expertise to enhance security across different aspects of Google Cloud. The presentation aimed to provide a structured approach for securing and leveraging AI in the security domain.

Bold & responsible – from Mountain View to Bangalore

Google has been carrying around the ‘bold and responsible’ label since its executives have not shut up about generative AI since 2023 began. The new mantra has been repeated over and over in all the conferences held from Mountain View to Bangalore.

In fact, at the I/O, a drinking game emerged — take a sip every time the speakers utter the word “responsible AI”. And this did not stop at California, the first-ever I/O Connect event at Bangalore was a carbon copy. As a responsible parent in the AI world, Google made sure every announcement was spiked with a hint of the responsible approach. Every speaker, from the CTO of Google Cloud to the VPs of Google Pay and Android, made sure to emphasise the ethical use of AI.

While the company has been portraying itself in a ‘don’t be evil’ light, its researchers have raised several red flags in the recent past over the internal shenanigans at the search giant led by Sundar Pichai. James Manyika, the company’s head of tech and society recently spoke to The Washington Post about the downsides of AI. 

Before addressing a packed arena, he discussed the scourge of misinformation and how AI has become an echo chamber reflecting society’s misdemeanour. Manyika warned about the emergence of new problems as the technology improves. As he stepped on stage, the words ‘bold and responsible’ ironically flashed on the audiences’ screens.

In an attempt to regain its pole position on the AI-first podium, the Google Search creator has been trying to ship products without much oversight. Even though the company believes that ethics cannot be an afterthought, its actions have spoken differently for a while. As technology progresses so does the risk of misusing it. Looking at Google’s history of antitrust cases and data theft scandals, it looks happy carrying around the ‘bold and responsible’ placard for now. 

]]>
How Many Jobs has AI Actually Gobbled Up? https://analyticsindiamag.com/ai-features/how-many-jobs-have-ai-actually-gobbled-up/ Tue, 25 Jul 2023 10:03:01 +0000 https://analyticsindiamag.com/?p=10097515 Given the anxiety over AI replacing jobs, let's find out how much of it has actually happened this year]]>

“AI will not replace you, but someone who uses AI will.”

Like a cautionary sign stuck on hazardous substances, the above line has been floating around for quite some time now. Economists and tech leaders have been ringing the death knell for a while cautioning about AI, which is looked upon as a potential reason for an impending apocalypse in the job market. With predictive stats on how certain jobs would be gone in an ‘xyz’ timeframe, a reality check would keep anxieties in check. So what’s the brouhaha all about? Has AI, or someone using AI replaced you? 

Executive outplacement and career consulting firm Challenger, Gray & Christmas, attributed 4000 job losses in May to artificial intelligence, making it the first time for the company to mention AI as a cause of job loss. 

Even though there have been massive layoffs owing to recession, including at big tech companies such as Microsoft, Meta, and others in the past six months, none of them have pointed to AI as a cause of it. However, observing the trend at those same companies on how each of them is adopting generative AI in their workflow, it is not difficult to connect the dots. But, having said that, the trends in the job market are reflecting another picture. 

Generative AI Fuels Job Market

With massive adoption of generative AI in enterprises, which is either fuelled by anxiety or enthusiasm, AI has indeed kick-started a job evolution in the market. As per AIM Research, the generative AI job market has witnessed a steady growth from January to June of this year. Generative AI-related job postings in the United States are said to have risen by 20% in May. From 3000 job openings in April, the job counts have risen to 4500 jobs in June. The IT sector has observed the highest number of job positions for generative AI roles

The figures may be indicative of jobs not seeing a decline, but job roles have been modified to suit the current wave. For instance, the role of a ‘generative AI engineer’, a role that never existed before, will require skills of engineers equipped in different fields. Therefore, the role of a generative AI engineer will encapsulate the roles of a deep learning engineer, ML, NL and also a software engineer. 

New roles that probably didn’t exist earlier are also sprouting in full vigour. The role of prompt engineers — a result of the chatbot revolution — has witnessed an uptick with companies increasingly seeking such roles. With the role offering salaries higher than Python developers, the job market is only looking positive. The massive shift brought by AI has also sparked a debate on how an entire generation will study for jobs that won’t exist. It might not be an overstatement to say that certain job roles may become redundant, but it is mostly because the nature of the job role is seeing a change. 

Together We Grow 

Enterprises are approaching the generative AI rage in a coalition of sorts — not via replacements, but through implementation and training of their employees to tame the system. TCS, which initially partnered with Google Cloud for their generative AI services, recently partnered with Microsoft Azure to train 25000 engineers on Azure OpenAI. 

In addition to implementing and training, enterprises are also building ways to support other companies thrive on generative AI. Tech Mahindra, which partnered with Microsoft to enable generative AI-powered enterprise search, unveiled their Generative AI Studio to help other enterprises kickstart their efforts in generative AI. Other IT companies have also followed suit. 

AI Over Humans? 

While most companies have found ways to work around the generative AI job buzz, there have also been companies that have openly embraced AI over human resources. Dukaan, a platform for enabling merchants to set up their e-commerce business, recently laid off 90% of their support staff replacing them with their new AI chatbot. The company claims to have saved cost and reduced customer resolution time since the move. Telecom company British Telecommunications said that over 55,000 jobs will be cut by the end of the decade, out of which a fifth will be in customer service where AI will replace staff.

There are also industries that have no choice but to embrace generative AI — travel industry being one of them. The industry that faced the biggest impact owing to the pandemic, is now slowly breathing and they all have integrated AI chatbots, ChatGPT plugins and other features — AI being the saviour. 

While you have companies and industries relying on AI, Zerodha on the other hand is all out to safeguard its employees from any form of AI job takeover. The company has been clear on its stance to adopt AI only if they deem necessary and it will not be at the cost of someone’s job. With a few companies allowing AI to replace jobs, and many others creating new jobs and also embracing AI to empower their employees to effectively use it without posing any threat to their jobs, it is fair to say that AI is becoming integral in all jobs. However, whether it will be a deciding factor to safeguard one’s job is not conclusive.

]]>
Park+ & Google Cloud Collaborate to Enhance Smart Parking Solutions https://analyticsindiamag.com/ai-news-updates/park-google-cloud-collaborate-to-enhance-smart-parking-solutions/ Wed, 19 Jul 2023 11:56:45 +0000 https://analyticsindiamag.com/?p=10097240 Park+ is leveraging the power of open source services on Google Cloud to build custom integrations and dashboards or orchestration pipeline]]>

Google Cloud and Gurgaon-based car tech company Park+ have partnered to enable Park+ with the integration of open source software needs and other Google Cloud offerings including Cloud SQL, Google Kubernetes Engine, Anthos and Global Load Balancer. 

With the public cloud software constantly evolving, Park+ is leveraging the power of open-source services on Google Cloud to build custom integrations and dashboards or orchestration pipelines without any hassle of maintenance. Park+ is developing solutions with next-generation ML and AI capabilities for a unique digital and conversational commerce experience for its customers. 

Park+ has been using Google solutions since the inception of their company, including Google Analytics, Firebase, Google AdMob, and now Google Cloud. As hardcore open-source enthusiasts, the team was thrilled to see the out-of-the-box integration of the open-source software they use, which made Google Cloud an immediate choice for them. The latency of their applications decreased by more than 12% from 100ms to 88ms. They anticipate saving more than 900 hours in a year, which they previously invested in maintaining and managing open-source infrastructure.

Park+ is a superapp for car owners, offering a comprehensive range of services in the transportation sector. It functions as a hidden treasure by assisting users in discovering and reserving parking spaces, rapidly recharging FASTags, accessing daily car cleaning services, reviewing e-challans, monitoring car health, purchasing and renewing insurance, and much more. 

Co-founded in 2019 by Amit Lakhotia, a former Vice President of Business at Paytm, and Hitesh Gupta, a former head of engineering for payments, Park+ prioritizes cutting-edge technology to streamline car ownership and maintenance. With a presence in over 20 cities, Park+ dominates the Indian market as the largest distributor of FASTags and access control systems, boasting an extensive inventory of parking slots throughout the country.

Inside the AI & Analytics Team of  Park+

Park+ is an AI-driven platform for car owners. It employs data-tracking to offer a range of services, such as analyzing driving habits, detecting damages through 360° video, suggesting nearby hospitals during emergencies, personalized content recommendations, predicting car prices, managing FASTag transactions, identifying popular parking spots, recommending EV charging stations, suggesting maintenance schedules, and tracking car movement within specific areas. They also provide a unique service where users can send car videos for remote assessment. Park+ employs data science to track stolen cars using camera footage and RFID tags. The data science team, comprising 10 members, processes extensive data for valuable insights.

Read more: Data Science Hiring Process at Park+

]]>
Google, SAP Unveil Data Cloud, the Next Big Thing in Business Intelligence? https://analyticsindiamag.com/ai-news-updates/google-sap-unveil-data-cloud-the-next-big-thing-in-business-intelligence/ Fri, 12 May 2023 08:57:20 +0000 https://analyticsindiamag.com/?p=10093218 This offering, which complements the RISE with SAP solution, enables organisations to access business-critical data in real time. ]]>

The enterprise application software vendor SAP recently announced an extensive partnership with Google Cloud to introduce an end-to-end data cloud that brings data from across the enterprise landscape using the SAP® Datasphere solution. This offering, which complements the RISE with SAP solution, enables organisations to access business-critical data in real time. 

The solution addresses a significant challenge faced by organisations that need to invest substantial resources in building complex data integrations, custom analytics engines, and generative AI and natural language processing (NLP) models to derive value from their data investments. 

By combining SAP software data on supply chains, financial forecasting, human resources records, omnichannel retail, and more with non-SAP data on Google Cloud from virtually any other data source, organisations can accelerate their digital transformation significantly. This approach provides them with a fully-defined data foundation that retains complete business context, enabling organisations to accelerate their digital transformation. 

“SAP and Google Cloud share a commitment to open data and our extended partnership will help break down barriers between data stored in disparate systems, databases, and environments,” said Christian Klein, CEO and member of the Executive Board of SAP SE.

Businesses will be able to utilise our analytics capabilities, as well as advanced AI tools and large language models to find new insights from their data.

At SAP Now India conference held in Mumbai, Paul Marriott, President, SAP Asia Pacific Japan, told AIM, “Whether it’s ChatGPT, generative AI, or what we’re doing in the automotive sector with Catena-X to create a global exchange that drives a more efficient supply chain for the industry, or initiatives around energy transition to create a greener future, all this innovation is being delivered into the RISE and SAP cloud platforms.”

SAP and Google Cloud also plan to partner on joint go-to-market initiatives for enterprises’ largest data projects. The SAP Sapphire® conference, which will take place May 16-17 in Orlando, Florida, will host demos of joint AI and data solutions, including how

Enterprises can apply generative AI to common workflows and applications, such as using a chatbot to search, create, and edit purchase requests.

Alongside Google Cloud, SAP also announced collaboration with IBM to embed IBM’s Watson technology into applications such as SAP Start, a digital assistant designed to work with SAP’s cloud solutions, including those integrated with SAP S/4 HANA Cloud.

]]>
Why Enterprises Are Super Hungry for Sustainable Cloud Computing https://analyticsindiamag.com/it-services/why-enterprises-are-super-hungry-for-sustainable-cloud-computing-future/ Tue, 25 Apr 2023 06:09:33 +0000 https://analyticsindiamag.com/?p=10092247 Cloud providers prioritise sustainability in data center operations, while the IT industry needs to address carbon emissions and energy consumption.]]>

The widespread adoption of cloud has brought about a big shift in the software engineering practices of organisations, particularly in the realm of developing and deploying cloud-based IT solutions. But just shifting to the cloud isn’t enough, organisations are also demanding sustainable cloud infrastructures due to the growing need to decrease greenhouse gas emissions. 

However, many companies still refuse to prioritise sustainability over the false beliefs that energy usage is negligible or already optimised. But the overall push towards greener alternatives in business has turned sustainability into the norm. 

Read more: How This Indian IT Giant Surpassed Others in Achieving Carbon Neutrality

Importance of Sustainability Goals for Cloud Service Providers

Cloud service providers have established various corporate objectives for sustainability, which dictate their approaches to constructing, energising, operating, and decommissioning their data centers. Cloud providers have distinct sustainability goals that guide their strategies for designing, constructing, energising, managing, and decommissioning their data centers.

Findings from a McKinsey report highlighted the need to address carbon emissions in the IT industry, with end-users and data centers being the biggest culprits. In contrast, cloud-based workloads have been shown to generate far fewer emissions. 

Meanwhile, another report by the International Energy Agency (IAE) revealed that global data centers consumed a staggering 200 terawatt-hours (TWh) of energy in 2022, with consumption expected to grow even further in the future. This is a call to action for all organisations to embrace sustainable and energy-efficient practices when it comes to cloud computing and data management.

Cloud computing requires continuous running of components in Cloud Data Centers (CDC), which contributes to high energy costs and carbon footprint. To attain sustainable cloud services, there is a need for energy-aware resource management, efficient cooling mechanisms, and data center relocation based on waste heat recovery, green resources, and free cooling proximity. Resource management overall should be done in a holistic manner to reduce energy consumption in CDCs.

Source: Sustainable Cloud Computing: Foundations and Future Directions

How can Software’s Carbon Footprint be improved?

Effective communication in software applications requires a lot of energy, making it the most energy-consuming factor compared to others. By reducing data payloads and eliminating redundant communication, businesses can save up to four metric tons of greenhouse gases per year. To cut down the computational effort, organisations can use pure functions, limit abstraction layers, and aggregate incoming data. 

Source: Data Centres and Data Transmission Networks

Companies should also involve end-users in decision-making and assess the costs and benefits of non-functional requirements. Educating users on conscious trade-offs to save energy can also help. Longer latency may allow moving computation to regions with greener energy sources. Upgrading software without considering the customer’s needs can lead to unnecessary additional features and functionalities that lead to higher computation and carbon footprint.

Road Ahead

It is crucial for developers, CTOs, and companies to act to enhance sustainability in software and data. They can minimise carbon footprints by adopting a conscious approach towards their decisions and making environmental sustainability a non-negotiable aspect of their software development process. Measuring the environmental impact of software as a metric can further strengthen their efforts. Such actions can pave the way for considerable reductions in carbon emissions and positively impact the environment.

Read more: AI Startups Need to Learn Stability Tricks from Databricks

]]>
Google and Replit’s Quest to Become the Next Copilot X https://analyticsindiamag.com/global-tech/google-and-replits-quest-to-become-the-next-copilot-x-2/ Thu, 30 Mar 2023 10:00:00 +0000 https://analyticsindiamag.com/?p=10090416 With the Google partnership, Replit believes that they will now get access to newer models as they are released which will ultimately reach the developers and help with the goal of “accelerating tech into everyone’s hands”]]>

Google Cloud recently announced its partnership with ‘Replit’, a cloud-based integrated development environment that allows developers to write and deploy codes in various programming languages from their web browsers. 

The partnership aims to turn “non-developers into developers”. With the Google partnership, Replit will get full access to Google Cloud’s infrastructure and Google’s machine learning platform, ‘Vertex AI’. Along with improving productivity, Replit claims that programmers can code complex-architected software in 1/1000th of the time

A day after the partnership announcement, CEO and Head of Engineering at Replit, Amjad Masad confirmed that the company will continue to remain an “open platform” and that they are open to work with more companies to “expand the ecosystem”.

The Need

Masad, in an interview with Semafor, explains the limitations they faced during the initial days. When GPT-2 was out in 2019, he had started playing around with code generation but it was only after GPT-3’s release that its potential was obvious for people. They were also not able to do much because OpenAI was strict about what gets productised and what does not. 

“In order to produce something like a Copilot, you have to do a lot of low level engineering, have access to weights and be really fast.” This was something Replit did not have access to but Microsoft had that advantage due to their “special relationship” with OpenAI. It was only after models started getting open sourced, that the company was able to build on their own. 

With the Google partnership, Amjad Masad believes that they will now get access to newer models as they are released which will ultimately reach the developers and help with the goal of “accelerating tech into everyone’s hands”.

Platforms and products that work seamlessly in silos are ultimately limited when it comes to full adoption. For example, for a developer who wants to utilise LLMs in their daily work, there needs to be an integrated development environment (IDE) where LLMs are implemented for wider functionality. This is probably where Replit and Google Cloud’s partnership will shine

Replit has already been implementing artificial intelligence through Ghostwriter—an AI-powered ‘coding partner’ launched in October 2022. Ghostwriter works on LLM which is trained on publicly available code that is fine tuned by Replit. The company even launched a chatbot for Ghostwriter last month, named ‘Ghostwriter Chat’. It is considered the ‘first conversational AI programmer’ to have an interactive experience like ChatGPT. 

Over 30% of the codes developed in Ghostwriter are generated by Ghostwriter coding AI. Powered by LLM chat applications, full programme codes can be generated. 

Vertex AI allows users to train and deploy AI applications and ML models. AutoML, an option for model training provided by Vertex AI, allows users to train image, text, or video data without writing codes. Vertex AI’s multimodal training model will subsequently help elevate user functionalities in Replit. 

Battle of the Behemoths

Running similar functionalities, Replit’s closest competitor is GitHub. With the announcement of Google Cloud’s partnership, the spotlight has returned to the race of the tech giants supporting both companies. 

Microsoft’s GitHub was first launched in 2008 and is used by over 94 million developers.  Comparatively new player ‘Replit’—founded in 2016—has raked over 20 million developers, as mentioned in their company blog. With Google Cloud partnership, Replit aims to support “one billion software creators” and ultimately also support the goal of enabling companies to promote development using AI. 

While there are similar functionalities for both, including the presence of AI features, Replit’s Ghostwriter has an edge over GitHub in certain parameters. In Replit, there is a “real-time multiplayer editor” option, and users can build, test and deploy “directly from the browser”—a unique function that is exclusive to Replit. In addition, the Replit app enables ‘voice commands’. Users can instruct the application via voice prompts to say “make an app” for a specific need. The application will also provide the source code, in case the user needs further modifications. 

Is Partnership the Way Ahead?

With Microsoft’s GitHub Copilot considered an essential for coders and Microsoft bringing ChatGPT-like capabilities to GitHub with Copilot X, Google’s push to make a mark in the developer community is evident through its new partnership with Replit.  

To remain relevant in the AI race—and probably take on fellow giant Microsoft—forming crucial partnerships with existing players is a hopeful route. 
Not far behind is another power partnership—AWS and Hugging Face. Amazon Web Services’ partnership with Hugging Face , a company that develops and maintains open-source libraries for natural language processing (NLP) and machine learning, is another initiative taken by a tech giant to accelerate next-generation ML models by helping developers build them.

]]>
AIM Research: Product Partnerships Of Data Service Providers https://analyticsindiamag.com/ai-features/aim-research-product-partnerships-of-data-service-providers/ Thu, 23 Mar 2023 08:00:00 +0000 https://analyticsindiamag.com/?p=10089865 A significantly high percentage of partnerships with companies like AWS, Microsoft, and Google shows their wide-ranging technology facilities, like cloud storage, database, reporting and visualization, data and analytics, computing, etc.]]>

Data science is a rapidly growing field, and demand for these services has been increasing as organizations of all sizes seek to harness the value of the large amounts of data they collect. Data service providers are companies that specialize in helping organizations leverage the power of data to make informed decisions and drive business outcomes.

To provide such services data service providers have product partnerships with different technology providers, the product partnership involves two or more companies collaborating to provide a more comprehensive suite of data services to their clients, these partnerships typically include, sharing of resources and technology to create innovative solutions that meet the evolving needs of the business.

The report mainly considers data science service providers in India. This could be firms headquartered in India or companies with a majority of their delivery team sitting in India.

This report can help data science service providers analyze what the current market trend is in terms of forging technology partnerships that can enable a high standard of work delivery. This can be also used by consumer companies to understand what tools are predominantly being used and subsequently invest in the right technologies when making an effort toward digitization. Technology providers can gauge where they stand in terms of their competition.

Read the complete report here:

]]>
GPT-4 Hype Can’t Hurt Google https://analyticsindiamag.com/ai-highlights/gpt-4-hype-cant-hurt-google/ Fri, 17 Mar 2023 07:30:00 +0000 https://analyticsindiamag.com/?p=10089524 Many have taken GPT-4 to be one more nail – or perhaps the final nail? – in the coffin of Google.]]>

The much-awaited GPT-4 is here. The new transformer model is touted to outperform its predecessor ChatGPT on several competitive exams while also being safer and more aligned. Many have taken this to be one more nail—or perhaps the final nail?—in the coffin of Google. 

Google—for reasons known only to them—also made several announcements around the time of GPT-4’s release. However, these announcements haven’t been much of a talk in town since the only word that has captured public consciousness is GPT. Did Google really think they didn’t want to be left behind in this AI frenzy? 

AI Cloud

The big news that came from Google this week is the release of the ‘PaLM API’. Pathways Language Model, or PaLM, is a 540-billion-parameter language model, open-sourced and made publicly available by Google. Since its release last year, expectations have been that Google will soon use it to model a variety of products like it did with BERT, which now powers the entirety of our search experience. The model surpasses the performance of 175B parameterised GPT-3 as well as the undisclosed architecture of GPT-4. 

The move to lease it out as an API before integrating into its Search is quite uncharacteristic of Google. But, at the same time, the push follows the huge market of Cloud that Microsoft has already gotten into with OpenAI’s GPT APIs. This is perhaps why, at the same time it released the API, Google also announced the launch of two Generative AI products on Google Cloud which will help developers build products on top of their own foundational models and others. 

In addition, Google will also be giving access to ‘Makersuite’, a tool that developers can use to prototype ideas, along with exercise provisions for prompt engineering, synthetic data generation, and custom-model tuning—which Google claims will be supported by robust safety tools

Looking at the current state of AI as one only for big tech companies to sell their cloud business, it does make sense for them to go the Microsoft way. 

Recently, Microsoft also purchased Fungible, a DPU startup, for streamlining its cloud service functions. To this purchase, add Lumenisity, a HCF solution provider, which Microsoft had acquired a month earlier. Microsoft has therefore been quite aggressively strengthening Azure. 

Nearly all major cloud providers have relationships with AI chip suppliers. AWS has its own silicon and custom intel processors while Google Cloud uses Arm’s Ampere Ultra chips to augment its infrastructure. The Cloud game is too competitive to be won easily and companies are throwing blind money at it to take a lead. 

Ethics, maybe? 

Unlike Microsoft, Google has repeatedly stressed upon safe and responsible AI. This is why, instead of ransacking the internet boasting its achievement, the company is providing limited access to select testers for generative AI in Workspace, starting with Docs and Gmail. This will allow them to pressure test new experiences before releasing it broadly to end-users. 

On the contrary, the Redmond-based company made headlines recently for laying off one of its responsible AI teams. “The pressure from [CTO] Kevin [Scott] and [CEO] Satya [Nadella] is very very high to take these most recent OpenAI models and the ones that come after them and move them into customers hands at a very high speed,” reads an article by Platformer

In the following tweet, Emily Bender discusses a paper from 2018 which gives two recommendations for how to mitigate system bias in language models. 

What was unsurprising to Bender was that even four years later—in the wake of the release of GPT-4—OpenAI failed to introduce details about the architecture (including model size), hardware, training compute, dataset construction, and training method. 

Additionally, it was also important that Google set its eyes away from search and consider the current state of generative AI purely as a productivity tool, which would give users the liberty to accept, edit, or modify the suggestions. 

Among other things, Google announcements this week also include ‘MedPaLM-2’, a medical language model which is an 18% improvement to its predecessor. The model, which is considered to be equivalent to an “expert” doctor level, is already being used to explore AI-assisted potentialities for ultrasound, cancer treatment planning, and tuberculosis screening.  

Meanwhile, the Google-backed ‘Anthropic’ also released its own chatbot, ‘Claude’, which will be generally accessible now. The AI startup had been quietly testing the model with partners like Robin AI, AssemblyAI, Notion, Quora and DuckDuckGo. Anthropic is addressing the general pitfalls of chatbots like ChatGPT—which are known for showing bias, producing harmful content as well as hallucinating—with a technique called “constitutional AI”.  

While in previous techniques, tens and thousands of human feedback labels were needed, constitutional AI utilises only a list of rules or principles to be able to train less harmful AI assistants. This new technique also helps fix mistakes to AI behaviour simply by changing the principles provided, instead of fine-tuning on large RLHF datasets. 

Final Thoughts

Beneath the current wave of AI hype, there is a nuanced story playing out that centres around the balance between noise and impact. It appears that Google’s primary objective—as is seen with MedPaLM—has been to leverage the AI trend to create tangible value. 

Moreover, one thing that Google has believed in since its inception is to make technology do the work for us instead of making users do the work for technology. What we have seen until now with Microsoft—and its closest ally, OpenAI—is to make end-users do the ultimate work of training and improving the model by interacting with it more and more. In this light, the hope is that Google will set a precedent for others to follow suit and make the technology create value in our everyday lives. 

]]>
Ford Revives Argo AI From the Dead https://analyticsindiamag.com/ai-trends/ford-revives-argo-ai-from-the-dead/ Mon, 06 Mar 2023 09:30:00 +0000 https://analyticsindiamag.com/?p=10088762 This seems like a strategy on behalf of Ford to be able to take incremental steps to develop systems in-house]]>

In October last year, after having invested $1 billion in the autonomous vehicle technology company, Argo AI, Ford had to pull the throttle up. Unfortunately, Argo AI shut shop and though it may seem like, Ford’s dreams didn’t come to rest like its investment. 

The company recently announced Latitude AI, a brand new venture to develop new autonomous driving technology with a focus on “hands-free, eyes-off-the-road” driver assist system for the next-generation Ford vehicles. 

Re-inventing Argo AI? 

Argo AI’s two main investors – Ford and Volkswagen – were incurring mounting losses as commercialisation of Level 4 advanced driver-assistance systems seemed almost impossible. If we talk numbers, “Ford recorded a $2.7 billion non-cash, pretax impairment on its investment in Argo AI, resulting in an $827 million net loss for Q3.” 

Meanwhile, the company also failed to secure new outside investors. Hence, its demise was imminent. Plus, in the wave of consolidation happening at the time, it was believed that only those companies would survive that had the capital and infrastructure to build end-to-end systems. Tesla was the poster company for such a realisation. Thus, there was a growing sentiment in favour of full self-driving through improving driver assistance (like Tesla, Comma) as against a robotaxi business model (like Cruise, Waymo). 

In this vein, Ford also maintained that it intends to invest more time and money in Level 2/3 driver-assist tech, than a Level-4 robotaxi technology. The automakers’ own L3 BlueCruise system, known as the Active Driving Assistance (ADA) system, is “the simultaneous use of a car’s adaptive cruise control (ACC) to control speed and lane centering assistance (LCA) to control steering”. The two systems work together to ensure the vehicle is always at a safe distance from the ones ahead, and is at or near the centre of the lane. 

The “hands-free” technology comes handy in straight stretches and traffic jams where drivers can engage in relaxed driving. Additionally, what makes BlueCruise rank higher in the Consumer Research report is the safety it guarantees with the direct driver monitoring systems (DDMS). With infrared cameras pointing at driver faces, DDMS sounds an alert and slows down the car if the driver doesn’t pay attention to the road. 

The fight for dominance in the “self-driving car” market is getting fierce, with Mercedes-Benz announcing early this year that they have cracked the code for Level 3 driving system, ahead of Tesla. BMW has also confirmed the launch of the L3 self-driving system this year. There is also Jaguar Land Rover, Audi, Volkswagen, Volvo, Nissan – you name it. The list is endless and they are all going the Tesla way. 

Pulling out investment from Argo AI, which developed self-driving products and services, to form its own company from the remains of Argo AI, seems like a strategy on behalf of Ford to be able to take incremental steps to develop systems in-house. This will also take the focus away from the overarching goal of “autonomous cars”. 

‘Data Centres on Wheels’

With all automobile companies jumping on the AV (Autonomous Vehicles) bandwagon, there is one that has been at the centre of all, enabling this – Google. The giant has partnered with automobile manufacturers like Mercedes-Benz, Renault, Ford, Volvo, Toyota, Nissan, Kia, Honda, and several others. Likewise, AWS has set foot in this region with BMW, Alfa Romeo, Chrysler, and Jeep, and Microsoft Azure with Tata Motors and Volkswagen.  

It’s a no-brainer that software is crucial to self-driving cars. The vehicles collect an extraordinary amount of real-time data from sensors (cameras, lidar, and radar) to predict movement paths of objects in the surrounding environment and evaluate possible courses of action. This way, cars of the future will be nothing but ‘data centres on wheels’. Hence, being able to compute the data on cloud will be key. 

Additionally, with carmakers making up a growing segment of Google’s Cloud business, there is an opportunity for it to get back the market share taken by AWS and Azure. The automotive companies will leverage greatly upon Google Maps’ geospatial data and navigation capabilities, alongside using Google Cloud’s AI and ML capabilities to create, train, and deploy AI models at speed.   

Dangers Ahead

Not everything is merry when there is an immense amount of data available at the hands of these companies. And we are already starting to see the dangers. For example, reports emerged that Ford recently applied for a patent on a system that would assist with vehicle repossession in the case of delinquent payments. The news of the patent’s publication, first reported by The Drive, describes a variety of procedures to enable this. Those include, sending notifications to the owner’s smartphone or the car itself, completely locking out the driver, disabling certain functions like air conditioning, and geofencing the vehicle to restrict it to a specific time or location. 

This kind of system completely ignores the reason why an individual has fallen behind on payment, and takes a rather autocratic stand over it. And like The Verge’s article points out – the future looks like one handicapped reality where the extended software systems will decide where we go, what we do, and how we do it. 

]]>
Google’s Winning the Automotive AI Race  https://analyticsindiamag.com/global-tech/google-is-winning-the-automotive-ai-race/ Mon, 27 Feb 2023 09:00:00 +0000 https://analyticsindiamag.com/?p=10088204 The tech giant has partnered with Mercedes- Benz, Renault, Volvo, GMC and many other automobile companies]]>

With Google’s recent partnership with Mercedes-Benz, a “next-generation navigation experience” is not too far away. In this deal, Google Maps will provide geospatial data and navigation capabilities for the car manufacturer, while Mercedes-Benz will use Google Cloud’s AI and machine learning capabilities to create, train and deploy AI models at speed. This will enhance customer experience, alongside building faster and more efficient data processing platforms to analyse fleet data. Additionally, it also plans to leverage Google’s open infrastructure to secure and scale from on-prem to the edge to the cloud, across its technology ecosystem. 

Google chief Sundar Pichai said that the company will provide AI and data capabilities to accelerate their sustainability efforts, advance autonomous driving, and create an enhanced customer experience. The tech giant’s association with Mercedes-Benz should not be construed as a mere attempt to work in the infotainment or navigation space alone, but rather slowly build its way as a crucial player in the AI automotive space. 

Mercedes has been making waves in the autonomous driving space with SAE Level 3 powered by the NVIDIA Drive platform, making it the first automaker in the US to receive this level ahead of Tesla, which is still in the Level 2 autonomy. 

Google’s efforts to create an AI niche in the autonomous driving segment has been brewing for quite some time. It is going through a series of collaborations with leading automobile companies, and Mercedes is not the first that will help Google achieve this. 

In November 2022, Renault Group and Google announced a partnership to build a ‘software-defined vehicle’. SDV is meant to combine the automotive technology and software capabilities of Google to build on the existing Android Automotive OS. Google was also named the preferred cloud supplier for Renault Group, where the Google Cloud technology would be employed for data capture and analytics. 

In 2021, Google collaborated with Ford, where the team has been using Google Cloud, Android and Google AI to help Ford transform its business and build automotive technologies.

Automobiles Powered by Cloud 

The partial exclusivity of Google in the software segment of automobiles is an added advantage. Some of the big brands where Google applications are built are Volvo, GMC, Chevrolet, Polestar, Cadillac, and Honda. 

Google’s Android Auto, an application that mirrors an android device onto a car’s entertainment unit, has been Google’s established software dominance over automobiles. Launched in 2015, Android Auto is part of Open Automotive Alliance, an association of auto manufacturers and tech companies to promote Android in vehicles. Android Auto is available in 46 countries and supports over 500 models. The only counterpart to Android Auto is Apple’s CarPlay. And though it has similar features, it is not superior to Android Auto. The make-or-break deal is Google Map’s edge over Apple Maps

But, Google is not alone, both Amazon and Microsoft have been partnering with automotive companies to leverage their technologies.  In October 2022, BMW partnered with Amazon, integrating AWS cloud computing in their systems. Similar to Google, Amazon is developing software-defined vehicles integrating AWS, which would help towards building mobility solutions for autonomous vehicles. Amazon is also working with car manufacturer Stellantis, makers of brands Alfa Romeo, Chrysler and Jeep. 

Microsoft has also tied up with automobile companies like Volkswagen to provide their cloud solution, Azure. In 2017, Microsoft had tied up with Tata Motors.  

Self-Driving Ambitions  

According to McKinsey research, the autonomous driving segment is said to generate $300 to $400 billion revenue by 2035. By 2030, 12% of the passenger vehicles sold will have L3+ technologies. This is an indication to how AI and ML tools will be widely adopted in the automotive sector in the coming years. By the way, Google was one of the first few technology companies to ambitiously take up self-driving car projects. The project, which was renamed ‘Waymo‘, first took shape in 2009, and launched as the 4th generation Waymo Driver (Chrysler Pacifica minivan) ten years later. In 2021, the 5th generation Waymo Driver was launched (electric Jaguar I-PACE), and it probably was as far as Google went as a manufacturer of autonomous vehicles. 

Waymo has been functioning as a ride-hailing service, and though fraught with technical problems, Google is still pushing updates to its self-driving segment. 

Waymo also signed a partnership with Uber in June 2022 to deploy autonomous trucks on the Uber Freight network. However, the company has held on to its aggressive expansion plans for Waymo. However, as part of the recent layoffs at Google, Waymo employees were given the pink slip. There were also claims that the company will hold back expansion on its autonomous trucks ‘Via’. Having spent over 12 years in a self-driving project, Waymo is still far from making profits. 

With Waymo’s growth trajectory, it is practical for Google to continue to prioritise developing their software solutions which can be implemented in the automotive sector, as opposed to the conventional method of investing in building autonomous cars. Google’s latest partnership with the luxury carmaker is a step towards software dominance in the automobile sector, and in farsight probably paves the way for autonomous driving and take on other biggies of AI in the automotive space.

Though there are other cloud contenders in the automotive space, Google has a clear edge over them. Having already entered the autonomous driving segment with its flagship driverless cars, Google is ahead in the AI automotive space. 

]]>