Data Science – Analytics India Magazine https://analyticsindiamag.com AIM - News and Insights on AI, GCC, IT, and Tech Wed, 05 Mar 2025 08:07:07 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2025/02/cropped-AIM-Favicon-32x32.png Data Science – Analytics India Magazine https://analyticsindiamag.com 32 32 Google Releases Data Science Agent in Colab https://analyticsindiamag.com/ai-news-updates/google-releases-data-science-agent-in-colab/ Tue, 04 Mar 2025 05:49:04 +0000 https://analyticsindiamag.com/?p=10165045 The agent achieves goals set by the user by orchestrating a composite flow which mimics the workflow of a typical data scientist.]]>

Google released a Data Science Agent on the Colab platform on Monday, powered by its Gemini 2.0 AI model. The Data Science Agent is capable of autonomously generating the required analysis of the data file uploaded by the user. It is also capable of creating fully functional notebooks, and not just code snippets. 

Google said the agent “removes tedious setup tasks like importing libraries, loading data, and writing boilerplate code”. The agent achieves goals set by the user by “orchestrating a composite flow” which mimics the workflow of a typical data scientist. Users can use the agent to clean data, perform exploratory data analysis, statistical analysis, predictive modeling and other such tasks. 

The generated code can be customised and extended to meet users’ needs. Moreover, results can also be shared with other developers on Colab. Google also said that the agent ranked fourth on the DAPStep (Data Agent Benchmark) on HuggingFace, ahead of GPT-4o, DeepSeek-V3, Llama 3.3 70B and more. 

The Data Science Agent was launched for trusted testers last December, but is now available on Google Colab. Colab is a free, cloud-based environment where Python code can be written and run within the web browser. It also provides free access to Google Cloud GPUs and TPUs. 

“We want to simplify and automate common data science tasks like predictive modelling, data preprocessing, and visualisation,” Google said.

Recently, Google also announced the public preview of Gemini Code Assist, a free AI-powered coding assistant for individuals. The tool is globally available and supports all programming languages in the public domain.

It is available in Visual Studio (VS) Code and JetBrains IDEs, as well as in Firebase and Android Studio. Google also said the AI coding assistant offers “practically unlimited capacity with up to 1,80,000 code completions per month”.

]]>
40 Under 40 Data Scientists Awards 2025 – Meet the Winners https://analyticsindiamag.com/ai-highlights/40-under-40-data-scientists-awards-2025-meet-the-winners/ Fri, 07 Feb 2025 10:17:24 +0000 https://analyticsindiamag.com/?p=10163015 This award recognises India’s top data scientists and their achievements in the machine learning and analytics industry. ]]>

Amidst the three days of AI and ML workshops, conferences, presentations, and tech talks at the Machine Learning Developers Summit (MLDS) 2025, about 40 dynamic data scientists were presented with the 40 Under 40 Data Scientists Award on Thursday. 

This award recognises India’s top data scientists and their achievements in the machine learning and analytics industry.

This year’s winners are driving real impact at some of the world’s most influential companies, including Razorpay, HSBC, Genpact, PepsiCo, Bloomberg, Ford Motors, Paytm, Tata, Wells Fargo, Accenture, and more. 

They are creating AI solutions that improve efficiency and developing data models that prioritise privacy. More than just driving innovation, they are also fostering a culture of learning and growth.

The Winners of the 40 Under 40 Data Scientists Awards 2025

Abhinav Vajpayee, Senior Manager, Analytics at Razorpay Software Private Limited

A hands-on innovator in data-driven business strategy, Abhinav takes part in optimising payments, ads, and content acquisition. His solutions at Razorpay, Swiggy, and Vuclip boosted retention, ad revenue, and cost efficiency, earning industry recognition.

Abhishek Kumar, VP, Analytics Lead at HSBC

Abhishek is a data science leader known for high-impact analytics solutions across banking, FMCG, and retail. His work spans forecasting, pricing models, and customer insights, earning multiple awards for innovation and business impact.

Akhil Makol, Principal Engineer at NatWest Group

Akhil leads the data strategy and cloud architecture for commercial & institutional domains. He drives AI and analytics adoption by aligning data products with banking standards and leveraging AWS Data Lake.

Akshay Jain, AGM – Lead Digital Downstream at Hindalco Industries Ltd

Akshay Jain is a data scientist transforming aluminium manufacturing with AI, predictive analytics, and Industry 4.0. He leads a team that optimises operations and drives AI adoption on the shop floor, focusing on people-centric implementation.

Amresh Kumar, General Manager at Niva Bupa Health Insurance

Amresh, a data scientist with more than 15 years of experience, has worked across insurance, banking, and digital marketing. He has successfully implemented renewal, planning, and reinsurance models, with certifications in advanced insurance and Google Analytics.

Ankit Sati, Senior Manager at Genpact

Ankit is a vital member of Genpact’s AI/ML practices. He specialises in computer vision and GenAI solutions. With more than seven years of experience across industries, he is also an active Kaggle competitor and hackathon enthusiast.

Anup Kumaar Goenka, Deputy Director of Data Science at PepsiCo.

Anup is a data science innovator known for AI-driven solutions in forecasting and automation. He developed an award-winning AI meeting summarisation tool and led predictive analytics projects optimising supply chains and financial planning.

Anupam Tiwari, Data Science Manager at GoTo Company

Anupam contributed to GenAI for Southeast Asian languages, developing Sahabat AI, a suite of LLMs for Indonesian dialects. His models, openly available on Hugging Face, support AI innovation and adoption in Indonesia.

Arjit Jain, Co-Founder and CTO at TurboML

Arjit is an ML researcher specialising in real-time machine learning for fraud detection and personalisation. A former Google researcher and IIT Bombay graduate, he has published award-winning papers with more than 200 citations.

Avinash Kanumuru, Senior Manager – Data Science & Engineering at Niyo

Avinash is a data science contributor who publishes acclaimed articles on platforms like ‘Towards Data Science’. He has also developed open-source Python libraries (ml-utils, pyspark-utils), simplifying ML workflows and enhancing industry best practices.

Debanjan Mahata, Senior ML Research Engineer at Bloomberg

Debanjan Mahata is a leading researcher in NLP, machine learning, and Document AI, with publications in top conferences and patented innovations in document analysis. His recent work focuses on multimodal Retrieval-Augmented Generation (RAG) for DocVQA, enhancing financial and ESG data extraction.

Dr. Shital Patil, Solution Architect at Robert Bosch

Dr. Shital, a Prime Minister’s Fellowship recipient, specialises in AI-driven machinery condition monitoring and predictive maintenance. With four patents, she excels in fault diagnosis, XAI research, and solution architecture across India and the Middle East.

Dr. Vikram Singh, Senior Vice President of AI & Digital at EightBit AI Private Limited

Dr. Vikram is a researcher specialising in image super-resolution, deblurring, and deep learning. His work includes high-frequency refinement techniques and advanced neural networks for sharper image and video processing.

Gaurav Mhatre, Director at Tiger Analytics

Gaurav, a data science leader with 13+ years of experience, drives AI innovations across CPG, healthcare, telecom, and eCommerce. At Tiger Analytics, he has led route-to-market analytics, pet food optimisation, and 5G network routing using cutting-edge AI algorithms.

Gopinath Chidambaram, Global Technical Director, AI/ML & Cloud at Ford Motors

Gopinath, an AI/ML professional, holds five patents in autonomous vehicle perception and has filed two more in AI-driven monitoring systems. He is also co-authoring Gen AI Untrained, an upcoming book on AI concepts and applications.

Kantesh Malviya, Associate Vice President – Analytics at Paytm

Kantesh is a data science leader known for mentorship and thought leadership. He has spoken at IIT Bombay and developed innovative analytics solutions, driving user engagement and revenue growth across industries.

Kulbhooshan Patil, Head of Data Science and Analytics at TATA AIG General Insurance Company

Kulbhooshan is an award-winning AI leader recognised for innovation in risk management and user experience in insurance. His AI-driven solutions have earned multiple industry accolades, including the Best AI Technology Implementation of the Year and Outstanding AI & ML Solution Provider.

Mahima Bansod, Data Science and Analytics Leader at LogicMonitor

Mahima is a Data and AI leader with a decade of experience driving digital transformation at companies like Salesforce and Siemens. She has implemented ML models for customer retention, achieving a 94% renewal rate and 40% growth in product adoption.

Mahish Ramanujam, Associate Director – Analytics at Games24X7

Mahish led an award-winning project, developing a game-wise adaptive user-engagement model inspired by cricket analytics. The model boosted D30 LTV by 20% while reducing spending by 15%.

Mehuli Mukherjee, Vice President at Wells Fargo

Mehuli, VP at Wells Fargo, is an analytics leader specialising in GenAI, LLMs, and NLP. A gold medallist and PhD researcher, she is developing an Indian Sign Language recognition system while mentoring in AI and advocating for social impact.

Namit Chopra, VP-2 at EXL Service

Namit developed EXL Property Insights Solution (patent pending) and applies NLP/GenAI to insurance claims. His work includes LLM-based claim summarisation, fraud detection, and cause-of-loss identification.

Namita Khurana, Data Scientist Associate Director at Accenture

Namita is a leader in Revenue Growth Management (RGM), specialising in pricing, promotion, and assortment analytics across global markets. She has developed patented solutions, including AI-driven conversational tools for strategy optimisation and decision-making.

Nandita Saini, Manager – AI & Cognitive Solutions at e& enterprise

Nandita Saini led the productisation of e& Enterprise’s GenAI-based Utilities Copilot. She successfully turned the concept into a launched product, driving innovation.

Nishant Ranjan, Head of Analytics at Godrej Consumer Products

Nishant developed innovative pricing, forecasting, and AI-powered analytics models, including a first-of-its-kind promotion attribution model. He also pioneered MLOps best practices, enabling scalable machine learning deployment.

Pankaj Goel, Associate Vice President – Innovations at BA Continuum India Pvt Ltd (Bank of America subsidiary)

Pankaj Goel pioneered demand prediction in the CPG industry, analysing country-level demand impact on product lines, earning recognition from Procter & Gamble. He recently completed a proof of concept on digital transformation using LLM/GenAI.  

Pavak Biswal, Senior Manager at Merkle  

Pavak, a data science leader with 13+ years of experience, has driven business impact through AI and analytics innovations. He designed GenAI-powered chatbots, optimised pricing models, and led multimillion-dollar analytics projects across industries.  

Pawan Kumar Rajpoot, Lead Data Scientist at TIFIN 

Pawan has more than 10 years of experience in NLP research and development and has been the winner of 10-plus international competitions. His previous work experience includes companies like Tact.ai, Rakuten India and Huawei.

Puspanjali Sarma, Senior Manager – AI at ServiceNow

Puspanjali is an AI and data science expert specialising in AI product management, NLP, and predictive analytics. Her work at ServiceNow and beyond has driven innovative, AI-driven solutions with measurable business impact.  

Rajaram Kalaimani, Senior Principal Data Scientist at Mindsprint

Rajaram is the architect behind Mindverse, a GenAI platform, and precision agriculture solutions for the agri supply chain. His innovations are now hosted on Google, expanding AI capabilities at Mindsprint.  

Ritwik Chattaraj, Data Science Manager/Senior Data Scientist at Commonwealth Bank of Australia  

Ritwik Chattaraj is a data science leader with expertise in Generative AI, LLMs, and robotics. He has led AI/ML innovations at major banks, published extensively, and received multiple excellence awards.  

Sachin Kumar Tiwari, Deputy Vice President at Canara HSBC Life Insurance Company  

Sachin Kumar has led key AI and analytics projects, including GenAI chatbots, customer genomics, and sales governance models. His work spans predictive modelling, sentiment analysis, and geospatial analytics to drive business decisions.  

Sairam Mushyam, Head of Data and AI (SVP) at Zupee  

Sairam has revolutionised Real Money Gaming in India through AI-driven innovations, regulatory frameworks, and data infrastructure. His work spans blockchain-based fairness validation, user integrity systems, and GenAI-powered gaming experiences, driving $30M+ in annual revenue growth.  

Shravan Kumar Koninti, Associate Director – Data Science at Novartis  

Shravan is an AI researcher and innovator who is developing self-service AI tools and large-scale AI projects. He has won multiple hackathons, pioneered Generative AI applications, and contributed to healthcare AI advancements through collaborative research.  

Sumeet Pundlik, Delivery Unit Head at TheMathCompany (MathCo)  

Sumeet is an AI and data science expert with a patented service location optimisation system for a global CPG brand. He also explores Edge analytics, pushing the boundaries of predictive maintenance.  

Swapnil Ashok Jadhav, Senior Director – Machine Learning & Engineering at Angel One  

Swapnil developed Yubi’s first open-source repository and India’s first Fintech language model, YubiBERT, earning recognition from Meta and media coverage. At Unacademy, he created EdOCR, an OCR tailored for EdTech, and presented it at NVIDIA GTC 2021.

Google Cloud MLDS 2024 Stage

Meet the Winners of Previous Years

2024 | 2023 | 2022| 2021 | 2020 | 2019

]]>
8 Best Certified Companies for Data Professionals to Work For https://analyticsindiamag.com/ai-highlights/8-best-certified-companies-for-data-professionals-to-work-for/ Tue, 31 Dec 2024 08:28:27 +0000 https://analyticsindiamag.com/?p=10160596 These firms not only nurture talent but also empower data scientists to excel in their roles.]]>

The best firms for data scientists and data engineers in 2024, as recognised by AIM, exemplify what it means to create workplaces where data professionals thrive. 

These companies prioritise continuous learning through robust upskilling and mentorship programs while fostering diversity with significant representation in leadership and inclusive team cultures. 

By offering flexible work arrangements, strong recognition systems, and comprehensive well-being initiatives, they address the holistic needs of their employees. 

These firms not only nurture talent but also empower data scientists to excel in their roles, making them standout choices for those seeking both career growth and a supportive work environment.

In alphabetical order, here are eight of the best-certified companies for data scientists to work for:

Aays

Aays, founded in 2018, is an enterprise analytics solutions provider headquartered in Gurugram. The company specialises in data and AI solutions in manufacturing, consumer packaged goods, retail, and automotive industries. 

The company has also developed a decision intelligence platform called AaDi. The platform serves as a copilot for finance functions and provides agentic capabilities and conversational features that help perform root cause analysis, flux reporting and variance or bridge analysis. Its multi-agent system orchestrates several tasks involving information retrieval, natural language understanding, and data extraction, summarisation and visualisation. 

“I am incredibly proud of the team we’ve built – talented, experienced professionals who bring genuine passion and purpose to their work every day. The success we are seeing wouldn’t be possible without their dedication, and I am committed to making sure Aays is a place where they can keep growing, pushing boundaries, and thriving while creating impactful solutions for our clients,” said Dwarika Patro, founder and COO at Aays.

The company earned the award for the second time in 2024. 

Carrier

Carrier Global, the company that specialises in heating, ventilation, air conditioning and refrigeration, established its first digital hub in Hyderabad in 2017.

Carrier’s India Hub is the company’s largest digital transformation centre, making up nearly 40% of the company’s digital talent from its two locations in Hyderabad and Bengaluru.

The India hub significantly contributes to Carrier’s flagship products: Abound, a cloud-native platform for healthy and efficient buildings, and Lynx, a cold chain platform that uses advanced data analytics, the Internet of Things (IoT), and machine learning. The company is also set to establish an AI centre of excellence in India with over 100 employees.

“Carrier Digital Hub India plays a pivotal role in our global strategy, accelerating our digital transformation journey. By leveraging India’s robust talent pool and strategic capabilities in cybersecurity, cloud operations, customer experience, IoT, and data science, we enhance operational agility and drive innovation across our enterprise,” said Bobby George, senior vice president and CDO of Carrier, in a recent interview with AIM

Diggibyte

Diggibyte Technologies Private Limited is a technology consulting and services company that specialises in automating IT operations with artificial intelligence and business intelligence tools. Their services also include IoT streaming, and the company processes a trillion messages yearly from devices. 

The company is also said to have developed over 1,000 data pipelines, which process 10 petabytes of data yearly and successfully migrated over 500 legacy data warehouse objects to modern warehouses. 

“We are delighted to share the exciting news that Diggibyte Technologies Pvt Ltd has been honoured as the best firm for data engineers. This prestigious recognition is a reflection of our exceptional team and the outstanding culture we’ve fostered,” said Lawrance Amburose, co-founder at Diggibyte. 

In 2024, Diggibyte Technologies was certified as the best firm for data engineers by AIM

General Mills

General Mills’ foundation dates back to 1866 in Mississippi, US. The company houses some of the world’s iconic brands like Cheerios, Pillsbury, Nature Valley and Häagen-Dazs. 

In 1996, it established the General Mills India Centre (GIC) in Mumbai, which currently has 2,000 employees. The company delivers value across multiple arenas of supply chain, sales strategy and intelligence, consumer and market intelligence, and so on. 

Earlier this year, the company introduced MillsChat, a private generative AI tool for employees in non-plant locations across the US, Canada, the UK, and India.

It is built atop Google Gemini and provides a secure platform for writing, summarising, and brainstorming assistance. The company is also examining how AI can drive value, competitive advantage, and growth with its commercial, marketing, and supply chain teams utilising various AI and ML products.

“We have cultivated a culture of innovation through our focus on continuous learning and inclusion. Our team thrives on collaboration and is dedicated to pushing boundaries,” said Ashish Mishra, head of digital and technology at General Mills India Centre.

Hansa Cequity

Hansa Cequity, founded in 2008, is headquartered in Mumbai. It helps brands and organisations harness the power of AI and data analytics for marketing, customer strategy, campaign management, and customer relationship centre services. 

Hansa Cequity also offers a suite of ten AI-enabled tools, all designed to transform customer interactions and enhance decision-making and operational efficiency. Tools like Varta enable AI-powered call analytics, while Cequity SMART offers a real-time recommendation engine for e-commerce. Other tools include a market potential analysis platform, an image analytics tool, and a data deduplication tool to ensure accuracy. 

“Hansa Cequity excels at merging data rigor with marketing deployment and customer centricity, creating impactful solutions for clients. This commitment not only drives good results but also empowers our analytics team members to thrive in their careers. The environment fosters growth and innovation, paving the way for professional excellence,” said Prasad Kothari, head of data science and AI at Hansa Cequity.

The company has been certified as the best firm for data scientists thrice, including in 2024.

Intuit

Intuit was founded in California in 1983, and the company established a centre in Bengaluru in 2005. Intuit India focuses on developing and improving the company’s flagship products, namely QuickBooks, TurboTax, Credit Karma, and Mailchimp. 

Intuit’s India development centre employs around 1,900 people. In India, the company designed the Intuit Enterprise Suite, which enables rapid scaling and adoption of the company’s financial management tools. 

“At Intuit AI, I am inspired every day by the exceptional team of ML scientists and engineers I work with. We are dedicated to pushing the boundaries of machine learning and AI to deliver outstanding benefits to our customers,” said Anusha Mujumdar, senior manager in data science and AI at Intuit.

“We are committed to nurturing our teams, fueled by mentorship, access to the latest tech stack, and a vibrant AI culture,” she added. 

MiQ

MiQ is a global programmatic media partner that offers data-driven marketing solutions for agencies and brands. Founded in the United Kingdom in 2019, the company opened its first office in India in 2020. 

MiQ’s services include predictive analytics, campaign management, and audience targeting to deliver high-performance marketing campaigns. MiQ’s Performance Engine enables marketing professionals to deploy “intelligent optimisation strategies” and use necessary custom algorithms to achieve their KPIs. 

The platform crunches over 2.6 petabytes per day from over 170 data feeds to enable over 500 custom analytics setups. The company also mentions that it completes nearly 10,000 campaign optimisations every week. 

“The impact that data has and will have continues to grow every day. Our focus has been on solving major business challenges that go beyond media campaigns. Thanks to our entire team for making our DS practice one of the best in the industry,” said Ramya Parashar, chief operating officer at MiQ.

AIM has awarded MiQ the certification for the third time. 

Rakuten

Bengaluru-based Rakuten India houses more than 1,700 employees. The company provides businesses with a wealth of knowledge in multiple sectors of technology, such as mobile and web development, web analytics, platform development, backend engineering, data science, artificial intelligence (AI), machine learning (ML) and more. 

The company also features dedicated centres of excellence for data analytics, DevOps (development operations), information security, engineering, and mobile applications development.

“With a culture fueled by innovation, usage of cutting-edge technology, collaboration and strong business communication, we’re proud to be the premier destination where AI talent thrives, and revolutions begin,” said Anirban Nandi, head of AI products and analytics (Vice President) at Rakuten India.

Rakuten was certified as the best firm for data scientists for the second time in 2024. 

]]>
Palantir’s New Cohort to Drive Manufacturing Innovations at ‘Warp Speed’ https://analyticsindiamag.com/ai-news-updates/palantirs-new-cohort-to-drive-manufacturing-innovations-at-warp-speed/ Thu, 12 Dec 2024 07:53:21 +0000 https://analyticsindiamag.com/?p=10143405 ‘At the dawn of WW2, we didn’t have a Defense Industrial Base; we had an American Industrial Base. This is also what our future must look like.’]]>

Palantir, the US-based data analytics firm, has announced a new cohort of companies that will use its tool Warp Speed to reindustrialise the United States’ manufacturing and production capabilities. Companies including Anduril Industries, L3Harris, Panasonic Energy of North America (PENA), and Shield AI are part of the cohort. 

Warp Speed is Palantir’s manufacturing operating system that provides a unified platform for companies to access multiple production tools. Warp Speed is focused on adapting towards companies’ production and business processes rather than the other way around. 

The platform combines several tools like enterprise resource planning (ERP), manufacturing execution systems (MES), product lifecycle management (PLM), and programmable logic controllers (PLC). This includes products from notable companies such as Siemens, SAP, Oracle and SolidWorks.

“The inaugural cohort is already using the software to gain an advantage in dynamic production scheduling, engineering change management, automated visual inspection for quality, and more,” read the announcement. 

One of the member companies in the cohort, Anduril, was able to observe a 200-fold efficiency gain in dealing with supply shortages. A few days ago, Anduril announced a partnership with OpenAI to bring AI technologies to the U.S. national defence and security interests. 

Leaders from other companies followed suit with a similar sentiment. “Warp Speed is enabling us to rapidly transform our manufacturing operation in Nevada and accelerate the ramp-up of our new factory in De Soto, Kansas,” said Allan Swan, President of PENA. 

In another instance, Shield AI says they are able to handle a ‘record demand’ for V-BAT, their flagship unmanned aerial system, through Warp Speed OS. “Warp Speed will help our different functions identify chokepoints and stay in lock step,” said Ryan Tseng, CEO and founder of Shield AI.

Last Month, Palantir also announced a partnership with AWS and Anthropic to provide US intelligence and defence agencies with Claude 3 and 3.5 models. 

“Our partnership with Anthropic and AWS provides US defence and intelligence communities the toolchain they need to harness and deploy AI models securely, bringing the next generation of decision advantage to their most critical missions,” said Shyam Sankar, chief technology officer at Palantir.  

]]>
Lingaro CEO Thinks the GenAI Enterprise Revolution is Slower Than it Looks https://analyticsindiamag.com/ai-features/lingaro-ceo-thinks-the-genai-enterprise-revolution-is-slower-than-it-looks/ Sat, 23 Nov 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10141498 India is a critical hub for Lingaro, as CEO Mantle emphasised the country's "tremendous wealth of data talent" and its reputation as a global leader in the tech services ecosystem.]]>

When Lingaro Group CEO Sam Mantle visited Bengaluru last week, the city’s traffic may have overwhelmed him, but the comfort of one of its luxurious hotels provided him with a much-needed respite. We at AIM had the opportunity to catch up with him in this elegant setting, delving into his thoughts and vision for Lingaro’s future in India.

The Polish IT firm is focused on data, and data alone, as Mantle highlighted Lingaro’s unwavering focus on this core area.

The company first set foot in India a few years ago. Now, it is here again with ambitious plans to increase its revenue by 30% year over year and double its workforce to 400 employees. 

A realist at heart, Mantle is pragmatic about industry trends and has strong opinions on generative AI. 

“There’s been a lot of hype around generative AI, but the promise hasn’t been realised. It’s going to take a lot longer than people think,” he said. Mantle isn’t the only one with this sentiment, and there’s more to the story. 

Barriers to the AI Promise

For one, Mantle mentioned that the legacy workforce is the biggest barrier to adopting a disruptive technology like AI. 

“Most people have not grown up with the capabilities that are available today. So, we have to rewire the way we think and the way we’re organised,” he explained. 

Moreover, Mantle also pointed out that, unlike individuals, it isn’t going to be easy for enterprises to adopt generative AI quickly, like the flip of a switch. 

Despite the difficulties, companies have been actively deploying AI services and products in their workflows. In the recent Microsoft Ignite 2024 event, Microsoft said that nearly 70% of the Fortune 500 companies now use Microsoft 365 Copilot. Similarly, LangChain’s recent survey revealed that 51% of companies have already implemented AI agents in their tech stack. 

India is not far behind, either. It was recently reported that over 18,000 developers at Infosys have written code using AI and that the service provider giant is fully embracing generative AI

That said, Lingaro is a proponent of AI, and the progressive adoption of AI tools aligns seamlessly with its vision. No matter what specific AI use case a company is implementing, Lingaro says it is providing the building blocks – which is its core ethos of providing data services. 

Lingaro sees a big opportunity to assist companies in adapting generative AI. Mantle said that most companies don’t have the right level of data ownership, and governance. If companies do not know what data they are using for AI algorithms, they might not be able to realise the best outcome. 

Mantle believes that regardless of where companies choose to prioritise, the backend, the engine, the accelerators, and the data assets must all be orchestrated seamlessly so they can quickly gain an advantage.

This is also what products like Snowflake and Databricks are doing, albeit with a product. Mantle revealed that Lingaro isn’t just offering a product of its own but also partnering with products like Snowflake, Databricks, and other data-focused software on the market. Moreover, Lingaro has established a close relationship with ‘hyperscalers’ like Microsoft, Google, and AWS, indicating their role in a broader section of the ecosystem. 

“Increasingly, the big enterprises are moving more towards hybrid environments. For example, you have to combine Azure with GCP. You have to combine GCP with AWS. Nobody wants to be all in with one—that’s beautiful for us because that’s the complexity that we need to help them navigate,” Mantle further said. 

That said, their big ambitions in India, have even bigger competitors in the world of AI. 

Battle With the Great Indian IT

India is a critical hub for Lingaro, as Mantel emphasised the country’s “tremendous wealth of data talent” and its reputation as a global leader in the tech services ecosystem.

With just over 2,000 successful projects, Lingaro’s portfolio is rather humble in comparison to industry giants like HCL, Infosys, or TCS. These companies are also exponentially expanding their project portfolios with time. 

For example, in the Q1 2024-25 earnings call, TCS chief K Krithivasan, said, “We are currently executing about 270 AI projects across TCS. Our AI pipeline has doubled in a quarter to $1.5 billion. Our investments in research and innovation continue. In Q1, we applied for 154 patents and were granted 277 patents.” 

At the FY25 Q2 earnings call, they announced that that over 600 AI and generative AI engagements have been deployed successfully. 

Even HCL recently onboarded 25 new clients, owing to the success of their AI suite, ‘GenAI Force’. This suite consists of some of the best market-leading AI products, including Anthropic’s Claude and Github Copilot. So, if Lingaro has to make a mark, they have strong competition to overcome. 

Lingaro differentiates itself by avoiding the ‘doing everything for everyone’ approach. Mantle said Lingaro is focusing on data specific services to their clients instead of the whole solution. “We’re only delivering data-related services, so we’re not distracted by all of the other things that are going on in the industry,” he added. 

Mantle also highlighted Lingaro’s priority of understanding their clients’ technology and business needs before laying out their expertise in data. He said that their ability to combine tech and domain data understanding is what he believes sets Lingaro apart from most of the other more generic services. 

Notably, we are living in a time when AI agents are poised to make it big, possibly threatening service providers. This is especially true due to the ease with which companies can build and deploy these agents. Owing to this, several companies in India’s IT sector may face difficulties.

Mantle, however, believes that deploying application layers across all parts of a company’s IT estate isn’t everything one needs to do. He believes that the onus must be on the data component present inside these applications and focus on its role. 

“I’m interested in who owns the data component that may sit in that application. Because if we really want to streamline things, somebody has to be responsible for that data, no matter where it flows within the organisation,” he concluded.

]]>
UST Expands India Presence with a Second Office in Bengaluru https://analyticsindiamag.com/ai-news-updates/ust-expands-india-presence-with-a-second-office-in-bengaluru/ Tue, 05 Nov 2024 09:38:10 +0000 https://analyticsindiamag.com/?p=10140203 UST’s growth in India has been significant, with plans for additional facilities, including a second campus in Kochi, Kerala, aimed at creating 3,000 jobs over the next five years. ]]>

UST, a digital transformation solutions company, has opened its second delivery center in Bengaluru, Karnataka, as part of its ongoing expansion in India. The new facility, located in Helios Business Park, Kadabeesanahalli, covers over 17,000 square feet and accommodates more than 300 workstations, featuring a Design Experience Center and other modern amenities.

Founded in 1999 and headquartered in California, UST operates multiple offices across India and employs over 20,000 individuals in the country. The company is marking its 25th anniversary this year, reflecting on its growth and commitment to innovation in the digital transformation space.

Bengaluru is now UST’s second largest global delivery center housing over 6,000 employees. The company first established operations in the city in 2012. UST’s growth in India has been significant, with plans for additional facilities, including a second campus in Kochi, Kerala, aimed at creating 3,000 jobs over the next five years.

The inauguration of the new office was attended by UST executives, including Alexander Varghese, Chief Operating Officer, and local leadership, who highlighted the importance of Bengaluru as a hub for IT and technology talent. The expansion is expected to enhance UST’s capabilities in delivering innovative solutions across various sectors, including healthcare, logistics, and retail.

]]>
Bangalore Leads the Way in Sourcing Talent for Frontend, Backend, DevOps, and Data Science Roles in India https://analyticsindiamag.com/ai-news-updates/bangalore-leads-the-way-in-sourcing-talent-for-frontend-backend-devops-and-data-science-roles-in-india/ Tue, 08 Oct 2024 12:02:55 +0000 https://analyticsindiamag.com/?p=10137857 Bangalore’s ability to consistently produce top-tier talent is the reason behind India on the global stage, not just in terms of technological output but also as a source of talent.]]>

Recently released Instahyre Report states that Bangalore has emerged as the most sought after  sourcing location for multiple BFSI tech skills – Frontend, Backend, DevOps and Data Science, followed by Pune and Hyderabad. The report further revealed  that Bangalore also tops as the source of talent for roles in DevOps, providing more than 30% of professionals skilled in Docker, Kubernetes, Jenkins and AWS. 

Over the years, Bangalore has become a key location for the disruptive startup ecosystem, tech MNCs, and a steady influx of technology innovations. The city is now a hotbed for companies looking to tap in top tier talent.

Despite facing uncertainties and layoff announcements, tech hiring in the Indian IT industry remains resilient, showing a positive growth outlook. 

Data indicates that major IT companies are planning to expand their workforce to meet the growing demand for IT services in India. On an average, these tech giants are expected to add between 40,000 and 50,000 new employees across various tech roles in the near future.

Bullish on the industry’s growth, Mr. Sarbojit Mallick, Co-founder of Instahyre said, “Bangalore, the ‘Silicon Valley of India’, has solidified its position as the leading hub for tech talent in the country. It consistently outshines other Indian cities, making it a crucial player in both the national and global tech landscape.” Mallick further added that along with Hyderabad, Bangalore was the cradle of tech services right from the outsourcing days. “Building on that legacy and the startup boom, it has cemented its leading position year-on-year, as our report’s data shows,” he noted. 

In the Data Science role, Bangalore is again playing host to more than 30% talent for Machine Learning, Computer Vision, NLP and Data Visualization, followed by Pune and Hyderabad. The city also leads the pack in sourcing Security professionals who are proficient in extremely vital tech security skills like Info Security, Security Testing, App Security, and Network Security. 

When it comes to talent sourcing, the Instahyre report reveals that Bangalore holds the lion’s share, supplying experts skilled in Data Analysis, Warehousing, Data Collection, and Data Extraction.

In addition, the city also leads tech hiring within the Networking space, providing skilled talent from Network Analysis, Network Testing, Network Admin and Troubleshooting functions and QA + Risk skills including Quality Assurance, Quality Control, Risk Assessment, and Risk Management. 

Given the cutting edge technology ecosystem in Bangalore, the city continues to mark its stride as the preferred destination for companies seeking skilled BFSI tech talent across various functions. Without doubt, Bangalore’s ability to consistently produce top-tier talent is the reason behind India being on the global stage, not just in terms of technological output, but also as a source of talent.

]]>
Data Science Hiring and Interview Process at SAP Labs India https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-sap-labs-india/ Mon, 29 Jul 2024 10:13:34 +0000 https://analyticsindiamag.com/?p=10110557 As a data scientist at SAP Labs, you will analyse large datasets, implement best practices to enhance ML infrastructure, and support engineers and product managers in integrating ML into products.]]>

German tech conglomerate SAP Labs has been one of the major players in the generative AI race on the enterprise side. The company recently introduced Joule, a natural-language generative AI assistant that allows access to the company’s extensive cloud enterprise suite across different apps and programs. It will provide real-time data insights and action recommendations.


Snowflake certification

With a global presence in 19 countries, labs are responsible for driving SAP’s product strategy, developing and localising its core solutions, and contributing to the SAP Business Technology Platform. 

54-year-old SAP was founded by five former IBM employees, Dietmar Hopp, Hasso Plattner, Claus Wellenreuther, Klaus Tschira, and Hans-Werner Hector. SAP Labs is the R&D arm of SAP with its second largest office space in Bengaluru. 

AIM got in touch with Shweta Mohanty, vice president and head, of human resources, SAP, India and Dharani Karthikeyan, vice president, head of engineering for analytics, SAP Labs India, to understand the company’s AI and analytics play, customer stories, hiring process for data scientists, work culture and more. 

AI & Analytics Play

“We have fully embraced generative AI in our business AI concept, aiming to provide AI that is responsible, reliable, and relevant. The goal is to infuse AI into business applications, with a focus on trust and outcomes,” Karthikeyan told AIM

SAP has a portfolio of over 350 applications spanning various use cases, from cash management to document scanning. The company is enhancing its Business Technology Platform (BTP) with a generative AI layer.  The team aims to improve business processes while maintaining human control over decisions. They have collaborated with Microsoft for Human Capital Management tools, combating biases in recruiting, and introduced a Business Analytics tool for faster insights. 

SAP is also partnering with Google Cloud to launch a holistic data cloud, addressing data access challenges. Additionally, they have invested in generative AI players Anthropic, Cohere, and Aleph Alpha, diversifying their capabilities.

Interview Process

The hiring process for tech roles involves five to six steps starting with profile screening, focusing on the candidate’s development background and programming language proficiency. As described by Mohanty, this is followed by an online assessment to test programming skills, lasting 60 to 90 minutes. Technical interviews include case studies to assess proficiency and hands-on experience. 

For senior roles, there’s a discussion with a senior leader to gauge cultural alignment. The final step is an HR discussion focusing on cultural fit and interest in the organisation. For college recruitment, the process includes live business solutions assessments. The process concludes with a rigorous background verification.

When it comes to finding the right fit for SAP labs, the ideal candidate should have a comprehensive understanding of ML algorithms, and to build and maintain scalable solutions in production,” added Karthikeyan, highlighting that this consists of the use of statistical modelling procedures, data modelling, and evaluation strategies to find patterns and predict unseen instances. 

The roles involve using computer science fundamentals such as data structures, algorithms, computability, complexity, and computer architecture and also collaborating with data engineers is essential for building data and model pipelines, as well as managing the infrastructure needed for code production. 

As a data scientist at SAP Labs India, you will also analyse large, complex datasets, researching and implement best practices to enhance existing ML infrastructure and provide support to engineers and product managers in implementing ML into products.

Work Culture in SAP

SAP’s work culture is characterised by abundant learning opportunities and hands-on experiences where employees have chances to shadow leading data scientists, participate in fellowship projects for stretch assignments, and explore various aspects. This hands-on approach extends to customer interactions and pre-sales experiences. 

“These opportunities, along with the focus on learning and customer engagement, give SAP an edge over other organisations hiring in data science and machine learning,” Mohanty commented.

SAP prioritises its employees’ well-being through a comprehensive set of benefits and rewards. The company recognises diverse needs beyond healthcare and retirement plans, offering global and local options for work-life balance, health and well-being, and financial health. 

Embracing a highly inclusive and flexible culture, the company promotes a hybrid working model allowing employees to balance office and remote work. Employee Network Groups foster a sense of community, and inclusive benefits include competitive parental leave and disability support. 

The ERP software giant also aims to foster personal and professional growth, providing learning opportunities, career development resources, and a leadership culture focused on doing what’s right for future generations. It values fair pay, employee recognition, generous time-off policies, variable pay plans, total well-being support, and stock ownership opportunities for all employees.

Why Should You Join SAP Labs?

SAP Labs offers a sense of purpose and involvement in transformative technology phases. At SAP, candidates dive into cutting-edge technologies, explore diverse industries, and embrace continuous learning and innovation. 

Mohanty explained how the team values adaptability, emphasising fungible skills and a proactive mindset, especially in areas like AI and generative AI. 

“We seek individuals ready to tackle new challenges and solve complex problems, fostering a dynamic and impactful work environment,” she explained. 

Adding on to what Mohanty said, “The work at SAP involves mission-critical applications, like supporting cell phone towers or vaccine manufacturing so the integration of generative AI into these applications offers a unique combination of purpose and technological advancement, providing developers with a high sense of purpose in seeing their software run essential business and retail operations. This phase of technological transformation at SAP is especially significant for new joiners,” said Karthikeyan. 

Check out the job openings here.

]]>
Data Science Hiring and Interview Process at ServiceNow https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-servicenow/ Mon, 29 Jul 2024 10:13:21 +0000 https://analyticsindiamag.com/?p=10111339 The company has eight open positions for applied research scientists and ML engineers.]]>

California-based ServiceNow, one of the leading names when it comes to operating as a cloud-based company delivering Software as a Service (SaaS) has brought purpose-built AI with the highly intelligent NOW platform.  Last September, the company expanded this using a domain-specific ServiceNow language model designed for enterprise use.


Snowflake certification

The NOW platform converts machine intelligence into practical actions, aiming to enhance process efficiency, reduce risks, optimise workforce productivity, and facilitate automated workflows with the help of purpose-built AI, providing users with self-solving capabilities through augmented intelligence

“With this (NOW Platform), we are enabling enterprises to increase process efficiency, minimise risk by avoiding human mistakes, optimise workforce productivity to focus on higher value tasks​, leverage automated workflows to drive standardisation and empower users to self-solve with augmented intelligence,” Sumeet Mathur, vice president and managing director of ServiceNow’s India Technology and Business Center, told AIM. 

The company has eight open positions in data science.

Applied research scientists in the Core LLM Team focus on developing generative AI solutions, collaborating with diverse teams to create AI-powered products and work experiences. Simultaneously, they conduct research, experiment, and mitigate risks associated with AI technologies to unlock novel work experiences. 

On the other hand, as a machine learning engineer, you’ll craft user-friendly AI/ML solutions to enhance enterprise services’ efficiency, emphasising accessibility for users with varying technical knowledge. 

Inside ServiceNow’s AI & Analytics Lab

The Now platform aims to create proactive and intelligent IT processes. The platform is built around big data and advanced analytics, incorporating real-time and stored data to enhance accessibility and support various use cases, such as self-service, incident detection, pattern discovery, knowledge base optimisation, workflow automation, and user empowerment. 

ServiceNow’s self-service has evolved with augmented AI and automation, using intelligent virtual agents to understand customer intent and resolve complex issues. Augmented agent support focuses on improving human capabilities through recommendation engines, automated workflows, and increased productivity, aligning with specific business objectives for measurable value.

Tapping into Generative AI

Last September, the company expanded its Now Platform using a domain-specific ServiceNow language model designed for enterprise use, prioritising accuracy and data privacy. The Now LLM incorporates top-notch foundational models, including a pre-trained model called StarCodel, developed in collaboration with Hugging Face and a partnership with NVIDIA, along with other open-source models. 

The initial release of Now LLM introduces features such as interactive Q&A, summarisation capabilities for incidents/cases and chats, and assistive code generation for developers. The development of this model involved significant efforts from engineering, research, product, QE, and design teams, as well as data centre operation teams managing the GPU infrastructure. 

Clients like Mondalez, Delta, Standard Chartered, Coca Cola, LTIMindtree, and various other companies across industries have used the platform for AI applications in areas like improving healthcare workflows, providing financial auditors with quick insights, and transforming supply chain management in manufacturing. 

“We believe that the most constructive and value-creating strategies for generative AI are grounded in embedding human experience and expertise into its core capabilities,” added Mathur. 

So it adopts a humans-in-the-loop model for generative AI, integrating human expertise into its core capabilities. The NOW platform’s generative AI is applied in diverse use cases, including case summarisation, content generation, conversational exchanges, and code generation. 

Interview Process

“Our hiring process for data science roles follows a structured approach aimed at attracting a diverse pool of qualified candidates. We publish job openings on various platforms, including our career site, job boards, social media, and professional networks,” added Mathur. The process involves careful evaluation through interviews to ensure the selection of the right candidate. 

The interview process consists of three technical rounds, each focusing on key competencies such as programming proficiency and experience with core ML and LLM. This assessment is followed by an interview with the hiring manager and, for certain roles, an additional round with the senior leadership. 

However, Mathur shared that during the data science interview process, candidates often make common mistakes that should be avoided. Some of them include inadequate technical readiness, a limited understanding of the company’s objectives and role, failure to ask insightful questions, overlooking the latest AI/ML trends, and neglecting to demonstrate effective problem-solving skills. 

Expectations

Upon joining the data science team at the Advanced Technology Group (ATG) of ServiceNow, candidates can expect to work within a customer-focused innovation group. The team builds intelligent software and smart user experiences using advanced technologies to deliver industry-leading work experiences for customers. 

The ATG comprises researchers, applied scientists, engineers, and product managers with a dual mission: building and evolving the AI platform and collaborating with other teams to create AI-powered products and work experiences. The company expects that team members will contribute to laying the foundations, conducting research, experimenting, and de-risking AI technologies for future work experiences.

Work Culture

“Our company fosters a purpose-driven work culture where employees have the opportunity to be part of something big. We make work better for everyone—including our own. We know that your best work happens when you live your best life and share your unique talents, so we do everything we can to make that possible for our employees,” Mathur added.

Some of the key perks include a hybrid working model, paid time off, well-being days, employee belonging groups, DEI learnings, internal opportunities, and paid volunteering.

According to him, joining ServiceNow means becoming part of an inclusive and diverse community with resources for well-being, mental health, and family planning, among others. Prioritising value and trust, SaaS giant provides ongoing support for learning and development, growth pathways, and action-oriented feedback aligned with clear expectations. The programs cater to individuals at all career stages. 

“We’re committed to creating a positive impact on the world, building innovative technology in service of people – with a core set of values and a deep responsibility to each other, our customers and our global communities,” he concluded.

Check out the careers page now.

]]>
Data Science Hiring and Interview Process at Happiest Minds Tech https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-happiest-minds-2/ Mon, 29 Jul 2024 10:08:16 +0000 https://analyticsindiamag.com/?p=10102220 Happiest Minds is currently on the lookout for a specialist in marketing analytics with over 8 years of relevant experience. ]]>

Founded in 2011 by Ashok Soota, a serial entrepreneur and Indian IT veteran, Happiest Minds boasts a robust data science team comprising over 300 members, including data engineers, intelligence specialists, and data science experts.


Snowflake certification

Based in the Silicon Valley of India, Bangalore, and extending its reach across the global landscape, including the US, UK, Canada, Australia, and the Middle East, this IT juggernaut seamlessly blends augmented intelligence with the art of understanding human language, deciphering images, analysing videos, and harnessing cutting-edge technologies such as augmented reality and virtual reality.

This dynamic fusion empowers enterprises to craft captivating customer interactions that surpass rivals and set new industry standards.

Happiest Minds distinguishes itself from traditional IT companies by avoiding legacy systems like SAP and ERP, believing that staying entrenched in these technologies limits growth and innovation. “Instead, we have chosen to focus on digital technologies like AI, which is the future of IT,” said Sundar Ramaswamy, SVP, Head of Analytics CoE, in an exclusive interview with AIM.

The team conducts regular market scans to identify the latest technologies and ensures that they are always on the forefront of innovation. This approach allows them to co-create and innovate with clients while building new solutions.

Now Hiring

Happiest Minds is currently on the lookout for a specialist in marketing analytics. The ideal candidate should possess a Master’s or Bachelor’s degree in Computer Science, STEM, or an MBA, demonstrating strong problem-solving skills. They should also have over eight years of experience in the analytics industry, particularly in marketing. 

This experience should include a track record of using AI to enhance the customer journey, encompassing areas such as customer acquisition, nurturing, retention, and improving the overall experience.

The technical skills required include proficiency in statistical techniques, ML, text analytics, NLP, and reporting tools. Experience with programming languages such as R, Python, HIVE, SQL, and the ability to handle and summarise large datasets using SQL, Hive-SQL, or Spark are essential.

Additionally, the knowledge of open-source technologies and experience with Azure or AWS stack is desirable.

AI & Analytics Play

This team collaborates closely with domain teams across diverse industry verticals. Their analytics process follows eight key steps. They integrate data from multiple sources, use BI tools for descriptive analytics, perform ad hoc analysis, build data pipelines and auto ML pipelines, retrain models regularly, focus on customer understanding, optimise cloud usage, and ensure data governance.

Their key industry verticals are CPG retail, healthcare (bioinformatics), FSI, media entertainment, and Edtech, with growing interest in manufacturing. The team works with classical analytics, deep learning, computer vision, NLP, and generative AI. This includes advanced applications like language translation and content generation from 2D to 3D images using generative AI.

Recognising the growing importance of generative AI, they have formed a dedicated task force comprising approximately 50 to 60 members, drawn from diverse domains, under the leadership of their CTO with the primary objective to leverage generative AI in addressing industry-specific challenges.

To achieve this, they’ve identified and categorised 100 to 250 distinct use cases across ten different domains, tailored to the specific requirements of each domain. The team is diligently working on creating demos and proof of concepts (POCs) that are domain-specific. 

Some team members come from analytics backgrounds, contributing their technical expertise, while others from domain areas contribute to shaping ideas and ensuring results align with the industry’s needs. This undertaking is substantial for the organisation, considering they have around 5,500 employees, with 100-160 dedicated solely to generative AI. 

In addition to building demos, the company is also focusing on educating its entire workforce about LLMs and their applications to equip all team members with a basic understanding of generative AI’s capabilities and potential applications.

To bring generative AI into action, the company is working with Microsoft’s suite of products. “We are a Microsoft select partner and are also experimenting with different language models,” he added.

The team initially experimented with Google’s BERT and now employs models like GPT-2. They have a strategic inclination towards refining existing models to suit specific applications, rather than developing entirely new foundational models. For example, they collaborate with a healthcare company to craft adaptive translation models with reinforcement learning.

Interview Process

“Data science is not just about technical skills; it also involves an element of art. Candidates are assessed on their ability to communicate their results effectively and their capacity to approach problems with creativity,” said Ramaswamy.

The interview process for data science candidates at Happiest Minds typically involves three to five levels of interviews. The first level is a screening by the HR team based on the job description. This is followed by a written test to assess the candidate’s proficiency in relevant languages and skills. For example, if the position is for a data engineer, the test might evaluate their ability to work with SQL and other database-related tasks. 

Technical interviews are conducted using case studies to evaluate the candidate’s problem-solving ability and approach. The interview process concludes with a leadership interview, especially if the position is a senior one.

In addition to understanding the interview process, candidates often wonder about the common mistakes they should avoid. According to Ramaswamy, there are two main pitfalls that candidates often fall into. First, many candidates focus excessively on specific tools or techniques and become fixated on mastering them.

“While technical proficiency is essential, it’s equally important to explain the problem being solved, the reasons for approaching it a certain way, and considering alternative solutions,” he added.

The second common mistake is becoming too narrowly focused on the solution without understanding the broader context. It’s crucial to see the big picture, why the problem is being solved for the client, and to ask relevant questions about the projects they’ve worked on. 

In terms of skills, the company looks for both technical and non-technical abilities. The specific skills depend on the role of the position, such as data engineering, business intelligence, or data science. 

However, primary technical skills include proficiency in relevant tools and technologies, certifications, and problem-solving abilities. Non-technical skills are communication and presentation skills, problem-solving skills, and the ability to coach and mentor, as collaboration and teamwork are essential for senior positions.

Work Culture

“As the company’s name suggests, we aim to cultivate a distinctive work culture based on four fundamental pillars,” Ramaswamy commented. Certified as a Great Place to Work, the company prioritises the well-being of their employees, believing that “a content workforce leads to happy customers“. They monitor and maintain employee happiness closely, offering support to those facing personal or professional challenges.

Collaboration is another key element of their culture, as they encourage a unified approach within and across different units and locations. “As a company born in the digital age, Happiest Minds thrives on agility, adapting swiftly to meet the ever-changing needs of customers and the digital industry,” he added.

Transparency is the fourth pillar, as they openly share key performance indicators and objectives with their employees, investors, and stakeholders. This culture of transparency and goal-oriented approach ensures that their efforts are always aligned with clear objectives and tracked diligently.

If you think you fit the role, check out their careers page now. 

Read more: Data Science Hiring Process at PayPal

]]>
Data Science Hiring and Interview Process at Wipro https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-wipro/ Mon, 29 Jul 2024 10:03:31 +0000 https://analyticsindiamag.com/?p=10103532 With over 30,000 AI and analytics professionals, the team is building its own LLMs. ]]>

Wipro, which began as a family operated vegetable oil manufacturer in the small town of Amalner, India in 1945, is now one of the largest IT companies globally, functioning in more than 167 countries.


Snowflake certification

The company has been a key player in driving generative AI. With over 30,000 AI and analytics professionals, the team is building their own LLMs. 

AIM got in touch with Sriram Narasimhan, global head of data, analytics and insights, Wipro, to understand about their data science applications, hiring for these roles, work culture and more. 

Inside Wipro’s Data Science Team

“Data serves as the bedrock for every AI NS ML initiative, laying the groundwork for success. The pivotal factor lies in guaranteeing precise data of optimal quality—an indispensable catalyst for these processes that yield the desired results,” Narasimhan told AIM.

Profound insights emerge from the ability to scrutinise, profile, and decipher patterns within the data landscape, identifying outliers to extract meaningful conclusions. At the heart of any data science endeavour is the adeptness to construct automations through AI and ML algorithms, elevating and refining the data and insight ecosystem.

This transformative process enhances operational efficiencies, underscoring the fundamental role of data science and engineering as the critical inaugural stride in the pursuit of quality outcomes in AI/ML implementations.

Wipro’s AI and analytics team is substantial, with over 30,000 practitioners globally. The company boasts 500+ AI/ML patents, 20 innovation centres, and over 15 partnerships with a strong presence in various industries. 

Recognised as a leader by agencies like Everest Group and IDC, Wipro specialises in industry-specific solutions and horizontal offerings like ML Ops and Legacy modernisation.

“The team co-builds solutions, leveraging tools like the Wipro Data Intelligence Suite (WDIS), prebuilt Industry Business Applications, and the Wipro Enterprise Generative AI (WeGA) Framework,” he added. These tools accelerate customer implementations, supporting the modernisation journey and enabling responsible AI with safety and security guardrails.

Riding the Generative AI Wave

Wipro has been actively involved in generative AI initiatives for over two years, collaborating with research institutes like the AI Institute at the University of South Carolina and IIT Patna. The company is committed to training its sizable workforce of 250,000 in generative AI. They have developed their own LLMs enhancing versatility and future-proofing, and have established a unique partnership with Google Cloud to integrate its generative AI products and services.

The company’s generative AI applications cover diverse themes, including cognitive chatbots, content creation and optimisation for marketing, media, automation in code generation, and synthetic data generation. The company’s internal initiative, Wipro ai360, focuses on incorporating AI across all platforms. Notable client projects include assisting a chocolate manufacturer in enhancing product descriptions and collaborating with a European telecom company to extract value from data.

Wipro is invested in the generative AI landscape, with 2/3rd of its strategic investments directed towards AI. The company plans to further support cutting-edge startups through Wipro Ventures and launch a GenAI Seed Accelerator program to train the top 10 generative AI startups.

Acknowledging the challenges associated with generative AI, the Bengaluru based tech giant has implemented a control framework, emphasising responsible usage. Initiatives include dedicated environments for developing generative AI solutions, GDPR-compliant training, and efforts to detect AI-generated misinformation. They have also established an AI Council to set development and usage standards, emphasising ethical guidelines, fairness, and privacy.

The team is attuned to evolving regulatory frameworks and is adapting strategies accordingly. The company envisions widespread benefits to the IT industry, with generative AI influencing code generation and call centres. The team anticipates a wave of AI services emerging in the next five years, facilitating enterprises in harnessing AI’s full potential. In the long term, they foresee AI disrupting every industry, with specific verticals like precision medicine, precision agriculture, hyper-personalisd marketing, and AI-led capabilities in smart buildings and homes gaining prominence.

Interview process

When hiring for data science roles, Wipro seeks candidates with practical experiences, strong programming and statistical skills, analytical abilities, domain knowledge, and effective presentation skills. 

“The hiring process involves a comprehensive evaluation based on real-world use cases, emphasising not only technical proficiency but also the candidate’s understanding of problem statements and the application of statistical methodologies to solve complex issues,” he added.

“Joining our data science team promises exposure to cutting-edge, real-life AI/ML problems across various industries as we encourage a democratic approach to AI, allowing teams the independence to build solutions while adhering to organisational processes,” Narasimhan commented.

The company offers a diverse range of competencies, including data engineering, data science, conversational AI, ethical AI, and generative AI, enabling associates to work on projects aligned with their capabilities and aspirations.

In interviews, Wipro emphasises the importance of showcasing real-life use cases rather than being overly theoretical. Candidates are encouraged to highlight their practical experiences, demonstrating how they understand, consider options, and provide solutions to problems in the realm of data science, AI, and ML.

Work Culture

Wipro fosters a work culture rooted in values and integrity for its global workforce of 250,000+. Guided by the ‘Spirit of Wipro’ and ‘Five Habits’ principles, it emphasises respect, responsiveness, communication, ownership, and trust. With a 36.4% gender diversity goal, the company supports inclusion through programs like Women of Wipro (WOW), addressing various aspects of diversity such as disability, LGBTQ+, race, ethnicity, and generational diversity.

For talent management, they use tech solutions like the MyWipro app and WiLearn. These tools facilitate goal documentation, feedback, skill-building, and awareness of biases. The company conducts biannual performance reviews, offers training, mentoring, and leadership programs, including global executive leadership initiatives.

Employee benefits encompass a comprehensive package, including 401k, pension, health, vision, dental insurance, competitive pay, bonuses, paid time off, health savings, flexible spending accounts, disability coverage, family medical leave, life insurance, and more. Additional perks involve retirement benefits, stock purchase plans, paid holidays, legal plans, insurance for home, auto, and pets, employee discounts, adoption reimbursement, tuition reimbursement, and well-being programs.

]]>
Data Science Hiring and Interview Process at Pegasystems https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-pegasystems/ Mon, 29 Jul 2024 10:03:15 +0000 https://analyticsindiamag.com/?p=10103957 For data science roles, Pega focuses on the candidate's ability to learn and adapt rather than specific tech skills. ]]>

Pegasystems, commonly known as Pega, is a global software company founded in 1983, focusing on customer engagement and operational excellence solutions. The Cambridge-based company has become a leader in business process management and customer relationship management.


Snowflake certification

The primary offering, Pega Infinity, acts as a comprehensive platform for businesses to create, implement, and improve applications, aiming to enhance customer experiences and streamline operational processes.

The company utilises AI and data science throughout its platform to improve decision-making, automate processes, and provide personalised customer interactions. CISCO, HSBC, and Siemens are a few of their primary customers. 

In their latest iteration of Pega Infinity 23, the platform introduces over 20 new features, including generative AI-powered boosters to enhance efficiency. The Connect Generative AI feature enables organisations to quickly utilise generative AI with a plug-and-play structure for low-code development.

AIM caught up with Deepak Visweswaraiah, vice president, platform engineering and site managing director, and Smriti Mathur, senior director and head of people, Pegasystems, India, to understand their generative AI play, hiring process and more.

Pega has open positions for solutions engineers and senior software quality test engineers in Hyderabad and Bengaluru.

Decoding Pega’s AI Ventures

In their core platform, Pega Infinity, the organisation relies heavily on data science, which plays a critical role in analytics, insights generation, natural language processing (NLP), generative AI, and various other applications that drive functionalities such as real-time decision-making and personalised customer communications based on attributes.

Data science also contributes significantly to the development of generative AI models, enhancing the overall intelligence of the platform. Its impact extends beyond the core platform to applications like customer service, one-to-one engagement, decision-making, sales automation, and strategic smart apps for diverse industries.

Pega GenAI provides insights into AI decision-making and streamlines processes, such as automating loan processing. “The benefits of generative AI extend to developers and end-users, improving productivity through query-based interactions, automatic summarisation, and streamlined case lifecycle generation,” Visweswaraiah told AIM

End-users also benefit from realistic training scenarios using simulated customer interactions.

Regarding proprietary foundational models, the organisation’s product architecture prioritises openness and flexibility. They support various language models, including those from OpenAI and Google. 

“In upcoming product versions, we are actively working to support and ship local language models to meet specific use case demands, focusing on accuracy, productivity, and performance in response to customer preferences for diverse capabilities,” he added. 

Interview Process

The company follows a global hybrid working model, encouraging collaboration in the office while providing flexibility, with about 60% of the workforce attending the office around three days a week. This approach aims to attract talent globally, fostering a vibrant culture and hybrid working environment.

In upskilling employees, technical competencies are crucial, and the company emphasises learning through its Pega Academy, offering online self-study training, live instructor-led courses, and online mentoring. Skill gaps are regularly assessed during performance reviews, providing learning opportunities through gateways and supporting external courses with an educational reimbursement policy.

“For data science roles, we focus on the candidate’s ability to learn rather than specific data science skills,” Mathur told AIM. The company looks for individuals capable of extracting insights from data, making informed decisions, and building models for application in various use cases.

Mathur further shared that the company emphasises the importance of understanding its problem-solving approach and creating deterministic models that consistently provide performant and real-world solutions. It encourages candidates to think from the customer’s perspective and avoid getting lost in vast amounts of data, highlighting the significance of models producing consistent and reliable answers.

Work Culture

The company emphasises diversity and inclusivity, fostering a culture centred on innovation and collaboration. It has been ranked as the best workplace for women by Avatar for five consecutive years. Pega values individuals who think independently, challenge norms and question the status quo to seek better solutions.

The company encourages leadership and curiosity in approaching tasks, promoting an environment where employees are empowered to innovate. Compared to competitors, Pega’s work culture stands out due to the unique problems it addresses and its distinctive approach.

Understanding the product architecture is crucial for employees, given the nature of the challenges they tackle. Pega’s ability to integrate technology into the platform is a significant differentiator, enhancing its capability to address complex issues. 

“With a focus on adapting to market changes, our mantra of being “built for change” reflects our commitment to staying dynamic and responsive to evolving needs,” concluded Mathur.  

So, if you want to join the dynamic community of Pega, check out the careers page here. 

]]>
Data Science Hiring and Interview Process at WNS https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-wns/ Mon, 29 Jul 2024 10:03:07 +0000 https://analyticsindiamag.com/?p=10104280 Consisting of over 6,500 AI experts, WNS Triange serves as a partner for 200 global clients in more than 10 industries]]>

Headquartered in Mumbai, India, WNS is a prominent global Business Process Management (BPM) and IT consulting company with 67 delivery centers and over 59,000 employees worldwide. 


Snowflake certification

Combining extensive industry knowledge with technology, analytics, and process expertise, the company collaborates with clients across 10 industries to co-create digital-led transformational solutions. WNS is renowned for its strategic partnerships, delivering innovative practices and industry-specific technology and analytics-enabled solutions. The company’s services cover diverse sectors, characterised by a structured yet flexible approach, deep industry expertise, and a client-centric partnership model.

WNS Triange, the AI, analytics, data and research business unit, has successfully harnessed the power of data science to develop robust solutions that effectively address a myriad of business challenges faced by its clients. 

Among these solutions are sophisticated applications such as an advanced claims processing system, a finely tuned inventory optimisation mechanism, and the implementation of a retail hyper-personalisation strategy.

Consisting of over 6,500 experts, WNS Triange serves as a partner for 200 global clients in more than 10 industries. 

“The team is organised into three pillars: Triange Consult focuses on consulting and co-creating strategies for data, analytics, and AI; Triange NxT adopts an AI-led platform approach for scalable business value; and Triange CoE executes industry-specific analytics programs, transforming the value chain through domain expertise and strategic engagement models,”  Akhilesh Ayer, EVP & Global Business Unit Head – WNS Triange, told AIM in an exclusive interaction last week. 

WNS’s AI & Analytics Play

The data science workflow at WNS Triange follows a meticulously structured process that guides the team through various stages, including problem outlining, data collection, Exploratory Data Analysis (EDA), cleaning, pre-processing, feature engineering, model selection, training, evaluation, deployment, and continuous improvement. A pivotal element of this methodology is the proprietary AI-led platform, Triange NxT, equipped with Gen AI capabilities. This platform serves as a hub for domain and industry-specific models, expediting the delivery of impactful insights for clients.

“When it comes to claims processing, we deploy predictive analytics to conduct a thorough examination of data sourced from the First Notice of Loss (FNOL) and handler notes,” said Ayer. This approach allows for the evaluation of total loss probability, early settlement possibilities, and subrogation/recovery potential. 

Simultaneously, its Marketing Mix Modeling (MMM) is employed to optimise resource allocation by quantifying the impact of marketing efforts on key performance indicators. Furthermore, the application of advanced analytics techniques aids in the detection of suspicious patterns in insurance claims for risk and fraud detection. 

Ayer shared that the team also actively leverages generative AI across diverse sectors. In the insurance domain, it is employed to streamline claims subrogation by efficiently processing unstructured data, minimising bias, and expediting insights for recovery. 

Similarly, in healthcare, it empowers Medical Science Liaisons (MSLs) by summarising documents and integrating engagement data for more impactful sales pitches. Generative AI’s versatility is further demonstrated in customer service interactions, where it adeptly handles natural language queries, ensuring quicker responses and retrieval efficiency.

The combination of LLM foundation models from hyperscalers like AWS with WNS Triange’s proprietary ML models enables the delivery of tailored solutions that cater to various functional domains and industries. Where necessary, WNS Triange employs its AI, ML and domain capability to fine-tune existing foundation models for specific results, ensuring a nuanced and effective approach to problem-solving.

Tech Stack

In its AI model development, the team utilises vector databases and deep learning libraries such as Keras, PyTorch, and TensorFlow. Knowledge graphs are integrated, and MLOps and XAI frameworks are implemented for enterprise-grade solutions. 

“Our tech stack includes Python, R, Spark, Azure, machine learning libraries, AWS, GCP, and GIT, reflecting our commitment to using diverse tools and platforms based on solution requirements and client preferences,” said Ayer. 

Even when it comes to using transformer technology, particularly language models like Google’s BERT for tasks such as sentiment analytics and entity extraction, its current approach involves a variety of language models, including GPT variants (davinci-003, davinci-codex, text-embedding-ada-002), T5, BART, LLaMA, and Stable Diffusion. 

“We adopt a hybrid model approach, integrating Large Language Models (LLMs) from major hyperscalers like OpenAI, Titan, PaLM2, and LLaMA2, enhancing both operational efficiency and functionality,” he commented. 

Hiring Process

WNS Triange recruits data science talent from leading engineering colleges, initiating the process with a written test evaluating applied mathematics, statistics, logical reasoning, and programming skills. Subsequent stages include a coding assessment, a data science case study, and final interviews with key stakeholders.

“Joining our data science team offers candidates a dynamic and challenging environment with ample opportunities for skill development. And while engaging in diverse projects across various industries, individuals can expect exposure to both structured and unstructured data,” said Ayer. 

The company fosters a collaborative atmosphere, allowing professionals to work alongside colleagues with diverse backgrounds and expertise. Emphasis is placed on leveraging cutting-edge technologies and providing hands-on experience with state-of-the-art tools and frameworks in data science. 

WNS Triange values participation in impactful projects contributing to the company’s success, offering access to mentorship programs and support from experienced team members, ensuring a positive and productive work experience.

Mistakes to Avoid

Candidates are encouraged to not only showcase technical prowess but also articulate the business impact of their work, demonstrating its real-world relevance and contribution to business goals.

Ayer emphasised, “Successful data scientists must not only be technically adept but also skilled storytellers to present their findings in a compelling manner, as overlooking this aspect can lead to less engaging presentations of their work”

He added that candidates sometimes focus solely on technical details without articulating the business impact of their work, missing the opportunity to demonstrate how their analyses and models solve real-world problems and contribute to business goals.

Work Culture

Recognised by TIME MAGAZINE for being one of the best companies to work in,  WNS has built a work culture centered on co-creation, innovation, and a people-centric approach, emphasising diversity, equity, and inclusivity, prioritising a respectful workplace culture and extending its commitment to community care through targeted programs by the WNS Cares Foundation. 

“Our focus on ethics, integrity, and compliance ensures a safe ecosystem for all stakeholders, delivering value to clients through comprehensive business transformation,” said Ayer. 

In terms of employee perks, it offers various services and benefits, including transportation, cafeterias, medical and recreational facilities, flexibility in work hours, health insurance, and parental leave. 

“Differentiating ourselves in the data science space, we cultivate a work ecosystem that fosters innovation, continuous learning, and belongingness for the data science team. Our initiatives include engagement tools, industry-specific training programs, customised technology-driven solutions, and a learning experience platform hosting a wealth of content for self-paced learning,” he added. 

Why Should You Join WNS?

“At WNS, we believe in the transformative power of data, where individuals play a key role in shaping our organisation by directly influencing business strategy and decision-making. Recognising the significant impact of data science, we invite individuals to join our collaborative and diverse team that encourages creativity and values innovative ideas. In this dynamic environment, we prioritise knowledge sharing, continuous learning, and professional growth,” concluded Ayer. 

Find more information about job opportunities at WNS here.

]]>
Data Science Hiring and Interview Process at Marlabs https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-marlabs/ Mon, 29 Jul 2024 10:03:00 +0000 https://analyticsindiamag.com/?p=10104968 Marlabs is currently hiring for 10 data science roles, including ML Architect, ML Engineer, and Statistical Modeling positions.]]>

Founded in 2011, New York-based IT services and consulting firm Marlabs helps companies of various sizes to undergo AI-powered digital transformation. It provides a wide range of services, including strategic planning, creating rapid prototypes in specialised labs, and applying agile engineering techniques to develop and expand digital solutions, cloud-based applications and AI-driven platforms.


Snowflake certification

Marlabs’s data science team addresses a range of industry challenges, emphasising tasks like extracting insights from extensive datasets and employing pattern recognition, prediction, forecasting, recommendation, optimisation, and classification.

Exploring Generative AI at Marlabs

“In operationalising AI/ML, we have tackled diverse projects, such as demand forecasting, inventory optimisation, point of sale data linkage, admissions candidate evaluation, real-time anomaly detection, and clinical trial report anomaly detection,” Sriraman Raghunathan, digital innovation and strategy principal, Marlabs, told AIM in an exclusive interaction. 

The team is also exploring generative AI applications, particularly in knowledge base extraction and summarisation across domains like IT service desk ticket resolution, sustainability finance, medical devices service management, and rare disease education.

However, it is not developing foundational models as of now due to substantial capital requirements. “Instead, we are focussing on the value chain beyond foundational models, offering tools and practices for deploying such models within organisation boundaries, tailored for specific domains,” he added. 

Marlabs employs a variety of tools and frameworks depending on project specifics, utilising R and Python for development, Tableau, Power BI, QlikView for data exploration and visualisation, and PyTorch, TensorFlow, Cloud-Native tools/platforms, and Jupyter Notebooks for AI/ML model development.

The team leverages Transformer models like GPT-3, especially in NLP use cases, implementing them in TensorFlow, and PyTorch, and utilising pre-trained models from Hugging Face Transformers Library. For generative AI, their toolkit includes LangChain, Llama Index; OpenAI, Cohere, PaLM2, Dolly; Chroma, and Atlas.

Hiring Process

The hiring process for data science roles at the organisation emphasises a blend of technical knowledge, practical application, and relevant experience. The initial steps involve a clear definition of the role and its requirements, followed by the creation of a detailed job description. 

The interview process comprises technical assessments, video interviews with AI/ML experts, and HR interviews. Technical assessments evaluate coding skills, data analysis, and problem-solving abilities. 

Video interviews focus on the candidate’s depth of knowledge, practical application, and communication skills, often including a discussion of a relevant case study or project. HR interviews center around cultural fit, interpersonal skills, collaboration, and the candidate’s approach to handling challenges. 

Expectations

“Upon joining the data science team, candidates can anticipate a thorough onboarding process tailored to their specific team, providing access to essential tools, resources, and training for a smooth transition,” commented Raghunathan. 

The company’s AI/ML projects involve cutting-edge technologies, exposing candidates to dynamic customer use cases spanning natural language processing, computer vision, recommendation systems, and predictive analytics. The work environment is agile and fast-paced. The company places a strong emphasis on team collaboration and effective communication, given the collaborative nature of data science and AI/ML projects. 

In this rapidly evolving field, the company expects new hires to demonstrate continuous learning, tackle complex technical and functional challenges, operate with high levels of abstraction, and exhibit creative and innovative thinking.

Mistakes to Avoid

“The most prevalent error observed in candidates during data science role interviews is a lack of clear communication,” he added.

The ability to effectively communicate insights to non-technical stakeholders is crucial in the AI/ML space, and this skill is frequently overlooked. 

Another common mistake is a failure to comprehend and articulate the business context and domain knowledge of the problem, which is essential in AI/ML applications with significant business impact.

Work Culture

“We are recognised for our value-based culture focused on outcomes, emphasising a flat organizational structure to spur innovation and personal growth. Key values such as respect, transparency, trust, and a commitment to continuous learning are central to their ethos, all aimed at exceeding customer expectations,” he said.

The company’s robust learning and development program has prepared over 150 young managers for leadership roles, with a strong emphasis on AI and technology for organisational insights and sentiment analysis.

The company offers a comprehensive benefits package, including versatile insurance plans, performance incentives, and access to extensive learning resources like Courseware and Udemy, supporting a hybrid work model. Additionally, they provide mental health support and reward long-term employees based on tenure. 

Raghunathan further explained that in the data science team, Marlabs stands out for its innovative and collaborative environment, encouraging creativity and continuous learning. “This distinctive culture and investment in employee growth make us a leader in data science, differentiating it from competitors in the tech industry,” he added. 

Why Should You Join Marlabs?

“Join Marlabs for a dynamic opportunity to work with a passionate team, using data to drive meaningful change. In this collaborative setting, data scientists work with brilliant colleagues across various industries, including healthcare, finance, and retail. You’ll tackle complex issues, contributing to significant business transformations. Marlabs supports your career with essential tools, resources, training, competitive compensation, benefits, and opportunities for professional growth and development,” concluded Raghunathan.  

]]>
Data Science Hiring and Interview Process at Global Fintech Company Fiserv https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-fiserv/ Mon, 29 Jul 2024 10:00:43 +0000 https://analyticsindiamag.com/?p=10095785 Fiserv prioritises up-skilling employees to help them excel in their roles and adapt to new technologies and client needs]]>

Headquartered in Brookfield, Wisconsin, Fiserv is a global leader in payments and financial technology, with operations across 100 countries. Aspiring to move money and information in a way that moves the world, Fiserv helps its clients achieve best-in-class results through a commitment to innovation and excellence. Boasting a global workforce exceeding 41,000 professionals, Fiserv’s data science team plays a pivotal role in supporting various business domains, driving innovation, and cultivating engineering excellence to enhance client experiences.


Snowflake certification

Founded 39 years ago through the merger of First Data Processing and Sunshine State Systems, Fiserv has experienced rapid growth by strategically acquiring prominent entities such as CheckFree Corporation, M-Com, CashEdge, and PCLender. Again in 2019, Fiserv underwent a transformative merger with First Data, resulting in the formation of a globally recognised leader in payment solutions and financial technology, facilitating unparalleled capabilities to deliver exceptional value to financial institutions, corporate clients, small enterprises, and consumers alike.

Analytics India Magazine got in touch with Manisha Banthia, vice president, data and analytics – global services, Fiserv, to understand the importance of AI and analytics for the company and how they hire the finest tech talents. 

Inside the Data Science Team of Fiserv

Being a major player in the fintech industry, the 75-member strong data science team of Fiserv tackles various complex challenges in the fields of banking, cards, payments, digital solutions, and merchant acquisition. Besides creating embedded analytics solutions for financial institutions (FIs) to aid their decision-making processes, they offer advisory services throughout the lifecycle of FIs and merchants, covering areas such as acquisition, growth, cross-selling, and retention. The team also focuses on building solutions to optimise internal processes and operations across different businesses.

For a major US retailer, they leveraged ML to identify prepaid cardholders who would benefit from targeted marketing strategies, resulting in increased engagement and reduced attrition. In another initiative, Fiserv aimed to expand its merchant user base for cash advance loans and achieved this by developing a risk model and an AI algorithm that enabled the sales team to target the right merchants, leading to portfolio growth, reduced marketing expenses, and cost optimisation.

Furthermore, the data science team developed an advanced ML-based solution to address fraud detection and prevention for financial institutions, replacing rule-based engines. “Our data science team follows a pod structure consisting of data scientists, domain experts, ML engineers, visualisation experts, and data engineers who constantly add value to our organisation,” said Banthia. 

Data scientists apply advanced techniques and provide recommendations. Domain experts offer business context, translate problems, and validate results. ML engineers deploy ML models for performance and reliability. Visualisation experts represent data insights visually. Last but not least, data engineers collect, process, and maintain data quality.

The team actively works with Python, Pyspark, Azure, Watson, Snowflake, Adobe Analytics, and Alteryx. 

Interview Process

The interview process consists of a thorough examination by both technical and managerial authorities. Candidates with strong programming skills, statistical knowledge, and problem-solving capabilities, evaluated through case studies and in-depth domain knowledge assessment, are ideal. Following it is an HR assessment to check interpersonal skills and the culture fit.

“A successful data scientist should prioritise a client-centric approach, seeking feedback, adapting to specific needs, and aligning analytical solutions with objectives,” said Banthia. 

Technical skills like solving unstructured problems, exploring AI and ML techniques, conceptualising solutions, and simplifying findings for stakeholders are valued. Fiserv also looks for strong leadership, business acumen, and functional expertise in executive hires. When interviewing, prospective candidates should showcase a balanced combination of technical, business, and leadership skills. They should effectively communicate their proficiency without excessive technical jargon and demonstrate the ability to lead teams and collaborate effectively.

Work Culture

Certified by Great Place To Work® in 2023, Fiserv aims to foster a fast-paced and dynamic work environment. Adaptability and the ability to iterate quickly and respond to market needs are highly valued. The company prioritises up-skilling employees to help them excel in their roles and adapt to new technologies and client needs.

Besides providing an inclusive culture and professional growth opportunities, the fintech giant offers learning programs, wellness plans, and engagement initiatives. “We are committed to being an equal opportunity employer with an inclusive workplace culture and clear communication through an open office concept,” she concluded. 

Check out their careers page now. 

Read more: Data Science Hiring Process at MediBuddy

]]>
Data Science Hiring and Interview Process at Lendingkart https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-lendingkart/ Mon, 29 Jul 2024 09:57:24 +0000 https://analyticsindiamag.com/?p=10106527 Founded in 2014 by Harshvardhan Lunia, Indian digital assembly fintech lender Lendingkart utilises a data-powered credit analysis system to facilitate online loans, aiming to improve accessibility in small business lending. The company’s proprietary underwriting mechanism utilises big data and analytics to evaluate the creditworthiness of borrowers. The company has so far disbursed over $1 billion […]]]>

Founded in 2014 by Harshvardhan Lunia, Indian digital assembly fintech lender Lendingkart utilises a data-powered credit analysis system to facilitate online loans, aiming to improve accessibility in small business lending. The company’s proprietary underwriting mechanism utilises big data and analytics to evaluate the creditworthiness of borrowers.


Snowflake certification

The company has so far disbursed over $1 billion in loans in over 1300 cities in the country, especially in Tier 2 and Tier 3 cities. The company, which recently reported its first-ever profits, a sum of Rs 118 crore, with total revenues reaching Rs 850 crore in FY23, specialises in providing unsecured business loans to micro, small, and medium-sized enterprises (MSMEs). 

The fintech company is backed by Bertelsmann India Investments, Darrin Capital Management, Mayfield India, Saama Capital, India Quotient and more. 

“Data science has always been at the heart and center of our operations. The AI/ML-based underwriting that this team has developed has been used to underwrite over one million MSMEs,” said Dhanesh Padmanabhan, chief data scientist, Lendingkart, in an exclusive interaction with AIM.

The 35-member data science team of the Ahmedabad headquartered firm is organised into three main groups: analytics, underwriting modelling, and ML engineering. The analytics team, with approximately 15 members, is further divided into three sub-teams focusing on revenue, portfolio (credit and risk), and collections.

“One of the key challenges addressed by our team at Lendingkart is credit risk management where we employ a combination of analytics and AI/ML models at different stages of the underwriting and collections processes to assess eligibility, determine loan amounts and interest rates, and ensure timely customer payments or settlements,” he added.

This underwriting modeling team consists of about 5 members dedicated to developing underwriting models, while the 10-member ML engineering team focuses on MLOps, feature store development, and AI applications.

Additionally, there are individual contributors like an architect and a technical program manager, along with a two-member team specializing in setting up the underwriting stack for the newly established personal loan portfolio.

The company has open positions for senior data scientist and associate director in Bengaluru.

Inside Lendingkart’s AI & Analytics Team

The team leverages AI and ML across various functions, for example, in outbound marketing to target existing customers and historical leads through pre-approved programs. Additionally, a lead prioritization framework helps loan specialists focus on leads for calling and digital engagement.

The company also employs an intelligent routing system to direct loan applications to credit analysts, and a terms gamification framework aids negotiation analysts in negotiating interest rates with borrowers. Its fraud identification framework flags potentially manipulated bank statements for further review, and a speech analytics solution is deployed to extract insights from recorded calls for monitoring operational quality.

On the other hand, collections models prioritize collections based on a customer’s likelihood of entering different delinquency levels, and computer vision models are used for KYC verification.

“We are also exploring the use of generative AI for marketing communication, chatbots, and data-to-insights applications,” said Padmanabhan. Moreover, there are plans to build transformer-based foundational models using call records and structured data sources like credit histories and bank statements for speech analytics, customer profiling, and underwriting purposes.

The tech stack comprises SQL running on Trino, Airflow, and Python. For ML tasks, they leverage scikit-learn, statsmodel, scipy, along with PyTorch and TensorFlow. Natural language processing and computer vision applications involve the use of transformers and CNNs.

The API stack is powered by fast API’s deployed on Kubernetes (k8s). In ML Engineering, the team prefers Kafka and Mongo. Additionally, there are applications built on Flask and Django, and they are currently developing interactive visualizations using the MERN stack.

Interview Process

Lendingkart’s data science hiring process includes four to five interview rounds, evaluating candidates with strong backgrounds in analytics, modelling, or ML engineering. In leadership roles such as team leads and managers, the company places emphasis not only on technical proficiency but also on crucial skills in team and stakeholder management.

During the interview process, non-managerial candidates undergo initial technical assessments in SQL, Python, or ML. Subsequent rounds explore general problem-solving and soft skills, with assessments conducted by peers, managers, and HR.

Expectations

Upon joining the team, candidates can expect to participate in a diverse range of projects encompassing revenue, risk, collections, and the development of tech and AI stacks for these applications. Collaboration with various stakeholders remains a significant aspect of the role. For example, the development of a new underwriting algorithm involves comprehensive reviews with risk and revenue teams to align with business objectives, followed by collaboration with product and ML engineering teams for successful implementation.

However, Padmanabhan notes that there is a common mistake which candidates make – they overlook the importance of thoroughly understanding the business context of the given problems.

“While they may possess knowledge of various algorithms used in different domains, they may struggle to articulate solutions or approaches when those algorithms are applied within a financial process context,” he added, highlighting the importance of connecting technical expertise with a deep understanding of the specific business challenges at hand.

Work Culture

“Our work culture is fast-paced and dynamic, characterised by group problem-solving focused on specific business goals with competitive ESOP packages and industry-standard insurance,” said Padmanabhan.

The data science team operates hands-on at all levels, adopting best practices like agile and MLOps. The “hub and spoke” approach involves data scientists taking responsibility for the entire process from conceptualization to implementation, distinguishing the work culture from competitors in the space.

At Lendingkart, you’ll collaborate closely with stakeholders on projects like developing underwriting algorithms. The company maintains a well-established agile practice led by the technical program manager and team leads, focusing on efficient planning, best practices, and clear communication to create a productive work environment. So if you think you are fit for this role, apply here. 

]]>
Data Science Hiring and Interview Process at Verizon https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-verizon/ Mon, 29 Jul 2024 09:54:17 +0000 https://analyticsindiamag.com/?p=10112646 The company plans to expand its footprint in India by hiring around 70 professionals this year for data science, data engineering, ML engineering, and cloud roles across Chennai, Hyderabad, and Bengaluru.]]>

New Jersey-based telecommunication operator Verizon is at the forefront of leveraging AI and analytics to transform its network, services, and customer interactions.


Snowflake certification

“When it comes to AI, especially generative AI, I think at the edge of the network it will be very important to have AI to make quick decisions very close to the end user,” Hans Vestberg, chief executive officer at Verizon, said on a panel discussion at WEF 2024, highlighting the significance of AI in powering future business growth.

“We focus on the core areas, which include customer experience and loyalty driven by personalisation when it comes to our customers. We also use AI to drive operational efficiency in areas like workforce demand prediction and audience targeting in marketing,” said Ebenezer Bardhan, senior director of data science at Verizon India, in an exclusive interaction with AIM.

Currently, the company has over 400 models deployed in production across various lines of business within sales, service and support.

Verizon plans to expand its footprint in India by hiring around 70 professionals this year for data science, data engineering, ML engineering, and cloud roles across Chennai, Hyderabad, and Bengaluru. 

Inside Verizon’s Data Science Lab

The AI and analytics team at Verizon consists of two divisions: Data and Analytics (D&A) focusing on enterprise analytics, and AI & Data (AI&D) comprising data engineers, AI engineers, and data scientists, with a team size of around 350 in India.

Some of the recent use cases of the company’s AI initiatives include personalisation and churn reduction to improve customer retention to add business value. From an AI services standpoint, it has a custom explainability and interpretability framework that data scientists in the team have adopted to diagnose models.  

“We are also exploring use cases in generative AI to improve the information access for frontline users,” said Ganesh Siddhamalli, director of AI and ML engineering, told AIM.

Internally, the company is developing a charter as part of the Responsible AI initiative to establish appropriate guardrails facilitated by LLMOPS services. Additionally, it also has a Generative AI Center of Excellence (COE) to provide a unified perspective on all potential use cases, allowing for evaluation and shared learning among teams throughout the journey.

“We primarily use frameworks like Tensorflow, Pytorch, Scikit learn, Domino DataLab, Vertex suite, Seldon, Ray, Great Expectations, and Pydantic across AWS/GCP and OnPrem for AI and analytics,” Bharathi Ravindran, director, data science, Verizon India, told AIM

Transformer-based models have been employed since 2018, particularly for real-time intervention in call and chat transcripts, aligning with generative AI use cases.

Interview Process

The team’s hiring process is well-defined and thorough, beginning with an internal posting for eight days to encourage applications from within the company. Subsequently, the job openings are publicised on various external platforms, including job boards, social media, and other professional networks, to attract a diverse and qualified pool of candidates.

“The meticulous and structured hiring process aims to thoroughly assess candidates’ decision-making and leadership capabilities, ensuring the identification of the right diverse talent to contribute to our success,” Samir Singh, director of talent acquisition, Verizon India, told AIM.

He further explained that for data science roles, the interview process consists of three comprehensive rounds. The initial stages focus on technical competencies, such as programming proficiency and experience in essential skills like Python, personalisation, forecasting, GenAI, Churn models, responsible AI, and networking. 

A techno-managerial round follows this, assessing the candidate’s ability to blend technical expertise with managerial skills, and finally, an interview with the leadership team.

On the other hand, data engineering roles involve two rounds that evaluate technical competencies in areas like Big Data, Hadoop, Teradata, and Google Cloud Platform (GCP), followed by a managerial round. 

Expectations

When joining the data science team at Verizon, candidates can expect a vibrant work culture, ample learning opportunities, and exposure to cutting-edge technologies. The company places a strong emphasis on integrity and expects candidates to embody this core value throughout their tenure.

As for expectations from candidates, Verizon values not only technical expertise but also the ability to connect the dots between technological advancements and business objectives. Candidates should confidently articulate their past or current roles during interviews. 

“In an AI-centric organisation like ours, where the daily focus is on enabling business through innovative solutions, the capacity to seamlessly integrate technical excellence with practical application is of great importance for success in data science roles,” said Singh. 

Work Culture

The company’s work culture centres around innovation, collaboration, and customer-centricity. Employees are encouraged to think creatively, take risks, and embrace change. Diversity and inclusion are integral values, creating a supportive and inclusive work environment. 

The company emphasises integrity, respect, performance, and accountability, offering benefits beyond the basics, like flexible working hours, wellness programs, and support for childcare and eldercare.

The key differentiator in Verizon’s work culture, especially within the data science team, is a strong emphasis on R&D, providing freedom for experimentation and learning, as shared by Singh. Apart from experimentation, an open culture fosters collaboration, making Verizon stand out as a great workplace.

Verizon prides itself on a diverse team, promoting gender diversity, supporting differently-abled individuals, and initiatives like WINGS (Women returnee program) and LGBTQ inclusion. The company has consciously made Diversity, Equity, and Inclusion (DEI) a core part of its culture.

“We foster a culture of inclusion, collaboration, and diversity both within the company and among our customers, suppliers, and business and community partners,” concluded Singh. 

If you think that you are a good fit for the company, check out its careers page now. 

]]>
Data Science Hiring and Interview Process at Nucleus Software https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-nucleus-software/ Mon, 29 Jul 2024 09:53:10 +0000 https://analyticsindiamag.com/?p=10114019 The company has positions open for ML engineer, lead data scientist, lead data engineer, data architect, and data and analytics manager.]]>

Founded in 1986, Nucleus Software is an Indian IP product company specialising in financial technology. The company focuses on developing and selling its own software solutions. Tailored for the banking industry, its key products, FinnOne Neo and FinnAxia, cater to lending and transaction banking needs.


Snowflake certification

These solutions aid banks in optimising processes, efficiently managing loans, and delivering innovative financial services. With a presence in 50 countries and a client base exceeding 200, some of its notable customers include State Bank of India, Citi Bank, ICICI Bank, Tata Capital, and Mahindra Finance.

The team has successfully applied AI and analytics in various business scenarios, including fraud detection in transaction banking, business optimisation through insights generation, natural language processing, real-time decisioning systems, AI-powered chatbots, auto summarisation, and hyper-personalised customer communications. 

AIM got in touch with Abhishek Pallav, associate vice president, Nucleus Software, to understand the AI activities of the company and the kind of people it looks for.

Inside Nucleus Software’s AI & Analytics Lab

In terms of implementing AI and ML solutions, Pallav said that Nucleus has developed a foundational framework for its AI components, ensuring reliability and scalability in the dynamic financial services sector. 

The application of AI and ML is evident in the real-time fraud detection engine within the transaction banking platform, natural language processing, customer satisfaction improvement, revenue increase, cost reduction, and operational optimisation for financial institutions. 

The company also leverages an in-house chatbot for L1 and L2 production support.

Regarding generative AI, Nucleus Software strategically deploys tailored solutions to enhance service operations and improve customer experience. This approach enables data access democratisation and the extraction of value from unstructured data. 

“We are in the process of building systems around generative AI and have developed use cases around customer behaviour and transaction history,” Pallav told AIM.

Interview Process 

“When it comes to hiring for data science roles, We look for candidates who have strong technical and domain acumen,” said Pallav, highlighting that the company seeks candidates with strong problem-solving skills for real-world data science scenarios. Proficiency in machine learning and deep learning, particularly in areas like finance, retail banking, fraud prevention, and hyper-personalisation, is highly valued. 

Preference is given to candidates from premier engineering colleges with a background in mathematics, computer science, and AI certifications.

Pallav elaborated that the hiring process for data science candidates begins with a written test evaluating IQ and logical reasoning, followed by two technical discussion rounds. The process concludes with an HR round. 

The written test, conducted online through the company’s platform, includes multiple-choice questions related to the domain and coding problems. Candidates are given a defined time window to complete the test.

Tech Skills Needed

Nucleus Software is currently hiring for five data science roles in Noida. 

The positions include ML engineer, lead data scientist, lead data engineer, data architect, and data and analytics manager. Candidates with experience in the retail banking domain are preferred. The minimum qualifications needed include a BTech in computer science from a premier institute. 

In terms of tech tools, applications, and frameworks, Nucleus Software employs a variety of them for natural language processing, computer vision, directed acyclic graph (DAG), hyper-personalisation, and explainability in its product R&D. 

The desired skills for candidates include proficiency in tools and frameworks such as Python, Pyspark, Spark, Kafka, Hive, and SQL. The knowledge of BI tools is desirable. Candidates should have expertise in various data science techniques, including analytics using AI/ML, deep learning, statistical modelling, time series analysis, and test & learn. The company values candidates who align with their core values of innovation, result orientation, collaboration with integrity, and mutual respect.

Nucleus Software’s products are built on cutting-edge technologies, are platform-agnostic, web-based, cloud-native, and powered by AI capabilities. 

Expectations

“Upon joining, an associate goes through a specialised training program offered by Nucleus School of Banking Technology, where they become domain experts,” said Pallav, emphasising that after completing the program, associates join the Build R&D team. 

However, Pallav explained that he has often observed candidates making the common mistake of adopting an overly generic approach. The company emphasises the need for a holistic approach to value delivery, coupled with strong analytical and logical skills.

Work Culture

Nucleus Software values its employees as crucial assets. This is evident in its various programs that promote employee well-being, such as job enrichment initiatives, competitive compensation, recreational activities, family outings, and diversity and inclusion programs. 

Currently following a hybrid work approach, the company prioritises upskilling with a range of online courses, expert-led training, and on-the-job mentoring. It believes in supporting employees in their professional development. Additionally, the company stands out by offering educational reimbursement for these courses and certifications.

According to Pallav, the work culture at Nucleus is distinguished by its commitment to inclusivity, diversity, engagement, and security. “For candidates driven by challenges, we offer a plethora of complex business use cases. These opportunities not only enhance your technical knowledge but also contribute to enriching your work experience, paving the way for you to become a top-tier professional,” concluded Pallav. 

Find more about the job opportunities here. 

]]>
Data Science Hiring and Interview Process at Confluent https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-confluent/ Mon, 29 Jul 2024 09:51:36 +0000 https://analyticsindiamag.com/?p=10117329 The company is seeking data scientists and engineers to further bolster its tech team. ]]>

In September last year, Confluent, a leading provider of data streaming solutions, introduced ‘Data Streaming for AI’, a new initiative to speed up organisations’ creation of real-time AI applications. More recently, the company announced the general availability of Confluent Cloud for Apache Flink. This fully-managed service enables customers to process data in real-time and create high-quality, reusable data streams.


Snowflake certification

Behind all the innovations in this space is Confluent’s strong and resilient AI and analytics team. “Building a truly data-driven culture is one of the top priorities for Confluent’s Data team. A critical part of achieving that is applying data science to address real-world requirements in business operations,” Ravi Kiran Yanamandra, manager of data science for product and growth marketing at Confluent, told AIM

Yanamandra, along with Karthik Nair, director of international talent acquisition at Confluent, took us through the company’s AI applications, hiring process, skills needed, and work culture. 

The company is seeking data scientists and engineers to further bolster its tech team. 

Inside Confluent’s Data Science Wing

The data team at Confluent is structured into sub-teams specialising in data engineering, data science, and business intelligence.

The organisation has leveraged data science in experimentation to inform product decisions and optimise marketing investments across multiple channels. This involves a multi-channel attribution model and an improved predictability model built on key business KPIs through machine learning forecasting models, enabling more precise planning.

Additionally, machine learning forecasting models improve the predictability of critical business KPIs. 

In terms of implementation, Confluent’s data science team uses a combination of online and offline machine learning models to support various aspects of the business. For example, online algorithms are deployed to evaluate the quality of leads or new signups in real time, allowing for immediate actions based on the insights generated. 

Furthermore, offline models are operationalised to assist business partners in making informed decisions, such as guiding marketing spend decisions through the marketing channel attribution model and providing predictive insights into future performance through revenue forecasting models.

“While still in the experimental phase, we are actively exploring the potential of generative AI as a productivity tool,” said Yanamandra, highlighting that the initial applications include enhancing internal search capabilities and evaluating the quality of support provided through channels like chatbots and email communications. 

Moreover, through its comprehensive data streaming platform, organisations can stream, connect, process, and manage data in real time, creating innovative solutions previously considered unattainable. By integrating generative AI with data streaming, organisations can swiftly address inquiries with up-to-date and comprehensive information.

In addition to leveraging existing technologies, the team also builds proprietary models using its proprietary data assets to address specific business challenges, said Yanamandra. These models, such as consumption forecasting and lead scoring, are tailored to Confluent’s unique needs, further enhancing their competitive advantage in the market.

The team predominantly uses SQL and Python for modelling and analysis, supported by tools like dbt, Docker, and VertexAI for data pipeline management and production model deployment. Tableau is the primary platform for visualisation and reporting, enabling stakeholders to gain actionable insights from the data effectively.

Interview Process

“The hiring process for our data team focuses on assessing candidates in three areas – technical, analytical, and soft skills,” commented Yanamandra. 

Candidates are evaluated based on their proficiency in Python and SQL, experience in ML algorithms and data modelling, and familiarity with A/B testing and data visualisation. Analytical skills are assessed through problem-solving abilities and structured thinking, while soft skills, such as business understanding and communication, are also crucial.

The interview process begins with technical screening focussed on SQL and Python, followed by a real-world business scenario assessment. For data science roles, there’s an additional stage dedicated to statistics knowledge and machine learning abilities. The final interview with the hiring manager evaluates project delivery experience, technical leadership, motivations, and cultural fit.

Expectations

When joining Confluent’s data science team, new members can expect to actively engage with business partners, focusing on solving their specific business challenges. Successful candidates join as subject matter experts for the company’s data tools and technologies, and training is provided to deepen their understanding of the relevant business domain. 

New joiners can expect to work in a “highly collaborative, innovation-driven, and fast-paced environment on the data science team. We move quickly and prioritise translating data insights into tangible business impact”, Yanamandra added.

“Another unique aspect is that candidates are exposed to diverse domains, offering opportunities to collaborate across functions such as marketing, sales, finance, and product analytics,” Nair told AIM.

Mistakes to Avoid

While interviewing candidates, Yanamandra has noticed a common pattern. Candidates often assume that proficiency in technical skills such as SQL, Python, or machine learning is the sole criterion evaluated for data science roles. 

However, while these skills are definitely crucial, Confluent equally prioritises problem-solving abilities and the capacity to apply data science concepts to practical business scenarios. 

Work Culture 

Confluent strives to prioritise hiring individuals who demonstrate empathy and interact well with others, fostering a collaborative and inclusive environment. “As a rapidly growing company, our employees are self-motivated and driven to seize market opportunities. We follow unified decision-making and communication with an open and hierarchical-free structure,” Nair told AIM

Nair stated that the company also offers flexibility through a ‘remote-first’ policy, allowing employees to work from various locations. Alongside competitive benefits, it ensures each employee’s contributions are recognised and valued. 

“Our team thrives on a culture of intellectual curiosity and innovation, where individuals will be encouraged to push the boundaries of what’s possible,” said Nair. The company strives to build an equal, diverse and inclusive work culture.

“We’re a high-growth company with incredible results and achievements, yet only scratching the surface of our potential impact in a rapidly growing market. Joining us promises an exhilarating journey of growth,” concluded Nair. 

If you think you are a right fit for Confluent, apply here

]]>
Data Science Hiring and Interview Process at AstraZeneca https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-astrazeneca-2/ Mon, 29 Jul 2024 09:48:48 +0000 https://analyticsindiamag.com/?p=10123796 AstraZeneca is active in leveraging generative AI, with use cases spanning research assistance, executive reporting, and competitive intelligence.]]>

With over 45 years of presence in India, British-Swedish pharmaceutical and biotechnology giant AstraZeneca has data science and AI capabilities deeply ingrained across its entire drug development lifecycle.


Snowflake certification

For the company, AI plays a pivotal role in accelerating target identification, drug discovery, clinical trials, and commercial analytics. AstraZeneca has implemented rigorous processes to ensure the responsible development and deployment of AI and ML solutions. 

Internally built use cases, solutions, or tools undergo an AI governance maturity assessment before production deployment to ensure compliance with the company’s responsible AI development standards, aligning with their data ethics principles. 

Teams from all business areas contribute to this process, fostering a collaborative approach to ensure AI is developed and deployed safely and responsibly.

AIM recently got in touch with Arun Maria, director of data and AI, R&D IT; Govind Cheramkalam, director of commercial reporting and analytics; and Anuradha Kumar, head HR of global innovation and technology centre, AstraZeneca, to understand more about the GCC’s AI operations, expansion opportunities, hiring process, work culture and more. 

Inside AstraZeneca’s Data Science Labs

AstraZeneca is also active in leveraging generative AI, with use cases spanning research assistance, executive reporting, and competitive intelligence. For example, AZ ChatGPT, an AI-powered research assistant, uses the company’s extensive biology and chemistry knowledge repositories to answer complex questions and provide prompts on discovery and clinical inquiries. 

“We are currently evaluating the capabilities of LLMs like AZ ChatGPT to improve insight generation for executive reports distributed to CXOs and decision-makers in brand and marketing companies,” Cheramkalam told AIM. 

Another such example is the Biological Insight Knowledge Graph (BIKG), a proprietary model developed by AstraZeneca. 

“It utilises the company’s exclusive machine learning models to serve as a recommendation system, enabling scientists to make efficient and informed decisions regarding target discovery and pipeline management. The primary goal of BIKG is to minimise patient attrition rates and improve clinical success,” Maria explained. 

The company has an Enterprise Data and AI strategy unit, with data and AI teams embedded across business and IT functions, fostering a collaborative environment to ensure the provision of foundationally FAIR (Findable, Accessible, Interoperable, and Reusable) data at the source.  

Data engineers, MLOps engineers, AI and ML engineers work as unified teams, promoting collaboration and accelerating business outcomes through structured learning and development programs that cultivate new skills internally.

“AstraZeneca uses a plethora of in-house and externally sourced tools, frameworks and products ranging across very proprietary in-house tools as well as  Databricks and PowerBI,” said Maria. 

The company uses Transformer and GPT models, including testing Microsoft’s Azure OpenAI Service with cutting-edge models like GPT-4 and GPT-3.5. To foster innovation and engagement, AstraZeneca follows a hybrid working model, promoting collaboration while offering flexibility.

Interview Process

AstraZeneca aims to become a data-led enterprise, integrating data into all aspects of its operations. To achieve this, the company seeks candidates with strong skills in Python, machine learning, deep learning, computer vision, and NLP, along with a mindset geared toward growth through innovation.

The interview process is designed to understand both the candidate’s suitability for the role and the company. “For all our roles we look for candidates that not only have the skills, knowledge, experience and competence but can also live our values and behaviour,” Kumar told AIM

Apart from this, assessments at AZ focus on evaluating whether candidates will perform well in the position, demonstrate leadership potential, exhibit enthusiasm and motivation, and work collaboratively in a team. 

Potential candidates should prepare by understanding these key areas and reflecting on their experiences and qualities that they can bring to the table.

What Candidates Should Expect

Joining AZ’s data science team offers opportunities to collaborate with diverse teams, tackle new challenges, and work with the latest technology. The environment supports development and innovation, with the ultimate goal of powering the business through science and market delivery. 

“As a candidate, research our strategic objectives, core values, and the position you’ve applied for. Use our social media or website to learn about the organisation, team, and people. In the interview, be ready to discuss your past experiences and what you’ve learned from them,” said Kumar. 

This will help in avoiding the common mistakes of a lack of preparation about the company and the specific role they are applying for.

Work Culture

AstraZeneca fosters a supportive and inclusive workplace where employees can learn and grow. New hires benefit from onboarding and buddy programs, while extensive training and career development opportunities are available for all.

The gender ratio in AstraZeneca India is approximately 64.6% male to 35.4% female.

The company was also recognised by AIM Media House in 2022 for its excellent work in AI and ML, thanks to its focus on putting patients first. It was once again recognised by AIM in 2024 for its data engineering excellence.

In the data science team, AstraZeneca encourages cross-disciplinary collaboration and lifelong learning through the 3Es framework: Experience, Exposure, and Education. 

The company has a global peer-to-peer recognition system and offers comprehensive benefits, including medical insurance covering parents or parents-in-law, personal accident insurance, term life insurance, and childcare support for new parents.

If you think you are a good fit for the company, check out its career page here. 

]]>
Data Science Hiring and Interview Process at Diggibyte https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-diggibyte/ https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-diggibyte/#respond Mon, 29 Jul 2024 09:48:28 +0000 https://analyticsindiamag.com/?p=10124504 The company is currently hiring for five data science positions in Bengaluru with at least two years of experience in the field.]]>

Founded in 2021 by Lawrance Amburose, Sekhar Reddy, William Rathinasamy, and Anuj Kumar Sen, Bengaluru-based Diggibyte Technologies focuses on providing data and platform engineering, data science AI, and consulting services to a variety of industries. The founders bring a wealth of experience and expertise in leveraging big data and analytics to drive business innovation and efficiency.


Snowflake certification

AIM got in touch with Anuj Kumar Sen, the chief technology officer of Diggibyte, to learn more about the company’s AI and data science operations, expansion plans, interview process, and work culture.

The company is currently hiring for five data science positions in Bengaluru with at least two years of experience in the field. 

Inside Diggibyte’s Data Science Team

Diggibyte’s data science team has tackled numerous issues across various domains. “We have solved multiple problems using data science across the domain, which include customer insights and personalisation (home appliance),” Sen told AIM

By collecting and analysing data from multiple channels, the 50-member data science team provided insights into customer purchasing behaviour, preferences, and needs to tailor marketing efforts and customer satisfaction.

When it comes to logistics, the team has addressed the problem of inaccurate demand predictions, an issue that can potentially lead to mismanagement of fleet drivers and resources. By analysing historical sales data, market trends, and seasonal variations, the team claims to have developed accurate demand forecasts. 

An improved predictive capability helps clients optimise and plan their fleets and drivers, ensuring resources meet customer demand without impacting delivery efficiency or profitability.

Another notable project is the development of an AI-driven headline generator for Norwegian language articles. Using NLP models, this solution scans article content and claims to generate compelling headlines that capture the story’s essence.

Besides, the company also employs generative AI for multiple clients in construction, media, and retail. “Our team is currently working on a latent diffusion image-to-image in-painting model that can automate the process of lifestyle product photography and commercial photography for sports accessories,” he added.

Tech Stack

The team leverages both Transformer architecture models and GPT models depending on the task requirements. They have developed solutions using the Transformer library for an RAG system. Additionally, they created a chatbot for customer support utilising GPT-3.5 Turbo in Azure ML Studio.

In potential candidates, the company looks for proficiency in Python, SQL, Pandas, NumPy, and machine learning. They should also be adept in using frameworks like Keras or TensorFlow and have experience with Databricks. 

In addition to these core competencies, data scientists are encouraged to have competitive programming skills, be fast learners, and stay updated with the latest developments in generative AI, preferably with hands-on experience. 

Interview Process

The interview round at the company is followed by a thorough and structured process to assess both technical and interpersonal skills. It begins with two technical assessment rounds where candidates are evaluated on their fundamental data science skills and adaptability to the latest trends in data science and AI. 

These are followed by an HR round focusing on the candidate’s interpersonal skills, career goals, collaborative abilities, and employment history.

The company sources its best fit by relying on a multi-channel approach to find the top talent in data science. This includes leveraging its internal careers page (set to be deployed in 20 days), online career portals, and trusted vendors.

Work Culture

Once employed, new hires can expect to work on cutting-edge AI projects. It also provides training to bridge gaps, if any, in both technical and interpersonal skills, supported by the leadership team. 

“Since we have a separate team for analytics, engineering, visualisation, and machine learning, we expect candidates to have specialisation in one skill rather than general knowledge in multiple skills,” Sen explained. 

Employees can pursue certifications from reputable sources such as Azure and Databricks. The company covers the exam costs. It offers a flexible work-life balance, accommodating four days’ of work-from-office and work-from-home arrangements, along with a supportive leave policy. 

The company also has policies in effect to make job options accessible, inclusive and diverse for all. Efforts around healthcare and a positive office atmosphere promote employee well-being.

Diggibyte Technologies was certified as the Best Firm For Data Engineers and was awarded during DES 2024.

]]>
https://analyticsindiamag.com/ai-hiring/data-science-hiring-process-at-diggibyte/feed/ 0
[Exclusive] Altair is Hiring Multiple Data Scientists in India https://analyticsindiamag.com/ai-hiring/exclusive-altair-is-hiring-multiple-data-scientists-in-india/ https://analyticsindiamag.com/ai-hiring/exclusive-altair-is-hiring-multiple-data-scientists-in-india/#respond Mon, 29 Jul 2024 09:08:36 +0000 https://analyticsindiamag.com/?p=10129455 From interns to seasoned professionals, the company is looking for people skilled in building data pipelines, data preparation, data modelling, model validation, and deployment. ]]>

Recently, computational intelligence solutions giant Altair launched HyperWorks 2024, an AI-powered engineering and simulation platform for its customers which integrates AI across the product lifecycle, claiming faster design exploration, efficient thermal analysis, and rapid behaviour prediction. 

It offers AI-embedded workflows, HPC environments, photorealistic graphics, Python and C++ scripting, generative design, and meshless ECAD simulation to boost productivity and streamline engineering workflows.

Gilma Saravia, the chief people officer of Altair told AIM in an exclusive interaction that the company is planning to expand its footprint in India and hiring for multiple roles in AI and analytics. It is also offering internship roles.

Skills Needed

Altair seeks individuals who are up for innovation and continuous learning. Specifically, for data science roles, the ideal candidates should be able to design and build systems for collecting, storing, and analysing data at scale and be good at improving data reliability and quality.

They should be skilled in building data pipelines, data preparation, data modelling, model validation, and deployment.

“We assess their (candidates) case studies and code completion assessments, as well as domain expertise to understand their level of technical proficiency. It’s through this assessment of their skills in statistics, programming, ML, and data visualisation that provide us the best understanding of where we can place the candidate,” she added. 

Interview Process

“The interview process at Altair is designed to identify candidates who align with the company’s core values: envisioning the future, seeking technological and business firsts, communicating broadly and honestly, and embracing diversity and risk-taking,” added Saravia. 

Initial Screening: Evaluation of candidates’ experience and domain expertise.

Technical Evaluations: A series of technical assessments focusing on coding, model building, and problem-solving abilities.

Behavioural Assessments: Evaluation of candidates’ fit with Altair’s culture and core values.

Project-Specific Questions: Tailored inquiries to assess applied skills in machine learning algorithms, statistical modelling, and domain-specific knowledge.

Practical Exercises: Candidates are tested on their practical coding and modelling skills, including code snippet completion, data preprocessing, and model optimisation.

The company expects candidates to have a customer-focused outlook, the ability to think complexly yet arrive at simple solutions, and a deep curiosity for exploring beyond the obvious. In return, it offers competitive rewards, flexible work schedules, and ample opportunities for career development. 

“We also greatly value an entrepreneurial spirit and team members who are always looking forward to new ideas and challenges. Candidates who are not able to envision the future or who do not possess a “problem-solving mindset” may not qualify as a top candidate in the process” Gilma concluded.

Check out the careers page for applying.

Read Next: Data Science Hiring Process at Target

]]>
https://analyticsindiamag.com/ai-hiring/exclusive-altair-is-hiring-multiple-data-scientists-in-india/feed/ 0
Time To Scale Down Large Language Models https://analyticsindiamag.com/ai-trends/time-to-scale-down-large-language-models/ https://analyticsindiamag.com/ai-trends/time-to-scale-down-large-language-models/#respond Tue, 16 Jul 2024 05:56:13 +0000 https://analyticsindiamag.com/?p=10129231 Advancements in hardware (H100 GPUs), software (CUDA, cuBLAS, cuDNN, FlashAttention), and data quality have drastically reduced training costs.]]>

Renowned research scientist Andrej Karpathy recently said that the llm.c project showcases how GPT-2 can now be trained in merely 24 hours on a single 8XH100 GPU node—for just $672. 

Karpathy’s journey began with an interest in reproducing OpenAI’s GPT-2 for educational purposes. He initially encountered obstacles in using PyTorch, a popular deep-learning framework. 

Frustrated by these challenges, Karpathy decided to write the entire training process from scratch in C/CUDA, resulting in the creation of the llm.c project. It eventually evolved into a streamlined, efficient system for training language models.

The project, which implements GPT training in C/CUDA, has minimal setup requirements and offers efficient and cost-effective model training.

Scaling down LLMs 

In his post, Karparthy mentioned how advancements in hardware (H100 GPUs), software (CUDA, cuBLAS, cuDNN, FlashAttention), and data quality have drastically reduced training costs. 

Mauro Sicard, the director of BRIX Agency, agreed with Karparthy. “With the improvements in both GPUs and training optimisation, the future may surprise us,” he said.

Scaling down LLM models while maintaining performance is a crucial step in making AI more accessible and affordable. 

According to Meta engineer Mahima Chhagani, LLMLingua is a method designed to efficiently decrease the size of prompts without sacrificing significant information. 

Chhagani said using an LLM cascade, starting with affordable models like GPT-2 and escalating to more powerful ones like GPT-3.5 Turbo and GPT-4 Turbo, optimises cost by only using expensive models when necessary.

FrugalGPT is another approach that uses multiple APIs to balance cost and performance, reducing costs by up to 98% while maintaining a performance comparable to GPT-4. 

Additionally, a Reddit developer named pmarks98 used a fine-tuning approach with tools like OpenPipe and models like Mistral 7B, cutting costs by up to 88%.

Is there a Real Need to Reduce Costs?

Cheaper LLMs, especially open-source models, often have limited capabilities compared to the proprietary models from tech giants like OpenAI or Google. 

While the upfront costs may be lower, running a cheap LLM locally can lead to higher long-term costs due to the need for specialised hardware, maintenance overheads, and limited scalability.

Moreover, as pointed out by Princeton professor Arvind Narayanan, the focus has shifted from capability improvements to massive cost reductions, which many AI researchers find disappointing.

Cost over Capability Improvements

Narayanan argued that cost reductions are more exciting and impactful for several reasons. They often lead to improved accuracy in many tasks. Lower costs can also accelerate the pace of research by turning it more affordable and making more functionalities accessible.

So, in terms of what will make LLMs more useful in people’s lives, cost is hands down more significant at this stage than capability, he said.

In another post, Narayanan said that the cheaper a resource gets, the more demand there will be for it. Maybe in the future it will be common to build applications that invoke LLMs millions of times in the process of completing a simple task.
This democratisation of AI could accelerate faster than we imagined, possibly leading to personal AGIs for $10 by 2029.

]]>
https://analyticsindiamag.com/ai-trends/time-to-scale-down-large-language-models/feed/ 0
Vector Databases are Ridiculously Good https://analyticsindiamag.com/ai-features/vector-databases-are-ridiculously-good/ https://analyticsindiamag.com/ai-features/vector-databases-are-ridiculously-good/#respond Mon, 15 Jul 2024 06:05:11 +0000 https://analyticsindiamag.com/?p=10126833 With the increasing adoption predicted by experts and the introduction of educational resources, vector databases are set to play a pivotal role in shaping the next era of AI technology]]>

Building large language models requires complicated data structures and computations, which conventional databases are not designed to handle. Consequently, the importance of vector databases has surged since the onset of the generative AI race. 

This sentiment was reflected in a recent discussion when software and machine learning engineer Santiago Valdarrama said, “You can’t work in AI today without bumping with a vector database. They are everywhere!”

He further added that vector databases, with their ability to store floating-point arrays and be searched using a similarity function, offer a practical and efficient solution for AI applications.

Vector databases provide LLMs with access to real-time proprietary data, enabling the development of RAG applications.

Database companies are pivotal in driving the generative AI revolution and its growth. Redis enhances real-time efficiency for LLM-powered chatbots like ChatGPT, ensuring smooth conversations. At the smae time, enterprises are leveraging MongoDB Atlas and Google Cloud Vertex AI PaLM API to develop advanced chatbots.

Making it Easier

However, major database vendors, regardless if they were originally established as SQL or NoSQL, such as MongoDB, Redis, PlanetScale, and even Oracle have all added vector search features to their existing solutions to capitalise on this growing need.

In an earlier interaction with AIM, Yiftach Shoolman, the co-founder and CTO of Redis, said, “We have been working with vector databases even before generative AI came into action.” 

Redis not only fuels the generative AI wave with real-time data but has also partnered with LangChain to launch OpenGPT, an open-source model that allows flexible model selection, data retrieval control, and data storage management.

Another important challenge vector databases claim to solve is hallucinations, which have been a persistent issue for LLMs. 

“Pairing vector databases with LLMs allows for the incorporation of proprietary data, effectively reducing the potential range of responses generated by the database,” said Matt Asay, VP, developer relations, in an exclusive interaction with AIM at last year’s Bengaluru chapter of their flagship event MongoDB.local.

During a recent panel discussion, Pinecone founder and CEO Edo Liberty explained that vector databases are made to manage these particular types of information “in the same way that in your brain, the way you remember faces or the way you remember poetry”.

Most of the prominent names in the industry have already implemented vector capabilities. Think Amazon Web Services, Microsoft, IBM, Databricks, MongoDB, Salesforce, and Adobe.

Jonathan Ellis, the co-founder and CTO of DataStax, explained that while OpenAI’s GPT-4 is limited to information up until September 2021, indexing recent data in a vector database and directing GPT-4 to access it can yield more accurate and high-quality answers. This approach eliminates the need for the model to fabricate information, as it is grounded in updated context.

What Next? 

However, vector databases are not without challenges. A recent report by Gartner noted that using vector databases for generative AI may raise issues with raw data leakage from embedded vectors. Raw data used to create vector embeddings for GenAI can be re-engineered from vector databases, making data leakage possible.

“Given the compute costs associated with AI, it is crucial for organisations to engage in worker training around vector database capabilities,” Gartner analyst Arun Chandrasekaran emphasised in an interview with Fierce. “This preparation will help them avoid expensive misuse and misalignment in their AI projects.”

Nevertheless, several vector db startups are now gaining prominence. During an otherwise weak year for venture capital, hundreds of dollars are flowing into vector database businesses like Pinecone, which got $100 million in April 2023 from Andreessen Horowitz. 

Pinecone is not the only one. Dutch firm Weaviate secured $50 million from Index Ventures. The Weaviate AI-native vector database simplifies vector data management for AI developers.

There are emerging divisions in the vector database arena, particularly between open- and closed-source players, and between dedicated vector databases and those with integrated vector storage and search functionality. 

On the dedicated, open-source side, Chroma, Quadrant, and Milvus (in collaboration with IBM) stand out, while Pinecone is a leading dedicated, closed-source player. Meanwhile, Snowflake, although not a dedicated vector database, offers vector search capabilities within its open-source framework.

And there’s a good reason why so many people are jumping into this sector. Chandrasekaran predicts that 30% of organisations will employ vector databases to support their generative AI models by 2026, up from 2% in 2023. 

Understanding its importance, Andrew Ng, has also introduced free learning courses on the same with MongoDB, Weavaiate, Neo4j and more.

With the increasing adoption predicted by experts and the introduction of educational resources, vector databases are set to play a pivotal role in shaping the next era of AI technology. 

As organisations continue to integrate these powerful tools, the potential for innovation and improved AI capabilities becomes ever more significant, heralding a new age of intelligent applications and solutions.

]]>
https://analyticsindiamag.com/ai-features/vector-databases-are-ridiculously-good/feed/ 0
There is No Such Thing as Experts https://analyticsindiamag.com/ai-features/there-is-no-such-thing-as-experts/ https://analyticsindiamag.com/ai-features/there-is-no-such-thing-as-experts/#respond Thu, 11 Jul 2024 07:35:56 +0000 https://analyticsindiamag.com/?p=10126512 Nobody with real experience is ever credited with any major innovation.]]>

In a recent interview, Vinod Khosla, the founder of Khosla Ventures, had some unconventional insight to share. He claimed that all major innovations almost always stem from disruptors originating outside of the field. Nobody with real experience is ever credited with any major innovation, he said. 

As per Khosla, these disruptors often challenge conventional wisdom and apply cross-disciplinary knowledge to create breakthrough technologies and products.

In Khosla’s words, “Retailing didn’t come from Walmart, it came from Amazon. Space X didn’t come from Lockheed or Boeing. Companies like SpaceX and RocketLabs didn’t work in space before entering the field.”

Outsiders Bring in Fresh Ideas

And he isn’t wrong—everywhere you look, industries are being disrupted. The hotel industry encountered it with Airbnb, and the music industry experienced it with Spotify. Mark Zuckerberg created a social media platform that changed online interaction despite not being an expert in social networking.

Each one of these businesses, in Khosla’s words, offered a more practical service than the tried-and-true approaches. By doing this, they not only fundamentally altered the rules for their rivals and users but also inspired a new wave of innovation.

Khosla discussed how, in 2018, he decided to invest in OpenAI, which back then was a nonprofit focused on artificial general intelligence. 

“Many people told me that I was crazy. It was a nonprofit and there were lots of reasons not to invest in it. It was an odd structure and they didn’t have a business plan. They didn’t know if they’d ever get revenue. And I made the largest initial investment,” he said.

Elon Musk is another prime example of this. Although he had no prior experience in the automotive or aerospace industries, he managed to disrupt both sectors by applying new perspectives and innovative thinking. His concept of a strong, long-range EV was fantastic. 

However, when Musk first introduced his vehicle, the Roadster, based on the Lotus platform, it experienced multiple failures and recalls. The technology was not flawless, the market was unpredictable, and the pricing was incomprehensible.

Steve Jobs, for instance, revolutionised many industries, including computing, telecommunications, and music, not because he was a specialist in any of them. He did it because he approached problems from a user-centric standpoint, with an unwavering emphasis on design and simplicity.

Apple noticed that while laptops excelled in functionality—thanks to larger, more comfortable screens and keyboards, as well as fast processors—it lacked portability that cell phones had.

There was a unique desire for a device that was more portable than a laptop but had more capabilities than a smartphone, which led to the development of the iPad. At its peak, the iPad held 28% of Apple’s revenue.

The Non-Expert Advantage

“You don’t go hire somebody from IBM, who’s done IBM for 20 years; that experience will kill you for sure,” Khosla said in the interview. 

Khosla believes that disruptive innovation typically comes from individuals or teams not deeply entrenched in traditional industry practices.

Let’s take a look at Khosla’s venture, Commonwealth Fusion System. In the interview, he spoke about meeting with a senior fellow from MIT’s plasma fusion lab. 

Khosla felt that the world needed a different kind of electricity than 

what was available, so he decided to take a chance with fusion energy despite nobody at the Department of Energy wanting to talk about it at the time. 

Commonwealth Fusion Systems is now working to develop fusion as a viable energy source.

Another example is Okta, one of the first companies he invested in. Before Okta, several companies like Microsoft and IBM already offered IAM solutions. 

However, they were primarily designed for on-premises environments. Okta thus began offering an entirely cloud-based program, which allowed for easier deployment, maintenance, and scalability. Today, 

Okta holds a 28% market share.

Innovation Comes from Disruptors

There is another story playing out in Netflix’s (which has been around since 1997) transition from DVD shipping to streaming. This transition required the company to disrupt itself, which is an extraordinary task, as most successful disruptive innovations attack someone else’s profit pool, not one’s own. 

Conversely, if the other industry’s core offering is a product, consider whether it can also be offered as a service. Hubspot provides an excellent example of this type of innovation. 

Following the success of Google’s search engine, other companies started offering consulting services aimed at helping their clients become more visible in Google searches (i.e. search engine optimisation). 

Hubspot developed software enabling companies to optimise search engines, thus transforming a core offering that others provided as a service into a product.

Learn Lessons in Disruption 

Experts play a crucial role in driving innovation and advancing technology. Elizabeth Holmes is a prime example of this. She was a non-expert in medical technology who founded Theranos, promising revolutionary blood-testing technology.  

The lack of deep medical knowledge led to flawed technology that failed to deliver accurate results, ultimately resulting in legal issues and the company’s collapse.

So, to innovate effectively, it’s often beneficial to balance the fresh perspectives of non-experts with the deep knowledge of experts. 

]]>
https://analyticsindiamag.com/ai-features/there-is-no-such-thing-as-experts/feed/ 0
Adobe is Hiring GenAI Researchers in India https://analyticsindiamag.com/ai-hiring/adobe-is-hiring-genai-researchers-in-india/ https://analyticsindiamag.com/ai-hiring/adobe-is-hiring-genai-researchers-in-india/#respond Mon, 08 Jul 2024 12:07:26 +0000 https://analyticsindiamag.com/?p=10126210 It is looking for researcher in NLP, LLMs, computer vision, deep learning, ML, and more. ]]>

Creative tech giant Adobe which houses around 8000 employees in India, is expanding its generative AI team and is hiring talented researchers in various fields.

This includes natural language processing, natural language generation, human-computer interaction and user experience research, computer vision, deep learning and machine learning, artificial intelligence, image, audio, video, and multi-modal data processing, and autonomic systems and computing.


Snowflake certification

Upon joining, the researcher will be responsible for creating, designing, developing, and prototyping AI innovations. Researchers will also transform these innovations into advanced technologies for Adobe’s products, explore new research areas, publish their findings in leading journals and conferences, and collaborate with colleagues from top universities globally.

The ideal candidates must have proven research excellence and a strong publication record. They should be from the educational background of computer science, electrical engineering, and mathematics.  

For senior roles, a minimum of seven years of research experience is necessary. Additionally, candidates must show an ability to take bold initiatives, solve complex problems, prioritise tasks, and make informed decisions. Strong analytical, mathematical modelling, and software skills are essential, along with the capability to work at various levels of abstraction.

Check out its career page here.

Previously, Prativa Mohapatra, VP and MD, Adobe India, told AIM that the company’s integration of generative AI focuses on three primary areas within its cloud products: Adobe Experience Cloud, Creative Cloud, and Document Cloud to address the need for faster and more effective content workflows, directly benefiting enterprise customers by reducing time-to-market and enabling personalised customer interactions on a large scale.

The company wants to democratise generative AI-powered content creation with product integrations. 

Adobe entered the generative AI race in March 2023 with Adobe Firefly, which it built in collaboration with NVIDIA, and has been coming up with new AI updates pretty frequently. 

Now, Adobe will integrate third-party AI tools, including OpenAI’s Sora, into Premiere Pro. Partnering with AI providers like OpenAI, RunwayML, and Pika, it aims to offer users flexibility in choosing AI models for their workflows.

]]>
https://analyticsindiamag.com/ai-hiring/adobe-is-hiring-genai-researchers-in-india/feed/ 0
AIM launches CDO Vision Mumbai and Dallas: Meet 30 Top CIOs & CDOs to discuss future of AI https://analyticsindiamag.com/ai-highlights/aim-launches-cdo-vision-mumbai-and-dallas-meet-30-top-cios-cdos-to-discuss-future-of-ai/ https://analyticsindiamag.com/ai-highlights/aim-launches-cdo-vision-mumbai-and-dallas-meet-30-top-cios-cdos-to-discuss-future-of-ai/#respond Mon, 08 Jul 2024 12:03:05 +0000 https://analyticsindiamag.com/?p=10126203 CDO Vision 2024 brings together top CDOs and AI leaders in Mumbai and Dallas for exclusive networking and insightful discussions on data-driven decision-making and AI strategies. ]]>

CDO Vision is an exclusive networking event series presented by AIM, bringing together over 30 top CDOs and AI leaders from large enterprises to discuss critical business issues and the future of data and analytics. This August, AIM is proud to announce two major events: CDO Vision Mumbai and CDO Vision Dallas.

CDO Vision Mumbai: August 30, 2024

Scheduled to take place in the heart of Mumbai, this event will provide a platform for the city’s elite CDOs and CIOs to connect and collaborate over an intimate lunch. The focus will be on fostering collaboration between tech and business for data-driven decision-making and exploring the latest advancements and strategies in data management and analytics.

Mumbai Agenda:

  • 9:00 AM – 9:15 AM: Registration & Networking
  • 9:15 AM – 9:30 AM: Welcome & Introductions
    • AIM representatives will introduce the CDO Vision Series, its objectives, and the key themes.
  • 9:30 AM – 10:00 AM: Keynote
  • 10:00 AM – 10:30 AM: Panel Discussion – Fostering Collaboration Between Tech and Business for Data-Driven Decision Making
  • 11:00 AM – 12:30 PM: Roundtable Discussion – Data Strategies for the Future: Balancing Innovation, Governance, and Technology
    • Thriving with Responsible AI: Explore how to balance cutting-edge data use with ethical considerations and responsible data governance practices.
    • Navigating the Regulatory Maze: Discuss strategies for staying compliant with evolving data privacy regulations.
  • 12:30 PM – 2:00 PM: Lunch and Networking

Register for CDO Vision Mumbai.

CDO Vision Dallas: August 23, 2024

Taking place at the Fairmont Dallas, this event will feature a gathering of over 30 top CDOs and analytics leaders. The focus will be on exploring the role of CDOs in today’s dynamic business landscape, especially in the context of Dallas’ thriving business environment.

Dallas Agenda:

  • 9:00 AM – 9:15 AM: Registration & Networking
  • 9:15 AM – 9:30 AM: Welcome & Introductions
    • AIM representatives will introduce the CDO Vision Series, its objectives, and the key themes for the day.
  • 9:30 AM – 10:00 AM: Keynote
  • 10:00 AM – 10:30 AM: Panel Discussion – Blind Spot in Generative AI
    • Leaders will discuss the importance of addressing blind spots in AI implementation strategies to ensure the completeness and effectiveness of their plans.
  • 10:30 AM – 11:00 AM: Panel Discussion – How to Monetize Generative AI
    • Strategic approaches for leaders to monetize Generative AI effectively, leveraging AI-generated content or designs, licensing AI models, and integrating AI-driven automation.
  • 11:00 AM – 12:30 PM: Roundtable Discussion – Generative AI: Basics, Pitfalls, and Best Practices
    • Industry leaders and specialists will discuss core principles, potential pitfalls, and key best practices for Generative AI.
  • 12:30 PM – 2:00 PM: Lunch and Networking

Why Attend?

  1. Exclusive Insights: Gain firsthand knowledge from top 30 CDOs and CIOs about the latest trends and strategies in data management and AI.
  2. Networking Opportunities: Build meaningful connections with industry peers and thought leaders.
  3. Strategic Discussions: Engage in in-depth discussions on how to harness the power of Generative AI and other advanced technologies to drive business success.
  4. Learn from the Best: Hear from leaders who have successfully navigated the complexities of data-driven decision-making and AI implementation.

Speakers and Participants

The events will feature an impressive lineup of speakers and participants, including:

  • Phanii Pydimarri – Head of Strategic Planning & Partnerships at Health Care Service Corporation
  • Preet Nagvanshi – SVP, Head of GTM Commercial Operations at Thomson Reuters
  • Arjun Srinivasan – Director – Data Science at Wesco
  • Rishi Bhatia – Director – Data Science at Walmart Global Tech
  • Erum Manzoor – Senior Vice President at Citi
  • Pritam Debnath – Director, Inbound Supply Chain Technology & Innovation at Sysco Corporation
  • Trey Connolly – VP, Data & Analysis at Digitas North America
  • Kalyana Bedhu – AI/ML Leader at Fannie Mae
  • Robert Garagiola – Senior Director – Global Enterprise Data at James Hardie
  • Paul Davis – SVP, Head of AI/ML Model Development for Consumer Lending at Wells Fargo
  • Ambika Saklani Bhardwaj – Data & Analytics Product Leader at Walmart
  • Zulfikar Sidi – Managing Director, Data & Analytics, Retail, Preferred and Wealth Management at Bank of America
  • Ram Chandra – Director of Data Science at Toyota North America
  • Sandeep Ahluwalia – Director – Data and Analytics at Citi
  • Satheesh Ramachandran – Head of AI and Analytics Product at Charles Schwab

Register for CDO Vision Dallas

These events follow the success of previous CDO Vision series held in Singapore, Dubai, New York, and San Francisco, all organized by AIM. CDO Vision is an AIM intellectual property designed to bring together the best minds in data and analytics for productive discussions and networking opportunities.

For more details and to register for these exclusive events, visit the official CDO Vision website here. Don’t miss this opportunity to connect with industry leaders and drive the future of data and analytics.

]]>
https://analyticsindiamag.com/ai-highlights/aim-launches-cdo-vision-mumbai-and-dallas-meet-30-top-cios-cdos-to-discuss-future-of-ai/feed/ 0
From Jaipur to Jersey: The Growth Story of Celebal Technologies https://analyticsindiamag.com/ai-features/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/ https://analyticsindiamag.com/ai-features/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/#respond Fri, 05 Jul 2024 11:29:55 +0000 https://analyticsindiamag.com/?p=10126011 It has achieved remarkable success, from Jaipur to becoming a global company with presence in USA, India, APAC, UAE, Europe, & Canada. ]]>

Amidst the many AI annotation, semiconductor and other talents brewing in India’s tier 2 and 3 cities, Jaipur has emerged as a leading hub for AI talent in India, with the highest number of AI job openings among tier-2 cities. 

A recent report revealed that 13% of AI job postings from December 2023 to April 2024 were in tier-2 and 3 cities, with Jaipur at the forefront, followed by Indore and South Goa.

Celebal Technologies, a premier software services company based in Jaipur, has played a significant role in developing the local AI talent ecosystem. 

With this focus on talent development, the company is helping unfold a quiet IT revolution in the heart of Rajasthan’s capital city.

Celebal, co-founded by Anirudh Kala and Anupam Gupta in 2016, is helping legacy enterprises embrace modern cloud innovations and AI, delivering end-to-end digital transformation to clients across the globe. 

In just a few years, the company has achieved remarkable success, growing from its humble beginnings in Jaipur to becoming a global company with a presence in the USA, India, APAC, UAE, Europe, and Canada. 

The company now boasts an impressive client base that includes 90% of Fortune 500 companies, a dedicated workforce of over 2000 professionals, and a track record of triple-digit revenue growth.

“Our mission is to make data simple and easy to understand for all organisations,” Kala told AIM in an exclusive interview on the sidelines of the recently concluded Data+AI Summit

“We are committed to providing solutions powered by modern cloud and artificial intelligence that integrate with traditional enterprise software. Our tailor-made solutions help enterprises maximise productivity and improve speed and accuracy,” he added.

Nurturing AI Talent in Tier 2, 3, 4

At the core of Celebal Technologies’ success is its focus on nurturing and empowering talent. The company has grown its workforce from just 300 employees in FY21 to over 2,300 in FY24, achieving a compound annual growth rate (CAGR) of approximately 105% during this period.

“There is no dearth of talent in this country,” Kala emphasized. “Specifically the fact that there are so many folks who have not had the chance to prove themselves. Sometimes I would meet people who are extremely introverted, but great coders. I have met talented folks from very humble backgrounds.”

Celebal has made it a priority to provide opportunities to individuals from diverse backgrounds, including those from tier 4 and tier 5 towns in India who may not have had access to privileged education. 

“We have been training people like those,” Kala said. “I can tell you they are amazing guys. There is no reason for intellect to remain with those who are privileged. It is with everyone.”

By focusing on creating talent rather than just hiring from the market, Celebal Technologies has built a strong foundation for its growth. 

The company provides mentorship and training, and shows patience in nurturing its workforce. It recognises that once an individual has learnt the necessary skills, there is no stopping them.

Innovation with Data and AI

Celebal Technologies’ success is also rooted in its unwavering focus on innovation, particularly in the areas of data science and artificial intelligence. 

Kala, who has spent over a decade working in AI, has witnessed firsthand how the technology has evolved from a “good to have” to a “must have” for businesses.

“We really focus on data and AI. And now, with data specifically, the kind of customer conversions that we have, I think it is second to none because we have never had such momentum,” Kala beamed.

The company leverages its expertise in AI and data to deliver cutting-edge solutions to its clients, helping them improve business efficiencies and unlock the full potential of their data. 

The company’s solutions span across various domains, including predictive maintenance, anomaly detection, inventory forecasting, customer 360, IoT analytics, and more.

Strategic Partnerships and Industry Recognition

Celebal Technologies’ success has been further fueled by its strategic partnerships with industry giants such as Microsoft and Databricks. 

The company has been recognised as a Microsoft Gold Partner and crowned Microsoft India Partner of the Year for two consecutive years in 2021 & 2022 and Global Partner of the Year for 2023.

“Winning this award for the first time felt like an incredible achievement but winning it twice in a row is mind boggling,” said Anupam Gupta, co-founder and president of Celebal Technologies. 

“This is a validation of our strategy in the domain of SAP innovation on Azure, and Power Platform along with our committed execution of this strategy,” Gupta added.

In addition to its Microsoft accolades, Celebal has also emerged victorious at the Databricks Partner Awards for three consecutive years. 

“We firmly believe that the cornerstone of AI innovation lies in robust data foundations. This recognition reflects our unwavering commitment to leveraging data to drive transformative, industry-specific GenAI solutions in production for our clients. 

What’s Next? 

As the company continues its remarkable journey, it remains focused on its fundamentals and the promise of creating intelligent enterprises. It aims to help organisations with legacy systems and outdated software ecosystems embrace the power of modern cloud platforms and AI.

“Traditional enterprises often carry the baggage of antiquated technology, heavy licensing-based limitations, and scalability challenges,” Gupta explained. “But they also want high levels of data analysis that puts them on par with far younger unicorns and startups.”

Celebal bridges this gap by providing legacy businesses with the right tech stack, platforms, and partnerships to enable their digital transformation. 

The company’s long-term vision is to make AI an implicit part of business processes, seamlessly integrating it into the very fabric of organisations.

“Eventually, we want organisations to make AI an implicit part of their processes. For example, if we build a house, we know the pipelines, the plumbing work, right? It has to be there. We don’t think of it building explicitly. 

“Could we have a business environment and processes that would have the same concept? It is implicit that we use AI,” Kala said.

While scaling presents its own challenges, Celebal Technologies remains committed to its fundamentals and the potential of each individual to contribute. The company aims to do full justice to each person’s career and ability, fostering a culture of continuous learning and growth.

“Scaling has its own challenges. I will not shy away from the fact that we have struggled, we are still struggling,” Kala admitted. 

“Of course, we are trying to figure out how to cut down that struggle. I think that struggle is going to be there even if we get more and more projects, more and more people. The way to scale is only through focusing on your fundamentals.”

As the company continues to push the boundaries of what is possible with data and AI, it is not just transforming businesses but also shaping the future of the industry, one solution at a time.

“I would say it is not a hype because we are using it,” Kala emphasised, referring to the AI revolution. “It is here to stay. We are seeing things happening. We are seeing things in our daily lives.”

]]>
https://analyticsindiamag.com/ai-features/from-jaipur-to-jersey-the-growth-story-of-celebal-technologies/feed/ 0
Why Data Quality Matters in the Age of Generative AI https://analyticsindiamag.com/ai-features/generative-ai/ https://analyticsindiamag.com/ai-features/generative-ai/#respond Thu, 04 Jul 2024 04:30:00 +0000 https://analyticsindiamag.com/?p=10125699 GenAI can augment human intelligence by identifying patterns and correlations that humans may miss.]]>

In the dynamic realm of data engineering, the integration of Generative AI is not just a distant aspiration; it’s a vibrant reality. With data serving as the catalyst for innovation, its creation, processing, and management have never been more crucial.

“While AI models are important, the quality of results we get are dependent on datasets, and if quality data is not tapped correctly, it will result in AI producing incorrect results. With the help of Gen AI, we are generating quality synthetic data for testing our models,” said Abhijit Naik, Managing Director, India co-lead for Wealth Management Technology at Morgan Stanley.

Speaking at AIM’s Data Engineering Summit 2024, Naik said that Gen-AI, machine learning, neural networks, and deep learning models that we have, is the next stage of automation post the RPA era.

“Gen AI will always generate results for you. And when it generates results for you, sometimes it hallucinates. So, what data you feed becomes very critical in terms of the data quality, in terms of the correctness of that data, and in terms of the details of data that we feed,” Naik said. 

However, it’s important to note that human oversight is crucial in this process, Naik added. When integrated carefully into existing pipelines and with appropriate human oversight, GenAI can augment human intelligence by identifying patterns and correlations that humans may miss.

The task of documenting every aspect of their functioning and the knowledge they derive from data is a complex one. This underscores the need for caution and thorough understanding when integrating Generative AI. 

Due to their vast size and training on extensive unstructured data, generative models can behave in unpredictable, emergent ways that are difficult to document exhaustively.  

“This unpredictability can lead to challenges in understanding and explaining their decisions” Naik said.

GenAI in Banking

Naik emphasised GenAI’s importance in the banking and finance sectors. “It can generate realistic synthetic customer data for testing models while addressing privacy and regulatory issues. This helps improve risk assessment,” he added.

This is especially critical when accurate data is limited, costly, or sensitive. A practical example could be creating transactional data for anti-fraud models.

Gen AI models, including Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and massive language models like GPT, may generate synthetic data that mimics the statistical features of real-world datasets. 

For example, Capital One and JPMorgan Chase use GenAI to strengthen their fraud and suspicious activity detection systems. Morgan Stanley implemented an AI tool that helps its financial advisors find data anomalies in detecting fraud, and Goldman Sachs uses GenAI to develop internal software.  Customers globally can benefit from 24/7 accessible chatbots that handle and resolve concerns, assist with banking issues, and expedite financial transactions.

A recent study showed that banks that move quickly to scale generative AI across their organisations could increase their revenues by up to 600 bps in three years.

“Of course, the highly regulated nature of banking/finance requires careful model governance, oversight, explainability and integration into existing systems,” Naik concluded.

]]>
https://analyticsindiamag.com/ai-features/generative-ai/feed/ 0
Is Data Annotation Dying? https://analyticsindiamag.com/ai-features/is-data-annotation-dying/ https://analyticsindiamag.com/ai-features/is-data-annotation-dying/#respond Tue, 25 Jun 2024 12:04:40 +0000 https://analyticsindiamag.com/?p=10124754 AI is coming for data annotation jobs. How ironic.]]>

Voxel51 co-founder and University of Michigan professor of robotics Jason Corso recently put up an article on data annotation being dead, which sparked discussions on LinkedIn and X alike.

One may have assumed that the wave of generative AI would make data annotation jobs even more abundant. But that is the exact same reason why these jobs are slowly becoming obsolete. 

Industry leaders specialising in computer vision solutions stress that despite advancements in AI, meticulously curated, high-quality image-annotated datasets remain essential. These datasets are critical as operations scale and diversify, and the notion that untested technologies could disrupt established workflows is not only impractical but potentially harmful. 

Moreover, human-created datasets are proving even more relevant in fields beyond computer vision, extending to generative AI and multimodal workflows. There have been several reports about companies such as OpenAI, Amazon, and Google acquiring cheap labour in countries such as India or even Kenya for labelling and annotating data for training AI models. 

In India, companies such as NextWealth, Karya, Appen, Scale AI, and Labelbox are creating jobs within the country, specifically in rural areas, for data annotation. When speaking with AIM, NextWealth MD and founder Sridhar Mitta said, “The beauty of GenAI is that it allows people from remote areas to do these tasks.” 

So, are these companies are about to slowly die?

Not So Dead After All

Human annotation has played a pivotal role in the AI boom, providing the foundation for supervised machine learning. The process involves manually labelling raw data to train machine learning algorithms in pattern recognition and predictive tasks. 

While labour-intensive, this approach ensures the creation of reliable and accurate datasets. It turns out that the need for human-generated datasets is even more crucial now than ever before. 

The only disruption possible is with self-supervised learning. An engineering leader in AI, Tilmann Bruckhaus, said, “These techniques reduce the need for manual labelling by using noisy or automatically-generated labels (weak supervision) or enabling models to learn from unlabeled data (self-supervision).”

Corso believes that human annotation will be needed for gold-standard evaluation datasets, which will also be combined with auto-annotation in the future. 

This process involves using AI models to automatically label data. While this approach shows promise, its applicability is limited. Auto-annotation is most useful in scenarios where the model’s performance needs to be adapted to new environments or specific tasks. However, for general applications, a reliance on auto-annotation remains impractical.

Adding to all of this is how current AI models are increasingly relying on synthetic data. SkyEngine AI CEO Bartek Włodarczyk said that with synthetic data “one does not have to worry about data labelling as any masks can be instantly created along with data.”

Dangerous Times Ahead?

Though one can clearly say that human annotation will be the gold standard in the future, if companies fail to adapt and thrive with the current boom, many of them will have to face dangerous times ahead. People For AI founder and data labelling director Matthieu Warnier said, “As labelling tasks become more automated, the ones that remain are notably more complex. Selecting the right labelling partner has become even more crucial.”

This was also reflected by Hugging Face co-founder and CSO Thomas Wolf. “It’s much easier to quickly spin and iterate on a pay-by-usage API than to hire and manage annotators. With model performance strongly improving and the privacy guarantee of open models, it will be harder and harder to justify making complex annotation contracts,” said Wolf, further stating that these will be some dangerous times for data annotation companies. 

It seems like manual data annotation might take a backseat when it comes to labelling data for AI training with models such as YOLOv8 or Unitlab’s SAM that can annotate almost anything without a need for human intervention. 

On the other hand, manual data annotations will remain a premium service, but the numbers are definitely expected to drop. Companies that are utilising workers in different parts of the world to create high-quality datasets will have to cut down on costs soon. 

So, the data annotation market might see major shifts when it comes to adapting to the changing landscape. While the size is definitely set to decrease, the manual data annotation companies will be the ones who set the golden standard, making themselves the benchmark for the automated data annotation market.

]]>
https://analyticsindiamag.com/ai-features/is-data-annotation-dying/feed/ 0