Indian IT GenAI Hangover Begins
AIM Events
Hosting the World’s Most Impactful AI Conferences & Events. For Brand collaborations write to [email protected]
India’s homegrown IT giant Infosys is leading the pack with a series of generative AI announcements made in the past few days. The company recently partnered with Sarvam AI to develop specialised small language models (SLMs) using NVIDIA’s AI stack. This collaboration will enhance Infosys Topaz BankingSLM and ITOps SLM, tailored for banking and IT operations.?
At the recent NVIDIA AI Summit India, in Mumbai last week, Infosys announced its partnership with the chip maker, which has beat Apple to become the world’s most valuable company. Infosys is incorporating NVIDIA AI Enterprise into its Infosys Topaz suite to enable businesses to rapidly implement and integrate generative AI into their workflows.?
In addition, Infosys launched an NVIDIA Center of Excellence dedicated to employee reskilling, solution development, and the widespread adoption of NVIDIA technology across organisations.
There’s more: At Meta’s Build with AI Summit in Bengaluru last week, Infosys announced a partnership with Meta to utilise the Llama stack, a collection of open-source large language models and tools, to build AI solutions across industries.
As an early adopter of Llama 3.1 and 3.2 models, Infosys is integrating these models with Infosys Topaz, the in-house AI platform, to create tools that deliver business value. One example is a document assistant powered by Llama that improves the efficiency of contract reviews.
“We are building industry-wide solutions. For market research, we have built one proof-of-concept (POC), and internally we are leveraging many Llama models for our AI-first journey,” an Infosys representative told AIM. The person further added that they have several other use cases, such as production use cases and document summarisation.
This comes after Infosys revealed that it is working on small language models for its clients’ various applications. During the recent earnings call, Infosys CEO and MD Salil Parekh said, “It’s an incredible approach that leverages various open-source components along with a narrow set of industry data and Infosys’ proprietary dataset.”
Interestingly, in March, Yann LeCun, the chief of Meta AI disclosed that he met with an Infosys co-founder who was funding a project based on Llama 2, the open-source model produced by Meta so that it recognises all 22 official Indian languages.?
Like the NVIDIA Center of Excellence, Infosys has strengthened its partnership with Meta and established a Meta Center of Excellence (CoE) to accelerate the integration of enterprise AI and promote contributions to open-source communities. The CoE will cultivate expertise in the Llama stack, develop industry-specific applications, and facilitate customer adoption of generative AI.?
Infosys vs the world?
NVIDIA is also supporting other Indian IT companies, including Infosys. Tata Consultancy Services is creating AI solutions using NVIDIA’s NIM Agents Blueprints for sectors including telecommunications, retail, manufacturing, automotive, and financial services.?
Its offerings feature NeMo-powered, domain-specific language models that can handle customer inquiries and respond to company-specific questions across all enterprise functions, such as IT, HR, and field operations.
In Q2 FY25, TCS reported over 600 GenAI engagements, a significant increase from nearly 270 last quarter. “Last quarter, we had eight engagements that went into production. This quarter, we have almost 86 engagements going into production,” shared TCS chief K. Krithivasan, noting that it’s a sign of maturity.
领英推荐
Wipro, on the other hand, with its AI-powered consulting and extensive employee reskilling efforts, is looking to build an “AI-powered Wipro” that drives efficiency and transformation. “Net-net, I think GenAI will be positive for us and for the industry,” said Srini Pallia, CEO and MD at Wipro, adding that they are investing big into GenAI.?
“We have now trained and certified over 44,000 employees on advanced AI, and we also have a significant number of employees actively using AI developer tools across the company for all our clients,” said Pallia.?
Wipro is applying NVIDIA AI Enterprise software, which includes NIM Agent Blueprints and NeMo, to assist businesses in crafting custom conversational AI solutions, such as digital humans, for customer service engagement.
Meanwhile, Tech Mahindra’s recent Indus 2 launch, a Hindi-centric AI model, is powered by Nemotron-4-Hindi 4B, targeting local language engagement. The company has reskilled 45,000 employees, supporting their AI roadmap through an internal proficiency framework.
Enjoy the full story here.?
Meta’s Llama 3.1 is the Missing Link for Indic Datasets
Meta’s Llama 3.1, with its vast 405B parameters, is currently being used to generate synthetic Indic data and enable cost-effective, powerful language models that bridge the country’s critical dataset gap. These models empower startups like Sarvam AI and others and scale across Meta’s platforms—WhatsApp, Instagram, Facebook, and Threads—for impactful, region-specific AI solutions.?
Speaking at India’s biggest AI summit, Cypher 2024, Vivek Raghavan, the chief of Sarvam AI, revealed that they used Llama 3.1 405B to build Sarvam 2B. He explained that it is a 2 billion parameter model with 4 trillion tokens, of which 2 trillion are Indian language tokens.
“If you look at the 100 billion tokens in Indian languages, we used a clever method to create synthetic data for building these models using Llama 3.1 405B. We trained the model on 1,024 NVIDIA H100s in India, and it took only 15 days,” said Raghavan. Read on.?
AI Bytes >>?