Yann LeCun ? India | Jensen Huang? India
AIM Events
Hosting the World’s Most Impactful AI Conferences & Events. For Brand collaborations write to [email protected]
“I actually want to take a selfie with you,” said Yann LeCun excitedly when two AI startup folks, who are building on top of Meta’s Llama, approached him as he walked around in his hotel in Bengaluru and asked to be photographed with him.
“You guys are building the future here,” the chief scientist of Meta AI said. After visiting IIT Madras and speaking with the people of AI4Bharat about their tremendous contributions to open-source AI for Indic languages, LeCun came to Bengaluru, continuing his Indian AI tour, and clicked a photo of the crowd with his Meta RayBan smart glasses.?
Speaking at the Meta Build with AI Summit , LeCun stressed the importance of diverse contributions in advancing AI and acknowledged the strong impact of open-source projects in India, highlighting initiatives like Sarvam AI, AI4Bharat, and more. “India, going forward, has an important role to play, not just in technology and product development, not just for local products, but for international products and research,” he said, adding that India is brimming with talent.
He drew parallels to the establishment of FAIR Paris 10 years ago. “I think we can sort of use this blueprint to accelerate the progress of AI in India even further.”?
The day before, LeCun visited IIT Madras and AI4Bharat , and was impressed by projects that focus on language translation and cultural preservation, such as IndicTrans2 and others – all thanks to Llama. “AI is going to help inspire”, particularly in regions with many languages and cultural complexities. He had earlier praised projects built on top of Llama , such as Kannada Llama.
LeCun envisions AI as a shared infrastructure that democratises access to knowledge. He articulated a future where “the big frontier AI systems will not be produced by smaller companies... but will be trained in a distributed fashion”. This approach would allow for localised data processing while maintaining a global consensus.
“Open source technology is crucial today but will become even more so in the future. This is because AI is increasingly becoming a shared infrastructure that can be used globally.” LeCun added that for AI to reach its full potential, it needs to serve as a repository of human knowledge. While we’re attempting to achieve this now, we’re limited by a lack of diverse data in terms of languages , cultures, values, and interests.?
Local and Global
Centralising all this data isn’t feasible, nor would it fully represent the vast range of human knowledge. LeCun underscored the importance of collaboration among governments, industries, and researchers to establish the necessary infrastructure for this vision. “There’s a lot of work to do,” he acknowledged, stressing the need to empower individuals, especially in underserved areas.
Even if we could access the necessary data, fine-tuning these AI systems would require input from people with diverse cultural and linguistic backgrounds. No single organisation or country can do this alone; it must be a collective global effort. In this sense, AI development needs to be transparent and decentralised.
While speaking at the panel with Infosys co-founder Nandan Nilekani and head of People+ai Tanuj Bhojwani , LeCun shared the idea of building India as the AI use case capital of the world. Nilekani said, “We think that we can build on top of digital public infrastructure.” He said this is providing the foundation to go to AI much more rapidly, a concept he previously referred to as "DPI to the power of AI’.
India will need to invest in infrastructure, expertise, and policies that support such a decentralised approach to AI . Rather than seeing it as a threat to sovereignty, this would enable true global sovereignty over knowledge and technology.
“I see a future where we’ll all have smart glasses or devices that allow us to interact with AI assistants,” said LeCun. These systems could eventually be smarter than us, but LeCun said that we shouldn’t see that as threatening. Instead, it’s empowering—like having a team of experts at our disposal.?
Everyone, not just people in tech or academia, should have access to these tools . “Imagine rural communities in India being able to ask questions in their own languages, improving sectors like agriculture and healthcare. This kind of future would bring significant positive changes,” added LeCun.
LLMs are Not All We Need
To reach the next level of AI—what we call advanced machine intelligence (AMI)—we need systems that can truly help people in their daily lives. This involves developing systems that can understand cause and effect, and model the physical world.?
LeCun used the example of a child who can figure out how to load a dishwasher on their first attempt—something no AI has yet been able to achieve. We need systems that can learn and adapt to new tasks with common sense.?
“We need to train AI to understand the world by observing it, just like humans do. This requires new architectures that move beyond today’s models.” LeCun emphasised on focusing on object-driven AI instead of prompt-based, as they would be able to solve problems that current AI models can’t.
领英推荐
“These systems will be able to plan actions and predict outcomes, and they’ll do so while adhering to built-in safety measures, or ‘guardrails’. This approach not only makes AI more powerful but also ensures it operates within safe boundaries,” said LeCun.
Read more here .
Jensen Thinks ‘NVIDIA is AI in India’
Jensen Huang, the chief of NVIDIA, is also on his India tour. Speaking at the NVIDIA AI Summit 2024 in Mumbai, Huang highlighted his immense love for India and the potential the country has to build global AI products. Interestingly, he too, spoke about starting local, and scaling it globally.
NVIDIA announced that Indian AI factories would see 20x NVIDIA GPU growth by year-end. Citing Yotta, Tata Communications, E2E Networks, and Netweb, Huang said that NVIDIA in India has built a rich ecosystem in just a span of one year. “By the end of this year, we will have nearly 20 times more compute here in India than [we did] just a little over a year ago,” he added.?
Reliance Industries chairman Mukesh Ambani joined Huang on the stage for a lively discussion about India’s potential in driving the future of AI.
In a light-hearted exchange, Ambani mentioned that his version of NVIDIA would be vidya – a Sanskrit word for knowledge, symbolising the Hindu goddess Saraswati, and pointed out that knowledge naturally draws Lakshmi, the goddess of wealth.?
In response, Huang enthusiastically suggested that he might have chosen the perfect name for his company, given NVIDIA’s strong ties with India.
Huang highlighted India’s significance for NVIDIA, noting that the company’s name almost spells "India" minus the letter “V”.? He stressed that India is central to NVIDIA’s vision as the world moves into what he calls the “intelligence age”. Huang said that today NVIDIA has 10,000 engineers in India. He also praised India’s rapid digital transformation, led by Reliance Jio, which has propelled the nation from 158th to first place in global digital connectivity in just eight years.
Huang spoke of India’s potential to shift from being a global IT back-office to a leader in next-generation AI, describing this transition as “Software 2.0”. He remarked that India’s ability to develop AI-driven software could soon position the country as an exporter of advanced AI solutions worldwide.
“Once you crack LLMs in India, you can build them anywhere in the world.” Huang specifically pointed to the complexity of Hindi, where dialects change every 50 kilometres, presenting a significant challenge for AI developers.?
NVIDIA Also Loves Indian Open Source
In line with its commitment to developers, NVIDIA also launched the Nemotron-4-Mini-Hindi-4B model , a small language model for Hindi, enabling businesses to deploy AI solutions specific to local needs. This model, part of NVIDIA’s NIM microservice, can be deployed on NVIDIA GPU-accelerated systems, optimising performance for various applications.?
Tech Mahindra was the first to implement this model, creating Indus 2.0 , which focuses on Hindi and its dialects. Sarvam AI has also developed Sarvam 1, India’s first multilingual language model, trained on NVIDIA H100 GPUs. This model supports English and ten major Indian languages.?
Large enterprises, too, have been using NeMo – Flipkart integrates NeMo Guardrails into its conversational AI systems for enhanced safety, Krutrim is developing a multilingual foundation model using Mistral NeMo 12B, while Zoho Corporation plans to use NVIDIA TensorRT-LLM and Triton Inference Server for its language models.
It is clear from LeCun and Huang’s appreciation for the Indian open source and infrastructure ecosystem that India is headed in the right direction for building AI.
AI Bytes