AI should be used for solving societal problems

AI should be used for solving societal problems

No alt text provided for this image
Nandan Nilekani at People +AI

The post-Chat GPT world is in awe of Artificial Intelligence. This awe is akin to a society bedazzled by the powerful arc lights, as if in a magical show. The magic of AI is everywhere and in almost everything at the same time. Suddenly government, bureaucrats, technocrats, technologists, and even social sector is talking about the magic that can solve everything or also ruin everything.

If there is hype there is money to be made and the VC funds as always are sniffing around for the next big thing that will disrupt big tech. Nothing wrong with following the money, it's a trait that creates wonderful entrepreneurs and all their shiny apps and stuff. It also allows everyman salaried guy to feed his family.

But some technologies are more than about just making money, there is more that can be done. There are a few people, who are not dazzled, a few who don't follow just the lucre of money, and a few who can see the potential for a Super Intelligence to solve society's most wicked problems.

Magically conjuring lots of money is one thing, magically solving the most vexing problems is a bigger thing.? The problems of poverty, agriculture, livelihood, and education outcomes can be the focus AI engineers. But these are not subjects being discussed in the conference rooms of big tech companies. The large tech companies have already gone into a huddle, they have stopped publishing any new research.

A small cohort of committed Indians who reinvented open source to solve at scale are trying to bend the arc again. So on a dry hot Saturday afternoon in Bengaluru on April 1st, 2023 they got together under the arches of Leela hotel. The day and time are important as the debate on AI has to shift from amazement to purpose. Nandan Nilekani, Pramod Varma , Shankar Maruwada , Vivek Raghavan and many others from EkStep Foundation brought together Engineers, Bureaucrats, and Policy researchers cohort to frame problem sets, suggest possible AI-based solutions and prioritize the ones which will have the maximum impact. Yes, AI's impact at a population scale. Not to write a better email or get a summary of search results. But to help a student to answer a question in Science in Kannada not with a pat answer but a nudge that guides him towards the answer without giving him one.?

Nandan Nilekani set the tone for how AI can be and should be leveraged for public good. And why it is important for it to become a community effort because only then can solutions for population scale be created. He pointed out that in the process of creating numerous DPG there is now a playbook of sorts as to what is needed to build a solution that will finally be able to create impact. For instance improving the outcomes of student learning or their ability to practice mathematics which is always poor as per the ASER report.

Pramod Varma pointed out that it is the role of AI to take to on unsolved societal problems and apply its algorithms to them. And this will not happen by itself engineers will have to commit time, and effort to create a system to do it. The People + AI?program is designed to serve as a community to enable just this to happen.

Is it difficult training a Super Intelligence for the purpose to solve the problem of say customizing advisory for each and every farmer in the country? Customizing the advisory for his crop, seed, and fertilizer based on climatic, water, carbon, nutrients, and soil condition of his farm. The permutation of these variables can run into millions and it needs AI along with computational power but the power of AI to provide this solution in the native language of the farmer. This is the problem that has to be solved and AI can curate the solution for exactly this problem in multiple Indian languages both as voice and text output. Google has been able to capture some of these elements of farms as is visible from the picture below.

No alt text provided for this image


Of course, the gap here is the data or the large language model to train AI to customize the solution for each individual farm. But as the Google slide above shows it has been able to digitize soil data and hopefully, over time this data once combined with Google geo landscape data will get better. But the crucial question is where this data will reside as to will the big tech just scrape and scramble this data for its own or will there be some method to it? And more importantly, where will the algorithm that will deliver the advisory function, reside?

No alt text provided for this image


This is one of the areas where the two worlds diverge. The People+AI cohort in Bengaluru is thinking about very different problems and big tech is still thinking about how AI can be used to squeeze more attention from the netizens.


AI is only as good as the data it gets. Pramod Verma and Nandan Nilekan both stressed the importance of India being a data-rich country. This has happened because India has taken the lead in creating digital public goods which has led to the digital transformation happening right down to the last man in the chain. Antyodaya in India has happened through the digital public good route.?

AI leaps across the language barrier

There is a severe divide between Indian language speakers, learners, and English. Most of the technology development has English as a default language for interface, information, and communication. This limits the use of internet and its application to a limited set of English speaking netizens.?

This language barrier creates bias and division in all AI applications, tools, and technologies. Interestingly the solution also lies with Machine learning and Big Tech has been looking to bridge this gap.

No alt text provided for this image


Google’s Manish Gupta talked about MorNI and? Muril (Multilingual Representation for Indian Languages) that Google has been working on to bridge the Indian language. Muril as explained by Google is intended to be used for a variety of downstream NLP tasks for Indian languages. This model is trained on transliterated data as well, a phenomenon commonly observed in the Indian context. This model is not expected to perform well on languages other than the ones used in pretraining, i.e. 17 Indian languages.

No alt text provided for this image


Project Vaani will be implemented jointly by the Indian Institute of Sciences (IISc),, and Google to gather speech data from across India for the creation of an AI-based language model that can understand diverse Indian languages and dialects. Pramod Verma, architect of Aadhar and UPI explained that AI4Bharat has already created Vaani and Bhashini as a DPG in solving the language barrier.

The language barrier is not just a divide in learning it is also an access issue. Everyone can't type text into small smartphones and access internet-based applications. Voice-based access in multiple Indian languages can democratize access across the country.?

Access is not the only crucial question for learners interacting with AI . They do need pat answers or essays to be typed, AI has to nudge them towards a solution without providing one. Learning can be customized to each individuals learning curve, recognizing and adapting to the need. Whether one is a visual learner or a rote learner, AI can certainly help in customizing the lesson accordingly.

There is also the issue where language itself is not easily accessible. For instance, the legalese that lawyers, judges, and even regulators or bureaucrats adopt can be difficult to understand. While there is a translation possible it is not enough to explain the issue. For instance, in case of a property dispute what part of the law will be applicable, and can that be transliterated into an Indian language by an AI engine so that it can even be understood by a so called illiterate person? This seems to be possible with the capability that AI and its large language models of doing this akin to magic.



The magic can also be applied to make natural language to be used for programming. Arun Singh a visionary analyst pointed out that with the integration of Wolfram GPT with ChaptGPT this process is now closer to reality. In a way it will be possible for a liberal art student to create a program in plain English without knowing any programming language. To know more about how this is possible to do this listen to this interview by Wolfram this interview. This is also a language barrier between computer engineers and non-engineers the ability to write a programming language is limited to engineers. As a result computing powers or the advance of digital technology cannot be fully leveraged by the non-computer engineer. This is also the only way to remove or reduce the inherent bias of programmers becoming heavy on AI is that programming becomes available to all. And in a way, this is the magic of AI.


The magic of AI will disrupt many industries and many sectors in a data-rich country like India. It is crucial that is not openly hocked or harnessed to create AI engines that will make Indians unemployable. AI4India also needs a policy framework that addresses this challenge.?


–ends

An abridged version of this article appeared here https://www.cdotrends.com/story/18049/can-we-train-ai-higher-purpose

要查看或添加评论,请登录

K Yatish Rajawat的更多文章

社区洞察

其他会员也浏览了