What I think about AI

What I think about AI

Futurist Ray Kurzweil talks about the Law of Accelerating Returns. He says that the amount of progress achieved between 2004 and 2014 was equal to the entire 20th-century’s gains. He believes that the same volume of progress will happen before 2021. 

The world is becoming an intensely interesting place due to this speed of change. And yes, technology has been the biggest driver here. Every day we hear of news of changes driven by advanced technologies such as AI. Technologies like Artificial Intelligence or AI used to belong in science fiction movies but no longer.

Today AI has embedded itself into our daily lives. The Siri’s, Google Now’s, and Cortana’s of the world operate using AI. Your Netflix experience is driven by AI. Clearly, AI is here to stay. And the numbers support this. The AI market is expected to be worth $ 190.61 Billion by 2025. But what is AI all about? This post is my attempt to understand and articulate what AI means.

Back to basics – what AI really is

I have heard people using the terms AI and Machine Learning interchangeably. However, as alike as they might sound, they are not the same. As I understand, AI, simplistically, is a computer program that is able to ‘think’ for itself without being explicitly programmed. (My previous attempt at demystifying Machine Learning is here for anyone interested.) AI processes employ deep learning algorithms that cause learning by the acquisition of information and rules from data. They use the information at hand correctly, employ reasoning capabilities to reach approximate or definite conclusions, and they enable self-correction. In a nutshell, AI is the simulation of human intelligence processes by computer systems or machines.

AI can usually be categorized as strong or weak. Virtual personal assistants such as Cortana and Siri are a good example of weak AI. Strong AI, also called Artificial General Intelligence, possesses generalized human cognitive abilities. These AI systems can find solutions to an unfamiliar task without human intervention, such as finding a face in a large mosaic.

The basic components of AI

Have you heard of NLP or Natural Language Processing? Or Machine Learning? Or Predictive Analytics? These are some of the technologies that allow the computer systems to understand language, learn, and make predictions. Here are some technologies that form the core of AI:

Machine Learning

ML provides computer systems the capability to self-learn and improve without explicit programming. ML algorithms focus on deep data analysis and on basing predictions on them. Netflix recommendations are an everyday example of ML

Deep Learning

A subset of Machine Learning, Deep Learning employs artificial neural networks that learn by processing data. These artificial neural networks can have many layers that can work together to determine a single output from many inputs. Deep learning enables self-education in machines by leveraging constant processing and reinforcing the progress. Speech recognition, like talking to your voice assistant like Siri, is a form of Deep Learning.

Neural Networks

The enablers of Deep Learning, neural networks are modeled after the neural connection of the human brain. Much like how neuron bundles create neural networks in the brains, perceptron stacks (this is getting complicated now!) create neural networks in computer systems. Neural networks help in making associations by processing data multiple times to find associations and give meaning to undefined data.

Cognitive Computing

The purpose of Cognitive Computing is to imitate and improve interactions between machines and humans. It aims to recreate the human thought process in a computer model by understanding the human language and the meaning of the images at hand.

Natural Language Processing (NLP)

Natural Language Processing (NLP) helps computers understand human language and speech to enable seamless interactions between humans and machines. A good example for this would be Skype Translator that interprets the speech of multiple languages in real-time.

Computer Vision

Computer Vision employs deep learning and pattern identification to understand and interpret the contents of an image or visual data. This could include pictures in PDF documents, graphs, tables etc. along with text and video. Computer Vision is being employed in the healthcare sector quite extensively especially in research and development.

Most industries today are looking at AI to get data-driven solutions to pressing business challenges.

So, how does AI deliver a business impact?

Automates repetitive work

Have you heard of Robotic Process Automation? That could be said to be an early form of AI at work (early, because it’s applicability is limited to rules-based processes). Instead of simply automating manual tasks, AI drives automation that performs frequent, high-volume, tasks reliably, even if they are not completely manual.

It brings ‘Intelligence’

AI brings with it the capability to processes vast volumes of data. Plugging in AI technology to products or processes improves the product’s capabilities or the process’s effectiveness as it becomes now powered by data.

It is adaptive

Owing to the self-learning capabilities of AI algorithms, the algorithm itself becomes a predictor. These models automatically adapt themselves when given new data. They can deliver immense business benefits by capably processing the data at hand.

Higher data analysis capabilities

Since AI employs neural networks, it can analyze larger volumes of data and find deep co-relations between large data sets and often faster. And the more data you feed these neural networks and deep learning algorithms, the stronger and more robust they become. It is because of this technology that building a robust fraud detection system for banks with five hidden layers is possible today.

Greater accuracy

AI brings in greater accuracy to data analysis using deep learning and neural networks. Using this AI can now find cancers on MRI’s with statistically greater accuracy than a trained radiologist

Makes data intellectual property

Since AI algorithms are self-learning, the data itself becomes intellectual property quite automatically. All answers to all questions lie hidden in this data. With AI, we don’t need to do much but to simply apply AI to an analytics program- this itself can take data from being good to great and helps in creating a strong competitive edge. This is why there is so much discussion now about “who owns the data?” at Google or Facebook or other such “free” services.

Clearly, AI has many advantages. However, it is still an expensive technology with its own set of systems and programming demands. Then, there are the challenges associated with obtaining the large data sets that are comprehensive and correct enough for the training data.

Additionally, we are yet to achieve proficiency in ‘transfer learning’ – a technique where the AI model is trained to accomplish a generalized task and to then apply the same technique to similar but distinct activity.

From what I’ve studied, I think that AI will make itself an integral part of our ecosystem, the way we live and work, in a compelling manner very soon. It will definitely impact the way we work. It is already impacting the way we do everyday things. As a technologist, I can also see AI working its way into the software development and testing landscape, I am keenly observing the growth of AI to see what comes next.





Andrew Rowe

Community service leader, mentor, investor and global adventurer

5 年

Good post, Ashwin. ?We are deploying these technologies at our company and excited about how it will improve quality, performance, scalability and help us grow.

回复
Daljeet Gora

Program Test Manager | Testing Lifecycle Optimization | Retail Banking | Data, Insights & Analytics

5 年

I echo the words of Hemant.

Fantastic post Ashwin. Thanks for jotting down the abc of it good guide who like to walk on this path either for tourism or serious learning.

Dhaval Mandalia

Empowering companies with AI, Quantum Solutions, Data & Cloud Engineering to reach their full potential. | Founder/Director @ Arocom | Certified Data Scientist | T1D Warrior | Volunteer - Diabetes Awareness Programs

5 年

Very nice article with intuitive flow. I would like to mention applications of AI have the potential and in some cases have contributed beyond what is mentioned. That's the beauty of it that it grows by leaps and bounds. Do you believe then will it reach point of singularity within next decade?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了