Your little AI Cheatsheet
Since the proposition of the term in 1955, Artificial Intelligence is once again in the limelight. There is celebration of the strides we are making and sensationalized conversations of how far AI will go. Traces of AI’s growing influence are manifest in achievements such as unmanned vehicles, but how far along is AI today?
The resurgence of the term in recent years has been due to the extraordinary progress made in machine learning and the eruption of data in a growing climate of connectivity enabled by the internet. Their synchronization has led to unprecedented developments in the field.
AI today is best described as Weak AI which is limited to narrow, specified tasks. Although there have been recorded breakthroughs in the learning capabilities of AI technology in areas such as speech and image recognition, AI today is “no smarter” than a 6-year-old child. Nonetheless, these advancements have paved the way for a well of opportunities that were unimagined some years ago, making it worthwhile to look at what the popularized terminology actually means.
Algorithm: Algorithms are functions or formulas written to direct the performance of certain tasks. They serve as the building blocks of AI technology.
Artificial General Intelligence (AGI): The term refers to the founding and ultimate pursuit of the field: creating technology that can exhibit human resembling general intelligence.
Artificial Intelligence (AI): Fundamentally AI refers to the development of intelligent machines with the ability to think like humans. It is an umbrella term which encompasses a wide variety of technology programmed with ranging human-like capabilities including reasoning, problem-solving, decision making, planning, perceiving, and learning from experience.
Artificial Neural Networks (ANN): ANNs are multi-layered computing systems designed to mirror the functioning of neurons in the human brain. They allow for data to be analyzed through layers of correlations, enabling more complex processing abilities of the human brain such as image recognition and natural language processing.
Big Data: Big data refers to the explosive growth in the volume of data availability due to the infiltration of technology in our lives presenting the unique opportunity to gather, identify, and process this data for our benefit. Big data is analyzed using advancing computational powers to reveal meaningful insights, trends, and patterns.
Cognitive computing: Cognitive computing is the use of machine learning to simulate the human thought process.
Data Mining: The process through which patterns are identified within large sets of raw data to extract useful information. Data mining has gained value with the rise of big data.
Deep Learning: Deep learning is a subset of machine learning which operates through a multi-layered system of ANN’s allowing an understanding of the complex patterns and associations within data sets.
Digital Transformation: The transformation of businesses based on opportunities that have arisen in the wake of technological advancements. This involves a thorough re-thinking of business models, operations, value creation, leadership, and culture.
Machine learning: Machine learning equips technology with the ability to learn and improve from experience over time without explicit pre-programming.
Narrow Artificial Intelligence: Alternatively called Weak Artificial Intelligence, it is machine intelligence limited to a single, specified task. It is a non-sentient approach to AI.
Natural Language Processing (NLP): This refers to human speech recognition, it is what allows technology to process and understand human language.
Supervised Learning: Supervised learning is a category of machine learning in which the algorithm is fed data that has already been pre-labeled and classified.
Third Platform Technology: A term offered by the International Data Corporation (IDC) highlighting which technology will make digital transformation a reality for businesses which includes four Pillars (mobility, big data/analytics, cloud, and social) and six Innovation Accelerators (Internet of Things (IoT), Augmented & Virtual Reality, Cognitive/AI Systems, NextGen Security, 3D Printing, and Robotics).
Unsupervised Learning: A category of machine learning where the algorithm is fed data that has not been pre-classified or labeled, rather the computer is expected to meaningfully categorize the data on its own.
As one learns about the latest trending topics such as Machine Learning and AI, the potential as well as the purpose and limitations of each concept, as well as possible application is evident. Most of the application today is to create new knowledge, to give users insight. These concepts and technologies can be harnessed as part of an ecosystem to make the leap from knowledge creation to the decision-making realm. Check out the new world of Cognitive Decision-Making and see why this business-first approach can deliver value on day one at getdiwo.com.
Principal Interaction Architect at Diwo
7 年Glad it was of some help!
Impact | Creativity | Strategy
7 年Great quick overview of AI, Greg! Thanks!
Gregory, thank you. You are better than wiki.