Learning AI Series: Part II - Common AI Techniques
Photo by Hitesh Choudhary on Unsplash

Learning AI Series: Part II - Common AI Techniques

Introduction

In Part I: Demystifying Artificial Intelligence (AI), we described AI at its most basic level to establish a baseline understanding. This article in Part II will describe several of the most common techniques used in AI. There are many different techniques and some of them overlap across AI and machine learning (ML), but we'll be clear about which is which to reinforce your understanding of this space. If you read Part I and understand what's written here in Part II, you'll be in good shape. ??

Artificial Intelligence Techniques

Quick refresher from Part I: Artificial intelligence is "the art and science of using computers to mimic the actions or behaviors of human beings." AI focuses on the 'thinking' part and uses automation to perform tasks/work that humans perform.

AI Technique #1: Natural Language Processing

Natural language processing (NLP) is the ability for computers to understand and interpret languages, either in written or verbal form. NLP is particularly useful with unstructured data where large amounts of free-form text in written form is available in digital format (e.g., documents, emails, articles, social media). In verbal form, NLP receives (hears) commands or requests and interprets (processes) and returns feedback to the user. Think of voice assistants like Siri, Alexa, or Google Assistant.

More specifically, one of my favorite NLP techniques is named entity recognition (NER). NER solutions essentially parse out the 5WH (i.e., Who, What, When, Where, Why, and How). Granted, the Why and How can be more subjective, but NLP NER is very good at parsing out the Who, What, When, and Where. (Interpreting the Why and How can be improved with additional techniques such as Topic Modeling.)

The image below shows sample output from an NLP NER model, highlighting the information in different colors (which can be customized). You can customize the model to recognize entities that are specific to your industry/domain, increasing it's accuracy.

Sample output from an NLP NER model.
Source: Great Learning Team. "What is Named Entity Recognition (NER) Applications and Uses?" Great Learning, 18 Nov 2022.

AI Technique #2: Computer Vision

Similar to NLP where computers process languages in written or verbal form, computer vision processes images and videos in the same way that humans use sight to gather information. Computer vision can be used for real-time applications (e.g., security video surveillance, drones) or batch applications (e.g., classifying photographs). The animated image below shows a sample object detection solution that identifies people, cars, trucks, traffic lights, etc.

Animated image illustrating object detection on a busy city street.
Source: Agrawal, Tanmay. "Computer Vision - Object Detection Principles." Medium, 2 Apr 2023.

If you're looking to experiment with computer vision solutions quickly, I recommend starting with Python and OpenCV/YOLO. (YOLO stands for You Only Look Once ??.) Computer vision algorithms make use of neural networks to interpret images, which leads into the next technique we'll discuss.

AI Technique #3: Neural Networks

Inspired by neurons 'firing' in the human brain, neural networks process information using a series of messages passed between nodes in layers to arrive at a calculated conclusion. There is an input layer, one or more hidden layers, and an output layer. Each node in the layers represents a task or function that contains a mathematical formula, including weights. When a node's output exceeds a threshold, it passes information to the next layer; otherwise (below the threshold), it does not pass information. The passing of this information from one node to another in the layers results in a conclusion in the output layer.

The image below shows how a handwritten seven (7) in a 28 x 28 pixel box is interpreted by a neural network. As each node analyzes the pixels of the handwritten number (784 pixels = 28 * 28), the calculations are producing values that increasingly determine what number was likely written. The far left is the input layer of all 784 pixels, the middle two hidden layers process the input, and the far right is the output layer suggesting that a "7" was written.

Animated image illustrating neural network processing.
Source: Jude, Harris. "How to build a handwritten Digits Classifier?" Medium, 15 Nov 2020.

This kind of processing using nodes in layers to accept input, process input with mathematics, and output a value mimics how the human brain instantaneously processes information with neurons to decide what we see (e.g., a cat in a photo, a handwritten number or word).

Getting Started

Seek Input from Experts

Many organizations are deeply interested in AI but struggle with where to use it and how. With assistance from subject matter experts (both technical & functional), I recommend brainstorming use cases for your business/mission areas. (And I mean real use cases, not token use cases.) Oftentimes the answers are hidden in plain sight and can be uncovered as a team using various ideation techniques. Don't under-estimate the power of collaboration, focus, talent, and expertise because when combined, they produce game-changing results.

Try Something!

Once you've identified a handful of use cases, qualify each one to determine where the expected return on investment is expected to be highest. Then start with one, yes one (1), use case to build a proof of concept (or prototype) for that use case. And try to leverage platforms you've already invested in to reduce the barrier of entry into the world of AI:

  • If you're using Microsoft, try something as simple as AI Builder in Power Automate.
  • If you're using AWS, try Bedrock.
  • If you're using GCP, try Document AI.
  • If you're using Snowflake, try Cortex AI.
  • If you're using Databricks, try Mosaic IA.
  • If you're using ___________, try ___________ (you fill in the blanks).
  • And if you're not currently using anything, try Python.

You get the picture, try something!

Overcome Analysis Paralysis to Innovate

The important thing is to begin experimenting so you can fail fast and learn towards achieving progress. Then that progress, through iterations, starts turning into success. Far too many organizations suffer from analysis paralysis and hesitate to invest time and money into innovation, hoping things will just magically fall into place. And the next thing you know, you're always behind the curve and limping along with primitive solutions. Yes, innovation is easier said than done, but entirely possible with a strong team and the right mindset.

"Chance favors the prepared mind." ~ Louis Pasteur

So, what are you waiting for?

要查看或添加评论,请登录

Mark DeRosa的更多文章

社区洞察

其他会员也浏览了