#10 Intro to AI
What does it mean? What do we need? Why is it taking off?
Listen to this post below!
The Takeaway
What people say the most about artificial intelligence (AI) is “I don’t know enough about it.”
This article helps you understand the basics of AI in simple terms. It also helps you think about the societal issues, and why American AI is well positioned to boost human productivity. It does not go into specific use cases that you’re seeing everywhere.
Q: What do the terms mean?
A: Artificial intelligence (AI) is about using computers to do tasks that mimic human intelligence to increase human productivity.
Machine learning (ML) is when computers learn how to do tasks using human made rules in calculations (algorithms). Example: Existing fraud prevention software.1
Deep Learning is a subset of machine learning where computers learn to do a task instead of being told how to do it. This is because they are modeled or designed to take data and connect all the dots or hidden layers like a person does in life.
Neural networks are mathematical models within deep learning that work like the human brain’s neural network.2
Data quality is important to AI. Models for a ML/ deep learning task have 3 separate datasets to keep the model from becoming biased to think it is more accurate than it is.3
In an example of a data labeler distinguishing between cars and motorcycles, the model would use 3 different traffic camera feeds.
Q: What do we need more of?
A: We need more power…and data!
Computing Power
Semiconductors or chips power basic and advanced technological systems, from coffee makers to cars.
Moore’s Law is Intel co-founder Gordon Moore’s prediction in 1965 that transistors in an integrated circuit (semiconductors or chips) would double every year for a decade, improving computing power and decreasing costs. The theory was later adapted to a rate of doubling every two years, which held true for over 50 years. Companies like TSMC, Intel, Nvidia, and SMIC are all competing to shrink chips. Once companies cannot shrink chips past their already nano-metric proportions (a crazy slim fraction of a human hair), some like Intel are optimistic that 3D designs will lead to the next computing power curve.4`
Nvidia has shifted the focus to the GPU (Graphics Processing Unit), which was originally designed for high resolution video games, and now increasingly is driving AI/ML computing. Fun fact: Nvidia’s new superchips are named for Admiral Grace Hopper, US Navy legendary computer scientist.5
As quantum computing develops, the ability to process many calculations in parallel could boost AI processes even further (because qubits can be in two states at once unlike the binary 0 or 1 system).6
Data
The modern cloud began with Amazon Web Services (AWS) in 2006. AWS built physical warehouses of computer servers (data centers) and started renting out online storage space and services. An estimated 120 zetabytes (trillion gigabytes) of data is out there, and the number will only grow.7
The US has over 2500 data centers, making up about one-third of global centers.8
Real estate investors like Blackstone capitalized on this trend.
领英推荐
AI/ML models need more data for inputs to become more accurate. This is why you will hear the phrase “garbage in, garbage out" applied to computer science.9
The outputs become data for online products.
You also guessed it, we will need energy, space, and other hardware to support the growing use of power and data centers.
Q: What are some societal questions about AI?
A: Ethics are part of AI just like they are part of society and our legal systems. There is no new regulation around AI in the US. Those in favor believe it necessary to write in safeguards to protect people, and for the US government to be a global leader. Those opposed believe it will harm the fast pace of innovation, that existing laws are sufficient, and private businesses will carry the norms. Big firms have been strongly encouraged by government to make safety pledges, and just did.
Human biases (irrational judgements) affect data models about people and lead to issues like fairness. The issues are magnified when the outcome or process is not explained (black box AI). Enter companies who specialize in explainable AI (XAI) to help people know about the calculations that went into the answer, but this can be an imperfect solution.
Large language models like OpenAI’s ChatGPT can confidently produce something that is not true (hallucinations) even with accurate data. If connected to inaccurate data, the models can generate more inaccurate information that compounds. There are also debates around creativity and ownership, such as someone not citing your work or someone making their work sound like yours.10
You might be writing with AI and generate a very generic product, or use it creatively to improve the structure, style and subject matter.
Businesses will use more interconnected devices (internet of things) with AI to record, analyze, and drive new insights. People will continue to be concerned about their data privacy and security. We haven’t even begun connecting to people’s brains, though Elon Musk’s Neuralink provides an early medical use.
Most interesting will be how the type of work we do changes and the meaning we make of our work.
Summary
AI is not just the hot new trend. It has fascinated people since there were sci-fi novels about robots 100 years ago. The US has led 60 years of progress, whether in semiconductor manufacturing, 40 years of the personal computer and internet, or 20 years of cloud services. These developments reshaped society and dramatically lowered the costs of computing while increasing access to data. Great innovators from Silicon Valley to Boston, and Google to government, have built off theories to drive current momentum, kicking off a new wave of innovation.11
I hope you enjoyed what is a basic framework. What are your thoughts?
1 Machine Learning, SAS Insights, 2023, https://www.sas.com/en_us/insights/analytics/machine-learning.html
2 Deep Learning, IBM, 2023, https://www.ibm.com/topics/deep-learning
3 Pragati Baheti, “Train, Test, Validation Split,” V7 Labs, 2023, https://www.v7labs.com/blog/train-validation-test-set
4 Dr. Ann Kelleher, “Moore’s Law, Now and in the Future” Intel, 2022, https://www.intel.com/content/www/us/en/newsroom/opinion/moore-law-now-and-in-the-future.html
5 Benj Edwards, “ Nvidia’s New Monster CPU+ GPU Chip May Power The Next Generation of AI Chatbots,” ArsTechnica, 2023, https://arstechnica.com/information-technology/2023/06/nvidias-new-ai-superchip-combines-cpu-and-gpu-to-train-monster-ai-systems/
6 Susan Galer, “If You Think AI Is Hot Wait Until It Meets Quantum Computing,” Forbes, 2023, https://www.forbes.com/sites/sap/2023/03/21/if-you-think-ai-is-hot-wait-until-it-meets-quantum-computing/?sh=4e881ead1ff6
7 Petroc Taylor, “Volume of Data/Information Captured, Copied, and Consumed Worldwide From 2010 to 2020, with forecasts from 2021 to 2025”, 2022, https://www.statista.com/statistics/871513/worldwide-data-created/
8 Ogi Djuraskovic, “Big Data Staistics 2023: How Much Data is in the World,” 2023, https://firstsiteguide.com/big-data-stats/
9 Conversations with US Naval Academy graduate
10 Conversations with Schwarzman Scholars
11 US government agency DARPA helped create the internet. Google changed search. OpenAI made generative AI accessible.