AI - Technology of the year
As 2017 comes to a close, I have been noodling about what deserves the title of “Technology of the year.†Clearly, Artificial Intelligence (AI) is the winner!
Quite a few terms are used interchangeably when discussing the subject of AI, including Deep Learning, Machine Learning, Neural Networks, Graph Theory, Random Forests, and the list goes on. Let’s clarify those terms. AI is the broad subject, describing how intelligence is gained through machine learning using various algorithmic options like graph theory, neural networks, random forests, etc. Deep learning is a specialized form of machine learning which expands the sample data sets to multi-layer learning.
I first worked on Artificial Intelligence during my final semester of engineering school. I wrote my first AI program in LISP for medical diagnosis. Let us call her “Dr. Smart.†It was thrilling to watch her diagnose and give me recommendations when I punched in the symptoms. The more facts I fed to her, the more intelligent she became. However, if I deviated from the script, she failed. The fundamental thing she missed was learning. She was a first-generation AI and didn’t have the capability to learn automatically about symptoms that were not in her knowledgebase. All she did was fact verification and apply a pre-determined logic to spit out an answer using a decision tree algorithm. If she could have learned new symptoms that were not in the knowledgebase and get smarter, then she would have been like a Siri or Alexa of today.
This study and summary from PwC clearly depicts the evolution of machine learning algorithms.
There have been a few factors in recent decades that accelerated AI progress, fueling entrepreneurs’ interest in making AI ready for prime time:
1. Algorithm Evolution: The theory of deep learning was published in 1965, and research advancement and evolution has led to multiple algorithm choices based on learning types like voice, image, complex data processing, etc. In 2015, Microsoft’s convolutional neural network (CNN)-based computer vision system identified objects in pictures more effectively (95.1% accuracy) than humans (94.9% accuracy).
2. Hardware Advancement: Graphical Processing Units (GPUs) are slashing the time required to train the neural networks used for deep learning. The processing power on Nvidia has increased by 70X in the last 3 years.
3. Cost of computing: Thanks to cloud, the cost of computing has dropped drastically making it easier and less expensive to harness high power computing. This opened the door for AI advancement to individuals, small companies and startups, increasing the participation beyond large enterprises and university research labs.
4. Data rich world: The world is producing 2.5 quintillion bytes of data every day. 90% of the data in the world today has been created in the last two years. This data comes from everywhere: sensors used to gather shopper information, posts to social media sites, digital pictures and videos, purchase transaction, and cell phone GPS signals to name a few. This big data provides a greater sample set for deep learning.
5. Automation: The rubber meets the road for AI with Automation. Whether it is in business with Robot Process Automation (RPA) or improving user experience with Consumer Process Automation (CPA), the fueling engine is AI for these automations.
AI is valuable as these capabilities offers revolutionary, rather than evolutionary, capabilities. The four broad areas of AI applications:
- Logical Reasoning: Solving the problem through logical deduction for use cases like financial loan application processing, recruitment screening, games, etc.
- Decision Making & Knowledge Management: Knowledgebase as a key basis for the value delivery. For example: Medical diagnosis, media recommendation, purchase prediction, financial market trading, fraud prevention, and cybersecurity.
- Communication w/Natural Language processing: Natural language processing as the foundational algorithm. For example: intelligent agents, assistants and customer support; real-time translation of written and spoken languages; real-time transcription.
- Perception with complex, high-volume data sets: Leveraging deep learning to develop perceptions with complex, high volume of data sets to make decision. For example: Autonomous vehicles, medical diagnosis, and surveillance.
So where do we go from here? There is a passionate debate happening right now on whether AI will affect or improve our way of life, especially our work life. There are opposing viewpoints that AI will accelerate the economical polarization even further than computerization, and it needs to be solved structurally, Vs AI can enhance human capability, and even in the self-learning systems the design decisions can be controlled. One suggestion is to implement corporation taxes for the bots like the way companies pay for their employees, to ensure AI doesn’t lead to social issues.
If we name Steam Engine as the first industrial revolution, Electricity as the second industrial revolution, and Computers as the third industry revolution, then AI is the fourth industrial revolution. With every phase, some of the old world is destroyed, but there is significant value delivery for the new world. Each revolution forced us to change the way we lived, but offered us more time back and options to do things we want to do.
Personally, I feel more optimistic about AI. It is important for business to figure out what AI means to them regardless of the industry they operate in!!
Marketing | Business Strategy | Partnerships | Human Capital | DEI | Innovative Servant Leader
7 å¹´I 100% agree! I love hearing all of the chatter around this topic. We are heading in the right direction and I feel there is still SO much unknown around this topic. 2018 will be a great year for Innovation!
--
7 å¹´definitely an area to watch
IT Manager | A leading UK dairy drinks business in the heart of Devon | Home of Arctic Coffee & Moo Milk
7 å¹´Very interesting article
Senior Systems Engineer - HVIA/Discovery Ops at The Home Depot
7 å¹´Anytime a technology take front and center and because known by the masses the mainstream media creates the apocalyptic movie that are based absolute zero fact. It's sad. That being said, there are precautionary steps to take with AI and the human factor to consider. AI has amazing potential, but we need to make sure that people are able to use the technology and not be replaced in the work force by it. Technology should be seen as a tool and not as a way to cheapen labor costs for companies.
Client Technology Manager - Security at SaskTel
7 å¹´Wow the numbers in the Data Rich World are astounding. I am amazed that the world produces 2.5 quintillion bytes of data every day. day