A.I. Island 1: An introduction to Artificial Intelligence....to music
Ken Finnegan ????
Director of Innovation, Creativity, and Entrepreneurship | Empowering the Next Generation of Changemakers
Whether you work in the world of technology or not, more than likely you have heard the term Artificial Intelligence over the past few years. Actually, if you are of the same vintage as me you will probably have heard Artificial Intelligence discussed back in the 1980s too (or even a better vintage in the '50s and '60s). There is so much noise out there about Artificial Intelligence. About how it will make our lives easier and we are about to enter into the promised world of more leisure and family time or alternatively AI will destroy jobs and livelihoods until the machines eventually take over and then where will we be. Is "it the end of the world as we know it"...? (Play tune while reading - you may need to play directly from youtube - its worth it though)
As the Chief Technologist in the IDA Irelands technology division, I want to clear up a few myths, clarify some of the technologies and terminology and discuss how we can prepare for the annihilation or blissful paradise that is heading our way in this series of blog posts.?
First of all, what is Artificial intelligence? According to the father of Artificial Intelligence, Stanford researcher John McCarthy, it is “The science and engineering of making intelligent machines, especially intelligent computer programs”. Okay…
So in other words this means that Artificial Intelligence is a way of making a computer, a computer-controlled robot, or software think intelligently, in the similar manner the intelligent humans think. A.I. is accomplished by studying how the human brain thinks, and how humans learn, decide, and work while trying to solve a problem, and then using the outcomes of this study as a basis for developing intelligent software and systems.
John McCarthy coined the term in 1956 during what is now called the Dartmouth Conference, in which the core mission of AI was defined. We are not at the end of the AI journey by any stretch of the imagination however the evolution of technology capability over the past 60 to 70 years has put us on the path to AI.
There are 2 forms of AI - Strong and Weak AI:
Strong AI: The work aimed at genuinely simulating human reasoning tends to be called strong AI in that any result can be used to not only build systems that think but also explain how humans think as well. Genuine models of strong AI or systems that are actual simulations of human cognition have yet to be built!
Weak AI: The work in the second school of thought, aimed at just getting systems to work, is usually called weak AI in that while we might be able to build systems that can behave like humans, the results tell us nothing about how humans think. One of the prime examples of this was IBM’s Deep Blue, a system that was a master chess player but certainly did not play in the same way that humans do and told us very little about cognition in general.
According to Dr. Kristian Hammond, Chief Scientist at Narrative Science, this is the 3rd time since the term Artificial Intelligence was coined that AI has reared its head as a trend and has promised to change the world we live in. And this time, yessss, I believe this is true. Why this time and not previously? Well if we look at the evolution of technologies and data availability we can see a lot more resources at a maturity that will enable AI.?
Five to 10 years ago the world witnessed the advent of Big Data and Cloud computing.?Companies, people, machines, industries etc. are all producing tonnes of data, so much so that they don't know what to do with it all. So then how about we feed this data to the machines and add some clever code and see what happens.
It’s also worth noting that we have begun to see the importance of structuring data and not just ‘dumping’ it into a database so it can be consumed by AI. The cloud has provided, what appears to be, an infinite amount of space to store data and software applications.?No problems there then. (Next tune)
Compute power has increased exponentially over the past 50 years in line with Moore's Law . ‘Moore’s law still gives an important cadence for the development of AI. The basic AI that we have today has developed with the power obtained from the current processors; developed using Moore’s principle. And that sets the base of the idea of superintelligence technology’.?
In addition to network capacity and speed, high-performance computing and the proliferation of sensors collecting data has all given rise to the ability for machines to turn data into knowledge.
Depending on whom you talk to, what their business is an area of interest, AI means different things to different folks. It might seem a little confusing at first as a lot of terminologies is interchangeable and AI is what I would describe as an ‘umbrella term’, meaning that there are many technologies that fall under the AI umbrella. So let's get the terminology out of the way and see if it makes sense. So there is Artificial Intelligence, cognitive computing, machine learning, deep learning, natural language generation, neural networks, speech recognition, broad and narrow systems, NLP (Natural Language processing), Vision processing systems, and weak and strong AI (explained above). Each topic is a discrete field of research in itself.
Cognitive Computing: The goal of cognitive computing is to simulate human thought processes in a computerised model. Using self-learning algorithms that use data mining, pattern recognition, and natural language processing, the computer can mimic the way the human brain works.
Machine Learning: Machine Learning at its most basic is the practice of using algorithms to parse data, learn from it, and then make a determination or prediction about something in the world. So rather than hand-coding software routines with a specific set of instructions to accomplish a particular task, the machine is “trained” using large amounts of data and algorithms that give it the ability to learn how to perform the task.
Neural Networks: Neural Network is a computer system designed to work by classifying information in the same way a human brain does. It can be taught to recognise, for example, images, and classify them according to elements they contain. Essentially it works on a system of probability – based on data fed to it, it is able to make statements, decisions, or predictions with a degree of certainty. The addition of a feedback loop enables “learning” – by sensing or being told whether its decisions are right or wrong, it modifies the approach it takes in the future.
领英推荐
Tune 3...
Deep Learning: Essentially Deep Learning involves feeding a computer system a lot of data, which it can use to make decisions about other data. This data is fed through neural networks, as is the case in machine learning. These networks – logical constructions which ask a series of binary true/false questions, or extract a numerical value, of every bit of data which pass through them, and classify it according to the answers received.
Natural Language Processing: Natural language processing or NLP is a branch of artificial intelligence that has many important implications on the ways that computers and humans interact. Human language, developed over thousands and thousands of years, has become a nuanced form of communication that carries a wealth of information that often transcends the words alone. NLP will become an important technology in bridging the gap between human communication and digital data.?
Natural Language Generation: Natural Language Generation (NLG) is a subsection of Natural Language Processing (NLP). NLG software turns structured data into written narrative, writing like a human being but at the speed of thousands of pages per second. NLG makes data universally understandable and seeks to automate the writing of data-driven narratives like financial reports, product descriptions, meeting memos, and more. Importantly, while NLG software can write, it can’t “read”. NLG turns structured data into human language but it is not able to, for example, read a news story and pull figures out of it. In fact, the science of taking unstructured data like a book and making it structured is called Natural Language Understanding.
Speech Recognition: Speech recognition fundamentally functions as a pipeline that converts PCM (Pulse Code Modulation) digital audio from a sound card into recognised speech. The elements of the pipeline are:
1 Transform the PCM digital audio into a better acoustic representation.
2 Apply a "grammar" so the speech recogniser knows what phonemes to expect. A grammar could be anything from a context-free grammar to full-blown Language.
3 Figure out which phonemes are spoken.
4 Convert the phonemes into words.
Broad and Narrow Systems: Another division in AI development is Narrow AI versus Broad AI.?Given that the requirement is that the machine performs the same task as a human then Narrow AI allows for lots of limited or narrowly defined examples, including outside of deep learning.?For example, NEST thermostats or systems that recommend options (what to watch, who to date, what to buy).?Broad AI then is a system that can be applied to many context-sensitive situations in which it can mimic human activity or decision making.?The goal of Artificial General Intelligence (AGI) is to create a platform that is Strong (simulates human reasoning) and Broad (generalizes across a broad range of circumstances).
Computer Vision: Computer Vision is an interdisciplinary field that focuses on how machines or computers can emulate the way in which humans’ brains and eyes work together to visually process the world around them. Computer Vision is concerned with the extraction of understanding from images. There are two key components to Machine Vision, engineering autonomous systems that are able to perform tasks of human vision and developing algorithms and computational models that are able to replicate the inner workings of vision and biological understanding.
Over the coming weeks and months, I’ll dig deeper into the application areas of these technologies, the potential disruption, and new forms of jobs to be created from AI in a series of blog posts.
Similar to Henry Ford's famous quote “Whether you think you can, or you think you can't--you're right”. Similarly, whether you believe AI to be an opportunity or a threat you are right. Change is coming and let's get ready for it.
If you want to find out more about how IDA Ireland is helping develop the AI industry please get in touch: [email protected] / www.idaireland.com
Last tune :)
Business consultant & business mentor, working with owners and managers of SMEs and family businesses - still learning
7 年Hi Ken Ken Finnegan, great article. Heavy reading of good sci fi is a great cultural primer for AI, much of the music quoted has roots there. Was concerned when I saw NLP mentioned, but its a science based NLP, and different. Billy Linehan