Cognitive Computing and Artificial Intelligence
50th in a series of?50 Knowledge Management Components?(Slide 64 in?KM 102)
Definitions
Background
Cognitive computing and artificial intelligence are very hot topics at the time of this writing. In researching this article, I found a large number of recent articles on these and related topics. Many of the writers seem to think that AI is a relatively new technology. It is not. AI has been around for a long time, having gone through several hype cycles.
Awareness of AI is widespread. Movies such as 2001: A Space Odyssey (1968), Colossus: The Forbin Project (1970), and A.I. Artificial Intelligence (2001) have portrayed AI somewhat realistically.
In 1983, Janet Johnson, my colleague at Digital Equipment Corporation, attended extended, intensive training to become an AI specialist. When she returned, we were able to place her at McDonnell Douglas in an expert systems project. But nothing came of it, and before long, she was working on other efforts that did not take advantage of her AI expertise.
The promise of AI has always been tantalizing, but it has struggled to deliver on that promise. They key is to create, promote, and implement cognitive computing killer apps that are markedly more capable than existing alternatives. In a television commercial for Amazon Echo, Alec Baldwin asks Alexa to check the traffic. This is not likely to convince many people to buy an Echo. More compelling use cases for AI are needed.
One of the early areas where AI was applied was expert systems for medicine. AI can help diagnose illnesses, prevent problems with drug interactions, and detect new associations not yet seen by humans. My radiologist friend, Dr. David Osher, told me that computers will soon be able to read most x-rays better than he can. Now that is a compelling use case.
Uses and Benefits
Cognitive computing and artificial intelligence can be used to support decision-making, deliver highly-relevant information, and optimize the available attention to avoid missing key developments. Implement cognitive computing to help achieve these desirable results:
Cognitive computing can simulate human thought processes and mimic the way the human brain works, addressing complex situations that are characterized by ambiguity and uncertainty. Artificial intelligence can perform operations analogous to learning and decision making in humans. Intelligent personal assistants can recognize voice commands and queries, respond with information, or take desired actions quickly, efficiently, and effectively.
Using these approaches can enhance the capabilities of humans by augmenting their powers of observation, analysis, decision making, processing, and responding to other people and to routine or challenging situations. Cognitive computing tools such as IBM Watson, artificial intelligence tools such as expert systems, and intelligent personal assistant tools such as Amazon Echo, Apple Siri, Google Assistant, and Microsoft Cortana can be used to extend the ability of humans to understand, decide, act, learn, and avoid problems.
Cognitive computing and artificial intelligence are the key elements of an Augment Strategy for knowledge management. Here are three examples of such a strategy:
Insights
1. Primer: Make sense of cognitive computing by Bob Violino – quotes Paul Roma of Deloitte
There are three main ways cognitive computing can be applied today:
How is cognitive computing different from AI?
Deloitte refers to cognitive computing as “more encompassing than the traditional, narrow view of AI.” AI has been primarily used to describe technologies capable of performing tasks normally requiring human intelligence, he says. “We see cognitive computing as being defined by machine intelligence, which is a collection of algorithmic capabilities that can augment employee performance, automate increasingly complex workloads, and develop cognitive agents that simulate both human thinking and engagement.”
2. The First Wave of Corporate AI Is Doomed to Fail by Kartik Hosanagar and Apoorv Saxena
Early efforts of companies developing chatbots for Facebook’s Messenger platform saw 70% failure rates in handling user requests. Yet a reversal on these initiatives among large companies?would be a mistake. The potential of AI to transform industries truly is enormous. Recent research from McKinsey Global Institute found that 45% of work activities could potentially be automated by today’s technologies, and 80% of that is enabled by machine learning. The report also highlighted that companies across many sectors, such as manufacturing and health care, have captured less than 30% of the potential from their data and analytics investments. Early failures are often used to slow or completely end?these investments.
For quick wins, one might focus on changing internal employee touchpoints, using?recent advances in speech, vision, and language understanding. Examples of these projects might be a voice interface to help pharmacists look up substitute drugs, or a tool?to schedule internal meetings. These are areas in which recently available, off-the-shelf AI tools, such as Google’s Cloud Speech API and?Nuance’s speech recognition API, can be used, and they don’t require massive investment in training and hiring. (Disclosure: One of us is an executive at Alphabet Inc., the parent company of Google.) They will?not be transformational, but they will help build consensus on the potential of AI. Such projects also help organizations gain experience with large-scale data gathering, processing, and labeling, skills that companies must have before embarking on more-ambitious AI projects.
For long-term projects, one might go beyond point optimization, to rethinking end-to-end processes, which is the area in which companies are likely to see the greatest impact. For example, an insurer could take a business process such as claims processing and automate it entirely, using speech and vision understanding. Allstate car insurance already allows users to take photos of auto damage and settle their claims on a mobile app. Technology that’s been trained on photos from past claims can accurately estimate the extent of the damage and automate the whole process. As companies such as Google have learned, building such high-value workflow automation requires not just off-the-shelf technology but also organizational skills in training machine learning algorithms.
3. How Companies Are Already Using AI by Satya Ramaswamy
4. How People Will Use AI to Do Their Jobs Better by H. James Wilson and Cyrille Bataller
5. The War on Experts by Gary Klein
Artificial Intelligence and Big Data have each claimed to be able to replace experts. However, each of these claims is unwarranted. Let’s start with AI. Smart systems should be able to do things like weather forecasting better (and more cheaply) than humans, but the statistics show that human forecasters improve the machine predictions by about 25%, an effect that has stayed constant over time. AI successes have been in games like chess, Go, and Jeopardy — games that are well-structured, with unambiguous referents and definitive correct solutions. But decision makers face wicked problems with unclear goals in ambiguous and dynamic situations, conditions that are beyond AI systems.
As Ben Shneiderman and I observed in a previous essay, humans are capable of frontier thinking, social engagement, and responsibility for actions. Big Data approaches can search through far more records and sensor inputs than any human, but these algorithms are susceptible to seeing patterns where none really exist. Google’s FluTrends project was publicized as a success story, but subsequently failed so badly that it was removed from use. Big Data algorithms follow historical trends, but may miss departures from these trends. Further, experts can use their expectancies to spot missing events that may be very important, but Big Data approaches are unaware of the absence of data and events.
6. Steer Clear of the Hype: 5 AI Myths by Christy Pettey
7. Micro Explanations For Nine Essential AI Technologies by Mike Gualtieri
8. 12 Ways AI Will Disrupt Your C-Suite by Lisa Morgan
Resources
1.LinkedIn Topics: Cognitive Computing - Artificial Intelligence
2. SlideShare:?Cognitive Computing?-?Artificial Intelligence
4. Cognitive Computing Consortium (archives)
6. (the late) Sue Feldman
8. Seth Earley
9. Adi Gaskell
10. IBM
11. APQC
12. KMWorld
13. Rolling Stone: Inside the Artificial Intelligence Revolution: A Special Report
16. Information Week
17. CNET: AI
19. InfoWorld: AI
领英推荐
20. SAS: AI
21. Forbes: AI
23. MIT: AI
25. Gartner: AI
26. Forrester: AI
27. Deloitte
28. David Schatsky
29. Gary Klein
30. Tony Rhem - The Connection between Artificial Intelligence & Knowledge Management
31. Mary Abraham
32. Matt Moore
33. Dave Snowden
34. Luis Suarez
36. Futurism: AI
37. What deep learning really means by Martin Heller
38. The incomplete A-Z of cognitive computing by BBC
39. The Growing Importance of Natural Language Processing by Stephen F. DeAngelis
40. What’s the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning? by Michael Copeland
44. 5 ways chatbots are revolutionizing knowledge management by Matt Wade
45. AI Use Cases That Will Advance the Industries in 2021 by Ivana Kotorchevikj
46. How Do Generative AI Systems Work? by Page Laubheimer
Software
1. Artificial Intelligence Software?by Capterra
2. Top Cognitive Computing Companies by Venture Radar
3. Top Cognitive Computing Companies by Predictive Analytics Today
4. G2
5. Liam H?nel
Generative AI Tools
Training
2. Udacity
3. Deloitte
4. edX
5. Coursera
Books
Video
with sharing and discusion to elavate the knowledge
6 年Good article Stan Garfield about AI an Cognictive learning which are base on simulation model programming.Now the KM be radically change due the technologi of IT and also OT including analytic process.May the digital twin technology be a tool of knowledge capture and transfer? So the IICT and ITOT and neuro logic make it more radical change of KM and KM orocess.
Content Marketer | 2X Author | Wealth Management | Sustainable Growth
6 年Thanks Stan Garfield. Great post for people exploring the potential of AI and related technologies.