Kaye Vivian: Profiles in Knowledge, Part 3
This article contains Kaye Vivian's Dove Lane blog posts from January 19, 2006 to April 25, 2006. Later posts are in?Part 1 ?and Part 2 and earlier posts are in Part 4 and Part 5 .
April 25th, 2006
Patrick Lambe at Green Chameleon told an interesting story about how he has been blacklisted by one of the large KM conference organizations. One of their inexperienced marketing people told him by mistake. It was eye-opening to me, and made me start thinking again about my KM conference experiences. I posted something related last fall .
I’ve been speculating a bit myself recently about why KM conferences aren’t more successful for attendees — namely, why don’t the so-called big name people show up and stay? We pay a lot of money, and would value the opportunity to shake hands with the gurus whose books we read and whose advice we follow. Patrick’s article points to a possibility I had not considered, namely, that conference organizers potentially keep some people out. I have wondered things like, why don’t the speakers that do come remain for the entire conference, not just fly in for their sessions and out the same day? Are they really that busy? Are they worried that people might ask them for free advice or a job? Why, when some people are obviously a hit with the attendees and have something valuable to share, aren’t they back the following year? There are clearly politics of some sort at work.
I’ve wondered if the advisory boards of these conference organizations aren’t a little closed to outside thoughts. Invariably there are recognized names on those boards, but it’s hard to find a single one of them without a vested interest in having their own colleagues as presenters or in pushing their proprietary point of view forward. In principle, I don’t object to that…people do volunteer work for business development and other reasons. I do object to not finding out about the creative things being done in the field at conferences. Each year we seem to get repeat speakers who have financial interest in participating. They are there to push their forthcoming book (orders taken at the table in the back) or new software. They have a new training or certification program for KM (fee only, of course). They bought a big exhibit space. We are not getting the ones who have launched KM on a shoestring using open source software, or ran a novel COP pilot, or created a completely new KM model without input from a big name consulting organization, or wrote a provocative thesis.
People on the business side of KM only seem to get invited if they work for a big corporate name. They are, unfortunately, often the weakest presenters at the event because they are one track ponies and can’t bring context to the learnings that the rest of us would like to take away.
Finally, there is the issue of last minute substitutions. Do conference organizers publish the conference speaker lists too early in the process? Do the speakers disrespect their commitments so much that they will fail to show up after their name has been published? (After all, they’ve had the free publicity, why should they go to the trouble to prepare and show up?) Why would three, five, or ten speaker substitutions announced at the conference be tolerated? There’s a black list that should be created! A few can be understood, but when there are so many that it creates confusion about what session you are sitting in, doesn’t it become the conference version of bait and switch?
There is a schism that threatens KM and it’s the same that appears in most fields where the financial stakes are high for being “right”. Biotech and software development come to mind. Practitioners rush to publish and copyright or trademark their idea or their work, and then spend the rest of their careers defending their one good idea. The way to make the KM field grow and gain meaning and respect is collaboration — building upon what others have contributed. It doesn’t take a Wiki. It takes a willingness. It’s optimistic to hope that the people who rush to create training programs focused on only their own approaches to the problem will have an epiphany and suddenly start to share their knowledge freely, but wouldn’t it be nice?
Right now the planning is on for the big fall conferences. I hope that the organizers for all of them can rise above the competition of proprietary interests, and the demands of financial backers to have sponsors, and put the best people on the podium to talk about the real issues and the creative solutions.
April 12th, 2006
I always enjoy reading the blog of the Anecdote consulting organization in Australia. They are great people and frequently post insightful observations on communities of practice and knowledge management. Recently Shawn Callahan posted about a little test he uses to determine whether a community has a chance to succeed in building an identify and affinity among members. Since community has to start with a sense of belonging in order to succeed, I think this is a useful tool for any KM practitioner to have in their back pocket.
Shawn wrote, “When someone says, ‘I would like to start a community of practice.’ I ask, ‘Can you describe the potential members by completing the following sentence? I am a ...’ If they can fill in the blank in a way that people can passionately identify with the descriptor then there is a chance a community might emerge.” In his example, “I am a project manager” had a good chance to succeed, while “I am a technical” did not.
I’ve seen this in my own experience. Groups with a clear sense of what commonality binds them together are more likely to have a viable community. If someone can say “I am an underwriter” or “I am a third grade teacher” or “I am the parent of a brain injured child” or “I am a User Interface designer”, then there will be a clear match in interests with anyone else who answers the same way. In my gaming life, that even includes being a player of a certain game, or a member of a particular guild/clan within a certain game or a certain class of player.
Trying to build communities that are too broad, for example, “I am an XYZ Company employee” or “I am a management consultant” will result in some or all of the following:
·?lack of participation/interest
·?coalescing of smaller and more narrowly defined sub-groups who share common interests
·?ineffective use of the community resource/tools
·?failure to achieve desired objectives
Building a community of everyone who can say “I am a developer” will successfully segregate the programmers from the people in business operations, however, it will still be far too broad to make it relevant for participants. Defining subsets of developers into “I am a Cobol programmer” or “I am a web user interface developer” or “I am a team leader” will result in bringing people with greater commonalities together and help them to start the conversations that will add value to their work and to the business.
April 11th, 2006
In another posting, I noted some definitions that show how fragmented our understanding of the relationship between KM and learning are. Shirley Hazlett et al. suggest that KM is in a state of pre-science and we lack understanding of the underlying assumptions: “…attempts to develop an optimal KM methodology are misplaced unless the underlying assumptions and paradigms are identified and understood…KM is currently in a state of ‘pre-science,’ wherein proponents of different paradigms have their own beliefs and values and often disagree with others about fundamentals within the field.” I believe this is true. I see it in the discussion groups I’m a member of. For me, the confusion and conflict is what makes KM such an interesting problem to solve. It’s a time when everyone’s voice is equal to everyone else’s, and while we are coming up with applications, approaches, and solutions right and left, the core statements of what KM is and what it does are still very much undefined. There’s a kind of “we know it when we see it” dynamic at work.
To me, there is significant overlap in the processes of knowledge management and learning, but they are distinctly different processes. It is worthwhile for us to understand where they overlap and where they are different in order to help organizations use the benefits each offers effectively. For example, both focus on people and use or create content, but learning creates courses to close knowledge gaps, while KM measures what is already known and creates processes to capture it. Verna Allee said in 2000 “eLearning could be a cornerstone of knowledge management but most elearning companies have failed to master the basic theory and practice of knowledge management. They not only cannot intelligently speak about knowledge management practice from a marketing perspective, they don't even have a coherent internal understanding of knowledge management or a serious knowledge management strategy of their own.”
G.P. Huber identified four integral elements linked to knowledge in the organizational learning process, and suggested that knowledge is essential for learning. Those learning elements are:
·?Knowledge acquisition. Knowledge may be acquired intentionally (searching) or unintentionally (noticing).
·?Knowledge/information distribution Information/knowledge from various sources must be shared: the wider the distribution, the greater the ability to learn. Distribution may be through formal processes or through informal contacts and learning by doing.
·?Information interpretation. Information is given meaning and shared understandings are developed. This may occur through formal meetings and discussions, or through recursive and informal, intuitive experiences.
·?Organizational memory. Knowledge is stored for future use, either formally codified (reports, memos and so on) or institutionalized in cultural values.
With a nod to the first President Bush, I have been known to say “It’s the people, stupid!” in response to business managers who get excited about KM and then lurch into action immediately to create new repositories of content in order to “capture knowledge”. Information is simply the input. Human insight is the output that changes information into knowledge.
Knowledge is critical to organizational success, as noted by Nonaka and Takeuchi, but people are the critical component in KM, not databases. I’m not saying that databases, information management and intellectual capital are not important. They are. I’m saying they are not “knowledge”. Knowledge is always the tacit wisdom contained in the head of someone with experience related to the topic. For that reason, expertise location and knowledge sharing are more important to KM than information/content management, even though both are needed. From the dawn of the species, humans shared knowledge without databases and indices.
There is a strong correlation between the importance of knowledge to organizational success and the need to nurture employees, as evidenced in the early 1990s by Buckman Labs . Nurturing so-called knowledge workers occurs through providing an environment in which they can both develop new knowledge (learning) and share what they know. Steve Barth wrote, “A knowledge worker is an asset that appreciates over time. Knowledge itself is more often a depreciating asset.” There are abundant examples of the importance of nurturing knowledge workers in the extensive internal learning and KM programs of all the large consulting and professional services organizations today, where people are the product they offer.
People are the key to both learning and knowledge management, and cultural readiness is an important component of any KM program. Providing technologies for knowledge sharing, motivating people to share what they know, and improving knowledge sharing processes are the realm of KM–for example, in efforts to “enable the knowledge worker”. Identifying knowledge gaps, and providing curricula for education and to create personal knowledge are the realm of learning.
Learning and KM Alignment - the point of divergence
I posed the question of the relationship between learning and KM to a discussion group recently, and here are some of the comments. Joe Firestone , who has written books that cover this topic said, “KM is not a subset of learning, but is the set of management activities intended to enhance learning processes.” Matt Moore, with IBM’s Business Consulting Services division, said “most of those involved in learning…know that most learning does not occur within classrooms - but rather on the job. Coaching and mentoring programs can help here, but increasingly they are looking to knowledge management for support around ‘just in time’ learning programs.” Some educators see learning as a product that they produce. Euan Semple commented that his organization named their KM program “Informal Learning” and it was difficult for some in the training group (where the KM initiative reported) to accept. “You are right though that some in training found this a challenge. The idea that the best knowledge is out there, current and manifest in conversations, can be challenging to those who have made a career out of dispensing it as a product. ”
Matt Moore further highlighted the strain of differences in a KM or a traditional learning approach: “People tend to fall back on what they know, and if you are a fantastic workshop facilitator or a great instructional designer, then nurturing a community of practice or running ‘lessons learned’ activities can be an alien experience. The tensions between JIT Learning & prescriptive curricula are also becoming apparent.” In another communication that presents a positive relationship between knowledge sharing and learning, Mark Spain said he advised a small business owner to use structured team discussions of unusual occurrences to enhance organizational knowledge and create a learning organization. “If you learn to review critical incidents with respect, openness with each other and a willingness to improve by tackling the difficult or embarrassing aspects of the conversation, you are starting to be what the theory describes as a Learning Organisation. You will know you have the opportunity to learn (or change) if you feel uncomfortable in parts of the process but get support from each other to continue because it adds some value to each other.”
F. J. Miller provides some finetuning for learning, saying that training is explicit (i.e., information delivery) and learning is tacit (i.e., the making of personal meaning). Experiments conducted by BHP Engineering attempted to understand the meanings people attach to certain key words in the workplace. When the word training was thrown into the ring, surprisingly, it typically evoked negative reactions. Words like teaching, classrooms, schedules, assessment, authority, competency measurement, control, accreditation, dependency, tests, discipline, boredom, and manipulation covered the white board in the room. Learning, on the other hand, generated a quite different and more positive list, evoking such responses as: self-direction, understanding, enthusiasm, self-pacing, independence, open discussion, success, commitment, freedom, ease of access, excitement, maturity, and honesty. Despite these very different perceptions and responses, organizations still continue to use the language of training and learning virtually synonymously.
In Summary…
Knowledge management and learning work in tandem for greatest organizational and individual effect. In a nutshell:
Knowledge exists only in a person’s brain and it is unique to each person. Learning is a building process for creating knowledge. Knowledge is the product of learning. Knowledge in a person’s head becomes information as soon as it is written or transmitted. Information is used to develop learning modules. The learning process passes structured information to brains, where it is selectively converted to knowledge as the information gains personal meaning. Information can be organized and managed; knowledge cannot. Knowledge management is an oxymoron, but using KM techniques to enhance learning initiatives results in a wide range of organizational benefits.
April 11th, 2006
Recently I had some discussions with colleagues and friends about the relationship between learning and knowledge. I captured one of those conversations between me (”A”) and a good friend (”B”), and I’m reprinting it here (with permission, of course):
A:
“What is the relationship between learning and knowledge?”
B:
We know what we’ve learned.
A:
Can you know without learning?
B:
I don’t think so.
A:
Do you know enough German to know the difference between kennen and wissen?
B:
No, I have not that knowledge, for I have not learned German
A:
(smile)
Okay. Back to knowing without learning.
I think you can know without learning. I think you can “know” how to stop breathing when you are under water or that food is necessary. Things like that. Babies don’t come out learning to eat. They know to eat, and learn HOW to use utensils to eat.
B:
Is that knowledge, or reflex?
After all, the first time you try breathing under water and inhale a mouthful of water, you’ll *LEARN* not to do that.
I’m not sure you’re born with that knowledge.
A:
I’m want to get to a definition of knowledge, not debate reflexes vs. learned behaviors. I’ve been reading a lot of definitions recently, and they are all somewhat wrong. Some are partially right, but terms are not defined in the same way. I’m trying to close in on a working definition I can propose.
B:
Well, there are some things then that you don’t learn…for example, you never “learn” how to make your heart beat… it just does. Is that knowledge?
A:
Right. I was trying to avoid inherent biological functions. Though maybe they are “known”…but I think they are unconscious instincts.
B:
They are definitely not known.
A:
But on some level it is a knowing. Maybe in the midbrain or something there is an unconscious “knowing”.
B:
I don’t *know* anything about digestion, kidney functions, etc
A:
But your medula oblongata does, because it monitors and manages those functions. Your higher brain may not.
B:
It doesn’t “know”, it responds to chemical stimuli.
A:
Don’t get into chemicals…if it comes down to it, everything in the body is chemical stimuli and response.
B:
That may be, but some of those chemical stimuli and responses result in “knowledge”, and an awareness of that knowledge, whereas other stimuli just amount to my stomach digesting the hummus and chips I’m eating.
A:
So, in 1967 Michael Polanyi identified that there is “tacit knowledge” and “explicit knowledge”. I think we are talking around that here. Tacit knowledge exists in the organism/brain and is information that has been filtered through your own personal experiences and reflection to have a personal meaning. Explicit 'knowledge' (as he called it) is the stuff that gets written down/codified. I believe, however, that anything that is codified is no longer knowledge. It is information. Information with more or less value to others, but information all the same…a kind of higher data.
B:
He was a physicist. What does he know anyway? :-)
A:
Here’s what I’m proposing for a definition. What do you think? I think it’s stuffy and arcane, but … many are worse!
Knowledge management is a process that uses a variety of social tools and technologies to capture information that an individual has absorbed and modified, using their own personal experiences and personal understandings as a filter, into a modified iteration of information that can be reviewed and used by others.
B:
So is “information” higher or lower than “knowledge”?
That is, is there “information” that is not “knowledge”, or is there “knowledge” that is not “information”?
A:
There’s no higher or lower. Information is used to create knowledge and it can be the product of knowledge. That’s a good question, though. I’ll have to think it through.
B:
Bottom line is, what’s the difference? Because according to your suggested definition, “knowledge management” is about “information”, not “knowledge”.
A:
Bingo! The difference is what can you manage. You can manage information but not knowledge. KM is an oxymoron.
B:
So why not call it “information management”?
A:
Because IT already owns that, and it’s about databases and fields and variables. KM also has a people/expertise component…an evaluation is implied. It’s not just hard, analytical data. It’s also opinions, observations, assessments, plans, etc. Plus, managing knowledge sounds more important than managing data.
B:
So I think that that’s the crux of the difference… to me, information is more facts and knowledge is more interpretation/analysis.
I can say “the temperature outside is 41 degrees”… that is more information than knowledge, though you could say that you now know the temperature in Columbia… but that’s because the verb “to know” is limited.
A:
Well, that’s not exactly correct. Information is facts, but it can be more than facts, because not all information is true. I’m assuming here that facts are true.
Knowledge is information that is passed through one’s personal filters of all kinds and interpreted.
B:
False information is still information
A:
But you said information is facts. Facts are true, therefore, the implication is that information is true. I’m saying not all information is true.
B:
To me, information management is about databases and fields and variables, but lacks the insights gained through analysis, and those insights are knowledge. So maybe in a way, information is barebones and primal, while knowledge has to do with interpreting information and extracting stuff that goes beyond the information. For example, take a profile captured by a radiation detector: it’s a bunch of numbers, representing the number of gamma particles hitting the detector at regular intervals. I can give you such a profile: that is all the information there is, and we put that in a database. However, if you have the knowledge, you can analyze the profile and figure out what material it was that generated those numbers. So, information management has to do with capturing/managing the raw numbers, while knowledge management needs to capture/manage the analysis by which someone determines whether those raw numbers are generated by one radioactive material or another.
A:
Not exactly…KM can’t capture/manage the analysis process, just the results of it. And that is exactly the point! Knowledge takes a brain! Which means people hold knowledge, not databases. You can’t capture knowledge. Once it can be put into a database, it is no longer knowledge, even though someone may still know the information. It is simply information. From a KM standpoint, we are less concerned with what information is captured or how it’s stored. We are more concerned with finding ways to entice people to share information so it can be captured and later retrieved by others for a variety of reasons, learning among them. So KM has to do with creating an environment were information can be shared, while maintaining a record of who knows the details of the shared information to make it possible for a seeker to find the knower and get enhanced information from him/her - with technology or otherwise - and rewarding people for participating.
A:
The reality is that KM practitioners are not going to call themselves information managers, because there are big interpersonal components to the KM practice that go way beyond building databases and input tools. For example, communities and how they create new information through member interaction, social networks and how they affect the flow of information, process workflows, etc. Expertise location capabilities. Rewards and recognition. Valuing of intellectual capital assets. Identifying information needs. All of those are part of KM.
A:
In general, I assume KM is a specialized type of information management — one that is partially the same as conventional information management, yet more. It’s a specialized type of management, because it actually creates the environment in which people can interact in a way that facilitates exchanges of personal knowledge between people…typically via codifying the information and storing it somewhere for reuse by others.
A:
The management comes in the managing of the variables that enable the free flow of information from someone who has it to someone who needs it. We can’t manage knowledge. (And I think that’s why many people think KM is the wrong discipline name for what we do - but we can’t seem to come up with another one!)
You can see that the conversation digressed from KM’s relationship with learning. If you have any thoughts on how knowledge, learning, knowledge management, and information management are related, I’d love to hear them.
March 15th, 2006
In the March, 2006 issue of The Source for KM Professionals, Chris Collison, one of the authors of the famous narrative on the development of KM in British Petroleum called Learning to Fly, discusses the value of metrics in KM . One of his statements really struck me. He was talking about how they had gone back to various participants in the KM program and asked them for stories that would demonstrate the value of KM to executives, and said “Did they have credibility as stories? Absolutely yes - because of who was telling them. Did the stories inspire others and give momentum to what was going on? Definitely.”
To me, this is the only way that KM practitioners can demonstrate the value of KM. KM will always be one of many influencing factors that results in more sales or cost avoidance or expense reduction or better products. What gives KM credibility is having the business leaders who have experienced value from the system stand up and say so. Other business leaders will listen and believe, because of who is telling them. As KM practitioners, we can report the same statistics or make the same claims, but it won’t have as much credibility as having the business person tell the story and give the credit.
I think that may change how I next approach looking for budget approvals or increases. I plan to quote the executives who have a value story to tell about KM, and let them make the sale for me. Think it will work?
March 13th, 2006
This morning I received a copy of the Ark Group’s most recent study results on e-learning. While it is UK-based and focuses on technology, it shows a shift has occurred from what I observed personally in the U.S. several years ago, and shows some maturation of the field, as well as the beginnings of standardization in tools. In the late 1990s most trainers and corporate educators were largely unfamiliar with using personal computers. They were very experienced with face-to-face interactions with students, developing learning plans, and cutting, pasting and photocopying materials for courses. Learning management systems were new, challenging to learn, and not terribly intuitive for novice users — both instructor and student.
The Ark study indicates that e-learning systems have changed from inhouse developed applications to more mainstream technologies. It also says that e-learning has failed to deliver both satisfaction and ROI, when on the surface, e-learning ought to be an easy sell, to both management and students. E-learning, like knowledge management, talent development programs, and employee communications, simply must be able to demonstrate a justifiable return on investment for management to be able to continue to support it. It’s not enough in today’s business climate for well-intentioned (and intelligent) educators and HR professionals to say that learning/development programs are intuitively worthwhile. Financial people need hard results they can attach dollars/pounds/euros/yen to. Identifying the right metrics to use to measure learning impact may be the trickiest part of implementing and maintaining a learning management system.
Here is a list from the study of metrics that were submitted by an Australian hotel chain:
·?Positive audit on compliance
·?Improved availability and accuracy of reports with reduction of time in delivery
·?Decrease in cost of providing training
·?Decrease in time to have an employee workplace ready
·?Increase in number of employees who believe they have sufficient opportunity to develop skills/eligibility for other roles or promotion
·?Satisfaction increase in access to reports and data for analysis
·?Increase in effectiveness of management reports
·?Increase of approval for all course applications
I found it interesting that so many high ranking responses to the questions mirrored the pre-Internet way of education, showing that many educators still appear to be looking for a balance between the “old way” and e-learning (and reflect an aging training work force?). For example, on the question “What would you say are the main drawbacks of e-learning?” High ranking responses included “Lack of face-to-face contact” and “Reduced interaction between tutor/pupil”. On the question, “How valuable would you say e-learning tools are to corporate training?”, the top answer was “Useful, but not essential.” The survey was obviously geared to corporate educators, though it’s unclear from the study report how many participants there were. It would be interesting to see if a survey of learners of a variety of ages using e-learning tools would have the same responses.
Read the study, called State of the Art: E-Learning 2006, from Inside Knowledge magazine here .
February 22nd, 2006
The more I read about KM, the more humbled I am by the insights and wisdom of the people who have attempted over the years to bring the field into focus. Every time I think I have had a brilliant insight, I seem to find a paper from 1999 or 2002 by someone who articulated what I just discovered beautifully. Instead of feeling embarrassed or wanting to disprove what they say, I feel joy! First, the fact that I got to an idea that someone else thought important to publish is invigorating and reaffirming, and secondly, I feel awe that the human brain is such an amazing instrument. At times it makes me want to give up writing about the things I’m learning, because I don’t see how I can improve upon what someone else said, yet I also often have the experience of reading a phrase and having it spark new insights for me.
I am intrigued most by the human dynamics of KM. One of the fuzzy areas to me, and I think to many people in the field, is how learning and knowledge management are related. I’ve seen several models representing the dynamics of knowledge and learning, and frankly, I haven’t found them compelling. They feel incomplete. It’s hard to create a complete model when each author starts from scratch and defines terms in their own way. The effort to compare apples to apples becomes daunting. I’m starting to sound like a broken record to myself, but it would be very helpful to have definitions we can all agree to and use consistently! To move KM forward, we need to build upon a base of …knowledge… we all share. I think we will spin in place until we do. This is the first part of a three-part series on the intersection of learning and knowledge management, and it deals with…definitions!
We All Think We Know
Learning and knowledge are things that everyone thinks they know about and understand (we all went to school, after all), but when you probe a little, few people really do. Our understanding is colored by assumptions filtered through our own experiences, rather than by unassailable and commonly accepted definitions of terms that can form the base for understanding and scholarly discourse.
You’ll see what I mean if you try this little experiment the next time you are out with friends or colleagues. It’s guaranteed to liven up the dinner conversation! I asked a small sample of people, “What is the relationship between learning and knowledge?” Most people become somewhat confused by the question, but, with a little pushing, you’ll receive answers that are more or less insightful, depending upon the person. I’m willing to bet, though, that your average response will be something along this line “Well, learning is something you do, and knowledge is…what you know!”
People in learning/teaching/training professions tend to consider knowledge a product of their work and knowledge management to be one of the tools of their profession. People who work in knowledge management typically take a different view, seeing learning as a knowledge creation process and separate from the knowledge management process. This is mainly an academic point, until you happen to be in an organization that wants to develop a knowledge management system or improve its learning delivery approach. Battles can develop quickly over who dominates on the organization chart. Clearly, overlap exists in the terminology, technology, and approaches used in both areas, yet most “corporate U” people and “KM” people don’t themselves understand how it all works together. Instead of collaborating, they can waste a lot of energy and resources on staking out and trying to hold their turf. One only has to review the well-known case studies of organizations like Xerox and British Petroleum to see that there is no mention of learning or the relationship between KM and learning when knowledge management activities and metrics are described.
The organizational struggle between learning and KM reminds me of the organizational factionalism that exists between corporate communications and public relations departments, which have more commonality than difference. It’s short-sighted feudalism, and detracts from getting work done.
Some Working Definitions
In 1998 Nancy Zurbuchen said “The subject is too young for fads, let alone for tried-and-true disciplines; but it is old enough to need a vocabulary.” Lack of widely accepted definitions remains one of the biggest problems with KM today.
Some see KM as a business process, some as information or content management, some as a toolkit, some as a technology. As an example, I have reviewed knowledge management presentations from a number of organizations and individuals, and inevitably in their presentations they offer a slide that defines knowledge management. They are amazingly different! Each definition of KM is skewed toward the skill set or product they want to sell…software, business process improvement consulting, call center efficiency, “knowledge” capture. Inability to define what it is we do and want to do in a common way is limiting. Perhaps we can get closer to a definition by comparing KM and learning.
According to the Wikipedia, learning is:
“…the process of acquiring knowledge, skills, attitudes, or values, through study, experience, or teaching, that causes a change of behavior that is persistent, measurable, and specified or allows an individual to formulate a new mental construct or revise a prior mental construct (conceptual knowledge such as attitudes or values). It is a process that depends on experience and leads to long-term changes in (an individual’s) behavior potential.”
To sum this definition up, learning occurs in the brain when new information is gained for an organizational purpose (in the business world) or self-betterment.
Knowledge is defined as both “information of which someone is aware” and “the confident understanding of a subject, potentially with the ability to use it for a specific purpose.” Sadly, I believe this definition misses the target entirely. Serious work is clearly needed on this definition yet (which is a problem with using Wikipedia definitions). Knowledge may a priori require awareness of information, but understanding is not inherent, and potential applications of knowledge are not relevant to defining it. Let’s try another source.
The American Heritage Dictionary defines knowledge as,
·?The state or fact of knowing.
·?Familiarity, awareness, or understanding gained through experience or study.
·?The sum or range of what has been perceived, discovered, or learned.
This is pretty good, but hard to turn into a compelling bullet point for a management presentation.
Tom Davenport (1998) offers this definition, “Knowledge is information combined with experience, context, interpretation, and reflection. It is a high-value form of information that is ready to apply to decisions and actions.” The first statement is correct, however, knowledge is a “high-value form” of information only to the knower. Knowledge is completely personal, and is no longer knowledge once it is codified. Knowledge cannot be codified or transferred — it can only exist in a brain, where the personal experience and personal understanding resides. Michael Polanyi defined personal knowledge as tacit knowledge in 1967, saying “we can know more than we can tell.” Ikujiro Nonaka built on that work, saying “tacit knowledge has a personal quality, which makes it hard to formalize and communicate.” I agree that these refer to knowledge. They unwittingly started us all down the path of confusion, though, by defining “explicit knowledge”, which is actually “information” and not knowledge.
Here’s a further example from the same page of the Wikipedia that shows how the confusion between knowledge and information even more specifically:
“Knowledge management treats knowledge as a form of information which is impregnated with context based on experience. Information is data which causes a difference to an observer because of its observer-specific relevance.” There are so many things wrong with these two statements that I don’t want to take the space to point them out. This definition doesn’t further the discussion at all. To attempt to clarify, I would revise it to say “Knowledge management is a process that uses a variety of social tools and technologies to capture information that an individual has absorbed and modified, using their own personal experiences and personal understandings as a filter, into a modified iteration of information that can be reviewed and used by others.” Not perfect, but it’s closer.
Only information can be documented/captured. Knowledge exists in the brain. Which begs the obvious question: Can one manage what is inside a brain? Of course, the answer is no. So as aspiring KM practitioners, we have a profession with a name that is an oxymoron (as David Skyrme first noted), we can’t define the boundaries of the playing field, and we can’t articulate how learning and knowledge management are related. Reminds me of that classic print ad of the elderly male executive in a wing backed chair with a list of negatives beside him, saying “And you want to sell me what?” Skyrme offers a pretty good definition for KM : “Knowledge management is the explicit and systematic management of vital knowledge and its associated processes of creating, gathering, organizing, diffusion, use and exploitation. It requires turning personal knowledge into corporate knowledge than can be widely shared throughout an organization and appropriately applied.” It shares the problem of the Davenport and Nonaka definitions above, calling “vital knowledge” and “corporate knowledge” knowledge when they are actually information. Personal knowledge is the only knowledge.
I think Paul Hildreth and Chris Kimble are closer to understanding what knowledge management is about. “Can knowledge be managed or can we just facilitate the development of a person’s knowledge? Is the knowledge being shared or an environment being created where a person develops their knowledge through interaction with, and guidance by, an old-timer?” The KM process to me is about creating an environment where a person interacts with others and is guided by their experiences — a neutral environment adapted to the unique requirements of each organization and its members.
I was having a conversation with a friend yesterday and he asked, “Is there ‘information’ that is not ‘knowledge’, or is there ‘knowledge’ that is not ‘information’?” That became an interesting conversation, so it will be the third part of this series. More soon on the overlapping strategies and tactics of KM, collaboration and learning.
February 16th, 2006
Anyone who is reading this probably already knows something about knowledge management, so I’ll ask you something that’s nagging at me. Why, after nearly 15 years of more or less organized thinking, debate and studying of KM, haven’t we collectively been able to:
·?Define what knowledge management is
·?Create an unassailable model of how it works
and perhaps more importantly,
·?Sell the KM value proposition to organizations that clearly need it?
It seems fundamental to further productive discourse, and yet, we can’t seem to resolve these basic questions. Why? There are many bright, educated, intelligent, capable, interested, articulate, clear-thinking people involved with this work. Some amazing insights and results and benefits have been captured and tested and reported. Why after all this time and effort and energy have we been unable to unify all our experience and insight and results into a single-minded understanding? Why are we still seeing the legs and trunk and tail of the elephant and not the elephant? This profession (if that is what we allow ourselves to be called) is churning. The tires are spinning and the steering wheel is being turned this way and that, but we can’t seem to get the traction that will break our inertia and send us moving down a road. I’m not sure any of my questions below will help to find an answer, but over years I have learned that when I want to understand where a blockage is or why something is stalled, I have to challenge all the basic assumptions about it to be sure they are true. Here is how I am puzzling out possible reasons for the wheel spinning.
Is it because KM is truly a new approach to how people work and this is part of the normal slow startup curve? If that is so, there must be parallels in nature. Nature has a model for everything we do. The human race is working out a new way of interacting. Perhaps this is a pre-Cambrian-like KM explosion period teeming with possibilities, a time of immense creativity and variety that will eventually resolve into a few versions that are viable. Can we borrow a model from nature to accelerate our thinking?
Is KM only a fad, as some have said? We don’t want it to be a fad. We are putting energy and thought into figuring it out, and we believe something is in there, but are we wishing something into existence that doesn’t really exist? Is KM just a tactical step on the path of CRM or collaboration or some other interactive process that we have attempted to elevate to more than it is?
Are we assuming incorrectly that all organizations need KM? Perhaps we should refocus our efforts toward defining and prioritizing who really needs it and will benefit from it, and what value they can expect from it. Most organizations that are interested still consider it a “nice to have” and not a business necessity. We need to focus on the ones that know they need it. Perhaps we could focus on creating a tiered approach, a way of defining organizations on the basis of the types or amount of value KM could bring to them — a Mazlow-like pyramid of organizations with criteria around it.
Are we putting a lot of thought and effort into KM when it is only a subset of a larger concept no one has yet defined? Before Chaos Theory was formulated in physics, scientists dealt with a lot of subsets of “something”, but they couldn’t quite understand what the “something” was that would make it all hang together. The same was true with the discovery of gravity. And the Theory of Relativity. Until then, a lot of observations were made about results, but no one knew how to make all the results make sense in a bigger framework. Maybe we are in a similar situation with KM, and we are churning on the subsets and missing the bigger picture.
Are we preaching to the choir too much, and excluding new or different voices? The maxim goes “the devil you know is better than the devil you don’t know.” It’s comfortable to be a big fish in a small pond. It’s easy to become complacent and stop questioning. Some of the leaders of the KM community have been around since the earliest days, and, while they have many good insights and much wisdom to offer, some seem more interested in being acknowledged by their peers than making new contributions to the KM discussion. Perhaps we are all coasting a bit, and poo-pooing the new ideas and contradictory views offered by new and different voices. After all, it’s human nature to resist change. It just doesn’t advance the field.
Is KM a product or a process? There is a lot of debate that goes on in KM circles about what constitutes KM. Some experts talk about it as if it is a product…as the result or outcome of some processes that occur in an organization. Some talk about it as if the processes themselves, a particular collection of steps or actions, comprise KM. It’s interesting to hear the debates, because it’s like the blind men and the elephant. Both views are correct, and neither is the whole picture. We can’t get our hands around KM because none of the prevailing theories can encompass both points of view, and the correct understanding of KM must include both concepts.
Is the concept of KM being hijacked by a small group of consultants? Each person or group with an idea (trademarked, of course) throws it out to the world as KM and tries to advance it against the others. Some of these are untested hypotheses, and some are tactical methodologies. It’s a kind of “capture the flag” game, where one consultant raises a flag and then another one steals it away and carries it to their home base, only to have it snatched away a few months later by someone else. Even the software vendors get into this game. They have gone so far as to hijack the term “knowledge management” and equate it so successfully with technology, that we may need an entirely new name for what we really do (which is only supported by technology). KM is either universal or it’s a subset of something that is. We have no governing body that represents what KM is or should be, how it works, and who is qualified to consult on it.
Is the factionalization of competing KM theories and methodologies confusing both customers and ourselves? We have vertical silos in KM, like portals or repositories or just in time learning or creating capabilities or intellectual capital or knowledge markets. And we have horizontal approaches, like communities of practice, social networking, collaboration or organizational storytelling to cut across silos. They are floating in a fuzzy sea of “improving decision making” and “empowering the knowledge worker”, cultural change, knowledge transfer, “the new KM” and the KM of complexity. They can’t all be right, yet they are partially right. How can we assess which are true? It’s confusing, and it sets off interesting and heated debates, most of which occur among the very people who stand to profit from having their own approach accepted as the authoritative one! KM is confusing to KM scholars and practitioners, and it’s even more confusing to business people…who don’t have time to make heads or tails of it. They just table the conversation!
Do our non-scientific or non-financial backgrounds make it difficult to produce results that can be accepted critically? At this point in time, KM is largely learned by doing. If we are lucky, we have a customer and a budget and learn under optimum conditions. KM is still new enough that only a few academic programs offer a KM curriculum track (and even fewer organizations want to hire them!). This means that most of us came to KM from other careers, and most of us were trained in the “soft skills,” not in hard analytical skills like math and science. The scientific evidence to support the claims of KM simply isn’t there yet because so few of us know how to apply scientific rigor to our work. Unfortunately, the people we need to convince tend to be analytical people, and that is what they want to see.
Each of these topics is worth separate discussion and debate, and because I haven’t read everything published, perhaps some good work I have missed has already been done. Speaking for myself, I’m bored with so-called KM conferences on content management and search capabilities and taxonomies and portals, and I’m even getting bored with seminars defining communities and valuing intellectual property and identifying incentives to share knowledge and hype cycles and KM infrastructure and causal maps and social networks and knowledge transfer. Where is the big picture? We need to pull ourselves up out of the weeds and find the grand unifying theory of KM.
February 15th, 2006
Following up on my previous post about Fahey and Prusak’s 11 Deadliest Sins of KM, I decided to offer another view and a new idea. First, an example of the 11 Deadliest Sins being cited in a research paper. Elisabeth Davenport and Blaise Cronin used the “sins” to offer examples related to their concept of the evolution of KM. Their approach says KM1 = information management, which evolved to KM2 = processes and ontologies, which evolved to KM3 = knowledge as capability, where people are put back into KM and where we are today.
Deadly Sins of KM (After Fahey and Prusak, 1998)
1. Not developing a working definition of knowledge - KM1
2. Emphasizing knowledge stock to the detriment of knowledge flow - KM1
3. Viewing knowledge as existing predominantly outside the heads of individuals - KM1 KM2
4. Not understanding that a fundamental intermediate purpose of managing knowledge is to create shared context - KM1 KM2
5. Paying little heed to the role and importance of tacit knowledge - KM1 KM2
6. Disentangling knowledge from its uses - KM1
7. Downplaying thinking and reasoning - (none given)
8. Focusing on the past and the present and not on the future - KM1
9. Failing to recognise the importance of experimentation - KM1 KM2
10. Substituting technological contact for human interface - KM1 KM2
11. Seeking to develop direct measures of knowledge - KM1 KM2
领英推荐
Personally, I’d prefer to see a new approach to the original list altogether. Here’s my suggestion for “Axioms (and Corollaries) of Knowledge Management”. There happen to be 11, since I started where I did, but maybe there are more. Any thoughts?
Axioms (and Corollaries) of Knowledge Management
1. Knowledge can be defined.
Corollary: We have not yet defined knowledge.
Corollary: We have not yet defined knowledge management.
2. Knowledge management is a process dependent upon people and what they know.
Corollary: Knowledge management generates information artifacts.
Corollary: Information artifacts are used to generate new knowledge.
Corollary: Knowledge cannot be codified.
3. Knowledge cannot exist outside the heads of individuals.
Corollary: Information can.
4. Knowledge exchange requires a shared context between individuals.
Corollary: Knowledge can be exchanged or created within a shared context.
5. Tacit knowledge is the true knowledge and cannot be managed.
Corollary: To capture tacit knowledge is to make it explicit and convert it to information.
6. Applications of knowledge are not the same as knowledge.
Corollary: Using knowledge is not knowledge management.
Corollary: Knowledge is separate from its uses.
7. Thinking and reasoning are the engine of the KM process.
Corollary: Thinking and reasoning result in knowledge.
Corollary: Communicating the results of thinking and reasoning creates information artifacts.
8. Documenting the past has value when no changes are anticipated.
Corollary: The future can be influenced by today’s thinking and reasoning.
Corollary: Documenting the past is content management.
9. Experimentation is crucial to improvement.
Corollary: Experimentation will occasionally result in failure.
Corollary: Experimentation can result in big successes.
10. Human interactions cannot be replaced by technology.
Corollary: Knowledge development and exchange occurs in people’s brains.
Corollary: Technology provides a means to capture discussions and convert them to information artifacts.
Corollary: Knowledge management is not technology.
11. Knowledge cannot be measured directly.
Corollary: Knowledge has value to an organization.
Corollary: Conventional balance sheet metrics do not adequately measure knowledge.
Corollary: Information resulting from knowledge management can be measured.
February 14th, 2006
A few months ago, John Maloney reprinted this list for reconsideration. Larry Prusak studied some 100 knowledge projects during the 1990s, and the list represents his and L Fahey’s joint understanding of the problems facing knowledge management in 1998. It’s been quoted often. In researching the list, I found more than 30 applications of it in all kinds of research papers.
The main issue I have with the original list is that it’s a mixed bag of admonitions. I get the sales hook of using “deadliest sins” in the title, but the list is confusingly written. The audience is not clear either. Is it KM professionals? Business people? Newcomers to the field? Some points seem to play to some but not to all. Of course, it was written in 1998, and that was fairly early in KM’s evolution, so this list may have been Fahey and Prusak’s straw man — a first attempt to provide some guidance to a new field — that was never intended to be the Rosetta stone for KM eight years later (even though we still face many of the same issues). To make their list easier for myself to understand, I revised it, taking into account the common knowledge of today, and decided to share it:
The 11 Deadliest Sins (Fahey and Prusak, 1998)
1. Not developing a working definition of knowledge
Comment: We’ve had a lot of working definitions of knowledge since 1998, and KM experts are still in strong disagreement about what it is. We’ll get there, but for now the problem is failure to define it.
Suggested change: Failing to define knowledge.
2. Emphasizing knowledge stock to the detriment of knowledge flow.
Comment: These terms have been somewhat superseded and create confusion. Knowledge stock refers to information — objects or artifacts that can be put into a database and retrieved using a search engine. Knowledge flow refers to the process of creating and reusing knowledge.
Suggested change: Emphasizing content artifacts instead of knowledge flow.
3. Viewing knowledge as existing predominantly outside the heads of individuals.
Comment: Many people believe erroneously that knowledge refers to documents and other information artifacts. Knowledge is not equal to information. Knowledge is only found and created in brains. The rest is information.
Suggested change: Believing knowledge can exist outside the heads of individuals.
4. Not understanding that a fundamental intermediate purpose of managing knowledge is to create shared context.
Comment: Knowledge can be shared when the person having it and the person needing it have the same understanding of the parameters. Creating an environment where contexts are shared is vital to effective knowledge exchange. KM has to do more than provide and catalogue artifacts. It has to help people to talk about circumstances.
Suggested change: Believing that creating shared context is not an important milestone in the process of managing knowledge.
5. Paying little heed to the role and importance of tacit knowledge.
Comment: Tacit knowledge is, by definition, unspoken. If anyone thinks that information captured in a database is the whole story, they are wrong. That point of view neglects the “people” dimension of KM. The knowledge that matters most is often situation-specific variables that are known but not documented.
Suggested change: Failing to understand the role and significance of tacit knowledge.
6. Disentangling knowledge from its uses.
Comment: Knowledge isn’t tangled in its uses, since knowledge only exists in the brain of the knower. Knowledge is de facto separate from its uses, but knowledge can be applied to new situations.
Suggested change: Confusing information creation with applying knowledge to new situations.
7. Downplaying thinking and reasoning.
Comment: Thinking and reasoning are human traits, and are critical to the knowledge creation process. Think about it. Would you rather look up a report in a database, or talk to someone who has experience with the issues? The human component in KM is currently undervalued.
Suggested change: Overlooking the importance of thinking and reasoning to the KM process.
8. Focusing on the past and the present and not on the future.
Comment: The process of documentation is always backward-facing. What’s important is the future — better decision making, improved time to market, faster processes, greater competitiveness, smarter workers. It’s useful to know where we’ve been, but not at the expense of building a road to the future.
Suggested change: Documenting the past and present, and ignoring the future.
9. Failing to recognize the importance of experimentation.
Comment: KM is a new field. All innovation creates change. The KM process is new and requires experimentation to get it right. Experimentation sometimes results in failure. Many businesses have low tolerance for failure. Knowledge results from both success and failure.
Suggested change: Failing to acknowledge the importance of experimentation and failure.
10. Substituting technological contact for human interface
Comment: Posted message boards, email, and online collaborations of various sorts will never fully replace human interaction. Capturing and documenting information can’t replace a live, nuanced conversation that establishes context and facilitates a transfer of knowledge from one person to another.
Suggested change: Substituting technological contact for face-to-face interactions.
11. Seeking to develop direct measures of knowledge.
Comment: “Direct measures” appears to mean quantifiable measures, such as might be used for accounting and valuation purposes. There are many viable measures of knowledge management success today, depending upon which aspect of KM one wants to measure. Some experts have suggested ways to account for the value of KM’s “soft” benefits, and over time these are likely to be more commonly accepted as society keeps moving toward a service economy. Bearing in mind that “knowledge” is not the same as “information”, information can definitely be measured and valued using current balance sheet metrics. Knowledge has to be assessed differently, perhaps as a component of the value of an individual to an organization.
Suggested change: Attempting to measure knowledge using the metrics of balance sheets.
Of course, this is my own interpretation. Did I misstate anything or lose the original intent? Comments and criticisms are welcome, as always…
February 2nd, 2006
In addition to understanding the risks executives see in KM, the KM team will need to assess organizational barriers and create strategies for overcoming them. I’ve listed below some you may encounter. Before you begin a KM program, it’s helpful first to understand the context of the KM challenge you are attempting to master, to be sure you apply the right solutions to the problems.
KM Context: Content, Collaboration or Process
Large consulting organizations approach KM in different ways, depending upon their own inhouse expertise and intellectual property. That’s why if you collect their presentations, proposals and published materials you find differences that can sometimes be confusing to someone just starting to understand knowledge management and what it can bring to an organization. This variety of approaches can make it difficult for a KM leader to reconcile viewpoints about the right KM options to apply to a specific problem. You may have experienced this if you have spoken with PriceWaterhouseCoopers, Accenture, KPMG, McKinsey or IBM. Each has excellent expertise in knowledge management, and each makes recommendations that necessarily reflect the strengths or viewpoint of their individual organizations.
This can be confusing for managers tasked with finding a KM solution for a complex business problem. Before you launch a KM initiative, whether you use outside consultants or do it yourself, it’s helpful to understand the three broad categories into which KM projects/programs typically fall, so you can apply the right KM tools. Each has its own challenges and barriers to overcome. One way to classify them is Collaboration, Process, or Content (which are similar to what we used to call people, process and technology). If you can assess the broad type of problem you are trying to solve with KM, you have a better chance of getting to the right solutions.
In the collaboration or personal interaction scenario, executives want people to work together differently/better. First determine what information is needed to support decision making. Next, identify and establish virtual communities. Provide incentives and make it easy for people to collaborate and share information on current and past projects. Identify experts, make them visible, and reward them for sharing what they know. Acclaim by peers is a heady elixir. Measure community activity and employee satisfaction.
In the process scenario, executives are focused on improving how a type of work gets done. First determine what information is needed to support decision making. Next, identify and document key business processes. Within those processes, define the steps where knowledge is created, needed, captured and used. Create plans for how to improve those knowledge points. Redefine the business processes and publish/communicate the changes. Measure sales increases, cost savings, customer satisfaction, and improved speed to market.
In the content scenario, executives think that capturing and organizing information better is the need. They think information is the same as knowledge and believe knowledge transfer is the answer. Very often, this type of project occurs in a technology-related area. Determine first what information is needed to support decision making. (A red “risk” flag should go up for the KM practitioner if an IT group is trying to solve a business need rather than their own documentation needs.) Next, identify existing content sources and create a taxonomy. Define key roles in the content creation and approval processes. Identify the technologies that will be used. Use KM techniques to capture information and transfer it to knowledge bases and other workers. Establish update and maintenance accountabilities. Measure increased intellectual capital/assets, information reuse, and efficiencies gained.
Whichever applies initially, the KM team will have some educating to do. All three scenarios plus education are legs of one table, and the table will not be strong and stable without four legs. Build a training/education component into your final recommendations and plan.
Some Universal Barriers for KM
Your own organization will have its specific barriers and challenges; however, some barriers are universal. I’ve based these on a presentation by Joe Katzman. If you are like me, right away your brain starts spinning with potential solutions:
Organizational
·?Lack of an executive level champion who can and will knock down the barriers
·?Making KM a priority with managers
·?Culture that rewards the status quo.
·?Don't know what we don't know
·?Competition with the corporate education/training group.
Individual
·?Knowledge hoarding by subject matter experts
·?Credibility of the source. How do “experts” acquire credibility in this group?
·?Inadequate search skills, difficulty of finding something relevant
·?Junk accumulation and need for quality control
·?Keeping users (and experts) up-to-date related to both KM and their professional expertise
·?Access to and easy use of technology tools
·?Users don’t trust information that’s there/lack of positive experiences
·?Want to work it out by themselves - enjoy the challenge
·?Teenager syndrome (this has never happened to anyone else but me, so why bother to look elsewhere for answers?)
For the KM practitioner, success is about planning; for the knowledge worker, however, success is about publishing. Don’t let your broader KM objectives become a barrier for participants. Workers aren’t interested in your plans and ideas and Phase II of the grand scheme for KM. They have work to do and need to understand what the payoff will be for them if they take time to participate now. Focus on how to keep workers engaged with the KM process…as a subject matter expert, as a facilitator of a community, as a contributor of written comments, as a behind-the-scenes adviser. For workers, success should be tied to writing and profiling what they know so others can find them when more specific information is needed.
Be judicious in the KM tools you use. The KM system is a technology tool to support the interpersonal interactions of KM. This is an important distinction. Other KM tools, like social network analysis or business process mapping or content mapping or email parsing, can be valuable in specific situations or to achieve specific objectives. The wrong tool in the wrong culture/environment, or no tool at all when one is needed, can cause friction and disengage the participants. Guard against letting technology tools dominate the KM conversation. KM is a people problem with a data component, and technology is only one possible solution.
Related to the previous thought, failing to choose the right technology based on the KM program’s objectives can be a barrier to success. You have to match the technology platform to the style and skills of the group/company/culture you are working in. Don’t rely on what the software vendor says…get some users to test drive the tools and talk to other companies who have used the tools before making any long term commitments. Make the vendors prove their claims about KM. And hold your IT department to the same level of proof when they tell you they already have systems inhouse that will do what you need. Do they really? Have you given them written requirements that spell out what the KM system needs to be able to do?
February 1st, 2006
Recently I had a conversation with a colleague about a problem he was having in his department with subject matter experts who jealously guarded their knowledge. The company is in the middle of a technology outsourcing project, and members of the offshore organization were inhouse interviewing subject matter experts about what they do and how they do it, so they could document processes and sources for their own use. So-called “knowledge transfer” is underway in many companies which outsource software development, customer service, and order fulfillment, and I’m always surprised that they are surprised because workers are hoarding their knowledge and not telling the new service provider everything they know. In this case, unwillingness to tell all is actually a symptom of a deeper underlying issue — job insecurity.
I believe the concept of knowledge hoarding is largely a myth. We’ve all heard stories about how some go-to people in organizations refuse (overtly or covertly) to write down what they know in order to make themselves indispensible to the organization. In at least one large company I know, being the sole source of a particular knowledge set was an unofficial post-retirement career path. For example, if you are the only person who knows a particular backend processing system, or who understands a critical process start to finish, then you could retire with your pension, and be hired back immediately as a consultant at double your former salary to do the same work. It happened at least four times in two years that I’m aware of in one department. This story is exceptional, though, (I hope!) and it’s not really about “knowledge hoarding”, which we typically attribute to an individual. It’s about bad management practices that fail to enforce quality assurance practices and/or build in redundancy.
As a KM practitioner, it’s important to be alert to so-called knowledge hoarders, because they are strong indicators of other organizational issues that need to be resolved for KM to be successful. We need to understand the root cause of any hoarding and how widespread it really is in order to elevate the problem to the organizational level where it can be addressed. Here are some things to consider if you think hoarding is a problem for your KM initiative.
·?In some organizations, the reward system is simply wrong for KM. People are given financial incentives to maintain the status quo and be an expert in high demand. There is value to being the only one who knows something, and experts with critical knowledge are praised and exhibited to others. Many organizations have a disincentive to share and reuse knowledge, especially in middle management where bonuses and promotions are based on short term deliverables and thinking. It’s job security.
·?Keeping experts on top of their subject matter is vital. One reason some experts withhold information is that they lack confidence in their own knowledge. Many companies fail to invest in helping a subject matter expert maintain current expertise. Professional people need to attend conferences, subscribe to trade publications, and meet with peers in other organizations — and be given the time to do these things — in order to keep their knowledge fresh. Some knowledge is subjective and may not be fully quantifiable. Experts who hoard may fear that their knowledge is outdated, and if challenged they could be embarrassed.
·?There are a few individuals who like to control people and situations through withholding knowledge, and use this as a way to demonstrate power, but they are the exception. Often these people present what they know as mysterious or too arcane for most people to understand, and they reveal little, even when pressured. Fortunately, most organizations find a way to shift such people out of mission critical roles. If you find someone like this blocking a knowledge flow important to your KM initiative, you have three choices: find a way to give that person a role in the KM initiative so they want to help it succeed, have a serious conversation with HR and the KM champion, or start dusting off your resume.
·?Establish non-threatening ways to encourage knowledge-sharing. People tend to share what they know when it is valued by someone else. They will usually share one-on-one when asked by someone who needs the information (though that may not apply equally to someone simply conducting a “knowledge transfer” interview for documentation purposes). People share when they are rewarded for sharing (and the rewards don’t necessarily have to be monetary). Rewards can be ego stroking, pride in having the answer, a sense of belonging, satisfaction that they could do something no one else could do, or even leaving a legacy. Gartner Group says people who hoard knowledge are not really experts. They don’t have the ability to create new information, and are afraid of losing what they perceive to be a personal advantage, so they try to hang on to what they do have by becoming a gatekeeper to the information. Real experts share knowledge, because they know they can always make more.
Update 1/10/06: As so often happens, I found a nice article related to this topic today, so I decided to reference it. Carol Goman says the five reasons people don’t share what they know are:
1. People believe that knowledge is power
“If I know something you don’t know, I have something over you.”
2. People are insecure about the value of their knowledge
“I feel that people tend to underestimate life experience, that intellect has been so over praised, and for some people without a formal education, that it is hard for them to believe that they can add value in a very different way.”
3. People don’t trust each other
“I didn’t know the other members of the team personally, so I didn’t trust them.”
4. Employees are afraid of negative consequences
“I was afraid that my idea would be ridiculed if it were slightly ‘over the top,’ rather than looked at as a useful brainstorming point.”
5. People work for other people who don’t tell what they know
“Personally, I have had more problems with managers and decision makers withholding information than I have had with colleagues or team members.”
The quotes are from managers in her study. You can read the article here .–KV
January 31st, 2006
This morning I came across an article by Don Moyer I first read nearly two years ago called “In Favor of Messing Around.” My friend Jeff sent it to me, and I found it very comforting to discover that some of the “time wasting” things I do, have done, want to do, or would have done are valuable!
Of course, high achievers that we all are, messing around can’t be time wasting. “Wasting”. Something our Depression Era parents taught us never to do. Very broadly speaking, Moyer’s messing around is goal oriented. His messing around means working with freedom. Playing with a purpose. Exploring a topic with no rigid goals, no particular agenda, no clients, no deadlines, and no specific deliverables in mind. But it is actively working. It sounds like what I do every day when I’m thinking about the things that matter to me — letting the mind range free in whatever direction it’s called, adding to my own mental knowledgebase, and then trying to make sense out of it.
Here’s an example from yesterday. I had a reason to do some research on Mad Cow disease. I started with Google, tabbing the most interesting-looking potential sites, and just flipped through them, reading and making notes at a relaxed pace. University of Illinois Urbana-Champaign, FDA, Prionics, Abbott Diagnostics, Wikipedia, VegSource.com (interesting counter culture view of the big “cover up” the author suggests is occurring in the US, from a site that also says Homeland Security is spying on vegans), and the USDA. As I was drafting a summary of that reading, I was also drafting simultaneously a new KM article for my blog on “Barriers to KM”, following a nice and unrelated exchange of messages with Jack Vinson at Knowledge Jolt .
I had to make some appointments with various people related to a new health program I’ve started, so in between calls I did a little research on walking, on diet programs, on some possible vacation locations. I then went back to some old Groove data I have and did more research dabbling related to the KM article, and discovered that there were two other topics I wanted to start for future articles, so I created new drafts. By then it was nearly 4 pm and I was restless, so I went into the living room and turned on the TV and watched Oprah talk with the bank robber father who was turned in by his sons. Not a subject I’m interested in really, but it started me thinking about morality and society. What is morality? How does a moral person make difficult choices? Is American morality somehow different from the morality of Eastern European or Mexican immigrants or Islamic fundamentalists or Hamas? Have we lost our morality? How can a family deal with such a blow and grieve for the loss of the parent/husband they used to know? What would life be like for them in the future, and would they be able to forgive one another for turning their father in and changing their lives? Is there such a thing as truth or morality or reality? I didn’t have answers, but I had been stretched in a new direction.
After a day of purposeful reading and thinking and writing, I wanted to do something different and spontaneous. What I really wanted to do was jump into World of Warcraft and play my little horde shaman I recently started, but it was still “business hours”, and something in my Puritan work ethic struggles with that when I have unfinished work I could be attending to. So I compromised. I went to my favorite news site, and caught up on the world and the weather forecast, and then I started doing more research — but this time, on a quest I’m working on in the game. I read a couple of web sites, and got some ideas about strategies I could try when I did get in and play next.
This is a superficial chronicle of about seven hours of my day, and there was another seven of similar but unrelated activities, followed by three hours of game play. I don’t want to bore you with minutiae when I’ve made my point. On the surface, only a few of the things I did seem productive. Yet, as has so often been proved in my own life, things that we find ourselves learning for no apparent reason can one day, completely unexpectedly, become the pivotal piece of information in making sense out of previously unrelated things — what some educators call an “ah-HA!” moment. It becomes our own knowledge, and we own it. Some days you are stuffing unrelated items into the old food processor brain, and other days you miraculously realize you know how to solve a problem that has been baffling you for years.
I think that was the point of Don Moyer’s article: make time for free-ranging and unstructured rambles to stretch what we know. Create a personal “broad playground of self-initiated projects and inspired play” to become smarter and more productive. It can only make each of us become a more interesting person and someone who smiles more. Here are some questions to think about. What are you inquisitive about? If time and money were not an issue, what would you be studying? What are you waiting for? Well?
January 30th, 2006
One of the most difficult challenges I faced in trying to establish a new KM initiative was the lack of understanding among the managers and executives I worked with. As I’ve said elsewhere, they were an older group (on the whole) who had worked most or all of their careers in the same conservative company, and, at least in the first few years I was there, many didn’t even have personal computers at home. They equated computers with work and with e-mail, which was seen as a necessary evil and an intrusion on getting “real work” done. The traditional way of doing constructive business was on the golf course or meeting with customers face to face.
Over the course of about 18 months, I and my team members met with about 70 business leaders, from executive vice presidents down through Associate Vice Presidents and Directors. Some we met with multiple times. Our approach was to educate them individually about why KM is useful, what it is, and how others are benefitting from it. Our goal was to identify several business areas where some aspect of KM (communities, knowledge base, expertise location, etc.) would be viable and to identify every key executive who might play a role in the decision process for choosing KM. We created dozens of customized presentations, tailored to what we thought the interests, level of understanding and needs of each individual were. We wanted to educate them…to give them facts and evidence to support that KM is a good thing…so that when we came back later with a concrete proposal for a KM initiative they would view it favorably and support funding it.
During the course of these meetings, we made extensive notes on their comments, reservations, concerns, interest and ultimately, their understanding of what we were suggesting. Many interesting things emerged, including what they perceived the risks of KM to be. They were both business related and personal.
·?Disruption of work flows. Their workers were already working at peak capacity, and most managers felt somewhat under staffed. With accountability for the financial results of their areas, they didn’t want to take their workers’ eyes off the ball and distract them with something that could potentially keep them from performing up to “plan”.
·?Taking resources away from other important projects. Without a formally-recognized (and funded) initiative, money to support a KM initiative could only come from cutting corners in other approved projects — projects with deliverables that were built into corporate financial projections. They can’t afford to take money or work hours away from important, approved initiatives to gamble on KM.
·?KM solution cost could be high. The organization had an unusual financial structure. There was no “corporate” budget and few enterprise-wide initiatives. Each business line funded its own projects and initiatives separately, even areas such as HR and IT and customer service, and there was little attention to “shared resources”. Project budgets were exceeded frequently, and a KM initiative might potentially result in mistakes, delays, high costs, overtime, or downtime. Any group or department that wanted to initiate KM would have to bear the full cost of developing the technology, hiring any staff, and promoting it to the rest of the business. The risk was that no one else would want to play. No one group wanted to take on a daunting financial obligation like KM alone.
·?Uncertain legal considerations. Without a lot of experience in web-based technologies, managers and lawyers expressed concern about various types of potential liabilities, getting processes in place to comply with Sarbanes-Oxley requirements, and privacy considerations. How could they trust in the data captured? Who owned the copyrights to posted materials? What happened when someone left the company? Was it really advisable for employees to get to know so much about areas of the business they weren’t responsible for? Who would manage the updating process? What if leaks of confidential information occurred? Without legal’s approval, nothing would be done. With no resources to address such big issues head on, KM was pushed aside a “nice to have” for consideration later.
·?Not a part of the company’s long-term strategic plans. Since the project was not included in the current budget, it was unapproved and no funding was officially available. The company’s history showed a preference to continue down an established path rather than change in midstream to take advantage of a new or better solution. Managers who wanted to take a chance on KM were afraid to commit to something new and innovative that would take time to develop because they needed to produce results now. Ongoing support for projects like KM with no immediate (quantifiable) payback would not be available. They were afraid of spending money to start KM only to find out that the company had decided to go in a different direction or that the KM team couldn’t deliver it, leaving them to eat all the incurred costs with nothing to show for it.
·?Diluting scarce technical resources might jeopardize other technology priorities related to serving customers. All technology investments were validated on the basis of how they improved the ability to serve (or strengthen relationships with) customers. Certain big ticket IT investments been made previously, and they were lumbering along, mired in problems, so no one was willing to entertain other options that might siphon off resources from the big investments. In addition, KM technology products are not yet mature, so there was no off-the-shelf answer. Customization would be required, and that required resources.
·?Lack of credible metrics. Managers want to see quantifiable value attached to the results of a KM initiative. Benefits offered by KM practitioners are often “soft” intangibles…like confidence in the reliability of information or employee satisfaction. Financial executives don’t give much credence to intangibles, even talking about increased intellectual capital. They need benefits in terms of sales increases or cost reductions or doing more work with fewer people that an be attributed directly to KM, yet the nature of KM makes such metrics difficult to define and measure. As one manager said to me, “I need some tangible benefits to put into the CBA, even though everyone agrees intuitively that there’s something valuable in KM, or it’s not going to fly.”
·?Their own ignorance. This was the underlying and unspoken risk that most of them shared. They didn’t understand KM or communities, didn’t see how it would help what they did “now”, they didn’t have time to spend on learning something new, and they had a perception from their children or young friends that talking online with others is purely social and a waste of time. They were afraid to fail. In some companies, managers are expected to know and be expert in everything their people work with. The concept of working online asynchronously and expecting to get value from it can be completely foreign and threatening to them personally. I suspect this is common in most old-line companies where baby boomers run the company and have not yet retired.
A survey of Fortune 500 CEO’s by the Baldridge Foundation shows that knowledge management is the second most important challenge facing companies, behind globalization. As KM practitioners, we need to figure out better ways to address these risks and present the KM value proposition effectively.
January 29th, 2006
Can you talk about knowledge without talking about reality and perception? I was inspired this morning by David Weinberger’s article called Four Former Truths About Knowledge . I like that he attempted to bring the discussion way up into the stratosphere, where it’s possible to see very broad patterns and a longer view of history. The need to understand patterns is something instinctive to me, and it’s probably why I became a generalist and not a specialist. By understanding the patterns, we know what to expect next and we know how to thread our ways safely through potential minefields in life (at least, theoretically).
People who need to understand patterns find it easy to categorize things — four former truths, 10 reason why something fails, three critical reasons, eight views of speciation, six principles of archeological excavation, four types of business outcomes. They can look at lists of information and reports, or seemingly random sets of letters and numbers, or chaotic behaviors, and make sense of them by asking questions like, where does all this seem to be headed? what’s the underlying tension that’s causing these results? what are the elements of this process, and is it like any other processes in life? what are the principles upon which this series of outcomes is based? What differentiates pattern finders from other people is that often they find answers, or at least plausible theories, for questions others can’t answer. These people are natural model builders. They see life as components and elements that are moving and interacting with each other in endless, fascinating ways that produce a myriad of results. They have an instinctive ability to organize information.
This brings me back to reality and perception. Is there such a thing as reality, or do we simply have a shared perception that we all tacitly or explicitly agree to accept as a principle? If one person has an experience, it becomes real to them. They have a sensory interaction…they see it, hear it, feel it, perhaps taste or smell it. They feel emotions related to it, and the emotions give weight to the importance of the experience. They record the experience in their brains and have that experience available to draw upon for the rest of their lives. (This recording is learning, by the way, but that’s another article.) That recorded experience becomes a reality to the person who had it and combines with other experiences (both personal and vicarious, through stories or observation) to become that person’s body of knowledge. Each person builds reality — a highly personalized view of reality — upon their personal knowledge.
Everyone’s view of reality is, of course, a perception. It may be possible to comprehend the vast complexities of life’s potentialities, but then again, it may not be. Our brain receives and stores input from our environment by sorting and sifting through all the data and emotions and history it has available, and then it applies the new information to previously stored information. This is the process of understanding. Any new information we learn at any point is filtered against all of our understandings. But our understandings are based on perceptions only.
In laboratory science there is a process called precipitation, where different chemicals or components are mixed together in a test tube so that certain ones can combine and settle to the bottom of the test tube to form a substance called precipitate. Learning is like the precipitation process: we put in a lot of pieces of new and old information, stir/shake them together in our brains, and then wait to see what precipitate settles out of it. To take the analogy one step further, what we call “knowledge” is that precipitate.
Knowledge is a product of sensory input, emotional filtering, prior knowledge, and a mixing process similar to scientific precipitation. Knowledge always incorporates personal experience. It’s no wonder KM initiatives fail. We can never understand another person’s context, so trying to “capture knowledge” is futile! All we are left with after a “knowledge capture” process are facts and opinions that we (or others) weigh and sift against our own experiences in an attempt to gain personal knowledge of the subject from the data markers that were provided to us. In the end, I believe the best (and maybe only) knowledge management system will be the one that lets people find other people with expertise on a certain topic, and facilitates discussion among them. Unfortunately, most people today would then capture those discussions, call it “knowledge”, slap it into a database and feel like they are doing KM. That’s reality. One reality.
January 24th, 2006
A while back I participated in a “branding” exercise to help define the purpose, describe value, and create an approach for a new KM initiative in a large company. The exercise involved three brainstorming steps. Each step had a focus, and the outcomes of the three steps were to be woven into a mission statement that would govern subsequent actions. These are my notes and the results of the session. Perhaps it will be helpful to someone else who is just starting out!
Before starting the exercise, it’s good to have a clear understanding of the potential “clients” for your knowledge management initiative. (See below for some discussion about the audience.) For brainstorming on branding, it’s not necessary to understand any technology the organization has in place or plans. The purpose is to be as objective as possible about what you want the initiative to accomplish, how it fits into important organizational goals, and to find the best way to present it to others so they will buy in and participate.
Brainstorm 1 – Value.
Ask: What should our clients value about us/the KM initiative?
Our winner: Value = Added Capacity
Other ideas: Community, collaboration, united, unifies, sharing, experience identified, expertise located, defragment, efficiency, assets, involvement, buy-in, centralization of information, enterprise-driven, powerful resource, themselves, collective, one company, growth, efficiency, being a team player.
Brainstorm 2 — Think.
Ask: How do you want clients to think about KM?
Our winner: Think = Innovative
Other ideas: efficient use of time, savings, valuable, mandatory, finally– the answer, time has come, it’s a commitment/useful/informative/innovative, “my property”, legacy for those who come after.
Brainstorm 3 — Feel.
Ask: How do you want clients to feel about KM?
Our winner: Feel = Vested
Other ideas: Satisfied, energized, accepting, welcoming, empowered, happy, winner, proud, ah-ha, relieved, like it’s easy, important, invaluable, it’s integrated with everything they need to do, vested.
Know the Audience/Clients
For this brainstorm, the clients/audience = three groups: Leaders of the organization (executives), Line managers, and Employees. Note that not all employees are “knowledge workers”. You may want to do a separate brainstorming step to define which groups of employees are knowledge workers, because initially you will be most interested in the true knowledge workers. It will help to focus your efforts effectively.
What does our “Value” winner (”added capacity”) mean for each audience?
·?For executives: It’s balance sheet driven. You have to have hard, quantifiable benefits, like head count reductions or reduced product time to market, for it to count. Innovation and more time to do more things are good, but it has value only if the time freed up is applied toward the core business activities.
·?For line managers: Whatever happens, make them look good (or like a hero or like an innovator). They are on the firing line to deliver the organization’s financial goals for the year. Help them to do that, which means attach your KM priorities to the most important business objectives and start there. Added capacity means getting more done with the same number of people.
·?For employees: The ability to do more/different things. Give them more time, reduce the amount of information they have to wade through daily, make it possible to do more with existing resources, find a creative solution to a routine problem, help them get more education, eliminate redundancies, improve quality of interactions with customers.
An important distinction came out of our brainstorm. As KM professionals we need to Sell to the executives, but we need to Market to the line managers. Getting the line managers to play the KM game will take education, persuasion, and a focus on the WIIFM (what’s in it for me).
To sell to executives, use effective sales techniques to identify a pain point, describe the problem, the solution, and the probable outcomes (and a rough order of magnitude of the value to be obtained). Drop names. Talk about how other similar organizations have used KM and the kinds of value they have achieved. Executives have the buying power and the ability to secure agreement/cooperation across multiple business lines. They can influence one department or division or business line to compromise for the good of the whole.
To market to line managers effectively, remember that they have day to day responsibilities for profitability, for employee satisfaction, for customer satisfaction, for surfacing new ideas, for holding down costs, for increasing sales. Line managers have the power to give/deny the KM team access to employees. They don’t normally have broad buying authority, and will follow the lead (and the subtle hints) of the executives. They are primarily concerned with the success of the area they are accountable for, not for the entire company, and they need to be shown how KM will help them to succeed personally and look good. Help them to see how diverting activities of busy workers with an already heavy work load is going to give a bigger payback.
Employees are not all knowledge workers. Probably only half are in most businesses, and they should be the first focus of efforts.
January 23rd, 2006
In a knowledge management initiative where communities of practice are used to create and validate best practices, it’s possible to use a lifecycle approach to formalize the creation process, using something along the lines of peer reviews used in science. Formalizing what is typically a more casual approach can ensure the best organizational benefit/learning from the results of an activity or event. The process might be something like this, adapted to the type of information to be reviewed and the working style of the community:
- capture
- disperse
- review
- consolidate
Capture. Depending upon the norms of the community, write a description of the scenario, the environment and parameters of the problem/situation, the objective of the actions, the participants, the action steps, and the results. Include information about what caused the situation to be normal or abnormal, and the names of participants, as well as the contact information for the person designated to be the primary correspondent related to the proposed “best practice”. Describe enough of the problem to help any later readers understand its scope and complexity.
Disperse. Once the story is written, distribute the draft to key reviewers. At a minimum that should be people in the community designated (usually who volunteered) to review and comment on best practice drafts. Don’t forget to include people who actually participated in the situation being described, since they may have other relevant details or different points of view about what occurred and how the process unfolded.
Review. Create a standard operating procedure for reviews. For example, if you distribute the draft in Microsoft Word format, ask reviewers to use “tracking” so all their changes can be seen and considered (or rejected). If the draft is a PowerPoint presentation, then making comments in the bottom section of the screen viewed in “Notes” format can work. You may want to consider setting up a standard for commenting, for example, a designation for a comment that indicates how strongly the reviewer believes his/her change should be made. Be sure to indicate in every draft distributed when and to whom the comments should be submitted.
Consolidate. Designate one person to consolidate all the remarks/comments on the draft into a final, official version. Post the final version to the entire community, as well as to any reviewers who may be outside the community.
Blogging and/or wikis can also be useful in the distribution and review steps since these tools facilitate group editing and commenting; however, the person assigned to consolidate all comments on the best practice needs to have the final say over what the end product contains.
January 19th, 2006
My colleague Peter Miller recently posed this question: Is there any research that proves there are benefits to problem solving in a group, rather than individually or in pairs? We were talking about communities of practice and their value. We were also discussing whether there is an optimum size for communities, specifically, communities in a business environment, where the community typically forms around a business issue or service, as opposed to a personal hobby or interest.
Clearly the normal kinds of personal interest communities, where any number of people come together voluntarily based upon a shared interest, have a high tolerance for large numbers of participants and varying levels of involvement. After all, most members never actually participate/contribute, or they contribute occasionally on only limited, highly specialized topics. Business communities are often smaller, more specialized subsets of a broader group that break out so people with like interests can find each other quickly and solve common problems more efficiently. Participants in these groups have a strong motivation to participate in problem-solving and contribute - it helps them do their work better. Research I’ve read over the years indicates that about 250 members is as large as a community can get before becoming ineffective (maybe that depends upon how you define the group’s purpose, e.g., casual or purposeful). There does seem to be an upper limit or point of diminishing returns, although I haven’t seen a definition of what that may be.
From my experience, in most discussion groups, no matter how lively the topic or how interested the community is in the subject matter, usually no more than 6-10 members carry the discussion on that topic actively, with others occasionally chiming in. If you imagine the inside of an active volcano (or the Lakkari Tar Pits in Un’goro Crater, if you happen to be a gamer playing World of Warcraft), where bubbles slowly rise and burst at irregular locations throughout the crater, you have a good sense of community dynamics. Some topics rise slowly and are large. Some are smaller and faster. They rise up and subside. You don’t know where the next one will appear, and the ripples and splashes created by each is unique. Community discussions are like that.
A person with a question to answer (and a deadline approaching) will find some sort of answer on his/her own, through reading, through hunches based upon experience or through asking others. What especially intrigues me, though, is the actual value we can ascribe to having additional people help answer the question, and the parameters around getting the optimum answer from a group. How do we quantify or assess the value of having additional people engaged in solving a problem or making a decision? What’s the right number of people to involve in the discussion?
If you have three people working on the question, does the 80/20 rule apply? It’s obviously faster and less expensive than getting a team of 20 involved. Will you get enough of an answer that you can move forward with reasonable confidence? If you have six people, can you get to the optimum answer/solution 99% of the time? Perhaps a group that size can reach a reasonably ironclad conclusion. If you have 12, is the resulting incremental information gained of sufficiently high value to justify the time/expense of having those additional people involved? Or do you hit a point where the incremental value of new thoughts is so low that it becomes too costly to add more voices? What is that point? And let’s not forget that other people who don’t participate in the discussion still read and learn from it. There’s value that should be applied to that.
Moving out of communities into a real world situation, the ability to know the optimum size for problem-solving groups could streamline all kinds of organizational meetings, saving time and speeding processes. It’s easy to imagine a Six Sigma project to define types of meetings and optimum configurations for them. If we knew how to assess the value of group decision making, then collaboration and knowledge management professionals would have a basis upon which to establish high-performance work groups, teams and communities designed for action. But we need some data to prove that better decisions result from problem solving in a group. This may be challenging since results-oriented subgroups are not a traditional dynamic of communities. Communities don’t typically have accountability for the discussions they enable, they aren’t obliged to ensure full and complete analysis, and topics are often free-flowing and have no closure — but that’s a digression from this topic.
Even in the volcanic bubbling of a community, there are some good indicators that group decision making and problem solving have value. Research suggests that learning is largely a social activity, learning in groups generally improves participants’ learning, successful group work can improve higher-order thinking, and having a facilitator/moderator can improve group collaboration. (Facilitation is another component or dynamic in communities that can improve or impede collaborative performance on a problem-solving task.) One strong predictor of problem-solving success is whether members of the group share a mental model, i.e., they are working with similar conceptions of the problem and its states. A good conceptual model of the problem, together with the strategic knowledge to generate appropriate solutions and procedural knowledge to carry them out, results in more successful solutions — but it still doesn’t tell us what the value of having more than one person involved in the process is.
How to improve problem solving and decision making in organizations has been discussed widely in recent years, and we seem collectively to understand more about how to make the decision making process better. But has anyone quantified the value of having more than one person participate in the decision-making process? It's an interesting topic.
Explorer
11 个月Tests of trust spark questions on risks in public cultures. ?? #Nature #people #emotions #toolmaking #practices