The Anatomy of Sound Decision-Making

The Anatomy of Sound Decision-Making

Decisions to do anything in life always have a cost and a benefit. Sometimes, these are clear to see and calculate but at others, this is not the case (often because they are both separated by time or the issues to think through are complex, or even apparently conflicting). But either way, this ‘trade-off’ is always there!

In broad terms, most of us are really poor at assessing which of our decisions will end up as a net cost or benefit to us or others. This is hard enough when the decision is about us personally and affects only ourselves. However, it gets even harder when we need to make decisions that go well beyond us, such as what does our team or whole organization do? We consequently make many potentially bad decisions (in both of these situations) because we are inherently poor at objectively assessing risks and rewards. In this article, I, therefore, want to describe a broad process that I think may be able to help most people who care enough to try it (or least to do so to some extent) and where a particular decision is important enough to them.

In the rest of this article, I will be describing two discreet and separate but related processes that we need to appreciate if you want to make better-decisions. These are:

A) The External or ‘Outside-in’ process of how we should be thinking about the decision and how it is best made. In essence, this means looking beyond ourselves and our organization for guidance and how we should go about making this decision, but I want to look at how we do this specifically. As we’ll see this has four sequential parts to it, which are thinking about Data, our relative Knowledge or Expertise, our Thinking Methods and finally potential Social Traps or Errors we can fall into.

B) The Internal or ‘Inside out’ process of how we should be considering the potential for psychological or what are often called Cognitive Biases in ourselves and others on the decision-making team, within our organization. Once again, as we’ll see, this has four sequential parts to it, which are cognitive Inertia BiasesPattern Recognition BiasesExtrapolation biases, and finally Social biases.

Let’s look at the four ‘Outside In’ categories first.

1. Considering Data Problems

All decisions should be based on a sound appreciation of the data or evidence which is relevant to include. The broad question here is, therefore “do we have a good grip on the data or evidence we need?”. There are 4 main ways in which this may not be the case:

a) We are potentially too hasty or careless in our consideration of the data. For instance, we may not take the time to ask questions about the pertinent facts or data which exists or question its accuracy and relevance, in favor of hurrying things along to reach a quick conclusion. For example, we may be asked to make a decision by Friday, but that deadline may be arbitrary or not allow enough time to properly gather and check the accuracy of the data or to deliberate. What we, therefore, should be doing here is asking “are we operating slowly enough, given the circumstances, and taking the consideration care that we should?”.

b) We potentially lack a proper context for individual data or evidence and thereby miss ‘bigger picture’ issues that may lead to a different and/or better decision, if known. For example, the question “should we replace an individual on our team because someone is leaving?” could have many ‘wider context’ issues around availability of time and skills in other team members to cope or not, and even time and skills available in other teams (not to mention the potential to redesign the job or role). What we, therefore, should be doing here is asking “are we putting this decision into a proper wider context and, when we do, what exactly is this and how does it possibly change our perspective?”

c) We potentially forget or fail to check that the data we are presented with is factually correct (or at least credible) and furthermore that is sufficient, and we are not missing other data that would be germane to the decision needing to be made. For example, a statement such as “our biggest competitor put its product pricing up by almost 5%” may mean we think about increasing our product prices too, but this may have been reported loosely or inaccurately and not be the case, especially if it were actually 3% or if they had also reduced the product volume or size by 5% too! What we, therefore, should be doing here is asking “what is the potential for this data to be incorrect or missing other parts that are relevant?”.

d) Particularly in a complex or a multi-faceted decision, we potentially may not pay sufficient attention to data or evidence which may conflict or tell a contradictory story and therefore needs discovery work to find the ‘truth’ if possible. For example, revenues may have declined in the last 6 months for an organization, apparently because there was a 50% jump in turnover of sales staff. However, other data suggests that a new product technology, by a major competitor, has been making significant inroads and causing more lost contracts than usual for the organization. These could both be independently true, partially contributing to the situation, or in fact, both minor contributors compared to another factor yet to be surfaced. What we, therefore, should be doing here is asking “Is any of the data or evidence in this satiation at odds and why might this be the case?”.

2. Considering Knowledge and/or Experience gaps

All decisions should ideally be taken by a person or a team that has appropriate knowledge, experience and/or expertise in the circumstances (whether it is one person or an assembled team to make the decision). The broad question here is, therefore “do we have the expertise needed here or in this situation?”. There are 4 main ways in which this may not be the case:

a) We potentially have too little knowledge of the issues that matter to make a sound decision. In this case, the knowledge we are looking for should be germane to the decision we are seeking to make. For example, if we are looking for good ways to minimize future organizational tax liabilities, we would want to ensure that solid accounting and tax knowledge is in the decision-making team or readily available to it. What we, therefore, should be doing here is asking “do we have the knowledge we need to make a demonstrably wise decision in these circumstances?”

b) We potentially have too little experience/expertise about the issues that matter to make a sound decision. In this case, the experience/expertise we are looking for may not be domain knowledge but past action experience which would allow for some issue navigation now. For example, if a team is seeking to decide what objectives and key results (OKR’s) to apply and to write them up and disseminate them, some expertise in what good OKR’s look like, and how they are best shared, would be extremely useful. What we, therefore, should be doing here is asking “do we have the experience/expertise we need to make a wise decision in these circumstances?”

c) We potentially often have only a binary choice when making a decision. This boils down to either ‘do this’ or ‘do that’ (such as do nothing or this one other option). This may be narrower than wanted and the decision may need more options to be generated, or to be available, to increase the quality of the future outcome. For example, a business may think it needs to make a cost-cutting decision such as stopping all of its staff from traveling by air or stop all travel and rely only on electronic communication. In fact, there are many alternatives that could be generated to cut costs, some of which are not travel cessation related at all, and some of which are more granular options (such as to limit trip expenditure or change the approval process). What we, therefore, should be doing here is asking “do we have the expertise/input we need to generate enough future options here?”

d) We potentially face an inhibiting mental laziness or lack of focus when we are looking to make a decision, which will often naturally negatively affect the quality of the outcome. For example, when faced with a decision that a team thinks it has seen before -let’s say decide what budget to set for next year, it is tempting to take this year’s budget and add to it a little or even leave it to the finance people to extrapolate some numbers and not pay much attention. This is likely to lead to a less useful decision than to look at what products and programs need funding and by how much. What we, therefore, should be doing here is asking “are we able to be fully engaged on this process in order to reach the most optimal decision we can?”

3. Considering Thinking Mistakes

All decisions should ideally be taken logically and with sound systems and approaches which extrapolate appropriately from solid assumptions or data/evidence. The broad question here is, therefore “Are we thinking about the issues in sound and logical ways?”. There are 4 main ways in which this may not be the case:

a) We potentially have the wrong assumptions and/or beliefs about an issue, and these remain accepted or go without challenge, as if they were true or factually accurate. For example, when faced with a decision to launch a new product, it may be believed that it must be cheaper than the major competitors. Without this belief or assumption being properly validated (usually with proper research that may find that customers want better quality more than a lower price) we may make a poor decision. What we, therefore, should be doing here is asking “Have we rigorously assessed our assumptions and beliefs about the issue at hand?”

b) We potentially have a faulty memory about what we’ve been told in order to relate to the issue in front of us. Sometimes, this is because we rely on memory rather than to take the time to check the facts or because we miss-recall a situation or apparent ‘fact’. For example, we may think we remember that a competitor went bankrupt 6 months ago and our organization may be able to expand into a strong territory they held. When checking, we actually find that they filed for chapter 11 and not chapter 13 and they are actually still in business. What we, therefore, should be doing here is asking “How do we know for sure that our relevant memory in order to weigh-in properly is sound and accurate?”

c) We potentially overcorrect for possible biases in our thinking that may not exist at all! Simply put, we are not as trapped by bias in many situations as we think we are, and we don’t want to end up with ‘analysis paralysis’ because of this. For example, if a decision-making team is prone to being relatively optimistic much of the time, a worry might be that over-optimism bias is present when it’s not. What we, therefore, should be doing here is asking “How do we know that we are not looking for thinking biases where they are none (and we should pay closer attention to other factors)?”

d) We potentially misunderstand or get wrong the amount of time we have and/or need to make a wise decision in the circumstances. This often means rushing to a decision when we don’t need to. For example, we may have been told that a decision to compare 5 different external contract bids, for instance, must be completed within 2 weeks (the usual time often allowed for this type of decision). Not only is this a standard and relatively arbitrary time frame, it does not take account of the fact that the bids may have been twice or three times as detailed and lengthy than usual and needed at least double the time to make a wise decision. What we, therefore, should be doing here is asking “Are we taking all of the time needed to decide wisely in these circumstances?”

4. Considering Social Errors

All decisions should ideally be taken by a person or persons who are able to think independently and assess their alignment with the thinking of others without undue social pressure or influence. The broad question here is therefore “Are we taking care to avoid potential social convention traps?”. There are 4 main ways in which this may not be the case:

a) We potentially face some degree of dishonesty and/or corruption. While this may be more often small in scale and influence, sometimes it is not. In either case, it may not be in plain sight and should not be ignored as a factor. For example, some people on the decision-making team may profit directly or indirectly from one option over another and therefore may push for outcomes that may not be optimal. What we, therefore, should be doing here is asking “Do we know for sure that there is no minor or major corruption and or dishonesty at play in making this decision?”

b) We potentially pay too much attention to analogies or metaphors that may limit how we think about the issues that matter. This is because both of these are descriptive shortcuts, but we can forget (unfortunately) that the ‘map is not the territory’. For example, when seeking to make a decision about how to best acquire new customers, a decision-making process leader may say that the team needs to concentrate on “the low hanging fruit”, suggesting the easiest and most accessible targets should be the main focus, when this is not necessarily the most profitable path. What we, therefore, should be doing here is asking “Have we assessed the analogies/metaphors that are being used to test for their appropriateness?”

c) We potentially think a little or a lot too much about our personal status in the decision-making team and/or organization relative to others around us. This can go in both directions. We may be ‘senior’ in status and expect our voice to carry more weight or we may be more ‘junior’ in status and stay quieter than we might. For example, when a person 2 or 3 levels more senior is in the room (such as the CEO) we might wait to be called upon to speak or minimize our inputs to avoid social ostracization. That same CEO, on the other hand, may say more than most or speak first on many occasions, thus diminishing input from more junior members of the decision-making team. What we, therefore, should be doing here is asking “Are we willing and able to challenge people with more status and power than we have?”

d) We potentially may be overly personally or emotionally invested in the decision we are being asked to make. This often means that our emotions can ‘flood’ the calmer logical parts of our mind. For example, when faced with a decision to sell a division or subsidiary company, we may be very attached to the people who are part of it and fear or regret the loss of future closeness or friendship and argue to retain this part of the organization for more emotional than logical reasons. What we, therefore, should be doing here is asking “Are we as free as possible, or can be, to decide without too much of an inappropriate amount of emotion coming into the equation?”

What each of the above categories forces us to do is to consider how the decision-making process should ideally operate, in general, and the questions that should ideally be asked in broad terms, no matter what the issue or topic. This means that we first need to take a strong ‘outside-in’ perspective, which is more about asking do we have the right context?, do we have the best external reference points?, are we framing our questions in the right way? or are we thinking about the obvious or clear consideration ‘traps’ we might fall into?, etc. These 4 category headings help us to be more rigorous about how we do this, and the chart below is a summary of these around the edge (and as we have detailed more fully above).

No alt text provided for this image

Taking Possible Cognitive Biases into Account

Our decision-making journey does not end with better rigor in what should be cognitively obvious and as the chart above illustrates in the center, the next step is to carefully consider whether we or others have cognitive biases that can negatively impact on the end result we may ultimately come to. This is largely a far less conscious process and one in which we develop an appreciation, or a basic awareness, of the potential for the problem that could be presented (or become aware of the particular bias) if it is indeed playing a role. As we said earlier, this more Internal or ‘Inside out’ process concerns how we should be considering the potential for these biases in ourselves and others on the decision-making team, or within our organization.

In broad terms, a cognitive bias is a deviation from a ‘norm’ or even a departure from what is deemed to be a rational judgment. In this sense, an individual may create his or her own “subjective reality” and have no idea that their judgment is ‘clouded’ or biased at all.

Although psychologists have so far identified more than 200 individually identified cognitive bias types, for the purposes of this article we believe they can be ‘bracketed into the 4 clusters These are cognitive Inertia BiasesPattern Recognition BiasesExtrapolation biases, and finally Social biases. We also take the view that these clusters are best considered sequentially.

Let’s look more closely at these four ‘Inside out’ Cognitive Biases to which we should pay most attention:

A. Considering Possible Inertia Biases

‘Inertia’ is a tendency that can occur in all people, having once established a familiar trajectory or course, and to continue on that course unless acted on by a greater force. This ‘familiar trajectory’ may have been created by a cognitive bias, although we may be unaware that it has corrupted our thinking and we need to find a path or ‘greater force’ to counter it.

There are four major inertia inducing cognitive biases. These can all influence a person individually or in combination. However, in general, they tend to manifest most often in the order in which they are presented here. Let’s look at each of these in more detail:

-Anchoring Bias: This is where an individual depends too heavily on an initial piece of information offered (considered to be the “anchor”) to make one or sometimes many subsequent judgments during decision-making. In simple terms, this often happens when we have available a few facts or figures, and assume them to be true, even if they are irrelevant. The familiar example here to most people is buying a car (especially a second hand one). The ‘sticker price’ might be completely arbitrary but, more often than not, it is a mental ‘anchor’ for the subsequent negotiation (and car sellers know this anchor will often work well!).

-Storytelling or Narrative bias: This is about people’s tendency to interpret information as being part of a larger story or pattern, regardless of whether the facts actually support the wider narrative. For example, you may be considering buying a small competitor company for whom a management colleague used to work. She says they were well-run and ‘had a positive culture’ when she was there, so you reason that it is likely to be a good fit with your company’s strong culture and performance ethos. However, you never looked at the competitor’s sales data, talked to their suppliers or looked at the ‘Glassdoor’ ratings on employee satisfaction.

-Risk/loss Aversion Bias: This relates to people’s tendency to prefer avoiding losses to acquiring equivalent gains. This bias often rests on the risk we are taking or loss we think we may bear will be ‘felt’ more keenly than a gain of a similar amount. A familiar example here is that many of us who are unwilling to sell our home for less money than we paid for it. Another example here might be if we lose as a result of our decision, it may appear we are weak, and it may have a reputational cost. For example, if we invest in a project, but then give up because the marginal cost is greater than the marginal benefit — we may not be given another project. In such circumstances, it may be in our interest to persevere and put a gloss on the new project, rather than admit we were wrong.

-Sunk Cost Fallacy Bias: Related to loss aversion bias, this is about people’s tendency to follow through on an endeavor if they have already invested time, effort or money into it, whether or not the current costs outweigh the benefits. Again, a familiar example here is individuals sometimes order too much food and then over-eat just to “get their money’s worth”. A business example here would be when you make a decision to offer a new hire a $5,000 starting bonus. If this employee is then hired but doesn’t end up working out, it is a sunk cost, and cannot have any bearing on doing this again with a new hire, should you choose to do so.

B. Considering Pattern Recognition Biases

The process of pattern recognition involves matching the information received with the information already stored in the brain and can be a positive way to make judgments in business and life in general (as a simple mental short-cut process). However, Pattern recognition cognitive biases tend to occur when individuals identify what they believe to be a pattern, but the association or matching process is inaccurate or false.

Once again, there are four major pattern recognition cognitive biases. These can all influence a person individually or in combination. However, in general they tend to manifest most often in the order in which they are presented here. Let’s look at each of these in more detail:

-Confirmation Bias: This is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports our prior opinions, beliefs, hypotheses or values. We see this regularly in TV police series when a detective may identify a suspect early in an investigation but then may only seek information or even evidence which confirms rather disproves a private theory about what happened (and they may then arrest the ‘wrong’ person). A simple business example here occurs with all sorts of workplace stereotypes, which often prevail, that may relate to categories like age (young people are na?ve, old people are ‘over-the-hill) or gender (it’s a boys club around here or women are too emotional) or even something basic like height (tall people get further in their careers than short ones-or vice versa).

-Availability Bias: This is often a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method or decision. A familiar commonplace example here is buying lottery tickets is a ‘long shot’, because the probability that we will win is so infinitesimally small -but people still do it anyway. A business example here might be that after seeing several news reports about a rise in break-ins and thefts in retail stores in inner cities, you might make a judgment that this same rise in break-ins and theft is highly likely to affect your business or become more common, even though you don’t have any retail outlets in inner cities and may have seen no data whatsoever.

-Curse of Knowledge Bias: This typically occurs when an individual, communicating with other individuals, unknowingly assumes that others have the background or experience to understand. In even simpler terms, an individual does not or cannot put themselves in other’s shoes. For example, a senior organization manager may make a presentation about ‘Unlocking shareholder value’ to a population that don’t even understand this presentation title, let alone what shareholder value is at a detailed level (and the information goes over their heads).

-Hindsight Bias: This is a phenomenon that allows people to convince themselves, after an event, that they had accurately predicted before it happened. In other words, it is the tendency of people to overestimate their ability to have predicted an outcome that could not possibly have been predicted. In essence, the hindsight bias is like saying “I knew it!” when an outcome (either expected or unexpected) occurs. This “I knew it all along’ bias can be dangerous because by assuming that they already knew the outcome that is going to occur, a person might fail to adequately do any (or enough) necessary preparation or analytical work.

C. Considering Extrapolation Biases

Like Pattern recognition, Extrapolation in general can be a positive and helpful mental process that takes simple data and extends it more broadly. However, Extrapolation cognitive biases tend to occur when individuals take a single or limited in scope data point and extend or broaden it inappropriately or in ways that are not logical (even though they may deem it to be fair or logical in his or her own mind).

Once again, there are four major extrapolation cognitive biases. These can all influence a person individually or in combination. However, in general, they tend to manifest most often in the order in which they are presented here. Let’s look at each of these in more detail:

-Overconfidence Bias: This is the tendency people have to be more confident in their own abilities than is objectively reasonable. An obvious and commonplace example of overconfidence is an individual who thinks his or her sense of direction is much better than it actually is (and, for instance, travels without a map and refuses to ask for directions). A business world example here came out of a study of 300 investment managers a few years ago, which asked if they believed themselves above average in their ability. 74% believed that they were above average at investing. And of the remaining 26%, most thought they were average. In short, virtually no-one thought they were below average.

-Experience Bias: This is the tendency to believe presented information that agrees with a person’s expectations and disbelieve, discard, or downgrade data that appears to conflict with those expectations. In other words, we may assume our view of a given situation (or sometimes problem) constitutes the whole truth. A common example of this in business is when managers interview candidates for a job and make quick judgments based on their past experience of what constitutes a good applicant and a bad one. This may result from arbitrary information, such as the strength of their handshake or the candidate’s preferences for a sport, for example.

-Present Focus Bias: This is the tendency of people to give stronger weight to payoffs that are closer to the present time when considering trade-offs between two or more future moments. If you offer people $100 today or $150 a year from now, most people will take the $100 today. In other words, we tend to overestimate the value of something now and underestimate the value of something later. For example, let’s assume there’s some new skill that if you practiced for 30 minutes a day, that you’d get 1% better each day for the next year. Let’s say you actually did practice for 30 minutes per day. How much better would you be at the end of the year? Instinctively, you probably think, “365 days in a year, perhaps 400%?” If you’re somewhat familiar with compounding functions, you might know to guess extra high, so maybe 1,000%. However, the real answer is 3,778% or 38 times better than the start of the year.

-Pessimism Bias: This is the tendency to overestimate the likelihood of negative things and underestimate the likelihood of positive things. This can cause people in business to avoid a willingness to try new things, due to the assumption that failure will be the result. For example, many managers may assume a pessimistic attitude in considering a new company acquisition, even where the data actually points to a range of highly positive benefits that may well accrue.

D. Considering Social Error Biases

Social error biases tend to occur when individuals take too much account of the views or opinions or others around them when looking to make decisions, and/or when they accept social conventions that may not be appropriate to the decision that needs to be made. The most common examples of this occur when individuals falsely compare themselves to others.

Once again, there are four major social error cognitive biases. These can all influence a person individually or in combination. However, in general, they tend to manifest most often in the order in which they are presented here. Let’s look at each of these in more detail:

-Groupthink Bias: This occurs when a group of well-intentioned people make irrational or non-optimal decisions spurred by the urge to conform or they believe that dissent is impossible. A commonly cited example of this is how the US Navy treated the WW2 threat of a Japanese attack on Pearl Harbor in Hawaii or President Kennedy’s team ignored practically all the facts that the raid would fail in the ‘Bay of Pigs’ disaster that went ahead in Cuba. It’s important to note that groupthink can occur in small and large groups and at all levels in an organization.

-Competitor Neglect Bias: This is the belief that a decision will not cause other organizations to make decisions, and especially competitors, to counter new moves. This happens a lot when companies engage in strategic and other planning or effect changes like new marketing or sales pushes and fail to plan for competitors to counter with their own plans in all of these areas. Dominant incumbent competitors often do this by dropping prices (as they have a dominant market share) and wait until the smaller competitor fails to make money and quits the market.

-Self-Serving Bias: This is the habit of a person taking credit for positive events or outcomes but blaming outside factors for negative events. In other words, this is the tendency to blame external forces when bad things happen (as a defense mechanism) and, on the other hand, to give ourselves credit (usually gratuitously) when good things happen. A business example here might be after a disastrous meeting with a large potential new client, an individual may mainly blame losing the account on a competitor’s underhand business practices and not the fact that he or she did not spend the time to research the new client’s expectations. Had he or she won this account (largely fortuitously perhaps) the person may have claimed their ‘great presentation’ saved the day.

-‘Halo effect’ Bias: This is the tendency to have an overall impression of a person which heavily influences how we feel and think about his or her character. Simply put, Halo effect means we tend to give way too much credit to people we perceive to have social status. It’s our tendency to assume that taller or good-looking people are smarter than they actually are, for example, or that successful people are more interesting or accomplished or even experienced or expert than they actually are. In business, it is often commonly believed, for example, that senior managers are more influential or powerful than they actually are. This is why CEO’s or senior VP’s in a room should ideally declare that he or she is only one voice and one vote in many!

All of the above cognitive biases, in all 4 of the clusters to which they belong, are summarized, with brief summary descriptions, in the chart below.

As we’ve seen, all of these cognitive biases represent a collection of faulty ways of thinking that are apparently ‘hardwired’ into the human brain at times, especially when we are quicker to decide or engage in more fast and largely intuitive decisions or what psychologist and ‘behavioral economics’ author Daniel Kahneman calls “System 1 thinking”. Our best strategy then to counter these biases, at least as a start, is to be aware of them and how they tend to occur and then engage in a slower and more deliberate process or what Kahneman calls “System 2 thinking”. System 2 thinking is the mind’s slower, analytical mode, and typically requires some sort of longer conscious mental exertion.

A common example used to demonstrate the two systems is this puzzle: A bat and a ball together cost $55. The bat costs $50 more than the ball. How much does the ball cost? Faced with this puzzle, the majority of people instantly guess $5. The correct answer, however, is $2.50 but we need the slower ‘system 2’ thinking brain to avoid the cognitive bias here.

Because biases appear to be so hardwired and difficult to shift, most of the attention paid to countering them to date hasn’t dealt with the problematic thinking itself and how ‘wrong’ it might be, but has instead been devoted to changing input behavior, in the form of incentives or “nudges.” (popularized by Richard Thaler and Cass Sunstein in their book ‘Nudge’). For example, while ‘present-focus bias’ is hard to change, some organizations have been able to nudge employees into contributing to retirement plans, for example, by making ‘saving’ the default option rather than actively contributing. In this case, you have to actively take steps in order to not participate. This also becomes a specific tactic to overcome laziness or inertia bias. With this kind of approach, procedures can also be organized in a way that dissuades or prevents people from making decisions or acting on other biased thoughts. A well-known example of this is the checklists for doctors and nurses put forward by Atul Gawande in his book The Checklist Manifesto (which he based on the checklist system that pilots use before taking off).

Summary

Our aim in this article has been to offer a model for sound decision-making that we can draw upon when a situation calls for it or the outcomes matter for each of us personally or for the wider organization of which we are a part. What has been described above is now all therefore represented on the ‘busy’ but convenient chart below. This chart can consequently be used as its own kind of ‘checklist’ to ensure that we have at least thought about how we should ideally go about the decision-making task, using an ‘outside-in process’ first (the outer edge) and then by using an ‘inside-out process’ to ensure that we have also considered whether or not cognitive biases might be at play at all in our thinking-this is shown in the inner diamond).

No alt text provided for this image

Clearly, this chart would be overkill in minor decision-making tasks. Even in more significant decisions, there may not be the time and appetite to use it fully. For this reason, the chart includes several ‘shortcuts’ that can be taken. The first of these is to engage in the ‘outside-in’ process first and in the sequence that is shown (Data first, Knowledge and expertise second and so on). Even here, where time or resources are limited the bold first sub-step will be extremely helpful to consider. And once we have considered all four boxes around the edge, we can move to the more ‘Inside out’ cognitive processes and assess for potentially biased thinking. Once again, this is best done in the sequence shown (Inertia first, Pattern Recognition second, etc.). And as for the outer edge categories, where time or resources are limited the bold first specific cognitive bias is often a very useful subject to consider and discuss.

Jon Warner is CEO of Silver Moonshots-www.SilverMoonshots.org, a research and support organization for enterprises interested in the 50+ older adult markets with its own aging focused virtual accelerator. He is also Chapter Ambassador for Aging 2.0 and on the Board of St Barnabas Senior Services (SBSS), in Los Angeles, California.

Jeannine Siviy ????

Sustainable Growth Strategist & Solutionist | Systems Thinker | Practical Innovator | Catalyst | Performance & Decision Analytics Thought Leader | Pathfinder

4 年

Great article, including a quick-reference infographic, Jon Warner. It strikes me as part process (or pointers to frequent gaps in process) and part readiness. In both cases, it's a useful distillation of many factors (esp biases) into a manageable "most crucial short list." A couple thoughts percolating in my mind. *A flavor of "data correctness" (1c on the list, and implied a bit by the example) is "data is incomplete" -- which raises an interesting balance to strike with managing uncertainty. And the use of intuition in decision-making. *And I thought the point about overcorrecting for biases that don't exist was interesting. Still mulling this one - Do you have another example besides the one about optimism?

Justin Radeka

CEO | Startup Founder | Entrepreneurship Expert | Entrepreneurship Education

4 年

Great Article Jon!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了