Management And Accounting Web

Kahneman, D., D. Lovallo and O. Sibony. 2011. Before you make that big decision: Dangerous biases can creep into every strategic choice. Here's how to find them - before they lead you astray. Harvard Business Review (June): 50-60.

Summary by James R. Martin, Ph.D., CMA
Professor Emeritus, University of South Florida

Behavioral Issues and Culture Main Page  |  Decision Theory Main Page

Many executives recognize that biases can distort strategic decisions, but awareness of this fact has not significantly improved the quality of those decisions. This article addresses that problem by describing a way to detect bias and minimize its effects in non-routine decisions involving a review recommendations that are important and recurring enough to justify a formal process. Examples used in the article include a radical overhaul of a company's pricing structure, a substantial capital investment in a manufacturing site, and a major acquisition that would complement a product line.

The Challenge of Avoiding Bias

In this section the authors address the question of why people are incapable of recognizing their own biases. Cognitive scientists have determined that there are two modes of thinking: intuitive and reflective. Intuitive thinking involves impressions, associations, feelings, intentions, and preparations for actions that flow effortlessly. Reflective thinking is effortful and deliberate, and mobilized when rule-based reasoning is required. Intuitive thinking is used most of the time and since it is highly sensitive to context it can lead us astray in various ways. Cognitive biases that creep in provides one example, and we have no way of knowing when this is happening. This causes us to be unaware that we are making a wrong decision and is the key to understanding why we accept our intuitive thinking without question. This is why knowing that you have biases is not enough to help you overcome them.

There is a way to deal with biases when we move from the individual to the group, or from the decision maker to the decision-making process. Biases can be neutralized or at least reduced at the organization level. Members of a group can apply rational thought to spot biases in others. The authors propose adding a systematic review of the recommendations process to identify biases that have influenced the people making important proposals.

Decision Quality Control: A Checklist

The proposed review process includes a 12-question checklist designed to reveal the cognitive biases of the teams making recommendations. The checklist includes three categories: Questions the decision makers should ask themselves; questions the decision makers should ask the group making the recommendation; and questions focused on evaluating the proposal.

Questions that decision makers should ask themselves:

1. Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team? This question is designed to check for self-interested biases that result in motivated reasoning that is significant.

2. Have the people making the recommendation fallen in love with it. Has the recommending team been overly influenced by the affect heuristic where they have minimized (or maximized) the risks and costs, and overstated (or understated) the benefits of their proposal?

3. Were there dissenting opinions within the recommending team? If the recommendation is unanimous, it may be a case of groupthink imposed by the team's leader, or occur when there is little diversity of background and viewpoint on the team.

Questions that decision makers should ask the team making recommendations:

4. Could the diagnosis of the situation be overly influenced by salient analogies? Past success stories might be used to promote a recommendation, but may be less relevant than it appears and lead to faulty inferences. This is referred to as a saliency bias. The team should explore alternative more rigorous diagnoses.

5. Have credible alternatives been considered? Teams are inclined to generate one probable hypothesis along with evidence to support it, but this might be the result of confirmation bias.

6. If you had to make this decision again in a year, what information would you want, and can you get more of it now? Not considering the data that is missing is referred to as the availability bias. In some cases, more useful information is available that would be useful to the decision maker.

7. Do you know where the numbers came from? Which numbers in the proposal are facts and which are estimates? Were the numbers developed from estimates developed by adjusting from another number? An anchoring bias can occur when the initial estimates are best guesses and these numbers are unchallenged, or when estimates are based on historical data. The trap of anchors is the view that they can be disregarded. To avoid this bias the team should reconsider its assumptions.

8. Can you see a halo effect? Is the team making false inferences that a person, organization, or approach that was successful in one area will be successful in another area? Inferences about a recommendation based on the reputation of a company or leader who successfully implemented a similar recommendation  introduces a halo effect into the decision.

9. Are the people making the recommendation overly attached to past decisions? Has the team fallen for the sunk-cost fallacy where past expenditures that are irrelevant to the decision are considered as relevant?

Questions focused on evaluating the proposal:

10. Is the base case overly optimistic? This question can reveal a number of problems with a recommendation including overconfidence, a planning fallacy, optimistic biases, and competitor neglect. The planning fallacy is where the focus is on the case at hand and the history of similar projects is ignored. This includes anticipating how competitors will respond to the decision. The decision maker needs to take an outside view and adopt a "war games" approach.

11. Is the worst case bad enough? A range of scenarios should include a best and worst case. What could happen that we have not thought of? This problem is referred to as disaster neglect. A technique used to avoid this problem is the "premortem," where the worst case is assumed to have occurred and a story is developed about how it happened.

12. Is the recommending team overly cautious? This question is designed to check for loss aversion or excessive conservatism. This problem is difficult to address because our desire to avoid loss is stronger than our desire to obtain gains. Explicitly sharing responsibility for risk is one way to address this issue because no individual or team wants to be responsible for a failed project.

Implementing Quality Control Over Decisions

There is a time and place for the systematic review described above, and there are ways to make it part of your decision-making process.

When to use the checklist

As pointed out earlier the 12-question review process is applicable to non-routine decisions involving reviewing recommendations that are important and recurring enough to justify a formal process. It is not designed for routine decisions that executives rubber-stamp.

Who should conduct the review?

There should be a clear separation between the decision maker and the team making the recommendation.

Enforcing discipline

Using the checklist depends on discipline, not genius. Partial use of the questions may be a recipe for failure.

Costs and benefits

The real challenge in implementing decision quality control is not related to time and cost, but in developing an awareness that even highly experienced, competent, and well intentioned managers are fallible and subject to a variety of biases.

12-Question Checklist for Decisions

__________________________________________

Related summaries:

Butcher, D. 2019. Ethics: Identifying common judgment traps and biases. Strategic Finance (August): 15-16. (Summary).

Butcher, D. 2019. Techniques to overcome biases. Strategic Finance (November): 13-14. (Summary).

Coutu, D. L. 2002. The anxiety of learning. Harvard Business Review (March): 100-107. (Summary).

Dawkins, R. 2008. The God Delusion. A Mariner Book, Houghton Mifflin Company. (Summary).

Hornsey, M. J. and K. S. Fielding. 2017. Attitude roots and Jiu Jitsu persuasion: Understanding and overcoming the motivated rejection of science. American Psychologist 72(5): 459-473. (Summary).

Kenrick, D. T., A. B. Cohen, S. L. Neuberg and R. B. Cialdini. 2018. The science of antiscience thinking. Scientific American (July): 36-41. (Summary).

Thaler, R. H. 2008. Mental accounting and consumer choice.  Marketing Science 27(1): 15-25. (Summary).