This site depends on JavaScript to run. Please enable it or upgrade to a modern browser that supports it.
 

ASCM Insights

Don’t Get Caught in the Bias Trap

title

Are you more likely to be killed by a cow or a shark?

How many muscles are in the trunk of an African elephant?

Your fantasy baseball player with a .335 batting average had no hits in the last three games. Should you keep him in the lineup, or replace him with a .250 hitter who had four hits in the previous game?

And what in the world do these questions have to do with supply chain?

Well, how people answer these questions affects the quality of their decision-making — and shows whether they have fallen for a bias trap. Despite our best intentions, we all fall prey to various forms of bias that interfere with good decision-making. Fundamentally, bias means that a person prefers a certain idea and possibly does not give equal consideration to another. A common example in supply chain is forecast bias, or the tendency of a forecast to be too high or too low.

Many of these traps stem from the fact that people don’t carefully process every piece of information in every choice they make. Instead, we rely on mental shortcuts known as heuristics. These allow us to make decisions in the face of uncertainty or when extensive analysis is too costly or time-consuming. Nearly all decisions are biased, and this is what enables us to make so many decisions in a day — from choosing what we want for breakfast to deciding to press the brakes when we approach a red light. Human brains are wired to make these decisions to save time and even save our lives. This frees us to spend time on higher priorities and more complex problems.

But biases also can hinder us. They can creep in and distort our thinking if we blindly rely on them.

Keep an eye out

Although there are dozens of types of biases, they tend to fall into four main categories.

1. Overconfidence bias is the tendency to believe you are more certain than you really are about a given situation. Psychologists have shown that human beings are systematically overconfident in their judgments. Overconfidence has been blamed for the sinking of the Titanic, the nuclear accident at Chernobyl, the loss of Space Shuttles Challenger and Columbia, the subprime mortgage crisis of 2008, and the Deepwater Horizon oil spill in the Gulf of Mexico.

We tend to assume that the accuracy of our judgments or the probability of success in our endeavors is more favorable than the data would suggest. When there are risks, we alter our read of the odds to assume we’ll come out on the winning side. When we feel too confident in our understanding, we don’t spend enough time or money acquiring more information or running further analyses.

Experienced professionals and senior decision-makers who have been promoted based on past successes are especially vulnerable to this bias because they have received positive signals about their decision-making abilities throughout their careers. For example, a newly hired senior project manager with a successful track record of completing projects on time, within scope and under budget might feel extremely confident in their ability to easily excel in new projects, even if the tasks pose a unique scope and different complexities. This manager might approach these projects without assessing the new challenges and working to address them.

2. Confirmation bias is the tendency to look for information that supports a preconceived notion and reject information that casts doubt on the desired outcome. Have you ever had a hunch about a particular stock and subsequently cherry-picked information supporting why it would be a great investment? If so, you’ve been swayed by confirmation bias.

This is the ultimate decision-making trap, and it becomes even harder to avoid when individuals face pressure from bosses or peers. For example, a supply chain executive who opposes the company’s practice of offshoring production might only present the benefits of reshoring, rather than helping the supply chain team effectively analyze both options. 

3. Stereotyping involves one’s perception of another person based on a generalization of that individual’s gender, ethnicity, personality or other factors, without having actual information about the person in question. This bias starts with humans’ automatic tendency to categorize individuals or objects to make sense of the world. Categorization provides order and predictability that we can rely on to guide our interactions with others. However, it also can lead us down paths we regret. For example, if you assume all Gen Zers don’t like to work in groups, all baby boomers are technophobes, and all millennials have a sense of entitlement, you could easily misjudge a valuable candidate or coworker.

Stereotypes also can represent positive opinions. For example, if your warehouse manager is a retired military commander, he might assume that all veterans have a strong work ethic, are disciplined and respect authority. If he sees these as ideal qualities for new hires, he might push human resources to only consider veterans for open roles. Although these characteristics may very well be true of many veterans, other individuals may have these qualities too.

4. Sunk-cost bias is the tendency to escalate commitment to a failing course of action if one already has invested a great deal of time, money and other resources that are not recoverable. Project Leap Frog was supposed to be a game changer for Universal Fulfillment Inc. It was going to seamlessly integrate 53 legacy systems into a single, user-friendly platform within five years and at a cost of $250 million. Five years and $500 million later, the project still is inoperable.

An internal assessment by the engineering and information technology teams indicated a low probability of success, which still would take another three years and $150 million of investment. Despite the cost overruns, delays and prognosis, the CEO fumed at the suggestion of abandoning the work. He felt that they’d already spent $500 million, so there was no turning back. The CEO was emotionally attached to the effort. On the other hand, if people behaved rationally, they would make choices based on the marginal costs and benefits of their actions and ignore sunk costs.  

How to outsmart bias traps

All of these biases regularly skew decision-making — even when the best data, experience and logic are available. In addition, more than one bias can exist in a given situation, which makes avoiding these traps even more difficult. But once you’re aware that these traps exist, you can use a variety of tools and techniques to effectively mitigate your biases.

First, you should embrace scientific thinking. This is all about following a disciplined process in order to

  • carefully observe and understand the current state
  • describe it in an objective, fact-based and time-bound manner
  • form a hypothesis or prediction about the probable — although not necessarily desired — outcome
  • conduct experiments or tests
  • observe changes and review data
  • make a conclusion.

For example, you could frame a problem like this: During 2020, the coronavirus pandemic contributed to a six-week delay in critical parts from one supplier in Italy. This resulted in several customer complaints and $50,000 in customer credits on $1 million in revenue. Such a description is specific and based on facts and data, focuses on a specific timeline, and avoids sweeping generalizations or emotion. From here, you can investigate your issue.

If you are having trouble framing your problem in a scientific manner, consider these questions:

  • What are we missing from our current understanding of the problem? Where are our blind spots? 
  • Do we have the right people involved in this discussion?
  • Do we have data that can help us? Is the data reliable?
  • Are the issues really symptoms of something else?
  • How are we framing the problem from a time perspective?
  • Is there a bigger problem to solve?

Another helpful technique is to bring in an outside perspective. Reach out to people who can play devil’s advocate or provide different viewpoints to stimulate active debate. This might include stakeholders from other departments who are not involved in the project. Neutral parties tend to be frank and ask both basic and pointed questions because they are removed from the situation.  

To avoid confirmation bias, actively look for information that disproves your beliefs. Ask yourself, “If my expectations are wrong, what pattern would I likely see in the data?”

Conducting a premortem thought exercise can be helpful in avoiding sunk-cost bias. In a postmortem analysis, the task typically is to understand the cause of a past failure. In a premortem thought exercise, you imagine a future failure and then explain the cause. Before committing to Project Leap Frog, the team in the example above could have asked some premortem questions: What happens if we’re unable to integrate with all 53 systems? What’s our backup plan? Will the technology be obsolete in five years?

This premortem technique helps temper optimism by identifying potential problems that ordinary foresight won’t bring to mind. It also guides you to prepare backup plans and exit strategies so you don’t keep sinking deeper into a financial hole.  

When trying to guess an outcome, such as in the case of forecasting, make three estimates: a low one, a medium one and a high one — also known as a pessimistic one, a base one and an optimistic one. People tend to give wider ranges when they think about their low and high estimates separately, which is why making three guesses is important. If you are unsure of the answer, provide a low and a high guess such that you are 90% sure the correct answer falls somewhere between the two.

One last technique is to use pairwise ranking. This powerful tool sets priorities between multiple options. Each option is compared directly against another. The output is a numerical ranking from highest or best to lowest or worst.

Psychology Today reports that the average person makes 35,000 decisions per day. For many of these cases, using a heuristic is appropriate. However, for the small subset of important decisions that require more objectivity, you should be mindful of recognizing biased decisions and avoid them by using these strategies.

And, in case you’re still wondering:

  • Cows kill four times more people than sharks.
  • The African elephant has 100,000 muscles in its trunk.
  • Keep your top baseball player in the lineup. Looking at only a handful of games is statistically too small.

About the Author

Peter J. Sherman, CSCP Managing Partner, Riverside Associates

Peter J. Sherman, CSCP, is managing partner at Riverwoods Associates, a process improvement training and consulting firm based in Atlanta. He is a certified Lean Six Sigma Master Black Belt and previously served as lead instructor of Emory University's Six Sigma Certificate program. Sherman may be contacted at peter@riverwoodassociates.com.