Outsmart Your Own Biases: Improving Decision Quality

In their research, “Outsmart Your Own Biases,” Soll, Milkman, and Payne make the argument that we (as humans) are cognitive misers and our decisions are marred by biases. This is not a critique of intelligence; it is a critique of energy conservation. The brain seeks shortcuts. The most common biases are referred to as System 1 thinking (judgments based on past experiences) and System 2 thinking (deliberate thinking focused on the wrong thing).

These biases lead us to focus on only one possible future, one objective, and one option in isolation. The danger is rarely that the single option is “bad,” but that it is fragile. By ignoring alternatives, leaders create plans that work perfectly in one specific scenario but fail catastrophically if reality deviates even slightly.

This article covers the core mechanisms of these biases, practical ways to overcome them, and where the research meets the reality of business speed.

Drivers of Biases: Emotion and Investment

The authors state these cognitive biases are motivated when driven by strong emotional attachments or investments. When a leader has “skin in the game” or has championed a project publicly, their ability to evaluate it objectively degrades. Trip wires or decision points can help in ensuring that logical decisions are made and preempt the biases.

A trip wire acts like a circuit breaker: it forces a pause when a specific metric is hit (e.g., “if revenue does not grow by 10% in Q1, we must review the strategy”). Without trip wires, commitment bias takes over, and teams double down on failing courses of action simply because they have already invested so much effort. How to overcome these biases is what I found most intriguing, and I will cover those methods below.

Bias 1: One Possible Future

One Possible Future bias implies that we are confident about our estimates and either use one best guess or hedge the bets. This manifests as “single-point forecasting”—assuming the project will take exactly six months, or the market will grow exactly 5%. To overcome this bias, the authors suggest four different ways.

First, to make at least 3 estimates instead of using a range. The low and high-end estimates will allow for planning, while the middle estimate will be realistic. Ranges are vague; specific points force the brain to construct scenarios. If the low estimate is “sales drop 20%,” the team has to ask why that would happen, which uncovers risks that a simple “+/- 10%” range hides.

Second, think twice—project two outcomes separated over time. This would allow for two perspectives to be created and allows for valuable information to be conceived and considered. By estimating once, waiting, and estimating again, you reduce the influence of momentary mood or recent news (recency bias) on the forecast.

Third is to use pre-mortems, or the process of identification of future potential issues and how to tackle them. A pre-mortem differs from a risk assessment because it assumes failure has already happened. “It is one year from now and the project failed. Why?” This reframing liberates teams to speak openly about flaws they might otherwise suppress to be “polite” or “optimistic.”

Fourth is to take an outside view on projects to avoid planning fallacy, which can lead to change of plans. The planning fallacy is the tendency to assume your project is unique. Taking an outside view means asking, “How long do projects like this usually take for other people?” Reference class forecasting uses that external data to anchor your timeline in reality rather than hope.

Bias 2: Thinking About Objectives

Thinking about objectives is a bias when direction is unwittingly set by only a subset of goals because a larger range is never considered. For example, a team might optimize purely for “speed to market” and forget “maintainability,” ensuring they launch a product that immediately breaks. There are two approaches to overcome this bias.

Start by seeking advice from others on objectives in an independent setting, to avoid subjectively anchoring. Independent input matters because if people hear your goals first, they will frame their advice to fit your bias. Asking “what should we care about?” before sharing “here is what I care about” produces a richer set of success criteria.

Next step is to cycle through the objectives, by reviewing the merit of each objective, allowing for a converged view to be created. Instead of trying to solve for everything at once, look at the decision solely through the lens of cost, then solely through the lens of quality, then solely through the lens of speed. This serial attention ensures no critical dimension is ignored just because it wasn’t the loudest one in the room.

Bias 3: Thinking About Options

Thinking about options is a bias in which people rarely consider more than one option and questions are framed to elicit a yes or no response, in which we are constrained by assumptions based on past experiences. “Should we fire this vendor?” is a yes/no trap. A better question is, “What are all the ways we could improve vendor performance?” The two solutions to avoid this bias are by making joint decisions and to try the vanishing options test.

In the former, decisions are made by evaluating the available choices jointly rather than sequentially. When options are compared side-by-side, subtle trade-offs become visible. In the latter, an assumption is made that none of the options available can be selected, forcing an alternative approach, which might otherwise never be considered. If you could not do the thing you want to do, what would you do? That question often reveals the actual best path.

The final recommendation from the authors is to anticipate 3 future possibilities, 3 key objectives, and 3 viable options. Utilizing these approaches, the biases can be preempted and we can nudge ourselves in the right direction. The “Rule of 3” prevents the tunnel vision that defines System 1 errors.

Shortcomings: The Time Factor & "Blink" Decisions

From a practical perspective, all the options suggested are valid, though time as a factor is rarely taken into consideration. Malcolm Gladwell in his book, "Blink," refers to the concept of "thin slicing," which allows for finding patterns based on narrow windows of experience. If time is of the essence, then consideration of multiple options, be it about future, objectives, or options, might not be a luxury that is available.

There is a tension here between accuracy and velocity. The research optimizes for accuracy; business often demands velocity. The skilled leader knows when to slow down for System 2 rigor (strategic bets) and when to trust System 1 intuition (operational fires).

PLC & BCG Correlation to Decision Making

From a business perspective, the decision-making process can be correlated to the state of growth (Product Life Cycle).

  • Embryonic Stage: Decisions are rapid, as there are minimal to no processes that exist. The cost of delay often exceeds the cost of a mistake.
  • Growth Stage: Decisions are made to ensure growth is viable, though pivoting is possible, since mass market adoption of production may yet to occur. Structure begins to form, but flexibility remains key.
  • Maturity Stage: Processes have to exist and recommendations have to be considered, to avoid the decline of the business. Here, biases are most dangerous because complacency (“we’ve always done it this way”) is a strong anchor.

Ref: Soll, J. B., Milkman, K. L., & Payne, J. W. (2015). Outsmart your own biases. Harvard Business Review, 93(5), 64-71. Retrieved from https://hbr.org/2015/05/outsmart-your-own-biases