Thinking in Bets

Annie Duke

Buy on Amazon

Separate decision and outcome. Separate skill and luck. Find other people who are seeking truth and hold each other accountable. There was lots of actionable advice from this book to improve decision-making.

Notes

Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments.

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.

Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.

When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.

Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.

We are discouraged from saying “I don’t know” or “I’m not sure.” We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision-maker. We have to make peace with not knowing.

What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”

Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold. When the 24% result happened at the final table of the charity tournament, that didn’t reflect inaccuracy about the probabilities as determined before that single outcome. Long shots hit some of the time.

Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether they turn out well on any particular iteration. An unwanted result doesn’t make our decision wrong if we thought about the alternatives and probabilities in advance and allocated our resources accordingly,

Should we be willing to give up the good feeling of “right” to get rid of the anguish of “wrong”? Yes.

By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers. Some things are unknown or unknowable. The promise of this book is that if we follow the example of poker players by making explicit that our decisions are bets, we can make better decisions and anticipate (and take protective measures) when irrationality is likely to keep us from acting in our best interest.

Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

Then your friend says, “Wanna bet?” Suddenly, you’re not so sure. That challenge puts you on your heels, causing you to back off your declaration and question the belief that you just declared with such assurance. When someone challenges us to bet on a belief, signaling their confidence that our belief is inaccurate in some way, ideally it triggers us to vet the belief, taking an inventory of the evidence that informed us.

Offering a wager brings the risk out in the open, making explicit what is already implicit (and frequently overlooked).

What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten?

There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward.

How we figure out what—if anything—we should learn from an outcome becomes another bet. As outcomes come our way, figuring out whether those outcomes were caused mainly by luck or whether they were the predictable result of particular decisions we made is a bet of great consequence. If we determine our decisions drove the outcome, we can feed the data we get following those decisions back into belief formation and updating, creating a learning loop.

If making the same decision again would predictably result in the same outcome, or if changing the decision would predictably result in a different outcome, then the outcome following that decision was due to skill.

Unfortunately, learning from watching others is just as fraught with bias. Just as there is a pattern in the way we field our own outcomes, we field the outcomes of our peers predictably.

Instead of feeling bad when we have to admit a mistake, what if the bad feeling came from the thought that we might be missing a learning opportunity just to avoid blame? Or that we might be basking in the credit of a good result instead of, like Phil Ivey, recognizing where we could have done better? If we work toward that, we can transform the unproductive habits of mind of self-serving bias and motivated reasoning into productive ones.

The people with the most legitimate claim to a bulletproof self-narrative have developed habits around accurate self-critique.

The key is that in explicitly recognizing that the way we field an outcome is a bet, we consider a greater number of alternative causes more seriously than we otherwise would have. That is truthseeking.

The first step is identifying the habit of mind that we want to reshape and how to reshape it. That first step is hard and takes time and effort and a lot of missteps along the way. So the second step is recognizing that it is easier to make these changes if we aren’t alone in the process. Recruiting help is key to creating faster and more robust change, strengthening and training our new truthseeking routines.

Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit.

Forming or joining a group where the focus is on thinking in bets means modifying the usual social contract. It means agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even (and especially) when it makes us uncomfortable.

In combination, the advice of these experts in group interaction adds up to a pretty good blueprint for a truthseekingcharter: A focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group; Accountability, for which members have advance notice; and Openness to a diversity of ideas.

As the Supreme Court has become more divided, this practice has all but ceased. According to a New York Times article in 2010, only Justice Breyer regularly employed clerks who had worked for circuit judges appointed by presidents of both parties. Since 2005, Scalia had hired no clerks with experience working for Democrat-appointed judges. In light of the shift in hiring practices, it should not be so surprising that the court has become more polarized.

A growing number of businesses are, in fact, implementing betting markets to solve for the difficulties in getting and encouraging contrary opinions. Companies implementing prediction markets to test decisions include Google, Microsoft, General Electric, Eli Lilly, Pfizer, and Siemens. People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.

Within our own decision pod, we should strive to abide by the rule that “more is more.” Get all the information out there. Indulge the broadest definition of what could conceivably be relevant. Reward the process of pulling the skeletons of our own reasoning out of the closet.

Enjoy reading this?

Join entrepreneurs, Fortune 100 executives, private equity investors and other consumer leaders in receiving 1-2 case studies each week!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.