Making Smarter Decisions When You Don’t Have All the Facts
By: Annie Duke
Written by a former professional poker player, the book explores lessons from the world of poker, which, unlike chess, is a game of incomplete information: decisions (bets) are made under conditions of uncertainty and luck can influence the outcome.
The key concept of the book is that you should approach many decisions in life in the same way you approach bets in a game of poker:
- Avoid decision making traps.
- Learn from results in a rational way.
- Keep emotions out of the decision making process.
In the spirit of the book’s useful behavioral suggestions, this book had a promising start AND didn’t deliver as many insights as I was hoping for. Much of the content could perhaps easily fit into a blog post.
Viewing decision making through the lens of betting is an interesting, if at times restrictive, approach. Some of the real life examples used in the book are easy to poke holes in or feel forced. It’s not always clear how using the lens of betting is more helpful than going back to what social sciences or information theory has to say about making decisions under uncertainty (for instance, “The Laws of Medicine” is similarly aimed at making you a better “belief calibrator”, but probably does a cleaner job in providing simple and practical rules). Some elements of decision making could be explored in more detail, such as goal-setting or the role of feedback (for instance, saying “wanna bet” is effective likely because beliefs have immediate and measurable consequences – an aspect that the book doesn’t spend a lot of time on).
- Higher IQ increases the risk of larger blind spots.
- Better at constructing elaborate narratives supporting beliefs.
- Be a better belief calibrator.
- Test, adjust, be open.
We are biased from an evolutionary perspective to first experience, then believe what we experience, and finally, (sometimes) question if the belief that we have just formed is actually “true”. We are biased to avoid the last step of evaluating our beliefs because of the low evolutionary cost of false positives: it’s better to be wrong about a noise being a potential snake in the grass than to question that belief first. Consequently, we seek out evidence that confirms our beliefs and we avoid contradictory evidence (“motivated reasoning”). The smarter you are, the better you may become at constructing narratives that support your beliefs (IQ correlates with larger blind spots). Protecting your beliefs leads to biased reasoning, increased polarization and inhibits “truth-seeking” behavior.
Bets are based on your beliefs about the world. It therefore pays to be a better belief calibrator, using your experience and information to adjust your beliefs about the world. Saying “Wanna bet” triggers the step of having to vet your beliefs and forces an examination of beliefs in a less biased way. Beliefs can become more nuanced, acknowledging levels of uncertainty (instead of: something or someone is “always” or “never” good, bad, etc.).
Outcomes are influenced by a combination of skill (our decision making) and luck (things we can’t control). When we seek to understand how outcomes came to be, driven by the need to have a positive self-image, we typically take 100% credit (skill) for good outcomes, and blame 100% conditions (luck) for bad outcomes (“self-serving bias”). It follows that (especially in zero sum games) good outcomes for other people are driven by good luck (they have to be, because my corresponding bad outcome has to be due to bad luck) and other people’s bad outcomes are driven by their (lack of) skills – this also allows us to compare ourselves favorably to our peers. “Fielding outcomes” along these lines is habitual and inhibits the feedback loop of learning / adjusting beliefs.
Habits are neurological loops: a cue triggers a routine which yields a reward. To change a habit, you keep the old cue, insert a new routine that still delivers the old reward. In fielding outcomes in a self-serving manner, the reward that the brain looks for is a positive self-image. The same positive self-image can be achieved by deliberately introducing probabilistic thinking techniques (testing alternative hypotheses, switching perspectives, acknowledging uncertainty – all of this is perhaps more common sense than “wanna bet”). These techniques minimize motivated reasoning, self-serving biases, improve accuracy and thereby improve decision making.
Pursuing this change of habits is easier in a group than solo. It pays to form groups that focus on accuracy (over confirmation), accountability, openness and diversity of beliefs. Rules of engagement for truth-seeking groups include communism (share the data), universalism (separate the message from the source; don’t automatically discredit something because of who said it), disinterestedness (don’t let the outcome influence your assessment of the decision making process), organized skepticism (I’m not sure, are you sure?). Outside of your truth-seeking group, be more careful – not everyone is on board for this type of truth-seeking behavior. Use appropriate language (“Yes, and” instead of “No” or “Yes, but”) and understand time and place (sometimes people are not looking for the truth or advise, they just want to vent).
Sometimes outcomes are not known for a long time or it is more attractive to pursue short-term outcomes that are more visible and attractive now but potentially harmful in the long run (favoring your present self at the expense of your future self: “temporal discounting”). In these cases, decision making can be improved through various ways of better taking into account future outcomes (imagining the future more vividly, thinking, imagining potential regret, avoiding in the moment (emotional) decision making based on immediate outcomes, taking the long view instead of focusing on the latest up- or down-tick, raising barriers ahead of time to avoid irrational decision-making, positive and negative scenario planning).