Our mentor suggested that we should read "Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts" from Annie Duke. The book would help us make smarter decisions.
Who is this book for?
This book is for everyone absolutely everyone! Each individual makes numerous decisions during the day and if we can improve the process of making the decisions even slightly the final outcome can be vastly improved.
Disclaimer: This post is by no means a summary of the book, I would encourage everyone to go ahead and grab the book and give it a go.
Book Notes
Life Is Poker, Not Chess
Humans have a tendency to equate the quality of decisions with the quality of its outcome. Poker players have a word for this: "resulting".
- Why are we so bad at separating luck and skill?
- Why are we so uncomfortable knowing that the results can be beyond our control?
- Why do we create such a strong connection between results and the quality of decisions preceding them?
If we are asked to list down our best and the worst decision in the previous year, most of us would remember the best and the worst results rather than the best and the worst decisions.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. We say this all the time "I should have known this would happen", or "I should have seen it coming", while doing so we are succumbing to hindsight bias. Those beliefs develop from an overly tight connection between outcomes and decisions. This is typical of how we evaluate our past decisions.
When we work backwards from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation or cherry-picking data to confirm the narrative we prefer.
Many decision-making missteps originate from the pressure on the reflexive system to do its job fast and automatically. Most of what we do daily exists in automatic processing. The challenge is not to change the way our brains operate but to figure out how to work within the limitations of the brains we have.
Poker players have to make multiple decisions with significant financial consequences in a compressed time frame, and do it in a way that lassoes their reflexive minds to align with their long-term goals. This makes the poker table a unique laboratory for studying decision-making.
Chess on the other hand, is a well defined form of computation. It contains no hidden information and very little luck. If you lose a game of chess, it must be because there were better moves that you didn't make or didn't see. Chess, for all its strategic complexity, isn't a great model for decision-making in life, where most of our decisions involve hidden information and much greater influence of luck.
Poker, in contract is a game of incomplete information. It is a game of decision making under conditions of uncertainty over time. Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand.
The quality of our lives is the sum of decision quality plus luck.
Wanna Bet?
When you are betting, you have to back up your belief by putting a price on it. By treating decisions as bets, we can explicitly recognise that we are deciding on alternative futures, each with benefits and risks. All decisions are bets. In most decisions we are betting against all the future versions of ourselves that we are not choosing. Whenever we make a choice we are betting on a potential future.
We bet based on what we believe about the world. Part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bests we make.
We form beliefs in a haphazard way, believing all sort of things based just on what we hear out in the world but haven't researched for ourselves. This is how we form beliefs:
- We hear something;
- We believe it to be true;
- Only sometimes, later, if we have time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Once a belief is lodged, it becomes difficult to dislodge it. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information processing pattern is called motivated reasoning.
Fake news isn't meant to change minds. As we know, beliefs are hard to change. The potency of fake news is that it entrenches beliefs its intended audience already has and then amplifies them. Internet is a playground for motivated reasoning. Many social media sites tailor our internet experience to show us more of what we already like.
When someone challenges us to bet on a belief, it triggers us to vet the belief, taking an inventory of the evidence that informed us. "Wanna bet?" question triggers us to engage in the third step of belief formation i.e. validating the belief.
We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of 0 to 10? Forcing ourselves to express how sure we are of our beliefs brings to plain sight the probabilistic nature of those beliefs.
Bet to Learn: Fielding the Unfolding Future
Experience can be an effective teacher. But, clearly, only some students listen to their teachers. People who learn from experience improve and advance. The future we have bet on unfolds as a series of outcomes. As the future unfolds into a set of outcomes, we are faced with another decision: Why did something happen the way it did?
As outcomes come our way, figuring out whether those outcomes were caused mainly by luck or whether they were predictable result of particular decisions we made is a bet of great consequence. If we determine our decisions drove the outcome, we can feed the data we get following those decisions back into belief formation and creating a learning loop
Actively using outcomes to examine our beliefs and bets closes the feedback loop, reducing uncertainty. The bets we make on when and how to close the feedback loop are part of the execution.
The way our lives turn out is a result of two things: the influence of skill and the influence of luck. Any outcome that is a result of our decision-making is in the skill category. If an outcome occurs because of things that we can't control the result would be due to luck. Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it won't be in our control. The updated learning loop looks like this
Outcomes don't tell us what's our fault and what isn't, what we should take credit for and what we shouldn't. This makes learning from outcomes a pretty haphazard process. We take credit for the good stuff and blame the bad stuff on luck so it won't be our fault. The result is that we don't learn from experience well. This is called "Self-serving bias". This pattern is a deeply embedded and robust thinking pattern. Understanding why this pattern emerges is the first step to developing practical strategies to improve our ability to learn from experience.
We blame our own bad outcomes on bad luck, but when it comes to our peers, bad outcomes are clearly their fault. Our own good outcomes are due to our awesome decision-making, but when it comes to other people, good outcomes are because they got lucky.
Be a better credit-giver than your peers, more willing than others to admit mistakes, more willing to explore possible reasons for an outcome with an open mind, even, and especially, if that might cast you in a bad light or shine a good light on someone else.
The Buddy System
We need to find only a handful of people willing to do the exploratory thinking necessary for truth seeking. Being in a group can improve our decision quality by exploring alternatives and recognising where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe. Interacting with similarly motivated people improves the ability to combat bias not just during direct interactions but when we are making and analysing decisions on our own.
Accountability improves our decision-making and information processing when we are away from the group because we know in advance that we will have to answer to the group for our decisions. To get a more objective view of the world, we need an environment that exposes us to alternate hypotheses and different perspectives. To view ourselves in a more realistic way, we need other people to fill in our blind spots. Accuracy, accountability and diversity wrapped into a group's character all contribute to better decision-making, especially if the group promotes thinking in bets.
Dissent to Win
Be a data sharer. That's what experts do. In fact, that's one of the reasons experts become experts. They understand that sharing data is the best way to move towards accuracy because it extracts insights from your listeners of the highest fidelity.
When we have a negative opinion about the person delivering the message, we close our minds to what they are saying and miss a lot of learning opportunities. Likewise, when we have a positive opinion of the messenger, we tend to accept the message without much vetting. Both are bad.
Our brains have built-in conflicts of interest, interpreting the world around us to confirm our beliefs, to avoid having to admit ignorance or error, to take credit for good results following our decisions, to find reasons bad results following our decisions were due to factors outside our control, to compare well with our peers, and to live in a world where the way things turn out makes sense.
Skepticism is about approaching the world by asking why things might not be true rather than why they are true. A productive decision group would do well to organise around skepticism.
Adventures in Mental Time Travel
When we make in-the-moment decisions (and don't ponder the past or future), we are more likely to be irrational and impulsive. This tendency where we favour our present-self at the expense of our future-self is called temporal discounting. We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later. When we think about the past and the future, we engage deliberative mind, improve our ability to make more rational decision.
Moving regret in front of a decision has numerous benefits. First, obviously it can influence us to make a better decision. Second, it helps us treat ourselves more compassionately after the fact. We can anticipate and prepare for negative outcomes. By planning ahead, we can devise a plan to respond to a negative outcome instead of just reacting to it.
By working backwards from the goal, we plan our decision tree in more depth. The most common form of working backward from our goal to map out the future is known as backcasting. In backcasting, we imagine we've already achieved a positive outcome. Then we think about how we got there. Backcasting makes it possible to identify when there are low-probability events that must occur to reach the goal.
Premortem is an investigation into something awful, but before it happens. Backcasting and Premortems complement each other. Despite the popular wisdom that we achieve success through positive visualisation, it turns out that incorporating negative visualisation makes us more likely to achieve our goals. Imagining both positive and negative futures helps us build a more realistic vision of the future, allowing us to plan and prepare for a wider variety of challenges.
Conclusion
Overall it was a good read and I am implementing the learnings from the book to become a better decision maker.