Thinking in Bets
The Book in 3 Sentences
Thinking in Bets explores how to make better decisions by treating them as probabilistic bets rather than binary right/wrong choices.
Duke draws from poker, psychology, and cognitive science to show how our natural tendency to believe without questioning and draw tight connections between results and decision quality leads to poor choices.
The book provides practical frameworks for more rational decision-making by embracing uncertainty and implementing systems to combat emotional and cognitive biases.
Impressions
Duke brings a fascinating perspective from her poker background to decision science, making complex concepts accessible through engaging examples.
The book’s strength lies in its practical tools like the 10-10-10 framework and “decision swear jar”.
I think one of the most important aspects of this is that you get skin in the game, as Taleb likes to point out.
Thinking in bets is basically switching from the speak, to action mode of thinking.
| Talking | Betting/Taking Action | 
|---|---|
| Playing the status game - It is more important to be liked than being right. Also related to the river vs the village mentality from Nate Silver | Real stakes involved - You risk losing something, other than social games | 
| Binary thinking - Things are simply “right” or “wrong” | Probabilistic thinking - You consider odds and multiple possible outcomes (30% chance of X, 70% chance of Y) | 
| Easy to be overconfident - No cost to being certain about uncertain things. | Forces honest assessment - When you have to bet money or reputation, you become more realistic about what you actually know | 
| The advantage in bets / actions is that you decouple yourself from the noice. The disadvantage is that a lot of extremely important context is in the noice sometimes. | 
My Top Quotes
- 
A hand of poker takes about two minutes. Over the course of that hand, I could be involved in up to twenty decisions. And each hand ends with a concrete result: I win money or I lose money.
 - 
Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.
 - 
Science writer, historian, and skeptic Michael Shermer, in The Believing Brain, explains why we have historically (and prehistorically) looked for connections even if they were doubtful or false. Incorrectly interpreting rustling from the wind as an oncoming lion is called a type I error, a false positive.
 - 
I particularly like the descriptive labels “reflexive mind” and “deliberative mind” favored by psychologist Gary Marcus. In his 2008 book, Kluge: The Haphazard Evolution of the Human Mind, he wrote, “Our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious.” The first system, “the reflexive system, seems to do its thing rapidly and automatically, with or without our conscious awareness.” The second system, “the deliberative system … deliberates, it considers, it chews over the facts.”
 - 
In The Ascent of Man, scientist Jacob Bronowski recounted how von Neumann described game theory during a London taxi ride. Bronowski was a chess enthusiast and asked him to clarify. “You mean, the theory of games like chess?” Bronowski quoted von Neumann’s response: “‘No, no,’ he said. ‘Chess is not a game. Chess is a well-defined form of computation. You may not be able to work out the answers, but in theory there must be a solution, a right procedure in any position. Now, real games,’ he said, ‘are not like that at all. Real life is not like that. Real life consists of bluffing, of little tactics of deception, of asking yourself what is the other man going to think I mean to do. And that is what games are about in my theory.’”
 - 
Trouble follows when we treat life decisions as if they were chess decisions.
 - 
Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether they turn out well on any particular iteration.
 - 
By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers. Some things are unknown or unknowable.
 - 
The definition of “bet” is much broader. Merriam-Webster’s Online Dictionary defines “bet” as “a choice made by thinking about what will probably happen,” “to risk losing (something) when you try to do or achieve something” and “to make decisions that are based on the belief that something will happen or is true.”
 - 
As for the dog-to-human age ratio, it’s just a made-up number that’s been circulating with no basis, yet with increasing weight through repetition, since the thirteenth century.
 - 
This is how we think we form abstract beliefs: We hear something; We think about it and vet it, determining whether it is true or false; only after that We form our belief.
 - 
It turns out, though, that we actually form abstract beliefs this way: We hear something; We believe it to be true; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
 - 
“Findings from a multitude of research literatures converge on a single point: People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.”
 - 
Gilbert and colleagues demonstrated through a series of experiments that our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.
 - 
We use all parts of our brain. The 10% figure was made up to sell self-improvement books; neural imaging and brain-injury studies disprove the fabrication.
 - 
The same belief-formation process led hundreds of millions of people to bet the quality and length of their lives on their belief about the merits of a low-fat diet. Led by advice drawn, in part, from research secretly funded by the sugar industry, Americans in one generation cut a quarter of caloric intake from fat, replacing it with carbohydrates. The U.S. government revised the food pyramid to include six to eleven servings of carbohydrates and advised that the public consume fats sparingly. It encouraged the food industry (which enthusiastically followed) to substitute starch and sugar to produce “reduced-fat” foods. David Ludwig, a Harvard Medical School professor and doctor at Boston Children’s Hospital, summarized the cost of substituting carbs for fats in the Journal of the American Medical Association: “Contrary to prediction, total calorie intake increased substantially, the prevalence of obesity tripled, the incidence of type 2 diabetes increased many-fold, and the decades-long decrease in cardiovascular disease plateaued and may reverse, despite greater use of preventive drugs and surgical procedures.” Low-fat diets became the suited connectors of our eating habits.
 - 
Stanford law professor and social psychologist Robert MacCoun studied accounts of auto accidents and found that in 75% of accounts, the victims blamed someone else for their injuries. In multiple-vehicle accidents, 91% of drivers blamed someone else. Most remarkably, MacCoun found that in single-vehicle accidents, 37% of drivers still found a way to pin the blame on someone else.
 - 
Dissent channels and red teams are a beautiful implementation of Mill’s bedrock principle that we can’t know the truth of a matter without hearing the other side. This commitment to diversity of opinion is something that we would be wise to apply to our own decision groups. For example, if a corporate strategy group is figuring out how to integrate operations following a merger, someone who initially opposed the merger would be good to have as part of the group.
 - 
“Even research communities of highly intelligent and well-meaning individuals can fall prey to confirmation bias, as IQ is positively correlated with the number of reasons people find to support their own side in an argument.”
 - 
The term “devil’s advocate” developed centuries ago from the Catholic Church’s practice, during the canonization process, of hiring someone to present arguments against sainthood.
 - 
Nietzsche said that remorse was “adding to the first act of stupidity a second.” Thoreau, on the other hand, praised the power of regret: “Make the most of your regrets; never smother your sorrow, but tend and cherish it till it comes to have a separate and integral interest. To regret deeply is to live afresh.”
 - 
“Every 10-10-10 process starts with a question… . [W]hat are the consequences of each of my options in ten minutes? In ten months? In ten years?”
 - 
Having a nuanced, precise vocabulary is what jargon is all about.
 - 
Having a nuanced, precise vocabulary is what jargon is all about. It’s why carpenters have at least a dozen names for different kinds of nails, and in the field of neuro-oncology, there are more than 120 types of brain and central nervous system tumors.
 - 
Tilt is the poker player’s worst enemy, and the word instantly communicates to other poker players that you were emotionally unhinged in your decision-making because of the way things turned out.
 - 
The concept of tilt comes from traditional pinball machines. To keep players from damaging the machines by lifting them to alter the course of the ball, the manufacturers placed sensors inside that disabled the machine if it was violently jostled. The flippers stopped working, the lights went off, and the word “tilt” flashed at numerous places on the layout. The origin of tilt in pinball is apt because what’s going on in our brain in moments of tilt is like a shaken pinball machine. When the emotional center of the brain starts pinging, the limbic system (specifically the amygdala) shuts down the prefrontal cortex. We light up … then we shut down our cognitive control center.
 - 
Recognizing that when we are on tilt we aren’t decision fit. Aphorisms like “take ten deep breaths” and “why don’t you sleep on it?” capture this desire to avoid decisions while on tilt.
 - 
Ulysses contracts can help us in several ways to be more rational investors. When we set up an automatic allocation from our pay into a retirement account, that’s a Ulysses contract. We could go through the trouble of changing the allocation, but setting it up initially gives our goal-setting, System 2–self a chance to precommit to what we know is best for our long-term future. And if we want to change the allocation, we have to take some specific steps to do so, creating a decision-interrupt.
 - 
A “decision swear jar” is a simple kind of precommitment contract that we can apply to many of the key concepts of this book. For the decision swear jar, we identify the language and thinking patterns that signal we are veering from our goal of truthseeking. When we find ourselves using certain words or succumbing to the thinking patterns we are trying to avoid because we know they are signs of irrationality, a stop-and-think moment can be can be created. You can think about this as a way to implement accountability.