๐Ÿ’Ž Even short breaks can disrupt habits

Recent research suggests that anything more than a short lapse in a behavior we hope to make habitual (say, multiple missed visits to the gym rather than just one) can be costly. Seinfeld’s mantra “Don’t break the streak” is astute. It also helps explain the logic behind twenty-eight-pill packages of birth control. Scientifiยญcally speaking, the pills are necessary only on the first twenty-one days of a twenty-eight-day menstrual cycle. However, most birth control packages include seven sugar pills along with twenty-one hormone pills to ensure that people on birth control won’t fall out of the bit.

Excerpt from: How to Change: The Science of Getting from Where You Are to Where You Want to Be by Katy Milkman

๐Ÿ’Ž On the importance of updating our beliefs when presented with new information

My colleague Phil Tetlock finds that forecasting skill is less a matter of what we know than of how we think. When he and his collaborators studied a host of factors that predict excellence in forecasting, grit and ambition didnโ€™t rise to the top. Neither did intelligence, which came in second. There was another factor that had roughly triple the predictive power of brainpower.

The single most important driver of forecastersโ€™ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions.

Excerpt from: Think Again: The Power of Knowing What You Don’t Know by Adam Grant

๐Ÿ’Ž On the value of changing your mind

With all due respect to the lessons of experience, I prefer the rigor of evidence. When a trio of psychologists conducted a comprehensive review of thirty-three studies, they found that in every one, the majority of answer revisions were from wrong to right. This phenomenon is known as the first-instinct fallacy.

In one demonstration, psychologists counted eraser marks on the exams of more than 1,500 students in Illinois. Only a quarter of the changes were from right to wrong, while half were from wrong to right. Iโ€™ve seen it in my own classroom year after year: my studentsโ€™ final exams have surprisingly few eraser marks, but those who do rethink their first answers rather than staying anchored to them end up improving their scores.

Excerpt from: Think Again: The Power of Knowing What You Don’t Know by Adam Grant

๐Ÿ’Ž On using our understanding of the natural world to your advantage

In another life-and-death situation, in 1989 Bengal tigers killed about 60 villagers from Indiaโ€™s Ganges delta. No weapons seemed to work against them, including lacing dummies with live wires to shock the tigers away from human populations.

Then a student at the Science Club of Calcutta noticed that tigers only attacked when they thought they were unseen, and recalled that the patterns decorating some species of butterflies, beetles, and caterpillars look like big eyes, ostensibly to trick predators into thinking their prey was also watching them. The result: a human face mask, worn on the back of head. Remarkably, no one wearing a mask was attacked by a tiger for the next three years; anyone killed by tigers during that time had either refused to wear the mask, or had taken it off while working. โ€” sidebar: Occam’s Razor in the Medical field

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

๐Ÿ’Ž On the danger of only measuring the first order effects of an intervention

In 1963, the UC Santa Barbara ecologist and economist Garrett Hardinโ€™ Proposed his First Law of Ecology: โ€œYou can never merely do one thing.โ€ We operate in a world of multiple, overlapping connections, like a web, with many significant, yet obscure and unpredictable, relationships. He developed Second-order thinking into a tool, showing that if you don’t consider โ€œthe effects of the effects,โ€ you canโ€™t really claim to be doing any thinking at all.

When it comes to the overuse of antibiotics in meat, the first-order consequence is that the animals gain more weight per pound of food consumed, and thus there is profit for the farmer. Animals are sold by weight, so the less food you have to use to bulk them up, the more money you will make when you go to sell them.

The second-order effects, however, have many serious, negative consequences. The bacteria that survive this continued antibiotic exposure are antibiotic resistant. That means that the agricultural industry, when using these antibiotics as bulking agents, is allowing mass numbers of drug-resistant

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

๐Ÿ’Ž On the illusion of explanatory depth

isolation is powerful but misleading. For a start, while humans have accumulated a vast store of collective knowledge, each of us alone knows surprisingly little, certainly less than we imagine. In 2002, the psychologists Frank Keil and Leonid Rozenblit asked people to rate their own understanding of how zips work. The respondents answered confidently โ€” after all, they used zips all the time. But when asked to explain how a zip works, they failed dismally. Similar results were found when people were asked to describe climate change and the economy. We know a lot less than we think we do about the world around us. Cognitive scientists call this โ€˜the illusion of explanatory depthโ€™, or just โ€˜the knowledge illusionโ€™.

Excerpt from: Conflicted: Why Arguments Are Tearing Us Apart and How They Can Bring Us Together by Ian Leslie

๐Ÿ’Ž On the advantage of being familiar with a number of accurate models of human behaviour, rather than just knowing a series of unrelated facts

In a famous speech in the 1990s, Charlie Munger summed up this approach to practical wisdom: โ€œWell, the first rule is that you canโ€™t really know anything if you just remember isolated facts and try and bang โ€˜em back. If the facts donโ€™t hang together on a latticework of theory, you donโ€™t have them in a usable form. You’ve got to have models in your head. And you’ve got to array your experience both vicarious and direct on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a latticework of models in your head.โ€

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

๐Ÿ’Ž Bayesian thinking and the importance of applying a base rate when interpreting new data

The core of Bayesian thinking (or Bayesian updating, as it can be called) is this: given that we have limited but useful information about the world, and are constantly encountering new information, we should probably take into account what we already know when we learn something new. As much of it as possible. Bayesian thinking allows us to use all relevant prior information in making decisions. Statisticians might call it a base rate, taking in outside information about past situations like the one you’re in.

Consider the headline โ€œViolent Stabbings on the Rise.โ€ Without Bayesian thinking, you might become genuinely afraid because your chances of being a victim of assault or murder is higher than it was a few months ago. But a Bayesian approach will have you putting this information into the context of what you already know about violent crime. You know that violent crime has been declining to its lowest rates in decades. Your city is safer now than it has been since this measurement started. Letโ€™s say your chance of being a victim of a stabbing last year was one in 10,000, or 0.01%. The article states, with accuracy, that violent crime has doubled. It is now two in 10,000, or 0.02%. Is that worth being terribly worried about? The prior information here is key. When we factor it in, we realize that our safety has not really been compromised.

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

๐Ÿ’Ž On how we can be trapped by our own perspective

The first flaw is perspective. We have a hard time seeing any system that we are in. Galileoโ€™ had a great analogy to describe the limits of our default perspective. Imagine you are on a ship that has reached constant velocity (meaning without a change in speed or direction). You are below decks and there are no portholes. You drop a ball from your raised hand to the floor. To you, it looks as if the ball is dropping straight down, thereby confirming gravity is at work.

Now imagine you are a fish (with special x-ray vision) and you are watching this ship go past. You see the scientist inside, dropping a ball. You register the vertical change in the position of the ball. But you are also able to see a horizontal change. As the ball was pulled down by gravity it also shifted its position east by about 20 feet. The ship moved through the water and therefore so did the ball. The scientist on board, with no external point of reference, was not able to perceive this horizontal shift.

This analogy shows us the limits of our perception. We must be open to other perspectives if we truly want to understand the results of our actions. Despite feeling that we’ve got all the information, if we’re on the ship, the fish in the ocean has more he can share.

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

๐Ÿ’Ž Beware claimed data (people don’t like to admit they ‘don’t know’ when questioned)

However, serious academic consideration of public opinion about fictitious issues did not start until the ’80s, when George Bishop and colleagues at the University of Cincinnati found that a third of Americans either favoured or opposed the fictitious Public Affairs Act. Bishop found that this figure dropped substantially when respondents were offered an explicit don’t know’ option. However, 10 per cent of respondents still selected a substantive answer, even when given a clear opportunity to express their lack of familiarity. Similar findings were reported in the US at around the same time by Howard Schuman and Stanley Presser, who also found that a third of respondents to their survey expressed positions on issues which, though real, were so obscure that few ordinary citizens would ever have heard of them.

Excerpt from: Sex, Lies and Politics: The Secret Influences That Drive our Political Choices by Philip Cowley and Robert Ford

๐Ÿ’Ž Beware interpreting stats on anything you have a strongly held view about (from politics to Covid and beyond)

It’s much more challenging when emotional reactions are involved, as we’ve seen with smokers and cancer statistics. Psychologist Ziva Kunda found the same effect in the lab when she showed experimental subjects an article laying out the evidence that coffee or other sources of caffeine could increase the risk to women of developing breast cysts. Most people found the article pretty convincing. Women who drank a lot of coffee did not.

We often find ways to dismiss evidence that we don’t like. And the opposite is true, too: when evidence seems to support our preconceptions, we are less likely to look too closely for flaws.

The more extreme the emotional reaction, the harder it is to think straight.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

๐Ÿ’Ž Beware the Rosser Reeves effect when interpreting tracking data (communication effectiveness)

Research routinely shows that people who’re aware of communication from brand X are more likely to buy that brand. Sometimes used as evidence that communication drives sales, in fact causality usually runs the other way: buying brand X makes you more likely to notice its communications. This phenomenon (the so-called ‘Rosser Reeves effecลฅ โ€“ named after the famous 1950s adman) has been known for decades, yet is still routinely used to ‘prove’ communication effectiveness (most recently to justify social media use).

Excerpt from:ย How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

๐Ÿ’Ž On the tendency of marketers to exaggerate the amount consumers change (social trends)

Marketing and advertising people can talk a load of nonsense at the best of times. But if you want to hear them at their worst, ask them to talk about social trends. The average social trends presentation is a guaranteed mix of the obvious, irrelevant and false.

Recently, we were listening to a conference speech about changing lifestyles’. Life nowadays is faster than ever, said the speaker. We work longer hours. We have less free time. Families are fragmenting. Food is eaten on the run..

We’ve been listening to this bullshit for 30 years. And it’s no more true now that it was then. The inconvenient, less headline-worthy truth is that people have more free time than ever. Economic cycles wax and wane, but the long-term trend in all developed economies is toward shorter, more flexible working hours. And longer holidays. People start work later in life and spend much longer in retirement. Work takes up a smaller percentage of our life than it used to.

Related myths about pressures on. family time are equally false. Contrary to popular belief, in developed economies parents spend more time with their children these days. Not less. Research shows the amount of time families spend eating together has stayed remarkably constant over the years, As has the amount of time they spend together watching TV.

Excerpt from:ย How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

๐Ÿ’Ž On the benefits of brevity (sell your idea or your dream in 10 to 15 minutes)

Let’s put this in perspective. Abraham Lincoln inspired generations in a speech that lasted two minutes. John F. Kennedy took 15 minutes to shoot for the moon. Martin Luther King Jr. articulated his dream of racial unity in 17 minutes. Steve Jobs gave one of the most famous college commencement speeches of our time at Stanford University in 15 minutes. If you can’t sell your idea or your dream in 10 to 15 minutes, keep editing until you can.

Ideas don’t sell themselves. Be selective about the words you use. If they don’t advance the story, remove them. Condense, simplify, and speak as briefly as possible. Have the courage to speak in grade-school language. Far from weakening your argument, these tips will elevate your ideas, making it more likely you’ll be heard.

Excerpt from: Five Stars: The Communication Secrets to Get From Good to Great by Carmine Gallo

๐Ÿ’Ž All speeches have three versions (before, during, ideal)

“There are always three speeches for every one you actually gave: the one you practiced, the one you gave, and the one you wish you gave.”

-Dale Carnegie

Excerpt from: 100 Things Every Designer Needs to Know About People (Voices That Matter)ย by Susan Weinschenk

๐Ÿ’Ž On our minds working on problems even when we’re not consciously thinking about them (John Cleese)

Graham and I thought it was rather a good sketch. It was therefore terribly embarrassing when I found I’d lost it. I knew Graham was going to be cross, so when I’d given up looking for it, I sat down and rewrote the whole thing from memory. It actually turned out to be easier than I’d expected.

Then I found the original sketch and, out of curiosity, checked to see how well I’d recalled it when rewriting. Weirdly, I discovered that the remembered version was actually an improvement on the one that Graham and I had written. This puzzled the hell out of me.

Again I was forced to the conclusion that my mind must have continued to think about the sketch after Graham and I had finished it. And that my mind had been improving what we’d written, without my making any conscious attempt to do so. So when I remembered it, it was already better.

Chewing this over, I realised it was like the tip-of-the-tongue phenomenon: when you can’t remember a name, and you chase after it in your mind

Excerpt from: Creativity: A Short and Cheerful Guide by John Cleese

๐Ÿ’Ž Kleiner Perkin’s tactic for avoiding their staff developing entrenched positions in meetings (flip-flop)

Another renowned venture capitalist, Kleiner Perkins’s Randy Komisar takes this idea one step further. He dissuades members of the investment committee from expressing firm opinions by stating right away that they are for or against an investment idea. Instead, Komisar asks participants for a โ€œbalance sheetโ€ of points for and against the investment: โ€œTell me what is good about this opportunity; tell me what is bad about it. Do not tell me your judgment yet. I don’t want to know.โ€ Conventional wisdom dictates that everyone should have an opinion and make it clear. Instead, Komisar asks his colleagues to flip-flop!

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

๐Ÿ’Ž Analysing successful brands can be misleading (survivorship bias)

The models whose success we admire are, by definition, those who have succeeded. But out of all the people who were “crazy enough to think they can change the world,โ€ the vast majority did not manage to do it. For this very reason, we’ve never heard of them. We forget this when we focus only on the winners. We look only at the survivors, not at all those who took the same risks, adopted the same behaviors, and failed. This logical error is survivorship bias. We shouldn’t draw any conclusions from a sample that is composed only of survivors. Yet we do, because they are the only ones we see.

Our quest for models may inspire us, but it can also lead us astray. We would benefit from restraining our aspirations and learning from people who are similar to us, from decision makers whose success is less flashy, instead of a few idols

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

๐Ÿ’Ž On why partial knowledge is often victorious over full knowledge (it conceives things as simpler than they are)

Such misleading stories, however, may still be influential and durable. In Human, All Too Human, philosopher Friedrich Nietzsche argues that โ€œpartial knowledge is more often victorious than full knowledge: it conceives things as simpler than they are and therefore makes its opinion easier to grasp and more persuasive.”

Excerpt from: The Myth of Experience: Why We Learn the Wrong Lessons, and Ways to Correct Them by Emre Soyer and Robin M Hogarth

๐Ÿ’Ž On the danger of a theory-free analysis of mere correlations (winter detector)

The ‘winter detector’ problem is common in big data analysis. A literal example, via computer scientist Sameer Singh, is the pattern-recognising algorithm that was shown many photos of wolves in the wild, and many photos of pet husky dogs. The algorithm seemed to be really good at distinguishing the two rather similar canines; it turned out that it was simply labelling any picture with snow as containing a wolf. An example with more serious implications was described by Janelle Shane in her book You Look Like a Thing and I Love You: an algorithm that was shown pictures of healthy skin and of skin cancer. The algorithm figured out the pattern: if there was a ruler in the photograph, it was cancer. If we don’t know why the algorithm is doing what it’s doing, we’re trusting our lives to a ruler detector.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

๐Ÿ’Ž On the lack of data proving the effectiveness of ad campaigns (designed to boost loyalty)

The advertising industry – whose only important asset is ideas โ€“ has learned nothing from this. We keep heading in the wrong direction. We keep bulking up everything in our arsenal except our creative resources. Then we take the people who are supposed to be our idea people and give them till 3 o’clock to do a banner.

Sure, we need people who are tech-savvy and analytical. But more than anything, we need some brains-in-a-bottle who have no responsibility other than to sit in a corner and feed us crazy ideas. We keep looking to โ€œtransformโ€ our industry but ignore the one transformation that would kill.

Excerpt from:ย How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

๐Ÿ’Ž Five pronged model for encouraging behaviour change (reduce)

REACTANCE

When pushed, people push back. So rather than telling people what to do, or trying to persuade, catalysts allow for agency and encourage people to convince themselves.

ENDOWMENT

People are attached to the status quo. To ease endowment, catalysts surface the costs of inaction and help people realize that doing nothing isn’t as costless as it seems.

DISTANCE
Too far from their backyard, people tend to disregard. Perspectives that are too far away fall in the region of rejection and get discounted, so catalysts shrink distance, asking for less and switching the field.

UNCERTAINTY

Seeds of doubt slow the winds of change. To get people to un-pause, catalysts alleviate uncertainty. Easier to try means more likely to buy.

CORROBORATING EVIDENCE

Some things need more proof. Catalysts find corroborating evidence, using multiple sources to help overcome the translation problem.

Excerpt from: Catalyst by Jonah Berger

๐Ÿ’Ž Six psychological biases that help explain why we fail to prepare for disasters

1. Myopia: a tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments;
2. Amnesia: a tendency to forget too quickly the lessons of past disasters;
3. Optimism: a tendency to underestimate the likelihood that losses will occur from future hazards;
4. Inertia: a tendency to maintain the status quo or adopt a default option when there is uncertainty about the potential benefits of investing in alternative protective measures:
5. Simplification: a tendency to selectively attend to on subset of the relevant factors to consider when making choices involving risk; and
6. Herding: a tendency to base choices on the observed actions of others.

Excerpt from: The Ostrich Paradox: Why We Underprepare for Disasters by Robert Meyer and Howard Kunreuther

๐Ÿ’Ž The ten steps for a successful behavioural science intervention

1. Establish the scope.
2. Break the challenge into addressable parts.
3. Identify the target outcome.
4. Map the relevant behaviors.
5. Identify the factors that affect each behavior.
6. Choose the priority behaviors to address.
7.Create evidence-led intervention(s).
8. Implement the intervention(s).
9. Assess the effects.
10. Take further action based on the results

Excerpt from: Behavioural Insights by Michael Hallsworth

๐Ÿ’Ž Three ideas from psychology that explain why brainstorms tend to be ineffective (from social loafing to production blocking)

Research shows there are many psychological processes at work which together limit the effectiveness of brainstorming. ‘Social loafing’ โ€“ a group situation encourages and allows individuals to slack off. ‘Evaluation apprehension’ – we’re nervous of being judged by colleagues or looking stupid. ‘Production blocking’ โ€“ because only one person can speak at a time in a group, others can forget or reject their ideas while they wait. We’re also learning more about the power of our “herd’ tendencies. As humans, we have innate desires to conform to others with only the slightest encouragement. When asked to think creatively, these implicit norms are invisible but powerful shackles on our ability to think differently.

No wonder so few ideas emerge.

Excerpt from:ย How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

๐Ÿ’Ž On the danger of statistical methods being used to control the world (rather than understand it)

Social scientists have long understood that statistical metrics are at their most pernicious when they are being used to control the world, rather than try to understand it. Economists tend to cite their colleague Charles Goodhart, who wrote in 1975: ‘Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes. (Or, more pithily: ‘When a measure becomes a target, it ceases to be a good measure.’) Psychologists turn to Donald T. Campbell, who around the same time explained: โ€œThe more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.

Goodhart and Campbell were on to the same basic problem: a statistical metric may be a pretty decent proxy for something that really matters, but it is almost always a proxy rather than the real thing.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

๐Ÿ’Ž There is no such thing as a wholly original idea (but there is such a thing as unique combinations)

It is to be found in the exceptional human capacity to synthesize our experiences, influences, knowledge and feelings into one, unified, original entity. To have such an inbuilt facility that enables us to make seemingly random connections across a broad It has to be the single most important creative faculty we have, as Einstein observed when he said, โ€œCombinatory play seems to be the essential feature in productive thought.’

The process our conscious and unconscious selves go through when editing, connecting and combining all that we know and feel into an original coherent thought happens over a period of time. It cannot be forced. It happens when we are awake and when we are asleep. It happens when we are thinking about something else entirely, or playing a game of tennis. It happens because a stimulus in our immediate surroundings โ€“ usually without our knowing

Excerpt from: Think Like an Artist: . . . and Lead a More Creative, Productive Life by Will Gompertz

๐Ÿ’Ž Too often our brain works like a lawyer (it will find arguments to defend our convictions whatever the cost)

The explanation for Kahan’s results? Ideology. Irrespective of the actual figures, Democrats who identified as liberal, normally in favour of gun control, tended to find that stricter laws brought crime down. For the conservative Republican participants, the reverse was the case. They found that stricter gun control legislation did not work.

These answers are no longer to do with the truth, Kahan argued. They are about protecting your identity or belonging to your tribe! And the people who were good at maths, Kahan also found, were all the better at this. Often completely subconsciously, by the way. It was their psyche that played tricks on them.

Excerpt from: The Number Bias: How Numbers Lead and Mislead Us by Sanne Blauw

๐Ÿ’Ž Stacking can make the adoption of new habits easier (like flossing)

In evidence that stacking works, consider dental floss. Many of us clean our teeth regularly but fail to floss. To test whether stacking increases flossing, researchers gave fifty British participants, who flossed on average only 1.5 times per month, information encourage them to do it more regularly.

Half of the participants were told to floss before they brushed at night, and half after they brushed. Note that only half of the participants were really stacking-using an existing automated response (brushing their teeth) as a cue for a new behavior (flossing). The other half, who first flossed and then brushed, had to remember, oh, yes, first I need to floss, before I brush. No automated cue.

Each day for four weeks, participants reported by text whether they flossed the night before. At the end of the month of reminders, they all flossed about twenty-four days on average. Most interesting is what they were all doing eight months later. Those who stacked, and flossed after they brushed, were still doing it about eleven days a month. For them, the new behavior was maintained by the existing habit. The group originally instructed to floss before they brushed ended up doing it only about once a week.

Excerpt from: Good Habits, Bad Habits: The Science of Making Positive Changes That Stick by Wendy Wood

๐Ÿ’Ž The paradox of progress (and the paradox of choice)

The Paradox of progress, and the paradox of choice: There is a familiar story of a New York banker vacationing in Greece, who, from talking to a fisherman and scrutinizing the fisherman’s business, comes up with a scheme to help the fisherman make it a big business. The fisherman asked him what the benefits were; the banker answered that he could make a pile of money in New York and come back to vacation in Greece; something that seemed ludicrous to the fisherman, who was already there doing the kind of things bankers do when they go on vacation in Greece.

The story was well known in antiquity, under a more elegant form, as retold by Montaigne (my translation): When King Pyrrhus tried to cross into Italy, Cynรฉas, his wise adviser, tried to make him feel the vanity of such action. “To what end are you going into such enterprise?โ€ he asked. And Pyrrhus answered, “To make myself the master of Italy.” Cynรฉas: “Then ?” Pyrrhus: To conquer Africa, then … come rest at ease.” Cynรฉas: But you are already there; why take more risks?”

Excerpt from: Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb