๐Ÿ’Ž The more we think about an event, the more we think itโ€™s likely to happen (often wrongly)

One of the earliest experiments examining the power of imagination to sway intuition was conducted during the U.S. presidential election campaign of 1976. One group was asked to imagine Gerald Ford winning the election and taking the oath of office, and then they were asked how likely it was that Ford would win the election. Another group was asked to do the same for Jimmy Carter. So who was more likely to win? Most people in the group that imagined Ford winning said Ford. Those who saw Jimmy Carter taking the oath said Carter. Later experiments have obtained similar results. What are your odds of being arrested? How likely is it you’ll win the lottery? People who imagine the event consistently feel the odds or the event actually happening are higher than those who donโ€™t.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž The anchoring effect is influential (even when the anchors are ridiculous)

They asked people two versions of the Gandhi questions. One version is what I’ve repeated here. The other began by asking people whether Gandhi was older or younger than 140 when he died, which was followed by the same direction to guess Gandhi’s age when he died. Strack and Mussweiler found that when the first question mentioned the number nine, the average guess on the following question was 50. In the second version, the average guess was 67. So those who heard the lower number before guessing guessed lower. Those who heard the higher number, guessed higher.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž When weighing up the merits of a product or dangers of a technology we often rely on how it makes us feel (rather than laboriously compute the facts)

In a second experiment, Slovic and Alhakami had students of the University of Oregon rate the risks and benefits of a technology (different trials used nuclear power, natural gas, and food preservatives). Then they were asked to read a few paragraphs describing some of the benefits of the technology. Finally, they were asked again to rate the risks and benefits of the technology. Not surprisingly, the positive information they read raised – student’s ratings of the technology’s benefits in about one-half of the cases. But most of those who raised their estimate of the technology’s benefits also lowered their estimate of the risk – even though they had not read a word about the risk. Later trials in which only risks were discussed had the same effect but in reverse: People who raised their estimate of the technology’s risks in response to the information about risk also lowered their estimate of its benefit.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž With uncertainty we prefer conformity (group think)

Crutchfield’s experiment involved slightly more ambiguous questions, including one in which people were asked if they agreed with the statement ‘I believe we are made better by the trials and hardships of life.’ Among subjects in a control group that was not exposed to the answers of others, everyone agreed. But among those in the experiment who thought that everyone else disagreed with the statement, 31 per cent said they did not agree. Asked whether they agreed with the statement ‘I doubt whether I would make a good leader,’ every person in the control group rejected it. But when the group was seen to agree with the statement, 37 per cent of people went along with the consensus and agreed that they doubted themselves.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž How our perception of risk is skewed by โ€œwhat makes a good story or hypothesisโ€(rather than a cold calculation of the odds)

Many other studies produced similar results. Kahneman and Tversky divided 245 undergrads at the University of British Columbia in half and asked one group to estimate the probability of a massive flood somewhere in North America in 1983, in which more than 1,000 people drown.’ The second group was asked about an earthquake in California sometime in 1983, causing a flood in which more than 1,000 people drown.’ Once again, the second scenario logically has to be less likely than the first but people rated it one-third more likely than the first. Nothing says ‘California’ quite like โ€˜earthquakeโ€™.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž Group polarisation and the danger of surrounding yourself with people who share similar opinions (How correct am I?)

But they wonโ€™t. Decades of research has proved that groups usually come to conclusions that are more extreme than the average view of the individuals who make up the group. When opponents of a hazardous waste site gather to talk about it, they will become convinced the site is more dangerous than they originally believed. When a woman who believes breast implants are a threat gets together with women who feel the same way, she and all the women in the meeting are likely to leave believing they had previously underestimated the danger. The dynamic is always the same. It doesn’t matter what the subject under discussion is. It doesn’t matter what the particular views are. When like-minded people get together and talk, their existing views tend to become more extreme.

In part, this strange human foible stems from our tendency to judge ourselves by comparison with others. When we get together in a group of like-minded people, what we share is an opinion that we all believe to be correct and so we compare ourselves with others in the group by asking โ€˜How correct am I?’ Inevitably, most people in the group will discover that they do not hold the most extreme opinion, which suggests they are less correct, less virtuous, than others. And so they become more extreme. Psychologists confirmed this theory when they put people in groups and had them state their views without providing reasons why – and polarization still followed.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž Why psychologists believe that focus groups are far less insightful than some marketers think (Head cannot look into Gut)

‘The heart has its reasons,’ Blaise Pascal wrote more than three centuries ago, โ€˜which reason knows nothing ofโ€™. Sot with the conscious and unconscious minds. Head cannot look into Gut and so it has no idea how Gut assembles its judgments, which is why psychologists believe that focus groups are far less insightful than some marketers think. If you put people together in a room, show them a car commercial, and ask them how they feel about the car, you will get clear answers. ‘I don’t care for it,’ a man may say. Fine. Why not? He frowns. ‘Um, the styling on the front is ugly. And I want a more powerful engine.’ That looks like good insight, just the sort of thing a company can use to design and market its products. But it’s not. This man’s snap judgment – ‘I don’t like that car’ – came from Gut. But the interviewer is talking to Head. And Head doesn’t have a clue why Gut doesn’t like the car. So Head rationalizes. It looks at the conclusion and cobbles together an explanation that is both plausible and quite possibly, wrong.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

๐Ÿ’Ž We often selectively interpret evidence to fit with our prior beliefs (and is used to further cement beliefs)

In 1979 – when capital punishment was a top issue in the United States – American researchers brought together equal numbers of supporters and opponents of the death penalty. The strength of their views was tested. Then they were asked to read a carefully balanced essay that presented evidence that capital punishment deters crime and evidence that it does not. The researchers then retested people’s opinions and discovered that they had only gotten stronger. They had absorbed the evidence that confirmed their views, ignored the rest, and left the experiment even more convinced that they were right and those who disagreed were wrong.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner