The Art of Thinking Clearly

finished (2020)Rolf Dobelli2013

1. Survivorship Bias

We often only look at those with positive outcomes—"survivors"—for guidance, because failures don't get far enough to share their perspectives (and their individual voice is diluted). So, perception of risk is off!

2. Swimmer's Body Illusion

Correlation ≠ causation. Good swimmers have very good bodies—which cause their ability, not vice versa. (Harvard: smart people, or good education? etc.) Very prevalent in advertising (beauty products being sold by beautiful people).

3. Clustering Illusion

We like to find patterns in data even when they do not exist; this often causes overfitting, leading people to impart meaning where it doesn't exist and think they know things well too often. Randomness exists, and is common!

4. Social Proof

We tend to follow other peoples' mass example, in both action and opinion. Implicit peer pressure: we assume that what is common is right—and conform.

5. Sunk Cost Fallacy

"Bygone is bygone!" We shouldn't consider fixed past events when considering value-based decisions for the future, because what happens from here is what counts. We're bad at this...

6. Reciprocity

If someone does something good for us, it feels cold-hearted not to do something back. Evolved for food sharing, now very common for sales (give small gift → ask/pitch) and social connections / meetings / etc.

7–8. Confirmation Bias

The classic blunder: only paying attention to data that confirms your pre-existing beliefs, and dismissing nonconforming data as "an exception" or incorrect. Occurs everywhere, especially for our core beliefs.

Confirmed beliefs often have strong disguises. One way to find and analyze them is to search for opinions supported largely by nonspecific assertions. Or, try to find contrary data! Writing them down can help here.

9. Authority Bias

If people have authority—power or expertise—we tend to take their words at face value. We also generally seek to please them; this can lead to automatic dissent suppression, which is bad.

10. Contrast Effect

We're horrible at judging absolute value—rather, we compare things (prices) to their context. Sales/discounts use this principle, as well as adding to expensive things (cars). Remember that $1 = $1 in all contexts!

11. Availability Bias

Humans prefer the wrong solution to no solution at all, leading us to incorrectly try to fit the world we face to our past experiences. Be willing to say "I don't know"!

12. Worse-Before-Better Fallacy

It's a "win-win" to predict something getting worse before it gets better when giving a "solution." Works? It's better sooner → recipient is happy. Doesn't? Built-in delay before facing consequences.

13. Story Bias

More correlation ≠ causation. Humans like a narrative, so we tend to retroactively fit meaning to events, giving us a false sense of understanding. The scientific method (predicting in advance) canslice right through this!

14. Hindsight Bias

We tend to believe we're better at understanding and predicting the future than weactually are, leading to aninflated self-assurance and inflated risk-taking.

15. Overconfidence Effect

There is a massive gulf between what people actually know and what they think they know. Be skeptical of predictions; start from a pessimistic view. Men are more susceptible to this.

16. Chauffeur Knowledge

We have a tendency to trust people who present information (new anchors, etc), even if they don't actually know what they're talking about. How to tell a mouthpiece from an expert? Experts say "I don't know."

17. Illusion of Control

We tend to think we can influence events even when we can't or don't (fake dials/switches, buttons, etc). Combination of placebo effect + confirmation bias.

18. Incentive Super-Response Tendency

People respond exactly to the incentives they're given, ignoring any intention behind them. If intent and reward are mismatched, the behavior that is rewarded will prevail.

19. Regression to Mean

Extremes do not tend to repeat; hence, bad things get better and good things get worse, on average. Don't misattribute! (For instance, punishment mistakenly seems more effective than reward.)

20. Outcome Bias

Result quality ≠ decision quality! People over-attribute results to decisions, especially with randomness involved.

if this is your website, log in to make changes