In this article 21 heuristics you need to know:
- Availability heuristic
- Attribute substitution
- Anchoring and adjustment
- Affect heuristic
- Contagion heuristic
- Effort heuristic
- Familiarity heuristic
- Fluency heuristic
- Naive diversification
- Occam’s razor
- Peak-end rule
- Representative heuristic
- Scarcity heuristic
- Similarity heuristic
- Social proof
What are heuristics?
Heuristics are simply mental shortcuts or “rules of thumb” that we use to speed up our decision making and problem solving, especially when we have limited time or information to work with.
You’re probably familiar with a bunch of heuristics already, you just didn’t know they were called “Heuristics”.
- An educated guess or a guesstimate
- Copying others when you’re unsure of what to do
- Doing what the authorities and experts recommend you do
- Going with your gut
- Using a rule of thumb e.g. “Measure twice, cut once”
- Using “common sense”
- Avoiding interviewing with a company with a bad reputation
- Breaking a large task into smaller steps
- Paying attention to ratings and reviews
- Process of elimination
- Trial and error
- Working backwards
Why do we use heuristics?
We often use heuristics when:
- We don’t have a lot of time to make a decision or to solve a problem
- We don’t have all the information, our information is often incomplete and lacking
- We’re tired and we don’t have enough energy to think it through
Bottom line: We use heuristics because they’re easy and practical, they save us time and energy, and even though they can lead to errors in our thinking, they’re right more often than not.
The simple fact is that we don’t always have time to analyse and assess every potential option, perform a cost-benefit analysis, rank all of the alternatives, compare all the pros and cons etc.
Although we might do this for the big choices e.g. a change of career or starting a new business, it’s too time consuming and exhausting to do this for every single decision.
Bounded rationality theory
Heuristics operate according to the theory of bounded rationality, which is the idea that our rationality is limited, because there are limits to our time, information, mental resources, thinking capacity etc.
History of Heuristics
The study of heuristics in decision-making was initially introduced by Nobel laureate Herbert A. Simon, and then developed in the 1970s and 1980s by psychologists Amos Tversky and Daniel Kahneman (see pic)
Why is it important to know about heuristics?
The more you know about the way you think, make decisions, solve problems, draw conclusions etc. the better you’re going to become at it.
List of Heuristics
The availability heuristic occurs when people make judgments about the importance of an issue, or the likelihood of an event, by the ease with which examples come to mind.
We are biased towards information that is easily recalled, so if an issue comes to mind quickly and easily, than we tend to assume it must be more important, or more likely, than issues that don’t come to mind as easily.
What determines what ideas, information, events etc. come to mind more readily?
Largely it’s the media and social media. If an event or news story is covered 24/7 nonstop ad nauseam, or if it contains a strong emotional charge, it’ll come to mind more easily, and you’ll naturally assume that it must be important.
However the availability heuristic can mislead us, just because an idea or issue comes to mind easy, that doesn’t mean that it’s more important, or more likely, than one that doesn’t come to mind as easily.
Mass shootings, plane crashes and terrorist attacks all come to mind easily, and this gives us the false impression that they’re a lot more common than they really are. But more people die from car crashes than plane crashes, drownings than shark attacks, suicide than murder etc.
The availability heuristic can even mislead us on small things:
Question: Are there more words that begin with the letter K, or that have K as the third letter?
Answer: There are actually twice as many words that have K as the third letter, but because it’s easier to think of and recall words that start with K (kitchen, ketchup, kill etc.) and a lot harder to think of words that have K as the third letter (acknowledge, bike, coke, lake etc.) most people are more likely to guess that there are more words beginning with K.
The attribute substitution heuristic is that when presented with a difficult problem, instead of trying to solve that difficult problem, most people unconsciously substitute it for an easier problem and solve that instead.
“A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?”
What is your answer?
Most people, including me, and more than 50% of students at Harvard, MIT and Princeton, and over 80% of students in other universities, incorrectly answer “10 cents”. It seems intuitively right but it’s wrong.
If the ball cost 10 cents, and the bat costs $1.00 more than the ball, than the bat would cost $1.10, and the total would be $1.20.
The reason most people make this mistake is they tend to unconsciously substitute the “more than” statement in the problem (the bat costs $1.00 more than the ball) with an absolute statement (the bat costs $1.00). This makes the math easier to work with. If a ball and bat together cost $1.10, and the bat costs $1.00, then the ball must cost 10 cents ($1.00 + $0.10 = $1.10)
However, the problem requires that the bat costs $1 more than the ball.
The correct answer is 5 cents:
Bat: $1.05 (bat costs $1 more than the ball)
The attribute substitution heuristic is why sometimes when you ask someone a difficult question, instead of answering your question, they answer a similar but easier question. Sometimes this is intellectually dishonest (Politicians often do this), however most of the time it’s unconscious. The person doesn’t even know they’re doing it.
“What is the most popular holiday destination: Hawaii or Thailand?”
If you don’t know the answer you might substitute this question to:
“Where would you rather go on holiday: Hawaii or Thailand?”
Note: We not only substitute difficult questions for simpler ones, but we also substitute complex answers for simple ones that are easier to understand.
Anchoring and adjustment heuristic
The anchoring and adjustment heuristic causes people us to rely too heavily on the initial piece of information offered (the “anchor”) when making decisions.
“Is the population of Venezuela more or less than 50 million?”
If you don’t know the answer, it’s highly likely that you’ll assume that the figure of 50 million is somehow significant, that it’s some kind of guide to the truth (even if it isn’t) and you’ll guess around it (say 30 million to 60 million)
However if I increased the anchor and presented you with an even higher figure, that would also influence you.
“Is the population of Venezuela more or less than 90 million?”
Now you’re likely to think that the figure of 90 million is significant, that it’s some kind of guide to the truth, and guess around it (say 80 million to 100 million)
The problem is that I could present you with almost any number, even a completely arbitrary number, and that would also influence you.
Why is this important?
It’s important for several reasons:
If the first figure presented to you is grossly inaccurate, it’s still highly likely to influence your thinking and decision making, and you’re still likely to “anchor” around it and treat it as if it were some kind of reference point to the truth – even when it isn’t.
Advertisers, marketers, salespeople etc. know this and they all take advantage of the anchoring effect by marking up prices and then offering you “25-50% off”. Why do they do this? Because they know that a $50 item looks much more attractive to you if it’s “on sale” from the $100 price they’ve first anchored in your mind than the regular $50 price tag.
Examples of anchors:
- Advertised salaries
- Marked down prices in retail stores
- Minimum payments on credit card bills
- Prices on Amazon
- Sticker prices on cars
The anchoring effect is hard to resist, because it seems like the first figure we’re presented with really means something, that it’s an accurate representation of something – even when it isn’t.
How to overcome the anchoring effect:
- Don’t trust the first number presented to you (the “anchor”)
- Do your homework. Seek information from a wide variety of sources, and educate yourself to understand what something should cost, how long it should take etc.
- Argue against the anchor presented to you, think of all of the reasons it isn’t right
- When shopping: Don’t trust the first price you see. Shop around. Look online and do your homework to find out what something should cost.
- In a negotiation: Don’t trust the first price you’re offered. Offer up the first figure and set a high anchor.
The bottom line: The more educated and knowledgeable you are on a subject, the less effect the anchoring effect is likely to have upon you.
However, the less you know about a subject, the easier it will be for others to mislead you with irrelevant facts and figures.
The affect heuristic causes our emotions to influence our decisions, and our perceptions of risk/reward ratios. It is the equivalent of “going with your gut”. (“affect” is a psychological term for emotional response)
If you have good feelings about someone e.g. a politician, you’re more likely to take what they say as true, and to spend less time fact-checking.
Why going with your gut isn’t always reliable
The problem with emotional reasoning is that your feelings about a person or thing aren’t always accurate. Believe it or not: Your gut feel can be wrong.
Your feelings can be manipulated in a variety of ways:
- How information is framed and presented to you (e.g. something you could gain vs something you could lose)
- Who presents it to you (e.g. a celebrity or someone you like and trust vs someone you hate)
- Someone’s level of attractiveness
- The tonality and volume with which they speak
- Media nonstop positive or negative coverage (Hatchet jobs vs puff pieces)
- Past positive or negative experiences with someone or something similar
- Ratings/Reviews/Social proof (all can be faked)
…and a zillion other things.
The contagion heuristic causes us to avoid something that is thought to be bad or contaminated.
If one brand of eggs is recalled due to a salmonella outbreak, we might avoid all eggs “just in case”.
The effort heuristic causes us to perceive objects that took a longer time to produce to be of higher quality and value.
If something took forever to create e.g. Michelangelo’s David, the Sistine Chapel, the Taj Mahal, the Great Pyramids of Giza, Petra, La Sagrada Família (still under construction) we perceive it to be much more valuable than the average building or skyscraper.
The effort heuristic even applies to small things:
- If you earn $100, it seems more valuable than if you found $100 on the street
- The harder you have to work to achieve a goal, the more you’ll value that goal when you achieve it
The familiarity heuristic is that we tend to favor the familiar over the strange.
We often equate familiarity with reliability and safety, what’s familiar can seem “right”, the safe choice.
The familiarity heuristic is the reason we like brands, products and people we’re more familiar with, it’s also why if you don’t know the answer to something, you’ll simply go with what’s more familiar.
Who do you think will win Wimbledon: Roger Federer or some guy you’ve never heard of?
The familiarity heuristic can mislead us however, just because something is more familiar, that doesn’t mean that it’s better than something unfamiliar. The known, is not necessarily better than the unknown.
The familiarity heuristic can cause us to make errors in judgement in terms of probability too. We tend to overvalue companies we’re more familiar with, and undervalue those we’re less familiar with. You’re more likely to rank Coca-Cola over ICBC, despite the fact that ICBC is a much bigger and more profitable company.
The fluency heuristic is that the easier an idea or information is to understand, the more likely it is to be accepted.
This means that given the choice of two options, one easy to understand, and one difficult to understand, you’ll tend to favor the easy one (even if the easy answer is wrong, and the difficult answer is right)
This is why it’s important to explain your ideas clearly and simply in plain English. Don’t use a $50 word when a $5 word will do. The easier your ideas are for the average person to understand, the more likely they are to be accepted, whether they’re right or not.
The naïve diversification heuristic states that when people are asked to make multiple choices at once, they tend to diversify more than when making the same type of decision sequentially.
When asked to choose five candies from a selection of ten types, people will tend to choose a variety. On the other hand, when asked to choose one candy from among ten types once a week for five weeks in a row, people are more likely to select the same one.
“Occam’s razor is the problem-solving principle that, when presented with competing hypothetical answers to a problem, one should select the one that makes the fewest assumptions.” – Wikipedia
Occam’s razor is a heuristic which favors a form of problem solving containing the fewest assumptions.
The “razor” refers to removing as many unnecessary assumptions from a hypothesis as possible, because the more assumptions there are, the more possibilities there are for error.
Occam’s razor in a nutshell: The simplest explanation is usually the correct one.
For example, which is more likely to be true:
A UFO in the sky is:
a) Aliens from another galaxy
b) A type of aircraft or drone you haven’t seen before
Paleontologists have discovered dinosaur bones in the earth because:
a) Dinosaurs once lived on the earth
b) God (or Satan) put the dinosaur bones in the earth to test the faith of Christians
A woman drowned her five children in the bathtub because:
a) God told her too
b) She is insane or schizophrenic and is hearing voices in her head
Yep, the simplest explanation is usually – but not always – the correct one.
Occam’s razor states that not only should you start with the simplest and most likely explanation, you also shouldn’t overcomplicate, or add any unnecessary extra layers to your explanation.
“Entities should not be multiplied beyond necessity.” – William of Ockham
You don’t need to say that gravity works because of the laws of physics – and invisible men – if just the laws of physics will do.
“If a thing can be done adequately by means of one, it is superfluous to do it by means of several; for we observe that nature does not employ two instruments if one suffices.” – Thomas Aquinas
It’s important to note that Occam’s razor is not an unbreakable law or rule – the simplest explanation is not always the correct one – but it usually is, therefore you should always start by asking:
“What is the simplest and most likely explanation?”
Instead of with complex or far fetched theories which are less likely.
Occam’s razor doesn’t mean you eliminate complex or far fetched theories completely, it just means you start with the simplest, most likely ones. If someone has a pounding in their head, it could be brain cancer, but let’s start off by assuming it’s a headache.
“If you have two theories that both explain the observed facts, then you should use the simplest until more evidence comes along”
Occam’s razor is also not about oversimplifying theories or excluding data or evidence, so if the simplest explanation doesn’t account for all of the available data and evidence, then it’s not the best explanation.
“Everything should be made as simple as possible, but not simpler.” – Albert Einstein
In summary: Occam’s razor is simply a good rule of thumb for thinking and problem solving for when two explanations are equally explanatory, which turns out to be right more often than not.
The peak-end rule is that we judge an experience largely based on how we felt at the most emotionally intense points (the “peak”), and at its end, instead of judging the experience as a whole. This bias of memory occurs regardless of whether the experience is pleasant or unpleasant.
The peak-end rule can apply to a movie, a vacation, a relationship – anything.
The representativeness heuristic is that we tend to judge the likelihood of someone or something belonging in a category, based on how similar it is to other members of that category.
If someone looks like a stereotypical nerd, most people are likely to assume that they work in accounting, finance, IT etc. rather than being a pro-athlete or a construction worker.
However, just because someone or something seems to fit the mold of the stereotype, that doesn’t mean that they are.
Base rate fallacy
One problem with the representativeness heuristic is that it causes people to commit the base rate fallacy.
The base rate fallacy is the tendency for people to ignore relevant statistical information, when estimating how likely an event is to happen. It can cause people to overestimate the likelihood of something very rare, or to underestimate the likelihood of something very common.
The scarcity heuristic causes us to desire and value things that are rare, limited edition, hard to find.
If there is only one of something, if it is a limited edition, or even if they stop making your favorite jeans and you know you can’t replace them, they immediately become more valuable.
Advertisers appeal to the scarcity heuristic and FOMO (fear of missing out) to convince you to buy:
“Only 5 items left”
The similarity heuristic is that we make choices and judgements of people and things in the present, based on how similar they are to something we’ve experienced in the past.
If you see an Italian restaurant that looks like your favorite one, with a similar decor, menu, prices etc. you’re likely to perceive it favorably
If you see a preview for a movie that has similar characteristics to the types of movies you’ve enjoyed in the past, you’re likely to give it a chance
The similarity heuristic is about learning from past experience, and letting lessons learnt from past experiences, guide our future choices.
However the similarity heuristic can cause us to negatively prejudge people and things, if someone reminds us of someone we don’t like (even if they’re nothing like that person) we will probably want to avoid that person, due to their similarity.
Social proof is what advertisers, influencers and the media use to mold our perception.
Most people are followers, when in doubt about what choice to make in a given situation, they follow the crowd and do what “everyone else” is doing.
Advertisers and influencers know this and that’s why they use social proof to manipulate your thinking and behavior, in order to get you to buy their products and services. Instead of trying to convince you logically to use their product or service, they show you ads and pics of celebrities and people like you (in the target demo) doing it. The line of reasoning is “If celebrity X is doing it, and if everyone else is doing it, it must be the right thing to do”.
7 types of social proof:
- Celebrity: If popular celebrities are doing or recommending something, other people will definitely copy and follow suit
- Influencers: Instagram and YouTube influencers with millions of followers can have the exact same effect
- Expert: If “9/10 Doctors agree” or some industry expert endorses a product or service people will definitely buy it
- Reviews: If your a product or service has great reviews on Amazon, eBay, Facebook, Tripadvisor, Yelp etc. people are more likely to trust it
- Social media shares: The more likes, comments, shares a particular blog article, podcast, YouTube video etc. has the more credible it seems to be
- Certification: Certifications make someone seem more credible
- Canned laughter: Sitcoms often use canned laughter to make even unfunny jokes seem funnier than they really are. Similarly late shows and talk shows instruct guests to cheer and laugh at even the slightest attempt at a joke from the host
Stereotyping is a common heuristic in which we unconsciously categorize people according to certain traits often possessed by their gender, race or culture, especially if the group is portrayed that way by Hollywood, the media, online etc.
- Asians are good at math
- Black guys are athletic
- Gays are good dressers
- Jews are good with money
- Women are emotional
- White guys can’t dance
Stereotypes are obviously not always true, however they’re often true more often than not. It isn’t for no reason that people think and say the things they do.
You probably recognise a bunch of these heuristics from your own life, you can see how common they are, we use them every day. Hopefully you can also see the errors and flaws in these mental shortcuts.
25 Signs of a Covert Passive-Aggressive Narcissist
33 Ways People try to Manipulate You
How to get Smarter: A guide to critical thinking, cognitive biases, and logical fallacies
The Top 10 Teachings of Sadhguru
The Wolf of Wall Street: Straight Line Persuasion Review
Your Deceptive Mind: A Scientific Guide to Critical Thinking – Part 2
Your Deceptive Mind: A Scientific Guide to Critical Thinking
Bad arguments to avoid – Part 4
Bad arguments to avoid – Part 3
Bad arguments to avoid – Part 2
Critical thinking2 months ago
25 hilariously wrong future predictions
Spirituality3 months ago
13x Jesus was wrong in the Bible
Critical thinking2 months ago
7 linguistic tricks people use to deceive and manipulate you
Personal development3 months ago
That’s illogical! 7 Logical fallacies you need to know
Spirituality3 months ago
101 of the craziest, strangest, most ridiculous Bible absurdities
Personal development3 months ago
How to get smarter (& avoid being a dumbass)
Critical thinking2 months ago
The False Consensus Effect
Critical thinking2 months ago
Deductive vs Inductive vs Abductive reasoning