...

MAKING BETTER DECISIONS – LOGICAL FALLACIES AND BIASES

This is the second part of a two part series – 

  1. MENTAL MODELS (READ THIS FIRST)
  2. LOGICAL FALLACIES AND BIASES

As previously mentioned in part 1 of my article, the key to better decisions is being wrong less often. Logical fallacies and biases are essential to this. 

Logical Fallacies are beliefs that are based on unsound arguments. We form these beliefs over a period of time and they are part of the way we think. These fallacies take a front-row seat when we are making quick decisions. 

Biases are formed over the years that give us a sense of perceived rationality. There are a lot of biases that influence individuals differently, but their common characteristic is that they lead to judgment and decision-making that deviate from rational objectivity. 

System 1 and System 2

Daniel Kahneman in his seminal work in Thinking Fast and Slow [check it on Amazon] explained that our ‘lazy’ brain uses fallacies and biases as part of system 1 to make quick and efficient decisions. However, the downside is that these decisions are not always correct. 

System 2 is more calculative rather than intuitive. Mental models and knowledge of fallacies and biases will help you utilize your system 2 more efficiently and avoid mistakes in the decision because of system 1.

An easy way to explain this is deaths in the USA. If you were to ask a friend why a random person died in the USA, what was the cause of death? Heart disease or unintentional accident. Most people answer accidents. However, the rate of death by heart disease is a lot more than in accidents. 

Here are some of my favorite logical fallacies and biases-

CONJUNCTION FALLACY

This is a common mathematical fallacy based on the concept of probability. It states that the probability of two events occurring in conjunction is far less than of them occurring alone.

The book Thinking Fast and Slow [check the book on Amazon] gives a very good example to explain this –

Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations.

The author now gave his subjects two options:

  1. Linda is a bank teller
  2. Linda is a bank teller and is active in the feminist movement

85-90% of people chose option b). But if you think in terms of probability, the answer should be a) because the probability of Linda just being a bank teller (or even just a feminist) is higher. 

Conjunction Fallacy - Human Bias and Fallacies

I personally chose b) as well. Based on the description provided, I made the judgment that she can’t just be a bank-teller based on her volunteering description. But that was my opinion and the author wasn’t asking for one. 

AVAILABILITY HEURISTIC

We tend to most easily judge the likelihood of an event occurring based on the frequency, recency (this bias is also called recency bias), importance, and the ease at which the examples of the events come in mind. 

The brain has its own energy-saving and inertial tendencies that we have little control over – the availability heuristic is likely one of them. We often tend to ignore the big picture and focus on the recency of events.

A great example from Thinking Fast and Slow [check the book on Amazon] is the crime rate. Currently, we are living in the least violent time in history compared to our ancestors.

However, we perceive the crime rates to be higher than they actually are due to news reports and media articles. Our brain is biased to think that the crime rates are much higher. 

SOCIAL BIAS

Social-Bias-Mental-Model-1

This is the herd mentality bias everyone keeps discussing. Humans are social animals and we always do things in groups. We have a sense of community and trust among each other. But this could cause a bias that large social groups are always correct.

The 2008 Financial Crisis could be a good example. The few people who actually went against the herd made a profit.

An easy way to question this bias is to remember this saying from Oscar Wilde, everything popular is wrong. 

FIRST CONCLUSION BIAS

As Charlie Munger pointed out, the human brain functions as a sperm and an egg – it shuts as soon as the first sperm enters.

Similarly, our brain shuts down as soon as we find a conclusion.

In fact, in a lot of instances, we will go out of our way to prove that the first conclusion is correct. Challenging your first-conclusion is necessary. 

SMALL SAMPLE BIAS

Forming your conclusion from a small sample size is dangerous. This bias deals with that tendency.

My mom who is trying to convince me to stay in India always says that ‘everyone she knows who is currently working in India after 2-3 years makes a move abroad’. Apart from forming the opinion based on her love for me, the sample size from which she has formed this conclusion is rather small.

I personally know this because I ask her how many people she knows who have moved abroad after working in India and the answer is usually 2-3 out of 4 people she has talked to.  

FUNDAMENTAL ATTRIBUTION ERROR

We tend to stamp traits to a person based on their behavior without considering the situation.

For example, this person is super greedy and never pays at the bar. We sometimes miss considering the situation that person would be in for them to behave in that way – maybe they are currently in dire need of money but value friendship.

As such, we are usually surprised when they don’t live up to the trait we have assigned them next time.

Humans have a tendency to form judgment but we should always take a step back and look towards the situation with empathy. 

CONFIRMATION BIAS

What we believe, we see. Our brain is really lazy. As such, it tries to see things that wouldn’t require much effort i.e. enforce your bias.

An image showing a person selecting supporting evidence that confirms their theory while contradicting evidence remains in the dark, illustrating Confirmation Bias in statistical interpretation.
Confirmation Bias Explained

Muslims incite terrorism in India, ‘only’ Muslims are going out during the pandemic and inciting terrorism (only being the keyword). People who believe in Muslims being the main agents for terrorism in India will actively seek articles and news channels to confirm their bias; most of the time ignoring every counter-evidence. This case is true for vice versa as well.

This is how news channels and media houses make a buck. In today’s world of filter-bubble-based social media applications, confirmation bias is something you have to be careful of. 

SURVIVORSHIP BIAS

Survivorship bias happens when you only look at the people or things that have “survived” a specific experience or situation, without considering the ones that didn’t make it.

Survivorship Bias explained through an image by Tapan Desai. Survivorship bias is the tendency to not focus on the complete picture but only success stories.
Survivorship bias: Successful outcomes don’t represent the full picture.

We often cast the winners (or losers) as the norm instead of considering them outliers.

ANCHORING

Anchoring or focalism is a term used in psychology to describe the common human tendency to rely heavily on, or “anchor on,” one trait or piece of information when making decisions.

This effect is widely used in retail stores. If you walk into a perfume store and see that the first perfume on the display is worth $700, the next perfume costing $200 will seem relatively cheap; you might even go for it. 

CURSE OF KNOWLEDGE

Imagine the time you were an intern and your manager gave you high-level details of things you had to accomplish for the day. You went back to your desk without zero understanding what actually needs to be done! This is the curse of knowledge. 

The inability to communicate ideas because you wrongly assume others have the necessary background to understand what you’re talking about.

Always assume the person you’re talking to doesn’t understand the concept without being condescending, of course.

SYSTEM JUSTIFICATION THEORY

System Justification Theory

According to system justification theory, people are motivated (to varying degrees depending upon situational and dispositional factors) to defend, bolster, and justify prevailing social, economic, and political arrangements (i.e., the status quo). It’s also called the status quo bias.

In simple terms, individual incentives will sustain systemic stupidity if there are individuals who benefit from them.

*Cough cough* Donald Trump *cough* GOP *cough cough*.

FREQUENCY ILLUSION (BAADER-MEINHOF PHENOMENON)

Ever read a word for the first time in a book, searched the definition, and for the next 7 days saw the word everywhere? I personally have seen this happen a lot. 

This is the frequency illusion also called the Baader-Meinhof phenomenon. It is a cognitive bias that describes our tendency to see new information, names, ideas or patterns ‘everywhere’ soon after they’re first brought to our attention.

If this is the first time you’re reading about mental models, pretty sure, you will read this word everywhere for the next 30 days.

This coupled with availability heuristic can be dangerous. Nerdwriter1 does a good job explaining this effect.

IMPOSTOR SYNDROME

Impostor syndrome is a psychological pattern in which one doubts one’s accomplishments and has a persistent internalized fear of being exposed as a “fraud”.

Even if the person is perceived as competent by external sources, they will attribute their success to luck or/and advantage rather than their own skill. This usually occurs to people who are already knowledgeable or expert in their field. 

I personally feel this helps to learn new things as you’re always feeling that you don’t know enough. But it sucks when this starts affecting your confidence.

AUTHORITY BIAS

Illustration of a balancing scale depicting authority bias and how authority is more powerful than rational brain when affected by the cognitive bias of authority bias in decision-making
Authority Bias: The authority outweighs rationality

This bias is the tendency to attribute greater accuracy to the opinion of an authority figure and be more influenced by that opinion.

You are more likely to take advice from a person wearing a suit than a person in worn-out jeans and a T-shirt. This ain’t true anymore with the Silicon Valley culture but you get the point, don’t you?

Read my full article on authority bias here.

DUNNING-KRUGER EFFECT

The Dunning Kruger Effect Explained - Art - Tapan Desai - Stages of Dunning-Kruger Effect

Daniel Kahneman says you will often find that knowing little makes it easier to fit everything you know in coherent patterns. 

Dunning Kruger Effect is a cognitive bias in which people with low ability at a task overestimate their ability. It is related to the cognitive bias of illusory superiority and comes from the inability of people to recognize their lack of ability.

In simple terms, the person in your office who is least knowledgeable on a subject will talk a lot, and say a lot of mainstream terms. The more knowledgeable will be quiet because of the impostor syndrome

FALSE CONSENSUS EFFECT

Also known as consensus bias, is a pervasive cognitive bias in social inferences, in which people tend to “see their own behavioural choices and judgments as relatively common and appropriate to existing circumstances”.

An example would be people who were shocked to see Donald Trump win over Hillary Clinton in the 2016 elections. I agree there were other forces that led to his victory but we are discussing the effect here.

In today’s age of filter bubbles, this bias could be dangerous. Hence they say, always understand both parties in an argument. 

NORMALCY BIAS

This is a tendency for people to believe that things will function in the future the way they normally have functioned in the past and therefore to underestimate both the likelihood of a disaster and its possible effects.

We can visualize this bias in today’s world with the COVID-19 pandemic and the handling of the pandemic by various governments. 

Risk management, stoicism, and antifragility could help you break this bias.  

POISONING THE WELL

This is an informal fallacy where irrelevant adverse information about a target is presented with the intention of discrediting the target. A senator was caught doing drugs in college, his climate change policy cannot be supported.  

COBRA EFFECT

This could be a side-effect of applying the incentive mental model wrongly. It essentially means solving a problem by providing the wrong incentives can make the problem worse.

This idea comes from an old Indian folktale where an Indian village was infested with cobra snakes and the king provided a bounty for dead cobras. The entrepreneurs in the village started breeding more cobras. 

QUESTIONING FALLACIES AND BIASES

Questioning Bias

I wanted to list some questions that could help you break through these biases and utilize mental models discussed in the previous article. Here are some of my notes-

  1. Is someone doing something particular due to ill-intent or carelessness? – Something goes wrong in your life and you think someone else is to blame for it. Think if that person did that because of malice or it was just careless on their part and would you have done it? This breaks the action-observer bias.
  2. Would you still do ____ if ___? – This is an open-ended question when you are making a decision. Would you still break the quarantine if your parents were on the frontline? Would you still mandate the company-wide policy if you were a clerk and not the CEO?
  3. What did I learn today? – This is my go-to question when I am updating my journal. What did I do today that helped me progress?
  4. If you’re doing something in one way and hit stagnancy, what could happen if you do the exact opposite? – This question was provided by Tim Ferris. It helps you prepare for the worst scenario. Personally, it helps me a lot with anxiety. I have to work over the month of December, I can’t attend my best friend’s wedding. What could happen if you don’t work and actually attend the wedding? The worst-case scenarios in our heads are not that bad in reality. 
  5. Can I achieve my 10-year goal in 6 months; what is required of me to achieve this? – I like to set annual goals and 3-year plans. They never work out. Ever. But I still do. But this question helps you start acting on your goals and have actual short-term actionable items. Most of the time you will realize that you will be able to achieve your 1-year goal in 6 months. 
  6. Are you hunting boars or field mice? – Ask this when you feel you’re tired and out of energy with all the things you have to complete. It will help you answer if you’re actually working on productive things or just being busy. It helps me prioritize. This question compliments the ‘opportunity cost’ model.
  7. 5 why’s – I don’t understand ____, why is ____? – This is a famous technique to brainstorm conclusions and break your assumptions. The 5-why technique will lead to the first principle of thinking. It will also help you question existing biases and fallacies.

FINAL WORD

Per Daniel Kahneman, our first instinct is always to use the intuitive part of our brain for most tasks, system 1. This is the hunter-gatherer’s brain. Mental models and study of biases help you use system 2 in a much more efficient way. 

It is important to understand your own biases and fallacies that can trap you into making regretful decisions. Mental models can help you avoid them. 

Always try to amplify your strengths and reduce your mistakes. Mental models will help you do that!

Be multi-disciplinary – read from different sources, try to learn from different fields, listen to informative discussions, have insightful conversations; these will help you build your latticework of mental models and fallacies.  

Keep an open mind when learning. Form your own judgments but always listen with an open mind. Always remember strong opinions, loosely held

Be organized and note down your models and biases to better reflect on them. 

Mental models are theoretical. You will get better in making decisions with practice. 

Be careful to jump to conclusions. Always try to take a step back and utilize your system 2 when making important judgments. Keep your fallacies and biases in check.

ADDITIONAL RESOURCES

I wanted to share a list of resources that have been helpful to me in building this article. I hope these resources will be insightful. 

  1. Farnam Street – As I said before, my journey with mental models began with this blog. Shane Parrish has been researching mental models for more than 15 years and his articles are insightful. I would highly recommend signing up for the newsletter and listening to his podcast, the Knowledge Project
  2. Gabriel Weinberg, the founder of DuckDuckGo, wrote Super Thinking [check the book on Amazon]. The book details all the mental models and logical fallacies he has learned over the years. A very good read.
  3. Tools of Titans by Tim Ferris [check it on Amazon here] is my repository of knowledge. So many insightful conversations are compiled in one book.
  4. James Clear is the author of the NYT best-seller, Atomic Habits [check the book on Amazon here]. His blog has some great articles about mental models
  5. Thinking Fast and Slow [check it on Amazon] – Daniel Kahneman published his seminal research in the form of Thinking Fast and Slow. This book introduces two systems of thinking – system 1 and system 2. System 1 is quick, and lazy while system 2 is more deliberate, and calculative. Decision-making is balancing these two systems for appropriate tasks.
You know others who will enjoy this article? Share:
Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.