“The human mind treats a new idea the same way the body treats a strange protein; it rejects it.” — P.B. Medawar.
Psychological biases can greatly affect our thinking, decision making, and behavior.
In this list, we will explore some of the most common psychological biases, from the Actor-Observer Bias to the Von Restorff Effect, and explain them in simple language that anyone can understand.
By recognizing these biases, we can become more aware of our own tendencies and make more informed choices in our personal and professional lives.
What are Psychological Biases?
Psychological biases refer to the systematic errors in thinking and decision-making that are influenced by various psychological and cognitive factors, rather than by rational or objective factors.
These biases can affect our perception, judgment, and decision-making in various ways, often leading us to deviate from the normative standards of reasoning and resulting in errors or distortions in our thinking.
Psychological biases can be caused by various factors, such as emotions, beliefs, heuristics, and cognitive limitations, and can impact different aspects of our lives, including our relationships, work, and personal well-being.
Psychological Biases
Here are some common psychological biases:
- Actor-Observer Bias: We tend to attribute our own actions to external factors and the actions of others to internal factors.
- Anchoring Bias: We rely too heavily on the first piece of information we receive when making decisions.
- Availability Heuristic: We rely on the most readily available information to make decisions.
- Backfire Effect: We resist changing our beliefs even in the face of evidence that contradicts them.
- Bandwagon Effect: We are more likely to adopt a belief or behavior if many others share it.
- Base Rate Fallacy: We overestimate the importance of specific information and underestimate the importance of general information.
- Clustering Illusion: We see patterns in random events even if they don’t exist.
- Confirmation Bias: We look for information that confirms our existing beliefs and ignore information that contradicts them.
- Dunning-Kruger Effect: We overestimate our abilities and knowledge, especially in areas where we have little expertise.
- Endowment Effect: We value things we own more than things we don’t own.
- Escalation of Commitment: We continue to invest time, money, or resources in a failing project or idea because we’ve already invested so much.
- False Consensus Effect: We overestimate the number of people who share our beliefs or opinions.
- False Uniqueness Effect: We underestimate the number of people who share our abilities or achievements.
- Framing Effect: We are influenced by how information is presented to us, even if the information is the same.
- Fundamental Attribution Error: We tend to attribute other people’s behavior to internal factors and our own behavior to external factors.
- Gambler’s Fallacy: We believe that past events affect the probability of future events, even if the events are independent.
- Halo Effect: We assume that people who have one good quality must have other good qualities too.
- Illusory Superiority Bias: We overestimate our abilities and think we’re better than average.
- Just-World Bias: We believe that people get what they deserve, and we blame victims for their misfortunes.
- Law of Unintended Consequences: We underestimate the potential negative outcomes of our actions.
- Mental Accounting Bias: We treat money differently depending on where it comes from or how it’s spent.
- Mental Time Travel Bias: We underestimate how much we’ll change in the future and overestimate how much we’ve changed in the past.
- Negativity Bias: We pay more attention to negative information than positive information.
- Optimism Bias: We believe that good things are more likely to happen to us than to other people.
- Planning Fallacy: We underestimate how long it will take to complete a task or project.
- Recency Bias: We overvalue the most recent information and ignore earlier information.
- Representativeness Heuristic: We make judgments based on stereotypes or prototypes, even if the information doesn’t support them.
- Self-Serving Bias: We attribute our successes to internal factors and our failures to external factors.
- Status Quo Bias: We prefer to keep things the way they are, even if change would be beneficial.
- Sunk Cost Fallacy: We continue to invest time, money, or resources in a failing project or idea because we’ve already invested so much.
- Survivorship Bias: We focus on the people or things that succeed and ignore the ones that fail.
- Von Restorff Effect: We remember things that stand out or are different more than things that are ordinary.
Examples of How to Use These Psychological Biases to Think and Do Better
Here are some examples of how knowing these psychological biases can help you think and do better:
- Actor-observer bias: Be aware of the different perspectives that can arise from being an actor or an observer in a situation, and try to consider both. An example of the actor-observer bias is when a person attributes their own actions to external factors while attributing the actions of others to internal factors. For instance, if someone fails to complete a task, they may attribute it to external factors such as a lack of resources or distractions, while attributing the failure of others to internal factors such as lack of ability or motivation. On the other hand, if someone succeeds at a task, they may attribute it to internal factors such as their own abilities or hard work, while attributing the success of others to external factors such as luck or help from others.
- Anchoring bias: Be aware of the first piece of information you receive, and try to gather additional information before making a decision. An example of anchoring bias is when a home buyer views a property that is overpriced, causing them to anchor on that price and potentially overvalue other homes they view. For instance, if a buyer views a property priced at $1 million and then views a similar property priced at $900,000, they may still view the $900,000 property as expensive because their anchor point was the $1 million property.
- Availability heuristic: Take a step back and consider whether the information that is readily available to you is truly representative of the situation. An example of Availability heuristic is when people estimate the likelihood of an event based on how easily examples come to mind. For instance, after seeing several news stories about shark attacks, a person may overestimate the risk of a shark attack, even though the probability of being attacked by a shark is extremely low compared to other risks like getting into a car accident or contracting a disease.
- Backfire effect: Be willing to reconsider your beliefs and opinions in the face of new evidence, and avoid becoming defensive or entrenched in your positions. The backfire effect occurs when a person’s core beliefs are challenged by contradictory evidence, and they react by becoming more entrenched in their beliefs rather than changing them. Suppose you have a friend who is very passionate about a particular political issue, and they try to convince you to adopt their viewpoint. However, you have a different opinion on the matter and present counterarguments to challenge their position. If your friend becomes even more convinced of their initial stance and becomes defensive, even in the face of evidence to the contrary, that would be an example of the backfire effect.
- Bandwagon effect: Don’t simply go along with the majority without considering whether it is truly the best decision or option. An example of the bandwagon effect could be a person who decides to support a political candidate simply because they are leading in the polls, even if they may not necessarily agree with all of their policies or values. This person may feel that supporting the candidate who is ahead in the polls is the popular or safe choice, rather than making an independent decision based on their own beliefs and values.
- Base rate fallacy: Don’t solely rely on anecdotal evidence or isolated incidents when making decisions, and consider the broader trends and statistics. An example of the base rate fallacy is when a person believes that they are more likely to win the lottery because they know someone who won, even though the actual odds of winning remain extremely low. This bias disregards the base rate or prior probability of an event and instead overemphasizes the anecdotal evidence or isolated incidents.
- Clustering illusion: Be aware of the tendency to perceive patterns in random events, and try to evaluate things objectively. An example of the clustering illusion is a belief in a pattern that does not exist. For instance, a baseball player may believe that he has a hot hand and is more likely to hit a home run because he has hit three home runs in the last three games. However, the clustering illusion can cause the player to overlook the fact that his overall batting average has not significantly improved over the same time period, leading to a false belief in a pattern that is not statistically significant.
- Confirmation bias: To avoid this bias, try to seek out alternative viewpoints and evidence that challenges your beliefs. An example of confirmation bias is a person who believes that a certain alternative medicine works, and then only searches for information that confirms their belief, while ignoring or dismissing any information that contradicts it.
- Endowment effect: Be aware of the biases that can arise from being emotionally attached to something, and try to evaluate it objectively. An example of Endowment effect is when you have an item that you are emotionally attached to, such as a family heirloom, and you value it more than you would if you were to try to purchase a similar item on the market. For example, you may have a watch that belonged to your grandfather and that you inherited, and you may value it more than you would if you were to try to purchase a similar watch on the market, even if the market value of the watch is less than the sentimental value it holds for you.
- Escalation of commitment: Be willing to cut your losses and make a change if a particular course of action is not working out. An example of escalation of commitment is when a company continues to invest in a project even when it is not yielding the desired results or returns. For instance, a company may have invested a significant amount of money and resources into a project that is not performing well, but instead of cutting their losses and reallocating the resources, they continue to invest more in the hope that it will eventually turn around. This can lead to a situation where the company is pouring more resources into a project that is unlikely to succeed, resulting in significant losses for the organization.
- False consensus effect: Be aware that your own beliefs and opinions may not be shared by everyone, and avoid assuming that they are. An example of false consensus effect is when a person assumes that their beliefs, opinions, and values are shared by a majority of people, even when evidence suggests otherwise. For instance, a person who strongly believes that a certain political party is the best might assume that most people around them share the same belief, even though this might not be the case.
- False uniqueness effect: Avoid assuming that you are better or more capable than others, and recognize the value of diverse perspectives and abilities. An example of the false uniqueness effect is when someone believes that their abilities, traits, or opinions are more unique or uncommon than they actually are. For instance, a student who receives an A on an exam may believe that they are the only one in their class who achieved such a high grade, when in reality, others may have performed just as well or even better. This bias can lead to overestimating one’s own individuality and underestimating the similarities and shared experiences among others.
- Framing effect: Be aware of the potential biases that can arise from the way information is presented, and try to evaluate it objectively. An example of the framing effect is a study that presented people with two options for a medical treatment: one with a 90% survival rate and one with a 10% mortality rate. When framed in terms of survival rate, more people chose that option. However, when the same options were framed in terms of mortality rate, more people chose the option with the lower percentage. This demonstrates how the way information is presented can affect people’s decisions. Framed in terms of a survival rate: Option A: Medical treatment with a 90% survival rate Option B: Medical treatment with a 10% mortality rate. Framed in terms of a mortality rate: Option A: Medical treatment with a 10% mortality rate Option B: Medical treatment with a 90% survival rate.
- Fundamental attribution error: Try to consider external factors that may be influencing a person’s behavior, rather than solely attributing it to their character or personality. An example of the fundamental attribution error is when a driver cuts you off on the road, and you assume they are a bad and reckless driver. In reality, the driver may have been in a rush to get to the hospital or may have misjudged the distance. Rather than considering external factors, you attribute the behavior solely to the driver’s character or personality.
- Gambler’s fallacy: Don’t assume that the outcome of a random event is influenced by previous outcomes, and recognize that each event is independent. An example of the Gambler’s fallacy is when a person believes that a specific outcome is more likely to occur because it hasn’t happened in a while, even though the probability of the outcome remains the same. For instance, if a person believes that a coin will be more likely to land on heads because it has landed on tails several times in a row, they are committing the Gambler’s fallacy. In reality, the probability of the coin landing on either heads or tails is always 50/50, regardless of previous outcomes.
- Halo effect: Try to evaluate people or things on specific criteria rather than being swayed by one positive attribute. An example of the halo effect could be a situation where a person who is physically attractive is assumed to also have other positive attributes, such as being intelligent or competent, even though there may not be any evidence to support this assumption. For instance, a job interviewer may rate a candidate highly based on their physical appearance, and assume that they are also a good fit for the job, without considering other factors such as their experience or qualifications.
- Illusory superiority bias: Avoid assuming that you are better or more capable than others, and recognize the value of diverse perspectives and abilities. An example of illusory superiority bias is when someone overestimates their abilities or qualities in comparison to others. For instance, a person may believe that they are the best driver on the road or the most skilled employee in their workplace, despite objective evidence to the contrary. This bias can lead to overconfidence and a lack of awareness of one’s limitations, potentially leading to poor decision-making or performance.
- Mental accounting bias: Be aware of the tendency to treat money differently based on how it was acquired or will be used, and try to evaluate it objectively. Mental accounting bias is the tendency for individuals to treat money differently based on the source, intended use, and how it is categorized. For example, if someone receives a bonus from work, they may be more likely to spend it on a luxury item than if they had earned the same amount through regular salary. Another example is when someone keeps a strict budget for their groceries but is willing to spend much more money on a vacation or other discretionary expenses. This bias can lead to inefficient use of resources and missed opportunities for savings or investments.
- Negativity bias: Try to focus on the positives in a situation and not allow negative events or emotions to overshadow them. An example of negativity bias could be that a person receives a performance evaluation with mostly positive feedback and one negative comment. Despite the abundance of positive feedback, the person may dwell on and remember the negative comment more vividly and strongly, which can lead to a negative emotional response and affect their self-esteem.
- Optimism bias: Be aware of the potential biases that can arise from being overly optimistic, and try to consider potential challenges and obstacles. An example of optimism bias is a student who believes that they will do well on a test, despite not having studied enough, simply because they have done well on previous tests without much preparation. This bias can lead to overconfidence and underestimation of potential risks or obstacles.
- Planning fallacy: Be realistic in your estimations of how long a particular project or task will take, and build in extra time for unexpected delays or complications. An example of the planning fallacy is when a student underestimates the amount of time it will take to complete a school project, and as a result, ends up rushing to finish it at the last minute. Despite having experienced similar situations before, the student is still overly optimistic about how much time they will need, leading to poor time management and potentially subpar work.
- Recency bias: Don’t solely rely on recent events when making decisions, and consider the broader context and history. Recency bias is the tendency to place more importance or weight on recent events or information, even if they are not necessarily more relevant or accurate. For example, in the context of evaluating job candidates, a recruiter may be more impressed with a candidate who has performed well in recent job interviews, even if their overall experience and qualifications are not as strong as another candidate who performed well in the past but had a less successful recent interview.
- Self-serving bias: Try to be aware of your own biases and motivations, and consider alternative perspectives and viewpoints. An example of self-serving bias is when an individual attributes their successes to their own abilities and efforts, but their failures to external factors outside of their control. For instance, a student may attribute their good grades to their intelligence and hard work, but their poor grades to a difficult teacher or an unfair grading system. This bias can also manifest in the workplace, where an employee may attribute their successes to their own skills and expertise, but their failures to a lack of support or resources from the organization.
- Status quo bias: Consider whether sticking to the status quo is truly the best option, and weigh the potential risks and benefits of making a change. An example of status quo bias is when a company continues to use outdated technology or methods because they are familiar and comfortable with them, even if newer and more efficient options are available. This can lead to missed opportunities for improvement and hinder the company’s progress.
- Sunk cost fallacy: Don’t continue investing time or resources into a project simply because you have already invested a significant amount. Suppose a company invests a large amount of money and resources into a project that is not producing the desired results. Despite the fact that continuing to invest in the project would require even more resources, the company decides to keep pouring money and time into it simply because they have already invested so much. They justify this decision by reasoning that they have already invested so much and it would be a waste to abandon the project. In reality, however, the company should have cut their losses and invested their resources into a different project that was more likely to be successful.
- The law of unintended consequences: Be aware that actions can have unintended and unpredictable outcomes, and consider potential risks and side effects. An example of the law of unintended consequences is the introduction of non-native species to an ecosystem. For instance, introducing a new species of fish to a lake in order to improve fishing opportunities for humans may seem like a positive change, but it can have unintended consequences on the lake’s ecosystem. The new fish may outcompete native species for food and habitat, leading to a decline in their populations. This can also have ripple effects on other species that rely on the native fish as a food source, ultimately resulting in a less diverse and less stable ecosystem.
- The Von Restorff effect: Be aware of the potential biases that can arise from focusing on unique or unusual elements, and try to evaluate things holistically. An example of the Von Restorff effect is when a person is given a list of items to remember, and one of the items is distinct or stands out in some way, they are more likely to remember that item compared to the other items on the list. For example, if a person is given a list of words and one of the words is in a different font or color, they are more likely to remember that word compared to the others. This effect can be used in advertising or marketing to make a product or message stand out and be more memorable to consumers.
Know Your Psychological Biases to Think and Do Better
Psychological biases are ubiquitous in our daily lives, and understanding them is crucial to making better decisions and becoming more critical thinkers.
By being aware of these biases and learning how to mitigate their effects, we can become more objective and rational in our judgments and actions.
Remember, being aware of our own biases is the first step to avoiding their pitfalls and becoming more successful in life, work, and relationships.
You Might Also Like
Big List of Biases
Attention Biases
Attitude and Belief Biases
Communication Biases
Cultural Biases
Decision-Making Biases
Emotional Biases
Evaluation Biases
Investment and Commitment Biases
Language Biases
Motivational Biases
Perception and Memory Biases
Probability and Randomness Biases
Scotoma: Why You Can’t See What’s Right in Front of You
How Labeling Others Distorts Your Thinking
How To Improve Your Critical Thinking
How To Practice Precision Questions and Answers