“The first principle is that you must not fool yourself—and you are the easiest person to fool.” — Richard Feynman
Do you sometimes feel like you’re making irrational decisions or wondering why you made a particular choice?
Decision-making biases could be the reason behind it.
These cognitive shortcuts and thinking patterns can affect how we process information and make decisions. In this list, we’ll explore some common decision-making biases, how they work, and how to avoid them.
By understanding and identifying these biases, you can become a more thoughtful and rational decision-maker.
What are Decision-Making Biases?
Decision-making biases are cognitive biases that influence and distort the decision-making process, leading to systematic errors in judgment and decision-making.
These biases can affect any type of decision, from simple everyday choices to complex strategic decisions, and can have significant consequences on the outcome of those decisions.
Some decision-making biases are related to our perception and interpretation of information, while others are influenced by our emotions, beliefs, values, and past experiences.
Here are common decision-making biases:
- Actor-Observer Bias: Tendency to attribute our own behavior to situational factors and the behavior of others to dispositional factors.
- Anchoring Bias: Tendency to rely too heavily on the first piece of information encountered when making decisions.
- Availability Heuristic: Tendency to overestimate the likelihood of events that are easily remembered or recalled.
- Backfire Effect: Tendency to reject information that contradicts one’s beliefs, causing them to believe even more strongly in those beliefs.
- Bandwagon Effect: Tendency to adopt beliefs or behaviors because others around us are doing so.
- Barnum Effect: Tendency to accept vague, general statements about our personalities as accurate due to their perceived uniqueness.
- Base Rate Fallacy: Tendency to ignore statistical information in favor of anecdotes or personal experiences when making decisions.
- Bystander Effect: Tendency to assume someone else will take action in an emergency situation, leading to inaction.
- Clustering Illusion: Tendency to see patterns where none exist.
- Confirmation Bias: Tendency to seek out and interpret information in a way that confirms our existing beliefs.
- Dunning-Kruger Effect: Tendency to overestimate one’s own abilities or knowledge.
- Endowment Effect: Tendency to overvalue something simply because we own it.
- Escalation of Commitment: Tendency to continue investing in a decision, even when it’s not paying off, because we’ve already invested so much.
- False Consensus Effect: Tendency to overestimate the extent to which others share our beliefs or opinions.
- False Uniqueness Effect: Tendency to underestimate the extent to which others share our unique characteristics or abilities.
- Framing Effect: Tendency to be influenced by the way information is presented or framed.
- Fundamental Attribution Error: Tendency to overemphasize dispositional factors when explaining the behavior of others.
- Fundamental Frequency Illusion: Tendency to notice things more when we’re paying attention to them, leading us to believe they happen more frequently than they actually do.
- Gambler’s Fallacy: Tendency to believe that past events will influence the probability of future events, even when they are independent.
- Halo Effect: Tendency to believe that a person who possesses one positive attribute must possess other positive attributes as well.
- Hindsight Bias: Tendency to believe that an event was predictable after it has occurred.
- Illusory Superiority Bias: Tendency to overestimate one’s own abilities or characteristics.
- Just-World Bias: Tendency to believe that the world is just and that people get what they deserve.
- Law of Unintended Consequences: Tendency for our actions to have unintended consequences.
- Loss Aversion Bias: Tendency to place more weight on avoiding losses than on acquiring gains.
- Mental Accounting Bias: Tendency to treat money differently depending on where it comes from or how it’s spent.
- Mental Time Travel Bias: Mental time travel bias is the tendency to overestimate the extent to which we will feel the same way in the future as we do now.
- Negativity Bias: Tendency to give more weight to negative information than positive information.
- Normalcy Bias: Tendency to underestimate the likelihood of a disaster occurring simply because it has never happened before.
- Optimism Bias: Tendency to believe that good things are more likely to happen to us than bad things.
- Overconfidence Effect: Tendency to overestimate one’s own abilities or the accuracy of one’s own beliefs.
- Planning Fallacy: The tendency to underestimate the time and resources needed to complete a task, leading to poor planning and execution. Example: A student may underestimate the time needed to complete a research paper, leading to last-minute rush and poor quality.
- Recency Bias: The tendency to place greater weight and importance on recent events or information, often overlooking historical data or trends. Example: A person may believe that a company is performing poorly because of a recent decline in stock prices, despite strong long-term growth.
- Regression to the Mean: The tendency for extreme or exceptional events to regress toward the average over time. Example: A sports team may have a great season, but the following season their performance may not be as exceptional.
- Representativeness Heuristic: The tendency to make judgments or assumptions based on stereotypical or prototypical representations, rather than considering all available information. Example: A person may assume that a professor is unapproachable and harsh based on their appearance and demeanor, without getting to know them personally.
- Self-Serving Bias: The tendency to attribute positive outcomes to personal abilities or traits, while blaming negative outcomes on external factors. Example: A student may attribute a good grade to their intelligence and hard work, but a bad grade to a difficult exam or unfair grading.
- Status Quo Bias: The tendency to prefer things to stay the same, even in the face of potential benefits from change. Example: A company may resist implementing new technology or processes because they are comfortable with their current systems.
- Survivorship Bias: The tendency to focus on the success or survival of a small group, while overlooking the failures or losses of a larger group. Example: A person may believe that entrepreneurship is easy because they know successful entrepreneurs, while ignoring the many failed ventures.
- Sunk Cost Fallacy: The tendency to continue investing resources in a project or decision because of previously invested resources, even when the expected benefits do not justify the additional costs. Example: A company may continue investing in a failing product because of the money already spent on research and development.
- Von Restorff Effect: The tendency to remember and give greater weight to unique or distinct information or events, rather than the mundane or common.
How To Use Decision-Making Biases to Think Better
Here are some examples of how knowing these decision-making biases can help us become better critical thinkers and make more informed decisions:
- Anchoring Bias: When negotiating a salary, don’t let the first offer you receive set the tone for the entire negotiation. Instead, research the market rate for your position and use that as a starting point. An example of the anchoring bias is when a real estate agent sets an unrealistically high price for a property, and then gradually reduces the price until it sells. Potential buyers may be anchored to the original high price and perceive the gradually reduced price as a good deal, even if it is still higher than the property’s actual value. As a result, they may be more likely to make an offer at the reduced price, even if it is still higher than what they would pay if the property was priced correctly from the beginning.
- Availability Heuristic: Don’t make decisions based solely on the most recent news story or the loudest voice in the room. Consider all available information, including historical data and less prominent sources. An example of Availability Heuristic is when someone evaluates the likelihood of an event based on how easily they can bring to mind similar examples. For instance, someone may overestimate the likelihood of a plane crash because it has been covered heavily in the news recently, even though the actual probability of a plane crash is quite low. In this case, the person is relying on the availability of information rather than objective probability estimates.
- Base Rate Fallacy: When making hiring decisions, consider both the qualifications of the candidate and the likelihood of their success based on past experience in similar roles. An example of the base rate fallacy is when a person assumes that an event is likely to occur because it is frequently reported in the media or because it is highly publicized. For instance, if someone avoids flying in a plane because they believe it is dangerous, despite statistics showing that car accidents are more common and pose a greater risk. In this case, the person is relying on the availability of vivid stories about plane crashes, rather than considering the overall statistics and probabilities.
- Clustering Illusion: When analyzing data, look for patterns but be cautious of reading too much into them as it may not represent the full picture. The clustering illusion is the tendency to see patterns in random events. An example of this could be a person believing that their lucky numbers have a higher chance of being chosen in a lottery, despite the odds being equal for all numbers. This bias can be addressed by understanding probability and statistics and avoiding making decisions based on perceived patterns or coincidences.
- Confirmation Bias: Don’t seek out information that supports your existing beliefs while ignoring information that contradicts them. Instead, actively seek out information that challenges your beliefs to make a more informed decision. An example of confirmation bias is when a person believes that a particular brand of product is the best and only looks for positive reviews and information that supports that belief, while ignoring negative information or reviews about the same product. They may even defend their belief despite contradictory evidence.
- Endowment Effect: Don’t overvalue something just because you own it. Consider the actual value of the item and avoid making emotional decisions based on personal attachment. An example of the Endowment Effect could be a person who is selling their used car. They may value the car more than its market value simply because they own it and have become attached to it. As a result, they may have an unrealistic expectation of what buyers are willing to pay, leading them to overprice the car.
- Framing Effect: The way information is presented can heavily influence decision making. Try to consider information from multiple perspectives and avoid being swayed by overly emotional or sensationalized framing. An example of the Framing Effect is when a store advertises a product as “80% fat-free” instead of “contains 20% fat.” The first phrasing emphasizes the positive aspect of the product, while the second phrasing emphasizes the negative aspect.
- Gambler’s Fallacy: To avoid making decisions based on the Gambler’s Fallacy, it’s important to recognize that past outcomes do not necessarily predict future outcomes, and each event is independent of the previous one. Instead, base decisions on current information and probabilities, rather than on past outcomes. For example, a salesperson should not assume that a customer will make a purchase just because they have made several purchases in the past. They should focus on the customer’s current needs and preferences to make the best decision for both the customer and the business.
- Hindsight Bias: Don’t let hindsight cloud your judgment. Avoid thinking that events were inevitable or that you “knew it all along” when making decisions based on past events. An example of hindsight bias is when a stockbroker predicts the stock market will decline in the near future, but the market instead increases. Later, the stockbroker looks back and believes they knew the market was going to increase all along. This bias can lead to overconfidence in one’s ability to predict future events and can be detrimental to decision-making.
- Just-World Bias: When analyzing a situation, avoid assuming that good things happen to good people and bad things happen to bad people as it can lead to unjust conclusions. An example of just world bias is when someone believes that bad things happen only to bad people, and good things happen only to good people. For instance, a person may believe that someone who is struggling financially must not be working hard enough, rather than considering other factors such as systemic inequalities or personal setbacks. This bias can lead to victim-blaming and a lack of empathy for those who are experiencing difficult situations.
- Law of Unintended Consequences: When making policy decisions, consider the potential unforeseen consequences and weigh them against the intended benefits. An example of the Law of Unintended Consequences is the banning of single-use plastic bags in stores. While the intention was to reduce plastic waste, it has led to unintended consequences such as increased use of paper bags which are more resource-intensive to produce, and increased use of reusable bags which need to be washed and can potentially harbor bacteria. Additionally, some consumers are simply opting not to bring their own bags and are carrying their purchases out of the store without a bag, which can lead to dropped or damaged items.
- Mental Time Travel Bias: When making plans, consider potential future events that could impact the outcome and adjust accordingly. Mental time travel bias is the tendency to overestimate the extent to which we will feel the same way in the future as we do now. An example of this bias could be when someone decides to take a job that offers a higher salary but requires longer hours and more stress, thinking that they will be able to handle it. However, in reality, they may find that the increased stress and lack of work-life balance have a more significant impact on their well-being than they initially anticipated.
- Normalcy Bias: When assessing risk, don’t assume that things will always remain the same and be prepared for unexpected events. An example of normalcy bias is when people living in areas prone to natural disasters, such as hurricanes or earthquakes, underestimate the potential damage and fail to take necessary precautions or evacuate. They may believe that everything will be fine and continue with their daily routines, despite warnings from authorities and evidence of previous disasters. This can result in individuals being caught off guard and unprepared for the actual impact of the disaster, leading to injuries and even death.
- Overconfidence Effect: Don’t assume that you know more than you actually do. Stay humble and continuously seek out new information and perspectives to make better-informed decisions. An example of Overconfidence Effect can be seen in stock market trading where traders might become overconfident in their own abilities and make risky investments that lead to losses. They might believe that their predictions are more accurate than they actually are and take unnecessary risks based on their overconfidence. This can lead to significant financial losses. To avoid this bias, it’s important to objectively evaluate one’s own abilities and track one’s own track record of decision-making.
- Planning Fallacy: Don’t underestimate the time and resources required to complete a task or project. Consider past experiences and be realistic in your planning and decision-making. An example of the planning fallacy is when a student underestimates the amount of time needed to complete an assignment and ends up rushing to finish it at the last minute. This could have been avoided by considering past experiences and allowing for extra time in the planning process.
- Recency Bias: To overcome recency bias, it’s important to consider information from a wider time frame and not rely solely on recent events or data. When evaluating performance, consider the entire body of work rather than just recent events to avoid overestimating or underestimating someone’s abilities. Additionally, consciously taking time to reflect on past experiences and decisions can help avoid the influence of recency bias. An example of recency bias is a job interviewer who only remembers the most recent candidate they interviewed, even though an earlier candidate may have been a better fit for the job.
- Sunk Cost Fallacy: Don’t let past investments or decisions influence current decisions if they are no longer relevant. Instead, focus on the current situation and make decisions based on current information. An example of the sunk cost fallacy is when a person continues to invest time, money, or resources into a project or decision that is no longer viable, solely because of the amount of time, money, or resources already invested. For instance, a business owner may continue to invest in a product line that is not selling well, simply because they have already invested a large amount of money in developing and marketing the product.
- Survivorship Bias: Don’t make decisions based solely on successful outcomes. Consider both successful and unsuccessful outcomes to make a more informed decision. One example of survivorship bias is the study of successful entrepreneurs. Researchers may only look at successful entrepreneurs who have made it big and ignore the ones who failed. This can lead to the assumption that certain personality traits or behaviors are essential for success, while ignoring the importance of chance or luck in the equation. In reality, there may be many unsuccessful entrepreneurs who exhibited the same traits, but were not fortunate enough to succeed.
Know Your Biases to Make Better Decisions
Decision-making biases are inherent to human nature, and we need to be aware of them to make better-informed decisions.
By recognizing these biases and understanding how they influence our thinking, we can become more critical thinkers and make better decisions.
This, in turn, can help us in all areas of life, from work to personal relationships, and lead us to greater success and fulfillment.
You Might Also Like
Big List of Biases
Attitude and Belief Biases
Scotoma: Why You Can’t See What’s Right in Front of You
How Labeling Others Distorts Your Thinking
How To Improve Your Critical Thinking
How To Practice Precision Questions and Answers