A product journey is typically fraught with many technical, analytical, and design challenges. Your biggest defense against these challenges is a rock-solid team with passionate team players who are excited about learning and solving problems. While someone's years of experience, unparalleled technical knowledge, or effortless business acumen can seem fool-proof, it's important to remind ourselves that we are, in fact, human. That means we are susceptible to all the occasional bugs (!) that crop up in our brain. These glitches that alter the way in which we perceive our reality are called cognitive biases.
According to Wikipedia, a cognitive bias is a systematic pattern or deviation from rationality in judgement. To put it simply, a cognitive error could be called a misconception or a lapse in judgement. What's interesting to note is that we make these errors in judgement again and again, repeating patterns, somehow unable to conceive the lack of logic in our actions. The term was first coined by Amos Tversky and Daniel Kahneman in 1972, when they were researching decision-making and human judgement. Since then, several types of cognitive biases have been discovered and documented.
We've put together a list of cognitive biases that a product team is likely to encounter in their journey, along with what it could look like and how one can overcome it.
This is a tendency to seek new information in a way that aligns with your existing beliefs and opinions. Famously known as the father of cognitive biases, confirmation bias is one of the most widespread and impactful ones in this list. Whether you believe in climate change or not, whether you think the government is evil or not, you're more likely to find information that affirms your opinions and theories. With surveillance capitalism, this bias is encoded into algorithms, effectively turning all virtual content spaces into echo chambers.
Confirmation bias isn't merely seeking new information to affirm your beliefs, it's also the tendency to skew or interpret information from a different perspective so that it aligns with your worldview. If you find your user research perfectly matching your product roadmap, or if you tend to dismiss data that is disagreeable as 'outliers', then you may be a victim of confirmation bias.
Sunk cost fallacy
When you've invested money, effort or time into something, it becomes hard to abandon it, even if you know rationally that its beyond saving. Because you've invested resources in it already, you feel obliged to keep going in an attempt to warrant those efforts instead of admitting that you made a bad call. This is the sunk cost fallacy at work. If you think this sounds ludicrous, think about this: most people sit all the way through a bad movie because they've paid for the tickets, and are more likely to continue going to a class they paid for, even if it's been unhelpful. The rational thing would be to abandon both those things, thereby saving your time. Your money isn't coming back.
Developing new features for a product past its prime in an effort to revive it, or adding enhancements to a feature that wasn't received well because your team worked really hard on it are examples of the sunk cost fallacy. Granted, there are times when sticking with a feature or product could lead to rewarding payoffs. So how do you decide? To avoid the sunk cost fallacy, the decision should depend on the likelihood of future rewards and not on the weight of past efforts.
If you have friends who have gotten into road accidents, you're more likely to be concerned about road safety. If you have a smoker friend who runs 5k marathons, you're less likely to think smoking is dangerous. We create a reality based on the examples that are available to us. We constantly miscalculate the risks of things around us, like plane crashes, terrorist attacks, or pianos falling on our head.
Our brain finds it easier to fall back on information that is readily available to us. In a product team, availability bias can look like focusing excessively on existing users. While it's crucial to take user feedback into account, it doesn't give the team the complete picture. Users who dropped off the sales funnel (people who evaluated the product but didn't buy it, people who weren't persuaded by the ad campaign, people who signed up but didn't buy or use the product) can provide us critical information that is tough to collate and quite easy to overlook.
Often called 'herd mentality' or 'group think', the bandwagon effect is a tendency to do something because everyone else is doing it. There's an abundance of examples all around us. Our world is full of trends—if you watch enough people 'bus it', you start to think there's something inherently cool about it.
'Groupthink' can look like a reluctance to say no when everyone is agreeing about something. In a product team, there are a lot of decisions that could benefit from some healthy questioning and different perspectives. Does your team quickly come to an agreement because there's an implied pressure to do so or because they're all too similar? If you find your team getting a little too homogeneous, try playing the devil's advocate and see if the resulting decision changes.
Authority bias & Overconfidence effect
These two biases are often found together in a heady cocktail, so we decided to list them together. Authority bias is the tendency to attribute greater importance to, or a reluctance to question an authority figure's opinion. People often think authority bias only belongs in places like the army, which demands blind, unquestioning execution of orders. However, it can be found everywhere in less lethal doses—corporate offices, stock markets, or even our beloved, overbearing families xP
The overconfidence effect is our systematic tendency to overestimate our abilities. Funnily enough, experts (in all fields) are more susceptible to this bias. This could mean that team members with more experience are likely to question themselves less, even though they have the same margin of error as everyone else. You can see how a mixture of authority bias and overconfidence effect could be deadly for any product team.
This is a tendency to judge a decision based on its outcome instead of the factors that led to the decision. Alarmingly deceptive, this bias is hard to detect. Let's say you explore a new demographic for your product and do a paid advertising campaign. The campaign could be a success and rake in new customers, or it could fall flat on its face. If the former happens, the idea is hailed as 'genius.' If the latter happens, everyone wants to know which idiot pitched the idea.
The decision to explore a new demographic has some calculated risks, and should be judged according to that. If you stop experimenting with advertising campaigns after this incident, you're taking away the wrong lesson. Good decisions can lead to bad outcomes, and sometimes poor decisions can lead to desirable outcomes. Outcome bias leads you to dismiss the role of chance and makes you judge a decision solely on the outcome it produced.
Fallacy of the single cause
Our world is becoming increasingly complicated, but our brains haven't changed much. The Trojan war, a stock market crash, and global warming—what do they all have in common? All of them can't be attributed to a single cause. If we ponder for a while, we might realize that even seemingly simple things could have more than a single cause. However, our stone age brains aren't built for complexity and prefer to retain the single-cause narrative.
When you've had a bad sales quarter, it's tempting to think it's because of an understaffed sales team or wrong advertising or even the economy. It's usually a combination of factors. In his book The Art of Thinking Clearly, Rolf Dobelli suggests you "...take a sheet of paper and sketch out all the potential reasons. Do the same for the reasons behind these reasons. After a while, you will have a network of possible influencing factors. Second, highlight those you can change and delete those you cannot (such as “human nature”). Third, conduct empirical tests by varying the highlighted factors in different markets. This costs time and money, but it’s the only way to escape the swamp of superficial assumptions."
Cognitive psychology is a fascinating school of science with many implications across all professions. If you're interested to learn more about how cognitive biases influence your everyday decisions, we recommend Thinking fast and slow by Daniel Kahneman. This book documents the Nobel prize-winning research done on heuristics and decision-making. If you prefer your wisdom in small nugget-sized bites, we recommend picking up The Art of Thinking Clearly by Rolf Dobelli.
Cognitive biases can be very sneaky; if you're not on your guard, you're likely to miss these reflex decisions. A little self-awareness can go a long way in helping you recognize your thought patterns and tackle biases. Question your assumptions, don't make a decision without supporting data, and don't be afraid of a little self-doubt!
We hope you've become a tiny bit wiser at the end of this article, and that you can spot these 7 systematic biases when you encounter them next.
Now place a hand over your heart and tell us honestly how many of these cognitive biases you've experienced in your product journey :)