Obtaining reliable and helpful data can be a difficult task. Every step of the process - from writing the right questions to ask, asking them and then examining the answers - all stages of this cycle have a chance of getting influenced by preconceived notions. But why is this so? Well, as human beings, we all have different experiences which make our personal opinions divulge with our behavior.
No matter how neutral a surveyor might be, they cannot deny that survey designers are humans with inherent viewpoints and unconscious habits of thought.
This is the loophole that results in a form of survey bias known as confirmation bias, also known as confirmation bias or research observation bias. Let us have a closer look at it.
What is confirmation bias?
This is a type of bias that can creep in on the back end of a survey. Sometimes, researchers evaluate survey data and try to find patterns to prove a point that they believe might exist, or something that has been ingrained into their belief system because of what they think is true. In such cases, they tend to overlook the data and try to find out what is wrong with a survey based on their ideas and what they believe is right.
What makes it trickier to recognize is that it occurs beyond survey design—you cannot technically spot it in any of the elements of a survey; rather, it is a human error that creeps in on the backside of a survey.
Surveyors have to be cautious since confirmation bias comes up at the final stage of your surveying. Even after all the effort of creating your survey—crafting the right questions and sending it to the right people—if you fail to interpret your data correctly, all the other efforts will go to waste.
One famous example of confirmation bias is the work of Cyril Burt, a research psychologist best known for his work on the heritability of IQ.
Burt believed that children from classes with lower socioeconomic status were more likely to be less intelligent compared to the children of parents of high social status.
The result was a two-tier educational system in England, where the upper classes went to the elite schools while the children from lower economic strata were sent to less-desirable ones.
Later, his work was debunked when it was found that he committed a data fraud. Today, it is accepted that intelligence is not hereditary.
Keep in mind that this type of bias is not always intentional. Instead, it is something quite inherent. The survey researcher does not intentionally decide have a bias in the results; it simply happens because of preconceived notions in the mind. This often has to do with the data's subject matter—what it's about.
The person is basically giving more weight to data that supports their position and overlooking evidence from the same data set that may contradict what they want to prove.
You'd be surprised to find out that some of the world's best researchers have been a victim of this bias.
World Bank conducted a study to test if their professional staff would fall into this trap. They used identical data sets to describe the efficiency of a skin cream and the impact of minimum wage on poverty rates.
Controlling for seniority and cognitive ability, the people who were given the neutral skin cream scenario actually interpreted the data much more accurately than the ones given the more minimum wages and poverty scenario, despite the data set being the same.
Why is there discrepancy in the interpretation despite the data set being same?
As humans, we all have personal beliefs that can affect our behavior towards the data, and impact our interpretations. We may be highly enthusiastic about a survey or its findings and want to focus more on the sections of the data that favor a particular outcome. This makes all kinds of surveys very susceptible to confirmation bias.
So what's the reason behind this?
Confirmation bias can come into play when a surveyor is highly enthusiastic and anticipates survey answers to their hypothesis. Then they overlook the fact that surveys are based on only a handful of respondents.
The culprit here is usually improper techniques or a confusing data set that lead to an incorrect interpretation of survey results.
Tips to avoid confirmation bias:
Create a data analysis plan before sending your survey.
The key reason this type of bias happening is the lack of a data strategy before researchers send out the survey. After creating this, you can chalk out a plan for interpreting the results.
Include questions in your survey that will allow easy interpretation of the results.
The arrangement, order, and sequence of the questions should reflect the set of questions you have in mind.
Use a multiple-choice question if you want something quantitative in your results. This way, you will know for sure which is the answer which the respondent is choosing
Please keep this in mind:
Put in a lot of thought before you start your survey. Take some time to consider how confirmation bias can seep into your survey results, and make sure you have closed all the loopholes, and steered clear of any kind of controversial questions. Once you know that your questionnaire is clear and good to go, you can send your survey to people.
Test with a pilot survey before sending it to your respondents.
This allows you to get an idea of your survey for errors before actually sending it. A pilot can be a litmus test for all the aspects of your survey—question flow, language, wording, and more. It helps you identify and fix issues so you don't get poor-quality data. It’s like putting your survey through a simulator to understand what is right and wrong. Feedback from the piloting session can ensure that those mistakes don't take place while actually sending the survey.
Confirmation bias is a little difficult to grasp at first, mostly because it doesn't emerge until the last stage of the survey. But if you can remain true to the survey's actual purpose and develop a firm understanding of the relationships you want to research and the topics you want to cover, you'll be well on their way towards avoiding confirmation bias.
Apart from the confirmation bias, there are several other forms of bias like response bias, that can affect your survey data. You want to ensure you are avoiding all of them to survey the right way and get the data you need.