Explicit bias is a demonstration of conscious preference or aversion towards a person or group. With explicit bias, we are aware of the attitudes and beliefs we have towards others. These beliefs can be either positive or negative and can cause us to treat others unfairly.
Example: Explicit biasYour teacher has graded your math exams and is handing out your papers. As they walk by an Asian-American student, the teacher makes an offhand remark about how they had expected better results from that student because ‘Asians are good at math’.
Expressions of explicit bias can seem innocuous, like the example above, but they can also include hate speech, physical harassment, or discriminatory policies that target and exclude individuals or groups.
Primacy bias is the tendency to more easily recall information that we encounter first. In other words, if we read a long list of items, we are more likely to remember the first few items than the items in the middle.
Example: Primacy biasYou are attending a lecture at school. At the beginning, you feel like you can absorb all the information and follow the topic. After a while, your mind starts to wander and only when the lecture is drawing to an end do you tune in again. Later that day, as you try to explain to a friend what the lecture was about, you realize that you can vividly recall the first part of the lecture but not the middle.
We also tend to assume that what is at the beginning of a list is of greater importance or significance. Due to this, primacy bias (or primacy effect) has far-reaching consequences in different contexts, such as job interviews, education, and advertising.
Affinity bias is the tendency to favour people who share similar interests, backgrounds, and experiences with us. Because of affinity bias, we tend to feel more comfortable around people who are like us. We also tend to unconsciously reject those who act or look different to us.
Example: Affinity biasYour company has hired several new people. During a team meeting, all the new colleagues take turns introducing themselves. One of them is your age, and it turns out that you both studied product design and have worked at similar companies. You instantly feel that this particular person is a good fit for the team.
Affinity bias is also known as similarity bias. It can lead to the exclusion of individuals or groups.
A double-barrelled question forces respondents to provide a single answer to two or more separate issues. Presenting multiple topics at the same time, even inadvertently, can be a problem in survey research as it makes it harder for respondents to give a meaningful answer.
Example: Double-barrelled questionIn a customer satisfaction survey, you encounter the following Likert scale question:
‘How useful do you find our help center topics and our email support center?’
Extremely useful
Useful
Neutral
Somewhat useful
Not useful at all
Suppose that you did find the help center topics useful, but the email support center was rather slow and didn’t really help you solve your issue. This question doesn’t allow you to rate the two separately.
As a result, the question does not capture the constructs you are trying to measure, potentially leading to biased research results and confusion. Double-barrelled questions are also called compound or double direct questions.
Actor-observer bias is the tendency to attribute the behaviour of others to internal causes, while attributing our own behaviour to external causes. In other words, actors explain their own behaviour differently than how an observer would explain the same behaviour.
Example: Actor-observer biasAs you are walking down the street, you trip and fall. You immediately blame the slippery pavement, an external cause. However, if you saw a random stranger trip and fall, you would probably attribute this to an internal factor, such as clumsiness or inattentiveness.
Because actor-observer bias can influence how we perceive and interact with other people, it can lead us to inaccurate assumptions and misunderstandings.
An ecological fallacy is a logical error that occurs when the characteristics of a group are attributed to an individual. In other words, ecological fallacies assume what is true for a population is true for the individual members of that population.
Example: Ecological fallacyYou are reading a news story about the wealthiest states in the country. Assuming that wealthier states contain more wealthy people is an ecological fallacy. Indeed, it could be due to the fact that they contain a small number of extremely rich individuals.
Ecological fallacy can be problematic for any research study that uses group data to make inferences about individuals. It has implications in fields such as criminology, epidemiology, and economics.
A ceiling effect occurs when too large a percentage of participants achieve the highest score on a test. In other words, when the scores of the test participants are all clustered near the best possible score, or the ‘ceiling’, the measurement loses value. This phenomenon is problematic because it defeats the purpose of the test, which is to accurately measure something.
Example: Ceiling effectOn a midterm math exam, in which the highest possible score is 100 points, 90% of the students score 98 out of 100. This means that the majority of the students obtained a top score, and the clustering of the scores near the top is evidence of a ceiling effect. This suggests the exam was too easy.
A ceiling effect can be observed in surveys, standardised tests, or other measurements used in quantitative research.
The affect heuristic occurs when our current emotional state or mood influences our decisions. Instead of evaluating the situation objectively, we rely on our ‘gut feelings’ and respond according to how we feel. As a result, the affect heuristic can lead to suboptimal decision-making.
Example: Affect heuristic You have been applying for jobs for the past few months. Your last application successfully landed you an interview at a big tech company, but you didn’t make it to the second round of interviews.
You were very excited about the opportunity, and now you feel disheartened.
A friend forwards you another job posting for a similar position at a smaller company. You decide not to apply, even though you are qualified. Because of your state of mind, you feel that there is a good chance that you won’t get that job either.
The representativeness heuristic occurs when we estimate the probability of an event based on how similar it is to a known situation. In other words, we compare it to a situation, prototype, or stereotype we already have in mind.
Representativeness heuristic exampleYou are sitting at a coffee shop and you notice a person in eccentric clothes reading a poetry book. If you had to guess whether that person is an accountant or a poet, most likely you would think that they are a poet. In reality, there are more accountants in the population than poets, which means that such a person is more likely to be an accountant.
Although representativeness provides a quick and efficient way to make decisions, it can cause us to overlook important information and draw incorrect conclusions.
Anchoring bias describes people’s tendency to rely too heavily on the first piece of information they receive on a topic. Regardless of the accuracy of that information, people use it as a reference point, or anchor, to make subsequent judgements. Because of this, anchoring bias can lead to poor decisions in various contexts, such as salary negotiations, medical diagnoses, and purchases.
Example: Anchoring bias You are considering buying a used car, and you visit a car dealership. The dealer walks you around, showing you all the higher-priced cars, and you start worrying that you can’t afford a car after all.
Next, the car dealer walks you toward the back of the lot, where you see more affordable cars. Having seen all the expensive options, you think these cars seem like a good bargain. In reality, all the cars are overpriced. By showing you all the expensive cars first, the dealer has set an anchor, influencing your perception of the value of a used car.