📓 Addressing Implicit Bias
This lesson is part of our regular Diversity, Equity, and Inclusion curriculum.
In Epicodus' code of conduct, we emphasize the importance of being kind to others, respecting our differences, and assuming that other students have good intentions. By consciously making a choice to do these things, we can be better pair programmers and a positive part of the Epicodus community even as we strengthen connections and improve the social skills that are instrumental to getting a good job.
However, it's also important to be aware of our implicit biases as well. According to the Perception Institute:
"Thoughts and feelings are 'implicit' if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term 'implicit bias' to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge."
As the Perception Institute goes on to detail, most of our actions occur unconsciously, not consciously. This means that implicit bias can have a strong hold over our actions even if we are consciously trying to be inclusive and avoid stereotypes.
For example, the study "Weapon Bias: Split-Second Decisions and Unintended Stereotyping" by B. Keith Payne demonstrates that stereotypes about race can lead to people thinking they see a weapon in a Black person's hand when there is no weapon. This is a particularly insidious example that has led to tragic real world consequences.
Implicit bias is common across all groups, and so it's not surprising that teachers at Epicodus have seen and experienced implicit bias in our student body. The most common example we see can be boiled down to the following statement: "I don't want to pair with person X because they are 'difficult' to pair with and/or they are a poor coder." This perception disproportionately affects students from underrepresented groups.
Here are a couple of real examples we have seen:
A male student didn't listen to a female teacher's answer to his question, even though she was the most informed teacher on the topic. The male student then tried to redirect his question toward a male teacher, even though the male teacher told the student that he was less informed on the topic.
A group of male students did not want to pair with a female student because they thought she was a poor coder. However, based on teacher assessment, she was a stronger coder than any of the male students that didn't want to pair with her.
In both of the examples above, the students engaging in implicit bias were hard workers who were determined to make the best of their experiences at Epicodus. Assuming their best intentions, they probably did not intend to have biased behavior. However, the behavior still had a negative impact on the other students, teachers, and the Epicodus community at large.
How can we face our implicit biases if we don't even know we have them? There are a couple of steps we can take.
First, we can acknowledge that we all have implicit biases. The human mind has a tendency to generalize and categorize; these skills have traditionally been essential for survival. However, these skills aren't always beneficial in our modern day-to-day lives.
We can take an Implicit Association Test. There are more than a dozen tests on implicit biases ranging from race to gender to age. While the test is not without controversy and results can change over the time, taking one or more of the tests can be a constructive way to think about implicit bias.
We can be aware of how prevalent the stereotypes around tech and programming are. Studies show that children as young as 6 have internalized the belief that boys are better at programming and robotics, and that this affects girls' interest in learning about the field.
We can question our assumptions. When we notice ourselves making judgements about someone being a good coder or not, we can ask ourselves what specifically led to that belief. Have we actually paired with that person and seen their code often? Were there specific actions or statements that have made us believe they're bad at coding (or vice versa)? And if the same behavior was done by someone of a different age, gender, or race, would we have interpreted it in the same way?
Ready to Write Your Reflection?​
There is a reflective assignment for this lesson. If you are ready to write your reflection, head on over to Epicenter to find the prompt. If you are logged in to Epicenter, you can access the prompt by navigating to this link:
Reflection Prompt: Addressing Implicit Bias
Otherwise, you can find detailed instructions on accessing the reflection prompts in the DEI Reflective Assignments lesson.
Do you have feedback?​
We want to hear about your experience of the DEI curriculum. We outline all of the ways you can give feedback in the student handbook.