Unlearning Unconscious Bias
Unlearning Unconscious Bias
It’s one thing to be aware of unconscious biases. It is another thing to unlearn them.
Pragya Agarwal, PhD, investigates the science behind unconscious, or implicit, bias, and the effect it has on the way we perceive the world. In her book, Sway, she draws upon countless studies that reveal how unconscious bias impacts a wide range of behavior, including in the medical field, the police force, and social media.
Bias training is not as simple as taking a test or attending a seminar. It’s ongoing work. And it’s critical.
A Q&A with Pragya Agarwal, PhD
We all carry certain biases. Some of these biases help us make decisions, because we are bombarded with information from all sides, and our brains cannot process the information at the same rate it is coming in. Often we have to process some of that information on a very superficial level. In that case, we match it to existing templates—patterns that we have already learned from past experiences. These are created as cognitive biases that we fall back on. I use implicit and unconscious bias interchangeably because some of these biases are things that we are not aware of. Explicit bias would be if I chose a particular flavor of ice cream that I loved, like pistachio, and I always chose pistachio ice cream.
Implicit biases are usually when people are either not aware of their biases or they don’t want to voice those biases because they’re worried about how they would be seen or viewed if they did. These biases affect our decisions and actions, especially actions that are taken hastily.
Children start making these kinds of judgments from a very young age. From the age of six months, children start identifying these kinds of in-group versus out-group associations. They start realizing who is familiar to them, depending on their parents, and they learn this through facial cues. This ends up creating a fear of the unfamiliar. By the age of three, they start forming more concrete judgments about what things are preferable and how people behave. Children use something called transductive reasoning, which means they can’t really make rational judgments about everybody on an individual level, so they create associations. For example: They might read books and make an association that blonde hair and blue eyes are pretty because of what they see in fairy tales, where Whiteness is usually the norm.
They are constantly picking cues from parents, educators, and television. They are like little sponges. They are making judgments and forming their preferences. Slowly, these can become really deep prejudices if these messages are deeply entrenched in them. Jane Elliott’s “blue eyed, brown eyed” experiment is such a landmark because it shows how quickly children can learn the notion of superior versus and inferior. These prejudices and racist ideas can become deeply ingrained as they grow older if they are not checked. That’s why it’s so important for us to be cautious and careful as parents and educators. We need to have conversations with our children about diversity and race, racial justice, and equality.
The more we talk openly about our biases—individual, interpersonal, and systemic—the more it helps, of course, as does having more conversations involving different communities. We also need to desegregate the data. Here in the UK and also the US, data tends to get divided into White or non-White. It is not segregated into Black, Hispanic, South Asian, etc. As a result, we do not see how bias is affecting different communities in general. “People of color” is quite a big term, and it homogenizes the disparate groups, also making Whiteness the norm against which things are measured.
While it’s important to have the correct stats and data, we also need to have more conversations in different domains in an open and nonjudgmental way, such as in the legal system, policing, artificial intelligence, and medical care. We have to think about how we can incorporate bias into medical training, for instance, and how stereotypes and generalized assumptions about different communities can be addressed. This will involve looking at words and images that perpetuate and reinforce these stereotypes, and it will require the media to take more responsibility and be accountable.
There is research I discuss in my book that has shown that when undergraduate students were shown black-and-white drawings of people’s faces, they found Black faces to be more threatening. (Before that, they were primed, meaning they were given a stimulus cue of Black people or White people in photographs.) They found Black faces to be more threatening. A number of research studies have shown that there is a strong stereotype about Black men, that they are associated with being more aggressive and threatening. There is also the trope of the angry Black woman, which is often used in media. This whole notion of aggressiveness that’s associated with both Black men and women means that people, especially police officers, who are making these decisions quite instinctively and quickly, are falling back on these stereotypes even when a person is not actually threatening. We have seen that again and again in a number of cases.
There was another experiment done with a video game simulation of a shooting scenario. If the participant thought the man was armed, they had to press a “shoot” button, and if they didn’t think he was armed, they pressed a “don’t shoot” button.” This decision had to be made very quickly. The study showed the participants falling back on implicit biases. The results have been published and showed that participants were more likely to shoot a Black person in error than a White person. This is one of the reasons there is so much greater incidence of police brutality and violence against Black people than against White people.
The “shoot versus don’t shoot” simulation was a research lab study, although the results from that simulation are backed up by data analysis: In 2012, FBI data was analyzed and showed that Black people made up 31 percent of the victims of police killings, even though they make up just 13 percent of the US population. In 2015, The Guardian in the UK found that 102 people of the 460 killed by police in the US were unarmed. It showed that Black Americans are more than twice as likely than White people to be unarmed when killed.
I always say that bias training is not a magical cure that can cure somebody of unconscious biases. Bias training could be a daylong course or a one-hour talk. In some cases, participants are asked to do an online test and then given a score of their implicit associations. The test is useful in trying to understand what kind of implicit associations they make. But that is not a training. There are many misconceptions around the test and what it accomplishes.
A test is not training. Bias training has to be an ongoing process. So yes, to start: Talking about unconscious bias will make you more aware of your implicit biases. When you see the facts, the research, the evidence, and the case studies, you realize how insidious and pervasive biases can be. They affect us in so many ways that we don’t realize. Take ageism, accents, looks, and height—and think about dating. So much of the dating technology being created is not only incorporating these biases but also perpetuating them.
“I always say that bias training is not a magical cure that can cure somebody of unconscious biases.”
We need to become aware of—and reflect on—our biases. It has to be an ongoing process of educating ourselves and watching ourselves. Then we have to make really important decisions. We have to try to neutralize our stereotypes by making sure that we don’t fall back on them. Especially when hiring and doing recruitment, people tend to follow confirmation bias. We are more attracted to people who perhaps went to the same university as we did, who come from the same town, who dress a certain way or speak a certain way. When a person walks into a room, you’ve already made these judgments about them whether you like them or not.
So how do we put processes in place so that we can minimize these effects? That is what organizations and workplaces have to think about in order to counter some of these systemic problems. Are we actively being allies? Are we actively promoting and encouraging people from marginalized and vulnerable groups? Are we actively speaking up? On an individual level, we all have a responsibility to counter these stereotypes. To make sure that we expose ourselves to as much reading as possible. To get our information from different role models.
When I was trying to understand where this in-group/out-group mentality comes from, I looked at the evolutionary basis of bias. Because a lot of our biases around people are due to the fear of the unfamiliar, the fear of the unknown, the fear of our status quo being disrupted, the fear of our privileges being taken away, or the fear of somebody coming into a space and threatening us. Evolutionary theories say that in the distant past, our ancestors had limited resources. So they had to make quick decisions about how to preserve those resources, make the most out of them, and survive. Which meant they also had to make quick decisions about who belonged to the group and who was a threat. In that way, that instinct is primal when it comes to forming communities. We do that still on social media.
On social media, we form communities, and we stick within them. This forms a comfort zone for us. We surround ourselves with people who talk like us, think like us, and act like us. We do this in real life as well. This enables confirmation bias. We hear our views being echoed back at us: This is called the echo chamber. Any dissenting voices are filtered. When we exist in these bubbles, the fear we hold against people who do not belong to our group is heightened.
I am positive, and cautiously optimistic.That said, writing this book was quite depressing at times. It was triggering, and I felt desperate, wondering: Is this ever going to change? This creates a crisis for me at times, and I can find myself spiraling. But I also firmly believe that humans are kind and good. We want to see the good in other people. We want to do the right thing. We might not know that we are doing the wrong thing sometimes, but most often, people want to do the right thing. Sometimes people don’t have the right tools, vocabulary, and mechanisms. Or they might not take the time to think about how their actions are impacting others. And so I am doing all that I can to raise awareness, and to engage in discussion and debate.I don’t believe we can excuse our biases and prejudices by saying that it is hardwired within us. We all have to take responsibility and accountability. We have to learn about the history and legacy of oppression and how that has affected our present. We need to work on ourselves while also keeping an eye on the wider systemic and structural biases. But I am hopeful that there is a real need and desire for people to do the right thing and change and be kind.
Pragya Agarwal, PhD, is a behavioral and data scientist and a freelance journalist. She was a senior academic in universities in the UK and US for over fifteen years after completing a PhD at the University of Nottingham. Agarwal writes regularly about bias, diversity, and inclusivity. She is the author of Sway: Unravelling Unconscious Bias.
We hope you enjoy the book recommended here. Our goal is to suggest only things we love and think you might, as well. We also like transparency, so, full disclosure: We may collect a share of sales or other compensation if you purchase through the external links on this page.