This week, we were honored to have Brian Nosek, UVa Professor of Psychology and founder of the Center for Open Science, present some of his research on implicit bias. Given the importance of this topic to our mission, we opened the meeting to all and were delighted with a record turnout!
Here is a summary of the presentation.
Meet your unconscious
The challenge of perception is that your mind has no direct access to the outside world. Rather, the brain receives information through bodily senses and that information is subsequently constructed into an experience. These constructs govern our operations and instincts.
Although our very survival as a species was secured by the information received and processed by early human brains, it remains true that representations are not the same as reality. Mental processing sits in the middle, and in that middle there is a great deal of interpretation and inference.
Mind the Gap: Experience vs. Reality
Many simple experiments have demonstrated how our senses fool us or deliver conflicting information. In what is known as the McGurk Effect, what sound we hear can be influenced by what we see. Competing pieces of information produce inconsistent interpretations—even interpretations that do not correspond to the actual sound or the actual sight! Without these tools that resolve information quickly and definitively, we would be constantly reinventing our understanding of the world.
Another aspect of our brains’ efficiency is that first impressions stick. If shown an optical illusion even for a fraction of a second, whatever image your brain sees first, it will see again when you are shown the image a second time; you will have to force your brain to redo the same work in order to get another interpretation.
Speaking of interpretation, you don’t get to decide how you experience the world. Look at the Adelson two shades of grey illusion. While it can be easily proven that the two squares are in fact the same color, you cannot override your experiential impression that they are different. Your brain in effect compartmentalizes knowledge and experience—facts aren’t enough to revise impressions that come from the processing of sensory information.
Finally, you cannot stop your brain from ingesting information it recognizes. This is why it is much harder to name the color of a word when the word is the name of another color. Your brain cannot help read the word green, which interferes with your ability to quickly say ‘red’ (the Stroop Effect). It takes more effort to override that instant recognition.
Bias: the Brainchild of Evolution
The cultural consequences of how our mind works (interpreting sensory data, contextualizing information, maximizing efficiency) include our personal habits as well as implicit bias. When shown the same video of a baby reacting to a Jack-in-the-box toy, respondents thought baby Joan was afraid while baby John was angry. The baby was the same, but being told the baby’s name (and by inference, gender) changed the viewers’ impression of the emotions they attributed to the baby.
A common test that researchers use to measure implicit bias are word associations. The theory goes that implicit bias can be detected using experiments that leverage the Stroop Effect: the more/less instant our recognition/matching of a word to an association is, the less/more implicit bias is at work. This conclusion is based on the idea that we are quicker to make associations that go along with our preconceived notions than contrary to them.
Implicit Gender Bias and Career, Tech
After this fascinating, interactive introduction to implicit bias, Brian went on to give us some results about implicit bias as it relates to the work world in general and then specifically the tech workforce. In word association tests that ask respondents to correlate gendered words and career/family words (i.e., put both male pronouns and career words on one side, and female pronouns and family words on the other, and then the other combinations, female pronouns and career words and male pronouns and family words), most people have an easier time putting male pronouns with career words and female pronouns with family words than the other way around. This implicit bias doesn’t correlate with their circumstances or personal values.
As for tech, the same holds true: both male and female respondents have an easier time associating men with STEM and women with humanities. Again, such implicit bias does not mean that people believe consciously that men are better at tech than women (i.e., a belief in the inferiority of women or superiority of men that would be labeled sexist), but all the same, these gender stereotypes absolutely influence people’s behavior. For example, the stronger a woman demonstrates an implicit bias toward the male/tech stereotype, the less likely she is to major in a STEM field in college. Inversely, men who demonstrate a strong implicit bias toward the male/tech bias are more likely to pursue a STEM major.
Culture and circumstances also influence the strength of a stereotype. This in itself proves that implicit biases can evolve over time as culture does. Implicit bias follows explicit bias: indeed, the only significant shift seen in implicit bias in the last 15 or so years is in regard to attitudes to gays and lesbians—a probable result of the significant explicit cultural attitudes toward homosexuality.
Also, one’s own circumstances play a strong role in one’s implicit bias. Take tech for example: the biggest factor influencing level of bias is belonging to a STEM discipline, meaning that women who are in tech are not as implicitly biased toward the male/tech stereotype. This may seem to some extent a tautology, but it suggests that bias is the result of cultural attitudes overwhelming personal exposure, and that the greater personal experience girls and women are given in tech, the less likely that the prevailing stereotype will influence them.
So now what?
If the bad news is you have implicit biases, the good news is, you have a conscious mind. It is important that we do not view the presence of implicit bias as a moral issue. There are, after all, many things about ourselves over which we do not have control but that are not assumed to have moral (good/bad) implications (e.g., whether we require glasses to see well).
Unfortunately, we do not give ourselves the same latitude when it comes to bias—it is unfortunate because is not possible not to be biased just by intention! Our minds simply didn’t evolve that way; instead they evolved to judge and discriminate, and to do so quickly and beneath our conscious awareness. Fairness, for better or worse, was not relevant to most of our evolutionary upbringing. Fairness is a luxury of modern humans. But we can organize what we do and the decisions we make in acknowledgement of this evolutionary inheritance.