From “Between the World and Me” to “Whistling Vivaldi”: How implicit bias trips up our brains…and what we can do about it

Marcelo Vinces, CLEAR and CTIE, November 21, 2016

Last year a group of faculty, led by Pam Brooks of the Africana Studies Department, planned and implemented a series of discussion groups to read and explore Ta-Nehisi Coates’s Between the World and Me, a book that Toni Morrison rightfully called “revelatory” and “required reading.” Written in the form of a letter to his black teenage son, the book was published at a time when the country was gripped with stories of violence set upon unarmed black bodies by police, in Ferguson, Baltimore, New York City, and elsewhere around the United States.

Paolo B, "Tra me el il mondo," Flickr CC

Paolo B, “Tra me el il mondo,” Flickr CC

In one striking passage, Coates relates to his son the particular burden of black Americans that is nearly invisible to all others, a ubiquitous manifestation of the fear and force which puts a distance between them and the world and represents an ever-present draining of human vitality and potential:

This need to be always on guard was an unmeasured expenditure of energy, the slow siphoning of the essence. It contributed to the fast breakdown of our bodies. So I feared not just the violence of this world but the rules designed to protect you from it, the rules that would have you to contort your body to address the block, and contort again to be taken seriously by colleagues, and contort again so as not to give the police a reason… This is how we lose our softness. This is how they steal our right to smile… It struck me that perhaps the defining feature of being drafted into the black race was the inescapable robbery of time, because moments we spent readying the mask, or readying ourselves to accept half as much, could not be recovered. The robbery of time is not measured in lifespans but in moments (pp. 90-1).

This past September CTIE hosted a workshop on implicit bias led by Cindy Frantz and Nancy Darling of the Psychology Department. Workshop participants received copies of Claude Steele’s Whistling Vivaldi: How Stereotypes Affect Us and What We Can Do. The above passage from Coates’s masterful epistolary work is resonant with much of Steele’s account of the research on stereotypes and the harm they inflict on the human psyche. Steele, who is a social psychologist, pioneered this field of research. In the chapter, “The Mind on Stereotype Threat: Racing and Overloaded,” for example, Steele summarizes how researchers came to understand that the fear of negative stereotypes tied to identity, whether someone is aware of it or not, is sufficient to cause physiological stress reactions that can interfere in performance and cognition. That “robbery of time,” of softness, of the right to smile that Coates writes about has been documented in countless studies that show the harmful effects negative stereotypes have on the mind and body.

Mahzarin Banaji, Experimental Psychologist. From an Independent Lens-PBS production (Feb. 24, 2015): "American Denial" - Click on photo for short video

Mahzarin Banaji, Experimental Psychologist. From an Independent Lens-PBS production (Feb. 24, 2015): “American Denial” – Click on image for a short video from “American Denial”

It is troubling that, just as our brains function less optimally when threatened by fear of stereotype (by disrupting working memory and executive function), so too are our brains wired in a way to make them prone to perpetuate biases at an unconscious, implicit, level. Scientists believe such wiring is an evolutionary adaptation of our ancestors surviving in the wild. Associating certain places or sounds with danger came in handy for survival, and the less (conscious) cognitive action it required to make such connections, the better. But when such associations are made between categories of people and negative traits, even without us being aware that connections are being made, they are maladaptive and linger despite our knowing this. As we go about our business, such implicit biases, left unchecked, can affect our judgement, even when we think we’re being fair. Acting on unconscious decisions based on lingering biased associations can make the difference which in other contexts, such as police work, can result in either peaceful outcomes, or a bullet fired at an unarmed individual. Our work in the classroom thankfully does not carry such high costs, but implicit bias can nonetheless insert itself:  in the way we grade, the opportunities we afford students, or in the subtle ways we regard individuals based on an identity they hold. They can, to return to Coates, rob some students of precious time, of the right to smile.

Taking action to counter implicit bias

So how do we counteract our brains’ tendency to make unconscious associations that perpetuate biased perspectives? As one way to answer this, let’s turn to a very different context, the court system, where the influence of implicit bias can have profound consequences on people’s lives. The National Center for State Courts (NCSC) published a report in 2012 detailing “seven general approaches to address implicit bias in the courts based on scientific research.” These strategies are:

  1. Raise awareness of implicit bias.

Attending workshops such as the one we sponsored last September, reading this article or other materials on the topic (some of which are cited here), or discussing these issues with colleagues informally or at a department meeting are examples of first steps in raising awareness. We can only work to correct for sources of bias when we are aware they exist and learning about the potentially harmful effects on judgment and behavior can motivate us to pursue corrective action.

  1. Seek to identify and consciously acknowledge real group and individual differences.
DbDuo Photography

DbDuo Photography

A “color blind” approach, though on the surface sounding idealistic and well-meaning, in fact, does not work to eliminate unconscious biases. In fact, “color blindness” actually produces greater implicit bias than strategies that acknowledge race (Apfelbaum, Sommers, & Norton, 2008). The NSCS report lists some practical approaches that an individual or an organization can take that are far more effective than a well-meaning “neutral” approach, including seeking out and electing to participate in diversity training seminars, seeking out the company of other professionals who demonstrate egalitarian goals, and investing extra effort in identifying the unique attributes of stigmatized group members.

  1. Routinely check thought processes and decisions for possible bias.

One of my former colleagues, discussing a co-worker we both felt animosity towards, said to me “I have thought about whether or not I dislike this person because she was a woman. Would I have a problem with this person if they were male? When the answer was yes, without hesitation, I knew it was not a gender bias at work.” I found two things remarkable about this conversation: the first was my colleague’s self-awareness and honesty about her judgment possibly being biased by a stereotype she may have harbored. The second was that despite her being female, she still felt the need to check whether her judgment was being influenced by negative gender stereotypes. Indeed, the research has confirmed that being a member of a stereotyped group does not immunize us from stereotyping or being biased against members of our own group (see Moss-Racusin, Dovidio, Brescoll, Graham & Handelsman, 2012, for an excellent example involving academics). That conversation has stayed with me for years, and to this day I quietly ask myself what hidden biases might be creeping in when arriving at judgements about someone or in influencing my treatment of an individual.

  1. Identify distractions and sources of stress in the decision-making environment and remove or reduce them.
Michael Teuber, "Rush," Flicker CC

Michael Teuber, “Rush,” Flicker CC

“Decision makers who are rushed, stressed, distracted, or pressured,” the NSCS report observes, “are more likely to apply stereotypes – recalling facts in ways biased by stereotypes and making more stereotypic judgments – than decision makers whose cognitive abilities are not similarly constrained.” To cite just one, very common, example: when we are in a total crunch and yet must write any number of letters of reference, we may be more prone to insert biased language thereby, and probably unintentionally, weakening a student’s chances of entering a graduate program, getting a fellowship, or a job. “She was one of the best women in this advanced math course,” we might write. Ouch! Studies have shown that regardless of the gender of the writer, the language used to describe male candidates is different than that used for female candidates (see for example, Schmader, Whitehead, & Wysocki, 2007). Particularly since having a stress-free environment while writing letters is not often possible, it is all the more important to be aware of the possible effects of stress on biases in order to write more fair and effective letters for our students. Nick Petzak, director of Fellowships and Awards at Oberlin, has prepared a document containing guidelines and recommendations for writing effective letters of recommendation in a way that minimizes gender bias in the language we use.

  1. Identify sources of ambiguity in the decision-making context and establish more concrete standards before engaging in the decision-making process.

Moments where selections take place are prone to introduction of hidden biases, especially when criteria are not well defined to begin with.  But they also are vulnerable to the bias inherent in what researchers of consumer behavior called the “evoked set”. An example of an evoked set is nicely illustrated in a 2003 article in the New York Times, “The Lessons of the Grocery Shelf Also Have Something to Say About Affirmative Action”:

Last summer [we] ran an article on Hollywood’s search for young action heroes. Old standbys like Arnold Schwarzenegger and Harrison Ford were getting a bit long in the tooth, leading studios to turn to newcomers like Matt Damon and Vin Diesel. The piece left the impression of a vast generation gap, with no heroes from the latter half of the baby boom. But one huge action star was inconspicuously absent: Wesley Snipes, born in 1962. Another, Will Smith, born in 1968, was mentioned only in passing. The evoked set of ‘action stars’ didn’t overlap with the evoked set of ‘black movie stars.’ There was no racial hostility at work, just the limits of human minds and the categories they create.

The writer continues that, during hiring searches, or when deciding who to invite as a guest speaker, “if you are looking for the best possible conference lineup, just listing the speakers who immediately come to mind may inadvertently exclude good candidates. You should also search through the other categories your mind uses to classify people.”

  1. Institute feedback mechanisms.

The NCSC report suggests that “transparent feedback from regular or intermittent peer reviews that raise personal awareness of biases could prompt those with egalitarian motives to do more to prevent implicit bias in future decisions and actions (e.g., Son Hing, Li, & Zanna, 2002). This feedback should include concrete suggestions on how to improve performance (cf. Mendoza, Gollwitzer, & Amodio, 2010; Kim, 2003) and could also involve recognition of those individuals who display exceptional fairness as positive reinforcement.” CTIE can help provide such peer feedback by arranging for a classroom observation of one or more of your classes. These involve a pre-observation interview, the observation itself, and a post-observation discussion and write up. Classroom observations are confidential and not reported to departments or the dean’s office. Videotaping of a classroom session can also be arranged. We can also arrange for other faculty to sit in on your classes or for you to observe others. These observations can be arranged to address particular areas of concern to you.

  1. Increase exposure to stigmatized group members and counter-stereotypes and reduce exposure to stereotypes.

Here I will allow myself to get personal and vulnerable, as we all must when confronting implicit bias in the work we do. I grew up a scrawny introvert, a bookworm who could barely achieve one chin-up in the high school gym, much to the chagrin of the physical education teachers and the scorn of my more athletic peers. I thus grew to hate athletics in general, and have since held a mild distrust of athletes, in particular. I was not quite aware of this latter bias in regards to students at Oberlin, and did not feel it was relevant in the work I do here. However my bias was revealed in two ways. One, upon invitation by some of the student athletes I work with in CLEAR (Center for Learning, Education, and Research in the Sciences), I began attending their games and found myself astounded at spotting some team members I had never imagined to be student athletes simply because to me they “did not seem the type.” Here was an overt stereotype being remolded. And second, subsequent and more frequent attendance at these games has allowed me to make greater connections with student athletes, and to get to know them better and in doing so finding aspects about them that further exploded my stereotypes about athletes and athletics. I have in my short time at Oberlin come a long way from a disengagement from athletics, to becoming a huge fan and advocate of our student athletes and their matches (ask anyone in the women’s volleyball team, for example). Only greater exposure, and having more conversations with student athletes and their coaches, have led to this dramatic conversion, and I hope have helped mitigate lingering implicit biases I may harbor in how I regard and treat our student athletes.

athletics

Conclusion

There are numerous practices which faculty at Oberlin and elsewhere already follow that reduce the influence hidden biases can have on our thinking and judgement. Some examples:

  • Try blind grading when possible. Because biased associations occur rapidly at an unconscious level, even seeing a name may activate stereotyped associations about ability and diminish objectivity when grading.
  • Use constructive feedback both to communicate high standards for performance but also to provide assurances that the student is capable of meeting those high standards (Cohen, Steele, & Ross, 1999).
  • Help individuals think of themselves in ways that reduce the salience of a threatened identity.Women encouraged to think of themselves in terms of their valued and unique characteristics were less likely to experience stereotype threat in mathematics (Ambady, Paik, Steele, Owen-Smith, and Mitchell, 2004).
  • Addressing the fairness of the test, even if you retain its diagnostic nature, can alleviate stereotype threat in a testing situation. For example, testing procedures could include a brief statement that the test, although diagnostic of underlying mathematics ability, is gender-fair or race-fair.

I hope the above approaches, taken from a report issued for a very different context, are comforting to you and that you are convinced, that just because our brains are wired in a way that makes implicit bias a fact of life does not make us slaves to these hidden influences, and that there are effective approaches to addressing them.
 

References

Apfelbaum, E., Sommers, S., & Norton, M. (2008). Seeing race and seeming racist? Evaluating strategic colorblindness in social interaction. Journal of Personality and Social Psychology, 95, 918-932.

Ambady, N., Paik, S.K., Steele, J., Owen-Smith, A., and Mitchell, P. (2004). Deflecting negative self-relevant stereotype activation: The effects of individuation. Journal of Experimental Social Psychology, 40, 401–408.

Cohen, G., Steele, C. M., & Ross, L. D. (1999). The mentor’s dilemma: Providing critical feedback across the racial divide. Personality and Social Psychology Bulletin, 25, 1302-1318. http://psp.sagepub.com/cgi/reprint/25/10/1302.pdf

Frantz, C., Cuddy, A.J.C., Burnett, M., Ray, H., Hart, A. (2004). A threat in the computer: The Race Implicit Association Test as a stereotype threat experience. Personality and Social Psychology Bulletin, 30, 1611- 30; 1611-1624.

Kim, D. (2003) Voluntary controllability of the implicit association test (IAT). Social Psychology Quarterly, 66, 83-96.

Mendoza, S., Gollwitzer, P., & Amodio, D. (2010). Reducing the expression of implicit stereotypes: Reflexive control through implementation intentions. Personality and Social Psychology Bulletin, 36, 512-523.

Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences, 109(41), 16474-16479.

National Center State Courts (2012), Strategies to Reduce the Influence of Implicit Bias.

Postrel, V. (2003). The Lessons of the Grocery Shelf Also Have Something to Say About Affirmative Action, New York Times, January 30, p. C2.

Schmader, T., Whitehead, J., & Wysocki, V. H. (2007). A linguistic comparison of letters of recommendation for male and female chemistry and biochemistry job applicants. Sex Roles, 57(7-8), 509-514.

Son Hing, L., Li, W., & Zanna, M. (2002). Inducing hypocrisy to reduce prejudicial response among aversive racists. Journal of Experimental Social Psychology, 38, 71-77.

Leave a Reply

Your email address will not be published. Required fields are marked *