Remove ads
Program to expose implicit bias and eliminate discriminatory behaviors From Wikipedia, the free encyclopedia
Implicit bias training (or unconscious bias training) programs are designed to help individuals become aware of their implicit biases and equip them with tools and strategies to act objectively, limiting the influence of their implicit biases.[1] Some researchers say implicit biases are learned stereotypes that are automatic, seemingly associative,[2] unintentional, deeply ingrained, universal, and can influence behavior.[3]
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
A critical component of implicit bias training is creating awareness of implicit bias, and some recent evidence has indicated growth in the understanding of implicit biases.[4] Since 1998, the online Implicit-Association Test (IAT) has provided a platform for the general public to assess their implicit biases. Although the IAT measure has come under severe scrutiny regarding scientific reliability and efficacy,[5] it has also sparked a conversation about implicit bias in both popular media and the scientific community.[6]
Many implicit bias training programs have been created in recent years.[7] Facebook designed a webpage to make implicit bias training videos widely available. Google has put about 60,000 employees through a 90-minute implicit bias training program. The United States Department of Justice has trained 28,000 employees on techniques to combat implicit bias.[8]
There are a wide variety of implicit bias training programs, but the programs tend to follow a basic three-step method:
Frequently, follow-up tests of implicit bias are administered days, weeks, or months after the completion of training programs to examine the long-term benefits of these programs.[9][10][11][12] It is still uncertain whether these programs are effective or not as researchers continue to test them.[13]
According to a meta-analysis of 17 implicit bias interventions, counterstereotype training is the most effective way to reduce implicit bias.[14] In the area of gender bias, techniques such as imagining powerful women, hearing their stories, and writing essays about them have been shown to reduce levels of implicit gender bias on the IAT.[15] Dasgupta and Asagari (2004) found that real-life counterstereotypes, such as going to a women's college or having female professors, can decrease bias because the idea that women are intelligent and hard-working is repeatedly reinforced.[16] Regarding racial bias, several studies have replicated the finding that training participants to pair counterstereotypical traits such as "successful" with images of black individuals is an effective tool for reducing implicit racial bias.[17][18][13][19]
Kawakami, Dovido, and Van Kamp (2005) challenged the effectiveness of counterstereotype training when they found that participants actually showed an increase in gender bias after training. Rather than using the IAT to assess levels of implicit bias, the researchers asked participants to read a resume and decide if the applicant was qualified for a leadership job because "when ambiguity exists in an individual's qualifications or competence, evaluators will fill the void with assumptions drawn from gender stereotypes".[20] The participants received one of four resumes describing equally qualified candidates. The only difference between the four resumes was the applicant's name—two had female names and two had male names. When participants were administered the job application task immediately following counterstereotype training, they were more likely to pick the male candidates over the female candidates, making it appear that the counterstereotype training was ineffective. However, when the researchers added a distractor task between the counterstereotype training and the job application task, participants selected male and female candidates at an equal rate. When participants had to engage in a cognitive task while simultaneously selecting a candidate, they were more likely to select female applicants.[21]
The researchers did a follow-up study with a slightly different procedure to determine why bias was increased in some conditions and decreased in others. They followed the same counterstereotype training procedure but divided the job application task into two distinct parts. Participants were either asked to first pick the best candidate for the job and then rank each candidate on sixteen traits (half were female stereotypes and the other half were male stereotypes) or they were asked to complete the tasks in the opposite order. Regardless of the order, participants consistently were biased against women in the first task, but not in the second task. The researchers hypothesized that the participants were able to discern that the purpose of the study was to reduce gender bias, so they showed an increased bias in the first task to compensate for the researcher's attempt to influence their behaviors.[22] Further research is necessary to determine why participants showed decreased bias on the second task and if the decrease has an enduring effect.
Hu and colleagues (2015) created a form of counterstereotype training to unlearn implicit bias while sleeping.[18] Participants completed the typical counterstereotype training task of pairing images of people of different genders and races with counterstereotypical traits. However, their study differed from previous research because two unique sounds were played after each successful pairing of either a gender or race counterstereotype. After the training task, participants were asked to take a 90-minute nap, and their sleep patterns were monitored with EEG. Once participants entered into slow-wave sleep, the researchers played either the sound that followed correct gender counterstereotypes or racial counterstereotypes in the training task. After follow-up assessments, the results showed that bias was successfully reduced depending on the sound played during sleep, meaning the people who listened to the sound associated with gender counterstereotypes showed reduced gender bias, but not racial bias, and vice versa. The specific reduced bias remained when the participants returned to the lab a week after initial training and testing.[18]
Negation training decreases implicit bias through actively rejecting information that reinforces stereotypes, therefore breaking the habit of stereotyping.[19] Kawakami, Dovido, Moll, Hermsen, and Russin (2000) conducted one of the first studies to test the effects of negation training on reducing implicit bias. In their study, participants were presented with pictures of Black and White individuals along with a word that represented a stereotype. The participants were instructed to press "NO" during stereotype-consistent trials (for example, a Black person and the word "lazy"), and "YES" during stereotype-inconsistent trials (a Black person paired with "successful"). Participants showed significant decreases in automatic bias from the pretest to the posttest.[23]
However, Gawronski, Deutsch, Mbirkou, Seibt, and Strack (2006) hypothesized that negation training was not only ineffective but could actually strengthen implicit biases. They stated that Kawakami and colleagues only produced positive results because when the participants responded, "YES" to stereotype-inconsistent word-picture pairings, they were using counterstereotyping rather than negation. To test these claims, the researchers created separate counterstereotype and negation conditions. The counterstereotype condition was instructed to press "YES" for stereotype-inconsistent information, while the negation condition was told to press "NO" for stereotype-consistent information. The results showed that the counterstereotype condition decreased implicit bias, but the negation condition increased bias. A possible explanation for the increase in bias with negation training is the level of control required during memory retrieval. During negation training, the memory of a previously held stereotype is activated and then you have to purposefully reject the meaning of the memory. The participants were repeatedly activating the memory of the stereotype, which made it stronger, and they were not able to replace the stereotype with a positive counterstereotype. Alternatively, in counterstereotyping, you do not have to exhibit control to reject a memory because a new and separate memory for stereotype-inconsistent information is formed.[24]
Recently, Johnson, Kopp, and Petty (2018) attempted to reconcile the discrepant results of the previous research. They argued the negation was not meaningful and participants were not adequately motivated to get rid of their implicit biases. The researchers introduced a condition in which participants were told to think, "that's wrong!" in response to stereotype-consistent information. Other participants were told, instead, to continue to use the typical form of negation and simply responded "no" to stereotype-consistent information. The researchers hypothesized that "no" is an ambiguous and weak response to stereotypes, but "that's wrong!" is a specific and morally tied response that is hard to ignore. When participants were told to think, "that's wrong!" in response to stereotype-consistent information, there was a decrease in implicit bias that was not observed in the condition that simply thought "no". Additionally, the researchers discovered that motivation plays a role in the effectiveness of implicit bias training programs. After the negation training tasks, participants took the Motivation to Control for Prejudiced Reactions Scale (MCPR) to measure the participants' drive to change their implicit biases. People who scored particularly high on the MCPR showed reduced bias regardless of the condition. Therefore, if people feel determined to reduce their implicit biases and think "that's wrong" rather than "no," negation training shows promising results for decreasing implicit racial bias.[19]
Perspective-taking creates a sense of empathy for a stereotyped group, which has been shown to improve attitudes towards individuals as well as their group as a whole.[25] Typically, perspective-taking studies follow a three-step procedure. First, participants are exposed to the target minority group by watching a video that displays examples of racial discrimination or by viewing a photograph of an individual from the target minority group. Then participants are told to reflect on that person's life and their emotions or imagine themselves as the main character. A separate control group watches the same movie or views the same photograph, but they are not given any additional instructions involving perspective-taking. Lastly, participants’ biases are reassessed by answering questionnaires, retaking IAT, or engaging in specific tasks. This prototypical form of perspective-taking has been shown to effectively reduce racial bias.[26][27]
Dovidio and colleagues (2004) found that a diverse group of strangers can come together as a unified group if they believe they share a common threat. Stimulating a perceived common threat can reduce bias because people are less likely to be biased against members of their own group.[26] Todd, Bodenhausen, Richenson, and Galinksy (2011) showed participants an image of a Black man, had them write an essay about a day in his life, and then watched the participants interact with a Black researcher. The face-to-face interactions were more successful and natural with the participants in the perspective-taking condition compared to the control group.[27]
Another example of perspective-taking was tested by Shih, Stotzer, and Guitérrez (2009). They had participants watch a clip of a movie that showed an Asian American being discriminated against and were told to read a college admissions folder and decide if the student should be admitted. The admission profiles were exactly the same, except one version checked White for ethnicity while the other checked Asian American. The participants in the perspective-taking condition demonstrated greater empathy towards the Asian profile and were more likely to accept him than the control condition.[25] In 2013, they conducted an additional study in which they added a task where they flashed the pronouns "us" or "them" before showing an adjective with a good or bad connotation. They found that participants that were in the control group quickly associated good adjectives with "us" and bad adjectives with "them", while the perspective-taking group did not show a significant time difference between the two categories. The researchers concluded that empathy and perspective-taking could reduce prejudice towards discriminated groups.[28]
Kaatz and colleagues (2017) had participants play a video game where they are the character Jamal, a Black graduate student, working towards a degree in science. Throughout the game, players had to complete tasks such as selecting an advisor, attending conferences, and publishing papers. During each task, the players experienced hardships due to racial discrimination and learned about implicit bias. In order to successfully complete the game, players had to be able to learn how to recognize, label, and talk about bias. After completing the game, participants filled out surveys about their experiences. Most participants agreed that it was an effective strategy for reducing implicit bias.[29] Further research is necessary to objectively measure the effectiveness of the game.
Meditation has become integrated into a variety of Western therapeutic practices due to its benefits of enhanced well-being, reduced depression and anxiety, and overall mood improvement.[30] In 2008, meditation was incorporated into implicit bias training using Lovingkindness meditation (LKM), which "aims to self-regulate an affective state of unconditional kindness towards the self and others".[30] Meditation studies follow the format of a pretest IAT, participation in a LKM program, and a posttest IAT. Hutcherson, Seppala, and Gross (2008) showed that a few minutes of LMK could create a sense of empathy and compassion for a neutral target, which inspired the idea to use meditation as an implicit bias training technique.[31] Stell and Farsides (2016) found that after only seven minutes of LMK, implicit racial bias for a targeted group was reduced.[30] Kang, Gray, and Dovido (2014) found that participants who attended a seven-week meditation course showed a significant decrease in implicit bias towards African Americans and homeless people. Notably, participants who participated in a discussion based on the Lovingkindness philosophy for seven weeks but did not practice meditation did not show a reduction in bias after the seven weeks.[32]
This section is written like a personal reflection, personal essay, or argumentative essay that states a Wikipedia editor's personal feelings or presents an original argument about a topic. (September 2018) |
Implicit bias workshops often employ a range of strategies designed to mitigate implicit biases. Devine, Forscher, Austin, and Cox (2012) have devised a workshop that incorporates five distinct techniques to address bias: stereotype replacement, counterstereotype training, individualism, perspective taking, and increased exposure to minority groups.[9]
Stereotype replacement involves participants recognizing their own stereotypes, reflecting on their origins, contemplating ways to avoid these stereotypes in the future, and formulating unbiased responses to replace them. Counterstereotype training prompts participants to envision counterstereotypical examples. For instance, if the counterstereotype relates to intelligence, participants might be encouraged to visualize intelligent individuals from a stereotyped group, such as President Obama or a personal acquaintance.
Individualism involves participants receiving specific information about members of a stereotyped group so they can remember each person as an individual rather than seeing the group as a singular unit.
Perspective-taking involves the participant imagining themselves as a member of a stereotyped group.
Lastly, participants are provided with opportunities to have positive interactions with members of minority groups. Studies show that four and eight weeks after completing the workshop, implicit bias (as measured by the IAT) was reduced.[9]
In 2016, Moss-Racusin and colleagues created a 120-minute workshop called "Scientific Diversity" that was aimed at reducing gender bias.[33] During the workshop, instructors presented empirical evidence on implicit bias, encouraged active group discussion, and helped participants practice techniques for creating an accepting environment. To assess bias, participants took pretest and posttest questionnaires. The posttest questionnaires revealed that participants experienced increased diversity awareness and decreased subtle gender bias.[33]
According to Gonzales, Kim, and Marantz (2014), the recognition of bias cannot be taught in a single session. Researchers have thus created workshops or class curriculums that span days, semesters, or even years.[34] Hannah and Carpenter-Song (2013) created a semester-long course that focuses on introspection. Students are encouraged to look within themselves to examine their own biases, values, and most importantly, blind spots. During each class, students discuss articles about various forms of bias and participate in interactive exercises that are designed to promote perspective-taking and empathy. Tests of the course showed that students who have an active interest in learning about issues of implicit bias were able to successfully reduce their levels of bias. However, a subset of the students did not reduce bias or even showed an increased bias after the course because the program was mandatory and they were not incentivized to change their thoughts and behaviors.[10]
Van Ryn and colleagues (2015) started a course for medical school students that studies disparities in minority health care.[11] The researchers were able to implement various forms of the class in forty-nine medical schools and collected data from 3,547 students. During class, students read articles about implicit biases, hold group discussions, and gain experience with interacting with racial minorities. Participants took the IAT during their first and last semesters of medical school to assess the effectiveness of the program. Though most reductions in implicit bias were small, the reductions were significant and affected behaviors. Students reported feeling more comfortable when working with minorities and kept in mind implicit biases when treating minorities.[11]
Stone, Moskowitz, and Zestcott (2015) conducted a workshop for medical students that used self-reflection techniques to motivate healthcare providers to address their implicit biases. First, participants took an IAT (but did not receive feedback) and read an article about implicit bias in medicine. A week later, the participants attended a lecture about implicit bias and had a classroom demonstration of an IAT. Two days later, participants discussed strategies for reducing bias, seeking common identities, and taking the perspective of patients in small groups. When participants retook the IAT three to seven days after the workshop, there was a significant decrease in implicit bias.[35]
Kulik et al. found that in a sample of 2,000, implicit bias training increased the bias against older candidates.[36]
Noon says implicit bias training initiatives are still in their infancy and require further research.[3]
Social psychology research has indicated that individuating information (any information about an individual group member other than category information) may eliminate the effects of implicit bias.[37]
Individuals' scores on the implicit bias test have not been found to correlate with the degree of bias shown in their behavior towards particular groups.
It came as a major blow when four separate meta–analyses, undertaken between 2009 and 2015—each examining between 46 and 167 individual studies—all showed the IAT [implicit bias test] to be a weak predictor of behavior. Proponents of the IAT tend to point to individual studies showing strong links between test scores and racist behavior. Opponents counter by highlighting those that, counterintuitively, show a link between biased IAT scores and less discriminatory behavior.[38]
Goff’s work points to studies showing police officers with high anti-black IAT scores are quicker to shoot at African Americans. That finding, though, has been countered by research showing the exact opposite.[38]
There is also little evidence that the IAT can meaningfully predict discrimination,” notes one paper, “and we thus strongly caution against any practical applications of the IAT that rest on this assumption.[38]
Interventions to reduce implicit bias did not result in actual changes in behavior:
A 2017 meta-analysis that looked at 494 previous studies (currently under peer review and not yet published in a journal) from several researchers, including Nosek, found that reducing implicit bias did not affect behavior.[38]
Another meta-analysis has since been published in 2019, reviewing 492 previous studies. It found that: "Our findings suggest that changes in implicit measures are possible, but those changes do not necessarily translate into changes in explicit measures or behavior."[39]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.