The Oberlin Review
<< Front page News November 18, 2005

Alum Points to Racism Among Police
Lecturer Argues for Police Reforms

To what extent do stereotypes influence police officers’ snap decisions as to whether or not to shoot a suspect? The work of Dr. Ashby Plant, OC ’92, examines exactly this question.

On Monday, Plant presented her findings in a lecture in Severance Hall titled “Eliminating Racial Bias in Decisions to Shoot at Criminal Suspects.”

Plant introduced her lecture with the story of Amadou Diallo. In New York City in February of 1999, Diallo opened his door to four police officers searching for a rape suspect. When Diallo, a black man, reached into his jacket, the officers shot at him 41 times and hit him 19 times. Diallo was only reaching for his wallet.

Did the officers think Diallo was reaching for a gun because of his race? While it’s difficult to say, Plant spoke about its possible implications.“Tragic cases such as these have led many people to wonder whether decisions to shoot criminal suspects come from racial stereotypes.”

Plant, along with two of her graduate students at Florida State University, decided to further investigate to what extent racial bias influenced these decisions and what can be done to correct it.

Before talking about her own work, Plant discussed the seminal work of B. L. Duncan. In one study, Duncan showed videotapes of black and white males pushing others. He then asked the participants if each man was shoving aggressively or just pushing. Participants were more likely to say the man was aggressive if he was black.

“When people are unsure of what is happening, they’ll rely on their expectation to inform what is going on,” Plant told the audience.

Since black males are often stereotyped as “violent” and “criminal,” a police officer, unsure as to whether a suspect has a gun, might use these stereotypes to make a judgment about the suspect’s dangerousness.

To test the hypothesis, Plant set up a computer program that overlaid a picture of either a gun or a neutral object, such as a cell phone, with the face of either a black male or a white male. Each race made up 50 percent of the photos, and among each race, a gun came up 50 percent of the time.

Undergraduate students went through these faces and were given an instant to decide whether or not to shoot the individual before the next face came up. If they were incorrect — shooting an unarmed man or not shooting an armed man — the computer told them they had made an error.

Plant hypothesized that participants would shoot unarmed black men significantly more than they did white men. Furthermore, participants would not shoot armed white men enough — potentially an equally grave mistake for a police officer. Both these hypotheses proved correct.

“However, this was not where we wanted to end the work,” Plant said. “My goal when I do research is not only to point out bias, but also to fix it.”

In order to see if individuals were able to overcome their biases, Plant looked at how the participants’ errors changed over the course of 160 faces.

“Over the course of trials, it should be clear that race does not help in making a decision,” Plant said.

As she expected, participants learned to ignore race by the end of the trials. They were making equivalent amounts of error for each race. Interestingly, few participants were aware of the adjustment they had made that allowed them to perform better.

“Our brains are very good at detecting when there is a conflict and will work to fix any conflict,” Plant said.

Anticipating the crowd, Plant went on, “This is nice with undergrads, but what happens with real police officers?”

To test this, Plant brought in 50 Florida police officers. The officers ranged in age, background and race.

As a group, the law enforcement officials performed similarly to the undergraduates.

Plant also asked this set of participants questions about their racial attitudes and contact with individuals of other races. She compared their performances against this information and their experiences.

The study found that officers with more experience erred less.

“Over time, officers have learned that it is not effective to let bias influence their decision,” said Plant.

Plant also found that officers who had more personal contact with suspects overcame their biases easier.

“Police officers are constantly telling us how their experiences with people on the job are overwhelmingly negative,” she said. “So if an officer’s only contact with African Americans is on the job, his opinion of them will be negative.”

The big question for Plant and audience members alike was how well these results related to real life.

“What we haven’t got to see is the long-term efficacy of our program,” said Plant when asked if the results could be used as a diversity training tool.

Plant said that she has already seen that participants continue to ignore race if they take the test again after 24 hours.

“We are taking the baby-steps approach with this. Next, we’ll test 72 hours,” she told the audience.

Plant said that she was optimistic that this program could be used as training for real-life police situations. At one point, Plant tested out another program where participants saw entire bodies holding either neutral objects or guns in different positions. Those that had used the original program did not exhibit racial biases when using the new program.

There was also evidence that the trials reduced the amount one thought about race generally. After the trial, Plant gave participants a list of half-formed words that could be filled in as racial concepts (for example “__ack”). Those who had used the program filled in less racial words than did a sample who had not used the program, and a sample who had only seen 80 faces. This showed that race was not as much on their minds, or perhaps being “actively suppressed,” as Plant speculated.

Plant said that there has been a lot of interest in her work among police departments ever since it has been published.

“[Police departments] know we’re not trying to point fingers, but to actually try and reduce bias,” she said.
 
 

   

Powered by