Wednesday, May 21, 2014

Why We Resist Unconscious Bias


By Meg Urry, Yale University, Department of Physics and Department of Astronomy. Reproduced from the January 2014 Issue of STATUS: A Report on Women in Astronomy

About ten years ago, I sat down at my computer to take the Implicit Association Test devised by Mahzarin Banaji, then my colleague at Yale University, now at Harvard University. I had just read a story in The New York Times about how she and her colleagues test reaction times for paired words and images, calibrating the experimental subject (in this case, me) on innocuous images, while we type “yes” or “no” to indicate whether the word belongs with the image. For example, you would type “yes” for a flower paired with the word “beautiful,” and “no” for an iceberg and the word “hot.”

I was pretty nervous. Prof. Banaji isn’t interested in flowers or icebergs. She wants to know whether we are as color-blind and gender-blind as we would like to be. Has our society progressed to a point where we treat everyone the same way? Where women and men have equal opportunities to become physicists and homemakers?

Since then, I’ve read lots of related research. Experiment after experiment suggests we underestimate the qualifications of women in male-dominated fields. For example, Moss-Racusin et al. (2012) found that science professors rate women science students lower, are less likely to mentor them, and offer much lower starting salaries. Uhlmann & Cohen (2005) showed that men are more likely to be selected for a job as a police chief, whether or not they have the key qualities defined at the outset by the reviewers. In both cases, both men and women raters show the same bias against women in male-dominated fields.

It’s all pretty consistent and repeatable. It’s hard to escape the conclusions that these biases are real and we all have them. But I’m a scientist – a physicist. Our core value is objectivity. To do our jobs well, we have to be objective. So the Implicit Association Test threatens us where we live. The very last thing I want is to be biased.

Okay, enough procrastinating. I started the test. Calibration run: fine, I’m good at this video game. Now the “money” test: what would my fingers do when I saw a black man in a white lab coat and the word “scientist”? I hurried to push “yes” as quickly as I could – as fast as I did when the person in the white coat was white. When it was all over, I managed to fool the machine. The software reported that my bias was “undetermined” – not obviously absent and not obviously present.

Whew! “Indeterminate” was like a passing grade.

But I could feel the difference. I could feel the extra split second it took to push “yes” when the figure in the white coat was black. I might have been fast enough to beat Prof. Banaji’s test, but I couldn’t deny what I knew.

Ever since that moment, I’ve encouraged countless doubting Thomases – often scientists like me – to take the test at implicit.harvard.edu. Check it out. Maybe you too can fool the test. But I doubt you can fool yourself.

Some years ago, I had an interesting discussion with a group of colleagues. We had all served on a major university committee, and the provost invited us to a celebratory dinner. I had just read a fascinating sociology paper about the reactions of psychology professors to identical CVs, one bearing a woman's name and the other a man's, ostensibly candidates for a faculty job (Steinpreis et al. 1999). The men and women professors identified the same qualities as desirable in a new faculty hire, but, at a high level of statistical significance, most rated the male candidate higher. This was independent of the gender of the professor doing the rating.

For me, the only woman faculty member in the Yale Physics Department when I arrived in 2001, this evidence of bias was a revelation. I thought it (along with a ton of other similar experiments) showed that, perhaps, the reason women weren’t progressing in science at the rate men were was not that they weren’t brilliant and dedicated, but that all of us couldn’t quite assess their value on the same scale as the men.  There was some unarticulated, difficult to spot unconscious bias holding women back in male-dominated professions, like academia.

When I described the Steinpreis et al. study to a colleague across the table, he and I soon became the focus of this small dinner of a dozen faculty. He couldn’t be biased, he shouted, he had hired a woman and tenured another woman and maybe he said he had had a woman student. I don’t remember it all. I was horrified and embarrassed and wished I had never said a word.

However badly that interaction went, the clearest thing was how wounded my colleague felt at the implication that he might harbor unacknowledged biases. It’s obviously not delightful to hear such a claim. It’s only marginally better if the accuser fesses up to the same sin. Pretty much everyone is going to be offended.

But it’s surprising how strongly offended scientists are. At first, I thought this was just defensiveness – after all, women are far below parity of numbers in science, especially in physics. So yes, maybe the leaders of those fields should feel defensive. For decades they’ve missed out on mentoring and working with some of the greatest minds of our time. (Read Sharon Bertsch McGrayne’s Nobel Prize Women in Science for some inspiring but often infuriating stories.)

But that ought to apply to bankers and financiers also. And lawyers, welders, and policemen. But it is scientists who proclaim angrily, “I am gender blind. I am color blind. I’m totally objective.” So here is what may be at the heart of the matter: when we talk about implicit or unconscious bias, the listener hears, “You are not a good scientist.”

It took me a long time to figure this out because I was excited to learn about bias and I loved reading about these experiments. I thought they explained so much of the physics world I lived in. I had not taken sociology or psychology in college; a counselor had tried to point me in that direction because I was a gregarious “people person,” but I rebelled. I was determined to study physics because it was so simple and profound and powerful. Had I known just how cool the social sciences were, I definitely would have taken those classes!

So I was a late learner, starting in my postdoc years and continuing to the present day. In explaining the Steinpreis et al. (1999) experiment to my senior colleague across the table at dinner, I felt I was sharing with him the excitement of discovery, just as if I’d described one of my own astrophysics experiments. But instead of delight and interest, I excited anger. Now I see how it must have looked: me, a senior woman in the field, scolding a male colleague for being biased. (I forgot to tell him that women do just as poorly in these experiments. It’s not men having unconscious bias against women, it’s all of us having that bias.)

My biologist colleague, Prof. Jo Handelsman, says she encounters people who dismiss the social science results. Those of us with advanced degrees in science, they say, were trained to be objective and couldn’t possibly be biased. It’s an affront to the very idea of being a scientist. How can you be a good scientist if you are not objective? So pointing out that we might all be biased is equivalent to saying that we are all bad scientists.

Don’t get me wrong. I’m convinced the social science is right. It has all the hallmarks of what we call science: there is a hypothesis, it’s tested, the result is clear, and the results are reproducible. When these experiments are assessed according to the usual scientific criteria, most are reliable and repeatable.

Many colleagues have seen the light. At a recent picnic at work, a postdoc talked to me about how his outlook had evolved. He listened to a talk I gave, he said, and then took the implicit.harvard.edu test. Like me, he was anxious not to be biased, but he couldn’t escape the conclusion that he did have these biases. (By the way, Mahzarin Banaji reportedly flunked her own test. So none of us should feel too bad.)

Why could that postdoc look objectively at this issue and come to an uncomfortable conclusion? Why can other colleagues not consider the possibility of bias?

The neat final point of the Uhlmann and Cohen (2005) paper was that those who said they were objective were more likely to change the stated criteria in mid-stream than those who said they were not objective. This suggests that being aware of bias makes one more careful about decisions. It’s the people who are convinced of their own objectivity that we should be most worried about.

Where does this leave us?

Maybe not every scientist is biased. Some people might have been raised in countries or in families with minimal gender inequity. When leaders everywhere include plenty of women, maybe this bias will become less common. But, in today's world, why rule out the possibility ab initio when the evidence is so overwhelming? Why would any good scientist not evaluate the experiments? Isn’t it time to act like the very best scientists and keep an open mind?

Reading the social science literature and doing our own experiments may be the only way some of us will be convinced that we have a problem. And then we can talk about how to fix it.

Additional Reading
C. A. Moss-Racusin, J. F. Dovidio, V. L. Brescoll, M. J. Graham, and J. Handelsman 2012, “Science faculty's subtle gender biases favor male students,” PNAS September 17, 2012.

E. L. Uhlmann and G. L. Cohen 2005, “Constructed criteria: Redefining merit to justify discrimination,” Psychological Science, 16, 474-480.

R. E. Steinpreis, K. A. Anders, and D. Ritzke 1999, "The Impact of Gender on the Review of the Curricula Vitae of Job Applicants and Tenure Candidates: A National Empirical Study" Sex Roles, Vol. 41, Nos. 7/8, 509-528

Sharon Bertsch McGrayne, Nobel Prize Women in Science, Joseph Henry Press, 2nd ed., 1998

No comments :