Wednesday, May 24, 2017

Peer Review as a Lens Into Bias

I have been working to incorporate themes of equity and inclusion into my physics classroom teaching. I’ve blogged about it a bit and some of you have kindly shared your ideas (see here, here, and here). 

In a recent upper-level Astrophysics course, I assigned students a term paper, and required that they participate in a double-blind peer review for their first drafts. (We used a tool called “Peerceptiv” because my University has integrated it into our learning management system, but there are many ways to include peer review in your curriculum.) I wasn’t originally intending this assignment to lead to a conversation about bias, but my students came to me with concerns about the “fairness” of the process: What if another student had a poor opinion of the topic they selected? What if their reviewers didn’t do a good job? Why were we doing it blind, so they didn’t know whose review to take more or less seriously, based on their experience of that student? How could they properly review the paper if they didn’t know who wrote it?

This opened a fantastic conversation about peer review and also about implicit bias in science. We discussed double-blind and single-blind review processes and the role these play (at least in astro) in the publication of manuscripts and also on our telescope allocation committees. We discussed the pros and cons of each. We talked about the classic orchestra example. My students couldn’t understand why scientists don’t adopt double-blind peer review — I ask myself this often!!  — yet, they also wanted to know who was giving them feedback on their work, particularly if that feedback played a role in their final marks. We also discussed how these biases can make science a more hostile environment for minoritized populations, particularly those belonging to a visible minority. It was amazing to have students engage these issues so authentically, rather than looking at me blankly when I try to explain how important they are, in the abstract, often too early in the term before we have a rapport. 

I followed up our great in-class conversation with an email that included links to definitions of single-blind and double-blind review (email appended below, with resources). I suggested they all take an implicit bias test or two, to encourage self-reflection. I assigned myself the work of learning more about peer assessment techniques and what strategies/questions can make peer review more powerful in the classroom. My University runs workshops on peer review and yours may too.

Having this conversation arise out of a real assignment, a real scientific activity, made our discussion much more interesting and impactful. It was a great reminder for me that getting students to DO science, or something very close, causes them to butt up against the same issues that we all face as active scientists, and engages them important questions more authentically. 

———————————————————————
My follow-up email to students, with references:
———————————————————————

Hi folks,

I appreciated our conversation about peer review yesterday and am following up with additional resources.

Here are definitions for two of the terms I used to describe peer review (credit: Lluís Codina, https://www.lluiscodina.com):

* Double-blind: the evaluators are unaware of the identity of the authors, and the authors are unaware of the identity of the evaluators.

* Single-blind: the evaluators are aware of the identity of the authors, but authors are unaware of the identity of the evaluators.

In a non-blind or single-blind peer review issues can (and often do) arise from "implicit bias", i.e., attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner (http://kirwaninstitute.osu.edu/research/understanding-implicit-bias/). These unconscious biases impact all of us and make us less impartial when we act as evaluators (https://www.aaas.org/news/journals-and-funders-confront-implicit-bias-peer-review).

For example, there is a paper out this week showing a bias against citing papers written by women in astronomical journals (https://arxiv.org/abs/1610.08984): "... papers authored by females receive 10.4+/-0.9% fewer citations than what would be expected if the papers with the same non-gender specific properties were written by the male authors." (See also their Figure 6, attached.) And the same issues creep in during peer review.

There are several ways conscientious scientists work to combat their own implicit biases (we all have them): (1) double-blind the review process, (2) educate themselves and put in place processes (e.g., explicit review criterion) that keep the reviewer's focus on the work, instead of the author/applicant. Our Astrophysics class is going with the first option, i.e., a double-blind peer review (using pseudonyms).

If you want to understand (and possibly root out) some of your own biases, i.e., get working on the second option, I encourage you to try taking an "Implicit Association Test" or two (or 7!):

Best,
-Prof. Haggard

PS Here are some additional resources (also from Lluís Codina) discussing the pros and cons of single- and double-blind peer review in gory detail:

1. Nature article in 2014 argues for a double-blind system:

2. Nature article in 2015 announcing optional double-blind review:

3. Ricard Snodgrass' article on single- versus double-blind reviewing:

4. Inside Higher Ed article in 2011 arguing against double-blind review, but for single-blind:

5. Daniel Lemire's 2011 blog (a computer scientist at Quebec U!) arguing against double-blind and for single-blind review:

1 comment:

  1. I agree! Bias, delays and tendency of the peer reviewed articles they produce to be tainted by some discrepancies or abuse. At the top of that authors pay a lot of money especially for open access journal. Its time for paradigm shift where editors to pay referees or peer reviewer. The idea of a Freelance Peer Review of Scholarly Journals by monetary compensation, the subject of discussion for quite some time and now its here.

    ReplyDelete