Banishing bias: New tool offers fairer research metrics across disciplines, genders and experience

It’s long been an issue facing research communities—how to objectively assess the relative merits of research across disciplines, to make fair comparisons between early career and established researchers, and to evaluate work from researchers of different genders, when outside factors sometimes throw a wrench in the gears of typical assessment metrics.
Stefani Crabtree, assistant professor of social-environmental modeling in USU’s Department of Environment and Society, and colleagues have developed a tool to assess research performance more fairly; one that they hope will level the playing field. The new index is a ranking algorithm that can be standardised across disciplines, can be corrected for career breaks, and provides a sample-specific threshold that can determine whether individual performance is greater or less than expected relative to the other researchers in a sample. This index is a way to reduce systemic biases in assessing the quality of researchers’ work via citations by instead providing career-stage, gender, and opportunity corrections to citation-based performance metrics.
“This not only levels the playing field for women and early career professionals, but also enables comparisons of researchers across disciplines—an inherently hard task,” said Crabtree. “As we move toward more multidisciplinary approaches in research, we need better ways to assess the work people do.”
The tool is freely available as an app — just input data for a sample of researchers from open-source databases like Google Scholar, and the app does the work to enable comparison of researchers at any stage of their career and from any discipline on the same scale.
It’s a boon for anyone who wishes to use an objective metric to rank researchers, whether it be for grant applications, job interviews, promotions and awards, or even as a staff performance indicator, said Crabtree.
A fairer way to compare researchers at any career stage and in any discipline using open-access citation data: Corey J. A. Bradshaw, Justin M. Chalker, Stefani A. Crabtree, Bart A. Eijkelkamp, John A. Long, Justine R. Smith, Kate Trinajstic, and Vera Weisbecker. PLos One, September 2021. DOI:10.1371/journal.pone.0257141