Researchers at Harvard University and Germany's Technische Universität Berlin analyzing how "fair" ranking algorithms affect gender uncovered inconsistent ranking of job candidates.
The team reviewed algorithms used on TaskRabbit, a marketplace that matches users with jobs by leveraging programs to sift through available workers and produce a ranked list of suitable candidates.
The researchers explored the generation of gender biases in TaskRabbit and their impact on hiring decisions by tapping various interacting sources—including types of ranking algorithms, job contexts, and employers' prejudices.
The team determined that while fair or de-biased ranking algorithms can help boost the number of underrepresented candidates hired, their efficacy is constrained by the job contexts in which employers favor particular genders.
The researchers said, "We hope that this work represents a step toward better understanding how algorithmic tools can [or cannot] reduce gender bias in hiring settings."
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA