Sign In

Communications of the ACM

ACM TechNews

Researchers Find Even 'Fair' Hiring Algorithms Can Be Biased


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Job candidates under a magnifying glass.

Researchers have found even purportedly "fair" algorithms are inconsistent in their ranking of job candidates.

Credit: blog.capterra.com

Researchers at Harvard University and Germany's Technische Universität Berlin analyzing how "fair" ranking algorithms affect gender uncovered inconsistent ranking of job candidates.

The team reviewed algorithms used on TaskRabbit, a marketplace that matches users with jobs by leveraging programs to sift through available workers and produce a ranked list of suitable candidates.

The researchers explored the generation of gender biases in TaskRabbit and their impact on hiring decisions by tapping various interacting sources—including types of ranking algorithms, job contexts, and employers' prejudices.

The team determined that while fair or de-biased ranking algorithms can help boost the number of underrepresented candidates hired, their efficacy is constrained by the job contexts in which employers favor particular genders.

The researchers said, "We hope that this work represents a step toward better understanding how algorithmic tools can [or cannot] reduce gender bias in hiring settings."

From VentureBeat
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found