Sign In

Communications of the ACM

ACM Careers

Auditors Are Testing Hiring Algorithms for Bias


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
AI bias, illustration

Credit: Getty Images

More and more companies are using AI-based hiring tools to manage the flood of applications they receive. A survey of human-resources managers by Mercer found that the proportion who said their department uses predictive analytics jumped from 10% in 2016 to 39% in 2020.

Many kinds of AI hiring tools are in use today. They include software that analyzes a candidate's facial expressions, tone, and language during video interviews as well as programs that scan résumés, predict personality, or investigate an applicant's social media activity.

Researchers have found that some hiring tools produce biased results — inadvertently favoring men or people from certain socioeconomic backgrounds, for instance. Many are now advocating for greater transparency and more regulation. One solution in particular is proposed again and again: AI audits, whose ability to detect and protect against bias remains unproven.

From Technology Review
View Full Article – May Require Paid Registration


 

No entries found