Amazon discontinued an artificial intelligence recruiting tool its machine learning specialists developed to automate the hiring process because they determined it was biased against women.
Starting in 2014, a group of Amazon researchers created 500 computer models focused on specific job functions and locations, training each to recognize about 50,000 terms that showed up on past Amazon job candidates' resumes.
However, because most resumes submitted to Amazon had come from men, the models tended to favor candidates who described themselves using verbs more commonly found on male engineers' resumes, such as "executed" and "captured."
In addition, the program penalized resumes that included the word "women's" and downgraded graduates of two all-women's colleges.
Although Amazon declined to comment on the technology's issues, the company said the tool was “never used by Amazon recruiters to evaluate candidates.”
View Full Article
Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA
No entries found