Sign In

Communications of the ACM

ACM TechNews

IBM Facial Recognition Dataset Aims to Remove Gender, Skin Bias

Providing more diverse information from which facial recognition systems can learn.

IBM is releasing a new dataset called Diversity in Faces in the hope that it will help developers tackle gender and skin type biases in facial recognition software.

Credit: Computer Business Review

IBM will provide a new million-image dataset to the global research community, to help developers eliminate gender and skin-type biases from facial recognition software.

The Diversity in Faces dataset uses publicly available images from the YFCC-100M Creative Commons dataset, annotated using 10 facial coding schemes, as well as human-labeled gender and age notes.

The coding schemes include facial symmetry, facial contrast, pose, and craniofacial (bone structure) areas, in conjunction with conventional age, gender, and skin-tone schemes.

IBM's John R. Smith said, "The [artificial intelligence] systems learn what they're taught, and if they are not taught with robust and diverse datasets, accuracy and fairness could be at risk."

The researchers said the new dataset is designed so facial recognition “performance should not vary for different individuals or different populations."

From Computer Business Review
View Full Article


Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account