Sign In

Communications of the ACM

ACM News

When AI Sees a Man, It Thinks 'Official.' A Woman? 'Smile'


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
image recognition, illustration

AI services tend to notice different things about women and men.

Credit: Getty Images

Men often judge women by their appearance. Turns out, computers do too.

When U.S. and European researchers fed pictures of congressmembers to Google's cloud image recognition service, the service applied three times as many annotations related to physical appearance to photos of women as it did to men. The top labels applied to men were "official" and "businessperson"; for women they were "smile" and "chin."

"It results in women receiving a lower status stereotype: that women are there to look pretty and men are business leaders," says Carsten Schwemmer, a postdoctoral researcher at GESIS Leibniz Institute for the Social Sciences, and a co-author of  "Diagnosing Gender Bias in Image Recognition Systems," published in the journal Socius: Sociological Research for a Dynamic World.

The study adds to evidence that the AI image services of Google, Amazon, and Microsoft do not see the world with mathematical detachment but instead tend to replicate or even amplify historical cultural biases.

From Wired
View Full Article – May Require Paid Registration


 

No entries found