When Bias Is Coded Into Our Technology

This is a preview. View original post on this site


By Jennifer Lee

Facial recognition systems from large tech companies often incorrectly classify black women as male — including the likes of Michelle Obama, Serena Williams and Sojourner Truth. That’s according to Joy Buolamwini, whose research caught wide attention in 2018 with “AI, Ain’t I a Woman?” a spoken-word piece based on her findings at MIT Media Lab.

The video, along with the accompanying research paper written with Timnit Gebru of Microsoft Research, prompted many tech companies to reassess their facial recognition data sets and algorithms for darker and more female-looking faces.

“Coded Bias,” a documentary directed by Shalini Kantayya which

Read Complete Article

,

Subscribe to Recruiting Headlines

* indicates required

RECRUITMENT MARKETPLACE


»Need an ATS? Try JazzHR



»Free Rejection Email Templates


»Text Recruiting Software


»Interview Scheduling Tool: Cronofy


»HR Podcast Directory


»Recruiting Newsletters


»HR Tech News


»Freelance HR Jobs


»Recruiting & HR Jobs


»Recruiter Ebooks

shares