While working on an assignment involving facial-recognition software, the M.I.T. Media Lab researcher Joy Buolamwini found that the algorithm couldn’t detect her face — until she put on a white mask. As she recounts in the documentary “Coded Bias,” Buolamwini soon discovered that most such artificial-intelligence programs are trained to identify patterns based on data sets that skew light-skinned and male. “When you think of A.I., it’s forward-looking,” she says. is based on data, and data is a reflection of our history.”Directed by Shalini Kantayya, “Coded Bias” explores how machine-learning algorithms — now ubiquitous in advertising, hiring, financial services, policing and many other fields — can perpetuate society’s existing race-, class- and gender-based inequities.
Source: International New York Times November 11, 2020 18:33 UTC