Many Facial-Recognition Systems Are Biased, Says U.S. Study - News Summed Up

Many Facial-Recognition Systems Are Biased, Says U.S. Study


The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals. The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found. The new report comes at a time of mounting concern from lawmakers and civil rights groups over the proliferation of facial recognition. Tech companies market it as a convenience that can be used to help identify people in photos or in lieu of a password to unlock smartphones.


Source: New York Times December 19, 2019 22:04 UTC



Loading...
Loading...
  

Loading...