Many facial recognition tools convey racial bias, study finds – CNET

gettyimages-614202874

A new NIST study looks into the biases of various facial recognition systems. 


Getty Images

Many facial recognition systems have a harder time identifying African-American, Asian and native groups than Caucasians, according to a study released Thursday by the National Institute of Standards and Technology. There were higher rates of false positives among these groups when it came to confirming whether a photo matched another image of the same person in a database, the study says. This is known as one-to-one matching and is often used for tasks like unlocking a phone or checking a passport.

The study looked at 189 software algorithms from 99 developers. Companies included Microsoft, Intel and Panasonic. NIST said Amazon didn’t submit its algorithm for testing, according to The Washington Post. Amazon’s Rekognition software has been criticized for showing gender and racial bias. The e-commerce giant didn’t immediately respond to a request for comment.

False positives were higher among women than men, a finding that was consistent across algorithms and datasets, according to NIST. There were also more false positives among the elderly and children. 

“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” Patrick Grother, an NIST computer scientist and the report’s primary author, said in a statement. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”

A second task, called one-to-many matching, involves determining whether someone in an image has any match in a database. This can be used to identify a person of interest. The study found that with one-to-many matching, there were higher rates of false positives for African-American females. This is an important concern because it could lead to false accusations, NIST says.

“In a one-to-one search, a false negative might be merely an inconvenience — you can’t get into your phone, but the issue can usually be remediated by a second attempt,” Grother said in the release. “But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny.”

The study notes, though, that “not all algorithms give this high rate of false positives across demographics in one-to-many matching.” Ultimately, “Different algorithms perform differently.” 

While a handful of previous studies have looked into demographic effects of one-to-one matching, none have explored the demographic effects of one-to-many matching until now, NIST says. 

To conduct the study, NIST used four collections of images with 18.27 million photos of 8.49 million people. They were pulled from databases provided by the State Department, the Department of Homeland Security and the FBI. 

source: cnet.com