MIT research finds ethnic and gender bias in Amazon Rekognition
However, AWS said MIT's testing was "ill-advised" and its software not used in the way it was intended
The Massachusetts Institute of Technology (MIT) has discovered that Amazon's Rekognition facial recognition platform may not be identifying race or gender accurately or fairly. A report by the scientific research facility said tests it has conducted on the technology found that Rekognition was less effective at identifying some races and genders compared to others.
For example, it mistakenly identified some pictures of women as men and this was more prevalent when presented with pictures of darker-skinned women. In fact, 31% of the time, it made this wrongful conclusion, compared to an error margin of 1.5% with Microsoft's alternative software.
However, AWS said that its software isn't true "facial recognition" software, but "facial analysis". It has been designed to identify facial expressions rather than ethnicity or gender and that's why it's less accurate than its competitors.
"[F]acial analysis [is] usually used to help search a catalog of photographs," Dr. Matt Wood, general manager of deep learning and AI at AWS said in a statement to VentureBeat. "[F]acial recognition is a distinct and different feature from facial analysis and attempts to match faces that appear similar. This is the same approach used to unlock some phones, or authenticate somebody entering a building, or by law enforcement to narrow the field when attempting to identify a person of interest."
He added that it is "ill-advised" to use the software in the way that MIT did because it has not been designed to identify criminals. However, in tests with its latest version of the software, AWS said it used data from parliamentary websites to test accuracy and had no false positive matches with the 99% confidence threshold.
However, it seems the company needs to communicate its software's purpose to its shareholders better as some have requested that the company stop selling its facial recognition service because they feel it violates human civil rights.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Clare is the founder of Blue Cactus Digital, a digital marketing company that helps ethical and sustainability-focused businesses grow their customer base.
Prior to becoming a marketer, Clare was a journalist, working at a range of mobile device-focused outlets including Know Your Mobile before moving into freelance life.
As a freelance writer, she drew on her expertise in mobility to write features and guides for ITPro, as well as regularly writing news stories on a wide range of topics.