advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Recognition software displays racial bias by disproportionately flagging black students

Racial bias is an element of technology that has become all the more important to identify and address. Case in point is remote learning, where facial recognition software from Proctorio has been used during exam sessions to ensure learners are always at their PC or notebook during the examination. While the software sounds useful, it has been found to adversely flag people of colour as being away from their devices compared to white students.

According to a report by Motherboard, a student researcher was able to figure out how the software performed its facial recognition, noting that it failed to recognise a black person’s face more than half of the time it was tested.

“I decided to look into it because (Proctorio has) claimed to have heard of ‘fewer than five’ instances where there were issues with face recognition due to race,” Akash Satheesan, the researcher in question told the publication. “I knew that from anecdotes to be unlikely … so I set out to find some more conclusive proof and I think I’m fairly certain I did,” he added.

Satheesan documented his research methods and findings in a number of blog posts. Here he analysed the code found in Proctorio’s Chrome browser extension, finding that the file names associated with the tool’s facial recognition were identical to those published by OpenCV, which is an open-source computer vision software library.

“Satheesan demonstrated for Motherboard that the facial detection algorithms embedded in Proctorio’s tool performed identically to the OpenCV models when tested on the same set of faces,” adds the publication.

The researcher also explained that the Proctorio software not only failed to recognise faces of colour, it also struggled with recognising faces of any ethnicity, with the highest hit rate was under 75 percent.

“While the public reports don’t accurately capture how our technology works in full, we appreciate that the analyses confirm that Proctorio uses face detection,” Proctorio spokesperson, Meredith Shadle wrote to Motherboard in an email.

As such it looks like the racial bias within the software is set to remain in place unchanged, with no official statement about addressing the findings being made at the time of writing.

With more businesses and institutions looking at this type of software to monitor employees and students alike, which needs to be flagged as ill-advised in its own right, it is important for those who chose to implement such solutions within their own environment, to be aware of the racial bias that likely exists.

[Image – Photo by Surface on Unsplash]
[Source – Motherboard]

advertisement

About Author

advertisement

Related News

advertisement