Some test takers of California's remote bar exam - particularly people of color - reported problems with ExamSoft’s facial recognition system. As expected, some nonwhite test-takers said the AI software did not recognize them - a recurring problem with the technology - and they were forced to submit selfies repeatedly, call tech support, or shine a light on their faces to verify their identities during this week's exam.
- Test taker Shameek Aahil Nazeer, who is Black, told the SF Chronicle that the facial recognition software repeatedly failed to recognize him and his computer crashed. He was forced to submit selfies to verify his face before each section of the test.
- Before the test, the American Civil Liberties Union and other civil rights and privacy groups expressed concerns about ExamSoft’s facial recognition system, which they said has embedded racial biases and will disproportionately flag more people of color.
- It remains unclear what levels of discrimination may have occurred during the test. Humans are biased, and that’s why the facial recognition algorithm is biased, said Jennifer Jones, an ACLU of Northern California technology and civil liberties fellow.
- The State Bar told the ACLU that any software issues would flag four reviewers before they determine a violation occurred. ExamSoft denied that its software is biased or "disproportionately harms any individual of color."
- A federal study released last year found that facial recognition algorithms are less accurate at identifying African-American and Asian faces compared to Caucasian faces. Systems also varied widely in their accuracy, according to the NIST study.