Corsight Achieves Demographic FMR Parity
♦ Corsight Achieves Demographic FMR Parity – Corsight AI reports its facial recognition technology topped the NIST FRVT 1:1 Verification test for proactively reducing bias and has achieved demographic false match parity.
According to Corsight AI, these results show its software achieves advances in reducing discrimination within the algorithm. The Corsight false match rate for black female/male subjects is now identical to that of white female/male subjects – Corsight said this showed its solution has less bias than the competition.
In NIST testing, bias is measured according to the false match rate (FMR) of different demographic groups. If the FMR for groups of black male/female subjects is higher than the FMR for white/male subjects, the algorithm is more likely to misidentify a demographic.
“We’re absolutely thrilled with these results…this is another step forward in countering claims that bias is damaging the effectiveness of facial recognition technology,” said Tony Porter, chief privacy officer at Corsight AI. “The argument that facial recognition software is not capable of being fair is frozen in time and the performance of Corsight’s latest submission demonstrates that.
“In relation to our most recent privacy release, this NIST test goes to show that Corsight is stepping towards a 360-degree approach to trustworthy facial recognition. Our solution is accurate, it’s fast, and now, according to NIST, it can deliver fairness. We are proud of the results…we will continue working hard to extend test cases with NIST to provide a balanced real-world solution.”
NIST is recognized for setting national and international standards within the facial recognition industry, where it aims to address issues and identify standards for bias and discrimination within its testing protocols.
#SEN #SENnews #security #electronics