Police equipment maker Axon said facial recognition won’t be used on its body cameras after an ethics review found the technology was not yet reliable enough (SAUL LOEB)

Washington (AFP) – Police equipment manufacturer Axon said Thursday it decided against deploying facial recognition on its body cameras after an ethics review found the technology “is not yet reliable enough.” 

Axon, formerly known as Taser and maker of the eponymous nonlethal police weapon, made its decision after a report by its board of directors that came amid intense debate on the use face matching technology for law enforcement and surveillance.

“Current face matching technology raises serious ethical concerns,” chief executive and founder Rick Smith said in a statement.

“In addition, there are technological limitations to using this technology on body cameras. Consistent with the board’s recommendation, Axon will not be commercializing face matching products on our body cameras at this time.”

Smith said face matching technology “deserves further research” to help understand issues around error and algorithmic bias in the technology which is based on artificial intelligence.

“Our AI team will continue to evaluate the state of face recognition technologies and will keep the board informed about our research.”

The Axon board report said face recognition technology “is not currently reliable enough to ethically justify its use on body-worn cameras” and added, “at the least, face recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across races, ethnicities, genders, and other identity groups.”

Facial recognition technology uses a scan of a person’s face to create an algorithm that is compared against a database. It can be used to unlock a smartphone or vehicle, pay in retail stores, verify identities at bank machines or develop customized fashion or beauty recommendations.

But its use in law enforcement has created the greatest outcry among rights activists because of the potential for errors and mismatches, and because the technology relies on vast databases that may have little or no oversight.

San Francisco’s board of supervisors voted earlier this year to ban the use of facial recognition from law enforcement.

A 2016 study by Georgetown University researchers found about 64 million Americans were in at least one database in “a virtual line-up” even if most had no criminal record, and that there was little knowledge of whether these systems were accurate or unfairly impacted minorities.

Disclaimer: Validity of the above story is for 7 Days from original date of publishing. Source: AFP.