Home → Magazine Archive → March 2021 (Vol. 64, No. 3) → Can the Biases in Facial Recognition Be Fixed; Also... → Abstract

Can the Biases in Facial Recognition Be Fixed; Also, Should They?

By Paul Marks

Communications of the ACM, Vol. 64 No. 3, Pages 20-22

[article image]

In January 2020, Robert Williams of Farmington Hills, MI, was arrested at his home by the Detroit Police Department. He was photographed, fingerprinted, had his DNA taken, and was then locked up for 30 hours. His crime? He had not committed one; a facial recognition system operated by the Michigan State Police had wrongly identified him as the thief in a 2018 store robbery. However, Williams looked nothing like the perpetrator captured in the surveillance video, and the case was dropped.

A one-off case? Far from it. Rewind to May 2019, when Detroit resident Michael Oliver was arrested after being identified by the very same police facial recognition unit as the person who stole a smartphone from a vehicle. Again, however, Oliver did not even resemble the person pictured in a smartphone video of the theft. His case, too, was dropped, and Oliver has filed a law-suit seeking reputational and economic damages from the police.


Ricardo Baeza-Yates

ACM through its US Technology Policy Committee urged the suspension of facial recognition technologies in June 30, 2020. The news that links the statement is here: https://www.acm.org/articles/bulletins/2020/june/ustpc-statement-on-facial-recognition-technologies

Ricardo Baeza-Yates
ACM Fellow and member of the ACM US TPC subcommittee in AI & Algorithms.

Displaying 1 comment