Police officers stopped Randal Quran Reid on a highway outside of Atlanta and told the 29-year-old he had two theft warrants in Louisiana. One problem: Reid had never been in the state.
However, Reid–who was on his way to a Thanksgiving gathering with his mother– was jailed for almost a week and had to spend thousands of dollars to be released.
The mistaken identity is reported to be a case of faulty facial recognition software.
As the New York Times noted, Reid discovered that he was being suspected of the crime because he “bore a resemblance to a suspect who had been recorded by a surveillance camera.”
As the Times-Picayune states, “[t]echnology has given police vast reach to compare the faces of criminal suspects against a trove of mug shots, driver’s licenses, and even selfies plucked from social media.”
The algorithm that matched Reid with the suspect was taken at face value to secure the warrant for Reid’s arrest without questioning its accuracy. Reid was not advised of the technology, nor did any of the documents used to arrest him state that facial recognition technology secured the arrest.
This faulty tech is raising concerns for advocates.
“It’s untenable to me as a matter of basic criminal procedure that people who are subject to arrest are not informed of what got them there,” constitutional law professor Barry Friedman told the New York Times.
“In a democratic society, we should know what tools are being used to police us,” said ACLU lawyer Jennifer Granick shares.
Reid was able to retain an attorney, and the warrants were dropped. However, other wrongful arrests have been made using the tech– including four publicly known cases– “that appear to have involved little investigation beyond a face match, all involving Black men,” a Georgetown Law researcher told the Times.
This reflects a wider problem. As Harvard scholar Alex Najibi noted:
Police use face recognition to compare suspects’ photos to mugshots and driver’s license images; it is estimated that almost half of American adults – over 117 million people, as of 2016 – have photos within a facial recognition network used by law enforcement. This participation occurs without consent, or even awareness, and is bolstered by a lack of legislative oversight.
With policing prone to racial discrimination, it raises alarm bells that technology is simply furthering that discrimination instead of minimizing it.
As Najibi stated, “the current implementation of these technologies involves significant racial bias, particularly against Black Americans.”