The Hidden Role of Facial Recognition Tech in Many Arrests
Quinn says the spread of facial recognition technology has led investigators to believe there will be suitable digital evidence in every case, similar to the way the TV show CSI led people to believe there would always be DNA or physical forensic evidence. In reality, security camera images can be grainy, low quality, from odd angles, and suffer from lighting issues that hinder a good match.
Given widespread mistrust of police in some areas, “we really need to put it out there and help educate our communities as to the value of this stuff and how we’re using it,” Quinn says. Referring to bans on facial recognition use in some cities, he says it otherwise “becomes very easy to discuss these technologies in terms of all or nothing.”
As more states and cities consider restricting the technology, a September report by the Center for Strategic and International Studies, a think tank, suggests that Congress create national standards to prevent a patchwork of regulation. Lead author James Lewis says he supports facial recognition and thinks its spread is inevitable but that there should be transparency around how the technology is used in criminal investigations. Seven US states and cities, including Boston and San Francisco, have adopted full or partial bans of facial recognition by government agencies. Lewis doesn’t think Congress will follow suit, in part because of the January 6 attack on the US Capitol and ensuing investigation, saying, “I think that’s influential, when you have to hide in a closet.”
An analysis by the Human Rights Law Review at Columbia University concluded that “defendants face meaningful barriers to challenging” the technology and called on Congress to pass a law requiring disclosure. The report also called for procedural safeguards, such as regular testing and a minimum threshold for the accuracy of facial recognition systems.
White House science and tech policy leaders endorsed more disclosure around the use of artificial intelligence as part of an AI Bill of Rights last fall. Regulation of facial recognition technology has drawn bipartisan support in Congress, but there are no federal restrictions on use of the tech by law enforcement, despite a documented lack of guardrails for federal agencies using the tech.
The National District Attorneys Association (NDAA) says it instructs its more than 5,000 members to use “professional judgment and discretion” when it comes to divulging the use of facial recognition and to consider issues like public safety, privacy, and relevance when making these decisions. NDAA officials did not respond to requests for examples of how disclosing facial recognition use in a criminal investigation could threaten public safety.
“The longer things remain secret, the harder it is to challenge them, and the harder it is to challenge them, the longer police go without courts putting limits on what they can do,” says Nathan Wessler, who leads the Speech, Privacy, and Technology Project at the ACLU.
An Attempt to Learn More
Defense attorneys say their best hope of getting police and prosecutors to reveal that facial recognition helped identify a suspect rests on a 1963 Supreme Court decision. In Brady v Maryland, the court ruled that police must turn over to a defendant any evidence they collected that would exonerate that defendant.
The best-known case involving facial recognition and the Brady decision is that of Willie Allen Lynch, a Florida man convicted in 2016 of selling $50 in crack cocaine, in part based on facial recognition, and sentenced to eight years in prison. During his trial, Lynch, who defended himself for a period of time, argued he should be able to cross-examine a crime analyst who had performed the facial recognition scan and sent a single photo of Lynch to investigators. In a pretrial deposition, the analyst testified that she didn’t fully understand how the facial recognition program worked.
In December 2018, a Florida appeals court denied Lynch’s appeal, arguing that he had failed to demonstrate on Brady grounds that documents like pictures of other potential subjects would have changed the outcome of a trial.
Lynch then appealed to the Florida Supreme Court, seeking more information about how facial recognition was used in his case, including pictures of other potential matches and the software behind the algorithm. The appeal was supported by groups including the ACLU, Electronic Frontier Foundation, Georgetown Law Center on Privacy and Technology, and the Innocence Project. They argued that uncertainty around the results of facial recognition analysis should be treated as equivalent to eyewitnesses who said they weren’t sure they would recognize the person who committed a crime. The Florida Supreme Court declined to hear the case.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.