Facial Recognition's False Positives: How AI Misidentification Is Sending Innocent People to Jail
#Privacy

Facial Recognition's False Positives: How AI Misidentification Is Sending Innocent People to Jail

Chips Reporter
3 min read

A Tennessee grandmother spent 108 days in jail after facial recognition software wrongly identified her in a North Dakota bank fraud case, highlighting the growing problem of AI-driven wrongful arrests.

A Tennessee grandmother spent nearly six months in jail after police in Fargo, North Dakota, used facial recognition software to identify her as the primary suspect in a bank fraud case, according to reporting by WDAY News. The case of Angela Lipps, 50, represents the latest in a documented pattern of wrongful arrests driven by facial recognition technology deployed without adequate investigative follow-up.

Angela Lipps speaks with WDAY News during an interview about her wrongful arrest.

(Image credit: Matt Henson / WDAY)

Fargo police were investigating a series of bank fraud incidents in April and May last year, in which a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars. Detectives ran surveillance footage through facial recognition software, which returned a match to Lipps. A detective then compared her Tennessee driver's license and social media images to the suspect and concluded that she was the perpetrator based on facial features, body type, and hair.

The fundamental problem: Lipps had never been to North Dakota, and bank records would later confirm she was more than 1,200 miles away at the time of the alleged crimes. Nobody from the department contacted Lipps before U.S. Marshals arrested her at gunpoint on July 14 while she was babysitting four children.

Lipps sat in a Tennessee county jail for 108 days before North Dakota officers collected her. Her attorney, Jay Greenwood, immediately requested her bank records, and when Fargo police finally met with Greenwood and Lipps on December 19, five months after her arrest, the records showed she had been buying cigarettes and depositing Social Security checks in Tennessee at the time police placed her in Fargo. The case was dismissed on Christmas Eve, but the damage had already been done; she had no money, no coat, and no way home, and subsequently lost her house, her car, and her dog.

This case exemplifies a systemic failure in law enforcement's use of facial recognition technology. A January 2025 Washington Post investigation documented at least eight instances of Americans wrongfully arrested after police found a possible FRT match, and in every case, investigators skipped fundamental steps like checking alibis and comparing physical descriptions that would have cleared the suspect before arrest.

The facial recognition vendors themselves, such as Clearview AI, attach explicit caveats to their systems. Clearview requires agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting on them. According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that FRT results don't constitute probable cause but made arrests anyway.

Robert Williams, whose 2020 wrongful arrest in Detroit was the first publicly reported FRT false-positive case, reached a landmark settlement with the city in June 2024 that now requires independent corroborating evidence before any FRT match can be used to seek an arrest warrant. However, only 15 states had enacted any FRT legislation covering law enforcement at the start of 2025, and North Dakota is not among them.

As for Lipps, she is now back home in Tennessee, awaiting an apology from the Fargo Police Department that hasn't yet come. Her case raises serious questions about the reliability of AI-driven identification systems when used without proper verification protocols, and the devastating consequences when technology outpaces the human judgment needed to use it responsibly.

Comments

Loading comments...