A 50-year-old Tennessee grandmother spent more than five months behind bars after an AI facial recognition program misidentified her as a bank fraud suspect in a state she says she has never set foot in.
Sources state that Angela Lipps was at home on July 14, 2025, babysitting four young children when a team of U.S. Marshals arrived at her door and took her into custody. The circumstances were sudden and disorienting: no prior phone call, no interview, no warning of any kind. Officers placed her in handcuffs and led her away while the children she was watching looked on.

The incident traces back to a series of bank fraud cases in Fargo, North Dakota. Authorities there were investigating a woman suspected of using a fraudulent military ID to withdraw tens of thousands of dollars from multiple banks. Investigators fed surveillance footage through Clearview AI, a commercial facial recognition tool, which flagged Lipps as a potential match.
The problem? Lipps lives in Tennessee, roughly 1,500 miles from Fargo, and insists she has never once traveled to the state. “I have never been to North Dakota or even any of the surrounding states,” she said.
What followed was a months-long ordeal that upended her life entirely. Held without bail in a Tennessee county facility, Lipps says she was never interviewed by investigators or given any meaningful explanation of the case against her.
“I sat in a county jail in Tennessee for 108 days. I had no bail. No one interviewed me. I just sat and waited,” she mentioned.
During that time, she was reportedly unable to even obtain her dentures.
On October 30, authorities flew her to North Dakota for a hearing. It was, according to Lipps, her first time ever boarding a plane. She described the experience as terrifying. But the case against her unraveled almost immediately upon arrival.
“It took five minutes for the whole thing to fall apart,” she said.
Her bank records, showing she was in Tennessee at the time of the Fargo incidents, were sufficient to establish what a basic preliminary investigation might have confirmed from the start. The charges were dismissed on Christmas Eve.
Rather than being offered assistance returning home, Lipps says she was released onto the street in the middle of winter wearing only summer clothing, with no transportation arranged.
The time she spent incarcerated had cascading consequences across every area of her life. Her rental housing was lost. Her Social Security Income was cut off. Her health insurance lapsed. Her car was gone, as was her dog. Her family had placed her belongings in storage, but the bills went unpaid and the contents were reportedly forfeited.
A verified GoFundMe established on her behalf notes that Lipps had not only never visited North Dakota, but had never even been on an airplane before her involuntary trip across the country to face charges she did not commit.
At a press conference, Fargo Police Chief Dave Zibolski acknowledged that Clearview AI had been used without his knowledge. He stated he “would not have allowed [it] to be used” and confirmed the technology “has since been prohibited.” The department’s own statement conceded only that the software “identified a potential suspect with similar features to Angela Lipps.”
Civil rights advocates and legal scholars have long raised alarms about the use of facial recognition in policing, particularly when outputs from these tools are treated as grounds for arrest rather than preliminary leads requiring verification. Misidentifications have repeatedly been documented across similar systems, with real and lasting consequences for ordinary people who had no meaningful recourse while the process slowly ran its course.
Lipps’ attorneys are currently exploring civil rights claims, though no lawsuit has been filed as of this writing.