In February, a woman told the police that a delivery man had exposed himself to her in a Manhattan building. He was about 5 feet 6 inches tall.
Two months later, evidence shows, the police arrested the wrong man. He was 6-foot-2.
The man, Trevis Williams, was driving from Connecticut to Brooklyn on the day of the crime, and location data from his phone put him about 12 miles away at the time. But a facial recognition program plucked his image from an array of mug shots and the woman identified him as the flasher.
Like Mr. Williams, the culprit was Black, had a thick beard and mustache, and wore his hair in braids. Physically, the two men had little else in common. Mr. Williams was not only taller, he also weighed 230 pounds. The victim said the delivery man appeared to weigh about 160 pounds. But Mr. Williams still spent more than two days in jail in April.
“In the blink of an eye, your whole life could change,” Mr. Williams said.
The algorithms that run facial recognition technology can outstrip fallible human eyewitnesses, and law enforcement agencies say the results are not decisive on their own. But the case against Mr. Williams, which was dismissed in July, illustrates the perils of a powerful investigative tool that can lead detectives far astray.
Researchers at the National Institute of Standards and Technology have found in repeated testing that facial recognition technology identifies the correct person a vast majority of the time. But that research typically involved images taken under controlled conditions, not the grainy, blurry images of surveillance footage.
Nationwide, at least 10 people have been wrongly arrested after being identified with facial recognition technology, according to news reports. “We’ve seen this over and over across the country,” said Nathan Wessler of the American Civil Liberties Union. “One of the primary dangers of this technology is that it often gets it wrong.”
The New York Police Department uses the technology thousands of times a year, but does not tally its successes and failures. Calling facial recognition an essential tool, a spokesman said that the department never relied on it alone to make an arrest, and that the victim’s identification of Mr. Williams provided the evidence to charge him.
Other police departments, in Detroit and Indiana, require investigators to gather more facts before putting a suspect identified by facial recognition into a photo lineup. The New York Police Department does not have that rule.
When New York detectives questioned the victim, she said she had seen the man who flashed her before. She described him as an Amazon worker who delivered packages around the area of the apartment building on East 17th Street where she worked cleaning and removing recycling and trash, according to an interview and police documents.
The man, who wore thick, black-framed glasses, made her uncomfortable, staring at her and lingering in the halls. At 4:30 p.m. on Feb. 10, the woman was cleaning when she looked up and saw the man reflected in a hallway mirror. He was staring at her, his pants undone and genitals exposed.
The woman said she had screamed and that the man had fled. The New York Times is withholding her name, because she was the victim of a sex crime.
At the same time, Mr. Williams, 36, was parking his car in the Marine Park neighborhood of Brooklyn, where he was meeting a friend. He had just come back from his job in Connecticut where he worked with autistic adults.
A week earlier, he had been arrested on a misdemeanor assault charge and photographed for a mug shot. Mr. Williams said he had gotten into a fight with a man who had been dating his ex-girlfriend. That case was dismissed in March.
But his image was still in the department’s system when the police received the call about the flasher on East 17th Street. The police pulled surveillance video, interviewed the woman and canvassed the area, looking for more witnesses, according to police documents.
They also had an investigator run a facial recognition search.
That meant uploading a still from the surveillance footage to a system that uses algorithms to render a face’s contours into data points, and then looks for the faces that are the most statistically similar. The Police Department is supposed to limit those searches to people who have been arrested in New York or New Jersey, according to the agency’s inspector general.
Ultimately, a human selects the best candidate. The examiner chose Mr. Williams and generated a report that warned that he was only a “possible match” and that alone was “not probable cause to arrest.”
His photo was then placed among six pictures — all Black men with dreadlocks and facial hair — that investigators showed to the victim.
She picked Mr. Williams’s photo, image No. 2, and signed her name underneath, according to police documents. “Confident it is him,” a detective wrote in the case report.
On April 21, the police caught Mr. Williams entering the subway through an exit gate in Brooklyn, and learned he was wanted for questioning in the Feb. 10 episode. They took him into custody.
When they interrogated him, Mr. Williams told the police that he had started making deliveries for Amazon on April 1. He explained he had worked in an Amazon warehouse during the Covid-19 pandemic and in Atlanta, but that at the time of the crime, he had been working his Connecticut job with autistic adults.
“That’s not me, man,” he said when the police showed him surveillance images of the flasher. “I swear to God, that’s not me.”
“Of course you’re going to say that wasn’t you,” the detective responded. He then asked what would happen if he pulled Mr. Williams’s employment records.
“Pull it,” Mr. Williams replied. “Please look it up.”
The police charged him the following day.
“The victim positively identified Mr. Williams,” said Brad Weekes, a spokesman for the Police Department. Mr. Weekes said the victim had told detectives she was “confident that was the same person, and only then was probable cause established to make an arrest.”
He said the victim’s photo identification, the questioning of Mr. Williams and his work history at Amazon were all part of the investigation. The police did not contact Amazon to find out the identity of the delivery man. An Amazon spokeswoman, Sharyn Ghacham, said the company would have cooperated.
In a statement, the Manhattan district attorney’s office said it could not comment because the case was sealed following its dismissal in July.
The New York police have been using facial recognition since 2011, with investigators running thousands of annual searches that have led to matches in cases as serious as rapes and murders. Mr. Weekes said that it was “factually inaccurate” to say that the police had made false arrests based on the technology.
But Legal Aid, the public defenders group that represented Mr. Williams, said that it knew of cases where the wrong person had been arrested following an identification that started with facial recognition. In 2022, the organization said, a man was accused of attempted murder and held for more than a month. The case was dismissed after the man proved that he had been elsewhere at the time, Legal Aid said.
“We are gravely concerned that the cases we have identified are only the tip of the iceberg,” Legal Aid said in a letter sent Monday to the city’s Department of Investigation asking the agency to look into police use of the technology.
The American Civil Liberties Union has called for a ban on the use of facial recognition by the police because of the risk of misidentification, said Mr. Wessler, deputy director of its Speech, Privacy and Technology Project.
Combining the technology with eyewitness identifications, which research has shown are often faulty, “compounds the problem,” said Karen Newirth, a lawyer who specializes in eyewitness identification.
In 2023, the National Institute of Standards and Technology found that the leading face recognition algorithms were able to produce a correct match for 99.9 percent of searches they performed on a mug shot database of 12 million people.
But the “accuracy may drop significantly when low-quality or uncontrolled images are used, as is often the case in real-world investigations,” said Michael King, an expert on the technology who has advised federal law enforcement on its uses and studied the report.
In Indiana and Detroit, the police are prohibited from putting facial matches in a lineup unless additional evidence — like fingerprints, DNA or cellphone data — connects a suspect to a crime.
In the case of Mr. Williams, the police failed to investigate beyond the photo array, said Mr. Williams’s lawyer, Diane Akerman. “Traditional police work could have solved this case or at least saved Mr. Williams from going through this,” she said.
A forensic analysis that Mr. Williams’s lawyers made of his phone records showed that at the time of the crime, his phone had communicated with cell towers around Brooklyn, according to data reviewed by The Times. His lawyers were prepared to use that evidence in court, but the case was dismissed before they could present it.
The victim said that Mr. Williams’s arrest had made her feel calm.
When The Times told her the charges had been dropped, that tranquillity vanished, she said.
Mr. Williams, who has a 12-year-old son, feared that the charges could get him placed on a sex offender registry, and that he would be unable to find work or pick up his child from school. He felt humiliated and angry, and he worried that everyone he knew would find out about the shameful charges.
“I despise people who do stuff like that,” he said.
He said he remained afraid he could be arrested again. “Sometimes, I just feel like I’m having panic attacks,” Mr. Williams said.
As for the public lewdness case, the police said it had been closed.
Kirsten Noyes contributed reporting.
Maria Cramer is a Times reporter covering the New York Police Department and crime in the city and surrounding areas.
Kashmir Hill writes about technology and how it is changing people’s everyday lives with a particular focus on privacy. She has been covering technology for more than a decade.
The post How the N.Y.P.D.’s Facial Recognition Tool Landed the Wrong Man in Jail appeared first on New York Times.