Los Angeles Police Commission President Eileen Decker said Tuesday that she will direct a subcommittee to prepare a report on law enforcement officers’ use of facial recognition software to identify potential suspects.
Decker said she wants the commission subcommittee to work with the Los Angeles Police Department to report on what oversight of the software is in place and to analyze policies other cities have adopted with regard to facial recognition.
“I’m aware a number of cities have taken different actions,” Decker said. “Of course, the state of California prohibits the use of body-worn (cameras) for the purpose of facial recognition, but I think it’s a good time to take a global look at this issue and to have the subcommittee do a deeper dive on this and bring back to the commission whatever might be necessary.”
The Los Angeles Times reported Monday that LAPD officers have run facial recognition software nearly 30,000 times in the last 11 years to help track down suspects, despite concerns about its reliability, especially when identifying people of color.
Hundreds of officers have worked to match images from surveillance footage and elsewhere against a database of nearly 9 million booking photos, even as the department has denied using the technology, according to the newspaper.
LAPD Assistant Chief Horace Frank told The Times the disparity was due to error, not any attempt to cover up the department’s use of facial recognition, saying that he had testified to its use in front of the Police Commission two years ago. Civil rights activists said the department only corrected earlier misstatements after The Times questioned their veracity.
At Tuesday morning’s Police Commission meeting, LAPD Chief Michel Moore said the use of the system is to recognize someone’s facial structure and does not profile suspects based on race. Moore said only officers who have received training are permitted to use the software.
He said using the system must be “tied to a specific criminal investigation.”
“There is a login system that tracks access to the system by whom and for what purpose,” Moore said, adding if there were any allegations of misuse of the system, LAPD would investigate it as a personnel matter.
LAPD spokesman Josh Rubenstein told The Times he could not establish how often the use of facial recognition technology resulted in an arrest, but said the software has helped identify perpetrators of gang crimes where witnesses were afraid to come forward. He said the technology had been used by investigators seeking to arrest individuals involved in arsons, burglaries and other crimes committed during protests earlier this summer.
“(It is) only used to develop investigative leads, not to solely identify a suspect in a crime,” Rubenstein told the newspaper. “No individuals are arrested by the LAPD based solely on facial recognition results.”
An internal department memo quoted by The Times states the technology “shall not be utilized to establish any database or create suspect identification books … Additionally, FRT shall not be used as a general identification tool, when there is no investigative purpose, or as the sole source of identification for a subject’s identity.”
A state law that took effect in January bans the software being used with police body cameras. In advocating for that law, the American Civil Liberties Union of Southern California pointed to an August 2019 test in which the software inaccurately “matched” 26 California legislators with mugshots.
Last year, San Francisco prohibited the use of facial recognition technology.
Civil liberties advocates say the software systems are less accurate for people of color, women and children. An M.I.T. study reported by the New York Times in 2018 concluded that racial disparities exist because the artificial intelligence is “taught” using a supply of photos featuring many more white men than black women, for example.
The technology used by the LAPD is provided by DataWorks Plus, a South Carolina company that moved into facial recognition from mugshot services. In 2019, a federal study of more than 100 facial-recognition systems, including software used by DataWorks Plus, determined that they falsely identified Black and Asian faces 10 to 100 times more often than white faces, with the highest false positives among Native Americans, according to The Times.
Todd Pastorini, executive vice president and general manager of DataWorks Plus, told The Times he has worked with forensic facial-recognition systems in California for more than a decade, and has “never observed a racial bias.”
Pastorini said his company has installed three facial-recognition search engines for the Los Angeles County Regional Identification System. The technology selects the top 250 positive facial matches for each image submitted by a police officer, leaving the officer to follow up on those leads, he said.
Rubenstein said 330 LAPD employees have access to the county system, which is provided to the LAPD at no cost, and that the department does not use the software to scan crowds or in any live-streaming capacity. Despite regular audits of its use, he said he is not aware of any instances where the system has been misused by LAPD officials.
LACRIS is used by dozens of county law enforcement agencies, but it is not known how many specifically utilize the facial recognition software. The Los Angeles County Sheriff’s Department declined to answer The Times’ questions about its use of the system.