AI-powered police body cameras, once taboo, get tested on Canadian city’s ‘watch list’ of faces

Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a “high risk” list in the Canadian city of Edmonton. It's a live test of whether facial recognition technology, shunned as too intrusive, could have a place in policing across North America.

But six years after leading body camera maker Axon Enterprise, Inc. said police use of facial recognition technology raises serious ethical issues, a pilot project launched last week is raising alarms far beyond Edmonton, the continent's northernmost city with a population of more than 1 million.

The former chairman of Axon's artificial intelligence ethics board, which forced the company to temporarily abandon facial recognition in 2019, told The Associated Press that he is concerned that the Arizona-based company is moving forward without sufficient public comment, testing and peer review of the social and privacy risks.

“It is very important not to pursue these technologies, which have very real costs and risks, until there is clear evidence of the benefits,” said former board chairman Barry Friedman, now a law professor at New York University.

Axon founder and CEO Rick Smith says the Edmonton pilot is not a product launch, but an “early stage of field research” that will evaluate the technology's effectiveness and identify the safety measures needed to ensure its responsible use.

“By conducting testing in real-world settings outside the United States, we can gather independent information, strengthen surveillance systems, and apply this knowledge to future evaluations, including in the United States,” Smith wrote in a blog post.

The pilot project is designed to help Edmonton patrol officers become safer by allowing their body cameras to detect anyone whom authorities classify as having a “flag or warning” under categories such as “violent or aggressive; armed and dangerous; weapon; risk of escape; and a high-risk offender,” said Kurt Martin, acting superintendent of the Edmonton Police Service. There are currently 6,341 people on the watch list, Martin said at a Dec. 2 news conference. He said 724 people with at least one serious felony warrant have been added to a separate watch list.

“We really want to make sure that we're talking about people who have committed serious offenses,” said Ann-Lee Cook, Axon's director of responsible artificial intelligence.

If the pilot project expands, it could have a major impact on policing around the world. Axon, the publicly traded firm best known for developing the stun gun, is the dominant supplier of body cameras in the U.S. and is increasingly offering them to police agencies in Canada and other countries. Last year, Axon beat out its closest competitor, Chicago-based Motorola Solutions, in an attempt to sell body cameras to the Royal Canadian Mounted Police.

Motorola said in a statement that it also has the ability to integrate facial recognition technology into police body cameras but, based on its ethical principles, “intentionally refrains from using this feature for proactive identification.” It is possible that it will be used in the future.

In 2023, the Alberta government introduced body cameras for all police agencies in the province, including its capital Edmonton, calling it a transparency measure to document police interactions, collect better evidence and speed up turnaround times for investigations and complaints.

While many communities in the US have also welcomed body cameras as an accountability tool, the promise of real-time facial recognition identification of people in public places was unpopular across the political spectrum. A backlash from civil liberties advocates and a broader conversation about racial injustice helped push Axon and Big Tech to halt sales of facial recognition software to police.

Among the biggest concerns there have been studies showing that the technology was flawed and showed biased results based on race, gender and age. It also did not match faces in real-time video streams as accurately as it did in faces posing for ID cards or in police body camera photos.

Several US states and dozens of cities have sought to reduce police use of facial recognition, although President Donald Trump's administration is now trying to block or dissuade states from regulating AI.

The European Union has banned real-time public facial scanning police technology across the 27-nation bloc except when used for serious crimes such as kidnapping or terrorism.

But in the United Kingdom, which is no longer part of the EU, authorities began testing the technology on the streets of London a decade ago and have used it to make 1,300 arrests in the past two years. The government is considering expanding its use across the country.

Many details of Edmonton's pilot project have not been publicly disclosed. Axon does not create its own artificial intelligence model for facial recognition, but declined to say which third-party vendor it uses.

Edmonton police say the pilot project will run until the end of December and will only run during daylight hours.

“Apparently it gets dark pretty early here,” Martin said. “Lighting conditions, low temperatures in the winter, all of these factors will influence what we consider in terms of a successful proof of concept.”

Martin said the 50 or so officers testing the technology won't know whether the software is suitable for facial recognition. The results will be analyzed later at the station. However, in the future, it could help police detect if a potentially dangerous person is nearby so they can call for help, Martin said.

This is only expected to happen if officers have begun an investigation or are responding to a call, rather than simply walking through a crowd. Martin said officers responding to a call can switch their cameras from passive to active recording mode, producing higher-resolution images.

“We really want to respect people's rights and privacy interests,” Martin said.

Alberta Information and Privacy Commissioner Diane McLeod's office said it received a privacy impact assessment from Edmonton police on Dec. 2, the same day Axon and police officials announced the program. The office said Friday it is now working to revise the assessment, a requirement for projects that collect “highly sensitive” personal data.

University of Alberta criminology professor Temitope Oriola said he's not surprised the city is experimenting with real-time facial recognition, given that the technology is already commonly used in airport security and other environments.

“Edmonton is the laboratory for this tool,” Oriola said. “This may well be an improvement, but we don't know for sure.”

Oriola said the police service had sometimes had a “chilly” relationship with indigenous and black residents, particularly after police shot dead a member of the South Sudanese community last year, and it remained to be seen whether facial recognition technology would make policing safer or improve interactions with the public.

Axon has faced backlash for introducing its technology in the past, such as in 2022, when Friedman and seven other members of Axon's AI ethics board resigned in protest over concerns about a drone equipped with a stun gun.

In the years since Axon abandoned facial recognition, the company has “continued controlled laboratory studies” of the technology, which has “become significantly more accurate” and is now ready for real-world testing, according to Axon CEO Smith.

But Axon acknowledged in a statement to the AP that all facial recognition systems are affected by “factors such as distance, lighting and angle, which can disproportionately affect accuracy for people of color.”

Each match requires human review, Axon said, and part of his testing also “examines what training and supervision human reviewers should have to mitigate known risks.”

Friedman said Axon should disclose those estimates. He would like to see more evidence that facial recognition has improved, as his council concluded it was not reliable enough to ethically justify its use in police cameras.

Friedman said he is also concerned that police agencies are allowing the technology to be used without discussion by local lawmakers or rigorous scientific testing.

“This decision should not just be made by police agencies and certainly not by retailers,” he said. “The pilot project is a great idea. But there needs to be transparency and accountability. … There's none of that here. They're just moving forward. They found an agency that's willing to move forward, and they're just moving forward.”

AP writer Kelvin Chen in London contributed to this report.

Leave a Comment