A US teenager was handcuffed by armed police after an artificial intelligence (AI) system mistakenly reported that he was carrying a gun when in fact he was holding a bag of chips.
“The police showed up, eight police cars, and then they all came out with guns pointed at me and talked about getting on the ground,” 16-year-old Baltimore student Taki Allen. told local publication WMAR-2 News.
The Baltimore County Police Department said its officers “responded appropriately and proportionately based on the information provided at that time.”
It said an AI alert was sent to human inspectors who found no threat, but the principal missed it and contacted the school security team, who eventually called the police.
But the incident has prompted some calls to review school procedures for using such technology.
Mr. Allen told local news that he finished a pack of Doritos after football practice and put the empty pack in his pocket.
Armed police arrived 20 minutes later, he said.
“He told me to get on my knees, arrested me and handcuffed me,” he said.
Baltimore County police told BBC News that Allen was handcuffed but not arrested.
“The incident was resolved safely after it was determined there was no threat,” the statement said.
Mr Allen said he now waits inside after football practice because he doesn't think it's “safe enough to go outside, especially eating a bag of chips or drinking anything.”
In a letter to parents, Principal Kate Smith said the school's safety team “quickly reviewed and canceled the initial alarm after being satisfied there was no weapon.”
“I contacted our school resource officer (SRO) and told him about it, and he reached out to the local precinct for additional support,” she said.
“Police officers arrived at the school, searched the man and quickly confirmed that he did not have any weapons.”
However, local politicians have called for further investigation into the incident.
“I encourage Baltimore County Public Schools to review its procedures regarding its AI-based gun detection system,” said Baltimore County Board Member Izzy Pakota. wrote on Facebook.
Omnilert, a provider of the artificial intelligence tool, told BBC News: “We regret that this incident occurred and want to express our concern to the student and the wider community affected by the events that followed.”
The company said its system initially detected what appeared to be a firearm and its image was subsequently verified by its review team.
This, Omnilert said, was then relayed to the Baltimore County Public Schools (BCPS) security team along with additional information “within seconds” for their assessment.
The security firm said its involvement in the incident ended once it was marked as resolved in its system, adding that overall it “operated as intended.”
“Although the object was later determined not to be a firearm, the process worked as intended: prioritizing safety and awareness through rapid human screening,” the report said.
Omnilert says it is the “leading provider” of AI-assisted gun detection, citing a number of US schools among its case studies. on your website.
“Detecting firearms in the real world is no easy task,” it says.
But Mr Allen said: “I don't think any chip bag should be mistaken for a gun at all.”
The adequacy of AI to accurately identify weapons has come under scrutiny.
Last year, American weapons scanning company Evolv Technology was prohibited from making unsubstantiated claims about its products after he said his artificial intelligence scanner, used at thousands of US schools, hospitals and stadium entrances, could detect any weapon.
BBC News Investigations showed these claims to be false.





