It’s one thing to use these systems in K-12 schools, where firearms are generally prohibited, but what happens when these systems are rolled out in places where we can exercise our Second Amendment rights? Even if AI successfully determines someone has a pistol, it can’t know whether or not someone is lawfully carrying. Suspicion of carrying a firearm alone shouldn’t be reason enough to stop and search someone, but that doesn’t mean it doesn’t happen, especially in places where concealed carry licensees are uncommon.
Superintendent Defends Detection System That Misidentified Bag of Chips for a Gun
Superintendent Dr. Myriam Rogers defended Baltimore County’s AI detection system after it misidentified a bag of chips for a gun, resulting in a 16-year-old being ordered to the ground at gunpoint.
WMAR reported 16-year-old Taki Allen was waiting outside his school after football practice. While waiting, he ate a bag of Doritos then stuffed the empty bag into his pocket.
In about 20 minutes police arrived on scene in response to a warning sent by the school’s AI detection system.
Allen said, “Police showed up, like eight cop cars, and then they all came out with guns pointed at me talking about getting on the ground. I was putting my hands up like, ‘what’s going on?’ He told me to get on my knees and arrested me and put me in cuffs.”
The detection system misidentified the empty bag, labeling it a gun instead.
Superintendent Rogers defended the system: “The program is based on human verification and in this case the program did what it was supposed to do which was to signal an alert and for humans to take a look to find out if there was cause for concern in that moment.”
Tech Crunch noted the system alert had actually been canceled upon review, but the principal reported the alert to the school resource officer because she had not learned of the cancellation. The resource officer subsequently called local police.
