AI Gun Detection System Blows It In Nashville School Shooting
A school shooting in Nashville made a lot of headlines, but it wasn’t quite what a lot of people think of as school shootings these days. Yes, it was a shooting and two people died–one of which was the shooter, apparently–and one person was injured, but it also wasn’t quite Uvalde or Virginia Tech. It was, however, awful for everyone present that day and an innocent person lost their life.
It wasn’t the first school shooting in Nashville in recent years, either.
After a shooting at the Covenant School, a lot of places stepped up their efforts to fortify schools. This is something I’ve personally been an advocate for.
The problem is that we need to use proven strategies or, if we’re going to rely on new technology, we need proven backups as well. One of those unproven technologies we’ve talked a lot about here at Bearing Arms is AI gun detection systems, such as those deployed on the New York City subway.
I’m just not convinced they’re ready for primetime.
In Nashville, it seems that, once again, we know the skeptics were right.
The technology system meant to prevent school shootings failed to detect the Antioch High School shooter’s gun, an official confirms.
A Metro Nashville Public Schools’ spokesperson says based on the camera location and the shooter in relation to the camera, it did not detect the weapon.
MNPS adds the camera did activate an alarm trigger when law enforcement and school resource officers arrived with their weapons.
The technology, Omnialert, is an Artificial Intelligence (AI) gun detection used in all Metro Schools.
Look, I like being right as much as the next guy, but I hate seeing the proof that I was right unfold like this. I’m not alone in my skepticism, either, but I’m pretty sure everyone else who had concerns feels the same way.
Omnialert is, of course, just one company. However, Evolv was the company in the NYC subway system, and it also had major problems.
A third company called ZeroEyes has been engaging in state lobbying efforts to restrict tax dollars to only go to companies with certain credentials, which coincidentally only they have. I don’t like the practice they’re undertaking, but it’s possible theirs would work better.
What people call AI today isn’t really artificial intelligence. Most of it is just software with a bunch of if/then statements that winnows down the possibilities and does so very quickly. Yet, like any software, garbage in, garbage out. It’s only as good as the programmers themselves, and while large language models can learn from the inputs they receive, there’s no indication this software can.
Or maybe it does.
What we do know is that in a key moment, the very moment this system was designed to prevent, it failed spectacularly.
It also seems that guns had been found on campus previously, according to one parent who voiced security concerns regarding the school, and begging for metal detectors to be installed.
That’s right. It seems the schools decided AI was all that was needed and not something tried-and-true like metal detectors.
Technology is great, and while I may be skeptical of taxpayer dollars going toward experimental technology, there’s nothing inherently wrong with trying new things. However, relying on these unproven technologies almost exclusively, as seems to have happened here, isn’t the answer.
The only backup seems to have been two school resource officers who were in a completely different part of the school when the incident happened and who arrived after the killer took his own life.
But I can’t help but wonder how things would have gone if the Nashville schools respected teachers’ right to keep and bear arms and an armed staff member had been present. Sure, the shooter would have probably still died, but no one else would have.
Read the full article here