F
16

Warning: I saw a facial recognition system flag the wrong person at a city council meeting

I was at a public meeting in Springfield six months ago where the police were testing a new AI system to scan attendees. It flagged a local teacher as a 'person of interest' from a different state, based on a 70% match. The officer had to pull him aside in front of everyone, causing a huge scene and delaying the meeting for 20 minutes. The vendor later admitted the training data was flawed and biased toward certain facial structures. This wasn't just a glitch, it was a direct harm to an innocent person's reputation and rights. Has anyone else seen public AI tools fail this badly in real time?
4 comments

Log in to join the discussion

Log In
4 Comments
quinn_burns
How is this even legal? I mean, a 70% match is basically a coin flip, and they're treating it like a sure thing. It just proves these systems aren't ready to be used on real people.
1
vera195
vera19524d agoTop Commenter
Right, a 70% match being a sure thing. That's like saying a weather app is always right because it guessed rain once. They act like the computer said it, so it must be true. Never mind the real person on the other side of that percentage. It's lazy. They just want a quick answer, not the right one.
4
reese_rodriguez86
Remember when those DNA tests said I was part velociraptor?
1
faith_hart20
What happens when they lower the match threshold to save money? They'll just arrest more innocent people based on even worse guesses.
1