this post was submitted on 24 Oct 2024
423 points (98.8% liked)
Technology
60087 readers
2763 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean, in terms of performance, I'd be more concerned about the false positive rate than the false negative rate, given the context. Like, if you miss a gun, whatever. That's at worst just the status quo, which has been working. Some money gets wasted on the machine. But if you are incorrectly stopping more than 1 in 25 New Yorkers from getting on their train, and apply that to all subway riders, that sounds like a monumental mess.
With how trigger happy police are, the false positives would lead to more deaths than they prevent. And police would claim it's justified because the machine told them so.
Facial recognition confirmed he was a criminal and the scanner confirmed he had a gun! Of course we opened fire instantly. How could we have known it was just some guy with a water bottle?
We used advanced colorimetry to determine he was a criminal!
False positives are fine because they only happen to poors.
"Not Hotdog"