this post was submitted on 05 Feb 2024
324 points (98.8% liked)

Technology

55940 readers
4147 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Dukeofdummies@kbin.social 4 points 5 months ago (2 children)

I am so confused by this, why does there need to be AI involved in this at all?

If somebody has a complaint, pull the footage, then the plaintiff goes over the footage and makes their case against the police officer. Why would an AI be necessary to find complaints that are not being complained about?

I feel like it's a technology solution for what should be a "more transparency and a better system" solution. Make complaints easier and reduce the fear factor of making complaints.

[–] bhmnscmm@lemmy.world 20 points 5 months ago (1 children)

The people most likely to be abused by police are the least likely to be able or willing to file a formal complaint.

[–] Dukeofdummies@kbin.social -5 points 5 months ago (1 children)

So fix that. Don't make an AI to dole out justice against police like some messed up lottery. This is such a hollow solution in my mind. AI struggles to identify a motorcycle, people expect it to identify abuse?

[–] quirzle@kbin.social 10 points 5 months ago* (last edited 5 months ago) (1 children)

So fix that.

Were it so simple, it would have been fixed decades ago. The difference is that having AI review the footage is actually feasible.

[–] Dukeofdummies@kbin.social 1 points 5 months ago (1 children)

Until you realize that the people who make the final decision on whether something the AI saw is indeed too far or extreme are the exact same people making the decision now and all we've succeeded in doing is creating a million dollar system that makes it look like they're trying to change.

[–] quirzle@kbin.social 6 points 5 months ago

So what's you're proposed solution? Your directive to "fix that" was a bit light on details.

This is a step in the right direction. The automated reviews will supplement, not replace, the reviewing triggered by manual reports you supported in your initial comment. I'd argue the pushback from police unions is a sign that it actually might lead to some change, given the reasoning the give in the article.

[–] RobotToaster@mander.xyz 4 points 5 months ago* (last edited 5 months ago)

I'm just theorising how AI could be used, but consider the situation where someone makes a complaint, but doesn't remember the exact time of the incident (say they remember it was within a six hour time frame for this example), or what the officer looked like.

You have (for example) 20 officers on duty it could be potentially be, in a six hour time frame, that's 120 hours or 5 days of footage. An AI can use facial recognition to find the complainant within minutes.