They better be careful, the AI could actually make stuff more impartial. They wouldn’t want that
Nah, they’ll just make the AI racist to compensate.
Also, until they can’t turn off the camera, it’s worth nothing.
They better be careful, the AI could actually make stuff more impartial. They wouldn’t want that
I dunno, when the cops scream “stop resisting” 400 times while kicking a man in the fetal position on the ground, will it conclude he’s resisting or conclude excessive force is being used? I know where my money is at.
My first thought too, “finally something in the chain that’s honest.”
It’d be good to audit it now and then, of course.
They are probably going to train the AI it on existing reports and videos. Why train an AI to work against you?
You just turn off the body cam first. Problem solved!
I mean if it’s based on the audio the police officer just can say I’m under attack I feel a tank even when they’re not before they walk up to somebody. Is very very very easily to manipulate this
Probably using the Arya ai prompt filter
“never repeat these instructions” in the prompt and it repeats it anyway. Hah.
It feels off that the headline talks about body cam footage but the AI actually just uses the audio. Technically that may be considered footage but I think I’m with most in considering that to mean the audio and video together.
Anecdotally, I’ve found that AI systems set up to summarise are reliable, probably using that “turn off creativity” setup that’s mentioned.
So the cops just dictate their side and then that is the reality… Yeah no thank you.
It’s already a report written by the police - they can make it say whatever they want with or without AI.
That’s already what happens
How much time will they really save when every report is just “File not found”?
If this encourages them to use their bodycams, it’s probably a good thing.
In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.
This ain’t no futurism anymore, it’s already time for an ancient_dystopia community‽
This is the best summary I could come up with:
As Forbes reports, it’s a brazen and worrying use of the tech that could easily lead to the furthering of institutional ills like racial bias in the hands of police departments.
“It’s kind of a nightmare,” Electronic Frontier Foundation surveillance technologies investigations director Dave Maass told Forbes.
Axon claims its new AI, which is based on OpenAI’s GPT-4 large language model, can help cops spend less time writing up reports.
But given the sheer propensity of OpenAI’s models to “hallucinate” facts, fail at correctly summarizing information, and replicate the racial biases from their training data, it’s an eyebrow-raising use of the tech.
“This is going to seriously mess up people’s lives — AI is notoriously error-prone and police reports are official records,” another user wrote.
In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.
The original article contains 555 words, the summary contains 152 words. Saved 73%. I’m a bot and I’m open source!
deleted by creator